A Couple Questions About HD

tygrus2000 wrote on 1/22/2005, 10:03 AM
I have heard that 2005 is the year when this gets introduced. How much will the HD DVD players come out at and what about a HD burner for your PC? What will be its price and the media?

What template in VV provides a true HD spec video? I see there are WMV-HD formats but that sounds like its highly compressed. What would give you something that could be directly burned to HD media one day when its available.

The whole HD format could be expensive for another year or two after its inital release, but consumers already have a way to view a HD video - their PC. Is anyone making PC only versions of their stuff for consumers to view?


riredale wrote on 1/22/2005, 10:17 AM
HDTV used to mean a couple of things, including doubled vertical and horizontal resolution, a wide-screen aspect ratio, and multichannel digital audio. Nowadays HDTV means pretty much any system delivering quality better than standard NTSC or PAL.

You can do HD now, using, for example, the new Sony camcorder. The trick is how to distribute and play what you shoot. The HD material can be encoded any number of ways, one of them being Microsoft's WMV format (which, incidentally, did not do well when compared to alternative codecs in Doom9's recent shootout).

Once encoded, you can distribute the data on any number of vehicles, including CD, DVD-5, DVD-9, or the upcoming HD-DVD or BluRay. The latter two are not necessarily "HDTV" formats, they just have a lot more space available per disk.

Then the question is how to play a disk. Right now, you can use a PC, as you mentioned. In the near future, DVD-type players might have any number of codecs installed. Finally, of course, you need an HD-capable display. Fifteen years ago that would have meant a 1080x1920 display, but now it could mean pretty much anything higher than 480x640.
B_JM wrote on 1/22/2005, 11:07 AM
you can buy a HD DVD player here for $249 ... works very well and is native HD and SD ..

AVeL Linkplayer2 http://www.iodata.com/
Barry_Green wrote on 1/22/2005, 2:15 PM
HDTV's have been available for sale in the USA since 1998... most cities already have HD broadcasts on the air. It's already available and released (although market penetration is very low).

In Europe, they haven't even settled on a recommended HD standard, but it is expected that the EBU's recommendation will be issued soon.

As far as when HDTV would be prevalent, that depends on the pace of consumer adoption, and it's a quite confusing situation. There is no standardized consumer HD distribution medium (such as HD-DVD or blu-ray) on the market yet... JVC does offer D-VHS recorders, but they haven't been widely adopted by consumers yet. Furthermore, there are sets out there that are "HD Ready", but they don't have a tuner in them, so they can't receive HD broadcasts! And since there's no HD-DVD player, it's probably a safe bet that people with "HD Ready" sets aren't able to use the HD capability at all yet. As well, some sets like Plasmas may actually be EDTV's, but because they can play an HD signal (by downrezzing to standard-def) people think they're HDTV's when they're actually not.

The major USA broadcast networks have rolled out some HD programming (some sports, The Tonight Show, American Idol etc) so if you have an HD-capable set, an HDTV tuner (or VOOM service), and your local stations are broadcasting HDTV signals, then HD can be a reality for you today.
B_JM wrote on 1/22/2005, 3:18 PM
as i pointed out above -- there are already HD-DVD players ... several already , and lot more to come in the next few weeks ... cheap really (these are not up-convert but true HD outputput and storage on normal dvd disks)

and you can buy now for some time a sony blu-ray (grey market in USA) - though still pricy at this point ..

the dust hasnt settled yet though on this whole thing ...

BillyBoy wrote on 1/22/2005, 4:31 PM
Unitl I got my HD plasma I had no idea how all over the map broadcasting at this point in time was. Funny, I'm starting to rediscover "nature" type broadcasts. The are breathtakingly real on the local public TV HD channel, WTTW Digital. They also seem to have the strongest signal.

Most of the major networks, ABC, CBS, NBC at least have some "real" HD TV, like ER and there is one HBO HD in the starter package I got.

File under you never know till you try category.

The cable guy hooked up the cable box with component connections through my A/V receiver. But he only came with "Hi-Fi" red and white audio cables. So after he leaves (didn't want to hurt his feelings) I look at the back of the cable box and it has both coxial and optical audio. So I hook that up and WOW, digital audio for TV what a difference in quality!

Related question.

The cable box is a Scientific Atlanta 3250 HD Explorer. In additionl to digial audio output it also has a DVI output. Anyone care to comment on the quality difference between component out and DVI out of TV broadcast via cable? The cable is all fiber-optic put in by Ameritech several years ago but never used until WOW came along.
epirb wrote on 1/22/2005, 4:38 PM
I truly feel the 16x9 format will be adopted faster than true HDTV, for one, because as Barry pointed out the sudden rise in EDTV’s on the market. People are buying these TV’s for many reasons. For example I know a few people that bought one thinking it was HDTV simply cuz at was plasma screen. Including consumer ignorance, is sometimes the dealer ”misinformation” such as salesman pushing EDTV’s and showing customers “see this is the Discovery HD channel”.In addition, many people just want the cool 16x9 look and don’t really care to have HDTV. Heck even my bank has 3 EDTV plasma screens behind the counter showing CNN or CNBC. Many of of the manufactures are offering more and more EDTV’s because of the price point.

I also feel that Moore’s Law not only applies to technology, but to the retail market as well in many ways.
Barry has given us good info in other posts as to the adoption of DVD into the masses, and the time it took. But look at the Ipod (PLD) craze now and how little time it has been since they first hit the market.
Heck my 20 gig HD player was $$$ a few years ago when they hit the market , now look at the cost of them. In just a few short years, way faster than DVD players, Ipods and the like have become a ”what you don’t have one?!” item like DVD players.
Simply put, as the cost keeps going down on high tech stuff, including HDTV’s, LCD’s etc., the faster this stuff will be adopted by the consumer. That applies to the manufacture as well as the retail level.

Ya never know, we may see a sharp rise in the rate that HDTV has climbed over the first years, given th decreasing costs of LCD TVs and other High def capable TV’s.

BillyBoy wrote on 1/22/2005, 4:48 PM
The one thing I found amusing in the Panasonic manual under troubleshooting.

"Some parts of the screen do not light up"

"The plasma display panel is manufactured using an extremely high level of precision technology, however, sometimes some parts of the screen may be missing picture elements or have luminous spots. This is not a malfunction."

I guess the "missing" or being non luminous is due to the extremely high level of precision technology.

I didn't check mine, there "only" 1,049,088 of them...
epirb wrote on 1/22/2005, 4:54 PM
well get to counting BB! I hear its snowing like crazy in my old home town.

I read a while back, that one of the reasons LCD panels( I would assume plasma are no different) are still so expensive is for that reason.
They say that during the manf that the failure rtae of screens is as high as 50%. Meaning the criteria for failed pixels are not met. Many times that can be one reason why some LCDs are cheaper than others.The Co. who has a high standard of the least dead pixels will sell their "failed screens to companys that say"We can live with that.
Most times people will not notice, unless of course ther are a bunch of dead ones together in the middle of the screen. Then everybody on TV has the Kate Moss beauty mark :)
farss wrote on 1/22/2005, 4:55 PM
To answer your previous question, DVI being digital is better than component. However 1080 is it seems right at the limits of what DVI can handle, that's why the Apple monster needs dual DVI to drive it.
epirb wrote on 1/22/2005, 4:57 PM
but do you think it would be noticeable given the bandwith of cable TV? Even for HDTV , I dont know, thats why I'm asking.
farss wrote on 1/22/2005, 4:58 PM
I'm told in China they're flattening city blocks to buld factories to build LCDs. Also recent advances mean they can make them much larger than previously possible. Also on the horizon we have OLED starting to hit the market, only on small screens so far.
farss wrote on 1/22/2005, 5:03 PM
Well I can see the difference even on my 17" LCDs between VGA and DVI feeds. You are cutting out a D->A->A->D, how noticable I couldn't say but hey if you've got the interconnect I'd sure go that way. If you had the choice between two systems one with and one without but you had to pay a much higher price to a DVI interconnect I'd say do a side by side comparison before you parted with the money.

But also component video being analogue is subject to many factors including cable quality and bandwidth of the drivers and quality of the filters post the D->A converters. With DVI if it doesn't make the grade it all falls apart, with component you may not see the loss until it's extreme or you do a A/B comparison.
epirb wrote on 1/22/2005, 6:27 PM
BB, do you remember about 10-15 years ago when WTTW's airtime was pirated?
I saw it, I was watching an episode of Dr. Who(Tom Baker) when all of the sudden the picture went fuzzy and a bad imatation of Max Headroomm came on, he did some goofy things then MOONED the camera! Man! am I glad WTTW wasnt high Def then.
tygrus2000 wrote on 1/22/2005, 6:30 PM
Thanks for all the input. I have read that some experts think HD will take a long time to catch on seeing as though DVD is just reaching market staturation. I totally disagree....anyone that thinks this format wont catch on hasn't seen a "true" high def signal. If the HD DVD players, blank media, PC burners and a good collection of Hollywood titles are all available at the same time at a reasonable price, this will take off like wildfire.

I just want to make sure my slideshow project is ready to go over to HD when it comes through.
BillyBoy wrote on 1/22/2005, 8:44 PM
I remember that! I used to watch Dr. Who every Sunday night on WTTW. Loved K-9 the wonder dog, (robotic)...

I also remember when WTTW was so small a station is was located in the Museum of Science and Industry at 57th, Lake Shore drive. Once upon a time you could go there in front of a "live" TV camera and see yourself on TV (in-house) broadcast in musuem only.


I'll always have a soft spot for that museum not just because of all the trips I made there like most Chicagoians have but because of a interesting fact. My great uncle, same name as mine down to the middle inital wasCEO of Illinois Brick Co. Originally, the Museum of Science and Industry as its still known was the home of the Field Museum till the 1920's (Marshall Field of department store fame) and before that it was part of the 1893 World's Columbian Exposition as the Palace of Fine Arts but the buildings weren't designed to last.

Anyhow if it wasn't for my great uncle that donated bricks to totally rebuild the place, one of the world's most popular muesums may have never been. There used to be a small exhibt near the enterance to the coal mine that mentioned it. You go now, and ask what they did with the exhibit, let alone the history they just look at you funny and now you got to pay to get in. It used to be free admission. Oh well...

JJKizak wrote on 1/23/2005, 6:59 AM
DVI has been discarded for HDMI (video & sound) and all of the new model HDTV's and DVD players are using these connections instead of DVI. This happened in a couple of weeks. Yuk! I saw no difference in quality between DVI and component even though some DVI equipment can handle only 720p.


BillyBoy wrote on 1/23/2005, 8:49 AM
Which is another reason I bought the "professional" monitor version as opposed to the "consumer" TV version. The consumer version has the newer HDMI while the monitor has the replacable card option so you can stick in a card that has a DVI input or one that has a HDMI or just use the component input that's build in. Of course they could have put both DVI and HDMI or even all three on one card, but they didn't. They want to pay another $100+ for that. <wink>

Its interesting that you see no difference in quality between component and DVI since everyone that's selling a DVI or HDMI capable set are hyping it because its a straight digital to digital if you use either DVI or HDMI as oppsed to going to component which is taking the signal analog again.
JJKizak wrote on 1/23/2005, 8:59 AM
I have Sony Tube model HDTV and I couldn't see any difference although I use the component all the time because the TV will only handle 720p through the DVI even though its a 1080i tv.

epirb wrote on 1/23/2005, 9:12 AM
Personaly with the right media or show in HDTV the picture is so awsome be it ,Component , HDMI or DVI. That I don't "really" see a difference, just due to the WOW factor of such an incredible picture compared to NTSC "blurvision"
BillyBoy wrote on 1/23/2005, 9:27 AM
I'll second that. I was just flipping through channels the other day and stopped on WTTW-D and ended up watching the entire nature movie (was about exploring the Grand canyon) simply because the picture quality was so much better. Then later they had a country western outdoor event on and I ended up watching a lot of that and I'm not that big a fan of that kind of music, but again both the picture quality and digital sound held my attention.

Funny observation... watching Bush's inaugural ABC broadcast in HD and you could clearly see Bush's barber didn't bother to remove the hair from inside his ears. Just never saw it before, and you can't on "blurvision".
VegasVidKid wrote on 1/23/2005, 2:23 PM
Reminds me a lot of when color TV was first becoming popular. We had some Zenith 25" round tube "consolette" model, which at the time was about $550 (about the equivalent of what a decent HDTV setup would go for in today's dollars).

My parents would watch any show that was broadcast in color (there weren't many to choose from), including The Flintstones. I remember my dad yelling at me once because I was watching something in black and white on this TV (only set in the house, BTW).

He was an electrical engineer, and was more concerned with adjusting the quality of the picture than watching the content. Every channel and every show required some kind of adjustment, and you had to sit on the floor to do it, because there was no remote.

For the initial setup/”calibration”, you needed a TV repair guy to come with a device called a color bar generator. Luckily, our set was one of the newer models, with a built-in degausser, so you didn’t need to call a technician whenever you moved the set!
riredale wrote on 1/23/2005, 9:10 PM
Down memory lane...

I can remember my first exposure to color TV was on a Halloween, probably around 1960 (I was 10). My best friend and I were trick-or-treating in the neighborhood, and as the lady opened the door to one house, I looked inside just as their new round-tube RCA color set was showing the NBC station-break peacock in glorious color. Both our jaws dropped to the floor.

A few years later when our family had our own round-tube RCA color set, I can remember that "Bonanza" was extremely popular primarily because it was in full color. I've seen a couple of old episodes recently, and was surprised that the quality of the show itself wasn't all that good. It was apparently the novelty of gorgeous color that made them fabulous back then.
Coursedesign wrote on 1/24/2005, 1:32 PM
When it comes to the insulting "blurvision" nickname for NTSC, let me point out one thing:

When the U.S. analog TV standard was finalized in the early 1940s, they had to decide on the number of lines to use. They decided to make it totally future-proof, by allowing for screens up to 20 Inches diagonally. The optimal seating distance was then calculated to be 8 feet at that size, and it was determined that no normal human being would be able to see the 525 interlaced scan lines at that distance, thus offering a smooth image of the highest definition the human eye allowed.

"Future-proof standards" have a tendency to run away from us...

BillyBoy wrote on 1/24/2005, 2:44 PM
I wonder how they came to such a conclusion. Today the optimal distance is suspose to be 2.5 times the diagonal width of the screen or beyond.

Reminds me.... when back in the old West, trains traveling over 40 miles per hour (faster than any horse) was considered by many to be dangerous to one's health.

I wonder if man with ever exceed the speed of light like in all the science fiction movies. For decades "science" laughed at such an idea but now a few are saying at least in theory it may be possible.