OT: Need an explanation please

p@mast3rs wrote on 12/7/2005, 11:33 PM
Ok, I asked here once before but somehow I lost the thread I had bookmarked and nothing turned up in my search.

Here it goes. How come the video that I have through my cable service looks good on my TV (no noise) but if I feed the cable signal into my TV Tuner card, the noise is way more apparent. Surely TVs dont have a more powerful denoiser built in do they?

Can someone please explain why this is? I just cant believe that my TV which is five years old looks better viewing the signal than my computer does which has more processing power. Maybe I am overlooking a simple explanation but this is driving me crazy.

Comments

johnmeyer wrote on 12/7/2005, 11:39 PM
Do you have an ATI card? The tuner, believe it or not, is absolutely terrible. Part of this is software (the later drivers are better) and part is hardware. The AVSforums are full of chatter about these problems.

p@mast3rs wrote on 12/7/2005, 11:50 PM
Well I have two tuners. One is the ATI HDTV Wonder and then I have the Emuzed Maui card that came with my MediaCenter PC back in 2003. I have another HD Tuner card (DVico Fusion5) that I am probably going to swap out with the ATI card and try using the Maui device again and see if that improves my captures.

Heres a dumb question, am I correct in assuming that I should see the same quality on the PC that I do on my TV and if not, how does a TV eliminate the noise better than a PC Tuner card?
farss wrote on 12/8/2005, 12:02 AM
Is the cable feed analogue or digital. Mostly down here I think it's all digital so no noise period.
If it's analogue it has nothing to do with the CPU power, it's all to do with the quality of the tuner, better RF filters etc.

Bob.
p@mast3rs wrote on 12/8/2005, 12:14 AM
Well I pay for "digital" cable but once it goes in the box digitally and out the box it comes analogue.But heres something funny. If the signal feed that comes into my house is indeed "digital" why cant it remain digital or does it have to be converted to analogue in the box in order for my TV to view it?
p@mast3rs wrote on 12/8/2005, 1:45 AM
Just changed out the ATI and put in my Fusion5 card and my god, I cant believe the incredible quality increase between the two cards. Wow, now I definitely wont be getting any sleep tonight LOL.
p@mast3rs wrote on 12/8/2005, 2:51 AM
I also switched from using the coax cable to using the s-video as well and that seemed to have made a difference as well.

My viewing has gotten much better.
JJKizak wrote on 12/8/2005, 6:13 AM
You might even have been happier with the MY-HD-130 Tuner card.

JJK
johnmeyer wrote on 12/8/2005, 8:36 AM
Moral of the story: Some products really stink, and the ATI tuners are one such product.

Again, the AVS forums, to which I linked in my earlier post, are full of stories and rants, just like pmasters', about the horrendous RF sensitivity (noise in the tuner) of the ATI Radeon AIW cards. I have one, and know first-hand how bad it is. I've had it for three years, and software upgrades about a year ago made it somewhat better, but it still stinks. The baseband analog capture (composite video) can be pretty good, but as Marquat pointed out, audio sync can be a real problem, depending on what codec you use, and at what data rate.

Also, the ATI bundled capture software absolutely stinks. The later versions are finally functional (i.e., you can capture something that doesn't have audio sync, frame judder, dropped frames, etc.), but not very flexible.