How fast of a CPU must I use to eliminate the slow flicker while playing through
a fade or transistion on the timeline prior to rendering? Presently have a
1 gig CPU with 786 meg. 100 SDRAM. My system is all SCSI and can capture at
18 megs per second with no frames dropped. ATI video card All In Wonder 32 meg.
the cpu is definitely the bottleneck. I just built a Athalon XP 2000+ with 333MHz RAM. framerate bottoms out at 23fps during a transition with the preview at full size in 'preview' mode. on the 1394 it bottoms out at 15fps.
at this speed the fps shortage is barely noticeable, but for solid realtime transitions we'll need quite a bit more horsepower. I'm thinking in 6 months it will be close enough for me. I'm holding off on upgrading until then.
I'm running an AMD 900 w/ 512MB sdram and a Geforce2 64MB video card and have no trouble with flickering during transitions before rendering. Recently stepped up from a Geforce 32MB. I don't remember any flickering before the change either.
Are you using the ATI aiw to caputure ? If so, what codec ?
With DV source (firewire capture) I get 28 fps thru a disolve on a 629x430 preview window at Draft quality thru my video card. That's a P4 1.3 - 128 rdram - ME - Geforce MX2.
I am using the Firewire port on my Audigy card with the Canopus
ADVC-100 converter. the ATI card works ok on S video input but I
have been recording the S video stuff to my Panasonic DV-1000 and then
firewire it to Vegas. I wasn't really upset too much by the flickering
as rendering cured the problems. It only occurrs on the transistions
and fades and text titles. The external monitor is a sony 13 inch tv.
Using the S video input on the ATI card at maximum quality it was
capturing at a rate of 18 megs per second and filling up my hard
drives faster than you could blink an eye. Also, it takes me about 11 hours
to render an hour and a-half movie and wondered how I could cut this
time.