ideal / best bitrate for full HD50i project?

Mindmatter wrote on 11/12/2010, 10:16 PM
Hello,
I´m just getting my head around the intricacies of DVD making, so I´m running into a few probably fundamental questions...
my first attempt at getting this particular 50 min project on DVD looks OK..but just OK. It was filmed in full HD and looks great in Vegas, but once on DVD it´s less crisp, less defined. Looks OK on the computer and laptop but very noisy on my older regular TV when played on a standard DVD player, at least far noisier than the regular satellite TV program on the same TV. From Vegas, I rendered the project using the DVDA mpeg2 PAL widescreen template.
I used the 8mb/sec setting in the DVDA template, and as far as I see, I can only go up to 9.8?
How high can / should I set the bitrate to do the HD material justice? What are the typical bitrates on, say, commercial movie DVDs? Should I use the HD 25mb/sec template with constant bitrate or variable? Should I set "DVD" as output type? can regular TV sets or DVD players handle those HD settings?
Questions, questions...as you can see, I´m still somewhat confused...Thanks for any help!

AMD Ryzen 9 5900X, 12x 3.7 GHz
32 GB DDR4-3200 MHz (2x16GB), Dual-Channel
NVIDIA GeForce RTX 3070, 8GB GDDR6, HDMI, DP, studio drivers
ASUS PRIME B550M-K, AMD B550, AM4, mATX
7.1 (8-chanel) Surround-Sound, Digital Audio, onboard
Samsung 970 EVO Plus 250GB, NVMe M.2 PCIe x4 SSD
be quiet! System Power 9 700W CM, 80+ Bronze, modular
2x WD red 6TB
2x Samsung 2TB SSD

Comments

TOG62 wrote on 11/13/2010, 12:18 AM
I suspect you're getting about as good a result as possible. The best quality you can get on a DVD is well below that of an HD source. You might well find that the result would look significantly better played on an HD TV which upscales the SD.

There is no way to play HD material on a standard DVD player. You can make an AVCHD on DVD disc, but would need a Blu-ray player and HD TV to view it.
PeterDuke wrote on 11/13/2010, 2:37 AM
The normal consequence of converting HD to SD to put on a DVD is that the result by comparison looks less sharp, not noisy. Can you describe the noise? Is it flickering? Speckled?

A typical bit rate for a DVD is 8000 kbps. DVD players have difficulty with bit rates much higher than that.

Some people consider the MPEG2 encoder in Vegas to be superior to that in DVDA. Have you tried doing the down-resolution in Vegas? Be sure to set the Video quality to "best" and set a deinterlace method to other than "none", eg "interpolate".
musicvid10 wrote on 11/13/2010, 4:44 AM
"Some people consider the MPEG2 encoder in Vegas to be superior to that in DVDA."

It's the same encoder. Custom controls are not exposed in DVDA . . .

Also, many DVD players will choke on sustained 8Mbs. 6Mbs CBR or ABR is a lot safer.
PeterDuke wrote on 11/13/2010, 3:55 PM
"It's the same encoder"

Another difference is that the Vegas encoder is multi-threaded whereas the DVDA encoder is single threaded. DVDA takes over 4 times longer to encode on my quad core PC.
Mindmatter wrote on 11/14/2010, 9:02 AM
Thanks a lot everyone for your help! I'm getting cleverer by the day thanks to these forums...
I had already seen that it's sort of wiser to encode in Vegas and import to DVDA, which I've done for this project. On my first attempt, I had a feeling that the DVD player was kinda choking on the HD bits of the project and was smoother in the SD sequences.
As I thought I had messed up some settings, I've rerendered the project and now it seems things are much better overall. The only "problem" I see, and which has been confirmed by other viewers here is one scene from the HD cam that shows a stony forest ground with a lot of light and dark sun/shadow patches. The moving animals and peolpe seem ok, but as soon as the cam pans over the ground, ther's a lot of..well i call it digital flickering (?) It looks as if you see the Vegas preview screen on very low resolution, where things start to pixelate.

It is very strange and annoying, as if those very contrasted pics could not be handled fast enough by the player. weird thing is, all other scenes of that clip outside the stony forest ground are fine...
There's also what I call "noisy" - it's hard to define. it's as if black wasn't really black but sort mixed with faint white static, if that makes any sense. Looks a bit as if you push the cam by 9db gain.
I'll check it later on my laptop and see if it's maybe TV related.

A side question: what sort of bitrate is satellite TV broadcasting with, and why is that picture on my TV so clean and smooth?

Thanks again everyone for your help!

AMD Ryzen 9 5900X, 12x 3.7 GHz
32 GB DDR4-3200 MHz (2x16GB), Dual-Channel
NVIDIA GeForce RTX 3070, 8GB GDDR6, HDMI, DP, studio drivers
ASUS PRIME B550M-K, AMD B550, AM4, mATX
7.1 (8-chanel) Surround-Sound, Digital Audio, onboard
Samsung 970 EVO Plus 250GB, NVMe M.2 PCIe x4 SSD
be quiet! System Power 9 700W CM, 80+ Bronze, modular
2x WD red 6TB
2x Samsung 2TB SSD