Annoyed and confused with 1080p encoding...

Sunflux wrote on 12/3/2009, 1:09 AM
For years I've been rendering my 1080i HDV projects to 720p WMV for web distribution (along with lower quality levels of course). With much experimentation I keep managing to tweak the bitrates down as my projects get larger - I originally started at 4000kbit, but with my latest one have got it down to 2500kbit without it looking OVERLY horrible.

Now I would like to start offering a 1080p version, but I'm not so sure that WMV is the best option for this. Yes, it's the most widely supported, but I was thinking of giving MPEG-4/AVC a shot. The problem is it seems I have to completely start over on figuring out how to tweak this beast.

So I check out 8.0c's Mainconcept AVC/AAC codec, and can only find preset SD iPod profiles. Yes, I can tweak it to 1080p resolution, but is this really what I should be using?

So next to the Sony AVC codec. Ah, proper HD profiles. Except that it doesn't seem to support VBR, which I feel is a must for something pushing the bottom of acceptable bitrates.

Finally, over to TMPGEnc which I use for encoding my WMVs (due to quirks with Vegas' black level output with WMVs, as well as its generally poor deinterlacer). Checking out *their* Mainconcept MPEG-4 AVC codec, I again find only low-end SD profiles, but can crank up the resolution and bitrate. But here there are numerous settings that I have no idea what they should be set to - motion search range, GOP length, quantization, entropy coding mode, motion estimation subpel mode... there are default values, but what works for an iPhone may not be ideal for low bitrate 1080p.

Alas searching Google is no help, as various entities have littered the net with tens of thousands of poorly written video conversion guides.

Anyone here know what some decent recommended setting for web distribution 1080p are? So far I've estabilished some baseline bitrates for both WMV and MP4, but little else.

Thanks for any assistance...

Comments

farss wrote on 12/3/2009, 2:25 AM
I suspect one of the hurdles involved is licencing. The other issue is I doubt there's a lot of push for streaming 1080p as the bandwidth rquirements and the horsepower needed to decode it mean Joe Average will not a have a pleasant viewing experience.

As for your troubles with WMV I cannot see how using a different codec will change black levels or de-interlacing. If you shoot 1080i and want to deliver 1080p something has to give, by design 1080i cannot deliver the same resolution as 1080p and that's before you consider the issues of de-interlacing. Mike Crash's Smart De-Interlacer Vegas plugin does a pretty good job though. In areas of motion it will interpolate which does halve the resolution however the motion blur makes this kind of irrelevant.

Bob.
Sunflux wrote on 12/3/2009, 2:56 AM
Well, I guess my concern with distributing 1080i is I'm not sure how well the "mass audience" will deinterlace it. Although my 1080i M2T files look *wonderful* on my system...

Also regarding the black levels - what happens is, with WMV the black levels (I guess gamma or contrast, haven't really done a full greyscale analysis) of all of my video clips are pushed higher, so what was near black (say RGB 250-250-250) is much brighter (say 235-235-235) when played back. This doesn't happen with AVC, MPG, CIneform, etc., and doesn't happen if I render to M2T and then render to WMV using some other encoder.

The main result is that WMV files appear to lack saturation or "punch" when played back, since the contrast is lower.
amendegw wrote on 12/3/2009, 3:31 AM
"The other issue is I doubt there's a lot of push for streaming 1080p as the bandwidth requirements and the horsepower needed to decode it mean Joe Average will not a have a pleasant viewing experience"

One thing to consider is IIS Smooth Streaming where the bitrate on the video stream is adjusted for the client's connection speed. Here's a clever demo: http://www.iis.net/media/experiencesmoothstreaming

"Now I would like to start offering a 1080p version, but I'm not so sure that WMV is the best option for this"

A while back I did a little testing on mp4 vs wmv and found wmv played more smoothly using Silverlight (note: this is NOT IIS Smooth streaming as my Hosting Service does not support it, also it's 720p). Both of these clips were rendered using the respective Vegas 9.0c encoders (the mp4 version was rendered using the Sony encoder, but maybe by the time you read this, I'll have a Mainconcept option on this web page. Edit: Now using Mainconcept encoder.) see: http://www.jazzythedog.com/ducks.aspx

Another thing to consider is Microsoft's Expression Encoder to encode your video. See: http://blogs.msdn.com/expressionencoder/archive/2009/10/09/9905564.aspx I don't have a whole lot of experience with it, so I don't know if it is better or worse than your other options. You can download a 60 day trial here: http://expression.microsoft.com/en-us/cc136533.aspx (Edit: there is also a free version of the Encoder if you don't want the entire Expression suite of products).

Unfortunately, none of this answers to your specific questions, but hopefully it may be of some help.

Good Luck,
...Jerry

System Model:     Alienware M18 R1
System:           Windows 11 Pro
Processor:        13th Gen Intel(R) Core(TM) i9-13980HX, 2200 Mhz, 24 Core(s), 32 Logical Processor(s)

Installed Memory: 64.0 GB
Display Adapter:  NVIDIA GeForce RTX 4090 Laptop GPU (16GB), Nvidia Studio Driver 566.14 Nov 2024
Overclock Off

Display:          1920x1200 240 hertz
Storage (8TB Total):
    OS Drive:       NVMe KIOXIA 4096GB
        Data Drive:     NVMe Samsung SSD 990 PRO 4TB
        Data Drive:     Glyph Blackbox Pro 14TB

Vegas Pro 22 Build 239

Cameras:
Canon R5 Mark II
Canon R3
Sony A9

farss wrote on 12/3/2009, 5:12 AM
"so what was near black (say RGB 250-250-250) is much brighter (say 235-235-235) when played back"

Now I'm really confused as those values are pretty close to white!
Also 235 is darker than 250!

If what you mean is the blacks from your camera end up as slightly grey then this is understandable as I think the WMV player uses Computer RGB rather than Studio RGB. To confound the problem though many cameras today push the whites to Y' = 255 or close to it. That can be a problem because if you use the Studio to ComputerRGB conversion you'll clip you whites.
The way to fix this is to use the Color Curves. Open the waveform monitor and swtitch it to Computer RGB. Find a scene with solid blacks. In the CC FX add a node close to the bottom of the line and pull it down to the X axis. Now slide the node along the X axis while watching the waveform monitor. You want to get the bottom (black) of the waveform just kissing 0%. That's it.
If that is too confusing send me an email and I'll send you a project file with the CC preset in it.

Bob.
Jøran Toresen wrote on 12/3/2009, 7:05 AM
Bob: "Open the waveform monitor and switch it to Computer RGB."

When I open Video scopes - Waveform I only have to options: Luminance and Composite. Where do I find Computer RGB?

Jøran Toresen
farss wrote on 12/3/2009, 11:58 AM
At the tope of the window there's the drop down to select the type of scope (waveform, vectorscope etc) next to that is a grey icon with an upwards left arrow and then an icon with a little scope in it.

The grey one lets you select 7.5 IRE setup and Computer/Studio. You should never select 7.5 setup as there's none in digital video.

The other icon switches between 'live' scopes and static. Note that running the scopes live costs CPU cycles and can affect preview performance.

Bob.

Jøran Toresen wrote on 12/3/2009, 1:17 PM
Thank you, Bob. I wasn't aware of those Video Scope Settings.

Jøran Toresen
Sunflux wrote on 12/3/2009, 3:51 PM
farss: Sorry, that's what I get for posting at 6 AM... was up encoding all night. I do mean near-black.

The thing is I don't select any specific color space - everything is left at default. Previews are normal (which means fixing it would be a crapshoot and make any OTHER encodes inaccurate). Renders to other formats are all normal. Render to WMV using other encoders are normal. And with other encoders blacks from the camera DO end up slightly grey. But with Vegas, they're VERY grey.

This is with a Sony FX1 (had it ever since only it and the JVC were the HDV options on the market).

It was incredibly annoying for several years, but now I've grown to appreciate an adaptive deinterlacer (though I guess I should really try the smart filter mentioned).

With that said - I know I had this problem from like version 5 up to 8.0B. However last night I tried to encode a greyscale graphic to show the problem in 8.0C and it didn't seem to be happening. But I have to encode some actual nighttime footage to confirm, and right now my PC has been encoding for the last 18 hours so I'm unable.
fausseplanete wrote on 12/3/2009, 11:26 PM
Luma ranges: The older WMV wants "Full RGB" / "PC" / "Computer" range 0..255 range while the newer H264 etc. want "Studio" range 16..235. Computers are becoming more Studio it seems.

Quickest conversion method: apply the Levels FX (e.g. as a Master FX, on the Preview), and select its appropriate Preset/template, it has them for both Computer->Studio and Studio->Computer. Can always add one for Camcorder-> Studio.

I think Levels is more appropriate than Curves here, because a linear mapping is what's needed for this kind of standards conversion.

Grading is another matter: I used to rely mostly curves for altering picture balance e.g. raise shadowy areas but even here I now find the Gamma slider in Levels often does the job, so that's now my first port of call. Levels eats up less CPU power than Curves, presumably just because it involves a simpler formula. Curves I reserve for things lke "look and feel", especially camera-matching, and wrangling hard scenes with faces, bright sunlight and shadowy areas.