Comments

adis-a3097 wrote on 8/29/2019, 11:47 AM

Youtube has been doing HDR for a year or more - its even a filter setting. Dolby Vision been around 4 to 5 years. There's tons of content out there. HDR TV are flying off the shelves (okay okay, shuffling slowly here and there). There is already companies out there producing lots of HDR content. There's many super detailed tutorials out there on HDR.... I may buy a $200 4k HDR TV just to use as output monitor. This isn't new. NEW to Vegas finally. I could jsut stick with Davinci, but I enjoy the feel of vegas, and want to push its envelope. Beelding edge sucks most when confronted with "you cant"-ism. I only live in "How CAN we?" Find a way. :)

Cameras don't SHOOT HDR. They have to be graded into the range closer to our eye site due to their short comings. Exactly like LOG is being done since the dawn of LOG :D

Um, convert lead (Pb) into gold (Au), while you're at it? 🙂

Musicvid wrote on 8/29/2019, 12:33 PM

Youtube has been doing HDR for a year or more - its even a filter setting. Dolby Vision been around 4 to 5 years. There's tons of content out there. HDR TV are flying off the shelves (okay okay, shuffling slowly here and there). There is already companies out there producing lots of HDR content. There's many super detailed tutorials out there on HDR.... I may buy a $200 4k HDR TV just to use as output monitor. This isn't new. NEW to Vegas finally. I could jsut stick with Davinci, but I enjoy the feel of vegas, and want to push its envelope. Beelding edge sucks most when confronted with "you cant"-ism. I only live in "How CAN we?" Find a way. :)

Cameras don't SHOOT HDR. They have to be graded into the range closer to our eye site due to their short comings. Exactly like LOG is being done since the dawn of LOG :D

Um, convert lead (Pb) into gold (Au), while you're at it? 🙂

I thought it was H2O into Ri (Ripple).

adis-a3097 wrote on 8/29/2019, 1:12 PM

Youtube has been doing HDR for a year or more - its even a filter setting. Dolby Vision been around 4 to 5 years. There's tons of content out there. HDR TV are flying off the shelves (okay okay, shuffling slowly here and there). There is already companies out there producing lots of HDR content. There's many super detailed tutorials out there on HDR.... I may buy a $200 4k HDR TV just to use as output monitor. This isn't new. NEW to Vegas finally. I could jsut stick with Davinci, but I enjoy the feel of vegas, and want to push its envelope. Beelding edge sucks most when confronted with "you cant"-ism. I only live in "How CAN we?" Find a way. :)

Cameras don't SHOOT HDR. They have to be graded into the range closer to our eye site due to their short comings. Exactly like LOG is being done since the dawn of LOG :D

Um, convert lead (Pb) into gold (Au), while you're at it? 🙂

I thought it was H2O into Ri (Ripple).

Nah, that's a common mistake. Originally, it was converting it to Wi (Wine), but that wasn't challenging enough. What exciting times we live in...😂

wwjd wrote on 8/29/2019, 4:36 PM

you guys are killing me! hahahahahahhaahhahaa :D

So, Cameras do not REALLY shoot HDR... yes, HLG helps get there but its not HDR until processed in post. And log, of course is the other direction: all flattened out to capture more and stretch out in post.... so HOW do they get HDR imagery? It's in the grading. Sure 10 bit works better than 8, but 10 is STILL expanded in grading. I don't think we have to suffer posterization.

Surely, someone here has done 8 into HDR. Happy to blaze that path, just not on my radar ATM, so just asking others. I'll get to it.

Musicvid wrote on 8/29/2019, 5:13 PM

@adis-a3097

So, after scouring the internet and finding nothing, I stumbled across the perfect solution to filling in all those colors -- on a Windows 95b hard disk!

Wolfgang S. wrote on 8/30/2019, 2:01 AM

you guys are killing me! hahahahahahhaahhahaa :D

So, Cameras do not REALLY shoot HDR... yes, HLG helps get there but its not HDR until processed in post. And log, of course is the other direction: all flattened out to capture more and stretch out in post.... so HOW do they get HDR imagery? It's in the grading. Sure 10 bit works better than 8, but 10 is STILL expanded in grading. I don't think we have to suffer posterization.

Surely, someone here has done 8 into HDR. Happy to blaze that path, just not on my radar ATM, so just asking others. I'll get to it.


Read the mysterybox articles that you have mentioned - and you will see that HDR typically requires 10bit (HLG, HDR10) or even 12bit (Dolby Vision). BUT that does neither mean that we are not able to Show a higher dynamic range with 8bit too (but with a lot of banding in many cases - that is why we do not intend to do that).

And log is a format that saves the potential 12-15 stops that we can record today. To my opinion it is HDR simply because we have more then 6 stops in that footage. But it is not compliant with the existing HDR Standards, and log must be graded to the existing PQ oder HLG Formats. So no issue here, no surprise here, and no grading form Pb to Au. If you go for HDR really, start to shoot in 10bit or higher and use log or cameras that shoot to PQ. Try to avoid the HLG 8bit consumer cameras.

Thats it.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

wwjd wrote on 8/30/2019, 8:05 AM

(but with a lot of banding in many cases -

Sounds like no one has tried it yet. Happy to when I get to it. Just thought someone might have already tried.

 

Musicvid wrote on 8/30/2019, 8:10 AM

So what's not good about HLG?

It looks like the best of both worlds; fat bottom, skinny top where the colors are, and 18 stops. Just look at what it does. Yes, flat log is real 10 bit, think of it as clever use of real estate.

adis-a3097 wrote on 8/30/2019, 9:12 AM

you guys are killing me! hahahahahahhaahhahaa :D

So, Cameras do not REALLY shoot HDR... yes, HLG helps get there but its not HDR until processed in post. And log, of course is the other direction: all flattened out to capture more and stretch out in post.... so HOW do they get HDR imagery? It's in the grading. Sure 10 bit works better than 8, but 10 is STILL expanded in grading. I don't think we have to suffer posterization.

Surely, someone here has done 8 into HDR. Happy to blaze that path, just not on my radar ATM, so just asking others. I'll get to it.

Please, wwjd, consider this:

https://poynton.ca/notes/Timo/Weber_and_contrast_ratio.html

This guy really, and I mean reallly knows his shite, to say the least. Pardon my French! 🙂

Browse through the whole site, it's goldworth!

And there's this:

https://4kmedia.org/real-or-fake-4k/

It's not about fake HDR, it is about fake resolution, but you'll get the point.

And this guy:

Again, it's about fake vs real. And if you're ok with fake HDR, I'm also ok with you being ok with fake HDR. Just mentioning it for you to not get deluded into thinking it can be done the way you think it can. It can't, you only can fake it, and with 8 bit video (previously compressed to death MPEG/AVC/HEVC) it's like trying to upgrade the prerecorded cassette sound (remember those? freq. limited and dynamics equaling 8 bit) into huge cinema sound - that's all I'm saying. Not worth the effort... 😉

 

Musicvid wrote on 8/30/2019, 1:34 PM

You mean like HDR-"Capable" teevees?

Hype has become its own commodity.

adis-a3097 wrote on 8/30/2019, 2:54 PM

You mean like HDR-"Capable" teevees?

Hype has become its own commodity.

Yeah, wwjd does sound like a salesperson. Kinda. 🙂

No pun intended.

Musicvid wrote on 8/30/2019, 5:29 PM

Ok, matching wits is fun, but let's leave the personal comments out of it. He's a creative contributor, and I can't match that.

Wolfgang S. wrote on 8/31/2019, 2:16 AM

So what's not good about HLG?

It looks like the best of both worlds; fat bottom, skinny top where the colors are, and 18 stops. Just look at what it does. Yes, flat log is real 10 bit, think of it as clever use of real estate.

There is nothing bad with HLG as delivery format (beside the potential issue that it should come with the gamut rec2020, what is a limitation for the backward compatibility with rec709 tvs).

But HLG is not so nice to grade due to the limited number of luminance values in the highlights.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Musicvid wrote on 8/31/2019, 8:04 AM

There is nothing bad with HLG as delivery format (beside the potential issue that it should come with the gamut rec2020, what is a limitation for the backward compatibility with rec709 tvs).

But HLG is not so nice to grade due to the limited number of luminance values in the highlights

Guess I'll have to learn more.

I never imagined HLG as a delivery format with limited upper luminance values. I haven't examined one in the wild, but such a bottom-heavy tonal print sounds unexciting. I assumed the upper log domain would be unpacked for for grading

fr0sty wrote on 9/2/2019, 5:06 PM

"yes, HLG helps get there but its not HDR until processed in post. "

This is entirely incorrect. I can shoot HLG with my GH5, and play the file right out of the camera to my LG OLED display, and I get beautiful 10 bit HDR video right out of the camera. HDR is not a grading thing, it is a bare minimum of 10 BIT VIDEO (A requirement specified in the HDR10 specification) that is in the Rec2020 color space being displayed on a TV screen that is capable of 1000 nits of peak brightness and 0.05 nits of black level (0.0005 for OLED screens). HLG has every one of those things ready to go out of the box, provided it is played back on a capable display. It was designed by the BBC and NHK to be a broadcast delivery format of HDR that is also backwards compatible with Rec709 8 bit displays without the need for conversion.

Last changed by fr0sty on 9/2/2019, 5:07 PM, changed a total of 2 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

wwjd wrote on 9/3/2019, 6:32 AM

See how we can all learn and grow from my technical ignorance? :D :D
Until someone actually SEES how vegas handles the banding (yes, I understand banding and am aware of it and dislike it) I will still believe in impossible miracles. Plus, I never said I was leaving it all in 8 bit. ;)

I'd heard HLG was better in some ways, worse in others, but NEVER heard if was HDR right out of the box. That is very interesting! Any other camera's doing that? BlackMagic etc?

Still pretty sure cameras do not have the dynamic range in them that HDR can show. Is that wrong?

fr0sty wrote on 9/3/2019, 2:33 PM

There are many newer cameras, even consumer models and some phones, that support HLG.

You are correct about the dynamic range. My GH5 only has about 10 stops, the S1 I'm upgrading to (One of my crew already has one) is 12.2 stops (more than the FAR more expensive Sony FS7! Almost equal to a BM Ursa Mini Pro 4.6, also far more expensive. HDR technically allows for around 16 stops if my memory serves me right.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Wolfgang S. wrote on 9/3/2019, 3:09 PM

I never tend to shoot to HLG due to the limitations in grading. My FS7 with typical 14 stops did not have HLG but slog2/3, here I prefer slog3 shoot to 19bit XAVC-I or ProRes with the Shogun Inferno.

With my EVA1 with a similar numbers of stops I shoot to v-log - typically recorded internally with the All-I codec. The EVA1 had now also the UHD 50p/60p similar to the S1H but with 10bit HEVC only - or again with the Shogun using HDMI. HEVC may be fine, but it is so high compressed that I do not like that really.

For me the trend is toward uncompressed - I rejected ProRes RAW since there was long time no grading solution for Windows (Edius handles that by now, but not Vegas). Even if the 5.7K raw from the EVA1 and the Shogun Inferno would be attractive. The S1H will also deliver ProRes RAW, for the first time using HDMI.

Another nice solution are the Blackmagic Pocket 4K and now the 6K. Unfortunately that can be graded in Resolve only up to now, but it delivers 6K 50p braw. You could burn in LUTs for either HLG or PQ, but due to the good color management in Resolve that is not necessary really. I would like to see that Vegas can handle braw in the future, but we will see if that comes.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

adis-a3097 wrote on 9/3/2019, 4:56 PM

See how we can all learn and grow from my technical ignorance? :D :D
Until someone actually SEES how vegas handles the banding (yes, I understand banding and am aware of it and dislike it) I will still believe in impossible miracles. Plus, I never said I was leaving it all in 8 bit. ;)

I'd heard HLG was better in some ways, worse in others, but NEVER heard if was HDR right out of the box. That is very interesting! Any other camera's doing that? BlackMagic etc?

Still pretty sure cameras do not have the dynamic range in them that HDR can show. Is that wrong?

Yes, it is. With modern, "pro", cameras it's more of a how high you want it question. 🙂

Look:

You "tonemap" that to "709", using "proper transforms", and you get this:

Say, one (only one!) 8K 16 bit TIFF frame is about 200MB, making a 24 fraps video a cca. 4.7 GB/s data stream (37.5 Gb/s). 2 hour movie would be like 33 TB. That's huge!!! The stuff you loose by going from RAW to delivery format (say 8bit avc 30 Mb/s) can't be "recovered", or "reverse-engineered". What's gone is gone, be it dynamics or resolution, it doesn't work like that... 🙂

 

 

 

wwjd wrote on 9/4/2019, 6:36 AM

I get the numbers. Until someone ACTUALLY tries it (me, I guess), I'll keep the faith. :)

fr0sty wrote on 9/4/2019, 10:22 AM

I've done it before, in projects where I had non SDR content. It didn't make the SDR video look any better. If anything, it just highlighted its flaws more. The effect of going from native HDR to SDR graded into HDR was also quite jarring when the scene would change from one to the other. It was obvious they didn't belong together, and I ended up scrapping the entire grade over it.

Last changed by fr0sty on 9/4/2019, 10:23 AM, changed a total of 2 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

wwjd wrote on 9/4/2019, 2:02 PM

THAT! ^^^ That's valuable information. Did you try upscaling the SDR at the same time - say 2k to 4k or similar, and did you use 8bit, 32 bit full range, floating, or convert to 10-12 bit video first?

adis-a3097 wrote on 9/4/2019, 3:01 PM

I get the numbers. Until someone ACTUALLY tries it (me, I guess), I'll keep the faith. :)

Do you?

Do you also get that, with digital, the colors are numbered? Like, literally! And not only the colors... 🙂

fr0sty wrote on 9/4/2019, 4:08 PM

"THAT! ^^^ That's valuable information. Did you try upscaling the SDR at the same time - say 2k to 4k or similar, and did you use 8bit, 32 bit full range, floating, or convert to 10-12 bit video first?"

All SDR content was native 4k resolution, as well as the HDR stuff. You don't gain anything by "converting to 10 bit first". The conversion happens anyway when you grade the HDR file with SDR content, so you're basically just wasting a render generation of quality to do a step that Vegas would have already done anyway.

Last changed by fr0sty on 9/4/2019, 4:09 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)