Comments

Musicvid wrote on 8/27/2019, 6:59 AM

Show us, please?

adis-a3097 wrote on 8/27/2019, 7:19 AM

:D Some HDR TVs can already do this conversion on the fly. Looks like Vegas can accomplish this. It is USING a 100w bulb, whereas in the past, all we were allowed is 60w.

Not that it can't, but out of curiosity, are you planing to use 709 source and go HDR, to it's max?

If yes, don't forget the sunglasses... 🙂

fr0sty wrote on 8/27/2019, 9:26 AM

What you get with 8 bit:

8 bit mapped to hdr... just adding air to the same signal:

10bit (different image, to illustrate what it should look like):

Last changed by fr0sty on 8/27/2019, 9:34 AM, changed a total of 3 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

fr0sty wrote on 8/27/2019, 9:37 AM

So, at best you might get some unnaturally bright highlights that stand way apart from the rest of the footage, and flatter looking footage in general, but not anything I'd call an improvement.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

john_dennis wrote on 8/27/2019, 10:05 AM

"These go to 11."

wwjd wrote on 8/27/2019, 6:39 PM

So, at best you might get some unnaturally bright highlights that stand way apart from the rest of the footage, and flatter looking footage in general, but not anything I'd call an improvement.

UNLESS you grade it as such bringing the middle area up to fill out the image, utilizing the new expanded range

wwjd wrote on 8/27/2019, 6:47 PM

Show us, please?

Musicvid, I'm sure you know it is literally AND physically impossible to show HDR properly on our SDR screens. All I can do is provide my unscientific example of the difference. First, I took my clip and squashed about a 1/3 of what it really looks like, to represent REC709. Then I added some contrast and color saturation to make it reach something normal-ISH looking with in the "SDR" range. And it looked okay, about like the original - somewhat flat - footage.

Then, I opened the gates and pushed everything as far as I can levels and color wise, which is similar to expanding up into HDR. Actually it looks better than the real standard 709 grade. Anyway, this is the difference I am talking about, not insane, retina scalding colors, just expansion back to real world (almost) dynamic ranges, closer to what we can actually see. Not like 709 with it's 6-8 stops of dynamic range.

So, if we expand it up, it's not going to leave blank or dull spots, there are no holes to fill. Just brighter levels of the same color.

Musicvid wrote on 8/27/2019, 11:11 PM

So I should cut the dynamic range in half, put some lipstick on the pig, expand it back out, and that will give me a ten bit render? Sorry, I must have missed something, because the files are exactly the same size, and full of hot air..

The vectorscope and waveform densities in your graphic are a dead giveaway, but I'll pretend I didn't notice that. Along with the luminance histo.

I must just not have the right "enhancements" to my imagination.

wwjd wrote on 8/28/2019, 6:41 AM

Yes, you are not getting it. Let me know when you get an HDR screen and I will send you examples. :)

Wolfgang S. wrote on 8/28/2019, 7:26 AM

Yes, you are not getting it. Let me know when you get an HDR screen and I will send you examples. :)


Agree with you. Seems to be hard to get. Maybe because looking to vectorscope and waveform pictures only takes away to see what it means on an HDR sceen really. Even with 8bit and a dynamic range of 7 stops only in the original footage there may be an improvement. That one cannot expect the same as if the footage is shoot with 10bit 422 and 14 stops is clear, but it can be an improvement. But not more.

Last changed by Wolfgang S. on 8/28/2019, 9:02 AM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Musicvid wrote on 8/28/2019, 9:13 AM

Yes, you are not getting it. Let me know when you get an HDR screen and I will send you examples. :)

Since you demonstrated positively through your example that you are reducing bit depth horribly, not adding colors, I had assumed this discussion was finished. At least my role in it is finished. And I agree, one of us isn't getting it. If you think I must wait to buy an expensive screen to see your illusion, it's going to be a long wait. Bet it looks really cool, though. My eyes lie to me every day. Real numbers don't.

The illusion you have created is one of fewer colors, not more. The remaining colors take on a garish, oversaturated look, which is usually referred to "posterization," or sometimes "artificial banding." The effects are often wild and exaggerated, and I suggest you explore their possibilities using just the controls you are using now, knowing your taste in effects.

Here, I first published a little tutorial on this in 2012, before you joined the forum, and even used it as a darkroom technique for college publications, in the 1960s.

https://www.vegascreativesoftware.info/us/forum/posterization-in-vegas--111932/

And just look at what's on the internet!

https://www.google.com/search?q=posterization&rlz=1C1CHBF_enUS856US856&source=lnms&tbm=isch&sa=X&ved=0ahUKEwiw1Ky61KXkAhVKY6wKHZf7C08Q_AUIESgB&biw=994&bih=451

 The left frame shows the out-of-focus flowers as they should appear with proper 10-bit processing. To the right of that is the frame processed at 8 bits per color.

 You can clearly see halos around the edge of the 8-bit flowers as well as a visible dark band that outlines it. Also notice the missing details within the 8-bit flowers.

So yeah, game up, and here are some takeaways from all this rumination:

-- I'm glad you discovered the technique on your own. It confirms my belief in your creativity, which I've held in high esteem all along.

-- When 32 bit monitors were new and really bad, we used to set them at 16 bits so we could see the colors. Some guys here will remember this.

-- Sometimes less is more. Your eyes' hyperactive reaction to fewer colors, rather than more, confirms this beyond a doubt.

-- Finally, you can call it posterization, Cartoonr, PhotoRealism, whatever you will. Just please don't call it HDR; that name is already taken, and it defines exactly the opposite of what you are doing. Might confuse some amateurs.

Only thing I would add is that you should start posting more examples and images of your own, rather than talking eternal pipe dreams. I can't continue dragging examples out of you, even though each one you actually create is fresh, revealing, and always gives me ideas to fill in my own creative insufficiency.

In fact, this suggests a possible agreement between us -- I won't try to teach you art if you don't try to teach me physics. Deal?

Thanks again for keeping my teacher-claws sharp over the summer. At seventy, I just got my first paycheck of the season, from paying clients. And for some reason, this year, it feels especially good...

Best as always.

wwjd wrote on 8/28/2019, 12:04 PM

All the TV manufactuers got it wrong? idiots! why are they even making HDR TVs since SDR to HDR doesnt actually work? :D

You are saying it is fruitless, I'm saying others are doing it now... Can Vegas... has anyone here tried? I haven't even bought an HDR screen yet, but soon a they are dirt cheap now.

https://arstechnica.com/information-technology/2017/05/sdr-to-hdr-conversion/

OldSmoke wrote on 8/28/2019, 12:19 PM

 

All the TV manufactuers got it wrong?

SDR fits within HDR and I doubt that any TV does up conversion. If the signal is SDR it will be displayed as such, but I might wrong? The only way I can see interpolation as possible would be by using 4K SDR and maybe trying to convert it to 1080 HDR?

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Musicvid wrote on 8/28/2019, 12:27 PM

Then let your hardware do the upsample and let it go.

Musicvid wrote on 8/28/2019, 3:08 PM

 I haven't even bought an HDR screen yet

What? After chiding me repeatedly for not having one so I could see your illusion? How would you even know what you have? Abominable.

Ghosted.

 

adis-a3097 wrote on 8/28/2019, 5:24 PM

All the TV manufactuers got it wrong? idiots! why are they even making HDR TVs since SDR to HDR doesnt actually work? :D

You are saying it is fruitless, I'm saying others are doing it now... Can Vegas... has anyone here tried? I haven't even bought an HDR screen yet, but soon a they are dirt cheap now.

https://arstechnica.com/information-technology/2017/05/sdr-to-hdr-conversion/

No, it's not the TV manufacturers who are idiots, it's the ones that swallow...without chewing. 🤓

fr0sty wrote on 8/28/2019, 5:26 PM

"The only way I can see interpolation as possible would be by using 4K SDR and maybe trying to convert it to 1080 HDR?"

HDR is increasing the number of steps between black and white, or black and any solid primary color, but it also increases the gamut of colors that can be produced significantly as well. This is not tied to resolution. Dropping to 1080p, you'd still have the rec709 color space and you'd still only have 8 bits describing those colors/shades, which means you still will only be able to produce 256 discrete shades. Mapping them over the 1024 HDR provides just changes what is defined as white or what is defined as black, but you aren't changing the number of steps there are between white and black (or any primary). Is it possible to interpolate some numbers in there and have the computer try to guess? Of course, but when you're giving it 1 thing and expecting to get 4 times that thing out of it, you can't expect those guesses to be right. So until I can see with my own eyes on my HDR screen that grading SDR to HDR will produce a better end result than leaving it as is, I remain skeptical. I may do some tests on it, though, just for fun.

Some have argued that dropping resolution can be a way to upscale YUV compression from 4:2:0 4k to 4:4:4 1080p, but that is a completely different debate. HDR can't be created by messing with resolution.

Last changed by fr0sty on 8/28/2019, 5:27 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

OldSmoke wrote on 8/28/2019, 6:26 PM

Some have argued that dropping resolution can be a way to upscale YUV compression from 4:2:0 4k to 4:4:4 1080p, but that is a completely different debate. HDR can't be created by messing with resolution.

That's what I was referring to but I thought it was 4K 4:2:0 to 1080 4:2:2?

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

wwjd wrote on 8/28/2019, 8:49 PM

that one dude has the answer

 

Wolfgang S. wrote on 8/29/2019, 12:47 AM

For a lot of people the increased color space of rec2020 is regarded to be more important then the increased dynamic range in the gamma curve. I think both us important. Beside that do not oversee that we have no monitors up to now that are able to show rec2020 really! Even not the beloved reference monitor Sony BVM X300.

So from that side you talk about something that nobody has seen yet.

Beside that - lets face it: PQ is not fully defined yet, since the producers of hardware are stii ll free hie they define the rolloff in the highlights. That is better with HLG. So why do you really think that everything is clear here?

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

wwjd wrote on 8/29/2019, 6:55 AM

Youtube has been doing HDR for a year or more - its even a filter setting. Dolby Vision been around 4 to 5 years. There's tons of content out there. HDR TV are flying off the shelves (okay okay, shuffling slowly here and there). There is already companies out there producing lots of HDR content. There's many super detailed tutorials out there on HDR.... I may buy a $200 4k HDR TV just to use as output monitor. This isn't new. NEW to Vegas finally. I could jsut stick with Davinci, but I enjoy the feel of vegas, and want to push its envelope. Beelding edge sucks most when confronted with "you cant"-ism. I only live in "How CAN we?" Find a way. :)

Cameras don't SHOOT HDR. They have to be graded into the range closer to our eye site due to their short comings. Exactly like LOG is being done since the dawn of LOG :D

wwjd wrote on 8/29/2019, 7:01 AM

This is an absolutely RIVETING set of articles if anyone is interested

https://www.mysterybox.us/blog/2016/10/18/hdr-video-part-1-what-is-hdr-video

Wolfgang S. wrote on 8/29/2019, 7:12 AM

This famous 5 Mysterybox articles are from 2016 - and still one of the best possible sources for an overview of HDR. Be Aware that some of the writtings are outdated today (especially the parts about Resolve).

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 8/29/2019, 7:14 AM

Cameras don't SHOOT HDR. They have to be graded into the range closer to our eye site due to their short comings. Exactly like LOG is being done since the dawn of LOG :D

There are some cameras that shoot in the meantime to PQ or HLG directly - especially to HLG you find enough. But be aware that log is still a better choice compared with HLG, if you wish to grade the footage. 

 

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems