:D Some HDR TVs can already do this conversion on the fly. Looks like Vegas can accomplish this. It is USING a 100w bulb, whereas in the past, all we were allowed is 60w.
Not that it can't, but out of curiosity, are you planing to use 709 source and go HDR, to it's max?
So, at best you might get some unnaturally bright highlights that stand way apart from the rest of the footage, and flatter looking footage in general, but not anything I'd call an improvement.
So, at best you might get some unnaturally bright highlights that stand way apart from the rest of the footage, and flatter looking footage in general, but not anything I'd call an improvement.
UNLESS you grade it as such bringing the middle area up to fill out the image, utilizing the new expanded range
Musicvid, I'm sure you know it is literally AND physically impossible to show HDR properly on our SDR screens. All I can do is provide my unscientific example of the difference. First, I took my clip and squashed about a 1/3 of what it really looks like, to represent REC709. Then I added some contrast and color saturation to make it reach something normal-ISH looking with in the "SDR" range. And it looked okay, about like the original - somewhat flat - footage.
Then, I opened the gates and pushed everything as far as I can levels and color wise, which is similar to expanding up into HDR. Actually it looks better than the real standard 709 grade. Anyway, this is the difference I am talking about, not insane, retina scalding colors, just expansion back to real world (almost) dynamic ranges, closer to what we can actually see. Not like 709 with it's 6-8 stops of dynamic range.
So, if we expand it up, it's not going to leave blank or dull spots, there are no holes to fill. Just brighter levels of the same color.
So I should cut the dynamic range in half, put some lipstick on the pig, expand it back out, and that will give me a ten bit render? Sorry, I must have missed something, because the files are exactly the same size, and full of hot air..
The vectorscope and waveform densities in your graphic are a dead giveaway, but I'll pretend I didn't notice that. Along with the luminance histo.
I must just not have the right "enhancements" to my imagination.
Yes, you are not getting it. Let me know when you get an HDR screen and I will send you examples. :)
Agree with you. Seems to be hard to get. Maybe because looking to vectorscope and waveform pictures only takes away to see what it means on an HDR sceen really. Even with 8bit and a dynamic range of 7 stops only in the original footage there may be an improvement. That one cannot expect the same as if the footage is shoot with 10bit 422 and 14 stops is clear, but it can be an improvement. But not more.
Yes, you are not getting it. Let me know when you get an HDR screen and I will send you examples. :)
Since you demonstrated positively through your example that you are reducing bit depth horribly, not adding colors, I had assumed this discussion was finished. At least my role in it is finished. And I agree, one of us isn't getting it. If you think I must wait to buy an expensive screen to see your illusion, it's going to be a long wait. Bet it looks really cool, though. My eyes lie to me every day. Real numbers don't.
The illusion you have created is one of fewer colors, not more. The remaining colors take on a garish, oversaturated look, which is usually referred to "posterization," or sometimes "artificial banding." The effects are often wild and exaggerated, and I suggest you explore their possibilities using just the controls you are using now, knowing your taste in effects.
Here, I first published a little tutorial on this in 2012, before you joined the forum, and even used it as a darkroom technique for college publications, in the 1960s.
The left frame shows the out-of-focus flowers as they should appear with proper 10-bit processing. To the right of that is the frame processed at 8 bits per color.
You can clearly see halos around the edge of the 8-bit flowers as well as a visible dark band that outlines it. Also notice the missing details within the 8-bit flowers.
So yeah, game up, and here are some takeaways from all this rumination:
-- I'm glad you discovered the technique on your own. It confirms my belief in your creativity, which I've held in high esteem all along.
-- When 32 bit monitors were new and really bad, we used to set them at 16 bits so we could see the colors. Some guys here will remember this.
-- Sometimes less is more. Your eyes' hyperactive reaction to fewer colors, rather than more, confirms this beyond a doubt.
-- Finally, you can call it posterization, Cartoonr, PhotoRealism, whatever you will. Just please don't call it HDR; that name is already taken, and it defines exactly the opposite of what you are doing. Might confuse some amateurs.
Only thing I would add is that you should start posting more examples and images of your own, rather than talking eternal pipe dreams. I can't continue dragging examples out of you, even though each one you actually create is fresh, revealing, and always gives me ideas to fill in my own creative insufficiency.
In fact, this suggests a possible agreement between us -- I won't try to teach you art if you don't try to teach me physics. Deal?
Thanks again for keeping my teacher-claws sharp over the summer. At seventy, I just got my first paycheck of the season, from paying clients. And for some reason, this year, it feels especially good...
All the TV manufactuers got it wrong? idiots! why are they even making HDR TVs since SDR to HDR doesnt actually work? :D
You are saying it is fruitless, I'm saying others are doing it now... Can Vegas... has anyone here tried? I haven't even bought an HDR screen yet, but soon a they are dirt cheap now.
SDR fits within HDR and I doubt that any TV does up conversion. If the signal is SDR it will be displayed as such, but I might wrong? The only way I can see interpolation as possible would be by using 4K SDR and maybe trying to convert it to 1080 HDR?
All the TV manufactuers got it wrong? idiots! why are they even making HDR TVs since SDR to HDR doesnt actually work? :D
You are saying it is fruitless, I'm saying others are doing it now... Can Vegas... has anyone here tried? I haven't even bought an HDR screen yet, but soon a they are dirt cheap now.
"The only way I can see interpolation as possible would be by using 4K SDR and maybe trying to convert it to 1080 HDR?"
HDR is increasing the number of steps between black and white, or black and any solid primary color, but it also increases the gamut of colors that can be produced significantly as well. This is not tied to resolution. Dropping to 1080p, you'd still have the rec709 color space and you'd still only have 8 bits describing those colors/shades, which means you still will only be able to produce 256 discrete shades. Mapping them over the 1024 HDR provides just changes what is defined as white or what is defined as black, but you aren't changing the number of steps there are between white and black (or any primary). Is it possible to interpolate some numbers in there and have the computer try to guess? Of course, but when you're giving it 1 thing and expecting to get 4 times that thing out of it, you can't expect those guesses to be right. So until I can see with my own eyes on my HDR screen that grading SDR to HDR will produce a better end result than leaving it as is, I remain skeptical. I may do some tests on it, though, just for fun.
Some have argued that dropping resolution can be a way to upscale YUV compression from 4:2:0 4k to 4:4:4 1080p, but that is a completely different debate. HDR can't be created by messing with resolution.
Some have argued that dropping resolution can be a way to upscale YUV compression from 4:2:0 4k to 4:4:4 1080p, but that is a completely different debate. HDR can't be created by messing with resolution.
That's what I was referring to but I thought it was 4K 4:2:0 to 1080 4:2:2?
For a lot of people the increased color space of rec2020 is regarded to be more important then the increased dynamic range in the gamma curve. I think both us important. Beside that do not oversee that we have no monitors up to now that are able to show rec2020 really! Even not the beloved reference monitor Sony BVM X300.
So from that side you talk about something that nobody has seen yet.
Beside that - lets face it: PQ is not fully defined yet, since the producers of hardware are stii ll free hie they define the rolloff in the highlights. That is better with HLG. So why do you really think that everything is clear here?
Youtube has been doing HDR for a year or more - its even a filter setting. Dolby Vision been around 4 to 5 years. There's tons of content out there. HDR TV are flying off the shelves (okay okay, shuffling slowly here and there). There is already companies out there producing lots of HDR content. There's many super detailed tutorials out there on HDR.... I may buy a $200 4k HDR TV just to use as output monitor. This isn't new. NEW to Vegas finally. I could jsut stick with Davinci, but I enjoy the feel of vegas, and want to push its envelope. Beelding edge sucks most when confronted with "you cant"-ism. I only live in "How CAN we?" Find a way. :)
Cameras don't SHOOT HDR. They have to be graded into the range closer to our eye site due to their short comings. Exactly like LOG is being done since the dawn of LOG :D
This famous 5 Mysterybox articles are from 2016 - and still one of the best possible sources for an overview of HDR. Be Aware that some of the writtings are outdated today (especially the parts about Resolve).
Cameras don't SHOOT HDR. They have to be graded into the range closer to our eye site due to their short comings. Exactly like LOG is being done since the dawn of LOG :D
There are some cameras that shoot in the meantime to PQ or HLG directly - especially to HLG you find enough. But be aware that log is still a better choice compared with HLG, if you wish to grade the footage.