HDR = 1 billion+ possible colors, 1024 possible shades of each primary color or between black and white. SDR is only 256. There is no way to add in those additional 768 shades to match the contrast, and also HDR uses a wider color gamut than SDR does, so it would have to create colors that aren't even possible to reach in the Rec709 color space. This isn't currently possible on any system I'm aware of, and I wouldn't expect the results to be particularly impressive even if something claimed it could. You can grade SDR video in a HDR project, but you're not adding anything, you're remapping those 256 possible shades per color onto 1024.
You must start with 10 bit, Rec2020 (or similar, I use AdobeRGB for stills) source footage to do proper HDR.
There are tools like Edius, that allow a conversion from SDR to HDR. How the do that? Well, they cannot add 768 tones but they csm spread the luminance range to a wider range.
Is that HDR really? Well, in terms of number of stops after that transformation it is - since it goes higher then the 6-7 stops of SDR. In terms of bits maybe - because you can do that wit 10 or 12 or only 8bits too. in terms of the original dynamic range captured during shooting one could argue that is fake HDR only.
Sure that can be done with Vegas too. But take care that white should stay around the 100nits typically (even if some increases that up to 200nits in some cases).
My own opinion about that: you should avoid that and shoot flan increased dynamic range for HDR really if you go for HDR in the end. And takes that have not more then the dynamic range of SDR in the original scene should stay as they are - from a general point of view.
I agree with both your guys's input there, but I know it can be done. Yes, it wont color fill the tweens accurately, but boost, say, RED from 30 nits to 80 (making numbers up for example), thus wider range. I'm good with that. So, nobody has messed with 17 to do that yet?
You're not really adding contrast, you're just... changing it. The same 256 levels are still all that is there. Even if you interpolate it like Wolf says Edius can do, it still isn't going to be as accurate as getting sensor data of what was actually there in 10 bit. I have tried to mix hdr and non hdr media, the results didn't look that good.
let me put it this way... I know it is not making "REAL" HDR... but if you change contrast from 100% in 709, to 400% that HDR can take, you end up with an output HDR contrast at 400%. Has anyone done this with SDR footage to HDR output?
Even if color interpolation in float space was feasible, which I haven't seen yet, it would still be filler (noise), not data. There would be negative impact on the integrity of the SDR source.
let me put it this way... I know it is not making "REAL" HDR... but if you change contrast from 100% in 709, to 400% that HDR can take, you end up with an output HDR contrast at 400%. Has anyone done this with SDR footage to HDR output?
Sure you can do that. Regardless what the people say here. If it makes some sense to you, well that is really up to you.
you can also shoot 1080 and edit and finish in 4K, if you see any benefits in that. I would do the reverse which suits my workflow. Then again if you looked at Smooth Motion slomo, it creates new frames by interpolation. So I guess you can always make yourself a HDR project, throw your files in, tweak and see if its worth the pain?
There's an old saying in television: "You can't polish a turd" ;-)
Tests with file sizes, encoder data, and histos would be welcome to this kind of discussion. Otherwise its speculative at best.
There is nothing that must be proved really. If you have some experience in HDR grading, then the possible way is quite clear how you can do the required operations. So no speculations here.
But if it such "fake HDR" make sense to you or me or anybody here - well that is another question. Up to you or me… but it is a decision where you will not find an answer or guideline in a file size. I would not spend much effort here.
it is a decision where you will not find an answer or guideline in a file size. I would not spend much effort here.
I quite agree with you -- since the tests have already been run with SDR->SDR, HDR->HDR, and SDR-HDR, comparing histograms, file sizes, and difference masking, I really don't need to run them again, as the outcomes remain consistent.
However, since the subject has been raised again, I am merely expecting counterexamples from you guys, which to my awareness have never been published, despite persistence of speculation.
Specifically, show us the third histogram with all the holes (banding) filled in, and 30-35% larger file size (example for 8 bit 4:2:0 to 10 bit 4:2:2), not the same as source. Change the contrast, or do whatever you need to do, but show us rather than talking about it. Good idea to upload the test files, too.
Then, we can discuss whether interpolated colors, if they can be made to exist, constitute data or noise. Ball is in your court, Wolfgang -- wwjd already understands, and as per precedent, no tests = no claims entertained.
Thanks, wwjd, for keeping an open mind to divergent outcomes.
8->8 Bit 3.47GB
10->10 Bit 4.63 GB
8->10 Bit 3.48 GB
I see these tests were run in 2014, and confirmed in 2018 and 2019. May be a message for me to get on with my life. Carry forward, gentlemen.
I have done similar tests years ago. And again, they fail to answer the question. The question was simple: can SDR converted to HDR in Vegas? And the answer was given with yes, similar as Edius can do. Limitations were mentioned too.
If you desire to make some semi-scientific tests, feel free to do so. Up to you. Whatever you wish to show us
I'm not sure why this is such a weird question. It's not visible resolution, it's not pixel color depth although related, Yes, numbers are numbers. I'm just asking if anyone has TRIED turning up to 11? If a scene is log flat, we add contrast (as fake as it is) to make it look good, and now that contrast/dynamic range and all things related are allowed to color outside the old NORMAL lines, why not do it?* IF REC709 has a max contrast or luminance (and all related parts) only goes to 6 MAX, but we can tweak to 11 now, why not do it?
Sounds like no one here has done this. I will blaze this trail once I get time and figure it out. :) It will be hard to show or prove anything if one does not have an HDR screen though.
*for the sake of TVs and monitors with the HDR capability only, of course
Of course I tried. Many times. The best I've gotten since 2014 was an 8 gallon file in a 10 gallon bucket. There was no interpolation or "filling in" of missing colors or "tweens" as you call it, just swiss cheese full of air. 8 bit banding was not reduced.
I can put on a pair of size 18 shoes. It doesn't make my feet big.
Rather than asking the same question and expecting different answers, offer tests of your own. Any tests. I truly feel my job here will be done once you've proven my wrong.
"5 gallons of water in a ten gallon bucket is still 5 gallons" - Great analogy MV!
This reminds me of a post on another group where a query about rendering a 16kbps MP3 to a PCM audio file, The OP assumed that would restore the original quality. By the sound of it, it was a lost cause to begin with (mic placement).
Of course this video thread is a little more complex.
Well, the online culture can turn water into wine, lead into gold, and yesterday's 8 bit into tomorrow's HDR. So sad for those of us born into the physical universe.
If a scene is log flat, we add contrast (as fake as it is) to make it look good, and now that contrast/dynamic range and all things related are allowed to color outside the old NORMAL lines, why not do it?*
Following that logic, if we impressed an 8 bit file on a linear gamma slope (y=x), we would wind up with 6.4 bits. And it would look "log flat." (Nice term!)
you are telling me, that on 120v, when we swap a 60w bulb (rec709) for a new 100w bulb (rec2020) it will still only put out 60w? :D :D I dont think so. I have upscale hd and 2k up to 4K. I did not leave black space, nor copy the same nearby color, it interpolated inbetween colors. was it true to life? of course not, but it looked great to the eye with no issue. This is the same deal. This means the new slo mo doesnt work either right? Because adding stuff in is wrong? All we are doing is turning up the wattage where previously we were never able to. Same red, more brighter of it.
:D Some HDR TVs can already do this conversion on the fly. Looks like Vegas can accomplish this. It is USING a 100w bulb, whereas in the past, all we were allowed is 60w.