I get the numbers. Until someone ACTUALLY tries it (me, I guess), I'll keep the faith. :)
Do you?
Do you also get that, with digital, the colors are numbered? Like, literally! And not only the colors... π
Yes, I understand numbers, also when to ignore them and jump in. I won't say I will succeed, but I won't let numbers slow me down. My initial question was answered: yes, vegas can send sdr to hdr. maybe it wont look good. Don't know, don't care at this point. I don't care about numbers. I like to SEE results, not spreadsheets.
I've heard the following over the years: cant send more than 9600bps across copper wire (from a level three network Tech), no one needs HD - SD is fine, no one needs 4k, no one needs 8k, "world market is maybe 5 computers", "Orville, that will NEVER fly" ... see where I'm going? :) there is already other software doing a great job of converting, there's TVs doing it. Not rocket science. Just wondered if Vegas was up to the task.
When I've upscaled HD to 4K, Vegas filled in the tween blanks, with tween colors, not just copies of what was in the nearest pixel. That would have looked like crap.
Would be sad to not be able to mix all the old stuff up to modern better tech.
If you are looking for visual "perfection", yes, don't do this. But, can it just look great? Probably, I have yet to see that. Currently picking out an HDR screen for this test. :)
OOooh ooooh! What's this?? lots of loss shown in the luminance scope forcing 8 bit into a 10 bit, >>SUDDENLY<< all evened out and filled in by a secret combination of native Vegas effects? Not impossible. I'd show pics, except VEGAS CRASHED before I could screen shot anything. :P
Also learned my GFX and monitor show 10 bit nice and smooth -- Vegas PREVIEW keeps things 8 bit stepped - I don't know if it does on external monitor, not using one ATM. Does V17 have a 10 bit preview if using 10 bit clips?
I get the numbers. Until someone ACTUALLY tries it (me, I guess), I'll keep the faith. :)
Do you?
Do you also get that, with digital, the colors are numbered? Like, literally! And not only the colors... π
Yes, I understand numbers, also when to ignore them and jump in. I won't say I will succeed, but I won't let numbers slow me down. My initial question was answered: yes, vegas can send sdr to hdr. maybe it wont look good. Don't know, don't care at this point. I don't care about numbers. I like to SEE results, not spreadsheets.
I've heard the following over the years: cant send more than 9600bps across copper wire (from a level three network Tech), no one needs HD - SD is fine, no one needs 4k, no one needs 8k, "world market is maybe 5 computers", "Orville, that will NEVER fly" ... see where I'm going? :) there is already other software doing a great job of converting, there's TVs doing it. Not rocket science. Just wondered if Vegas was up to the task.
When I've upscaled HD to 4K, Vegas filled in the tween blanks, with tween colors, not just copies of what was in the nearest pixel. That would have looked like crap.
Would be sad to not be able to mix all the old stuff up to modern better tech.
If you are looking for visual "perfection", yes, don't do this. But, can it just look great? Probably, I have yet to see that. Currently picking out an HDR screen for this test. :)
This! π
You really don't seem to understand what the numbers imply, do you?
And no, it's not maybe it wonβt look good, it simply can not look good (given you really mean what you say, which is HDR out of SDR).
More (fun)facts: a good 709 monitor should be able to reach 100 nits in the whites and 0.05 nits in the blacks. That's a contrast range with a ratio of 2000:1. 2000:1!!! And that's SDR. Think 8 bit is enough? Even for SDR?
If one is to maintain a 2% Weber fraction:
The ability of the visual system to discern intensities - the Weber fraction - is about 1%. Some people claim 2%.
one would need 383,83 code values (9 bit) to cover the entire range of intensities (no wonder they're making more and more 10 bit SDR monitors now). And that's with 2%, which may be visible (banding!).
With 8 bit there are only 256, soo...
Moving on: as per Mr. Poynton:
A CRT in an office environment rarely achieves a contrast ratio better than 10 to 1. A ratio of 30:1 is typical of viewing television in a living room. A very good cinema can achieve a contrast ratio of 100:1
Meaning that the only reason you don't get to see banding, even with 8, bit is because of your "environmental constraints", which makes the whole thing anything but SDR.
And don't get me started with monitors capable of c. ratios of 2.000.000:
I do see banding, on screen, on 8bit sources - in or out of Vegas. My screen shows the 10-bit sources to be smooth and band free, even in motion. VEGAS shows the 10bit as 8bit in preview - it bands my smooth 10-bit source. With some tricks it is easy to REDUCE banding, at least to a point where no one but pixel peepers notice it.
Done with numbers. Don't care about numbers. I have the clay in hand, gonna play! :D
Also learned my GFX and monitor show 10 bit nice and smooth -- Vegas PREVIEW keeps things 8 bit stepped - I don't know if it does on external monitor, not using one ATM. Does V17 have a 10 bit preview if using 10 bit clips?
Assuming you have Vegas 16+, your GPU is HDR capable, and you have your external display hooked up to an HDR TV or monitor, and you have windows 10 in HDR mode on that screen, and Vegas' project settings also in HDR mode, then yes, Vegas will give a 10 bit HDR display on a secondary monitor.
AFAIK only pro-level GPUs will display 10 bit in Vegas' preview window. Consumer level can do 8 bit preview 10 bit external (Vegas is the only NLE I know of that can do this. The rest require pro GPUs or other display adapters like BM output cards for 10 bit external preview, at least the last time I checked).
Dipping the toe in... temperature is PERFECT! I think this is going to be amazing! I can FEEL the extra colors oozing while editing.πΒ Don't have an HDR monitor yet, but have friends who can play the YouTube renders on their TV so will test that soon. Not seeing broken scopes, but I have added garlic, BBQ sauce, virgin olive oil, and my special secret seasonings combination to end up with a "look" I want for this project. LOT of learning to do before I do a final HDR render. (PS that shot is an early morning wake up and is supposed to be dark like that)
EDIT: and I see the pic above is destroyed on upload and looks pretty hideous. Looks better in person
You're definitely in the HDR range, going up to 1k nits, but keep in mind when grading HDR... every scene doesn''t have to fill the entire dynamic range. If you come at it from that perspective, when you do have highlights that reach into those upper nit levels, such as something that should be extremely bright (a flashlight pointed at the camera, a flame, the sun, etc) will pop that much more than if we saturate the viewer's eyes with 1k nit brightness in every shot. Your shot is kinda dark, so maybe roll off those highlights just a bit.
As for monitors, I'm using a LG OLED display. You can get them for the $1200-1500 range these days... though I might go for a Quantum Dot display instead if I were you, so you can reach into those upper nit levels (OLEDs peak out around 750 nits, but can dive way deeper into the blacks (essentially pure black) and have rock solid color reproduction), and you can pick those QD displays up for a bit cheaper on average as well.
If you want true pro monitors, not consumer grade displays, you may want to ask Musicvid to chime in on what industry standards to check out. My OLED does what I need for my purposes, and everyone seems to really like the results, but your mileage may vary.
Frosty, I understand not having a REAL HDR monitor will not show my physical eyes REAL output brightness levels - I do get that. But what does the view transform show me as is in Vegas? It should be a dumbed down version, but should still be in the ball park for the "Look", correct? IE blacks won't be buried in HDR nor in SDR, or white blow any further - it's just way brighter and more contrasty in HDR, right?
The setup video which I think is from Vegas, said to leave it at sRGB(ACES) if viewing on s regular computer monitor.
Colors will be limited, in addition to brightness. you'll lose detail in the highlights especially, so shots that look fine in HDR will look overexposed in SDR. The view trasnform helps to negate that, but you're still flying blind in those shades that SDR cannot resolve.
It's a very meaningful topic. The conclusion first, It is possible, Β you can get the HDR look from the SDR footage. However, in the case of 8-bit clip, banding is likely to occur, and you can not restore a clipping white or black.
First of all, I would like you to know that SDR is generally recognized as information up to 100 cd/m2, it is correct but it is not means an image actually cut up to 100cd/m2. This is easy to understand if you consider what happens when you clip the HDR image at 100 cd/m2. As a matter of course, all areas outside the SDR range will be overexposed, making it unsuitable for viewing.
However, in reality, this is not the case. In most cases, the high-luminance side creates a look that is gradually saturated and desaturated as if it were compressed. This is a very preferable method considering the processing and display in SDR.
For example, if you adjust the gain of an image up to 1000cd/m2 to 100 cd/m2. The image will be very dark and have low contrast. This is close to the so-called log state and can be output as a rec.709 image while maintaining a very high dynamic range as a ratio.
However, as mentioned above, this method image is very dark and causes a reduction in contrast. This is because the display on SDR TV is usually 100 cd/m2, and even if it is displayed high like a general TV, it is at most 300 cd/m2. However, the 1000 cd/m2 image stored in the SDR range by this method can be regarded as the original image up to 1000 cd/m2 by adjusting the gain. This process can be considered same as the log.
Think again about normal SDR video. As I mentioned earlier, low brightness and high brightness have been curved and are gradually compressed. This is necessary to prioritize the appearance in SDR.
Should it be considered that this clip cannot be converted to HDR because discarding all high-brightness information? I want to say, the answer is No.
Since the high-brightness side is only compressed and not completely saturated, it is possible to extract considerable information by converting it to scene linear.
This idea has also been adopted in the ACES Workflow. Even if the image is exactly the same rec.709, for example, up to 100 cd/m2 image in the PQ format is input to ACES, IDT convert to aces colourspace without linear conversion. It is just a dark image up to 100 cd/m2.
However, in the case of SDR video, if rec.709 is specified for the input color space, IDT convert it to ACES with linear conversion and handled as HDR image up to about 600 cd/m2.
Of course, depending on the shooting conditions etc., there may be cases where higher brightness should be proper. In such a case, you can get the look you want by adjusting the brightness with the color correction function as needed.