Where we at with HLG?

AVsupport wrote on 3/17/2019, 7:00 AM

I know Vegas is no longer a Sony product, but HDR and HLG aquisition is on the way into the living rooms and offices. What's the way forward, the vision, the direction?

As a Sony A7iii user, I could shoot HLG now, but what's the point? Should I stay with Cine4?

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

Comments

JackW wrote on 3/17/2019, 1:33 PM

This article -- https://www.digitaltrends.com/home-theater/what-is-hlg-hdr/ may help in answering this question. And I would be interested to know how many on this forum have had paying customers ask for delivery in HDR and/or HLG formats.

eikira wrote on 3/17/2019, 2:06 PM

I know Vegas is no longer a Sony product, but HDR and HLG aquisition is on the way into the living rooms and offices. What's the way forward, the vision, the direction?

As a Sony A7iii user, I could shoot HLG now, but what's the point? Should I stay with Cine4?

Well technically you now can edit HDR and export it in Vegas Pro (16 at least, dont remember about 15) in all kind of HDR Formats and specs. But in reality, the workflow and the performancedrop is just a nightmare in Vegas.
I mean i have i quiet decent PC 6Core Intel 6850K 4GHz, 32GB 3200MHz RAM and a new RTX2070 GPU, but the reality is, that you need to set the project into Pixelformat Fullrange BT2020. And it takes for ever to render, its like you would render a UHD Video with a single core CPU-only from 12 Years ago.

And not to forget, you also need yourself very very decent Displays to see what exactly you are doing. So at least in the Semi-Pro segment, HDR in general is a pain in the ass and cost you some money for new hardware.

Unless you plan it out really good and your project needs to be state of the art, yes, stay with Cine4.
Magix clearly need to do something to make it workable with 32bit fullrange pixelformat.

fr0sty wrote on 3/17/2019, 5:05 PM

I shoot HLG on my GH5 when I don't feel like grading my video into HDR from VLOG, it works just fine.

As for 32 bit performance, you get your editing done first in 8 bit, then switch to 32 for the final coloring phase. The performance isn't good, but it isn't unusable. I know Magix is putting priority on improving 32 bit performance in future versions.

As for who asks for HDR, I usually sell it to my clients by explaining what it is and showing it to them on my OLED monitor. Doesn't usually take much convincing once I fire that TV up.

Last changed by fr0sty on 3/17/2019, 5:06 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

AVsupport wrote on 3/17/2019, 5:13 PM

Yes, good points @eikira . I feel like there's not a lot of difference in the acquisition whether to shoot Cine4 or HLG3 as they're looking / behaving roughly the same Apart from slightly different (Log) highlight rolloff at HLG.

I guess the biggest difference is Metadata where HLG can be identified as such on a compatible screen.

And yes, I definitely don't want to edit in 32bit until VP has a better core (NVidia already has hardware encode support for HDR). What happened to 10Bit, seeing this is where everyone is headed?

Cine4 and HLG3 are 8Bit XAVC-S coming out of my Sony A7iii. I would love to keep project and export format the same, but HDR conformant?

(@JackW your site doesn't like ad-blockers, sorry what's the story?)

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

eikira wrote on 3/17/2019, 5:31 PM

I shoot HLG on my GH5 when I don't feel like grading my video into HDR from VLOG, it works just fine.

VLOG does not mean by default on the GH5 HDR. So i think you are mixing things up here.

As for 32 bit performance, you get your editing done first in 8 bit, then switch to 32 for the final coloring phase. The performance isn't good, but it isn't unusable. I know Magix is putting priority on improving 32 bit performance in future versions.

That only works, if your footage is overall either the same, like in an interview and not changing enviroment etc. or you absolutely done perfect preparations and set up everything right, meaning very good lighting, exposure etc. If you have, a whole movie but dont have time, money and personal, that is a very annoying "workaround".

Its only "usable" without effects and only in poor preview quality.

I hope so that Magix puts some serious efforts into performance issues fixing BEFORE they want again money for a Version 17 for that...

n improving 32 bit performance in future versions.

As for who asks for HDR, I usually sell it to my clients by explaining what it is and showing it to them on my OLED monitor. Doesn't usually take much convincing once I fire that TV up.

Again, OLED alone does not mean HDR in general.
Overall it looks like 2-3 Years have to pass to find a broad support for all kind of support of HDR Versions and Standards OR that one Standard just kicks out all others. Because as it is right now we have so much things to consider:
BT2020, ColorRange full or limited, BT709, HDR10, VESA HDR200 HDR400 HDR1000, 1000nits, 2000nits, 4000nits, HGL, Dolby Vision, some in 8bit per channel 10bit per channel and in the future we allready have to think about 12-14 bit per channel, etc. esf. just to name the most known, and some even mix up and are compatible. it is a mess right now.

So as long as you dont have a really good HDR multistandard supporting OLED AND good footage which it supports that what your equipment is capable to deliver it, sure the client will not be convinced to much about it.

JackW wrote on 3/17/2019, 7:22 PM

@AVsupport: Sorry, you've lost me here.

AVsupport wrote on 3/17/2019, 7:57 PM

sorry @JackW I was unclear.. I couldn't read the link you provided without turning off my adblockers in my browser..hence I couldn't read the article...

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

fr0sty wrote on 3/18/2019, 6:04 PM

I shoot HLG on my GH5 when I don't feel like grading my video into HDR from VLOG, it works just fine.

VLOG does not mean by default on the GH5 HDR. So i think you are mixing things up here.

- I am well aware of what VLOG is vs. HDR. VLOG must be graded into Rec2020 color space (assuming it also was shot in VGamut), but you can produce HDR from VLOG, and it's in fact the preferred way to go as HLG doesn't grade as well.

As for 32 bit performance, you get your editing done first in 8 bit, then switch to 32 for the final coloring phase. The performance isn't good, but it isn't unusable. I know Magix is putting priority on improving 32 bit performance in future versions.

That only works, if your footage is overall either the same, like in an interview and not changing enviroment etc. or you absolutely done perfect preparations and set up everything right, meaning very good lighting, exposure etc. If you have, a whole movie but dont have time, money and personal, that is a very annoying "workaround"

Its only "usable" without effects and only in poor preview quality.

- It works in any scenario. You cut your video in 8 bit, as cutting has nothing at all to do with color grading or correcting and you do not need the extra accuracy. When it comes time to color, switch to 32. There is no reason to cut the video in 32 bit pixel format. Smooth framerate is more important to the editing process, higher resolution is more important to the coloring phase, so I usually edit at a lower preview quality in 8 bit mode, then color at higher quality in 32 bit mode.

I hope so that Magix puts some serious efforts into performance issues fixing BEFORE they want again money for a Version 17 for that...

n improving 32 bit performance in future versions.

As for who asks for HDR, I usually sell it to my clients by explaining what it is and showing it to them on my OLED monitor. Doesn't usually take much convincing once I fire that TV up.

Again, OLED alone does not mean HDR in general.

-Again, I am well aware what OLED is and the fact that it can display non-HDR content. I am talking about me producing actual HDR content and displaying it on my actual HDR TV to clients, who upon seeing it actually want to buy it.


Overall it looks like 2-3 Years have to pass to find a broad support for all kind of support of HDR Versions and Standards OR that one Standard just kicks out all others. Because as it is right now we have so much things to consider:
BT2020, ColorRange full or limited, BT709, HDR10, VESA HDR200 HDR400 HDR1000, 1000nits, 2000nits, 4000nits, HGL, Dolby Vision, some in 8bit per channel (What HDR standard is 8 bit? Every one I've ever seen is either 10 or 12 bit.) 10bit per channel and in the future we allready have to think about 12-14 bit per channel, etc. esf. just to name the most known, and some even mix up and are compatible. it is a mess right now.

- I totally agree, though it is encouraging that HDR10 and HLG seem to be leading the pack. My guess is that HDR10/10+/etc will prevail as the standard for films and online distribution, and HLG will be the standard for broadcast. Proprietary formats rarely succeed, so I don't see Dolby gaining any ground unless it goes open source.

So as long as you dont have a really good HDR multistandard supporting OLED AND good footage which it supports that what your equipment is capable to deliver it, sure the client will not be convinced to much about it.

I have several clients and several thousands of dollars worth of additional income I've charged them for HDR productions that beg to differ. If you know how to sell it to them, people actually like future-proofing their productions, especially when it comes to things like weddings, concerts, or other large events. As for the format confusion, some clients have opted to have me record using 10 bit formats, but not bother grading into HDR yet, they just keep the 10 bit raw masters around for later. When they bother to get a HDR set, they can pay me a bit extra to grade it for them. I had to deal with similar challenges convincing clients to buy my DVDs back in the 90s, then blu-rays in the 2000s... at one point I was even giving blu-ray players away as part of the package if they chose to have me make HD videos of their event. Now about 30% of the discs I produce are BDs, though I'm producing fewer and fewer discs at all these days.

I agree the tech is still early, and isn't an easy sell yet, but I'm glad to see Magix leading the way with adopting it and producing the easiest and most feature-rich HDR workflow of any reasonably priced NLE.

 

Last changed by fr0sty on 3/18/2019, 6:05 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

AVsupport wrote on 3/19/2019, 12:57 AM

"What HDR standard is 8 bit?" @fr0sty I have to admit this is all relatively new to me, and I probably haven't done my homework;

But how come Sony has an A7iii (the camera I am now using) that shoots 'HLG' using 8-Bit XAVC-S? Is that cheating? Is it not true and non-conformant HLG? Just curious..

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

eikira wrote on 3/21/2019, 9:17 AM

But how come Sony has an A7iii (the camera I am now using) that shoots 'HLG' using 8-Bit XAVC-S? Is that cheating? Is it not true and non-conformant HLG? Just curious..

HLG is just the "color amount" information. Instead of BT709 colors you can get with HLG BT2020

8bit vs 10bit means the steps between the colors are being encreased. or in other words you can make smother color transitions because you have more levels. Instead of each color (RGB) having only 255 steps, with 10bit you increase the levels up to 1024 per color.

And i dont see in the HLG specifications as manditory to be 10bit.

AVsupport wrote on 3/21/2019, 4:59 PM

@eikira , HLG is Hybrid LOG Gamma, not Gamut.

https://en.wikipedia.org/wiki/Hybrid_Log-Gamma

But yes, on the A7iii you can choose different Gamut colour curves in combination with HLG.

And yes, 10-Bit will give you more graduated information. But the same applies for shooting 10-Bit in 709, Cine or LOG.

Last changed by AVsupport on 3/21/2019, 5:01 PM, changed a total of 1 times.

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

fr0sty wrote on 3/21/2019, 6:07 PM

It is either incorrectly reporting 8 bit when it is really 10, or HLG in the a7iii doesn't conform to HLG spec, which is 10 or 12 bit.

AVsupport wrote on 3/21/2019, 6:12 PM

It is either incorrectly reporting 8 bit when it is really 10, or HLG in the a7iii doesn't conform to HLG spec, which is 10 or 12 bit.

XAVC-S codec only supports 8-Bit. I have posted this question on the Sony Community forum, but I don't expect to get any answers to the topic of 'misleading advertising'..

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

eikira wrote on 3/21/2019, 9:38 PM

@eikira , HLG is Hybrid LOG Gamma, not Gamut.

https://en.wikipedia.org/wiki/Hybrid_Log-Gamma

But yes, on the A7iii you can choose different Gamut colour curves in combination with HLG.

And yes, 10-Bit will give you more graduated information. But the same applies for shooting 10-Bit in 709, Cine or LOG.

I said nothing about Gamut at all., but in the case of HLG this is part of this HDR standard. To be precise it is even more than just BT2020 its BT2100. So please dont school me on things you asked, seems bit twisted.
I just explained the issue about HLG as an HDR option and whats about with 8bit or 10bit, which are not the same things, because you asked about if your camera is cheating by only recording in 8bit. Dont mix them up b default...

It is either incorrectly reporting 8 bit when it is really 10, or HLG in the a7iii doesn't conform to HLG spec, which is 10 or 12 bit.

Can you point out the Line in the Specification that this is the case? It uses the Color Palette of BT2100 yes, but i cant see that it is a must have of HLG itself. Those are not the same things.

XAVC-S codec only supports 8-Bit. I have posted this question on the Sony Community forum, but I don't expect to get any answers to the topic of 'misleading advertising'..

Because it is not. Again i cant see any line in the specification of HLG itself that would not allow Sony to do that. Infact, if you would look your own recommended wikipedia site, you would see, that there is a specification for 10bit mentioned on Source NR. 38 called HLG10. THAT one needs to be 10bit or it does not meet the standard specifications.

It the end, it is everything all right and Sony did not cheat, that is the outcome of your question.

fr0sty wrote on 3/21/2019, 10:01 PM

No need to be condescending to people who are just asking questions and trying to understand something. You may not intend it, but you're coming across unnecessarily harsh.

As for HLG 10 bit,

"Hybrid Log-Gamma (HLG) is a HDR standard jointly developed by the BBC and NHK.[59] It is compatible with standard dynamic range (SDR) displays, although it requires 10-bit color depth."

https://en.wikipedia.org/wiki/High-dynamic-range_video#Hybrid_Log-Gamma

And from the BBC's website, talking about why they created HLG:

"Finally we wanted an approach that was compatible with our current 10 bit infrastructure and only needed changes to the cameras and critical monitoring displays. This led us, and NHK who shared many of our concerns, to invent the Hybrid Log-Gamma (HLG) system for HDR."

There are no 8 bit flavors of HDR that I can find.

Last changed by fr0sty on 3/21/2019, 10:13 PM, changed a total of 2 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

AVsupport wrote on 3/21/2019, 10:27 PM

HLG is just the "color amount" information

Well I believe this is not true, as for the above, HLG Gamma describes a dynamic range of the image being far greater than 709, hence this being a HDR standard.

Sony XAVC-S can deliver dynamic range greater than 709 (with Cine and LOG curves), but only in 8-Bit. A7iii can do HLG and within that Gamma, different color Gamuts such as 709 and BT.2020 .

But "Rec. 2020 defines a bit depth of either 10 bits per sample or 12 bits per sample.". So, technically, I believe they couldn't call it conformant..

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

eikira wrote on 3/21/2019, 11:50 PM

No need to be condescending to people who are just asking questions and trying to understand something. You may not intend it, but you're coming across unnecessarily harsh.

I am not really harsh. Just wanted to point things out directly.

As for HLG 10 bit,

"Hybrid Log-Gamma (HLG) is a HDR standard jointly developed by the BBC and NHK.[59] It is compatible with standard dynamic range (SDR) displays, although it requires 10-bit color depth."

https://en.wikipedia.org/wiki/High-dynamic-range_video#Hybrid_Log-Gamma

I think it still is a misunderstanding. Probably it was intended in the beginning so. But it clearly is not now, otherwise the clear definition of HLG10 would not be necessari. Dont you think?

And from the BBC's website, talking about why they created HLG:

"Finally we wanted an approach that was compatible with our current 10 bit infrastructure and only needed changes to the cameras and critical monitoring displays. This led us, and NHK who shared many of our concerns, to invent the Hybrid Log-Gamma (HLG) system for HDR."

There are no 8 bit flavors of HDR that I can find.

The flavor is hidden in the word compatible.

HLG is just the "color amount" information

Well I believe this is not true, as for the above, HLG Gamma describes a dynamic range of the image being far greater than 709, hence this being a HDR standard.

I would agree. If your Sony gives you only BT709. Thats what i wanted to bring across all the time. You cant really have a dynamic range with the already established colors (amount of colors).

Let me try to describe it in words and numbers alone. Lets define BT2020(or BT2100) with the number 200 alone and BT709 with the number 150.
BT709 is in the range and the boundries of BT2020, meaning 150 is in 200, but has space on all sides, meaning the start of 200 is at 0 and ends at 200 while 150 starts in the space of 200, lets say in 25 and ends in 175. 200 reaches further to the "edge".

Now, with 150 you can reach only the bottom of 25 in the 200 range and the top 175 of 200.

While 150 you can consider SDR with 50 more in 200 the dynamics change and are higher. Were you could not chose alone in 150 for a specific color 200 has the dynamic encreased and lets say could chose at the spot 80 of 150 for itself (meaning 200) 79 or 81. Or at the end and the beginning go lower or higher to represent a more accurate color.

So if your Sony gives you only BT709 then i would consider it cheating. At least gives you for the Editing no possible way to use HLG at all. Then it would look like yes it receives the range of HLG itself in the camera, defines the colors on the palete, assigns them on their spots and burns those informations into the file. If you have only BT709 i would consider it useless. Well not entirely uselss, in the case of just showing it without any editing at all, the picture would probably look rich in dynamics on a screen directly from the cam, meaning for personal uses could be ok. But Sony calls it HLG 3 so there is probably a reason why they call it HLG 3. Right now i could not find clear information about HLG 3 and why Sony uses that specific thing.

Sony XAVC-S can deliver dynamic range greater than 709 (with Cine and LOG curves), but only in 8-Bit. A7iii can do HLG and within that Gamma, different color Gamuts such as 709 and BT.2020 .

Well is the Cine Profile really a different dynamic than BT709?

As en example i use my GH5
If i use CineD (Cine Dynamic) its clearly a flat image to give me the possibility to grade a bit myself.
But if i then look into the file itself in terms of colors and bitdepht this is in it:

Chroma subsampling             : 4:2:0
Bit depth                      : 8 bits
Color range                    : Full
Color primaries                : BT.709
Transfer characteristics       : BT.709
Matrix coefficients            : BT.709

If we speak about LOGs they are only usefull, if the software has a clear matching profile to chose from. Vegas has quiet some LOGs to change the file to that specific LOG. So far i have not come across that at least Vegas is able to recognize what LOG it is, but instead just manages to destinguish between BT709 and BT2020, and even that seems not to work every time.

But "Rec. 2020 defines a bit depth of either 10 bits per sample or 12 bits per sample.". So, technically, I believe they couldn't call it conformant..

That i already agreed on it, but HLG, and that i mentioned too, does not claim to be BT2020 but instead can handle it into its specifications, that is the reason why HLG can even use BT2100, because its not bounded on those alone but uses them only to its needs.

 

Look, i would also prefer way more clear cut specifications, but it is as it is, we have a mess and your Sony "gets" most likely away with its HLG 3 without breaking any demanded specifications. My GH5 again as an example does not allow to use HLG in 8bit, so Panasonic clearly has more strict rules set up.
In your case, if your Sony a7 III does not use for its HLG BT2020 but only BT709 i would simply use S-Log3 just because at least Vegas clearly has a matching Profile to select specific for that purpose so that color grading makes sense and use HLG of it, if i do not want to grade anything at all and just give a "pre graded" balanced dynamic picture.

As we have it now, and that was your initial question, we are at a big giant pile of confusion and unnecessary mix of all kind of standards and have to chose something of this mess which has a good chance that in 5 years nobody will bother. HLG probably will for the broader things be the thing, i mean we are talking about a system by BBC and NHK and which will be used in practically on gigantic global Sports events, it will define something. And when a TV says HLG we most surely can be garantueed it will display something better than SDR, no matter if 10bit or 8bit. Thats pretty much the selling point of HLG, it will look with future pictures on good HDR very good, but still can handle "crappy" SDR Displays without distroying the picture for all viewers with old equipment.

AVsupport wrote on 3/22/2019, 12:45 AM

It's important to not confuse a 7-stop dynamic range 709 which can come in various bit rates resolution with a 14-stop dynamic range in various bit rates resolution. Recording a 14-stop SLOG on an 8-bit camera doesn't make a lot of sense because of banding and posterization in the shadows and highlights because of lack of data in those areas. However, Cine profiles and HLG can make sense and increase the dynamic range compressed into the recording standard, by 'flattening' the image on the aquisition side.

All HLG is is pretty much a 709 with a LOG highlights curve. The embedded metadata information then triggers the HDR display to display this correctly.

I am keen to know if this [HLG] metadata information can be read and written by VP in the rendering process, even when editing 8-bit, or if there is moves to do this in the next version, as I am all about future proofing.

Hence my question in the OP: "What's the way forward, the vision, the direction?"

FYI, My Sony A7iii can record both colour spaces, BT2020 and 709 when choosing HLG. You can find more info on the Sony website https://helpguide.sony.net/di/pp/v1/en/contents/TP0000909109.html

 

Last changed by AVsupport on 3/22/2019, 12:46 AM, changed a total of 1 times.

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

eikira wrote on 3/22/2019, 1:52 AM

It's important to not confuse a 7-stop dynamic range 709 which can come in various bit rates resolution with a 14-stop dynamic range in various bit rates resolution.

Sure, but nobody said that really in here. But we also dont need to confuse the image gathering information on the sensor with how it processes that information and stores it.

Recording a 14-stop SLOG on an 8-bit camera doesn't make a lot of sense because of banding and posterization in the shadows and highlights because of lack of data in those areas.

It depends. Lets bring more confusion into this, to show further problems in understanding it. Logs are used to color grade stuff. It is a thing about color accuracy, because without color accuracy the grading can be difficult and result in distortion of the colors itself if you bend it to harsh one or the other way. Thats why you have the Color Format in this whole game too.
Thats why old cameras had also RAW Formats and the Semi-Pro segment to enable at least some sort of accurate grading but still remaining in the BT709 field just do not distort colors to greatly with 422 in MPEG2 for example but still cameras used more stops than the avarage camcorder for homeuse etc.

But yes, i agree with you, today it seems not to make much sense to use for grading BT709 if you have the possibilities to do it with better inputs.

However, Cine profiles and HLG can make sense and increase the dynamic range compressed into the recording standard, by 'flattening' the image on the aquisition side.

I think you are confusing again and i already explained that. At least in the example of my GH5. But if we look at Sony and Vegas, you are limited in the sense of using S-Log3 to grade, so your Cine profile will not do much for you here anyway, just because its flatten, does not mean the software can magically regain information which is not there. It only makes sense if you exactly know what you want, how the Cine flat profile works for you and your system and your software.

But lets say you are working on a feature film project and you have no clue who is gonna work on the editing/color grading on it. Do you think you can retain a better result in the end with S-Log3, a quiet good defined LOG standard or with a specific to the camera tuned Cine Profile? In the case of Cine Profile, you also could just use the Standard profile, and just prepare clear lightning, positioning of the cam and a to the point exact white balance in the camera settings etc.

All HLG is is pretty much a 709 with a LOG highlights curve. The embedded metadata information then triggers the HDR display to display this correctly.

No, that we already established that HLG is not that. I dont understand why there is still confusion about it. HLG just is able to manage BT709, BT2020 and BT2100. To now say well its just en enhanced BT709 makes no sense to me.

I am keen to know if this [HLG] metadata information can be read and written by VP in the rendering process, even when editing 8-bit, or if there is moves to do this in the next version, as I am all about future proofing.

Thats a good question. Since Vegas is not really able to automatically recognize it, i would say no. Well you have to define what your source is, and that was my point before with S-Log3 and Cine.


I dont know all the settings to be able for HLG in the Metadata segment, but VP simply disables Metadata if you edit it in 8bit, it even disables in the rendering profiles HDR10 automatically if you have your project set up to 8bit Edit, which makes sense, no matter if you want to make somekind of metadata passthrough or not, Vegas does not care, it more or less says, either you build your project up to 8bit or full pixelformat. Otherwise if its in 8bit Edit mode the Meta Data option is grayed out in Export:

Hence my question in the OP: "What's the way forward, the vision, the direction?"

 

Only Magix could give you a clear answer to that and how well they want to go along with HLG and swipe away the confusion away by supporting it fully. But they will also not be able, because you also need Displays who can exactly handle the specifications and recommendations of the standards. And as long as it is not established in the sense of manufactures say: 'This Display is HLG futureproof' Magix will to not invest to much effort, time and resources on supporting everything and define their software on it. And nobody can blame them really for not carring to much of supporting the whole jungle of confusion.

I personally want also to be futureproof with my stuff, but i still did have managed to convince myself to go on the HDR trail at all, simply because there is just to much to consider about the display alone. Not to mentioning the cost for something you have no idea how its gonna look in 5 years. And in 5 years Vegas will be at Version 18 or so, so you will anyway spend on software alone again.

I mean look at the big studios alone, they also cant really decide on which path they all should go along. Should it be HDR10+ or better Dolby Vision, or even lets just go HLG together. And even if frosty thinks HLG will be only a broadcast thing, it says much, that netflix goes HLG with its streaming, and they also produce big Movies and Shows.

In short, to try to answer your question in regard of Vegas Pro, you cant be futureproof with a mess we have right now if you buy VP16, but you will have certain options to try stuff out. But it makes to me little to no sense to think either using VP16 for HDR in general in terms of futureproof or not, since nobody knows which formats/standards will "win".

AVsupport wrote on 3/22/2019, 3:51 AM

To now say well its just en enhanced BT709 makes no sense to me.

I'd encoureage you to watch 'Talking HLG with Alister Chapman' on youtube

Alister is a very knowledgeable man, Sony ambassador and founder of xdcam-user.com ,a site which contains a lot of valuable information I can thoroughly recommend to digest, at your own leisure.

He explains very well why HLG has become what it is: because of the need to [709] backwards compatibility.

Yes I would like to know from Magix where we're going, because I would like to know if I am investing my $$ in the right product moving forward.

Eventually HDR will arrive, if you're ready or not, I'd rather be.

You can't tell me, Netflix didn't invest all that money not knowing where they're going. I already have a 4K screen. That wasn't too hard. It's obvious to me that before we see any other major change like 8K (which is already in the works as you know) there'll be other 'cheaper' improvements like HFR, HDR, codecs, and bitrates.

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

eikira wrote on 3/22/2019, 4:04 AM
 

You can't tell me, Netflix didn't invest all that money not knowing where they're going. I already have a 4K screen. That wasn't too hard.

First, i dont know how much money they invested in every aspect of their Streamingbusiness.
As far as i know they also are into HDR10 and multiple containerformats to support as many enduser devices as possible. And thats probably the reason why they are in enormously debt. They too dont know where the HDR is going to, so they try pretty much as many things as possible.

It's obvious to me that before we see any other major change like 8K (which is already in the works as you know) there'll be other 'cheaper' improvements like HFR, HDR, codecs, and bitrates.

Aactually nothing is really cheap in that regard. HFR is still no topic beside for Peter Jackson =). Thats why netflix is very very happy that AV1 is comming up and they think can later just dismiss in 4-7 years all other formats. Till then, they have to pay major technicians and software engineers to make it work on most of the devices. But at the moment it cant be so cheap for them.

Thanks for the videolink, i will watch it later, its always good to learn, hopefully, new things with more indepht.