push the envelope, bleed on the edge.

wwjd wrote on 7/5/2014, 11:18 AM
summary: how can I push everything beyond current technology?

I have a 4k camera and want to experiment PAST where things are today. for the next test/fun project, I want to go beyond: beyond current limiting standards, beyond current display abilities, etc etc

This is not about making it workable/viewable now, but pushing everything to the width of what it can do.

My camera can shoot UHD(4k) at 30p, 0-255 luminance, and Vegas can do this 32-bit output render. I plan to double the frame rate to 60, not restrict anything to levels of 16-235 (this is NOT about coloring within the lines)... if there is a range, I want to maximize it, not limit it.

Yes, disc space is considered, and I do not have 4k monitoring yet with wider color space to SEE the final product.
DolbyVision got me thinking about advance stuff, rather than staying within the limits...

Any advice, input, ideas I may not have thought of to achieve this?

Comments

riredale wrote on 7/5/2014, 12:45 PM
Interesting idea. But I'd also think that, for example, from a normal viewing distance one simply cannot see detail beyond a certain threshold. So shooting and displaying an image with a gazillion pixels would be a waste of resources simply because the viewer couldn't tell the difference.

As a kid I always enjoyed the "Circlerama" show at Disneyland, where they had perhaps 12 synchronized films running on large screens surrounding the audience. I've already gotten pretty decent at shooting surround-sound and it makes an amazing difference. Perhaps going the surround-video route would be the next logical step for me.

As an aside, the effectiveness of the Disney Circlerama was evident in a scene where a national park (Yosemite, perhaps?) was being filmed from the air. When the aircraft suddenly makes a sharp left banking turn, the entire audience gasps and practically falls down. That's why they installed the handrails throughout the stand-up theater and urged everyone at the beginning to hold onto them.
wwjd wrote on 7/5/2014, 4:32 PM
I remember the circle-rama stuff!

Any technical issues with shooting and rendering the full 0-255 range at the 32bit mode? I discovered my current monitor fully displays 24-bit color thanks to some tests I ran....

are there some file formats that are stuck in 8-bit or anything like that? how can one tell how many bits are produced in the final file?
videoITguy wrote on 7/5/2014, 8:23 PM
wwjd, the way I see it, you are doing a lot more bleeding NOW than such thoughts already requires. I go with riredale's observation - for goodness sake, examine what your outcome is supposed to be and then work backwards to engineer that outcome.

From the standpoint of processing, the issues about larger color space like 4.2.2 or using 10bit processing throughout production and distribution have been well covered by those who care for many many years. Not rocket science, and not a secret, these are the issues that get in the way of reaching a higher outcome than what is the standard of the day.

I suggest to not chase your tail and do some serious study about the larger view of video production and distribution.
wwjd wrote on 7/5/2014, 10:25 PM
I kinda agree with that, but this is more experiment than worrying about current distributions. I'm just a hobbyist, and not concerned with how or if anyone will ever see it. Worst case I can always render it normally also.



Dolby Vision aims to bypass and exceed at dispalying colors, dynamic range, brightness etc instead of sticking with the old school thought "This is what we have to work with now". I think this will be game changing, no longer having to work within the limits of decades old technology. Now, my camera is not an Arri or Red or anything, but I aim to push what I have to it's absolute limits.

I need to rephrase my question I think: Can VEGAS keep up? Will it render more color, higher framerates, larger resolutions?
Last time I tried to Render something under the pixel format 32-bit video or full, I got odd blue washouts in the highlighted areas. Not sure why something like that even happens, sort of seeking advice from anyone who has done some extending setting rendering beyond the normal kind.

videoITguy wrote on 7/6/2014, 9:43 AM
Has been hashed out in this forum for many years. Briefly. Vegas works in an 8 bit pipeline, 8bit in 8bit out. Now 32bit handling can occur within mid stream and is useful in some very few situations of complex high-quality compositing. As well some digital intermediates in low-loss or no loss is a good thing for workflows. Cineform is a good example of digital intermediate handling at greater than 8 bits.

The large problem with Vegas as the NLE in this discussion, no foreseeable way to monitor the workflow at greater than 8bits at any step. Yes, you can process with some intermediate connections output but all gets reflected to the human eye as essentially 8 bit.
wwjd wrote on 7/6/2014, 10:27 AM
so when I am switching from 8 to 32 bit, and my monitor shows more than 8 bit, and I can SEE a difference in banding or whatnot, it is still only previewing in 8-bit?

Is the final render done at 32? or 24 or how do we know?

I can live with not seeing it, as long as it is saving all the info for playback on future devices with better color spaces.

I guess forward thinking, if I buy the 10-bit recorder for the GH4 (it can do 10-bit recorded externally) will Vegas squash that back into 8-bit, LOSING fidelity? or do you just mean the preview part?

BTW thanks for the info!!!
videoITguy wrote on 7/6/2014, 1:45 PM
Just like I said, all processing of Vegas timeline will utlimately net 8 bit. You can shove in more, you can manipulate more in some digital intermediates and 32bit activity - but your outcome always in the distribution/preview is utimately 8bit.

You may think you see more, and well that you should see slight improvements if you do your workflow right, but actually what you are resulting is in just a " better "8bits.

The preview pipes even in the external monitor are clamped essentially - so no amount of fiddling there is going to tell you much. In the hardware chain of external preview you must have a card processor like BlackMagic capable of passing 10bits - but in effect just handing off the 8bits of VegasPro withOUT artifact generation.
wwjd wrote on 7/6/2014, 1:56 PM
Just to be PERFECTLY CLEAR - and I know you have told me the same thing 10 times in this thread already - ....

Vegas can NOT output any file types greater than 8-bit color depth?

I would need to buy a new NLE that does?
If that is the case, what would be able to deal with 10-bit, RAW, etc file types keeping the colors and ranges intact for future display spectrums? Premiere?

I read this in Sony specs: High-precision, enhanced 32-bit floating point video levels mode for 10-bit and higher sources and render formats
so sounds like it could render 10-bit whenever I find which ones do. :)
videoITguy wrote on 7/6/2014, 2:20 PM
Rendering Cineform as an output codec (for digital intermediate - its not a distributed file type) will give you essentially a 9bit result out of Vegas - Of course people call it 10bit - but it is not quite true in Vegas case.

To get around the preview to external monitor pipeline issue - you can go into Premiere latest version - and thru external card processing like BlackMagic -you may get a 10bit feed which should go to a pro 10bit monitor - All of this very many times the cost of the software investment - in fact more than your platform inverstment period.
larry-peter wrote on 7/6/2014, 3:02 PM
videoITguy, I want to make sure I'm not misunderstanding, too. I realize that Vegas' preview pipe is 8 bit limited, but wwjd also asks about outputting files greater than 8 bit.

Uncompressed files are rarely used for delivery, but I've always been under the assumption that if I work with Sony 10bit YUV .avi or any of AJA's 10bit avi codecs in a 32 bit float project and render to a similar 10 bit codec the rendered files will be proper 10 bit files. Am I missing some important information?
wwjd wrote on 7/6/2014, 4:05 PM
right Atom. I'm not worried about what I am seeing - just make sure the OUTPUT FILE can be higher than normal.

Again, this is for me, not for any distribution. I will render typical 8-bit for online, but when the DOLBYVISION drops, I want to try to extend the levels of everything.
videoITguy wrote on 7/6/2014, 5:45 PM
You can be using 10bit YUV for digital intermediates - but the question for you does it maintain any real increased precision over the source. The answer would be no for an 8bit source - but then a 10bit source - what about that? Well, absence any edit or efx to cause a re-render of the source - perhaps you have a pristine 10bits? maybe, but did you then use Vegas to go into 32bit processing - while it was on the timeline - that becomes a question as long as you only apply 32bit processing efx. Kind of tricky.

Given the above, you are still monitoring in VegasPro external preview with effective 8bits.

As atom12 is about to point out - the 10bit stream workflow can only be really purposeful when doing the most basic manipulation within VegasPro and then passing the digital intermediate off to something else for further work.

Premiere is a possibility - processing by BlackMagic Design Resolve is another.
wwjd wrote on 7/6/2014, 6:47 PM
hmmmm, this is a most excellent discussion. Exactly what I was looking for. Sounds like this tool IS the weakest link in the chain. Looks like most of the 32 bit plugs are the ones I use already, so I'm good to go there.

From what I gather here: http://www.cambridgeincolour.com/tutorials/bit-depth.htm
My monitor can display full 24bit - even though I know Vegas CAN'T display any more, at least I can edit, render and playback full level stuff.

This will be an interesting experiment. Can't wait to start! Waiting on getting the prop car fixed.
videoITguy wrote on 7/6/2014, 7:53 PM
I can tell from your previous post above, that you are confused, so I will go back to my very first comment - you need to study the details of video production and distribution- essentially we just made a big discussion circle returning to the beginning statement of this thread.
wwjd wrote on 7/6/2014, 7:58 PM
I don't want to learn/study anything... I'm asking for the max limits of Vegas.
And I am finding them. For example: can't render "8K"... 4096 seems to be the max setting it defaults back to when trying to render or make a project 8K. I was able to produce a 60p UHD 4K output.

Anyone else pushed the limits to their max or found limitationsin Vegas? Very few people have colored outside these lines. But the lines will be changing soon.
larry-peter wrote on 7/7/2014, 8:34 AM
@videoITguy, thanks for the confirmation. I wanted to make sure I hadn't missed some important discussion about Vegas' precision in rounding/truncating from a 32 bit float project.

You make an important point that I think many miss along the way - be very aware of the 8 bit limited plug ins you may be using. I've seen many posts over the years promoting the workflow, "edit in 8 bit, render in 32 bit" but if you begin working in an 8bit project and have certain plugins on the timeline, switching to a 32 bit project pretty much means you're just increasing rendering time.
john_dennis wrote on 7/7/2014, 9:00 AM
Come on, Greg. Join us.
Accept mediocrity.
You'll be much happier.



P.S.
Todays mediocrity was the hot new thing six months ago.
wwjd wrote on 7/7/2014, 9:29 AM
hahahahahaha! Someone's gotta break the box :)
wwjd wrote on 7/8/2014, 8:51 AM
anyone else coloring outside and above the lines?


(Darn, these crickets....) :D
Rory Cooper wrote on 7/9/2014, 1:33 AM
Oohhh so that’s what those lines were for.

I suppose ultimately you would need a display device that can handle the content color space and a media player that can handle the codec.

What camera do you have I missed it?

Have a go at rendering out to log type png sequence then run the sequence through photomatix it will blow your mind, you don’t have to go for a grunge look
keep it natural.

For TV! well the absolute mindless garbage they pump out these days we might as well go back to black and white without audio then I could maybe sit through an episode of Keeping up with the Cantfashians and add my words, it would be a lot more fun and creative anyway.
ushere wrote on 7/9/2014, 2:47 AM
+ 1 rory's last statement

to paraphrase palin, you can put lipstick on a pig, but it's still a pig ;-)

i am so sick and tired of BEAUTIFULLY shot cr@p that i really don't care anymore.

farss wrote on 7/9/2014, 5:25 AM
I tested this when Vegas first came out with a 32bit float pipeline.
Vegas will read, process and output the full 10 bits to / from a 10 bit codec. The use of 10bit codecs is as old as Vegas itself, the bog standard Digital Betacam cameras and VCRs were all 10bit however they used a linear gamma curve. There's cameras around today such as the F65 that are recording with 16 bit 4K codecs in RAW and anyone who cannot see the difference on a big OLED monitor has a problem. the extent to which the captured RAW images can be graded is pretty astounding.

TBH I'm amazed that people here are saying otherwise, it's pretty simple to test and verify.


Bob.
videoITguy wrote on 7/9/2014, 9:01 AM
As some have just pointed out - the early roots of digital video production and assembly to get to distribution - years ago were handled by Targa photo sequences rather than the highly compressed video streams that we see commonly today. At low cost and less destructive file space you can do the same with .png sequences today.

Of course you can also go back to workflows that have 4.4.4 color space uncompressed if you want pristine quality. The point is that the video production and distribution is always grand compromise between the doable and the practical.

Get yourself a 4k monitor - place it 20 feet away in the living room and then do some testing.

Relevant thread remarks:
Q:Why can't they just agree on 1 standard for all video and audio footage for christ sake, that would make life a lot easier...

A:The problem is that a file optimized for hi-def theater playback will perform very badly streaming to a small mobile device, while a file optimized for the small mobile device will look very poor on a BluRay. A compression algorithm that creates excellent quality small files may be too complex to run in real time on a camcorder, while the algorithm that can run on the camcorder produces files too large to distribute.

wwjd wrote on 7/9/2014, 9:20 AM
thanks for the extra input guys...

Camera: Panasonic GH4 4K (can do 10-bit 422 when the recorder starts shipping)

yeah, this is just about having fun, not producing a quality TV show or anything, just doing it to my next project to flex the tools and push the envelope.... to prepare for future changes in TVs and viewing devices that will soon be easily capable of more than the past restrictions.

http://gizmodo.com/how-dolby-vision-works-and-how-it-could-revolutionize-1594894563