Is it just me or does anyone else notice a dramatic loss of resolution using the Mini35 in those shots?
I'd seen the same thing through a monitor at IBC a few years ago on a DVX100 but this looks even more dramatic in 720! Even the colors seem to get washed out, maybe because of the loss of light or is it just from having too much glass in the optical path or is it the ground glass causing diffusion.
Bob.
I guess it means one needs to have a final definition of "resolution". At face value shooting direct to HD res is the same "resolution" no matter what you put in front of the lens. From the description they were using DV Rack and recording direct to the hard drive...so one could ask if the loss of resolution is due to..um..lack of tape stock? Ok, I'm reaching there but you know certain companies push how their tape stock will result in better colors and the like - I have yet to see a hard drive company say the same thing. Would there be any "resolution" change in an ATI drive vs a SCSI drive vs a SATA drive for example.
I didn't notice anything dramatic resolution wise but if you read the whole thing they do mention they were working with a prototype camera and that there was some sort of black "noise" thing going on in half the frame, gain up or not. Maybe this is what you saw/mean.
As for the "washed out" look - I think, speaking of a 'film look' concept here, that most film is somewhat washed out anyway along with being a bit soft and diffused. And at the get go HD was defined as sharp, sharp, sharp! One wrong tweak of the focus and it was soft. So keeping in mind some of the tests here were done with the film look gamma setting as well as the mini35 and, if I was reading to flow part correct, they tried to figure out the whole soft skin setting as well....so yeah, there is some diffusion going on no doubt. And going back to my "most film is somewhat washed out anyway" comment - I mean that as a DP choice and than moving to the colorist. When I saw a restored 70mm print of Oklahoma many years back I was blown away by the color of it all. But it was the process at the time that allowed for th emost vivid colors, and I just saw "Charlie and the Chocolete Factory and that had a funky color scheme as well once they got into the factory - sort of a nice tip to Wizard of Oz I think...vivid colors inside the fantasy as opposed to the bland mono colors of the "real" world. I like the vivid look and if I were to look at these tests I would think this camera and the mini35 sucked. But I know it wasn't shot that way...and most films are not shot that way. So by comparison most films look a bit washed out in a sense. (To answer your question: I did not think the colors looked washed out at all...maybe a bit dull overall but there was a lot of green in these test shots. I would have used fuji film had it been a 'film' shoot to make the greens pop more...but for video...it looks fine to me)
And FWIW - I thought the night shot was pretty awesome on a video level. But they called it a "magic hour shot" and I would have just called it a night shot. "magic hour" to me brings up thought of warm rich yellow and gold...and a nice soft feel overall. And I am from the age where video meant no blacks - just shades of grey and muddiness in the dark areas. Shooting at night meant nothing or muddy, pixelated "survalence" type video. So compared to those standard the video was mind blowing.
Yes, think I noticed same. Looks like a slight blur added to the shots. It's quite visible in this one shot which is provided with and without adapter.
To be honest - this 35mm adapter didn't get too much of my focus. But the rest ... ;-)
The shots were done in 23,976 mode, progressive scan of course. So I carefully watched any movement in the shots - and they all are very smooth - interlaced video cannot be any smoother in my opinion.
When trying to work on these original clips in Vegas performance is much better than trying to use 1080i footage. Not a big problem to find a pc capable to edit the 720p m2t-streams instead of going the intermediate way.
720p monitors can be found easily nowadays - even rather effordable ones.
And if it is about to find a final render format - I tried rendering these files to WMV in same hd resolution and in contrast to rendering 1080i clips to progressive WMV these 720p clips rendered to WMV showed no remarkable losses.
If I had to take a HDV camera right NOW - I think I would take this JVC HD100.
I guess if you're in NTSV land then the added res of 720p is a big plus. I only have three issues with this camera:
1) It's made by JVC and we've had very bad experiences with their gear and we're not alone. JVC claim to have fixed the QA and hope they've done that.
2) It doesn't shoot interlaced. For a lot of things that can be a problem, watching an hour of close shot sports action shot at 25p isn't a good thing.
3) Compared to SD PAL, 720 isn't that big a step up in resolution, to really get the HiRes hit compared to PAL you need 1080.
Apart from that there are some very nice features on that camera, even if it was only a SD camera it's worth a look.
Bob.
Yes it does - Frame rate at 60/30:
DV-60I DV format Shoots using a 480/60i signal. (U model Only)
HDV-SD60P HDV format Shoots using a 480/60p signal.
HDV-HD30P HDV format Shoots using a 720/30p signal.
Frame rate at 50/25:
DV-50I DV format Shoots using a 576/50i signal. (E model Only)
DV-25P DV format Shoots using a 576/25p signal. (E model Only)
HDV-SD50P HDV format Shoots using a 576/50p signal.
HDV-HD25P HDV format Shoots using a 720/25p signal.
Frame Rate at 24:
DV-24P DV format Shoots at 480/24p (2:3:2:3 pulldown). (U model Only)
DV-24PA DV format Shoots at 480/24p (2:3:3:2 pulldown). (U model Only)
HDV-HD24P HDV format Shoots using a 720/24p signal.
Only in SD though. Also the other downside compared to the Z1, you have to buy either a PAL or a NTSC camera. I would have thought Sony have set a new standard with the Z1, if they can build a dual format camera then I'd suggest that should be what everyone's doing. It cuts down inventory costs and in the global village we now inhabit makes our lives that much easier.
Bob.
I have also analysed last night the HD100 material - and I must say: wooah!
Marco is completely right:
- playback behavior of the 720p footage is much better then 1080i. On my P4 3.2 Ghz, I have 23.979 fps or the 30 fps with NTSC preview capability, while I have a preview of 10-12 fps with 1080i only. So, it is much easier to edit that material, even without intermediates or proxy files.
- I also analysed the material frame by frame, looking for how well the progressive material wil be able to smooth fast movements. And I was surprised, the material looks like interlaced video in terms of fast movement. This function "smooth motion" as part of JVC Pro HD seems to deliver very good results.
This camcorder is superior, I think. It is also a profi camcorder, what seems to make a difference.
The HD100 uses a new smoothening method for its progressive mode. This is far beyond any progressive mode we had in other consumer cameras before. Watching the HD100 clips in my opinion there is no noticable difference compared to interlaced clips if we talk about movements. I can't see a disadvantage here.
O.k. - probably much more tests are needed. I think I will rent this camera soon for a production and see how it works.
Not much difference in resolution to SD??? - It's more than doubled and it provides real 16:9! And it is not just a matter of the format's resolution but also of the lenses used. It's not just math, it is a visual process in the end. And this visual process also depends on the viewing distance. What we have in our living rooms and even what is given at best viewing distance is something what makes a 1080 resolution more useless for most cases. To really benefit of viewing 1080 you'd need be more closer to your monitor than you probably would like to. This is one of the reasons the EBU strongly recommended to use 720p as production and broadcast format over 1080i.
1080i must be deinterlaced for monitoring and this forces a loss of sharpness. Encoding progressive video is more effective then encoding interlaced video. Taking this combined with the JVC smoothening technology shooting in progressive mode even is a big advantage for most cases.
But this again begins to drift to theory.
What I saw convinced me. I saw lots of footage shot with different 1080i cameras and I saw this HD100 720p stuff. I find the quality of the HD100 is better in the end and it is more easy to process. Also also made some tests by encoding this 720p and various 1080i shots to HD WMV. Again the 720p gets a point over 1080i here.
Real HiRes? What's it? - Maybe what we see in Cinema. This again is far beyond 1080i because it is 2K und 4K and it's all progressive again. This is quite another league. Apple and oranges. This is not what people watch in their living rooms.
To me 1080i is something in between which does not really work the way it is praised. The resolution can't be used for regular TV purposes and the interlace property prevents it from stepping onto higher stages.
I'm really courious how it is to make a production with the HD100.
Marco,
the issue isn't progessive versus interlaced. The issue is temporal resolution. Full screen fast action at 25fps no matter what you do with it involves a LOT of motion blur which is not natural. This causes our brains to have to work overtime trying to follow what's happening. This is OK for short periods but not for extensive coverage such as sports action. Any cinematographer knows this and movies are shot accordingly, they don't want the patrons running to the toilets with motion sickness.
And I think what the EBU wanted was 1080p instead of 1080i and I agree with them, problem was Sony at the time had nothing that did 1080p.
Your argument regarding perceived resolution and image size is correct, you need a quite large screen before the improved res becomes noticable to the average viewer. On a cinema screen though the difference is quite dramatic, the Z1 looks pretty impressive for the money at that size.
My argument about SD PAL wasn't about DV, I was talking about 4.2.2 broadcast standard shot with cameras with expensive glass. Yes those cameras are way out of our league but for a network they're only part of the equation. Converting to HiDef involves costs orders of magnitude greater than just the price of the cameras, all their decks, routers, cabling and edit suites have to be replaced.
Their thinking relates to how much bang and future proofing do they get for their dollars and the thinking of at least one of them down here is that 720 just isn't a big enough step up, in fact it looks like even HDCAM falls short of what they want and they'll opt for HDCAM SR. Just to recable one station for that will cost millions.
You'd be amazed at what people want in their homes, people down here are spending as much on home theatre systems as others spend on houses and these aren't just the uber rich, 2K projectors are now pretty much consummer items even at $20K. At the same time cinemas are going under.
Bob.
Maybe or probably who are right about the temporal resolution issue. I'll wait and see how it comes out when there are long form productions made with the HD100.
Longest video I saw now was a few minutes.
EBU definetely proposes 720p, not 1080p. Problem wasn't the lack of 1080 monitors. They just analysed what resolution makes most sense in the end. So not sure about it - I think it was a 50p format they recommended, not 25p. But 720p.
I know about the home cinema freaks who are willing to buy best equipment available. But this is a minority. I can't speak for other countries but here people like them are below 1 percent. Here usually people have their TV put anywhere in the living or sleeping room with distances of 4 to 5H which is beyond what would be perfect viewing distance. So even with 720 lines there is still some buffer. ;-)
have you downloaded the tets clips of the HD100 or not? If yes, you would have seen that motion blur is limited dramatically with the HD100. Do a frame by frame analysis, and you will see what I mean. 1080i has a lot more motion blure, given the interlaced frames.
1080p would not have been possible on a consumer DV-tape, maybe one reason why Sony did not go for it. In addition, it only makes sense if you go for 1080 60p oder 1080 50p, what blows up data rates dramatically.
Image size: we know from empirical research today, that a 50 inch plasma with 1366x768 delivers a superior picture with 720 50p, compared with 1080i. Empirical testing by www.film-tv-video.de has shown that in a clear way. However, on a beamer with 1920x1080 1080i is prefered according to testing in most cases, without one genre: sport is better in 720p, given the lower amount of motion blur.
Other empirical tests show, that - for the same distance to the projection - an interlaced source requires 1.5 time higher number of lines compared with a progressive source, to come up with a similar horizontal resolution. From that side 1080i requires a higher number of lines, to come up with a similar resoluton as 720p does.
So, it may depend what we go for: do we foresee a world, where everybody will have a projector in full HD-solution? Or do we foresee a world where people will use plasma and lcds at home? I think, the later point is more likely to take place, since most people cannot afford to spend endless money in home cinemas.
In addition, we see today significant advantages in cutting 720p versus 1080i. Make a chroma keying in interlaced material, and compare that with progressive streams. And best of all: downlaod the clips and check your self.
Given the significant advantages that I have seen last night in 720p material, I will go for 720p and not for 1080i for sure. At least, as long as we do not have 1080 50p and quatro-core Pentium 8.
;)
Here's where I have to differ with you for once, Wolfgang. :-)
1080i is better for capturing motion, and 1080i converted to 1080p30 is stunning. I've keyed with it, it looks wonderful. On the other hand, 720p upconverted to 1080p looks quite poor, with washout and blocks. I've upconverted using a couple different methods, and to get it to look good just takes a lot of time. I'd rather acquire at the higher format and drop down, rather than acquire at a lower rez and go up. Yes, progressive will be better than interlaced at the same resolution, but 1080i60 is better than 720p30 for capturing motion.
Once upon a time, 720p was a needed intermediate step. Now that we've got low cost 1080p displays from 11 different manufacturers (OK, not all are low cost) the impetus and energy on the display side are fairly well evident. Look at 720p on a 1080 display, you won't be all that thrilled, IMO.
I can't see the need for upsampling 720 to 1080 or for viewing 720p on a 1080p monitor. That' s right it for me. I regard 720 lines as a sufficient resolution because this is what fits best to the way video is viewed in most circumstances.
Recording interlaced in a first step to deinterlace in a post-processing - I doubt the motion will look better compared to a smooth motion recording processing like the HD100 does.
Deinterlacing in post-processing always is regarded as a weak point, all the guys preferred progressive recording to avoid deinterlacing later, even if there were only rather poor technologies of pseudo-progressive recordings. Even the Canon XL-1 frame mode gave better results than many deinterlacers (though I never liked the XL-1 frame mode output).
If you focus on the p gainst i thing to value the motion - what I saw from 1080i which was deinterlaced later and what I saw from the 720p which does not need any deinterlacing - I find the 720p quality better.
1080 is nice. 2k and 4k even nicer. But this isn't what most people are or even will be able to use at home.
I agree that the point "better for capturing motion" is true very likely for 720p in a general way. But it is not true for the HD100, due to the Smooth Motion Function, where JVC writtes:
"JVC's exclusive smooth motion function captures images at double the normal rate when shooting in 30p or 25p (that is, at 60p or 50p). When the doubled images are merged, they are passed through a newly developed filter that smoothes out the subject's motion by retaining a small percentage of residual image. This eliminates the motion judder that typically appear in images shot at 30p or 25p."
So, if you capture 720 30p in reallity as 720 60p, you will have a wonderfull low motion blur. And that is what you see in the test videos, the quality of the material is terrible good.
1080i will be deinterlaced on all our progressive plasmas and lcds and beamers. That is where you will loose quality too. So, when
With the upconversion you can have a point. If you believe in a future, where most homes will have their large home video, 1080i could be superior. But I think that 42 and 50 inch plasma will dominate the market in the foreseeable future - and then 720p form a HD100 is perfect.
Well that's a bit of a laugh. In other words the camera is scanning the CCDs interlaced at 50i or 60i, they then ditch the second field after blending a small amount of it to produce 'smooth motion'. Take the resulting frame and rescale to 720.
Me thinks someone with a bit of clever code could do the same thing with Z1 footage in post.
I'm not saying that isn't a very clever bit of engineering / marketing. It does get around a big problem, scanning a high res CCD in progressive mode is very difficult, as I understand it each CCD element is scanned sequentially and that takes a lot of time, too much time to scan all the elements at 30p. Obviously it can be done, there's cameras out there capable of HiDef at 60p but they're using custom built CCDs.
Bob.
>>In other words the camera is scanning the CCDs interlaced at 50i or 60i, they then ditch the second field after blending a small amount of it to produce 'smooth motion'. Take the resulting frame and rescale to 720.<<
No, that's not what it does at all. In "motion smoothing", it scans the CCD at 50P or 60P, not interlaced. Full progressive frames. Then it blends two frames together to create one, and records that lower frame rate. No rescaling, no fields, none of that.
The camera is incapable of recording 720/50P or 720/60P, so in-camera they shoot and process 720/50p and blend every two frames together to create and record a 720/25p stream.
It looks like the "motion smoothing" process isn't much more than just OR'ing the two frames together, although it seems like one of the frames may have a lower opacity set.
I haven't been too impressed by "motion smoothing" yet.
I have tested the camara this morning - it is a great professional tool in terms of a camcorder. Vegas recognized the camcorder immediately, capturing of the m2t footage was no problem. Funny enough, we have not been successful to capture the m2t footage as DV-avi widescreen - but we did not have enough time for testing.
I have not analysed the material yet - and I will get the camecoder over the weekend in two weeks, to be able to shoot more material and familiarize myself with the camcorder in a better way. It is hard to produce high-quality material with a camcorder, which you do not know really. Where other people tend to lough, I tend to analyse. I appologize for that, if that generates bad feelings elserwhere.
Meanwhile I carefully reread some EBU papers and it's rather clear that monitoring more than 1280x720 isn't of much use. It's based on a carefully research the EBU did in Europe so at least in my surroundings it might be something to take care of.
If you ask people here what size of monitor they would choose - independent of prices - most would take sizes of 30 up to 40 inches.
If you ask people what their desired viewing distance for a 30 to 40 inch monitor would be they are at about 2,7m (which should be about 9 feet I think).
If you show high detailed pictures and step by step encrease resolution and ask people at which point they are no more able to recognize the encreasing resolution: even on a 50 inch monitor people end up at 1280x720.
Furthermore they tested if the result differs if a 1080 monitor is used for the 1280x720 signal: it doesn't. It only would be if the viewing distance would be closer.
So using 1280x720 for broadcasting and monitoring is regarded to be more as sufficient in EBU countries.
EBU also mentions there are lots of industrial efforts to push a 1080 format anyway. EBU don't try to prevent higher resolutions because during productions processes it gives kind of qualitiy buffer in some cases.
So in certain production surroundings you may benefit of a 1080 format during production. But remember interlaced isn't a topic at all. Any suggestion, any thoughts are based on progressive properties. Having a 1080i signal which must be deinterlaced either in post production or while monitoring eats up too much of qualitiy than to referr to the researches done with progressive based stuff. Here again progressive always means 50p not 25p.
Here I find progressive or interlaced - besides some advantages of progressive signals in processing - it's often just a matter of taste. I even liked the progressive property without any smoothening in many cases (if there isn't too fast motion in it). I am very curious how I'll like or dislike the smoothed progressive mode of the HD100 which isn't based on interlaced but of 50/60 full frames.
I for one am no fan of anything interlaced, it was a solution to a technical limitation which largely no longer exists. I'd argue much the same for YUV over RGB too. What I want to see are frame rates high enough so that it's at the limit of what we need and that's around 60fps.
As for resolution well I;ve made the same argument about FM radio, how many people have the radio on all day through a system capable of a sound quality that does FM justise? Extremely few people, it's simply too higher a fidelity, it demands our attention. Same goes for high definition moving images, I think there's still a large proportion of TV that's 'watched' on small B&W sets in kitchens. It's there, but only being watched out of the periphery of the brain. Put a floor to ceiling HiDef image with surround sound in it's place and it'd get turned off real quick.
I think the problem is here we're mostly technocrats (myself for sure), we strive for the best possible image and sound but that doesn't mean that Joe Average or Mrs Houswife wants to be bombarded with that 10 hours per day.
Home theatre is a very different beast, should free to air be trying to compete with that or provide content for that, I don't really know. Certainly in that environment the more resolution you can give the viewer the better, they want the same experience as they get in the cinema.
Bob.
>>>I think the problem is here we're mostly technocrats (myself for sure), we strive for the best possible image and sound but that doesn't mean that Joe Average or Mrs Houswife wants to be bombarded with that 10 hours per day.<<<
I agree with that. We have said this in so many words in other threads - wasn't there one about going to Circut CIty with HD material and testing it with their HD set up? Same concept here with what Bob said. I myself want to get one of these JVC's but I have to keep saying to myself over and over, "No one cares at the moment" - and I mean that in the sensse that having it as a "toy" is fine but if people really only ask for non-HD material at the moment I would not be able to charge more just because the camera is better...did that make sense?