OT - A rather strange review.

GeeBax wrote on 12/17/2014, 6:04 PM
Reviewer Justin Craig in reviewing the latest Hobbit release 'The Battle of the Five Armies' says:

Once again the high frame rate format is as distracting as ever. Jackson implemented the 48fps during the first “Hobbit” film and created a shockwave of criticism. The same criticism applies to “Battle of the Five Armies.” The frame rate takes the entire cinematic feel out of the experience. Instead, “Five Armies” looks like behind-the-scenes video in the DVD extra features or worse- that TruMotion feature on HD TVs. The cinematic magic is stripped: sets, costumes and movements look fake. Not to demean plays, but it looks like a play, which is not the best use for cinema.

This is the full review: http://www.foxnews.com/entertainment/2014/12/16/hobbit-battle-five-armies-is-disappointing-final-film/

Comments

PeterWright wrote on 12/17/2014, 6:49 PM
Mmm - yes, a bit strange.

Because the increased frame rate makes it look more "realistic" - more like real life, he equates this to looking worse, because it's not as degraded as "normal" movies. He even says this more realistic product "looks so fake" !

This reminds me of those who prefer the hum of vinyl audio recordings to the purer sound of CDs.

Content wise, I agree with him that this looks an extremely boring film.
GeeBax wrote on 12/17/2014, 7:15 PM
Like you Peter, I have no interest in the content, but thought the views on the 'look' to be somewhat elitist. Sort of reeks of the 'Golden Ear Brigade'.
PeterDuke wrote on 12/17/2014, 7:42 PM
Are the look and feel! The environment is more important than content!

I suppose he misses those little circles/ovals in the top right corner before each film segment is switched in. And frame dust and vertical scratches.

Does he still watch TV on a 4:3 CRT with rounded corners in B&W?

But don't take any notice of me. I also dislike shallow depth of field, where a focus puller dictates what part of the screen that I will watch.
johnmeyer wrote on 12/17/2014, 8:09 PM
We've had this discussion before. Higher frame rate is not "better;" it is simply different.

The reverse is more important: lower frame rate (24p) is not worse; it is simply different.

There is absolutely no question that 24p imparts a "once-removed" feeling to the program material, whereas 60i/60p makes the material seem much more immediate and in the present.

Marshall McLuhan attempted to describe the different impact that various media have. He did it very poorly, but since he was the first person to explore the concepts, he will forever be remembered for "the medium is the message."

Finally, if any of you spend time over at the AVS Forum, you will find that most people there absolutely hate the "soap opera effect" that is available on many modern LCD TVs where the TV firmware generates "tween" frames to create an artificial higher frame rate. This technology lets you take pretty much any 24p movie and change it so that it looks like the 48 fps "Hobbit" movie.

GeeBax wrote on 12/17/2014, 8:27 PM
And yet for years our American friends could not stand the low frame rate of the PAL system at 25 fps compared to the 'proper' rate of 30 fps. They said the flicker drove them nuts. So which one is it?
musicvid10 wrote on 12/17/2014, 8:33 PM
This is a built-in feature of warmblooded creatures -- its called "imprinting."
The earliest experience is the one that is right, and that is often for life.
It's the way we learn to identify our mommies.

Does the author realize how maddening the experience of someone raised on 48fps would be the first time they were forced to watch 24fps cinema?



OldSmoke wrote on 12/17/2014, 8:38 PM
[I]There is absolutely no question that 24p imparts a "once-removed" feeling to the program material, whereas 60i/60p makes the material seem much more immediate and in the present.[/I]

I doubt that feeling will go on much longer. The younger generation has no relation to "old" movies and actually finds them rather disturbingly bad, just ask my 16year old daughter. Tor her, even CDs look antique.

I personally feel the same and would love to see "Out of Africa" in 60fps in an IMAX theater... sorry.. I loved that movie when it came out.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

CJB wrote on 12/17/2014, 9:07 PM
I happen to agree with the article and find the high frame rate "realistic", not "cinematic". JMO. My son who is of this new generation thinks the same.

Same can be said for 3D. Still hasn't caught on so much. They try every 10-20 years, but in general it's a novelty.
johnmeyer wrote on 12/17/2014, 9:29 PM
And yet for years our American friends could not stand the low frame rate of the PAL system at 25 fps compared to the 'proper' rate of 30 fps. They said the flicker drove them nuts. So which one is it?PAL at 25 fps is a totally different thing than film at 24 fps.

What? Why?

Because the flicker rate of 24 fps film is not 24 fps, whereas the "flicker rate" (it doesn't really flicker) of 25 fps PAL is 25 fps. This is due to the fact that a movie projector's shutter opens and closes at least once during the projection of each image -- and sometimes twice, or more -- thus doubling or tripling the flicker rate. This is done because people get sick watching 24 black frames per second interspersed with the actual images. 24 Hz flicker gives people headaches, but 48 or 72 Hz. flicker does not.

There are other things going on when comparing PAL and NTSC images that may contribute to likely or disliking one vs. the other. They are different in many ways.


This is a built-in feature of warmblooded creatures -- its called "imprinting."That is an interesting question. Do I prefer 24 fps traditional film because that is all I've seen?

My answer: no, it is not imprinting.

Why do I say that?

Because I was born in 1952, after TV had become popular, and I was "imprinted" with that media just as much as I was imprinted with the film look. In fact, based solely on time, I'm sure I saw far more hours of true video-based TV than I did movies, even if I include movies telecined for broadcast.

So, I think I am equally imprinted with both 29.97 interlaced (60i) and 24 fps, both 24 fps film viewed from a projector, with the 48 Hz. or 72 Hz shutter flicker, as well as 24p viewed on a TV with no flicker whatsoever.

And speaking of flicker, nothing flickers except for film viewed from a projector because none of the video technologies involve periods of blackness. 24p in a theater is different from 24p on an LCD screen.

There are most definitely various artifacts of the slower fps, such as "judder," but those are quite different than flicker.
NickHope wrote on 12/18/2014, 12:15 AM
I absolutely hated the high frame rate of the first Hobbit movie. I was really surprised how much I disliked it. It was basically just too real. They looked like they were running around in a hangar with unconvincing background scenery painted on the walls. I couldn't get immersed in the story at all.

Likewise a mate of mine recently showed me his new Samsung TV which has "Smooth Motion" enabled by default (i.e. extra frames added by interpolation). It's awful. It really makes Hollywood movies look like home movies. Didn't seem to bother him too much though. So I might be a visual snob. Or I might just see things differently to others.

[url=https://www.change.org/p/hdtv-manufacturers-please-stop-making-smooth-motion-the-default-setting-on-all-hdtvs]
Serena Steuart wrote on 12/18/2014, 12:36 AM
Perhaps the issue is that the reviewer was mindful of the higher frame rate and because this made him conscious of the technology it broke the spell. This always happens when the technology attracts attention to itself, whether it is extremely shallow DoF, cunning camera moves and viewpoints, etc or when one starts identifying locations and playing the game of identifying actors, etc. It's all smoke and mirrors and the magician must never call attention to himself in any way that causes you to really watch what he's doing.
Certainly we commonly speak of the cinematic cadence of 24fps, although I admit I've suggested that is rubbish. The success of cinema relies on making the artificial seem real, so the downside of greater visual fidelity is greater difficulty in fooling the eye (brain). There must be some space for us to fill in through imagination.
PeterDuke wrote on 12/18/2014, 1:45 AM
It is not the flicker of 24fps I hate, it is the judder. Multi shutter exposures won't fix that.

Also, PAL is not really 25 frames per second, it is 50 fields per second. The experience is quite different.
ushere wrote on 12/18/2014, 4:18 AM
actually if it's worth watching it really doesn't matter what the frame rate is ;-)

...i find i watch less and less of the 'spectacular', 'awe-inspiring', etc., etc., simply because the content doesn't interest me. i also find the obsession with 'image' over nearly everything else in corporates equally trying.

i want to be entertain, educated, or informed as an ADULT, not some mindless, multitasking, air head.

[/r]
farss wrote on 12/18/2014, 6:19 AM
Given that this movie was shot at 48fps what happens when it's shown at 24fps?

Bob.
Rob Franks wrote on 12/18/2014, 7:22 AM
24p is so yesterday. Let it die and move on.
OldSmoke wrote on 12/18/2014, 7:43 AM
+1 Rob!

I am also sure that 24fps was choosen for a technical reason rather then estatics. If film making would start today, I am very sure it wouldn't be with 24fps.

I also agree with Kimberly that film makers face bigger chalanges with higher frame rates and it will taek a while to over come those, similar as HD impacted studios. Make up artist had to change their way of working and studio sets had to be improved to cope with higher resolution.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

TheHappyFriar wrote on 12/18/2014, 8:40 AM
PC games have been displaying 60p (and over) for over a decade, so I don't believe it's because of the framerate. I've gotten to watch 60fps PC games on the "big screen" too, doesn't look bad, just bigger. The Hobbit made the same cinema changes that Phantom Menace did when it came out. PM was digital acquisition. You could SEE the artifacts on the big screen. Stuff that was on SDTV & PC's for years and is still on HDTV now (and computers) and very few, if any, card. But it was painfully obvious. Now nobody mentions that a film is shot digital, not even to mention back then (when PM was released) they were still delivered on film, now it's digital on both ends. Tech changed to solve the problems and recording techniques and styles changed to make it look good.

Give it ~8 more years and 48 frames per second won't bother anyone because most movie makers will know how to use it and give good results. I don't know why we bothered with 48 vs 60 though, except to easily convert to 24 for playback devices that don't do 48. Or maybe it was to save space like 24. :?
Chienworks wrote on 12/18/2014, 8:45 AM
"actually if it's worth watching it really doesn't matter what the frame rate is ;-)"

I say the same thing about frame resolution. SD vs. HD doesn't matter to me. YES, i CAN see the difference ... if i look for it. However, the content should be gripping enough that your mind doesn't wonder about resolution or frame rate.

Heck, i have a TB of old video files stored on a hard drive that are all encoded at 15fps so that i could use half the bitrate to save drive space. Consider that when i started the collection i had a 4GB hard drive instead of the 11TB currently spinning in my desktop so space was quite a premium. Do i notice the slower frame rate? If i care to be bothered to notice, yes. In normal viewing i just enjoy the material and am not affected at all by it.
Chienworks wrote on 12/18/2014, 8:50 AM
"This is due to the fact that a movie projector's shutter opens and closes at least once during the projection of each image -- and sometimes twice, or more -- thus doubling or tripling the flicker rate. This is done because people get sick watching 24 black frames per second interspersed with the actual images. 24 Hz flicker gives people headaches, but 48 or 72 Hz. flicker does not."

This is very strange! If they can move the shutter fast enough to make it open and close 3 times per frame then why not use that same speed to only close it while the frame is changing and leave it open for 83.33% of the time instead of 50%? That would result in less flicker and less black time, and more image-on-screen time.
fldave wrote on 12/18/2014, 8:53 AM
I didn't see the 48fps Hobbit movie, but did watch it in 3D IMAX. I loved the experience. I call it an experience, because it's more than just "a movie" and more immersing. Better than non-3D? Just a different experience.
We also loved 3D IMAX Gravity. What a ride. We watched the movie again on 67" HD theater at home, and it wasn't interesting at all. Glasses are a bit annoying, but much better than earlier years.

As stated earlier, 3D's a novelty. I may watch the new Hobbit movie in both 3D and 48fps and compare.
wwjd wrote on 12/18/2014, 9:16 AM
SICK of juddering 24. LOVE the new framerates. That's all I have to say about that.
Hulk wrote on 12/18/2014, 9:27 AM
Once I'm "into" a movie I have to admit I don't notice the production quality unless it's bad enough to disrupt my focus on the story. If I go into pixel peeping mode then I can't really watch the movie for content.

That being said I didn't notice anything annoying with the Hobbit movies except for the fact that I didn't think they were very good. 24fps, 48fps, whatever, the movies just didn't do it for me the way the Lord of the Rings did.

While I personally generally prefer higher frame rates what is more distracting is bad focus and inconsistent lighting. The frame rate is pretty far down my list on the production scale. Actually now that I think about it I think lighting is the most important factor in good video production. Both technical excellence and drama are born from great lighting. Just look at some of the old classic black and white movies. All done with a keen eye for light. I think we've lost some of that with millions of pixels, 3D, CGI, etc...
johnmeyer wrote on 12/18/2014, 9:40 AM
This is very strange! If they can move the shutter fast enough to make it open and close 3 times per frame then why not use that same speed to only close it while the frame is changing and leave it open for 83.33% of the time instead of 50%? That would result in less flicker and less black time, and more image-on-screen time.The flicker problem has little to do with the amount of time the shutter is closed while the film advances, but instead is a problem because of the the frequency (number per second) of the "black flashes."

If you've ever experimented with a strobe light, you know that as you slowly increase the flash rate, at some point you no longer perceive the flashing, but somewhere between 10 and 30 flashes per second, it get really annoying and can actually cause convulsions in susceptible people.

Exactly the same thing happens when you increase the number of flashes per second in a movie projector by adding artificial additional "black flashes" by adding blades to the rotating shutter.

BTW, the "shutter" on a movie camera or a movie projector is almost always a rotating wheel with blades which block the light, not a focal plane or leaf shutter like you might find in a still camera.



[edit]Something I forgot to mention: with a strobe light, the duration of the bright flash is unbelievably short, measured in microseconds, so the duration of the flash has nothing to do with all the nasty side-effects.

Tim L wrote on 12/18/2014, 11:39 AM
[I]"Given that this movie was shot at 48fps what happens when it's shown at 24fps?"[/I]

The movie would be twice as long, right? (just kidding)

I've wondered about this. If they just used every other frame for the 24 fps version, wouldn't it resemble a normal 24 fps film? I assume the 48fps "video" is probably shot with a 1/48th second exposure, so using every other frame at 24 Hz would resemble a conventional 24 fps film shot with a 180° shutter, right? (i.e. each frame exposed for half the frame rate?)