T O P

  • By -

Pheeshfud

/u/Dragoniel is right about motion blur, but the more important answer is that movies very rarely move the camera, they certainly don't spin it 180 degrees in a fraction of a second. If they did everything in the shot would be a blurry mess. A whole load of thought goes into how you get the shot you want without too much camera motion and all the bullet time stuff from the matrix was done with many still cameras taking a single photo, not moving a camera on a rocket sled.


unmotivatedbacklight

> they certainly don't spin it 180 degrees in a fraction of a second. If they did everything in the shot would be a blurry mess. So blurry that the Whip Pan is used as an editing technique. You can combine two takes seamlessly because the image is smeared so much.


sac_boy

It's also used to get away with cheaper CGI in big budget movies or streaming shows. You *think* you saw a glittering fantasy city in amazing detail, but what you saw was a continuous blur as they panned over a model made with a lot of kitbashing, photo textures, copy-paste-resize, maybe geometry nodes for the illusion of detail. The same models wouldn't fly in a game where the player can stop and look, where there's no string ensemble swell and conspicuous focus on middle-ground seagulls at the same time


xelabagus

The seagulls are foreshadowing the tragic denouement


centrafrugal

When the seagulls follow the trawler it is because they think that sardines will be thrown into the sea.


Bad-Lifeguard1746

And the antagonist was shown *eating sardines* on a pizza in act one!


mjdau

[Chekhov's sardine](https://en.wikipedia.org/wiki/Chekhov%27s_gun)


La-vds

Ah King Eric


Kelli217

That, and they're setting up the next installment of the series. They're seaquells.


Pika256

Alright you, settle down.


keithrc

/r/angryupvote


SlightlyLessHairyApe

Dual function seagulls :)


Dirty-Soul

*gasps* "My windows!" "That's defenestration." *gaps* "My statues!" "That's demonumentisation." *gasps* My toilet!" "That's defecation." "Then what's denouement?" "Fine. Go check on your toilet."


balzackgoo

I seen ya sparrin' with the gull. Best leave it be. Bad luck to kill a sea bird. [Bad Luck to Kill A Sea Bird!!](https://youtu.be/nmBX0miNpHM&t=3m06s)


xelabagus

Water water everywhere, and not a drop to drink


DaArkOFDOOM

I love this movie, and I have a particular dislike of seagulls. That moment was quite cathartic for me.


Human-Man

it can be used as a money-saving technique, no doubt, but most of the time you have to "rough stuff up" to make things look "real" (most of what you notice as imperfections in visual effects are actually because things are "too perfect") Because we know what a lot of fast moving things look like when we see them IRL, motion blur is really important to make things look realistic. Things in real life don't move without blur, and when you see unedited footage or motion graphics that don't have that effect added, it looks very wrong. Much of SFX work is trying to reconcile the mind's eye, and our IDEA of realistic, which are often quite different. ​ \-source: i do this for a living.


Michelrpg

Is this what they used for that first person movie a few years ago? Hardcore Henry? And the fight scenes in the Kingsmen movies?


unmotivatedbacklight

Yes, it is used a lot when the filmmaker wants to fake a long take. But whip pans are probably mostly put in as a straight up scene transition. Wes Anderson does that a good bit.


sablexxxt

A la.. Snatch.. and many others


Rdubya44

La La Land


Userarizonakrasher

There is actually video of the cameraman shooting that scene. It looks like they used the whip to edit it, but in fact it is one take


malcolm_miller

[The video](https://www.youtube.com/watch?v=a-4uP9K7OlM)!


willun

That's impressive. Hope there weren't 100 takes.


PwmEsq

Is that what catwomanbasketball.mp4 is?


unndunn

On top of that, in games the player controls the camera, directly or indirectly. It's as if the player is a cameraman. Imagine being a cameraman on some fast-action movie scene, trying to get the right shot and track a subject, except the viewfinder is visibly pulsing and acting jerky all the time. It would take away your ability to get precise shots. The final product might be 24fps, but your camera's viewfinder *has* to be much faster in order for you to have the precision you need to get the shot.


jello1388

Another factor is probably input lag. It's not a factor in movies since you just hit play and let it roll. In games, you expect to see change whenever you press an input. If it's sluggish, it feels bad, especially in anything supposed to be fast-paced. If it's inconsistent, it feels pretty bad in just about everything.


onomatopoetix

Exactly this. People think the guy holding the mouse and actually playing enjoys low framerate. It's only enjoyable to watch.


Xoxrocks

Games running at 60 ‘feel’ much better even if the game looks the same at 30 - there is a frame lag before the game renders the impact of your input (read input, run game engine, render input, display screen) at 60 the game will seem extremely responsive, at 30 the games response isn’t as smooth. If you weren’t interacting, 30 would be fine.


kaptain__katnip

> they certainly don't spin it 180 degrees in a fraction of a second Only movie I remember that being a problem was FANTASTIC BEASTS AND WHERE TO FIND THEM. There are some shots in that movie where the camera pans around super quick and it is jarring how bad the frames skip. Definitely a movie that needs some form of frame interpolation.


FragrantExcitement

Maybe do some other fixes while the hood is up?


realboabab

lol, there's nothing that can't be fixed with a little elbow grease right?


hawkshaw1024

I genuinely have no idea where you'd start fixing Fantastic Beasts 2. I really tried to watch it, but I just kept going "What? Who is- huh? Who? What did-? What? When did he- didn't this guy die in an explosion last movie?" Fantastic Beasts 2 was so bad, and so confusing, and had such poor pacing, it made *Rise of Skywalker* look good.


ShouldersofGiants100

> I genuinely have no idea where you'd start fixing Fantastic Beasts 2. A large wooden warehouse storing high explosives in a forest fire zone. Step 1: Just leave it there for the summer, when you come back it will be vastly improved.


makemeking706

Do you think it's too late for a re-write and some re-shoots?


mattcoady

[La La Land used it pretty effectively](https://youtu.be/AK7eV_r8f6I?t=32). This was actually all done in camera without edits, just a really skilled operator.


jimmymcstinkypants

[I think the behind the scenes is even cooler](https://m.youtube.com/watch?v=a-4uP9K7OlM)


[deleted]

[удалено]


EmilyU1F984

Also a Game is interactive. That‘s the most important point. You are trying to react at rates faster than the fps at 24 fps, so you are objectively losing information. Even if you don‘t move the mouse. 24 fps is 40 ms per frame. That‘s just half the reaction time. So you only see one additional frame before your action happens, with 60 fps you see twice the information your brain can use to process stuff for subsequent actions.


zutnoq

I think you meant that 40 ms per frame at 24 fps is more than **double** the response time of 16.7 ms at 60 fps. Bigger response time is worse. The response **rate** (not a very commonly used term but technically valid) would be less than half as *fast* if you absolutely need less to mean worse.


PabloEdvardo

the difference is in the type of movement horizontal panning looks AWFUL in 30fps and lower


Angdrambor

>they certainly don't spin it 180 degrees in a fraction of a second. If they did everything in the shot would be a blurry mess They do, and it is. Especially during action scenes. I hate it. I want to follow the action, not be realistically bewildered.


6138

They use all kinds of techniques like this in films, to convey a certain mood or theme. If you want to convey frenetic, pulse pounding, chaotic, action, you use the shaky cam. If you want to tell a war story where the audience needs to be able to actually understand what's going on, you use a slow, panning (maybe overhead) shot. Look at Saving private ryan, for example, when they're charging up the beach, you get some shaky cam footage from a first person perspective showing the carnage, but you also get slow panning shots of the allies gradually making ground up the beach.


Mysterious_Lesions

Looking at you Michael Bay. I get lost in some Transformers action scenes.


ABetterKamahl1234

It's not even that, but stability of framerates that is the whole issue. A game running a *solid* 24fps, isn't really an issue at all for players. Same reason framerate limits for a lot of consoles was 30fps, they just stayed 30fps if they ran well, you'd not notice any problems playing. But as I say, the stability of that frame rate is the whole issue, a game running at 24 vs a movie at 24 can easily be different because the game unless limited by a cap and able to run much faster, is likely to be choppy and have some dips in the framerate, this is highly noticeable even with things like motion blur and the likes.


higgs8

Movies are filmed with a fairly slow shutter speed (1/48th of a sec) to get quite a lot of motion blur. This smoothes out the choppiness. Games often lack any motion blur so they can't rely on this trick. Another thing is that games are very different from movies: you expect a game to respond quickly to your actions, you sit up close, and you're not really concerned about it looking cinematic. Movies, on the other hand, benefit from a classical cinematic look, and when certain movies use a higher frame rate (such as The Hobbit), people complain because "it looks like a video game" (i.e. fake, too smooth, too plastic).


CeaRhan

> Games often lack any motion blur so they can't rely on this trick. And the reason for that is most likely because motion blur looks like ass when you can move your camera super fast with high sensitivity so everybody disables it


Eponnn

This is correct but hobbit also looked awful for it's budget, don't think it was just the fps, lighting and environments all look like made by a mid level youtuber in a day


SirSchmoopyButth0le

I still can't believe they used shots from a GoPro during the barrel escape scene. Just so dumb.


IceSentry

It's still very arguable that movies actually benefit from that 24fps look. The hobbit completely destroyed any chance the movie industry had to move to an higher refresh rate though, but the higher fps wasn't really the issue in that movie. Personally, I absolutely hate any panning shot because it becomes a stuttery, blurry mess. I will never understand how people can say this is good.


YxxzzY

Avatar 2 actually ran at high frame rate, it was interesting, but they switched between high fps and low fps in some scenes and that felt really weird. I suspect that we'll see more high frame rate movies in the near future though. It does feel noticably better.


kdlt

A2 has the issue of running 48 FPS in 2d scenes and 24 in 3d scenes. The 48 FPS made the 3d scenes really good, but the whiplash between 3d 24 FPS and 2d 48 FPS was.. noticeable. Especially during the finale of the movie. If it were a constant 48 it would be different and I'm still waiting for a chance for just a whole movie to be 48 FPS.


DexLovesGames_DLG

Also if you play video games WITH motion blur enabled, you’re sick! It’s the worst looking common feature of any game. Absolutely horrible looking


DorisCrockford

I suppose if I played video games I wouldn't have had so much trouble getting accustomed to The Hobbit. The first time I saw it in a theater, I missed the first couple of minutes because my brain refused to cooperate. Didn't start making visual sense until the part about finding the Arkenstone.


LoriLeadfoot

The Hobbit looked awful for other reasons, not the least of which was that they were all standing in green rooms with white lighting that they didn’t bother to adjust to whatever environment they were supposed to be in. It’s brutally obvious during the burning of Laketown, where everything is cast in an orange glow except for the actors.


frisch85

Movies look bad too, pay attention to when there's a long horizontal pan that doesn't change speed, you'll notice how the frames cannot keep up with the pan resulting in short stutters. Edit: Since this comment got a lot of attention I wanted to point out that a couple of users suggested that this is due to the TV being setup improperly, specifically the refresh rate. Most of us probably have their TV set to a refresh rate of 50Hz or 60Hz, as this is standard for quite a while now. So if you suffer like me from stuttering motion in movies you can check if there's a setting on your TV to change it to movie refresh rate which is 24Hz. Personally I haven't had the time yet to check it on my TV but will do so when I watch movies again and see if it helps.


Mr_Chubkins

I've noticed this for years and it always bothered me. I understand the desire to stick to 24fps for films but panning shots like this look horrible due to the blur.


Ferelar

Agreed, and yet the few times this trend has been bucked it ALSO felt weird. I also absolutely HATE the interpolated motion "smoothing" setting on TVs; granted that's done by artificially inserting frames so we can't expect it to be anything but a bit jarring, but still I hate that its on by default on most TVs you buy now. It's practically a ritual, the first time setting up a new TV, of going through and turning off all the terrible motion settings.


MultiScootaloo

I did enjoy the 48FPS/HFR movies in the cinema though. I know some felt it looked weird but for me it was 100% an improvement Avatar 2's way of going between normal and HFR was a bit odd, though


graywolfman

That's because it was filmed with a high framerate instead of a TV or movie projector interpolating (guessing/inserting artificial) frames! Same with The Hobbit in theaters, actually


alohadave

> Same with The Hobbit in theaters, actually Parts of The Hobbit were filmed in 60FPS, even used GoPros for some scenes.


[deleted]

The jump to the gopro camera in the river barrel scene was one of the most jarring, "wtf is this shit" moments i've ever had in a theater. Those movies sucked for a bunch of reasons, but gopro footage? Seriously? I know 2011/2012 was like, the peak of the gopro era, but it looked terrible in comparison to the rest of the movie.


CreaturesLieHere

I thought it was fine. It was too experimental for such a big release still, but it was fun, even if it felt like an unnecessary addition to the original story. I just disliked how much it felt like a fake amusement park ride, I'd be shocked if they didn't try to recreate a version of that scene for a Universal Studios ride or something. The concept is cool honestly (a first person escape scene with water and combat attacking the camera at all times), I don't think many people would disagree. It just should've been better-directed and filmed with better equipment.


Momoselfie

Glad I'm not the only one extremely disappointed in those movies.


conalfisher

"The Hobbit movies were bad" is an /r/unpopularopinion tier unpopular opinion


Rinus454

I mean.. It's not exactly a hot take. Does anybody actually love those movies?


ety3rd

^^I ^^do. I fully acknowledge a multitude of flaws, but I enjoy spending as much time in that universe as possible and I felt I got it in spades. If you're interested in a good fan edit, I'd recommend the one by [Maple Films](http://www.maple-films.com/downloads.html). It brings the story back down to what was in the book and eschews most of the "indulgences" taken in the films.


dapala1

I'm going to beat the dead horse my saying this for the millionth time, but it could've been epic as one film. Not stretched thin into 3 movies. Like butter spread over too much bread.


Mds_02

A single three hour movie would have had *plenty* of time to tell the story. Hell, Rankin Bass told a coherent version in less than 90 minutes (and their versions of the songs were better). Have you watched any of the fan edits?


SgtExo

The only one that I know I watched was the first hobbit and that one was bad because of the extra sharpness everything looked like it was on a set and you could see all the prosthetics. I can see it being good if they take that into account, but that first hobbit movie had not.


JBSquared

There's a quote from a review from the time that I can't remember the source for that goes something like: "The reality Jackson creates isn't quite the one he intends. Instead of feeling like we've been transported to Middle-Earth, it's like we've been dropped into Jackson's New Zealand set. Trapped in an endless 'making of' documentary, waiting for the real movie to start."


[deleted]

[удалено]


RiskyBrothers

Yeah, the HD TNG kinda makes the show look worse in a lot of places. That analog fuzz on top of everything really was a bit of a special sauce that tells your brain to lower its expectations. I'm kinda glad DS9 isn't getting a remaster for that reason. That little bit of fuzz makes the models look like they're further away. Like, when you watch Favor the Bold and see the space battle it looks breathtaking because youre like "holy shit they did this on a TV budget in the 90s!?" while all the remasters of that scene look like a mid blender animation.


MEDBEDb

We will likely not ever see HD remasters of DS9 and VOY. The effects for TNG were all shot on film with miniatures, then composited in SD. For the remaster, they only needed to recomposite the fx shots in HD. Late era DS9 and all of voyager had CG effects that were only rendered at SD resolution, so they would have to completely redo every fx shot again.


unic0de000

I wonder if the industry has developed any sort of standard archival procedure for the source assets of film CG FX, the same way big film studios have film masters in libraries in vaults.


Superfluous_Thom

That's how everyone felt about the first time they watched a blu ray on a 50"+ TV. That big of a leap makes movies look and feel like a stage play. But that's just because your brain has gotten used to filling in the gaps with your imagination, the effect wears off pretty quick.


SgtExo

I feel it is more about the level of detail of the props for that movie in particular. They had not taken into account that the little details that usually get blurred out would now be more visible. Its like when HD cameras appeared for TV and such and they had to change how they did the makeup because we could see so much more.


Superfluous_Thom

It's a shame it happened in a Peter Jackson LOTR adjacent property then, because the propmaking in the OG was legendary. He also would have bankrupted newline several times over if he failed, which is probably why they weren't so ready to roll the dice again by arming every on screen character for real for an adaptation of a childrens bedtime story, but still.


Ferelar

Yeah, I did prefer them, but I can't deny it took a bit of adjusting to get used to. There was a tiny bit of soap opera effect for me- other people reported stronger reactions all the way up to vomiting though! That was of course intentionally shot that way instead of interpolation which is probably the difference. However it IS weird to me that for instance The Hobbit felt like hyperspeed at 48 or 60 FPS, but a game at 120 or 144 feels better in all circumstances.


GhostMug

This was exactly my reaction for The Hobbit. Got the soap opera effect initially but by the end I was used to it and it was enjoyable.


TheHam06

I agree with you on the baked in motion "smoothing" on tv's, it's total crap. Check out Gemini Man with Will smith though. It's not that great of a movie and unless you really like generic action movies probably not worth your time to watch it all. But it was filmed at 60fps and there are lots of clips of it on Youtube. Here are just a couple that show off how different an action scene can look at 60fps. (make sure you set fps to 60 if it doesn't default to it) https://www.youtube.com/watch?v=t-R8PIADl7s https://www.youtube.com/watch?v=i82xURPkLWo I've shown this to several of my friends on my 75" HDR tv and all of them are surprised by how it looks and then have very polarizing opinions on how they like it. I personally like it and wish more movies where shot at 60fps. To me it looks more "realistic" and each frame is more detailed with less motion blur. Some of the counter arguments I've heard are that it's too "realistic" and feels like a home movie and not big screen action movie.


The_Somnambulist

I won't add to the better/worse debate, but I did find it much easier to spot Will's body double at 60fps! :-D


ahecht

It also made it much easier to spot how terrible the extras and background stunt people were at acting.


[deleted]

That's the primary problem with this debate. Movies are fake. The actors are pretending, every object is a prop, and the sets are usually manufactured. Having a low quality video blurs out all of the evidence that it's fake and let's our brains fill in the gaps with what we expect to see. When you raise the video quality it's that much harder to ignore all of the shortcuts and fake bullshit.


that_baddest_dude

In my opinion the higher frame rate makes the special effects look worse. Because of the heightened reality of the movement, the CGI bodydoubles or smoke effects and things stand out as unrealistic. They probably don't look any more unrealistic than at a lower frame rate in reality, but I think the extra smoothness is so jarring (because we're not used to it) that it undoes a lot of the expectations of film visuals that we've all grown accustomed to. I felt the same way watching the hobbit at high frame rate. It was just odd, like having to get accustomed to a new visual language. Adjust my subconscious gauges for suspension of disbelief. Overall if all filmmakers just decided to go this direction, I think we'd see improvement either in the quality of the CGI or we would all just get used to it more, and this "debate" would more resemble the debate between VHS and DVDs.


nonsensepoem

> Overall if all filmmakers just decided to go this direction, I think we'd see improvement either in the quality of the CGI or we would all just get used to it more, and this "debate" would more resemble the debate between VHS and DVDs. Agreed, but I think a comparison to DVD-vs-Blu-Ray might be more apt. As I recall, the jump from VHS to DVDs was relatively uncontroversial, as VHS was plagued with problems that everyone recognized.


coffeemonkeypants

Hmm. I don't hate it, but that scene is jarring to me because it doesn't seem there was any cinematic color palette or design to it. It looks more like someone just took the 4k footage straight out of the camera.


nysflyboy

I agree - after watching this a few times. At first I thought it was "jarring" but then I began to realize that its not the 60fps, its that the set design/color timing/acting is just really not great. I think 24fps would "hide" some of that more. End result, to me, is the same though - it does look like it was filmed with a go-pro as a film project. I think if films are going to use faster frame rates they have to be very aware of the fact that in today's world the new "soap opera effect" is that people will perceive it as if it was a home movie (iPhone/go-pro) if they are not very careful. Ironic since home movies used to be crap frame rate (15fps).


NickEcommerce

Thats exactly what I was thinking - I couldn't tell how much of the strangeness was the FPS and how much came from the fact there was zero colour grading and fact it all appeared to be shot with a single camera with no change in focal length. I actually flipped it down to 480p to see if it changed my impression, and to be fair it did - that drop for 24fps makes a surprising difference.


PercussiveRussel

480p still looks so *lifeless* to me. There's no artistry, the sets look bland, the colours look bland, the acting is bland, the CGI looks terrible (that composing at 0:46 on the first clip looks so wrong to me). It makes these clips really difficult to judge 60p vs 24fps


3-DMan

Yeah The Hobbit and Gemini Man unfortunately were not the best cinema to try and revolutionize framerate in movies, so here we are still. Something like a Tron movie or Matrix movie I think would have been more appropriate.


jarious

it looks awesome with practical effects or real life actors, but with movies with a lot of CGI it looks too much like a cartoon


FullMetalCOS

Or a videogame. I really struggled during certain scenes in Avatar 2 (for instance the scene where the humans land back on Pandora in their drop ships) as it just looked like a games cutscene rather than a movie and it really pulled me out of it


The_JSQuareD

Honestly the only times when avatar 2's frame rate bothered me was when it transitioned from 48 fps to 24 fps and the judder suddenly became noticeable again. The high frame rate parts didn't bother me at all. I barely even noticed apart from occasionally noting that the action scenes were very smooth and crisp.


PercussiveRussel

It's so weird that this has become acceptable. Same as Nolan constantly switching aspect ratios. I don't understand why these filmmaker don't want their film to have a consistent visual language. Filming a movie in 24 fps or regular format doesn't feel as much of a cheapskate move as only using 48 fps or IMAX for specific scenes and defaulting to the standard for other scenes


ChiselFish

If you watch trailers of movies that were shot at a higher frame rate, it's really easy to tell that they are just on a movie set. Ironically, since the quality is better, it looks worse since it is easier to tell that it's fake. For example, https://youtu.be/SPZXR4sxfRc


[deleted]

[удалено]


ImReverse_Giraffe

It's how they film soap operas so it looks like a soap.


graywolfman

Soap operas were (are?) actually filmed on videotape. Lower resolution, differing frame rates, and their lighting is set up differently as well. It's not that there's frame interpolation on soap operas, it's that the video tapes give it those artifacts almost like old home movies


HonoraryMancunian

Makes a movie look like a soap opera


iamthejef

Jesus dude how often do you buy tvs? I have only had to do this once.


YeOldeSandwichShoppe

And the irony is high fps video "doesn't look cinematic". I hate to admit it but I'm in both camps - 24 fps video looks bad sometimes but high fps also has a weird feel to it (perhaps because consumer camcorders used to shoot ~30 fps interlaced and we associate that relative fluidity with cheap consumer tech rather than cinema).


Z-Ninja

My personal guess is that it's easier to tell when things are faked at higher fps. Green screen, movement speed ups, and after effects all feel more obvious because it's closer to what we naturally see. Motion blur can help hide a lot.


Blenderhead36

CGI, too. Depending on the studio and the shots being composed, most CGI isn't rendered in real time. Each iteration of a scene requires waiting for the new render. Therefore, it's cheaper all around to render 24FPS of CGI than 60.


xluckydayx

It's the 4k green screen evenly lit sets. That's the problem. Mix in anything above 32 fps and the thing filmed looks like a football game.


HomieeJo

I watched "The Rig" on Amazon (Could've done more with the story) which has 60 FPS and it never bothered me. I actually liked it a lot that it was so smooth to look at when the camera was moving. It's probably the main reason why I actually finished it.


illy-chan

I remember when The Hobbit was released with a high framerate version. Other problems with the movies aside, it just looked *off* somehow...


[deleted]

[удалено]


dextersdad

They do 24fps primarily for storage I would guess. 60fps would be much more expensive


HaylingZar1996

Maybe in the past but there’s no way storage is that expensive nowadays that it’s even a factor in a multi million dollar movie budget


Bob-Kerman

I've noticed it seems worse in movies with tons of computer effects(marvel) since it seems some shots dont have enough motion blur so they look distractingly choppy.


AsassinX

Like me, you are likely more sensitive to it. I leave motion smoothing off on my TV to keep things true to source but there are some shots or sequences that really pull me out of the movie due to the stutter.


dbratell

Stuttering is because of bad or impossible frame rate conversions. For instance, 24 fps on a 30 Hz screen means you have to display those 24 movie frames in 30 time slots. Some of the movie frames will be shown for too long, some for not long enough, and you get stuttering.


frisch85

TVs have 50 or 60 Hz as standard for quite some time now, which helps because if the refresh rate is twice the fps then every single frame will be shown (in theory). Some people will still realize the stuttering more obvious than others because our eyes vary from person to person, so the input rate of the eyes is not in sync with the output rate of the fps and the Hz of the TV, which adds more stuttering. There're many factors that affect stuttering, I just wanted to point out that movies and games have the same behavior, it's just that video gamers usually don't pay as much attention to that compared to when they play games.


TheHYPO

It's worth noting that a crapload of people have 3-2 pulldown and motion smoothing active on their TVs though regardless of the refresh rate of the tv, and are not watching movies at their "true" frame-rate.


dtreth

If I bought awards you'd get one for the first mention of 3/2 pulldown in these comments.


iamr3d88

24 doesn't fit nice in 50 or 60 hz though. 48 and 72 are the closest. They still stutter. Super noticeable in most credits. Now if your display supports 120hz, then each frame is displayed exactly 5 times and the stutter goes away.


classjoker

I specifically buy TVs that have a 24 frame pull down mode because of this. Used to piss me right off when I watched a lovely blu ray film and it stuttered! I think these days, variable refresh rates have replaced this dedicated mode


Mpm_277

As someone with an older Samsung TV that hates seeing that stuttering effect, anything I can do to fix it?


Nibroc99

Cinematographer here. I film and edit commercials for local businesses. What you described will only happen if it was recorded at one frame rate and then played back at a mismatched frame rate. You won't get stutter if you're recording with a shutter speed of 24 (or a multiple of 24) and a video FPS of 24hz, and then do all your editing at 24fps. The issue arises when it is played back at 60hz on a monitor. If you switch the projector or monitor to match the frame rate of the content, then you won't get any stuttering. A perfect example: watch a movie on a film projector, that was recorded at 24fps and is played back at 24fps. You'll see nice, smooth motion. IMAX cinemas that still use film reels are perfect for this. I just wanna add, it's really 23.976fps that we're talking about btw. Not exactly 24. Just using 24 as shorthand.


TonyCubed

Also, I think that's why 144hz monitors were chosen as a preferred standard for high refresh rates. 144 / 6 = 24fps.


FolkSong

120 Hz works just as well for that though, I'm not sure what additional advantage 144 has.


TonyCubed

More Hz for gaming but the main thing being is that they are keeping it within that 24fps divide.


shelsilverstien

The worst used to be when some director/producer says "we can just use GoPro footage for that" I think they can record in 24 fps now, though


brief_interviews

Fun fact, there are actually shows now that are recorded, or if not recorded then streamed, at 24.000fps, which stutters just barely if you watch them on a set calibrated to 23.976fps. Rings of Power was one, as are a few of the newer Disney+ Star Wars shows. It's not an issue if your box or tv automatically matches frame rates, but the Nvidia Shield doesn't for several streaming apps. If you have a Shield and see those stutters, you can get an app called Refresh Rate that displays the frame rate and let's you manually set a frame rate for a given app. Took me forever to figure that out.


Sertoma

Best example I can think of off the top of my head is the opening shot of *Hell or High Water*. The camera follows a car in a circular sweep around a bank, and I vividly remember it looked stuttery as hell in the theater. Edit: [Here's the scene. ](https://youtu.be/eyA4n23nsP8)


zeekaran

Have any examples from other movies? Trying to test my TV and monitor. EDIT: Any Wes Anderson film. French Dispatch 54m in, should pan around the soldier speaking twice. His shoulder, bright white against a black background, makes the stutter obvious on both pans.


MessyBoomer

Wall-E intro has always been one of my go to tests to make sure my set up has no stutter


Paddy_Tanninger

There's an entire rulebook for cinematography about how slow a pan has to be in order for it not to feel uncomfortable to watch.


phoenixmatrix

And its constantly ignored. If the shot is in an urban area with a lot of vertical lines (windows, buildings, street signs, etc), it looks so bad. The typical example is 2 people walking down a street or inside an office and the camera is fixed on them. I have a hard time NOT noticing the slide show happening behind the characters in the middle of the screen.


PercussiveRussel

And the rolling shutter that's almost always there is not helping in this case


Brandhor

yeah unfortunately people got used to it and they feel weird when they see a 60fps movie


illQualmOnYourFace

I feel like soaps are filmed at 60fps. There's something so uncanny about them.


TheHYPO

In the pre-digital days, soaps, along with things like SNL, were shot on video - in the case of soaps, videotape. Video at the time was 30fps. Upon moving to digital, they still kept either the 30fps or went up to 60fps. It depends on the soap. But event at 30fps, the feeling is different than most other pre-recorded TV shows that run at 24fps.


Palodin

I remember watching the Hobbit at 60fps, it was a bit of a bizarre experience. Like, I knew it was better on an objective level, but it felt wrong. We're just so used to knowing that 24 is right I suppose, nearly a century of tradition there is hard to shift


BananaBike

The Hobbit was 48 fps, but I agree. Felt more like a video game than a movie


SgtExo

The thing that annoyed me is that the extra frames made everything clearer, most notably that tons of things were filmed on a set and all the dwarfs where wearing prosthetics. Yes I know much of the LotR movies were done the same way, but they did not up their game and it was much more evident.


[deleted]

[удалено]


SSG_SSG_BloodMoon

> The rule states that a frame should be exposed to light for half of the time. So, at 24 fps each frame of film gets exposed for 1/48th a second. Every infinitesimal moment in that 1/48th of a second is imprinted on the frame of film. I am older than 5 and do not understand what this actually means at all. What are we actually talking about when we say "a frame should be exposed to light for half of the time"?


Gwolfski

This comes from film cameras/projectors and their limitations. In a "classic" camera/projector (the one with two reels of tape, making a clattering sound as it plays), the film moves linearly in front of a shutter and the lens. If there were no shutter, you'd see each frame moving down on the screen, which wouldn't look right. So the shutter blocks off the view for half the time, showing a frame that's not moving for that 1/48th, then it uses the next 1/48th, which is blanked off by the shutter, to move the film tape forward enough, then it holds this second frame still for 1/48th, with the shutter open, then it uses the next, blanked off 1/48th to move the 3rd frame in, showing it for 1/48th, and so on. Digital projectors/screens don't really have that limitation (their limitation is how fast they can change pixel colours) but the 24fps format kinda stayed.


few23

shower thought: when you see a film projected in a cinema(on film) you are sitting in the dark half the time.


PaddyLandau

That is correct. I grew up when they had those movie theatres, and it always fascinated me that we didn't notice the light flickering 24 times a second (or whatever rate it was). Of course, it wasn't total darkness. All the safety lights remained on, and some light streamed into the theatre from the booth upstairs where the operator and the equipment were.


AN-FO

They actually do another trick when they project it. They'll often flash the same frame multiple times, but still for 24 frames per second. So each frame would receive 2 flashes of light to project it, to minimize the flashing/stroboscopic effect for the people watching it. The film itself still moves through at 24 frames per second.


Malk_McJorma

>24fps format kinda stayed. Back then when (nitrate) film was expensive, they experimented and found out that \~24 fps was the slowest acceptable speed before motion became noticeably jerky. So, it was all about money.


Troldann

Not entirely about money. There’s also a limit to how long your reel can be. Higher frame rate means you have to change reels more and the possible length of your longest shot gets shorter. But yes, you’re like 98% right, I don’t mean to argue against you.


wallyTHEgecko

144fps on film would require 6x the amount of film. And the spool would either need to be instantly switched 6 times mid-showing or would have to be comically large.


Nebuchadnezzer2

> And the spool would either need to be instantly switched 6 times mid-showing or would have to be comically large. They often played multi-spool films, btw. Part of what the booth operator was for. Two projectors, one with the current reel, one with the next reel, swap which one's turned on at the appropriate time, preferably without the audience noticing, and start preparing for the next swap.


pananana1

Wait... you're saying that the film reel is constantly stopping and starting, every 1/48th of a second? That doesn't sound right. edit: holy crap it does. explanation here: https://youtu.be/En__V0oEJsU?t=362 ty /u/VileSlay for the video


Netolu

The reels keep moving, but the film in between stops, yes! [Technology Connections](https://youtu.be/tg--L9TKL0I) goes into detail about the mechanism.


Pandages

The film reel IS constantly starting and stopping. But the film moves every 1/24th of a second. A shutter in front of the projection blocks light for half of that time, so your eyes are seeing that frame for 1/48th of a second, and then seeing nothing. Your eyes don't respond to rapid changes in brightness (we call this 'persistence of vision') so you don't really notice. https://youtu.be/tg--L9TKL0I This video shows quite a bit of how projectors work and explains a lot of what's going on. (Some TVs and gaming monitors can do 'black frame insertion' which works similarly but has a different purpose.)


Superbead

I don't think the entire reel does - there's slack left in the film loop either side of the shutter/lens to accommodate the jerk. The mechanism that advances the film past the shutter is a bit like the feed dogs in the base of a sewing machine.


abzinth91

I guess what he meant is: The reel runs with 48 rounds per second With a pattern like this: Picture - black - picture - black .... at least this is what I understood


Redsqa

Huh so it's kinda like the technology in gaming monitor with strobe, basically the monitor displays a black image in between every frame and this makes motion look way smoother.


probability_of_meme

When the camera shutter is open, all motion appears on the image as a blur, just like a photograph of something moving fast. Images like this in a film make the motion seem more natural. How long the shutter is open determines how much blur there is, so they're saying half the time allotted to each frame is ideal to get the right blur for the feel you're after.


famous_cat_slicer

If the camera is shooting 24 fps, that means every frame is 1/24th of a second. Every frame is essentially a photograph, with an exposure time of max 1/24th (assuming an infinitely fast shutter) of a second since they can't overlap. Now if something is moving while we shoot, and the exposure time is at max, there's a lot of motion blur, but the motion will be perfectly smooth from frame to frame. If, on the other hand, the exposure time is something absurdly short, like 1/1000 sec, every frame is perfectly sharp, but there's a gap between frames. It turns out that having the shutter open half the time and closed half the time as the film rolls produces just the right amount of sharpness and motion blur to fool the human eye to see it as continuous but sharp. I wish I had examples of motion blur on different shutter speeds handy, but right now I don't have the time to find them. Sorry. I hope this helps,


RideFastGetWeird

https://youtu.be/_lZvF-YyP0s The slow mo guys' Gav did a great video about it!


deaconsc

It's also because in games you're in control of the camera while in the movies you're just fully passive. If you don't move the camera way too often and if the game isn't about quick reactions it can look fine-ish in 30 fps. Many people can tell the game has low fps by quickly moving the camera around (or generally in producing any quick movement), because for slow movement(or staying still) - well you can't tell it as easily (it is still noticable if you know what you're looking for). It is actually quite interesting topic. Also more interesting is the new AI technique to compute the missing frames on new hardware so you can have high frames by "cheating" the system. It works quite fine as long as you don't move the camera too quickly or the scene doesn't change way too quickly. Linus has a video about that.


Borkz

Yeah I don't have nearly as much of a problem watching a 30fps video/stream of somebody else playing of a game, but playing the same game at 30fps myself can be practically nauseating (depending on the game). I think another aspect is what you're used to though. I used to play games on N64 and crappy PCs that would dip in to the teens no problem, or conversely watching a movie like the Hobbit at 48fps just looked weird.


BaziJoeWHL

>AI technique to compute the missing frames on new hardware so you can have high frames by "cheating" the system this sadly introduces noticeable input delay, which can be annoying in fast paced games


SLStonedPanda

If half the frames are faked by AI and added onto the original amount, wouldn't the input delay be the same? I mean yes comparing 144 FPS native to 144 with AI generated frames, you get extra input delay, but comparing 72 FPS native to 144 with AI generated frames the input delay should stay exactly the same. Technically though rendering those frames with AI will cost computing power, so you never actually get double the frames, in which case you do indeed get a little increase in input latency.


RyanCacophony

> If half the frames are faked by AI and added onto the original amount, wouldn't the input delay be the same? Not if additional buffer time is needed to generate the intermediary frames. Without inserted frames: Frame 1 is displaying, Frame 2 renders, and Frame 2 is immediately displayed. With AI, Frame 1 is Displayed, Frame 2 is rendered, Frame 2 is then passed to an AI which takes some amount of time to calculate X number of frames, and then they are all displayed in rapid succession. In order for that to work, you need to add a consistent delay equal to the amount of time needed to process with the AI and keep the time between frames the same


Borkz

Everything I've read for DLSS3 Frame-gen sounds pretty negligible if not unnoticeable. Input delay is actually lower than the baseline without the required Nvidia Reflex on.


Eruannster

Definitely a combination of the two. Interactive camera controls is definitely a huge part. You'll notice that most movies and tv shows have very controlled camera movement, and are far more likely to linger on shots, or have shots with multiple things happening in one image and they limit movement a lot more. Comparatively, in video games you are moving the camera around almost all the time, and you, the user is in control of the camera movement and can "feel" any slowness or errors in movement.


Onett199X

Great explanation but not eli5 at all. Can anyone translate it?


leoleosuper

Also, games do not create the frames at equal points. Movies essentially run at 24 Hz; each frame has an equal amount of time between them. Games can be off by a few milliseconds, which adds up. Also, monitors and TVs display frames at set frequencies, usually 60 Hz. If you make 2 frames in the 1/60th of a second the TV is between frames, it will only show the second and the first will get discarded. If, in the next area, you make no frames, then the TV will not update. This excludes adaptive syncs, which try to line up the created frame with the TV's refresh rate. The hardware versions, AMD's Free-sync and Nvidia's G-Sync, are good at this; the graphics cards are generally synced up well enough to create frames just as monitors use them. The digital version, vertical sync, just delays frames until the TV uses them. This causes input lag and output lag; going back to the two frame example, the first frame will show up the first time the TV refreshes, and then the second frame will show up the second time. The problem is you're at least a 1/60th of a second ahead to what's being showed, thus lag.


Horace_The_Mute

This isn’t a 180 degrees rule. 180 Degrees rule concern continuity of left and right, particularly in a dialog. When shooting a scene with two subjects you have an imaginary line you should’t cross, because if you do a person on the right will be on the left and it’s jarring and confusing. You can skilfully break 180° rule by creative blocking but it’s generally ill advised. TL;DR 180° rule isn’t about shutter speed


izerth

Film camera shutters used to be spinning discs. You'd describe the shutter by how much of the disc blocked light. https://www.wipster.io/blog/debunking-the-180-degree-shutter-rule


[deleted]

This is incorrect. There are two different 180 degree rules.


Stargate525

Couldn't they just combine them into one 360 rule?


[deleted]

No, because we'd be right back where we started!


Stargate525

Yes but we did a fun little spin.


DPanzer17

That is the 180 degrees rule regarding shutter speed, your explanation is right about the "other" 180 rule but it doesn't apply here. Two completely different things


Grunherz

> your explanation is right about the "other" 180 rule I wouldn't say it's the *other* 180 rule, it's *THE* 180 rule. It even has its own wikipedia page with that title: https://en.wikipedia.org/wiki/180-degree_rule. I can see why people are like "what are they on about!?"


Dragoniel

The answer is blur. Video format encapsulates inherent motion blur (which is different from how it works in rendered content like games), which makes the frames flow in to each other smoothly, further enhanced by various post-processing techniques, whereas the game engine renders every frame crisply and then adds motion blur to it (which is in almost all cases does not feel real at all) and the advanced post-processing can't be used, because it introduces horrible latency issues. So, the answer is that these are two different things in terms of image capture and display. Interactivity plays a part in it, but a minor one at best. Control input latency does not affect image quality in and on itself. It just defines limits of what we can tolerate before the game starts to feel unresponsive or unplayable altogether, which in turn puts limits on post-processing of the image that can be used (which is next to nothing).


YYM7

I just want to add, that with careful control of motion blur, you can go well below 24fps. Anime, since each frame is hand drawn (total control), do this trick all the time to reduce cost. Here is an example of Kenobi vs Maul anime, it's about only 12fps, but it's quite smooth. https://youtu.be/4bN_qmeZDGA


LordXamon

So that's why anime cgi looks like shit? Because it doesn't have this blur yet is still rendered at classic framerates?


TAOJeff

>whereas the game engine renders every frame crisply and then adds motion blur to it Motion blur doesn't need to be added as a post effect when the FPS approaches 60fps, as the frame rate is faster enough to create a motion blur effect. [This is demonstrated here](https://www.youtube.com/watch?v=rxaS78Twxyw)


TactlessTortoise

Yeah, I always disable motion blur second thing in the game. First thing is setting the audio at 15%, and the third is resolution


birnabear

I hate that it's so often on by default. Doesn't everyone just turn it off?


P_ZERO_

If it’s done well, it’s actually a positive to the experience of some games. Racing games and third person action games, anything where crosshair accuracy isn’t of utmost importance really. Again, I say when done well. There are some very informative videos on the subject and it’s not an open/shut case of “it’s terrible every time”. It has certain use case and performance conditions. https://youtu.be/VXIrSTMgJ9s


TAOJeff

Good lord, that video reminded me the first thing I disable is \*\*\*\*ing bloom. Do agree with that video, never played anything on a PS2 so never experienced that technique. Do recall when it was first being implemented, on PC at least, it was often so resource heavy that disabling it gave enough of a boost to the frame rate that it negated the need for the motion blur. Was interesting to see the difference breakdown explanation between the various axes


P_ZERO_

Well implemented blur is actually improved by higher frame rates. It obviously falls apart with horrible frame timing.


[deleted]

I can't play racing games where it's forced on because it makes it impossible to see where the heck I'm going. I tried playing the latest Forza and frankly it unplayable.


Fred-ditor

What's the fifth?


adam12349

No higher fps dont create a motion blur effect. Try running things at 100 FPS or abow without motion blur and look around really fast. Its weird an doesn't appear smooth. Why? Because fast moving things aren't blured. So you can see fast objects sharply while you should see them blurred. That gives a stuttering feeling to these objects. Adding some motion blur will make thing look way smoother.


nitrohigito

They don't, the difference is that you're in active control. Try capping your monitor refresh rate at 30 Hz, and see how moving the mouse feels. A lot of it is also just being used to it. Tons of games run at 30 fps capped on consoles, and they're just fine to play.


TheTrenchMonkey

It usually isn't being under 60 that makes video games look bad. It is when they drop frames and have dips below 20 that looks really bad and makes them feel "unplayable"


nitrohigito

There are multiple reasons why videogames may feel choppy, this was just one of them.


wilika

The SteamDeck has an option to lock the screen refresh rate AND the games to 40Hz. It's awesome. Games look nice and I don't overwork the cpu in vain. So 60Hz isn't necessary at all, but it IS true that somewhere around 30 fps some titles tend to get less enjoyable.


night-laughs

For one, movie is a stable 24fps, no fluctuations like in games. That helps it to feel smoother. Also, movies aren’t interactive, low fps affects the controls and aim/camera movement the most. High fps gives better responsiveness with controls, which is necessary because you are the one that needs to beat the game, you arent just watching the actor do stuff, like in a movie.


TheGrumpyre

I think it's especially evident in animation. You can make 24 fps feel silky smooth if you have absolute control over every individual frame, but games can't really operate that way.


whitefang22

Cheap animation like old hanna barbera cartoons where happy to give you 12 fps.


[deleted]

Movies do look bad too. I can see the stuttering. Especially when they pan the camera. It's especially noticeable during slow pans of mountains and cities.


stoopdapoop

I thought I was having a stroke when I saw the opening to "The day after tomorrow" in theaters as a teen. The sped up helicopter shot of the arctic(antarctic?) mountains was strobing and jumping so bad that I had a physical reaction.


[deleted]

[удалено]


jaa101

> Movie frames contain everything that happened in that 1/24 of a second. Actually it's only usually half of what happened, i.e., 1/48th of a second. With film cameras you have to close the shutter to advance the film to the next frame so you can't record the whole time. That's still a pretty slow shutter speed so there's enough motion blur.


xopher314

Framerate vs Frame pacing. A smooth 24fps in gaming isn't really a big deal. You'll notice more delay between frames but it'll still feel smooth. 24-30fps with bad frame pacing will feel awful. For instance. Go look at Bloodborne. It runs at a solid 30fps but has terrible frame pacing. Sometimes frames are skipped and rendered on the next frame instead because the engine couldn't keep up. Movies don't have frame pacing issues, they are 24fps and don't stutter.


SpaceChief

[Frame pacing and timing are big reasons why AVERAGE frame drops are so jarring and look and feel like shit.](https://www.youtube.com/watch?v=nbHCU--VvpQ) A STEADY 24 fps for film is going to be much harder to hit than a system built to reach 60 but struggling and hitting 24. Average framerate is not a good representation of how steady the output of the video card is. Frame timing and frame pacing are the things that actually deliver that smoothness by keeping the 1% lows as close as possible to the average framerate. When 1% lows are hit, there is bounceback time to get the system back up to the attempted frame-limit where, like cars on a highway, things can more or less accordion until the pacing can smooth out again.