FPS

I'm a dumb ass please someone explain it to me.
Why is it that when I watch a 27 fps or 30 fps video it seems smooth on my moniter.
When I play a video game in 30 fps, it feel laggy and jittery

Other urls found in this thread:

youtube.com/watch?v=eXJh9ut2hrc
youtube.com/watch?v=YCWZ_kWTB9w
twitter.com/NSFWRedditGif

You're not controlling the content in the video

Not your personal text support, go back to Sup Forums or reddit or wherever you came from.

This. No feedback from your motor skills involved.

Wrong

Aight why

I actually thought it was because 30fps video were constant (and the source captured had certainly more than 30fps), which helped everything smooth.
And when you played something, you could notice framerate change, which gave you this impression.

This is part of it,

But also when you play a 30fps game; its not always 30fps, it will dip down to 20fps now and then, and you will notice that.

But when a 60fps game dips down to 50fps it wont be as noticeable.

Your game just feeds frames to the monitor directly while most media players will apply some sort of frame-blending/smooth-motion algorithm before feeding it to the monitor if there is a frame rate mismatch.

Well, it's true that a game is generally "smoother" by keeping a constant framerate, which is why some games have frame limiting options. I'm one of those people that would rather play something at 30 constant FPS than 30-45 with frequent drops.

Besides isn't 24 the default for movies?

1. Motion blur
In a video, the content doesn't change, so it's trivial to blend two frames together to create a smooth transition. In a video game - or, for that matter, when moving your mouse around on your monitor - the next frame can be drastically different depending on user input, so this kind of technique is far more limited. To simplify things, it's enough to assume that each frame has to be discrete, with a much rougher result. This is a big reason why the cinematic industry is stagnating on 24FPS as standard, while computer monitors are 60Hz minimum and 120Hz+ are being more and more widely adopted: even just moving your mouse at 120 or 144 FPS makes you not want to go back to 60, and I'm not even talking about 30 here.

2. Input lag
As others have mentioned, at 30FPS you will on average have about 16ms of input lag, with a maximum of 33.3ms. That's noticeable in almost any game; in multiplayer games, consider that over 30ms ping is considered suboptimal, over 60 is meh over 100ms is really bad. At 60hz, that's halved, and meme 165Hz gaming monitors cut that down to 6ms max, 3ms average - not nonexistant, but nearly negligible.

Check camera movement. Most movies (of the TV/cinema variety) use very, very little movement.

Now go play a video game and do a 90 degree switch.

Nobody's talking about this dipshit. If your GPU can render at 200FPS and you have a 144Hz monitor you'll get 144 frames, maybe slighlty less (like 142) every now and then due to vsync.

You'll get all 200 frames but some will manifest as screen tearing. Vsync only forces the game engine to render frames according to the refresh rate.

>referring to low fps as "lag"
Unironically defenestrate yourself.

youtube.com/watch?v=eXJh9ut2hrc
in short, movies, are usually recorded at higher frame rates, but theyre processed to be output as 24FPS.
imagine theyre recorded as 48FPS. you pair all the frames (1,2), (3,4), ... and then mix these pairs. the output will have half the frame rate (24), which is the output
this mixing process is more or less mimicking how our eyes see the world. if you move your head really fast, you dont see a few very nitid images, you get blured forms

Movies are renders and have frames blend into one another. Also frametines are 100% consistent. A video game will have a variable frametime (even locked at 30 or 60 fps), frames aren't blended together.

Using vsync or double buffer at 30 / 60 fps will be "as smooth" but won't be as good due too many variables.

100hz+ and things like g-sync are good alternatives and will be very smooth, hence being popular.

a game running at 30fps with vsync will look just as smooth

>moniter
literally gas yourself.

I've been refusing to do console gaming for the last 8 or so years. I've been clinging to my 60fps, and have fefused to touch anything that have been locked to 30fps. But honestly, i can't really remember if 30fps is really that bad. I should go pirate a title that is locked and really see for myself. I went from wanting to buy okami, to not even want to pirate it after they annouced that it was locked.

youtube.com/watch?v=YCWZ_kWTB9w

>want to buy okami
>finds out its locket at 30FPS
>doesnt want to pirate it anymore
what did he mean by this?

That's kind of autistic user. I can understand this for competitive console games, but fucking okami? Stop being a sperg.

>I've been clinging to my 60fps
I also cannot understand how someone can be such a pleb yet act so high and might. High refresh rate monitors are very cheap and common nowadays. Go get yourself one, if framerate is so important to you that the difference between 30 and 60 is enough to make you not want to play a single-player game, then you'll thank me after using a 144Hz as your daily driver.

I'm already owning a 144hz monitor user.
What made you jump to the conclusion that i didn't own one?
And also, i can admit that i'm being kind autistic about it. Thats why i posted about wanting to give it a chance.

>What made you jump to the conclusion that i didn't own one?
You literally said "I've been clinging to my 60FPS". Why the fuck are you clinging to 60FPS if you have a 144Hz monitor?

I suppose that makes sense. Most tripple A single player titles seems to be locked to 60 fps. I suppose i could have written 60fps OR HIGHER.

He asked a question purely about cognition, not tech support ya knee jerking jerk

>Most tripple A single player titles seems to be locked to 60 fps.
I have literally never heard of a PC game being framelocked, except some really shitty old console ports. What the fuck are you talking about?

Aside from the already mentioned reasons like a lack of feedback, motion blur, and limiting the depth of field, film can seem fine as slow as 24fps because there is over a hundred years of cinematography practice that goes into dictating how the camera is allowed to move and how fast during any take. In fact, temporarily breaking out of these standards can be used as an effect in the right circumstances, like cutting the blur in a war film to make all motion seem hypersensitive.

In a game there is an expected level of feedback with relation to your motor skills, blur and a limited depth of field only serve to disorient you, and you are often in control of the camera at all times, with no real limitation to how fast you're allowed to move it around. The only way to satisfy this situation is with a high frame rate.

real life things have motion blur, video games don't

Wait, just remembered Skyrim's physics engine is gay and can shit itself if you play above 60. That's the only prominent example I know of, though.

>moniter

Video is not interactive and video is likely to have near-perfect frame delivery. In games low FPS is much easier to notice because it's interactive and frame delivery may also be fucked up to some degree (large frame time variance). Low FPS also affects input lag since the average frame time can be much higher, so the game is literally less responsive beyond the mere fact that it isn't animating smoothly.

>all fighting games is locked to 60fps
>most console ported games are locked to 60fps
I could give you specific names. But i bet you could do a google search yourself user.

The Okami HD port on PC is god awful when it's capped at 30fps like that I feels like I'm playing a ps1 game. It's very noticeable.

>>all fighting games is locked to 60fps
Oh, so irrelevant stuff
Even Brawl has a 120Hz patch by the way. This is pathetic
>most console ported games are locked to 60fps
I have never encountered one (other than the aforementioned Skyrim) desu. Even the early assassin's creed games, which literally have playstation button icons in some of the tutorials/help screens, are unlocked. Maybe you're just playing shit games?

>irrelevant
>shit games
This is going nowhere user.

>most console ported games are locked to 60fps
That's simply untrue, most multiplats are in fact not FPS locked at all.

Because a video is out of your control, once its in your control, you will have 33ms of absolute lag between when your monitor displays a frame and when you started to move, now in a worst case scenario, your monitor display one frame at the very last moment possible, and then it didn't have a frame ready till the very next moment the frame ticked over so you have a 66ms of lag.
Then you have the absolute lag of the monitor where it will take X ms to display no matter what you do, and there is no monitor that is better than 9ms here, so you are looking at 42 ms of lag no matter what, this feels like fucking hell to play

now, if you are playing a game like this, its most likely that you either have a cpu bottleneck or a gpu bottleneck, if its gpu bound, the game wont feel smooth, but it still plays well, if its a cpu bottleneck, you now have an experience where the game is just not fucking workable no matter what you do.

hope this helps a bit.

Back when gta5 came out, I had a shit setup, and it could do 60fps, but would drop to 30~ fps, so I decided instead of having that wide a range, why not just put all my settings to get that frame variance down, so now it was at 20-35 fps instead, overall a better gameplay experience, but honestly, not worth it at all.

you will notice the 50 fps drop depending on the game

Motion blur and input lag play a part, but the reality is that 30fps does look jittery even in video. It's just not as noticeable to you because in filmmaking movements that make it obvious are usually avoided or disguised, whereas in gaming moving the camera almost always involves wide panning movement, and the focal point is the environment around your character (whereas in films it would usually be characters with panning happening the background).

...

>frame-bending
is this the Avatar of the 21st century

gif doesn't support 60fps

it's set to 50fps, so it's actually 50/25/12.5

Video frames are blurring into each other as a consequence of the way it's recorded, so it appears continuous. Game frames aren't so they appear choppy.

read again

Retard. Record a 30 fps capped gameplay of yours and watch it back, it'll feel lot less shitty than playing it in 30 fps.

In GTA 5 there is a sync mod that helps make 30 fps look like it's a video.

Motion blur in games just makes things blurry or introduces extra input lag.
Besides if we're talking GTAV in particular there's literally zero (0) reason to play it at 30fps

This*1000

Personally I can understand this, 30fps games make me want to hurl these days.

I would look into the release and see if people come up with a way to uncap the framerate on the port manually, someone almost always does.