Tfw can't tell a difference between 30 and 60fps even in side by side comparison

>tfw can't tell a difference between 30 and 60fps even in side by side comparison

Other urls found in this thread:

youtube.com/watch?v=T2TMcUhibkg
twitter.com/NSFWRedditImage

You git ripped off when you bought your eyes man

Cute bird

Shit sucks, man.

literally how

>Face in the bottom corner

Every time

this means you have poor eyesight, or your tiny brain is bottlenecking your eyesight

I want to see a 60fps vs 120, I know anything higher than 60 is a meme

What if he has a 30hz monitor, or is playing on a TV?

You wont see anything unless you have a 120-144hz monitor.

And how would it prove your point if you don't have a 120fps monitor to check it yourself ?

hertz

I can see fps, it's resolutions that all look basically the same to me. Everyone is so excited for 4k shit and I really couldn't care less.

Meh, i don't know if that's the correct term for recent OLED/LED/LCD .. monitors since they can just switch output refresh rate freely unlike CRT which had to keep it going at third/half/max speed (i have to admit i'm lost on recent CRT tech).
Recent flat screens output the entire screen and can benefit from just relying solely on what the gpu outputs (freesync/gsync).
So is it really a "refresh" rate since .. well a flat screen will never flicker ?

>"realer than real life"
This phenomenon is not unique to 60fps WebMs, but to anything (live action) digitally recorded with a high framerate but no interlacing or motion blur.

In "real life," there are essentially infinite "moments" available for the human eye to see, but the eye/brain can only process a certain amount in a span of time. In the process of perception, the brain interpolates everything, essentially adding motion blur. This way of seeing motion is what we're used to, and is typically replicated in cinema and other forms of digital video where, atop a somewhat low framerate, motion blur is intentionally added.

In a high framerate, non-motion blurred recording, there are 60 frames per second available for the eye to see, and none of them are connected or interpolated because of the nature of higher framerate, lower expose capture. As a result, your eyes/brain are processing the 60 frames just like a very fast slideshow, creating a strange perceptual effect. This is essentially the same feeling of stop-motion captured video, just faster, since the fluidity of stop-motion is limited by how minute the change in poses from frame to frame are.

The effect, which is typically described as a good thing and an indication of how great 60 fps video is, is really just your brain being confused by a quick, but no so quick the brain can't tell it's not real life, series of images.

This is an unfair comparison precisely because it's side by side, though. If you're just watching the animation in 30 fps alone it doesn't look nearly as bad.

Compare 45 to 60 actually. You won't notice a difference

This. 1080p is fine for me for now. 720p gets kinda blurry and I couldn't run anything above 1080p decently anyway except maybe overwatch. But nothing like a sweet fps.

...

You can already notice some stuttering in the 60fps example.

fuck, I never noticed until now

>.gif

This is cool.

That's why it's called a comparison. One is going to be better or worse than the other.

>Its not shit if you don't KNOW its shit.

Your eyes seeing both at the same makes you perceive the 30 fps one as much choppier than you would perceive the same 30 fps animation on its own. It's just a fact. Similar principle as pic related, and it applies to everything: motion, color, brightness, sharpness, etc...

I never said that, but I figure you're just memeing. Obviously, 60 fps is better. Twice as good, in fact!

...

WARNING
WARNING
WARNING

You will perceive the previous picture as worse after having seen this! Open at your own risk!

I can't play 60fps more than an hour without getting dizzy.

Like said.
144Hz fag here. You notice the difference above 60 mostly in very fast movements. Like you can see in the webm, with very slow movement there is only a "small" difference in how it looks and becomes more apparent when it starts to move faster. similarly, when you have something that moves very fast across your screen, you start to notice the difference between 60 and 120 way more.

That's legitimately abnormal. I'd get that checked out.

How in the fuck dude

Your genetics are weak as fuck

I'm kind of glad I still haven't seen a 144hz monitor as it means I don't have to buy one

>It's another fps thread
This shit was never, not once, brought up before this generation. It's just a fucking buzzword. That's all it is.
Sure, you might be able to spot a small difference between 30fps and 60fps when looking at a test, but in-game? No. That's a whole other story. You have to have some kind of autism to even give a fuck.

how about rapid changes in perspective, like turning a quick 180 in an fps? Would you say 120 has a noticeable improvement?

someone post the webm of that panning shot in some movie

I would say so, yeah. But I have to admit, while having a 144Hz monitor and enjoying playing games at 120 I must say that it isn't as "necessary" (at least to me) as 60 FPS is over 30. It's more of a nice addtion. Something that makes your game look a bit nicer, a bit smoother but won't ruin your experience if it is missing unlike 30 fps and lower which actively annoys me because it is so noticeable for me.

no need to be in denial, this isn't anything new. it's the consoles that slowed down fps due to hardware limitations. the NATURAL for us should be 60 fps for vidya, but generations and more generations of 20-30 fps tricked our brain to think the other way around.

>tfw you've been using a 144hz monitor for so long that when you update your drivers you thing your graphics card broke, when actually the refresh rate defaulted back to 60.

Seriously after using 144hz for so long 60hz is no longer acceptable for me.

Okay, kid.

>download a free game like TF2
>join or create your own server

>open console
>type "fps_max 60"
>play for a bit
>type "fps_max 30"
>play for a bit
>kill self

>fucking implying that you regularly do anything that refreshes as fast as your monitor anyway

shit ass, somebody posted a good video yesterday that tested the differences and reaction times of 144/120/60/30/23 hertz monitors. Where did it go?

>buzzword
retard faggot, maybe you're content with shitty frame rate but not everyone is

>This shit was never, not once, brought up before this generation


sup underage

Show me one fucking post from a gaming magazine pre-2004 talking about ''60fps'' shit. Go ahead, faggot. I'll wait.

Oh. You can't.

...yes?

Maybe it was never brought up because up until 5 years ago, 55hz and 60hz was what everyone used. The rate is not only noticeable, but it also gives you a clear advantage over someone who doesn't have the high refresh rate, especially in situations that can be ended with a single button click.

Seriously though. When my refresh rate went back to 60hz my mouse cursor felt laggy and unresponsive. It literally made my nauseous and I thought my computer was broken. So I tried to roll back my driver and thats when I discovered I wasn't running at 144hz

released 2002

take a look at that back cover page.

Where is your god now?

youtube.com/watch?v=T2TMcUhibkg

Is there a way to experience the difference between 60 and 120 fps if you only have a 60 Hz display?

no

>This shit was never, not once, brought up before this generation.
People thought 15fps was smooth back when 3D graphics in PC games were still new.

Shut the fuck up.

People have literally always jerked about graphics for as long as videogames have existed.

This is so fucking pathetic. People like will defend shitty framerate to death thinking that 60 fps is just a "new buzzword". DMC1, Onimusha, MGS2 (kinda) and some other games had it back in fucking 2001 and it was pretty damn noticeable.

You might have a 30Hz monitor.

I can't show you because this was back when magazines were still a thing, but I remember a few articles about Timesplitters and Madden on PS2 having a high frame rate.

Cool. I remember the exact same thing, but they were talking about anything but framerate. Guess you're argument don't hold up very well?

Didn't old 2D games have pretty good FPS?
I remember being staggered at how shit it was when we went 3D.

You remember not remembering something?

Yes and no. The difference between games' frame rates back then was much more dependent on the skill of the programmer than the hardware it was on, as opposed to today

Yes, go and buy a 144hz display.

Even NES games are 60fps, nigga.

This. I just get motion sickness from the annatural speed. Looks weird.

*with massive slowdown or nothing going on

There doesn't need to be a debate anymore about this anymore. The reason some people are fine with 30fps and some aren't is due to a psychological phenomenon called "habituation" which occurs in many situations of our day-to-day life.

It's simply the process of subconsciously "getting used to" changes made to our environment. If a PC gamer played a game that was suddenly locked to 30fps it would have a jarring effect. They would instantly notice "choppiness" in the gameplay. Maybe even feel that the game was temporally slower. Although with prolonged exposure to this 30fps motion they would slowly no longer feel bothered by it (habituation).

The reverse can also happen to a console gamer playing a game that was suddenly locked to 60fps. It would feel smoother and maybe even faster (temporally).

Another case is if a PC gamer decides to purchase a higher refresh rate monitor. I personally cannot go back to play first-person shooter games at 60fps now that my default refresh rate it 96hz.

So next time this argument comes up remember there's a thorough explanation as to why people feel 30fps is fine in their game experience.

No.