High Frame Rate vs. High Resolution?

Which matters more and why?

Specifically 1080p @ >60 fps
vs. Resolutions >1080p @ 60 fps. Lets throw in 75 fps as well, since I've seen that on some 4k monitors.

Had it all. 4K. Ultrawide, 144Hz
High frame rate is a meme, don't bother even if you're a Sup Forumsermin.
Ultrawide is great for productivity even though retards are going to meme spout about muh vertical workspace when compared to 4K.
4K is the most universal go to thing atm.

All of this doesn't matter if you're a richfag and can spend 100 grand on a 5k ultrawide 144hz monitor.

I personally wouldn't consider 1440p because of the scaling issue with everything from the last 10 years. But 4k costs a pretty penny.

High frame rate is useful for some games and makes things like scrolling feel better but is wasted at all other times. High resolution improves literally everything provided you're not one of the idiots who bought a gigantic screen to emulate four normal sized 1080p panels.

It's like the TN vs IPS arguement. Yeah sure TN has a better response time but it's not much and it certainly doesn't make up for looking like dogshit. Once you go hidpi you won't go back.

Ultrawide is the one true king. Everything else is meme shit.

Maybe it depends on what you're using it for. Both have pros and cons.

You brings up a good point. High frame rate isn't useful all the time. High resolution is. Ironically I have two 27 inch 1080p monitors and every time I mention that here, people REE out about DPI. I've never been bothered by it. I have a 2010 1680x1050, 15.4 inch Macbook Pro at 128.65 dpi. Using my 81.59 dpi 27 inch monitors doesn't bother me one bit.

It's true that once you you go high refresh rate you can't go back. 60Hz feels like dogshit after using a 120Hz+ monitor. the biggest difference is scrolling and moving the mouse, and of of course games will look much nicer. Be warned if you use Nvidia on Linux, the mouse will move faster but moving Windows and scrolling still runs at 60Hz for some reason. Never could fix it, tried literally everything.

I feel like you'd have to run into diminishing returns with frame rate, more so than resolution. Is there really much of a difference between 60fps and 120fps? What about 144 & 240?

Before trying 120 or 144 you'll think 60 is good enough (and it really is), but after trying 120+Hz at 100+FPS you'll start noticing how much better everything is at higher refresh rates. Many people can't tell the difference between 144 and 240, but 240 is very close to how many unique "frames" a human brain can uniquely identify and process (most brains "run" at 235~260 "Hz").

Depends on your use case.
I much prefer framerate, personally.

High framerate will give you lower latency in the case of buffering due to shortening the timespan of each frame. But whether you actually use that latency or not is another matter. And often times you should opt to limit prerendered frames, or block sync and limit framerate, on a driver level anyway.
High framerate motion is also easier to follow visually. Especially with strobe backlighting, but that's really quite a meme and could also strain your eyes anyway.
I'd say it feels like everything gives less resistence, so while it's less noticeable, it is appreciable.
Lower framerates, like 24/25/30, scale better to 120 without pulldown jitter.
Using 1080p standard means you don't need to deal with UI/DPI scaling issues. So while less things actually make use of it, having it is less intrusive than having a higher resolution.

With a higher resolution gives more clarity, when programs actually output more visual information.
It is definitely a more noticeable benefit.
Removes a lot of issues from aliasing, makes a lot of things actually tolerable without AA. Even fonts, to some degree.
It gives you more pixels to work with, simple as is. So if you're multitasking windowed or tiled programs, that's a straight up benefit to consider.
But it still need software support.
Scaling unsupported 1080p to 4k is going to make it blurry with linear, blocky with nearest, or tiny with no scaling, depending on settings.

Then there's the performance aspect. Since 4k60 is twice the frames per second compared to 1080p120, you could expect as bad as twice the performance cost, though it's obviously not linear like that.

In either case, if you're using software that does not support the one you use, then it is annoying. With high resolution it becomes a noticeable nuisance due to settings and visual difference.
With high refresh rate it's just disappointing.

You do run into diminishing returns, 240 is probably noticeable but it's not worth it unless you're a pro CS:GO player, since the only panels right now are 1080p/TN. You can figure out how much of a theoretical difference there is by looking at the frametimes.
24fps=41.7ms
30fps=33.3ms
60fps=16.7ms
120fps=8.3ms
144fps=6.9ms
165fps=6.1ms
240fps=4.2ms
So going from 60 to 120 is an 8ms drop in latency, and 120 to 240 is a 4ms drop. So there's definitely diminishing returns, and then of course there's the question of how fast can it get before your eyes don't notice any difference.

Same person, but I'd still say 120-144Hz is worth it due to the diminished motion blur, and no judder when you play 24fps content. (24 goes into 120 and 144 evenly)

>the manlet monitor meme

3840x1440/1080 can be ok but 2560xanything is fucking meme tier

resolution and screen quality. I bought a 32" Samsung 2560x1440 back when 4K was just a dream. My 60HZ 1440p panel looks a million times better than my buddy's 144HZ 1440p monitor due to panel differences.

If I were to buy a new monitor tomorrow, it'd probably be a non curved high end 4K 60HZ screen. OLED or that new quantum dot shit.

Pretty much everything you can do with a computer will benefit from a higher resolution. Not so much with high framerate. I don't play gayms so I really don't care, but if you do, high framerate can make a noticable difference.
When in doubt, however, go for high resolution. You can't go wrong.

Unless you are actually srs enough to even organize your sleep schedule around playing competitive games and measure and calculate whatever needs to be for a competitive edge, you'll likely never really experience an actual competitive difference above 60fps.

Just make sure you got 60 fps for no real handicap and then learn how to play better.

720p@60
or
1080p@30

why would you have to choose? Are you using a TV or something? Just get a decent 1080p 60HZ panel. They're like $120 now.

>High frame rate is a meme, don't bother even if you're a Sup Forumsermin.
Confirmed for casual gamer with shit eyes. Opinion on the high refresh rate invalid.

All monitors and TV's should be 120Hz so you can play 24fps, 30fps, 60fps content on the same screen. (Downclock to 100Hz to play 50fps PAL content.)

TVs that claim to be 120+hz usually can't display 120+ real unique frames each second, they rely on shiitty tricks to make their real 50-60hz refresh rate look higher.

My old monitor is only 720p and I can't afford to buy a new one since my saving will be spent on new pc instead.

What 50FPS PAL content would you actually use?
Only thing off the top of my head is emulation of old games, which often have NTSC versions or patches anyway.

While it's true that many use interpolation to take low framerate content and upscale it to 120fps, a lot of new TV's have a 24p mode for Blurays so you can play the native 24fps movies without any 3:2 pulldown. At least I know the Samsung 4k I got for my mom for Christmas has a 24p Bluray mode.

So you could diminish total input lag by increasing frame rate? I never thought of that.

I also never thought about frame rates not scaling properly. I guess that would make those 65fps monitors be slightly off for movies and such.

I've never understood monitor display quality. There's not really any way to tell. You can't trial them. The stores all have some shitty low bitrate video loop playing on them.

Yeah, watching something in 30/60fps will have judder on 144 or 165Hz monitors. Although it's not as noticeable since the frames are updating so fast. The same is true of screen tearing, the frames are refreshing faster so if a game is running at 144fps and tears, you probably won't notice it. Gsync and Freesync are still great for games where you can only hit 60fps though.

>tfw god-tier AH-IPS 100% sRGB 4k monitor
>tfw gtx 680 so I can't power any game at all @4k

If I ever get a 144hz monitor should I run it at 120hz to avoid jitter in 30 and 60 fps content?

I'd try both just to see if it makes a difference to you. Just think, most people don't even notice anything wrong with 24fps movies on 60Hz TV's. If you're marathoning Youtube or something then yeah, 120Hz is probably better though.

Wow, 144hz is dumb as fuck. They should've stayed at 120hz, then made the jump straight to 240 or 180 instead of an awkward number like 144.
I mean, MOST of the content is still 30 or 60 fps.

Just pony up for gsync, Juan.

So 120 or 240 is the true masterrace?

144Hz exists because of 3D actually. Movie producers (and Nvidia, when Nvidia was still pushing 3d gaming) noticed that 72fps made 3d look more realistic than 60fps. And so 144Hz monitors were born, they could play 72fps 3d movies, alternating between the left and right eye. It took off for gamers since 144>120, even though it's probably almost impossible to tell the difference. 165Hz exists because it's the upper limit of DisplayPort 1.2, 1440p/165Hz is the maximum bandwidth.

>Not talking about refresh rate and response time.
These two things are the only fucking time that matters if you're a gamer. High frame rate is also pretty nice for the way your game moves.

1080p 60+ FPS monitors with a refresh rate of 240hz and a response time of 1ms is for gaming, 4K and wide screen for anything else.

Response time matters less than correct strobing.
Refresh rate only amplifies that advantage.
That said, 1440p + increased desktop PPI and decent refresh is extremely beautiful in motion.

>want an ultrawide for muh immersion
>dual screens with ultrawide is unrealistic or looks fuck awful

Why not both? 1440p is a good mix of resolution and can reach higher refresh. If you're bottlenecked at the hardware that's where you should aim first, though. A 1080 ti is more than adequate to push 144+ in most games, exceptions being SP high fidelity games which are few and far between. More importantly the size of the display is going to be a significant variable, a 27" 2160p is going to look very similar to a 1440p without close examination, but scaling to something like 35" you'll notice a stark contrast between the two.

>buying monitors in stores
You look at reliable review sites like tftcentral that actually measure the panels through a battery of tests. If your monitor isn't there it's probably shit.

Yeah, you get less input lag by increasing your framerate and refresh rate. A really cool bonus feature of HFR monitors is the ability to play locked 60fps with lower input lag than 60Hz monitors. This means that even if your GPU isn't strong enough to run above 60, you still get benefits. This is really useful for mouse driven games, since your mouse will feel less laggy than it usually does when switching for 120 to 60.

There are still no video players that support variable refresh rate technologies.

What the fuck? I got my first 144hz monitor this weekend and I cannot go back. Even for browsing and general desktop usage the difference is immediately obvious

720p@60, of course.

144 is a multiple of 24, the frame rate of movies.

Freesync/Gsync should work on most movie players in fullscreen. I know the Windows 10 Movie app works at least. Freesync will lower the refresh rate to 48Hz when playing 24fps content fullscreen. Tried MPC-HC but it didn't appear to work, I believe that's proabbly MadVR's fault. MadVR is doing the upscaling so there's nothing for Freesync to actually do.

Went from 1680x1050 60hz to 2560x1440 144hz, second user is right. Going from 144 to 60 is like going from 60 to 30, you can totally feel the difference. That being said, I noticed the res bump + pixel density of the 2K a lot more instantly, and could only really feel the 144 after going back.

>Even for browsing and general desktop usage the difference is immediately obvious
He's obviously an idiot, but HRF price premiums are pretty steep just for smoother mouse movements and window dragging.

They only get significantly more expensive with IPS panels and/or goysync

4k 30 fps > 1080p 144fps

only people who have sub >4k monitors will argue this point

yep, switch on that ULMB and you'll never settle for anything less.

Do you like
use a wall projector
And have a quite flat painted wall

Otherwise that looks like a horrible idea.

These.

Using a 144hz monitor or a CRT at 100hz is worth it for the desktop experience alone. Moving a mouse and dragging windows has never been more fun. It just feels nice.

30Hz is unusable unless you only use the console.

High refresh rate for vidya
High resolution for everything else

I came from CRT and I've always wondered if it was my aging eyes because scrolling in games and browsers were so blurry.

Then ULMB happened.

I'm just making a point since most demanding games are unable to attain full 60fps @ 4k unless you go all out with SLI or a 1080ti, and consoles are only 4k 30fps at the moment

still far better than 1080p no matter the fps

samefag here.

I mean from CRT to a typical LCD then to one with ULMB.

Framerate to up to 240. Human brain can't perceive more than 220-250 fps and you wouldn't see a difference between 240fps and 100000fps. (Or maybe it was around 500 max?)
Resolution depends on the screen size and distance from the viewer.

I think some military pilots got tested and could distinguish up to about 600. That was the last relevant study I read

It's just training. Fighter Jet pilots can distinguish more frames then normal people for instance. No idea where the limit of perception is. After all, the human visual system doesn't run on a clock as far as I know.

Pilot mind.

your cone and rod cells are still bound to the laws of physics.

But how many times a second can then transmit information?

what if man is wrong about physics?

>full 60fps @ 4k unless you go all out with SLI or a 1080ti
You forgot "at ultra settings." Lower the settings nigga. Or take the console route and play at 1800p upscaled. The DPI at 4k is so high that you probably won't notice the upscaling honestly.

I want a 1440p gsync 144hz ips monitor

The biggest issue is motion blur. Either all screens need ULMB or black frame insertion, or we'll need 1000fps LCD's. That's according to VR guys anyway, for VR to look like real life a game would probably need to be 8k/1000fps.

lower settings @ 4k is easily noticable vs 1080p

you don't lower shadows and textures @ 4k

>for VR to look like real life a game would probably need to be 8k/1000fps.
There is no way we can reach this before hitting a wall with silicon transistor size.

Your brain also approximates "frames" inbetween all the time

1440p 100hz+ free or g-sync is the sweet spot right now and the 1080ti does pretty well at Ulta.

Intent is everything.
For gaming 1080 and a high frame rate is currently considered best. GSync and FreeSync popularity in professional circles back this.

But if you're talking about work space 4K is really fucking nice to have.

That said. I'm fine running 1080p across 3 screens.. I guess.