What do you think of 240hz?

What do you think of 240hz?

Other urls found in this thread:

youtube.com/watch?v=rQY8hSZ9xNE
youtube.com/watch?v=74SZXCQb44s
twitter.com/NSFWRedditGif

Amazing, but new type panel as oled,microled or QD, plus new thing API ultra low overhead for graphics and new version display connectors.

Placebo like anything above 60hz.

isn't it just 144hz with light boost or some gay shit

144hz has deeper blacks

Rip dust 2
.____.

t. Stevie Wonder

This meme should've stopped at 120hz

>240hz
I never tried but I doubt being able to tell the difference between 240 and 144.

While the "human eyes can't see above 30fps" meme is ridiculous, it has limits nonetheless.

Marketing threads exist and push gamer shit to hungry redditors on Sup Forums

Too few games benefit from it.
Even 144hz is basically useless tech if it wasnt for CSGO. CSGO is the 1 game that is used their marketing and people buy them for that game alone.

It's not that screen refresh rate should stop accelerating - human perception needs to start catching up with it.
We need to discover the way around that, food supplements that make brain perceive faster or cognitive exercises that push its capability by training.
We go that 600Hz coming by 2020? Better make sure the next generation of gamers makes the most of it!

And get couple good lawyers to have us covered on all the possible side effects in the small print.

You don't exactly need a full 1080p screen to watch movies, either. A 768p TN panel is fine.

sorry but my 144hz freesync monitor is great for all games

really makes your cpu sweat trying to put out more than double the frames though

You can see well over 1000fps. I could instantly tell the difference between 240 and 144.

isnt csgo dying?

kek

Okay what fucking game?
Do you think that many people own gtx 1080 ti?
If you use 1440p resolution you basically need 1070 to have consistent 60 fps.

If you can't see the benefits of having more frames you aren't playing fast enough games and are by extension a slowpoke shitter.

Quake.

HOWLING with laughter

shit.

youtube.com/watch?v=rQY8hSZ9xNE

120 or 140 hz is enough. Anything above that is unnecessary.

yes and no, army already did the tests, humans can accurately identify shit around 480~ fps and with an upper limit of what can be preserved being somewhere along the lines of 600-800fps

I would personally take 60fps with pixels that actually have the response time they say they have so everything is an instant shift, but since that's not on the table, 240fps usually has a faster responding pixel so everything looks less blurry in motion even without black frame insertion.

This, I don't understand why everyone is complaining, no one is forcing you to buy high refresh rate monitors and it's well within human abilities.
On top of that it drives the price down of all other monitors, so everyone gets to win.
We can stop once we reach 600hz.

you guys will never stop

1000/60 (frames per second) = 1 frame per 16.666 millisecond
1000/144 = 1 frame per 6.94 milliseconds
1000/240 = 1 frame per 4.16 milliseconds

Anyone who's ever played online games, particularly at a competitive level, will be able to tell you 10 milliseconds is noticeable even talking about network latency, even if it's just a little noticeable, so it's definitely gonna be noticeable if it's your entire display - that's between every single frame on your screen, a 10 millisecond improvement, so you're seeing a frame movement every 6.9 milliseconds at 144hz instead of every 16.7 milliseconds at 60hz.

60hz to 144hz is definitely not a placebo, that's a near 10 millisecond improvement in the time for every frame.

240hz on the other hand is a very minor improvement compared to the one between 60hz and 144hz.

Imho 144hz is worth paying a bit extra for, but 240hz isn't worth paying a lot extra for over 144hz.

Take 240hz if you have a bunch of money to burn and nothing better to spend it on, but don't if you're on a budget when 144hz is virtually the same for all intents and purposes.

Not sure if trolling but refresh rate/ frame rate has nothing to do with your image or blacks, that's all to do with your image settings on your monitor (contrast etc.)

What does this mean for 8k?

7680*4320*32*144=152'882'380'800=153G
7680*4320*32*240=254'803'968'000=259G

This means the graphic card needs 153Gbit (144Hz) or 259Gbit (240hz) Ram every second only to fill the 8k display. There is absolutely nothing else done yet. Whatever application or game you run that needs 3d calculations, pre and post processing or who knows gravity engine something... nothing of those are put into the calculation. This roughly means a 32 Gigabyte Ram Graphic card is barely enough to display a picture, but not enough for actual gaming.

For christmas I had a gift to myself, I had a few choices

1) get a high end monitor that would do 240fps
2) get a lower end monitor that would do 144fps but it would be an ips
3) get a 4k tv that has amazing contrast ratios

each one was with in 100$ of my spending limit, and I went with the tcl p605

fuck me, even non hdr content that contrast ratio pops shit like nothing else. it has accurate colors and is at least technically a 10 bit panel, which when you see hdr demos on it its night and day different from non hdr, but it shifts colors a bit when im up close, nothing as bad as my 24 inch tn did, fuck me was that bad, but it shifts, and the very edges have a bit of dimness as its not edge lit.

But I can say without a shadow of a doubt that contrast is king, there is not a single factor that matters more in your perception of a monitor then contrast. sure, hdr will introduce new colors for it to play with, but demo content like
youtube.com/watch?v=74SZXCQb44s
Is not indicative of actual content you consume, having an ips and this monitor side by side while watching that made this pop, but watching actual hdr content, much of which mastered for dolby hdr which demands a 4000 (whatever the brightness scale name is here) panel, its kind of a pyrrhic victory, but what remains is the contrast ratio.

long story short, I find it very hard to recommend pc monitors as monitors because of how shit they all collectively are, once hdmi 2.1 hits, you will have tv's that are 4k 120hz and 10 bit (all oleds are 120hz, many of the tvs are 120hz, but they are throttled by output methods) and all will have outstanding contrast, while monitors have a few tns that are great for games but that's it. I honestly recommend anyone getting a monitor to plan for 2 monitors, one for gaming, one for literally everything else

are you fucking stupid?
144hz makes even fucking solitaire and chess feel smooth, even fucking moving windows around on my dekstop feels fucking smoother and better at 144hz...
fucking hearthstone at 144hz, you cuck

men of culture like me play quake but any FPS/RTS you will immediately notice the 144hz, not just counterSHIT

if you want to feel the best input ever, play QW in software renderer at a low resolution. Preferrably on a CRT. 1000+ fps, something like 320x240@160hz is insane how much better it feels than anything else that exists. You start to get the sense that the pipelines in graphics cards are way too long.

I just came off a 50ms input lag monitor, that had 4 frames of blurring because of the slow pixels, to one that is 15ms with a sub frame of blur.

Difference is fairly noticeable, but I can't stand playing online, I do get good to the point lag is a factor and I fucking hate being able to blame me sucking on anything but myself, because I can always get better, I cant get less lag.

8k isn't going to happen
I mean it will happen, the hardware will get there, the shit will be made at some point, but mark my words there will be no push to it outside of professional applications and theaters.

tv shows will be recorded at 8k, just so they can sample it down to 4k, or have room to play with at 4k with digital pans, movies will be shot in 8k because that's literally the only application where 8k would be shown off greatly.

but in the home? Even cramped apartments 4k more than they can see, not to mention a real houses living room where you are 8-12 feet away from the tv and even 720p is overkill unless you get something 100+ inch

a movie room may be an application for practical 8k, but projectors are expensive as fuck and unless laser ones come down in price, I just can't see this being a home option. for pcs? 4k-5k is going to be a stopping point for now, with 8k-10k being the pro line highest end so they can edit in 8k and not have to scale down, but we already have 22-24 inch "I want shit sharper" 4k and we have 49-55 inch "I want more screen real estate" 4k, sure the bigger panel users could use 8k, but they got a big screen for a reason, sharpness isn't a priority, and 4k at 22-30 inch is going to be the same as 8k at a non screen licking distance.

possibly vr could use 8k, but 4k or 4k per eye may be just as good, would have to see that in person, have hear great things about 1440x1440.

point being, 8k will happen at some point, but it's not going to be a massive push to it.

What is cache, user

Golden

...

i instantly know when my monitor isn't using the 100hz profile and reverts to 60hz.

been using a $180 korean 27" 1440p IPS overclocked to 100hz for years now, never going back to 60hz peasantry

human eyes cant see past 60hz

It's a gimmick, just like 144Hz. 120Hz is okay though.

In 2080 cyborg tech will be mainstream, people will be buying 7g wireless compatible nvidia eyeballs

literally any game that isn't turnbased. first person games are the most noticable but platformers and any other movement based game with good controls are enhanced. I find it hard to play hollow knight on 60hz
even outside of games having 144 is nice. I have 2 monitors with different refresh rates. win10 fucks it up sometimes and sets my 144hz monitor at 60hz on startup and the moment I move my mouse I notice. it's night and day.

HOOMAN EYEZ CONT SEE PASST SIRTY R T Z ANYWEY

Overkill. Most game engines max out at around 140~200fps in internal tick rate, so a lot of that extra frame rate will go to waste even if you had the hardware for it. Source may literally be the only game engine still in wide use that can actually push over 200fps.
120/144hz is more reasonable, and as long as it doesn’t come as a detriment to visual quality, is a nice addition. Meanwhile 240hz is only possible on a tiny shitty TN panel, so if you’re not actively taking advantage of this feature, you’re basically just using a tiny shitty TN display.

If you cannot aim at 120 and 144Hz you cannot aim at 240Hz either.

It's great! Thanks to my 240 Hz monitor I went from Bronze to GM in Overwatch in just a couple of days :)

N-no!!!!

It's like the 4k of 60Hz :)

spbp