Will we see a single gpu within this year that can run 4k and keep a steady 60 fps or higher? Is 4k just a meme?

Will we see a single gpu within this year that can run 4k and keep a steady 60 fps or higher? Is 4k just a meme?

Other urls found in this thread:

youtube.com/watch?v=fYal2RtkysI
en.wikipedia.org/wiki/Wirth's_law
techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/22.html
twitter.com/NSFWRedditImage

4k just a forced meme
and it's not for gaems

60 fps or higher for what? There already GPUs which can do it with older or less demanding games.

Games become more demanding over time too, not only GPUs get better.
You can already run lots of games at 4K 60FPS, just not the newest ones. In 4 years, you'll be able to run today's games at 4K 60FPS or higher.

It's a game, GPUs will keep chasing it and eventually reach it, but it only depends on how much games evolve in the near future too.

games are meme´s.

its not a meme, but demands performance that is currently unavaiable. The upcoming top-end GPUs will barely handle 4K without MSAA.
Only this. But the newest ones won't work.

Not this year no. The 1080 overclocked is pretty close though. 58 fps on Ultra and no AA in GTA V I just saw in the Strix review. So next year the 1080 Ti will do it unless games get more demanding but the trend seems to be less demanding right now.

>4k

fuck off with your meme resolution. Unless you run a fucking theatre, you don't need anything over 1080p. Literally the only reason is being pushed because 1080p@60 has been available for 5+ years and no one would buy new cards if there wasn't a new gimmick around. Even VR is less of a joke than "4k gaming" on some fucking 28" monitor.

>a single gpu within this year that can run 4k and keep a steady 60 fps or higher?
>single gpu
>60fps 4k
Not this year, nor the year after that, or the year after.

Explain your stance.

The full Pascal chipset should be able to do it, if Nvidia ever releases a GPU with 56SM enabled.

The GTX0180 only has 20SM.

youtube.com/watch?v=fYal2RtkysI

retarded

Will this meme die already?
People have been gaming at 5760x1080 (75% of 4K) for half a decade.
4k monitors have been around for a while, too, and people play on them with a single GPU just fine.

Vega 10 must be capable to perform like that in some games.

>within this year
Unfortunately not. Q1 2017.

For games, we need higher refresh rates, not higher resolutions.

>tfw 1440p 144Hz sweet spot

yes
overclocked cpu and 1080
theres no reason to have AA at 4k

the real meme is 1440p
>ooga booga muh video games

>Is 4k just a meme?
Yes. A meme is anything that you personally don't like, or anything that is technically more advanced than previously existing technologies. If you already have a set of hardware that you personally like, you should keep using it for the rest of your life in order to prevent unwanted contact with memes.

Turn your shaders down to medium and keep everything else on max you should be able to pull it off with any last gen Enthusiast card (Pre GTX 1xxx and Rx 4xx).

Pic related, R9 390x Not exactly 4k but close enough.


4k is still a meme, you won't be getting 144hz for cheap anytime soon

Oh wow It didn't include my pic

> looked at 1440p 144Hz Freesync monitors
> like $600+ Dollarydoos
jfc I didn't know they were that expensive

Not 3840 X 2160
Not 60 fps

Did you read what OP asked?

Yeah, that's the main thing that's going to keep 4k 144Hz out of most people's price range. Monitor will be about as expensive as the GPU.

Shame. I would've liked one, but I'd rather snag an RX 480 (or whatever Vega comes out as, if it's low-priced enough) and simply smash all of my games at 1080p. I'm happy with that resolution, and while higher ones like 1440p or 4K would be nice, they don't make or break the experience for me.

I '''''overclock''''' my monitor's refresh rate and resolution anyway. Supersample (I think?) to 1440p on a 1080p monitor and running at 75Hz actually looks and plays decent, not even memeing.

Someone's gonna roll in and tell me I'm retarded and should just shell out for a new monitor I don't need, though.

m8 I'm doing that with my 380. If it looks a bit fuzzy I use reshade to sharpen everything a bit and it looks spot on.

I'm using a GTX 760 hahaha. Has mad issues for games even like Dawn of War II at 1440p. Looks pretty decent, though - especially on things like League.

>it's a meme if I can't play muh gaymes on it

>SLI 1080s 60ish fps at 4K in TW3
>game still has annoying texture pop-in

Spending more than $250 on a GPU is a fucking meme when games are just so poorly coded. Graphics are a fucking meme.

>It didn't include my pic

Yeah I'm sure the computer fucked up and not you.

no, captcha fucked up and I had to redo the post

Not him but you're a super fucking idiot. Your original post is basically a confirmation that we are still a ways off from OP's question. Nobody cares about 50fps with medium settings at your weirdo resolution. Ultra settings are implied.

Clearly you've never used a resolution over 1080p or are just blind. 1080p at 20"+ is nowhere near the pixel density where you start to not be able to notice individual pixels. 1440p and 4k look so much better than 1080p.

>23 ms frametime

Is this an AMD card or something?

That being said, there's USUALLY very little difference, graphically, between High (or Very High, depending on the game) and Ultra settings, even at high resolutions.

I'm not saying nothing's different, because obviously there are changes, but they're usually so insignificant, the FPS drop isn't work the graphical gain.

The gtx 1080 is close, and definitely can handle 4k 60fps if you don't need to max out every setting.

Not him but I have a 24" 1080p monitor. I don't wear any kind of corrective lenses or anything and as far as I know have about 20/20 vision. I cannot distinguish individual pixels unless I'm about 5-6" from my screen. 4K is a fucking meme at PC monitor sizes.

And aliasing?

That's very easy to say until you've used a higher DPI screen. You'll immediately notice how much sharper everything looks.

yup, pre 16.5.1 hotfix

...

I have a 24" 1440p monitor and I can see the pixels clearly from over a foot away.

1080 is already temp throttling at 20, why would they ever bother with 56? That's a literal housefire waiting to happen.

For the price, at this moment in time, I'd say you get a better, more noticeable improvement from higher framerates than from higher resolution (especially for a 24" panel). 1080p 144hz looks great, and doesn't cost nearly as much in terms of monitor + gpu as 4K 60hz.

Thats how the fan profile is set on the bios et cetera aftermarket cards average 70ºC, people in Sup Forums should at least know how Reference cards work, all you have to do is take a look at the bios file with a hex editor.

Then you should go to a doctor because you clearly have superhuman eyesight, and they'll want to check exactly how that occurred.

I have a 22' 1920x1080 monitor and I can see individual pixels from an arm and half away and I have astigmatism.

>4k 144Hz

won't 4k top out at 120 Hz?

IIRC, DP 1.3/1.4 is 25.92 Gbps, which is only 130 Hz at UHD@24 bpp before packet and timing overhead.

Butthurt poorfag neet detected.

My monitor 4k monitor is 40 inches

>1440 at 24" gives a pixelpitch of 0.2075mm
>0.2075mm at 1 foot gives 0.039 degrees
>Visual acuity of human eye is 0.02 to 0.03
Sorry to hear about your sub-par vision, user

If it's not for games I don't see what it is for. Games are probably the most reasonable application of 4K because you can generate the absurd video data at runtime, trying to store or transmit true 4K video is stupid.

Work space

I've never actually taken a forreal eye test to get any specific measurements or answers. Would be interesting, though.

Have you actually seen a "Retina" display before?

mine too! Using the philips 40". I love it.

Fair enough.

Monitor bros.

It's so comfy. It's like being IN the computer.

I guess Vega if it comes out this year or the Titan 2016 Apple edition will probably do it. I'm more interested in constant 120fps on 1440p though.

1080 has 40 SMs (2560 / 64), not 20.

Only in Apple stores when I dick around on their display models. Why is that?

Like, don't get me wrong, I'm not saying higher resolutions don't look good, because they certainly do. I was just saying that I, personally, can't pick out individual pixels at like a foot and a half away on a 24" screen at 1080p. Well, not on the current monitor I'm using. I can on a different monitor I use at home.

Doing some more math for fun...
Based on my monitor
>4k at 32"
>Pixelpitch = .1845mm
>Visual Acuity of 6/6 vision = 1 minute of arc ( = pi / 10800 radians)

A person with 6/6 vision would be able to discern a single pixel at a maximum distance of ~63.4cm

That is a lot further away than I thought it would be

vega 11 and big pascal might but they are not coming out this year

Wait for Vega

How does vision measurements work, i.e. 20/20, 10/20, 6/6, etc?

Can you point me to where Vega 11 and 10 are a thing? Polaris 10 is the one we're going to see with the RX480 and Polaris 11 is what is heading to notebooks and slim devices. However, Vega 10 should be what replaces Hawaii and Vega 11 should be what replaces Fiji? When did this happen?

technically that would be 5.7K (140% of 4k)

Take 20/70 for example (my vision), what a person with 20/20 can see from 70 feet away, I have to be 20 feet away to see as clearly.

>2016
>60 fps
I want 4k @ 120+ fps.

>Games become more demanding over time too, not only GPUs get better.
en.wikipedia.org/wiki/Wirth's_law

6/6 means you have a visual acuity of 1 arc minute, this means that you could discern details as small as 1.75mm at 6 meters.
20/20 means the same thing, but uses feet instead of meters for the reference distance.

6/12 means that what you could discern at 6 meters, the person with 6/6 would be able to discern at 12 meters.
6/4 means that what you could discern at 6 meters, the person with 6/6 would need to be standing at 4 meters to discern.

You do realise that the whole point of AA is to make up for the lack of pixels on the screen right. As we increase pixel density we won't actually need anywhere near as much AA to smooth out games eventually.

I know DP 1.3 is stuck at 120Hz, but I thought DP 1.4 bumped us up. I could be totally wrong about that though.
Either way prepare to pay out the nose.

When AMD pushed the RX 490 up to October. People are now deciding that means Vega 10/11 because they don't actually have any idea what they're talking about.

DP 1.4 is just new color profiles for HDR, which will cut framerates back down to what normal 24bpp got on DP 1.2.

I see. That's cool as fuck. I wonder what my eyesight is like. I don't believe I need glasses, and things are rarely blurry or out of focus, but I've never been actually measured.

You're not taking into account for the vertical. 5760x1088 = 6,220,800 total pixel / volume. 3840x2160 = 8,294,400 total pixels / volume. 3820x2160 is literally 4x 1920x1080.

Forgot to throw in that 1440 is 4x 720.

GTX 1080 can keep Witcher 3 at 40+ FPS on 4K. On 99% of all other games, it will produce 60+ FPS.

techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/22.html

Probably the next dual gpu card by AMD or Nvidia on the new die in 2017-2018.

>Will we see a single gpu
>Probably the next dual gpu
You what.

This reminds me of pic related

If it's not for games, then what have I been playing at 4k for the past 2 years?

Ruses