Get 1080 ti

>get 1080 ti
>still can't run games at a solid 60fps playing at 4k
Is 4k a joke?

Other urls found in this thread:

techpowerup.com/reviews/EVGA/GTX_1070_Ti_FTW2/24.html
twitter.com/SFWRedditImages

Yeah.

> he fell for the 4k meme

There just isn't enough computer juice to make 4k work decent for games.

If you had looked at benchmarks, you'd have known this.

>games

4K gaming is probably one of the dumbest memes you could fall for right now, especially when you could be witnessing the beauty of 144Hz @ 1080 or 1440p with adaptive sync

Is this pasta?

No just the truth.

Lower the settings from max to high, you'll be playing at 90fps on average.
Doubt you'll notice the difference either.

Until we get 24gb graphics card (hopefully soon) just turn down the setting a bit

What can't you run? I'm on a 1080Ti and I haven't noticed any of my games dropping below 60 @ 4k

>get 1080 ti
>dont even play games

For games? Yeah. For office work? No, it's fucking lovely.

Run no aa and medium settings and every game except ones from the past year or so will run fine even on a lesser card.

yup. lol

Hopefully the volta-based gaming cards will perform better, the 4k 144hz displays probably got delayed cause of this

Why would you use antialiasing at 4k?
Are you retarded?

At 4K, you can reduce AA. 1080 Ti can run Witcher 3 on 4K at 60FPS.

techpowerup.com/reviews/EVGA/GTX_1070_Ti_FTW2/24.html

Just reduce some graphic settings, especially AA and Motion Blur, since you don't need it anyway.

>Hey bro check out my new 4K TV!
>It's 120 inches diagonal!
Resolution != Density

3840 x 2160 / (1920 x 1080) = 4

You quadrupled your graphical requirements. Is it worth it? You tell me.

> 1920 * 2 = 3840
> 1080 * 2 = 2160
>Durr, double means quadruple!
Fucking retard.

...

I pray to God you're trolling...
yes quadruple you 5th grade dropout. if each SIDES is doubled then the AREA (the total number of pixels that need to be generated) is quadrupled.

How is that possible if both the height and the width are only doubled then retard? They would have to be quadrupled for it to be 4 times as much.

Oh wait, it's not because you're retarded.

Ironic shitposting is still shitposting.

lol its basic fucking math

>Sup Forums in 2017
pic related

what the fuck did you expect? FHD has been around for ages and not every has it yet

buy the new titan, goyim

I'm 4k 60fps'ing everything I throw at it. What the hell are playing? Alternatively what shit CPU do you have?

fuckface, you multiply the resolution together. why do you think theres an X in the middle of the two numbers? it gives you the pixel count if you compute it.
1280 x 720 = 921,600 pixels
1920 x 1080 = 2,073,600 pixels
and so on...

...

>60hz

You need SLI for that goy. Stop being a dumb peasant and buy another one.

Area is the total amount of pixels displayed
1920x1080 literally means 1920 multiplied by 1080 pixels. If you take (1920 x 2) X (1080 X 2), you can simplify that to (1920 x 1080) x 4
Go back to middle school

plus the eyes can't see past 1080p

le bait

Include me in your screenshot

Sup Forums 2017

>Get 1080ti
>Considering selling it for a Vega 64 instead

I shouldn't feel this way since the 1080ti is anywhere from 20-30% faster than the Vega, but they just look so nice. Especially the watercooled version.

or 60 Hz

1440p is the perfect resolution for the time being. 4K requires absurd processing power for the diminishing returns of higher resolution.

1440 is also better, because you don't have to deal with Windows DPI scaling.

>get 1080ti
>play games that run on a 10 year old hardware

>the resolution will greatly enhance the current state of gameplay

Everything over 1920x1080 is a joke. For competitive games 144Hz with 300fps is ideal.
Resolution and stable framerates affect input lag. The only tricky part is finding a monitor with accurate blacks so you can see poorly lit enemies.
Also a 2500k paired with a 770 2GB is more than enough power for anyone.

kek

I don't get it. How is this not wrong?
4K should be 1080*4=4320p.

Fingers can't smell past 20 kHz, and it gets worse as you age.

My nose used to mine at >500MHS too just two years ago too

1. >>> Sup Forums
2. Depends on the game. I enjoy the enchanted crispiness of nes/snes/megadrive's magnified pixels and it still runs fine.

Wtf I have a GTX 1080 and an Intel Core I7 4770k and I can play the most games at 60 fps on max settings in 4k.

>multiply by 2, twice
>thinks it's still only 2

>Is 4k a joke?
it is a resolution

>buy 1080ti
>only shitpost about vega on Sup Forums
what am I doing with my life

>500MHS
Leave it to the pro's goy.

1080p was never a good resolution. 1600x1200 and 1920x1200 are far superior due to more vertical space. If you want 16:9 at least get 1440p.

4k means approximately 4000 horizontal pixels. DCI 4k is 4096x2160 and UHD 4k is 3840x2160

I would agree, but 1600x1200 and 1920x1200 are hard if not impossible to find on a panel suited for gaming. If it has high refreshrate, it might still suck because of blacks or other issues.
1440 really doesn't offer you much in terms of looks, but it requires a lot more power. I pretty much stopped playing newer games, so might get a 1440 but not for games, more because i just want bigger and "better".

Yeah it will play a lot of games at 4k 60 max but there are the few modern games that just won't work. I have an 8700k so I know it can't be a bottleneck.

>get gtx1090ti faggot extreme edition
>only play minecraft, deus ex, or system shock 2
>i-i swear i'll use this card for all it's worth some day

Monitors are progressing faster than GPUs. Well, modern games are progressing faster anyway. I'm pretty sure even integrated graphics can handle just your general desktop workspaces at 4k or whatever. Maybe if things were more optimized (like if games were all free software and people could contribute to improving them), it would be a bit better.

Anyway,

Didn't fall for the meme here.

My trinitron CRT runts at 1600x1200, looks great, instant response time, and has perfect blacks.

Brains can't deal with speeds above 1hz because the human has only one heart

>CRT
My last Viewsonic died. I envy you. What Hz at 1600x1200?

85hz. I have a sony e500. I also have a spare e540 in the closet. Shit is awesome for games and movies. I can't even play games in my lcds anymore because they just look like shit in comparison. Buddy of mine had his mitsubishi crt die so we are replacing the bad filter caps.

Consumer whore

85Hz is the cutoff for refreshlets in my opinion, but for competitive games i'd throw that into 1024x768 for 100/120Hz. Unfortunately that'd be a bit ugly for newer games, but Quake and Brood War would still look good.
My old Viewsonic did 100Hz in 1600x1200, and it made LCD owners uncomfortable when i brought it to LAN, i loved it to death. I literally got it for free(stole it kinda) when companies started just throwing out CRT's.
As great as it was, i don't want to go down to 768 for 100Hz if i got a new one, and especially not 600 for 160Hz. Better to just get used to shitty TFT's where you have to squint to see baddies hiding in the shadows.

Most games outside of Ass Creed Origins aren't even that demanding. The issue is people are obsessed with maxing out every game, even if the ultra settings look exactly the same as very high but cut your framerate in half. Im willing to bet 1080Ti can easily play any game at 4k/60 with a few meme settings turned off.

4k only makes sense for TVs.
2K is ideal for anything that involves a computer/battlestation.

Being this much of a capitalist.
Mao and Stalin should have won.

hows the

everyday this

>You quadrupled your graphical requirements

Technically it isn't quadrupled

Fuck off back to Sup Forums manchild

>not getting two 1080ti in SLI/Crossfire

user, I...

>the current state of Sup Forums

Literally not a single dev in the world still implements SLI in a worthwhile way anymore. There is absolutely no reason in doing so.

lol what? just because windows 10 shits itself in high DPI doesn't mean it's not worthwhile.

also 2k is 2048x1080. maybe you're talking about 2.5k

BAAAHAHAHAHAHAHAHAHAHAHAHA!!!!

Holy fucking shit thank you for the lulz

iktf

Yeah for gaming ain't a meme for anything else tho

I mean on a ~24" monitor it's effectively antialiasing by supersampling

are you the one who asked about microwave time?