wait the benchmarks everyone keeps repeating it's faster than a 980ti and 980 sli. speed=/=power wait for benchmarks to see if there's a real difference between a 980ti and 1080
Logan Williams
What, the 1080?
Xavier Fisher
People are already saying that the benchmarks for Doom are unreal.
I just checked my receipt, I actually only have 5 days to decide if I can return this shit or not. Fuck, the past month flew by.
WHAT DO?!?!?
Jaxson Collins
it runs the new doom at 150-200 fps, unless the 980ti is pulling the same numbers there's no point waiting
Jacob Ross
Fuck so I have to return this thing don't I?
That's what I was afraid of. That means I have to go 3+ weeks of no vidya....
Jose Campbell
>return your card >check the benchmarks >buy the best card
can't you wait 2 weeks without playing minecraft
Christian Martinez
it runs doom at 150-200 fps but on a fuck overclock 1080 runnig at 2400mhz and the stock boost frequency is 1700mhz
Jonathan Ward
simple answer is no. complicated answer is no.
Henry Green
A logical man would: - Return it, since it is now obsolete - Wait for specs of the supposedly great new card - Wait for news of hardware problems - Wait for a GTX 980TI discount, before buying an SLI motherboard - NEVER install new drivers
Luis Morales
so just pray for not being cucked by nvidia
Isaiah Howard
this is why you get a 3ds
Jose Allen
you never, ever buy a high-end card at the end of a generation, period. take it back and get credit with no consideration given to the next generation: you simply don't do what you did.
It's like paying full price for a 2016 car when a 2017 car is about to come to the lot. What do you think happens to the price of the 2016 car when the 2017 arrives?
Easton Moore
rizes for being vintage?
Christopher Stewart
havnt played vidya in 6 months, you aren't missing much.
Christian Cruz
What are you going to be doing vidya-wise that justifies what the 1080 assumes to be selling? Do you ever go past anything in the 2560x1440 ballpark? Do you like to downsample? Are you future-proofing for VR? There's a good chance that the next wave of cards aren't going to be radically-different from the current high-end options like the 980 and Fury family
They also seem to be going for the crazy bandwidth marketing hook, so this is probably for enjoyable single-card 4k, since VR doesn't seem to be an issue for current high-end cards, and probably won't be for a while, at least until the next major revision for any given headset
Just hold onto it, there's guaranteed to be a crazy-ass markup and nonexistent supplies due to the gaymer rush and frothing demand, current high-end cards should hold their own for a long time
Christian Campbell
It was supposed to be a better clocked version...it's the newest one by gigabyte.
I know it was dumb but I wasn't willing to wait an unknown amount of time till they released new cards. I was in a position where I had to get one then and there, and at the time we had no clue when the 1080 was going to be announced...it could have been the end of the summer for all we knew.
Adrian Baker
>- NEVER install new drivers Why? What do new drivers do?
Ethan Ortiz
>It was supposed to be a better clocked version...it's the newest one by gigabyte.
doesn't matter whatsoever, you don't buy high end cards at the end of a generational cycle. you never ever buy high end ANYTHING at the end of a generational cycle
just sit on that credit or preorder, 1070 will annihilate 980 anyway
Jayden Collins
>he fell for nVidia hoax.
Latest games will run like dog shit.Even on your overpriced GPU's. Enjoy your broken games/delays.
Luis Thomas
You should definitely return it and get something else.
The 1080 is good but when you consider its a die shrink the gains arent actually that massive. Flagship AMD card will probably be superior, esp in DX12. Then Nvidia will release the 1080ti to fight back so thats what to look out for.
Easton Hughes
>The 1080 is good but when you consider its a die shrink the gains arent actually that massive
nope, outright lie.
this is one of the relatively fastest hardware generations of all time. the numbers are crushing.
AMD won't be able to keep up in the enthusiast market and DX12 will never matter, vulkan is out now and devs do not want to side with MS who is actively trying to kill pc gaming with UWE.
Colton Gonzalez
20%-30% with overclock is not that great dude. Theres going to be a lot of buyers remorse just like when the 980 got cucked.
Nvidia never puts all their eggs on the first card to market.
Alexander Cooper
you have a 980 ti.
you'll be fine either way.
Benjamin Mitchell
I was going to get a new monitor soon so this only justifies maybe getting a 2560x1440 to go with a 1080..
Juan White
Would an i5 4690k at 4GHz bottleneck a 1080?
Michael Morgan
Except it's not 20-30% it's more like 70%?
Isaac Adams
This pretty much. It's going to be like 10 years before a game comes out that you can't max with a 980 Ti.
Christopher Gray
yes you need a minimum of 3 i5s to avoid bottlenecking this gpu
Charles White
tell me how a cpu bottlenecks a gpu
or optionally explain to me what your question meant
Easton Rivera
All CPUs bottleneck current GPUs, but it won't be an issue.
Just know that 1080 is complete overkill for literally anything but 4k, 144hz or VR.
Tyler Robinson
>All CPUs bottleneck current GPUs
tell me how that is true
i want you to explain what that means since you typed it and hit post
Isaiah Ward
>Just know that 1080 is complete overkill for literally anything but 4k, 144hz or VR i guarantee there are games a 1080 won't be able to max at 1080p and maintain a constant 60 fps.
Carson Jones
poor optimization is a factor you control for when making those types of statements.
Zachary Perez
That's the fault of the games, not the hardware. a GTX 460 should be able to max all current games.
Adrian Price
All CPUs bottleneck current GPUs
u high, kid?
Noah Turner
>nvidia >drops prices
Don't listen to this guy op. He sounds like one of those guys who is always going to wait for the newest card but never actually buys one.
I would return it, wait for real numerical benchmarks (not graphs that has the y axis labeled as "power") and get what works the best for what you need
Eli Wilson
Google is your friend.
Mason Peterson
I need to get a new GPU so I can play DaS3.
What's a good cheap one under $200?
Logan Barnes
>I would return it, wait for real numerical benchmarks (not graphs that has the y axis labeled as "power")
if you actually knew how to digest the numbers you'd realize that the real *hardware numbers* are infinitely more important than *benchmarks* which are for *stupid people* who *don't understand hardware*
benchmarks conflate hardware performance with game code performance. ALL YOU NEED is the hardware numbers. The number of shaders, the fill rate, texture rate, bandwidths etc etc etc.
if you cannot digest and interpret those numbers, do not give video card advice.
Josiah Wilson
geforce 960 maxes it out.
1060(?) won't be out for close to a year so there is no danger buying a 960 now.
Daniel Ortiz
Yeah and Google tells me there are very few cpu bottlenecked games. Nearly every game that has some sort of bottleneck is almost always gpu bottlenecked.
Ryder Anderson
Do some research next time faglord
Ayden Ortiz
>tfw bought a 980 a year ago
i-it's still going to l-last me quite a long time, r-right bros?
Luke Gutierrez
Don't be a retard and buy the 1070.
There's your answer.
Camden Edwards
Yes, it's still a top card only really bested by Nvidia flagships like Titan X and 980ti. The AMD equivalents (Fury/Fury X) are more powerful but can't OC for dog shit as well as being >amd.
OC it when it starts slowing down but as long as those consoles exist you likely won't need an upgrade for a while. At least not for another GPU generation or 2.
Wyatt Phillips
Yes. It'll be years before a game can't be maxed by a 980 (disregarding ass optimization).
Dylan Williams
It's still one of the strongest cards around. Unless you're rocking a 960 you likely won't need an upgrade any time soon.
Alexander Peterson
my 3.1 GHz CPU would be too much of a bottleneck for me to bother wanting a 1070, that's why I'm staying cheap. Might spend big when I buy a new PC in like 2 years.
Juan Moore
>Yes, it's still a top card only really bested by Nvidia flagships like Titan X and 980ti.
the 1070 is significantly faster than the 980
Leo Cook
R9 380
You should probably just wait though
Nathan Martinez
thank you
Jonathan Moore
Why don't you want to install the new driver (I'm a newfag so I can't help it)
Kevin Garcia
Same here. I'm probably going to get a 1070 in a year's time though.
Isaac Cox
>my 3.1 GHz CPU would be too much of a bottleneck for me to bother wanting a 1070
oh god its starting to infect people
theyre starting to repeat stupid shit they dont understand at all!
Zachary Wood
>use 980 >when it starts showing its age, OC it for a huge boost in performance >continue using it
980 overclocks like a MOTHERFUCKER. Simply OC it when you feel as if you can't max out games (providing they don't have shit tier optimization which in that case, even 1080/1070s will likely not perform optimally).
Ryder Johnson
1070 isn't out yet, i'm talking about right now and if that user is gaming at 1080p, why would he upgrade to a 1070?
Jacob Richardson
I got the 980 just a week or so ago and i'm sending it back, getting 200$ back and buying the 1070
Luke Moore
>just bought a graphics card >is it good Do people seriously do this? I fucking spend days/weeks debating whether or not to spend a hundred bucks but I guess that's my inner jew, though.
Jaxon Davis
i didn't say anything about him upgrading. i corrected your wrong statement. the 1070 is not a flagship card (80 cards are flagship) and is significantly faster than a 980
Christopher Morales
please, educate me, or show me where I can educate myself on PCs.
Anthony Miller
The 1070 isn't out yet, nothing I said was wrong.
Jackson Rivera
Nigga you don't just be like "my 3.1 ghz cpu will bottleneck it". Clock speeds aren't universal. Every cpu performs differently when at the same clock speed. An i5 6600k at 4ghz won't perform the same as an i5 4690k at the same clock speed.
Leo Wright
it doesn't have to be out to understand how fast it is. once again.
holy shit you "benchmarks" kids that can't analyze a fucking technical whitepaper. we knew how fast these cards were two weeks ago.
Juan Kelly
Will my 3.5 ghz be good for a 1080/1070...?
Lucas Evans
ok, I should probably specify that it's an i5 4400
Luke Lee
Regarding the cpu shitposters
Will an AMD FX 8350 bottleneck a 1080? I built this pc before I knew about the amd cpu meme but it's served me well on my 760, and I would really rather not spend money replacing it if I don't have to.
If the cpu isn't good enough, what's the highest tier gfx I can get without bothering to replace the cpu?
Brandon Gray
>educate me
tell me why you think your 3.1ghz cpu is a bottleneck. tell me what it would bottleneck, and tell me why it would bottleneck a 1070 or how this would affect your performance.
don't know a good technical explanation for any of these things, right?
then why did you say it in the first place? repeating what someone here told you. People on Sup Forums don't understand computers dude.
CPU cannot even fucking "bottleneck" GPU anymore because northbridges operate asynchronously these days. The concept is laughable in 2016. It's something I see frequently on Sup Forums: Kids dredging up computer concepts that were relevant in the 90s in order to try to sound knowledgable.
your 3.1ghz cpu isn't going to bottleneck anything, tl;dr.
Asher Richardson
drivers sometimes fix performance issues and shit like that. They sometimes fuck everything up and cause bluescreens. Most most importantly above all, we're too fucking lazy to install them
James Watson
>Pushing 4 year old hardware minus the meme70 >memory errors out the ass because win10 >didn't even want win10 >seems like it's about the right time to build a new rig >willing to wait just a bit longer after the release
Learn about the what hardware consists of and its features before you learn about how the hardware affects software.
Benjamin Kelly
Current PC is using an AMD FX 6300. I kinda regret not getting Intel cause I hear they're better for games but it's been alright.
Looking to upgrade but I'm stuck with an AM3+ socket. Would it be worth it to spend the additional cash to change my mobo as well? I'm planning on getting an 8350.
Anthony Brown
you dont buy an absolute powerhouse of one component when the rest of your pc doesnt match.
do you have awesome SSDs? awesome monitors? awesome bookshelf speakers with an audio receiver? awesome keyboard, mouse, desk?
no to any of those things? then why would you buy a 1080? so you can play games at 300 fps instead of 200 fps on your 60hz monitor?
Lucas Morris
thank you guys so much
Jeremiah Jones
>it's still a top card bested by Nvidia flagships like 980ti and Titan X >"HURR, THERE'S SOME CARDS WITH NVIDIA SHILL BENCHES THAT HAVE YET TO RELEASE THAT ARE APPARENTLY SIGNIFICANTLY MORE POWERFUL"
Go home nvidia shill
Michael Morales
it's not
>The GTX 1080 was 13% faster than the GTX 980 Ti and 11% faster than the R9 Fury X at 1920×1080. >At 2560×1440 and the same preset the GTX 1080 was 9% faster than the GTX 980 Ti and 11% faster than the R9 Fury X.
it is pretty much never, ever worth upgrading CPU. you are most likely going to get like a 20% performance increase max while investing money that could have paid for most of a new motherboard
Carson Baker
Yes. Because that gets me hard.
Ethan Reed
The AMD 8350 bottlenecks 980s
Daniel Young
>synthetic benchmarks
laughing reaction image right back atcha
Landon Turner
>no reading comprehension you tried shill
Jose Parker
oh, i'm sorry, ashes of singularity is a game?
not only one that nobody has ever heard of, but one that advertises for AMD. ah, the plot thickens
>a...async will really matter guys!! in these games where we paid for it to matter!
Gavin Reed
Most of this is right but when you say cpus can no longer bottleneck gpus you're not fully correct. A cpu can bottleneck a gpu when in very cpu demanding parts of a game. If the cpu is struggling with drawcalls then the the gpu will be idling more than it should, create a form of bottleneck. This is why nvidia cards always work better with worse cpus compared to amd. Nvidia drivers have much less overhead compared go amd drivers which results in more data transferred per drawcall compared to amd. Lower level api's like mantle, dx12 or vulkan eliminate these overhead problems though.
Blake Wilson
This nigga here is hilarious! He can read a graph with an x and y axis and now little man thinks he can tak about video card advice.
Give your ipad back to your parents and go to bed squirt
Daniel Richardson
Well I was planning on swapping CPU and GPU (and PSU) which would be like $450. Planning on a 960 or maybe 1060 depending on how cheap it is.
FX 6300 is pretty low bar, but I think what I'll do is buy the GPU and then see how stuff runs before considering changing the CPU.
Colton Campbell
>A cpu can bottleneck a gpu when in very cpu demanding parts of a game.
for all those super cpu demanding games right now
Benjamin Jackson
>don't read the article >don't know what he's talking about >still thinks it's somehow a rival company secret conspiracy it's like the pc market got infected with console-tier brand loyalty
Julian Anderson
1060 will not be out for a year. they don't come out with the midrange cards at the same time as the flagships
you don't even need a core i5, i3 is fine for most shit. spend more on your motherboard than cpu if you go that route.
Zachary Reed
it's literally the first DirectX 12 title to launch m8
Alexander Brown
Well I'd only be swapping mobo if I was switching to Intel. Is that really worth it? Surely an 8350 isn't bad at all.
Ryder Nelson
it's some shit nobody ever heard of, and nobody cares about dx12
dx12 is DOA, microsoft is actively attacking PC gaming with win10 and windows store. Do you think game devs are going to be on the side of lost profits? hilarious.
Brayden Thompson
It's going to run like shit in anything that's CPU heavy. I have an 8350 running at 4.5 GHz paired with a slightly OC'd 390 and my framerates in games like Crysis 3, AC:Syndicate, Dark Souls 3 and Arkham Knight are pretty meh at times where as people with my GPU and Intel CPUs seem to be doing much better. Switching from 1080p to 1440p usually makes 0 difference whatsoever in most games. If you have a miracle chip that you can OC to 5.0+ you might be fine but as it is you may as well just get a 1070 instead and spend the rest on a 6600k and mobo or 4690k and mobo. Then again, DX12 seems to be doing god's work for AMD cpus so who fucking knows at this point.
Adrian Baker
I was going to make fun of you but then k realized that's all I use my 3ds for anymore
My keyboard shit the bed last week and I spent like 6 days playing 3ds just while the new one came in the mail
Dominic Jenkins
Apparently under dx12 these older multi core amd cpu compete with high end intel chips in games. Nvidia have low driver overhead so if you do plan on getting anything to pair with that 8350 I'd say the 1070 or any other nvidia card. I would recommend getting a good aftermarket cooler and overclocking the shit out of that cpu to lessen the chance of cpu bottlenecking in shitty optimized games or very cpu intensive ones like gta 5.
Ethan Sullivan
So you think they purposely made the game in a way that doesn't benefit that much from a specific card (that a specific company didn't even announce), while somehow running almost the same with the previous top-tier card
because some sort of AMD secret plot with Microsoft and against the buyers
fucking wow
Carson Fisher
>390
There's the problem. That driver overhead is killing your performance.
Dominic Brown
no, i think a clearly amd-leaning site picked a clearly amd-leaning benchmark for an irrelevant game
pretending as if it matters because of dx12 shows you don't understand that dx12 is doa
oh and for the third time this thread: people who can understand technical specs don't need benchmarks to understand how fast a card truly is. benchmarks give you nothing but a warped perspective, ever
Gavin Foster
>nobody cares about dx12! >buys latest top gamma graphics card
Grayson Murphy
>benchmarks give you nothing but a warped perspective >b-but only if it's Nvidia™ and it's bad!