Gtx 1070 on FX CPU

Arlight, Sup Forums. I'm seriously considering getting pic related. Problem is, I got an FX350 for a CPU and am afraid it might bottleneck the 1070 a bit. I have it at 4.6 GHz right now. It's running this frequency at 1.35 V under a Corsair H80i cooler, so I guess I could push it to near 5 GHz with no significant rises in temps.

If I do the latter, could I expect to minimize bottlenecking to tolerable levels and therefore enjoy decent framerate with the gtx 1070?

Other urls found in this thread:

youtube.com/watch?v=UTQhlNoBnhk
youtube.com/watch?v=I7eG2lEgfTI
twitter.com/SFWRedditImages

I love it when I fuck my first post. Meant FX8350 of course.

no

...

Is this a legit answer cause I would also like to know. Have pretty much the same set up as op

Depends entirely on the game. Look at benchmarks for FX-9590 for a good idea of what you can expect.

>I have it at 4.6 GHz
Jesus fucking Christ, your CPU will not bottleneck anything for the next 10 years.

just get it and ull still have the card when ull upgrade ur cpu

>FX 8350
for what purpose?

As long as you're fine with the framerate occasionally dipping below 60fps, it's fine.

FX 8350 and FX 9590 bottleneck GTX 960 and R9 370 even at 480p. Don't waste your money.

At 480p, any semi-capable GPU will bottleneck even 6700k.

>480p

OK. Back to 1997 with you.

Here are some tests by people who own i7 and FX playing GTA5

FX
youtube.com/watch?v=UTQhlNoBnhk
i7(guy is annoying but it's basically only one with i7 if found in 2 minutes)
youtube.com/watch?v=I7eG2lEgfTI

from skipping the videos FPS looks in the same ballpark.

Its all about resolution baby.

1080

The biugger question is why are you using a gtx 1070 at 1920x1080? However going beyond that Yes a fx chip will hold back a 1070 - but so will an Intel chips (just to a lesser degree).

What Sup Forums doesn't understand is that due to a combination of hardware and software modern cpus can't keep up with the latest and greatest gpus at most resolutions. 4k testing is good for highlighting this to a point as while overall performance will be lower than compared to 1920x1080 the delta between the various gpus will change as the more powerful ones aren't held back as much anymore (its why you see a fury x for example generally overtake a stock 980ti at 4k).

Overclocking your cpu as far as you can is going to help but you damn well better have a motherboard capable of handling the power draw - an 8350 at 5ghz (aka 9590) will flatout kill many motherboards if you can't cool them.

tl;dr yes it will but you're stuck between a rock and a hard place.

>an 8350 at 5ghz (aka 9590) will flatout kill many motherboards if you can't cool them
OP here. Are you reffering to power phases and VRM, or PSU wise?

Anyway, would a gtx1070 be a decent purchase as it is considering my gtx770 is really starting to show it's age lately?

Motherboard vrms - you're going to be pumping nearly 1.5v through them and you need to keep them cool to prevent thermal throttling or explosions. There is a fucking good reason why maybe 6 AM3+ boards are rated to take a 9590.

>Anyway, would a gtx1070 be a decent purchase as it is considering my gtx770 is really starting to show it's age lately?

That is for you to decide. It is certainly a considerably more powerful chip but equally its a lot of money and playing at 1920x1080 is going to hold it back one way or another.

More like 2009.

>8350
>4.6 stable on water

Nah, you're good to go for this gen and the another one for sure.

>More like 1997

I went 1024x768 in 1998

Probably computing. If he took it out, the computer wouldn't work so well...

OP, I have the exact same thing. I bought an MSI armor 1070 OC 2 weeks ago and it's kind of a mess.

I'm running it at 4.0 GHz and while I do get high frames in pretty much everything, some games shit the bed hard. I never get what benchmarks say I should. While I do get 60fps in Witcher 3 at all times, I can't get over 50 in GTAV with anti-aliasing all turned off. Sames goes for Dark Souls 3. I read from some guy on [spoiler]reddit[/spoiler] that he got ~70FPS in 1440p on DaS3, but I can't get over 50 at 1080p. And this is all with having it overclocked to as high as it can handle without giving me issues.

Get the card and then save up for an intel CPU. That's what I'm doing.

have you tried unparking your cores?

worth a try.

Have you tried windows 7. Bot net might be slowing you down.

That's true: GPU load increases and CPU load decreases with larger resolutions.

Why don't you overclock that 8350 though? I reckon you could go as high as 4.3-4.4 GHz without liquifying that fucker. There are some things you could work around in the BIOS, which could make your CPU run better and more stable generally. Look around the internetz about this.

I mean, the fx8350 is definitely not an i7 6700k, but getting 50 fps in GTA V with a gtx1070 is absolutely not normal. Try to monitor your CPU/GPU usage - if the latter is not anywhere near close to max while the former is close to 100%, you're experiencing a severe bottleneck. An OC could definitely help with that.

>CPU load decreases with larger resolutions.

This is inaccurate - cpu load (more or less) remains fixed regardless of resolution (though not framerate). As resolution increases the gpu has to work harder and harder and typically you reach the point where the gpu is being hammered so much the cpu is waiting for the gpu to finish.

This is a good thing as a gpu doesn't render shit without commands from the cpu. So in an ideal setup your cpu will always be one step ahead of your gpu and if you want faster framerates/higher resolutions simply adding more gpu power will net you those results. Of course reality doesn't work that way and its generally the software fucking things up, but still.

>buying nvidia gpus
>buying amd cpus
wew lad you couldn't fuck up harder than this