24fps on a 1080 with an i7 in Witcher 2

>24fps on a 1080 with an i7 in Witcher 2

What the fuck is wrong with my PC? I just upgraded to an i7 6700k, a 1080 and 32 gigs or RAM on Windows 7 and some games are still running like fucking shit. At first I thought I might have a bottleneck since my GPU seemed to be performing fine, but anything CPU heavy was literally shitting the bed, I average about 20fps in Men of War as well. But then I realize my CPU isn't even using itself to max capacity, it's barely using half of it's power here. Why the fuck is my i7 not working properly? Or is Witcher 2 just an extremely badly optimized game? Note that I can play Metro 2033 at like 155 frame per second.

Will post more pics.

turn off ubersampling
that thing slaughters your fps

I'm getting 60 fps 1080p on ultra settings with my 7 year old PC (i7 2600 8gb ram) and a new 3gb GTX 1060.

Seems like youre fucking something up with drivers or someshit

Here is Men of War, even had to put the draw distance down, not even 10 frames per second in this shot and CPU barely even performing.

deactivate ubersampling

Why would anybody play 2 when 3 exists?

It's off, turning it off gives me maybe 10-15 extra in Witcher 2.

With ubersampling on, it averages around 20-30. With it off it averages around 40-50 fps but with drops to 30 and lower like in pick related every now and then.

I downloaded brand new drivers today.

>Windows 7

Yeah no wonder you're getting low FPS. Upgrade to windows 10.

Download cpuid check cpu speed etc.

And here is Metro Last Light after I shot some targets making physx objects bounce all over the floor. Notice my framerate, and also the fact my CPU is going into overdrive.

Why does my CPU go all out for last light, but for W2 and MoW is just sits doing a half assed job and leaving me with a shit framerate?

disable Nvidia PhysX shit, and drop some things a bit.

>"""Upgrade""" to windows 10.
fuck off. Even my old i5 + GTX 550ti could run TW2 at 1080p, Medi-High settings at steady 50+ framerates.

What GPU do you have.

You sure the game is not accidentally using your HD graphics integrated card.

Reinstall drivers. Check if your cooler is working properly or the temperature of the processor with Speccy
Probably it's a overheating problem, I had a similar problem with an older PC. I just changed the thermal paste of the CPU and it worked fine.

>Metro Last Light
Stop playing console ports, and play the REAL Metro 2033 instead.

And here is last light when there are no physx objects on screen.

see and this post. Pyscx can run on max and it still goes above 60fps in other games.

This is very common with modern GPUs for TW2

I have both. Metro 2033 works like a dream, smooth as butter. I dunno how this works so fucking fine yet Witcher 2 shits the bed.

did you plug your monitor into the mother board or some stupid shit like that

This OP, I remember that you can choose what GPU to use in the games launcher.

This is the only part of Metro 2033 I have tested so far where my framerate even went over 100.

Yet in shit like Far Cry Blood Dragon I am struggling to maintain a constant 60fps. I don't get it.

different games, different engines, different optimization levels.

FC3-4 are just terrible console trash, with bizarre variable V-sync. Sometimes it literally locks the fps to 30, even when just staring at fucking ground.

It really sounds like you're running the game on the i7's HD Graphics chip, not your GPU. Like said, some 6yo mid-tier GPUs can run it just fine.

>spend almost 2 grand on a pc
>can't play ten year old games above 25 fps

Just buy a ps4 user.

An gtx 1080, I said in the OP. It's the gigabyte G1 gaming if that means anything, but I think the problem is with my CPU, GPU seems to be working fine. Also yes I plugged my monitor into the GPU.

What do you mean?

>Even my old i5 + GTX 550ti could run TW2 at 1080p, Medi-High settings at steady 50+ framerates.

I can barely get 60 on absolute minumum graphics in Withcer 2. I put everything to low and I can barely gain 30 frames. Yet my old set up, an i5 2500k and a 660 ti could play it at around 40fps maxed out aside from ubersampling, which would put it around 15 fps. Why is my new rig barely better than my old one for Witcher 2?

did you run those games in Administrator mode?
what are your nvidia settings towards those two games?

your rig is miles better. Something's just off software-wise, either on the game's or on the driver's side.

Try to completely disable the Intel HD Graphics from your Hardware management settings.

Yeah something's up with that. I only just finished a Witcher 3 playthrough with a 1080 and i5 4590 playing at 4k60fps constantly. Turn off ubersampling and the blur options for both gameplay and cutscenes. W2 had some pretty fucked up blur and DoF settings that stacked on top of each other that really fucks up performance.

why would you even live when suicide exists?

Men of War isn't on PS4 user. MoW: Assault Squad 2 mods and Call to Arms was my main reason to upgrade, but I also wanted to play some other graphically intensive games my old i5 and 660 couldn't run, and am just pissed off that after waiting 5 years to upgrade Witcher 2 still runs like fucking ass.

AS2 runs better when I have lowered the AI count, this is a 16 player AI match. When I play campaign it plays at 60fps with drops to 40 when the action is heavy. I excepted some big frame drops on this with mods and so many AI units, but it's still giving me the shits with how big it is, I expected at least 30fps at least, and my CPU is just sitting around not even playing at full capacity.

May performance was equally strange on a 970. It played fine for a while, but one day I loaded it up and the framerate was all over. Playing windowed was the only solution I found. Luckily it was on my second playthrough trying the terrible Roche route, so it wasn't much of a loss.

>Try to completely disable the Intel HD Graphics from your Hardware management settings.

How do I do that?

Also I decided to test out Cryostasis since it was notorious for being unoptimized, I actually almost got to 60fps though it regularly dropped to 30. I'd say it was an average of 40-50fps in Cryostasis.

Disable V-Sync and ambient occlusion, and make sure that rendering distance is below medium
You can also go into the User.ini files and set AllowSharpen to 0

If that doesn't help then you're on your own

Sounds to me like something isn't properly installed in your PC.
Even in Metro you're getting pretty sub-par performance (unless that's absolute max settings including SSAA).

>buying nvidia
lmao you should have known you'd only be getting 24fps

Go to the Window's Device Manager, check the Graphics Adapters section, right click & disable Intel HD Graphics.

>This is the only part of Metro 2033 I have tested so far where my framerate even went over 100.

Sorry I meant under 100.


Anyways I also have a massive ram problem with Men of War. As you can see it's got about 10 gigs loaded into ram. I have 32 gigs installed, but every time it goes over 10 gigs in the cache, the game crashs. This game crashed about a minute after I took this screenshot. How the hell do I allow the games (and firefox) to use all my 32 gigs instead of being limited.

Check your power management mode in Windows settings. I had the exact same problem- mine was set to "balanced" to try and conserve power. Windows was literally throttling my GPU and CPU since I bought them.

Make sure you have an HDMI 2.0 I think.
Make sure VSYNC in your nvidia options is off, and that your screen resolution is set to 60hz not 30hz.

One thing I can suggest you try to check is the power cable to your GPU. A few months back, when I was installing my new SSD and had to take my 1080 out, my power cables weren't completely in when I reinstalled it. This caused the 1080 to run in constant low power mode and I was getting bad performance and constant frame pacing issues in almost everything I played. This continued for like a week until I decided to see what's wrong and noticed the power cable being a bit loose. Tightened that shit in and have been getting max performance in everything.

Do you have the Cinematic Depth of Field set to on?
It is pure murder on fps

turn off hair works

I'm gonna test this right now.

No I hate DOF and turn it off in every game that features it.

pc gaming everyone

I'm getting around 40 fps on near max settings at 4k with a 1080 and an 4690k @ 4.2ghz.

that's cus you fell for the pee sea gay ming meme my friend. good job gobbling Sup Forums's bullshit please do so with gusto in the future.

How many virtual cores does your CPU have? I'm not sure how many cores Witcher 3 actually uses.

What is your gpu?

>console gaming: 720 - 900p @ ~Low & cinematic 24fps average

wtf DELET THIS

At least it fixed

>Intel
>Nvidia
How the tides have turned.

at least it works unlike le pc games crashing on startup and not working at all, or 10 year old games running at 11 fps at 720p cus memes

to be quite frank if you compare games gettin fukt over time with new OS and drivers and shit, get a stable 900p 28fps on consoles is a much better deal

case in short: vidja is fucking shit. >>click here to die

...

>But then I realize my CPU isn't even using itself to max capacity, it's barely using half of it's power here
No you bloody moron.
For the last time: you DO NOT want your CPU nor GPU to be at 100% usage! That would mean that they are literally pushing their limits to give you a playable experience.

If your game keeps %%% below 100, that is a GOOD thing, because it means that the game's a peace of cake for your components to run, and there is no bottleneck.

help i'm drunk

>at least it works unlike le pc games crashing on startup and not working at all
riiiight....
Let's not forget the dozens of game-breaking bugs, crashes and auto-deleting saves we've had to deal with ever since 7th gen. Online connectivity and "patch it later" mentality has ruined console games.

On consoles, you can't even play older games on newer hardware unless the devs give you the option to buy the game again. I know you'd like to believe every single new game just breaks the fuck down on PC based on users complaining, but the fact is that even in a broken state, they work on par or better than the console counterpart.

I get 30fps 720p everything but hairworks set to Ultra on a i5 3.2gz and 750ti in Witcher 3, can we take a moment to appreciate this fucking ancient budget card being based as fuck?

It's not there, just my 1080 is. Though there is no graphics adapters setting, just display adapters.

I have rendering distance on max, I thought a 1080 should be able to handle that? Is this really intensive and should I put it down, I will try it.

I don't know what version my HDMI is, how much would it matter? My screen is 60hz though.

Unfortunately this didn't work, though maybe I need to shut off firefox to fully make sure. Firefox is a resource hog and I have 600+ tabs active and I'm not closing any of them. Takes me a minute to start firefox every time I open it.

Your hardware specs are more than fine. Something's not right, drivers? Ubersampling? AA? I'm not sure what, but something from a software standpoint is messing you up. There's no way that a 1080 would pull such low fps in that game.

It's 4 cores with 8 threads or something. It's an i7 6700l 4.00 Ghz.

Also Ass Creed while a shitty game I wanted to test it, and in certain areas it runs like fucking shit too. Literally 29fps for a bit here, though it can just barely get up to 60fps in other areas.

This is aboslutely true, I'm still the guy you replied to but I absolutely believe we were in a better time when consoles couldn't be connected to the internet and games either had a rerelease ala games like MGS3 or Persona 3 with updated everything and extra content, or that games were actually playtested and bugtested to be a complete experience before releasing. nowadays devs rely on the fact that everybody and consoles have internet to release broken as fuck games that they day 1 patch and other bullshit to make at least technically stable.

wtf has happened to vidya? also delete this

>It's not there, just my 1080 is. Though there is no graphics adapters setting, just display adapters.
you may have to do it via BIOS then.

If that does not cure the thing, run one of those driver cleaner programs, and install latest safe drivers + DirectX10 separately.

>Sup Forums trying to into pc everyone

the op is either 14 or has recently switched from consoles.

Have you considered maybe the temperature is the problem? Your card will roll back its performance if it gets too hot.

What are your CPU temperatures? Install Core Temp and see. Did you buy a heatsink or are you using stock? Was there thermal paste on it or did you apply it incorrectly?

Theres literally no reason your games are running that bad. I have an i5 2500k and gtx 970 and I'm outperforming the fuck out of you.

>If your game keeps %%% below 100, that is a GOOD thing, because it means that the game's a peace of cake for your components to run, and there is no bottleneck.

Oh okay, well that's good then. But what I don't understand is why it's still at shitty frames. I'd rather it try to push itself to at least 80% if it could get me a steady 60fps. Why is it letting me play at 30fps and it's sitting at 50% usage? Is this normal for CPU's?

Yeah something's up. I can run that at 4k60fps on maxed settings, even with the Physx stuff on my 1080 and i5 4590.

Check how many saves you have. Regularly delete saves as the Witcher 2 doesn't save over them

>I have rendering distance on max
well now that explains everything, jesus

This may sound weird OP but install the latest windows updates. I had a very similar problem with my computer. Couldn't figure it out. Then i just decided to check windows updates for no reason and after the updates everything worked. Hopefully your shit isn't pirated so you can get the newest updates.

Can you actually take a screenshot of the settings you're using. This is really bizarre. My immediate guess is you're using some crazy setting that just kills performance and doesn't make the game look all that much better.

try some new well optimized game like Prey see if its good or not

hold on shill

drawing attention to this

ya got the latest geforce exp?

3dpd? delete this you fucking inhuman normalfag.

Disable onboard video. For some reason you ahve games latching onto it. MSI Afterburner will still see your video card but won't be showing you what the game is actually using. Not sure why the games are latching onto the 'allmighty intel hd' but seriously, just turn it off man.

I've got 3 saves. All from 2015 when I bought the game, still haven't played past the first area because it ran like shit on my old card too.

Is a 40 degree CPU and a 58 degree GPU really bad though? It though it was when it got past 70 degree's that things got really bad and started scaling back.

I will go turn it down and test it.


Will do this too.


I still want to know how I can get MoW to accept all 32 gigs of my ram though, I dunno why it crashes if it goes over 10 gigs.

>I dont like games that you do

Doesn't make sense since I can run it with that setting on max despite having a weaker CPU than he does.

Turn off 'bost mode' and the intel gpu in BIOS. Do a clean install of your drivers. Are you sure you put thermal paste on your cpu? Are you sure the psu is powerful enough?

h o l d o n n i g g e r

first off: """"""""""""""""""""""""""""immersive "sims" """"""""""""""""""""""" are pretentious garbage. deus ex is the only good one. engaging in them is retarded fart huffery and extreme pretention. i bet you think le epic single player rpgs are "hardcore" despite having never played a fighter or a shmup in you are life. first off: kill urself. Second off: warren spector is a hack and you wank borderlands fps-rpg "le pc xD" games off to feel better bout urself. delete this and kill yourself. thank you.

also pray is shit and you wouldn't know good gunplay if it stepped on your balls and then cummed on your face.

ps the entire game is a simulation and it took ZERO (0) effort to type out this post.

I didn't notice you had temperatures up in your screenshots. 40 and 58 isn't bad at all. As far as GPU's go, below 70 is ideal for maximizing longevity, but the card is cleared to go up to 90. Don't go up to 90 though.

kys faggot, go to Prey thread and hate on a game there, this is not thread for you to sperg out, op just wants some help

Jesus crhist autismo calm the fuck down the guy was just making a simple recommendation for OP and you went full fucking sperg.

op is incapable of help, and so are u.

Is your hard drive failing?

I just put it down from max to normal and it doesn't do much.


Here. I have tried lowering a bunch of things and on lowest settings it does go above 60fps easy, but on a gtx 1080 with an i7 I shouldn't have to play on lowest fucking settings. It's bullshit.

I think MY hard drive is failing after reading this thread...

Turn off the windows 10 game DVR and shadowplay / nvidia streaming service and see if one of those is doing it.

Wait a minute, you're getting sub-60fps at 1080p? On a gtx 1080?

I put too much thermal paste on my CPU and when I put the fan on, it almost went over the side, so I scraped some of it off with a card, so it's a thin layer but cover the whole thing. The temperatures seem fine though.

Also got a 700 watt PSU.

Something strange is going on, I ran it pretty much maxed (other than ubersampling) on a 570 gtx. Is some kind of power saving feature turned on maybe? Did you check your temps?

okay go back to your waifus faggot nier tier gamer guy

I have 9 hard drives and one is a brand new SSD. One of them might be close to failing but it's an old one that I use for steam back ups and Witcher 2 isn't installed on that drive and neither are the other problem games.

god, you weebs are miserable

it seems like something serious

just save yourself the hassle and scrub your entire pc and reinstall windows and then if its still not fixed you at least then know for sure its a hardware problem

Yes. That's why I made this thread because boy am I mad I wasted 2k on this shit if I can't get 60fps on a 4 year old game.

You fucked up something, thats for sure

Didn't W2 come out in like 2011? It's like a 6 year old game that ran on a heckin xbox 360. There's zero excuse for it to run on high end modern hardware so poorly.