Playing witcher 3

>playing witcher 3
>beeping sound from mobo
>CPU temp is at 99C
It'll be f-fine right g-guys?

I'll be watching the news when your house burns down, where are you from?

...

>probably picked triss
>CPU overheating

All is well in the world.


on topic: if you have a stock intel HSF then you probably didn't connect one of the corners of the HSF to the motherboard properly. Shit happens with those types of coolers.

poo in it

Pretty good for integrated graphics user

>picking Yenn

cuck confirmed, enjoy being used


OP, stock CPU fan or aftermarket?

lel

it's an i5 3570K with a meme hyper evo 212 cooler.

GPU is an R9 290, it also gets hot (around 90C)

make some tea on that cpu, you have the temp

You must not have applied it correctly. My 4790K never goes above 75C under full load with a Hyper 212

>Picking Triss merry-slut

Try replacing the thermal paste and re-seat the cooler if it still persists

I've already changed thermal paste three times, it did not improve temps at all.

screenshot is idle temps

Clean/dust your stuff I guess

>overclocks CPU
>it gets too hot
>"g-guys what should I do?"

reduce the fucking voltage and multiplier if you have to.

>reduce OC
I rather have it melt that loose fps in game

Winfag cuck detected, go bother the other gamer kiddies in Sup Forums.

learn to undervolt

>playng video games
>having a meme cpu with a meme cooler
>overclocking a cpu for 2fps

Sup Forums in 2016

i'm already on the lowest stable voltage
don't you have a desktop thread to circlejerk about your shitty meme OS?

what the fuck do you want from us then?

your CPU will NOT be fine if it regularly reaches those temps and we already told you how to fix it.

now fuck off.

>playing overwatch ultra detail
>gtx 760 is 81C
>lower the details to high
>81C
>lower them to lowest
>79C
i start to suspect that overwatch's gpu temp monitor is broken.

well I opened the case and now it stays below 90C so it should be fine

What fps are you getting? I have a 770 and it dips on ultra

Perfect temperature for black tea though.

Not even the worst 3570k needs 1.3v for 4.1ghz. Lower your fucking voltage.

>b-but I just left it at stock voltage and turned up the multiplier....

Learn to overclock dummy

So if he said he was doing video editing and had same issue you would be fine with it right?

Kill yourself you tween faggot. You are as disliked here as you are at school you little fuckwad.

Get the fuck out of Sup Forums and don't let the door hit you in the ass on the way.

>tfw 2x 670
Feels good. At max settings, if I unlock the frame rate it never dips below 100fps. They sit around 70C with the fps locked at 70.

it's unstable if I reduce voltage, at 1.28 it even crashes when entering bios.

Oh my god you're actually pissing me off. If you are complaining about the temperature its getting. DONT FUCKING OVERCLOCK. SIMPLE AS THAT. I bet you the fucking "improvements" you're getting is barely even noticeable.

>b-b-bbut my fps

Fuck off you illiterate child

>b-barely noticable
50% increase according to some benchmarks

>ArmA
It would also help if Bohemia Interactive was competent at writing a properly functional game.

>one side has 70% GPU usage
>the other one has 38%
>23 fps difference

Hmmm I wonder why

Only in CPU dependable games like ArmA.

Stock clock?

Jesus Christ user are you really this fucking retarded?

GPU utilization will increase if the CPU pushes more instructions to the GPU.