Buddy of mine claims that a refresh rate on a monitor makes games run better...

Buddy of mine claims that a refresh rate on a monitor makes games run better, and that intergrated graphics is better as well since it uses the CPU.

How Autistic is he?

Other urls found in this thread:

outervision.com/power-supply-calculator
youtu.be/sph6cjJeRdI
youtu.be/xp6ltBCMDCE
twitter.com/SFWRedditGifs

Very

On a scale of 1 to 10, I'd say potato

Checked.

He's a massive idiot OP. Refresh rate does shit.

If you have a game running at 120 fps, and a monitor which has a refresh rate of 60 hz, then you will only see 60 images in a second. However if you have a monitor with a 120 Hz refresh rate then you're able to get the full effect

all idiots confirmed

Intergrated graphics is better if you have a good CPU but a bad graphics card right?


Or am I buttfuck retarted?

>all idiots confirmed
[citation needed]

Where's your proof nigger?

The monitor thing I could understand someone being misinformed or uneducated about, but that integrated being better than discrete comment makes him an actual tard. Sounds like the kind of idiot who couldn't even change a tire on a car if he needed to.

Extremely retarded. The integrated graphics on a CPU won't get any modern game even close to the framerate to make use of a high refresh rate monitor, and the monitor does absolutely nothing to improve performance. It's not even part of the fucking PC.

If your graphics card is extremely shit, yes, but it's unlikely that any gamer would have that setup.

"Refresh Rate = Number of times a screen is capable of DISPLAYING, per second.
Hence, if you have a 60Hz moniter, and are outputting 120 FPS, you only actually display 60 frames. Likewise, if you have a 120Hz moniter and 75 FPS, you will output 75 unique frames."
Didn't realise Sup Forums was this stupid...
Hz means the number of times the monitor refreshes image in a second, ideally your refresh rate should match your FPS.

A dedicated GPU will almost always be better man, common sense. Unless the GPU is an actual potato. A discrete card will have its own video memory while the integrated would have to share with CPU, that alone will account for massive performance loss.

Nice bait

Not a bait at all.
Your frames per second is capped at your refresh rate.

The amount of frames your monitor displays has nothing to do with how well the game is running, you retard faggot.

Didn't claim that, I said FPS doesn't matter if it surpasses your monitor's refresh rate

You're wasting your time arguing in these threads man. It's a bunch of kids who watched a couple Linus videos loosely and suddenly think they're experts.

>Buddy of mine claims that a refresh rate on a monitor makes games run better
>people disagree
>you claim people are idiots
Good job.

The refresh rate thing he has a point.

If your monitor only outputs 60hz then you are only really seeing 60 frames per second. Even If you have 300 fps in game.

The CPU thing I have no idea where he has got that from. Dedicated GPU's would destroy any CPU when it comes to gaming, that is literally what they are designed for.

To Clarify from what OP said:
>Buddy of mine claims that a refresh rate on a monitor makes games run better

The monitor with a high refresh rate has absolutely no part whatsoever in making a game run better. The CPU and GPU & RAM to a certain degree will do this. However a high refresh rate monitor outputs more frames a second making the image much smoother than a 60hz monitor - I have a 144hz monitor and its dope as fuck.

>can't think of a counter point
>deny claims
>gives no reasoning
>call him a faggot

This. I didn't have the effort to type it out.

This is either bait or you need to go back to elementary school and improve your reading comprehension.

Autism prevents you from picking up social cues and isolates you from society. It doesn't make you computer illiterate. Obviously he doesn't have autism, because if he did, he probably would know SOMETHING about computer components.

You, however, op... I have some bad news for you.

Newfag when it comes to computers, was raised by someone who only used typewriters and 90s comps. Forgive me.

I don't have a gaming computer, but I have appeantly "top of the line" and the only way to go higher is a dedicated gaming computer. But some games (espically DS3) can't run for shit.

Would I be able to upgrade my graphics card and would this allow me to run games better?

Well if he means for ray tracing a gpgpu architecture CPU would be superior to a graphics card. Only a few on the market and they don't run the standard Windows OS. It's a big pain.

The short answer is: It should do yes.

However if your CPU is a bag of dicks, you will still not be running games with a high FPS.

A buddy of mine of a Nvidia 1080 graphics card but because his processor is old as balls he still doesnt get high fps in CSGO.

It fluctuates around 30-60 fps which is pretty bad.

You would have to show your specs for that to be answered (download something like Speccy and run it), but if you can't run DS3 for shit then I doubt anything in your setup is "top of the line". A good graphics card is the most important part of a gaming PC but your performance could still get bottlenecked by other shit components if you have them.

What is the point of such high refresh rates when humans have sensory inputs that are relatively limited

GPU-CPU cards such as the cell processor.

How hard is it to replace a CPU?
With a graphics card I've been told it's not too hard to replace, idk about th CPU

This.
OP is baiting retards with basic psychology

>refresh rate on a monitor makes games run better
Fact.
There is also several videos about CSGO that if you have a better monitor you'll have a slightly advantage spotting the enemy.

>intergrated graphics is better as well since it uses the CPU.
False.

Since people tend to focus more on the last argument than the first, they see everything considering only the last thing (It works better if they don't know shit about computers).

I don't think we have hit a limit yet in the technology of monitor refresh rates where we cannot see a difference when going higher.

There is a big difference from 60hz to 144hz and soon there will be 250hz monitors hitting the market.

You are correct in saying that there will eventually be a point where we dont see any difference in going higher, this is exactly like what the apple retina display is about. They worked our the highest amount of pixels that we can actually notice with the naked eye which is dependent on the size of the screen.

It depends on your motherboard, since the CPU is just a little square which is placed into a slot in the motherboard. There's also a CPU Cooler on your CPU, which is probably slightly harder to remove but since you're new to this you'll almost certainly have a stock cooler that easily pops into some holes on the motherboard, making the whole process very simple. You can look all this up on youtube.

You have a point there but for example we can see a dramatic difference between 24, 30, and 60fps. From there up to about 120fps the ability to notice it starts to drop off. Scientificly speaking something like a 144hz or higher monitor would be indistinguishable, but there is also the placebo effect where we "see" an improvement where we physically can't, but it still adds the sensation of it being better which for some people is arguably worth it. Of course in most cases as many things PC related it's more of a numbers war, a primordial dick swinging of sorts.

Integrated graphics just means a babby gpu in the same chip as your cpu. They're equivalent to other babby gpu models. Integrated shares system RAM rather than having its own vram.

The GPU is pretty easy as it just needs to slot in and out of a PCI-e x16 bay.

The CPU on the other hand will require you to do the following:

- Firstly make sure your motherboard has a compatible slot for it. In most cases you will need to buy a new motherboard. google pc part picker for this.

- You will need to install a CPU cooler or use the standard one which is included. I would recommend buying an aftermarket fan one to be honest. Installing the CPU cooler means you need to put a bracket on the back of the mobo (motherboard) which can be a bit tricky but it is pretty easy when you get the hang of it.

- you will need to apply a thin layer of CPU cooling paste on the top of the CPU just before you are about to install the CPU fan. This allows the heat to dissipate more evenly and will prevent your CPU from overheating.

I would google or youtube how to do this as well, as words can only get you so far. A physical demonstration will be better for you

don't forget about power supply needs re an upgraded gpu. don't assume that a higher end card will work without checking power.

ah excellent point, I completely forgot about that.

outervision.com/power-supply-calculator

This website will help you find out if you need a new PSU.

This, I didn't think about it during my last upgrade and my old PSU didn't even have an 8-pin PCIE cable for my new graphics card.

also note that the ic gpu on the motherboard will be next to worthless when you add a dedicated gpu. in fact, having both can sometimes cause issues where the intel chip will try to override the dedicated gpu. still, almost all motherboards have some kind of ic gpu, so it isn't always an issue, but fallout 4, for example, had all kinds of problems with this. if you are having trouble running something that you shouldn't have trouble, make sure that the application is using the correct gpu.

Almost as autistic as you for starting a thread about him

...

youtu.be/sph6cjJeRdI

youtu.be/xp6ltBCMDCE