Shitty Intel TIM

>Late 2016
>AMD Zen nowhere in sight
>Everyone tells me to buy Skylake
>Buy 6600K
>Intel re-releases Skylake with higher clocks
>Want to overclock 6600K to 4.5GHz to try and catch up to Kaby Lake a bit
>Can't, becuase Intel jewed out and used shitty thermal paste for the IHS
>Arctic Silver, good cooler, doesn't matter
>Don't want to delid because warranty
>Stuck with shit clocks so Intel could save a few cents
>Problem is worse with Kaby Lake
>Ryzen (finally) comes out, every CPU uses proper solder
Why do people defend this corporation's fuckery?

Because I get 140 frames instead of 120 frames whenever I play a game with my 1080Ti. Ryzen a poo.

You can probably sell your parts and get enough for a Ryzen build.

Dont worry goy- I mean guy
Skylake-X and Kabylake-X are coming out in a month on a whole new socket.

Sell that shit and buy a 1600(X) + b350 for the money.

My friend has this now, want to switch but so t want to go through the hassle of selling and buying. Maybe Zen 2

Stockholm syndrome.

People unironically want AMD to run out of business when they have done everything the community has wanted.

More IPC, check
More cores, check
Better power efficiency, check
Undercutting prices, check
Soldered IHS, check
Cheap unlocked chipset, check
More PCIE lanes on Naples, check
Free version of GSYNC, check
Proper Async support, check
Proper video drivers, check
More VRAM, check

Or

Gameworks(tm)
Gsync(tm)
Driver gimping (tm)
64X tesselation (tm)
Hyperthreading (tm)
Several sockets(tm)
Z series boards(tm)
Intel management engine(tm)
Quad cores(tm)


I could keep going, but you get my point. If you don't despise Intel and Nvidia, you are not a fan of tech. You are a mindless consumer.

I really don't know, got a R5 1600x. It just works, really well at that too. My only complaint with Ryzen is the way the CPU cooler has to be connected and why the fuck it has to be done in such a infuriating way, other than that it's pretty solid.

Using 144HZ over 120HZ lightboost.

What kills me is that you esports fags don't even know what the fuck you are doing. If you are going to invest in snake oil, at least invest in the proper snake oil.

God damn the fucking gaming community to hell.

>People unironically want AMD to run out of business
Nah, it's just shitposting.

Surely you know that you need v-sync if you're gonna use strobing properly, otherwise tearing artifacts are incredibly obvious and almost equally as distracting as the non-strobed image.

And surely you know that if you drop frames during v-sync, you're gonna be in crosstalk city for a few moments, not to mention the feeling of inconsistent tracking. And on top of that, you've got input lag which comes with the v-sync territory.

No you don't.

And why the hell would you be dropping frames? You spend $300 on a TN panel but you can't buy a modern low end GPU!?

I wish Nvidia supported freesync so I didn't hve to sell my kidney for a gay-sync monitor

Witnessed

What did he mean by this

because you get 121 frames instead of 120 in call of dooty and Cosmetic Skins: Goyim Offensive

>No you don't.

Strobing makes tearing incredibly obvious and distracting. Sample-and-hold blur masks it to a degree.

>And why the hell would you be dropping frames? You spend $300 on a TN panel but you can't buy a modern low end GPU!?

>And why the hell would you be dropping frames? You spend $300 on a TN panel but you can't buy a modern low end GPU!?

If you are getting a maximum of 140 frames per second in a game, you will very likely be dropping below 120. If you've got V-sync on (which you should, if you're strobing), your framerate will halve at that point.

And in case you want to stomach tearing and decide not to use V-sync, whenever your framerate drops below 120, you will get lots of crosstalk which is by far more distracting than sample-and-hold blur.

Even if they did, neither Freesync nor G-Sync can be used in tandem with strobing. It is proven to be technically possible (with g-sync at least), but not officially. It's difficult to set up and there's issues with it.

I've tried strobing at 144hz but I'll stick to 240hz+gsync

Because all Intel shills are Feminist Frequency literal cucks.

> >Want to overclock
> >Don't want to delid because warranty
Wat

INTEL MASTER JEWS KING OF JEWS
>INTEL CAN'T EVEN USE DECENT THERMAL PASTE OR SOLDER THEIR CPUS

>TFW THERE'S NO DECENT CPU

>SHITTY AMD RELEASING MORE CORES MEME CPUS WITH SUBPAR SINGLE CORE PERFORMANCE AND UNSTABLE MOTHERBOARDS

END IT END IT NOW

>unstable mobos
What was X99 then?

>B..BUT INTEL DID IT

THAT'S NOT THE POINT YOU DISGUSTING AMD SHILL. DON'T YOU GET IT BOTH COMPANIES ARE FUCKING US RELEASING UNFINISHED SHIT

>unfinished
No launch was ever perfect. Especially when actually new arch was launched.

I think it's just bad luck in the silicon lottery. Even my 3570k clocks higher with decent cooling.

3570K is just better CPU than 7700K

just buy vega

I wish i fucking could.

Since what generation Intel decided to jew their cpus with paste rather than soldering?

Ivy.

>Intel uses shitty TIM on their newer cpu's
>this means if I get one I need to delid it to get it at decent temperatures with a higher overclock
>Ryzen is trash for gaming because it can't overclock past 4.2 ghz and games are not optimised for it.

Fuck why are these companies making it so hard to spend my money on them. Just make a fucking cpu that doesn't have big downsides.

At this rate I will still be using my 2600k over 5 years

He's talking about cpu's you dumb fuck

How can they prove you overclocked the cpu?

Do you even know what delidding is?

>If you've got V-sync on (which you should, if you're strobing)
>v-sync
>ever
M8 r you having a giggle. V-sync makes shit unplayable.

ergo, strobing is not all it's cracked up to be. That's my entire point

>arctic silver
>good
Might as well use cum, toothpaste or original Intel™ thermal solution.

>Don't want to delid because warranty
In 15 years I have had every single component fail except a cpu
You don't need warranty because it wont die for anything its covered for.

fuck off and die.

>t. triggered temperaturelett
Go burn with your CPU, faggot.

> For extra-cool temps you can also leave your intel pc off

>don't want to delid because of warranty
"My CPU died before it became obsolete", says no one ever.

Arctic silver 5 is outdated, has long burn-in time, and is electrically conductive.

>I accidentally the die
Every second delid.

With SUPER reduced power consumption!

amdrones are too dumb to make a difference

Can some paid shills use their time to explain a couple things:

Is the 4gz wall people are finding with ryzen a hard limit or have they just not had the time to master how to OC it properly?

As much as I want to go back to AMD I can't imagine how I could live with a 3.9 ghz processor in 2017. Especially if I'm going to want max FPS in GO and Quake Champions if it's any good.

...

The best bit is that it's not even from the process of deliding. But mounting the cooler.

Seems like a voltage/heat limit for the moment, since most people are not willing to risk values higher than 1.4V for a long term OC until they're sure it's completely safe. In any case the higher thread count, bigger caches and a much better SMT implementation (following IBM standard and not Intel's abomination called HT) seems to make up for it unless you compare with an i5/i7 OC'd to the moon (so watercool+delid involved) in very specific cherrypicked games and scenarios.

The 4Ghz limit seems to be due to the immaturity of the silicon. Look at haswell on release, terrible OC, in part due to the TIM issue, but as time went on, higher and higher OCs were the norm.
Once the silicon has a chance to mature, the fabs make their tweaks, then we may well see higher OC results.
But for the moment most people can only get their 1800x's past 4.1 by going crazy on voltage.

What did he mean by this pic?

Don't forget about IPS. (instructions per clock).
You're not going to have a problem in CS. if anything it's still got a major advantage over the 7700k because of the stutter. I haven't seen tests to see if it stutters in CS. But still worth noteing.
Zen+ is in another 8 months or what ever and was sighted for another 15% IPC. Can't vouch for that anymore because of Ryzen being 10% over its target.
There's still a couple of features that were left out of Ryzen because of time constraints that will be included in Zen+. So it will still be a significant improvement.

>GHz myth

He is asking how can the prove you overclocked, not delided, retard. And it is an interesting question - how can they tell if the cpu was overclocked?

No.. It actually makes sense for some things. CS is one of those. If you had two processors that had the same performance but one was 100Mhz the other 6Ghz. The frames from the 100Mhz one will come in clusters, rather than a smooth dispersal across the entire second.

>MUH GIGAHURTZ

Stop talking about shit you know nothing about.

You are fucking retarded.

> I can't imagine how I could live with a 3.9 ghz processor in 2017
Buy FX-9590. 8 cores AND 5GHz easy.

THEN PLEASE TELL US ALL ABOUT IT YOU CONDESCENDING MOTHERFUCKER

Sup Forums THE MOST CONDESCENDING BOARD OF ALL OF THEM, WITH NONE OF THE POSTERS ACTUALLY EXPLAINING WHAT THEY TALK SHIT ABOUT

YOU DUMB BITCH

Do you really thing a 100Mhz chip would keep up when you swing the cam 180.

I hope you got paid to reply to my over 1 hour old post asking a couple genuine questions.

You know for most people (including me) which $200-300 CPU I buy is not the defining moment of my life so your autistic screeching is a waste of time.

I remember nearly falling for the bulldozer meme with people going on and on about how very soon you would need more than 4 cores because every program was going to be multi threaded and how AMD's new 'HT but not HT but better than HT' HT was going to be a game changer.

That is why I am wary to trust people on AMD again. They did the same thing with 1090T about how multi threading was going to explode and that it would destroy intels puny quad cores.

An AMD Bulldozer 5GHz 8-core CPU can't get even close to an intel 4-core 4GHz one. Gigahertz (a measure of frequency, Hz being times/second) means nothing if the amount of instructions you can do per clock (per Hz) which is called IPC is different due to completely different architectures.

There you have your spoonfed middle school physics. Now GTFO back to Sup Forums

you missed the part about ram speeds, FSB, bandwidth , cache systems and how the arh is CMT not smt like ryzen is.

stupid brainlet.
Dun know nothing yeah that's basically Sup Forums in a nut shell.
Itrs time to get off you tube and read "mickeal meyers A+ guide to hardware" book that one booke is 600 pages of tech goodness.

Fucking kids these days can't even read a darn book...rude man.

should i delid my g3258?
i'm at 4.3GHz.

I've seen both Intel and AMD cpus bite the dust. They're not gonna die at the rate harddrives do, but they're not indestructable.

Wouldn't suprise me if the CPU or management engine recorded the max frequency it has hit on some embedded flash or something. That's how I would design it if I were trying to jew over consumers.

This.

I've been running my 3570k at 4.6ghz for three years on a closed loop liquid cooler.

I got 4.5 easy on my 6600k and my cooler isn't even that great (bought it for aesthetics mostly)
Did I win silicon lottery?

>If you don't despise Intel and Nvidia, you are not a fan of tech. You are a mindless consumer.
/thread

actually /board

>Caring about CPU warrenty

This is such a shitty argument, by the time your warranty is up your CPU will NEED to be delided and OC'd to the max to keep up with the latest Sandy Bridge rehash and ZEN+ anyway, just do it now, CPUs simply don't break under normal use and never will, even shitty slapped together modern Intel trash.

>Intel numbering (((scheme)))

>CPUs don't break under normal use
Yeah, but they do break from idiots messing up their delid, which we wouldn't have to do if Intel stopped fucking around with the TIM

A great post he witnessed

>1.452
your 3570k gonna die any day now

Intel how-to guide to BTFO Ryzen

1. Play the silicon lottery
2. Lose
3. Repeat from Step 1 until broke or Ryzen BTFO

>More IPC
Useless
>More cored
Uselezs, LE COARZ
>better power efficiency
False, 7700k has lower tdp
>Under cutting prices
Bad, they're destroying the market, and hurting Intel
>Soldered IHS
Bad, doesn't give customer option to delid for better performance
>Cheap unlocked chipset
Bad, doesn't separate consumer from professional
>More PCIE lanes
Useless, nobody need that many
>Free version Of GSYNC
Useless
>Proper async support
Useless bully tactics
>Proper video drivers
Blatant lie
>More Vram
Useless, still loses to less from Nvidia

Or
>Gameworks
Good way of managing drivers and games
>Gsync
Good frame rates
>Driver gimping
Fake AMD shilling lie
>64x tesselation
AMD cards aren't capable of that
>Hyper threading
Godly power and good tdp
>Several sockets
Gives users options and choice
>Z series boards
Good motherboards
>IME
Good for keeping computer working good
>Quad cores
Still beat AMD "8 core" shit processors


Fuck off shill

You don't have to delid lol. Intel is still faster than Ryzen without k series.

Low quality bait

6600k at 4.5@ 1.32v 55°C on h240x
70°at 4.8 @ 1.38, not stable but Jesus the temp difference

...

You put so much effort into this bait, but it's just so boring and tryhard.

It's like you come on this board just to get BTFO.

What is it with intel peasantry, are they just acting like retards to provoke reaction?

>Proper video drivers, check
LOL

Really low effort. Could have at least used a benchmark, let alone more descriptive rebuttals.
Apply yourself.

Same here, but my pc died and I had no other options so I don't feel too bad about it

Kill yourself you perma-BTFO'd tripfaggot

Intel can't solder Kaby Lake trash because it's too small and shaped like a long rectangle. X99 CPUs are soldered, because they are much larger.

And that's why they stopped soldering since Ivy.

>If you don't despise Intel and Nvidia, you are not a fan of tech. You are a mindless consumer.

HE'S GOING ALL-OUT!

How the fuck is Ryzen soldered despite being a SMALLER DIE?

Its a lame excuse by Intel to justify their cost cutting.

oh you're back again for some more BTFO collage like this eh?