Are 32 Epyc cores at 2.7GHZ really going to use around the same power as a 2 core i3-7350k at 5GHz?
That just seems silly.
Are 32 Epyc cores at 2.7GHZ really going to use around the same power as a 2 core i3-7350k at 5GHz?
That just seems silly.
...
why not ?
Yes, that's how voltage works.
i3-7350K is comically inefficient and overpriced. There is no reason for it to exist, at all.
Yeah, AMD can have 600 cores, they're still slower in games than 2 Intel cores
TDP = heat watts generated by the cpu not power draw
>Thread about server CPUs
>MUH GAYYYYYYYYMES
Kill yourself back to Sup Forums
You think gaming wouldn't be listed there if AMD was able to be listed? Memes like "VDI, dense, DBMS, analystics" are there to misdirect you from its lack of gaming performance
reminder to ignore and report trolls
heat and power are related user.
can you into physics?
>The thermal design power (TDP), sometimes called thermal design point, is the maximum amount of heat generated by a computer chip or component (often the CPU or GPU) that the cooling system in a computer is designed to dissipate in typical operation. Rather than specifying CPU's real power dissipation, TDP serves as the nominal value for designing CPU cooling systems.
silentpcreview.com
Guess where that heat comes from...
I wonder which gayme needs 5GHz singles thread performance.
Dorf Dortress.
fake news
>why 32 cores cpu use more power than 2 cores cpu?
Out of your ass
Exactly
This is a shitty meme.
...
>it's just coincidence that under typical load the chip draws as much power in watts as the manufacturer's specified required heat dissipation.
You do realize that the rated tdp does not matter to overclocks right? So long as cooling is sufficient to keep the chip from throttling or shutting down it'll run at full speed. Electricity and heat are both energy. You apply energy to something it gets hot. The heat generated is directly linked to the electricity drawn. Whether or not that's how they define tdp doesn't change this fact.
energy in>resistance occurs>heat out
That's how a cpu works user. More power = more heat = higher "TDP"
Back to school you go
CPU isn't providing power to other compentns in any meaningful way.
The power draw is directly going to heat.
>seems silly
Zen is retardedly efficient at lower clocks/voltages, user.
A while back I had an argument with someone going to school for semiconductor engineering over this. I'm glad I'm not the only person not retarded on this planet. I don't have a suitable image so have this.
>Thread about server cpus
>i3 is server cpu now, my dude
that's not how it work
Zen is stupid efficient. Their 8 cores sometimes draw less power than Intel's quad cores, that's just insane.
I mean you need to pour satanic voltages into 1800x for it to reach 6900k's power consumption.
>thread about epyc power consumption is actually about kaby lake i3 it was compared to
Politifact literally is fake news though. I live in the city of the fucking newspaper that owns it.
We're starting to see some Intel price cuts in the HPC sector -- mostly likely in anticipation of the upcoming Epyc release.
I think this is the first Intel price cut of the year -- somebody please correct me if I'm wrong about this.
lmao this one is good
Yes, but a THIRTY TWO core using the same power as a bit overclocked 2 core is more insane.
That's KNL and it has nothing to do with mainline CPUs.
Probably because nobody has found a use for Xeon Phis yet. Aren't those the shitty Atom cores or something?
This thread is about CPUs that do actual work, friend.
>Memes like "VDI, dense, DBMS, analystics" are there to misdirect you from its lack of gaming performance
Yeah server grade chips having bad gaymen performance will be the death of AMD
They know it too which is why they list useless features instead of average frame times in AAA games
>they're still slower in games than 2 Intel cores
lol, even the similarly priced 1500X/1600 can beat that shit in basically everything.
What part of
>STUPID efficient
did you not understand?
>That's KNL and it has nothing to do with mainline CPUs.
Yeah, but there has to be some kind of reason for this **extremely** **rare** Intel price cut.
What would explain it better than fear that Epyc could cut into the lower end of the HPC space?
No, these are PCI-E coprocessors and no one simply buys them.
>what is the G4560
>1/4th the price
>1/2th the cores
>1/4th the threads
>110% the FPS
It's one of the very fastest-growing segments of the market. It's true that currently, HPC revenues are not a high percentage of overall revenues, but this market is a future play for Intel.
The timing of the price cuts is extremely suspicious. Notice that Intel released a new price list on the 18th, just 11 days after their previous price list -- just to make drastic cuts in only Xeon Phi. That smells of desperation, timed to coincide exactly with AMD making a big new reach for the low end of the HPC market.
How can anyone seriously claim that's a coincidence?
Maybe the price cuts are simply because Intel started feeling guilty about overcharging so much all these years?
G4560 gets rekt by even the lowest end R5 1400. And Raven Ridge APUs will compete with it directly.
>The last good thing from intel is the £50 g4560 that cannibalises intel entire
Because motherfucking V100, user. Not like NVIDIA will ever yield enough of this fuckers for at least few server rooms but come on.
Yeah, they pretty much had to obsolete the entire line of i3's to "win".
if you pay more that 120$ for a CPU, someone should cut off your hands
Ty for admitting
150£=dominated by 7700k
Literally no reason for AYMD to exist in the consumer market atm.
Server CPUs aren't your typical toaster oven PC fare.
It's more like those are the only two intel CPUs still worth half a shit, and everything else is garbage compared to Ryzen.
we are not talking about server hardware
>4560 gets rekt by even the lowest end R5 1400.
If by "rekt" you mean being equal and superior in some gayms then yea
For 1/3rd the price of the shitty 1400 mind you.
intel doesn't advertize xeons for gaming either you mingbogglingly dumb motherfucker
Not really. youtube.com
And like I said, the real competitor to the G4560 is the Raven Ridge APUs which aren't even out yet. It's doomed once those come out.
I mean if you want
>MUH FPS
then you should get back to your containment board
>onee-chan
>youtube.com
Pajeet needs his little pro-AMD safespace waaaaah
A xeon phi is Intel learning the hard way that a gpu isn't just thousands of tiny cpu cores glued together.
Didn't even do this on purpose
HOLY KEK
>rooting for intel
>the year of our lord
>2 shekels have been placed into you account
How GNU are you?
...
>gets btfo
>resorts to memes
Yup, this is Sup Forums
>The power draw is directly going to heat.
No, it's going into computation and heat.
Learn thermodynamics. Work is being done, a CPU is not a perfect resistor.
>thread about server cpus
>Sup Forums kids turn it into another "MUH FPSSS" episode
Fuck off already
>converting energy to "computation"
Lol, retard. You are the one who needs to leas thermodynamics.
these cpus specs looks more like gpus specs, but really shityy once. And hugely overpriced.
>learn
FTFM
Cherry picking is a meme here newfriend
But this user is right.
The power in != heat produced.
Yes, they're related, but the heat is a function of power in and the resistance in the circuits.
A chip could draw the same power as another chip but have less resistance (i.e. fewer transistors) and therefore produce less heat.
retard
correct
But it doesn't really matter anyway. In the very near future everything will place a high premium on multi-threaded rendering.
Vulkan, DX12, and Metal are all extremely focused on enabling multi-threaded rendering. UE4, Unity, Source 2, and id tech all support Vulkan already. Every other engine is doing the same. On every platform.
Multi-core is the future. Period.
le epic win XDDDD
>I live in the city of the fucking newspaper that owns it.
And how is that relevant at all?
xcelerit.com
The Xeon Phi costs literally 1/10 the price of the Tesla it's being compared to, and it's not 1/10 the speed.
Nobody cares about price when it comes to accelerators.
No. Do you know what's silly? Using way too much voltage on your 7350k to achieve 5ghz. Anything above 1.35 is just asking to grenade it, even with liquid cooling you shouldn't go past 1.36. I'm sitting comfortably at 4.6ghz with 1.335V without even using 50W during CPU benchmarks. AMD shills will do anything to defend their 180W stock clocklets.
>m-muh DEGRADATION
Literally reduces the lifespan of your chip from 15 years to 11, who fucking cares
Clocklets, ha. That's a new one. Still, I'd rather be down 500-600mhz on the approach to 5ghz than have a quarter of the threads. Corelets are truly the greater evil.
I completely burnt an AMD CPU within 6 months of purchase by running it at 1.5V to the point where it wouldn't even stay stable underclcoked, it matters a lot more than you think.
Because there's a difference between near suicide voltages and relatively safe voltages.
And I've seen people run Phenoms, Sandybridges and Bulldozers for years clocked to the fuck wall with nothing happening.
>I completely burnt an AMD CPU within 6 months of purchase by running it at 1.5V
I take it wasn't a bulldozer based chip then. That arch weas designed to handle way, way beyond what anyone can actuall cool in the home environment (to wit if you go for motherboard block as well the likes of a 9590 can handle nearly 1.6v). In fact the highest single core OC ever still belongs to a bulldozer based chip.
what's with the obsession for 5Ghz, the voltage level is insane anyway
just get a 4790k and 5ghz is easily reachable under 1.4v with a top mobo and psu
but yeah below 1.3v is the normie safe zone, over 1.3v 24/7 is the lifespan zone although the maker themselves spec the chips at 1.45vish easily since heat degrades much more than voltage but as he said 1.3-1.35
Again, most 5.0ghz 7350ks are not running at sane voltages. The one I saw on a review site had it running at around 1.75. Those power draws are an unrealistic representation of Kaby Lake overclocks at sane voltages.
this
>what's with the obsession for 5Ghz
5ghz was irrelevant when AMD got there. Now that Intel can finally hit such clocks reasonably reliably actual megahertz (in b4 megahertz myth) is all that matters.
That's an extremely simplified way of stating it.
Right now, we don't really have a way of performing computations without generating heat (look up Landauer's principle). Thus, heat is a byproduct of computation; some form of "work" is being done to produce heat. Electrons are being forced around through transistors in meaningful ways, but since they're not physically doing any work, we technically consider all power to be converted directly to heat.
The problem is that it's much more complex than that. We know something happens because of the power draw. It's one of those things where we need to expand our terminology to explain it.
Actually, there is electrical work being done.
Why are you faggots talking about consumer budget CPUs in a server CPU thread?
I bet you thought Fallout 4 was good
...
Kek, but whats the sauce on the game assuming their is one?
I'd guess Higurashi.
t. don' play games, didn't watch the animu
there is a reason, intel binning
if it weren't for i3's to pad their bottom line i7s and i5s would cost far more
because it's all intel has to offer at this point. You'd have to be insanely high to ever consider using xeons or servers now.
It's called switching you retard. It takes energy to turn a FET on and off or hold it on because gates have resistance, capacitance, and leakage to the source and drain. The faster you switch, the more you lose to the gate resistance and capacitance because the reactance goes down as frequency goes up. This is basic EE, and all the terminology is well established.
Power per core is something that everyone would do well to understand along with power increasing as a square of voltage.
Desktop Ryzen CPUs show a power per core of about 5.5w at 3ghz nominally. Some obviously better, some a little worse.
If you have 4 Zeppelin dies, each with 8 cores, and each core is pulling 4.5w~ at 2.7ghz you have a power consumption of 144w without any uncore. Factor in uncore of roughly 10w per die and you hit the 180w TDP mark.
Of course power consumption does not directly correlate to TDP, but the two can be the same if the safe temperature range and thermal conductivity of the die and package allow. In the case of AMD's 14nm LPP parts the TDP does closely fall in line with average power consumption.
In the case of intel's i3, just look at the voltage scaling of the other Skylake and Kaby Lake chips. A Kaby Lake core requires 12.5w to hit 3.6ghz. Reaching 4.2ghz requires 21.5w. A 600mhz clock increase requires almost a 75% increase in power drawn per core.
As clocks increase further, the voltage required increase, and power operating as a square of voltage really starts exploding.
AMD general goes by "average peak usage" for their TDP.
Which means either the (8?) 2 core max turbos run through some common programs and how much that uses, or the 2.7GHz all-core turbo for some programs.
So yeah, it'll be a bit less than the theoretical TDP of the all-core. Though if I'm not mistaken, they traditionally measured Opteron TDP differently since it's more common for HPC to run flat out a lot.
And for those that are wondering, Intel TDPs are based on their base clocks. That's why they're so hilariously off.
>50% price cut
god damn, the intel jews were really fucking you.