Barely faster than a 6950X

>barely faster than a 6950X
>a $1500 processor that has similar performance to a $500 Ryzen 1800X


HAHAHAHAHAHAHAHAHAH

*breathes in*

HAHAHAHAHAHAHAHAHAHA

Other urls found in this thread:

wccftech.com/intel-developing-new-x86-uarch-succeed-core-generation/
twitter.com/SFWRedditGifs

Geekbench3 is cache bound garbage and that's the reason Apple uses it all the time.
It's inflating scores like old CPU-z did.

DESI/g/nated

Wait for Cannonlake

>slower than 14nm

Mobile only.
You got another skylake refresh before Icelake hits in late 2019

DELET THIS.

You weren't fucking joking, lol

...

>308W
Sides status: currently in orbit
House status: currently burnt down to ashes

>308w

delid

>$500 Ryzen 1800X
You mean $450

>three hundred and eight watts

outside US it's still 500$ or barely above

>308W
what le fug

>Skylake-X
>Coffeelake
>Cannonlake
>Icelake
>Geminilake
i honestly lost the track of what's what

>Applebench

haha... surely that simply has to be something like Nvidia's power limit setting, right? That's more power for the CPU than most PCs draw in total.

they like lakes apparently

yellowlake when?

...

THREE HUNDRED AND EIGHT WATTS

NOT EVEN THE 5GHZ FX SERIES WAS THIS RETARDED

Performance-lack
Efficiency-lack
Competitiveness-lack
Integrity-lack
Honesty-lack
Freedom-lack
Goy-lack

I'm pretty sure that's not correct, but the performance being so medicore is pretty much expected. All they did was drop the price from $1700 to $1000, really.

the funny thing is that most of amd products carry so many driver and overheating problems that people (including me) are willing to overpay just to get intel/nvidia.
>t. my first amd gpu

AMD's GPUs and drivers aren't doing that badly lately. They just have nothing to compete with in the high end until Vega.

You will NEVER convince me to EVER go back to amd I will just wait till the gen following this to upgrade my 6700k

A sheckel for the good goy.

>I will just wait till the gen following this to upgrade my 6700k
Coffee Lake S just got delayed to Feb 2018. And it's only going to be 14nm+ to 14nm++. So basically, it's fucking nothing. 3-5% perf. increase if you're lucky. 10nm on desktops is still MIA.

You're going to be waiting until 2010 or 2011 to see anything interesting out of Intel again. When their new architecture comes out.

Oh ok I will just hop into my tardis and go back in time to pick up those upgrades then

Oh fuck, I derped. 2020/2021. wccftech.com/intel-developing-new-x86-uarch-succeed-core-generation/

This is what I get for shitposting before my morning coffee.

AMD drivers actually have lower overhead because they implement a lot of things in hardware instead in drivers like nVidia, which really shows with new efficient low-level APIs like Vulkan and D3D12.

>just wait 4 years goyim
Now gonna work schlomo.

So you prefer buying from a company that gimps their GPUs once their new line releases
>t.970 owner

I have a 1070 and I have not noticed any gimping as you claim

Because 1070 is still current gen?

Did the 1170 get released?

That is the newest GPU from them you butt. I'm still on Kepler, this thing is practically worthless even though theoretically it's still not that bad of a GPU.

No titan so is current gen just the forerunner

Titan is the flagship, retard, not the entirety of the current gen.

Volta GPUs when

Strange my 780 lasted around 4 years before I moved to the 1070 when it finally started to show it's age

It was outperformed by the fucking 7970 in most new games for at least two of those years.

Ya sure pal I had no issues with it the one amd card I owned was slow shitty and ultimately died on me I have never had an nvidia die on me plus amd has no cards in the high-end

You guys know whats the funniest part.

AMD basically 2 R7 CPU's together right? Nothing wrong with that, Intel did the same a couple years ago

The reason Intel couldn't do this anymore because its TDP would be around 360-400W

Just Wait goy, you're a good goy right?

>I got one bad GPU therefore the entire company and every product they make is shit for all eternity
Are you like, 12? By any chance? What partner brand was it anyway?
Q2 2018.

You need to wait, gweilo.

>308 watts
It would more efficient to run my small air conditioner at 69 F during the winter than this heate- CPU...

T H R E E H U N D R E D E I G H T W A T T S

>The reason Intel couldn't do this anymore because its TDP would be around 360-400W
I thought it's because their arch literally can't support it.

It was an Asus and every Asus nvidia I have owned has been stellar I am 30 actually and like I said amd has no high end they have things that barely compete with the 1070

>mfw could ask someone what has a lower wattage, a 9570 or an intel chip
>it's actually the 9570

You know ASUS isn't saving up better components for their NVidia cards. You just got unlucky.

Something seems really off in (), do processors normally use such high amperage?

Below the "308W", there is "Max Current = 1023.875 Amps".. Welders don't even go that high.

Yeah that is definitely a bug.

You still have not addressed the no high end argument

It sucks, but AMD is a smaller company, and they're split between CPUs and GPUs. They can't push products out as fast as NVidia. Vega should be out 2 months ago. Shit happens.

Thanks for the advice. I will never buy slow, shitty cards that die on me. Fuck AMD!!

So if their test software is buggered, can we even trust the performance score it shits out?

(the proper answer is: "No, no we can't")

The 308W seems accurate, since the officialy declared TDP of 140W is measured with basic frequency only. Once it starts turbo boosting the TDP and power go through the roof.

The end result cinebench score is different than a misread voltage level.

Nothing stops them making mcms of lcc dies and link them with qpi.

So why does wattage shoot into the stratosphere when they try and do this?

Except maybe the dies melting through the board unless they target 2 GHz clocks.

It's a reporting error. But you can expect like 250-300A. It's 300W at 1-1.2V

Yes we can. Sensor data being misreported or misunderstood by software has 0 impact on the performance. Sensor reading is an isolated activity so to speak, in that it has no impact on the rest of the function of the CPU and in many cases even older CPU/chipsets still have reporting errors.

It's believed one of the sensors that AMD uses still misreports even though it's been like 5 years. We can still trust performance scores.

>JUST WAIT(tm)

Haven't noticed any significant performance loss on my 980.
Will be upgrading to Vega for 4K/VR and freesync soon if it performs well for the price. Tempted to just get a 1080Ti and call it a day though, the wait is killing me.

they should bring the bridges back, at least they where good

ASUS's AMD cards pre-Polaris had literally no effort whatsoever put into them.

The Hawaii cards specifically had the same heatpipe layout as Nvidia cards, so it completely missed the VRM.

My last AMD card was a Sapphire HD6870 and it was pretty dope.

So how long until Intel officially recommends that these only be used in an air conditioned server room with a fire surpression system?

Too late ..

1023 A / 308 W = 3.2 V

3.2 Volts is normal for a CPU. Not a problem.

>308 Watts
I think that even a OC dual Nehalem rig can't reach that comsuption CPU alone...

FUD
U U
DUF

personaly i like msi amd cards, but sapphire is fine

P=UI

It's not 1023A/308W but 308W/1023A = 300mV or so.

But U=P/I, it should be 0.3 V.

I just followed a guide online and failed :(
Okay, so, how many volts are flowing through?

Are you retarded? Both of the people you quoted told you the exact voltage.

I mean, what is the correct voltage that the CPU is using because my equation is wrong. So, whats the correct equation to find the correct amps and volts the CPU is using?

The one we used. That is, if we had correct data which we don't.

>just wait

Wasn't that supposed to be the AMD fanboy line?

Yes. The irony is quite delicious.

>wait
so Intel is now Just Wait ? Slurppppppsss

>300watts
WHAT THE FUCK ?

|
|>
|
|3
|
|

3 0 8 W A T T S
0 8 T
8 T
W A T T S
A
T
T
S

Lmao do I even need to run the heat during the winter with this?

INTEL IS GETTING BTFO QUICK SWITCH THE TOPIC TO AMD GPUS

HAVE YOU EVER ASKED YOURSELF, WHAT IF MY CPU CONSUMED MORE POWER THAN A 290X OR A GTX 480 WELL INTEL HAS THE ANSWER FOR YOU

INTRODUCING THE NEW SKYLAKE-X CPU NOW WITH MOAR CORES THAN AMD AND MORE POWER CONSUMPTION THAN ANY CPU OR GPU ON THE MARKET

380x works on my machine

Sometimes I can't imagine how you can go beyond 200 watts in a CPU but then I remember the 295x2 exists and shudder a little.

...

Nice update on that old maymay.

OY VEY
SHUT THIS THREAD DOWN NOW

I mostly post this to remember that it is seven years old now.

having a million fucking cores what do you expect. Why did they sell this to consumers