Bad multicore benchmarks

>bad multicore benchmarks
>"our engineers will optimize it"
>no single-core benchmarks
>building a fucking computer in VR because it's The Future™
>playing battlefield as a CPU benchmark
>dota streamer who can't into hardware encoding
>a fucking obese female nigger "gamer" playing battlefront
That's what you get with a female CEO.

Other urls found in this thread:

ebay.com/sch/i.html?_from=R40&_sacat=0&_nkw=2600k&LH_Complete=1&LH_Sold=1&rt=nc&_trksid=p2045573.m1684
twitter.com/AnonBabble

literally incomprehensible

I'm talking about AMD's stream where they showed Ryzen. It had all those elements described in the OP. The woman in the picture is the CEO of AMD.

what did you expect? they had to cherrypick scenarios that hid the fact that zen only has clock for clock perf equivalent to sandy bridge while making it appealing to gaymers.

If you are this fat you should consider suicide.

so? what's your point? back to if you want 100 guaranteed replies under one hour.

what's with the lesbian style? She would be qt witl normal long hair

Equal amount of cores between both CPU's, they're raping Intel IPC wise, and probably clock wise too
With so few replies you won't get paid

why? at that size you only have a few years to live anyway

>madkek

>That's what you get with a female CEO.
still better than what was happening with previous limpdick CEO

>bad multicore benchmarks
uh, I'm pro Intel but AMD had the best multicore til 2013

>dota streamer who can't into hardware encoding

Hardware encoding is shit for streaming.

You need at least over 15Mb to achieve decent quality compare to CPU x264 with 3Mb encoding.

I really see all that equipment that can do H265 HW encoding...

How well do newer 2-pass HW solutions fare in comparison to x264?

DELET THIS NUMALE!

>building a fucking computer in VR
wat

>stock value quadrupled in a year
>huge ass SoC contracts
>company making actual money for the first time in forever
>GPU market share up 10%
>caught up on massive technical debt

by all metrics she is doing a terrific job

Zen isn't even faster than a stock 2600k in AMD's own cherrypicked benchmark.

Let's hope AMD at least sells off RTG to someone competent before they go bankrupt.

Who was that obese negress doing the battlefront demo? She couldn't even work out how to start the game. Reminded me of that ugly fat one from Lady Ghostbusters who had her nudes leaked

>that ugly fat one from Lady Ghostbusters who had her nudes leaked
the one on the right here?

that's not a female.

>implying Sup Forums, I mean Sup Forums, knows or cares about anything aside from muh gaymes

Their market strategy is great. Basically market their cpu's on the basis that the majority doesn't know anything about the technology, and are only looking at "buzz" word metrics and comparing them to other products "buzz" word metric and the cost.

"ooooo, dis one has 4.5 gb speed and is $200 cheaper than that one with 3.5 gb. It must be really good! Look, it even has more cores!"

>>bad multicore benchmarks
>beating a $1.2k Intel CPU
>bad

>>"our engineers will optimize it"
Happens all the time with any semiconductor company, so yes. they will optimize the final silicon.

>>no single-core benchmarks
This is worrying indeed.
Still, assuming the chosen multicore benchmark workload is somewhat representative of Zen's general performance and their SMT implementations are about the same, you can assume that Zen's single core performance is in the ballpark of Intel's.

>>building a fucking computer in VR because it's The Future™
The demo chosen was retarded, but VR/AR definitely is The Future™

>>playing battlefield as a CPU benchmark
It's something many potential buyers will do.
The event was a "Fan Event" after all.

>>dota streamer who can't into hardware encoding
It's a CPU benchmark.

>>a fucking obese female nigger "gamer" playing battlefront
pic related

>This is worrying indeed.

Pick your poison since Intel reccomend blender as a benchmarking tool.

1) Lower IPC but a superior SMT implementation
2) Worse SMT but better single core

You can't have both worse SMT and worse ST while beating a 6900k in a piece of software AMD's older architectures get absolutely slaughtered in - even when they were new.

...

thanks for the advice
i will consider it

ebay.com/sch/i.html?_from=R40&_sacat=0&_nkw=2600k&LH_Complete=1&LH_Sold=1&rt=nc&_trksid=p2045573.m1684

you can get a 2600k for $100-140 these days, which is pretty good considering the terrible stagnation in newer uarchs put out by intel and the fact that zen is already disappointing

>Zen isn't even faster than a stock 2600k
Source?

malti core is a meme. They should figure out a way to reduce the latency from shoving your lambda to the threads and getting the result down to at most 1 microsecond. This latency is 20-50 microseconds at the moment which is good enough for the things that need to be parallelised but slow enough that you can't just parallelise everything by default and expect it to be faster than the single core version.

What most developers want is to say: I want to run these four lambdas on different cores RIGHT FUCKING NOW, not after the OS decides to shedule the damn threads.

His rectum.

The intel defence force has been working overtime these last few days and now the hot new thing to claim intel's superiroity with is power usage delta.

Sup Forums is the same as OCN, anandtech and the like (but better than Hardocp) in that its currently trying to disprove what AMD has shown of zen with memes. Hardocp is actually convinced zen is slower than sandy vagina.

>anyone who doesn't blindly believe AMD's marketing BS is shilling for intel

kek, imagine the AMD fanboy butthurt if it was NVIDIA or Intel putting out these hilarious demos that don't represent real world perf.

every CPU with any sorts of caches is optimized for the case of hitting peak performance after "warming up" the critical event processing loop.

you're never gonna see the day where threads can be spawned, scheduled, and execute meaningfully in a few thousand cycles, since even normal single threaded code isn't expected to warm up in a core that fast.

even AMD shills are starting to admit that Zen is DOA.

soon they'll switch to their shilling tactic of 'it was never meant to beat Intel in the first place, AMD wanted to make an inferior product'.

>vlc

Huh, it's almost like EVERY company biases benchmarks in favor of their own products...

Maybe instead of sitting around arguing over made-up numbers, we could wait until some third-party reviewers have got hold of the thing and given us less-biased comparisons?

Just up the TDP to 140W like the i7 6900K. AMD has XFR which automatically adjusts voltages for temperature.

Remember how AMD cherrypicked Blender benchmarks with Bulldozer too?

This, it's like everyone has the memory of a bird lol

That's what you think. My grandfather has been this big for my whole life and he's still alive at 74.

it makes sense that they would try to get whatever cash they could out of faildozer while they scrambled to make zen for half a decade. it's still pretty staggering the amount of excavator chips they sold, considering how awful it was. I guess we should thank consolefags for ryzen.

but I don't think the "market strategy" you're talking about was particularly effective. they lost nearly all their market share in the PC market. that isn't great by any stretch of the imagination.

my money's on better SMT

either that, or their branch prediction is fucking amazing and zen "ramps up" ridiculously well

Things that never happened: The Posts

The Bulldozer arch did terribly in Blender. It never favored their arch at all.

>Intel recommends Blender to benchmark their high end CPUs
>B-B-B-B_B_B_B-B-UUUUUUUUUUT AMD CHERRYPICKED

Also, Bulldozer was never competing against a CPU with the same core/thread amount.
It was always 8c Bulldozer vs 4c/4t or 4c/8t Intel, where BD will obviously have the advantage.

Proof

>less is better
are you trying to be retarded?

Well at the very least ramping up can't mean adjusting clocks as AMD demonstrated zen locked to 3.4ghz.

Piledriver doesn't even compete with Sandy Bridge 4modules vs 4 cores without HT.
Blender isn't a cherrypicked AMD favoring benchmark. It isn't a synthetic, its a real world workload, and it heavily leverages FPU ops that the entire BD family of architecture did terribly in.

Intel shills are totally BTFO over Ryzen performing so well in Blender.

You're illiterate.

Then I guess handbrake must be AMD cherrypicking because integer workloads are a strong point for bull- i'm sorry, I can't keep a straight face while typing that.

>Intel shills are totally BTFO over Ryzen performing so well in Blender.

Pretty much.

any price leaks?

This

>Woman as CEO
No wounder their drivers are terrible.

Nothing substantiated, but its been expected for a long time that the top end SKU would sell for around $500.
Some Chink "leak" showed as much, though it was nothing but a collection of rumors and already known info, so its entirely baseless.

Given how they positioned it for streaming I don't think they're going to go near the upper i7E prices. It'll probably be between $350 and $500.

I think he meant from Lisa Su onwards.

There is not much AMD could have done on the CPU segment without Zen, as Bulldozer and derivates were a disaster, and they didn't quite have the money to force OEMs to use it, like Intel did with Pentium 4.

They have recovered a bit on the GPU segment though, with a different approach to it.

You are fucking retarded

butthurt intel shill

>Just up the TDP to 140W like the i7 6900K.

The Broadwell-E chips only have higher TDPs from their massive (but rarely active) AVX units, and integer workloads won't even come close to them.

Zen is decent, but it's not like magically 33% more energy efficient than Broadwell.

This. My 955BE from 8 years ago still eats the i3-6100 for breakfast.

Battlefield 1 is very CPU intensive.

By the way, some guy that was in the presentation and got to play BF1 on both Intel and Zen systems commented that the Zen system had FPS drops below the 60s high 50s unlike Intel which drops to the 60s and staying in the 70s mostly.
Both systems were running Pascal Titan X

It looks like it will have lower single core performance.

There is more to the TDP disparity than just AVX. The Broadwell-E line typically pull 100-110w under a nominal heavy load. Big AVX will stress them to 120-130 and it will drop clock rates.
The logic is simply dense, and takes more heat sink to cool effectively, so TDP rating is higher. Lighter workloads they can pull 90w~

Zen based parts do have lower TDP, but Zen cores are also smaller than intel's core i arch. Intel has native 256bit datapaths. Zen is only 128bit. The FPU isn't nearly as big.
Its just a smaller core. Though its still the biggest thing AMD has ever made.

>Intel has native 256bit datapaths. Zen is only 128bit.

which datapaths are you even talking about?
Everything I saw for Zen showed 32B/clock transfers except L1D loads.

>cache
LOL I see you're confused.
I'm referring the the FPU itself. Zen has two 128bit FMACs.
Intel's latest core i arch can natively handle 256bit ops. Zen cannot do the same, it has to take a big performance hit by spreading them across the two 128bit FMACs.

See this is why I preferred the real world software demo. I have no fucking clue what any of this means, and I couldn't give two shits either.

Does it run the programs I intend to use better than an i7 for the same price?

Yes - I buy it. No - I buy Intel.

Being that fat is suicide

people don't usually refer to ALUs as datapaths, put whatever floats your boat, champ.

Software should always be designed to utilise multiple cores, ever wonder why servers and supercomputers mostly use Linux? Because Linux offers better performance in multiple core systems, for commercial systems evolution has completely disregarded single core perf., Intel and brainwashed gamers are obsessed with a false economy of improving single core performance encouraged by lazy game design,

multiple cores has greater potential than single core optimisation, only lazy programming holds the current dynamic in place, it will be broken soon since both major consoles are following multiple cores now, themselves able to detect the bullshit of single core

My grandfather was that fat until he was 87 years old and smoking killed him, a habit he had since he was 16. I was obese all my life, and my arteries were tested totally clean a few months ago, and my blood tested like that of a sportsman were it not for the high uric acid content.

Don't believe everything jew doctors tell you.

Its not just ALUs, champ. Its everything in the pipeline that feeds instructions to executing logic from decode to retire.
The entire physical path that said ops take. The datapath.

Google the term and learn something.

But the doctors told you there was nothing wrong with you?

Tests told me there was nothing wrong with me. Values on a paper do not lie. Doctors who tell you being fat is a health risk just by looking at you do.

WHY MUST EVERYTHING BE SO STUPID NOWADAYS

I REALLY NEED TO PUNCH SOMETHING SOON

awww a new pepe how cuuuuuuute

Tbh Hardocp was on point with the 480 RX prediction.

Not everyone has a dedicated encoding/capture rig. You'd be mental to encode in software on the same system that you're using to play games. It's just asking for frame drops.

>4 terabytes of captures in OBS the last 4 years
>0 frames dropped
At least you tried...

Yea okay, maybe if you play some games which don't use the CPU at all. Even I get 15% CPU usage with some really simple settings.

Replace your shit CPU then. Chrome with shockwave eats more CPU than OBS.

Okay, I'll get right on buying a 1800 dollar CPU.

Don't you see the arithmetic here? If Chrome eats way more CPU with a shitty flash plugin than OBS does, then you are doing something very wrong in OBS. This is a core i5 on base clock for god's sake.

Then why don't you reveal your magical settings that not only use low CPU but also produce good results. An i5 at base clocks will drops frames like crazy on SW encoder if you run GTA5 BTW. The game alone will tap out 100% on all 4 cores.

Retard

No it doesn't, it runs with 30-60fps on an i3 dualcore with highest texture settings, rest medium and if you're too jewish to buy a broadcasting software with GPU encoding support then it's just your own fault.

I tolerate a lot, but talking shit about GTA V? Not on our watch, kiddo.

>inb4 hurr durr what are multipliers

Maybe you should stop trying to deliver the same thing you see on your monitor to the stream? I mean give me a break, it's a fucking stream. You don't need to show people what 60FPS with 1440p feels like in a fucking window that is not even fullHD in size. Do you know how many people watch streams in full screen?

Zero point fucking zero something percent.

>4k gayming meme

I want it to stop.

We are talking about CPU encoding here scumbag. I can encode on GPU just fine if I want to.

And yea it runs 30-60 FPS.. if you turn the CPU intensive settings down.

I've never watched a stream not full screen.

>We are talking about CPU encoding here
How fucking stupid are you?

>he has no quicksync

lmaoing@your life

The fuck is your problem, retard? Why do you butt into conversations you have nothing to do with and which don't interest you? You fucking autistic or something?

you all are retards. i want amd to do extremely well with zen, just so intel has to do actual something rather than just the bare minimum.

for too long intel has done just enough, so they produce shit slightly better each time. if amd delivers, i want intel to go full fucking force.

and not even considering how prices would drop. everything is looking bright.

>Tests told me there was nothing wrong with me.
Tests that doctors did for you, moron.

Also, it doesn't matter if you think you're healthy. You're still a fat sack of shit.

Because I have no respect for you, shithead.

>I have no fucking clue what any of this means, and I couldn't give two shits either.

They're arguing about terminology for the floating-point SIMD for the processors.
It barely matters since more and more things are moving to GPU offload (for fp32) at least.

More like you're trying to save face because you have no idea what was being talked about until it was pointed out to you.

If you can't see the difference between drawing conclusions from a personalized test result, and someone just generally assuming something and forcibly trying to apply it to you without ever having looked at your test results, then I am sorry for you.

You deserve your Dr. Shekelberg.

Less is better. Those both show the AMD CPU's doing the worst of all the tested ones for those graphs.

Retard.

How are you this fucking illiterate?
How is everyone in this thread so illiterate?

>Battlefield 1 is very CPU intensive.
yeah nah its not my i5 can handle it just fine its more gpu intensive

>Do you know how many people watch streams in full screen?

That's how I watch them.

Intel shills in full damage control mode.