Overclocked power hungry housefire

>overclocked power hungry housefire
>still loses to stock, locked i5
>not even better value sans electricity cost

What was the point of Ryzen?

>720p

>being this ignorant

>muh 720p

S-SOPA!?

if you read your own benchmarks you can see that the 1600 is barely more $/frame, considering you're getting double the threads of the 8400 and the reassurance that the socket isn't going to get switched immediately the 1600 is still a decent buy.

From a simple gaming by the numbers perspective the 8400 is still good though, it's all just preference unlike earlier this year.

the problem is AMD couldn't create a special chip like this even if they wanted because they don't have the r&d budget.

The only problem is that Ryzen's clocks are limited to about 4GHz by the process. Once Ryzen 2 comes out on high-performance node(as opposed to 14nm LPP used for Ryzen) Intel will be BTFO again.

we are definitely in for some frog-leaping which is always good for consumers, but I'm wondering whether Intel will keep up in the longer term.

seems too good to happen for AMD. I expect problems to arise.

I'll take 12 threads and not having to give my money to the silicon Jew over a few fps any day.

But that's without considering power draw. The 8400 uses half the power of a 1600 at 4GHz, meaning AMD becomes even more expensive over time.

>would rather buy an inferior product instead of paying ther jew in the industry
tfw amd is full blown muh diverse lgbt tier jew

Depends. AMD already has ipc parity or is very close to it thanks to intel's lack of care so to say. If they manage to get same clocks at slightly lower IPC for half the price and intel doesn't come out with some core2 level of leaps, I can't see a bright future. AMD proved they can go 6 years without making money, I don't think intel execs are as committed. Then again we're assuming intel and AMD as competition and not two heads of a hydra.

>1600 at 4GHz,
But why are you comparing it to 4ghz?

The thing is, and this is important. both products are performing well above necessary, no one is pairing a 200$ cpu with a 120+ hz monitor and no one is gaming at 720p so the gap would be even smaller, and if they put the intel cpu to 200 and amds down to 190 like they really are, they flip positions on cost performance, but here is the thing

amd and intel are ballpark equal performance, there is not 100% disparity anymore, and in actual gaming scenarios the cpus are closer to a 7-10% difference. you now get to pay with what company you want to reward... personally I have been fucked over by intel so many times and so long, I would go amd regardless, and zen was my limit, either zen was good or I was getting a used xeon dual cpu.

Now I own a 1700 and cant be happier with it.

same with nvidia, I got fucked over for years by their business practices along with just getting fucked over by them in general, I only have a 1060 6gb in my system because msi would not replace my 280x with an amd equivalent part either in performance or price, I would never use nvidia by choice and their fucking shit driver cements that for my next purchase.

Braindead fucks

it's irrelevant how much something is """""""""Bottlenecked""""""""""" in a certain higher resolution, no one plays at 720p, it's irrelevant

intel could have 20000000000000 more fps at 720p, if the results are different at higher res IT DOES NOT MATTER

>Dilusional blogpost by r/user/mouthfullofsperm

et tu monkeys

then why is the 8400 at similar clocks shitting on ryzen?

>8400
kek
Anyhow, Glofo has IBM nazi magic on their side, I believe them
good goy

...

>Buying directly from jackie chan's female clone, and her personal street shitter's office in israel.
Nah man you seem like the top goy here.

You have to be a special kind of retard to buy intel now.

>doesn't mention when honest amd colluded with nvidia on price fixing

...

>plobbit.
typical amd poster

because when you actually stop being a braindead reviewer and start doing benchmarks that makes sense you realise that 8400 is literally shit
6% difference is literally nothing

>What's the point of gaming?
ftfy

>720p
Oh god it's this thread again

6% is 6%
what sort of ram was used for these benchmarks you posted?

its tpu who the fuck knows

also on HU video that op posted the differnce on 1080p is literally 1.5%

>spend $340 on just the cpu
>have no money left for a monitor with a resolution higher than 720p
sure is great being an intel corporation shill

>6 threads vs. 12
>the difference is minimal!
it shouldn't be, and a loss is a loss

6 threads vs 12 threads

there we have it boys

CORES DONT MATTER ANYMORE intel reinvented the cpu and only has threads

not a single game of his collection is able to full utilise a single thread let alone 6 that 1600 has..

>720p
shilling this early in the morning pajeet?

Turns out at resolutions from this decade the 1600 wins.

da silva pls stop

>that 7700K score
Muh coars fags btfo

so wait a minute

cost per fps ON A FUCKING CPU?
WHAT?
WHAHHAHHAHAHHTTTTTT?
HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

DELET

OP isn't even shilling right

He's supposed to be using 360 x 640 resolution in CS:GO with an OC'd GTX 1080 ti

makes sense to me. all of this does. AMD shills like to act like long time used benchmarks suddenly aren't necessary.

The scariest part is that he really did nothing wrong

That's what we get for not supporting the terrorists

Amd plz gib vega drivers already

>720p

...

it helps give a look at what cpu will be better when GPU is not the bottleneck at high resolutions.

So why wasn't 180 x 320 res used instead?

Isn't 720p a little too high for that?

yes lets assume you are right

cost per fps at 720p
find me a single person on this rock we call earth that will create a 720 system

find me a single person on this rock that cares about fps on 720p
find me another example of that specific usage of benchmarking that will seriously have an impact on real world usage

oh wait you cant
its like tpu that they used euler as a benchmark(for those that dont know euler is created from a very specific fortran intel compiler that makes use of a specific avx load with small kernels to fit on the l1) and gave a massive advantage to intel (62%) only to have a fucking insane average number

if you wanna talk about serious stuff get your ass back to 2017

and in what game did this ever happened?
is it because intel is still pushing for developers to use avx on their code so that their cpu can be relevant again at higher resolutions?

cause if you are implying that a gpu wont be able to feed fast enough and the cpu will do the task on 2k and 4k then you are a moron

games that cache textures so basically all of them

>720p in a time when we already have 8K 120Hz monitors and 16K monitors being developed as we speak

read your comment loud and clear again

you arent a moron you are something worst
a tech illiterate

whats wrong with my comment?

We had 1080p TVs in 1991.
Your point?

the cpu loads the game textures?

THE CPU LOADS THE GAME TEXTURES?
THE CPU HAS ANYTHING TO DO WITH TEXTURES WHAT SO EVER?

Other than OP is a faggot?

Well higher resolutions are vastly superior to AA'd vidya or graphics.

what do you think tells the gpu to load a texture?

>720p benchmarks
>2017

let me try to dumb it down for you

game engines dont use the cpu to load any texture its hdd->hdd cache->memory temp loc->gpu memory aloc
the cpu doesnt do shit regarding texture loading it never did

?

I never claimed they did. I'm telling you why 720p is used because it removes the drawing part of GPU usage to cpu/gpu interaction and how fast it can run the game engine and give calls to the gpu.

NOBODY bud idiots and 'pro' (cough) gamers play at 720p.

Is this an indicator that future *FASTER* GPU's will be slower on Ryzen? Nobody fucking knows and there is no indication that that will be the case.

When cheap H and B 370 motherbards come out Ryzen+ will already be getting ready for launch and a 1600+ will equal or beat a 8400 anyhow and probably be cheaper overall.

Plus availability of the 8400 is somewhat limited right now anyhow.

In other words. It's fucking nothing. Buy whatever the fuck you think you need/want. Intel or AMD. It's all the fucking same.

>for half the price
Thing is, if it had that kind of performance, I highly doubt it would be much cheaper than Intel. A bit cheaper, sure.

what is pcie bus.

what is root complex

>6 threads vs. 12
It's 6 cores vs 6 cores.
The fuck does hyperthreading matter here? In some benches it even lowers performance.

>1660X
I don't see a 1600 there goy

>1600 at 4GHz
The 1600 is not a 4Ghz chip, user.

lol

threads vs. 12
intel shills dont see actual tech information user
they see numbers and just attach random words on them

Yeah, I was also under that impression.
They're the guys that go into a shop, look at specs and buy based on those numbers.

>gpu bottleneck post
like clockwork

Oh wow! AMD did something bad like once. Intel on the other hand....bribes and anti-competetive behavour on a daily fucking basis!

why stop at 720p then, should go lower

you could and AMD would still lose

>THICC

lose on setting only shill use

>once
>not realizing that it's still going on
>actually defending gpu bottlenecked benchmarks
The absolute state of amd

...

Well of course it will. But it is not real life usage. I already said in the comments on his previous video that 720p only feeds the fanboys. It's pointless in real life scenarios and it does not indicate what performance increases if any that future GPU's will give. It's pure pandering to the fanboys and Steve really should know better.

To add to tmy comment the only indicators that should be used for CPU performance is non-gaming applications.

Gaming comparisons is a shills game.

>there is no indication that that will be the case
that's fucking bullshit
just because it hasn't been a 100% accurate indicator doesn't mean it isn't a good indicator

The whole point of using games as benchmark is to see real world performance. 720p is not a real world use case.

Nobody benchmarks video encoding software with 240p video

So is Sup Forums now officially just a subreddit of Sup Forums now?

grab two 1080's and rx vega's. sli and xfire them them, find some games that uses 2 gpu's well. prove it.

r/applereviews

I've been noticing that too.

So which is it?

What's going to be amd's excuse when the next gen gpus are out?

GAYMES!!! XD

>MUH MULTITHREADED SERIOUS BUSINESS APPLICATIONS FOR A CONSUMER CHIP

MUH 720P RESOLUTION THAT TOTALLY REFLECTS REAL WORLD APPLICATIONS

HAHA I 720 INSTA SCOPE YOU NEWB! XD

COWADOOTY IS FUN! XDDD

I LIKE GAYMES HABAAHAHSHAHAHAHAHA XDDDDDDDDDD-----

typical

Ryzen beats the 8400 on price in 2 of the most used case scenarios vs 1. 1080p and 1440p.

But enjoy your 720p gaming faggots.

stop posting these threads user we know its just you

crickets

Fuck you I'm going to buy a Coffee Lake processor!

No wait...

Well already be on Zen+ and ZenToo so it won't fucking matter.