JAYZ ON RYZEN

>you're in for a shock. Buckle up

What did he mean by this?

GET HYPED FOR ANOTHER MASSIVE DISAPPOINTMENT

i thought jayz was a nigger
what happened

See:

Also, DRAM is the true last level cache in Zen, and its absolutely fast enough for it

Ryzen will be shit and all the madcucks will commit sudoku

>Snake oil salesman who actually attempted to make an argument for AMDs failed APUs strikes again

Don't care, him and all his e-celeb friends will say whatever lines their pockets.

Shock?

Pretty fishy

He meant: go back to Sup Forums you e-celeb shitter.

Watching Gamers Nexus' new episode seemed like Steve was kinda wary of the results garnered by his benches for common use. But who knows, really since he couldn't really expound.

Uh oh

No you idiot you're thinking of the engine

I don't know what any of that means aside from 60ns of latency

well...

This is a feature on high end intel boards too. Its nothing new.

Hell, it came on my bargain bin Z170 too

He discovered Sup Forums

intel about to get BTFO'd really really hard

>e-celebs

Well that's fucked up

Literally rigging benchmark results

Explains why they do so well in CineBench and other synthetics, and then fail in real world tests

...

Reminder that youtubers are even below tech journalists in the grand scheme of things.
Intel trying to bribe all of them to benchmark 5 year old games at ultra low resolutions

lower resolutions tax the CPU more
and what are you talking about 5 year old games

iranian "review"
it got shreded to pieces though due to being on stupid ram, ES sample and ES motherboard

>lower resolutions tax the CPU more
please explain your reasoning, I'd love to understand that.

changing resolution won't affect the cpu performance in most cases, but lowering it certainly won't tax it more

boy howdy can't wait for these youtubers to test how it performs in Hitman, Crysis 3, Rise of the Tomb Raider, and GTA V. Those are the games I play!

Lowering the resolution yields a higher framerate meaning the CPU is taxed more

>AMD game
>ok
>recent
>still relevant
what games do you play

to be honest, Bulldozer was decent for games and pretty good for other things for the price. Even today you can play games at 60 fps with it

You are retarded, try it yourself. Lowering the resolution removes the GPU bottleneck, which means the CPU is tested more.

What?
Why?

A review using a game should use the settings most people would use,and I don't tend to run my shit at 800x600 because it's not 1998 anymore.

If they want to test raw CPU compute performance run synthetic benchmarks, that's literally the reason they exist. If you want to give a practical performance benchmark run the fucking game at 1920x1080.

This is the dumbest shit I've ever heard. They're just ignoring the point of a benchmark on a game. If you run with the same GPU the difference, if any, will be marked by the CPU.

>AMD sponsored
>Low CPU usage
>Medium CPU usage
>High CPU usage
What's the issue here?

>Bulldozer was decent for games

Its almost as if production code is different than code meant to artificially stress the system....

Prove it.

moron

Read some more CPU reviews, this is common practice

It isolates the CPU gaming performance.

If you don't do this, all you are getting is a GPU-limited benchmark.

Calm down
It's the way CPU benchmarks have been done for a while. Running lower resolutions shows CPU bottlenecks in gaming performance, not synthetic scores

>Jay Z

There's only one rapper who's tech opinions I care about.

No it shouldn't we went over this shit after the shitdozer release

Your a fucking idiot or a troll

Synthetics test nothing, just made up shit


They've literally been testing in low rez for cpu performance since the mid 90s


U a straight busta me n my boys gone ride and u gonna get clappd CLACK-CLACK-CLACK

My point is, the test doesn't make any sense then. If a game running at 1080p, which is a minimum resolution for anyone gaming I'd say, doesn't stress the CPU enough for it to be a bottleneck, then what's the point of using that to test the CPU?

Again, the testing methodology is just dumb. The reason I look at game benchmarks is because I want to know how well the game would run if I had a specific part. If it ran the same way at a realistic resolution as another CPU, then the information being given to me isn't very useful.

It let's you know how the CPU performs in a non synthetic scenario in gaming

Is this the first CPU review you've ever read?

You are seriously new and need to educate yourself.

Intel was better for games, sure. But Bulldozer was good enough. Again, those processors still run games at 60 fps today if you pair them up with a good graphic card. That's 6+ years of good service

>But Bulldozer was good enough.

The everlasting cry of the AMDtard

"it's good enough!"

Oh well, you're trolling. I though we were talking.

>Bulldozer was decent for games

Even after all these years we still have AMDrones full in denial mode.

Well then just transcode a standard video file and give me a time comparison. That's a non-synthetic, CPU bound metric. That's a realistic scenario of something CPU bound as well.

My point is that if a CPU just does not make a difference in a game unless you're running at 800x600@700FPS, then it's not really realistic to say that one is better than the other because of this. However, if CPU A transcodes a standard video file 10 seconds faster than CPU B, that's a useful fact that I could take into account when making a purchase.

Good for you

Trolling? It's the truth, you're in denial. Bulldozer was good enough for games like a McDonald's happy meal is good enough for your health.

Dude. I bought an Intel cpu back then because i like to play at +90 fps. But what i say it's true; check the games benchmarks, those processors still play games at 60 fps. Enough for more than 90% of people out there.

How is transcoding helpful when determining gaming performance? Are you trolling? This is literally how CPU gaming benchmarks are performed, you lower the resolution to make a CPU bottleneck. This isnt controversial at all.

Gaming at 1280x720 is just as synthetic as as any other synthetic benchmark.
I don't want to run Skyrim at 250fps or 300fps. I want to know how processor choice influences current games at WQHD or 4K

>gaming at a low resolution is synthetic

4k cpu benchmarks are worthless since your GPU will be the bottleneck in 99% of games.

>ITT: synthetic benchmarks are mistaken as real world performance and low resolution gaming is unrealistic and useless

You just don't get it.

You are seriously delusional and have no idea what you're talking about. Go read some more CPU reviews to understand how it's done.

>I want a CPU benchmark where every single benchmark has the exact same FPS no matter what CPU is being tested

Be retarded somewhere else please

Games that are limited by the GPU anyway are not relevant to my CPU buying decision.
There are multiple current games that are influenced by CPU choice even at 4k (see: Battlefield 1). Those matter.

Wrong.

A transcode obviously isn't a measure of gaming performance, that wasn't my point.

What I'm saying is that if you have to increase framerates to 300fps to start seeing a divide between CPUs, then the metric isn't realistic. What was the last time you ran a modern game at 300fps at 1080p? I'm willing to bet never. If CPUs aren't a factor while gaming unless you're pushing retarded framerates at frankly unplayable resolutions, then gaming isn't a very good CPU benchmark.

You're right, I don't understand why playing a game at a resolution no one runs is a viable method of reporting to a consumer what they should buy.

>yet listens to synthetic benchmarks that are not indicative of real performance at all
>finds scores in benchmarks a viable method of reporting to a consumer

Go back to Sup Forums, holy fucking shit.

You are goofiest idiot I have seen here in along time. Thanks for the chuckles.

playing it at sub normal resolutions is only good if you show a bunch of resolutions.

So like 1280x720, 1600x900, 1920x1080, 2560x1440, 3840x2160 just to show at what point the it's GPU bound vs CPU bound.

Are you new to PC gaming? Because this is how CPU benchmarks have been performed since the advent of discrete graphics accelerators. I find it bizarre that you would think this is Intel paying reviewers.

That testing PCs is beyond his comprehension.

But then he owns a 350Z... He coulda had a V8.

>then the metric isn't realistic.
OF COURSE IT'S NOT REALISTIC
THE IMPORTANT THING IS THAT THE METRIC IS USEFUL, AND GPU LIMITED BENCHMARKS TEST THE GPU, NOT THE CPU.
WHICH FOR A CPU BENCHMARK MAKES THEM EXTREMELY UNUSEFUL.

IS YOUR BRAIN FULL OF CHEESE

DO YOU HAVE PROBLEMS FIGURING OUT WHERE TO PUT YOUR HEAD WHEN WEARING CLOTHES?

He is seriously out of his mind. This is probably the first CPU review he has ever read.

I don't even care about the Intel v AMD thing, my point it that even if this was the way they've been doing it for 100 years I think it's not very logical nowadays.

I'm not saying they should use a regular resolution for testing while gaming, I'm saying that they shouldn't use gaming as a test at all. If by your own admission running a game at, say 1080p, doesn't push enough frames to stress the CPU to the point where it makes a difference, then why force the situation into something that isn't realistic in order to stress the CPU?

Just use a different test. Do a rendering workload, or an encoding workload. Something that stresses the CPU enough so that there's a marked difference between different products, but is still a realistic scenario someone in the real world would do.

You don't know anything about CPUs in general, so why would anyone care what you think is logical? Your logic is retarded, that's why it doesn't make any sense to you.

Does anyone know when exactly the reviewers like Linus will publish decent benchmarks?

I use After Effects and Premiere, need to see the benchmarks before selling my 5960x system to build a new 1800x system.

Please stop before you hurt yourself

>I'm saying that they shouldn't use gaming as a test at all.

Congratulations, you've now gone full retard.

>watching linus reviews
HAHAHAHHAHAHAHAHA

I heard tomorrow 9am

>Linus
half the hardware industry is on his sponsor list. He has never published an unbiased review

>Linus
>ever

I have it on good authority that it rendered a 320x240 video CD in 38 seconds.

while the 1800x is better than the 5960x in general I would really question how it will do with adobe
AMD CPUs are historically bad with the Adobe suite

I'm more waiting for Paul's Hardware (and maybe JayzSeveralCents) also did Kyle get one?

Yeah does Linus even do technical reviews?

Personally I'm waiting on techpowerup and anandtech.

Exactly why I'm impatiently waiting for benchmarks!

Well, fuck Linus. I just gave him as an example.

Out of curiosity though, which source is best for me to check tomorrow for the most unbiased reviews?

Why do you watch any of those shitty channels
literally no redeeming qualities about any of them
they consistently break hardware and fuck up all the time, literal label readers

You seem to be completely at odds with logic. I state this seriously. It was fun and a good laugh at first, but now I am concerned for your wellbeing.

>I have never watched Paul's Hardware

Oh and guru3d is usually pretty reliable

find the average from multiple sources to avoid bias

Paul is a moron as well

Could be... but I'll still trust him more than most of the others

Man's got integrity

guru3d or some germans if you can read that

Intel fired whole PR department in Europe except UK, so everyone from EU except UK is fine

>JayzTwoCents
He somehow manages to be worse than Linus

Official list of people/sites you can kinda maybe trust tomorrow:

Steve Burke - GamersNexus
Paul's Hardware
Anandtech
Tech City (Aussie dude who isn't that technical but is honest as fuck)

People to avoid:
Linus
Jay
[H]ardforums

Feel free to agree/disagree or add more.

He is just linus in 10 years when children grow up a bit.

>historically bad
>x was bad once so it will always be bad
>you cant change anything
So wouldnt this then be a reason to change it with a new product?

add to avoid:
toms hardware

GN is surprisingly not bad, I don't like their methodology for games(buy FCAT already, god) but they seem honest