How can they make amazing CPUs but then make horrible GPUs?

how can they make amazing CPUs but then make horrible GPUs?

Other urls found in this thread:

youtube.com/watch?v=0juO5KuwBX4
exhentai.org/g/1041914/efa0664b3b
pcgameshardware.de/Playerunknowns-Battlegrounds-Spiel-60812/Specials/Playerunknowns-Battlegrounds-neue-Benchmarks-Vega-1236260/
twitter.com/AnonBabble

They don't even make amazing cpus bruh.

still better than jewtel's CPUs

Fuck you mean? Vega is curb stomping the 1080 ti right now.

That's not the only game too.

>Don't make amazing CPUs
That's why every known major server OEM and server farm is already locked in for EPYC? Even Intel butt-sluts like Dell is offering EPYC based rack mount servers over Xeons these days. Ryzen being more and more popular in the private sector is just icing on the cake.

As far as their GPU's, it's because they're an entirely different section of the company. One that just isn't doing well. I'm really hoping all the increased profits AMD has from Ryzen will partially go towards making ATI great again. Vega could have been great had AMD stuck to GDDR5 or even GDDR5X if they wanted to be fancy. While HBM/HBM2 is good on paper, it's such a pain in the ass to make, and thus much more expensive. Why do you think AMD is having problems keeping up with demand? Add all of that and you also have lack of software support for both gaming (lackluster drivers) and professional applications (Firepro for both GPU acceleration and GPU based computations being trumped by Nvidia Quadro/Teslas. OpenCL being trumped by CUDA, etc). So not only would it take AMD fixing their hardware, it would take a complete overhaul to fix their software suite and end user experience.

This all coming from a guy who owned radeon 7870, 7970, R9 290X, 390X, gen1 Fury. I now own a 980ti and the experience and ease of use is untouched. I'd gladly buy a Vega64 if it meant everything Just werked.

This, intel offers CPUs with 5% better performance. Yeah you need to pay $100-200 more, buy a $300 AIO, and never even attempt to OC but what are you gonna do, go with MAD? lmao

The problem with VEGA was that AMD rushed the pro drivers and optimized the shit out of vega for buttcoin mining (ETH is like 50 MH/s on vega 56 now).

depends entirely on the game

>stomping
>less than 10FPS average on both low and high end
Yea nah. Not to mention DOOM 2016 is easily one of the best optimized games we've seen in quite some time, that's not really all that impressive. More so when you take into consideration that game was built heavily with AMD in mind what with DX12/Vulcan implementation.
Forza has already shown it favors AMD cards due to how the cards handle the amount of shaders or some shit. I forget the specifics of it, but there was an article about why AMD does so well in Forza

>inb4 Nvidia shill
I'm rooting for AMD, I really am. But those games aren't the greatest examples.

Let me guess, Dirt 4 is also heavily optimized for vega too right?

It's 42MH tops.
Yes, the global illumination it uses is shader-heavy.
Their Pro GPUs are fucking amazing this year, just look at fucking SSG.
Consumer stuff?
Let's not talk of it right now.

>6c/12t CPU running almost 25C hotter than a 16c/32t 180w workstation CPU
>At stock

Literally how.
yea it fucking kills me that's how it is these days, but I'll hand it to AMD. They know their audience/user base. They know their cards and gaming capabilities aren't on par with Nvidia. But going based on how every single RX470/480/570/580 sold out everywhere for mining, they figured why not just pander to the miners to move as many units as possible? Their Vega arch may not be the best, but you can't deny it's selling like hot cakes.

Vega is literally the most interesting thing that have happened to GPUs since the introduction of unified shaders in 2005 by ATi.
Shame they don't talk about the most interesting details for now.

Is civi 6 heavily optimized for vega as well?

actually, yes. If you recall, Dirt 3 was an AMD "gaming evolved" sponsored game. They even sold graphics cards with the game included free as a bundle. Wouldn't surprise me that Dirt 4 is also sponsored by AMD.

Besides it's another racing game and similar to Forza in that regard. They use similar shader intensive rendering as the other user mentioned.

Maybe?
I don't know.
Outliers are pretty pointless; for all we know Vega is still not working properly.

The most silly part is that they NDA'd their alpha-testers for drivers.
That's gay.

Well I guess I'll just have to make do playing my games at 127FPS instead of 130FPS, but that's a small price to pay for free drivers.

It is indeed one of the most interesting. But then again, AMD has always been the one throwing everyone into the future, regardless of the blow back in the now. Look ahead at the 290X offering an 8GB VRAM variant. And the 390X offering 8GB as the standard after that. At the time it was competing with the 3GB 780TI and everyone SWORE even 6GB was worthless. Look where we're at now. The same could be said for the CPUs. AMD has been playing the moar coars games forever. We're finally starting to see the pay off. And as much as I'd like AMD to discuss the nitty and gritty of their new cards, most people don't care. They just want the end result. More FPS or more performance. Plain and simple. It doesn't help the power consumption of Vega is a bit high for the performance it offers.
No that game is actually a good example of Vega playing strong. The insane memory bandwidth of the HBM2 stacks on Vega really shine in a resource heavy game like Civ6.

Nonononono, Vega is literally about doing things NOW.
RIGHT THE FUCK HERE.
BY TELLING MS AND KHRONOS TO FUCK OFF.
Shame it's not working properly yet.

When are we getting primitive shaders enabled so game devs can stop fucking shit up?

When the AiB cards are launching?
Late October?
Probably that.
Mid-December at very worst.

>>At stock
It's not stock also shitty TIM

It's telling not the gamedevs, but MS and Khronos to stop fucking living in DX9 era.
Vega says bye-bye to hardcoded shader stages.
Long live not-Larrabee!

>BY TELLING MS AND KHRONOS TO FUCK OFF.
How?

They are replacing ruckretarded ancient shader stages for something tailored to their hardware.
Only pixel shader remains, the rest are compute shaders with inputs and outputs defined at runtime.

Is this.. what primitive shaders are? woah I thought it was just another stage lmao

Don't they still need to work with khronos and ms to get them included in the apis?

The cavemen as supposed to work by themselves without help from the gods.

Yes, that's what primitive shaders are. They are literally calling them compute shaders in whitepaper.
They can make some extenstions for Vulkan (think custom shader language that gets translated into SPIR-V).
DX12? Don't think so.

>Not stock
Yes it is. According to that picture, the thermal chart of synthetic load says right in the title. "Stock"

Whoever the fuck even though about ditching parts of D3D pipeline (no matter how difficult it actually is) should get a fucking medal.
This is A+ grade out of the box thinking.

It's not. They had some auto-OC shit set in the motherboard BIOS settings. youtube.com/watch?v=0juO5KuwBX4

Sauce? Reverse searching comes up with nothing

I still don't understand why modern APIs still use inflexible hardcoded shader stages when GPUs became general purpose SIMD processors with some fixed function hardware dedicated to graphics.
Use Google search.

not to mention deliding it to put 10 dollar liquid metal on it so it wont thermal throttle

exhentai.org/g/1041914/efa0664b3b

although what posted is edited afaik

kek at your pathetic lives.
AMD finally releases haswell tier cpus, and you faggots defend them to death. Hats off to you for tenaciously marketing that piece of shit though, because it seems like a soul sucking job to do so.

AMD GPUS aren't horrible though. Vega 64 rivals GP102 at professional graphics and general compute. The problem is the software-side.

It is currently functioning like an OC Fury at gaming stuff until primitive shaders are working. It should at least distant itself from GP104 but I doubt it will catch up to GP102 in terms of gaming performance but will get within grasping distance.

(You)

Catching up to GP102 is piss easy assuming it all works.
Making it work is difficult due to nature of changes Vega brings to the table.

>synthetic benchmarks
>any year

>it's not stock, it's auto-oc
>auto-oc is enabled by default
so, it's stock? just like turbo boost.

This, what we really need are gaymen benchmarks XD.

>x265
>synthetic
bruhhh

>whats price to performance

>haswell tier
>implying haswell Socket H CPUs have 8 cores
>implying skylake have 8 core options on the mainstream platform

>CRF encoding
>Quality is not the same between presets
What the fuck is x265 doing? The point of CRF is consistent quality. Presets use more or less features to achieve the same quality at different sizes.

Fucking pajeets.

>What the fuck is x265 doing?
Still a work in progress like x264 was for a decade.

>The point of CRF is consistent quality. Presets use more or less features to achieve the same quality at different sizes.
>Fucking pajeets.
What's worse is the one with the slow preset resulted in a larger file size instead of smaller.

AFAIK Vega's hardware is actually amazing, FE is basically as good as twice as expensive Quadro, but the problem is DRIVERS. You know what's missing in them? Oh, just how to use every hardware gimmick they crammed in. It's bad, but at least they can eventually fix it.

I've heard that about when 1070 Ti launches

>cherrypicked benchmark

you idiot

With this being the hot new meme game of the year i'm surprised to see Vega with such a lead.

source for that please, as far as I remember pubg runs like shit on all amd gpus

>depends entirely on the game
>the most competemt game developers alive
>the most technically polished game to ever be made
>better on Vega

It's pretty much the fault of the games.

Here you go.

pcgameshardware.de/Playerunknowns-Battlegrounds-Spiel-60812/Specials/Playerunknowns-Battlegrounds-neue-Benchmarks-Vega-1236260/

You will also note that anything that isn't vega on the AMD side is running like shit.

welp

How can those exist in the same timeline?

what do you mean?

Oh boy, it's going to be silly.

It has something to do with Vega working not the way it was intended to.

For whatever reason Vega is not utilising its bandwidth effeciently as there is a clear trend (one not shared by older GCN revisions) that as resolution climbs performance drops off at a faster rate than the competition. For fiji/polaris and older resolution increases typically lets the cards stretch their legs a bit.

I do believe Vega (under the assumption AMD gets all of its features working) can compete with volta based GTX cards - especially since vega does fight the quadro/tesla volta cards in a lot of workloads right now.

>horrible GPU
nah, its just nvidia pay gayme devs to use game engine who favors nvidia. look how amd cards wrecking 1080ti by a fucking rx 580 in mining. compute performance

Bandwith is not the problem.
The way Vega works now is totally unintended.
It's not supposed to waste processing cycles and trash caches to perform things like fixed function culling.
Also shade once is disabled.

But NDAing alpha-testers for drivers is maximum gay.

>whats price to performance
It's an excuse that amdshills make for not being able to hit 60+fps in most games, so they claim "price/perf, and post gpu bottlenecked benchmarks all day while pretending that they didn't want to play those other games at all.

Are you even 18 years old?

Doom is all corridors, and closed spaces. It would take special kind of fuck up to make a game like that run like dogshit.

>Are you even 18 years old?
Yep, and I don't pretend on /nu/g that I don't play a few fucking games on the computer mashine.

[citation needed]