AMD Giving Up on CrossFire with RX Vega

techpowerup.com/235699/amd-giving-up-on-crossfire-with-rx-vega

Other urls found in this thread:

tomshardware.de/vega-benchmarks-workstation-leistungsaufnahme-gaming,testberichte-242375-6.html
arstechnica.com/civis/viewtopic.php?f=6&t=1122459
anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5
www
twitter.com/SFWRedditVideos

THANK YOU BASED AMD
.
H
.
A
.
N
.
K
.
B
.
A
.
S
.
E
.
D
.
A
.
M
.
D

NVIDIA BTFO & FINISHED. WHAT NOW NVIDIOTS? HAHAHAHAHAH

You don't need cf support when you don't drivers!

why would you even use 2 cards when you can have 1 powerful card. newfags

>vega
>powerful

Now instead of crossfire they will sell dual gpu cards.

tfw AMD cares for our safety and wont let some lighthearted users get burned alive while trying to cf two vegas

AYYMD IS FINISHED & BANKRUPT

AYYMDPOORFAGS CONFIRMED ON SUICIDE WATCH

But even Raja show than 2xpoolaris was faster than GTX 1080

>based AMD killing off the sli/crossfire meme
Thank fucking god

They won't need it if Navi works as intended.

God just finish off AMD already.

That was just their marketing spin on Polaris because it was a shitty architecture that didn't scale at all, hence the whole "85% of the market is under $299" buillshit and then they turn around with Vega and go "well the top end of the market represents 66% of the margin so we're shooting for that now!"

Yeah it was a stupid move, they bent over and basically asked Nvidia to fuck them in the ass by raping their profit margins and making up for it in the high end.

it was a shitty uarch..

you literally dont know shit about gcn eh

>GCN
>good
lol

>autistically lists a bunch of impressive stats
>no drivers

lol the tripfaggot is craving for some those (you)s

It's a bit ironic that the AMD fanboys started hyping up the Crimson series drivers yet they use incomplete drivers as an excuse for Vega's absolutely garbage performance.

To be fair Crimson drivers are very good now, my biggest gripe with my 1080 Ti is the garbage drivers, control panel looking like Windows XP software and the DPC latency issues which still haven't been fixed but lo and behold can't have the best of both worlds.

so amd lied about the drivers
the people that tested FE and said that it was almost as a fiji on gaming lied
but as usual Sup Forums was right!
why even bother going on beyond3d and discuss the tech there when your shills on Sup Forums have a better knowledge judging by pictures

So AMD lied about RX Vega performance in their own launch slides?

>unironically thinking AMD are sandbagging
Stop posting.

The BF 1 results they gave are the closest we have to real numbers. They were done on prototype RX Vega and prototype drivers, though.

>They were done on prototype RX Vega and prototype drivers, though.
lol

zen is calling they want your tears to cool them down

Friendly PSA: Dont forget to sage, hide and filter the tripfaggot.

AMD cares more about rebrands and marketing than drivers.

Literally right after Jensen Huang gave away a bunch of AI cards to a bunch of white CS students with his signature Kaduri gave away a bunch of Vega cards to minorities with his signature.

They are a fucking joke.

What tears? I'm not the one in denial about RX Vega's garbage performance and high power draw

Gaymers BTFO

...

instead of money they should pay you in history books

None of these endnotes are related to the BF1 results.

i wonder what your excuse about the power draw will be once wolfenstein comes and nvidia will have to use the cuda cores to get near amd

>nvidia will have to use the cuda cores
????

You mean use their shader cores like any GPU would?

if you dont know what i meant then dont fucking reply
f16 is op purely on cuda cores and cuda cores are underperfoming at a 1/32 for fp32 and 1/64 on fp16 they will have to use a lot of power to fucking use any kind of fp on 1070/1080/80ti
it was like when nvidia was running certain rage features on cuda...their fucking power draw skyrocketed

Get fucked shitcoin miners

Enjoy your Nvidia mining rigs

>two games will use FP16 for some calculations
kek

>meanwhile NVIDIA can just unlock performance because AMD are so far behind

fp16 is required by vulkan moron
but hey you arent really the brightest around here so i wont laugh at you

wew, the tripfag is on summer mode

He really is. Don't forget to hide and filter him, it's for the good of all.

>FP16 is the latest AMDrone meme

>using old-ass SPECviewperf instead of SPECapc

tomshardware.de/vega-benchmarks-workstation-leistungsaufnahme-gaming,testberichte-242375-6.html

>using cherrypicked benches to push an agenda nobody is buying
nice try

Mining doesn't need CF/SLI. The mining platforms recognize each individual GPU separately.

Ah, all that shitposting about Nvidida dropping 3/4 card SLi and now it's TOTALLY OKAY when AMD does it.

last time amd enabled fp16 for gaming usage this happened
arstechnica.com/civis/viewtopic.php?f=6&t=1122459

history repeats itself

Triple/Quad SLI was retarded, though.

I agree, but that doesn't negate all the AMD shill shitposting over it

>Anything NVidia isn't good at is now a meme
FP16 is for deep learning/AI and NVIDIA own that market.

>Being this much of a buttblasted NVidiot Sup Forumsermin drone

I want you to go to Tomshardware right now and eMail the editor and writer of this article and tell them how much of agenda pushing shill they are posting cherrypicked benchmarks.

So why did they stop?

ATi played just as dirty as NVIDIA did.

holy shit you dont even know what the fuck you are talking about eh?
take a look
anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5
enjoy the cuckery that is called nvidia

cause the moron amd had as a ceo back then was the one that bought ati and was the one that pushed for bulldozer ....

Yeah buying out ATi was AMD's biggest mistake, look what happened now they had to split them off as RTG and are eventually going to sell them off because they're dragging the company into the shit

Counterpoint.

buying ati was ok the problem was the ceo wanted epicly stupid designs the bigger the better
i swear he was literally going the hitler's way
lisa literally took amd from a near collapse to what it is now
fun fact he is CIO on dell now and he was the one that burned off the evidence that intel was giving money to dell

As compelling a product as it is, nobody important is going to buy it, all the big studios are already invested in the CUDA ecosystem.

Makes sense when you consider the possibility that combining two of these in a cramped space could create a nuclear fission.

www facebook com/jarred.land/posts/10154878409465415?comment_id=10154878470535415¬if_t=like¬if_id=1501276190497990

cuda is just a set of api's and nothing more currently there isnt a single app that runs purely on cuda and since amd already told the press that sony pixar and adobe are already developing support well no cuda compared to what the ssg offers is ancient literally

You type like a 13 year old who has had one too many energy drinks

>It's another episode of a retarded AMD poster spamming same picture of a $7000 GPU as an argument in a discussion about consumer GPUs.
I hate reruns.

i guess trying to debunk red ceo the company that literally helds hostage the render segment on hollywood is too much

CUDA can be compiled into OpenCL pretty easily now, can't it?

Reminder that Zen runs cooler than kabylake which can literally burst into flames

Blame the guy claiming AMD is going to sell RTG off because reasons.

Oh no less gaymurs are buying our workstation focused GPU architecture, clearly we need to get rid of these fgts before they stink up the place with their anti-gaymur agenda.

Why have two housefires when you can have one giant housefire? Raja messed up big time, the only way im putting that shit in my system is if i get it free

Are you retarded? The graphics division and miners have been the only things keeping AMD afloat until just recently

Of course it's not supported, no residential home could provide enough power to drive 2 Vegas under load.

I get if gamers are mad they waited for a year to get an arch that's better suited to workstation tasks (pretty much what Zen was, really)

But let's face it, hardly any gamers were buying discrete add-in board AMD GPUs anyway.

If AMD were smart they'd just use all those GPUs to mine crypyocurrency for themselves

Don't really care. Crossfire and SLI have been more trouble than they were worth every time I've tried them, it would be good to see both die to be honest.

why would a shovel company dig when they can just make truckloads off of selling the shovels to fools

There are a couple of ways to look at this.

1) SLI is a pain in the dick to write drivers for and the whole DX12/Vulkan thing was to offload that work to developers. Since multi-gpu support is baked into those APIs it might still be a pain in the dick, but it's a much smaller pain in the dick for the developers.

2) Raja has a tweet about Infinity Fabric and Vega somewhere. It was about enabling IF in Vega drivers being hard or something. It's entirely possible that they are fiddling with IF in such a way that multiple GPUs don't look like multiple GPUs. Perhaps they are using IF to force GPUs to function like the CCX in Zeplin.

when nvidia puts a hardware sc their power draw will skyrocket too just like it did on kepler
but hey its amd bla bla

NVidia's "simple hardware complex software" approach can't last forever. I'm looking forward to the day hardware schedulers return to NVidia.

DELET DIS

AMD ALWAYS DOES THIS

IT'S FINE WHEN AMD CREATES HOUSEFIRES WITH EVERY MODEL

BUT IT'S NOT FINE WHEN INTEL OR NVIDIA HAS THIS ONE SHITTY MODEL THAT DOES IT

well duh
dx12 and vulkan support multi gpus mixed from any manufacturer
of course both AMD and Nvidia will want to kill it dead.

>muh fine prints

:^)

It was

And then they fired the original engineers and replaced them with rtg china

>But let's face it, hardly any gamers were buying discrete add-in board AMD GPUs anyway.
This, nobody buys AMD cards. It's time AMD realized who their real consumerbase is

>It's time AMD realized who their real consumerbase is
Judging by Vega's architecture design, they already have.

Except with DX12 you don't even need an SLI cable or CF mobo to make it work.

Cute girl.

>mfw the last good AMD card was the HD7990 and I had it
Was nice to watch it beat the GTX

I miss dual GPU cards desu

>jews not caring about saving hundred thousands of dollars

They aren't snadbagging, they are or were deathmarching the driver team to finish implementing DSBR before the RX Vega launch, which they essentially stealth delayed twice to August 14 to give them extra time.

Vega FE is like Maxwell with TBR turned off. Until we see how good RTG's implementation of DSBR is we don't know fuck all about how RX Vega will perform relative to Vega FE as already tested.

>I'm looking forward to the day hardware schedulers return to NVidia.

I'm not.

>Not using the massively outdated version used in SPECviewperf is cherry picking.
?

Oh, it will be glorious.

Good
CfX and Sli are outdated garbage
Single GPU is the way to go.

>400 W power draw
>no SLI

No shit, you'd need a 1200 W PSU to run that shit comfortably lol

Not very surprising since the market is full of games which have effects and rendering techniques (basically all temporal effects, they require data from the previous frame so rendering of the new frame can't start before the old one is finished) completely incompatible with AFR multi-GPU.

NVIDIA already pulled the plug on low end and mid range SLI.

>of course both AMD and Nvidia will want to kill it dead.

If they wanted it dead then they would be shilling SLI/CF even harder.

If Vega is same price as 1080 and same performance its totally worth it to buy, cuz Freesync is easier and cheaper to get than Gay-Sync
Would totally get Vega myself, but currently trying to buy 570/580 for normal price, fuck you miners :)

>cuz Freesync is easier and cheaper

Any of that saved money is lost the moment you get your power bill

>power bill

I don't trust Vega FE simulated benchmarks sorry
lets just wait for legit RX tests

>Gamersnexus OCs FE Vega to 400 watts
>HURR DURR VEGA IS 400 WATTS
You retards have brains smaller than a fucking peanut.

>Simulated

AMD said the fucking power draw themselves. 345W for Vega 10 XTX. 190% the average draw of a 1080 for same performance a year late.

That is the watercooled version. Aircooled is 295 watts and that's 60% more power than a 1080.