What happened to that rumoured 490 which was going to be two 480 GPUs on one card?

What happened to that rumoured 490 which was going to be two 480 GPUs on one card?

Also is AMD actually doing anything at all at the moment?

Other urls found in this thread:

forums.geforce.com/default/topic/472194/geforce-500-400-series/pci-e-power-cable-8-pin-vs-6-pin/
youtube.com/watch?v=WAbl0fLY06U
twitter.com/NSFWRedditVideo

itll be a shittier gtx 1080 that uses twice as much power

i dont give a shit if its better or worse but nvidia need some competition at the high end.

AMD can't compete

AMD can compete in amount of fecal matter dumped on the designated streets.

>nvidia need some competition at the high end

Apple will make an A-series GPU to compete with high end Nvidia before modern AMD does.

Vega

The 480x2 was never on the table.

Requesting mehmet my son version with raja kudori and 6/8pin PCIE.

does anyone actually take AMD seriously anymore?

but will it just werk?

AMD has great GPGPU value.

I suspect that ZEN is a giant mess, tand that all manpower and funds are going toward trying to limit te damage. So gte ready for 2017. AMD is dead, long live ATI!

AMD has been the best value for mid-range (~$200) GPUs for a long time.

...

Thank you.

Came a little late.

8 pins does not mean more power it just means more grounding. A 6 pin can still pull as much power as an 8 pin. Also that is an out of date meme.

forums.geforce.com/default/topic/472194/geforce-500-400-series/pci-e-power-cable-8-pin-vs-6-pin/

> However, the 8-pin is designed to supply up to 150W of power, the 6-pin only 75W.

You can't even read

>A 6 pin can still pull as much power as an 8 pin
Not really, it can't. At least if it should stay within specs.

But the card could pull all the power it needed from the 6 pin, that was never the issue.

Just because it is designed to does not mean it can't. Just saying bruh.

and indeed a simple driver fix solved it anyhow. It was a non-issue.

Meanwhile over in Nvidia land

youtube.com/watch?v=WAbl0fLY06U

>buyers remorse

I have a 1070 lol

we shall wait for a vega nvidia husefires and scams/gimping is just not worth it

Here's your ','.

one 480 is already destroying motherboards
2 x 480 will be hotter than the core of the sun.

>gpu kills motherboard
vs
>gpu kills itself

This is the choice we have to make. GPUs were truly a mistake.

Nvidia with the house fires again.

Are you too retarded to shitpost properly? They were killing motherboard because they were drawing too much power from the slot, not because they were running too hot.

Hey retard. Power draw over long periods = heat. If the power draw were too much for the mobo to handle. It would go poof immediately. It didn't. The mobos got fucked because constantly high power draw at the slot melted the contacts and fucked the PCI-E slot.

480 only destroys motherboards if you use the reference design, and even then only if you put 3 of them in the same board for bitcoin mining.

With the 8pin cards this was literally never an issue.

Except that exploding GPUs are only from EVGA, you can get a properly working Nvidia card from another AIB.

>They were killing motherboard because they were drawing too much power from the slot, not because they were running too hot.
THEY ARE THE EXACT SAME THING

user, that retard lives in a universe where higher power draw doesn't generate more heat. He's just from a different dimension with different physics.

Post benchmarks.

>THEY ARE THE EXACT SAME THING
Not that guy but you're trolling, right? There's a pretty big difference between "using too much power in general" and "Pulling too much power through an area that wasn't designed for it".
I don't think you're in a position to call others retarded.

>through an area that wasn't designed for it
Yeah and what exactly damages the area that it wasn't designed for?

Heat, fucktard.

no no no that's not it it's the electrons moving too fast that's damaging the motherboard

I'm fucking done with this shitty board

Does Nvidia have decent open source drivers? No? Fuck off.

Ok, let me spoon feed you. The motherboard was not designed for that amount of current. Yes, this would result in more heat than the motherboard was designed to handle. No, this heat was not made by or on the GPU itself. That means that the temperature of the motherboard and the temperature of the GPU are two separate things. The GPU could be cooled by a sun made of ice and the motherboard would still fry itself. The load on the motherboard being damaging does not mean the GPU runs hot. Again, you're not in a position to call anyone a fucktard.
>I'm fucking done with this shitty board
Yes, please leave Sup Forums.

There are no reports whatsoever of a 480 destroying even a single motherboard.

Yeah everyone knows that the socket heats up too much and the contacts melt you fucktard. The bottom line is the socket can't get rid of the heat fast enough as it's not DESIGNED to handle that amount of load. YES WE KNOW THE GPU DOESN'T HEAT UP FUCKER EVERYONE IS SAYING THE CARD DESTROYS MOBOS. STICKING TWO TOGETHER WILL LEAVE YOU WITH A CUTE AND NICE FUNCTIONAL GPU AND A FRIED MOTHERBOARD. THANKS AMD

>Does Nvidia have decent open source drivers? Not that guy but no one has decent open source drivers. NVidia are especially bad because they don't publish any documentation on their stuff though, meaning no one can make open source drivers. No one makes them for AMD either, who happily publish their documentation, but that's beside the point.

>YES WE KNOW THE GPU DOESN'T HEAT UP FUCKER
APPARENTLY WE DON'T
See here? This guys says that a GPU drawing too much power from the slot is the same as the GPU running too hot.
See here? This guy says that since one 480 will destroy a motherboard, two will be hotter than the core of the sun.

So, in conclusion, not everyone knows that the two are separate things, which is what this entire chain of replies has been about. It took an insane amount of spoon feeding to get that point through to you.

Wow, no idea how I managed to fuck up my post that bad.

>Apparently we don't

Maybe you don't, fucktard.

>Two will be hotter than the core of the sun

Yes at the slot.

>Yes at the slot.
Nice backtracking. GG no re.

Neither does AMD. The radeon and amdgpu drivers don't support OCL for example.

>backtracing

Fuck off pajeet, you have 2 months before deportation.

>Lost the argument? Angrily call him a pajeet!
So angry you can't even type properly, impressive.

Can't stump the trump

I don't have to, I just have to stump a leftard pretending to be a trump supporter as a false flag.

Amd free drivers are almost as good as closed. Nvidia is dogshit unless you use proprietary.

>Amd free drivers are almost as good as closed
With some GPUs and some Linux configurations. It can be quite a nightmare. Then again it's Linux so that's to be expected, novidya are no better on that front. Haven't tried Intel but unless their Linux drivers are drastically better than their windows drivers, I don't see them doing too good either.

better than AMD
AMD cant into OpenGL either. Nvidia does everything, AMD just bothers with DX12 only so AMDrones assume that Nvidia is gimping GPUs.

>caring about muh open source when the proprietary driver is on par with the windows driver in terms of performance

you freedumbs communist idiots are the worst. thank god we elected trump to get rid of fifth columist dipshits like you once and for you.

B..but muh botnet

AMD still has a better chance of developing a top end GPU.

My GTX 660ti works flawlessly in Ubuntu.

>nvidia stock just went up 20%
How can AMD compete?

> freedumbs
> communist
DIVISION BY ZERO

Considering the Fury HMB1 is actually holding up in benches

Vega HMB2 could seriously shake the market up.
Especially if it allows compact pcbs with lower TDP

>What happened to that rumoured 490 which was going to be two 480 GPUs on one card?

It was a stupid rumour from a clickbait site.

>one 480 is already destroying motherboards
>2 x 480 will be hotter than the core of the sun.

They can just put two 8 pin connectors on it and it would run fine. The 480 only had problems because the reference design was stupid and used 6-pin power and a $5 cooler; third party designs all use a proper cooler and 8-pin power and NONE of them have temperature or power drain damage issues.

Unlike Nvidia cards which so far experienced VRMs blowing up, VRAM corruption by firmware, and drivers bricking cards (again)

Fiji lives or dies depending on how heavily an engine relies on deferred shading and avoids excessive geometry loads.

The things Vega need to do to succeed are:
> don't fucking skimp on geometry and rasterization again
> don't sit at ~65% of theoretical memory bandwidth again
> get relatively close to GP104/GP102 in perf/Watt

it could only compete in price, not performance

and the rop