RX490

>RX490
>HBM2
>can't beat GTX1080 with GDDRX5


is AMD even trying anymore?

Other urls found in this thread:

videocardz.com/64475/are-those-radeon-rx-490-vega10-benchmarks-results
wccftech.com/amd-radeon-rx-490-benchmark-performance-dx12/
youtu.be/WAbl0fLY06U?t=1m14s
twitter.com/SFWRedditVideos

DELET

Source on any of that?

>AMD innovates
>REEE WHY DID THEY INNOVATE I HATE AMD NOW
>

AYYMD HOUSEFIRES

worst
>4096 cores vs 2560 cores
>amd can't win

not OP but probably

videocardz.com/64475/are-those-radeon-rx-490-vega10-benchmarks-results

wccftech.com/amd-radeon-rx-490-benchmark-performance-dx12/

>AYYMD
>housefires
>Indian memes for some reason
>goyworks
>3.5
>woodscrews
>1.7%
>The™ way™ it's™ meant™ to™ be™ played™
Kill yourselves. Fuck this disgusting board.

>install getoo
>gayming
>gaymen laptops
>ssd lol
>tfw 16 GiB
>mpv comand line elitism
>botnet
>windows kek
Kill yourselves. Fuck this disgusting board.

200w TDP

Probably.

and a burnt PCIE 3 port on mb

Probably needs a separate PSU.

>mfw mpv requires no cmd line and never has

>mfw this user is right
I just edited the conf file like I would configure MPC-HC with its gui. Then it's all the same by using a file manager.

Wait, the rx480 is a $200 card, so why would the rx490 all of a sudden be $600 and compete with the 1080? Where's amd's 1070 competitor?

HBM is a meme

this is the actual 1070 competitor. probably a cut-down Rage/Fury/Radeon XXX die or something.

/thread

better than a burnt down house
youtu.be/WAbl0fLY06U?t=1m14s

Its actually the future but wont be main stream till 2020

why would they compete for power?
Has the market fucking EVER rewarded them when they were clearly better in performance AND price?

All I want from amd is them to make a gpu that doesn't make me feel like i'm getting ripped the fuck off, when I buy it.

I got a 5770 for 150$, and I got a 280X for 250$ before that I was using a 6800 ultra, amd makes me feel like I got a deal for the price... I honestly love their current strategy of hitting certain segments and hitting them good, If they pull off proffit, they will continue to make gpus like this.

If what it looks like they are going to do, as in have an mcm gpu that does not run under crossfire but runs as a single die, that is when they may compete for power again, but till that happens, literally why the fuck would they?

Wait, vega is out?

TDP is not power consumption.

Numbers and words is all AMDrones have going for them
>but muh 8 cores
>but muh Async
>but muh HBM
>but muh Vram

Nvidia and Intel speak for themselves with performance

it's related

Sure, but not in the way people think. TDP is purely heat energy.

this tbqh

There aren't any reliable sources. We're not even sure if the RX 490 will Polaris or vega architecture. Wait until the card comes out and prices are public before sperging about Ayymd.

>480 releases
>well it will be good SOON, especially with DX12 and Vulkan
>late 2016
>DX12 and Vulkan, SOON! ;_;

no its just another """leak""" from pajeettech

I dont even know what hardware news website to follow anymore

>All this shitposting
Nvidia must be very afraid

follow the ones that only say good things about your favorite hardware vendor

The card isn't overhyped like fury and poolaris, I wonder why. Fail incoming

why would the be? normal people refuse to buy amd regardless.

If anything I'd expect something better since they're so quiet. Too much hype already blew in their face with the Fury X. The Polaris hype was just fans being too overhyped, false flaggers, and jokes being taken as serious.

>TDP is not power consumption.
>TDP is purely heat energy.

where do you geniuses think the energy goes?

literally every nanowatt of electrical energy consumed becomes useless heat in one step, unless you're powering a tiny little wind turbine from the exhaust from the heat sink, in which case you're just delaying things.

some power might be pushed out external IO to be consumed elsewhere, and some energy might be fleetingly stored in DRAM capacitors, but you're not exactly hoisting weights up mountains or winding springs with your GPU.

FUCK OFF Sup Forums FAGGOTS
GET OUT OF HERE

>where do you geniuses think the energy goes?
Hey, genius, go learn what a TDP is.

If the card is the large Vega (12 TFLOPS, 2x HBM2 modules), AMD is completely fucked.
If the card is the rumored smaller Vega (~7 TFLOPS, 1x HBM2), AMD is completely saved.
The card is almost definitely the 2x Polaris 10, which nobody is gonna buy anyway.

if vega is just scaled up poolaris was fiji was to hawaii was to tahiti, then amd will need an approx ~600mm2 die just to compete with the 1080, which is a chip half the size. amd is probably fucked since they would never be able to sell such a large chip profitable for less than $500.

I thought the 480 was supposed to compete with the 1070, why are you over hyping this again only to get disappointed like when you were hyping the 480 as a card that'll beat the 1070/980Ti?

>Being years behind your competitors in terms of performance, efficiency, and stability is innovation

TDP is the maximum average heat over some period of seconds that the cooler will have to dissipate. The heat MUST be converted from electricity, thus TDP is a good indicator of power draw maximums at a glance.

And yes, the heat output of a microchip is so close to the power it draws that it's completely useless to distinguish the two. See entropy.

roy taylor the alcoholic 'genius' behind amd's marketing thinks its a good idea to pay shills to generate fake hype

How did anyone think it would compete with the 1070 when the leaked benchmarks showed it in the ballpark of the 970 and 980... which is exactly where it landed.

What over hype are you talking about?

/thread, upboated etc.
Tbh pricing is really what counts also. I see the 1060 being priced better than the 480 outside of the US.

I thought vega was the fury branding, while the 490 was gonna be two binned polaris ships on one gpu like the 7990

It's the usual Sup Forums habit of going into a hype loop. Some leaks, some speculation. Some bullshit people say on here.

All feeding into each other and going from "decent midrange" to "titan killer"

Vega will be the 500 line, or whatever it'll be called.

I just wan't another 390 value card really.

I think if anything was over hyped on the 480, it was the power efficiency. And interestingly, there are chips that pull less than 100W now and overclock quite well. I'm very curious about how much of a role drivers had in that power consumption drop vs. ASIC quality.

SHOTS FIRED

GET BACK TO YOUR KEYBOARDS, THE GOYIM KNOW!!!!

>TDP is the maximum average heat over some period of seconds that the cooler will have to dissipate
Where did you get this information? It's a figure generated by considering nominal loads, but it's not an average itself.
>The heat MUST be converted from electricity, thus TDP is a good indicator of power draw maximums at a glance.
Yes, but no.
>And yes, the heat output of a microchip is so close to the power it draws that it's completely useless to distinguish the two
So if I have two dies that are identical, but I need one to be at a lower temperature, they will have identical TDPs?

this.

the hsf on video card X can dissipate 450 watts yet the card itself consumes 250 watts.

retards will assume it is 450 watt card.

most people on here don't know what watts, amps, volts, or anything electrical is without relativity to something else. we're dealing with people that only have metrics in resolution and fps.

Its silicon lottery, not driver.
the process amd is using is a real silicon lottery, not intel's 'every cpu hits 4.6ghz' lottery

its the 'oh shit, this is 160 watts not oc' to 'i'm barely breaking 90 without oc'

as the process evolves it will get better, but as of now, its a lottery with the top binnings going to oems first, gamers second.

what we knew was it was going to be amazing for the price, and it was, however someone with a high bin gpu kept showing benches, which do put it around a 980ti in 11, and can but it above a 1070 in 12.

however everyone shit themselves that every chip will be that ocable, mix in with nvidia fan boys fanning the hype saying retarded things, they calling it shit.

Its a shit show all around.

now we have 2 options I know of

1) they show vega, not as interesting
2) they show a dual gpu polaris, potentially more interesting than anything else, because there is potential they went mcm with it, instead of having it work in crossfire, if they did, this would change how gpus are built forever.

if they go dual polaris through corssfire, who the fuck would buy it, multi gpu is fuckign retarded for the price.

MCM Polaris will never happen.
There's nothing on the die to make it worth it, just 16 fat PCIe bit lane controllers (bottom right).
If you're gluing shit together on an MCM, you better have some huge buses (with simplified physical layers because of having like

>MCM
What the fuck does this mean?

Multi-cock Mother

It's a term your dad came up with after watching your mom get slammed out by two big dicked niggers.

>200 dollar card competing with a 400 dollar card

4870 round 2 confirmed?

>double GPU video cards
Why the fuck is this still a thing

Useful for lots of things that aren't games. Plus Nvidia got fed up of actually having to design good coolers for them so they ragequit after the titan z and jacked prices up to balance it out.

Why are nvidiafags so quiet about DX12 benchmarks?

The GTX 1080 costs 3X as much yet only gives you 40% better performance.

The Rx 490 will most like be a little faster than the GTX 1080 in DX12 and cost less than one too.

don't give 2 shits about dx12
quantum break dx11

...

>don't give 2 shits about dx12
You gonna be saying that when DX12 titles outbumber DX11 titles in a few years? good goy

holy shit, how much is nvidia paying you?

>AMDrones need to meme about DX12 and Async to get people to buy AMD

>Nvidia speaks for itself with performance

every thread.

Because dual GPU is better than two GPUs in SLI/crossfire

...

hmm? they are dx12 benchmarks where they are kinda equal and a couple where amd pulls very slighty ahead.
(protip: Posting the same Hitman or Ashes benchmark for the past 6 months dont mean shit though)
Nvidia read the adoption rate of gaming api tech and went with a card that crushes dx11 (which is what most gamers should want at this point in time) and will have crush dx12 with new gen of cards better timing their release than AMD, again..!

you see, if you are amd and are getting crushed, your competitive strategy is to innovate in whatever features you can to try to differentiate.

Think about it this way, you think that amd is chosing not to have a competitive product to beat nvidia at the vast majority of games? no, they cant, its not a choice they made. Their only option is competing in a niche.

...

DX12 is very popular among new games, it's one of the fastest adopted DXes.

>where do you geniuses think the energy goes?

Not all energy is transferred into heat, some is transferred to movement other is transferred into electromagnetism, etc etc

excess power is transferred to heat.

FUN FACT: until it died Mantle had a faster adoption rate than what DX11 did when it was released. For DX12 to have as many supporting games as it does this soon is basically unheard of (as you say).

Yet now they produce desktop chips that can go into 0.88" thick Razer laptops without thermal issue, so why don't they try dual cards again?

>Yet another "totally legit" video of hardware failures happening on a cold boot

Anyone have that last one where the supposed Fermi driver related failure happens seconds after a cold boot, before a driver is even close to being loaded?

Nvidia wants to get rid of SLI as they are running into bandwidth issues because they haven't done anything with the technology for years. Plus they like disabling it on lower tier cards as it makes their upper tier cards look bad value.

Outside of gaming it's pretty useful.
But in gaming it's god awful and only used by companies that can't make 1 big gpu.

Why can't amd just die so it someone competent buys it and starts outputting some actual fucking competition to nvidia and intel already? No seriously, jesus christ.

Fuck, this waiting is killing me.

>Why can't ATi die so that it could start producing something incredible
AMD Buys ATi
>Why can't AMD die so that it could start producing something incredible

This is never ending cycle, nVidia has normies brand recognition on their side.

And yet it hardly ever improves anything and its implementation is more often than not buggy as fuck.

DX12 isn't this magical switch you just flip and that's it just like people think it is. It's actually pretty hard to properly implement.

what does HBM2 have to do with computing power ?

This is an apples to oranges comparison.


>why did my car that takes premium gas lose to your twin turbo V8 ?

if you don't understand parallel programming maybe.

But then again I was lusting for gaming scene to have some blood letting.

But it still died.
That's like saying "I was winning the race until I was losing it."
Your mom might give you a gold star for the 2 seconds you were ahead but no one else cares.

>If what it looks like they are going to do, as in have an mcm gpu that does not run under crossfire but runs as a single die, that is when they may compete for power again, but till that happens, literally why the fuck would they?
This is what they really need to work on. Raja teased the future with the RX 480, they just need to implement it. The yields will be much better for them, but I don't know if it's going to be that cheap given that they need to waste shekels implementing it.

>without thermal issue
You don't honestly believe this, do you? Nvidia have been upping the TDP every generation. Go compare the1080 laptop version to the 980m to the 780m. It's quite hilarious. Any cooling advantage is due to Razer, not novidya.

>nvidia
>innovates by actually making their cards faster

>amd
>innovates by finding new and exciting ways to make you pay more for worse shit every year

really gets those neural connections activating

>in a few years
then I'll have the Nvidia GTX 1280, which will trash whatever AMD puts out then in both DX11 and DX12, poorfag.

name one nvidia cpu

thats right, you cant
NVIDIOTS BLOWN THE FUGG OUT

Multi-chip module
Think what the core 2 quads where and what the 9+ core zens are going to be

you make a smaller chip, and put them together as though they were one.

Something I had a good idea about before, but someone went into the why recently told me that even when amd is clearly better people refuse to buy them, But amd is fantastic at making sweet spot gpus, best bang for buck.

If they mcm these, instead of 2304:144:32 you would have 4608:288:64
Now, lets look at this from a gpu cost perspective, a 470 is the same as a 480 just less vram, so, assuming 4gb of ram is 40$ based on the 4gb and 8gb price difference, that would make the gpu alone 160$ including everything but ram.
320$ excluding ram, then either 40-80$ more for 4/8gb respectively and you have a 400$ mcm gpu.

Now, I havent the slightest fuckin clue what the disadvantages could be from this, but looking at core 2 quad, when done right, there is next to none.

If amd do this, and make it so there is next to no if not no crossfire like problems, they would likely be about to fire some shots at nvidia for a generation or two, and at the very least for people who could use that brute force, gain some new customers.

This method could potentially make a monstrous gpu, while making sure yields are high, without the need to waste silicon on higher end chips, you make one scaleable chip that can be deployed in 1-4 chip modules, with failed chips either being put in inbetween cards, or relegated to bargin bin gpus.

amd had a clear lead for 2 generations and was cheaper, they then were equal for 2 more, and then the 900 came around and now 1000, no one bought amd because marketing 'the way its meant to be played' and normal people.

why would amd waste money being competitive when bang for buck nets them more money overall... its why an mcm gpu without sli/crossfire disadvantage seems to be the most likely route for amd, they make a single gpu that can be put in many configurations, this makes the r&d cheaper along with makeing the yeilds better through smaller chips.

it didn't die, they killed it off for some stupid reason.

if amd dies we will never see competition in cpus again because either apple or someone will make the cpus/gpus in house and only use it for their shit.

Crossfire and SLI will always be under par but at least now with DirectX 12 the devs have control for multi GPU so they can be the deciding factor in whether or not the game has good support for multiple GPUs.

It's hinted that Navi will have "scalability" as a primary new design feature, and it's broadly suspected that there will be an industrial Zen + Vega/HBM2 MCM APU, but it's simply not happening for Polaris.

Traditional GCN has a central Graphics Command Processor that drives 1-4 independent Shader Engines (each with 1 Rasterizer and Geometry engine, a handful of Render Backends (with 4 ROPs each) and a cluster of 8-16 Compute Units (4 * 64 slot * fp32 SIMDs)).

~100% scaling efficiency would require a GCP to drive remote CUs or have multiple GCPs coordinate work intelligently as well as mirror memory effectively to avoid bottlenecking everything through the interlink.

it's still alive but it's called vulkan now

it was hinted for volta in 2010, things aren't going that well