RX Vega 56 - More efficient than a GTX 1070

>Vega is ineffic-

Only if you're a nigger that doesn't know how to undervolt and overclock.

>Wattman Settings:
>GPU Core:
>P-State 6: 1352 MHz @ 810 mV
>P-State 7: 1402 MHz @ 820 mV
>HBM2:
>P-State 3: 960 MHz @ 820 mV

>Witcher 3 Max Settings (no Hairworks) 1080p
>Benchmarking Novigrad City Run Through
>Average FPS: 95
>1% Low FPS: 70
>0.1% Low FPS: 53
>Power Draw Average: ~120-130 Watts

Stock settings get me around 99 FPS average so I'm losing ~4% over the stock configuration but reducing power consumption by ~33%. Stock Vega 56 is also ~7% faster than a stock GTX 1070 (depends on the game but with a large enough selection that's how it plays out).

Attached: silicon wars.jpg (1920x2160, 2.1M)

Get the fuck out

Fuck you nigger, this is tech related. AMD's problem is that they over-volt EVERYTHING at stock settings. The Vega architecture is a lot more power efficient than people think.

You don't own a gtx 1070...

>AMD kills Intel
>Nvidia kills AMD
Pottery

Nigger you can undervolt the 1070 too.

Then it just makes Pooga 56 look even worse.

>Nigger nigger
>Muh games

Read tons of 1070 reviews and viewed the Digital Foundry review that shows the same area I benchmarked. Besides, I'm only losing ~4% performance from the stock configuration and all the reviews suggest Vega 56 is ~7% faster than the 1070.

I think the main problem with Vega is the core being massively overvolted. Mine can do 1400 MHz with just 820 mV.

Show me what the results are (and not for mining).

Everyone has always gotta talk smack about AMD, but I really believe they'll win in the end.
When they announce Q is 2019, that'll be the game changer.

t. salty linux toddler

t. butthurt underage nigger.

nvidia pays for better support from developers than amd does, and that often makes a bigger difference than >10% raw performance. if you're doing something other than gaming, vega is probably worth it. but not for gaming.

I don't regret the two 1070's I bought for my two rigs. not right now at least.

Vega 56 beats the 1070 in most gaming benchmarks, you're retarded.

sure, when the amd support chooses all the benchmarks.

if you're gaming why would you underclock anyway

you're gaming, if you cared about actually saving power you'd just NOT be gaming

t. cia nigger

You know what they say, if a product can't stand on its own..

Actually, it's hit and miss when it comes to the 1070 and Vega 56. In some games NVIDIA wins and in some AMD does. Hard to tell which GPU is better.

Same settings (except mine can't hold 960Mhz HBM stable so 945 instead).

Fire Strike Extreme stress test = 202 watts max. Gaming will be lower for the most part though. Especially if you enable Radeon Chill.

The Virgin Intel

Attached: Hmmmmm.png (947x368, 463K)

The chad ryzen
>The Vega architecture is a lot more power efficient than people think.
Isn't it vega architecture that they're shovin in their apu's cellphones and laptops? Obviously it must be really efficient. The only problem I though was that i doesn't scale properly

Attached: Chad amd.png (1052x879, 1.06M)

>I don't regret the two 1070's I bought for my two rigs. not right now at least.
do you really need two? There are poor niggers like me that have to spend their whole xmas bonus on a 1060 3gb.
And here you are just talking non-chalantly about $1400 worth of gpu's that you don't even need. Capitalist pig.

You're a faggot. Kill yourself.

>poor voltage

t. inhell nigger

You poorfags are hilarious. You buy garbage "underdog" hardware and then spend your free time trying to convince yourself and others that you actually made the best choice.

Post-purchase rationalization is a hell of a thing.

And how much it was for a Vega 56?
I got my 1070 for 400 bucks a month ago, the Vega was around the 650-700 bucks.

t. inhell peasant

I don't know whats more hilarious? Buying a superior product price to performance wise or giving your money to criminal organisations that have exploit and extort the market into a monopoly only to charge their customers twice the products worth.

I got mine for £380. Flashed the 64 BIOS and can nearly match a 1080.

this

t. Angry poster

t. Samefag

t. Ornery Sup Forums.org/g/ user

Wrong :^)

Attached: Screenshot_20180320-221041.jpg (509x337, 65K)

>Inspect element
Really nigger?

Is this bait?
[spoiler] you can't inspect element on Clover or mobile phones [/spoiler]

Admit it you got totally trolled by the anonymous Sup Forums!!!!!!!!!!!!!!!!!!!!

shut up you lying phagget

top kek get BTFO'd cunt!!!!!!!!!!!!!!!!!!!!

t. enraged troll victim

PRAISE KEK JOIN ME ON REDDIT :^))))

Attached: Select_all_images_with_targets.jpg (1440x540, 67K)

>mentioning plebbit
KYS you cockroach.

Attached: images (5).jpg (384x384, 12K)

wake me up when I can get vega 56 for gtx1070 price

Does AMD hardware still run like a Siberian oven?

All me baby, cry more.

where the fuck do I even buy a vega they're out of stock everywhere

Intel and Nvidia hardware have taken the job as the heater for people in Siberia.

>amd has to overvolt TO COMPETE
>which means they can't compete without inefficiency
>which means the product is inefficient, and shit

Congrats on tweaking your shit to make up for being a poorfag. It's still shit.

I'd rather have barely 7% less performance with more features.

>Shadowplay
>Shield
>Drivers that aren't fucking garbage
>I can actually afford a 1070

Plus if I develop autism and start mining crypto the 1070 is the better choice.

Inefficient.
Compared to 1070 it costs 3x more.

I should mention that the total power draw of my card (including PCIE draw) is ~160 Watts while testing with Witcher 3. I'll see what I get with other GPU stress tests. Timespy results with my undervolted card in the pic.

It's shit in the way you have to tweak the settings to get decent performance per watt but I got a cheap FreeSync monitor and bought my Vega 56 for 470 euros (with Wolfenstein 2 and Prey included, though I didn't want them). So a decent deal when you consider that FreeSync monitors are much cheaper than GoySync monitors.

Vega was let down by HBM2 (it was supposed to offer 1.2 times the bandwidth it actually does) and it's poor frequency scaling made it even worse with the GPU core overvolted.

Vega should of been a mid to high end card. Vega 64 is an embarrassment.

Attached: timespy.png (1526x1276, 1.17M)

can someone just clock it to like 50% stock frequency and make it draw 20W
i'm considering buying one but 600€ is a lot for a toy

back to containment board

Nigger, AMD has had a feature identical to shadowplay for more than a year now. Try to stay on top of your memes.

t. ganoo pluhs loonix plebeian

Are you a shareholder, dude? I think you have a bigger problem now with the GPP stuff

>reducing power consumption by ~33%
Who in the hell actually cares about power consumption? Like, seriously. I have never once worried myself about how much electricity my rig uses.

AMD has created a muh underdog cult.

I know, but many of them are actual shareholders who bought shares in 2015/16. They're all over tech websites and YouTube's comment sections speculating and it's kinda funny to see it was all in vain

feature identical... relive works better then shadowplay

fuck your shield

considering nvidia drivers are the only ones I crash playing video with, and will drop my monitor till I unplug and replug it in, ill call amd better. just note im o n a 1060 6gb right now.

390X here, ReLive is a mess.

more power means more heat, and some of us have upses so lower watts means longer use.

had access to a 290 and 290x, was better than shadowplay by a significant margin.

>had access to
Well I have access to it right fucking now and it's still much worse than OBS

Booty blasted

nvidia/intel would have the same 'underdog cult' if they were on the bottom.

Anyone who isnt a teenager trying to justify what their mom bought them understands that competition is great and would like to see each of them have a fair and equal share of the market.

This.

AMD needs to stop fucking aound do a complete redesign on an ultra high clockspeed ultra efficient 2,000 shader design and then make MCM chips out of multiples of those dies.

>Anyone who isnt a teenager trying to justify what their mom bought them understands that competition is great and would like to see each of them have a fair and equal share of the market.
And of course the mature thing to do is spazzing, shitposting and defending your favorite company online, right? AMD is fucked because of their own stupidity

t. inhell plebeian

Only the first post you quoted is me. To answers your question, I was but I traded AMD for Micron last December.

*answer

I may get back in but only if they show data center market share gain by Q3.

>I was but I traded AMD for Micron last December
Dodged a bullet

Yeah we knew 64 was overpriced and the 56 was the sweet deal. I'm on the 64 BIOS and get some pretty good results. Not quite as good as yours (silicon lottery) but good enough. If I overclock I can get to GTX 1080 levels of performance (at least in 3DMark).

Attached: Result_-_2017-11-05_07.15.25.png (977x821, 78K)

V56/64 is a MASSIVE chip vs the 1070...Vega is a mess OP

Vega 56 is good. 64 is trash.

Or buying the better product instead buying amd garbage and then acquiring Stockholm syndrome to delude yourself that the shit you bought is actually gold.

>good
It was released more than one year after its rival, it's hotter and consumes more power and you still have the usual driver overhead and shit OpenGL support (goodbye emulators)

they aren't always both for me, and they have different uses in different rooms.

One of them runs servers and stuff and doubles as the gf's pc sometimes and VR rig (eventually) and the other is my personal PC for gaming and browsing in a small form factor case, which I'll be switching to the NFC S4 mini case next month.

I'm a bit new money (got a raise from 40k to 110k annually early last year) so I wanted to build some nice pc's that I could use to replace my old rig with a really shitty passively cooled card and 1155-socket cpu.

I can't wait to get fucked by taxes in april

t. nvidiot

what are the odds OP bought a vega and forgot to upgrade his psu?

Why did you switch the color of their lightsabers?

They were already green and blue

You're free to prove me wrong :^)

You want to be proven wrong. Okay pleb. Look at pic. It's you.

Attached: image.jpg (796x805, 112K)

k

Attached: nvidia.png (1083x585, 142K)

Okay this you then

Attached: image.jpg (653x726, 105K)

Now I agree, I am extremely disappointed by AMD and my next GPU is going to be a Nvidia

I have a GTX 770 and my next GPU will be AMD because I have become a proud AMD cocksucker. Although my 770 is great, I regret not getting the R9 280 at the time.

If only you got a 290, the last great AMD chip...

Oh Pajeet, you are silly. Look my GTX 1080:

Attached: GRW_2018-03-21_01-16-37.jpg (2560x1440, 1.67M)

>sharpening filter

Attached: 1431729954297.jpg (678x519, 37K)

The 480/580 are good. You are an NVIDIA shill.

>The 480/580 are good
I agree, but good and great are two different things

If you game at 1440p or lower resolution on a monitor that has a refresh rate of 75 Hz or lower there's no need for that kind of OC. You're probably drawing 270 Watts for the GPU alone at that clock speed.

After further testing I've found the sweet spot for Vega: CF 1250 MHz @ VC 820 mV MF 960 MHz. This pulls roughly 155 Watts for the card alone. Though I'm not exactly sure how to get the best estimate for GPU only power draw. I'm using Corsair Link to monitor the system power draw along with a socket power meter and GPUZ's GPU power draw measure.

When idle Corsair Link says ~79 Watt average draw and when I stress the CPU only 100% it reads ~157 Watts (R5 1600X @ 3.8 GHz VC 1.3 V). Then running the Witcher 3 it averages 260 watts. Gaming doesn't stress the CPU 100% (~35% I think) so the basic math works out the GPU power consumption at ~155 Watts. GPUZ says ~122 Watt average but that's probably an under estimate.

I'm getting 72 FPS average in the Witcher 3 at 1440p ultra in Novigrad which is fine as I've got a AOC Q3279VWF 32" VA 75 Hz FreeSync monitor.

I'm got a HX750i. More than enough power.

Attached: v shills.jpg (883x704, 204K)

Bitch please

Attached: kGwkdhd[1].png (876x797, 58K)

Attached: lrzodkJ[1].png (898x797, 57K)

290 was masterrace bang for buck gpu desu

...

That was just to see how high it would clock on air on the stock GPU without crashing. In reality I keep it a lot lower and try to conserve power and keep it quiet like I posted prior to that.

Probably some sort of advanced joke about how AMD kills Intel but then gets killed by Nvidia? Which wouldn't have worked with the original colors.

Vega would be fine if custom cards were available 18 months earlier and it was priced appropriately.

Unfortunately it uses twice as much die as a 1080 for the same performance, plus some VRAM that's crazy expensive, so that's never going to happen. Unless AMD pulls some driver magic out of their ass, or mining demand takes off again it'll fade away after Volta drops and it's relegated to being a 2060/2070 competitor.

The package is also incredibly delicate and I regularly see idiots destroy their cards trying to do basic repasting stuff that is trivial on most other cards.

Polaris is fine and a 28-32 CU Vega would also be fine. GCN just gets unbalanced when you try and scale it past that point.

>something would be fine if it was available before the tapeout
No shit.

You fucking retard that watt man is only gpu power consumption you need a external watt meter like the reviews to compare