If i wanted power efficiency id have a celeron and gt 750

if i wanted power efficiency id have a celeron and gt 750

Other urls found in this thread:

hothardware.com/news/amd-radeon-vega-frontier-edition-hbm2-preorder
twitter.com/NSFWRedditImage

>That 300-375W number is in all likelihood coming from the peak loads the board is designed for. Add to that AMD's notorious practice of buffing up voltage to harvest as many dies as possible at high stock clocks adds into the increased power draw by sizable amounts. Initial cards will be just like the RX 480 launch with power consumption improving later on and people finding even more reductions through undervolting. Keep in mind semicon firms have varying methods how they classy their 'TDPs','TBPs' etc. peak consumption of a chip is typically 1.5x of the respective TDP. Well designed boards are done with margins to ensure it can handle more than the typical TDP, this means close to the 1.5X value. So the average you should expect for the claimed 300W is anywhere around ~200-225W and ~250-275W for the 375W SKU. Initially though due to high voltages AMD will inevitably put on the early batches it will cause the averages to be well close to 275-290W.

This is just a Fury 2.0 with overclock.

Vega is a meme! Wait for Navi™.

xbox one x uses amd and can run 4k@60fps for $500. Get me nvidia pc which can do this even for twice as much.

AND STILL SLOWWR THAN A 1080 HAHAAHAHAHAHAHAHAHAHAAH

AHAHAHAHAHAHA LOLOLOLOLOL

VEGA IS DOA!

Guess it was too much to ask to spank both Intel and Nvidia.
Hopefully it responds nice to undervolting on stock frequencies, I don't know why Fiji and Polaris shipped with such high voltages.

Wait™ for Navi™
(But seriously if you don't need to upgrade yet)

>60fps
No.

navi is a smallscale gpu though.

1080ti hits 280w right?
I don't see a problem you all trying blow out of proportion again.

Vega is a high end monster.
ayynvidya fanbois been complaining amd sucks because it doesn't have a titan-like thing. When they get it, they'll ofc complain too.

Reminder to report samefagging shitposters.

oh wait...

Aren't these the compute cards?

No one knows what Vega FE is for, pro or Titan like flagship.
Either way, tests on it are not ran with pro drivers while it's on AMDs pro line page, so we fucking don't know.

Marketing acting retarded again, probably, or sandbagging

Some MSI guy said something about RX Vega which is the consumer card. Click the damn link.

>clicking links on Sup Forums

>samefag

KYS

lol

"60 fps"

clicking links on Sup Forums > Sup Forums > Sup Forums

>literally zero information other than Vega will probably require 300-375W
Nice bait, faggot.

>require
Not even that, it's maximum board power.

>Vega will be another 300W housefire with worse performance than 2016 Nvidia GPUs

It's just getting embarrassing now. AMD should quit making GPUs.

>worse performance
Benchmarks please.

Reading blocks of green text is informative sometimes.

See They run a shit ton of voltage to harvest as many dies as possible.

XBOXONEXEONEXOBX uses a "Jaguar" apu which is a AMD construction chip and an rx580 glued together. No 8320e and 580 is doing 4k 60fps at any reasonable settings

>Ryzen was better than expected and BTFO'd Intel in certain segments
>Vega's gonna be shit
nVidia is going the be the winner in all of this

Hopefully their effort is going into MCMing GPUs for Navi or whatever is latter so they can overwhelm Nvidia with cheap silicon.

A lot of "Source: My Ass" going on ITT.

>costs 2-3x times the 1080ti
>is slower than it
>uses more power

One of these is wrong.

The thing with Nvidia is even if they do fuck up it never costs them much money. I wouldn't be surprised if the GTX 970 is is the most popular discrete GPU of all time, even with the memory issues. Sure they had to pay $30, but how many people honestly bothered with the rebate? Even if the Shield wasn't super successful, they managed to dump all of their unsold Tegra's on Nintendo and the Switch is now a pretty big success despite being a rebranded Shield. They had their biggest fuckup in the company's history at the start with the Saturn and the NV1, but bounced back and have been unstoppable ever since.

It's not a 8320e, it uses much later cores. Excavator I think, the last generation of the bulldozers. It has much higher performance/watt. and iirc it has much more cores than the RX580.

But, anything can do 4k 60fps as long as it has enough bandwidth to draw that many frames. The only question is what kind of graphics it can push at that resolution. For example my 10 year old PC can run Touhou at 4k 60fps fine.

You forgot NV30.

If RX Vega has the same fps/flop ratio as Tahiti had, that power consumption is more than fucking justified.
That perf/flop ratio would wipe the floor with 1080ti, which is why I think it won't happen even with Vega's massive bottleneck removals

Polaris was not efficient, vega will not.

Tell me what Polaris has to do with Vega.
Try to at least sound technical.

Nothing. Polaris does not deliver the goods, we should not expect more with Vega

I said try to sound technical not fanboyish.

>not fanboyish
I own a 390

>I said try to sound technical
With a TDP of 375w for the Frontier Edition, I did not expect better for the gayning version.

First of all, TBP is not TDP
Secondly, there's two frontier editions from 300-375W
And we've already been shown a 8+6pin card.

So the clock with be lower than the Frontier Edition? Also 300w is a lot, more than Fury X.

No they won't, and TBP/TDP is not power consumption.

>No they won't

How?

I'll let you on a little hint, both 375W and 300W Vega have the same TFLOPs, meaning they have the same clocks.

Think about why that is and think hard what "TDP" means.

The AIO version is faster.

It's not magically faster, they have the same clockspeed.

I'm asking if you know why that is?

>I wouldn't be surprised if the GTX 970 is is the most popular discrete GPU of all time

Geforce 2 MX
Geforce 4 MX
Radeon 9600
GF 8600 was super popular too
Radeon 4k and 5k cards sold great as well
and right now the 470/480/570/580 is impossible to find due to high demand

I'd be surprised if a high end card like the 970 would ever outsell the mid-range cheap cards.

More heat =~ more power draw

>It's not magically faster, they have the same clockspeed
Still faster...

What a flippen disappointed.
Oh well, then 1080 it is

If it has the same clockspeed it's not faster since it's the same chip, you're not making any sense.
If they have the same clockspeed the heat is also the same since it's the exact same chip.


Now I'm pretty sure you're not really versed in this topic, I'll ask you again, why has the 300W and 375W the same clockspeed, same TFLOPs, but a 25% difference in TDP?

There's actually 3 reasons, can you name at least one?
I'll fill in the other two.
I'll also give you a hint, TDP != power consumption

God damn this is the technology board, learn what the fuck TDP means already.

>same clockspeed
source

>Both graphics cards include 16GB of HBM2 memory, 483GB/sec of memory bandwidth, 4096 stream processors 13.1 TFLOPS single precision compute performance and 26.2 TFLOPS double precision compute performance.

hothardware.com/news/amd-radeon-vega-frontier-edition-hbm2-preorder

Same everything.
Come on, this is basic tech stuff.

AYYMDPOORFAGS already bought a 1080 Ti. The meme is over. No one is getting butthurt anymore. There is no one arguing that AMD will deliver a better product than Nvidia at this time, and if there are, they're delusional.

And what is the clock speed?

Xbox doesn't matter when everyone uses Sony ps. Only 8 yearolds care about xbox.

Oh God you're retarded.

If the TDP is higher, the clockspeed aren't the same.

We know nothing about the GPU clock.

...

I really never understood why anybody acts like such a toxic shitter over PC part specs.

You guys are literally cancer on all sides.

>Which Jew is best Jew, Sup Forums?

is what you sound like. Fucking kill yourselves.

TFLOPs ratings are clockspeed you nitwit, if they're the same the clockspeed is the same.

TDP has nothing do with it, I already said there are 3 reasons why the TDP is 75W higher, you and the one I was arguing with haven't mentioned a single one of those reasons, because you clearly have no clue how any of this works.

Excuse me if you're not the same braindamaged guy from above.

TDP = Thermal Design Power = "Design the card for these maximum specs so that the actual average power requirement is safely suatained for a very long time"

TDP =/= expected power use

Fiji (Fury X) averaged something like 240w
The 290 non-x was still rated for something like 275w but averaged 215-225

Stop believing the lies

Because we FUCKING WAITING™ Vega and this shit will probably have a very high power consumption and will not beat the 1080ti

Asperger's causes people to have mouth foaming fanboyisms over niche products and characters in shows (hence the obsession with anime on Sup Forums boards regardless of the board.

I just buy whatever card is best for the money. First card was a gtx 460, then a 570 later on, then a 7950, and now I'm rocking a rx480 granted I'll probably sell the rx480 and slap my 7950 back in considering the cryptomining boom.

>7900X has a TDP of 140W
>1600X has a TDP of 95W
>1700X has a TDP of 95W
>7700k has a TDP of 91W
>69** has a TDP of 140W


Take a good look.

GPU buyers are not a finite resource

Still spanks Poozen.

I'll give you one more hint since you're so adorably retarded.

A card that pulls 200W can have a TDP of 100W, heck, actually, even 75W, but it's inadvisable for many reasons that are not power consumption.

The pure friction sets it on fire.

about as much as you'd expect from the two extra cores

TR is going to fuck Skylake-X in the ass though

...

Why is Sup Forums so cancer? Literally clueless on how hardware works.

we're too busy making love to our qt bfs to care

That's not explosion that's a cow

970 was the most common card on the steam hardware survey for years. Not sure if it still is.

Also it had a $300 MSRP at launch and went down in price after a few months

What kind of power?

Could be CPU power.

Is anyone surprised? Vega FE is rated at 300W with the liquid-cooled version being 375W. I recall Raja, or someone else from RTG, saying consumer RX Vega would have a liquid-cooled version as well. 300W is fair if it beats out a 1080 Ti.

Raja said nothing about reference RX Vega editions.

This.

Personally I can't wait to buy my $499 4K Minecraft machine.

All (((Intel))) had to do is drop the prices to be called a winner. Instead they keep releasing nuclear editions of their already hot CPUs.

Nvidia is another beast. I know it's fun to overhype things you cheer for but let's be honest. AMD didn't beat Intel, Intel beat itself by releasing same CPUs with same performance for last 5 years with the only difference in actual frames being faster memory.

Nvidia on the other hand keeps releasing cards 50% faster than their previous numeric alterations even if they raised prices as a consequence to AMD's inability to compete with them.

I bet Vega will be somewhere between 1080 and 1080 Ti at best being closer to 1080 Ti in Vulkan and DX12 games and closer to 1080 in DX11/Goyworks games.

Then Volta comes over and the cycle repeats itself with AMD not being able to produce new chips while releasing 3 series of slightly faster and hotter rebrands when Nvidia once again gets 2 years ahead of AMD.

>All (((Intel))) had to do is drop the prices to be called a winner
Muh 60% margins.
>Nvidia is another beast.
Literally marketing: the company.
>Nvidia on the other hand keeps releasing cards 50% faster than their previous numeric alterations
Because making more powerful GPUs is piss easy, since you can throw more hardware.
>I bet Vega will be somewhere between 1080 and 1080 Ti at best being closer to 1080 Ti in Vulkan and DX12 games and closer to 1080 in DX11/Goyworks games.
You should bet your anus and then re-read what the fuck Vega is all about.
>Then Volta comes over
Oversized 1.7% yielding dies on """""new""""" node? Lmao smells like thermi 2.0.
3/10 bait made me reply.

I, too, have a phone to post from

>muh watts

It's almost as if the two products are in two different categories.

Ryzen + Nvidia is truly best

1080 ti sucks 410 watts.
This is the current norm.

>Polaris was not efficient
not the leftovers sold as gaymur cards

the embedded/oem version of the polaris 10/rx480 gpu was 95w tdp instead of the 150w the consumer cards were rated at

same clocks and everything too, just better binned parts while RX boards got whatever was left

Combine Vega with i9 and win and lose at the same time?

i'm considering this (because x299 itx) but i'm pretty sure it'd void my home insurance the moment i powered up my new mini-housefire

I'm pretty sure that's what I'm doing too. I really want quad channel. Bandwidth is important for 4k, whether or not using Vega.

Two words
1700 level performance

>robo

Source:
My
Ass

First one is definitely wrong
Second is probably wrong
Third is probably right

hey guys

resident amd engineer shitposter here again

ask me anything about vega and i will respond with cryptic memes.

Your guess is as good as mine

>Geforce 4 MX
i'm still butthurt.