>>63351351

>not buying the Nitro+ OC

Other urls found in this thread:

youtube.com/watch?v=kOVj_btNHzQ
videocardz.com/73947/sapphire-radeon-rx-vega-64-nitro-pictured-and-tested
twitter.com/AnonBabble

>it's real
oh no no no AHAHAHAHAHAHA

MY FUCKING NIGABYTE 1080 G1 ONLY HAVE 1X8 PINS AND IT IS FASTER. HAHAHAHAHAHAHAHAHAHAHAHAHAHA

Is it a new record for single GPU?

Enjoy worse overclocking desu

Oh my god.
Is that picture real?

this has to be a photoshop

good thing amd botched the vega aftermarket cards lauch back in september lel, i would fell to their meme back then

>not running your video card in energy saving mode to save half the noise and 1/3 the energy for a 10% loss of fps which you won't notice anyway

That looks like a nightmare. Glad I went with the standard liquid cooled one desu.

Pascal barely scales with extra power, more than 8pin on 1080 is literal snake oil.

>youtube.com/watch?v=kOVj_btNHzQ
Looks pretty real to me.

Yeah it's the 390X2 I think

Look the bench of the OP card, they push Vega to the limit with no headroom for thr overclockers

release when?

Now actually look at the OP and notice that it's still clocked at stock clocks, not overclocked like the final product.

You really want this meme? Get a 1080 or maybe a cheap V56 desu.

>Is it a new record for single GPU?

No. The Galax 1080 Ti HoF has three eight-pins. There are probably others.

>they removed the tachometer
but still have lights to spare to cover the fans

Holy jebus, how much fucking power does that thing draw on idle?

>they push Vega to the limit with no headroom for thr overclockers
>barely any faster than reference
So which is it? Either they didn't or Vega is already pretty much maxed.

Not much. No modern card does, since they shut down most of the chip and clock down, to the point that the difference between a 1080 Ti or Vega 64 and a 1050 is ~10W. How many power connectors it has doesn't change anything.

Most likely with vega it doesn't consume much more than the reference card. Adding more 8 pins is just more marketing bullshit to overclockers than anything else

>when you're old enough to have GPUs that ran full bore no matter what
>not even any over temp protection

It's not 300 watts for Vega 64?

My reference 4870 idled in the mid-70s.

>discussing idle power consumption
>posts a graph showing gaming power consumption

What did (You) mean by this?

>Average (Gaming)
>Idle
Okay.

hurr

what's that right angle tool?

Stop bully me :(

This? The gpu brace.

A prop so it doesn't bend your house.

Looks silly, but I don't care. This is what I've been wanting to upgrade to from my current Fury Nitro.

I see.
kinda smart, since it looks like you fasten it with the gpu screw holes atop the back panel; instead of the other stand type anti bending thingamajigs

This thread takes me back...

>3 8-pins

What is this, a R9 295X2?

I think not even that had three 8-pins

That was just before they went tits up.
I wonder whether there's a parallel.

You're right
It had four

these threads became much funnier since I got 1070, weird how it works

I thought it had 8+8+6 conectors

Now I want one even more, together with the FX9590

>Two GPUs require twice as many connectors
Wow!

Get on my level.

would be hilarious if someday that thing started
a fire, wouldn't it?

>would be hilarious if someday that thing started
I'm going to guess it starts every day.

ahem,lemme just connect my gpu

that's not enough, actually

At least it would give me an excuse to upgrade, currently theres not really a significant enough gain to be made from anything available. its a 390X.

Also the ciggy pack is reinforced inside, its just the perfect size to wedge in there so i went with it.card doesnt even get that hot, highest i recorded was 74 celsius taken straight after a session of gtaV, might be a bit hotter nowadays, had the card over a year, also i say thats not that hot but im not really sure whats considered hot for a gpu. My cousins 970 went over 80 though and thats the closest matched nvidia card i can think of.

the refrence amd card this time around had a top notch pcb and components
its hard to surpass it in any way unless amd bin the hell out of them

But there is 24 pins on the molex and there is 24 pin on the gpu.

just...tell me the date

Pretty sure there's a kingpin edition from a nvidia card that can take 1.2 kw of power so this is nowhere near as full retard.

You know that more headroom means nothing in terms of heat right?
Also,
>it is faster
Wait(tm)

>31cm
It will fit into Define C right? The case manual says it can hold any GPU shorter than 31.5cm.
A-asking for a friend.

Kek, 64/10

>Enjoy worse overclocking desu
Every Pascal AIB are the same

>believes that 4 8pin connectors belongs to a single gpu

it's a dual GPU retards, why is everyone so fucking stupid these days?

Don't quote me if you can't even comprehend my post, you fucking retard. I was responding to the question of whether a single GPU card has ever had three eight-pins before.

Nice get

But ur right, almost all 1080s hit the same clock speeds.
Nvidias binning is too tier.
My 1080 under water hits 2150mhz and that's it.
MSI Ab says power usage is still 60-70% of the max.
37c load temps.

>someone somewhere actually bought vega

Why you quote me???

you can saw it shorter if it doesn't fit :^)

I guess I'll be buying Vega to replace my 290x... I can now use freesync in the 100 fps range which should be sweet.

>1080 performance level
>100 fps

Does free sync even work at those fps?

Also, I have a gsync monitor and you can't tell if it's turned on or off.
It's a marketing gimmick

>videocardz.com/73947/sapphire-radeon-rx-vega-64-nitro-pictured-and-tested
>[Edit] We were asked by HWBattle to remove the charts, as they were misleading (final product will have higher clocks).

Freesync can, but the monitor might not.

Normies will unironically think this card is higher performance because of muh 3x8 pin.

I mine goes from 48-120hz

I can 100% tell apart even tiny changes in frame rate at the highest level. It can get pretty shit if it dips from max to below and I don't have it on. Freesync definitely helps out.

delet

Simply using a FPS cap with MSI afterburner gets you the same frame consistency.
Make a cap at say 144fps, tune settings so you got 90% GPU usage at 144fps
Fps stays in that window and everything is smooth.

Neither myself, brother, or roommate could tell the difference between my settings and Having Gsync on.
Free sync and gsync are marketing gimmicks.

are you sure it's even working? getting a freesync monitor was the difference between night and day for me, instead of horrible stuttering every time I drop 5-10fps I now get buttery smooth gameplay until the fps tanks out of the freesync range which is 35fps on my monitor, you can still tell when fps is tanking because animation feels like it's slowing down but it's not obnoxious and jarring stuttering

Do you really think the performance will be better?...3x8 pins for +1%

On that bench both cards are runnign with the same clocks and the small difference is explained by the better thermals the custom card has. If the final product runs at higher clocks of course it's going to have better performance

No. Some do have higher power limits.
Reviewers always get sent the best binned chips, so they are less affected by power limits when overclocking.

So ugly..

I think you're a fool if you're turning on adaptive sync for a game that will run above 144 fps. That's absolutely not the point of having it.

Its especially useful for games with variable frame rates. Example GTA V. I can get it anywhere between 60 and 90 depending where I am on the map. It can look really choppy and when the frames aren't consistent. What's your solution there? Vsync? Capping it to 60? Why when I can get it to 90 sometimes and it looks smooth because my monitor has adaptive sync.

Did you seriously think it was supposed to be used if you're above the max fps the monitor can output?

Another good example is overwatch. If I play it at like medium settings it will stay above 144 most of the time and look fine. Some effects will cause it to drop and you can immediately tell it dropped because it looks choppy and laggy.

It doesn't work that way my guy

>no DVI

>No Galax
>No EVGA

Oy Vey!

PRIMITIVE SHADER DRIVERS? NO?

FUCK OFF

This is not the final product

Has anyone done a Morpheus II mod on Vega? I heard there were some issues with VRM cooling since Vega has a slightly different layout.

The best use for Gsync and Freesync is actually running your monitor at 120-144Hz for games locked at 60 or games that are so graphically intense you can't hit over 100fps. Running a 60fps game at 144Hz has less input lag than 60fps at 60Hz.

I wish I could get AC:O to run above 60 FPS at all times without sacrificing settings. It's a bitch to run.

AYYMD HOUSEFIRES