Daily Wait for Vega Thread /w4v/

Hi all,

This is where we wait for the latest AMD graphics card which AMD will surely release in a timely manner and will definitely blow all of our minds, no questions asked, 100% guaranteed, no chance it'll fail, just you wait.

Latest information leaked:

hothardware.com/news/amd-radeon-rx-vega-specs-leak-256-tmus-and-shader-engines

hexus.net/tech/news/graphics/105295-rx-vega-gpu-specs-uncovered-via-amd-linux-patch/

Other urls found in this thread:

youtu.be/b_ZMOn0X6jw
3dmark.com/spy/1595952
vesa.org/featured-articles/vesa-publishes-displayport-standard-version-1-4/
twitter.com/SFWRedditVideos

...

Not sure how to feel about these unverified leaks. Seems like it would make sense that AMD would release a weaker version of vega which leaked first.

Looks like they're directly competing with 1070, 1080 and 1080ti

>I-It's going to bankrupt nvidia

>weaker
that score is higher than the 1080ti's lol

That's baby Vega 11 for you.
I don't think we've seen any leaks of Vega 10, or at least none of them line up with what the leaked advertised specs.

Yeah I meant the original benchmarks that leaked, show performance just above a 1070.

Yeah I think we've seen the lowest and highest powered so far. One above 1070 and one above 1080ti.

I hope Vega bombs, I already spent $1k on a gsync monitor

I've got a freesync monitor and willing to buy a 1080ti if vega has 10% less performance.

Bad move, that shit is so over-priced.

G-sync is already dead you dumb cuck, it's just a question of how soon it falls over.

The HDMI 2.1 spec has VESA Adaptive-sync style VRR.

>goysync in 2017

Man, Nvidia is gonna end up looking real stupid once Intel picks up adaptive sync

Let's all over hype it so when it comes out it can in no way meet the unrealistic expectations we've made in our heads

Still waiting

MAY 9TH
A
Y
.
9
T
H

how does this work then? is there some sort of setting that you enable that just causes "freesync/gsync" to kick in or does it just do it automatically?

That's my birthday. Shiggity.

vega c3 is what i want and i want it at 350$ msrp if i can get it for that much i will
i want it to be roughly 1070 tier performance on average and better in vulkan and dx12

Dunno how HDMI 2.1 VRR will work, but VESA Adaptive-sync works by having the monitor send an extra flag at connection handshake time saying that it's able to accept any updates that occur in a certain frequency range, then the GPU or whatever just sends frames whenever it feels like, so long at the timing windows match.

Dead simple.

G-sync is a hack where the display scaler communicates back and forth over the DP link to tell the GPU when it's allowed to display a frame, which adds some tiny fractional millisecond of extra latency but mostly is just sloppy engineering.

It's currently broadly assumed that HDMI VRR will just build off the FreeSync over HDMI stuff that AMD already developed, which is again basically just the VAS protocol translated to HDMI.

So there gonna be a $300 vega? Will it be much better then the 4/580?

I'm a huge bang-for-buck style guy, usually purchasing something new every 2 years. However i stayed with the pic related set up for the past 3 years and am now feeling the the age.

Was thinking a whole new set up, with Ryzen 1600 that was suggested, but undecided on going for a 580 or wait and see if Vega has a lower end/bang for buck model.

so it'll just be plug and play for the display to synchronize its refresh rate with the amount of frames that the GPU is outputting?

pretty much.

awesome, last question but when will monitors start supporting HDMI 2.1 and DP 1.4?

If Vega releases at Computex, that lines up well with my college rebate.
Hopefully their 1070-equivalent isn't ridiculously expensive in comparison.
>unintentionally being a wait™ fag

DP 1.3/1.4 is (barely) already happening.

HDMI 2.1 is taking a while to finalize, so 2018 model TVs in the absolute best case with 2019 probably being more likely.

hbmeme are expensive

beats my gtx 760

how many TMUs did Hawaii, Fiji, and Polaris have?

>That's baby Vega 11 for you.

No, this is all vega 10. vega 11 is next year.

They are just making multiple SKUs, like how they had Fury, Fury Nano, Fury X, and Fury X2. We'll have Vega, Vega Nano, Vega X, and Lou Vega Mambo No. 5.

you mean 8th.

4 GB of HBM2
8 GB of HBM2
16 GB of HBM2

Nah, it costs the same as any decent GDDR5 setup. Especially considering it's only two stacks.

As good as the 1070 with twice the TDP bet on it

>twice the TDP
user, Vega uses tiled rasterisation.

>thermal design poweremission instead of actual wattage

you don't truly belong to Sup Forums anymore

just curious, what are you all even waiting for?

what amazing game is currently out that a GPU in a similar price range couldn't easily handle?

there are literally NO GOOD GAMES OUT that need a next gen GPU

what the FUCK do you even want vega for

just buy a 1080 or a 1080 ti and be done with it

>just buy nvidia goyim
Happy merchant pls go.

I'm waiting for a 1070-equivalent to compare the 2 choices, and grab whichever suits me best.
Then, I wouldn't have to upgrade for ages.

>Nah, it costs the same as any decent GDDR5 setup. Especially considering it's only two stacks.

HBM itself is cheap, but the problem is that it needs an interposer so it can be put next to the GPU, and this completely ruins whatever yields they may have for the GPU itself (which might be pretty low too, since it is a huge chip).

Vega will start at $500, minimum.

I'd say $550-650 for big Vega.

youtu.be/b_ZMOn0X6jw

4u

There was a time when amd used them interchangeably

the moment you buy some tech shit, it is already outdated. i'm waiting to the very last second to buy shit, and because this tards are throwing shit closer and closer, i don't know really.

i hope to buy some day a ryzen + vega, or a 8th gen intel with volta. i live in a third world shithole, and have to plan for US trips with months in advance. i wish they made clear the launch dates so it wasn't so unexpected for me. and that's assuming they will let my in in the first place. it is cheaper to fly to the US and buy computer shit there than bring it and pay absurd taxes for a corrupt government.

fuck

AMD sometimes uses TBP and sometimes TDP, anyone with basic hardware knowledge will know the difference.

I'd need an edit of that for Vega.

Hi guys Raja here. Soon very soon we will release powerful graphics cart, but need to wait. It will kick nvidia ass, but please wait. We need optimization for ryzen drivers, need to wait. Please vait guys, soon we release graphics card. nvidia is shit

Just wait™ for AMD™ Radeon™ (RTG™) brand new release of Vega™ guys. It'll be what everyone's talking about!

This ... isn't real, right?

Raja while you're still samefagging pls explain what's dat.

It's about as real as anything posted so far.

Well it's fake, but at least someone put some effort into it.

it means

R -> R

Geometry culling

Or it's not and the whole board will be full of asspained nvidiots on Vega launch day.

>primitive shader

It would be a nice change in the market, but Nvidia would still sell more.

do we get some caspian spice and ice cream with vega too?

and that poo in the loo game too?

NVIDIA always sells more, people bought fucking Thermis instead of Evergreens.
No, only Quake: Overwatch edition.

not raja, but looking at the pic it looks like its hardware overdraw elimination. It is probably related to their tile based renderer - which works by drawing small chunks of the entire scenery, like 16x16 or 32x32, in the ultra fast L1 or L2 cache, then copying the output to vram in the end. It is still bruteforcing draws (ie. it still draws stuff that isn't visible in the end because it gets covered up), but by doing so in internal cache, it can be done orders of magnitude faster than in the framebuffer. And since it takes many tiles to make up the full framebuffer, it can also be parallelized, say, across 64 compute clusters.

... that, or they have some other magic that culls unused geometry even before rendering anywhere, tiles or framebuffer, but I don't know how the hell you'd do that.

And that was when ATi was a pretty powerful brand name, now the mass of the market are Generation Z underage who didn't even know ATI or AMD exist.

AMD needs to straight up lead for 3-4 gens before it can reach 50-50 marketshare again

Na they will be first to have 144hz 4k ips Q2. You can't find anything about freesync2 monitors.

t. Freesync monitor owner looking for a 144hz version

yup this, then blame amd

I wonder how many people had 970 as their first video card.

Except this time no one expects anything because AMD was incredibly vague about Vega.

This.

The only thing to expect is performance slightly worse than 1080ti, like every other god damn release

>incredibly vague about Vega
>literally detailed the arch 5 months ago

It's only vague to idiots.

290x raped Titan and 780ti.

It's idiots that fall for marketing tricks and buy most high-end video cards. I mean Vega is already awesome if you know what made Maxwell awesome.

I don't care about idiot's opinion, I care about interesting technology be it Nvidia or AMD, and Vega is interesting technology, just as you said, Maxwell was.

Not at launch

But you see, these idiots buy these cards, and AMD needs to sell them. Awesome tech does not sell, marketing sells.

I can't help AMD there, I'm more interested in what they produce than their standing as a company and if you're worried about lack of R&D it seems the CPU division will now have more than enough cash to feed the GPU one.

Instinct cards will sell like hotcakes bundled with MI25's so GPU division will have money too. But i wish AMD had more influence over gaymur market, since things like Mantle/Vulkan are awesome, and software and APIs they develop are open-sourced, compared to one faggy green company with insane boner for proprietary frameworks.

bundled with Naples*

No point in bundling it with Naples for servers that don't need an accelerator.

There could be some 2U+ combos with them but the ML market is still tiny compared to the traditional x86 one

>ML market is tiny
Yet it grows, surely and steadily. And Naples is still the king of traditional x86 market due to sheer amount of cores and I/O.

UHD@144Hz = chroma subsampling garbage

DP 1.3/1.4 peaks out at UHD@120Hz for 24bpp and 96Hz for 30bpp.

nice strawman faggot

In pure specs, you're right, but I'm worried about AMD's on-site support, RMA and software compatibility, but Lisa Su has been doing these things for IBM for years so she should know that pure specs aren't all there it to datacenter

DP 1.4 has lossless compression

Considering the entire focus of Zen lies in dominating mobile and server markets, my guess they were preparing for it. And Su is no retard.

My 7950's are waiting!

>not flashing his Tahiti PROs to XT

VESA Display Stream Compression is "visually lossless" 2x-3x compression, but it's optional and nobody uses it yet, either in GPUs or displays.

The new UHD@144Hz displays do 4:2:2 subsampling at their max refresh rate.

Wasn't 4:2:2 normal? Isn't 4:4:4 just on expensive as shit 12bit+ displays?

No, 4:4:4 is normal unless you're using a UHD@60Hz setup over HDMI 1.4 or similar garbage.

4:2:2 and 4:2:0 both make color text and linework look like complete shit, and I hope you'd notice it if you saw it.

So there are no 4k 144Hz 4:4:4 displays out or even coming until DP 1.5 or someone adopts VESA display stream comrpession?

wew lad

3dmark.com/spy/1595952

Right, excluding multiple DP links. The new models now should do UHD@120Hz/4:4:4 though, unless the manufacturers are completely retarded.

But the next DP is at least several years away, and the next opportunity for DSC adoption is probably Volta or Navi.

The new Nvidia Gsync 4k 144hz IPS are DP 1.4

Seems like 1.4 can handle 4x @ 120hz no problem:

vesa.org/featured-articles/vesa-publishes-displayport-standard-version-1-4/

There's no difference between 120 and 144Hz and the former is better for videos so that's fine.

>3 drivers with in one month
Raja finally hired some driver guys it seems.

It's been what, a fucking year already since AMD had better drivers than novideo.

120 Hz is still a little too low for people with flicker sensitivity with ULBM or whatever strobing is enabled.

144 Hz is slightly better in that regard, but honestly just getting things up to 240 Hz would be good enough for anybody.

I thought Gsync/AdaptiveSync + ULBM doesn't work together?

Crimson is definitely better than recent Nvidia drivers, but honestly AMD was close to parity in the stability functionality for maybe closer to 2 years.

The new GUI and ReLive were just the final nails in the coffin.

For me 120hz is the sweet spot since videos look good in it, and 60fps capped japanese games look good in it, all without needed to constantly switch my refresh rate.

I've tried ULBM in the form of lightboost, and to be honest I wasn't impressed. The screen gets way too dim, the input lag is annoying, and I feel no appreciable gain from nuking motion blur from things.

I was really disappointed when I read that those monitors won't be using DSC, I was really looking forward to them.