This is where we wait for the latest AMD graphics card which AMD will surely release in a timely manner and will definitely blow all of our minds, no questions asked, 100% guaranteed, no chance it'll fail, just you wait.
Not sure how to feel about these unverified leaks. Seems like it would make sense that AMD would release a weaker version of vega which leaked first.
Looks like they're directly competing with 1070, 1080 and 1080ti
Jackson Morales
>I-It's going to bankrupt nvidia
Nathaniel Martinez
>weaker that score is higher than the 1080ti's lol
Jackson Wood
That's baby Vega 11 for you. I don't think we've seen any leaks of Vega 10, or at least none of them line up with what the leaked advertised specs.
Josiah Morgan
Yeah I meant the original benchmarks that leaked, show performance just above a 1070.
Yeah I think we've seen the lowest and highest powered so far. One above 1070 and one above 1080ti.
Cooper Reyes
I hope Vega bombs, I already spent $1k on a gsync monitor
Caleb Allen
I've got a freesync monitor and willing to buy a 1080ti if vega has 10% less performance.
Joshua Williams
Bad move, that shit is so over-priced.
Jason Nelson
G-sync is already dead you dumb cuck, it's just a question of how soon it falls over.
The HDMI 2.1 spec has VESA Adaptive-sync style VRR.
Angel Cruz
>goysync in 2017
Alexander Campbell
Man, Nvidia is gonna end up looking real stupid once Intel picks up adaptive sync
Eli Murphy
Let's all over hype it so when it comes out it can in no way meet the unrealistic expectations we've made in our heads
Jeremiah Jackson
Still waiting
Zachary Long
MAY 9TH A Y . 9 T H
David Kelly
how does this work then? is there some sort of setting that you enable that just causes "freesync/gsync" to kick in or does it just do it automatically?
Ryan Davis
That's my birthday. Shiggity.
Jack Johnson
vega c3 is what i want and i want it at 350$ msrp if i can get it for that much i will i want it to be roughly 1070 tier performance on average and better in vulkan and dx12
Nicholas Carter
Dunno how HDMI 2.1 VRR will work, but VESA Adaptive-sync works by having the monitor send an extra flag at connection handshake time saying that it's able to accept any updates that occur in a certain frequency range, then the GPU or whatever just sends frames whenever it feels like, so long at the timing windows match.
Dead simple.
G-sync is a hack where the display scaler communicates back and forth over the DP link to tell the GPU when it's allowed to display a frame, which adds some tiny fractional millisecond of extra latency but mostly is just sloppy engineering.
It's currently broadly assumed that HDMI VRR will just build off the FreeSync over HDMI stuff that AMD already developed, which is again basically just the VAS protocol translated to HDMI.
Christian Lee
So there gonna be a $300 vega? Will it be much better then the 4/580?
I'm a huge bang-for-buck style guy, usually purchasing something new every 2 years. However i stayed with the pic related set up for the past 3 years and am now feeling the the age.
Was thinking a whole new set up, with Ryzen 1600 that was suggested, but undecided on going for a 580 or wait and see if Vega has a lower end/bang for buck model.
Gabriel Roberts
so it'll just be plug and play for the display to synchronize its refresh rate with the amount of frames that the GPU is outputting?
Ian Adams
pretty much.
Thomas Thompson
awesome, last question but when will monitors start supporting HDMI 2.1 and DP 1.4?
Julian Butler
If Vega releases at Computex, that lines up well with my college rebate. Hopefully their 1070-equivalent isn't ridiculously expensive in comparison. >unintentionally being a wait™ fag
Jose Martinez
DP 1.3/1.4 is (barely) already happening.
HDMI 2.1 is taking a while to finalize, so 2018 model TVs in the absolute best case with 2019 probably being more likely.
Brayden Young
hbmeme are expensive
Kevin Thompson
beats my gtx 760
Jackson Harris
how many TMUs did Hawaii, Fiji, and Polaris have?
Adrian Jackson
>That's baby Vega 11 for you.
No, this is all vega 10. vega 11 is next year.
They are just making multiple SKUs, like how they had Fury, Fury Nano, Fury X, and Fury X2. We'll have Vega, Vega Nano, Vega X, and Lou Vega Mambo No. 5.
Logan Scott
you mean 8th.
Jeremiah Cruz
4 GB of HBM2 8 GB of HBM2 16 GB of HBM2
Daniel Russell
Nah, it costs the same as any decent GDDR5 setup. Especially considering it's only two stacks.
Mason Jones
As good as the 1070 with twice the TDP bet on it
Cameron Martin
>twice the TDP user, Vega uses tiled rasterisation.
Brayden Cooper
>thermal design poweremission instead of actual wattage
you don't truly belong to Sup Forums anymore
Colton Brown
just curious, what are you all even waiting for?
what amazing game is currently out that a GPU in a similar price range couldn't easily handle?
there are literally NO GOOD GAMES OUT that need a next gen GPU
what the FUCK do you even want vega for
just buy a 1080 or a 1080 ti and be done with it
Carson Davis
>just buy nvidia goyim Happy merchant pls go.
Nolan Gomez
I'm waiting for a 1070-equivalent to compare the 2 choices, and grab whichever suits me best. Then, I wouldn't have to upgrade for ages.
Henry Howard
>Nah, it costs the same as any decent GDDR5 setup. Especially considering it's only two stacks.
HBM itself is cheap, but the problem is that it needs an interposer so it can be put next to the GPU, and this completely ruins whatever yields they may have for the GPU itself (which might be pretty low too, since it is a huge chip).
There was a time when amd used them interchangeably
Austin Williams
the moment you buy some tech shit, it is already outdated. i'm waiting to the very last second to buy shit, and because this tards are throwing shit closer and closer, i don't know really.
i hope to buy some day a ryzen + vega, or a 8th gen intel with volta. i live in a third world shithole, and have to plan for US trips with months in advance. i wish they made clear the launch dates so it wasn't so unexpected for me. and that's assuming they will let my in in the first place. it is cheaper to fly to the US and buy computer shit there than bring it and pay absurd taxes for a corrupt government.
fuck
Juan Brown
AMD sometimes uses TBP and sometimes TDP, anyone with basic hardware knowledge will know the difference.
Ayden Gonzalez
I'd need an edit of that for Vega.
Jayden Edwards
Hi guys Raja here. Soon very soon we will release powerful graphics cart, but need to wait. It will kick nvidia ass, but please wait. We need optimization for ryzen drivers, need to wait. Please vait guys, soon we release graphics card. nvidia is shit
Evan Sanchez
Just wait™ for AMD™ Radeon™ (RTG™) brand new release of Vega™ guys. It'll be what everyone's talking about!
Cameron Moore
This ... isn't real, right?
Gabriel Bell
Raja while you're still samefagging pls explain what's dat.
Bentley Adams
It's about as real as anything posted so far.
Christian Peterson
Well it's fake, but at least someone put some effort into it.
Gavin Sanchez
it means
R -> R
Nolan Mitchell
Geometry culling
Liam Davis
Or it's not and the whole board will be full of asspained nvidiots on Vega launch day.
Logan Price
>primitive shader
Jaxson Torres
It would be a nice change in the market, but Nvidia would still sell more.
Daniel Ross
do we get some caspian spice and ice cream with vega too?
and that poo in the loo game too?
Hunter Allen
NVIDIA always sells more, people bought fucking Thermis instead of Evergreens. No, only Quake: Overwatch edition.
Joshua Sullivan
not raja, but looking at the pic it looks like its hardware overdraw elimination. It is probably related to their tile based renderer - which works by drawing small chunks of the entire scenery, like 16x16 or 32x32, in the ultra fast L1 or L2 cache, then copying the output to vram in the end. It is still bruteforcing draws (ie. it still draws stuff that isn't visible in the end because it gets covered up), but by doing so in internal cache, it can be done orders of magnitude faster than in the framebuffer. And since it takes many tiles to make up the full framebuffer, it can also be parallelized, say, across 64 compute clusters.
... that, or they have some other magic that culls unused geometry even before rendering anywhere, tiles or framebuffer, but I don't know how the hell you'd do that.
Jackson Thompson
And that was when ATi was a pretty powerful brand name, now the mass of the market are Generation Z underage who didn't even know ATI or AMD exist.
AMD needs to straight up lead for 3-4 gens before it can reach 50-50 marketshare again
William Harris
Na they will be first to have 144hz 4k ips Q2. You can't find anything about freesync2 monitors.
t. Freesync monitor owner looking for a 144hz version
Andrew Diaz
yup this, then blame amd
Nicholas Harris
I wonder how many people had 970 as their first video card.
Samuel Rodriguez
Except this time no one expects anything because AMD was incredibly vague about Vega.
Nathaniel Phillips
This.
The only thing to expect is performance slightly worse than 1080ti, like every other god damn release
Brandon Price
>incredibly vague about Vega >literally detailed the arch 5 months ago
It's only vague to idiots.
Tyler Campbell
290x raped Titan and 780ti.
Oliver Robinson
It's idiots that fall for marketing tricks and buy most high-end video cards. I mean Vega is already awesome if you know what made Maxwell awesome.
Dylan Stewart
I don't care about idiot's opinion, I care about interesting technology be it Nvidia or AMD, and Vega is interesting technology, just as you said, Maxwell was.
Cooper Myers
Not at launch
Jaxon Scott
But you see, these idiots buy these cards, and AMD needs to sell them. Awesome tech does not sell, marketing sells.
Nathan Brooks
I can't help AMD there, I'm more interested in what they produce than their standing as a company and if you're worried about lack of R&D it seems the CPU division will now have more than enough cash to feed the GPU one.
Matthew Gutierrez
Instinct cards will sell like hotcakes bundled with MI25's so GPU division will have money too. But i wish AMD had more influence over gaymur market, since things like Mantle/Vulkan are awesome, and software and APIs they develop are open-sourced, compared to one faggy green company with insane boner for proprietary frameworks.
Dominic Cook
bundled with Naples*
Mason Flores
No point in bundling it with Naples for servers that don't need an accelerator.
There could be some 2U+ combos with them but the ML market is still tiny compared to the traditional x86 one
Nathan Carter
>ML market is tiny Yet it grows, surely and steadily. And Naples is still the king of traditional x86 market due to sheer amount of cores and I/O.
Oliver Kelly
UHD@144Hz = chroma subsampling garbage
DP 1.3/1.4 peaks out at UHD@120Hz for 24bpp and 96Hz for 30bpp.
Jaxson Richardson
nice strawman faggot
Nolan Smith
In pure specs, you're right, but I'm worried about AMD's on-site support, RMA and software compatibility, but Lisa Su has been doing these things for IBM for years so she should know that pure specs aren't all there it to datacenter
Bentley Evans
DP 1.4 has lossless compression
Joseph Martinez
Considering the entire focus of Zen lies in dominating mobile and server markets, my guess they were preparing for it. And Su is no retard.
Thomas Hall
My 7950's are waiting!
Luis Perry
>not flashing his Tahiti PROs to XT
Thomas Edwards
VESA Display Stream Compression is "visually lossless" 2x-3x compression, but it's optional and nobody uses it yet, either in GPUs or displays.
The new UHD@144Hz displays do 4:2:2 subsampling at their max refresh rate.
Owen Barnes
Wasn't 4:2:2 normal? Isn't 4:4:4 just on expensive as shit 12bit+ displays?
Christopher Ward
No, 4:4:4 is normal unless you're using a UHD@60Hz setup over HDMI 1.4 or similar garbage.
4:2:2 and 4:2:0 both make color text and linework look like complete shit, and I hope you'd notice it if you saw it.
Benjamin Ward
So there are no 4k 144Hz 4:4:4 displays out or even coming until DP 1.5 or someone adopts VESA display stream comrpession?
There's no difference between 120 and 144Hz and the former is better for videos so that's fine.
Hunter Robinson
>3 drivers with in one month Raja finally hired some driver guys it seems.
Brayden Wright
It's been what, a fucking year already since AMD had better drivers than novideo.
Benjamin Baker
120 Hz is still a little too low for people with flicker sensitivity with ULBM or whatever strobing is enabled.
144 Hz is slightly better in that regard, but honestly just getting things up to 240 Hz would be good enough for anybody.
Luke Ross
I thought Gsync/AdaptiveSync + ULBM doesn't work together?
Leo Johnson
Crimson is definitely better than recent Nvidia drivers, but honestly AMD was close to parity in the stability functionality for maybe closer to 2 years.
The new GUI and ReLive were just the final nails in the coffin.
William Reed
For me 120hz is the sweet spot since videos look good in it, and 60fps capped japanese games look good in it, all without needed to constantly switch my refresh rate.
I've tried ULBM in the form of lightboost, and to be honest I wasn't impressed. The screen gets way too dim, the input lag is annoying, and I feel no appreciable gain from nuking motion blur from things.
Dylan Garcia
I was really disappointed when I read that those monitors won't be using DSC, I was really looking forward to them.