What do we know about AMDs upcoming "Vega12" GPUs?

What do we know about AMDs upcoming "Vega12" GPUs?

I'm actually curious. I haven't read anything about AMD releasing new GPUs any time soon. But I did catch this:
lists.freedesktop.org/archives/mesa-dev/2018-March/189756.html

AMD just added support for some new line of Vega GPUs named VEGA12. This isn't "vega 12" like "vega 64", RX 4xx cards were POLARIS10 and RX 5xx POLARIS12. So we're talking some kind of new line of Vega cards.

Any additional details on these anywhere?

Attached: 104510864-GettyImages-657222368-AMD.530x298.jpg (530x298, 35K)

Other urls found in this thread:

anandtech.com/show/7870/imagination-announces-powervr-wizard-gpu-family-rogue-learns-ray-tracing
youtube.com/watch?v=HSmm_vEVs10
research.nvidia.com/publication/interactive-reconstruction-monte-carlo-image-sequences-using-recurrent-denoising
lists.freedesktop.org/archives/dri-devel/2018-March/170534.html
twitter.com/NSFWRedditVideo

Mobile

they've been very quite since the flop of vega frontier i wouldn't be surprised if they let nvidia release their new line then 1 up them or at least try

Unlikely. I see those mobile APUs referenced as "CHIP_RAVEN". "CHIP_VEGA12" is clearly something else and it looks like it's something very similar to "CHIP_VEGA10". It appears to be more of a small improvement like RX 480 to RX 580.

I do like this one for VEGA12 and the lines that follow, btw:
+ /* Temporary workaround to prevent VM faults and hangs. */

Yeah, that's basically why I'm asking. There was this huge hype-machine before the Vega56 and Vega64 cards were released (and a huge disappointment when they were). I have seen absolutely nothing in news or anywhere else about new AMD or Vega cards.

>I have seen absolutely nothing in news or anywhere else about new AMD or Vega cards.
Probably because it doesn't matter what they bring out. They will still be out of stock everywhere and no one can use them, thanks to miners. I have the RX480 and it's just not possible to upgrade since everything that came after it is sold out constantly.

Vega64 cards are not out of stock out right now locally, they were last week but this week local stores in my country clearly got some pallets of both Vega64 cards and RX 580's because suddenly all of them have plenty (well, like 20-50) in stock. And the prices are... about the same for a NVidia 1080ti and a Vega64 and about the same for a RX 580 and a NVidia 1070. They might as well not have them as far as I'm concerned. The Vega64 ain't on-par with a 1080ti, it should compete against the 1070. And I consider the RX580 to be on-par with the 1060 3GB version.

You're right, it won't matter much what VEGA12 turns out to be if it's out of stock or this insanely overpriced.

I haven't heard a peep from AMD about their GPUs since Vega launched. Not anything substantial anyway. I doubt they have any new designs coming apart from whatever they have planned for 7nm. They could potentially release 12nm refreshes of some ASICs though.

>Unlikely
Vega M

>Vega M
Yes, go on? What is Vega M?

>They could potentially release 12nm refreshes
It could be that simple. Regardless, them putting code into MESA for some unreleased future VEGA12 chip now does point to something being released perhaps 2-3 months from now. Yet nobody's talking.

MESA commits are interesting to me because both Intel and AMD tend to put support in place quite some time before launch.

mid/late 2018
as APUs show it scales down well enough, polaris probably will get replaced

since we are on GPu topic, what with ray tracing boom out of nowhere?

It's probably the 7nm Vega compute accelerator / deep learning card with 4 stacks of hbm.

From the looks of it it seems that the rumors of hardware-level fuck up seems to be true.

They kept promissing tiled rasterization, then they said its coming later, then they dumped it.

It seems they will need to rework their architecture to function properly.

Right now Vega offers pretty poor die size/performance ratio which means there's something wrong with it, its a giant chip that doesnt perform like a giant chip.

On the other hand it means AMD have a way for improvement and have made a mistake rather than having hit a brick wall they cant overcome.

Its not out of nowhere. ImaginationTech has been working on real time ray tracing accelerators for a very long time. They've been showing off hardware for years. anandtech.com/show/7870/imagination-announces-powervr-wizard-gpu-family-rogue-learns-ray-tracing

A year or so back they produced a 10w discrete ray tracing accelerator and it shit all over one of Nvidia's professional cards. It showed that the era of pure triangle rendering is coming to an end. The software is ready, and now the hardware is appearing. Both AMD and Nvidia are going to struggle to keep up on this front.

>Yes, go on? What is Vega M?
The "M" could stand for "mesktop", or "morkstation", but I could be mrong.

Attached: sat-chan.png (1754x1620, 2.42M)

>Both AMD and Nvidia are going to struggle
doubt it, there is going to be transition period when they will have to cater to both methods before triangles go away

it stands for Motherboard

The current rasterization methods are still a legacy nobody is willing to abandon.

The next step will be MCM gpgpu with massive compute performance, after that the new standards will start appearing ad getting implemended into gpgpu architecture.

A raytracing ASIC wont take off widely due to compatibility limitations
>buy an ASIC that can run nothing but tgese two games on this specific version of Unreal
>the new unreal version comes and the future games adopt it

There's already a game in early access utilizing real time volumetric ray tracing. Funnily enough it's not utilizing a rasterization hybrid, or either of the new apis on pc. It's called claybook, it runs all the ray tracing and fluid physics on the gpu, they've even gotten it running on the xbone so far. Nothing super fancy, but pretty impressive.

Ray tracing is a primitive technology and can be ran on any GPGPU that supports OpenCL.

The real time tracing is basically the same old shit except it uses the lowest possible quality and then some smart machine guesswork to fill in the gaps and noise filtering to smooth out the shitty grainy image.

There's no space magic in it.

>since we are on GPu topic, what with ray tracing boom out of nowhere?
It's just a matter of time until GPUs are fast enough, high end GPUs can probably real time raytrace at a few FPS now so they are making hybrid renders with only small portions traced. Another few GPU generations and you will probably see completely ray traced games. Old hardware and consoles will hold back adoption though. Things could also change rapidly if large companies hurl money at real time raytracing though.

so, VEGA32/40/48 12nm @ 1800mhz soon?

Denoiser base on Deep learning LSTM

youtube.com/watch?v=HSmm_vEVs10

research.nvidia.com/publication/interactive-reconstruction-monte-carlo-image-sequences-using-recurrent-denoising

Allow realtime feedback to designers and production movies, only use few rays allow buy high quality images and even works in real time.

Microsoft add Raytracing API to DX12 as DXR to allow implement easy raytrace on current game engine or apps.

but nvidia cant into dx12

I would consider one if there were GPUs at a reasonable price. 970 performance in the 100-200 € range where it belongs. I'm not willing to spend 200+ for a dated shit GPU with faulty RAM, I could need more than 4 GB for 4K.

This will be called VEGA Volcano.

You thought the last generation consumed a lot of electricity. Think again, the VEGA 12 Volcano will consume upwards of 1000watts. It's a new line from AMD, a personal space heater that mines crypto currency as well.

Get your pre-order in boys.

>I could need more than 4 GB for 4K
You will absolutely want to get a GPU with more than 4 GB. I run a 3x4k 27" monitor setup on a RX470 with 8GB RAM. Needless to say that I can't use that GPU for 3x4k gaming. But it is fine for all kinds of desktop use. I check radeontop from time to time and I typically use between 2 and 3 GB of GPU VRAM, sometimes more than 4. And that's NOT gaming, just using it for regular desktop stuff. 1x4k would obviously use less but still, if you buy a GPU then you'll probably keep it a while and 4 GB really is a minimum if you ever want to do something like 3x1440p or 3x4k.

As for prices... yeah, GPUs are so overpriced right now it's not even funny - specially on the AMD side of things. I guess your only hope would be that NVidia and possibly AMD release a new generation of cards that are a lot more efficient at mining and that miners sell some older generation cards. You'd have to be sure you can replace the fans on them, though. Cards used for mining are usually just fine, even though many on Sup Forums like to claim otherwise. But the bearings on the fans on those cards will be worn out.

holy fucking shit this spacing seriously consider suicide

no u

Yes, adults use spaces between paragraphs of text. Feel free to grow up.

I currently have a 960 in my HTPC. Its fine for HEVC playback (The main reason why I bought it, and only 60 € in January), but Warframe and Mass Effect 3 still shutter. I just want a cheap card wish HDMI 2.0, HEVC and enough VRAM/performance to play older titles in my bed. The 960 sells at 120 € now.

Maybe RX6xx series? I'd like to see how a mid range Vega performs when not overvolted and overclocked to hell.

>One, sometimes two sentences
>"paragraphs"
Second post he quoted does have a paragraph though, but he's the exception.

Looks like AMD is trying to get VEGA12 support included in kernel 4.17.
lists.freedesktop.org/archives/dri-devel/2018-March/170534.html
So new AMD GPUs around July/August?

>amd has shit support on linux
>amd doesnt work with tensorflow
whats the point

Have you been living under a rock for the past 2 years? AMD support on linux is great now with amdgpu + mesa/radv. That and they do support tensorflow with ROCm

A amd dev already denied that those patches have anything to do with kaby lake. So no, they aren't for mobile cards. The only logical thing would be rx 5xx successors. Hopefully, i manage to grab some before the miners cause a price hike.

AMD is actually better on Linux than Nvidia now. I am actually considering a Pro 9100 because of it.

what are you talking about? amd's linux support is great

if you mean a GTX 960 , i had a GTX 960m and i can assure you that it should be able to run Warframe or Mass Effect 3 without any problems , you should look into your driver version or active/disable VSync.
i could play Battlefield 4 at 60 FPS and even Dying light without any problems (it only started to show weakness with Battlefield 1 playing at 20 fps minimum).

>They kept promissing tiled rasterization
It already works.
Saves bandwidth, even.

>paragraphs can't be one sentence

Attached: 1509188624739.jpg (221x250, 5K)

Do you even read? This thread is about whatever information there is about AMDs upcoming GPUs. And we can assume there are some due to the simple fact that support for some VEGA12 chip was added to both MESA and the kernel this week. This indicates that some Vega refresh or replacement for the RX 5xx line-up will be in stores June/July/August or something like that. I'm just guessing based on previous git commits vs releases.

As for support, AMD cards are very well supported on GNU/Linux these days and OpenGL runs faster than it does on Windows with their binary blob driver, specially at higher resolutions. TensorFlow is specifically made for NVidia's proprietary CUDA so that will never work with AMD cards or anything else that uses the free-to-use OpenCL standard.

>TensorFlow is specifically made for NVidia's proprietary CUDA so that will never work with AMD cards or anything else that uses the free-to-use OpenCL standard.
AMD have just released TF port for ROCm.
And it even works!
Not as fast as on NV cards, but it's something already.

I didn't say they can't, but they generally aren't. Besides, these aren't paragraphs even if you consider that.