Ryzen 7 apu spotted

ryzen 7 apu spotted
gfxbench.com/compare.jsp?benchmark=gfx40&did1=53156431&os1=Windows&api1=gl&hwtype1=iGPU&hwname1=AMD Radeon(TM) Vega 10 Mobile Graphics&D2=AMD A12-9800E RADEON R7, 12 COMPUTE CORES 4C+8G

well good bye intel Tesselation is almost x3 times more than A12-9800E which is holding its own

Other urls found in this thread:

forum.beyond3d.com/posts/1997699/
vulkan.lunarg.com/doc/view/1.0.30.0/linux/vkspec.chunked/ch19s02.html
twitter.com/SFWRedditGifs

>2700U
>U

That's a low voltage mobile chip.
Vega fucking rocks at low clocks.

Isn't Vega a meme? I'm probably going to buy a 1080ti when they drop.

At clocks and voltages these mobile chips are running? It's great.
The only thing that's a meme is Intel's non-eDRAM SKUs

for mobile chips vega seems to be good choice. For desktop, eh, not so much. You might be better off with the 1080ti if you can afford it.

its a meme till you undervolt it back to its normal settings(since its fiji basicly)
currently vega runs on 1.2v which is way high normally it needs to be on 1v people that already undervolted it saw massive gains on perfomance while reducing the power to 1070 levels

>R7 2700U
i-is this a 8c/16t APU?? Holy fuck

I'm a brainlet, just realised this is about mobile chips. Standard 1080s will drop soon right?

I wonder how this compares to a standard PS4.

>Standard 1080s will drop soon right?
...what? Have you been living under a rock? GTX 1000 lineup is over a year old at the moment, you can get the 1080/ti anytime you want.

PS4 power budget is incomparable, unless Vega mobile is 9 times more efficient than 1200 shader Polaris

these things don't massively drop in price if there's nothing driving the price down
Vega is an expensive dud, so Nvidia will keep milking us

He means the price

if only amd used hovis for desktop gpu's also

Looking forward to getting my Raven Ridge ThinkPad!

Shame, I'm not satisfied with my R390 but I want to do a decently large upgrade or not bother, maybe I should just wait for next gen

Inb4 single channel, eMMC, 6 pounds, $800

Good. Can't wait until some laptops start shopping with the new APUs. With ddr4 memory being even in laptops, I can see the AMD iGPU being monsters. May even be competitive enough to edge out laptops shipping with dedicated GPUs like the gtx 1050.

Actually, if they use HBM2 memory, there could be a bizarre class of laptops that have no memory on the board whatsoever.

This. I just got a second hand 980Ti because I'm sick of it.

maybe one day
don't think it will happen in mainstream stuff for the next two years
the thought of it makes me rock hard though

Not yet, maybe next year.

just get the 1080ti
i waited this long for vega, been stuck with a 7950 since it came out

but in my country, the vega 64 is more fucking expensive than a 1080ti. so fuck this shit, i might as well be jewed

All the APUs use 2xxx naming instead of 1xxx, it's probably zen+

I imagine that the dell guys probably have a boner for this possibility that can be contained even in a building.

>zen+
it's just fucking current-gen zen with vega graphics

1080ti is like $1400 here still, 1080 I can get for like 790

You don't know that.

>AMD Ryzen 7 2700U with Radeon Vega Graphics
doesn't mean that it has hbm, it has a vega chip
as for it being zen+, please...

Why would it not be zen+, give me a good reason.
It's launching not far off Pinnacle Ridge, it's unlikely but it is possible.

if it was zen2 we would fucking have seen a lot of leaks already

Zen+ will be on a new process node. AMD already said that. This is just regular Zen. It might have a small improvement here or there, but not enough to warrant a new classification.

For some reason AMD and Intel can't get a halfway decent naming scheme to save their lives. Nvidia has it figured out pretty well, it seems. AMD's graphics division may have finally settled on the naming scheme RX [GPU name] [CU count] which is really great because it's very straight forward. But then again, they've changed their branding like five times over the past few years so who can say.

They should have had a different name for the APU line, though, for sure. The reason they've chosen to go to Ryzen 7 2xxx is that some laptops are going to have actual Ryzen 3 and 5 chips in them, not just APUs. By trying to avoid confusion, they're confusing people. Brilliant.

Zen2 is not zen+

just buy it now, AMD haven't provided strong enough competition with Vega to make Nvidia drop their prices, and because Pascal cards are selling so well at their current price, they have no reason to release Volta for the foreseeable future

there is not zen+ anymore unless you think amd will launch 2 fucking cpu's in a matter of a single year

Zen+ referred to Zen's successors, the first of which is Zen 2. So yes, they're the same thing.

vega is a good competition but not right now
dont forget vega is the only card atm full compliant with both vulkan and dx12.1 they will probably get a lot of perfomance gains later one as usual with amd

You think AMD won't launch a bugfix refresh in Q1 2018? Are you dumb?

yes im dumb
you know why threadripper doesnt have any of the previous bugs? its B1 stepping the bugs are already fixed moron

the next zen is zen2 already confirmed by amd countless times nothing in between by them on the consumer side

They launched it with the drivers literally half finished. There are core uarch features still not enabled in drivers yet, so it's probably going to be another 2-6 weeks before you can know how fast Vega actually is in games.

Basically, Vega massively front end bottlenecked by only being able to deal with 4 triangles per second at the moment, but once primitive shaders are enabled in drivers Vega should no longer be front end bottlenecked and the real performance.will show.

Just to show how much there still is on the table, Vega gained this much performance from undervolt on the second public driver.

He has a point. Usually a bugfix would be in the works, but Zen2 is close enough so that it might not be necessary. Just spend the time running up Zen 2 production, getting it more efficient to make.

it sounds too good to be true, if you ask me

Why would they even show Vega in that state? It completely killed the hype. Also I've heard this
>just wait for drivers then amd will be better
meme one time too many

B2 is EPYC only it's a SoC/uncore bugfix stepping, not the core.
AMD needs something 'fresh' in Q1 to keep momentum that coffeelake will disrupt.

For fucks sakes they pushed out bandaids during Bulldozer every year with 10% of CPU division working on it, they will do the same with Zen.

Most importantly shareholders and OEMs want a refreshed lineup, this is business

forum.beyond3d.com/posts/1997699/
>Quick note on primitive shaders from my end: I had a chat with AMD PR a bit ago to clear up the earlier confusion. Primitive shaders are definitely, absolutely, 100%not enabledin any current public drivers.
That is Ryan Smith (the editor in chef of Anandtech) confirming on the record directly from Mike Mantor that primitive shaders are not enabled yet in Vega drivers.

Because they figured they'd have the drivers feature complete by the time AIB cards launch in September.

just
stop
posting

This meme is generally true.

Great rebutal.

because the mining bubble is going to burst soon and they want to sell as many gpus as possible before then.

I'm not contesting prim shaders not being enabled. I'm saying I have doubts in how meaningful the results of them will even be, or how well they can get it to work.

yes amd will release zen 2 mid 18 and before that they will release zen+ with bugfixes
you dont even realise how stupid you sound

Even realizing half of the claimed benefits in the white paper would be enough to free GCN from it's inherent front end geometry bottleneck. Literally half of the claimed benefit would take Vega straight past Pascal in dealing with culled geometry.

we already know how good they are cause they work on the pro drivers it literally shits on p6000 at 4k with x8msaa

Wow, Vega sure is awesome when it no longer chockes full of triangles.
And yes A475 when.

Zen2 is late 2018 or 2019

source? magic and speculations as always
you do realise amd is going for a 6core ccx thus avoiding the impact of IF?
do you even understand what is actually going on?

It all depends on the faggots making the motherboard

where the workloads are heavy on geometry
it's not going to be magic in games
it might take it past 1080ti, but I doubt it

>I have doubts about the feature that fixes inherent bottleneck of GCN
?
DSBR fixes any BW problems, IWD helps with ALU saturation.
Vega is as good as it gets, hardware-wise.
The software stack surrounding it has to mature.

Source is GloFo's 7nm HMV date

they're still going to combine 2 6core CCX
yields with zen were fucking great and they still combined 2 CCX

There's more to Vega than simply new geometry pipeline.
But you can't see any other improvements because front-end chokes right now.

Which will always cut corners.
Intel is smart here, they specify a MINIMUM HARDWARE REQUIREMENT for Ultrabooks.
Which I don't think AMD can push with its measly war chest

yeah, because it sounds too good to be true, and like a band-aid on an inherently flawed design approach

primitive shaders also works on shadows and pretty much anything that pass on the compute path
as by the whitepaper traditional culling and culling by those shaders gives almost a 130% jump than polaris
on paper this is almost x3 times faster than nvidia
but given how complex this shit is i have very little faith on amd driver team (unless they got some new minds with the money of ryzen)

>HMV date
2h means late 18 because surely its prety convenient to choose the latter in order for our statement to have some validation

Wow, a new, efficient shader path that can handle geometry processing sure sounds like bandaid!

it really fucking does
Nvidia doesn't seem to need this voodoo nonsense

Yes, nVidia uses the other voodoo nonsense, Polymorph Engines.
Please stop being retarded.

its almost as if amd uses dsbr and nvidia tbr

sure looks like magic

alright, pal
prim shaders will be the second coming of christ
enjoy your delusion

It means latter than sooner, else they'd specify a quarter.

Also you need to build up stock.

yes it is because it needs a high level of parallerism on hardware
vulkan.lunarg.com/doc/view/1.0.30.0/linux/vkspec.chunked/ch19s02.html
there is a reason as to why nvidia will choke to death if they use it as it was the case with async

>Nvidia doesn't seem to need this voodoo nonsense
Is that a joke?

>no arguments
HAHAHAHAHAHA
That's usual per-primitive shader.
You just invoke vertex shader for each primitive in batch.

>Nvidia doesn't seem to need this voodoo nonsense
>need this voodoo nonsense
>voodoo nonsense
>voodoo

Why did you have to remind me...

lets just say in a very simplistic way

everything that will pass via the compute path on amd is going to get a shader treatment literally most of the games being developed for 2018 will have one way or another some compute shaders

Oh yes, NVIDIA is doing the same mistakes SGI or 3dfx did.
I very much hope Jensen is sane enough to not do that.

you mean trying to lock the industry on a api that is more closed down than a virgin pussy?
or creating a module for g sync while freesync 2 already has a response time of 0.3ms without a module?
hmmm literally sounds like shit 3dfx would have done

Ye, or shit SGI did.
I seriosly hope Jensen will regain his sanity in two years.
Because the market would tolerate proprietary walled garden bullshit for only SO long.

> Raven Ridge graphics
> matching a 1050
calm down user... calm down

vega 56, undervolted and oc'ed is at 1080 levels
both look to be bottlenecked by a feature that isn't in drivers yet and was confirmed to not be in drivers.

something is going to happen with vega and it's going to be fun to watch... better than a 1080ti, no idea but performance above a 1080 on average from both skus? most likely.

Maybe a bit higher than 1030.

Should be close to a 1030, I wonder when will AMD make a 300mm2 APU with 220mm2 dedicated to the iGPU.

Now that would be a midrange killer.

Won't help as long as it's using DDR4 for memory.
Though you can make a premium APU with 1 4-Hi or 8-Hi HBM2 stack.
Would be nice.
REALLY nice.
Like a market in itself.

It seems like vega is massively bottlenecked by not having the primitive shader, to the point that both vega 56 and 64 preform about equal even though one is cut around 15% of the die.

Once the bottleneck is alleviated both 56 and 64 will likely jump ahead, will 64 beat out a 1080ti? Fuck knows, but it will shit all over a 1080, while a 56 shits on it harder, you know, being significantly cheaper (at msrp)

I'm not expecting miracles like a 30-40% jump, but I am expecting more bottleneck alleviation so the card doesn't just look like an overclocked fury.

>I'm not expecting miracles like a 30-40% jump
But that's what primshade is all about.
GCN has an inherent bottleneck of only 4 frontends and no additional ways to chew geometry.
Primshader fixes it.

I refuse to expect a massive fuck off jump, that just isn't happening but I do expect to see a jump.

You see, Maxwell was a big fuckoff jump because Fermi-likes have no problems chewing geometry.
Vega introduces both tiling and new wacky ways to chew lotsa triangles.
Everything combined results in a very nice jump.

>RAM encryption in a laptop
>Insanely power efficient
>(Possibly HBM powered) APU

Just a 10% faster 56 would be a really good GPU.

Assuming no fuckoff prices(hahaha)

Last Crimson update boosted PUBG scores by like 18% on Vega...

Tells you everything about drivers

Or PUBG is just that shittily coded.

Well it's early access trash, and it's on fucking UE4 of all things.
Of course it's pajeetcode incarnate.

UE4 doesn't historically play nice with AMD for whatever reason.

Because Sweeney sucks Jensen's cock for breakfast.
Like really.
But anyway it's going to change soon since Sweeney fell for pajeet tricks and appeared on AMD's keynote.

I assume 430-450 will be the norm when the dust settles and after market hits, and those cards will likely all be undervolt overclocks if the companies are worth anything so even at stock about equal to a 1080

>yes but you see Vegas TRUE performance has yet to be realized, prim shaders, whitepapers...
>it will get better soon, Just Wait
Yes but why the fuck wasn't it RELEASED in a working state? Another disastrous shitty product launch of unnecessary controversy, rumor mongering and confusion, and spicy memes. Most people aren't going to buy into this when the first thing they see is that it fucking sucks.

Good thing Ryzen is making money. The big Chinese jews are all buying into EPYC.

You'd be looking at Xmas for fully working drivers.

The performance was enough to warrant a launch.
Any further improvements provided by NGG pipeline are nice bonuses for (You), dear customer.
t. Raja Koduri