VEGA is DOA

>show Zen
>...but with a NVIDIA Titan X (pascal)

The CPU look nice but the GPU will be shit.

Other urls found in this thread:

youtube.com/watch?v=4DEfj2MRLtA&t=39m0s
geforce.com/hardware/10series/titan-x-pascal
twitter.com/NSFWRedditVideo

All consoles will be AMD

You mad bro?

>consoles use Vega

We've seen their DOOM demo and AotS results leaked, Vega seems to be around GTX 1080 performance level. That's not too shabby, but pretty underwhelming for a product coming out in Q1 2017, so basically a year after the 1080.

I can't stop laugh abiut that

youtube.com/watch?v=4DEfj2MRLtA&t=39m0s

@39min

>Vega seems to be around GTX 1080 performance level

On debugging drivers

Why would they want to risk a driver crash on stage by showing a incomplete (and crippled) product?

They will use low power derivatives

They didn't want to use AMD GPU with and AMD CPU so as to avoid people thinking it was rigged idiot.

>On debugging drivers
Yeah you're right, I'm sure AMD wanted their card to look underwhelming. Take a step back and realize that they made a conscious decision to show that DOOM demo with the FPS counter up on the screen, why would they do this if the card is underperforming significantly? They probably aren't going to be squeezing much more out of it, if anything it should be a bit faster than a 1080 since the Vulkan render path heavily favors AMD, so if Vega is on average as fast as a 1080 it should pull ahead in DOOM.

Literally what?

It did over 60fps on 4k. It'll be fine.

Vega may match with the GTX 1080 but will never touch Titan X

jesus christ

Fuck me, gpus still can't handle wqhd at decent fps?

I was going to buy a 1080 to use with my pg279q but hardly seems worth it now. 1080 ti when?

>Vega seems to be around GTX 1080 performance level. That's not too shabby,

it's a complete failure considering vega 10 is a 450mm2 die vs gp104 which is 300mm2.

Just wait, it'll be good, you'll see.

JUST

or maybe AMD's processor division wanted to demo their CPU on the fastest stable hardware and wanted to avoid using products still under development. Plus they're cozying up to Nvidia for the christmas season.

Not him, but if AMD had used an AMD GPU in the rigs while comparing CPUs, then Intel fanboys could cry wolf and say that AMD gimped performance by introducing some sort of CPU bottleneck in the GPU drivers when Intel CPUs are detected

By using an Nvidia GPU, that is one less way in which they could have skewed the results to make their CPU look better. It's as apples to apples as you can get.

This may or may not have been why they did it. They could have just done it for Nvidia's shekels. Who knows. If impartiality is why they did it, THIS is the reason why.

HOW THE FUCK CAN IT STILL BE UNDER DEVELOPMENT?

THE RX480 IS LIKE FUCKING 6 MONTHS OLD WHERE IS THE 490 REEEEEEEEEE

cont. Also, when comparing CPU's in video games, you want to use the absolute best GPU you can in order to push the CPU to it's limit, making a comparison possible. Also, you want to use a GPU that is already out in the market so the difference between the CPUs means something. Had they used an in development 490, the results would mean LITERALLY nothing.

As easy as it is to meme AMD over this, it was the scientifically minded choice to use a Titan

even a titan xp can't max games at 4k to the point where cpu would give a meaningful effect on performance. if they weren't trying to misrepresent their uarch's perf they would have been playing at 1080p or 720p, which is what every credible reviewer does when comparing CPU performance in >muh gaymes.

>DOOM and Aots
So the 2 most AMD favorable games on the market?

i think they wanted to show that zen can be used with other configurations, not just amd hardware

plus, amd radeon technologies group is "different" from the amd cpu division... did you see how amd radeon partnered up with intel for a holiday sale?

now you've got the cpu division showing nvidia stuff

amd is smart and is playing the "if we can't full out beat them, let's work with them!"

also, intel is going to be using amd igpu's in their cpus

don't be surprised if you see an nvidia / amd zen partnership coming up

It's not AMD GPUs, it's an AMD licence that allows them to use Intel GPUs in Intel processors.

Isn't the Nintendo Switch using nVidia's Shield GPU or whatever

>Fuck me, gpus still can't handle wqhd at decent fps?
it's just Watch Dogs 2, I got it for free with my overclocked 1070 and it runs like shit and looks like shit compared to GTA V which I play at 2560x1440 with everything on max (2xMSAA + FXAA). Pretty embarrassing that a game that originally launched 3 years ago still looks better today on the PC than the latest high budget garbage Ubisoft shits out.

nintendo switch isn't a console, no matter how they try to spin it

it's literally a tablet with detachable controllers, just google search "tablet controllers" and you'll see so many

yes, the nintendo switch uses nvidia tegra chip, which is in the nvidia tablet/shield, which further backs up that the switch is just a tablet

it makes sense for nintendo to go with the nvidia tegra chip because it was already working in the shield tablet/tv box

Um... Sources already say they were using a modified Fiji driver. The reason they are using it is because Vega is not completely finished and verified yet. Not precisely sure why AMD is releasing this so late in the game. Might as well double down on the mainstream market with Vega.

at $400 1080 performance is not underwhelming

you do realize the 1080 price will go down as a result, right? its not like they cant compete.

Last generation consoles are all AMD and 100% flawless.

Is AMD surviving or are they making Superior products compared to the competition?

amd cuckoldry personified

The chip in the switch is a new tegra chip, not the current one used in the shield.

i know, it's still a tegra chip though

your days are numbered, fucko
what will you do when nvidia stops paying you to shill on forums? degenerate sociopath

Damn I love this argument

amd never stated a price, but that price probably would be the most they could charge if they want to shake up both the 1070 and 1080.

If that's true then I should probably return the 1070 I bought last week

AMD master race, only stupid goyim pay for trash like Intel.

>$1000 video card for gaymz

Does anyone even know someone who actually purchased a Titan X card?

Didn't think so.

Probably the trust fund babby that bought Extreme Edition Intel chips.

The future 1080ti...

1080ti
>800+ dollars
>Only 200 dollars short of the Titan XP

Lmao we're not even a day removed from New Horizon and already Intel/Nvidia fancucks are raging on Sup Forums.

I think Vega is going to be a great card. Jewvidia needs competition so prices stay reasonable, and there are actual gains in performance from one generation to the next. (Like for example on the CPU side. There's absolutely no reason to sink money into Intel's Kaby Lake if you're already sporting Skylake, or fuck even Haswell unless you absolutely need "muh tickboxes")

Came here to say this. Using 1080s isolates the CPUs and heads off cries of foul play.

If they're so worried about it crashing and burning why not pre-record a test clip with some Kill-A-Watts like they did with Polaris?

You're right to point this out though I hope no one thought Nintendo would be dumb enough to use a years old GPU.

I mean I hope they(Nintendo) aren't that dumb..

drop 250 to 300 dollars off a 1080 holy shit talk about greedy and a dream come true. Make it happen

DELET

Fucking ubisoft, goddamnit.

did you miss real conference?

> I'm going to ignore that they did a small demo showing Zen & an RX490 running
>I'm going to ignore that it was Battlefront 4K maxed out and getting 60fps
>I'm going to ignore a GTX 1080 can barely keep 50fps at 4K at those settings.

BTFO

My 290x was performing worse than a 780ti at the time of its launch. now look at how it compares now compared to the 780ti, it's not even close.

Might just turn out to be a better investment in the long run.

They did it with a pascal Titan X so they could remove any possible GPU bottleneck so the results would be strictly based on the CPU.

Titan X isn't a consumer card.

Yes it is, it's a gaming card. NVIDIA stopped pushing the Titan as a prosumer/budget workstation card after the original Titan. The Titan X can't even do double precision.

>with everything on max (2xMSAA + FXAA)

Why do people keep doing this? If you're not using the highest setting, you don't have 'everything on max'.

8x MSAA would be maxed out.

This is the attitude that brings us "my 2 generations old mdrange GPU still handles everything at max on 1080p" posts where half of the performance-intensive options are actually medium.

Consumer cards aren't priced at 1k+, neither are consumer CPU's.

AA is not graphics options
it doesn't increase graphics quality directly
doesn't make more shadows, or sharper textures
all it does is softens(read perception of resolution worsens) picture for no jaggies trade off

They're called 'enthusiast'. Don't try to imply that is not a consumer product, because it sure as hell isn't an enterprise/workstation product in any way.

>nvidia apologist

??

geforce.com/hardware/10series/titan-x-pascal

"LATEST GAMING TECHNOLOGY"

You're just an idiot who doesn't know what he's fucking talking about, I have and always will have AMD GPUs because they aren't locked down garbage.

my point is: enthusiast market is overpriced
margins are worse than audiophile market at this point

You went from 'it's not a consumer card if it's $1K+' to 'it's overpriced'

Why would they show their hand this early? If they had already shown full performance Nvidia could fine tune the 1080ti to respond. Better leave Nvidia in the dark.

If first samples out of the factory beat 1080 before driver optimization, then the final product will definitely be in Titan X territory.

Isn't it a weak ass chip graphically . How they gonna compete with consoles pushing 4K pixels its 2016 ffs nobody cares about gameplay

>flawless
Around the same graphical fidelity as my gtx 750ti in most titles

try buying a PC with a 750ti for $250 with a bundled game and controller

The reason AMD rules the console market is APUs. So much cheaper than two separate chips.

Nvidia has their Tegra SoC's and Intel has their mobile i7's with Iris Pro, and then there's the older Nintendo fuckfest of a Power CPU and a AMD GPU on a interposer
AMD rules the console markets because they had a better offering than the competition

>...but with a NVIDIA Titan X (pascal)

i still can't fucking believe they did this

is amd just giving up? we have monopoly laws, why doesn't the gov't step in and do something about this?

>not the guy you talked to

>With Nvid
I don't know which reaction is more accurate

...

>just wait until VEGA to match the Titan

MSAA doesn't count as a setting you retarded piece of shit

Why not set 8x DSR in nvidia control panel for all of your benchmarks?

>Bundled game
Tf2

Real talk does any and GPU even touch the 980ti yet? And nvidia has releases 3 cards a tier higher than that since then? And AMD just made another card in the same tier as the 290x again?

What is this samefag laughing about?
Also, they obviously didn't enable QuickSync encoder on Intel cpu, it does 1080p transcoding very well.

Oh shit, i got it.

AMD cards that beat 980ti

295x2 still can
Fury x2 can in DX12

but that's it.
this is why we need vega now so it forces nvidia to get off its asre
52.min
that 4k gaming

Sold I'm buying vega.
I like being a snowflake.

I hope it will work on my 6700k.

>295x2 still can
at the cost of awful frame times, high latency and garbage 1% lows.
AVG fps is meaningless on its own.

because surely you want a product that isnt finished yet to showcase another product that isnt finished yet on drivers 4 months old...
people still didnt learn from polaris...

nice try retard

If they used AMD GPUs they would have been accused of rigging the benchmarks

this

The Titan X is currently the highest performing consumer GPU on the market. People know what it can push and what numbers you can expect from the GPU. So at that point introducing a new CPU with unknown performance means you can work out where it is compared to the market competitors based upon the framerate and the known factor, being in this case, the GPU.

Should they have shown VEGA, then there would be 2 unknowns. The CPU, and the GPU. We wouldn't know if the CPU was struggling to run the game, or if the GPU simply didn't have enough kick.

Why people need this explained to them I'll never understa- Oh. Yeh, everyone here is from Sup Forums, I remember now.

>amd use amd gpu for cpu comparison
hurr biased bullshit
>amd use nvidia gpu for cpu comparison
lol amd's gpu sucks

HOW ABOUT YOU ACTUALLY SHOW THE FRAMERATE THEN YOU FUCKING PAJEET?!

I've been using AMD cpu Nvidia GPU for years now, it's the true patrician way. GPU poorfags should be perma banned spewing their shill crafted videos made by some Raja praying Indian.

They showed a gimped Vega with only 3,500 shaders and half of its memory bus disabled. running on Fiji 1.2 drivers with a fucking debugging layer added onto it and it beat a GTZ 1080 overclocked to 1900mhz at 4K resolution.

That is fucking insane. Even minus the boost from Vulcan when this card is running regular DX11/12 its gonna be a monster.

Nvidia is about to get its shit slapped hard.

lol when will you just admit amd is shit m8

so you went from the official its -10% below a 1080 with debug drivers to it beat it did beat a 1080 in X and try to hype it up.... just wait for benchmarks faggot if they hold what AMD (not some random clickbait site like currytek) tells us, I am happy if not, I just stay with my old gpu and its good.

Zen just beat broadwell on an inferior process and Vega just stomped the gtx 1080 with shit-tier fiji drivers and a gimped core and memory bus

AMD is about to bend you over and rape you

le reddit fags do

>Also, they obviously didn't enable QuickSync encoder on Intel cpu, it does 1080p transcoding very well.
The point was to compare CPU performance, not Quicksync/VCE performance, both were running x264 for software encoding

NO NO NO THIS CAN'T BE REAL

Does that somehow make it not shit?

I'm agree with you if you talk about the 1st Titan X with a lot of FP64 but Titan Pascal is a real gayming card.

290x2, fury pro duo

Fury X trades blows with the 980ti in some games.

>Real talk does any and GPU even touch the 980ti yet?

1070, 1080, titan x are all faster than the 980ti.

>believing amd's ministry of truth

the amd fanboy delusion has reached a new level, can't wait to see zen performing only on par with sandy vagina and amd fanboys damage controlling with 'this is the performance amd was targetting all along!'