THANK YOU BASED NVIDIA

THANK YOU BASED NVIDIA

Other urls found in this thread:

guru3d.com/news-story/nvidia-titan-v-graphics-card-benchmarks.html
reddit.com/r/nvidia/comments/7ikhyp/nvidia_titan_v_fire_strike_benchmarks_oc_non_oc/dqzfe8p/
benchmark.unigine.com/leaderboards/superposition/1.0/1080p-extreme/single-gpu/page-1
forums.anandtech.com/threads/nvidia-volta-rumor-thread.2499125/page-15#post-39207683
en.wikipedia.org/wiki/Jensen_Huang
en.wikipedia.org/wiki/Chris_Malachowsky
en.wikipedia.org/wiki/Curtis_Priem
tftcentral.co.uk/news_archive/38.htm#auo_roadmap_oct17
3dmark.com/3dm/23849552
3dmark.com/fs/13515881
twitter.com/SFWRedditImages

...

so that is the power of volta woah

>AMD"s most powerful card in 7th place

guru3d.com/news-story/nvidia-titan-v-graphics-card-benchmarks.html

Where are the games?

what the fuck is unigine superposition and why the fuck should i care?

reddit.com/r/nvidia/comments/7ikhyp/nvidia_titan_v_fire_strike_benchmarks_oc_non_oc/dqzfe8p/

WHY THE FUCK IS NVIDIA THE ONLY COMPANY THAT IS ABLE TO PRODUCE SOMETHING LIKE THAT????

Because 800mm^2 of silicon.

Because its xbox hueg

And it only draw 250 watts while Vega is a housefire

Is firestrike a game?

Apparently Linus bought one. We will see benchmarks in a week or so.

>B-but it's not for gaming!!!
Why do retards think professional GPUs can't play games? Sure the price/performance is bad for gaming but that dorsn't mean it can't be indicative of future gaming performance. People said this nonstop with Vega FE and it was retarded then too.

Does this mean VR gaming is finally feasible?

>3000 dollaridoos

benchmark.unigine.com/leaderboards/superposition/1.0/1080p-extreme/single-gpu/page-1

God, this shit is fast, even the 3 GHz Evga 1080ti is slower

Who was talking about vega user?
Anyway my point is very rarely does a company have the resources and brand racognition to be able to pull something like V100 off, it's like the original titan was them just testing the waters of what they could get away with and now they're just going to run with it. Reticle limit gpus every year for everyone, 1180ti at $1180 is a bargain compared to the titanV.

No, 1180 Ti/2080 Ti will be $699 because it will not use GV100, it will be GV102 or GA102, depending on whether it's Volta or Ampere

Would be nice if Nvidia could make a 800mm^2 die with only FP32 cores.

That would be a monster gaming GPU

You can pray for that or you can pray for AMD to glue 4x 240mm^2 dies with Navi or whatever it is after that.

>tells he's happy amd is competetive and that Nvidia was allowed to do shit like Titans for far too long
>buys a 3000 smackaroo Titan
I get that having a review up for the "most powerful gpu" in time will make their mone.... Hold on a second, is Nvidia trading failed Teslas for google's money via techtuber proxy?

Because it relies on a DirectX implementation that is over a decade old.

That's will cost you a lot.

It correlates with increase in FP32.
So, where's the nVidia magic?

Nvidia has no competition. They can charge whatever the fuck they like and gamers will swallows the Jewish sperm like the good goy they are.

That's great!
It's SGI 2.0 now.
Hopefully this cancerous tumor will die the way of SHI 1.0.

With the whole new gluing memes, I could actually see AMD doing polaris11 sized fp32 only chips and sticking them together for gaymen-only cards and do thr same for fp16 accelerators and general purpose 3D professional cards could have a healthy mix of all.

They won't use small chips for MCM, that's pointless given the Vega design.
But something ~350mm^2 (including SerDes die overhead) will do.

I was laughing when people said vega had a chance
kek

ATI/AMD HAVE BEEN DOGSHIT AFTER THE BATMOBILE

P O O R
V O L T A

Are Rise of the Tomb Rader and Gears of War 4 not games?

forums.anandtech.com/threads/nvidia-volta-rumor-thread.2499125/page-15#post-39207683

>Well, it's SGI 2.0 for a reason.

Bondrewd, fuck off back to whatever shithole you came from

You do understand that nVidia's founders are ex-SGI?

en.wikipedia.org/wiki/Jensen_Huang

en.wikipedia.org/wiki/Chris_Malachowsky

en.wikipedia.org/wiki/Curtis_Priem

You don't even known what you are shitposting about, none of them have worked for SGI

Bondrewd, fuck off back to your shithole

>Bondrewd
I hate that meme spewing little shit

Its like a GTX 1080 Ti Ti.
Call it the Big TiTi.

I'm hoping the standard run temp isn't so close to the thermal threshold.

>DX11

Just found I have $4000 worth of bitcoins, should I get one of these bad boys?

yes, and an AIO liquid cooler conversion kit for it

Cool. Will probably pick up a 4k monitor too

>7608x4320
>high settings
>30fps MINIMUM

This isn't even their gaming card. What the unholy fuck. AMD should just liquidate their GPU division.

87 is not bad for a video card on air while benching superposition, most cards above 150W fall into the 75-95 range on air

wait for the monitor, 4k 120hz will launch at CES in jan.

>4k 120hz
That's what I want my dude, thanks for the heads up

Assblasted pajeet detected.

I don't even understand why AMD bothers to "compete" in that market. They get btfo with every gen.

For good boy points?

Source on monitor?

not for gaming

>tftcentral.co.uk/news_archive/38.htm#auo_roadmap_oct17

Asus and Acer showed off 4k+HDR+Gsync monitor last year at CES, but delayed the launch. The panels come from AUO who said production should start in November.

poor vega

At least the poo in loo is gone.

I'm starting to think Raja sabotaged AMD. How can someone shit up the GPU division so bad since his arrival back in 2013? He couldn't fuck the 290X which was competitive, but fucked all the cards afterwards.

He most likely did it because he wanted to devalue the division so they would sell it off to Intel. But that failed so he just decided to move there himself instead.

He left because RTG is literally designated shitting division comprised of mainland chink """"engineers"""and pajeet driver team.

That's enough to generate console and iGPU IP.

If anything the driver team is the only good part of RTG now, meanwhile Nvidia's drives have been shitting bricks for the last few years.

Completely different from what happened a decade ago with ATi, good hardware, shit drivers.

Driver team that can't deliver advertised features in software for new uArch can't be good.

Very interesting. I don't think AMD will pull a ryzen in GPU division for a while now, they should only focus on the low end market since Nvidia is on a roll with their architecture. Maybe they have a deal with the ayys? Who knows?

That's likely what the customer GV102 is going to perform like.

They just remove some blocks (smaller dies = easier to fab) and increase clockspeed to compensate (less overclocking headroom).

Volta is mostly about general compute not silly gayming. Nvidia wants a slice of the crypto-currency market. I wouldn't be too shock if Volta SKUS became the new power/performance sweet spot for mining.

I'm talking about stability, there's a different thing completely from having too little manpower and too much workload.

Performance per dollar on this is ridiculous, that $3k card isn't 3 times faster than the $1k card.

And $1k is pretty darn expensive for a GPU. Keep in mind that the XBox One X is half of that and that's a complete NEET system.

The only good thing about this card is that it gives an indication of how the mid-range and high-end cards will perform.

Too bad nvidia's mid-range GPUs are now priced like high-end GPUs due to no real competition from AMD.

That's a very sweet cherry you picked there

It's merely a question of AMD suddenly being eager to throw fuckton of money at GPU division.
The question is why would they do that?

Firestrike performance is shit and Firestrike is shit.
Timespy only 20% better while having more OC headroom than Pascal, what?

That huge a difference between 8k and FullHD on superposition? Certainly not due to VRAM or bandwidth, the TitanV doesn't have that much more.

>Daliy reminder Nvidia's absolute market domination would not have been possible without these guys

No, it wouldn't been possible without their marketing department.
nVidia can probably sell literal shit for $499 a pop and people will buy it.

but their literal shit is still better than all the other shits

It doesn't matter.
Their marketing department can make people eat shit and ask for more.
They can comfortably make smaller heaps of shit pricier each gen, even.
56% gross margin is no fucking joke.

If I wrote memory dependent aplication, which AI with linkings can be, Instict kicks this things ass.

well said, fellow radeon rebellion comrade
let's make some noise

Dependant on AMD memory... "nVidia"

>"""people"""
Is that what Razer kids call themselves these days? Funny.

The memory is bad by Samsung.

>The absolute state of AYYMDPOORFAGS

HBM2 is a JEDEC standard, it was not designed by AMD

Joe Macri is very much an AMD fellow.

>1 person did everything

Kill yourself, Bondrewd

Joe Macri and his team designed and specced HBM.
Stop denying reality.

>DirectX 11
Gee, I wonder why they chose it?

how so, the G256 came out before 3dfx shat the bed with V5

Because they are not stacked on top of themselves.

AMD didn't design shit

It's a JEDEC standard that other companies like Samsung & SK Hynix did the real work

If it was not a JEDEC standard, no one will adopt it

Stop being an AYYMD asslicker, they did nothing

Don't be a faggots, we raise up $20 000 split it in half, one team build nVidia chess robot, another one build AMD chess robot. We'll see how much they compete with each other.

So those are the guys who did all the biased benchmarks?

Nvidia backed HMC. JEDEC chose HBM instead.

And back with cards, Crossfire RX 580 is still cheaper than Single 1080 Ti...

You just want to justify wrong spend money.

Actually there is a idea of stacking cores on top of itself, cooled by graphene nanotubes, patented to Sup Forums now.

>AMD didn't design shit
>except when they did and submitted it to JEDEC
HOL UP
Where's HMC?

Crossfire and SLI are literally dead now outside of compute.

Cumpute does not use CF/SLI.

You are absolutely right, they should quit this market and leave it all to nVidia to have for themselves.

the powah of volta!
3dmark.com/3dm/23849552
vs 3dmark.com/fs/13515881

13% difference for x4 the money

nvidia is going full intel

Oh, looks like they are hitting the same scaling up problems as AMD with Fiji.

no the irony is elsewere
amd has the problem that no games up untill now was able to use its massive paraller hardware what so ever

now that games are actually moving to the compute path nvidia has to come with an idea of a hardware sc similiar to amd's and ditch their cpu hogging wrap shit when the consumers volta hit the floor you gonna see a sudden DECREASE of "serious" youtubers doing live capture through obs and shit because of the cpu spikes

amd really dropped the ball

Its crazy that nvidia keeps on delivering each year, 40%+ gains..even with no competition

>reference blower
Looks pretty good honestly.
Triple fan could reach

Uups, I'm using 2 of graphics as PhysX and 1 for rendering visual of it in my game engine designs, too poor real games doesn't have real particles in it. Like realistic shatter models, demage models etc... With that game is better on 1080p 60fps than without on 4k 120fps

Also I start buying games, when they use Instinct as PhysX... but you are too young to know what PhysX is.

No 1050 Ti - no parTi