THANK YOU BASED NVIDIA

youtube.com/watch?v=tjf-1BxpR9c

THANK YOU BASED NVIDIA

Attached: maxresdefault.jpg (1280x720, 108K)

Other urls found in this thread:

hardocp.com/article/2018/03/20/nvidia_titan_v_video_card_gaming_review
twitter.com/tonroosendaal/status/943845517301338113
nvidia.com/en-us/gtc/
hof.povray.org/kitchen.html
github.com/straaljager/GPU-path-tracing-tutorial-4
anandtech.com/show/11913/nvidia-announces-drive-px-pegasus-at-gtc-europe-2017-feat-nextgen-gpus
youtu.be/cRFwzf4BHvA
theregister.co.uk/2018/03/21/nvidia_titan_v_reproducibility/
twitter.com/SFWRedditImages

BASED

Nice.

AMDRUMPFT IS FINISHED


sage

go be constipated somewhere else, pajeet

SAD

Attached: SAD.jpg (688x267, 58K)

games right now dont have shadows?

So, the robots are homosex, right?

please upgrade

The point is that these are supposed to be pretty shadows. Ray traced shadows aren't anything new though. What's promised is an SDK but before we get our hands on it we won't know if it's even workable. And if it is you'd have to accept to spend development resources on replacing something as basic as shadows for Nvidia users. Or not selling to AMD people at all. That won't happen, trust me. Look at physx

OP has no perspective on graphics demos vs reality in games though. That we know for a fact.

nVidia will become next Windows with every new product updating "gameworks" with arbitrary upgrades.

The issue too is that unless this is supported by an engine, a dev is never going to bother implementing it. Unreal 4 comes with free Gameworks ray tracing out of the box? Might get a menu tickbox. It doesn't? You're never seeing anyone do it.

>BUTTMAD AYYMDPOORFAGS WITH NO RAYTRACING TECHNOLOGY DETECTED

Thank you NVidia(r) and Microsoft(r) for gifting us a technology which will only work with DirectX(tm) 12(c) on Windows(tm) 10!

Are there any news about AMD's Vulkan equivalent?

Well you absolutely might see bigger devs do this for payment. Nvidia pushed physx this way. Can't say if it was a good idea for them but they kept doing it.
>checkbox
It'll never be fast enough like this. We're talking about raytraced lighting/global illumination solutions at realtime.
If that's not just a deceptive description you'd probably have to specify actor-scene-lightsource collections in some way to simplify for the tracer. It's also very likely that you'd specify simple bounding volumes. For all these things.
>idiot consumer detected

No one cares about your ignorant, uneducated opinions, AYYMDPOORFAGS

Nvidia will get all big gaming companies to use raytracing on Post Volta consumer GPUs for better graphics and there's nothing you can do to stop progress

This looks nothing like fp64 cpu ray tracing from a decade ago.

Attached: Glasses_800_edit.png (2048x1536, 2.9M)

I'm just suggesting with 'checkbox' that even if it does get supported it's going to be an afterthought kind of thing as far as developers are concerned. Like how some games right now TECHNICALLY support a DX12 render path but you're actually getting worse performance versus your DX11 or Vulcan options. Even if devs do get behind this a bit you're never going to see anything realizing its potential outside tech demos. Or until raytraced solutions become mainstream, probably in another console hardware cycle or two.

>SAD
And's vega will rape nvidia in that field.

Besides this is literally 40 years old rendering method that is done the lowest quality but with some fancy ass raster noise reduction.

I bet it kills the small details, textures and such

Attached: 90098.png (650x337, 25K)

>simple integration
Yeah ok. I don't see us getting there.

Your picture simply has poorly done materials and textures.

It is technologically superior BUT the artist has no idea what the fuck he is doing.

hardocp.com/article/2018/03/20/nvidia_titan_v_video_card_gaming_review

> NVIDIA is so far ahead of AMD on performance and the NVIDIA TITAN V shows everyone what this looks like. With performance 40% faster than GeForce GTX 1080 Ti FE AMD has nothing even close to that level of performance in a GPU today. Even if you shrink Vega to 12nm, and boost clocks, it will still never make it to near the performance that NVIDIA TITAN V with Volta can achieve in games.

STAY MAD, AYYMDPOORFAGS

OpenCL cycles in 2.79 doesn't have transparency/refraction support.

Yeah. And if we don't get to the point of that simple I have a real hard time seeing it ending up in anything.

What?
It absolutely does.
>With performance 40% faster than GeForce GTX 1080 Ti
So it is almost as fast as Vega? Good for them lol.

Stop posting fake news, faggot

Blender Foundation's chairman himself already said Titan V is the performance champion

twitter.com/tonroosendaal/status/943845517301338113

>Nvidia sent us a new GPU - the stylish Titan V. Ranks #1 in three of 6 benchmarks here

AYYMD getting DESTROYED in gaming, compute, you name it

reminds me of the Unreal 4 engine demo, yet we still haven't seen any games that good yet...

>these people were paid to say this
>it must be true
What? Are you even listening to yourself?

I fucking bet as long as the materials and shaders are standartized next year Unity and UE4 will simply make the "raytracing" button in the graphics options.
The settings will simply look like:
>low-med-high-ultra-epic-Raytracing

nvidia.com/en-us/gtc/

Next week is GTC, can't wait for Jen-Hsun to announce the next Nvidia GPU microarchitecture on the 27th

Nvidia, winning and winning all the way

>$3000 unicorn chip
vs
>actual consumer products

Oh neat! I don't know why I thought 2.79 still didn't support transparency. Was it 2.78?

>Ranks #1 in three of 6 benchmarks
So what ranks #1 in the other three?

>the next Nvidia GPU microarchitecture on the 27th

Ampere -- the voltage unlocked 12nm Pascal all over again.
The flagship sporting gddr6

Here I announced it today.

Yet it still looks better than this hardware "raytracing" trash.

I really dont know what are you talking about but OpenCL in blender worked just great for years already and is objectively superior to CUDA

>Being this stupid and retarded
>Nvidia already announce RTX technology for Volta & FUTURE GPUS which means there is no way it's any Pascal refresh

Keep on posting fake news, AYYMDPOORFAGS :^)

Your pathetic behaviour really fuels Sup Forums for years to come

>hardware
>raytracing
>this
>hardware

WHAT?

delete

>looks better

can it run 60fps?

gpu assisted anything is considered "hardware" like encoding/decoding hevc on phones

Do you thinks Nvidia cant just disable the old GPUs from running their new software?
They either will do it, or they will gimp the performance beyond usable.

SURPRISE ATI COULD RUN PHYSX IN THE PAST THEN BOOM IT GETS BANNED

But dont worry, when AMD gets their own path tracing renderer they will allow everyone to used, even the Nvcucks. Mommy Su is so kind.

If we had enough x86 cpu FP64 then yeah. Can't intel's knights landing CPU be able to do this? 3+ x86 FP64 TFLOP seems like a lot of juice.

Attached: Knights_Landing.jpg (3000x1688, 1.29M)

Everything is hardware, even the CPU.
Even the PSU

AYYMD garbage can never run PhysX in the first place, please stop lying and embarassing yourself if you don't know anything

>gpu assisted is considered hardware.
Interesting. What do you suggest we consider specialized hardware solutions then? Silicon assisted?

Don't be fucking dumb.
>hvec
Is usually about separate chips for specialized decompression/compression tasks. H264 chips most commonly.
Like the broadcoms

We dont need high precision for that.

In fact AMD's rapid packed math will fucking shine in this application. Path tracing can run on half precision and maybe even fucking quarter precision operations.

I like how no one understand how this works at all. The raytracing stuff is part of directx, it's going to be supported by both parties as a hardware extension, and probably also by opengl. I'm pretty sure that Nvidia was the first to push tessellation back in the dx11 days but it's on all cards now

lmao, no it can't. The difference between fp64 and fp32 ray tracing is noticeable but between fp64 and fp16 insanely huge. A better approach would be to ray trace only certain spots or objects and rasterize everything else with some fancy gpu shadow/diffusion effects.

It's Goyimworks so you can know it wont run well even on Nvidia's own gpus.

AMD are already working on Vulkan feature like this so you can take this as a preview for that.
When AMD release it will run better than Nvidia's shit even on the Nvidia gpus.

You really have no fucking idea what you are talking about.

It's all going to be trash anyway, unless GPUs start doing cpu-level complex math in FP64 precision then don't hold your breath.

You are really fucking stupid.

I think you mean thank you Microsoft.

Thanks to Nvidia only comes if you buy Volta since anything older than that is just using the software fallback like AMD cards.

Stop lying, AYYMDPOORFAGS

Nvidia GPUs have special hardware feature to accelerate raytracing which AYYMD garbage with no vision don't have'

The fact that AYYMD can barely release a statement just shows they got caught with the pants down and simply have no answer for years to come until they try to copy Nvidia's innovations as usual

Doubt this will be used for games if optix/redshift is anything to go off of.

The ray tracing in OPs video definitely isn't being done with double precision floating point math and it looks like ass.

Pic related was done in fucking 2005, over a god damned decade ago yet it still looks better than anything in OPs video.

hof.povray.org/kitchen.html

Attached: kitchen.jpg (1440x960, 511K)

>Nvidia GPUs have special hardware
Only Volta

Don't pretend your GTX 1070 or some shit is going to magically have ray tracing hardware acceleration.

Not exactly new.

github.com/straaljager/GPU-path-tracing-tutorial-4

Again, I never said Pascal, stop trying to put fake news in other people's mouths, faggot

Yeah, you make ambiguous statements like
>Nvidia GPUs have special hardware

Which is simply not true, brand new Volta Nvidia GPUs will, nothing else.

Blanket implying JUST GO NVIDIA GUYS THEY'RE THE BEST is fucking retarded.

>ITT: 16 year olds with no clue

You're retarded thats for sure

Volta & FUTURE GPUs, which we already know is coming

anandtech.com/show/11913/nvidia-announces-drive-px-pegasus-at-gtc-europe-2017-feat-nextgen-gpus

But keep posting fake news I guess, Drumpflets

Fake news?

Nothing I said is wrong.

I never said Nvidia GPUs past Volta wouldn't have it.

But your statement of
>Nvidia GPUs have special hardware
Is just plain wrong.

Volta or newer, sure. But you can't just say, oh well Nvidia GPUs have special hardware that AMD doesn't SUCK it AMD!

We have this every year and every year it's just the same thing slightly optimized.
There hasn't been any new groundbreaking things related to graphics for over a decade. Ray tracing is still just developing just as one might guess, nothing to get excited about.

AMD

youtu.be/cRFwzf4BHvA

Attached: 1521241529326.webm (1400x1080, 1.64M)

>to bake lighting

Attached: trashman4.jpg (650x429, 167K)

It's great. Unreal does light baking on cpu only and it takes hours

Fuck ray tracing I want to be in that hella sick android party

Attached: 9b60e89863c2c12cf97e047b413809ac.jpg (2000x2000, 1.64M)

>static light/shadow-map baking is now innovation
even fucking half life 1 did that shit during map loading, how is this impressive in any way, shape or form in {{$scope.current_year}}?

>shadow-map
You don't know what this means, do you?

It's not impressive but it's pretty fucking great for productivity.

And the swarm agents for distributing/accelerating the process can only run on Windows.

Using anything other than windows means you're a brainlet anyway

Yeah sure. Building a renderfarm for lightmass baking and buying a windows license for each node is what an intelligent person would do.

Enterprise doesn't buy singular licenses.

CPUs have been adding feature specific hardware to the die for years as a way to improve performance, how are GPUs any different?

>why does this old render that took 7 hours look more accurate than this real-time demo?

This is already a thing in Blender since version 2.7 which came out lats year.
Nvidia made just much better version and on the fly.

>"The radiosity data was saved in a first-pass render with plain textures and withouth area light nor antialiasing, at 720x480. This precalculated data was copied along with the scene on my 2 computers to distribute the render. The computers where an AMD XP 2,01Ghz and an AMD Athlon 1,2Ghz, both running Fedora Core 2 Linux. The final render at 1440x960 took 154 hours and 18 minutes, using +A0.0 +R4 for the antialiasing."
the point isn't that it's groundbreaking /quality/, but rather /speed/

raytracing is 80's technology.
raytracing is seriously an acient technique.

My dude, this was done on hardware weaker than $20 wallmart android phones of today. We have 32 logical-core threadrippers at 4GHz. Are you saying we still don't have the cpu brute force to render accurate ray tracing?

> Titan V : $3,000 .. GTX 1080ti $700
> 4x the cost for 40% bump in performance
Ohhh yeah, btfo totally

Great!, another BASED Nvidia technology that will be used on 5 or 6 games and then abandoned

good luck rendering that scene at 30+fps on any modern single consumer cpu
sure those are slow as shit compared to modern cpus, but going from 154 hours to 33 milliseconds (30fps) is nearly 17 million times faster, cpus haven't improved /that/ much

Yup, how the fuck to people keep missing the obvious?

What about knights landing processors?

Attached: intel-knights-landing-overview.jpg (812x448, 127K)

CPUs have improved a lot but the point is that they can't do real-time raytracing because they can't push that much data fast enough and they can't do that many operations in parallel. It's why we have gpus in the first place, they're simd architecture with insanely fast memory

Real time raytracing was getting added to DX wasn't it?

Is this kid larping or actually being paid 10c for every time he mentions ayyyymdpoorfags

no, 0.000,000,6 shekels from shintel probably

its far ahead indeed
theregister.co.uk/2018/03/21/nvidia_titan_v_reproducibility/

its so far ahead it cant even make sense of its own calculations

Looks noisy and shit.

A completely still picture, like toward the end, still has moving light and shadow since the source rays which are grounded and denoises are still noisy.

All these artifacts are completely unacceptable. Even worse than the shimmering pixels that deferred rendering with poor antialiasing gives.

The only demo that looked good was the UE4 one. Nvidia and Futuremark demos looked god awful.

it's more the fault of game consoles than nvidia really

consoles can hardly handle any of the gameworks tech or AMD equivalent, it's why physx died out, game studios had to specifically add physx features to games on PC only to remove them from console ports that'd outsell the PC version 30:1

no sane game studio is going to waste time and money adding gameworks features just to appeal to like 15% of the overall gaming market

It's obvious to anyone who isn't completely ignorant that gaming GPUs are inaccurate because they use lesser precision numbers for faster calculation since games don't require high precision, literally EVERY consumer GPU from Nvidia and AMD does this, it's why there's a split between consumer GPUs (GTX/Titan) and the workstation GPUs or whatever you want to call them

but hey let's write a sensationalist article cuz the GPU is really hyped up and expensive!!! that'll get us some free clicks xDDDD

How many houses burned down to render this?

>Besides this is literally 40 years old rendering method that is done the lowest quality but with some fancy ass raster noise reduction.
>I bet it kills the small details, textures and such

The polygons and albedo are still raster rendered.
They're just doing things like light bouncing, ambient occlusion, shadows, etc and applying that.

The problem is that there are still extreme artifacts with anything less than using 4 Volta GPUs to achieve 24fps. The demos using only a single Volta GPU have been garbage with very obvious and distracting artifacts.

This is nice tech for using UE4 to create cinematics that the artists can view in real time and tweak and refine more easily, which still look like the game art itself but are prerendered and higher quality. It's still 5+ years away for games outside of using VXGI and calling it ray tracing, or using screen space shadows and calling it ray tracing, etc.

Ray-tracing is old hat.

The problem is that requires an insane amount of computing power versus rasterization.

It belongs firmly in the professional graphics unless you want to start gaming back at 640x480/320x240 again.....

Suddenly, everyone is on raytracing now?
What did happen for it to happen?

Holy shit, a tech from ages ago if finally making it's way into GPUs?!???!??! FUCKING HELL MAN