Raytracing

What's going on here?

Attached: directx.png (276x183, 2K)

Other urls found in this thread:

youtube.com/watch?v=6xE3J56pabk
vimeo.com/180284417
twitter.com/SFWRedditVideos

You have the good old rasterization pipeline (which sits on the 3D queue). And we also have the newer compute "pipeline" (which sits on the compute queue). Now we also get a raytracing "pipeline", which grotesquely sits on either the 3D or compute queue, as one whishes.

It is a hybrid somewhere between a "middleware" on top of the lower level APIs (setting instanced objects, building their bounding volume hierarchies, starting the ray tracing process), and an extension of the lower APIs (they have introduced new HLSL intrinsic functions).

I would be vary of new highly managed layers and introductions of special instructions for only one purpose. Remember the Retained Mode aka Retarded Mode of Direct3D 3.0? This DirectX Raytracing is a Retained Mode for raytracing.

Needless to say that the Retained Mode was mostly unused and later removed. This API was build in cooperation with engine developers, so we will see how it will turn out.

It will be interesting to see how the API ages. My bet is that pretty soon the raytracing API will not be used, because it is too inflexible compared to the rendering and compute pipelines. The problem again is that the people on Twitter -- especially the gamedevs -- are too easily excited about stuff.

Attached: 1516098272683.jpg (722x349, 58K)

I could swear I saw some real time OpenGL/Vulkan raytracing demos a year ago.

>they have introduced new HLSL intrinsic functions
Ah so that's what this is about, Vulkan 1.1 just added HLSL support, so of course HLSL gets updated to break compatibility.

Raytracing is pretty API agnostic, in it's real-time flavor (read: mostly only primary rays) it can be implemented in Direct3D, OpenGL, Vulkan, whatever you like.

The thing is that last year on HPG an intern from Nvidia released a paper for denoising raytracing scenes, in a way that is both spatially as well as temporally stable. I personally do not really like that paper as they give performance numbers, but only for the denoising step, not for denoising + raytracing, in order to claim it to be "real-time". The raytracing is most of the work here, so that's a bit misleading.

On an additional note: This raytracing pipeline only works on Direct3D 12. When you look at the list of games released with Direct3D 12 support on Wikipedia, that list pretty much resembles a tumbleweed rolling over the prairie.

Attached: 1516557278870.png (591x591, 22K)

Lord, not this raytracing crap again. Every few years somebody pipes up with some new study or piece of software.

>Omigurd, raytracing will revolutionize 3d-graphics
No, it won't. Raytracing is not a 'be all end all' solution. It's interesting and really nothing else.

Using Raytracing denoise IA as preview for render.

youtube.com/watch?v=6xE3J56pabk

I used to know his brother
Brian Tracing

here's another one :^)
vimeo.com/180284417

>No, it won't. Raytracing is not a 'be all end all' solution. It's interesting and really nothing else.
You do realize raytracing is heavily used outside of gaming right? Like in movies and TV and shit right?

This time it's hardware too.

Which means that no one can use it until the new hardware is adopted, plus Microsoft tied it to their malware platform which will also hurt adoption.

So its just another attempted distraction from the new version of Vulkan in an attempt to lock more developers in.

We'll see what happens. Realtime raytracing is too valuable to be wasted on just gaymes.

SHINY BALLS

Attached: Raytracing_reflection.png (1024x768, 501K)

Raytracing, or rather forward path tracing really could be the magic solution, if only hardware could support it in real time. The reason being what you get for free. With other techniques yo can certainly have myriad cool effects cheaply, but you have to consider them all separately. With fully physics based path tracing, you don't have to keep them in mind, they're a result of, well, physics. Proper reflection, refraction, caustics, global illumination, translucency, subsurface scattering, diffraction, all of that is just along for the ride. Then you really only have to focus on geometry and ensuring the proper properties of materials.

I remember talking about ray tracing a decade ago, it's only now that they're going to implement it in games?

This opens the door to so many more things than just accurate reflections. It opens the door to subsurface scattering, refraction, subsurface scattering, colored transparent shadows, caustics, way better shadows with attenuation, good lens effects, volumetrics, and of course global illumination. I am sure I forgot some other rendering technologies which rely on raytracing as a foundation.

Yes, this isn't a trivial challenge

Just Microsoft trying to claim/hijack what various people and business have been working on since atleast the 90's.

>Raytracing is not a 'be all end all' solution.
It literally is. It simulates path of light and all possible effects come out of it for free and with perfect accuracy.

So you could just ditch you normal * light direction calculations and instead go for rays?
Or is Directx going for a pet pixel casting anyway?

Shit if I know. But raytracing is by nature produces perfect photorealistic graphics, unless you deliberately fuck it up.

It would be interesting to see if you could forego the current lighting methods in 3D.

Something about soaking the polygons according to the rays cast.

That's essentially what this is.
The scene is rendered as usual, just without any shaders, then raytracing is used to do the lighting.