>Unreal Engine + $150,000 GPU = Real-time Raytraced Star Wars
youtube.com
have we finally reached graphics that look as good as prerendered cgi?
>Unreal Engine + $150,000 GPU = Real-time Raytraced Star Wars
youtube.com
have we finally reached graphics that look as good as prerendered cgi?
Other urls found in this thread:
youtube.com
nvidia.com
youtube.com
twitter.com
Though that cutscene was cringeworthy, it looks good. They say it's realtime reflections. Better show me a scene with an actual mirror instead of shiny metal.
>$150,000 GPU
?
That's the price of a GTX960 these days
Not to be a dick OP, but we were kinda discussing ray tracing here as well
oh well,
read the description.
zimbabwe dollars i guess
>who should present our engine to the world?
>well we've got this guy who dresses like a homeless preson, has a lisp and green hair
>perfect
this industry
>gay Lisping, blue haired fagget
Unreal
>actual mirror
Their GPU setup would melt.
Maybe he's one of the guys that worked on it
"Getting a “cinematic” 24fps with real-time raytracing"
>24fps
>ray tracing
If Moore's law holds true we'll have this sort of processor in our consumer computers in about eight years.
>7th gen: motion blur
>8th gen: chromatic aberration
>9th gen: raytracing meme
can't wait for everything looking 10x time more wet than it does now...
Thank the inflation due to bitcoin mining fucktards.
That's probably a 480 of all the fucking things, enjoy your god damn house fires for fucks sake.
God fucking damn it I am mad
>improved cinematic depth of field to simulate real world cameras
just what I wanted to hear, more interactive movies
I don't understand why you think this can't be used to create normal real time games like Wild or Bloodborne?
"raytracing" doesn't mean much, there's tons of variety of it, and some date back to the 80's.
real-time does tho
Raytracing isn't a meme you retard. Unlike the ones you listed and the brown and bloom you forgot to list, Raytracing is actually needed to achieve realistic results that aren't just trying to emulate shitty cinematic effects. If NVidia did actually engineer something that achieves real time raytracing without needing 12 state of the art GPUs then that's fucking impressive. Though at the moment I have my doubts.
the wall reflects of that wall and reflects of that wall!!!!!!
games are saved!!!!!!!!!!!
>If NVidia did actually engineer something that achieves real time raytracing without needing 12 state of the art GPUs then that's fucking impressive.
GPUs - 4x Tesla V100
CPU Intel® Xeon® E5-2698 v4 2.2 GHz (20-Core)
System Memory - 256 GB LRDIMM DDR4
except they did user
Are you retarded?
>CPU Intel® Xeon® E5-2698 v4 2.2 GHz (20-Core)
desu this is the thing that sets off the BS. Whatever process they use must also use a fuckload of CPU resources, which means you won't see it in games.
I wonder how much of that is actual new tech aimed at optimizing ray tracing and how much is just brute forcing.
Either way, it seems we won't be seeing it in games this decade.
I CLAPPED WHEN I SAW STAR WARS
ok i got a chub
8 tesla v100s. thats about 1200 teraflops of power brute force.
Normal 4 or 8 cores probably can't drive 4 tesla V100 in SLI. Doesn't necessarily mean that the task is CPU intensive by itself, but the fact that GPU also require some processing power for rendering frames.
Call me when games look like the opening cinematics of the PS3 Armored Cores
>games will never look as good as this
youtube.com
Oh boy I can't wait for this to usher in even more games that try to emulate reality instead of something completely unreal
They put 4 GPUs that aren't even available to the normal consumer in a PC.
>2018
>caring about NuWars
sad
>Though that cutscene was cringeworthy
you really need to eat shit
>dumb post processing
>dumb post procesing
>actually fucking simulating how light works IRL
If you have a 1000 Tflop deep learning server what would you do with it?
>Tech demo shows impressive graphical shit
>Always in a fucking corridor or a room
EVERY. FUCKING. TIME
Is everyone retarded or something?
it took 9hrs to render that scene user.
That's a lot of flops.
Give them to Microsoft.
Realistic ray traced rape simulator.
>implying playing a stealth game and seeing an enemy through a multi-mirror setup wouldn't be the coolest shit
I just did by watching that part.
>adds to cart
Then it’s not real time rendering. Am I being tricked?
And my point is that people always lose their shit when devs show a pretty room or corridor with shiny effects but not on any big interactable open world, why bothers then? A room is so easy to set up lighting to make it look good, people have done that in many games
how fucking stupid are you?
They have enough flops as is.
It took 9 real-time hours ;^)
No you're wrong, this is an Nvidia DX technology that was implemented in the engine. It's all in the GPU
But can it run Crysis?
Shame we'll never have anything like this since games have to be held back from their true potential for the sake of consoles
Enlight me then. Show me that I’m wrong. Show me the reason they only render rooms for these technologies instead of a bigger environment like forests or cities.
The answer is even the beefiest machine will render those scenes at 3 frames per second
UE4 supports planar reflections that are effectively mirrors already.
Run 10 crysis at the same time
>adds tampons to cart
>Show me the reason they only render rooms for these technologies instead of a bigger environment like forests or cities.
Probably nothing to do with them creating a quick tech demo that showcases the particular tech as well as possible, hence why they even used the reflective interiors of the starkiller base
They should have used all the assets from the now dead Paragon and gave a comparison of rendering quality based on that since it'd be closer to a real world example instead of some VFX garbage.
I don't think you know what tech demo is. And I don't think you realize just how new this Nvidia ray tracing tech is. GDC is not for you, stick to VGA.
>nearing 15h GMT
>still no spoiler images
>OP threads on Sup Forums are mostly shit now so i'll just skim through it
>chapter will only be out tomorrow and i'll have to read it on my phone during classes as in the last weeks
when did it all go wrong?
oops, wrong tab
From the vid
>The best way to show ray tracing is through reflections so let’s get someone shinier
>Put in the metal girl
>The scene is still a small fucking elevator
Sure it looks pretty and all but my point still stands. The performance will tank outside of those corridors they created. They always introduce the tech too early and people have to wait for a sufficiently strong enough card to come out before it can go mainstream.
>The performance will tank outside of those corridors they created.
And? What the fuck does this have anything to do with the technology? Do these fucking naysayer cucks have nothing better to do than whine about literally fucking anything they see? This board spends every single horror game thread circlejerking the PS2 silent hill games for how they still look insanely good due to the graphical tricks they pull, but when a fucking TECH DEMO limits itself to one corridor to showcase some technology these whiners are in actual tears over them not getting 10 more of those GPUs and CPUs to pull off the same thing in an open level
>something about video game technology
>better care about how someone looks!
It's not about performance you dumshit, it's a proof of concept for something revolutionary which is real time ray tracing. It's not about how efficient or fast it is, it is about being able to do it. Dude this is way above your level, you don't know wtf you're even talking about
It was pretty bad. It was like a scene out some faggy anime.
So pretty much any anime.
>have we finally reached graphics that look as good as prerendered cgi?
What's the point if people are still poor?
Sup Forums is so fucking sad
More like 1.
The first games that utilize this tech will be limited to tight corridors.
>a true return to survival horror?
>the retard(s) in this thread complaining about the proof of concept being in the corridor
Having several reflective surfaces and light sources in close proximity is the best way to show off this technology. In a large open area, light would diffuse so much that ray tracing wouldn't be nearly as noticeable.
GDC threads should be on Sup Forums, not on Sup Forums. This board isn't mature enough for it, they don't even get the point of the conference
WOW A PRERENDERED CUTSCENE WOW GUYS SUCH ADVANCEMENT
The number of retards posting in this board is truly staggering. No wonder the industry is going to shit when it pander to dumbasses like this.
>you (You)
>waa waa why cant technology be made fast enough so that as soon as a concept is proven we can already have a full AAA videogame made from it
I hate retards who try to act as if they understand how the real world works
>thread about new technology
>4 posts talking about muh crypto
>1 post sperging out about Star Wars
>some posts sperging out about muh corridors
kill me
>In a large open area, light would diffuse so much that ray tracing wouldn't be nearly as noticeable.
This. There is nowhere on earth suitable enough to show off all the fancy light, shadows and reflections that raytracing would offer. A tiny cramped corridor is a much better option.
>cute deep voice
>it's a 40yo guy with green hair and dressed like an emo teenager
Gamers... Pfft.
Watched the livestream of the UE4 GDC presentation, and it was painful. The presentation was okay, but the comment section was almost completely kids asking when the (non-existent) Fortnite announcement was coming. There was even some kid that kept asking for Spiderman PS4, which of course isn't even UE.
Sup Forums is maybe a teeny bit better than youtube commenters... but not much.
As someone who understands how ray tracing works, how the hell would this be possible in real time? It just doesn't sound plausible
Fucking magnets
Like previous attempts at communism, it's not true raytracing. They're only using it for certain visual elements in a scene and not to render the entire image itself.
Server thing nvidia sells with 2 xeon cpu, 8x V100 workstation gpus.
nvidia.com
Only a percentage of the frame would rendered, the rest would be filled with other algorithms. It's more believable that way if it wants to get to vidya.
I want to see this shit done with water, that would look amazing.
How many light sources do you count there? How many reflective surfaces?
There's some algorithm that prioritizes rays, which allows a near-realistic look for a fraction of the computational requirements, I believe.
nvidia has been going at this since at least 5 years ago
youtube.com
>How many light sources do you count there?
One, a strong sunlight casting shadows across the entire scene from all the buildings, people and surrounding objects and with it comes both large scale and fine GI and AO on every last inch of a wide open space.
>How many reflective surfaces?
Hundreds. Can't you see all the windows?
But does it have lootboxes tho
It doesnt matter how pretty games look if they're going to get mangled by microtransactions and the countless other shady monetization schemes
no if youre not poor
Well that shit isn't going anywhere, at least at a basic level, so might as well keep improving the other aspects.
render for all sfm porn artist. just email me the files then have all your stuff rendered at the highest settings super fast.
Exactly. One light source. Not a great way to show multiple sources interacting.
>windows
Can you see the reflections from the windows on that art piece? Even if you got the perfect angle and zoomed in enough, no, because the reflection from the window would have diffused so much it would be blurry as shit.
To reiterate: ray tracing is to simulate multiple light sources and reflective surfaces INTERACTING with each other.
Due to how PBR is designed to work, every single pixel in a scene is both a source of some form of light as well as a reflective surface and they all interact with each other. Just because it's not a giant shiny mirror doesn't mean it's not contributing or being affected by the environment.
There's a reason they added the one Storm trooper who's literally a walking mirror to the tech demo. How many people would care about the light from a regular surface just barely reflecting onto another surface? How many people would go crazy over a shadows from a single light source?
The tech is most noticeable in a small, controlled environment.
BASED Nvidia making the future possible meanwhile AyyMD hasn't even got to the present.
AMD is making their own and it'll work with Vulkan. I'd rather than over DX12.
They're just going to use this tech, when it's commercially viable, to make some SJW garbage.