60 trianlges bestr.
>>393498094
.
inb4 pic that disproves that shitty one
never understood the point of that pic anyway, as if added triangles go into one model and not increasing amounts of models in a given area
...
there we go
Have we hit the ceiling of how good games can look?
I mean, games have barely improved at all visually since Crysis came out.
Am I dumb for thinking the 20000 and 40000 ones look basically no different
I thought the PS2 was supposed to have Toy Story graphics, but even the PS4 doesn't.
Read the image.
No that's the whole point, they are simply adding triangles without adding details so the result doesn't change. In the bottom row you see what happens when you actually use those extra triangles.
>le games still haven't beaten toy story yet
here's a game that actually cares about graphics unlike autism hearts
How are they able to nail the eyeballs down but not the ears?
came here to explain this
glad someone already did it
We have a long ways to go. Hence why we see massive downgrades between E3 trailers and release still.
I know nothing about models. I read it twice and I didn't get it.
see
also say you could have a dude with 6000 triangles, then you could have 10 of these dudes with 60000 triangles
Time taken to render:
>2017: ~33 ms per frame
>1995: several days per frame
Consoles still for some reason can't into tessellation.
Just for the record the thing on the second row, where his ears look faceted, is ending soon as Pixar released their OpenSubdiv code and will be integrated into engines for those sweet smooth edges. I heard Unreal Engine is working on it already.
Then again, it's >kingdom hearts, so I wouldn't expect them to give a shit about it. The whole point of the franchise is minimum effort maximum profit from autists
>Autism Hearts
>2017 rendered on an apple
>1995 rendered on an orange
not an argument
What if we used circles instead of triangles?
>FMV is the same as real time
Neither is your stupid molymeme phrase
>2017 rendered on a cheap consumer machine
>1995 rendered on a specially built render farm
See this for good graphics Or battlefront, or most AAA titles
>2017 is the same as 1995
>Same as it ever was
Shit engine most likely. Tons of games graphically blow Toy Story the fuck out.
No, shit devs. KH3 is using UE4.
Well there are NURBS which are perfectly smooth but they are almost impossible to animate and very hard to work with for artists. Also because nobody gives a shit about them the rendering technology is very outdated and would hardly work in real time. But yes they exist
>here's a game that actually cares about graphics unlike autism hearts
and it looks like dog shit
>Devs don't make 1 triangle 3D models for their games
Just look at how good it looks
>Dragon's Lair still is the pinacle of graphics
Jesus christ, they're retarded then.
the fastest supercomputer in 1996, ASCI Red, pushed 1 TFLOPS, though I think it was in FP64
God, toy story looks like fucking trash.
I mean it's the cuphead meme of 80s
you could subdivide that triangle into 40k triangles and it'd look identical, diminishing returns!
The main thing holding back real-time from surpassing pre-rendered is lighting and reflections.
That was Mickey-Mania.
Are you pretending to be stupid or are you really this stupid?
KH3 looks better though in that
For the room itself, the lighting, shaders, and textures are WAY better in KH3 then in TS1. There are outright objects in the TS1 image that are flat diffuse textures with no lighting or shaders on them.
For Woody, his ear is circles to show less polygons along his upper ear, but it completely ignores that the Kh3 model has way better polygon counts and normals along the back of his neck, jawline, nose, his eyebrows, sideburns, and due to the fact it's not using racyasting, lacks the noise artifiacts along the brim of his hat/forehead and in the background, and in general the TS1 version has inferior shaders and lighting.
Also TS1 looked like ass outside of the main toys in general and even mid to late gen 6 games BTFO most of the movie (pic related)
>The farm itself is essentially a wall with 117 SPARCstations configured as headless servers, each with 192 to 384 megabytes of RAM (each processor has an average of 96 megabytes of RAM) and three to five gigabytes of local disk storage.
>The movie's final image rendering was accomplished on a "farm" of 87 dual-processor and 30 quad-processor 100-MHz SPARCstation 20s -- representing more computing power than 300 Cray 1's.
>SPARCstation 20 (single processor) had SunOS 5.4 installed and used a HyperSPARC @100 MHz with 27.5066 MFLOPS
>Theoretical maximum performance of the setup used by PIXAR: 294 * 27.5066 = 8086.94 MFLOPS
>Movie was rendered at 1526x922 pixels using Stochastic Anti-Aliasing Scan-line rendering used, shadow mapping for shadows ( no ray tracing )
The eyeball is a simple sphere while the ear is a much more complex mesh. When you do 3D, you either have to smooth your mesh or putting a lot of polygons to make it appear smooth.
Smoothing doesn't add polys but give a "gummy" look to your object if it's formerly low poly, you can't smooth a low poly chair for exemple because you wanna keep the sharp edges.
Video games can't have as many polygons as in 3D movies because consoles can't support it.
So in conclusion, in a videogame a sphere can be smoothed without a lot of polygons without consequences on its look while a ear must stay unsmoothed in order to look "good"
nuh uh, toy story one reflects the room in the floor
checkmate nerd
We're playing checkers, my dude.
>because consoles can't support it
PCs can't either
you've been playing checkers but I've been playing chess
>117 SPARCstations
ill sparc your station if you know what im saying
this is why pc gaming is a ruse
You are dumb for not knowing how to read
in terms of flops the base PS4 is around 230x more powerful
What about voxel shit?
>only has shitty western multiplats
>they look exactly the same on console as they do on PC
>many of the best games are Japanese, many of which will never come to PC for one reason or another
>PC would be lucky to get a game over a decade after consoles had them
Why do people fall for the PC meme?
3d artist here
basically
instead of "tris" or "polys" imagine words
and the number of words is a erotic story of you're waifu doing your dream fetish to you
60 tris is the story told in 60 words
6000 tris is the story told in 6000 words etc
In original example the story in 6000 words is the way stoy was written and the 60,000 tri one is the same story copy pasted 10 times and shipped as the full story.
It doesn't make any use of the given world limit and wastes literally 90% of the story on saying the same shit over and over.
Whereas if it were done properly those 60 000 words would've been used to flesh out all the details of every penetration instead of copy pasting them to do the same thing.
Basically it's just an image made to trick normies like you.
games have come really far from crysis days.
How so?
we've solved
>pixel sized voxels
>compression and efficiency
there's still some open issues about animating that stuff decently
voxels could be the future if geometry was our bottleneck in graphics but it's not
need arm hair
fucking kek
pbr to name one.
I think people are mad since most of the major leaps foward and improvements have been for the tools that us devs use and the refrinement has been very subtle.
What bugs me is when people argue that this is just about graphics, but in reality what it means to render 60k triangles instead of 6k triangles isn't the small increase in detail, it's that you could render 10 times as many objects of that 6k quality. The number of orcs walking around shadow of mordor is a lot more than the number of guards in Metal Gear Solid at any one time, on top of background effects like the stupid leaves people love in Battlefront II's Naboo level.
Better textures, better models, better lighting. Better implementations of things like ambient occlusion. Better background stuff like deferred rendering and physically based rendering. Games just don't have the same care put into them generally but the tech itself is better.
that's pretty much what's he current advancement has been. The programers have been wrking on trying to cram as much detail onto that 6k model and making them as easy to make as possible.
You just helped his argument you tard
>1995: several days per frame
More like 3 hours.
Also:
>2017: 4000000MFLOPS
>1995: 40MFLOPS
crysis was one of the first games to do ambient occlusion
games really haven't come that far, things still work mostly the same as they do before, I guess you could say PBR is a technological advancement but it really isn't that big compared to everything else
thanks to tools like the stuff allegoritmhic offers we can put more care into our models since we have more time to do so. Also we have some neat tricks to fake putting care into them.
Honestly even a random pump in some shitty cod gets a week worth of effort put into it.
I know cause I did.
This doesn't even go into how high quality models are made first nowadays, baked into a normal map, which is then overlayed ontop of a lower poly model.
By putting all curvature detail into what is essentially a texture, we don't need to waste cpu cycles casting rays and all that shit.
Sokeone more knowledgable could probably better explain how you combine the diffuse texture, slecular, and normal map on the GPU, as Im fairly certain engines are designed to load all texture layers onto the gpu in one go.
You can take a nicer shot, but I can't think in a modern game with the same ammount of TECHNOLOGY than Crysis
video games have been stuck with small numbers of characters on screen since we entered the 3d era
hardware gets faster but people make more detailed models to compensate
most games still only handle 10-20 dudes on screen
because consoles are holding back PC, we've lost a good 10 years of graphical advancement thanks to consoles.
nah just play crysis in 2017 and you'll realize it's starting to age, the environments are really rough and blocky, it's a 2007 game that feels like a 2010 game
Deferred rendering is one of the largest steps forward for video games in the last decade. Crysis is forward rendered.
That has nothing to do with tech and more to do with the care put into the game like I said.
arma3
Pixar's renderfarm during Toy Story 1 was only ~12GFLOPS.
retarded meme
if PC was the primary platform, devs would target the average users hardware which would be low range PCs with console tier specs
...
t.idiot
You can massively detail that model by doubling the polygons but whomever made that "infographic" is an idiot.
>Sokeone more knowledgable could probably better explain how you combine the diffuse texture, slecular, and normal map on the GPU, as Im fairly certain engines are designed to load all texture layers onto the gpu in one go.
idk not an engine programmer but we combine them into materials.
Where it gets interesting that we don't only get those 3 maps we also can get masks and alphas and shit in there and instance those into another materials which basically means we can use same base 4 textures, do some retarded math with it I don't even understand and end up with 40 completely unique materials that use very little resources.
looks like shit and is also a cutscene
>we've lost a good 10 years of graphical advancement thanks to consoles
congrats on the Sup Forums tier post uneducated post
Crysis did some form of basic deferred rendering, you need to for SSAO, but deferred rendering isn't so much of an upgrade as it is different, alot of people these days are coming back to forward rendering because of the limitations of deferred
>those disks that have a useless polygon in the middle
>while railings in all areas are 4 sided
makes me just mad
Ehh doesn't really bother me much. I still play old games from time to time. Modern game worlds are not impressive and look generic as fuck.
>Crysis did some form of basic deferred rendering, you need to for SSAO, but deferred rendering isn't so much of an upgrade as it is different, alot of people these days are coming back to forward rendering because of the limitations of deferred
probably used the z-buffer for SSAO
>Those shoulder edge loops.
WHAT
How soon before we gen these vidya implants that connect directly to our brain via the nervous system?
Forward rendering severely limits how many lighting sources you can have at one time. Each light source gives a larger performance hit in forward rendering than in deferred rendering. There are limitations in deferred rendering I'm sure but modern games are possible because of its benefits. I can't think of any game made in the last few years that is forward rendered.
This.
Im as PCfag as they come but consoles are good as far as GPU targets go.
They sure as hell need more cpu power, but limitations are what leads devs to come up with innovative solutions.
Doom's temporal supersampling is fucking fantastic and it comes at an exact 5 frame fps loss.
Checkerboard rendering is fucking cool shit and is the equivalent of what Carmack pulled off to get commander keen to run on PC's way back when.
Better copypasta:
Toy Story 1 can almost run realtime on a Titan Xp.
>87 dual-processor and 30 quad-processor 100-MHz SPARCstation 20s
>film's 110,000 frames required the equivalent of 46 days of continuous processing
>rendering each frame took one to three hours of [a single] SPARC processors time
sunsite.uakom.sk
>One 100MHz HyperSPARC CPU (used in SPARCstation 20) runs at 40.9795 MFLOPS:
performance.netlib.org
>The original movie was rendered at 1536x922 (1416192 pixels)
en.wikipedia.org
That's ~12GFLOPS for the render farm. A Titan Xp can do ~12TFLOPS (~1,000x more).
So a Titan Xp could render the movie in 3976 seconds, or 66 minutes, or just over 1.2x faster than realtime at ~29fps (compared to 1h 21m at 24fps).
But the frames in the movie vary in complexity, with a variance of +/-100%, meaning the more complex scenes would run at ~14fps, and the simpler ones ~43fps.
the eyeballs are totally different, are you blind?
underrated post
>Titan Xp
>unironically falling for a meme card
Doom has their whole forward + thing going on.
...
What is it, advanced cost-free tesselation?
Doom 2016 uses foward rendering. I think the technique they used now is called clustered rendering where they divide the screen into squares and pre-calcuate the lights that are going to be in each square so it turns out to be not much more expensive than deferred
titans are literally dev cards
Just cause you have no use for it doesn't mean same means for everyone
>Using apple to work or game
I thought this meme died long ago
>The virgin real-time
>The chad pre-rendered
prerendered vs real-time is apples vs oranges