I hate how people compare real rendered forests like my pic to other games that fake it, like resident evil
I hate how people compare real rendered forests like my pic to other games that fake it, like resident evil
Other urls found in this thread:
youtu.be
youtube.com
twitter.com
So you are upset games don't have endless scapes but eventually environs end.......
Nah mostly just people saying "Hurr see the ps4 can do amazing forests look how good this RE forest looks! when it's literally just 2 trees and a cardboard cutout.
I hate when people think good graphics make up for a shit game
Nothing titty & horse dick mods won't fix.
The ps4 seems to be capable even if devs are too lazy to make it more than two trees deep. It doesn't really matter. Who plays a game for a forest that is just a backdrop?
Game is great now. That WebM is from the first day it launched. It was patched ages ago.
Plays and looks miles ahead of anything I've ever played.
You must not have played many games
It's a boring piece of shit and tracking is literally just witcher senses
this is what I envisioned to be a bare minimum standard of ingame graphics in just couple more years after Crysis.
It's been 10 years now, and they still are rare exceptions.
that roadrunner wind-up at the end
top kek
It still hurts.
Why are consoles stagnating games so hard?
rather than consoles per-se, it's the lazy as fuck western AAA devs taking all possible shortcuts, as they know the consolikids will buy it anyway. At 70 dollar price at that.
This here for example is 16 years old console game, ran at modern UHD resolution. It barely looks even half that old.
Because gameplay > graphics?
There are PC exclusives that have amazing graphics and subpar gameplay, you can "play" those.
It's a pretty comfy game. Just walking around the woods. Finding indian artifacts.
>optimization is a secret now
But most games with shit graphics still have shit gameplay, and Crysis actually has great gameplay.
>hey guys lets release a game that only few PCs can run.
>Surprised it doesn't sell great
>"muh consoles"
Crysis is a garbage game. It's ez to release super shinny tech stuffs that even high rig can barely run 4 years later (yes i'm talking about crysis 3).
Comparing modern AAA devs to Team Silent simply isn't fair, user. Silent Hill (1-4) is what happens when devs are truly dedicated to their games.
I was more impressed by RE7's graphics than your garbage walking simulator with washed up lighting, plastic foliage, and shit shaders.
>it's been 10 years
>games still rarely have this level of technology in them
Fuck man, this sucks.
Hows the multiplayer? i heard its very buggy?
I pity you, poorfag.
But if they have bad gameplay, why do you care if they have bad graphics?
You wouldn't want to play a shit game for the graphics, right?
You can load up Cryengine or UE4, fill a map with trees, grass and bushes and look at that if you want pretty visuals.
Just go to the forest nigga
>yfw entire franchise was outdone by a single movie
Why are jap devs so bad at games?
Seriously, dude, if you're going to bait you need to try harder. I know there's a lot of retards using this site, buy c'mon, you make it too obvious. You're probably another retard.
What does that even mean.
That you didn't play a lot of games to think that
>too lazy
why model something you'll never see, and will make the game run slower by rendering unesessary shit?
It had a shit story though that was changed like 6 months before release. Also fuck the zero g and flying sections near the end.
Give me a list of games in the last year that give you near-fully destructible and deformable environments and objects with at least realistic graphics/art style.
Weird considering RE7 is a walking simulator with washed out lighting, plastic foliage, and shit shaders.
You mean like Just Cause 3?
Not all games benefit from that stuff.
I hope he says The Division so I can laugh in his face.
Does crysis 1 have dynamic lighting/open world/GI/pbr/tesselation, etc?
No, then fuck off.
Takayoshi Sato is one dedicated mofo. Taught himself 3D animation in the late 90s when you couldn't Google that shit, while in Japan.
Obviously not, but when technology evolves in an iterative way, a lot more games like that should be able to be made, but they don't.
Nice excuse.
No wonder the gameplay is so shit. He was experimenting with 3D.
He did the cut scenes.
what excuse is that you nigger? gave you a list of standards techs that crysis 1 doesn't have. So don't talk about technology just because your garbage has some destruction here and there (battlefield does it 10x times better btw).
>"Hey guys look quake 3, wow amazing graphics, what a revolution"
>"yea but worms 2d has destruction, therefore its better xd"
Fucking poorfag.
You mean the terrible uncanny valley scenes?
Games can't have physics as part of gameplay on PC because fewer people would be able to play. Nvidia owning Physx and not letting AMD use it pretty much ruins it.
Games like BotW, which is on a really weak system, can have interactivity with physics in interesting ways because the developers know exactly what platform everyone will play it on.
If there were only two different PCs, one mid-low and one high end, I'm sure more developers would make interesting games, where the difference was only in better graphics between the two PCs.
this is also why high-end PCs are a joke currently. Too few people have one for developers to make good use of them, and making the medium settings = PS4/Xbox settings and higher settings only having higher res textures, shadows and LOD distance, stuff you can easily change in .ini files.
high end PCs are no more a joke since 4k exist, but yea before that time it was.
People love to rag on Crysis for being a soulless tech demo, but the game clearly was a labour of love. Still prefer Far Cry, though.
Yes, it is better. Reactive environments should be the bare minimum.
Graphics don't mean shit.
Man the guy doing that is so annoying. It is really obvious that he is pandering to children.
Games are almost equally good at 4k and 1080p.
4k is also a joke because you can tell pop-in and lower res textures/jaggies more easily and no developers account for the higher resolution.
Also, very minor but if you're just barely supposed to see something at 1080p, 4k kind of ruins it. Similar to picking a brighter brightness in games that are supposed to be dark.
But how much better do games today look than they did 10 years ago? The jump in graphics is not as large as the jump in graphics in the 10 years previous. But things like physics could have been improved a lot and made many kinds of great games possible. The industry has clearly moved towards creating games more focused on looks and not on innovative ways to use technology.
>Games can't have physics as part of gameplay on PC because fewer people would be able to play. Nvidia owning Physx
You don't need Physx to do physics sims on a gpu, what are you talking about? physx is literally just nvidia's api for it, anyone can write their own
>jaggies
meant edges with few polygons that look round on 1080p but not at 4k.
>Games are almost equally good at 4k and 1080p.
God no.
Here's GTA V at 1080p with maxed settings and 8x MSAA.
The locked down technology and the whole deal with Nvidia is pretty shitty, yeah. Definitely adds a lot to the problem.
Still, the physics in Crysis wasn't really that intense. It has been 10 years and we should be able to do a lot more with physics now, even if the graphics we chose to implement in a game is a bit less high quality than the current state of the art.
You'll find games with individually designed graphical assets are shorter games with less exploration choices.
youtu.be
The industry moves the way a fat man's bowels do, constipated at best. They get so backed up that's why they shit out of their mouth stupid crap no one wants like waggling, tablet consoles, and vr.
And here's GTA V at 4k with lowered settings.
The Mass Effect Andromeda mess is a perfect example of how terrible the industry is on iterating on past work and sharing the base technologies.
Think about what if everyone just poured their collective efforts in software together instead of reinventing the wheel for every new entry or IP.
The funny thing here is literally just proved his point. Side by side the difference in those pictures is very minimal. The power required for decent performance with 4k is not worth it at all. 4k is a fucking joke.
Good job proving his point by accident.
Kind of already do that when most just use Unreal
That's one way to put it.
>Side by side the difference in those pictures is very minimal.
Performance is also nearly exactly the same.
Running 4k with slightly lowered settings is much preferable to running 1080p maxed out considering the extra detail and crispness you get out of such a huge resolution bump.
People don't have computers capable of running games with absurd phyiscs/graphics in them. Just look at steam's hardware survey as an example, 3.39% of people have a 1070, and 1.66% have a 1080, so about 5%, and even then at 1440p it's not hard to max out or at least close to max, a 1080's performance capabilities.
Most devs could produce games that looked better, but almost no one would be able to run it, just like how for Crysis most people ran at low settings and the game looked completely different than screenshots posted now.
>It has been 10 years
And it was way ahead of its time. You've seen those fairly recent water physics simulator videos? You could probably include that in a game and say it's amazing and have three people run it at 30 FPS while playing.
First of all, you're posting still images of a video game that's supposed to be in motion.
Secondly, they look equally fun to play.
Thirdly, imagine if there was a detail where there was a small note stapled to a tree and when you went up to look at it, a bunch of thugs would sneak up and ambush you. That scenario wouldn't work as well on 4k because you can stand twice as far away and read it.
>being this stupid
AMD has their own version of Gameworks, GPUOpen, which is open source.
>most
No, most high budget companies use their own engines or some other engine that's not publicly available. If we're talking about independent developers, then yes that is the case. This is actually a reason why I prefer a lot of their work over AAA developers'.
I mean yeah if you can run it smooth at 4k then there's no reason not to, it's technically superior, that's a fact. My point is that although it is better, the difference is very minimal to the point most of the time you don't even notice unless you're looking side by side and squinting. Why buy a 1080 TI to try and force games to run at 4k not knowing how smooth it'll be when you can just buy a 1070 and play at a guaranteed 60+ fps at all times at 1080p.
>People don't have computers capable of running games with absurd phyiscs/graphics in them.
You don't need a 1070 or higher for absurd physics/graphics.
Considering this demo: youtube.com
And Crysis on release could run maxed (or near maxed) on a single 8800 GTX at 1600x1200.
>the difference is very minimal
The difference is only minimal when comparing absolute maxed 1080p with insane amounts of anti-aliasing to 4k, which would result in almost the exact same performance (or worse in some games depending on how well MSAA is implemented), so it makes it kind of moot.
It's preferable to play at 4k with 60 FPS and slightly lower post processing than 1080p with all the bells and whistles at the same framerate, and they're both often just as demanding, so you'd need a 1070 or greater regardless.
>People don't have computers capable of running
Even if that's true, now we're talking about a company's desire for profits. IIRC Crysis was made on a pretty low budget compared to most AAA games even of their time, but they still were able to get by. Studios and companies obviously want money, but that's the problem. Very little developers with money like AAA companies have the desire to make games that are innovative, only the desire satisfy large target audiences.
The problem isn't so much graphics anymore as even the graphics of Crysis wouldn't be THAT bad. Dynamic elements in games could be so much better and if developers chose to, they could make some really interesting gameplay with those elements in mind.
That's a 680 in 2012, aka the equiv of owning a 1080 now. Most games do look better than that currently, nothing about that even looks good compared to what we have now.
>Most games do look better than that currently
No, not really, I can think of plenty of games that look quite a bit worse than that tech demo.
Hell, when it comes to particles the only games that even come close are games that use PhysX.
>implying Worms isn't better than Quake 3
The lighting looks worse?
Game?
Just kidding, you post this threads everyday.
Fuck you shill/dev.
REPORTED
this is pathetic
A forest with deciduous trees like that would be much thicker. If they wanted to be cheap on resources they should have made a coniferous forest, those can actually be thin like that
not OP but the forests are very dense in this game, it gets a lot denser than in the OP pic. I don't have a screenshot of it though so here is a picture I took of a startled nigger on my safari.
>if you're just barely supposed to see something at 1080p, 4k kind of ruins it.
This user is correct. CoTW has the most authentic feeling forests I've seen in a video game to date.
read example I wrote here related to GTAV
>imagine if there was a detail where there was a small note stapled to a tree and when you went up to look at it, a bunch of thugs would sneak up and ambush you. That scenario wouldn't work as well on 4k because you can stand twice as far away and read it.
There are a ton of similar instances, where traps are more visible the more pixels that show them.
>Can barely see something in low res
>Crystal clear in high res
That's not how it works user.
>enhance this blurry security camera footage!
>ENHANCE!
>The jump in graphics is not as large as the jump in graphics in the 10 years previous
Yes it is.
The problem with you guys is that you take Crysis 1 as your average game in 2007 while it wasn't.
>2k texture of a note on 400x400 pixels (part of 1080p screen)
vs
>2k texture of a note on 800x800 pixels (equal part of 4k screen)
of course it doesn't matter if the texture is low res but you graphics boys don't like that, do you?