Why are games still using pre rendered cutscenes...

Why are games still using pre rendered cutscenes, I thought graphics nowadays were good enough that things didn't need to be pre rendered anymore. What gives?

prerendered looks better, loads faster and is sometimes easier to make

I'm pretty sure that it isn't about graphics, it's about saving space or something.

So you can get a 100% guaranteed good framerate on weaker hardware.

Opposite: Pre-rendered cutscenes are basically video files, so they take up a lot of space, compared to real-time making the characters interact for a scene.

Because why add resources and files to when you can have one scene.

depends on the situation. sometimes its used as a way to hide loading.

sometimes its a way to have vastly different scenes play out seamlessly instead of loading , like in one scene its Italy and another its Antarctica. its a way to cut down on loading.

though sometimes it makes no sense to use it when the scene is just talking heads.

The main reason is because consoles are shit.

Didn't PCs use pre rendered graphics long before consoles ever did?

The only advantage to not using pre-rendered cutscenes is that it saves an immense amount of space, allows for the players choices, (even in how the environment is affected by the player), automatically scales for framerate, and makes transition to gameplay more smooth.
These aren't that important. Disc space, disc drive space, and download speeds have never been larger, cheaper, and faster respectively, (though some people have to worry about data caps), most games' cutscenes wouldn't take player choice into account anyway, consoles are the standard for almost all games, where they reach for locked frame rate, most games don't bother with transition to gameplay even if cutscenes are real time, (which is a shame).
Still, there are plenty of games where real-time cutscenes make plenty of sense, (games where costumes are important, games where cutscenes are interactive, etc), so there shouldn't be rule for whether cutscenes should be pre-rendered or not.

>The main reason is because consoles are shit.
>All cutscenes in Horizon, Uncharted 4, GTA V are rendered in real time

>saving space or something
>MGRR bloated with pre-rendered cutscenes , increase filesize to 20GB when main files probably 5gb only
sure m8

*autistic screeching*

Pre-rendered cutscenes are used to hide loading screens and allow for visuals that couldn't be rendered in real time. Some games use it primarily for the first reason, while sometimes it's a mixture of both.

Yeah but we've been in an age where gameplay visuals are better than pre-renders for a while. Like why do Platinum games have ugly 30fps cutscenes when gameplay graphics on PC at least are better? There are other reasons too, like paying CGI dudes so they don't starve to death. Economy boosting stuff. And because CGI dudes are have better tools for framing cinematic shit.

But in most cases from what I can tell it's because consoles can't render particular scenes, though I'm sure there are some games with scenes even modern PCs couldn't render too. Who knows.

Got a source on that? Just curious.

>Pre-rendered cutscenes are used to hide loading screens
You wish. Games still have loading screens before and after them.

>he can't take a joke
I was clearly exaggerating. It's a bizarre thing though because take ROTTR for example, cutscene Lara and general graphics look much worse in the FMVs compared to gameplay.

What's the purpose?

>loads faster
This is the key reason here, along with performance concerns. Nobody wants loading times before cutscenes, pop-in or fade-in, or framerate hitches screwing up the flow of cinematics. It is why you often see in-engine cutscenes that are presented using prerendered video anyway.

>Game allows you to give your character different costumes or equip them with different weapons.
>If you do though, you have to deal with jarring switches in appearance during cutscenes and gameplay.

that ass is as flat as a pancake

>enjoying Valkyria Chronicles at 144hz with nice 4k graphics
>720p 30fps FMV comes on after every mission

>>enjoying Valkyria Chronicles at 144hz with nice 4k graphics
Must be nice, but I find that a little hard to believe.

Actually, piggybacking on this, I heard that connections with the bandwidth speed that would allow such a display don't exist yet, at least not for general commercial use.

???

And you're playing this at 144hz?

Yeah.

Whatever you say, fatty.

What's your setup, including the type of cables and monitor you use?

You're getting creepy.

Okay, how about just the monitor?

Apparently, current cables can only do 4k @ 120 HZ, not can't do 4k @ 144.

You don't have to answer, Anonymous, we just don't believe you.

The prerendered cutscenes near the end of the Witcher 3 (I think during the siege of Kaer Morhen) were fucking god awful.

I went from running that game at ultra settings to looking at some grainy, 720p, 30fps bullshit that even had a different volume level than the normal game so I could barely hear it.

Honestly, pre-rendered cutscenes need to die.

I'll pass, stalker.

You need to do just a bit more research, you're almost there.

Plebs

That's what I thought, (and still suspect), but why then are there 144hz 4k monitors, then?

Do you really need a fucking source for Horizon, Uncharted 4, and GTA 5's cutscenes being in real time? It's incredibly fucking obvious. Like what the fuck would make you think they'd be pre-rendered. Jesus fuck dude.

It's not obvious to me because I haven't 100%d any of those games and I recall some videos not looking anything like gameplay at all.

When the entire rendering pipeline has changed I really start to wonded, don't you?

Consumers standards have risen, while consoles aren't that powerful.

Because user and you are dumb.

You can't answer just answer the question? Displayport 1.4 only supports 120hz at 4K.

...

you're an idiot and don't know what you're talking about.

i like the feeling