Most movies filmed in 4k now

>most movies filmed in 4k now
>digital instead of film
>HDR
>Dolby Atmos surround sound
>Projection technology better than ever, better contrast and deeper blacks.
>TV's are moving to OLED, with great blacks and fast response times
>But film is still stuck at 24fps and TV is stuck at 30fps/60 fields

Why is framerate the one thing studios refuse to budge on? Films keep pushing higher and higher resolution but outside of Peter Jackson no one even tries to film at high framerates outside of slow motion scenes.

Other urls found in this thread:

cinemashock.org/2012/07/30/45-degree-shutter-in-saving-private-ryan/
twitter.com/NSFWRedditGif

because Hackson tried and it looked really bad

Because the soap opera effect looks terrible and fake. They only need 60fps for panning shots.

Higher framerates don't work for most films because it alters the tone. Comedy programmes work at higher framerates though, a lot of the 80's programmes were shot that way in the UK.

Get

How does is change the tone?

Sup Forums's 3MB limit is fucking garbage, it's almost impossible to make movie footage look good.

Interesting factoid. They were broadcast at the usual 25 fps though, right?

why don't they just aim in front of it?

Because it doesn't look "cinematic", it looks weird because we are used to seeing films at 24 fps.

24fps in film is the sweetspot for atmosphere and action. Also framerate isn't everything, read this to see why.

cinemashock.org/2012/07/30/45-degree-shutter-in-saving-private-ryan/

Because people will complain that it looks too smooth. They complain now with 120hz 4K HDTVs. You have to turn off some motion settings and the picture still looks smoother and clearer than ever before. It's great imo but the average person hates a change like that.

Different person, I've noticed that on BBC America most shows look smoother than American shows, no sure if it's 50fps or just the way they were filmed. Ever since I was a kid I've always wondered why British TV looked different than American TV, guessing it's the framerate.

why can't he lead his target

IIRC the guy is a scientist who created the beastosauraus rex but not weapons trained, he's an egghead way outta his league.

There are other factors as well like motion blur. It's why on a lot of old programmes you notice ghosting.

Underage faggot detected

Speaking of ghosting, remember plasma screen TVs all the complaints about them? It seems like LED has totally taken over.

Why does OP webm look so wonky, but the 60+fps webms I've seen anons post look wonderful?

He's talking about the gunner, who in the movie claimed to be a combat vet, yet couldn't manage to hit a huge target not even a 100m away.

There was once a time that nature shows that wanted to use "slow motion" film on small animals, risked the animal subject bursting into flames as they tried to film it under the crazy bright lights. Higher speed filming means lots more light, unless you like all those colorful high-iso picture error artifacts.

I avoided buying a plasma TV at the time. I nearly did but I was advised not to so I didn't get to see the ghosting personally.

LCD TV were well fucking expensive back then, cost me £1000 for a 37" in 2006. It's still going stong even now.

It's a movie. Turn off your brain, bro.

Really man? Are electronics typically so expensive in England? I just picked up a 48" LED 4K "smart" tv for $370, courtesy of Costco. There were some absolutely massive ones there, like 65" and up too. Where you would put such a monster I don't know.

This will be handy for those that don't know how simple types of filming and effects can change the tone of the product.

You were misled, only OLED can top Plasma in picture quality.

Pretty much. In truth though the Aussie's have it far worse.

It's gotta be import costs or tariffs something. It's not like American Samsungs, Vizios, or Toshibas are any different from English or Australian ones.

The framerate is too low.

>Most movies filmed in 4K now

No.

>digital instead of film

Yes

>HDR

No.

>Dolby Atomost surround sound

Is the mix

>Projection technology better than ever, better contrast and deeper blacks

No, actually, if it's digital.

>But film is still stuck at 24fps

No, never has been.

Going from film to digital means worse resolution. Enjoy your 1080p hell.

No.

That user doesn't know what he's talking about. Old British TV was 25fps, Old American TV was 30(or 29.blah blah) FPS. Old British TV had a smidgen more resolution than Old British TV.

The downgraded image quality in earlier British productions was down to crappier lighting, equipment and other budgetary constraints.

They are different actually. But yes, it's the earlier things you mentioned. Taxes and simple distribution differences (smaller market).

Mainstream digital cinema cameras are available in 6K now. It's not a big issue. You can have a post production pipeline in 6K that is still cheaper than film, now, too.

It's just not that common yet, because money and "its' good enough".

High frame rate in movies makes them look like dayime TV. It looks amateurish and almost as if it was filmed on a camcorder. Film *needs* 24fps to look and feel correct. Because of the lack of frames, motion blur is used. Your brain fills in the gaps and it's a less stressful experience for your mind

Because it's more expensive. That's literally the only reason. And 60fps isn't high frame rate, it's medium frame rate. High frame rate starts at 120fps.

GoTG2 was shot in 4K yet it was 1080p after the CGI had been applied. So any attempt to make it higher is simply an upscale and nothing more.

Meanwhile, movies filmed on film from 50 years ago can with ease be remastered into 4K releases.

>parents have huge high frame rate TV in living room
>i guess it computes missing frames to make everything 60fps
>my mom has been into movies her whole life and doesn't even notice how bad it looks
>everytime I look at it I want to vomit

>Old British TV was 25fps, Old American TV was 30(or 29.blah blah) FPS
True, but only because of a weird and counterintuitive definition of "frame". Unless it was converted from film, the interlaced fields were offset in time, so the temporal resolution was double the frame rate.

WHAT? 200 million American shekels and the movie is 1080p?

Because I'm still trying to learn the best way to make webms. Also interpolation works better for some movies more than others.

No.

It was shot in 8K and the CGI was applied as part of the typical post production pipeline which was 2K (and the resolution the film was finalized at).

Just about every single movie that has been shot on film in the last 10 years, and 99.9% of those with extensive CGI in the last 15 years is also finalized in 2K.

Now, the resolution of film is (potentially) higher than a lot of the movies shot on 2K in that time period, so the image quality should be better.

But right now you are quite capable of shooting in 6K (which resolution wise is going to match or surpass any currently available cinema film stocks, in as much as you can compare digital and analogue) and finish in 6K if you want.

It's just that no one wants to.

>With ease be remastered into 4K.

It's not with ease. It's actually very difficult, and sadly for the same reasons studios don't shoot and master in 6K, often deemed not worth it by studios.

They should learn how to make better movies before trying to make them looking better.

People used to shoot deep focus using incandescent lights. We have high CRI LEDs now. Sticking with 24fps is pure cheapness, and audiences are dumb enough to fall for the industry propaganda tricking them into thinking it's "artistic", so it's unlikely to change soon.

Because for some reason a higher framerate draws attention to effects looking fake. I don't know why either.

Also, your post makes no sense at the beginning.

>Any attempt to make it higher is simply an upscale

That is true of any film finished in "x" resolution.

Hence why if you watch a lot of 90's films with CGI, the CGI elelments will often feature a degraded, often very grainy/noisy image quality - The CGI elements were scanned at 2K (often lower) and thus when photographed back out, they were "blown up" or simply not at the same level of resolution and quality as the rest of the film which had been finished photo-chemically.

This started to change around the time O'Brother, when they started scanning and finishing entire films digitally for telecine (color correction).

>watch 24 fps rotating panning shot of landscape in theater
>visibly choppy as fuck, to the point of being distracting
>beautiful scene is now ruined

I can't wait for the "choppy 24 fps cinematic experience" meme to end. Jackson did God's work

Right. One big factor that many people ignore is that a lot of British drama was shot on video, while in the U.S., due to higher budgets, shooting on film was common for drama shows.

They don't want higher framerate because it makes shitty CGI look even worse than it otherwise would. That's all it really comes down to.

It's not.

It's 2K, like most movies now.

Like higher resolutions, it also simply costs more.

Capping at 2K early on was in due part to post production houses not wanting to deal with extra storage and rendering farms, and higher frame rates were discarded in the early years because it was more expensive and 24 was "good enough". Eddison was quite mad about it.

2K is just 2048x1080. On a 1080p TV it gets shrunk to 1920x800 though.

The Hobbit looked like shit, absolute shit.

>devices decoding 4k video at 60 fps
um user

.
Right. It's not 2K, it's a lower resolution. No major cinema release from a studio in Current Year is below 2K. Many, like for example, The Last Jedi, are finishing in 4K now.

Even Attack Of The Clones (which was caputred in 1080) was finished in 2K, along with it's FX work.

My god. Beautiful digits user whered you get those?

It didn't look MUCH worse than the regular 24fps release I watched at home.

Billy Lynn was interesting. I found the HFPS version I watched kind of odd, but it looked totally "normal" at 24fps, so there definitely is an effect with HFPS that puts you off slightly - and that wasn't a CGI heavy movie or anything. It was a realistic looking war picture (in part).

Drop the resolution. 24fps all blurs to shit whenever anything mores, so it's obvious nobody really cares about HD anyway.
480p120 > 8Kp24

I should add I had to watch both in 3D, and I really don't like 3D, so that's going to affect my viewing experience hugely.

>panning shots are lovely instead of distracting
>can actually tell what is happening in action scenes

It looks too real. You suddenly feel like you're watching actors on a set rather than characters in a story.

2k has the same actual resolution as 1080p, but with 2k horizontal instead of 1920

kek.

480 would be unbearable on a cinema screen.

Just shoot 4K, 120FPS for cinema release (since projectors can currently handle those specs) and scale back to 1080, 24fps for the plebs at home.

this

No it doesn't. I'm not sure where you've gotten this idea from or what you're basing it on.

HFR looks like shit.

Why does it give me such an uneasy feeling?

Because it looks weird.

I doubt you've even seen any. There's only been one true high frame rate movie (Billy Lynn's Long Half-time Walk), and the HFR version had a very limited release. 48fps barely counts as medium frame rate.

A lifetime of industry brainwashing. It's the same trick they used to make people think 3:2 pulldown was acceptable.

HFR image looks odd on a movie. Jackson tried it with the Hobbit and it just didn't look right. None of other HFR movies have felt right either.

But I do recall Cameron insisting on using it for Avatar$ so who knows. Maybe something will come from it.

it has 2000 horizontal pixels.
is this bait

2K literally does have the exact same vertical resolution as 1080p.

Is 4K the absolute limit we can push film resolutions? I feel like we're reaching a point where the studios don't want to invest in higher resolutions.

I saw it, and I thought it was weird. I did mention above that the 3D is a problem for me. Also Billy Lynn was variable. You could tell what was 24fps vs what was higher, but fucked if I could tell what was say, 48vs120fps.

While it did look better than The Hobbit, I still found it disappointing. The battle scenes had a sports game quality to them. Ironically, the stadium scenes looked really good with the fireworks.

I still want to see more stuff done in it to get a feeling for it. Maybe something LIKE Lawrence Of Arabia - break out the 8K and finish in 4K, 120fps. Don't mind if it's 3D even if it's shot in that slow, wide style.

I really don't know why it's so hard for Hollywood to do. You can still do action scenes, just do them like Jackie Chan flicks where it's a lot of long takes with smooth/slow camera movement.

good chart, saved

8K is possible, but like 70mm, it's a waste of money on today's narrow FOV screens. We need to go full dome (obviously after we've fixed the low frame rate problem) to make good use of 8K.

Sure. The pixel count is different, hence the higher resolution, hence it's choice for projection/cinema capture.

That also assumes the "standard", when in reality 2K can be higher than 1080 veritical lines.

>Also Billy Lynn was variable.
Shit. I never saw the HFR version and now I learn it was a scam all along. Variable frame rate is even worse than low frame rate (except in anime, where I consider shit motion quality part of the style, because it's more like upgraded manga than downgraded real-life).

Eh, not really. A film shot on 65mm, even if projected on smaller screens, still looked better than that shot on 35mm.

Capturing and finishing at higher resolutions is good regardless of how low you go, while of course, the bigger you go the worst the lower resolution material will look.

As mentioned above, they shot Guardians on 8K, just as in the olden days, even if you were finishing on TV, many shot in 35mm simply because the image will be sharper with better contrast and better quality overall the less you need to blow it up.

This will all happen eventually. The Star Wars films are being shot in 35mm/65mm and finished at 4K now. 6K capture and 4K mastering will be the blockbuster standard, I assume.

Infiity War will be 6.5K shooting, 4K mastering.

I don't know. In fact, you know that could be WHY the HFPS seemed weirder than I expected it to. Varying that shit is kind of odd and I guess it wouldn't let my eye "settle".

I'd like to just see someone shoot an action film that isn't CGI heavy, something like a Bond movie, with lots of car chases and martial arts, at 60fps. No 3D. Just to get a feel for it.

Huh. Reading the Wiki, it says it was shot in 120fps. But I remember reading it was variable. If that's the case, it definitely looks shit, because there were parts that looked 100% like a "normal" film and then parts that had that hyperreal quality.

Variable frame rate ruined Coraline for me. So close to being my favorite stop motion movie, but they throw in random 12fps scenes for no obvious reason. Not even expensive looking scenes most of the time. Every 12fps scene broke my immersion, just like the frame skipping scenes in Fury Road.

Variable framerate when?

When the fuck is Hiro going to add vp9 support?

I don't remember that in Coraline.

I don't think that's "variable" frame rates though - wasn't that just fast motion?

In stop motion and hand drawn animation already, and hopefully nowhere else.

It's mostly 24fps, but every so often there's a 12fps shot. It's hugely distracting. Maybe I'll convert the whole thing to 12fps if i watch it again.

I guess if they have variable resolution (hello Nolan), variable frame rates for action sequences wouldn't be utterly bizarre.

That's not "variable" in the sense I'm talking about. That's just fast/slow motion.

I meant variable in that the playback frame rate actually changes.

Higher framerate looks bad, in movies.

On a non-strobed display the frame rate actually does change. You could argue that the strobe rate has to change on a strobed display for it to count, which it would not by default, but you could theoretically set it up with the right player software.

Because low fps on movies (24) make them better.

Not really related but fuck, RLM need to change the way they they film. Either the framerate is way too low or they're just using a fast shutter speed. It looks like a slideshow whenever they do any panning shots or movements.

Gladiator has a 12fps sequence in the opening battle IIRC. It looks like shit. And there are those cheap-ass pseudo-slow-mo shots you see sometimes. The Spy You Loved Me has a few particularly egregious examples.

Sure, but no feature films (Coraline included) have variable playback speeds. They're all 24fps. You can't change that (otherwise the film speeds up or slows down).

God how I fucking hate when movies do this.
>Shot at running target, but always on point so the shots will always miss by just a bit. Same shit in the WW webm.

Its like you never played things like freelancer or games with similar targeting system.

Gladiator has a 12fps sequence in the opening battle IIRC. It looks like shit. And there are those cheap-ass pseudo-slow-mo shots you see sometimes. The Spy Who Loved Me has a few particularly egregious examples.

Duplicating frames is literally the same as halving the frame rate, unless you're strobing, or unless you have noticeably bad compression artifacts.

This is so right. I cannot stand 60 fps films. It looks bad

>actually trying to watch their disasters

No.

All of Gladiator is 24fps.

What they do at the beginning of Gladiator is low frame rate + step printing.

You actually have the right idea - step printing is simply reproducing more frames to "fake" slow motion. If you shoot in 24fps, and playback in 24fps, it looks normal. If you shoot at 120fps and playback at 24fps, you get slow motion. If you shot at 24fps, but want slow motion (say three times slow), you simply repeat every frame 3 times. It looks blurry, because you're "smearing" rather than slowing it down.

What they did in Gladiator, (which copied Saving Private Ryan in more than just this manner) was shooting at a low frame rate (say 12) and then ALSO step printing it (say 48 frames) and then playing it back at 24. It gives it a jerky, blurry effect.

In Fear & Loathing they take it even further - lower frame rates even still, and step printing at even higher frame rates - so when you play it back at 24 it has a really smeared, trail style effect.

>hurr durr Tie Fighter pilot saw the gap just right after the X wing vanished in it, its not like the Xwing changed its roll axis kilometers before it.