HDTV PC Refresh Rate

I have my PC hooked up to my HDTV so I can watch movies and shit that I've pirated. In the NVIDIA control panel I can set the refresh rate for the TV. Do I set it at 23.976 when watching movies for smooth playback and then switch it depending on the content?

Other urls found in this thread:

youtube.com/watch?v=-wg9CN8vrIg
twitter.com/SFWRedditImages

No

What the fuck is the point of 8K even? You can't see the pixels on 4K unless you have your face right next to a 55inch+

There isn't. They should have stopped at 1080p and started making a lossless compression format instead so everything you watch is in perfect quality.

Nobody has an answer to my question I guess

are you fucking retarded?

Now try and actually explain what you're talking about

did you ask if you should lower the frame rate to make smooth playback? are you fing serious?

Movies playback at 23.976fps, monitors are 60hz. Obviously some bullshit happens and you get judder. I'm asking, since I can set my TV to 24hz, do I do that as it's practically the same as a movie to reduce judder on playback.

..... dude.what....what....what......whatwhatwhatwhatwhatwhwtahwtah. thats ......what?

this isnt a video game. set it at the highest it can go and change your tvs smooth motion setting to the one you prefer.

smooth motion bullshit makes it overly smooth and fake, commonly referred to as the soap-opera effect. nothing movies like it should as it's all interpolated.

you clearly have no idea what judder is or anything I'm talking about.

You best be trolling

Try watching a movie at 60hz and compare that to 24hz. The latter will appear much smoother, especially during camera panning

Of course this is with any frame interpolation tech disabled. The only one of those that doesn't look horrendous is madVR's, because it is designed to undo the stutter effect of 60hz displays rather than make everything look soap opera smooth

i'd buy this monitor if this chink girl will be included

wtf are you guys? higher frames is always better. its smoother. in what world is lower frames smoother? never. please delete this thread.

troll or dumbass?

I have enabled on mpv, but only with oversample as anything else is to blurry and gives a weird effect.

you cant be serious?

Lossless video compression will never match the efficiency of lossy compression, because you can obviously do better if you throw some data out, and compression isn't magic, you can't just throw money at algorithm designers and expect arbitrarily small file sizes. Not to mention that you would probably want to throw chroma subsampling out the window, further increasing file size.
Going from FHD to 4K isn't even a problem, you just need 4 times as much storage space in the worst case, which is nothing.

I'm pretty sure whatever frame rate upscaling algorithm a TV uses would produce artifacts similar to the ones in this video: youtube.com/watch?v=-wg9CN8vrIg

A 24 fps movie is still a 24 fps movie, even when watched on a 60 Hz screen, but now some frames are displayed longer than others.

>lossless video

I'm saying that instead of increasing resolutions which nobody really needs, why not just find a better way to compression video losslessly and transport it so you have perfect video? With that, make a storage media that can easily store that large of a file to replace Blu-Ray. The only thing 4K is good for is HDR, that's it. The vast majority of movies are already show in theaters at 2K on fuck huge 35' screens.

depends on the tv. the video you linked is trash. Id never recommend that to anyone. op asked about smoothness. higher refresh always has smoother playback. ive never seen a tv or monitor with terrible interpolation or whatever.

Yes. mpv on GNU/Linux even has a userscript called xrandr.lua which does that automatically for you, it's pretty neat. On windows I think you can also setup madVR to do it for you. I think Kodi also supports this.

>ive never seen a tv or monitor with terrible interpolation or whatever.
I don't really have experience with frame rate upscaling on TVs, but I don't see how any algorithm could handle things like the satellite at the beginning which is motion blurred to hell on the source.

When the source content is 24 fps with motion blur synchronised with that exact frame rate. As soon as you start displaying it at a frame rate that isn't a multiple of 24, you have to display some frames for longer or shorter than intended and the motion blur becomes uneven and stuttery to the eye.

Same principle behind 60 fps looking less smooth than 60 fps with vsync enabled, and also the reason why gsync/freesync exists