4k playback

So what happened in the last year that 4k video playback has become so common place? Was it HVEC?

I remember downloading a movie not too long ago and it barely playing at all. Now I'm seeing them pop up everywhere.

Other urls found in this thread:

realorfake4k.com/
twitter.com/NSFWRedditImage

They finally managed to get AACS 2.0 encryption keys to make actual UHD remux's.

Previous 4k rips were screen caps and therefore not source quality.

Sure there are also more people with GPUs or CPUs capable of hardware HEVC decoding. But any modern desktop CPU from the past few years can handle HEVC software decoding if needed anyway, so that wasn't really an issue.


Mostly it's just there are now more source quality UHD rips available.

People generally prefer the higher quality thing. The demand caused devs and other relevant parties to get their acts together, and here we are.

In a few years, 1080p will be like using a CRT television.

I hope not. I got show rips that ain't even dvd quality (cause anything higher don't exist;no DVD/no torrent). They look fine on my 65 inch tv but just like trying to up scale an jpeg, past a certain point the results look like shit.

>People generally prefer the higher quality thing.
Yeah, at this point HDD space is so cheap, and with the proliferation of gigabit internet speeds, 4k seems to be gaining more and more ground.

I myself only download 1080p+ rips of movies and TV shows, but now I'm gonna dip my toes into 4k.

Honestly, there isn't even THAT much of a size difference. h264 encoded stuff at 1080p is roughly the same as 4k stuff with h265.

jelly?

Why can't AMD get into proper 10 bit 4K HDR playback and VP9 4K HDR HW decoding?

Was about time, I'm sick of people saying we don't need it and it will not be adopted before long, as always you're wrong. You can't stop technology.

that's actually shit....

My GPU has no issue with the jellyfish file at 250mbps.

How shit something looks when upscaled depends mostly on the screen size and the viewing distance, your 480p rips would look almost identical on a 65" 4K screen.

>and with the proliferation of gigabit internet speeds, 4k seems to be gaining more and more ground
100 mbps is enough for 4K streaming, and 4K blu-rays take a bit more than an hour to download, which is fine for me as I tend to download shit ahead of time.

>100 mbps is enough for 4K streaming
Youtube 4k is ~15-25mbps
Netflix 4k is similar, maxes around 35mbps, but averages around 20mbps.

4k streaming doesn't even require 50mbps. At least not in it's current state.

what's your mpv.conf?

profile=gpu-hq
scale=ewa_lanczossharp
cscale=ewa_lanczossharp
video-sync=display-resample
interpolation
tscale=oversample
ytdl-format=bestvideo[height

Do you guys feel old yet? I remember when 1080p televisions were thousands of dollars and the next big thing.

>100 mbps is enough for 4K streaming, and 4K blu-rays take a bit more than an hour to download, which is fine for me as I tend to download shit ahead of time.
Yeah, for me file size is a non-issue, not to even mention streaming.

>cache=4000
that's a tiny cache.

I was using the default MPV conf file with a much larger cache (cache=2147483).

Plugging in your conf settings (besides the smaller cache) and my GPU use actually went down by about 10%.

1080ti/8700k

For the most part the GPU would go down as far as 7%.

i'm surprised your CPU use is so high.

I'm running a 5820k and a GTX 960

here is the 400mbps version.

>i'm surprised your CPU use is so high.
Probably because I had just skipped forward in the video. I'll try the 400mps one

It was HEVC when it comes to files you download (or the ability to rip them, anyway). I say this is the case because every single 4k video file I've downloaded that didn't come from YouTube has been HEVC-encoded.

But it's also sites making 4k videos available in the formats they commonly use. YouTube has some kpop music videos and a few other things available in 4k and those are encoded with h264 and VP9.

>In a few years, 1080p will be like using a CRT television.
I disagree. I used to watch divx dvd-rips all the time. The difference between one of those files and a good 1080p bluedisc rip is huge. This was something I immediately noticed when watched my first 1080p movie. It's like night and day.

I downloaded like 10 4k movies when I got my 27" 4k IPS display and watched quite a few. The difference wasn't all that. Then I watched a 1080p movie. I can't really say I noticed much of a difference.

huh? Good 1080p's are typically 10-20 GB. The 4k's I've got are like 50-80 GB. That's a pretty huge difference.

>The 4k's I've got are like 50-80 GB
Because they probably aren't encodes. They're remuxes.

Harry potter for example. There are UHD copies of most of the movies out there now, but from what i've seen they're all remuxes, not encodes. Same goes for most titles from what i've seen.

Waiting ... 8K

>doubles resoluition
>doubles screen size
>yo brah check out how clear the picture is this is one of those new state of the art high end 4k screens
>whaaa dog the shits like real life
why do people do this? How do we mandate the ppi standard?

The 400mbps video chugs on mpv, and the CPU pretty much caps out in the first half of the video.

Second half is fine.

I played the same video on MPC-HC and I only see the GPU go up to like 40%. Plays perfectly.

I don't know why mpv does that.

Whats your Conf?
Sounds like it's attempting CPU decoding.

I looked for it in the Roaming folder but it's not there. According to documentation that's where it should be...

So I'm guessing that's the problem?

potentially, mine is super basic if you want to use it.

hwdec=auto-copy
hwdec-codecs=all
vo=gpu
profile=gpu-hq
hr-seek-framedrop=no
cache=2147483

Thanks dude, this did it. Task Manager even shows video decode now instead of 3D.

no worries, should drop your power draw by a good amount as well.

um how come mine doesn't say gpu video code and only gpu 3d? Do you need an actual discrete gpu to use it? I can only use hwdec=auto on an 8700k

The iGPU on the 8700k can easily handle 4k 10 bit HEVC decoding. Even VP9 hardware decoding.

yes it can use the regular hardware decoding but then if I use a profile like gpu-hq it shits the bed

Well gpu-hq is meant for discreet GPUs, not iGPUs.

Even for discreet GPUs its meant for mid range and higher cards.

even opengl-hq shits the bed

Yes because that's not using hardware decoders anymore, that's doing opengl rendering.

opengl doesn't use the gpu?

Wtf I didn't think there were even consumer 10G connections (not to mention network cards)

Uni backbone or something?

>10G
What?

>Wtf I didn't think there were even consumer 10G connections (not to mention network cards)
not him but that's gigabit, not 10g

l2read

it uses the GPU, just not the hardware decode block.

so what are the best settings for non hardware decode 4k hevc main10 100+ bit rate?

No idea, I don't have that hardware

he probably meant 10gbE

It's obviously not.

1gbps is ~950mbps when accounting for TCP/IP frame overhead.

good dvd versions scale very well to higher resolutions, bluray adds detail but it means fuck all if the detail isn't there in the first place.

4k, as it stands now, is nice but not the be all end all for video. streaming wise, it's just a higher bitrate video as far as i'm concerned, higher quality.

as far as movies go, no display properly handles hdr yet as dolby and hi10 both demand higher peak brightness then we currently get.

oleds are the best you got right now, and in a year or two, per pixel local dimming will likely be a thing if the cost inst to high (panasonic just made a per pixel ips so it gets oled blacks while being able to shoot a fucking retarded light out the back so it gets the brightness)

Per pixel local dimming will be what makes 4k look amazing, and is currently what makes 4k uhd movies look good on oled's, this is something that doesnt translate to normal monitors.

Hell, I have a tcl p605 and the I fully understand hdr and what it wants to do, but the tv has 1/4th-1/6th the peak brightness that dolby hdr demands, and that's what most video is masted to. currently no tv even gets 1/2 of what dolby hdr wants.

if ppi was enforced, I would likely never move off 1080p as I don't want hardware based AA, im very happy with my 55 inch 4k screen real estate to work with opposed to making text just a bit sharper.

112mbps is what it would take to stream disc quality.

also, isn't netflix 12~ mbits for 4k and youtube tells you up to 25?

Average bitrates are different from peak bitrate.

Just depends on the content you're streaming to be honest.

Not all content can be compressed to the same degree.

given I have a 120mbit line, and netflix shouldn't even be an issue, their compression on video is fucking shit.

Of course it is, they don't have unlimited bandwidth either.

Highest streaming bitrate i've seen is with youtube 8K, 70-80mbps

I can't speak for 4K but netflix SD matrix revolutions looks better on Netflix than the DVD.

larger screen realestate is a great thing, but selling it as a higer definition while constantly moving up in screensize is bullshit

I hate this stupid fucking trends, especially with Animus cgi backrounds or the shitty tumblr filters like in VEG, it's supposed to be 2d vectors and solid blocks of color, it should all be released in 240p vector graphics and scaled to fit any arbitrary resolution you want by the software you use.
fuck bloat

I know, I just didn't get why he would think that user has 10 gbE when the speedtest clearly shows him having a

55" 4K is like 27" FHD, what the fuck are you doing?

Most 4K blu-rays I pirate are about 50 GiB in size, which works out to about 60 mbps for a 2 hour movie. Are there 100 GiB ones, or am I missing something?

I'm pretty sure most current PCs would have trouble rendering vectorized animes in real time. The may be able to do it most of the time, but some scenes with a ton of effects would make your framerate drop.

wanted a 48 inch monitor, but tcl p605 only came in 55 inch, and as far as I can see, if you are going cheaper, why the fuck even bother and if you are going more expensive, its only surpassed when you go over 1500$

that local dimming
That static contrast ratio
and colors that re represented accurately enough for me to be able to see a difference between 187 187 187 and 187 187 and 186

Its honestly shocking how much banding you start to notice when you have good color representation, granted it's still a minor thing that's largely overlooked, but its there.

my tv is that sweet spot as far as price performance is concerned, I would have loved to have had a displayport or hdmi 2.1 but that's just not happening.

probably because when they get the high res master, they downsize if from there while using higher end codes than we had when dvd was a standard so they are able to push a same-higher bitrate version of sd while still having it look better than the dvd counterpart.

Its an advantage of not being constrained by a standard from inception to obsolescence

That said, netflix 4k hdr you really start to notice how it's not good when a scene has lots of kind of fast moving detail, the kind of shit that would be sharp on a bluray or even most bluray encodes but is so not what you expect it to be from them.

uhd bluray goes up to 100~gb a disc. the 112mbit is the absolute max assuming a worst case scenario

have a vector cull that would stop at native/screen resolution.

You could also have the video encode itself and not watch it real time.

anyone remember the name of that website that lists real 4k encodes and the fake upscales?

realorfake4k.com/

2016, Pascal GPUs have full HEVC Main10 & VP9 hardware decoding

2017, Kaby Lake and Coffee Lake CPUs have full HEVC Main10 & VP9 hardware decoding

Everyone can enjoy 4K videos now easily

thanks, enjoy the free (you)