Why is this the default codec for anything 60fps on youtube?

Why is this the default codec for anything 60fps on youtube?

>watch 60fps vids on chrome
>stutter&buffering, more than half of the frames are dropped
>90% CPU usage on Broadwell i5 ULV
>install h264ify
>butter-smooth playback without any issues

Other urls found in this thread:

en.wikipedia.org/wiki/VP9#Hardware_implementations
community.amd.com/thread/208915
youtube.com/watch?v=wTcNtgA6gHs&index=4&list=PLyqf6gJt7KuEtjkT-oK4v7AIINtup4bBC
twitter.com/SFWRedditImages

your hardware is the problem

tell me, what is the least powerful cpu required to playback vp9 60fps vids without hardware acceleration?
I have a Ryzen 1600 for gayman, would that be enough?

microsoft edge doesn't have this problem, time to get a real browser

>4 threads corelet

>using a web browser for media playback

Outside of gaymen 60fps video is absolute garbage anyway

VP9 is a shitty meme codec because Google is too jewish to pay for the h265 licence

60fps VP9 video works fine on my machine. i7-920 @ 3.8ghz.

500,000,000,000,000,000 Rupees (0.00 USD) have been deposited to your account by Motion Picture Experts GroupĀ®.

Its not how powerful CPU is. Its whether or not CPU has support for vp9 decoding.

You can play 60fps videos on shitty dualcore skylake celerons. Or even shitty dualcore arm cpus used in smartphones.

Also here's the list of hardwares that support it

>en.wikipedia.org/wiki/VP9#Hardware_implementations

tl;dr modern ~2+ year CPU/GPU

Learn to read you mouth breather. The person you're responding to asked specifically how much horsepower is required to decode VP9 in software. Your reply is a non-sequitur.

You fags are not smart enough to buy superior Intel Kaby Lake or Nvidia Pascal GPUs with real VP9 hardware decoding, AYYMD does not have real hardware decoding of VP9

No CPU has fixed function video decoding. It's the integrated GPUs. So if you use a dedicated GPU, all those iGPU decoders are useless because you're not using the iGPU.

Your *GPU* must support VP9 decoding and as of now only Pascal and Polaris (and Vega) can do it for dedicated GPUs.

A bunch of Intel HD/Iris can do it for laptops, too, though.

What about nature documentaries and porn?

>being a corelet
My two xeons (that's 16 cores total) don't break a sweat rendering 60fps VP9 videos.

Yeah it does.

Watch it in Firefox. FF has a better decoder (FFmpeg) that is faster and scales better to more cores.

You have just slow dualcore but it might be able to cross the line, or it at least reduces the amount of framedrop.
Chrome has the Google shit (libvpx) that is slower because some excuses. I wish I was making this up.

Are you stupid or retarded?

community.amd.com/thread/208915

AYYMD has no VP9 hardware decoding otherwise the CPU usage wouldn't be high at all

You're doing something wrong, OP.
>7yo CPU
>PII 1055T @ Stock
>VP9 4K 60fps
>0 frame drop
>vivaldi
youtube.com/watch?v=wTcNtgA6gHs&index=4&list=PLyqf6gJt7KuEtjkT-oK4v7AIINtup4bBC

In my case, it plays fine on EDGE/Chrome/Vivaldi, firefox just crazy lags / skip frames / drops a bunch.

FF has tons of issues for me with Youtube. I'm now stuck with Chrome because it's that bad. I'd rather have a botnet and entertainment than neither.

It's like you didin't even read the forum thread you posted

Firefox works just fine

post your task manager screenshot

...

Not for me, I get 60% cpu when on software decoding and like 10% cpu on hwdec (media.wmf.vp9.enabled).
It does not play smoothly, tho. There is clearly frames being skipped, but for some reason youtube doesn't count them. It's obvious enough to notice without any side by side comparison (which I did anyway to confirm). But I'm talking about 10+ frames/sec being skipped.
Kinda weird since it plays fine on EDGE (wmf).

>claim 60 fps

>stat shows 30 fps cap

Either way, 8% cpu usage is still relatively low for 4k vp9 playback. It would have been a lot higher if it was using software to decode.

Its not 4K, the stats even show what dimensiont he video was downscaled to.

(Downscaled to 480p)
(Downscaled to 1080p)

Chances are browser scaling uses the cheapest algos too for cpu usage.

The 4k video still needs to be decoded as 4k, dumbass. That's where most of the compute power goes into. It doesn't matter if you're downscaling that shit to 240p, it will still be stuttery if it's being decoded with software on a shit CPU.