Video codec of choice

So Sup Forums, which codec would you prefer for video files, in which usecases and for which reasons?

h/x264
h/x265
mpeg4
theora
vp8/9

Arguments like
open standards > corporate patent-encumbered cuckoldry
image quality
file size
encoding and decoding cpu/gpu requirements
degree of support in various software and devices
are welcome

I just tried out x265 and it does seem to deliver on h265's promise of the same image quality at half the file size compared to h264, but h* and mpeg are maximum corporate licensing/royalty bullshit. Theora, vp*, etc seem like more interesting choices but also much less supported in general. I use ogg, opus and flac for audio too though and it works fine for me on my devices and software, so I suppose the same will be true for video codecs.

Other urls found in this thread:

en.wikipedia.org/wiki/VP9#Hardware_encoding.2Fdecoding_support
en.wikipedia.org/wiki/AOMedia_Video_1#Adoption
jell.yfish.us/
builds.x265.eu
support.google.com/youtube/answer/1722171?hl=en
twitter.com/NSFWRedditImage

.wmv

VP9 because it's free

*vorbis, opus and flac
ogg is the container... mkv ftw btw

fuck off. Shitty image quality at twice the filesize and badly supported in most software. Wmv is the worst thing ever

is there any support for it at all yet though? It's like webm iirc, literally nothing works with it yet

>like webm
It is webm, but not supported by everything that supports webm

In terms of hardware decoding for VP9 or webm, I think that's very minimal at this moment. Remember that hardware technology normally has a turnaround time of two years from design to manufacture then another two years till that hardware device actually sits in a consumer's possession. There's also the fact that a significant portion of hardware manufacturers are licensees of MPEG-LA so they'll make use of they money that they've invested into it.

At the moment, I've only seen Chinese quality hardware make use of webm and vp8/9.

VP9 but it's soooo slowwwwwwwwwww

Also should have good hardware support by the end of this year.

so what's a good option for now then? Let's say I want to create a video file today, what do I render and archive it as?

x265 seems nice, the image quality/filesize is as promised but the encoding time is twice as long of course, and it wouldn't be a big jump from h264. Theora is slow as fuck. mpeg is shit. vp8 started out good but apparently slows down to a crawl later in the process.

Seems like the only good choice for now is x26*.

I've used mjpeg in the past and it was good but software support is glitchy.

There are only 2 video codecs I would ever recommend. Those are 8-bit H264 and 10-bit HEVC.

8-bit H264 provides the fastest encoding times and has hardware decoding support in like 90% of devices out there so it has the greatest compatibility as well. MPEG4 is faster to encode but it's just too shit to be relevant anymore.

10-bit HEVC provides slow encoding times but has the highest compression efficiency than any video codec out there. However hardware decoding support is nearly non-existent.

Finally whether you use either just make sure to use a CRF and the fast preset. A ton of newbies make the fatal mistake of choosing an ABR and use something like the placebo preset. What they end up with is a video that took them 20 hours to encode that looks like shit. The only reason to ever choose an ABR is because you're youtube or netflix aiming for specific internet connections (ie 1, 2, 20 Mbps).

>so what's a good option for now then? Let's say I want to create a video file today, what do I render and archive it as?
Depends on what you care about more, compatibility (8-bit H264) or the smallest file size possible (10-bit HEVC).

>x265 seems nice, the image quality/filesize is as promised but the encoding time is twice as long of course
More like 4-6X slower

>and it wouldn't be a big jump from h264.
Actually it is. Current x265 encoders have 40-50% improved compression efficiency over H264.

>8-bit H264 provides the fastest encoding times and has hardware decoding support in like 90% of devices out there so it has the greatest compatibility as well. MPEG4 is faster to encode but it's just too shit to be relevant anymore.
H.264 is MPEG-4. Part 10 to be exact.

>>and it wouldn't be a big jump from h264.
>Actually it is. Current x265 encoders have 40-50% improved compression efficiency over H264.
I meant in terms of being a standard. h264 to h265 is a jump with less potential for problems than jumping to vp* or theora.

>>x265 seems nice, the image quality/filesize is as promised but the encoding time is twice as long of course
>More like 4-6X slower
yeah possibly, it's taking

>>so what's a good option for now then? Let's say I want to create a video file today, what do I render and archive it as?
>Depends on what you care about more, compatibility (8-bit H264) or the smallest file size possible (10-bit HEVC).
Ok so you only recommend one of two and no others, as you said before

I like your opinions. Anyone got others? x265 is starting to win the argument given its prospect for wide adoption. The other codecs are good, especially vp9, but their future doesn't seem as certain to me.

Did not know that. Anyway most people would think of the MPEG-4 part 2 when someone mentions MPEG4. Nobody calls H.264 "MPEG-4 part 10" lol.

>Ok so you only recommend one of two and no others, as you said before
Yup, every other video codec is essentially a meme.

>I like your opinions. Anyone got others? x265 is starting to win the argument given its prospect for wide adoption.
You can actually play HEVC video on most devices today. Problem is that since nearly all devices today lack HEVC hardware decoding the CPU must suffer and decode the video alone, which results in a huge battery drain especially on phones.

HEVC isn't as CPU intensive as people make it out to be. A phone with a snapdragon 410 should be able to software decode 480p HEVC video just fine. A modern flagship from 2016 should be able to software decode 1080p HEVC video just fine. However like I mentioned there will be a huge battery drain and some phones can get uncomfortably hot if the CPU is constantly stressed (ie HTC one M9)

>The other codecs are good, especially vp9, but their future doesn't seem as certain to me.
They're all memes. The only reason to use VP9 is to not pay royalty fees for commercial use which is the main reason Youtube decided to adopt it.

listen it's as simple as this: h.265 in the mkv container. that is literally the apex of video files.

It's not that simple, while h.265 may be the most efficient video codec out there it still has problems especially on mobile devices. For one thing literally no fucking hardware decoding exists for it.

DON'T CARE, H.265 IS THE COMFIEST CODEC.

I've spent a fair amount of time getting good quality/bitrate ratios with various codecs.

I started before VP9 & h.265 were a thing, where h.264 was definitely king.

I switched to VP9 before h.265 had any real GNU+Linux support. I am pretty happy with VP9. It is supposedly the best high efficiency codec for streaming and mobile, even now.
After h.265 got more traction I experimented with it. It seems to have slightly better quality/bitrate ratio, but at the time I had already piled up a bunch of VP9 stuff so I didn't feel like making the switch.

>tl:dr; h.265 is slightly better in picture quality, VP9 is open and more suited for streaming/mobile.

I've rarely known a Sup Forums thread to be this worth reading

>VP9 is open and more suited for streaming/mobile.
For streaming on PCs sure but not for mobile. VP9 has no hardware decoding support on mobile so it murders your battery and/or stutters badly if your phone is a little old.

Informative. Is there any hope for VP9 HW decoding on mobile in the future?

h.264 works just fine - if I had a machine that could crank out h.265 encodes at 500 fps then yes I'd use it but alas, I don't.

VP9 encoding achieves 70-80% speed of h.264 in my experience, at better quality and smaller output filesize.

VP9 support isn't nearly as good though.

>x265 is starting to win the argument given its prospect for wide adoption

SAY IT ALONG WITH ME:

x265 IS NOT A GOD DAMNED VIDEO CODEC, IT'S AN ENCODER FOR CREATING H.265 MATERIAL WHICH IS.

VP9 is irrelevant, period. Don't believe me?

Show me any modern smartphone or tablet or most anything else that does hardware VP9 decoding.

I fucking dare you. I fucking DOUBLE dog dare you.

Maybe. Youtube would probably have to pay qualcomm to do that though. Phones already play H264 youtube streams with hardware decoding so there might not be enough incentive for VP9 hardware decoding.

What: x265/HEVC 1920x1080
+Very heavily compressed

Why: Just so i can laugh at VLC users.

You're late, user. We're already discussing if there is any hope for mobile VP9 decoding.

I even mentioned bad support in that post.

h.265 hardware decoding support exists in several products at this point (smartphones, tablets, etc). 10bit decoding support will happen soon enough (and it does exist on one or two of the flagships on the market today)

VP9 is starting to be adopted by hardware manufacturers. Unfortunately, the current state of HW Decode is that it's limited to high end, flagship phones which is not something many people purchase.

en.wikipedia.org/wiki/VP9#Hardware_encoding.2Fdecoding_support

The MS Surface 4s are tablets. Skylake decodes VP9.

S6, S7

Note 6, Note 7

Pretty much any cpu produced this year like Skylake.

I use VP9 8bit @ whatever cq setting looks nice+160kbps opus.

Believe it or not, I don't have any device that supports hardware H264 decoding.

Also, AV1 will be here in late 2016/early 2017, so there's no hope for hardware VP9 decoding on phones, since vendors will just skip VP9 and go straight to AOMedia's AV1.

Very informative (thanks for the source!), but I won't consider x86 mobile anytime soon.

>high end, flagship phones which is not something many people purchase
>he's saying not many people buy the best smartphones manufacturers make

That's a joke, right?

Why opus at 160Kbps? I'm an avid opus user myself and have concluded that anything above 128Kbps is a waste of bits.
72Kbps VBR opus sounds better than 128Kbps CBR MP3 to me, which is used almost universally on the web (soundcloud, etc.)

Are you using more bits to accommodate for a faster algorithm in order to speed up encoding?

While Opus at 128Kbps is just fine for almost anything, it doesn't hurt to bump the bitrate to 160Kbps - it adds like a few hundred kilobytes to typical song sizes and maybe a handful of megabytes to a full blown movie so, practically irrelevant but definitely it technically provides better quality even if you can't actually hear it given your ears.

x265 is an implementation that produces h265

potato, potato

Normies won't shell out 700+ for a smartphone. They'll purchase macbooks instead.

Well, I normally use 128kbps, but I used 160 for a few music videos, where I used the same bitrate "table" that I use for music (160VBR Opus/Q91 AAC/V0 MP3). For things like TV shows, yeah, bitrates over 128 can be overkill.

Whether on not you can hear it depends on your audio equipment a lot as well as your ears. If you use some run-of-the-mill speakers or headset, you're much less likely to spot a difference than with, say a K702 and an E10K.

>the best phones they make *that year
>implying 95% of the reason to buy them isn't muh prestige
Phones are so damn shortlived and gimped (just look at the awful rate of updates) I wouldn't buy a high-end one, ever.

x265 is a computer program, h.265 is a format. Learn the difference you illiterate.

I've got HD598s with the gear to drive them. Nothing special, but spending more on headphones to get a better sound is placebo.
I can confidently say that 96Kbps opus compares to 256Kbps CBR MP3, if encoded with the most expensive algorithm.

I know what they are. You just said the same thing as I did: x265 is a program that produces h265.

Stop being such a fucking uptight nitpicking twat. I was referring to x265 because I had its performance on my machine in mind at that moment, not the general format. Yeah yeah, h265 is what needs to be adopted, and that will be done through x265. Quit being an autistic cunt.

Nobody sane uses CBR. Use V2 or V0.

That said, there definitely is an audible difference between 96 kbps Opus and flac for me.

>x265 IS NOT A GOD DAMNED VIDEO CODEC, IT'S AN ENCODER FOR CREATING H.265 MATERIAL WHICH IS.
>x265 is the program that encodes and decodes a video stream to and from h265
>x265 is not a codec

>h265 is a codec

>standard
vpx8 for webm
h264 for mp4/mkv

>ideal

vpx9
h265

drop mp4 from that
and h264 in a few years
then it's perfect

I really wish your first statement was true, but on the web (where I personally consume the most MP3), eveything uses CBR.

I won't argue there is no difference between 96 opus and lossless, the same way I won't argue there is no difference between 256 MP3 and lossless.
I personally keep an un-encoded copy of all my music in case I stop getting full enjoyment out of my transcoded stuff.

>any year
>CBR mp3

V0 MP3 is normally around 256Kbps, and I ended up with 160 Opus/Q91 AAC/V0 MP3 after failing ABX tests with those settings but not with the ones right before them.

AAC and MP3 seem to have issues with muffled audio and stereo separation at lower bitrates, while Opus just sounds "different", in a spectrograph, you can see that takes a chunk from the 12.5-15.5KHz range (96Kbps) and stores 15.5-20KHz with a low accuracy (96 and 128Kbps), so maybe the reason why I pass ABX tests at higher bitrates than other people is because I can still hear up to 18.5-19KHz

CBR is what I come across most when consuming MP3 on the web (soundcloud comes to mind)
I don't store MP3s myself.

Nobody said that CBR isn't common. But it's used by people who don't know what the fuck they're doing.

I agree, it's just that anyone sane enough to know not to use CBR are probably using AAC.

*probably using vorbis or opus

aac isn't bad but it's only a last resort imho because it's shitty proprietary crap and you have to jump through hoops to get a codec. I don't get why anyone in the fucking world puts up with all that licensing and other encumberment when free stuff exists and is just as good

i only download yify.

You're not thinking like a plebian.
Knowing not to use CBR doesn't neccesarily mean you're smart. AAC has a very high adoption rate.

AAC outperformans vorbis.
At least opus is the best codec overall, only downside being more taxing on low-end CPUs than MP3 or straight lossless.

>AV1 will be here in late 2016/early 2017
Match 2017 is the current expected date for a finalized spec. Polaris already has AV1 support and I wouldn't be shocked if Pascal does too considering both AMD and Nvidia are members of the consortium behind the AV1 codec. Google also announced they'll be converting all 2k, 4k, and 8k YouTube videos to AV1 within 6 months of its released final specifications.

The industry appears fully willing to move to AV1 on both hardware and software ends as soon as they can.

>match
March

How do we know AV1 won't be another meme video codec like daala?

It has huge industry support from streaming giants (Google, Netflix, Amazon, Mozilla, Microsoft) as well as hardware from Nvidia, AMD, and Intel.

did i just see a thread of diverse opinion, sane agrumentation, respect and a lot of stuff to learn in Sup Forums ?
Thanks everyone, i learned stuff today. Sadly i can't participate any other way than thanking all of u guys.

en.wikipedia.org/wiki/AOMedia_Video_1#Adoption

This should explain it.

How does H264, H265 and VP9 10bit CPU decoding performance compare?

If you wanna compare h.264 and h.265 yourself:
jell.yfish.us/

>How does H264, H265 and VP9 10bit CPU decoding performance compare?
H264 is the easiest on the CPU. I have a moto e (SD 410) that plays 720p30fps h264 video in hardware mode with no stutter whatsoever. However on that same device 720p30fps h265 video will stutter often and it's the same news for 720p30fps VP9 video and even 480p30fps VP9 video will stutter sometimes.

All videos played on MX player btw.

a modern SoC will have h.264 hardware decoding and flagship phones will have HEVC and VP9 decode built in as well, it wont even use the CPU.

I'm trying to build an archive of my favorite movies/shows from when I was young. I intend to store these on a custom build HTPC for 30+ years.
What is the most future proof codec?
Lets say you dont give a shit about encode time or file size.
I want high quality video/audio that will be playable a long time in the future.

just store remuxes of the dvd's/bluray's directly

>What is the most future proof codec?
10-bit HEVC, it's gonna be here for a very long time.

>Lets say you dont give a shit about encode time or file size.
Then just encode the stuff with CRF 0 (lossless encoding) 10-bit HEVC with the slow preset. This will often give you smaller than source file sizes while maintaining the same source quality. DO NOT just mux video, this will guarantee you get bloated file sizes since DVD/blu-rays will be using some shitty 8-bit H264 or MPEG-2 video.

>I want high quality video/audio that will be playable a long time in the future.
What I said above + 192 VBR Opus audio. There's no point in using a higher Opus VBR or FLAC audio since at this point the audio is audibly transparent to human ears. This is assuming it's stereo audio, if it's something like 5.1 surround then use something like 512 VBR Opus.

Download the latest 10-bit HEVC encoder here: builds.x265.eu

Just replace the stock HEVC video encoder with the latest one in MeGui and you will be ready to go. The reason I mention to use the latest one is because it's more efficient than the first few releases.

>HEVC
While I want to agree I can't, it will be good but the whole point of AV1 is to make an equivalent codec without the royalties

>While I want to agree I can't, it will be good but the whole point of AV1 is to make an equivalent codec without the royalties
The royalty fees are for commercial use. Since you're encoding media with HEVC for personal use then you won't be charged with royalty fees.

Anyway AV1 is meant to compete with HEVC not replace it so at best it will be like 10-20% more efficient. In addition to that it will be a while before AV1 is finalized, stable, and ready to use for the general public. Even after all that it will still be a few years before hardware AV1 decoders exist.

So basically you have 2 options:

A. Use 10-bit HEVC to encode your media library today
or
B. Wait 4-8 years for AV1 to mature into a non-meme video codec

The question was future proof, knowing Intel, Nvidia, ARM, and AMD are involved, means the hardware encoders and decoders will come

Google and Netflix means the media will come

I know I wont have to pay, and I do consider HEVC the best right now, but for the future AV1 has the most potential to overthrow HEVC because those companies have to pay (or have Cisco pay)

with Mozilla and Google in the fray I fully expect them to implement it into their browsers too even if HTML5 or HTML6 don't spec it.

>The question was future proof, knowing Intel, Nvidia, ARM, and AMD are involved, means the hardware encoders and decoders will come
True but that shit takes time, like years. Also in general HEVC will have more hardware decoding support than AV1 simply because more companies will use it.

>Google and Netflix means the media will come
We don't know when though. This could be as little as 4 years if companies invest a ton of money and research into AV1 which may or may not happen.

>I know I wont have to pay, and I do consider HEVC the best right now, but for the future AV1 has the most potential to overthrow HEVC because those companies have to pay (or have Cisco pay)
AV1 is meant to compete with HEVC in streaming scenarios, it won't magically replace HEVC in everything.

Anyway I think you're misunderstanding the point of AV1. It's not supposed to overthrow HEVC and it won't. It's simply meant to be a better video codec than VP9 with more hardware support. It's also not even meant for personal use, it's main goal to to create a better video streaming codec.

Anyway depending on how much money compabies throw at AV1 it will take ~4-8 years before this video codec becomes usable and has some hardware decoding support.

My advice: just use HEVC like everyone else. The wait is simply not worth it. From the sound of things AV1 will take longer to encode and it may be a shit video codec like VP9 for personal use. With HEVC you just select a preset and a CRF and you're ready to go. With codecs like VP9 you have to jump through many hoops to get a similar output in video quality , AV1 will probably be the same.

Google and Netflix are among the founders

The plan is a March 2017 release

>The project will release new video codecs as free software under the Apache License 2.0
Consumers will use it

I didn't know this but Adobe recently jumped on board too

I also agree use HEVC right now.

>Google and Netflix are among the founders
>The plan is a March 2017 release
That's going to be the first version of AV1 which won't have any hardware decoding support at all and will be riddled with bugs and problems.

>Consumers will use it
lolno. Most people don't want to spend days reading manuals and playing with 10+ parameters to get good video encodes. Hell most can't even use a CLI.

>I didn't know this but Adobe recently jumped on board too
Talk is cheap and action speak louder than words ya know? If we expect to see AV1 to become a worthy competitor to HEVC as a video streaming codec then all of these companies will have to pool hundreds of millions of dollars to pay for devs and shit. That's a pretty big commitment desu.

>I also agree use HEVC right now.
Yeah it's just simply better for personal use and AV1 will most likely bring more problems to a regular user. This is after all a complex video codec that will mainly be used on servers.

Anyway AV1 seems like an interesting video codec for streaming stuff if companies add hardware decoding asap, we may see it become a worthy codec in ~4 years if company throw lots of money at it. Hopefully it won't turn out to be another huge meme like Daala was.

Good luck user, hope your HEVC encodes don't take that long. Also remember to use the 10-bit encoder as it will reduce color banding across all CRF values. 10-bit HEVC hardware decoding has little support now but will grow significantly in the next year.

Intel and AMD have been involved with the project how long? what makes you so sure Vega and Cannonlake wont have decoders in their SIP?

by Consumers I mean it will be implemented into popular media programs like ffmpeg, Handbrake, and OBS

We also have never seen an alliance for media like this before, remember this is an answer to the royalties, this is an answer so they no longer have to spend hundreds of millions of dollars anymore

I don't use HEVC because I have a shit CPU and my (old)GPU has a dedicated h.264 encoder still, I dont hate HEVC or anything its just not as old as h.264 yet

You keep saying 4 years as if again you're comparing this to some other situation, this project has been going on for nearly two years and was born of the unreleased VP10 and others, I may be over estimating it but you're clearly underestimating the alliance too.

user said size wasn't a problem so why couldn't he use 10bit h.264?

I double checked the date, it wasn't nearly two years my mistake

>and it wouldn't be a big jump from h264

You are seriously wrong, with a slow preset and proper RF, I can get files down to 40% their H264 size with nearly unnoticeable quality loss.

not same user btw.

>Intel and AMD have been involved with the project how long? what makes you so sure Vega and Cannonlake wont have decoders in their SIP?
Because they haven't even released the first version of AV1 yet. Even after they do that it will take at least a year to design, test, and finalize a working hardware decoder for AV1.

>by Consumers I mean it will be implemented into popular media programs like ffmpeg, Handbrake, and OBS
Even if those programs get AV1 encoders it won't mean people will use it. AV1 isn't designed to replace HEVC so it may not be as user friendly with the parameters.

>We also have never seen an alliance for media like this before, remember this is an answer to the royalties, this is an answer so they no longer have to spend hundreds of millions of dollars anymore
Talk is cheap like user said. You remember how anonymous was gonna change the world and how they made all of those great big speeches and even showed up to a few protests irl? Now they're just a joke.

>I don't use HEVC because I have a shit CPU and my (old)GPU has a dedicated h.264 encoder still, I dont hate HEVC or anything its just not as old as h.264 yet
You really shouldn't be using GPU encoders. They offer the worst quality possible for any given bit-rate. It's like using the ultra fast preset with the baseline profile on the CPU.

>You keep saying 4 years as if again you're comparing this to some other situation, this project has been going on for nearly two years and was born of the unreleased VP10 and others, I may be over estimating it but you're clearly underestimating the alliance too.
4 years is a good estimate though. It took HEVC almost 4 years to become relevant. AV1 will take just as long if not more

>user said size wasn't a problem so why couldn't he use 10bit h.264?
Other user probably said that because there are more 10bit HEVC HW decoders than 10bit h.264 encoders. Companies pretty much gave up on 10bit h.264 but not 10bit HEVC.

Are you using the latest x265 2.0 encoder? I heard they reached 50% compression efficiency over H264 now.

builds.x265.eu

YouTube adopted it because vp9 is developed by Google.

I'm saying they can test before its release, there is no reason people involved in the project have to wait until day 1 to start testing

This is true

Anonymous isn't an alliance of corporations fed up with paying for royalties, the Hardware vendors have been making codecs in hardware for years they are fully capable of pulling this off

Youtube already compresses my videos to shit anyway, even at 1080p@50mbs bitrate, they just destroy quality.

support.google.com/youtube/answer/1722171?hl=en

HEVC launched in 2013, its in the Skylake SIP's now, aka when they launched in 2015

and the 900 series Maxwell GPUs (albeit not 10bit until Pascal) in 2014

2015 in the R9 300 series (again not yet 10bit) all of these did not take 4 years

That was without being a part of the creation process

ah I see, thank you for clarifying