10-bit HEVC video encoding thread

Video Encoder: sourceforge.net/projects/megui/
Latest 10-bit x265 encoder: builds.x265.eu/
Why 10-bit encoders are better than 8-bit ones: x264.nl/x264/10bit_02-ateme-why_does_10bit_save_bandwidth.pdf

720p 800 Kbps 10-bit HEVC sample: mediafire.com/download/6fob4pzy0voaapu/10-bit HEVC.7z
720p 10-bit H264 samples: mediafire.com/download/oo5luitse6izc2c/10-bit H264.7z
10-bit H264 samples include 800, 1144, 1344, and 1600 Kbps bitrates. All samples include 96 vbr stereo Opus audio

10-bit HEVC encoding speed estimates if using the fast preset, an ABR of 800 Kbps, and transcoding a HQ 720p version of big buck bunny

Intel Pentium 4 (1 physical core @ 3.6 GHz): ~1.5 FPS

Intel Atom x5-Z8300 (4 physical cores @ 1.4-1.8 GHz): ~3.5 FPS

AMD Athlon 5350 (4 physical cores @ 2.1 GHz): ~8 FPS

Intel i3-4170 (2 physical cores @ 3.7 GHz): ~16 FPS

AMD A8-7600 (4 physical cores @ 3.2-3.8 GHZ): ~16 FPS

Intel i7-4770K (4 physical cores @ 3.5-3.9 GHz): ~30 FPS

AMD FX-9590 (8 physical cores @ 4.7-5.0 GHZ): ~30 FPS

Intel i7-5960X (8 physical cores @ 3.0-3.5 GHz): ~50 FPS

Other urls found in this thread:

youtube.com/watch?v=gPbVRpRgHso
en.wikipedia.org/wiki/Unified_Video_Decoder
en.wikipedia.org/wiki/Unified_Video_Decoder#UVD_6
screenshotcomparison.com/comparison/174224
screenshotcomparison.com/comparison/174225
wccftech.com/amd-zen-cpu-performance-double-fx-8350/
forum.doom9.org/showthread.php?t=167963
wiki.webmproject.org/ffmpeg/vp9-encoding-guide
twitter.com/NSFWRedditGif

Neat

Like so many have said in the past thread, we are not ready for HEVC 100% but we are getting there. There is still no HW support for 10-bit HEVC decode on most laptops and virtually all phones. Encoding times are still rather long even on desktop processors. I would advise you all to NOT attempt to encode 1080p HEVC unless you have something better than an i3/A8 CPU as encoding times will be excruciatingly painful.

The encoders on public trackers fucking trigger me. They often use a x264 file as a source and instead of medium speed they choose super fast with 2 pass encode.

I thought I knew nothing about encoding, but apparently these kiddo's are worse.

I want to apologize for the pic I posted last thread, I made a mistake in what preset I used to encode. I used the fast preset not the slow preset. Again sorry for that fuck-up.

pic related is the corrected version

Their mission is to upload the file asap.

>Intel i7-5960X (8 physical cores @ 3.0-3.5 GHz
literally why, getting 4.4GHz is trivial.

When will x265 become mainstream?

>They often use a x264 file as a source and instead of medium speed they choose super fast with 2 pass encode.
lol yeah, you never use a 2-pass ABR or choose something faster than the "fast" preset.

It's in vain since they use a shitty 2-pass ABR desu. They would have much better quality and maybe even take less time if they used the fast preset with a CRF value.

My estimates only reflect the processors on stock settings because not all can be overclocked much. However if you did manage to overclock an i7-5960X to 4.4GHz then you might get close to 60 FPS which is pretty cool.

When the anime and piracy groups properly adopt it.

>all those tears when Daiz drops the HEVC on the lowtier weebs

Nice lucky trips.

Well it all depends on how soon phone/laptop manufacturers adopt HW HEVC decoding for their products so soon. Maybe in the next 1-2 years we will see HW HEVC decoding on most phones and laptops.

See for HEVC to go mainstream people must be able to play their chinese cartoons/movies on their phones and laptops without sacrificing battery too much or causing it to constantly overheat. Cheap (maybe $300) 8-core Zen processors with i7-5960X stock multi-core performance are going to be sold as soon as Q4 2017 so encoding HEVC won't really be a problem. The real problem preventing HEVC from going mainstream is HW decoding support.

>if you did manage to overclock an i7-5960X to 4.4GHz

Its not even a question of if really, as long as you have a halfway decent cooler 4.4GHz is dead simple.

4.5GHz+ can be tricky on Haswell-E, especially on the 5960x, but I have yet to see a single 5960x that couldn't get stable at 4.4GHz and 1.3v. I'm sure there are a few out there that just have shit OC potential, but most of those Intel bins for 5930k/5820k stock.

HEVC is pretty neat. Some shows have been getting great-looking ~800mb 1080p releases for a 40min show.

HW decoding support doesn't help with 10 bit, does it? I don't think anyone builds 10 bit decoders in hardware.

>When the anime and piracy groups properly adopt it.
This too.

>all those tears when Daiz drops the HEVC on the lowtier weebs
Hopefully Daiz will put HEVC on hold until more HW decoding support arrives. We already have HEVC HW support on carrizo APU laptops I think.

Realistically the 5960x tops out at 4.2 average. 4.5~4.5 sells for above retail and is considered golden.

>HEVC
this is a free software board desu.

>tfw your 5820k is stable at 4.7GHz and under 1.3v

Unfortunately the encoders don't know what the fuck they're doing. They could lower that file size further and still maintain relatively good quality.

Are the chinese making fun of Americans in that gif?

youtube.com/watch?v=gPbVRpRgHso

Doubt it's stable. Try running occt.

THANK YOU BASED NVIDIA

I read GPU encoding leads to lower compression and more key frames (lower quality)

>HW decoding support doesn't help with 10 bit, does it?
It does, HW decoding in general extends battery life and reduces heat output on devices.

>I don't think anyone builds 10 bit decoders in hardware.
For phones and laptops? Pretty much, which is a shame, 10-bit video is far superior than 8-bit video.

It's CPU-z validated and can run Cinebench R11.5 and R15 without crashing. I will try an encode later tonight at those settings and see if it crashes. If it doesn't then it's stable enough for my uses. I'm not doing 24/7 number crunching. I do encoding and some desktop usage.

Why is HEVC bigger than H.264?

The hwbot encoding test is actually very difficult to pass. I'd say if you can pass that at least a few times it's probably stable enough for daily use.

I mean it's already fairly impressive that it boots to 4.7GHz at 1.274, i've seen a 5820k struggle at 4.5GHz with 1.3v.

Nvidia will have GP107 for notebooks, HEVC Main10/Main12 hardware decoding will be available

Intel will have HEVC Main10 hardware decoding in Kaby Lake Core iX CPUs & Apollo Lake SoCs for low cost PCs

>It does, HW decoding in general extends battery life and reduces heat output on devices.
In theory, but it's not worth anything if nothing has 10-bit decoding hardware.

Are 8-bit decoders totally useless for 10-bit video? Is there no way whatsoever to take advantage of the 8-bit hardware with a 10-bit file?

Skylake has hybrid/partial Main10 decoding but it's horribly slow and useless for 4K and not power efficient at all compared to a full hardware decoder

My 5930k boots to 4.7 as well but I could only 4.4 1.38v out of it. I beat it to hell though and passed every stress test. Occt large data set was the hardest and forced me down from a stable 4.5 1.25v.

If you were trying to maximize core count for encoding purposes, what kind of setup would be most cost effective?

Does RAM matter at all?
I imagine regular spinning disks would be fine because of how shit slow the process is.

I'm thinking some kind of Xeon E5 v1.
Engineering samples sound somewhat dangerous for this task though.

Not bad, was expecting worse.

Those $50 12 core sandy bridge xeons

>"UVD 6.1: STONEY (HEVC=Yes, Max.Size: 4K, Supports 10bit Color Channel Depth for Ultra Blu-ray Video)"

en.wikipedia.org/wiki/Unified_Video_Decoder

It looks like stoney ridge APUs (x86 excavator) will also feature 10-bit HEVC HW decoding support. Carrizo APUs and Rx-3xx GPUs already support 8-bit HEVC HW decoding.

NO

None of the Rx-3xx supports HEVC decoding, not even Main 8bit, Tonga does not support Main 8bit at all

If you don't need it 24/7, I'd consider renting a server. A beefy EC2 instance is

>If you were trying to maximize core count for encoding purposes, what kind of setup would be most cost effective?
I'd wait for AMD to deliver ZEN. Their 8-core flagship is supposed to compete with the i7-5960X at least on stock settings.

>Does RAM matter at all?
Not really. Video encoding doesn't use a lot of RAM or bandwidth. As long as you have like 1GB free of RAM then you should be okay.

>I imagine regular spinning disks would be fine because of how shit slow the process is.
This I'm not so sure. Maybe an SSD might speed up encoding but probably not by much.

>"The UVD version in "Fiji"- and "Carizzo"-based graphics controller hardware is also announced to provide support for High Efficiency Video Coding (HEVC, H.265) hardware video decoding"

en.wikipedia.org/wiki/Unified_Video_Decoder#UVD_6

Are you sure? Did AMD not keep their word? I heard you have to be using the latest catalyst driver and a compatible video player (MPC-HC I think) to be able to have HEVC HW decoding available.

Tonga is UVD 5.0, don't you even see the entry above UVD 6.0?

The wiki page is wrong, period.

Should I be using hyperthreading when encoding?

VP9 is a nice video codec that all streaming websites should adopt because you don't have to pay anything to use it. However for personal use HEVC beats VP9 in terms of ease of use and encoding speed. We also have growing HW decoding support for HEVC (mostly 8-bit for now).

Not really. Hyperthreading is only really useful when a program can only efficiently use 1 or 2 threads (ie ARMA 3). HEVC encoding can efficiently use all the threads you have. Whether you turn on hyperthreading on or not won't really affect performance much.

Nice. Looks like you can get away with encoding 1080p HEVC video.

When will the 720p meme end?

Come on, it's fucking $current_year, guys.

hahaha. No but seriously encoding HEVC is a heart fucking experience unless you have a really nice CPU like does.

Hopefully AMD Zen will change all of that.

Is AMD Zen 8 reel cores or 8 fake cores?

Here is a screenshot comparison of the big buck bunny rips I did

800 Kbps 10-bit HEVC vs 800 Kbps 10-bit H264: screenshotcomparison.com/comparison/174224

800 Kbps 10-bit HEV vs 1600 Kbps 10-bit H264: screenshotcomparison.com/comparison/174225

>reencoding

LMAO
M
A
O

They're real.

wccftech.com/amd-zen-cpu-performance-double-fx-8350/

Go away placebofag. I bet you use 384KHz sample rate/128-bit FLACS.

Sorry for being clueless, but what is the approximate bitrate for "normal" (typical quality with typical encoding) 720p video?

Is 800 Kbps a lot or a little compared to the files most people play?

Thanks YIFY

>Sorry for being clueless, but what is the approximate bitrate for "normal" (typical quality with typical encoding) 720p video?
It depends on the source and can vary greatly but generally 1000-2000 Kbps 10-bit H264 is good for chinese cartoons and 2000-4000 Kbps is good for action movies. Because we cannot gauge what the best exact bitrate is for encoding a specific video and because ABR is inherently inefficient we use a CRF value instead. Generally for 10-bit H264, 22 CRF will give you okay quality, 16 CRF will give you high quality, and 28 CRF will give you yify quality across all resolutions and source types. You can see why using a CRF is the smarter choice.

>Is 800 Kbps a lot or a little compared to the files most people play?
Depends on the source. This would be okay for chinese cartoons most likely. But again using a bitrate is inefficient and you'll get better quality output if you use a CRF value.

Not saying we should all aspire to encode videos to yify quality. Even the creator of YIFY himself said his movies looked like shit. His goal was to distribute small file size movies that people with even dial-up connections could see. He chose compactness over quality.

Anyway there is a point where video will look exactly the same as the source while being magnitudes smaller. A ~4-6GB 720p 10-bit H264 rip will look pretty much the same as the 15-20GB 720p blu-ray source if encoded right.

Why have you not joined the Nvidia HEVC Main12 master race, Sup Forums?

>Why have you not joined the Nvidia HEVC Main12 master race, Sup Forums?
12-bit HEVC is pretty fucking dumb, the benefits stop at 10-bit. Anything above that is placebo.

I'm sorry I forgot to answer you. Well honestly I'm not sure but the file size difference was very little and insignificant at the end (like 2%). The ABR in 10-bit HEVC might work a little different than the ABR in 10-bit H264, I know the CRF values are not compatible. I did input 800 as the ABR for both 10-bit HEVC and 10-bit H264 when I did those encodes.

What exactly is the point of 10-bit anyway? I was under the impression this referred to color space. I have a few Blu-Ray movies, but most of my stuff is on DVD. I don't think MPEG2 had 10-bit color per channel. If anyone could enlighten me on the subject of why it's so important when the original source was 8-bit, I would be grateful.

>What exactly is the point of 10-bit anyway?
To reduce color banding and improve the quality of video while maintaining the same bitrate compared to 8-bit video.

>I was under the impression this referred to color space. I have a few Blu-Ray movies, but most of my stuff is on DVD. I don't think MPEG2 had 10-bit color per channel. If anyone could enlighten me on the subject of why it's so important when the original source was 8-bit, I would be grateful.
See the PDF link I posted at the start of the thread up there.

Whoops meant to quote you to

I'd rather they improve the quality as much as possible and make the releases 1GB. It's not like anyone's short on space/bandwidth these days.
I'm not sure I've ever seen a "perfect" 1080p TV release where textures don't shift around due to compression. Especially in faces. those really make bad compression stand out.

figure i will ask here

Im on an ancient cpu and am waiting on zen, so upgrading while it would be the best option, isn't one im taking.

I have a 280X

Is there ANY video editing program that uses the fucking gpu to render the video, I don't care about it being "lower quality" all I want is to encode at least 720p in real time, god knows my 5770 could do that for 1080p, but for some reason the 280x refuses to do anything.

hell, is there an open source video editor that isn't a complete bitch to use?

>I'd rather they improve the quality as much as possible and make the releases 1GB. It's not like anyone's short on space/bandwidth these days.
True we even have unlimited internet on phones now. However making the releases 1GB for the sake of making them 1GB doesn't really make much sense. Instead they should use a CRF value that will make each episode vary in file size due how much motion it has. This will improve overall quality and keep file sizes down. Nobody should follow in the steps of bloatgirls.

>I'm not sure I've ever seen a "perfect" 1080p TV release where textures don't shift around due to compression. Especially in faces. those really make bad compression stand out.
Actually they make encodes that do not use a CRF stand out. When you use an ABR the encoder will target a specific bitrate not quality so if there is a scene that temporarily needs 2-5 Mbps of bitrate it will become bitrate starved and you will notice artifacts. Even 2-pass ABR will suffer this same problem though quality will be a little better.

Fact is when you use a CRF the encoder will target a specific quality and only use the bitrate derived from things like motion and video dimensions to match that quality. The only reason ABR exists is if you need to maintain a specific bitrate (ie streaming/ fitting video in a dvd).

>I'd rather they improve the quality as much as possible and make the releases 1GB. It's not like anyone's short on space/bandwidth these days.
True we even have unlimited internet on phones now. However making the releases 1GB for the sake of making them 1GB doesn't really make much sense. Instead they should use a CRF value that will make each episode vary in file size due how much motion it has. This will improve overall quality and keep file sizes down. Nobody should follow in the steps of bloatgirls.

>I'm not sure I've ever seen a "perfect" 1080p TV release where textures don't shift around due to compression. Especially in faces. those really make bad compression stand out.
Actually they make encodes that do not use a CRF stand out. When you use an ABR the encoder will target a specific bitrate not quality so if there is a scene that temporarily needs 2-5 Mbps of bitrate it will become bitrate starved and you will notice artifacts. Even 2-pass ABR will suffer this same problem though quality will be a little better.

Fact is when you use a CRF the encoder will target a specific quality and only use the bitrate derived from things like motion and video dimensions to match that quality. The only reason ABR exists is if you need to maintain a specific bitrate (ie streaming/ fitting video in a dvd).

I don't think any video editor does that. GPU HW encoding has always been spotty at best with AMD GPUs.

If you're going the open source route then you're gonna have to tough it out and learn how to use blender as a video editor.

As for the real-time 720p encoding thing, you can already do that by using the ultrafast preset.

What CPU do you have? I can get more than 20 FPS encoding 720p 10-bit HEVC video with the ultrafast preset and 22 CRF.

GPU encoding is like using the ultrafast preset btw.

bump

page nine save rave

>Even 2-pass ABR will suffer this same problem though quality will be a little better.

Do you even know how 2-pass ABR works? 1st pass calculates a CRF that fits the filesize, and second pass is a CRF encode at that chosen CRF. If it's anything like x264, there's literally no difference between 1 pass CRF that gets you that filesize and 2-pass ABR.

1-pass CRF will get you the exact same results as 2-pass ABR. If you don't care about a certain filesize, then sure go ahead with CRF. If you are a professional mastering studio and you need to cram a studio ProRes 4:2:2 onto a BD disk at the maximum quality, you probably will encode at 80 Mbps 2-pass ABR instead of doing trial and error to find an optimal CRF that will max out bitrates on the disk.

Phenom II 955 black

Thing is good enough for general computer and works for playing games, but I want to do anything remotely cpu intensive it shows its lack of power, zen is the deadline, I either get zen or I get a 6 core intel.

>Do you even know how 2-pass ABR works? 1st pass calculates a CRF that fits the filesize, and second pass is a CRF encode at that chosen CRF. If it's anything like x264, there's literally no difference between 1 pass CRF that gets you that filesize and 2-pass ABR.
WRONG. The CRF targets quality whereas the 2-pass ABR targets a file down to the MB. Yes both will look identical in the end but the 2-pass ABR takes longer and you have no fucking idea what the overall quality of the video will be. With CRF you will know the quality but not the filesize. ie you know CRF 28 will look like shit and 16 CRF won't.

Whoah, thanks for saving my thread user-kun, I thought it was finished and bankrupt.

Sorry, I got that confused with 2-pass VBR/bitrate.
But you should read this.

forum.doom9.org/showthread.php?t=167963

Why and how does 10 bit help me?

>If you are a professional mastering studio and you need to cram a studio ProRes 4:2:2 onto a BD disk at the maximum quality, you probably will encode at 80 Mbps 2-pass ABR instead of doing trial and error to find an optimal CRF that will max out bitrates on the disk.
Good point but this rarely applies to the average user/pirate. By the time you finish doing 1 2-pass ABR encode you could have done 2 CRF encodes.

It's not twice as slow. Maybe 1.5 or something.
1st pass to determine what CRF to use is much faster than a normal CRF encode.

And even if it was twice as slow, 2-pass VBR is still beneficial for targeting a certain filesize. If you're trying to cram 80 Mbps onto a disk, CRF 16 and CRF 17 might be a +/- 5 Mbps difference, which is gigantic.

>Why and how does 10 bit help me?
PDF link is at the start of the thread but generally it's because "muh reduced color banding".

>And even if it was twice as slow, 2-pass VBR is still beneficial for targeting a certain filesize. If you're trying to cram 80 Mbps onto a disk, CRF 16 and CRF 17 might be a +/- 5 Mbps difference, which is gigantic.
It is twice as slow unless you're dumb enough to turn turbo first-pass on. Targeting a file size is rarely needed now. I mean most people have probably never used a CD/DVD in the last 5 or so years.

I'm not saying 2-pass ABR is completely useless but for the average user it's not really needed anymore.

Look on the bright side: At least they're better than the encoders in actual encoding studios

>OP still using lossy material for the encodes
>I told you to use the lossLESS version of Big Buck Bunny to do your testing
>anything less is just fucking stupid
>stupid fucking people will be the death of us all

>HW decoding support doesn't help with 10 bit, does it? I don't think anyone builds 10 bit decoders in hardware.
HEVC hardware decoders will support 10 bit, whether by choice or otherwise.

>HEVC beats VP9 in terms of ease of use and encoding speed
>encoding speed
does it really? all I see is people not knowing how to use the proper flags to encode VP9, like -speed

VP9 isn't ready for personal use yet. Last I checked it doesn't even have a CRF mode.

YouTube already uses VP9 by default on Chrome and Firefox.

>crf is the quality value (0-63 for VP9). To trigger this mode, you must use a combination of crf and b:v 0. bv MUST be 0.
wiki.webmproject.org/ffmpeg/vp9-encoding-guide

It looks like a crf is already available for vp9.

How do I select h265 in megui?

Go to: settings
Go to: external program configuration
Check: x265
Click: save

It should now download the latest 8-bit x265 encoder to the "tools" folder in the Megui directory.

Finally just replace the old x64 x265 encoder there with the latest one from: builds.x265.eu

What is this?

nwe video format? new player?

I can't open any of these links at the moment, dumb it down for me

I don't really know if I'm doing this right.

Hevc used to be a piece of shit. It would take forever to encode and in the end you would only shave like 30% off the total file size. Now it's faster to encode and compression efficiency seems to have hit 50%.

Also it turns out 8-bit encoders are a piece of shit because of color banding and because they use higher bitrates for the same bitrate.

>babbys first encoder
Use megui family

>43fps
holy shit mang, what cpu do you have? That's fucking fast for hevc.

yea gpu encoding is garbage currently.
other than intel sync w/e the fuck it is

I'm trying it out and the GUI is shit and confusing. I'm sure it allows a lot more fine control over your settings, but it could use a lot of work on organization.

2x Xeon X5650

>a HQ 720p version of big buck bunny
Which one of the sample files does this correspond to?

also the motherboard I'm using has a feature called "Intel Enhanced Turbo Boost" that lets all cores run at the turbo clock all the time. It might help.

>using a lossy encode the OP made from a lossy encode to make yet another lossier encode
>stupid fucking people will be the death of us all