HEVC / x265 whats the hype about

Sup fa/g/gots daily lurker reporting in with a question

So I recently bought a new TV an LG OLED55C6V. After doing all the setup ect video with HDMI to pc and audio with HDMI to amp.

I start downloading some high file size "hevc 2160p" releases. One of the files being something like 70gb(new batman movie). I have quite a good pc but my cpu instantly went to 95-100% during playback and couldnt properly play the video in mpv, vlc or mpc (i5 4590k OC'd to 4.2ghz, gpu gtx 770 2gb) I dont think the gpu matters. But I was surprised to see that my pc couldnt handle the file. I put the file on an external hdd and connected to straight to my TV and it suddenly plays flawlessly.. I find it hard to believe this TV has more processing power than my PC. But this is actually beside the point

I dont SEE a major difference when I play a normal 1080p x264 12GB release vs the 2160 x265 10-bit 70GB release apart from the clearly hearable audio difference (dts vs dts hd) even when my tv jumps into HDR more after automatically recognizing the x265 file the on screen difference is minimal

So what exactly is the great benefit of playing a 70GB HEVC 10-bit file on a 10-bit TV ? And if its "highly efficient" why the fuck is it such a large file for such minimal gains

Back in the days when x264 came out it made such a massive difference but with this x265 I just dont see what the hype is all about (literally).

Can someone enlighten me, googling didnt help

Other urls found in this thread:

pastebin.com/sRpVCjCa
thepiratebay.org/torrent/15544777/Batman_v_Superman_Dawn_of_Justice_(2016)_2160p_4K_UltraHD_BluRay
youtube.com/watch?v=fHa10EbyKTk
draw.io/
builds.x265.eu
en.wikipedia.org/wiki/Ultra_HD_Blu-ray
youtube.com/watch?v=cI-a4WZWwZc
demo-uhd3d.com/fiche.php?cat=uhd&id=145
twitter.com/NSFWRedditVideo

tl;dr

wow thats alot of words that i didnt read

I do apologise as I copied the post from a forum

Ive adjusted the it to "0 second attention span" standards for the usual Sup Forums crowd

What exactly is the great benefit of playing a 70GB HEVC 10-bit file on a 10-bit TV if the difference with a standard x264 1080p file is minimal?
And if its "highly efficient" why the fuck is it such a large file for such minimal gains

idk

the benefit is to be stuck with a proprietary codec and gain 2 bits more color.
its useless and probably your tv has hardware support for it while your computer doesnt or you havent turned it on

sounds like a shit encode. the resolution improvement alone should be noticeable

>x265 encodes

Utter trash. All good encodes are still x264.

>And if its "highly efficient" why the fuck is it such a large file for such minimal gains

An equivalent x264 encode would be 150gb+.

H.265 is going to be deprecated next year when AV1 releases, all the major players in tech and media are pushing it because H.265 is patent encumbered bullshit.

Human eye can't tell the difference between 2GB file and 100 GB HEVC 10 bit file

x265 encodes are substantially smaller in file size for relative same quality

Tfw HEVC is a meme, I couldve known

I would like to think the same, but the TV upscales 1080p releases and makes them look fantastic. Honestly the difference im seeing in upscaled 1080p and 2160p is very minimal.

Get 1080p remuxes, they should upscale pretty well.

You wont "see" much of a difference beteen a 12gb h264 file and anything else, because the h264 file is already rather high bitrate....
Compare a 2gb h264 movie to 2gb h265 movie and you will definitely see a difference.


You just were an idiot and downloaded a fuckhuge file for no reason. Its like people downloading FLAC 24bit 120khz rips of rap music.

Dont bother downloading anything over 8gb when it comes to pirated movies.

>Dont bother downloading anything over 8gb when it comes to pirated movies.

t. public tracker cuck

>takes 6 years to encode something that takes 30 seconds on x264

into the trash

>You just were an idiot and downloaded a fuckhuge file for no reason.

I prolly was.. Someone that calls himself Zeus on public trackers is uploading these 70gb hevc files with 2ch audio literally why

About your TV playing the file while your PC maxes out on CPU, your TV probably has a hardware accelerator for x265, while your computer only has one for x264. That means that all of the decoding on your computer has to be done on the CPU, unlike the TV where it's all done on a custom ASIC made for decoding x265.

AFAIK the DRM on 4k blurays hasn't been cracked yet so you were most likely downloading a 1080p upscale or a fucking web transcode.

Just download hdb or ahd internals and you're set.

I tend to do this.

8gb for kino (more if you want dat 4k)
4gb for general viewing
2gb if i need it fast and dont really care.

Stay the fuck away from anything that is less than a gig per hour.


pretty sure the 770 can do hardware accelerated hevc decoding... maybe not at 4k though.

>I dont think the gpu matters. But I was surprised to see that my pc couldnt handle the file
HEVC is pretty demanding at 4k, your TV has a fixed function decoder dedicated for HEVC that's why it's faster than your CPU.

>1080p x264 12GB release vs the 2160 x265 10-bit 70GB
2160p HEVC releases are either low bitrate webrips or reencoded screen captures, so they look like shite compared to true UHD bluray. Also jump from HD to UHD is not a visually great one as SD to HD.

>x264 came out it made such a massive difference but with this x265
x264 is an encoder implementation for h264 and x265 is an encoder implementation for HEVC. x264 is quality wise much more mature than x265, advantage of HEVC is better compression and smaller filesizes.

these are the only encode groups you should download

anything else is pure shit

The 960 was the first Nvidia card to have hwdec for hevc.

>handjob
really? i thought handbrake's deinterlacer was a shit; guess it's ok if you're encoding progressive material

it's a minor step up from '1080p' yify releases

2 years later is this list still up2date?

pastebin.com/sRpVCjCa

test123

roll

>70GB HEVC
that just means someone was retarded and didn't want to wait on their encode so they told it to run ultra-fast preset or similar.

No way should an HEVC encode, even if it's 4k, be 70GB unless they did dick all for compression.

they actually released it backwards for the GTX 750Ti.

The better question is, how fast do you watch flicks, movies, film and kino?

Flicks: 2.0x
Movies: 1.5x
Film: 1.0x
Kino: 0.5x

Those are hybrid decoders, still dependent on CPU

>AV1
literal meme tier
jewgle can't develop a good codec even if their designated streets depend on it

Nope
GTX 750 SE, GTX 950, and GTX 960 are all on the same hardware decode/encode block.

750Ti != 750 SE
Maxwell v1 750Ti and other non-GM206 Maxwell v2 cards only has hybrid decoding

>750Ti != 750 SE
thank god the autismo saved me from making a serious mistake

My original reply to which is about 750Ti is true which you claimed moronly "nope".

m8

again, thank god we have you here

No one else could have waded through that conundrum and figured it all out without your being here to guide us.

You sound perturbed, is something good happened?

If you knew the 750 SE had hardware encoders and just didn't mention it because the 750Ti was specifically mentioned then you're autistic.

You either didn't know about the 750 SE, or you did and specifically didnt mention it here() because you wanted to be contrarian.

You should rather ask about a point of 4K video, since codec itself doesn't make it look any better.
HEVC allows to make video ~30% smaller than h264 with the same quality, but it completely losses it's sense with such ridiculously huge bitrate. With such file size it would look stunning even encoded with DivX.
H264 made a difference only because it was introduced together with new HD resolutions. 720p looks way better than 480p, but difference between 4K (2160p) and 1080p isn't so big. With HEVC I'm sure this movie would look about as good as 8GB 1080p release.
BTW, did you download this release:
thepiratebay.org/torrent/15544777/Batman_v_Superman_Dawn_of_Justice_(2016)_2160p_4K_UltraHD_BluRay ?
Seriously, this is just a waste of bits.

Wew, spoken like a true butthurt. Simply mentioning 750Ti didn't cause me remember 750SE which wasn't widely release.

>Batman
>Superman

>waste of bits
No kidding, LOL.

I even called it a re-released 750Ti which is what it is, I just didn't call it the Se

well meme'd young lad :^)

You said "released it backwards for", which "it" sounded like a decoder added a with a driver update.

and yet you and I both know of the 750SE with a real hardware encode/decode block for HEVC

HEVC when properly encoded should produce smaller file sizes at same quality.

Your TV has hardware decoding for HEVC your PC does not,.

8bit vs 10 bit and hdr is marketing bullshit so you buy new TV. I'm not wading through this thread.

Yeah that was the one.. I later downloaded a different version from zeus+ some guy that put dts hd behind it. I usually stick to my private tracker but similar versions in this filesize would be a hit on my ratio.

I dont really mind the flop of a download, ive got amazing speed and enough storage.. its just the WHY, why bother releasing something like this. Its almost like hes trolling, but the sad truth is we all know he isnt

roll

There actually is one possible reason to make such humongous release: watching it at standalone 4K player.
Archiving better compression requires more demanding techniques that are usually unsupported by embedded hardware decoders.

ends up fucking those of us with 1440p and 4k monitors with CPUs that can do software decoding.

Guy who uploads hevc encodes here

>What exactly is the great benefit of playing a 70GB HEVC 10-bit file on a 10-bit TV if the difference with a standard x264 1080p file is minimal?

At that file size there really is none I guess.

>And if its "highly efficient" why the fuck is it such a large file for such minimal gains
You're not using either encoder correctly. You are supposed to use a CRF not some arbitrary bitrate you pulled out of your ass.

Currently it kinda goes like this for a typical 720p 2 hour movie and using the fast preset:

28 CRF H264: ~800 MB
28 CRF HEVC: ~400 MB

22 CRF H264: ~1.5 GB
22 CRF HEVC ~800 MB

16 CRF H264: ~3 GB
16 CRF HEVC: ~1.5 GB

As a sane human being 28, 22, and 16 CRF are the only reasonable CRF you will ever use to encode video with 28 being the visually worst and 16 being the visually best. Any CRF lower than 16 is a placebo.

Well, it can be also used as a source for those, who make smaller releases. They also need to get a copy of movie...

>encoding from an encode

Please dont do this \:

Always encode from source.

>encoding from an encode
>Please dont do this \:
>Always encode from source.
Blu-Rays are encodes, typically H.264 or VC-1, sometimes MPEG-2

It's not always feasible and in the end most people don't give a rats ass as long as the content doesn't look too bad and is of small file size.

Most of my content comes from 5-10 GB 720p rips already there.

True, but when encode is THAT big, it doesn't really make a difference. Especially if you're going to downscale it.

He probably means 2nd gen encodes (blu-ray transcode)

It wont make MUCH difference, but encode from a remux and encode from that encode then compare the two results and there will be differences, probably not noticeable without comparing frame by frame, but the quality is still not going to be as good.

wow that's a lot of shit waifus

Those are counted as source material by sane people.

>encode from a remux
= transcode (if blu-ray remux)

that is correct

>factually wrong
>sane

youtube.com/watch?v=fHa10EbyKTk

AYYMD IS FINISHED & BANKRUPT

AM1POORFAGS CONFIRMED ON SUICIDE WATCH

Here

Did you use inkscape to make that?

draw.io/

minimally it was posted back in august.

name one good encode groups that common on public tracker

Comparing CRF between two codecs is retarded. Besides hevc is still shit at encoding grain even with --tune grain h264 is much better

cool

Most HEVC encodes out there will be 8-bit ones including mine so not really. I'm pretty sure AMD can at least do GPU assisted decoding if 8-bit HEVC by now.

Isn't there some new format that's not fucked by patents though? why would I want to use this over that...

You are absolutely right, but you know, at some point it becomes art for art's sake. When you want to make a low (but reasonable) bitrate release, transcoding from decent source becomes a sufficient alternative. Sometimes makes no difference at all, even when comparing frame by frame.

AV1 is backed by Google, Amazon, Netflix, Intel, Nvidia, AMD, Microsoft, Mozilla, Cisco, and others.

AV1 however isn't out yet. But since it's royalty free i'd expect it to be the primary codec for internet served media for the next decade.

The biggest downside to AV1 is the lack of any decent encoders, this might change when the codec is finalized, but if VP9 encoding is anything to go by, it will be very resource intensive compared to HEVC/H.264

Roll

>Comparing CRF between two codecs is retarded. Besides hevc is still shit at encoding grain even with --tune grain h264 is much better
You haven't been paying attention to x265 updates.

So for yourself: builds.x265.eu

The x265 encoder version 2.1 now has very similar visual quality if it uses the same CRF of H264. It didn't used to be like that. Encoding video with grain has also dramatically improved as well.

But don't take my word for it, go ahead and replace the stock x265 encoder in MEgui with the latest one and see for yourself.

OP here, Thanks for your explanation tripbuddy

Wrong, most HEVC encodes out there are Main10 because thats what Ultra HD Blu-ray uses

en.wikipedia.org/wiki/Ultra_HD_Blu-ray

AYYMD can't do hybrid decoding for HEVC Main10, basically unable to handle it because it's extremely demanding. No CPUs can handle HEVC Main10 at high bitrates.

youtube.com/watch?v=cI-a4WZWwZc

So yeah, educate yourself and stop lying for AYYMD shit

version 2.2 just recently released

>No CPUs can handle HEVC Main10 at high bitrates.
lol uhh?

What do you consider high bitrates?

I am watching a UHD HEVC 140Mbps encode (the UHD spec allows for 125Mbps so 140Mbps is a safe burst bitrate test file) and my CPU use is at 60-75%, it's certainly not great considering it's a single video file being played back, but there is 0% load on my GPU video engine so it's doing pure software decoding.

who cares? I'll just watch the yiffy release, there's literally no difference in quality. it all looks the same!

>I am watching a UHD HEVC 140Mbps encode
Why would you even have something like that?
Unless you're using it for benchmarking, then it's ok.

>Wrong, most HEVC encodes out there are Main10 because thats what Ultra HD Blu-ray uses
Nope I've been doing encodes for a while now and I can guarantee you that most are still using 8-bit H264, and some are using 10-bit H264 and 8-bit HEVC. It is very rare to see 10-bit HEVC encodes.


>en.wikipedia.org/wiki/Ultra_HD_Blu-ray
>AYYMD can't do hybrid decoding for HEVC Main10, basically unable to handle it because it's extremely demanding.
I never denied that but 10-bit HEVC won't be a thing for a while mostly due to smartphones not being able to decode it.

>No CPUs can handle HEVC Main10 at high bitrates.
Depends, an 8-core FX from AMD will be able to handle 100Mbps of 10-bit 4K HEVC video just fine.

>So yeah, educate yourself and stop lying for AYYMD shit
You honestly sound like the shill here.

It's just a 30 second test file, I have 140Mbps, 160Mbps, 180Mbps, 200Mbps, 250Mbps, and 400Mbps.

My CPU can handle anything up to 160Mbps without stuttering. 180Mbps if I stop background tasks from using CPU resources.

200Mbps+ requires GPU decoding.

What about my buddy SPARKS they usually get fast releases and good quality

Decent scene encoder, probably the best you can commonly find on a public tracker.

See if you can play this without dropping frames.

demo-uhd3d.com/fiche.php?cat=uhd&id=145

Especially if you use MadVR as the renderer. I can't using a 4770K at 4.2Ghz and RX 480 at 1400Mhz.

it's under 70Mb/s, my CPU can decode up to 160Mb/s no problem, 180Mb/s if I stop background tasks.

With MadVR set to NNEDI?

> One of the files being something like 70gb
Sounds like someone , somewhere fucked up.
Like giving the transcode a higher bitrate than the original.

The reason x265 is being pushed is it is pretty much required if you want 4K content.
>The video sequences were encoded using the HM-12.1 HEVC encoder and the JM-18.5 H.264/MPEG-4 AVC encoder. The comparison used a range of resolutions and the average bit rate reduction for HEVC was 59%. The average bit rate reduction for HEVC was 52% for 480p, 56% for 720p, 62% for 1080p, and 64% for 4K UHD

HEVC adoption is slow, but steady.
I know the 10XX series from nvidia supports it and kaby lake is supposed to support it as well.


>So what exactly is the great benefit of playing a 70GB HEVC 10-bit file on a 10-bit TV ? And if its "highly efficient" why the fuck is it such a large file for such minimal gains

Don't put the blame on HEVC when it really is the fault of a shitty transcoder that doesn't know what they are doing.
Like how the fuck do you create a 70GB file when most retail BR discs are only 50GB ???

No because you'd be god damn retarded to use those settings on a UHD high-bitrate file.

Image doubling and chroma upscaling are only going to be useful for image quality enhancements with standard def, and to some extend 720p.

Don't be retarded.

>Making a FREE codec without any royalties
>jewgle

>vs literal jews trying to rip off the entire industry.

Gee, it's almost like you're an MPEG shill.