>>57633406

>4k meme
or you could just pirate it

Other urls found in this thread:

jell.yfish.us/
microsoft.com/playready/
en.wikipedia.org/wiki/Nvidia_PureVideo#The_seventh_generation_PureVideo_HD
blogs.windows.com/windowsexperience/
geforce.com/whats-new/articles/pascal-video-playback
twitter.com/SFWRedditImages

yes

they're asking for piracy

Sure but most people pay for netflix anyway, would be nice if I could watch 4k content with my GTX 1080 and 6700k since I dont plan on upgrading anytime soon.

Now im forced to pirate.

So glad I dropped netflix a few months back. They have become pretty bad.

>spoof user agent as edge
das it mayne?

The Edge browser isn't the issue user, it's the 7th generation CPU requirement.

You're telling me even though my GTX 1080 and i7-6700k will play netflix just fine, I have to upgrade to an i7-7700k to watch netflix in 4k.

It's bullshit

Ugh I reactivated my yeard-old Netflix subscription a few months ago to watch Stranger Things.

I went all in since I had a 4k monitor now and shelled out the highest plan not even thinking that there would be anything preventing me from watching 4k.

But no, you pay the highest plan, but no 4k for you. Not even 1080p. It was limited to 720p on the PC, despite my 150Mbps connection, i5 Skylake and 1070.

Go suck a dick, Netflix.

>But no, you pay the highest plan, but no 4k for you. Not even 1080p. It was limited to 720p on the PC, despite my 150Mbps connection, i5 Skylake and 1070.
With windows 8 or windows 10 you can get 1080p on the PC.

However 4k requires kabylake as this thread is about.

They'll probably update to support newish nVidia GPUs for 4k eventually since they support HDCP 2.2 and have compatible hardware decoders which was the hold up publishers had for 4k on PC anyway.

They might, but officially right now they are not on the supported device list.

Everyone complaining about 4k not being available is a technological luddite who thinks his rig is fit enough. No it's not and the video engineers at Netflix are telling you why. Torrent monkeys are no engineers, they encode for worst quality with least compression and no 10bit support.

(you)

>No it's not
You really believe a GTX 1070 or GTX 1080 can't output high quality 4k BUT a kabylake iGPU can?

Kill yourself.

There is literally no excuse to not have 1080p on Linux. That made me drop my subscription.

Not my problem nvidia didn't build in 4k support in their newest line of deprecated gpus.

>You MUST have a kabylake CPU
Pretty sure it's not about performance at all, but about some cancerous form of DRM that requires some sort of hardware support to make sure you can't record their shit from the screen. That's why it's Edge-only too, other browsers probably don't have the DRM cancer required.

I have a 4790K and a 4K monitor, if this worked I would've at least given it a try most likely. But it doesn't, they don't want my money because they think I'll rip their epic 4K shows.

It has nothing to do with the GPU cores, it's about the hardware video decoder onboard. He's still wrong though because the GTX 960 and newer do have a capable hardware decoder.

>a browser gets access to my system specs and restricts content if i'm not using Recommended And Approved Hardware (TM)

They did is my fucking point, the hardware decoders do the SAME 10-bit HEVC decoding as the kabylake iGPU.

>nothing to do with the GPU cores, it's about the hardware video decoder onboard
I hope you realize they're basically one in the same, the video encoding block is located on the GPU die.

>can still play 4k with your shitty smart tv netflix app
netflix 4k is still bitstarved trash anyways, you're better off waiting for the 1080p season blurays which have way higher quality cause they aren't bitstarved to death

Of course, but my point was starting with the 960 the decoder no longer uses general compute cores at all and is fully dedicated meaning the specific GPU model is irrelevant as long as it has the same decoder.

>bitstarved HEVC trash that is worse than a regular 1080p blu ray and can be played fine by a 400$ chink TV is somehow too powerful for my """rig"""

call me when they offer shit >10Mbps

jell.yfish.us/

No 4K Roku?

Fug

So basically the future is going to be locked. Everyone will bend over for Intel and their CPUs which will be able to block content that you're not owning and will observe any activity that you do and report, damn

THIS is why I illegally download things.
I couldn't even get amazon player to stream 1080p for the Grand Tour

>the decoder no longer uses general compute cores at all and is fully dedicated
This has been true for x264 for almost a decade. Pretty sure the 4xxx series from AMD had dedicated decoding hardware blocks.

This is the DRM used: microsoft.com/playready/ which indeed requires edge + kaby lake

Except the 10xx series GPUs from Nvidia ARE Playready3.0 certified devices.

>People who pay cannot watch 4k
>People who don't pay can

wew lad

I was talking about HEVC 10-bit specifically there but yes, full hardware h.264 support goes even further back to the HD3000 series.

>acquire half a dozen stolen netflix accounts owned by normie idiots just for shit and giggles
>can't even play 1080p without using edge or getting a smart tv
>go back to downloading shit from BTN

>HEVC 10-bit specifically there
If you were you were mistaken. GTX 960 is only 8-bit HEVC decode/encode.

10xx series is 10-bit HEVC.

>>can't even play 1080p without using edge or getting a smart tv
actually the windows netflix app also gives you 1080p

isn't that only for windows 10? that's worse than edge

WHERE WERE YOU when you realized piracy is the only option?

W10 or 8

Not true, the 960 and 950 use newer hardware than the rest of the Maxwell GPUs.

en.wikipedia.org/wiki/Nvidia_PureVideo#The_seventh_generation_PureVideo_HD

Lorelei is a slut.

It's likely drm bullshit.

As a average user there is no way in hell I would upgrade my CPU and switch my browser just to watch 4k Netflix.

There are too many options out there for me to bend over backwards like that.

>blogs.windows.com/windowsexperience/
>window sex

True bullshit intel atom kabylake with onboard GPU can play 4k and not my i7 skylake + 1080p invidia is marketing bullshit

>Lorelei is a slut
facts tho

...

>Kaby Lake gets bigger upgrades, such as hardware support for encoding and decoding 10-bit 4K HEVC video codecs as well as 4K VP9.


that is all it really came down to, there is no conspiracy here

Just try to watch a 4K HEVC video and see your CPU catch fire or video stutter

as the the Edge requirement, MS probably spent the resources and hence 4K netflix is coming to Edge first, eventually this support will make its way to other browsers or when Firefox/chrome/whomever decides to spent the resources and make it work there too.

based Luke

Nigger this has NOTHING to do with that, the GTX 1060/1070/1080 all support 10-bit 4k HEVC and 4k VP9.

Pair a GTX 1070 up with a skylake i5 but no 4k netflix for you.

He's too pure. That whore doesn't deserve him.

Worst thing is that there probably are people who find these lack of freedoms acceptable.
Guess I'll keep downloading my shit, no restrictions for me.

>140 mbps HEVC is unplayable
>140 mbps x264 is smooth
>20 MB size difference

What did they mean by this?

The fucking Xbox One S streams Netflix in 4K, my PC should fly through this shit.

Why can't the movie industry not come up with something like Steam or Spotify?

I'm not sacrificing quality with either, both are DRM'ed but not to the point where they would lower my pleasure of experiencing music or video games. With both solutions I can watch shit offline just fine, I can transfer them between devices, and they support an abundance of devices where you rather have to look for one that DOESN'T support it rather than one that does.

All the movie industry came up with is pieces of trash that are only supported on special snowflake smart TVs with an always-on connection that is significantly more expensive than just buying physical discs and interferes with the viewing pleasure in many ways. They don't offer eveb 25% of the bitrate that modern devices could already easily handle. Want to watch it in bed on your laptop? Too fucking bad. Want to watch it on your Desktop PC or on your phone on the plane? WELL TOO FUCKING BAD YOU BETTER TAKE YOUR 50" SMART TV ON THAT PLANE.

So much this. The movie industry just has its head so far up its own ass that they don't realize that if they just made it more convenient to pay for their content than it is to pirate than people would pay.

...

>use stolen netflix account
>use btn
wow :^)

geforce.com/whats-new/articles/pascal-video-playback

All Pascal GPUs are PlayReady 3.0 DRM certified, MS & Netflix selectively enabling 4K only for Kaby Lake is BULLSHIT

It's just the latest effort to force people into upgrading to a new Windows 10 machine. It might suck, but the public let them get away with creating a literal botnet in Windows 10, so now they don't need to show any restraint.

Why do these fucks try to hide the real reason for the shit they do behind bullshit phrases like "meeting standards for premium content playback", there is not a single mention of DRM or content protection in the article which is the fucking thing PlayReady has always been about since its inception

btw the CPU requirement is likely because of Intel SGX

DRM is a bad word now that the mass consumer market recognizes is something they don't want.

"PlayReady" is just a new rebranding of DRM that most people won't start to recognize for at least a few months.

An average Netflix subscriber doesn't give a fuck about DRM because he's okay with not owning content realizing it's a rental model and many people who pay for things say they're okay with DRM as long as it doesn't inconvenience them.

DRM affects consumers who buy digital downloads and discs and they're slowly becoming a minority because owning stuff is not cool these days.

>SGX
SGX was introduced with skylake, so still doesn't explain why a GTX 1070 and skylake CPU can't playback.

Yeah, that's true.

But to some degree, it doesn't even matter, because DRM is a negative buzzword now, so they're not going to put it right in the advertising.

Netflix's 4k is bullshit anyway since the bitrate is only slightly better than youtube's. I rather just pirate bluray iso images and get a true 4k picture.

Fuck streaming content

sounds like scripting bullshit

Theres no way they'll figure a way to block you from watching netflix in 4k because it magically thinks it requires windows 10 to run

Dumb ignorance poster

Has nothing to do with that if they're hardware locking it to 7xxx series kabylake CPUs.

Kabylake probably has some bullshit DRM in it to protect their IP. I can think of no other reason other than they are plebs. I bet it's that.

Then maybe it has to do with Intel's part of the protected video path thing (since all hardware components involved in transferring video to the output have to be "secure"). Anyway this new protection is the fucking nuclear option and I doubt we're going to see an universal crack in a few years at least, all 4k pirated content is going to be HDMI captures and I'm not sure how long until new HDCP that fixes the current loophole comes along.

I've heard talk they wont allow UHD bluray playback on any computer without a TPM 2.0 chip installed so the AACS 2.0 encryption key can be kept in a secure encrypted environment so it can't be cracked.

The CPU (or the GPU or both) has a secure TPM-like chip that stores the device's private key inside and is impossible to extract for a mere mortal. You need to remotely attest yourself to the streaming server so it allows you to access the stream. It has to be a "trusted" key and they probably have the capability to check the authenticity (so you can't just generate a bullshit one and emulate the process) or even ban devices.

Older cpus don't have the hardware drm :^)

So they won't be able to play the content due to "compatibility" reasons

Someone should specify which DRM specifically the skylake and older CPUs don't have, as far as I am aware, skylake has the same DRM as kabylake besides the iGPU having HDCP 2.2, which wouldnt matter if you're using a dGPU like the GTX 1070 anyway.

Then just spoof your hardware certs mate.
You can literally do it with a browser plugin.