HDMI 2.1 announced, supports 8K and 4K 120Hz

Will this kill DisplayPort?

hdmi.org/manufacturer/hdmi_2_1/index.aspx

No.

Maybe

isn't that exactly what DisplayPort supports

Didn't know DisplayPort was ever alive

So when are we gonna see the new 5,000 USD HDMI 2.1 cable?

HDMI 2.1 supports 8K60hz 48 Gb data

DisplayPort 1.4 supports 8K30Hz 32.4 Gb data

Roughly 50% more bandwidth with HDMI2.1

Display port already supports 32gbps, HDMI was at 18gbps previously, this brings HDMI up to 48gbps which surpasses dispalyport, but HDMI wasn't "dead" at 18gbps even though DP had 32gbps, so I doubt DP will die while HDMI has 48gbps, just wait until DP 1.5 where it gets doubled to 64gbps.

What does the bandwidth mean that isn't related to the max resolution/rate

if it can't do 4K 144hz I don't buy it

Bandwidth is the ability to carry more data between the two devices.

High bandwidth = Higher image clarity and smootheness

Why can't they make a standard that would last for 20 years, like support 3000Gb data rate.

Is that what it means?
If a card can display 3440x2160 of pixels 120 times a second, what does a few extra Gbps do for the image quality?

analloge master race

DVI master race

8K60hz / 8K30hz != 48Gb / 32.4Gb

I find your post unsettling.

DP 1.4 can. It can even do 120Hz 5K.

To me display port was always dead. I don't even remember seeing a male connector IRL.

The monitors light refresh rate and pixel response times actually determine the clarity, the smoothness is a factor of frame pacing, which is dealt with on your GPU

I thought it was just 4K 120hz?
Isn't that why the 4K 144hz monitor has two DP ports?

mfw most TVs still don't have DisplayPort

Why? Is it the Jews?

I know it costs a bit more to add another connector, but still...

Are you dumb?

In any comparison between X and Y, you try to eliminate all other variables. This means, if all things are equal, with exception of HDMI 2.1 and Displayport 1.4, then blah blah blah.

If you wish to nitpick about different monitors or different gpu or different sun or the position of the moon, that's your discretion. However that will not effect the absolute difference.

> Having both DisplayPort and HDMI and additionally having multiple versions of them in different sizes is just a huge scam so we have to buy countless adapters/cables for our graphics cards, laptops, projectors, screens and TVs.
We are victims.

How is that dumb, if I may ask?

Trying to keep TV's and monitors separate. Image what could happen if normies found out they are basically the same!

May allow it to do the same resolution/frame rate with higher bitrate

I still don't get how 4K@120hz + data is better than 4K@120hz
Is there some metadata that gets analyzed by the monitor?

But they are not:
Everything sold as TV is generally too blurry to be used as television screens and everything sold as PC monitor is generally to small to be used as TV.

But doesn't the GPU handle that? Are there different codecs being received by the monitor? I would've guessed that raw DisplayPort output wouldn't have any extra data past the pixels but I don't know shit

Not to be a dick, but... Wasn't display port always dead?

>8k is actually much closer to 7.5k
fucking marketers

Its always been this way. 1920x1080 is 2K but its not actually 2000 pixels.

Who the fuck called it 2k though?

>1920x1080 = "1K"
A "K" is not a unit of pixels, it's an advertisement

Newegg

2K was never supposed to mean 2000 pixels, dumbass.

Thats literally the point I'm making.

But your entire post is still wrong. 1080p is 1K. 4K = 1080p x 4.

I meant SI prefix (not hnit). And it stands for kilo, which stands 1000.

Ok my bad

4K doesn't mean 4 Kilopixels
It doesn't mean 4 Kilo-horizontal-pixels
It's a marketing term so dumb normies could easily differentiate 16:9 resolutions without remembering 8 whole digits

That's not how it works, lol.

Oh my god wow I never knew that everything suddenly makes sense why didn't anybody mention that wow 1080 vertical pixels means 1000, and 2160 vertical pixels means 4000, it all makes sense now

That's how marketing works

...

>4K = 1080p x 4.

1920x1080 * 2 = 3840*2160
1920x1080 * 4 = 8k

>TV marketing K must mean kilo because I learned that in school!
>7680 = 8000
>3440 = 4000
>1920 = 2000
>1280 = "1K 720p"
It's all just marketing because resolutions are big scary numbers

Are you fucking retarded or did you drop out or something?

...

so 8K should be called 16K since it's 4K times 4 ?

retard

So a 1K monitor is 720p?

It is marketing, and it means kilo. Just as a kilo byte is often (and wrongly) said to be 1024 byte.
Whatever definition fits marketers best is used. Either the actual Kilo (1000), if convention is more. Or the convention (if you'd round it it'd be 1000) if the actual kilo would be more.

>1920x1080 is 2K
Only in marketing meme land.

It's not referring to kilo or any bastardization of kilo

See

This
2K, 4K and 8K are already coined resolutions, and none of them are 16:9 1080p or 2160p

(You)

are you actually retarded?

>1080p is 1K

but that's wrong you retard

>in the order of 8000
So not 8000, but close to it.

Nobody calls 1920x1080 2k. It's called 1080p.
Autists ITT hear the new marketing terms 4k and 8k and try to apply the same naming convention to the previous resolutions. When it is not even the same.

...

A kilobyte is not 1000 bytes, but close to it.

A kilobyte is actually 1000 bytes
A kibibyte is not actually 1000 bytes, but close to it
Like I said, it's marketing semantics that you shouldn't concern yourself with

windows says otherwise

Well Windows is clearly not referring to Kilo-, just like 4K isnt

Windows is wrong and having properly labeled GiB is one of the best reasons to use a GNU + Linux desktop operating system

...

Gee it was a marketing gimmick all along

who knew!

Yes it is, you retard. Do the math. 4K has EXACTLY 4 times the amount of pixels 1080p does.

Not even close.

Displayport is a superior interface made for computer mointors

While HDMI is an inferior interface geared towards A/V equipment.

Why?

>8K
Literally why

>HDMI 2.1 supports 8K60hz 48 Gb data

Is this before or after PHY-level coding? (8b10b, 64b66b, etc.)
Do the quoted resolution modes require bullshit chroma subsampling?

1920 x 1080 = 2073600
3840 x 2160 = 8294400
8294400 / 4 = 2073600

Again, 4K is 1080p x 4. This is fucking elementary math. Try not to overthink it.

Not him but different use cases require different optimizations.

A/V requires longer cables, more audio channels, etc.
DisplayPort requires higher refresh rates, a wider variety of resolutions, etc.

I does make sense to have two standards instead of just one.
And it really triggers me that more people use HDMI than DP for computer purposes now.

>DisplayPort requires

Meant to say: computers require

So with "8K" they finally dropped the autistic "K = 1024" meme?

This thread is hilarious.

It never was 1024.

2K = 2 x 1024 = 2048
4K = 4 x 1024 = 4096

Now look at the picture again.

Are you stupid? K is the 1080p multiplier.
1K = 1080p
2K = 1440p (2 times as many pixels as the 1K)
4K = 2160p (4 times as many pixels as 1K)
etc.

Then 8k should be 16k.

Marketologists literally played themselves this time.

Except certain people (not me) freak out when you call 2160p "4K".
They insist you call it "UHD", and only use "4K" for resolutions that are 4096 pixels wide.

So I was wondering if with "8K" this distinction is finally gone.
But after following 's recommendation I guess there will be both a "8K UHD" that is 7680 wide and a "8K" that is 8192 wide.

They will probably call an ultrawide 4K an ultrashort 8K

No argument there.

Also, "2K" actually has 1.777... times as many pixels as 1080p (1.333...^2). For it to have twice as many each dimension would have to be root 2 times the size of 1080p, which would result in a very strange resolution.

No because no DRM, its not even out, DP can do 4K 144Hz and why 8k?

1440p masterrace coming through

Fucking marketing terms.
They call UDH 4K when its not.

All I want is 240hz.

I don't care if there's no way to distinguish it, I just want to see it.

I've always wondered: do images sent through a display cable actually use 32 bits per pixel?

Since the alpha byte isn't needed (as the image is already processed), wouldn't it make more sense to just use 24 bits per pixel?

Benq makes one

Feels comfy

>acer
>not westinghouse
>not viewsonic

I'm excited for TVs to finally support high frame rates and adaptive sync.

>not 60hz