Will we ever see 16K or 32K resolution TVs?

Will we ever see 16K or 32K resolution TVs?

Other urls found in this thread:

twitter.com/TimothyLottes/status/935704999912443904
twitter.com/idSoftwareTiago/status/937776357764976642
displayhdr.org/performance-criteria/
twitter.com/SFWRedditGifs

Over 2040 maybe 16K
32K as luxus monumental Display

>16K (15360x8640 pixels)
>mfw my monitor is 1366x768

TVs? Nope. Monitors? Not a chance.

Fucking smartphone? Absolutely.

we must abolish pixels and use vectors instead

that would be great to reduce screen door effect in VR though

except VR is a meme

it'd be way too power hungry

Until I can jack it directly into my brain/nervous system fuck """VR"""

who let the brainlets out?

Its a bit of a toss up.


For things like monitors/hand held devices there doesnt seem to be any benefit going to such a high resolution once you achieve the 'retina' affect of high PPI.

But since TVs tend to be huge as shit these days, they can benefit from 8K and beyond resolutions.


I think we will ultimately see 8K emerge in a few years time and maybe in 10-15 years it will be the standard for TVs

>use vectors instead
please explain what you mean by this to a brainlet such as myself

Not him, but vectors are always perfect lines, opposed as finite pixels, wich at some point you can see, thus making diagonal lines into "saws" or "ladders"

>16K or 32K

Well, high end TVs aren't going to just stagnate at 8k for the rest of time. That number will either go up, or display technology will be fundamentally changed in some way where they're no longer measured in distinct resolutions. Since your eyes are only focused on one part of a display at a time, I could see some sort of adaptive resolution technology catching on at some point, where the resolution is high at the point of focus and drops off towards the peripheral, accomplished through specialized headware, contacts, or optical implants or something.

yes

I dont actually get it, why not use vectors are text rendering? You only need to create one set of text that can be infinity stretched.

>paint the inside of a piece of glass with quantum dots
>use a scanning electron beam to activate the quantum dots
>infinite resolution

Internally all fronts are vectors.
But all screens they're displayed on are raster

>But all screens they're displayed on are raster

The hardware manufacturers would certainly want it eventually to sell.

The problem is the content and applications. Regular TV/home theater is kind of dead at this point since people would rather stream to their tablets. Video games have been struggling miserably to get over 1080p

Why not just skip to 1024k and be done with it?

that would be nice. the native resolution meme needs to die.

thats litterally crt though

Wrong. Quantum dots are immune to burn-in and don't dim with time (on a human time-scale, at least).

they havent even got 4k working properly on pcs yet. you need expensive multi gpu setups for it. it needs to be usable with an igpu just like 1080p is.

what for? tentpole movies are still mastered at 2K. cgi rendering at 4k is still prohibitively expensive for every single studio.

32 Mp, aka 8k, is enough for vr if you got 20/20 vision

>Video games have been struggling miserably to get over 1080p

Video games have no reason to go over 1080p, anyways. Ignoring VR, of course. 1080p@144hz would be hugely preferably to 4k 24FPS or whatever retarded console manufacturers want for pretty promo videos.

640K is enough -- for anyone.

the extra pixels are needed to display text properly. 1080p on a 23" screen looks like shit when compared to same printed on paper.

exactly what part of 4K isn't working on PCs ?

We have been for several years now at a point where mid range cards play nearly any game in 1080 very comfortably.
Even the budget 1050 manages 60+ fps in some games under medium settings.


Just because current flagship cards can barely handle 4K doesn't mean its a goal we should give up on.
Eventually there will be a time where midrange cards will handle 4K easily, maybe by then 8K will be what we are striving to achieve.

so do quantum dots actually exist then? i just thought it was something theoretical the other user just made up. would such a monitor technology actually be possible some time?

I just want this to get cheaper

Thanks doc

I'd rather get more hz.

I want all computer monitors to get cheaper. They're all absurdly expensive compared to TVs

Who, who who!!1

what monitor is it? I searched for dell original png but apparently that's not a monitor

>igpu
This and its defenders are a cancer upon gaming. Microsoft needs to stop blessing Intel's shitchips if they're going to soak up half the market.

Not him, but hardware scaling, OS support, and engine development have been incredibly stagnant for years since Microsoft gutted the PC market to push the 360. DX12 is practically DX11 service pack 2. New compression algorithms and other techniques to scale those resolution highs just haven't been arriving.

Most of the old players in the GRFX arena are dead or somewhere else. SGI went bankrupt because the hardware crew went full retard. Matrox does medical imaging and billboards. Trident's XGI was a flop. John Carmack is too busy diddling around with his VR headset. Ken Silverman is screwing around with voxel shit and so on.

honest question how did MS gut the PC market ?


As for the "old players" leaving the market, yeah more like the free market decided to not buy their shit.

>forgot the 4th who
>:^((((((((((((((((((

>1 gorillion dollars tv 32k display
>streams with codec that displays half HD images with noise over it
>muh resolution

Looking good is more important than resolution. Here are some recent tweets about the insanity that is cranking up the the resolution.

twitter.com/TimothyLottes/status/935704999912443904
twitter.com/idSoftwareTiago/status/937776357764976642

it's the only 8K monitor in production

currently $3800

>honest question how did MS gut the PC market ?
When the 360 released, they made a massive push to move their PC devs over to the console. A good deal of this including buying exclusives and paying devs to retool for the 360. The other portion was dropping support for the PC side of things and leaving DX to stagnate. Third party library makers tried to step in with things that converted xbox API calls and whatnot to DirectX and the result was a bloated, under-performing mess.

What MS did not expect was multi-platform development becoming the norm and Sony dragging out the console life cycle to a full decade.

I'd rather see 4K or 8K becoming the norm for both displays and content across the board. Yes that means TV broadcasts and streaming too.

but muh frames and muh gaming

Wouldn't it be fair to say that trend is now being reversed with the latest gen consoles being x86 and DX12 ?

120hz should be fine for most games, higher rendering resolutions can be quite beneficial though, especially with regards to aliasing. Tons of other gaming technologies that still eat up resources though, so some more work on reducing performance impact would be ideal. There's a reason why a handful of console games give you the option to reduce graphics settings to improve performance, beyond underpowered console hardware.

>Will we ever see 16K or 32K resolution TVs?
I've never even seen an 8K tv.

yes, we'll have 512k tvs by 2020

They're not actually, if you want high-end stuff. Early adoption is always cost inefficient. I wish there was some HDR monitors though.

But at some point you got to account for all the "legacy" resolutions and the fact that due to a. licensing, b.profitability, or c. just laziness (aka don't exist at anything other than whatever res it is now) that how are you gonna display all that on your fancy 8K TV without it looking like shit? Most rips at under DVD res will look like utter garbage on a big ass 8K set, even 720x480 may look like shit when it's displayed on an 80+ 8K TV. What will you do then? You can't just go out and buy the "Remastered" disc set cause it don't exist.

it all depends how far away you sit from your TV.


It is only a recent thing that people have huge ass TVs in their living rooms.

Most of this SD quality stuff looked just fine to people when their TVs were half their current sizes.

It's pretty much the same as the issue we have now with dvds, where short of very good upscaling technology it's just not gonna look good on higher res displays. Heck there's still plenty of movies that aren't even on dvd. This is why pushing for content being produced at higher resolutions is important, if only for archival purposes.

I know, I've got a 65" now. I sit a decent distance away from it so even 480x360 stuff ,depending on how it was encoded, don't look bad when upscaled to fill the whole screen. Can I tell a difference between 480 x 360, 720x480, etc and native 1080 content, yeah but it's not so bad that I can't watch it/make out details.

Probably not. If it does happen, I won't buy it. I won't be able to tell the difference between 4k and 16k at 12 feet, so why wouldn't I save money? What matters to me is HDR, OLED, and frame rate, not a higher resolution, and certainly not a meme curve.

Vectrex. Release date: 1982.

i dont care about modern games and are not going to spend hundreds on a gpu that is obsolete after less than a year.

Old old old fag witnessed

the 7700k has an integrated gpu that outputs 4k

>FHD
FHD
>4K
UHD
>8K
QUHD
>16K
OUHD?
It's not 16K, it's 15.3K thus closer to 15K.

Hard to say to be honest.

The modern dev doesn't care about x86 or whatever ISA the console itself uses. Most of them rely on a licensed engine or set of middlewares to do the heavy lifting for them. It is a boon to low level people since it is well understood with lots of optimizations and techniques to apply.

DX12 was a concession they had bloat and obsolete hardware assumptions baked into the API. It is a lot more modern and much better designed to leverage modern parallelism. Still, there was no real reason not to release it for 7 and 8 other than upgrade bait.

The release of 4k consoles seems to have brought standards in development pipelines back up. If they want to target multiple resolutions, they have to put in a little more effort...just like in the PC halcyon days.

PC gaming seems to be back on an upswing since devs are doing proper releases instead of lazy ports. The market is too big to ignore, but will give a shrill reaction to shoddy work.

Why bother. The human eye sees in 480p

Vector fags please go away. I might consider listening the day you can digitize an image in a format suitable for vector displays.

>7700k
>Thinking your unusable poor fag garbage is even capable of running windows
Lmao at your life

Nope. 7680 * 4320 is the end of resolution. Displays have reached their logical conclusion at that point.

resolution is a fucking meme

i literally see no difference vs my 1080p tv in terms of clarity. HDR however... holy fucking shit it's like the jump from black and white to technicolor. shit is amazing. now i just need more content in HDR. every movie in history needs to be remastered in HDR.

>barely any VA
>barely any 10-bit
>fake HDR VESA standard (anything less than native 10-bit panels are fake HDR)
>see displayhdr.org/performance-criteria/
>TN still 6bit+FRC
>IPS everywhere with shit contrast ratios (HDR correlates with DR)
yeah no. you can get FALD VA 10-bit HDR TV @ 50" for around $1K now (X900E). contrast ratio is over 5000:1 too.

Maybe.

meant to say HDR correlates with contrast ratio but you get the point

Even Dell's $5000 8K 31 inch monitor doesn't hit the density required to completely fool an eye. It probably doesn't matter when you're middle aged or need glasses to achieve 20/20, but 280DPI just isn't there yet.

So when we ever get 16k, what happens to 720p or 1080p movies and TV shows? The bulk of our media is designed for 1080p and with such high resolutions it'd be like watching a show from 1980 on a 4k monitor

We'll just run the 720p video through a highly advanced neural network that pulls a bajillion pixels out of its ass to make everything look good. Besides, you can fill your whole vision with a 8k screen already and it should be sharp enough to make pixels unnoticeable. With that logic 16k should barely bring any noticeable improvements unless you find it important to stare at

I think the race for the highest resolution is so that, even though we dont notice it, we get as close to real-life clarity that we can.

The only reason low res stuff looks shit on high res screens is when there's something to directly compare to, so, when apple started shipping the retina macbook pro, and everyone was seeing upscaled shit next to new vector icons, they could tell it's shit.
1080p on a 4k TV is gonna look just like it does anywhere else, it just means the normie probably has seen 4k content as well. Doesn't matter if they saw it on their own screen or another.

yes.. 10 years tops.