Buy 4K monitor

>Buy 4K monitor
>Open MSPaint.exe
>Pick line tool, 1px, black
>Draw a diagonal line
>Can clearly see the jagged edges

8K monitors, when?

Other urls found in this thread:

testufo.com/#test=eyetracking
testufo.com/#test=blackframes
reddit.com/r/oculus/comments/2xoscc/is_90hz_a_fast_enough_refresh_rate_that_low/
twitter.com/NSFWRedditVideo

>he doesn't know about pixel density
>he fell for the 4K meme

>he doesn't use drawing software with anti-aliasing forced upon every tool available

it's you're fault for not using the right software, my arbitrary choice of hardware and software is objectively the only valid solution

>Having a PPI so low that he needs to blur everything to not see jagged edges

Apart from the 5K iMacs, is there are screen out there that has a high enough PPI that you can't see the individual pixels?

>having a PPI so high he can't see the thin line between insanity and satire

Well, that's definitely not how that works...

but it is user

LG Electronics Ultrafine 5K 27"

Technically that's the same display as the 5k iMac...

take a picture of your screen.

Best Buy is selling a 4k Toshiba TV for 700 and giving a free PS4 Pro.

Somehow, I don't think it's that great a deal.

i fucking hate Sup Forumstards who will say that whatever they have now is pretty much the best you will ever get, fucking retards, it will be fucking nice to have ridiculous (by today's standards) resolution and color depth and refresh rate etc

Sup Forums used to bitch and moan about not needing 1080p, and that 1680x1050 was the sweet spot for price/resolution. Guess they all sat there with their shitty "in the middle" resolution screens while the rest of the world got on the 1080p train.

You'll need an 8k to 12k monitor depending on the screen size.
>5k garabge
No not nearly enough.

Never and if it was it was some beaner or troll.

Remember, always do the opposite of what Sup Forums says.

5K monitors already exist and they're significantly more pixel dense, 14.7 million pixels vs 8.3 million pixels

That's actually a very good and simple way of testing resolution and judging whether there's any possible benefit in going higher from what you currently have.

>if it was it was some beaner or troll.
I wouldn't be so sure. Remember not to fall for the SSD meme, ok?

SSDs were trash till a few years ago.

>Not wanting to see individual pixels
how will you look at pixel art then?

>not buying 12k monitor
lmao

soon...
Die flut kommt!

2x nearest neighbour scaling

I'm looking at new monitors right now, i really don't know if i should go for a more conventional 1080, an in between 1440, or 4k for possible longevity. At 27inches or so.

What distance are you from your screen?
At 50 cm, a 25 inch 4K has a dot pitch just under 1'
5K can go up to 33 inches and still be below 1'
8K is under 1' up to 50 inches... that's just excessive, desu

Higher is always better but just you really should judge other aspects than resolution.

>1 degree
>a whole fucking degree
fuck off luddite

wait you're not talking about degrees, what even is 1', fucking shart in mart

non widescreen organic led screen when

So the dot pitch is about half the screen size?

1 minute of arc...
The visual acuity of a person with normal vision

just because you can't actively distinguish each individual low-contrast pixel doesn't mean it doesn't affect the image quality

One internets for you good sir.

Of course not, image quality is absolute, not relative.
It just means you can't notice the technical flaws from that distance.

when they get good response times, when they stop burning in, when they don't have color shift across different brightness settings and when the near-black to black transition doesn't look like smudgy mess.

>non-widescreen
as a monitor? probably never, unless you mount it vertically.

What is a good PPI?

autist still using a crt in moms basement

k

what if I wouldn't care about that shit and would early adopt just because organic led want

well it won't help them to be released any time soon.

you can get oled panel tvs, laptop, tablets and mobile devices of many types. buy a tv and use it as a monitor?

>buy a tv and use it as a monitor?
I will

...

only phone screens have that high PPI

Even 8K monitors won't make that go away. You'll need 16K resolution for a 24" monitor and 32K resolution for a 48" monitor.

Until those thing become affordable you'll have to make due with anti-aliasing.

If you want 1337 diagonal lines in 1080p-1440p monitors then you need to check out inkscape (open source, free).

I cant see jagged edges on the diagonal line on my 3200x1800 13" xps unless i get close enough that my nose touches the screen

we also need ultra high refresh rates/low persistence instead of this sample-and-hold blurry bullshit

Nope, the average human eye can't see past 60-90 FPS. 60 FPS monitors are more than enough for most.

>eyes are so shit can't appreciate 4k much less 8k on anything less than 45 inches

nice meme dipshit

just because you can't distinguish each individual frame doesn't mean it doesn't make a difference. and those studies were probably done with CRTs, not LCD displays

testufo.com/#test=eyetracking
testufo.com/#test=blackframes
reddit.com/r/oculus/comments/2xoscc/is_90hz_a_fast_enough_refresh_rate_that_low/
>Nope. Carmack thought full persistence 120 FPS OLED would be enough and Abrash convinced him he was wrong (source: Quakecon 2013). They had a 95 Hz prorotype with a button to turn on/off low persistence and the difference was huge.
>To achieve 3 ms persistence with a full-persistence display you would need 333 Hz/FPS. Abrash believes VR should go below 1 ms (full persistence: 1000+ Hz/FPS).
>The longer the persistence the more wrong data your eyes see (the data is correct only in the short moment of the refresh). 11 ms (90 Hz full persistence mode) is just too long for our brains.

also there are countless blind tests on youtube where they tell the difference between 60/120/144 Hz

Or just a strobe.

dude higher resolution is literally a meme lmao

I don't think you understand. While the human eye can "observe" shit at thousands of FPS, it cannot interpret all of that information.

The only information it is able to detect is around 60-90 FPS. Beyond thay human eyes will guess visual information in a desperate attempt to detect what they saw. See UFO sighting reports that turned out to be regular/experimental aircraft flying by quickly.

You can easily see this by moving your fingers in front of your eyes side to side quickly.

There are exceptions, you may be born with mutated eyes that allow you to clearly interpret like 200FPS but for most people this just simply isn't true.

no one has to be able to pick out a single frame from a continuous video stream, that's completely irrelevant, what's relevant is "the big picture" and that you're almost always seeing objectively incorrect data, especially in a video game where the data is only valid for an infinitesimal period of time

Likely never happening until the processing power to drive it is so trivial, or the manufacturing technique is more costly to make lower resolution screens.

4k already has such a high ppi on sub 30 inch screens that you need to scale the ui to comfortably read and use it.

get your fucking nose off the screen

pic related looks nothing like watching a horse run in real life. temporal resolution matters. being able to pick out individual frames is about as useful as being able to pick out individual pixels. you're incredibly autistic if you insist that this is the only thing that matters.

It's within your observation capacity so does it really matter?

You can show someone a video playing at 1,000,000 FPS on a 1,000,000 Hz monitor while listening to a recording recorded at 1,000,000 Khz and then show them that same video scaled down to 90FPS while listening to audio scaled down to 48 KHz.

Do you honestly think they will be able to tell the difference?

There's a point where you have to stop and think about the human limitations in the media you want to present. Of course this will change when we become machines but that's still decades away and probably not in our lifetimes.

they were never trash, just ungodly expensive for what they did.

>Do you honestly think they will be able to tell the difference?
yes, especially on a sample-and-hold LCD displays

look at the fucking ufo tests i linked, see it for yourself, and start your own VR company while you're at it, dumb shit

I really hope you die a horrible death. People like you are exactly why the monitor market is as shit as it is.

>PS4 Pro
that can be sold for 400$~ so the tv effectively is only 300

They were garbage with very limited number of read writes.

Fuck off placebofags. Go listen to your 1Thz/1,048,576-bit lossless music with cable lifters and magic stones.

no its not, remember higher resolutions demand more processing power, depending on application and power of the pc, this could really fuck with it. personally i'm productivity over looks, so I want a 48 inch 4k so i can have 4 bezel less monitors worth of screen space.

what do you normally do?
if its just use a computer, 1440p is likely the better option
do you play games? 1080 120/144 would be better
do you watch videos? 4k would be better due to how the scaling works full screen.

>57591462 (You)
>you're incredibly autistic if you insist that this is the only thing that matters

enjoy your $1 chink earbuds lol

Any credible test utterly shows how ignorant you are, kiddo.

dell has a 5000$ 100% adobe rgb panel, pretty sure its up to buy now.

I have both a 20‐something″ 4k monitor and a 13″ 3200×1800 laptop. The jagged edges are almost impossible to see from a normal viewing distance on the laptop, so we don’t have far to go

Hey what's the magic stones to get for my setup guys? :^)

I already have a $40,000 setup with ceramic and wooden cable lifters (used both for maximum quality). :^)

Considering I mostly watch videos on my computer, I could justify a tv as a second screen if its oled.

OK

what about vidya

100% depends on viewing distance.

80-120 is the sweet spot for not needing to scale anything in the os though.

300-1200 is what pictures are printed at in print media.

I mean maybe you're a degenerate? Sorry if you are. 120fps for me.

Strobing is a method of achieving low persistence, it's not an either/or thing.

Personally, for that i would just get a 1080p 120/144 hz display.

My main monitor is 24 inch 1920x1200, and I would never want a vertical resolution lower then that, so if i upgrade this, it would be to a 1440p 27 inch, of if I had money to burn, the philips 40 inch 4k

But i'm ok with having a monitor dedicated to gaming if it could be had cheap enough.

you want an all in one, you want to go with a high contrast 120/144 screen. I think benq makes a few that are either 3000:1 or 5000:1 contrast and are some of the best displays you can get in terms of contrast, but not sure if they are over 60hz... that contrast would be worth dropping the fps for though.

If your goal is less blurr then it is.

The z5 premium has 800ppi

Get a custom made one.

luddites BTFO

jesus christ i need this

I don't think you understand. Saying "Or just use a strobe." to somebody promoting low persistence is completely redundant because strobing is a method of low persistence.

Nigger everybody can see that this shit is resampled from a lower resolution.

military did the research, humans can accurately identify shit up to 300fps, and its though the the max perceivable fps is somewhere around 500

I want an organic LED display that actually has a small computing unit behind each pixel acting as the graphics card. This could lead to immense power and cost savings and could mean a far better optimized result. Basically GPUs have so many cores because we so many pixels.

When they begin to be able to print screens on glass surfaces, technology could be further advanced, leading to application of an additional silicon layer. This silicon layer would inhibit controlling as well as computing circuits, for each individual pixe.

Is it? :^)
Sup Forums limits the file size + jpg compression and reduced quality to fit under 4mb

The first 720p phone I got legitimately blew me away at how good that dense of a PPI looked. I upgraded to a 1440p phone recently and I could honestly barely tell the difference from 720p.

Unrelated to your post but I honestly don't think phones need to go above 1080p. I know we'll still be getting 4k phones that are completely overkill though.

>doesn't post source
whatever you say, whiteflagger

the first ones were all slc that had millions of writes, but something like 40-80$ a gb, at least this is where I started to pay attention to ssds

then tlc came along, cut the cost of manufacture down but also limited writes to about 5000 times,

current ssds have sub 1000 writes, but are big enough to never die unless you use it as a scratch disc or purposefully break it.

1080p on a 6 inch phone looks kinda shitty, could definitely be better, and that's with all kinds of anti-aliasing, it's quite bad without anti-aliasing

your on your own looking for the source, it was the air force if i remember right, and they used slides having people identify what the planes were, and most could do it accurately up to 300fps.

>purposefully break it

Gee, I wonder who would want that

i mean, if you think back to some years ago, we were completely fine with 320x480 and such resolutions, now in hindsight it looks like complete ass

I'm thinking the people who handle review sites, who just randomly wrote data to a few ssds to see when they would die... I think samsung managed to get a 256 drive out that had 1.4pb of writes before it gave up completely.

Yeah it's not like a tri-core ARM chipset with shitloads of RAM can have an algorithm that mixes a timer with I/O so that it only breaks after a few years

I'm not projecting my own ideas on coorporations, coorporations made me start thinking this way.

And it's not like I'm the first man on the moon just having invented that idea m'lao

right one looks best tbqhwyfam

>jpeg compression makes gigantic oversized blue/orange text sharpening artifacts and uniform three pixel wide aliasing before there's major macroblocking

Wow amazing :v)

look like ass is very relative, remember, we also had much smaller screens, and the screens were crts that handled non native resolutions like a pro.

I had a 17 inch 1600x1200 crt, that i ran at 1024x768 because text got to small on the higher resolution to read.

as for playing games, I remember carmageddon, one of my favorite games at the time, I only ever used software render even after i could 3d accelerate because it looked like shit when everything had a sharp edge.

the phone would be useless far before the drive would ever die, and if someone did this with the pc, there would be an industry wide shitstorm the likes we have never seen before.

>there would be an industry wide shitstorm
Yeah, just like with the suicidal foxconn employees, nvidia with 3.5 GB VRAM, non-overclockable CPUs from Intel, missing updates on smartphones of any kind, Apple wanting $150 for something they fucked up themselves..

A shitstorm would change so much!

Those companies must be really scared of fucking the customer over already. Just look how many people are avoiding Apple, Nvidia or Intel already!!

>use shitty drawing software
>output looks shitty
WOW

pic related made in inkscape

No one gives a fuck about foxconn employees outside of a small vocal minority, that and "Apple pays a company who drives employees to suicide money"

The nvidia shitstorm ended in a class action that nvidia lost, and they had to pay everyone who bought a 970 what was it, 20$ or 40$

Intel making non overclockable cpus, you realize you are a vast VAST minority if you overclock, a small shit storm over that and what are you going to do, use amd? at least till zen, intel has no incentive to make all cpus overclockable, especially when a dual core i3 can compare favorably against an i5, if you could oc the shit out of it, why get anything below an i7? You would completely kill off everything outside of their E lines,

smartphones, at least on android side, have had this problem from day one, and you either accept it or get apple, its not like 10 years in they start fucking you on updates.

and what apple thing was 150$ for something they fucked up? also, its apple, when has there ever been a shit storm in their userbase when they fuck them over?

Basically all phones from over a foot away
VR from over 3 inches away
A 24" 4K monitor from over two feet away
Or pretty much any TV in a normal sized living room