Why is all hardware improving/advancing at a competent rate except for monitors?

why is all hardware improving/advancing at a competent rate except for monitors?

>terrible black levels
>dead pixels
>corner bleeding
>all "innovations" are just marketing memes

Other urls found in this thread:

news.panasonic.com/global/press/data/2016/11/en161128-4/en161128-4.html
eizo.com/products/coloredge/cg3145/
twitter.com/SFWRedditVideos

*blocks your argument*

I honestly don't care about the shitty contrast, corner bleeding or anything

What I CAN'T stand, is the motion blur that is present on every single hold&sample display
Not even 144Hz helps

It's literally the only reason I keep my CRT around.
Anything that moves will look like a blurry mess on any LCD

Black levels improved, you're just buying the wrong panel tech. If you take a look at some of the expensive LCD TVs from Samsung/Sony you'll see they're all VA, at around ~5000:1 contrast ratio or more (some of the recent ones reach 7000:1 which is insane for LCD), all 120Hz with high brightness output and a very good coverage of DCI-P3.

On top of that, this exists now:
news.panasonic.com/global/press/data/2016/11/en161128-4/en161128-4.html
eizo.com/products/coloredge/cg3145/

And assuming it isn't a total meme (probably isn't) LCD's will stop being absolutely ridiculous when it comes to black levels soon.

Anyway, LCD as we know is faded to die anyway, we'll start using microleds instead of backlights and create a new emissive display tech that hopefully won't have any of the OLED drawbacks, just the motion problem from sample-and-hold which can be relatively fixed with BFI.

>he doesn't have a CRT

enjoying my deep blacks, lack of motion blur and higher refresh rates here (v:

The first 3 things listed are more of a quality control thing, LCDs can have varying quality levels even ones in the same batch. Don't cheap out on a monitor or fall for the "gaming" monitor meme you'll often get better panels.

If you really care about this you can still find some of the legendary 2013 Plasmas (Panasonic ZT60, Samsung F8500).

Because consumers only care about "K's" not dynamic range or color gamut.

I don't see you dropping 500+ $/€ for a monitor.
See Can't have motion blur when all you see is blur.

OLED just needs a decent way of dealing with burn in, then manufacturers actually need to invest in making them. Maybe qled will lead to a better opportunity for monitors as well, but not in it's current form.

I don't see myself upgrading for another 3 years possibly. Sitting on a 1440 IPS 60hz, and 1080 TN 144hz.

I guess I could steal my dads old 24" viewsonic flatscreen crt...

what would you consider to be a "competent rate" of technological advancement of computer monitors?

I don't think its even a question of competency

>gets 4K TV
>has to upsample from a 1080i signal upsampled from a 720p source (at best)

No blur here friend

The fact that a 20 year old technology is still capable of better contrast, colors and crystal clear motion than current displays is shameful

Hey I'm still running an HD 5670 and i3 530. Works perfect for CAD drawing and the games I play.
Don't ever trust salesmen who try to convince you need something newer.

Yeah but imagine if newer CPUs had to give up single core performance for more multi-tasking and ... oh

1. technology. OLED just not there yet, MicroLED not around yet.

2. Both the marketing and consumers are retarded when it comes to monitors.
With a CPU or GPU you just buy the newest one / series, with or without reading the "% better than last gen" benchmarks.
But with monitors, consumers just keep the same old shit for 10 years and then buy whatever is popular and cheap without regard to progress. I am offended that old shit like the VG248QE and GN264HL are still on the market, but really if Asus and Acer wanted they could just sell only those and still be in business for years to come.
The only thing companies and consumers seem to understand is higher numbers. Can't go lower than 1ms (or promising 1ms with your strobed VA or IPS panel), but more hz, resolution, premium features like HDR or RGB shit, etc. That's all well and good, except insted of replacing the older models, they just slap a 1500$ pricetag on these and call it a day.

Personally I agree with . I have an FW900 and 144hz monitors just aren't comparable.
Can't afford one of the high res high hz IPS models, so I instead opted for a 32" 75hz 1440p VA for my general use (new AOC Q3279VFW, finally one with Freesync and inputlag-free 75hz, at 200€ shit was so cash)

just because its not better at the things you want it to be better at, does not mean that no improvements have been made.

You can't ignore that monitors have become so portable that you carry one in your pocket every day, that most personal computers come with built in monitors these days. that you can mount a monitor flush against the wall.
For a lot of people reducing bulk was the biggest advancement that could have been made in computer monitors

Size and geometry was the ONLY thing that they did better. And now they're trying to convince people on giving up the geometry with the meme curved displays

LCD get limit, some new things as quantum dot comes next year, microled or pure quantum dot display are on 5 to 10 years away

>literally shoots laser in your face

>tfw no 4k 144hz no backlight bleed 1ms input lag 1ms response time monitor

Can someone explain why curved is a meme? Better yet explain why it isnt?

Obviously not too desirable if you're a graphic designer, and curved TVs were definitely a "what can we meme to get people to buy new TVs?" tier.

But curved monitors are not too bad since you sit up close to them. Tried one, felt natural within the day. It probably also helps some of the subtle shifting and black crush VA panels inherently have.
Not much of a difference at 24" or 27" 16:9 though

Bro bought an OLED tv to use as a monitor few months ago.
Any other monitor or tv looks like washed out garbage in comparison.
Can't wait for OLED or comparable tech to become a standard. Picture quality is leagues above anything else out there.

Just ordered a Dell UltraSharp U2718Q. Did I just buy a meme, or is it actually decent?

Unironically wait for Micro LED.

You just can't afford the montiors without that.
Also burninoled in the newest meme
i feel like all crts are constantly blurry

Bought one and returned it due to Dead pixels... So good luck

See you in 30 years

>>terrible black levels
my current VA has pretty good black levels. not kuro or OLED blacks but much better than TN or IPS
>>dead pixels
never had them actually, I must be lucky. worst I've had is a pressure spot on a monitor I basically got for free.
>>corner bleeding
unnoticeable on mine, you could probably expose a bit in a dark room with a camera but it's nothing like IPS bleed.
>>all "innovations" are just marketing memes
curved is nice and doesn't really cost extra.
144hz, freesync, and BLB are great for gaming, everything feels super smooth.

I have had 2 FW-900s, and while I love them, CRTs need to warm up, and get convergence problems over time that need work that I don't know how to do and I don't think I could hire anyone to do. mine was starting to look distinctly fuzzy/unconverged in the edes. one day one of mine turned on and gave an out of range message. I would need a special cable with a mobo that has that port to even start fine-tuning the internal settings. at that point the bother was too much.

LCD tech does blur a bit more than CRT, maybe even my Plasma looks a bit faster in pure motion transition, but it's a close thing. my 144hz VA, simply by being able to present more frames, all smoothly timed with one another, feels smoother than my CRT ever did at 85hz.

one just feels faster and smoother, and it's no longer the CRT. If I want to watch some passive content or a game that is slower, and I also want more of a "CRT look" in terms of blacks and colors as well as motion, I watch it on my plasma.

LCD still has plenty of room to progress, but my only current complaint that seems like an easy fix is I would like the option of no AG coating.

>Just wait
Ahahahahahahahahahahahahahaha

because you're a poor bitch nigger who cant afford eizo

because you're too poor for oled.

>neural interfacing VR will become affordable before a physical monitor with high contrast, brightness, refresh rate, resolution, accurate colors, low input lag and response time.

Because you're too poor for MicroLED.

It's called a CRT honey :)

I have one stuck green pixel on my new 27" 1440 144Hz IPS panel, don't even notice it unless I stick my face pretty close and look for it.
If anything IPS bleed is what is irritating, had to drop the brightness low enough ~30 until I stopped noticing it on a black background around the bottom left corner.
Don't think it is IPS glow because if I distance myself it sticks out like a sore thumb, yet if I stick my head close to the centre of the display it disappears.
There is an ever so slight yellowish tint to it, its like two glowing bubbles sticking out of the side of the panel.

Haven't been having much luck with IPS lately, my last 27" 4K LG panel was fine for a while until it started exhibiting a red tint on black background slowly growing from the top corners, it was small but growing more and more noticeable until I got fed up and sent it back for warranty.
Should do more research prior to purchase.

>27"
>4k

Meme.

LCD panels are a commodity and consumers are dumb. The HD meme retarded progress in monitors for over a decade because it allowed the LCD makers to sell cheap 1920x1080 or 1280x720 panels with atrocious PPI and color depth as if they were top-of-the line. The best thing about 4K becoming a thing is it allows them to do the same thing again but with higher PPI displays.

The UHD alliance is attempting to fix some of the issues with piss poor specs on displays outside of resolution.
Ultra HD Premium is suppose to be a seal of approval from the UHD alliance where displays have to meet a list of specifications.

I doubt this will make much of a difference. The vendors will push for the proliferation of certification levels that can accommodate any device they make, no matter how cheap. This results in confusion and consumers ignoring the certification labels. Case in point: USB.

Perfect unless you're a gamer

sounds like you just got really unlucky. I have heard of and seen pics of effects like that with IPS, where you not only get minor glow, but big, colored splotches that appear on very dark or very bright blank space. from what I have seen, IPS has the worst backlight problems of any of the available display tech. I personally never got into IPS because I rarely saw examples that looked hugely better than a good-color TN except for quite expensive stuff. and fast IPS are all quite expensive. as for the stuck pixel, I have heard displaying certain patterns in that area combined with massaging where the pixel is with an eraser head or something can unstick a stuck pixel.

I like that the luminance requirement is tied to black level, but still, an I the only one that thinks TVs have been more than bright enough for many years? if anything all displays are too bright by default. I know normies like watching TV in direct sunlight but still...
I lower brightness on all my displays by some amount. even a generally dark old plasma at full brightness settings looks fatiguing to me.

>displaying certain patterns in that area combined with massaging where the pixel is with an eraser head or something can unstick a stuck pixel
Was a little tricky to get with my hands shaking all over the place but I just checked it with a digital microscope I have lying around. It is odd first time I saw a defect like this I don't think it is a stuck subpixel otherwise it would be easier for me to spot from a normal viewing distance.
Was tinkering with UDpixel single pixel flash under the microscope and noticed that the green dot is smaller and fainter than the subpixel arrangement. When the presumably "stuck" green subpixel is actually lit It is a lot larger and easier to notice by eye from typical viewing distance.
It is far from easy to notice so I will avoid physically prodding the display to reduce risk of fucking up other pixels. Flashing with UD pixel for a few minutes had no effect. Hopefully it will disappear on its own but it is not distracting since the only way to spot it is to stick your face in the display and look real hard on a black/dark blue background.

Its like there is a small tear in the green subpixel and green light always leaks out of it rather than the full green subpixel being lit.
Pic related shows my mouse pointer and the stuck pixel is the little spec of green light on top.

It's mainly to be able to work in all light scenarios, like you said watching TV in direct sunlight or someplace like a well lit living room with lots of natural sunlight, but yes your right in that TVs and most monitors are sometimes too bright by default. I'm using my benq VA monitor at brightness level 7 out of 100 only because I'm under around 80w of incandescent light and don't need high brightness, but if I was under heavy florescent tubes that brightness is getting jacked right up.
You also have to think that HDR might need higher brightness in general

That kind of blur is solved by those strobe backlit LCDs, gets rid of the perceived blur caused by sample-and-hold.

Kek

Why? Too big or too small for the resolution?

jews

Chemistry innovation is slow.
This is why battery technology takes decades too.

>why is all hardware improving/advancing...
*ahem*

>1366x768 laptops still being sold 20+ years after first availability