What is the fucking point?
What is the fucking point?
Other urls found in this thread:
resources.printhandbook.com
en.wikipedia.org
twitter.com
real 2k 2:1 displays when
for higher resolution you dumbass
>release 4k screen
>It's 24 inch
What's the fucking point if you can't read anything?
It can be done and it sells. That's enough.
I'm clearly able to read though...?
Dude, Musk has put a car on the moon and our children will be playing on MARS !!!
BIBI is making israel great again !!
Consumerism is Jewish, Goy.
Because you're a dumbass who uses scaling to make his 4k monitor look like a 1080p monitor
mfw when the israeli jet downing is the begining of world war 3...
mfw when trump and bibi will become the new hitlers of their tmie after they lose ww3
You may have autism.
>ur a dumbass for making things more detailed and readable
You literally can not tell the difference
If you can't mate you need some better eyes/genes holy fuck
False, 4k unscaled is entirely readable without eyestrain with practice. I work using that setup every day and it's productivity nirvana.
that battery consumption tho
Maybe you should stop reading your screen from milimeters away dumbfucks
Wow. The absolute retardation of this post. Im sorry you're so poor user and you have to justify your shitty 1080p screen. Maybe in 10 years you'll see how much better 4k is.
Sup Forums agrees with me idiot
Wow you're fucking retarded. also
If you're curious, this is what it looks like. The text in the debugger is scaled up a bit for more comfortable reading, but even the MacOS top bar is perfectly readable from a comfortable distance
Indiscernible pixels are wasted pixels
FullHD is good until 24"
2560x1440 needs at least 27"
4k requires at least a motherfucking 42" TV, better get 3 separate monitors instead, more flexible, works better with fullscreen applications and is cheaper
selling more hardware. they require better hardware and your old shit could become unusable when those high resolutions start getting popular.
>that resolution does not even EXIST
This one gets me every time :^)
Nope, having two applications side by side on one monitor is vastly superior
On a laptop, sure. I'm talking 27" and greater.
Increasing the resolution is easier than increasing the quality
...
distance from the display OBVIOUSLY affects perception. a user who sits closer will see more detail and more easily be able to discern pixels
you clearly haven't taken this into account
a TV must be much larger than a computer monitor to take advantage of the same resolution because viewers sit so much farther back from it. If you're not on a cough on the other end of a room, you can see detail in much smaller screens
can you run a fullscreen game and fullscreen youtube one monitor?
(seriously, I looked for a solution to this 10 years ago, found other people asking that even earlier, and only "solutions" are "hurr, tilling manager", except none of them can tile fullscreen stuff)
I don't see how or why you would want to run two 16:9 applications at the same time, you're going to have to cut quite a bit. Most monitors do have PIP mode though that you could attempt to do that on
You'd have to run the game in a borderless window, and YouTube with scale to browser window. That's the closest you can get
dude, I have a 55 inch 4k on my computer, even scaling that text up to fit my monitor that shit was to fucking small to read comfortably, much less at sub 40 inches.
Well maybe using a TV as a monitor isn't that great of an idea
because you get a tcl p605 and have a near perfect monitor, I would have liked 42-48 inches more, but it was either get the best or take a significant drop in quality for a smaller screen ot pay a fuck load more for equal if not slightly worse screen that's also the smaller size.
this is Sup Forums and pre /mlp/ being banned faggot.
The whole point of 4k for computer monitors should be banishing useless small displays forever. No more having to buy 2-3 displays just to be able to get stuff done, one nice reasonably sized 40"+ display with enough usable area to actually get shit done.
you scaled it 200-250% larger then it actually is
this is what wiki 4k looks like
>The whole point of 4k for computer monitors should be banishing useless small displays forever.
oh, and if anyone is wondering, scale this to scale this to 50% if you are on 1080p to see what 4k would look like for you.
yea, the text being bigger on my screen then it is on yours, and it STILL being to fucking small to comfortably read screams my 55 inch is the problem, and not text at 4k sub 40 inches.
>lmao tinyfonts
>that resolutions doesnt even EXIST
That is kinda the whole point of using 4k you dumb faggot. High dpi with upscaled text makes for exceptionally good reading. Your phone does the exact same thing and it is the reason 'retina' displays really WERE a good thing.
You should probably get your eyes tested
I can easily notice a difference between a retina macbook with everything scaled up and one of the older macbooks without a high res screen. The retina screen leads to less eye strain.
GOD DAMN RIGHT
tcl p605 is perfect for this purpose.
Yeah thats the default setting for a 4k monitor on windows, literally plug it in and it looks like what I have.
so piss away every advantage the hardware gives you for screen real estate in favor of hardware based anti aliasing? are you fucking stupid?
Who the fuck wants to actually see each pixel?
in 1080p I can see the pixels, at 4K you couldn't even see any unless its a huge screen
your nose shouldn't be touching the screen,
It looks far superior and is easier to read than any antialiased fonts at lower resolutions. I am sorry you are too poor to afford a good monitor. Actually 4k is too small for optimal dpi for a 24-27" monitor, 8k is optimal.
yea, piss that screen realestate away, let's have a 1080p monitor but make its user experience even shittier.
>What is the fucking point?
Of living? I reckon you have to figure that out for yourself.
Are you stupid? I can see the pixels in 1080p from 6 feet away. You really do need to get your eyes checked.
why is no-one bothered by the fact that "2k" (what I expect to be 2560x1440) is displayed as about as big as 1080p?
congratulation on being a liar on the internet, parents must be so proud of you.
Yes, because hardware based anti aliasing in this form is the best. In a perfect world we would have molecule-sized pixels, but a high pixel density is going to be close enough.
He is literally retarded, there have been plenty of studies on this shit that show optimal dpi and is the entire reason apple went with 5k for their retina iMac a while back and not 4k because 4k was too small. He just whines about 'muh real estate' as if you can't scale text down in certain programs if you need more space.
55" is a little too big for 2160p, only 80ppi, ideally you want somewhere between 38-43
2k is typically 2048 × 1080, but could literally just be 2000x1000~
I've never seen 2048x1080, but thanks for the info
resources.printhandbook.com
really, I wanted 48, but like I said, literally everything in that range is either a VERY significant drop in quality, or is 1000+$ more expensive.
seeing that i'm 3-5 feet away from it at all times, and would likely be further away if not for my desk and how it is, I get the advantage of very nice txt, and not needing to scale at all as if something is to small sit up and lean forward.
The fucking screen lickers I will never understand though.
Its literally for movies mostly
then it makes sense, I've seen people referencing to 1440p as 2K before (for whatever reason)
because 2k is damn near 1080p,
what you are thinking of is wqhd
2k is specificly 2048x1080
4k gets a fun bit where its used for uhd which is 3840x2160 when 4k is really 4096x2160
I think technically it would be 2.5k since it's typically a reference to the horizontal pixels
You fukcktards probably also think 720p on a >5" phone looks good.
It doesn't look bad
t. used 1366x768 on a laptop for years
I have bad eyesight , 1440p is at the limit of what my eyes can resolve on a 27" monitor at whatever a normal viewing distance is. Anything more is a waste of pixels
>2048x1080
2048x1024 would be a much better resolution that could appease my autism
...
tons of shit would be a better resolution, but I believe this is a film resolution more then a normal hardware one.
/thread
thanks Sup Forums
That resolution should be outlawed. Its insane that its 2018 and that's still just about the most common resolution on laptops.
I know Windows sucks for resolution independence but seriously it shouldn't be so hard to find a reasonably priced laptop with a screen that doesn't suck.
past 4k i say there is no point
I'll be satisfied with 8k
this
There is, but its stuff like a single screen that can treat both 2160p and 1440p as essentially 'native' resolutions.
>quality
What are you even talking about? colors and contrast on monitors?
>wasting gobs of gpu time on things that make no practical difference, while at the same time breaking programs that don't use vector graphics
10 bit 4k tv here, granted its 8bit+frc but from what ratings has said, its handles color better than quite a few of the 10bit native panels.
its a 6500:1 without local dimming, and 7500:1 with and has an absolute lag of 14~ ms, something many monitors fall sort on funny enough. yea, a monitor may get 120/144/165/240/480hz, but you still have 2-3 frames of input lag.
hdr is forcing quality to get cheaper,
dolby hdr is a 12bit color, and 4000 cd/m2 (is that the term) brightness, with a contrast range of was it 17 or 21 stops and that is I think the something like 12-27000 contrast ratio
shit will target that in the coming years, and we will see displays go down in price to compete with china tvs or move the fuck out of the way.
>hdr is forcing quality to get cheaper,
>dolby hdr is a 12bit color, and 4000 cd/m2 (is that the term) brightness, with a contrast range of was it 17 or 21 stops and that is I think the something like 12-27000 contrast ratio
HDR is a meme. All the cheap TVs already are "hdr ready" and that's all they need to sell. The only thing HDR actually adds is 10bit (or 12bit I guess, despite that being completely useless)
He means incrementally we have gone from PS1 style graphics to "4K" graphics today.
Looking at the Witcher 3 the only way to improve graphics is to start simulating reality. So more phyics on objects texture needs to be replaced by something else etc...
Actually producing better graphics has become incredibly difficult. Technically speaking from Ultra Graphics on Crysis 1 to Witcher 3 there has not been much visible improvement.
So what they do bump up the textures and resolution and call it a day.
An actual improvement in graphics would have been Zelda : BOTW with Witcher 3 graphics. So physics combined with high textures and shit.
>The only thing HDR actually adds is 10bit (or 12bit I guess, despite that being completely useless)
How can a technology board be this retarded?
Witcher 3 doesn't look anything like reality.
I didn't implying it did. It just a really good looking game. But it's very far from reality. Thats what i mean. Still imagines of it look really nice. But the physics in it suck dick compared to reality.
The metadata is trash, and the "specifications" aren't exactly being enforced. You're the retard for gobbling up the marketing.
Do you even play video games?
because hdr is so fucking stupidly advertised and explained everywhere that you have to see the difference between hdr and non hdr to see the difference,
that in and of itself is fairly damning, but from my understanding, and correct me if im wrong,
non hdr video was masterd to a contrast ratio of 72:1~ and has at best 8bit color, if even managing to hit that.
hdr brings more colors, so you see less banding, and allows more contrast to be displayed at once.
granted there is no real reason hdr content shouldn't have some benefits from even non hdr displays, but 10 bit is a sticking point that most displays cant do.
HDR is the future of displays. It's like saying "well all content is 256 colors at most why would you even waste with true color?". The only problem is OSes can't handle HDR monitors yet, they fuck it all up.
Why are we still using pixels when voxels are a thing nowadays
Cinema projectionist here.
You expect full HD/2k on your phone. You expect 2k-8k on your monitor.
What do you expect on the big screen then?
What kind of dumb question is that?
the point is to have higher dpi, you fucking brainlet
4K is literally a scam to sell more expensive GPUs and monitors
32k
480p from a DVD judging by the last few times i've gone to the cinema
I've got a 65" HDTV. I sit 8 ft away from it. Picture looks damn good to me, even upscaled stuff looks decent. 4:3 content is displayed full screen (aka no letter boxing) and it ain't all stretched/distorted. Some show the encoded res is 352x240, even upscale to fit the whole screen the result ain't bad. But again I don't sit so close my nose is hitting the screen. This show in question, there is no DVD release of it, this torrent is all that exists, so your stuck with the resolution given. Just like trying to print a small ass 4x6 jpeg to 24 x 36, sooner or later you can't upscale/convert anymore without the results looking like shit. The pixels/bitrate/resolution just ain't there.
>hdr is so fucking stupidly advertised and explained everywhere
Yes, they lie in the advertisements. No other way to put it.
>you have to see the difference between hdr and non hdr to see the difference,
Yeah and it's honestly nothing special. The only time HDR makes a difference is when the TV has intentionally gimped SDR content by not displaying them in full brightness and saturation.
>and allows more contrast to be displayed at once.
Well no, but it like you said will reduce banding in those situations.
in 10 years time, witcher 3 will look like dog shit compared to zelda though.
art direction trumps realism every time.
we are hitting a limit though. we can't push policies and tessellation done right solves what would normally need larger poly counts.
textures honestly are just fine, all we need is cameras that don't allow you to wall lick or in bethesdas case, hire the modders to properly uv map your fucking textures.
with the newer apis, they should handle draw counts better, so you could add more grass without it being the massive fuck off chokeing point it is today.
the last thing we really need is animation that is more fluid, which will take some software to process mo cap better, and path traceing, which with what amd is doing, the rapid math push, may be possible soon, talking to someone who worked on path tracing either a game engine or a quake engine I forget which one, a video game quality path tracing could possibly be done with quarter precision math and not be a detriment to quality. that would effectively turn an amd gpu into a 50tflop, and brigade demos were done on what was it, 6 tflops?
What a shill post. 10bit color has been around for a looong time, HDR is a locked down marketing standard.