Is 4K ULTRA HD a meme?

Is 4K ULTRA HD a meme?

5K is the future.

higher resolution isn't a meme

It's the new 1680x1050
5k what's gonna be the new 1080p in the end.
1440p is the new 1440x900

yeah cuz its half the product

ULTRA HD PREMIUM is what ya want if ya dont want to get ripped the fuck on

4k ultra hd is not everything.

when you buy a bluray 4k disc, you actually get better color and contrast information in your video as well, it's not only more pixels but also much more color information

On anything smaller than a desktop monitor, yes. It definitely makes a difference on large screens.

8k will be here in 5 years. 8k is the final frontier in resolution. Might as well wait it out.

Yes. 1080p is optimal for 24/27in monitors, which are optimal for most desk configurations.

Except for 10k, 12k, 16k, 18k.

It's great honestly but the only real use is gaming, streaming or torrented content. Buying Blu-ray is bullshit price gouging that bundles extra inferior version copies that are the to inflate the price.

maybe for computer screens, but for media it is

>27 inch
>1080p

Yikes.

assuming you have an HDR player as well as an HDR TV.

if your tv panel can reproduce 10bit color depth in h.264 or h.265 content, it already has one advantage.

the rest is depending on different technologies like local dimming and wider color gamut trough a phosphor layer called quantum dot sometimes

Nah. 4k on a 24" display is just under 200ppi, which is considered the point at which higher resolution is pointless. 1080p is less than 100ppi at 24", which is shit tier.

4k netflix content is fucking goregous on my 4k tv

never going back to shitty 1080p

all you anti-4k fags are denial

if the media is proper 4k, it looks amazing on a proper 4k display

it's dpi is 91 which is acceptable.. everything above 90 is totally fine in my opinion if you're on a budget


the perfect resolution in this decade in my opinion is wqhd in the price/performance regard when you start with 27" on 16:9, you only need a tiny bit of temporal anti-aliasing and every game couldn't almost look anybetter at decent high fps and settings with a 350€+ gpu

>if you're on a budget
But if you have the money, 4k looks better.

last time I ran the numbers, 6k ish was the "retina point" for a 24in panel. Likely they will sell 8K panels, since TV panel design is king in this market.

tilted pictures look better, but when the image is moved, 144hz is a much better thing to look at in my opinion.

my opinion might change once 144hz 4k gets available

They already have 144Hz 1440p, the problem is I don't believe there is a video transport capable of enough bandwidth to do 4k 144Hz at the moment.

nope, last I checked it's 240Hz @ 1080p or 60@2160p

the limit is 4096x2160 @ 75hz or 3440x1440p @ 100hz with current cables.

fucking jews are milking us with their last reservoir of input connection

they could easily design a cable to handle the bandwidth required for 144hz 4k

they just don't because they aren't done milking us for everything they can with 1080p 144hz and 4k 60hz

monitor industry is a fucking JOKE

144hz oled ultrasharps WHEN

they finished designing the displayport 1.4 standard but they are sure that they still can milk us with 1.2a and hdmi 2.0

Animu isn't made in 4K therefore 4K is a meme.

since it's done digitally now, it should be easy to upscale as much as they'd like.

Yeah but nobody on nyaa is gonna seed higher than 720p.

the issue with torrented anime is that it often gets re-encoded by subbers to 720p. even if it was source 1080p, you might not be able to torrent it that way until the sub teams decide you're worthy

>watching anime

who gives a fuck

its movies, tv shows, basically every other form of media that need to make the collective jump

anime production usually lags behind quite a few years in terms of new resolutions. smaller studios are only now moving to 1080p

Most Hollywood movies arent even made in 4K. At best certaint A-grade film's like Taxi Driver will get a 4K remaster, thats it.

I still don't understand why 1440p isn't the new "HD" standard already.

most of them are, actually. digital projectors have been 4K for years. rumor has it that the movie theater industry is trying to slow down the move to 4K until they can get better equipment and higher res films

No. 8k is the resolution needed to get maximum image quality on a screen that will still fit inside an average person's living room. Or an auditorium. You'd need a 20' screen to start to be able to tell the difference between 8k and 16k. If you could at all.

I watched an anime film in a theater on the big screen and it wasn't until the BD came out that I realized it was mastered at 720p. Feel free to call me blind but if you also saw Kizumonogatari, was it obvious?

Cinemas are really overrated in terms of image quality, which is a shame. Only a few cinemas these days provide a better picture than your living room television.

720p anime lookks like ass on my 5k screen :(

idk man, 1080p looks pretty good... doubt 2160p will look any better

still we need a better compression method

h264 kinda sucks for 4k

>streaming
>good quality
pick one

I mean I'd rather run a game with superior rendering algorithms on 1080p than spend over 8 times the pixel processing to render to a slightly crisper image (though obviously the bigger the screen relative to your eyes the more noticeable the difference), but it's not to the point where only the most obnoxious audiophiles can convince themselves of its merit. Though we're quickly approaching that point.

Nintendo Switch can't do 4k therfore its a meme

why does it look so good then?

yes. Maybe I'm just getting old, but even my 23 inch 1080p monitor is overkill

You're a meme.

4K on small screens is probably a meme. on big monitors it's definitely not. it will catch on for gaming once the hardware for it is cheaper.

>Is 4K ULTRA HD a meme?

Yes. And I will explain why.

4k isn't just a meme. Resolution is a meme.

What matters is the clarity of the image. And that comes from many factors separate from resolution. The first factor that matters is the type of lens which was used on the camera that shot the video in the first place! If you don't use a sharp lens then all the resolution in the world will never help you. This is one of, and a big reason why, the "HD" video from your phone doesn't look as good as a lot of SD videos which were shot on conventional video cameras. Because the lens on your phone sucks and the video camera probably has a decent lens for shooting video. And professional equipment is even better.

Now a lot of people on this board are familiar with bitrate as a means of measuring video quality but there is something else to take into consideration and that is noise. Video noise gathered by the sensor in the camera when the video was recorded. More noise makes the image seem fuzzier. Cameras have a setting called ISO which roughly equates to sensitive the camera is to light and the lower the ISO the less noise you will have in the image, but you will also need more light to get an acceptable image. And the size and type of sensor in the camera will also affect noise. If you have a shit sensor you'll have more noise and that will bottleneck any amount of detail you see in your video. If you're not working with digital you'll be limited by film grain in a similar fashion (and also remember any film has to be scanned and digitized by guess what: a digital camera, unless you want to watch it on a film projector).

Then there is the issue of chroma (color) sampling, and whether you are looking at 4:4:4 or 4:2:2 or 4:2:0 color subsampling and that will make a difference to the overall quality of the image.

(1/2)

(2/2)

Most of the time you are not going to be looking at 4:4:4 video because the file sizes were inconveniently large in the SD era. But the best cameras recorded in this format and the master tapes and discs from that time still have it and can serve as good source material for further releases.
If you look at SD video samples which were shot with GOOD lenses, good sensors, the right camera settings and operators, and the right post-production pipeline you will be blown away by the possibility. Also keep in mind SDTV is not 640x480. It's slightly more than that at 720x480 with the aspect ratio anamorphically corrected by your television (if you have a CRT, if you have a a rare LCD SDTV you might be screwed out of some detail).
The quality of your television or monitor matters too. High quality SD video (or downscaled HD video in most practical cases) being played back on a professional studio monitor like a Sony BVM is like looking through a glass window. It helps that many demo reels are shot with great equipment and have excellent contrast and color to make the image "pop" as annoying people love to say. And there is a world of difference between composite video and true RGB.
And all of this is not even starting on frame rates. A 60 fps broadcast with the above qualities can look like you're right there on a large sized CRT like a Sony XBR series. (or a mythical 720x480 40" LCD screen with great contrast that does not actually exist in the real world)

So yes, to a great extent 4k is a meme, and so is any resolution above it if you don't take all the things I've just mentioned into account when viewing 4k video. If they are all equal, of course the higher resolution is the winner. But side by side the difference will not be as much as you think.

the spelling and gramm0r in this post be distractingly bad.

It's great for programming , I have a 32 inch 4k and it's nice. Not much else use for it besides work.

It doesn't. You're just sitting far away from your screen.

Nope, I have a 4K 27" and a 1080 24" and the difference is night and day.

Text on the 4k panel looks so much sharper, on the 1080 panel I can easily notice the pixel grid pattern at a veiwing distance of around 0.5m. On the 4K panel it is quite hard to notice at that distance unless you have pretty damned good eyesight.

1:1 scaling at that size/resolution can be tough to read if sitting further away, but you can fit a hell lot more.

For instance if I am to view this thread with my 24" 1080 panel in PORTRAIT mode from topmost part of the page the last reply I can view is
meeanwhile with the 4K panel in LANDSCAPE mode at native scaling I can view
4K would be quite taxing for newly released video games, 1440 is more feasible at the moment. But is more than enough for older games.

For video 4K scales well natively with both 720 and 1080 content, 1440 would struggle with 1080 if you are simply resizing the image to full screen without any software trickery.

144Hz seems better for video that plays at 24Hz since it is a direct multiple, meanwhile closest multiple to 60Hz would be around 72Hz. A lot of panning scenes and the like look juddery on a 60Hz panel, couldn't for the life of me figure out how to resolve it with software.

Highest refresh rate I have ever witnessed is 90Hz but I can't notice a difference because that was through the HTC Vive which is stereo.

I'll wait for decent OLED and GPUs capable of 4K/144Hz before even considering an upgrade. It's going to be a long 5~10 years.

is there any specific reason why dell doesn't make 144hz ultrasharps?

they would make a killing

is it like some anti competition shit or something?

Shoot for 1440p instead, 4K is a fucking mess because the MPAA hasn't been glassed.

well panning juddery could theoretically be eliminated with SVP

Dell S2716DG

If you have to have a 50+inch screen and can only sit 5 feet away from it to see the advantages, it's a very expensive meme for plebs that enjoy being jewed.

On another tanget: The niggers that make games won't even scale ui and font for 1080p. I can only imagine the hell for anons that use higher resolutions and sit far away from their screen.

t. 50" 1080p sitting at ~10ft with usb extenders in comfiness

Yes
t. have a 3k, literally no content even for this resolution

He said Ultrasharps, not TN paneled garbage.

cheers big ears

Also it looks like shit I own one... Acer XB271HU sucks with back light bleed but I prerfer that compared to Dells 144 hz that looks grainy af

>4K
>doesn't fucking exist
>it's 3.84K if you have to call it that
>technically it shouldn't be referred to by the number of horizontal pixels
>it's 2.16K more than anything else
>stupid fucking people will be the death of us all

Yes, As a computer screen it has been really poorly released
> need expensive gpu
> desktop screen have a soft size limit size because of the desk size
> 1080p has been around for too long
> no clear flagship technology because of high refresh rate adaptive sync
> scalling problems

To be successful a product needs to be adopted by normies, anything above 1080p is too complicated for normies

>2.16K
2160p, you fucking retard.

Why wouldn't it look good?
As long as you have a decent connection, it should be good quality.

4k TV are cheap as fuck anyway. No reason to get full HD this day and age.

yes until 144 hz monitors for it exist and the GPU's to drive it, or at least 60ish fps

Fucking retard

Are 8294400 pixels that noticeable over 2073600? Not very often. Is the color gamut and black level far superior? You bet your ass it is.

>It's the new 1680x1050
>5k what's gonna be the new 1080p in the end.
No, it's 8.
5? Japan intends to have the necessary broadcasting infrastructure by 2020 (for Olympics) and continue to use it afterwards.
4K test broadcasting stations are already getting fewer and skip 4K. Japan intends to skip 4K completely.
Sony and some other company (forgot the name) are expected to hit the market with 8K setups this year. Of course they will be pretty expensive but just wait a bit.
No, recent new high definition standards (and stuff like that sticks around in the industry) define 4K and 8K.
4K really is the 720p of today, 8K will be 1080p.

*necessary broadcasting infrastructure for 8K

Yes. As simple as that. Stick to 1080p.

Not at all, it's way easier to view portrait length pictures that would otherwise have to be scrolled around

For instance it's very important to be able to sense the true depth of things