Why is 144hz a thing instead of 120hz?

Why is 144hz a thing instead of 120hz?

Other urls found in this thread:

cnet.com/news/ultra-hd-4k-tv-refresh-rates/
twitter.com/SFWRedditVideos

MOAR IS BETTA

144 is divisible by 24

120hz is a thing is you ULMB

12 * 12 = 144

144 is divisible by 9.

is that sam hyde?

144 is a cuck refresh rate
240 master race
divisible by 24
divisible by 60

144 is divisible by 2

But it's not divisible by the utility frequency

yeah someone punched that dumb nazi in the face

he has fallen on hard times

Twice as good as what 60hz plebs get plus an additional cinematic 24hz on top.

>dumb nazi in the face
Source?

Holy shit. That's the most JUST I've seen anyone for a while.

120 is also divisible by 24 you retard

not as an integer.

Holy kek

uhh
someone wanna teach this guy division

There were 120hz panels for a while, it's just a marketing gimmick

also you can set a 144hz panel to 120hz (at least the asus vg248qe can) and that's actually the only way to use the ulmb feature

5 isn't an integer?

5 is a whole number.

I got u senpai

_24_
5|120
- 100 = 20/5 = 4

...

>human eye can only see 40 Hz
Ey goy, wanna buy some of that 144 Hz for a reasonable premium?

thank you for your contributions user

There are many different standards from multiple markets and even more legacy systems.

They have argued that their standard is better and everyone should adopt their standard. Well now that new tech which offer new options and growing economies of scale favor a single display standard (because who wants to buy a new monitor for each case) while market forces keep pushing margins and costs down.

It all came to the point it got them to the table to agree on a new standard. And 144 was the smallest number the could be easily divided by all the existing standards, so we got 144. From what I know it was a great thing as it should put an end to all this infighting that has been wasting resources.

Remember how huge USB was? This won't be as big, but it is still huge in terms of a global multi market standard that all the major players agreed on.

Because Overwatch hardcore pros say that there's a difference.

...

Because gamers are gross themselves and respond well to other gross things.

>my $5 Patreon contribution will be there soon, Sam
fucking lol

Which one of you changed his Wikipedia page

Where you get this 40hz? On your damaged brain?

ahahaha

144 can't be divided by 60

>who did this

saved

My fucking sides, it's real.

>it's locked

lol

The caption is a joke. I know it seems like it might be real but he's doing fine

i did MUHAHAH BXZ

We don't use the AC mains frequency for timings since the early 90's anymore.

I love this man

Because wall sockets run on 60hz wouldn't having frequencies that are multiples of 60hz create more interference from a power supply? How noticably would it be?

This is because at 1920x1080p @ 144hz uses the full bandwidth of a dual link dvi-d connection. Anything above this will use a display port connector or hdmi 2.0.

6*24(f/s)=144(Hz)

Sharpie in pooper.

And 6 is an integer but 5 isn't?

5 is not a multiple of 2

Pretty sure that's not what makes an integer

Yeah, but I wanted to tell you

>this "viral" self-marketing

Man, the FBI must be closing on him.

That's fair enough

Kek

Why don't we use, for frequencies
>numbers that are divisible by 1000 (1 sec in milliseconds)
>numbers that are powers of two (32 etc)

>buy 144hz or higher monitor
>set to 120hz

u stoopid?

OP CAN’T AFFORD A 144 HZ MONITOR HAHAHAH

LAUGH AT HIM WITH ME

>Why is 144hz a thing instead of 120hz?

It's a holdover from ca. 2000-era shutter 3D glasses.

72 fps per eye was the lowest nicely divisible frame rate Nvidia found they could get away with that wouldn't cause crippling flicker-induced eye fatigue in most users.

Honestly even 144 Hz (when strobed) isn't quite enough to get rid of flicker problems for everybody. 165 Hz is a bit better, but just jumping to 240 Hz would pretty much put the entire issue to rest forever.

I know we're going to start seeing 3840x2160@144Hz at some point, but I'd actually like a monitor with 3840x2400@120Hz

Hey dude, I have a question for you? What is the answer to 24 x 5?

60hz
100hz
144hz
256hz

These are the most common refresh rates available on the market.

Although, it doesn't actually matter what number divides exactly by what other number if you have a variable refresh rate.

this is his wiki page RIGHT NOW

...

This is heavily photoshopped right?

no

>256HZ
I have literally never heard of a single monitor with that refresh rate.

What does refresh rate mean, anyway? Does that mean the monitor is blinking 120/144 times per second? Why don't they just make one that stays on all the time?

Hard to argue against that

CRTs blink. LCDs don't.

>40Hz
Don't project your deficiencies onto the whole human race.

cnet.com/news/ultra-hd-4k-tv-refresh-rates/
>The fact is that nearly all of these new 4K TVs -- which now make up the increasing majority of all TVs priced over $1,000 in the US -- have, at best, a 120Hz refresh rate. Actually, many of the least expensive 4K sets are 60Hz, and none that we know about are 240Hz.

120+ Hz is a meme

Refreshrate is the rate in which the monitor calls for a frame refresh. Each Hz is a refresh call made per second.

Dunno about 256 but doesn't asus have that 240hz one

Assuming this isn't bait: the simplest explanation is that it's how often a new image can be displayed on the monitor.

B4?

Is that why my monitor buzzes when set to 144?

>only on bright mostly white screens.

>his wiki page still hasn't been reverted

Why would it be? It's a picture he posted online of himself.

>it's fucking real
i'm fucking dead

Wikimods are all no fun allowed faggots.

Sounds like your monitor is fucked. I had a 144Hz monitor from BenQ and that shit was flawless for the whole 3 years.

Then why is everything still 60hz?

In Europe we have a 50 hz grid and still we have 60 hz displays. It's just a number. It doesn't matter anymore since displays generate their own clocks.

Why is it 120/144 and not 100Hz?

This. ~220fps is the official limit the human eye can see, or to be more precise it's the limit of images per second a human brain can register. So 240Hz is a superior choice.

Didn't all Yurope displays in the 80s and 90s use 50hz? How do you play a PAL snes on a new tv?

It doesn't matter on modern devices because they're made to compensate for the refresh rate.

Ah cool

I might be mistaken but I think all modern TVs support 50Hz modes.

He is probably talking about needing to use a component cable adapter, which most of the time (because it's chink shit) only outputs 60hz. Fucks with the image really bad.

>Sam currently does not have health insurance.
My fucking sides!

>it's still up

And don't accept that they are wrong sometimes.

Wikipedia simply turned into shit. Instead of editing an article, you can alter it by simply deleting it and rewriting it. That gets rid of all the articles history and prevents comparing old and new articles with wikipedias comparison/diff tool.

Articles on nootropics were long a few years ago. Contained all kinds of information. Nowadays the articles span across a few sentences for some substances and there is no history, as the article claims to be "new". Hmmm... who could have interest in hiding studies showing negative effects for untested medicines/drugs? When it's multi-billion-dollar big pharma (out of russia, in this case) standing behind it?
It's something that could happen to your drive for knowledge as well.

Soon old wikipedia DVDs from years ago will be worth millions...

>year of our lord and savior 2017
>not overclocking your monitor up to 150
>ishygddt

...

*skips frames*

fraps tells me I am getting the 150 fps in some games though

>now it doesn't even have a picture
:( rip