4K+60hz

4K+60hz
or
1440p+~144hz

Other urls found in this thread:

anandtech.com/show/2794/2
twitter.com/SFWRedditImages

4k

this
anyone who disagrees should go back to

even at 27"?

No, get the 28" 4K Samsung monitor that is on Amazon right now. Goes for about 319 GBP, no idea what it's going for in USA dollaroo's though.

1440p @144hz is better for gaming

4k for work
2k for games

>no adaptive sync
nope

It's 60hz, if you cany power that you shouldn't be getting a 4K monitor

I don't think you know what vsync or adaptive sync does

Nothing at 60fps, which is where you should be

How does this differ from just enabling it in the GPU control panel?

The monitor needs certain hardware to use adaptive sync, vsync can just be done on whatever screen, but only at the refresh rate or divisions of it (60, 30 or 144, 72) where adaptive sync is basically vsync but at any fps level, you don't have to worry about not getting max fps quite as much

That actually makes a lot of sense.

Is it not possible to emulate adaptive sync by having the GPU time slice frames in such a way that it appears to refresh at any fps? Using whatever combination of time interval and frames being pulled.

that causes screen tearing

and you've confirmed you don't know what v-sync or adaptive sync does

Monitors that don't have adaptive sync only have one refresh rate, you're comparing a monitor that can only refresh 60hz vs one that is made to refresh from 10-60hz

Even with triple buffering enabled or whatever prerendered?

Seems strange to me. I can change the refresh rate of my chink monitor to whatever i'd like and not in intervals. I'm not seeing how this all can't be done from the GPU and O/S side but i'm probably missing something here, or it's just a gimmick. Dunno.

You can do that, and because it isn't synced to the monitors refresh rate, some frames will get interrupted by new frames, causing screen tearing
The only thing g/freesync does is prevent screen tearing, but at any fps
Vsync prevents screen tearing by adding micro-lag to match the refresh rate even when the gpu's framerate is too high

27" 4k is kinda dumb, I'd go 1440p higher refresh rate

well you start buffering frames and you introduce quite a bit of input lag

But he's right, if you have 60fps on a 60hz screen, gsync won't do jack shit for you
But it's also really hard to get 60fps at 4K, meaning it would probably be useful to have

good luck maintaining that near perfect 60fps without going too far over or under

I think someone has it mixed up

Frames > refresh rate = Vsync
Frames < refresh rate = Gsync

So cap your frame rate at your refresh rate so you can stay in Gsync.

I'm only saying this because I buffered a few frames in Witcher 3 and did the adaptive sync even though I don't have the monitor for it. The input lag is completely unnoticeable with say 2 frames. It's not even a millisecond of delay. It worked too, much better than uncapped.

The monitor has its own refresh rate (usually cannot be changed)
The GPU has its own framerate output (which can be changed)
If you're GPU puts out fps exactly as fast as the monitors Hz, good for you, you don't need gsync at all
If the GPU is pushing out frames faster than the monitor can refresh them, you're gonna lose frames or get tearing
If the GPU is pushing out frames slower than the monitor is refreshing, you're gonna get screen tearing
Gsync makes it so the monitor changes its refresh rate to match what the GPU is putting out (up to a certain limit), no screen tearing, no lost frames

The point of variable refresh rate is that it's variable, your monitor is matched to your fps but both are still variable and not "locked"

You're probably just using vsync
Just use vsync and try to get 60 or 30 fps, you don't have an adaptive refresh rate monitor

Still unclear to me. Not saying i'm necessarily right, I just tested a lot with adaptive sync, rivatuner, etc.

My chink monitor (qnix 2710) allows me to set it to ANY refresh rate. I'm at 110 as it's the sweet spot for me. Why can't it do the same thing as adaptive sync if I have both o/s and gpu control along with the monitor's refresh rate

My only hypothesis as to why this can't be real adaptive sync is that the interval I chose (110) is just a false refresh rate emulated through the various ratios possible. But still, with buffered frames the adaptive sync DOES work and is noticeably better than uncapped and it's definitely not vsync. The input lag is about 1ms and not noticeable at all.

I don't even game much, but it seems like a gimmick to me. Maybe it's just a cheap implementation for hardware manufactures, but then why don't they all do it?

Online I'm seeing it's 60hz but "overclockable"
That's not variable refresh rate, that's just adjustable, you can make it 90hz but you can't have that change in-game like with adaptive sync. The software side can be made to output any fps you want, that still doesn't change the monitors refresh rate though
Do you know exactly what your monitors refresh rate is right now or can you adjust/overclock it?

Buffered frames is vsync you dummi

Triple buffering is not v-sync user.

Honestly the terminology with all of these is so ridiculous and convoluted, probably necessary considering the number of techniques possible.

Yeah it's overclock-able. I think you're implying that I can't change it instantaneously to cap at x fps? Dunno about that, you might be right.
>That's not variable refresh rate, that's just adjustable,
Adjustable is just variable refresh rate being applied constantly with a time interval, no?

Triple buffering seems to be a slightly delayed version of async from what i'm gathering. Not a big concern to me I just thought it was interesting. From a lot of testing triple buffering was amazing as eliminating the tearing in witcher 3 and unnoticeable input lag and smooth as fuck too. Maybe async does it all slightly better, i'd be impressed to see if an async monitor did better in a noticeable way. Tiny bit less VRAM for buffered frames but that's not a big factor at all to me.

anandtech.com/show/2794/2

Really interesting article about triple buffering and some others.

I have both, I prefer the 1440p@144 Hz screen.

just google these terms they're very simple to understand. Your monitor has neither freesync nor gsync modules, it doesn't even have the right inputs to transmit that metadata, it's a fixed refresh rate monitor you probably bought used that somebody made non-60Hz, vsync is delaying the frame to match the refresh rate and that's exactly what you're doing, vsync at 110fps would probably default at 55fps on your monitor

I prefer 1440p at 165hz

>Your monitor has neither freesync nor gsync modules
I know this, that's not my point.

>it's a fixed refresh rate monitor you probably bought used that somebody made non-60Hz
It's overclock-able and not bought used. Very common value Korean monitor. I can change the refresh rate at any time and yes it's the real refresh rates configured in that 110, 90, 60hz are all noticeable configurations both within windows and in games.

I see the point of freesync now that I read further into it. My entire argument was that it seems (and from some testing) that it is possible to emulate the same thing but at the cost of wee bit of vram and some time delay. I'd have to get a freesync monitor as i'm curious if it's better than my configuration in the control panel right now as the tearing and lag are not noticeable but were very much in the vanilla CP config. Not saying i've found some super secret builtin freesync.

I wonder how much it costs for the monitor manufactures to include freesync/gsync technology. anyone know?

Are you gaming or a liberal arts fag?