G-Sync will be an extra $400 Goy

Really makes you thing

Good goyim, show us some more nvidia shit

>34" with 1080p vertical resolution
nice ultrawide meme

I actually have the Freesync one.

Its good for watching movies, its not that worse than a 24 inch 1080p screen

idk why you would spend big bucks on a monitor that has a shit res
kinda regret getting a Dell U2715H (1440p) now, should've saved up and gone 4k.
I guess it's not as important for gaymen, so whatever makes you happy

Honestly it really doesn't look bad. I never used 1440p/4K monitors before, but I heard some cannot go back to lower. Its really a bliss when you watch movies tho. Its like the cinema's

g-sync is a premium technology for premium users, if you don't like it you can stick with commiesync

I'm pretty sure freesync and gsync are gaymer features that people who only watch movies on their monitors wouldn't even be able to notice

ofc free/g/sync is useless for movies as they're fps locked.
Xyr meant that particular monitor

When a 27" 4k 120hz HDR monitor is $250 then maybe i'll upgrade

> IPS
> gaming
> 2k
> LG
And these are only some of the wrong things I saw there...

Good goy.

think about killing yourself dumb memespamming retard

that just means that freesync monitors sell more and monitor brands want higher margins from lesser sales of g-sync

IPS is good for gayman these days. They have very low input lag - not to be confused with 60hz IPS monitors.

If Vega had been on fucking time, instead of being massively late, coupled with FreeSync2; this argument against G-Sync would have been a better one.

Instead, with Vega being massively fucking late, and potentially a complete power hungry/housefire trainwreck, and with FreeSync2 being mostly worthless without said new GPU to go with it, G-sync is the only game in town.

Yeah, its crazy fucking expensive; but so far, its the best solution given the circumstances.

It's at most 2 months late. AMD can't push products out the door as quickly as NVidia.

>5 ms

buying the G-Sync module from NVIDIA costs the monitor company like $200 each. Resell for $200 more to make up for additional planning, assembling and stuff and some profit too.

meanwhile freesync requires no additonal hardware and is almost free for the monitor company to implement. Just gotta add some software.

>34 inch 1080p monitor

Enjoy your pixels I guess.

A 34" ultrawide is equivalent to a 27" 16:9 panel in terms of pixel density. Still not amazing for 1080p, but not too bad.

> 1080p 21:9
> 34"
Pretty sure you can spot individual pixels at that point. 21:9 1080p sweet spot is that 29"

depends on distace retard

its a monitor not a tv, fuckboyy

what does gsync actually give you? i'm using a 21" monitor from 8 years ago and it seems fine IMO but i don't know shit

so what is the MANDATORY distance all monitors have to be used @

Its nvidias answer to AMD freesync, did you ever play new Vegas with the anti stutter plugin? This is what the monitor have built in so gameplay is smoother when your frames dip, it doesn't work with every card tho.

haven't done that so i still have no clue what this is about

the only game where i saw any ghosting on my cheap 5ms ips was undertale
when playing anything like csgo or any other fast game i didn't notice anything wrong

If you drop from 60fps to 30 you'll see a difference if you pay some Jews 1000 shekels they can make that difference less noticeable.

It adjusts the monitor refreshrate to the update rate of the video signal. No more tearing or vsync stutter if your frames drop.

>at most
>2 months
>when it was hyped since Pascal's release
>when NVIDIA had Pascal out for more than a year before AMD could respond with details about Vega
>when NVIDIA released the GTX 1080 Ti and re-released the TITAN XP to make the GTX 1080 a mid-range card and the GTX 1070 a low-end card so AMD's 500 series could look like complete trash in comparison, as well as make AMD look like fools for still not having Vega out more than a year later
Yeah, no.

so why not just buy better hardware so you won't have these drops in the first place?

>GTX 1080 a mid-range card and the GTX 1070 a low-end card
Your post just kinda fell apart, bro.

Idiot.

>gtx 1070
>$500
>low-end

To be fair, even 1060s are like that right now as well. Everything is super inflated because of the miners.

It's a really awful time to build or upgrade.

user, the problem here is not the lack of frames per second. The problem here is that there are certain parts during gameplay where the GPU is stressed more and may drop frames. It becomes especially dangerous when you have all kinds of post-processing stuff enabled like ambient occlusion, FXAA, Anti-aliasing, bloom, etc.

well even without the inflated prices the 1070 was around $400 while the 1060 was around $200-280 depending on the version

>buying nYidia
>ever