How am I supposed to connect my Dual Link DVI Korean monitor?

How am I supposed to connect my Dual Link DVI Korean monitor?

Other urls found in this thread:

ebay.com/itm/Dell-BIZLINK-DisplayPort-to-DVI-Dual-Link-Adapter-USB-Powered-XT625-/301964564302
twitter.com/SFWRedditVideos

Use DP tot DVI converter or just get a proper monitor that has DP input

Not going to work.

All DP adapters that you can get are single link only even if the chink scammer says it is dual link compatible.

DP -> DL-DVI adapter exist, it's just that they still cost like $100 and need a USB plug for power, etc.

Buy a non-shitty graphics card?

I think you mean, "buy a non-shitty monitor" shitty monitors are laden with DVI and VGA ports everywhere. Quality monitors have a a DP in and out.

You're either a troll or retarded. The aftermarket cards always change the I/O config.
>buying reference
>not retarded
Pick one.

>Not using a FreeSync monitor
You might aswell not use a Radeon then.

Dude, are you serious? DVI? Do they even make 1440p monitors with DVI only?

They did, and a whole bunch of retards fell for the korean 1440p panel meme. Now they're stuck with a piece of crap that needs outdated connectors that even back then were on the verge of disappearing. Everybody with half a brain knew what would happen.

Actual quality monitors have both. And HDMI.

3rd party ones have DVI, just wait

THANK YOU BASED NVIDIA FOR GOING TRIPLE DISPLAYPORT BACK IN 2014

Variable sync is just a mitigation method for shitty low framerates.

The real feature that most normalfags don't begin to appreciate is strobing, but that requires running a game with a locked 120/144 fps.

You are retarded.

go cry to your whore mother, you 40 fps poorfag.

>being poor

Bring up a good point OP, I know DP is the future but it's too soon to dump DVI

Ideal connector set IMO is 1x dual-link DVI-I, 3x miniDP.

VGA and HDMI folks can use passive dongles, or buy active adapters if they need more than 1 legacy display.

>Be AMD
>Make poorfag GPU
>Don't put DVI on it so the poorfags that buy it can't connect it to their ancient poorfag monitora
The absolute madmen

I have a 7970

Will I even benefit from buying this thing and selling my thing

anybody who spends less on displays than on GPUs is a moron with zero sense of tech depreciation.

Buy a model of the GPU from a different maker. Different companies put different connectors.

> 7970: 30 GP/s, 118 GT/s, 264 GB/s, 3.8 TFLOPS, 250 W
> RX480: ?? GP/s, ?? GT/s, 256 GB/s, 5.5 TFLOPS,

If I'm a poorfag casual, will this card be worth buying?

I would see what Nvidia is offering in the mid range first.

Cards will probably come soon.

very, very likely yes.

Nvidia priced the 1070 and 1080 expecting this to sell for $300-$350, not $200.

Unless a 1050/1060 is released or 1070 prices are slashed ASAP, Nvidia's sales are going to be wrecked by this.

is this a good monitor to go with the new gpu?? It being out of stock does say something

That 4k shit is useless, better get a 24 inch 1440p monitor. Seriously, why even have FreeSync on a 60Hz monitor, btw 5ms response time, kek!

Yeah, get 1440p 144Hz IPS Freesync.

Customs will have DVI 100%. All necessary elements are on the card.

It might be the best poorfag card ever made. Time will tell.

How exactly is "that 4k shit" useless? Satisfied owner of a 4k monitor here, please tell me.

read

get a non reference model with dvi

>Anonymous 06/02/16(Thu)22:13:03 No.548782
he means for gaming, since you cant play on native resolution (maybe if you buy a 1080, but still would have below 50-60 fps on modern games)

I have a 980ti, I can play a at 3820p 60fps just fine on 99% of everything I've tried. Most people think 4k is harder to run than it actually is, because they try maxing it like you would for 1080, but with 4k you don't need near as much AA as you do, so crank that down and you'll no noticeable difference in quality. but you'll gain a good FPS bump.

Is free sync/g sync a meme? I don't know what that shit is

*3840x2160

You can't differentiate on 1440p vs 4k on a 24inch monitor anyways.

Playing 4k at 60FPS is stupid because higher framerate is always better specially if you can't even notice the resolution difference.

Kind of - it's a good feature that should become standard, but being such a new technology it's still ridiculously overpriced for what it is. It essentially does the same thing as v-sync (eliminates screen tearing) without the lag caused by it. If you think the few ms of lag is worth a few hundred dollars, then that's your prerogative. For most people, however, that's just unnecessary.

If you gayme it's not a meme, no input latency and tearing with 9-144FPS (depends on monitor, mostly it's 38-144hz aka fine with 38-144fps)

It's silky smooth and much better then v-sync input vise.

Wrong. I upgraded from a 1440p to 2160p, and still have both running side by side. You can absolutely see the difference.

If you game on a budget get a AOC monitor, those monitors are like 150€.

>tfw instead going for 1440p I went 144hz

Did I fucked up Sup Forums?

You can also see the difference between 144Hz and FPS vs 60Hz and FPS.

I have tried all kinds of monitors, a 1440p with overscaling and 144Hz running games at 144FPS is much greater experience then 2160p at 60FPS.

You know there are 1440p 144Hz monitors, right?

Not in same same price range its not. The cheapest 144hz 1440p cost $100-150 and had a shittier color accuary than my TN 144hz

1080p at 144hz and 2160 at 60hz are the only good choices. 1440p is a meme and will be the new 720p - obsolete and discarded as 4k becomes the new standard .

>Buy an ultrasharp
>has DP/HDMI/mini-DP

DVI is dead. Fuck off.

use a non reference card which has dvi ?

If that's your personal experience, more power to you. Personally, I had scaling issues all the time with 1440p outside of gaming, and got fed up with dealing with it. That's the single reason I recommend 4k to people over 1440 - it scales perfectly from 1080.

>he means for gaming, since you cant play on native resolution (maybe if you buy a 1080, but still would have below 50-60 fps on modern games)

You can play at native 4k, the only thing you need to do is disable AA (which is pointless at 4k anyway since you have a way higher DPI and jaggies aren't that visible).

I have been using 1440p/144Hz for two years now.
Way better then 1080p/144Hz, how can you deny that? 4k monitors ain't just ready yet before they can run at 144Hz.

but its still at most only 60 fps and been used to 144 fps for years you just cant go back to 60...

Well a gtx 1080 won't get you 144fps either so I don't see the problem.

Yes I can.

Not true, you need an active DP to dual link DVI. They usually need USB for power. I use one to connect to my Dell 3007 WFP. I got the adapter used for $35. Here is an example:

ebay.com/itm/Dell-BIZLINK-DisplayPort-to-DVI-Dual-Link-Adapter-USB-Powered-XT625-/301964564302

Vega will.
It's designed for 5k, so there's a pretty damn good chance we'll see solid >60Hz 4k.

You can easily run 144FPS at 1440p even on med class cards.

Why are the clock speeds on this thing so low?

I thought they used a die shrink? is GCN already stretched to its limits?

4k monitors can't run at 144hz, but they can run at 4k. I've personally found that 60fps is plenty, and I prefer the trade off for better visual fidelity.

On low settings yeah

And high with a good card

Are you insane, 144hz on 1440p in games like Witcher 3 and Battlefront with med class cards?

lol nope

Is a 5ms response time plenty too?
The trade off of responsiveness and blurring, low FPS I think is not worth a upgrade from 1440p to 2160p, specially if you get a monitor with high DPI.

They don't need high clocks; just like how the Athlon64 at 2GHz was as fast as the P4 at 3.5GHz.

Mad class? I said good card, not med.
Aka high settings with high end (good) card.
I mentioned med before when you said low settings.

>muh giguhurtz
Kek'd

>Mad class? I said good card, not med.
No you did say med cards
That doesn't answer the question, why aren't the clock speeds higher?

And then I said good card, med being a OK card, good being high end, misunderstanding.

Literally autistic

That still won't cut it, even 980ti's don't get 144 hz on 1440p in games like witcher 3, unless its on low settings

They don't have to get 144Hz, but 144FPS, also with a adaptive sync monitor playing even at 120FPS is silky smooth.
Yeah, maybe a single 980 Ti won't do Witcher 3 on Ultra settings at that framerate, but Fallout 4 for example? Totally.

xd