VGA/DSUB & CRTs

>Best GPU I can find with a native VGA port is the GTX 750 ti

Fewer and fewer GPUs coming with VGA ports natively on the card

They're trying to take my CRT away from me, guys. I love my CRT. Theres nothing wrong with it and I want to use it until it finally goes bust.

Other urls found in this thread:

amazon.com/StarTech-DVI-Cable-Adapter-DVIVGAMF/dp/B000067SOH
twitter.com/AnonBabble

There are adapters, aren't there?

Switching from CRT to LCD was one of the greatest moments of my life. What's wrong with you?

amazon.com/StarTech-DVI-Cable-Adapter-DVIVGAMF/dp/B000067SOH

you cant buy one of these why?

...

>CRT
enjoy your eye cancer you hipster dumbass

Kill yourself user.

VGA is a 40 year old technology that has been replaced 5 times over.

If you're still using a CRT you obviously don't care about visuals anyway and you could probably just get by with Intel graphics.

How does that even work? How do you convert from digital from analogue with such a simple and cheap device?

>Switching from CRT to LCD was one of the greatest moments of my life
This

I'd like to add getting a Dreamcast to that last too

Because DVI natively carries VGA as well.

That's the reason it has so many pins.

>tfw the switch ruined my xXxnoscopexXx skillz for a while.
>tfw my aim on the CRT was heaps better.

There is no signal conversion being done by that adapter. It's just a connector converter.

cause DVI carries a VGA signal too buddo

now stop being a memester and get a real monitor

how have you never heard of DVI to VGA adapters? every card i bought since ~2005 had one shipped with it.

DVI or HDMI while gaming on PC.

Does it even matter

>i don't know about the dvi-vga adapter that was bundled with every single graphics card for the last 15 years

how fucking young do you have to be in order not to have at least 10 or more of these adapters sitting in a drawer?

This, I got one with every gpu I bought ever. What's wrong with you OP

You may want to google for the differences between DVI-A, I and D

Just a note so OP doesn't go buy a card that doesn't do the job, DVI has 3 different flavours, A, I and D. A and I support analog and I and D support digital. I've never seen an A one (cause why bother?) but I is quite common. That said, I believe the 10x0 range only supports DVI-D, so the adaptors won't work.

For AMD, it appears the rx480 doesn't even have DVI.

>never had a DVI cable cause they were too expensive
>stole some from work one day
>mfw I swapped the vga cable in my PC at home

It's been 5 years and every day the boss wants to speak to me I think its about those cables still

He just wants your dick.

I expected amazing improvement when I finally switched from vga to dvi.
I couldn't notice a single difference.

fucking neurotypicals
your boss would only commend you for that

vga only starts losing quality past a certain resolution

i can see a good difference on my 24" 1920x1080 LCD

couldn't see any difference on my old 19" 1680x1050 LCD

I'm so glad for DVI and legacy standards.

I'm still running a CRT as a secondary monitor. Not because of the picture quality, or the refresh rate, although they are miles better than my top of the end LCD. But because I like light gun games.

And thanks to ports, emulation, and the odd rare peripheral, I can play the majority of them on my Windows 7 rig without even having to move. I just pick up my gun, switch my primary monitor, and I can blast zombos, criminals, or raptors in the face all day long.

But don't you know that lightgun games ruin the TV, user? Stop doing it immediately!

I still don't understand if i am suppose to use HDMI for my desktop or something like DVI for best picture.

It's digital, they're identical. HDMI carries audio too, that's about the only positive.

the best one is Display Port, but it is equivalent to HDMI if you are not doing 4K

Not really, DisplayPort just works (tm) while HDMI carries over stupid legacy bullshit like the limited color range and overscan, which might or might not end up being on by default depending on your luck and the current orientation of the planets.

you're right, sorry
Also DP has support for higher resolutions and framerates, if I remember correctly

>dvi
fucking horrible connector, I don't know who thought it was a good idea

>we'll take Dsub
>we'll make it bigger
>we'll make it so there are different pin combinations
>ports will be different as well
>now you can buy something that looks right, but it may or may not fit the connection you want to plug into because there are 3+ different versions of the "standard" that only superficially look identical

Are Asus and Acer monitors good?

>no signal conversion being done by that adapter. It's just a connector converter.
DVI-A does, but other versions of DVI don't

If you live in the UK I'll sell you my 1080p60 24 inch for £50

DVI-I does as well.
If it has the little pin cross it works with analog

Okay noob siddown.
I'm going to reach my hand out to you in hopes you learn something.
DVI is natively digital. There are adapters out there that are a total scam in that they DO NOT COME WITH the pins needed to carry over an analogue. How do you tell?

Look at the 4 pins around the horizontal bar... if it has pins there then its compatible VGA.

If it does not it's Chinese garbage scam.
Also look at your video card and make sure it has the female DVI-I and NOT DVI-D.

Posting examples of ones that work and dont work next.

L O N D O N

Switching from LCD to plasma was like a creampie of colors that I never knew even existed displayed in my videogames, its like receiving a free graphical update that doesn't have a performance hit.

You're in London? Or you're just memeposting?

On this one has no 4 pins and therefore is pure garbage and needs to be tossed in the trash. It simply doesn't work as it doesnt pull any analogue signal at all.

Why do they make these you ask? Because unsuspecting customers buy them for cheap thinking they work when they don't at all.

This almost identical looking adapter with 4 pins along the horizontal bar will work guaranteed and is not a scam. Confirm when you buy one of these adapters that it has the 4 pins in the photos.
Good luck to you.

Some versions of DVI do, not all

It really depends heavily on the quality of the VGA cable. I had some that would make my PC->TV output look like utter shit and some that looked as crystal clear as HDMI did.

It does. The downside of DP is that it doesn't carry audio, and I'm not even sure that's true for the more recent versions.

it actually does now
source: my new monitor shits sound through it

It indeed can.
>[...] the audio path can have up to eight channels of 24-bit 192 kHz uncompressed PCM audio or can encapsulate compressed audio formats in the audio stream.
It can also do USB and a whole lot of other things as well, since it's packet based adding new features to the standard is easy.

I don't know shit about this topic, how come (for example) a HDMI cable made 10 years ago can work as well on HDMI 2.0 as 1.0, despite the former having a much larger maximum bandwidth?

Fuck, I just noticed it actually uses a separate auxiliary channel for non-audio/video things. Oh well, at least it does actually have USB support baked in the standard.

The cable hasn't changed that much, the connections and pinouts are still the same. 2.0 gets the performance boost by cranking up the clock rate. I'd wager the version branding is mainly to attract consumers, though it might also have something to do with more stringent EM shielding requirements or something.

>want to upgrade my rig
>think it's just simple as getting a new monitor with HDMI and new GPU
>thread talking about display ports and adaptors and sound and HDMI problems and shit

What the fuk man.

just use an adapter, dingus.

what are adapters

>>think it's just simple as getting a new monitor with HDMI and new GPU
It is.
Unless you have a >60Hz monitor.

DVI-I ( that 24+5 pin ) do carries BOTH
DVI-D, ( 24+1 ) nope, only digital

Who /fw900/ here?

What if I wanted to get higher Hz monitor for futureproofing?
What should I look out for in that case?

You're not gonna future proof with a high refresh rate monitor but, to answer your question, you need to use the DisplayPort port of your GPU and monitor in order to use resolution with a higher Hz than 60.

The main downside to Displayport is the Pin 20 issue

just use a vga to dvi-i adapter

Future cards are getting rid of analog output altogether. So no DVI-I.

well i guess you're stuck with nvidia 9 series card

I'll never buy a card that can't even support basic legacy hardware.

Jesus christ the VGA standard is fucking ancient and the only reason to even use it is if you have a CRT monitor and 99% of people today doesn't have a CRT monitor. It's way to niche for manufacturers to shell out money on.

In any case if you run an intel CPU chances are you have a VGA connection on the motherboard for Intel HD graphics. I use that for my secondary CRT monitor and it works fine. I can't plug it in my GTX 1080 as it doesn't carry any analog signals at all.

>VGA

Its the current year