Hey Sup Forumsuys. I heard that PCI 3.0-oriented hardware can work (reverse compatible) in PCI 2.0 slots. Is this true?

Hey Sup Forumsuys. I heard that PCI 3.0-oriented hardware can work (reverse compatible) in PCI 2.0 slots. Is this true?

Also is their any disadvantages or limitations to putting a pci 3.0 card in a 2.0 slot?

Other urls found in this thread:

en.wikipedia.org/wiki/PCI_Express#Pinout
twitter.com/NSFWRedditVideo

Not really. I used a PCI-E 3.0 card (GTX 980) at PCI-E 1.1 speeds.

How?

for most consumer tasks there isn't a difference. pcie 3 is technically faster than 2 but 2 is so fast anyway that you won't notice any difference

the slots are the same

True. Because most cheap mobos i see have 2.0 but also the cheap gpu cards use 3.0 lol

Oh shiy I misread that lol. I was thinking pcie x1 instead of pcie 1.1 x16. Whoops.

>Hey Sup Forumsuys. I heard that PCI 3.0-oriented hardware can work (reverse compatible) in PCI 2.0 slots. Is this true?
Yes.
>Also is their any disadvantages or limitations to putting a pci 3.0 card in a 2.0 slot?
Technically, yes. But these disadvantages and limitations don't apply to any consumer hardware that you care about, as even a modern high-end GPU won't saturate the full bandwidth of a PCI-e 2.0 x16 slot. The difference in bandwidth only really becomes apparent in HPC hardware.

ive got a 750ti in a 2.0 slot, not much of a performance problem when its only being used on sub 1080p anyway

>he thinks the PCI-e bus gives a shit about resolution
Spoiler alert: It doesn't. Your GPU pulls the same amount of bandwidth whether it's running shit at 640x480 or 8k resolution.
Another spoiler alert: You won't get more frames per second by putting your 750 Ti in a 3.0 x16 slot.

Oh. Thanks.
Oh thanks.

I might have to use a shitty onboard graphics card that comes with a gpu i might get, but apparently it has a fixed memory usage of 512mb.. Is their any cpus that have any more than that, and if there are, are there any where you can change how much memory it can use?

Is Gentoo PCI 2.0 compatible?

I feel like there's a picture floating around out there where someone cut/sawed off the extra pins of their GPU to fit it in a 1x slot.

Lol really? Seems kinda extreme.

Modern GPUs still do not use enough bandwidth to max out a PCIE 16x 2.0. Much less 3.0. You are safe OP.

Technically you can do that. They make adapters for that. They also make PCIe 1x GPUs. And I've seen people either cut the pins off their GPU or cut the edge of the PCIe 1X slot to make the GPU fit.

Coin miners do stupid shit.

...

How would this not break the card?

Depending on what you're going to do with the pcie slot it might become a huge limitation.
Example for a fileserver:
>Raid controller with 16x 12gbps sas ports = 192 gbps
>Usually 8 lanes
>PCIE 3.0 x8 ~ 64 gbps
>PCIE 2.0 x8 ~ 32 gbps

Electrically, it works. The pins on the left are the most important. The rest are just used for the higher speeds. You can test it by putting tape on the pin edge to force the card to run at different speeds.

en.wikipedia.org/wiki/PCI_Express#Pinout
He basically cut off all lanes but lane 0 and the other necessary pins. On serverboards you often have 16x mechanical lanes but only 8x electrically connected.

The standard is actually designed to allow this.

Interesting. But what's the point of doing it if it slows down ur card?

It's cheaper for manufacturer to have just one type of slot.
On servers you need a gpu for the intial setup, after that you could use ssh, rdp or vnc or whatever you want. With 16 electrical lanes 8 lanes would be wasted as most add on cards are for physical and electrical pcie x8 slots.

it just werks

bobbleneck :-DDD

Are you PCI-E compatible?