RIP GPU

> left in the morning, turned my pc on to let some downloads run
> come back in the evening, screen is black
> desperately wiggle the mouse, nothing
> try all the troubleshooting steps, eventually isolate it to my gpu

It served me well. Thanks to the unimpressive tech of the xbone and ps4, this 5 year old card ran any new game without a problem on the second-to-highest settings.

My question is, what causes a card to just die like that while idly displaying a desktop? I wasn't around for the actual death so I can't say exactly how it happened, but why did it?

Other urls found in this thread:

en.wikipedia.org/wiki/Electromigration
youtube.com/watch?v=1AcEt073Uds
twitter.com/SFWRedditImages

Could be everything.
Disassemble it, get rid of thermal paste, get some aluminium foil and wrap it in it. Then put it in the oven for 30 min at 150°C or even higher. Take it out, wait till it has cooled down, assemble it again and maybe your card is up and running again. Worked for me and many others.
No, it won't create nerve gas.

time to buy a 480 my friend

Don't listen to this shill
try what said

M8 I wouldn't have said a thing if you were still using a 670 but a 570? Damn that shit is old architecture wise. Let it rest in piss and get something like a 480/1060. Both cards are a huge jump in performance for you.

don't listen to this dumbass

just use your integrated graphics. it's not even a big deal, integrated can run just about anything if you don't have a fucking 10 year old cpu

OP is obviously a poorfag. He should wait for the rx 460 or a Zen APU.

This. I can run Overwatch at 1080p* 60fps on integrated graphics.

Not a poorfag, I just saw no reason to upgrade as my card ran everything fine. Now that it's dead I'll be getting a gtx1080 or 1070. I'm just curious what, physically, is the reason a card "dies". Solder melts somewhere inside and a connection is lost?

>still running my GTX 460
>it's gotten 6+ hours gaming use every day for 6 years

Sh-should I be worried, Sup Forums?

that's a strong lifespan

i just bought a msi 1070, hoping it will live as long as yours
my previoys card was 660 gigabyte, it died recently, i got it in 13 so im not too impressed by that

As time goes on, and electrons move through the lanes and contacts on the board, atoms will shift, and sometimes it ends up with a broken lane, or something of that nature. Putting it in an oven will let the solder reflow, but the card will only last for another 30-90 days. Then it'll just die again because the bonds were weakened.

it looks like TF2 running on a pentium 4 computer

>GTX 460
It's probably be doing you a favor if it just went ahead and died, m8

There's something really comfy about old outdated graphics cards.

No? If you're gonna be worried about it eventually dying, you should also be worried about your motherboard and everything else for no reason, because when it decides to die, you'll most likely never see it coming. Unless it's already overheating or something, which then you should just change the thermal paste.

fucking bake it before you throw it away.......

That's not actually 1080p though, your render scale isn't at 100%

>nVidia

Found your problem.

So it can last another 30-90 days and die again? Just get a new fucking card and recycle this bitch.

> see rip gpu
> must be nvidia
> open image
> nvidia
> afterBURNer

How about wait and get the GTX 1060?

Or you can go for AMD, I heard Polaris (or is it Zen?) will be pretty close to NVIDIA.

That's the joke.

I feel like Im getting deja vu like a thread like this was created some time ago

F

dont listen to this you will break it for good.
Rossmann says 5 minutes to be safe at 120 Celsius and nothing more.
That way you will find out if its joint problem or on GPU problem that cant be fixed.

Dude, I know of people going as high as 220°C, since it didn't work on lower temps.
I personally did it with 150°C and 200° on a PS3. Both worked afterwards.

Of course it worked. It allowed the solder to reflow and make connections to the contacts. But most of the time when people decide to do this, they kill the actual chip while trying to do it, or the hardware fails again after a month or two. But of course there will be those lucky times that it'll just keep working.

Mine just died a month ago. It couldn't handle the extreme demand put on it by Netflix.

F

If you see no need to purchase a new GPU when it runs anything 'fine' I would suggest steering clear of Nvidia for the time being.

My 580 also died last week, it started with the driver crashing, then after installing a new driver, the pc just froze and reset itself. Then i noticed weird dots on the post screen and i couldnt boot into windows untill i deleted the driver in safe mode. But it would just go to a fucked up BSOD with lines everywhere when installing any driver. RIP. Atleast i am using my old GTX 285 for now. This thing was loud as shit but it ran games pretty smoothly.

en.wikipedia.org/wiki/Electromigration

>still running a GT 220
fight me

i saw this and thought you meant a 480 fermi

i'm old D:

my GTX 560ti is still running
Please kill me
After 30 minutes of CSGO it goes up to 89 Degrees
And the fan spinning the fucking hot air in my room
cooking my legs and letting me sweat as if i am in a sauna

i replaced my '09 5850 a few weeks ago with an r7 370

acbn#1634

add me on overwatch

>6+ hours gaming use every day for 6 years
how? how can someone play THAT MUCH games and not feel like a 6 year old GPU just isn't cutting it? do you only play Source games?

>have 780ti
>playan games
>suddenly the screen shuts off and the card starts using max fan speed
>shut down computer and reboot
>computer wont boot
>every time i turn on the power the PC only flashes it lights for a fraction of a second
>remove gpu
>pc turns on
>replace gpu, change the slot
>wont power on
>replace gpu with an older pig disgusting gpu
>pc powers on
>visible concern
>remove the heat sink from the 780ti and inspect for damage
>zero visible damage
still stuck with a 5770. heavily considering not purchasing any new nvidia cards.

Kek I'm still rocking a gtx480

>wanting lead solder fumes in your oven
I bet you eat lead paint chips too, retard.

No idea, the 550Ti in my home theater is still chugging away. It's probably a bit overkill since it's just running MPV on high.

>lead
Lead hasn't been used for solder since the early '90s. Solder is tin based now and is why everything goes bad due to cracked solder joints. 80's electronics still work fine due to their lead solder.

Wrong, most solder is a lead tin mix.

>Solder is tin based now
Err not exactly. There are a number of different breakdowns, there are some that are Tin and Copper. Leaded solder is absolutely still used, just not mostly in consumer electronics (I use leaded solder quite a bit in my job.)

Regardless, you also have the electrolytic fluid from the caps that will evaporate some if you put it in the oven, Copper and Tin aren't exactly great for you to be heating up in the oven that you cook with either... there's a number of toxic things on that board, that will make you very sick if you use that oven for anything. wrapping it in tinfoil isn't gonna do shit to stop that. But sure, go for it.

Most common soldering alloy used is 60/40 tin/lead. Also the melting point is higher on tin, and it corrodes less. Meaning tin likely extends the life of the solder.
The higher melting point is more annoying to work with though.

The issue is tin is less flexible and less resilient to heat/cool cycles. Pure lead solder didn't have this problem.

>Lead hasn't been used for solder since the early '90s

Mid 00's, you dumb fuck

Nvidia picked off your GPU as sacrifice to the new 1000 series. They periodically do this. Go AMD or learn to upgrade every year. RX480 awaits friend.

The easiest method to prevent your GPU from going bad is to never peel off the plastic. It will think it's new forever.

Have you tried using the video output from the motherboard/using the computer without a GPU at all? It could be a problem with your motherboard not playing nice with your GPU, so if everything works without a GPU, you can probably rule that out.

I do 8min at 250C, that's actually putting the least stress on the other components and just melting the solder.

Yeah just put it in an oven at a tempurature high enough to melt solder (tin solder is around 450 Farenheit), it will reflow the solder which will totally fix the problem.

>480 and not 1060.
Kek'd but checked.

Solder used for BGA and surface mount solder used in those card's is lead free.

I done lead solder reflows in my oven too, where I eat out too, past 10 years I haven't died.

I know it's stupid, but just let the over run for a few minutes with the door open and the room window open, nothing bad from it.

Kek, how can you fail with something that easy.

Don't get AMD unless you want an unstable system

I own several of both company, CPU's and GPU's, they all run just fine.

AMD/Nvidia is just a hyped meme.

>electrolytic fluid from the caps
Most graphics cards use polymer caps these days.

>Most common soldering alloy used is 60/40 tin/lead.
For manual soldering, yes. In manufactured products, you'll never see leaded solder. It's probably even illegal in first world countries.

Wouldn't it make more sense to buy solder and a soldering iron and resolder the card in your garage instead of in an oven like a nigger?

>Pure lead solder
Electronics is never soldered with pure lead. I don't know if that might have been so at some point in history, but if so that would have been at least a hundred years ago, literally.

AMD is shit tier, uses way too much power and produces way too much heat, and bloated buggy as fuck drivers

No, because a soldering iron is useless with BGA's.
Google it.

>lead solder fumes
Are you retarded? Metals don't evaporate at any temperatures that you can produce at home. When you solder manually, you'll get flux fumes, but on manufactured units any flux that didn't evaporate during reflow will have been cleaned off.

I've had a 290 for over a year now and I've never had a single issue with the drivers.

I have an FX machine I use occasionally, sure the CPU is not top notch anymore VS Intel, but it's does run cool and eat's the same amount of power at a higher clock rate then comparable Intel at the same time, aka 4-5 years ago.

What goes for GPU's, the AMD box is AMD only, so it has a R9 290, I used it for a gaming PC years ago and it did it's job just fine.

I can't recommend AMD right now, there's not really anything with good on the market, maybe 480, looking at it's price/performance ratio? But I don't like to cheap out on hardware.

That picture gets me every time.

I'm surprised the AMD shill post is the second post not the first one.

Just reinforces the belief that AMD promotes unhelpful responses like this.

me too...

I had trouble with the Game DVR thing slowing everything down when I was running on a G3258, but otherwise, no problems.

Wtf has happened here? Did someone put it into the oven with the plastic shroud still on?

fermi's burnin' on the dance floor

jen hsung let's go!

Try an Nvidia card, the difference is night and day. AMD drivers are so shit, I have to uninstall and change all the options just to make them usable.

Dunno, never had problem with neither of them.

My old card was a GTX 670, literally no difference in the drivers.

AMD is 6 years behind as usual.

no i'm six years older than you are

i'm in my twenties

Will graphics cards fanboys please kill themselves? This is getting beyond cancerous.

I was trying to make the joke that nVidia had a card named 480 6 years prior to AMD having one.

It's more complicated than that. The lead does react with oxygen, and hence there is risk of leaded dust particles. Not to mention that you don't need to get close to boiling point for evaporation to occur. There is a lot of chemistry involved with this and I could not find a definitive answer from the research I've done, it's mostly hearsay and nonscientific.

1. Even if that is true, the effects are going to be so negligibly small that it's like worrying about the radioactive potassium in bananas.
2. Fortunate, then, that all graphics cards are manufactured with lead-free solder.

you show me how to hand solder gpus and i'll do it

youtube.com/watch?v=1AcEt073Uds

What are you trying to say?
I think we all know it's just going to maybe give a few more weeks or months of operation or did you just now find it out?

I actually had a laptop with a fucked up GPU, unstable and colors where off, after treating it with the oven I used the same laptop for the next 4 years, running perfectly.