Is this a good idea?

Is this a good idea?
a PCI-e card with interchangeable ASIC coprocessors, for various encoding or computing purposes

Other urls found in this thread:

opencapi.org/about/
twitter.com/SFWRedditGifs

I think so

Why not an fpga that the user can just reprogram to whatever he wants. Actually I wonder how hard it would be to do this with off the shelf parts. Fpga stuff is really proprietary and secret is one issue.

>muh proprietary algorithms

Nop, latency and bandwidth PCI-E make unlikeable, IBM research ultra fast and low latency operations,memory coherent.

opencapi.org/about/

What are the benefits from this over a videocard?

No because it would raise the cost of the final product and would not sell compared to contemporary options.

Not to mention you would have reliability issues, and you have to design something something so an idiot would understand how to use it properly (one thing I learned as a fabricator). The risk is too great for high return rates.

cheaper
faster for some applications
more energy efficient

nice
this is the future

Videocards are general purpose. ASICS/FPGAs can theoretically be much faster/energy efficient by putting very specialized algorithms in raw silicon. Look at how bitcoin mining is 100x more expensive on GPUs now.

Imagine being able to just download a hardware update and have your 20 year old ThinkPad be able to play movies and YouTube videos with a hardware decoder for modern codecs.

Call me when normies actually use their GPUs for compute

>download a hardware update

I think he meant firmware

Normies does't even know what "PCI" or "GPU" is

no, just make multiple cards

You want a standardized LGA socket for ASICs chips? Other than that there are seperate gpu power boards that you can buy like the Galax HOF and EPOWER V.

btw, Fujifilm recently made a firmware update to process the X-Trans RAWs (pictures from Fuji's wierd sensor) on PC with the camera's processor, via USB. It's pretty neato.
Thats why I put a 'fuji xtrans' processor in my wonderful drawing.

In the past there were also GPUs that had expandable vram capacity. Like 128 MB modules.

It's been done with CPUs before and wasn't a success.

ASICs are orders of magnitude more efficient. FPGAs are for prototyping ASICs.

Bad idea because each chip may need different peripherals to work, such as memory or providing some output (video connector). That, and you'd run into heat problems as well if you had so many chips on there.

Simpler to just have an entire pcie slot dedicated to the coprocessor, then you can just mix and match cards easy peasy.

How do you even connect that to a card?

Soundcards too, but that was back when they all used socketed DRAM. After SDRAM became a standard, the chips had to be soldered on, and I don't know of any card that actually allowed that kind of expansion.
Then with DDR you had BGA memory chips, you can't even solder those unless you have a reballing station (that might kill the card too).

Why would anyone buy am ASIC for anything other than mining buttcoin? Do you really need to run bzip or gcc 50 brazillion times a day?

Developing ASICs is not cheap.

Welcome to the weird wacky world of programmable hardware!

>Why would anyone buy am ASIC for anything other than mining buttcoin?

Practically any pcie card that is of any use has an ASIC on it. GPUs, Soundcards, network controllers, storage controllers, RAID cards, USB controllers, and so on and on and on.

>and you'd run into heat problems as well if you had so many chips on there.
I thought ASIC are supposed to be fucking energy efficient?
Pic related is literally 150 faster than a GTX 1080Ti at bitcoin mining and is passively cooled (probably chink shit dying in 3 weeks tho)

>mining bitcoins with a GPU

Retard

What I meant is that's ASIC aren't supposed to heat that much because they're so efficient that they don't need quadrigorillions cores and gigahurtz

>ASICs are orders of magnitude more efficient. FPGAs are for prototyping ASICs.
They are also orders of magnitude more expensive. Like millions of dollars investment is the bare minimum. If we are gonna make a niche product that only a handful of autists will ever use, it needs to be as cheap to develop as possible.

And also it can't be upgraded. The user has to buy physical card and open up their computer and put it in. For every single feature they want until they run out of slots. Wouldn't it be cool to just freely download new hardware features whenever you want?

>you'd run into heat problems as well if you had so many chips on there.

I doubt you'd be running them all at the same time, and as mentioned by others asics are ridiculously energy efficient.

>Bad idea because each chip may need different peripherals to work, such as memory or providing some output (video connector)

It could have some minimal memory built in and otherwise share with the CPU. I don't see why it can't just pipe the output to the CPU.

This is weird