YOU WERE WRONG

THE WEEBS WERE RIGHT

You can soon
1) download more ram
2) download a faster cpu
3) download a bigger gpu
4) download more battery

are you gonna eat your hats now?

Other urls found in this thread:

intel.com/content/www/us/en/servers/accelerators/deep-learning-inference-accelerator-product-detail.html
twitter.com/SFWRedditGifs

>guys, I fundamentally misunderstand technology and am easily sold on marketing bullshit

Are you smoking the drugs, user?

This won't make anime real so it doesn't matter.

intel.com/content/www/us/en/servers/accelerators/deep-learning-inference-accelerator-product-detail.html
Powered by corporate buzzwords.

Intel is selling FPGA-accelerated processors, IIRC in xeon phi's. That would be more in line with what OP was talking about.

I'm guessing this is their answer to google TPUs, which is actually pretty cool. Hopefully it won't be so long until we can rip them from decommissioned servers.

It seems like a lot of companies are going with basically extended GPUs for AI acceleration. Does anyone know any progress on true neuromorphic architectures? TrueNorth seems to be stuck in dev hell.

I'll have what he's having

>Hi Bill, I heard you had a problem?
>Yeah Mike, my car is leaking oil.
>Wow, that sucks. I tell you what, I have a solution looking for any problem out there! FPGAs!

I'm a neural VHDL developer bruvalove

it's better than anime :D

nah I think it's real, and about to get much better

TrueNorth and all that was a good idea at the time, but the reality of reality is that machine learning just leaps ahead way too fast for any hardware implementation to stay relevant for long. (it's not all neural)

Xilinx is also launching cost competitive (their words) hybrid FPGAs (zynq series), so you may see something soon enough in consumer devices.

Oh yeah, good news for developers too! you can now start patenting your programs :D

In case you're wondering what you can do with an FPGA:

1) you can use an FPGA to store data
2) you can flash a CPU with advanced ALUs onto an FPGA
3) you can flash a GPU with many primitive cores onto an FPGA
4) you can alter the concurrent utilization factor of the FPGA so the processor uses less power and produces less heat, and extends your battery time as a result.

unless i can use it to procedurally generate VNs that are at least half decent it's not worth the money.

can someone just recommend me a pci-e x16 fpga card

Unfortunately you can't grow transistors.

just a matter of time until we generate porn suited exactly to what makes you tick.

but the question is, do you really want that? you'll deviate pretty rapidly and go for that endless stream of procedurally optimized endorphins that'll leave you in a state where you don't wanna interact with reality anymore at all because it doesn't do anything for you anymore.

1) improve math
2) reduce chip coverage per flop

is that a problem?

no, that is the desired outcome. the death of the white race via porn and VR overload.

tangential FYI: there's a movement to reduce floating point precision. less precision, smaller logic circuits.

To be fair, I can see them useful for certain purposes. Mainly in versatile ultraportable devices. I really don't think they'll be the be-all and end-all that the FPGA crowd suggest, but they have their uses.

Improved maths will go some way, but it's still nothing compared to transistor density. And there are still plenty of things where ASICs will have lower latency and better performance. What's much more interesting is improving hardware versatility, by being able to dynamically re-optimise, isolated failed transistor clusters. Perhaps changing load distribution if electromigration can be detected. Supposing you have an equally capable heatsink across the whole chip of course. Plenty of long-life applications for robust hardware, not least in satellites.

I'll skewer anyone that tries to take my FP precision away.

As you say, advantages of ASICs:
1) smaller packaging
2) as a result, higher frequencies, faster bus, etc, etc.

GPUs and CPUs will likely not go away, for the above reason.

Let's not talk about satellites, or other high budget one-off products. they are a completely different beast and are already using FPGAs and ASICs wherever they need to be used.

BUT, as coprocessors FPGAs will become ubiquituous soon.

>hardware components start to only have a single model on stores for each 2 years
>each hardware component has the ability to be the best on the market
>its capability is locked behind online authorization
>suddenly you are downloading more, anything
you know it's going to happen at some point

>wherever they need to be

Nope, they're not. FPGAs are still considered a bit on the untested side of things for those applications. Also don't be so sure that they're one-off. More and more go up, and if we're going to get this free-space QComms network up many more are needed, probably in clusters.

Military usage also falls into that, they have huge reliability issues with conventional hardware - good FPGA systems definitely have a role to play there, but we're still a good few years from the mil sector trusting them enough to adopt.

I agree we'll see them as co-processors, and I think we might see GPU design shift in the long run, shifting a lot of calculations and storage onto FPGAs. I also think you might be able to get complete FPGA solutions for mobile devices. Perhaps a bit of dedicated flash memory and some controller, but mostly FPGAs gearing to whatever app is running.

On a different tangent, they're going to be a security nightmare. The consequences of a compromised FPGA is huge.

>implying you can't use an FPGA to patch a leak?
What's it like being too retarded to comprehend something as childlike as Verliog?

>it's better than anime

But when can I download a car?

The future is now, user.