>Google's Tensor Processing Unit could advance Moore's Law 7 years into the future Google unveils a custom chip, which it says advances computing performance by three generations.
Forget the CPU, GPU, and FPGA, Google says its Tensor Processing Unit, or TPU, advances machine learning capability by a factor of three generations.
Both AMD and Intel BTFO by Google> Vidyafags on suicide watch
Jaxon Long
>which it says advances computing performance by three generations.
If they're taking Intel's generations, thats like a 6% increase in performance :^)
Xavier Collins
kek
Jeremiah Roberts
>Intel on suicide watch. Why? It's not a cpu
Henry Robinson
...
Nathan Sanchez
>made for specific field >Intel on suicide watch
Are you retarded OP?
Jason Watson
>Moore's Law isn't dead meme
Aaron Morgan
Op doesn't understand diff b/w ASIC and processor. He probably doesn't even understand diff b/w server chips and consumer chips.
Gabriel Morgan
>ASIC is faster than a CPU
OH NO, IT'S OVER
Hudson Mitchell
Intel fanbois making noise Cute
Alexander Garcia
>advances machine learning capability by a factor of three generations.
So does a LISP machine.
Aiden Wilson
save us, IBM and/or Google.
just give is x86_64 backwards compatibility
Brody Lewis
>LISP machine. Literally slower *for running LISP programs* than a 1987 PC or Mac.
>In 1987, three years after Minsky and Schank's prediction, the market for specialized AI hardware collapsed. Workstations by companies like Sun Microsystems offered a powerful alternative to LISP machines and companies like Lucid offered a LISP environment for this new class of workstations. The performance of these general workstations became an increasingly difficult challenge for LISP Machines. Companies like Lucid and Franz Lisp offered increasingly more powerful versions of LISP. For example, benchmarks were published showing workstations maintaining a performance advantage over LISP machines.[27] Later desktop computers built by Apple and IBM would also offer a simpler and more popular architecture to run LISP applications on. By 1987 they had become more powerful than the more expensive Lisp machines. The desktop computers had rule-based engines such as CLIPS available.[28] These alternatives left consumers with no reason to buy an expensive machine specialized for running LISP. An entire industry worth half a billion dollars was replaced in a single year.[29] en.wikipedia.org/wiki/AI_winter
Cooper Garcia
It's not dead, they just had more trouble then expected switching from immersion lithography to EUV.
TSMC already announced they plan to catch-up.
Jonathan Nelson
>advances moore's law >reduces transistors per operation uh
Also, they don't replace the CPU or GPU. if the article is right, they're a cheaper ASIC. They do one thing really well. Grunt work. Still cool.
Jacob Nguyen
Exactly my point.
Brody Jones
>retard believes he'll soon have a GOOGLE ISA CPU in his GOOGLE PC running GOOM
how retarded.
Aiden Turner
>Custom built ASIC >Forget CPU, GPU, FPGA >Implying they didn't use a computer with a CPU and GPU to design the ASIC >Implying they didn't use a FPGA to test their design first >Implying Technology 'journalists' aren't completely retarded.
Juan Hill
In other news, GPUs are faster at rendering graphics than CPUs.
Jaxon Scott
>they don't replace the CPU >It's not a cpu You just wait until Google AIOS comes out.
Levi Perry
Also production transistor size is basically limited by R&D by ASML. Moore's Law is about transistor size, Google doesn't have a lab that works on transistors.
If you want to stretch it further to speed instead of transistor size: ASIC speed isn't comparable to CPU speed, because the first is designed for a certain purpose, a CPU has a very general ALU design.
Nicholas Nguyen
>moores law >law
Nathan Howard
How many quantum computers can you fit in there?
Lucas Harris
About as likely as a massively-parallel based OS coming out for standard-alone GPUs.
>A specialty piece of silicone does a job better than a generalized piece of silicone Would have never thought. This is like being amazed that a GPU performs 3D translations better than a CPU. Or that a dedicated DAC chipset does Audio better than a FPGA.
Don't get me wrong, it's awesome that Google is advancing Machine Learning with additional hardware to make it happen, but it's far from putting "intel on suicide watch". Or AMD. Or nVidia. Via is always on suicide watch, tho.
Connor Green
Okay let's just explain what's going on:
Google is literally ignoring the IEEE standards. And that works out just fine.
These TPUs calculate things like sqrt(d) with much less accuracy
literally all google did was make a 1/2 floating point GPU
Kevin Hall
Will it blend?
Cooper Young
The x86 manuals are 460+1529+1964 = 3953 pages. It's time to throw that junk in the trash.
William Bailey
I've had an idea going around; but it really needs fleshing out. No doubt it could integrate basically all these different methods.
Ryder Davis
Obviously as the fallout of the silicon cul de sac we drove our CPU design into continues to reach into the tech sector, casting a pall of crushed hope over everything, innovative intermediary parties such as Alphabet will design specialized processing hardware to solve the pressing problems faced by their software engineers as they wait for a process material switch.
The interim period will, in other words, see systems become a complex system of interposers between a plethora of interlocking ASICs, coordinated by the CPU, to solve specific problems.
Software companies, increasingly tied into specialized hardware platforms, will increasingly become specialized affairs themselves, allowing for increased market diversity and competition.
The immediate future is bright.
Brayden Murphy
So it's pretty much a GPU that takes shortcuts to make specific AI computations faster?
Noah Martinez
It says in the article it's not meant to replace the CPU and is an ASIC for machine learning.
Henry Young
tree muderer
Nathaniel Powell
Doesn't IBM already have an x86 translation solution for their POWER processors?
Luis Sanchez
>OP not happy with being only a huge fucking faggot, makes post to prove he is also a dense motherfucker.
Nigger, that's a great chip for doing shit like bitcoin mining or even bruteforcing something, little has to do with a CPU or a GPU. It would be interesting if anyone could buy those retail like a regular consumer CPU on a competitive price. I bet a lot of people making research work and simulations would be salivating over them, besides the usual rushkie/chinaman scammer.
Kevin Gray
>using google perma-beta products
LMAO
Andrew Ortiz
>TPU Tensor Processing Unit >PCI-E interface >ASIC technology >inaccurate 16bit floating point >sized to fit hard drive caddy >8 year tech advance >hardware in use for over a *year*
This is old info. They must have revved this 5 times by now.
Michael Rodriguez
its just clickbait why are you still in this thread sage
Noah King
yea because I really need to be effective in Watts doing all my machine learning running the browser and steam
Ayden Howard
...
Jose Cook
1st floor enjoy your botnet!
Jose Ortiz
>botnet but all it can do is work on tensors
Ethan Bennett
>inaccurate float calculations >bitcoins mining >bruteforcing ANYTHING I got some bad news for you
Carter Gonzalez
Yes, AI training needs to burn through a ton of data, but the tolerance for 90% of the workload is very low, which means you benefit from having a combination of compute levels. Nvidia was making a point about this last year when they claimed Pascal will have 10x the AI training power of Maxwell, primarily due to its ability to process multiple precision levels between FP16 and FP64 simultaneously, utilizing the one best suited for each task.
Evan Rivera
Skynet confirmed , need to stop Google, save the world.
Sick of hearing about technology from the media, they're fucking useless.
Juan Morris
Reminder that it took Intel a whole decade to make the Pentium 4 finally be faster than the Pentium 3.
Isaiah Kelly
...
Easton Long
>not quantum meh
Cooper Lopez
ayy lmao
Thomas Martin
>Google's Tensor Processing Unit could advance Moore's Law 7 years into the future Oh the clickbait and sensationalism.
David Evans
Its over NSA won See ya in the FEMA death camps
Jaxon Cox
8 bit fixed point, you don't even need floating point.
Adam Ward
>advances machine learning capability by a factor of three generations. I don't care.
Isaac Hughes
By how many generations does it advance the botnet?
Xavier Harris
Really? I had AMD during that time so I can't recall.
Ethan Anderson
So what sort of FPS boost will I see when I buy one at Best Buy? Should I replace my CPU or GPU?
Henry Diaz
Where do i find these three?
Benjamin Hernandez
>Google randomly coming out with world breaking cpu now
How many people sold their soul to the devil to make google what it is today ? The level they constantly advance into markets out of nowhere is astounding !
Jackson Miller
it's for a specific purpose, machine learning, and assuming it's probably not going to be compatible with consumer motherboards, its just another ASIC processor that will never be used in consumer OR enthusiast pcs. its nothing fucking new