Why did the cell processor fail?

It really seemed like that processor of the future!

Other urls found in this thread:

youtube.com/watch?v=xHXrBnipHyA
yarchive.net/comp/instr_per_clock.html
www-03.ibm.com/ibm/history/ibm100/us/en/icons/petaflopbarrier/
one-blue.com/royalty-rates/royalty_rates.html
ign.com/articles/2013/10/08/playstation-3-was-delayed-originally-planned-for-2005
twitter.com/NSFWRedditGif

Did it fail? It seems like a lot of Power8 use the same PPE and SPE design.

The short answer is that it was hard to program for.

Mark Cerny gives a pretty good talk about how they got to the PS4 and why they decided to move away from CELL

youtube.com/watch?v=xHXrBnipHyA

>why did the cell processor fail
>PS3 sold more than 85 million units
??? it was a commercial success. it just never took off in other platforms because developers like x86 better

Bad yields for IBM nodes.
After PS3, bad reputation for being hard to code well for, since the bingbus on the SPEs had terrible latency.
Even worse when Sony cut out 1SPE for the shipping PS3, but never defined which one (since it was cut for yield) so you could never quite be sure of the latency between each SPE.

Is this more of an indication of the weakness of modern software engineering or something intrinsic to the technology itself?

>lost $200 per sale
>commercial success

if ps3 wasn't so shit, it would have sold like the ps4 off it's brand loyalty alone

I didn't.

What 16 year old did tell you that?

How does that mark success? Cell wasn't widely adopted, hell it was abandoned by its own creators.

Low IPC, shit is wrose then that AMD's Bulldozer crap.

Most consoles lose money on sales, though with the Xbox One and PS4 they've definitely been better at it not losing nearly as much, which is why their hardware is nearly identical.

Although I think Nintendo's consoles have always been profitable, but I could be wrong on that one.

Hey man this is a super interesting talk, just wanted to say that.

Easy, to difficult to program. Thus not enough programmers...

IBM broke the petraflop barrier with ps3s

It is good as a GPU replacement, but not as a CPU replacement.
It trips like a bitch on conditional branching, making it run certain tasks really, really badly.
I read once that an SPE performed as well as a Wii CPU running emulator code for example.
Not "with the clocks scaled", 3.4Ghz vs 720 Mhz.

because everyone calls them smartphones now, not cellphones

>in the beginning was the word

How does it perform at cryptomining?

>2006 vs 2011
no shit

Cell was bad in IPC even if compared to the CPUs of the time.
It was the pentium 4 problem, but significantly worsened.

yeah, companies are pretty much bruteforcing their way, then when components get cheaper the recoup the losses, hence the 10 year life cycle of consoles

but thanks to cryptocucks expect prices to rise again

Is it a question of quantity or quality?

I remember hearing something back in the day about Sony going to use it for graphics, but they ended up getting a Nvidia since the Xbox GPU performed a lot better.

...

What failed about it? The PlayStation 3 it was built for did just fine and the handful of Cell-based supercomputers seemed to do their jobs well enough for those who invested into them. It even saw some limited use in niche servers.

Why does everything have to take over and totally reshape the computing landscape like some kind of cancer to be considered a "success" in the eyes of so many people?

It was a weird time.
I think the original plan was to use four Cell CPUs, but with those initial yields, they probably given up the idea.

It was a total flop. PS3 only kept going so long because Sony invested way too much into it and were knee deep in shit trying to make back money on software sales while selling the systems at a loss

reminder that IPC has been a shit metric of comparing architectures since the 80s
yarchive.net/comp/instr_per_clock.html

ppe at least was at least comparable to k8 though?

The PS3 was pretty good at that protein folding thing, better than any CPU of the time. So it was probably great for crypto too.

Well, i was talking more on a "real case scenario" basis, rather than bogomips crap.
The lack of branch predictor,long pipeline and no out of order execution ruins the cell performance quite badly, given its DSP like nature.
But for DSP stuff, thing is a monster.

I guess it really depends on what "success" means to you. I don't believe the Cell had any wide-reaching ambitions or goals like, say, the Itanium; it was built for a specific system that still sold in the tens of millions and found its way into several publicized supercomputers, and just looking at the chip itself like that, I wouldn't really call that a flop. Sony might have lost money on the deal, but I don't think IBM did.

It lacked the a17 and a18 coprocessors to make it perfect.

cell wishes it had even 1/4 the IPC of a pentium 4

yeah I can see where you meant it in a ballpark sense now
(couldn't help myself with the opportunity to spam a yarchive tl;dr though, some of that shit is pretty cool to read)

IBM and co did have some pretty large ambitions, given that the processor was set up for supercomputing tasks and there were cell servers and pcie cards for sale in the mid 2000s.

No they had really big ambitions for Cell, it was supposed to go int everything from computers to appliances to cars. The idea was they'd all link together and shit like your microwave could make your PS3 faster

Because it's a hybrid between a GPU and CPU, but it got the worst of both worlds.
The future was in CPUs with a handful of powerful cores, and highly parallel calculations being performed on the GPU.

But was that intent, or just the usual press release marketing bullshit that follows any kind of technology product? I never remember IBM pushing the Cell in a way you'd expect them to if they genuinely wanted it to become widely adopted. I've never personally heard of any variant of the Cell that was actually viable for use in embedded applications (or any attempt at one) or really any kind of effort to sell them outside of the PS3 and a few niche systems.

I'm not trying to say you're wrong here, I just really don't remember IBM or anyone else giving a shit about the Cell outside of its intended application (the PS3) and a few specific HPC applications.

Furthermore, they wanted to use Cell for both CPU and GPU tasks by creating a special game engine themselves for all developers to use. Then developers with some sense objected to the absurdity of this idea because Cell was simply not enough to handle both no matter how "powerful" it was or how you optimize code. Yes the CPU is powerful as we have seen in Last of Us or Uncharted, it was able to produce quality games even after 8 years of its release. But it was simply not enough. Maybe they were planning to use multiple Cells but they probably got that it would be a real oven to put a few Cells in the same board. I can't imagine the price of VRMs to handle the power draw, yet alone the cooling solution that would make the console 10kg and still wouldn't be enough. Also remember the fat ps3s that kept failing due to heat.

So it was a dream. Maybe more modern Cell implementations&redesigns with better nm production would have met the goal, like Cell 2 or something. But Sony basically didn't calculate GPGPUs would become a thing around that time.

RIP Cell, it was used for teaching MIT students multithreaded programming.

>Also remember the fat ps3s that kept failing due to heat.
I've got the latest one (super slim) and it has overheated 3 times already and needed reapplying of thermal paste besides cleaning. Am not alone with this, there's even fan speed mods you can buy and the slim also had overheating issues.
PS3 is planned obsolescence in all of its variants.
Not as bad as the Xbox 360, which has a failure rate of about 1/3, but too close to be disregarded.

So long story short:
That made me think that cell processors are absolute dogshit and must go extinct.

the PPU on the Cell was outstanding when it was designed im 2003, total dogshit when it came out in 2005-2006.

also, not enough SPUs - they should've put hundreds of smaller ones and kickstarted the whole pixel shader meme.

> processor of the future
> not IA-64
absolute state of Sup Forums

>Sold 85 million units
>Every game comparison showed a notable drop in performance and quality which brought it almost in line to the Nintendo Wii
>Wii has sold ~100 million units

I think I see a pattern.
Now introducing: THE POTATO VISION CONSOLE. EVERYTHING IS PIXELS, BUT REALLY BIG ONES. OH WOW.
>200 million units later.

>why did the cell processor fail?
The design was absolutely terrible. It lacked basic stuff like branch prediction that other processors have had for decades. The memory model pushed a lot of stuff back up to the software developer that would normally be handled by a proper memory management unit. The SPE, the units that were supposed to make parallelism magic are too weak for the task.

Middleware developers managed to abstract away the worst of it. Neither IBM or Sony fielded a major revision of it.

i dont get it. you have to code assembler stuff for cell in normal game development?
is it hard to code in C++. i dont get it

They demonstrated it running better than the 7800 GTX core they wound up using for the GPU but only with specifically optimized custom engines. The real problem was that the Xbox 360 used a first gen programmable shader GPU from ATi whereas the 7800GTX was fixed shaders which meant it was stuck being slower no matter what.
I remember when they first came out with fully programmable shaders, it was an exciting development. Overnight DX9 games went from needing $600 GPUs to running extremely well on $200 GPUs.
Half-Life 2 had never ran so smooth for so cheap.

game developers were not competent for good software architecture design and especially multithreaded programming, so they failed to adapt Cell earlier. The ones that were competent were really promoted in the future, that was a leap for them. Nowadays, game developers are even more shittier, they simply use shitty engines with zero optimization, games literally consume 10GB memory while they use single core with 100% load. They are not even trying to optimize their shit anymore. The most recent optimized engine I know is Fox engine I guess. Works like a charm, even on PS3.

It was a good idea at a bad time, GPGPUs started becoming a thing at around the same time Cell came out and overtook it.
The PPE's sort of neat, I recently found out it's an in-order core with 2 way SMT and in addition to the 7 SPEs it's also got an AltiVec unit.

Building an engine from scratch is expensive and difficult. Doing so for a weird, and non-portable architecture ups the business risk ante even more.

Most devs want to make games not fuss around with low level engine stuff.

Yes, but the massively pipelines Pentium 4 sucked dog testicles when it came to system call performance. So much so that they actually downscaled the pipeline stages going from Netburst to Core.

>now after spectre they all know about predictive execution

I've still got my original preordered 60GB PS3 and it's in fine working order. I think the issue was assembling/manufacturing standards becoming lower in the late 2000's, with the advent of Apple's Chinese sweatshop products being sold at a fucking premium and making billions.

I don't believe you, but if you're happy I'm happy.

I've heard the issue was the solder. The 7th console gen was the first to be affected by RoHS compliance and early lead-free solder sucked.

I would never tell lies on the internet, of all places

Sounds about right. Mine was probably a demonstration model they put together before they got the sweatshop workers to copy it, lol

there are no simple 'plug this into the code and it will manage everything' as a programer on cell, you had to tell what instructions would run on what core and you would have to have everything work in tandem well.
it doesn't help that sony also wouldn't give out documentation for it because they wanted devs to learn it over time and make better and better games as a result of knowing the hardware better

I can't imagine it was much more friendly on the programer side in non gaming solutions.

Only the launch units had that high of a BOM. The revised units lowered costs, they eventually sold the hardware for a small profit.
Original 60gb model had a bill of materials of $840. The slim version launched a couple years after was $336 at launch, and that cost lowered with time. There was another revision I'm aware of that was cheaper as well.

the first xbox was an 800$ piece of kit they sold at 300$
I don't know about the other gens other then 360 and ps3 sold at losses, though not as heavy, ps3 may have been heavy because stand alone bluray players cost around 1000$ at the time ps3 came out.

Whoa sauce?

Cell was a Larrabee before Larrabee. And Larrabee failed.
>Xbox GPU performed a lot better.
No shit it would, it was the first chip with unified shader model.

>trying to do graphics with a cpu.

If you don't have the infrastructure around an architecture it's going to fail no matter how superior the new architecture is compared to the old one.
x86 should've been dead long time ago but since most software written in the past 30 years was written for x86 and it has the most money poured into support it isn't

Larrabee!

What's a cpu? Some kind of ass speculum?

Don't forget about balancing you instruction pipelines, if you didn't, you lost half of your potential performance.

>pic
This is fucking retarded.
Who greenlit this fucking design?

Ken Kutanagi and IBM.

Its conceptually sound, it was the first step towards heterogeneous computing between CPUs and GPUs.
Today we have it with HSA and its much more streamlined. Both the Xboner and PS4 benefit from what started with the CellBE.

Fuck them.
>Its conceptually sound
Just like Larratrash!
>Today we have it with HSA and its much more streamlined.
No shit, but this is done with _separate_ CPU&GPU.
>Both the Xboner and PS4 benefit from what started with the CellBE.
They benefit from AMD doing all the weird Fusion stuff.
CBE left no legacy.

GPGPU

What makes people think it was any good?

>Ken Kutaragi
He genuinely thought forcing developers to use weird hardware would produce gaming miracles. Using a major platform as a testing ground for chip architecture experiments is a bad idea. Sony rightfully fired the guy shortly after Microsoft started seriously taking chunks out of their market share. MS SDKs have always been top notch, and devs will gravitate towards tools they like. IBM's octopiler never materialized.

Non-locking parallel algorithms and the rest of modern multi-threading wouldn't be ready for another decade. Even then, a revised Cell would still be weak compared to its competitors.

>IBM broke the petraflop barrier with ps3s
www-03.ibm.com/ibm/history/ibm100/us/en/icons/petaflopbarrier/

IBM. Blame those dumb cunts.
Their monolithic nature led to Cell not going anywhere.

Only Sony really done shit with it and made a killing off it.
A big issue though was Sony gimped Cell by making the SPEs local memory too small. They really needed to be 512. That would have been able to work insanely better than the shitheap it ended up being.
Cell in PS3s were horribly underperforming because of that.

Cell was actually a bit of a success in the architecture front.
Most of the ideas are used in modern hardware to various extents. More so GP-GPUs. (mainly the pipeline admittedly, but simple programmable cores connected to said pipeline)

I will forever hate IBM for letting Cell die.
I used to be active on developerWorks for a while. No longer. Fuck them.

This was after the initial hurdle the first gen developers had to deal with.
Once the octopiler came out, it was trivial to program for.
It is no more complicated than any multi-core design of today.
It was just procedural-babbies whining. They still whine today, in fact.

>RoHS
Fuck that shit meme.
ALL it has done is led to MORE electrical waste sitting around, causing more damage to the environment than the fucking toxic elements!
FUCK ROHS.

You'd think that the industry would learn. Precisely zero out of zero times has a "weird" architecture been successful.

Worse than GPUs at doing GPU shit
Worse than CPUs at doing CPU shit

Faggots never learn.
I want to kill every nigger shilling VLIW abortions like the Mill.

"Here, have a data center gpcpu with a weird architecture and absolutely zero documentation, dont worry tho, we will give some documentation in like 4 years lol, only we have access to it right now"

But can I mine bitcoin with it efficiently?

No, you can't mine on it.

>He genuinely thought forcing developers to use weird hardware would produce gaming miracles.
He believed this because of the PS2's success. What he never realized was that PS2 succeeded for various reasons IN SPITE OF it's assfuck retarded hardware. PS3 was doomed before it ever left the drawing board.

PS3 was decently successful in the end though.

What would it be like at its full planned potential?

And then Microsoft went and put their very own Crazy Ken in charge of Xbox and completely tanked their business, pissing away all their hard work during the previous generation.

Pottery.

>Blaming a chip for poorly designed cooling

>Blu-Ray drive @ 66 USD.. When Sony owns blu-ray kek.
also
>Other Materials 79 USD, is that 79 USD worth of plastic..?

>When Sony owns blu-ray kek.
You're a retard, Sony is a member of the Blu-ray consortium, they don't own Blu-ray.

sony owns 1/10th of the patent and rights to Blu-ray while 7 members for the consortium are companies and the last member is a University. It is interesting that Sony is paying more than a cheap blu-ray player costs in total for a blu-ray drive, considering the consortium doesn't have to pay royalties and even if they did the cost for the license would be 8.00$ but again, Sony, Thompson, Samsung, LG, etc. do not have to pay the rate due to cross-licensing

one-blue.com/royalty-rates/royalty_rates.html

PS1 was weird as fuck too but so did everyone else at the time who had weird stuff out. PS1 really only stood out for getting amount of innovative stuff right per dollar right to be a moderate success.

Gamedevs hate weird stuff.
That's why 360 with simple tricore PPC CPU and a good GPU was oh so much easier to develop for.

Not really.
It was the most "vanilla" console released until the sega dreamcast.
Regular ass T&L math, regularass texture support, hardware does ALL the 3D job, you could extract the whole 3D performance of the system with C.
People only got lost with it back then because 6 months ago, they were doing 16 color sprites, and finding 3D artists was an impossible task because only the SGI machines could run decent 3D modeling software, and those did cost a fortune.

I don't understand. He came off of just working with the ps2 and it being the most successful console ever and then didn't think once that the ps3's design was a massive mistake?

Like I can understand being sequestered enough to not care really but you would think like 5 minutes away from the project would be enough to realize how much bullshit it was.

Reminder that the original PS3 concept was to use the Cell's SPEs in place of the GPU

I don't get this meme rumour, that was never the case.

ign.com/articles/2013/10/08/playstation-3-was-delayed-originally-planned-for-2005

the ppc scheduling the cells seemed like a good idea until they got engineering samples and realized it was a catastrophe of a design

people make mistakes when they like something and want it to work out

ya but he also says how proud he was of having a game engine that stomped all over other 3rd parties who hadn't even seen the hardware or anything yet. Like, no shit, you had direct access to the development of the hardware I would fucking hope your shit would be good.

Wouldn't you want 3rd parties to actually be able to develop games that are as good or even better than your own?