Why did IBM abandon Cell? It was way ahead of its time...

Why did IBM abandon Cell? It was way ahead of its time. I mean the year is 2004 or 2005 and game developers were still very stranger to multicore programming concepts. And it had a pretty strong 8 cores that were able to consume looots of data in a cycle. Pretty different architecture, designed for solely multithreading purposes. All the developers did was just whining about how hard it was etc.

They were still planning for a second revision Cell but for some reason they abandoned it.

Now you see PS4 hardware quickly became deprecated in one year due to bloated x86 shit. Nothing special anymore, no flexibility.

Other urls found in this thread:

all-things-andy-gavin.com/2011/02/02/making-crash-bandicoot-part-1/
en.wikipedia.org/wiki/Power_Processing_Element
en.wikipedia.org/wiki/Xenon_(processor)
en.wikipedia.org/wiki/Cell_(microprocessor)#Xenon_in_Xbox_360
en.wikipedia.org/wiki/Xenon_(processor)
youtube.com/watch?v=Y6vysVDQJjQ
youtube.com/watch?v=4V8Uy_d6jzA
moderncrypto.org/mail-archive/noise/2016/000699.html
twitter.com/AnonBabble

Cell was shit when it was new, and it's still shit today

Because x86 is cheaper now

cell was powerful at its time. it had an original concept that mainly focused on huge data processing.

>Now you see PS4 hardware quickly became deprecated in one year due to bloated x86 shit.
>due to bloated x86 shit.
No, it's because consoles have always been outdated pieces of shit but with the new games it really shows

IBM abandoned the Cell because people simply weren't interested in it. Sure, it had good floating point maths performance and all, but GPGPU clusters were coming in around that time and pretty much killed Cell setups in floating point performance and did so without the vendor lock-in.

Sorry to break it to you, but big mainframe style hardware is no longer the way to go in scientific computing. Sure, they still have their uses in things like super high speed transaction processing, but commodity or commodity derived hardware has taken over most of the specialized hardware markets.

Sony sold the PS3 at a huge loss at the beginning, I bet they are making big profits with selling the PS4 hardware now

The reason PS4 became outdated so fast is because Sony didn't want a repeat of the PS3.

When the PS3 first came out Sony was losing money on each console sold and hoping to make it back on game licensing. This didn't pan out and Sony wanted to avoid this on the PS4. So they opted to use shit hardware to try and profit on each console sold.

GPGPUs

Cell couldn't compete with much cheaper GPUs from Nvidia and AMD.

I think the uncharted devs made really good use of it

It failed because it had no hardware mutex locks to synchronize memory across cores. The major pain area for libraries like mpi is that it requires explicit communication between cores/clusters. People were writing mutexes in software which is ungodly slow.

With only 8 cores, something shared cache can support easily, it has little to offer. There are plenty of processors and clusters that have high-speed communication without shared memory that spanked cell even when it was new.

>Tldr: cell is shit now and was shit then.

Isn't the Cell CPU close as powerful as the PS4 Jaguar too?

haha
no

Cell only had one CPU.

The 8 other cores were SPEs (SPE is kinda like a gimped CPU that can only do floating-point math).

x86 and desktop hardware is a competitive scene where even the highest-end hardware could be totally deprecated and replaced the next year. PS4 hardware is born dead due to that. They should have gone with some specialized hardware, focused solely on game programming. Well it's their decision if they want to release tons of revisions for PS hardware.


SPE cores had no branch predictors that's one of the hugest pain in the ass. Just like shader programming, your game work load had to be as branchless as possible.

Cell does not = Gaming
It's good for servers and shit, that is why the 3 PowerPC Xbox 360 cores were better.
When they managed to master the cell though they came out with some amazing games like The Last of Us.

>Naughty Dog

No surprises there, they've always been at the top of trying to squeeze performance out of consoles.

the SPEs on cell were watered down weakshit, half-assed and can't do anything well on its own and hard to utilise because of the extreme verbosity required to use them.

They're less capable than a full cpu core, and less abundant than programmable gpu compute hardware (which had just made its first real entry in the form of ATi's Xenos Xbox 360 GPU, then later on PC's in nVidia's Tesla GPU and ATi's own Terascale)

who gives a fuck i've got 11 teraflops of theoretical performance.

and we still can't emulate ps2 games perfectly.

call me when 20-30 teraflop cards are in stock and a console breaks 5.

>no branch predictors

Does any GPGPU have branch predictor?

Or, do you have to use Xeon Phi if you want something like that?

>It's good for servers and shit

nope

Even on ps2 they had great success for performance

Not even in specific tasks?

all-things-andy-gavin.com/2011/02/02/making-crash-bandicoot-part-1/

Very interesting read.

>SPE cores had no branch predictors that's one of the hugest pain in the ass. Just like shader programming, your game work load had to be as branchless as possible.

I didn't know that. That's fucking hilarious. I mean I guess that's okay for games and streams where you expect to not take a branch as little as possible... But holy fuck. Branch predication has been around solidly since the 90s.

No idea, but it doesn't matter anyway since the system memory was only 256MB in total, which is ridiculous. Also the GPU in the PS3 was crap.

pcsx2 perfectly emulates software rendering. HW rendering is another matter, they are trying to create a scalable emulation that works on GPU. Which means you are "theoretically" able to run metal gear solid 3 on 4k @ 60fps.

and metal gear games of course. they squeezed the performance on ps2 and ps3.

let's say the branch predictor was weak, because of the pipelining. flushing SPE pipeline is a HUUUUGE hit.

I'm amazed just how powerful the Xbox 360 was

Branch prediction is incredibly cheap and lightweight to do. It's literally just keeping tabs on how many times you branched at a certain instruction and lazily clearing the pipeline if you guessed wrong.

How long was the spe pipeline? It's designed as a stream processor, so I'd assume pretty short.

It seems like the Cell is able to do some things better than the Jaguar on the PS4.
But the higher clocked same Jaguar on the Xbone is a bit stronger.

Yeah, but it's significant if it happens enough.

Pcsx2 is software renderer is definitely not perfect.

The funny thing is that the the Xbox 360 has 3 Cell general purpose cores but with improved floating point.

Ironically, Sony helped to fund the development of the Cell CPU but Microsoft arguably got a better gaming console CPU out of it.

I don't know but as far as I know SPEs could only access 256KB size of memory (that includes both instructions and data). Cell had two parallel pipelines, probably one for consuming data and calculations, other should be memory fetching etc. It's still amazing how Last of Us or MGS4 were able to work with such a minimalistic hardware.

It's the most accurate emulation, slow but everything works natively as it was supposed to be shown in PS2.

Again, stream processor. It expects to do the same shit all day long. However, branch predictors are incredibly low footprint in gate complexity. They could've shoe-horned one in at the last minute and no one would've cared.

>tfw it's been too long to see a thread doesn't get derailed by GPU fanboys, desktop posters, "guys should i buy it", gaymers. Instead something is actually being discussed.

Devs did not whine about multithreading, op, but because it was a really difficult arch to dev for.

it was a pain to program properly compared to x86
the ppu had to send pointers to the spes and the spes had to read data through DMA transfers into a small 256KB local memory. the ppu and the spes communicate though slow 32bit mailboxes.
it was very difficult to properly design the memory access patterns and videogames are pretty random and varied in their access and computations patterns
it is very difficult to make 8 high throughput FPUs that have little memory and no branch prediction to run a video game efficiently

Because it discusses something with shit GPU, no desktop and definitely shouldn't be bought nowadays?

why did sony literally waste $400M by funding the development of this shit? couldn't they foresee this, I mean they have been in the game development industry for many years, how could they see how this fits to game programming?

They probably realized it at the $200 million point and after having designed a console around.

In retrospect they should have pulled a Microsoft and asked for more PPEs.

honestly I don't know
maybe the (potential) price/performance was too good for Sony to turn down

>Xbox 360 has 3 Cell
No.

>Xbox 360 has 3 Cell general purpose cores
>core

Jesus Christ, learn to read.

the difficulty of developing games on cell scared many small developers away

No. The SPEs weren't limited to floating point at all.

>tfw PlayStation SDK is still locked up behind 7 layers of NDAs, and you have to e-mail someone at SCEA to get a phone call to discuss how much money they want you to pay before they even give you the NDA papers to sign

It has no Cell anything, they're just PPC.

360 is not cell
it does however share the powerpc architecture with cell but it is not cell
it's amd

In specific tasks it might be a little faster. In most code the Jaguar will be multiple times faster, and that's taking into account the heavy optimization required to get decent perf out of the ppe/spes.

>people saying that ps2 had a 128-bit cpu just because it had 128-bit simd registers
that was bad, I didn't like it

Holy shit, you people are retarded.

en.wikipedia.org/wiki/Power_Processing_Element

>it's amd
you especially

>but everything works natively as it was supposed to be shown in PS2.

No, the software renderer has problems too

>amd

wut? it's still IBM, they literally waited until cell is developed and picked the good parts out of it. It's PPC with general purpose cores.

wrong article
again it shared elements with cell but it is not a cell processor

en.wikipedia.org/wiki/Xenon_(processor)

this is the xbox 360 processor and it is not cell

please point on the doll where it says cell

>en.wikipedia.org/wiki/Power_Processing_Element
And yet, it is still not Cell.

>big endian

trash it goes

And if you could read and/or had eyes in your skull you'd see it does not list Xenon as a Cell and nowehere on the Xenon article does it have any Cell architecture mentioned let alone the word Cell

They Xenon is a triple cored PPE with improved floating point.

The PPE was developed for the Cell.

The PPE is the general purpose core of the Cell.

Which part of this do you not understand?

en.wikipedia.org/wiki/Cell_(microprocessor)#Xenon_in_Xbox_360

the article you linked only backs up what everyone else is saying, they share power architecture but xenon is not cell which was made between sony and ibm

>nowehere on the Xenon article does it have any Cell architecture mentioned let alone the word Cell

en.wikipedia.org/wiki/Xenon_(processor)
>These cores are slightly modified versions of the PPE in the Cell processor used on the PlayStation 3.

Read the article before you talk shit.

No one said it was a Cell, just that it has the Cell general processor unit, the PPE, in it.

Microsoft saw what a steaming pile of shit the Cell was and only wanted the PPE.

>the ppe was designed for cell
meaning ppe would become cell
>the ppe was then put into xenon
meaning it branched into xenon before it went on to become cell

what do YOU not understand?
if ppe is charizard, and cell is charizard x and xenon is charizard y, does that help you understand?

Fuck the Cell

because of it we will never have PS3 emulation

No one is say that the Xenon is a cell CPU. My point in the original post was that Microsoft poached the PPE after Sony had paid IBM to develop it.

>No one said it was a Cell

Xbox 360 has 3 Cell

Did you just arrive in the discussion or what, this is the post that started the whole debate, wrongly stating 360 is Cell

We already have

>has 3 Cell general purpose cores but with improved floating point.

which it fucking does, the PPE is the general purpose core of the Cell.

The PS3's chipset is nearly as powerful as the PS4's. The PS4/ XBone just have more RAM.

There's already one under active development
youtube.com/watch?v=Y6vysVDQJjQ

There's an xbox 360 emulator as well
youtube.com/watch?v=4V8Uy_d6jzA

Yes it is you idiot. Why do you think the U.S made a super computer made from a PS3 cluster?

i see few things that were wrong with cell
1.
RAMBUS™ for cell and GDDR3 for RSX meaning that that data would need to be reencapsulated when copying from
CELL RSX
2.
Separate adress spaces for both RSX, CELL
3.
Only 512Mb of RAM
4.
Weird master-slave (client-server (?) ) architecture
5.
256Kb of L2 cache
6.
Only two instructions per cycle

>only used for sat image processing

Cell was only good at very certain, specific workloads where the task could be coupled tightly to the design of the chip.

It was pretty shit as a general purpose processor.

>It's the most accurate emulation, slow but everything works natively as it was supposed to be shown in PS2.
Wtf, what games have you been playing? Games I wanna play have serious issues, especially burnout 3.

It was shit, but they really managed to get a hold of the power of the chip.

As a gaming console, this is a good thing
It's processor was great at doing one thing

Get a load of this fucking idiot

well..
remember how many people bought PS2s in part because it was a DVD player (there wasn't ultra cheap standalone dvd players from china back then)?
sony wanted to make the PS3 the ultimate entertainment center, having people's consoles hooked up in vast networks to share the load (cell "broadband" engine) and so on

it would be the heart of your home, so sony thought it was worth taking a loss on every console sold in order to gain that presence

it was to be more than just a simple game console

Except it was a 128bit cpu in practice, unlike, say, the n64

Because the PS3 was heavily subsidised. It was not a particularly good design, it just had a good price:performance ratio thanks to Sony spending so much money on it.

lel

so your desktop cpu is 256-bit, because it supports 256-bit avx instructions

you can't create a cpu specialized for running games
depending on the game the cpu would need:
- multiple cores
- good branch prediction (game logic, AI, scripting)
- good cache to help memory access patterns
- good compute performance performance (physics, matrix computations)
- large amounts of memory
- good I/O performance

also it would need to run the underlying operating system which is exactly what a general purpose cpu is good for

many game optimization for the CPU code are just heuristics because no one can predict all the possible scenarios a game may run into or what kind of memory access patterns will take place

a cpu specialized for games is just a good general purpose cpu

You can take out the AI bit, no one programs that shit anymore.

>Why did IBM abandon Cell?
Because it's crap.
>And it had a pretty strong 8 cores
lolno. It's one PPE core was weaker than a Pentium III clock for clock, and the SPEs were much much weaker than it.

>but everything works natively as it was supposed to be shown in PS2.
Ratchet and clank would like to have a few words with you. Those words may include mipmapping.

Jesus, I forgot all about that shit. Why did you have to remind me OP.

Mipmapping and Ratchet and Clank works properly with the software renderer.

Does it? Last I checked it still messed up the textures somewhat horribly.

That was only the hardware renderer. GSDX in hardware mode still fucks up custom mipmaps and will always fuck up custom mipmaps, but the slow software renderer works fine.

Maybe, but unlike avx, 128bit simd instructions were used constantly on the ps2

Lol, it's always worked perfectly in software

it's perfect in software mode, you just need a cpu with retarded single threaded performance to run it at full speed

>restricted to in order exercution

Last time I tried I was getting some messed up textures but I'll have to try it again then. Yes, I'm reasonably certain that I used software rendering.

IBM shopped the XBOX 360 CPU to Sony, but they wanted IBM to partner with Toshiba to come up with the CELL instead. The end result was 1 super anemic PPC core and 8 Toshiba SPU cores. They aren't even the same instruction set. There's a reason why Intel doesn't combine x86 cores with arm cores, even though they could. It's a retarded idea, and it needlessly over complicates things.

>you just need a cpu with retarded single threaded performance to run it at full speed
And lots of cores/hyperthreading. I drop down to ~40fps in areas on an OCed 6600k on R&C1 in SW mode.

moderncrypto.org/mail-archive/noise/2016/000699.html

CPUs are optimized for video games. Video games are a huge market---a
market where people pay close attention to performance and adjust their
purchasing decisions accordingly.

Yes, yes, some buyers pay attention to CPU performance for weather
prediction, or movie rendering, or various other important applications.
But most of these applications have the same basic bottlenecks as video
games: most importantly, low-precision multiplications and additions.

Do CPU designers spend area on niche operations such as _binary-field_
multiplication? Sometimes, yes, but not much area. Given how CPUs are
actually used, CPU designers see vastly more benefit to spending area
on, e.g., vectorized floating-point multipliers.

You have that backwards. PPC came before CELL. The XBOX360 doesn't have CELL cores. It has PPC cores. I'm sure Sony played it as though XBOX 360 had modified CELL cores, but PPE is Sony market speak.