What went wrong?

What went wrong?

Other urls found in this thread:

youtube.com/watch?v=eu8Sekdb-IE
youtu.be/9r84HcI9cf4
youtube.com/watch?v=9r84HcI9cf4
youtube.com/watch?v=ARAvaKcoZB8
twitter.com/SFWRedditGifs

weak cores

black

They named them Black edition. Common knowledge that blacks don't work and even if they do, they don't do it well.

The idea that processors were already fast enough and all that would be left to do is making many cores for a few bucks. So amd shit out an architecture that had 2 integer but only one floating point cluster per module. Also they did not increase ipc or efficiency. Bulldozer was worse than the previous generation in many aspects.

Tbh. they weren't wrong. For any normal user Bulldozer desktop and laptop products are still performing good enough today, but obviously it's nice to have an even faster cpu for a better experience so intels huge lead in single core performance and efficiency made AMDs effort obsolete, especially since intel would sell their own bad chips(the "modern" celerons, pentiums and whatever crap there is) for cheap too, but their budget chips were often still surpassing the products AMDs midrange offered.

/thread

This. Poor single thread performance combined with a insanely high TDP made it an epic failure.

>making many cores
I don't think they didn't try to make their cores faster, I think they simply weren't able to and tried to make up for it with more cores. And they thought developers would "catch up" and start utilizing them.

The funny thing is that these FX CPUs will probably perform better on new games and applications compared to their release. Intel finally got a clue and made their i5 and i7's 6-core. Future software will probably be a lot better at utilizing all cores compared to what was common when those FX CPUs were popular.

...

Hardware unboxed has benched a few FXs in modern games and they lose just as badly vs intels of the same era, if not more. Good multi core scaling requires good single core performance, at least in games.

whatever you say Lenny.
youtube.com/watch?v=eu8Sekdb-IE

>2013

lel

Yeah, on a topic about 6300, I'd better be posting benches from the future, not the year the cpu was released.

the topic was modern games faggot

kek, I thought they were 'fast'

Nothing. Still running one on my second computer. Performs beautifully. A little too weak for multitasking and shit but it's more than enough for music editing and even substantial gaming.

House fire and terrible IPC due meme architecture.

The name is a warning.
>F indicates failure
>X is used to cross out a mistake

I don't recall Nvidia's FX series being too great either. The name is cursed.

The original Athlon FX's were good if you had $1000.

IPC per ALU/AGU actually improved with Bulldozer, it's just each integer core only had 2/2 ALU/AGU vs 3/3 on Phenom. That IPC was only about 5-10% lower was remarkable.

Zen has 4/2 btw, but each AGU is dual issue on load.

Literally nothing. My 6300 is a beast that handled pretty much anything for a long time and overclocked like a champ.

I knew a guy that had one back in the day. His mom had won like 3mil in lotto. FarCry never looked so slick on his preemo lcd

The money tore his family apart (dad got cucked lol). Intetesting seeing an FX in the wild tho

>design, build and sell multi core CPUs
>majority of software can only use one core
What does the computer industry mean by this?

Multithreaded design was in its infancy, and doing it right still isn't exactly easy. It takes time to adapt to new things.

>Good multi core scaling requires good single core performance, at least in games.
That is only because most games, especially of the time were not truly multiprocessing. The mainloop was, and still is largely tied to a single (hopefully) powerful core that offloads only tiny pieces of external calculations to other cores.

With the coming multithreaded rendering commanding this ought to change however.

no white edition

>haswell perf 4 years later
>moar coars
It's not really that different.

Nothing? I see them used all the time to this day

1. Bad architecture type: Bulldozer was a "speed demon" type where you have long pipelines with longer latencies clocked at high frequency to make up for it. Except they couldn't clock it high enough to make up for the single-core performance gap versus Intel. The high frequencies also meant more heat which made it really suck for notebooks.

2. Bad cache structure: Bulldozer had very little instruction and data cache compared to, for example, Phenom. They thought adding fuckload of L2 and L3 cache would offset it, but, traditionally, AMD's L2 and L3 caches have more latency than Intel's and it sucked. This was fixed in later revisions, I think.

3. Only one dispatch unit per module: This meant it had to alternate between the two integer "cores" and coupled with Windows's pants-on-head retarded scheduler performance suffered across the board. This was fixed after Piledriver, so the later APUs didn't have this problem and performance went up 30% in some tasks.

My fx4300 did well in most games and is certainly not a housefire with 95watt tpd as a maximum.

>haswell performance when it easily outdoes Kaby Lake on a lot of workloads with equal threads and cores

That was a lay up

Moar cores aren't always the best option when they have a significant slower rate of instructions per clock than the competition.

What AMD did right with ryzen was exactly fixing that and losing only a small percentage of performance to intel.

I have one, it works great

>easily outdoes Kaby Lake on a lot of workloads with equal threads and cores
Joke of the year

Nothing because you can pick up this cpu along with mobo and memory for less than 100$

I still use a fx6300 black edition for gaming,rendering and music

Nothing, still does thé job.

Nothing. People who knew what hey were buying and why got incredible value for their money across the system. Motherboards were half the price of Intel mobos, in addition to the FX line of chips being dirt cheap in comparison. If you wanted a multi-core chip to run multi-thread applications, you saved hundreds of dollars and still have a beast of a system today. Especially if you took the time to learn how to overclock your chip properly.

I've been running an FX 8350 at 4.7 GHZ for about 6 years now, with no planned retirement of the system. Around 2020 I'll (maybe?) assemble a Ryzen 2 system, but the FX 8350 will just be moved to a different function. I doubt it will fail before 2025, and it has more than enough computing power to be useful to 2030 or later.

The people who get all salty over the FX line are the fagchildren that bought the chip thinking they could use it for "whatever" for "free" instead of what it was designed to do. And didn't bother to get educated on how to tweak and use it properly. The same fagchildren that look at a Chevy 450 cubic inch V8 pickup and say to themselves "Well, fuck. If it's got enough balls to do heavy farm work, it will be amazing when I run it in the Indy 500 and BTFO all those retarded rednecks that don't into building race cars."

> "What went wrong"

Morons that have no fucking idea what they're doing complaining that someone else didn't design what they wanted and give it to them for free. Poorfags who took a cursory look at all the performance benchmarks & tech specs, then decided to ignore them because "Im'a cheap the fuck out and be amaze because not one of the tens of thousands of industry professionals know more about how to use these chips than I do."

99.9% of the crybaby faggots are angry tweens and basement dwellers that built performance workstations and tried to use them as 133tz gaymeme rigs.

No sympathy. Shut the fuck up and neck yourselves already.

There is not one single game this cheap ass CPU cannot handle, so nothing really went wrong.
Hell, I stil think I should've paid extra for it and OC'd.
t. athlon x4 845

>built performance workstations and tried to use them as 133tz gaymeme rigs.

But that's where you're wrong kiddo, if they built performance workstations they would have gone with intel and overclocked the shit out of their cpus.
t. user with overclocked X5670 who beats any bulldozer CPU even though my shit is almost 10 years old by now.

If it performs better then, it will perform better now. Vulkan and dx12 is only a bonus.

>X5670
>compared a server processor with a gazillion cores with a desktop one with only 8
500 horsepower are more than 90, I guess.

New? Where?

Nothing. I used to make extra money building PCs because these were so cheap.

I constantly got comments about how blazing fast and awesome the computer was too.

You can't be serious

Because it was a failure to the point no one wanted to think about it. Hell, I remember at some point Intel had well over 80% of the market share.

>failure
Name one game or piece of software this processor can't run

"Run" even a Core2Quad can.

>Over a decade later
>Still can't get multithreading right
>Can't even get x64 right
The absolute state of the Tech industry

Yeah difference being fx-6300 will never drop below 30fps in any game whereas the core2quad will freeze your OS as soon as you open steam overlay.

Commercial failure, retard. And you must be kidding if you think if can run modern games just fine.

This.

Brainwashed intel shill. Told you fag, name one game the FX-6300 can't run (at least at 30fps stable)

You must be kidding.
youtu.be/9r84HcI9cf4

Bad branding, big TDP

>90+% CPU usage
AAAAAAAAHAHAHHAHA
Yes, that's what I was talking about

>it can run every game!
>at least at 30fps 800x600 low of course
kys

Sorry, AMD doesn't suffer from intel's fixes so I don't get that

...

>intel_customer.png

Is that all you got? Go poo elsewhere.

Geforce fx5200 with 128MB of ddr ram was the best lower end card ever released. Nothing compared to it in its price range, it was destroying radeon 9250 in everything. I even played Doom3 and Oblivion(with oldblivion mod) on it.

It's a core2quad, what did you expect?

but 750ti tho

To not be better than an fx-6300

55-60fps vs 40-45fps kek, no wonder no one but absolute poorfags bought that thing back then.
youtube.com/watch?v=9r84HcI9cf4

Wrong video.
youtube.com/watch?v=ARAvaKcoZB8

Nice GPU bottleneck example, dirtbag

R7 360 is similar.
There was nothing similar to fx5200.

>1060
>bottlenecking a FX

>GPU at solid 99% (100%)
OOH MAMA

I don't know what all the shitting on bulldozer is about, I bought a 8350 in 2014 and it's handled everything I've thrown at it with stride.

As an anecdote, I know everyone here doesn't game but I'm running PUBG at 100-120FPS and getting solid temps.

Would I buy AMD again? Probably not, Intel owns the market now and there is really no competition but comparing this to the other CPUs of the era there is no competition, these 8350s won out by a long shot

>being this retarded