ARM desktop GPUs when?

ARM desktop GPUs when?

Other urls found in this thread:

amd.com/en-us/products/server/opteron-a-series
lenovator.com/product/103.html
hardkernel.com/main/products/prdt_info.php?g_code=G143452239825
dfrobot.com/index.php?route=product/product&search=LattePanda&description=true&product_id=1498
twitter.com/NSFWRedditImage

wat

I'm assuming you meant "CPUs". If so then never, they offer IPC worse than FX AMD processors and would consume way too much electricity with their dogshit performance-per-watt.

1980

You can already plug a monitor to a Windows phone.

[citation needed]
If they were actually worse then why are data centers considering them?

amd.com/en-us/products/server/opteron-a-series

Shit, meant Allwinner/MediaTek/Qualcomm.
They must have decent performance per watt since they're used in mobile devices. They could just slap 20 of them on a PCB and call it a day, singlethreaded performance on GPUs is largely irrelevant

For personal computing? Never. Could have happened if windows RT didn't flop but it did so it won't.
>If they were actually worse then why are data centers considering them?
Not that guy but performance per watt for low power applications and lower initial cost. ARM is not currently geared for HPC applications.

lenovator.com/product/103.html

>windows
What does Windows have to do with personal computing?
Install GNU.

>singlethreaded performance on GPUs is largely irrelevant
t. AMD

Oh, so you actually did mean the GPUs. Well, there's the whole driver support thing, competing with two other well established companies that both work very closely with Microsoft and khronos, among other things. It's a large investment for very little potential return.

Not an arguement

>GNU
Never has that been less relevant than when discussing ARM. Android is a far better bet.

What GPU would you want on the desktop, Sup Forums?

mali - powervr - adreno - videocore - think silicon - vivante

PowerVR for that sweet tile based rendering

$75 Arm machine ($100 with eMMC)

It can handle 4-5 open tabs on chromium, office software and music player, all at the same time.

hardkernel.com/main/products/prdt_info.php?g_code=G143452239825

Well, I keep hearing about Apple possibly moving their Macs over to ARM. Maybe it'll be soon?

dfrobot.com/index.php?route=product/product&search=LattePanda&description=true&product_id=1498
For four dollars more than your base price, you'd get EMMC and a much more useful chip.

They're Chinese companies, they can just reverse engineer Nvidia/AMD stuff.
Consumer GPU market is easier than selling to companies.
Not a GPU.

See Apple won't move their iMac or macbook lines over, ARM just doesn't have the performance.

They could pull an AMD and just add more cores/CPUs. They're power efficient, so it doesn't pose the same problems.

Good find user.

>they can just reverse engineer Nvidia/AMD stuff.
No, no they can't. What exactly do you expect them to do with reverse engineered AMD/nvidia tech? Get the same fabs that produce their GPUs to produce clones a year after AMD and Nvidia have moved on? Even then, the drivers are going to be a huge problem. Ignoring the memes, Nvidia, AMD and Intel have had huge problems with their drivers ranging from fried hardware to only partially API support.

They would have much lower r&d costs, they could make dual/quadruple GPU versions of nvidia GPUs. Anyway, that wasn't the question. Their GPUs already have OpenGL support, how hard would it realistically be to create drivers? Making GPUs physically is cheap, they could pack more of them on a chip to compensate for the lower power, since they have good power efficiency.

>how hard would it realistically be to create drivers?
Hard enough that three massive, experienced tech companies have failed massively on several occasions each.
>they could pack more of them on a chip to compensate for the lower power
That opens another massive can of worms with money having to be spent on making them communicate, distribute the workload and share memory. It also doesn't scale very well beyond a certain point.
>They would have much lower r&d costs, they could make dual/quadruple GPU versions of nvidia GPUs
Again, ripping off nvidia designs is a terrible idea because drivers and how long after the real deal it would hit the market, not mentioning how much trouble they'd have selling it in western countries.

>Making GPUs physically is cheap
The upstart costs for a modern foundry that can make 10000 300mm wafers a month are in the hundreds of millions...

They have factories.
>Hard enough that three massive, experienced tech companies have failed massively on several occasions each.
They already have drivers though, just for mobile.
>That opens another massive can of worms with money having to be spent on making them communicate, distribute the workload and share memory. It also doesn't scale very well beyond a certain point.
Dual GPU support already exists.
>Again, ripping off nvidia designs is a terrible idea because drivers and how long after the real deal it would hit the market, not mentioning how much trouble they'd have selling it in western countries.
If they'd do that:
1) they would rip off their drivers as well
2) they'd at least try to hide it, drawing nvidia into a long and unprofitable legal battle taking place on Chinese soil, whose legal system is notorious for protecting their own

>They already have drivers though, just for mobile.
Yes, because those are good enough and porting them will be easy.
>Dual GPU support already exists.
Oh, so mali, adreno and the others have dual GPU support? Wow!
>they'd at least try to hide it, drawing nvidia into a long and unprofitable legal battle taking place on Chinese soil, whose legal system is notorious for protecting their own
No, they'd be blocked from selling it in any western country. Congratulations, you've now spent fucktons of money making something you can't sell outside China with the exception of the very few people willing to order a Chinese rip off GPU online.

You clearly have no idea what you're talking about, just stop user.

>server SoC with dual 10 GBit and 14 SATA ports
>the dev kits only have dual 1 GBit and maybe 3 Sata ports
The CPU itself is also pretty outdated. Even a low end ARM server should at least have 8 A72 cores. That would actually be pretty nice.