Desktop grade components in the palms of your hand

>Desktop grade components in the palms of your hand

Why even buy a desktop when we have smartphones in 2016?

Other urls found in this thread:

youtube.com/watch?v=g2Smcq8qLEY
en.wikipedia.org/wiki/Argument_from_analogy#False_analogy
news.berkeley.edu/2016/03/11/magnetic-chips-low-power-computing/
en.wikipedia.org/wiki/Reversible_computing
twitter.com/NSFWRedditGif

If you don't need one, don't buy one.

That's funny, I don't remember desktops having shitty ARM octacores which can barely decode a HD video in software

Same reason you didn't replace your desktop with a pocket PC.

>angry birds processor
>desktop grade
Kek

>Gimped no interface board
>Desktop grade

I started using teamviewer on my phone to connect to my desktop and holy FUCK what a time it is to be alive, there is almost no latency, quality is great, and it's super easy on battery life.

LITERALLY desktop components IN MY POCKET. Absolutely blew my mind when I started using it.

Sadly it has rendered my ThinkPad useless.

OP here

Before anybody gives me scrutiny for this, this was from an HTC 8x, a WINDOWS product. It's basically the same as any other enthusiast grade desktop.

WEW LAD

Yes, yes it it... except it cant run any respectable software.

> desktop-grade
A bad desktop, perhaps.

>qualcomm
>samsung
>synaptics

nigga that's hardly even laptop grade hardware

no interfaces
no expandability
useless architecture
limp-dick operating system
limp-dick software hobbled by oversized touch interfaces for ham planet fingers
cancerous dev communities
ugly styling
uselessly small for anything productive

you know the drill, that's why you made this a bait thread instead of an actual discussion thread

You're great at spouting memes and opinions.

It is the cutting edge of technology. We are all benefiting from the tech.

>someone says something I don't like but I have no counterpoints against it
>"that's a meme!"
you sure showed me
even though OP's "smartphones are basically desktop-level now guys I swear!" bait is one of the oldest memes in the book

>it's another Sup Forums jerks off to smartphones because they have BIG MEGAHERTZ thread

>>Desktop grade

Some people need to do more than fuck off on Facebook, Instagram, and Twitter.

That's literally all smart phones are: social media devices and police detective aids.

>Desktop grade
By definition you can't have desktop grade in the palm of your hand, there will be always something better ON THE DESKTOP.

...

worse than thermi, confirmed

If you are a manchild with no job sure.

I prefer my keyboard and mouse to actually be productive and not some smudgy touchscreen.

>inb4 hook up a bluetooth kb + mouse

That still doesn't change the fact that any smartphone gives a subpar user experience for productivity.

You forgot to add "shitpost on a Tibetan monk study guide BBS assistance device"

>Newest android phones will have 6 GB of RAM
Well now low and middle tier PCs can be replaced by high tier phones I think.
Only if you can connect them a mouse & keyboard and a monitor in an easy manner that is.

>Well now low and middle tier PCs can be replaced by high tier phones I think.
Please be bait.

A phone you can dock into a laptop or desktop would be neat.

Specs are definitely there for it

because some of us really work.

I don't see the GP102 3840Shader HBM2 on that pic. Did I miss it? How can it run ME:Andromeda in 4k?

>6gb of ram in a smartphone

Too bad android is too retarded of an OS for this to be useful

Fucking this, Android is a waste with anything more than 2 GB RAM.

>muh vidya

>Muh Octacore smartphone
Who needs more than 2 cores and 1 GB RAM on a phone to browse websites and read Emails?

Ugh

Not even a bad desktop. Even a 10 year old pentium 4 computer with 512MB of RAM is better than OP's angry birds phone motherboard.

At least with the desktop you can run x86 programs.

Everyone does. Websites are bloated and app devs can't optimize for shit. Only solution is more CPU and RAM resources.

Granted those things have become relatively cheap. You can get phones with 2GB of RAM and 4 A53 cores for less than $100 nowadays (ie ZTE Warp Elite).

The next Lumia will basically be a Supercharged PC

I got a Z2 with a 2.3 GHz Quad and 3 GB RAM. I don't know how to utilize that. I hope it will serve me at least until 2020. Otherwise Im going for my 2006 phone again.
>Spending 600 € on a fucking phone

Windows phones are dead. I sold my barely used 640 on ebay for 5 fucking dollars. Nobody fucking wants them, the buyer probably just wanted to phone for spare parts.

>I hope it will serve me at least until 2020
lmao it won't. Remember when the ZTE Blade used to be hot shit and everyone thought it would last until 2015? It didn't.

>Spending 600 € on a fucking phone
You don't need to. You can get phones with good enough specs for $100 every year. By year 2020 you'll probably be able to buy a 4GB RAM, 8-core CPU, 1440p res phone for $100.

Thing is the Jews have us by the balls with technology and the only way to win is to buy the bare essentials every 2-3 years instead of the latest flagship phone.

Buying 10 $100 bare essential phones over your lifetime: $1,000

Buying 10 $700 flagship phones over your lifetime: $7,000

>crossboarders continue to fall for the desktop smartphone meme because they look at component and see big numbers

>I can't into concurrent programming
>it's the CPU's fault!

Go away, intelkid.

No but for real ARM processors are dogshit. Yes you can code mobile operating systems around them but in the end they're still shit. This is why 99% of servers and supercomputers don't use ARM. ARM is more power hungry compared to x86 and you have to re-compile software for it in order to do anything besides play angry birds on it.

You've obviously never dealt with ARM and are just talking out of your own shitty experience with Android.

Android is painfully slow because of Java. ARM CPUs aren't great, but they aren't dogshit either.

>Android is painfully slow because of Java.
You've obviously never programmed in Java, the current Java VM is fast as fuck, like near native code speed.

>ARM CPUs aren't great, but they aren't dogshit either.
They are dogshit which is why performance is so much better on the Zenfone 2. Most ARM processors excel in everything except integer/floating point raw performance which is what actually matters in the end. The only reason we use them on phones is because using Core-M processors would be prohibitively expensive.

good luck doing something useful with it

>the current Java VM
That's super nice, isn't it? Too bad Android uses Dalvik/ART...

>muh zenfone meme
Haha you caught me, took me long enough to realize you were just trolling! ;^)

>ARM is more power hungry compared to x86

When did ZenFone become such a meme here on Sup Forums? Sometimes it seems as if this blog is full of Intel/Asus reps...

>You've obviously never programmed in Java
You say that as if it was a bad thing.

>rows of caps around the SoC

modern PCBs are legit art

Call me when ARM has something that competes against the Xeon-D 1540, which has the multi-core performance of a desktop i7 and a max TDP of 45 watts.

>Sup Forums retards still fall for the bait
>every
>time

Because it can literally run Windows 7 in a VM and no samshit/lg phone comes close to the bang for the buck it offers.

youtube.com/watch?v=g2Smcq8qLEY

Because RISC isn't CISC

Actually modern "CISC" processors (ie amd64 ones) are really just a RISC core with a CISC interpreter.

Strangely though ARM processors manage to be more power hungry than amd64 ones.

CISC is incredibly outdated

Indeed but like I said modern CISC processors are not 100% CISC. 100% CISC processors don't exist anymore.

You can't platy DOOM 2016 on your overpriced android phone

>because it can run Windows
Wintel shill detected, opinion descarted! Fuck off, Rajesh!

It can also run a VM of gentoo you autistic piece of shit

So can any ARM chip with Hyp mode, you braindead mouthbreather.

Fuck you, you ARM shill. That joke a cpu architecture is what caused phones to only be useful for making calls and playing angry birds.

You mean Java.

Here we go again. Back to your cuckshed you icuck.

Not an argument.

I guess that's why it's being so widely embraced, right? Because it's so bad?

lol

Classical CPUs are incredibly outdated. The future is quantum computing.

>top 500 memes

Nice memeing, but I happen to work with scientific computing, and I'm familiar with dozens of supercomputers running MIPS, ARM and SPARC.

Fuck off, Intel shareholder!

>i-it has lots of gigabytes guys
why are smartphone tards literally soccer mom-tier

>muh 0.01% of market
lol

>500 meme computers vs. every single cellphone out there counting up to a billion
I see now that I'm arguing with a teenager Intel fanboy tech illiterate. Call me back when you're an actual IT professional who works with those machines and speaks out of personal experience instead of what you've read online.

When I can run Windows 7 from a device in the palm of my hand with the same or better levels of performance as my current laptop (quad i7/16GB/etc) and just set the phone on a cradle/dock (using pogo pins not a hardline connector like microUSB or even USB-C either) and do everything I need to do with an attached keyboard/mouse/monitor/etc then let me know and I'll buy it at any price up to maybe $1000 USD.

Until that happens, I don't give a fuck about anything else on the market today or in the near future.

The Ubuntu Edge phone a few years ago had the right idea but it was obviously going to be running Ubuntu - the smartest fucking thing Microsoft could do right this second would be this:

- buy the design/form factor/patents on everything related to that Ubuntu Edge device (pictured)
- get an Intel Atom processor in it
- make it workable with the desktop version of Windows 10 not some fucking ARM based bullshit like the Surface RT devices (and then I'd find a workaround to get Windows 7 on it)
- market it as the Surface Phone - I mean look at the god damned picture, it fucking looks like the original Surface tablet, has the same design, has the same styling, same angles, same everything just in the palm of your hand)

Microsoft, what the fuck are you idiots doing? Why won't you do what's smart and make the god damned Surface Phone using that dead Ubuntu Edge design/form factor? It's fucking perfect for a Surface Phone. If I didn't know better I'd say someone at Microsoft based the Surface design on the Ubuntu Edge in the first fucking place.

I MEAN REALLY, LOOK AT THE PICTURE, IT'S PERFECT FOR A SURFACE PHONE.

>stupid fucking people will be the death of us all

Nice meme.

Wow, I'm glad to see the success of ARM and MIPS SoCs and Android is really making some fanboys butthurt.

Success breeds jealousy!

>dotcom bubble and PCs surpassing the "good enough" threshold severely hit non-x86 vendors
>Intel and Red Hat finish them off with the Itanium hype train
>Intel ditches Itanium, leaving IBM and Oracle as the last non-x86 vendors with HPC-suitable products
>Oracle, being Oracle, shoots SPARC in the foot
>IBM supercomputing designs are expensive and often custom-built for a particular customer
>the rest of the vendors are forced into sub-par x86 because that's all that's left and it's cheap enough that you can just order thousands of them and outdo better architectures by sheer brute force

>somehow this even has any bearing on drastically different consumer use cases especially when most supercomputing applications are better done on GPUs anyway

Wow, I'm glad to see the success of Facebook and Google services and datamining platforms is really making some privacy fanboys butthurt.

Success breeds jealousy!

en.wikipedia.org/wiki/Argument_from_analogy#False_analogy

>I don't like that! MEME MEME MEME MEME MEME
when will reddit leave forever
if RAM was all that mattered in performance then we'd all still be running core 2 duos with 8-16GB

but alas, all the RAM in the world doesn't matter for shit when the rest of your hardware is shit, it's slow, or your use case simply doesn't utilize it

not that you'll agree with that, because anything that doesn't justify your mindless consumption on BIG NUMBERS is a "meme"

Because touch screen, how the fuck can any sane person enjoy doing anything with that shit?

Because in my leisure I'd rather sit in a comfy computer chair at a healthy distance from a 20 inch monitor with an efficient comfortable keyboard and mouse.

Learn2ergonomics

>Intel and Red Hat finish them off with the Itanium hype train
>Oracle, being Oracle, shoots SPARC in the foot

these hurt the worst

>wiki link to a logical fallacy

This would be 10/10 shitpost but this is modern day Sup Forums

>kids who do nothing but play videogames all day and believe in the "16GiB of RAM is a meme" meme

RAM IS all that matters in the virtualization age.

In one of my recent jobs I took a crappy dual-core second-generation Opteron Supermicro server, loaded it up with tons of RAM and made it into a virtualization powerhouse.

Why are we letting tech illiterate teens control the frame of debate on Sup Forums? If you're not an IT professional, you shouldn't be here! Go back to your GPU thread!

This. Memory capacity is everything these days because most of the time your CPU is being massively underutilized, so you just virtualise everything up, and RAM dictates how many virtual machines you can have.

Oh yeah, I forgot neo-Sup Forums doesn't care about logic. Sorry about that mistake, won't happen again.

B-but I am still running on old Core 2 Duos. I run a server farm on old laptops. AMA.

>b-but servers and virtualization...
just what the fuck does that have to do with anything? we're not talking about servers, we're talking about phones and consumer desktops with drastically different target use cases, having 6 GB of RAM in your facebook telescreen is not going to trample all over a desktop solely because it has more memory, the same way my NetBurst server with 18 GB can't touch a modern system with 8 GB.

That is currently physically impossible, and likely always will be.

Unless you are only talking about limited web browsing and other light usage, which a tablet or net book do better given screen space and typically more power.

If you remember Landauer's principle from basic circuity class and adjusting for current hardware realities you see that such claims are thermodynamics unsound.

Given that node size has one of the largest influences on energy consumption, with a ceteris paribus assumption software of course which actually can have significant effects and noting node size across chips by time is relatively the same as the manufacturing crosses market boundaries very quickly. Then the power to to flip a bit is roughly the same regardless of the chips casing. This goes both for memory and processing. We will ignore classical hard drive arm movements in the calculation given solid states are widely available for desktops as well. So as you can see in such a basic framing that such computational power is physically dependent on electrical power and the rest of this argument should be self evident.

>Landauer's principle
Nigga, current silicon processors don't even work at the Landauer limit.

Oh and look we actually made some magnetic circuits that operate fucking close to the Landauer limit:
news.berkeley.edu/2016/03/11/magnetic-chips-low-power-computing/

Reversible computation also beats it: en.wikipedia.org/wiki/Reversible_computing

Of course that is why I said to adjust for "current hardware realities". You can use a simple empirical based multiplier to scale for an accurate approximation.

Also if you had bother reading the original Berkley papers your article references you would know it was a state change demonstration with a low energy laser, not a working chip. Very impressive, but now where near practical for actual use even in the extreme military and industrial markets.

Yes, before you go citing some of the actual single atom systems, I already know of several. Most notably a similar size systems that where experimented in 2004 for military use by Sandia, but production defects were near 100% to the point each individual chip had to custom architecture and BIOS modification making it non-viable, also any UV or temperatures over 50C permanently destabilized the few that survived the messy manufacturing method.

Do you even know the difference between informational entropy and physical entropy?
Yes, they are related and such an informational reversible system has advantages. But it is a huge, and sadly common, mistake to mix the two distinct idea.

I suggest reading Carnot's original notes his brother published after he died, it helps complete the historical thought development of the ideas we call now entropy. I found it to be most insightful on the matter.

Desktop CPUs haven't seen actual speed increases in half a decade now, only core-count increases. Phones these days can run Microsoft Office and NOVA 2, what more could you ask for? OP is right.

>somehow this tiny piece of shit able to match 600w of top notch computing power
kek, enjoy playing angry birds at 30fps

If you can do all your computing on a phone, then that's fine.

I need my desktop to edit documents, write code, edit video, and do 3D modeling.

They're different tools, with different capabilities. Choose what suits your needs.

mobiles have been able to run Office and other traditionally desktop applications forever, Winmo even had an official Photoshop port and shipped with Office as a standard

but just being able to run it doesn't mean it runs it well or effectively, even when the market was dominated by denser, more accurate stylus interfaces and software tried to be as desktop-like as possible it was still shit, just like it always will be because phone screens are too tiny to be useful for anything but dicking around, and nobody wants to deal with a touch keyboard for anything more than firing off a shitty tweet

That's why Android desktops/laptops are taking over the market, kiddo. See and and pic related.

Holy shit doomfags sure are rectally shattered over Angry Birds becoming a bigger more successful franchise than their shitty first-person shooters that need cutting edge CPUs and GPUs but are no fun to play except for a few select screwed up teenagers raised by single moms.

video encoding/decoding are pretty much as parallel as you can get. So no, its the processors that are shit, faggot.

Still up to the dev to implement it properly, smegmalord.