What, in the long term, is the most important work being done in computing right now...

What, in the long term, is the most important work being done in computing right now? What should highly talented programmers work on to have the greatest impact on computing right now?

Other urls found in this thread:

deepmind.com/publications.html
arxiv.org/abs/1506.07285
medium.com/conversations-with-tyler/peter-thiel-on-the-future-of-innovation-77628a43c0dd#.icv7eub6k
world.std.com/~rjs/publications/nips02.pdf
youtube.com/watch?v=wGkLo0DrJmQ&list=PL2dAOzo3a1SoQ4AqFCHb04yzQn8VnfK8l&index=1
arxiv.org/abs/1605.09128
docs.opencv.org/2.4/doc/user_guide/ug_traincascade.html#positive-samples
arxiv.org/abs/1605.06065
twitter.com/SFWRedditGifs

Definitely working on advanced AI research.

You could also say that making crypto accessible for normies is important to fuck with the NSA.

I'd say it's the two sides of the crypto coin: "quantum computing" and robust cryptographic methods.

I'd say machine learning and all that garbage too but the work being done RIGHT NOW is not particularly impactful imo.

Faster than light travel. Nothing else makes sense from a technological advancement perspective.

>computing

web development with php

Do you think this nothing to do with programmers?

Humanity needs every good hand to solve this problem.

do your own damn homework

It's summer, this doesn't work anymore

Machine Learning, especially Deep Reinforcement Learning. See deepmind.com/publications.html

/thread

The reason why big software companies are researching machine learning is that they need someone to analyze all the data the botnet collects from you. That "someone" will be AI

There are other, even more important applications of agi: producing scientific research, for example.

>ai
See
Field has stagnated. Show me current, ground breaking research. Protip, you cant.

Oh if you knew how bored am I when I hear about ml stagnation from you, user. In the past 5 years there was a continuous stream of breakthroughs in ml. Its hard to pick the best ones, but I'd pick deepmind's a3c, neural GPU, neuraltalk and visual qa. But there is more....

Follow the field, user.

Plz don't answer with shallow 'its not ai' criticism.

Current state of art is very impressive. There is a ML model that can learn to answer arbitrary questions about arbitrary images. It generalizes to images it was never trained on. It's from arxiv.org/abs/1506.07285

Again, with such state of art it shouldn't be impossible to develop sub-human but general AGI in coming 10-20 years.

Robot waifu.

Once it exists it won't be AI anymore. It's only AI until we understand it, then we can reclassify it.

>summer classes don't exist

>field has stagnated, no robot waifus
>*buys from amazon*
>eeeyeaaaah this was 100% processed by humans
>*hops in self driving car*
>cant believe how stagnant AI is right now
>*waves at automated NSA surveillance drone*
>*upvotes targeted ad*

Lack of robot waifus is a symptom of another, separate problem: true stagnation in physical technology. The progress in the field of computing is exceptional.

>t. i dunno shit about tech i just know the benchmarks got marginally better

>highly talented programmers
Research is a different skill than programming.

yeah programming is all about learning libs and gluing APIs together to make GUI furry porn browsers for mac

Nope, I mean "we were promised flying cars. We got 140 characters". I don't deny progress in computing, software and machine learning. But everything else has slowed down.
medium.com/conversations-with-tyler/peter-thiel-on-the-future-of-innovation-77628a43c0dd#.icv7eub6k

>It generalizes to images it was never trained on
Obviously... that's the point of ML
And that's not new... it's a RCNN.

>not processed by humans
Like all purchase requests in the last ten years
>self driving car
Ring ring, 2005 is calling
>Govt surveillance drone
K

>break through in ai is adding another layer and putting it on a faster gpu.
Call me when another Vapnik arrives

Vapnik's theory gives very crude bounds. If we talk about theoretical ML then Hutter and Solomonoff are way more interesting.

Also
>denying significance of new deep learning model architectures, optimizers, regularization tricks and sheer power of engineering required to train these models.

I can't understand how can one not be amazed by neural GPU or visual qa solving model. These are like magic. Learning sequence transduction algorithms is really beyond ordinary statistics whose primary concern is fixed length vectors.

Okay you win

Let's read Solomonoff together!

Any favorite publications? I love looking back at old papers.

It's sooo easy to spot Neil Degrasse Sagan reddit-tier retards on this boards, because they're always the ones that bring up AI and machine learning as if either of those things means fuck-all

The Jewish AI scam was exposed in the late 80s. I can't believe an entire new generation is falling for it
Keep smoking that weed and jerking off with your lesswrong buddies about how smart you all think you are, you retarded nu males

This one. world.std.com/~rjs/publications/nips02.pdf

Some deepminders are heavily influenced by his work, hope they complete it in some form. It's not deep learning, it is a top down approach.

I'm going to bed.

>programmers
HAHAHHAHAHAHAHAHAHAHAHA
Fuck off pajeet, the computer scientists are talking.

LW is a meme.

ML is not a meme, its a 10B$ industry.

>people are spending money on it, so it must be legit!

Oh you poor, poor child

Thanks user :)

>jerking off with your lesswrong buddies

The most important thing in computer right now is starting non-profit tax shelters to "teach women to code"

>not exploiting the ml meme to earn mad cash

In my uni they taught us the future is big data management. Bash my education Sup Forums

AI learning.
Once we have robots doing all the jobs we can start educating people instead of training them, and thus focus on more important tasks like space travel and dyson swam construction.

people saying AI are retarded memelords who dont know anything about how computers work
the correct answer is probably quantum computing

machine learning is useful, it's just not the 'future', it's a well-understood technology that's been used for decades, the problem is people who think because google made a Go AI that the new era of AI is upon us and soon we'll have real thinking computers

Again, you are clearly ignorant of development of the state of art.

We already have computers "thinking" in some sense (I.e having short term memory full of representations of the outside world).

Sup Forums ignoring and denying machine learning is hilarious. I suspect this is due to tech conservatism and disdain for math which are prevalent here.

Rubbish.
There is no artificial intelligence of the type people see in scifi. There is no consciousness, no self awareness, no evil robotic overlord.

There is just pattern matching, like an extremely advanced spam filter.

ML has disappointed me over and over whenever I tried to use it for my latest project which involved finding simple objects in a field. Ironically, the simpler the object the harder it is (if not down right impossible) for the machine learning to work. All ML depend on complex features to be useful, this is why facial recognition is easy for a program. The features in a face are extremely unlikely to be found in nature.

Simple objects however... are the bane of ML. If you can find me a ML implementation that can reliably detect objects shown in the video, I'll smack myself.

youtube.com/watch?v=wGkLo0DrJmQ&list=PL2dAOzo3a1SoQ4AqFCHb04yzQn8VnfK8l&index=1

Also, if any of you are serious about discussing this further, feel free to drop me a msg.

Machine learning is like an IDE, it cuts down on work, it doesn't substitute it.

Proof please.

>What, in the long term, is the most important work being done in computing right now?
Achieving gender and racial equality.

> If you can find me a ML implementation that can reliably detect objects shown in the vide

A small convnet trained on 1000 positive and negative examples should be enough. Feature selection is not ML, its CV. ML is training models on data. No data - no ML.

>proof
arxiv.org/abs/1605.09128

Something like

docs.opencv.org/2.4/doc/user_guide/ug_traincascade.html#positive-samples

?

I've tried many many many different kinds and types and it was too unreliable.

OpenCV has old shallow haar classifier. Try a caffe convent.

> I've tried many many many different kinds and types and it was too unreliable.
Use appropriate classifier model and training data. Look outside opencv.

Other people have tried similar approaches to this problem and didn't get very far. In lab conditions ML works well, but in the open world it tends to be unpredictable.

Self driving cars use ML => it can be used IRL

Facebook classifies natural photos using ml

If you can't apply it right it doesn't mean it's broken

Yes, but as I mentioned before those are all on complex objects. Machine learning fails to work reliably against simple objects such as the white cylinder or blue rock because they have such few features.

utilizing light(the photon) as a means of storage and computation

limiting factor at the moment for optical computing is creating the lenses, the sensors to detect the light, and the fact that electrons still have to be utilized at some stages which limit the speed of transmission

Anything FLOSS is the most important. Anything non-FLOSS is the cancer.

Beside, classification in images is downright solved and has been for half a decade.
>I'm retarded that means ML is a memee!111
And this is why Sup Forums is 100% consumer bullshit nowadays. I mean hell, look at the rest of the thread, 80% of responses are on the same tier as that.

>it's summer in the whole world

>And this is why Sup Forums is 100% consumer bullshit nowadays. I mean hell, look at the rest of the thread, 80% of responses are on the same tier as that.

What to read then? I'm reading r/MachineLearning, news.ycombinator.com, and r/programming (in the order of decreasing tech sophistication). Lainchan looks like a more polite Sup Forums, the people are almost the same.

I don't know other tech/CS/programming/electronics communities.

Quantum computing (in programming - parallelization)

arxiv.org/abs/1605.06065

>I'm reading r/MachineLearning, news.ycombinator.com, and r/programming
Let me laugh even harder.

one shot learning

I'm ok with you laughing, but it is a fact that some high-profile researchers (r/ML) and many programmers/entrepreneurs (HN) read and comment on these sites.

Talk about poe's law
>someone else does the abstraction
>you make the product

For programmers, any work towards making trustworthy crypto ubiquitous.

If we include the hardware side of things, then it's definitely creating blob-free SoC:s and other core components. The way the world is heading, it's critical the user gets control over their own devices.

>If we include the hardware side of things, then it's definitely creating blob-free SoC:s and other core components. The way the world is heading, it's critical the user gets control over their own devices.

This. OpenCores and RISC-V are doing very important work. It's a shame that FOSS people care so little about their software running on throughly backdoored, SMMed and MEed intel/amd hardware.

Awesome, thanks user