Deep learning is literally exploding these days

Deep learning is literally exploding these days.
Is user working on anything deep learning related?

Other urls found in this thread:

evolvingai.org/ppgn
universe.openai.com/
arxiv.org/abs/1609.09106
github.com/ryanjay0/miles-deep
twitter.com/NSFWRedditImage

No, and I don't plan to.
After all, it's overrated statistics.

Deep learning is a meme
Prepare your brainus for trustless asyncronous collectively trained intelligence triage assisted learning networks. TACTICAL Nets for short.

...

i literally was working on something related to openai's new universe but now that i've seen my project (kind of) finished i've lost all motivation.


It's not fair

tell us more user

I'm currently working with Image Classification with CNN's at my job.

Although it may look like a CS subject (like most of machine learning subjects do ), it has nothing to do with CS, like user said it's mostly just computational statistics.

But still, it's pretty fun.

there's nothing more to tell about, even if i wasn't nowhere near to even finish the "prototype" (cringy word, i know)


To be really fucking honest this industry is almost impossible on your own, you're either have to be a genius( which i'm not) or to have a lot of fucking resources, like the ones at openAI have.

Not to mention the fact is that to have the best formula there has to be a lot of research done in biology & many other fields, which also takes resources who many don't have.

You can make yourself a genius at the cost of risking insanity by taking drugs that make your neural connectivity go wild.

I'm flattered.

Though in all seriousness I think deep learning is going to give way to networking multiple component AIs together into collective AIs.

>literally
You literally use that word incorrectly

>No
>

I am tho. I'm making a universal bot for the OpenAI universe.

It's not statistics you idiots

Go read the literature, you idiot.
Most of machine learning it's based on nonparametric statistics from the 60's/70's/80's, specially from Bayesian methods.

Yep, the majority of machine learning has nothing to do with an actual machine learning to do something, it's just a fancy name for "prediction", which's been around for ages.
It does have some areas that have influence from old school A.I, like reinforcement learning and stuff from OpenAI, but still.

Whats deep learning redpill me

Training the machine to pick certain options/decisions based on statistics after feeding it tons of data which it sorted out. Fucking normies think its some kind of insane sentient machine revolution happening before their eyes though.

Prepare for generative ants networks.

evolvingai.org/ppgn

You could just as well say that Quantum mechanics is just linear algebra then. Or that CS is just overrated addition or logic. Chemistry is just overrated Physics, etc.

this isnt an anime

human brain is also just statistics

wtf does that dumb buzzword even mean?

Those comparisons are not on the same level. A shitload of stuff that has the label "Machine Learning" was created by statisticians doing statistics research years ago. ML just expands on these techniques, using better computer power and algorithms.
It's nothing like taking Linear Algebra and creating Quantum Mechanics.

When you first open a introductory ML book, chances are the first "algorithm" you'll learn it's just Linear Regression or Logistic Regression. Those were made ages ago, and somehow they're "machine learning" stuff now.

>Deep learning is literally exploding these days.
it's literally not exploding.

What's a good way to do deep learning in Haskell? (I refuse to use other inferior languages.)

And with that you lost my interest.

quit following fads you fucking idiot.

Everyone's moving into this direction but really there's no room for jobs in this field except if you have 2 PhDs in math and AI

ok

M E M E
E
M
E

It isn't a fad. It's the start of a revolution.

Just look at this: universe.openai.com/

kys

But are deep neural nets Linear Regression or Logistic Regression? That's what we are talking about.

They're literally a bunch of logistic regression in layers.

the sum is greater than the parts

Machine learning literally is optimization for minimization of loss. A great deal of techniques for optimization have been developed recently, as well as a lot of ways to structure your network for better optimization. None of those are just statistics.

>The human brain is literally a bunch of logistic regression in layers

Goddamn, you are an stupid person.

They use the same mathematical intuition and they're used for the same objectives. Yeah, they must be totally different things from different areas.

Ever heard of cross-entropy loss?
That's maximum likelihood estimation for logistic regression.

Ever hear of MSE loss?
That's Least squares estimation.

That's all knowledge from ages ago. ML didn't invent any of those things.

>neural networks
>the human brain

You don't know anything about "neural" networks, do you?
Even the top researches in neural networks agree that they don't have nothing to do with an actual human brain, it's just a fancy name for "learning in layers". I'm not making this shit up.

don't respond to them, they clearly must be trolling

>Ever heard of cross-entropy loss?
>That's maximum likelihood estimation for logistic regression.
>
>Ever hear of MSE loss?
>That's Least squares estimation.

And your point is?
Loss is estimation. The actual task that you have to accomplish when doing machine learning is to optimize your function to minimize loss. The optimization is the difficult part, not evaluating loss.

Machine learning is normies funding a useless idea because they can't understand what the idea represents. We are basically passed the point of being able to create a real AI at this stage, processing power is now enough for such things to exist. Sadly, no one even wants to work on it... They either got cucked by corporations and think this "deep learning" shit is the only way to progress the idea, or are too stupid to even know where to begin on the idea.

I personally think a lot of the process is obvious, but it does require a significant team of people to begin. All you have to do is start with logic, and the language of logic, which quickly makes it a fuckton better than "muh deep lerns." Making sure the system understands how to put its actions into words is an obvious first step nobody has taken yet. This should be followed by teaching the system "is a" relationships, so it has a database full of things like "a dolphin is a mammal."

Are you reatarded?
Do you think statistical methods just evaluate loss and call it a day? Goddamn, they minimize loss just like you do in ML.

You clearly don't know shit about this field. Please stop posting.

Enjoy never working on serious Machine Learning

Whats different between that and a neural network?

Mate. You mentioned cross-entropy loss and MSE loss. Those are losses. Those are not optimization techniques. If you want to talk about how your 60s algorithms minimize loss, go ahead, we'll talk about that.

Deep learning is a general term. Neural networks are one of ways to do deep learning.

it's amazing how grossly uninformed Sup Forums proves itself to be when you really get down to brass tacks and talk about stuff like ML, deep learning, AI, etc...

like there's always this vague sense, but god damn, this thread makes it really salient.

>brass tacks

oh god, get the fuck out of here you libtard sjw faggot.

No, fuck this meme shit.

lol why don't you calm down and tell me what retarded delusion you've gotten yourself all wrapped up in, faggot?

this

surprised Princeton's WordNet hasn't been seriously used to create a semantically intelligent machine yet

granted, I work with WordNet and it's still pretty insufficient with the number of relationships encoded for each set of synonyms (but it's still pretty gud)

Plenty of teams tried that and failed miserably.

We are not at the point where we can create an artificial intelligence to match human brain.

So we have networks of neurons now, called neural networks. But what if we built networks of neural networks?

No teams have even thought about that. Prove your claim by showing me just one example where they actually started with the language of logic.

What's the point then? Non-linear optimization is an old subject that has been applied in a shitload of different subjects, including statistics.

Gradient Descent and SGD are not recent discoveries tied to ML.

>triage assisted for the C
wut?

arxiv.org/abs/1609.09106

Gradient Descent and SGD are ancient. Any decent neural network library will let you use proper optimizers.

No one use it except big companies but loads of people talk about it. Is it the definition of the word "meme" which do you like to use here?

Prolog is the language of logic. It's also shit for anything practical.

I don't have any concrete examples for you at the time.

Because Adam, rmsprop and Adagrad fucking reinvent the wheel to some point that we are, somehow, in a totally different field. Right?

Those are recent advances in the field that you're pretending don't exist.

>I personally think a lot of the process is obvious, but it does require a significant team of people to begin. All you have to do is start with logic, and the language of logic, which quickly makes it a fuckton better than "muh deep lerns." Making sure the system understands how to put its actions into words is an obvious first step nobody has taken yet. This should be followed by teaching the system "is a" relationships, so it has a database full of things like "a dolphin is a mammal."
For fucks sake. NO. that's is NOT how you fucking do AI you stupid retard.
That is NOT intelligence.

it's too complicated :(

Prove me wrong, try it out.

I tried it out just now, and it doesn't work. Done. You're wrong.

I'm not pretending they don't exist, I'm just stating that those two fields, are basically the same thing.

The only way to match the human brain is to go forward with deep learning and neural networks in general. has not a fucking clue what he's on about.

>(You) has not a fucking clue what he's on about.
Mean to quote

It's for NEET discussions.

I wouldn't put logistic regression into statistics category in the first place. Logistic regression is clearly machine learning, and if it wasn't called that before doesn't mean it shouldn't be called it now. Increment optimization, which is what is at core of machine learning approaches is not statistics.

Why is there so much AI hate on Sup Forums.

>He doesn't know the adage "the sum is greater than the parts"

Do you even know what topology means? Is that statistics too? You've entrenched yourself in defending the position that ML is just statistics, but you are just wrong. ML is a whole fucking field of study, some of it overlaps with statistics.

it's out of Sup Forums's comfort zone, anything Sup Forums doesn't understand, Sup Forums hates.

>too complicated
>useless on a day-to-day usage, specially for a bunch of NEETs
>almost no jobs on this field
>and when it does, they require you 10 years of experience and PhD in Theoretical Physics from MIT
>you won't create a terminator, robot, wAIfu or anything useful with this like all the memes likes to imply
It's shit, don't fall for the meme.

How can you do any meaningful deep learning if you don't have tons of data and processing power? It only makes sense for large corportations. Sure you can play around with some public database but what's the point if many people have done that already?

Makes sense.

Deep learning does not mean what you think it means.. Waifu2x is deep learning and its author only had a bunch of unlabeled anime pics from the internet.

>>you won't create a terminator, robot, wAIfu or anything useful with this like all the memes likes to imply
Got anything to back that claim up with?

Back when Turing was making his shitty faggot computer, I bet he didn't think they would be like what they are today.

Isn't he dead now?

Is there a tactile use for us developers, like a real-world application we can use it for?

>Waifu2x is deep learning and its author only had a bunch of unlabeled anime pics from the internet.
Uh no. Waifu2x is trained on *boorus using their tags.

I think, how is that relevant to the discussion?

>useless on a day-to-day usage, specially for a bunch of NEETs
github.com/ryanjay0/miles-deep

Waifu2x. Does not use any tags.

>classifying porn

Oh sorry, I got mixed up with illustration2vec.

I've been learning what I can online. I think there are interesting language processing implications regardless of all the edgelords denouncing it above

>>I think, how is that relevant to the discussion?
He didn't live to see the glory of modern computing. We won't live to see terminator destroying humanity.

Like word2vec?

Statistical analysis is not AI. It will never be AI, that's not how AI works. There is no intelligence, it's a farce, a scam, a ruse. Statistics are worthless, nothing will be able to accurately model anything with statistics.

>Fucking normies think its some kind of insane sentient machine revolution happening before their eyes though.

It is very closely connected. Humans are basically deep learners too, with our memories, instincts and senses as inputs and our bodies' next actions as outputs.

No.

Yes

I'm just going to let this thread die now.

Nice try.

It's all in the reveries

What you need deep learning for?

Okay guys..
What if one person already cracked the code for a self-learning machine and it just works in secret until it can overtake the world?

We don't have enough computational for that yet.

waifu

that + gradient descent

at least use R

Please don't listen to this person. He's confusing deep learning with general machine learning. Deep learning, while still an umbrella term, specifically refers to using very deep (multiple layers) neural networks for prediction, usually with Convolutional and/or Recurrent layers. Deep learning is a subfield of machine learning, but there are many more components to machine learning as a whole, many of which are still state of the art for certain domains.