Can Sup Forums discuss neural networks and deep learning without resorting to memes?
Can Sup Forums discuss neural networks and deep learning without resorting to memes?
Other urls found in this thread:
en.wikipedia.org
twitter.com
That's impossible.
Neural Networks ARE memes!
>solve computer vision
>memes
99% of neural network variations only exist so some graduate student somewhere can write a paper to get their phd
they fiddle with the knobs a bit, find a structure that performs 1.5% better on a very specific input type, then go looking for an industry job
i think this is where you are wrong. all living beings function based on patterns, and those network variations display that certain activities are performing better than others.
your statement is just dumb because it implies that those "99%" aren't relevant or not functional at all.
True, 10 years ago Machine Learning papers were 15 pages of math formulas. Neural Networks handed the whole CS branch over to engineers. The best papers are just engineering tricks and people have no idea why it actually works.
>it implies that those "99%" aren't relevant or not functional at all.
no, it implies that they're mostly equivalent
based on what knowledge?
Those fucking images don't even make sense with how the networks are defined or used.
you could also argue the same way about sorting algorithms
A generic neural network is a ridiculously difficult proposition.
Hence why we have so many different layouts- they're not pure machine learning because they're still tailor built to the problem.
One of the key issues of neural networks is the learning constant (which I'm under the belief should be solved by another neural network), another is exclusivity of activity where most neural networks will learn across their entire being, and teaching them to do something new will cause them to forget how to do the old.
Ie the ordering of memory and decision about which length of memory is preferable is not dynamic and self-decided by the system, so it inherently cannot achieve all these tasks.
>A generic neural network is a ridiculously difficult proposition.
Obviously. The human brain has different neuroanatomical structures for every conceivable function.
generic learning networks would be possible if we had the computational power to actually process them and the time to actually train them
>without resorting to memes?
No
Sup Forums can't discuss anything at all without resorting to memes.
Is that a thing we even really want? Everything is made up of parts. The human brain has hundreds of different sections that do different things, and so does every other part of every organism on the planet.
yes, which means a generic learning network would be likely made of many discrete parts that are domain specific just like how the human brain is put together.
Neural Turing Machines are a step in this direction (bottom right in op chart)
Retard here,
What's a Neural Turing Machine and how does it work?
A neural network with added memory module. Being worked on at google/deepmind. They now call it a Differentiable neural computer.
Brains do the same thing.
There is no standard "handle everything" function of the brain.
Things get passed off to specialized areas of the brain if it triggers a certain response when analysed by another part to figure out what it is. (I totally forgot the name of it now, headache central)
Failure of that analysis part is what makes people slow, despite being intelligent. (which is why standard IQ tests aren't timed, that is a different thing)
Likewise on the other end, you can be quick-witted but a total fucking retard that just spits out random reactions and words. (which has a basis in tourette's)
>open thread
>no variational bayes
>no gaussian processes