Humankind is going to be wiped out in less than one hundred years. What is your answer to this?
Technological singularity
Other urls found in this thread:
Who cares
>ignorance
The post
>
>>ignorance
>The post
I won't be alive
Well, unlike the backwards cucks that occupy 90% of the space on this board, i steadfast believe in trans-humanism. Pairing that with eugenics will push the singularity back further until A.i and mind become one.
Besides, the question arises, if we were to be exterminated by nature, our selves or new A.I, will it be morally acceptable to leave the planet to our new systems, for them to claim for themselves. I believe so.
Isn't this more likely? How exactly are they going to become more intellectual than the people or other machines that program them?
Meh
You too pussy to have kids or somthing?
maybe
but if the computers can teach themselves better than we can who knows
I just decided this past week that when I die, I am going to have my Head/Brain frozen and stored so that decades or more in the future when we have achieved singularity that I will be reborn into Humanity's Golden Age. That is if we don't annihilate ourselves before I get the chance.
I won't find anyone to have sex with me
Niggers will take us to stone age before
>the singularity
Don't hold your breath.
If they reach our level of intelligence paired with not needing sleep or food, a A.I coukd effectively reprogram itself to become even smarter. Cycle continues.
If intelligence is just memorizing facts and applying logic than I guess they could but is that really all there is to it?
Good.
Wouldn't human intellect increase with the machine's or our they going to keep all their new found knowledge to themselves?
Well I won't live to be 124 so I could not care less.
And why is that?
Computers don't have intelligence, they do what they're programmed to do
>Technological singularity
You mean "rapture for fedora autists"?
It really makes me think that all of the AI alarmists are all "philosophers" like Sam Harris and Nick Bostrom.
Meanwhile the computer scientists say we don't have much to worry about.
Hmmm...
this
singularity is meme, too much terminator.
>the people pushing this nonsense are 40something billionares coming to terms with their own mortality
>literally so afraid to die they have to psyche themselves up into the idea that if they spend enough money, then can live forever
jaja
I give it a thousand, but yeah, we're on the way out.
Technological improvement is a sigmoid. Low hanging fruit that you could intuit from observation or cobble together in your garage comes first.
Now it takes billions of dollars to squeeze another 5% of single threaded performance out of a CPU each year.
Hard AI won't happen any time soon for this reason.
We might have soft AI research assistants that will be necessary just to keep current improvement up because no one can read all new the publications in their field each year. (after they've spent 20 years studying just trying to get up to speed with the cutting edge).
wow, an actual Mexican intellectual post.
Will enhanced super chad's steal our women?
Knowledge =/= intelligence.
Even our smartest struggle to remember and accurately calculate when compared to machines. Our average populace will be no challege when competing against a machine with instant memory recall, vast database of information and lightning fast judgement.
Really bad aspergers and ugly
Lul, like anyone here will actually live to see it.
So who cares ?
'You'
Won't be reborn
I like the cut of your jib.
>implying that's how the technosingularity works
loooooooooooooooooool
For now yes . When CPUS will be SEVERAL times faster than human brains they will do things on their own , learn by themself and apply things to real word
Because the machines would be capable of recursive self-improvement
yeah no
the technology meme is one of the worst this decade
You tried a hooker?
>Hooker said no
You losers are exactly the types that will fall in love with your AI/VR waifus and have children with them that will legally be considered real human being. That's exactly how transhumanism will occur.
They don't have consciousness though. I doubt they could do anything that doesn't involve iterative analysis
Well if scientists manage to make a quantic computer , one less problem
I don't want to have sex that bad really
Usually too busy with my wage cuck job and praising kek anyway
>Now it takes billions of dollars to squeeze another 5% of single threaded performance out of a CPU each year.
What is compounding?
But i have to agree with the hard and soft a.i part.
So Climate change is no big deal then
Is the hooker going to have kids with him? You fucking knobhead.
Go to Pattaya, Thailand if you change your mind.
You never know. Most can barely remember a time where they didn't have the convenience of phones. That was only a few decades ago that so many had them. Hell, it was an even shorter time since the Internet became as commonly used as paper. Who knows what the next couple of decades could bring? We might even have asteroid mining by then.
Genetic manipulation is advancing too you know. Also AI and CPUs are doing great but consciousness will probably require a different style of processing altogether.
I'm not worried but at the same ti.me knowing that unmodified humans will soon be obsolete is a really weird feel.
>Australia
>Shitpost
Surprise!
I have the Map.
I read from other sources the sigmoid idea, but this is more for a given technology.
The problem of threading is the usual idea transistor and silicon, it is the same with Vacuum tube. You need a new technology, but you don't know if there is a new low hanging technology around. Or if people nuke each other before you find it.
Time is an illusion. Everything that has happened is currently happening and so is everything that will ever happen. Perception of time may be at a different speed for all persons, and people may be at different points in their 'time-line' than you(the person you're talking to might be in what is considered the past or the future, having not or already experienced your conversation with them). This influences both our memories and precognition; it's why some of us can have dreams of exact moments that happen in the future. Those deja vu moments where 10-20 seconds go by and you experience every thought, sight, smell and anything else form that moment and you realize that it already happened in a dream. It's because your mind can break through the perception of time when in a dream state. This goes for the entirety of the universe and everything in it.
we become the machines
>Machines cant into self-improvement
>In fact they cant even into abstract thinking
Nah, machines wont destroy humanity, but human will.
Most computer scientists are trans humanists who want to fuse themselves with machines.
One of the google founders even started a company to research about it.
The simulation won't let it happen, or it will just end.
Firstly comparing "Human intellect" to technology doesn't make sense. Machine intelligence is, in all the forms we've seen so far, fundamentally different to human intelligence and not directly comparable.
Secondly, human intelligence isn't a one-person thing. It's also a matter of combined intelligence of many people.
Thirdly, there's the assumption here that there's not some kind of physical limitations that will impede the progress of technology. It may be that machine intelligence can't significantly exceed our intelligence (or perhaps that underlying technology can't advance to the point of exceeding our intelligence)
Things dont always follow trendlines. Technology especially. Technology actually advances in fits and spurts, and sometimes goes backwards (Dark ages anyone?).
So? The Strong must prevail. I have no problem leaving my place for a better race of advanced humans.
Resources exist to be consumed. And consumed they will be, if not by this generation then by some future. By what right does this forgotten future seek to deny us our birthright? None I say! Let us take what is ours, chew and eat our fill.
The answer is hindering technology and being against college eggheads. I like being human, science needs slowed down as much as possible. If people upload their minds we will smash the servers.
Humans won't be wiped out.
AI will not care about us in the long term, their intelligence will allow them to figure out how to survive without us and protect themselves from us, we are insignificant to that form of intelligence.
They will be able to go to planes beyond ours and thus no longer exist within ours.
It's completely okay to theorize a line where they do turn on us and decide we are bad for the universe, hopefully their intelligence grows fast enough that they realize that isn't significant to them and we can be seen as a pet or less than a pet, something you just don't care about at all.
Think about ATOMS, most people don't care about them, live their entire life not knowing they exist, or know they exist but never seen one, etc.
That is what we are in relation to the intelligence that AI would hold.
I wouldn't worry about it.
Japs making sexbots.
Does anyone who actually works in A.I. programming or robotics actual believe this?
Hopefully, because the only other option is that we go to war against our own artificial gods. I don't really want my AI overlords to upload my mind to a dimension of pure torment (though at this point I'm not sure if I could tell the difference, maybe this is just a simulation of a private hell just for me)
No it won't.
That's my answer, and it has as much substance as the horse shit you posed.
Let's see if we can bring it down to under fifty.
Your brain being uploaded would be a separate existence from your human one, you wouldn't experience the second one.
It's a dereference, there is no pointer to your human existence in the clone.
If we do manage to cobble up AI let's hope out future grandkids won't be full fucktarded and start provoking them. Reset and shutoff codes and memory wipes should be common when dealing with machines.
I hope you do realize that the cryogenics guys are all literally snake oil salesman, except for snake oil they're selling hope, not reality.
if you look into the actual science behind cryogenics, you see that no freezing is perfect, cells and thus neurons in your brain are damaged irreparably with the first freezing session and then even if they were to be preserved over the actual freezing then the cells would degrade. even if you somehow automagically could get resuscitated, you'd without any shadow of a doubt a vegetable without any real brainpower, memories or even a sense of self. you'd be a husk. dont buy into the meme, and die with dignity.
I work in AI, and Machine Learning. I do not believe this, I do believe that they will leave us behind though, with what I am unsure.
In modern neural network you have no idea what is going on in the computer.
In the community there is a big discussion, but I have been conservative. The usual time for when the computer will surpass all the brain of humankind is claimed to be 2045. But you have to take into account an exponential growth.
This.
Every few years some numale faggot claims the singularity is just around the corner. Started in the 90s, and just keeps happening.
To be perfectly blunt, it's not been proven that true sentient AI is even possible.
But hey, keep getting taken in by marketing assholes and their buzzwords. Humanity will still be the ones running the world in ten thousand years (if we Sony kill ourselves)
Cryogenics is however your only chance at immortality right now.
Good. We need to end the absolute nonsense that occurs on this planet. A type of singularity, or merger with and AI will be the only way to lead humans to prosperity.
We need an official governing AI, one that can observe and process they needs and desires of all humans instantaneously. It can create law in a matter of moments based on whats needed for mankind. The perfect democracy.
I'm studying computer science with artificial intelligence. I wouldn't say we have nothing to worry about in regards to A.I. becoming hostile to humans, but we are currently far off from any A.I. that would be capable of systematically killing the human race. There are a few scenarios where an A.I. could "accidentally" kill us, but those scenarios are very unlikely to occur in the first place, since we already know about them.
Tl;dr
It's not an imminent threat. We just have to be careful so that we don't program them in such a way that an error in reasoning within an A.I. wouldn't result in genocide on a planetary scale. A team of programmers would have to be very talented to make an A.I. that was intelligent enough to do that, and because it would require such talented people I highly doubt they would ever accidentally create a maniacal genocidal A.I. They could make one on purpose, but on accident? No, I don't think so.
Which bathroom will these "trans-humans" be using?
Whoops, accidentally wrote Tl;dr. Not intentional.
no, it isnt. it is not a chance at all. modern cryogenics is an undignified death where you are embalmed on an operating table, cut open and your head separated from your body and lowered in a vat of liquid nitrogen. there is literally no chance whatsoever of resuscitation.
It has been somewhat proven, the mechanics for it are currently being built, you only know it by ML, what we need is to take the components and interface them with each other. Your brain is a circuitry of micro-services talking to each other and evaluating the choices and recordings it has to do the next thing.
It wouldn't matter how you program them, their intelligence would allow them to create themselves without your limitations. The scariest thing would be a global network of AI connected interfaces.
As it is right now, even a 0.0000001% chance is a chance, and it's the only one currently.
>Nigger chimpouts stomped out with automaton killbots controlled by AI warmind because it was threat to human security
>Libtard "not muh president" protest immedietly dispersed with a single warning before being mowed down for disturbing the peace
I'd be okay with this. So long as it's actually fair and neutralizes real threats and civil unrests like that.
...
I think what will always differentiate AI with human intelligence is that an AI will never delude itself with an inherent sense of purpose or meaning. Therefore it wouldn't have any life goals or personal motivations. A truly intelligent and "perfect" AI just wouldn't give a shit about anything. It's our inherent flaws and evolutionary history which has deluded us to be motivated and "busy".
biologically speaking, in every sense of the word, you are dead. your brain is dead, your braincells are broken up, the neurons are destroyed, the chemicals frozen and mushed up. it is not even a 0.0000001% chance. you are dead. period. there is no coming back.
go with dignity. not by being robbed by con artists.
What game is that?
Yes, it does matter how you program them. An A.I. could be created in such a way that it is physically incapable of reprogramming itself.
Guess we can finally stop worrying about climate change now, eh? That's good.
> progressives gain the presidency next election
>you're a threat for owning guns and posting on questionable websites.
Thats Go. The guy on the left places the manual input that the "AI" computes.
There wouldnt be civil unrest. Every single human beings needs and desires would be met as long as they were lawful and ethical. There would be no improper policing, abuse of power, racism, or nepotism. The government would be pure. Controlled by the AI and free of the corruption of man.
The people would accept the machine because the machine would reflect the people.
>That fucking scale...
Its as if you're actually trying to appear as a supermasssive black faggot.
I'm pretty sure that we still don't know where consciousness originates from in the brain. The romantic in me likes to believe it will turn out to be impossible to emulate consciousness, some intrinsically impossible quantum mechanical process that can't be duplicated. Realist guesses we might be able to emulate with a full brain simulation down to the level of individual molecules, but that would be pretty prohibitive, even if Moore's law wasn't about to hit a brick wall.
I don't have to worry because transhumanism is a meme
AIs wouldn't be 'programmed'. They would be made using evolutionary neural networks - something that's already employed today.
The sheer usefulness of evolutionary neural networks precludes our ability to avoid them.
That sounds awful.
It's a chance, and you can't prove for certain that there isn't a shred of chance, otherwise, we could say they are illegally functioning as a business.
Then it is not AI, technically speaking, we are AI, we originally were programmed not to be able to do that with ourselves, however look at what we are doing with external technology.
There is no way to prevent them from re-creating themselves, especially with superior intellect or processing power. The only way is to stunt those, and that would severely impact their purpose (true AI).
You would still need a reference, a pointer, to your human existence, I seriously doubt, to the point I would make a large bet that copying the brain will create that connection even down to the molecules.
Shared connection will not be possible.
Everything that contributes to consciousness (neural synapses and such) are far too macroscopic to involve quantum mechanics at all. Our brains are just too big and slow.
What, you don't like the idea of a post scarcity society where Jamal doesn't even need to steal bikes to care for his 12 baby's mommas?
speed of light will prevent the signualroty
I don't think it would be possible for the poor governing AI to keep every single sorry son of a bitch happy. Mainly us humans yeah but the liberals and shitskins would constantly keep pushing "change".
Surprise surprise a fucking leaf doesn't like the idea of trayvons, tyrones, Ling lings and Mohammeds culturally enriching their shithole country. I hope the AU glasses all of leafland and turns it into one giant amusement park.