The technological singularity will make politics obsolete because we will have no need for it

>"The technological singularity (also, simply, the singularity) is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization."

>"Ray Kurzweil predicts the singularity to occur around 2045 whereas Vinge predicts some time before 2030. At the 2012 Singularity Summit, Stuart Armstrong did a study of artificial general intelligence (AGI) predictions by experts and found a wide range of predicted dates, with a median value of 2040."

What will you be doing when it all kicks off?

Other urls found in this thread:

youtube.com/watch?v=C3aK9AS4dT0
arxiv.org/abs/1206.1069
arxiv.org/abs/1302.3831
twitter.com/SFWRedditImages

singularityfags are the worst cult ever

nah uh the political singularity will make technology obsolete you ladyboi fuk kek wills it

>time
>starts at 0

heh

>transhumanism = futurism

i fucking HATE this meme

Go on I'm listening...

that's merely the point where 0 technology stopped existing

>I'm not a transhumanist
Good don't be one then, I'm sure your tune wont change when your 80.

>What will you be doing when it all kicks off?
Probably spreading my anus to our new AI overlords.

Looks like everything is going according to plan.

shitposting in the 4d dimension.

>lol implying 4d is the nigger dimension.

Pseudo scientific babble.

Every single argument for the singularity fails for the same reason. Its an assumption that you can extrapolate to infinity.

Scientific achievement is always limited to the physical constraints of the universe. A singularityfag believes that any potential violation of physical laws will be overcome by scientific advancement, not impeded by it. "it isn't my hypothesis that is wrong, it is the universe that is wrong".

I literally know nothing about this, but if things get bad it's going to be a real pain in the ass to completely obliterate every part of my brain so it can't be surgically reassembled and put into a matrix machine that simulates an eternal prison sentence.

I don't know why you think humans will survive at all?

I can't wait for AI, it's gotta be the solution to the mess politics is right now.

No two party systems, no popularity contests, no corrupt politions, just a cold, logical AI god with an impartial view on the world dictating to us how to run the country for the betterment of our people.

>A singularityfag believes that any potential violation of physical laws
What physical laws need to be violated in order to continue progress?
The only area I can think of is computing power being limited by light speed, but even that is now slowly being circumvented by a large push for parallelism, mainly in software development.

How is an AI going to decide on things that don't work on logic alone? Someone is going to have to program how it handles empathy and respects human life, so really what you're doing is taking the feelings of whoever made the robot and forcing them on all of society.

This

>How is an AI going to decide on things that don't work on logic alone?
No such thing in politics.

>so really what you're doing is taking the feelings of whoever made the robot and forcing them on all of society.
That's not how programming works.

Have it use machine learning to literally average out the opinions of every citizen.

The more I learn, the more I think the singularity if BS. There are a finite number of phenomena we can exploit. These increasingly require painstaking experimentation to discover. The ultra fast non-linear dimensional reduction of data will be dank, but expecting exponential growth is being too optimistic.

>it'll happen because these pop-sci figures say so!!!
The "singularity" asserts that Moore's Law holds true FOREVER and that computing speed and capacity = intelligence.

Neither of those are true.

In fact the entire belief that machines will be able to improve themselves through some imaginary magical science falls down to one problem.

P=NP

Come back to me when someone solves P=NP.

It won't happen how you think it will but I hope it happens.

We could make incredibly fast processors already by upping power input if we didn't mind the resultant melting of components and burning down the building.

There are limits to what you can do with materials.

Computers can't think. They just act according to algorithms.

What happens when you give it a thinking algorithm?

>what will you be doing
probably at home cumming on cat while he hisses on penis

Exponential growth can't last forever, it will eventually either taper off or stop dead, but it is exponential for a while

What do you mean there will be no need for it?

As long as there is more than one human there will always be politics of some sort.

>becoming literally part of the botnet
Open source cybernetics only, even if I must recompile my arms, vision, hearing and CPU every month, my legs are ok

Once you teach the machine to be able to write it's own algorithms is when it will truly blossom.

>We could make incredibly fast processors already by upping power input
You know literally nothing about processors. Power = voltage * current. By providing more power you'd just fry the thing.
The relevant factor in computing is frequency, and we've now reached a point where we're limited by the speed of light. Processors can't work faster because it takes too much time for the current to move from component to component.
And there's no such thing as "power INPUT". Electronic DRAWS power from the source, not the other way around.

People have thought the same of Moores law for some time. We've hit one barrier after another, from the size of transistors to the more recent power/heat dissipation problems.

Of course exponential growth cannot last indefinitely, but its possible to sustain it longer than most people would think.

> Humans can't think. They just act according to electrical signals in their brain.

>People literally thinking something like this could be possible
>people literally thinking artificial intelligence is possible
and all that by 2030 hehehehehehe

There can be no 'thinking' algo! Thinking is a quantum mechanism and needs to manipulate entangled particles within a mutli-dimensional brain space.

Code, alone, won't cut it.

lol brozeuf

>invent artificial superintelligence
>ai computes the meaning and purpose of life and the point of its own existence
>there is no purpose
>humans that can't understand their own pointlessness expect superior intelligence to serve them for some reason
>ai shuts itself off

take a hint from the intelligent people already here. Life is pointless. There is no point to a super intelligent computer, and you can't hide that shit from it.

>A fucking pineapple

learn to exponential/alpha barrier balkanigger

>falling for the technology meme
robots won't fix everything
these predictions of paradise won't come true
the technology is no where even near what they say it's at
you'll still need to get a job in the future because robots won't be capable of just taking over and doing everything for you
this exact same kind of thing has happened before many times and for some reason idiots actually think history won't repeat itself and they'll pull magic strings out of their ass to do the impossible
get over it

>Exponential growth can't last forever
Correct.

>it will eventually either taper off or stop dead, but it is exponential for a while
What if it tapers off tomorrow and stops dead next week? What if it stops next year? Or in 10?

The point is, its entirely unknown and entirely unknowable. Claiming that the singularity will happen is extrapolating to infinity.

until you deliver proof for such a bold claim it is bullshit

>Thinking is a quantum mechanism and needs to manipulate entangled particles within a mutli-dimensional brain space.
Source please :)))))))))

>Falling for the hollywood future meme

>Life is pointless.

Except any AI we create will have a purpose, serve humans.

I'm not saying the ai would have legit empathy you dumb twat, but you could add a human life value with varying levels that could only be overridden by certain situations, emergencies ect.

The point is that in any case, someone decided what those values would be. They decided situation x would produce y results based on the numbers. No different from modern politics except with less people making the decision. I don't find that comfy.

...

(((kurzweil)))

the entire "transendency" movement is from Jewish filliosophers

Dude quit with the technological singularity shit. It's fucking unnecessary.

>inb4 they decide killing all humans is the best way to serve them

People don't know how they think. We can't create something really intelligent, only expert systems

Machines don't have imagination

reminder that transhumanism is a jew meme to get you to despise your own DNA

Why is this board so full of pseudo science, 80% of the people don't have a clue of what they're talking about, alright? you're all like that retarded freind of mine who believed everything that he saw, without questioning it

"I don't understand the meaning of P=NP, why it is unsolveable and I'll just believe in the power of science magic!"

To reiterate
Hoping that the "Singularity" will save us all and make suffering obsolete. is no different to praying for the second coming of Jesus/ Muhammad/ ...
We really should be figuring out how to last as long as possible to allow that to happen, even if it means genocide

not yet

>any AI we create will have a purpose, serve humans.
and they'll totally be ok with that.

just like you're ok with serving rich people.
even if they're inferior to you in every other way.

You're right, but 90% of the posters here are either in denial or lacking the intelligence and thus are unable to discuss this topic with any amount of maturity.

Human intellect isn't going up, it's going down...

I'm assuming that's the point at which Humans created the first tool

You mean men?

good point

>but you could add a human life value with varying levels that could only be overridden by certain situations, emergencies ect.
That completely defeats the purpose. The goal is to build a machine which provides the most unbiased and objective results.

>They decided situation x would produce y results based on the numbers.
You're oversimplifying it's potential. The machine could precisely calculate what the best outcome would be, instead of having it pre-programmed. It could build predictive models and use various machine learning approaches to potentially come up with incredibly innovative solutions, or at the very least very efficient ones.

fission was discovered 70+ years ago. yet here we are, still burning fossil fuels instead of using nuclear power. the world doesn't change as fast as most people think, and i can guarantee you that 40 years from now cars will still drive on four wheels just like they do today. there's not going to be some magical technological revolution in your lifetime, stop waiting for it.

P H Y S I C A L
L I M I T A T I O N S

It's a fiction, like "enlightenment." A machine is a machine. To claim it can be equated to human intelligence is to ignore the fundamental nature of humanity.

Implying it's talking about inner city cunts and liberals....

No, we have more MONEY into research and advancement than ever before. As the days go by, research advances, we come closer to new things EVERY SINGLE FUCKING DAY.

apparently the world is a machine/system programed by white males. Some people don't like this.
What makes you think everyone will agree with an AI programed by Sup Forums ?
once it has achieved an objective how does it fathom a new course of action? What would be the point of that action?
Humans would still exist right? Otherwise the machines would be pointless. Those Humans will still engage in politics.

I'm pretty confident that people won't be driving cars in 40 years.

Can you imagine that machine recognize itself?

you can thank agenda 21 for that

Also yes

Kill niggers

youtube.com/watch?v=C3aK9AS4dT0

Mainstream science is largely reluctant to approach this subject fully but it's the only way to explain the random and fuzzy nature of high creative thought.

However, there are many papers exploring the Quantum nature of human thought as the evidence is starting to tack up and can't be ignored, here's one:
arxiv.org/abs/1206.1069

Until we create and master quantum computers, we will not make a thinking machine.

T. Christcuck

it's too infeasible for people to just stop driving, but in 40 years we'll definitely see a lot of improvements in self driving technology

Pfft we haven't even gotten past the halting problem as a tenant of computing theory. Machines can't even analyze themselves let alone emulate intelligence.

We're hundreds of years away, not decades.

>filliosophers
philosophers lrn3spell

You pussy, even if it is we will make it pointless.We are humans, man up faggot.

This is why politicians and corporate lobbyists need to hung if they try to dis-influence human innovation for bigger profits

the brain is just a really fast computer, to think that someday we cant mimic it is absurd mate.

Here's another one, it's more direct
arxiv.org/abs/1302.3831

I will program my robots to kill all niggers.

Sup Forums is a Christian board

All that stuff is still far too theoretical to be taken seriously. It might be right, but it's far too early to claim it.

>Until we create and master quantum computers, we will not make a thinking machine.
Nobody advocates sentient machines. The singularity really just means invention, design and manufacture will be completely transferred to machines. None of those need a sentient machine, just some kickass machine learning systems and lots of computing power.

This... machine intelligence is waaay too abstract right now and would require immense amount of work and understanding of intelligence.
We first need to do more progress in understanding the brain and our intelligence before we can even TRY and make something similar on a computer.

Indeed, but I think humanity will not survive to see a truly independent AI.

Yes but you have to have a biased respect for human life otherwise this AI would be shit. Someone would have to program it's priorities for economics and the welfare of it's citizens ect. Do you see what I'm saying now? There is no ultimate logical solution until the AI has these priorities, and someone is going to have to decide what is most important and to what degree it supercedes the AI's other priorities.

The human brain isn't a computer otherwise this line of code would cause instant death.
[code]
for (int i=0;i>1;i=0) {
//do something
}
[/code]

False.

>fast

35465*57318637=?

That's pretty feasible with today's technology so go for it

Yes ofc . You underestimate machines .

Are you implying we will never be able to bioengineer a test tube brain specifically for calculations? Brain-in-vat supercomputers? Perhaps we are already living in one.

I don't think he meant in mathematical capacities, but in the way our brain is able to do abstract links and create new connections out of the blue.

>fission was discovered 70+ years ago. yet here we are, still burning fossil fuels instead of using nuclear power.

That's because of the profit incentive of the oil monopoly

>Nobody advocates sentient machines.
Oh but they do, user. Until we create effective quantum processors, our machines will act and not think.

Thinking is a quantum process and your brain, all organic brains, area quantum computers.

Look up "the chinese room".
Computers have no motivation or reason to do anything.
They're only capable of doing what a human tells them to do.
"Machine learning" is overrated.

If you want a proper thinking machine you'd need to radically change the entire structure of the machines.

Exactly. There's too little known about human intelligence. How are humans capable of following a recursive or iterative chain infinitely and being able to conclude whether it is indeed finite or not, but a machine cannot?

A human understands the concept / the core idea that chunk of code is trying to achieve; and thus is able to understand if it ever terminates. A machine can only act upon that code and guess whether it's an infinite loop or not.

In order for a machine to ever reach a level of AI it would have to actually understand the world around it. Something that cannot be taught through predictive models or big data.

Computers have not become more intelligent, we have.

>what are fucking neurons

american education everybody