When will the tech singularity happen? I'm thinking ~2030

When will the tech singularity happen? I'm thinking ~2030

It happened already. The NSA has it.

whenever we actually figure out how the brain actually works and understand it well enough to effectively implement it in digital logic
what snowden leaks demonstrate this capability

There is no controlling a singularity. Humans will totally lose control within minutes to hours.

Any tech singularity won't happen anytime soon if ever. Century or more at least. I'd bet on even longer since we're likely to actually to regress due to globalism and cultural marxism.

Oh and if socialism or communism take hold even more than they have in the west prepare to add never since it's up in the air how long it takes to get out of that mess if we're not all dead or knocked back to the bronze age.

do you really think the NSA is humans?

I'm thinking later, the machine learning software isn't really ready yet, and neither is our study of biological computer "thought" software.

Huh, we're in almost complete capitalism. As far as you can take it in reality, anyhow.

By 2030 modern civilization will be on the verge of collapse due to a combination of climate change leading to coastal flooding and uninhabitable latitudes, resource/food scarcity and overpopulation. You think The Wall(tm) is crazy, but wait a few decades. Trump is just ahead of his time. Even he doesn't realize it yet.

Sorry for my lack of knowledge..but what is a singularity? In your words thanks..

Hypothetical event where artificial superintelligence improves itself by making smarter versions of itself until it's a gazillion times smarter than man. At which point nobody has any clue what will happen. Chances are even if it's possible to accomplish, humans aren't smart enough to create it anyway. It's basically nerd rapture.

I think is improbable..but if someone create a programme that is able to reply itself with the same features of real life (especially casual mutations, capacity to adapt to the environment and capacity to pass automatically from a pc to another) mabye this sort of artificial life could become intelligent.

Ok, so we create something intelligent. Maybe even as intelligent as a primate. If it's not capable of iterave, intelligent, guided self-improvement, that's just a failure. Unless it can learn to improve itself, which I don't see happening.

Only UBI can keep communism at bay. Basically just pay people not to riot and they wont. Its that simple really.

Border protections will be a must or UBI will fail, Only so much of the pie to go round. This is the hard fact that liberals cant face. They want open borders and UBI too but they cannot have both.

I didn't say to create something intelligent..I said to create life..like a bacteria and let it to evolve..with time probably it will become intelligent..

To create an IA I think is much more complex than create a simple kind of life..

You realize the singularity happens in days/weeks, not millions of years?

You realize that the evolution of this kind of life could be very fast? It depends to the frequency of mutations and reproduction and since this should be a program we can choose high level of both that variables and so maybe it could be intelligent in few years..probably you watched too much movies like terminator and don't understand that reality is not a movie..we cannot create an IA, we cannot define intelligence!!

>the singularity happens
The singularity isn't something that happens.

It describes a state beyond which we cannot predict what will happen because technology is evolving beyond our comprehension. It's analogous with a gravitational singularity where our understanding of physics breaks down

>complete capitalism
Unfortunately this is incorrect. Government has meddled since the 19th century and big moves towards socialism and communism started with Teddy and Franklin. Did you perhaps mean a form of state capitalism? There are some areas where this kind of applies like the Department of Education.

UBI is inefficient. Society would collapse.

>UBI is inefficient. Society would collapse.
How is it inefficient?
If anything it's more efficient to simply give money for people to decide what services they need instead of a bunch of bureaucracies deciding with a bunch of arbitrary cut offs.

Besides how will people ever start up businesses and stuff if they think there is no social safety net and risk their standard of living. Right now there is a pile of money and people are just grabbing it and throwing it into the pockets are super rich people already. Welfare hardly even goes to the poor it goes right into the pockets of the walton family. Rich people benefit off welfare more than the poor in its current implementation.

tech singularity is champagne communists wet dream, it's never going to happen under a fully capitalist system because the biggest capitalist stop everyone else from making more capital if they can. Technology won't be concentrated enough and spread around for a singularity under those conditions.

>When will the tech singularity happen? I'm thinking ~2030

World economy is gonna collapse sooner than that. Probably not WW3 because of nukes, but generalized low intensity warfare and crime.

Never, it is impossible for a computer to replicate human thought in any capacity, because it lacks a soul.

>superAI starts optimizing itself with the deepest knowledge found in any books and compilers
>there is a human made error somewhere deep down
>AI dies

after our death or never
stop being so naive

I agree that UBI is more efficient than say multiple programs and alphabet agencies accomplishing the same thing. Brief example. We have food stamps, HUD, WIC, FAFSA, child support, SSI arms that bend the rules to payout more, and much more. It's far more efficient to close all of that up and just simply say if you make X or less you get Y as a stipend per milton every month or year. It could easily be handled via IRS. Issue is once someone is on the dole it will never enough. They want more. You won't have people saying NO. The programs will stick around so instead of just UBI which would simplify things and cut the federal workforce and glut required it just gets tacked on to already broken and abused system. We also aren't economically free in the slightest even compared to the so called socialist utopias of Northern Europe (Demark/Sweden). Currency and interest is controlled by government. We lack proper markets (free). We're weighed down by sanctioned monopolies or monopsony. Sure we'll see new businesses sprout up from time to time in areas that have yet to regulated to death. Uber/Lyft etc. UBI is inefficient in the end because we lack the competition and free market to make services competitive. This is why it cannot work even if all subsidy programs were dissolved right now and only UBI were put in place.

when old traditionfags shut up

AIDF pls go

hopefully by then I'll have gf

Globalism and cultural marxism would quicken singularity due the homogenization of humanity. Easier to code up a construct if your variables are all constants.

With that said, we're nowhere close to understanding how the brain works in humans let alone how we would duplicate the process in silicon. Jeff Hawking had a good book on the subject about 10 years ago that still holds up today and is worth the read.

Not that soon. A standard desktop PC would be considered a supercomputer 20 years ago. Now, while we don't have the exact specs, we know that supercomputers today have about one petabyte of RAM and ~500 thousand CPU cores. Sometime in the future there will undoubtedly be a way to "fit" more RAM, and in turn VRAM in a desktop for consumers.

The only thing I see stagnating are CPUs, where the limiting factor is simply the laws of physics.

>The only thing I see stagnating are CPUs, where the limiting factor is simply the laws of physics.

Although I don't disagree that it will probably stagnate, but it wont be because of any physical laws, its simply in the way we manufacture them.

There is a lot more to go before we reach any kind of theoretical limit on computation.

2025

>Globalism and cultural marxism would quicken singularity due the homogenization of humanity
This is unlikely since as a collective humanity would lose IQ not gain it. We would also lose distinct cultural not just IQ differences between civilizations that evolved organically to become apex civilizations over tens of thousands of years. If you somehow merge them all together inorganically through force without competition you will always come out with a negative result. There are quite a few other problems we would likely see like being more prone to certain diseases and afflictions. A downward spiral in most other areas with no gains.

Expanding upon what this says any AI or other inventions from apex civilizations will each have their own distinct flavor. A Chinese AI will not resemble one created by the USA. The cultures are far too different. The same applies to Europe in general where it's fractured into many more distinct fiefdoms. I think people have a hard time coming to this realization and therefore get into this "Star Trek" communist utopia train of thought. Collectivism as a whole would be akin to creating a lossy copy of the original like copying a bluray to VHS. The degradation gets worse and worse. This is also if you take into account that mix race humans can actually function normally. The data is already there that they suffer increased risks of negative behaviors partially due to identity problems. This identity plays a strong part in the human condition just as religion does to a culture's overall framework.

mostly ITT: memed by tech journalists who over hyped deep neural networks

When will Reddit and Sup Forums finally leave? I'm thinking ~never

>singularity

The emergence of a super intelligence - not the meme singularity - is a real topic of interest, with even a founded expert group in the us at least dealing with the matter, among some popular tech people raising the attention to the serious topic.

A book covering it all: "superintelligence" by Nick bostrom

I'm in the middle of reading it, it covers:

- arguments why superintelligence will emerge
- technologies that might lead to superintelligence (sped up eugenics through gene selection and rapid generation through creating live in petri dish; AI that self improves by altering its architecture - seed ai, whole brain emulation - this one has the least technological barries iirc)
- forms os super intelligence (speed-, quality, collective superintelligence)
- how the world will deal with the different forms depending on take of rate

It's an awesome book because the idea can actualize in the future, its rather exciting. For example their is the problem of controlling the superintelligence, for example we could try to hardcode a final goal for it like "make humans happy". And the superintelligence might put electrodes into the brains of humans triggering nonstop euphoria. The book also argumetns how every superintelligence regardless of goal will by default want to assimilate all the resources it can turning the whole universe into a "computronium" that is matter arranged in such a way that it will enhance the capability of the super intelligeince.

A great read!