Who's also becoming increasingly technophobic here...

Who's also becoming increasingly technophobic here? I'm getting more and more convinced that technological """"progress"""" isn't always a good thing and that it's starting to cause more harm than anything in our society

Luddites ought to be used as human batteries

I'm with you on that.

Techno-utopianists are fucking morons.

There's literally nothing to be afraid of, human.

Transhumanist billionares and AI techs are giving me the creeps

Anytime someone posts about this on the Internet, I am submerged in a deep sense of disbelief.

Even just all this new VR crap makes me feel quite uneasy. When I saw that pic with Zuckerberg walking among all those people wearing VR glasses I kind of shivered
I've been thinking about limiting my internet use just to the most essential stuff for a long while actually

What's so wrong/scary about technology, user?

>human batteries

Quite honestly, this is the dumbest fucking thing in the history of science fiction.

replicators will exist within cyberspace in 5 years time. there are ai systems already that can, after a set observational period, replicate the online posting style of the observed subject and create matching content

>that flag
what the fuck would you know about technology

VR porn though

science will be the death of man

it will come in a man made biological engineered virus

This is why I dumb myself down


want no part of it

It's starting to be seen as an "ultimate goal" instead of just a tool. This is also why people are starting to call arts and humanities useless and shitting on everyone who is not in some boring STEM field

I'm not the user. Brazil is a tech consumer. People here live in a shithole, but have the best cellphone they can, gaymer computers, drones and shit. And It's really expensive, of course, but brazilians are retard in finances

>biological engineered virus
It's not '90s anymore.

>my current year

fuck off kaccapie

go worship your glass tomb embedded pinko lenin-a-catacomb on the red square fucking subhuman retard

science, if we consider just its original latin meaning (i.e. pure knowledge), is always a good thing in my opinion, it's its possible practical use (technology) that could well be humanity's epitaph

well im not much into philosophy and sheeit but people much smarter than me have serious problems with AI and make a good case against it.

even if you just think the cost of "dumb" AI and automation in the workplace (without basic income/more gibmedats) it's not good.

this is why man kind was doomed from the start

be playing god we have long ago opened pandora's box

man is too corrupt to create anything of preserving life

death, killing is all we are good at

i'm happy with AI. I gave up after 4 years studying architecture & urbanism, but I know how to use the softwares and basic programming and soon, less than 15 years (probably more time in Brazil), most of what architects do now will be replaced. I still like construction area. So, I will be back, without college (hate the academics atmosphere) and doing freelance jobs and soon my company wth 3d printer if everything goes as I predict.

I've found it's usually just moron low class right-wingers who call for the end of humanities and arts. In fact it's usually wealthy STEM people who are funding creative causes.

any super intelligent and powerful being will surely just kill everything including in itself and put this shitshow to rest

good riddance

Society is becoming more and more Orwellian. The only (((advancements))) being made nowadays either enable the government to spy on you or corporations to sell you more shit.

The only good future for people is to governed by a group of hyper intelligent AIs

What does "progress" mean in the first place? Just that the oldest professions acquire an electronic interface, that’s all. Progress doesn’t alter the nature of the fundamental processes.

> 2000 rubles/hour
man that is super cheap
what is wrong with those girls?
also how/where can i get one?

I've never heard of a country or a place replacing humanities and arts with science. But that's really stupid if they are doing it, both of them have different purposes in society, one can't replace another.

But there is something else, at least in my country we've experienced multiple times the removal (or attempt) of humanities from most public schools and to make the population ignorant by removing history and philosophy, since the most commom and sucessful type of goverment in Brazil is the "Bread and Circuses" goverment type.

So it's important to see if it's technology replacing humanities or the goverment is trying to control the population.

I'm not exactly sure what "problems" you meant that AI and humanities have with each other, but i will assume it's about human rights for robots.

Right now, there isn't a single robot that is intelligent enough to be considered similar to a human, most of them can solve problems way faster than humans, but humans are more complex than just solving problems, machines can't find problems that are not programmed for him to find neither they are capable of identifying themselves as a individual, machines are far away from this.

Also robots will get cheaper with time and sometimes it's cheaper to buy a robot that can work no-stop than hiring a employee.

>talking shit about your blood brother
Your not a true Pole.

Not a stupid phobia by any means desu, we're probably going to see more technological process in our lives than the last 1000 years combined. Its honestly fucking scary how AI/robotics and biological engineering will change the world.

David Icke says its all a plot to control ourminds by shapeshifting vibrating frequency Archon Reptile things
Captcha_ Stop, concidence?

No, I'm more talking "Terminator" than "I, Robot", but that's big picture AI stuff. Not as far off as you think either imo.

Specialised dumb AI will take skilled jobs within the next 20 years; I'm talking white collar (like your architect example) not just drivers. It already has in supermarkets etc, not even touching on admin jobs that have been replaced by software.. This automation has a serious human/social cost under our current financial system.

Tech has the potential to enslave us or set us free, but AI is it's own thing with it's own ideas.

Real AI will never be "it's own thing with it's own ideas".

All code is written by man, and so at best we will get imitations of intelligence modeled by its creators and bearing the intrinsic flaws of its creators. There will always be more than one flavor of AI, and they will use different methods of achieving "intelligence", and it is too subjective of an idea to ever say without a doubt "we have replicated human intelligence".

>Real AI will never be "it's own thing with it's own ideas".
Surely by definition that's exactly what it is?
True AI that is.

Pls stop you are giving me a frighten

Just imagine the movie said "wetware computers" instead, that was the original plot before the studios thought normies wouldn't understand. It also explains why people can change the matrix around them, it's generated in their collective brains.

Sorry, bad word choice.

The AI that will be realized by humans in the distant future, practical or otherwise, will not be without inherited human flaws that prevent it from being intelligence that is independent from human influence, which is what "true AI" is.

It's like the concept of "now". You may think you're blinking you can blink both eyes at the same time, but as you increase the precision of the measurement when you close your eyes, the chances that they will be closed at EXACTLY the same moment in time is zero if you tried it over and over again for a millennia.

We will get close to approximating intelligence, no doubt. But it will not be a discovery, it will be like a new form of art, constantly changing to meet the subjective interpretations of human's understanding of something

>This automation has a serious human/social cost under our current financial system.

Agreed. I'm not exactly sure how it will end if everything gets automated, if it will become a utopia or dystopia, but i'm pretty sure the process to the final product is going to be a dystopia, most people will be replaced by machines that will work for free and better, but if the people don't work they don't get any wages.
However this is also a problem for people trying to sell products, because to someone buy their products, they need money. The moment no one has to work anymore, money will (probably) become obsolete, because it's not used anymore. Employers don't need to use money because all their workers are machines and Employees don't have any money to spend. The most popular idea is that in the end it will be an utopia where everony can get everything for free, but there is also the dystopian idea that automation will deplete all the resources in the planet. However, the depletion of resources is most likely related to humans themselves and how we spend them, not automation. Humanity was never a "conscious consumer".

>No, I'm more talking "Terminator" than "I, Robot", but that's big picture AI stuff.

Well, as far i know, the most advanced millitary AI belongs to a "UK independent combat drone", know as "Taranis", everything you need to do is tell the drone the target and he will do the job, but it can't choose targets on it's own. Also i don't know how efficient this drone actually is. Pic related.

Do you have idea idea what kind of job will stay relevant in the future ? I'm looking for a formation right now and everything i check out has almost no offers and tons of demand

Computer programmer or farmer.
That's literally it.

You'd still at least need mechanics and engineers, people to maintain equipment and fix bugs and glitches though.

The last jobs to be taken by machines will be the ones that make em. Automation Engineering is one of them.

But the amount of people working is much smaller compared to those before the machines, also the machines can have other machines fixing them.

AI will write itself. Once it suppasses us in whatever it's competencies are, it's very hard to control. The reason I don't think it will take as long as you think it will for true AI is basically Moores Law.

Things like that are really straightforward though. Im not really talking about the ethics of dilemmas like that drone choosing it's own target, or how a self-driving car behaves in a crash, but literally the implications of creating something we cannot control. In some way we potentially cannot foresee. I don't know man, I'm a complete laymen with this stuff, but it does open all sorts of spiritual/philisophical questions.

Less so as IOT tech takes hold. Diagnosis can be done remotely, if we're talking about industrial sensors etc.

I wouldn't worry too much about that trying to predict the future user, just learn an industry that's not literally driving or something.

Doing something like that is really hard for two reasons:
The first one being, why would we even make a AI that we cannot control? For what purpose? The second one is just that it's really hard to do something like that. However if a machine becomes "self-aware" i'm pretty sure there would be problems with religion and philosophy mostly, because of their beliefs, the main one being "You're playing god", because there's the idea that making a a AI that we cannot control is technically making life. I'm not sure about how philosophy would react.

If AI writes itself using instructions we gave it, then it really isn't writing itself.

The most sophisticated algorithms learn by permutations, but humans dictate what parameters are important, what a success means, and the sets from which it will learn.

Quite the contrary for me.
The more I learn about codes, the more excited I become. Computers are life. Machines are life. A default logical framework combined with the continuous pursuit of information is everything.

>YWN live in The Culture.

Just end it for me, Subetai.