Google Twitter Algorithms and Outrage

This is NOT a Sup Forums topic (though I'm completely aware that this could soon skate onto the thin grey area of slippery icy slopes, or something). This is about the first and last sentence in the selected quote from an article (by Phoebe Weston) concerning technology.

"Google has been slammed for 'fixing' its racist image recognition algorithm by simply removingthe word 'gorilla' from its auto-tag tool...
The software outraged many users back in 2015 after it tagged images of a computer programmer and his friend as primates...
Now, nearly three years later, it has been revealed the company has 'fixed' the issue by blocking identification of gorillas, chimpanzees and monkeys...
Twitter users have criticised the company for not working to develop a diverse model for an algorithm and instead just banning identification of gorillas and black people."

>racist image recognition algorithm
>develop a diverse model for an algorithm

How can I tell if my written code has the proper diversity?
Can I send my racist algorithms to sensitivity training?
What do?

damn I feel bad for them, the entire internet can search up gorillas and their faces come up lmao

>How can I tell if my written code has the proper diversity?
>Can I send my racist algorithms to sensitivity training?
>What do?
It's 0's and 1's lad. It's in God's hands now

You penalize undesirable situations more. Type 1 type 2 error tradeoff.

>The average penis is 9-10 inches long and around 9 inches around
What?

>Twitter users have criticised the company for not working to develop a diverse model for an algorithm and instead just banning identification of gorillas and black people."
Google probably tried to do this and realized that it's literally impossible because black people actually look like apes. It's literally 100% impossible to have an algorithm identify monkeys and apes and not also identify blacks.

It's true. Mine is just a little below average, but comparing with my friends that seems to be where they're most at.

Stats are for US

are u winterfags serious?? this is OLD news, and the algo was right tho

Reminder, Google changes its algorithms.

>t.dicklet

>Google has admitted its image labelling technology is nowhere near perfect but instead of fixing it the company has simply banned the term 'gorilla'.

Discovering how they've "fixed" the problem is new, yes? Has Google ever talked about "racist code" or "anti-diverse programming structures" etc as a result of the criticism stemming from 3 years ago? I remember a bit that there was a bit of a to-do about "master/slave" computer naming conventions.

>1st on 4th row

>Google's image classification is nothing more than using a library like tensorflow and turning the machines lose to figure out things for themselves
>through countless trial and error, machines have decided that some black people look like very similar to gorillas
>other ethnicities snicker in the corner because an objective, truly unbiased and impossible-to-be-racist source has discovered what humans knew all along but weren't allowed to say because of politics

What will happen when AI gets involved further, like in our government, and it declares black males people to be the most violent, threatening group in society (based on crime stats like murders per capita). Twitter will have a meltdown

>does this means google wants black people to be white?

> tfw plebbitors push for "muh unbiased AI overlord"
> After poring over all the data and weighing the probabilities, AI instantiates the fourth reich
:D

based google algorithm

Maybe we should ask, if an AI makes a verdict on anything that people don't like, can the code be called racist? Can the code be condemned for being un-diverse? What if an AI said, "White Men can't be discriminated against or suffer reverse-racism or reverse-gender-bias because they are the class with privilege." Is the code racist? Is the algorithm flawed?

Maybe it can be. Are the programmers then responsible? Maybe the leader of the project? Everyone involved? Should AI even be given any authoritative voice to weigh in on social issues?

Can an AI or its code be un-diverse? Where in there can we draw the line?

>does this means google wants black people to be white?
Ooh, so close.

soygoy redditor

>Google image search american scientist
>All black people
Oy vey

...

Separate issue. This is just confirmation bias

>release all videos
>95% of crime recorded is done by black males
>confirmation bias

So in this case, it's not even computer code that is racist, it's the raw data (surveillance footage). And even further, this raw data has the power to make people racist against their will.

Forget about which race, or which class, which gender, blah blah. It's gone way too far when people start claiming that raw data must be censored from the public view.

I don't have any solution in mind, except maybe public humiliation in the very social media these peple keep espousing these ideas, but then that seems to keep fueling the fire. I dunno. Thanks for discussing.

(on a side note, i am getting sick and tired of clicking 11 cars, being told to try again, click 9 cars, being told "check new images", failing, trying again. Maybe Google should have pictures of Gorillas in their captcha to help solve their autotag recognition problem, eh?)

I actually tried this a few weeks ago thinking I was being memed... turns out that it's legit. They even blacked up Alexander Graham Bell's pic a bit if I recall correctly.

You don't know what this term means.

I think it was France (maybe Germany) that did something similar to this; they won't release crime stats by race because, well... "it'll make people racist".

Serious though, how do you explain this?

It's just a pure coincidence, user.

Sweden stopped recording ethnicity data because African and Middle Eastern people were hugely overrepresented in crime.

If the facts don't agree with you, hide the facts!

basically you need to train your algorithms on more than pictures of white people
that being said, in a high contrast picture like that, and given gorillas genetic relationship to people, it's probably pretty damn hard to differentiate between gorillas and those black people using a standard machine learning algorithm, and people being "outraged" over this are completely lacking in understanding of the technology at work here

it gets a little better if you put "white couple" in quotes so it looks for more exact matches
basically if you just put in the words white couple without quotes it looks for keywords "white" AND "couple", which pops up images that are marked as "white and black couple" or some such
here's something you can do for comparison: google image search just "couple" and you'll get pretty much all white results

>Kek

UNDERRATED

The graph shows preference it is what women want.

What it really shows is that women have shit-tier spacial recognition. Anecdotally, my ex was convinced that I'm 8" and I can assure that I'm barely 6.5" down there. Half the time she complained that after a little bit (like 10-15 minutes) it was painful because of my size. Half these chicks thinking that they want some pornstar type dick would be in for quite a surprise I think should they end up up close and personal with it.

It might be so that chocolate people are not in general more criminal, but only those who flee to civilized countries. After all, if you're smart and have a set of useful skills you can get by almost everywhere. If you don't you go somewhere nice and leech off conditions there by crime.

Other possibility would be that immigrants are not more criminal but only caught more often due to being less intelligent

>It might be so that chocolate people are not in general more criminal
Why then do their own countries have higher raters of crime than other parts of the world?

WHAT DOES THIS MEAN!?

Probably because they are wrecked shitholes, which is at least in part fault of European colonialism.

The worse living conditions are the higher the crime rates.

There is no doubt that people from less civilized countries are more criminal, all I'm trying to say is that corellation isn't the same as causation. In other words, black people countries are uneducated and crappy to live in for the most part, which is cause of the high crime rates, not being shit skinned in itself

This.

Use legacy captcha.

It's all the same here man, "white couple" and white couple, both of them gives the same nignogs as result.

>make a post on imgur & reddit
>spam the key word over and over in the title of both
>get lots of upvotes & views
>it gets onto the top google result every time

hmmmmmmmmmm

You're an idiot

>This is NOT a Sup Forums topic (though I'm completely aware that this could soon skate onto the thin grey area of slippery icy slopes, or something).

You're not fooling anyone. Go back to Sup Forums.

A few countries (Australia is one) have tried to use AI or some sort of blind process to select potential job candidates based on skill factor while eliminating gender bias.

95% of selected candidates were males so the procedure was scrapped.

It'll die in around two months, and after that it won't even be a choice anymore

Well, Is he the coolest gorilla in the jungle?

>how to tell
Proving that you won't have a class of error is the same as developing a method for excluding them.
Google isn't failing here for lack of trying. It's just that their algorithm was insufficient and they can't 'lead' it in any way to check if it's right.

Ultimately the only method of handing this is to exclude predicting on the cases where people are sensitive until you're confident enough. Like Google did.

I'd like to think that people were smart enough to understand that what an image recognition algorithm 'sees' is not reflective of human perception. Which is what you really care about.
No human would ever actually mistake these people for gorillas (not even politically incorrect jokesters).

This is just based on popularity.
'white couple' is extraordinary mundane. Do you really find articles talking about Caucasian couples specifically as such when they are? Meanwhile people still (somehow) talk about mixed race relationships as if that wasn't the hot topic of the 1950s.
You can look at the articles those images are linked to. I'm sure you'll find that at least most of them are some form of contrasting white couples to other forms of couples. With mixed couples given images for some reason.
Which may be entirely a political bias of the article authors but you asked for images relevant to this. Not for it to do its image recognition and recognize Caucasians in a relationship.

Search for just "couple" and basically every result will be white.