INSANE: Google Removes Gender Labels From Image Recognition Technology to Stop ‘Unfair Bias’
Google’s Cloud Vision API, an artificial intelligence tool that helps people analyze images, will no longer include gender labels in their software, the company has announced.
The AI tool will be made less effective due to politically correct concerns regarding the LGBT community, as males and females with mangled, disfigured genitalia are now allowed to masquerade as if they are their opposite gender in society.
“Given that a person’s gender cannot be inferred by appearance, we have decided to remove these labels in order to align with the Artificial Intelligence Principles at Google, specifically Principle #2: Avoid creating or reinforcing unfair bias,” the email making the announcement stated.
Google’s AI principles regarding bias read as follows: “AI algorithms and datasets can reflect, reinforce, or reduce unfair biases. We recognize that distinguishing fair from unfair biases is not always simple, and differs across cultures and societies. We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief.”
Google is receiving praise from Silicon Valley for their decision to conform to LGBT dogma.
“Anytime you automatically classify people, whether that’s their gender, or their sexual orientation, you need to decide on which categories you use in the first place — and this comes with lots of assumptions,” Frederike Kaltheuner, a Mozilla tech policy fellow, told Business Insider.
“Classifying people as male or female assumes that gender is binary. Anyone who doesn’t fit it will automatically be misclassified and misgendered. So this is about more than just bias — a person’s gender cannot be inferred by appearance. Any AI system that tried to do that will inevitably misgender people,” she added.
Before making their decision to implement the change, developers were invited to discuss the proposal on the tech giant’s forums. Only one individual had the courage to voice the obvious concern about how Google is purposefully limiting their own technology and making it less effective to serve a politically correct agenda.
“I don’t think political correctness has room in APIs,” the developer wrote. “If I can 99% of the times identify if someone is a man or woman, then so can the algorithm. You don’t want to do it? Companies will go to other services.”
With Google becoming a monopoly and growing more all powerful by the day, there may not be any room for competition in the marketplace before long. The Orwellian nightmare will be super inclusive if Google has their way.
Share: