Return to site
Return to site

Bad Google AI!

When the rules of AI models aren't designed well...

· AI's Unintended Consequences

AI tools are not just beginning to outperform humans; they're also picking up some of our worst biases. Google’s Gemini’s recently showed an inability to accurately produce historically accurate images as a result of DEI rules not working as expected with a hodgepodge of WWII German soldiers in nazi uniforms being black, asian and hispanic to show diversity. Well, it’s now also covertly discriminating against African American Vernacular English speakers. It's like the tech world's version of a bad sitcom.

 

Subscribe
Previous
AI in Education: Artificial Intelligence or Artificial...
Next
Is it Game Over for Human Jobs?
 Return to site
Profile picture
Cancel
Cookie Use
We use cookies to improve browsing experience, security, and data collection. By accepting, you agree to the use of cookies for advertising and analytics. You can change your cookie settings at any time. Learn More
Accept all
Settings
Decline All
Cookie Settings
Necessary Cookies
These cookies enable core functionality such as security, network management, and accessibility. These cookies can’t be switched off.
Analytics Cookies
These cookies help us better understand how visitors interact with our website and help us discover errors.
Preferences Cookies
These cookies allow the website to remember choices you've made to provide enhanced functionality and personalization.
Save