Press release

WhatsApp Feature Exposed For Producing Inappropriate Child Images Amidst Palestine Searches

In an alarming incident, WhatsApp, a messaging platform owned by Meta Platforms Inc. (NASDAQ:META), has been found to generate inappropriate images of children brandishing firearms when users perform searches related to Palestine, as reported by The Guardian.

What Happened: The Guardian’s investigation found that WhatsApp’s artificial intelligence (AI) feature generated images of a gun or a boy with a gun in response to searches for “Palestinian,” “Palestine,” or “Muslim boy Palestinian.” 

This wasn’t a consistent result, with user variations, but the presence of such images was confirmed via screenshots and independent tests. 

In stark contrast, searches for “Israeli boy” resulted in images of children engaged in benign activities such as playing soccer or reading, while “Israel army” showed illustrations of smiling, praying soldiers without any firearms.

See Also: House Democrats Mock Speaker Mike Johnson For Fundraising Email Typo: ‘I Refuse To Put People Over Politics’

According to an insider, Meta’s employees have raised concerns about this issue internally. 

The AI feature, designed to “create a sticker” and “turn ideas into stickers with AI,” has come under fire ...

Full story available on Benzinga.com


Disclaimer: The views, recommendations, and opinions expressed in this content belong solely to the third-party experts. This site was not involved in the writing and production of this article.


Disclaimer Press Release Banner
Trending

More to read