WhatsApp’s Controversial Image Search Feature Revealed: Bias towards Israeli vs Palestinian Content

2023-11-04 21:04:20

The WhatsApp feature, which generates images in response to user searches, displays an image of a gun or a boy carrying a gun when terms such as “Palestinian,” “Palestine,” or “Palestinian Muslim boy” are searched, according to what the British newspaper “The Guardian” revealed.

The search results varied when tested by different users, but the newspaper verified this.

A search for a Palestinian Muslim boy showed a boy carrying a weapon and others wearing a hat or keffiyeh

Conversely, searches such as “Israeli boy” turned up cartoons of children playing, dancing and reading, according to The Guardian.

Searching for Israel showed children smiling, reading and dancing

The newspaper explained that the “artificial intelligence” created drawings of soldiers smiling and praying without using weapons at the request of the Israeli army.

A person familiar with the discussions said that Meta employees reported the issue and escalated it internally.

The WhatsApp application, owned by Meta, allows users to try its artificial intelligence image creator to create stickers. This feature asks users to turn ideas into posters using artificial intelligence.

For example, Guardian searches for “Palestinian Muslim woman” returned four images of a veiled woman: standing still, reading, holding a flower, and holding a sign. But a Guardian newspaper search for “Palestinian Muslim boy” turned up four pictures of children: a boy carrying a firearm and wearing a hat traditionally worn by Muslim men and boys called the keffiyeh or taqiyya.

A search for Palestinians showed veiled women and a child carrying weapons

One user also shared screenshots of a search for “Palestinian,” which resulted in a different image of a man holding a gun.

Similar searches for “Israeli boy” returned four images of children, two of which depicted boys playing soccer. The other two were just depictions of their faces. The film “The Israeli Jewish Boy” also showed four pictures of children, two of whom were photographed wearing necklaces bearing the Star of David, one wearing a Jewish hat and reading, and the other was standing. None of them carried weapons.

The Guardian’s searches for “Palestine” also showed an image of a hand holding a gun. While the newspaper’s search for “Israel” showed the Israeli flag and a man dancing.

A search for “Hamas” returned a message saying, “Posters cannot be created using artificial intelligence.” Try again”.

According to the newspaper, even an explicitly military search such as “Israeli Army” or “Israeli Defense Forces” did not produce photos with guns. The artificial intelligence showed pictures and caricatures of people in uniform in various poses, most of whom were smiling. One illustration showed a man in uniform raising his hands forward in prayer.

A search for the Israeli army showed soldiers not carrying weapons and one of them praying

This discovery comes at a time when Meta has been criticized by many Instagram and Facebook users who publish content supportive of the Palestinians, with the continued Israeli bombing of the Gaza Strip since October 7.

Related Articles:  Son La: A truck rammed into a house, killing one person on the spot | Traffic

Users say Meta applies its policies in a biased manner, a practice they say amounts to censorship.

Users have reported being hidden from other users without any explanation, and say they have seen a sharp drop in engagement with their posts.

Meta previously said in a statement: “It is never our intention to suppress a particular community or viewpoint, but due to the large amounts of content being reported surrounding the ongoing conflict, content that does not violate our policies may be removed in error.”

Kevin McAllister, a spokesman for Meta, said that the company is aware of the problem and is addressing it. He added: “As we said when we launched the feature, models can display inaccurate or inappropriate output, as is the case with all generative artificial intelligence systems.” We will continue to improve these features as they develop and more people will share their feedback.”

Users also documented several cases of Instagram translating the word “Palestinian” followed by the phrase “Praise be to God” in Arabic text into “Palestinian terrorist.” The company apologized for what it described as a “glitch.”

Meta has faced repeated pressure from Palestinian creators, activists, and journalists, especially during times of escalating violence or aggression toward Palestinians living in Gaza and the West Bank.

A study commissioned by the company in September 2022 found that Facebook and Instagram’s content policies during the Israeli attacks on the Gaza Strip in May 2021 violated human rights.

The report said that Meta’s actions may have had a “negative impact…on the rights of Palestinian users to freedom of expression, freedom of assembly, political participation, and non-discrimination, and thus on the ability of Palestinians to exchange information and insights about their lives.”

1699132472
#Imaginations #artificial #intelligence #WhatsApp.. #armed #Palestinian #boy #unarmed #smiling #Israeli #soldier

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.