Facebook-Whistleblower Data Scientist Francis Hogan India Content

0

[ad_1]

Despite knowing that “RSS users, groups and pages incite fear and incite anti-Muslim sentiment”, social media company Facebook could not take action or flag this content “due to lack of Hindi and Bengali classifiers”. The whistleblower complaint was filed before the U.S. securities regulator.

Complaints that Facebook’s language skills are “inadequate” and lead to “global misinformation and racial violence” have been flagged by many. Whistleblower Francis Hogan, A former Facebook employee, with the Securities and Exchange Commission (SEC) against Facebook’s practices.

Whistleblower Aid, a non-profit law firm, cited a domestic Facebook document entitled “Networks Harmful to the Enemy-India Case Study” in a complaint to the US SEC, stating that “inhumane recordings (on) Muslims … lack of our Hindu and Bengali classifiers” Most of this content has never been flagged or acted upon, and we have not yet made a recommendation to give this group (RSS) political impetus.

Classifiers refer to Facebook’s hate-speech detection algorithms. According to Facebook, it added hate speech classifiers in Hindi from the beginning of 2020 and introduced Bengal later that year. Classifiers for violence and incitement in Hindi and Bengali first came online in early 2021.

Eight documents containing several of Haugen’s complaints were uploaded by the American news network CBS News. For the first time on Monday, Ha identity gun revealed his identity in an interview with the news network.

In response to a detailed questionnaire sent by The Indian Express, a Facebook spokesman said: “We prohibit hate speech and content that incites violence. Over the years, we have made significant investments in technology to detect hate speech, even before people report it to us. With more than 40 languages ​​worldwide, we use this technology to detect infringing content in Hindi and Bengali.

From May 15, 2021 to August 31, 2021, the company said it had “pre-deleted” 8.77 lakh hate speech content in India, and more than tripled the number of people working on security and security issues, including more than 15,000 dedicated content reviewers. “As a result, we have reduced the spread of hate speech globally – that is, the amount of content that people actually view – on Facebook has increased by almost 50 per cent in the last three quarters. In addition, we have a team of content reviewers covering 20 Indian languages. , We continue to make progress in enforcement and are committed to updating our policies as hate speech develops online, ”the spokesman added.

Not only was Facebook aware of the nature of the content being posted on its platform, but through another study, it also found the impact of posts shared by politicians. The internal document, entitled “The Consequences of Politician Shared Misinformation”, also included examples of “high-risk misinformation” shared by politicians in India, which led to the “social impact” of the video “out of context. Anti-Pakistan and anti-Muslim”.

An India-specific example of how Facebook’s guidelines recommend content and “groups” to individuals comes from a survey conducted by a company in West Bengal. Was found to be “fake / unsubstantiated”. User L28 has more than 30 million users with high view port views (VPVs) or recordings that are estimated to be unsubstantiated. L28 is referred to by Facebook as the bucket of active users in a particular month.

Another complaint alleges that the Facebook “single user multiple accounts”, or SUMAs, or duplicate users have no control, and cites internal documents to outline the use of “SUMAs in international political dialogue”. The complaint said: “The BJP party official in India used SUMAs to promote pro-Hindi news,” an internal statement said.

Questions sent to RSS and BJP were not answered.

The red flag shows how complaints, especially “deep redistribution” can lead to misinformation and violence. The redistribution depth is defined as the number of hops in the original Facebook post of the redistribution chain.

India ranks first in terms of Facebook’s policy priorities. As of January-March 2020, India, along with Brazil and the United States, is part of the “Tier 0” countries, the complaint states; “Tier 1” includes Germany, Indonesia, Iran, Israel and Italy.

The internal document, entitled “Civic Summit Q1 2020”, contains a global budget distribution in favor of the United States with the “purpose” of “misinformation, reduction, information / measurement of information on FB processors”. It said 87 per cent of the budget for these purposes was allocated to the United States and the remaining 13 per cent to the rest of the world (India, France and Italy). “This is despite the fact that only 10 per cent of ‘daily active users’ in the US and Canada …” the complaint added.

India is one of the largest markets for Facebook with over 410 million users for Facebook, 530 million for WhatsApp and 210 million for Instagram respectively.

On Tuesday, Hogan appeared before a U.S. Senate committee where he testified about Facebook’s lack of oversight for a company with “intimidating influence over many people.”

In a Facebook post following the Senate hearing, CEO Mark Zuckerberg said: “The argument that we are deliberately pushing content that makes people angry for profit is very unfair. I don’t know of any technology company that makes depressing products.Ethical, business and product incentives all point in the opposite direction.

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here