Facebook Denies Targeting Emotionally Insecure And Vulnerable Kids, But Admits To The Research

A leaked document made some startling allegations, indicating that Facebook was targeting emotionally insecure and vulnerable users. The company denies such claims.

The leaked document reveals that Facebook conducted research to target vulnerable users, but was not published. Nevertheless, although the paper was not released, it still sparked plenty of controversy when it leaked last week and the media was abuzz with reports of how Facebook targeted users based on their emotions to push ads.

The paper reportedly detailed how teenage Facebook users frequently post about weight loss, self-image and other issues that reveal their insecurities. More specifically, Facebook reportedly used complex algorithms to assess the moods of its vulnerable and young users, as young as 14, in order to target ads based on their mood.

Facebook Denies Targeting Emotionally Vulnerable Users

For those unfamiliar with the matter, The Australian said it saw a 23-page Facebook document marked "Confidential: Internal Only," which detailed how it could target ads depending on insecure users' moods.

The report raised great concerns and controversy regarding Facebook's purported privacy invasion and the whole immorality of the action, sparking some serious questions about the company's ethics.

Facebook now refutes The Australian's report, at least partly.

Facebook admits that it conducted such research and shared it with advertisers, like The Australian reported, but argues that "the premise of the article is misleading."

"Facebook does not offer tools to target people based on their emotional state," says the company.

Facebook further notes that an Australian researcher conducted the analysis, but the goal was to help marketers get a better idea of how users express themselves on the social network.

"It was never used to target ads and was based on data that was anonymous and aggregated," the company adds.

However, Facebook does admit that it has an established process for reviewing the research it conducts, and this paper did not abide by that process. Consequently, the company says it's reviewing the details to fix the oversight.

Monitoring Users Who Feel Stressed, Overwhelmed, Stupid

The Australian notes that Facebook shared the report with marketers working for a number of major banks in Australia. Facebook executives Andy Sinn and David Fernandez reportedly wrote the paper, The Australian further claims.

The document revealed that Facebook could monitor photos and posts from users who might be "anxious," "stressed," "nervous," "defeated," "stupid," "overwhelmed," "useless," "silly," or a "failure."

The research covered only Facebook users in New Zealand and Australia. Facebook typically has strict guidelines in place to take into consideration any potential adverse effects on users, or whether conducting such research would be reasonable. Since Facebook now acknowledges that this research did seem to go against some of its policies, it remains to be seen what action the company will take.

This is not the first time that Facebook comes under fire for manipulating users' news feeds in the name of research. Back in 2014, news broke out that Facebook intentionally manipulated the news feeds of 700,000 users, showing them certain types of content to see if it could manipulate their emotions.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics