Jaw-dropping moments in WSJ's bombshell Facebook investigation

There’s lots to unpack from the Journal’s investigation. But one factor that stands out is simply how blatantly Facebook’s issues are documented, utilizing the form of easy, observational prose not usually discovered in inside communications at multinational firms.

Here are a number of the extra jaw-dropping moments from the Journal’s collection.

In the Journal’s report on Instagram’s impact on teens, it cites Facebook’s personal researchers’ slide deck, stating the app harms psychological well being.

“We make body image issues worse for one in three teen girls,” stated one slide from 2019, in response to the WSJ.

Another reads: “Teens blame Instagram for increases in the rate of anxiety and depression … This reaction was unprompted and consistent across all groups.”

Those slides are notably notable as a result of Facebook has usually referenced exterior research, slightly than its personal researchers’ findings, in arguing that there is little correlation between social media use and despair.

Karina Newton, head of public coverage at Instagram, addressed the WSJ story Tuesday, saying that whereas Instagram could be a place the place customers have “negative experiences,” the app additionally offers a voice to marginalized folks and helps family and friends keep related. Newton stated that Facebook’s inside analysis demonstrated the corporate’s dedication to “understanding complex and difficult issues young people may struggle with, and informs all the work we do to help those experiencing these issues.”

‘We aren’t really doing what we are saying we do publicly’

Facebook CEO Mark Zuckerberg has repeatedly, publicly maintained that Facebook is a impartial platform that places its billions of customers on equal footing. But in another report on the company’s “whitelisting” practice — a coverage that permits politicians, celebrities and different public figures to flout the platform’s guidelines — the WSJ discovered a 2019 inside assessment that known as Facebook out for misrepresenting itself in public.

“We are not actually doing what we say we do publicly,” the assessment stated, in response to the paper. “Unlike the rest of our community, these people” — these on the whitelist — “can violate our standards without any consequences.”

Facebook spokesman Andy Stone informed the Journal that criticism of the apply was honest, however that it “was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.”

‘Misinformation, toxicity and violent content material’

In 2018, Zuckerberg stated a change in Facebook’s algorithm was meant to enhance interactions amongst family and friends and cut back the quantity of professionally produced content material in their feeds. But in response to the paperwork revealed by the Journal, staffers warned the change was having the alternative impact: Facebook was changing into an angrier place.

A crew of knowledge scientists put it bluntly: “Misinformation, toxicity and violent content are inordinately prevalent among reshares,” they stated, in response to the Journal’s report.

“Our approach has had unhealthy side effects on important slices of public content, such as politics and news,” the scientists wrote. “This is an increasing liability,” certainly one of them wrote in a later memo cited by WSJ.

The following 12 months, the issue endured. One Facebook information scientist, in response to the WSJ, wrote in an inside memo in 2019: “While the FB platform offers people the opportunity to connect, share and engage, an unfortunate side effect is that harmful and misinformative content can go viral, often before we can catch it and mitigate its effects.”

Lars Backstrom, a Facebook vice chairman of engineering, informed the Journal in an interview that “like any optimization, there’s going to be some ways that it gets exploited or taken advantage of …That’s why we have an integrity team that is trying to track those down and figure out how to mitigate them as efficiently as possible.”