QAnon founder may have been identified thanks to machine learning

Facebook researchers were warning about its recommendations fueling QAnon in 2019

A document titled ‘Carol’s Journey to QAnon’ raised concerns more than a year before Facebook banned the group.

Karissa Bell
K. Bell
October 23rd, 2021
QAnon founder may have been identified thanks to machine learning
Leah Millis / Reuters

Facebook officials have long known about how the platform’s recommendations can lead users into conspiracy theory-addled “rabbit holes.” Now, we know just how clear that picture was thanks to documents provided by Facebook whistleblower Frances Haugen.

During the summer of 2019, a Facebook researcher found that it took just five days for the company to begin recommending QAnon groups and other disturbing content to a fictional account, according to an internal report whose findings were reported by NBC News, The Wall Street Journal and others Friday. The document, titled “Carol’s Journey to QAnon” was also in a cache of records provided by Haugen to the Securities and Exchange Commission as part of her whistleblower complaint.

It reportedly describes how a Facebook researcher set up a brand new account for “Carol,” who was described as a “conservative mom.” After liking a few conservative, but “mainstream” pages, Facebook’s algorithms began suggesting more fringe and conspiracy content. Within five days of joining Facebook, “Carol” was seeing “groups with overt QAnon affiliations,” conspiracy theories about “white genocide” and other content described by the researcher as “extreme, conspiratorial, and graphic content.”

The fact that Facebook’s recommendations were fueling QAnon conspiracy theories and other concerning movements has been well known outside of the company for some time. Researchers and journalists have also documented the rise of the once fringe conspiracy theory during the coronavirus pandemic in 2020. But the documents show that Facebook’s researchers were raising the alarm about the conspiracy theory prior to the pandemic. The Wall Street Journal notes that researchers suggested measures like preventing or slowing down re-shared content but Facebook officials largely opted no to take those steps.

Facebook didn’t immediately respond to questions about the document. “We worked since 2016 to invest in people, technologies, policies and processes to ensure that we were ready, and began our planning for the 2020 election itself two years in advance,” Facebook’s VP of Integrity wrote in a lengthy statement Friday evening. In the statement, Rosen recapped the numerous measures he said Facebook took in the weeks and months leading up to the 2020 election — including banning QAnon and militia groups — but didn’t directly address the company’s recommendations prior to QAnon’s ban in October 2020.

The documents come at a precarious moment for Facebook. There have now been two whistleblowers who have turned over documents to the SEC saying the company has misled investors and prioritized growth and profits over users’ safety. Scrutiny is likely to further intensify as more than a dozen media organizations now have access to some of those documents.

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics   

(19)