Republished with permission by Knowhere News
Facebook has built a physical “war room” in its California headquarters in an attempt to avoid repeating some of the mistakes made during the 2016 US presidential election.
Facebook on Wednesday invited reporters on a tour of the large conference room where dozens of employees monitor events from the social network around the clock, briefing visitors on its latest efforts to uncover nefarious uses of the platform during election season. All said, representatives from around 20 teams are in war room, representing 20,000 workers who work on safety and security.
“We know when it comes to an elections, every moment counts,” said Samidh Chakrabarti, the firm’s head of civic engagement, who is overseeing the war room. “So if there are late-breaking issues we see on the platform, we need to be able to detect and respond to them in real time, as quickly as possible.”
Chakrabarti said war room workers track data such as the amount of foreign political content and users’ reports of voter suppression. They also monitor other social media sites, such as Twitter and Reddit, while keeping up with other Facebook employees.
The human and computerized efforts to fight bad information are complementary, according to Chakrabarti: “If an anomaly is detected in an automated way, then a data scientist will investigate, will see if there is really a problem.”
During the 2016 US presidential election, Russian government agents and for-profit fake news outlets filled the social network with divisive propaganda around hot-button issues. Now Facebook is hoping to avoid a repeat in the coming US midterms as well as in elections around the world.
Nathaniel Gleicher, head of cybersecurity at Facebook, says the company’s goal is to ensure that the election is fair and that “debate around the election be authentic … The biggest concern is any type of effort to manipulate that.”
Ahead of the Brazilian presidential election this month, the company identified efforts to suppress voter turnout but was able to stop them quickly, thanks in part to having so many teams in a single room.
“Content that was telling people that due to protests, that the election would be delayed a day,” says Chakrabarti. “This was not true, completely false. So we were able to detect that using AI and machine learning. The War Room was alerted to it.”
Chakrabarti said Facebook removed those posts in a few hours before they went viral.
[wordads]
Do you appreciate our work? Please consider one of the following ways to sustain us.
or
Left vs. Right Bias: How we rate the bias of media sources
Be the first to comment on "Facebook launches ‘war room’ to combat election manipulation"