Prior to the election, social media platforms--Facebook, Twitter, and YouTube vowed to clamp down on election misinformation. This includes premature declarations of victory by candidates and unsubstantiated fraud charges. They mostly delivered with merely a few hiccups.
On November 5, TikTok confirmed that it has taken down videos disseminating election misinformation that had been shared in two high-profile Republican-supporting accounts, The Republican Boys and The Republican Hype House. The accounts are popular among young, conservative voters. They reach over a million followers combined and have the capability to reach more users through other means, including shares, hashtags, or algorithmic recommendations.
According to President Donald Trump's Twitter handle, "surprise ballot dumps" were aiding Democratic presidential nominee Joe Biden as having content that was "disputed and might be misleading about an election or other civic process," reported Aljazeera.
Social media companies have been seared over how they guard rapidly prevailing false information and election-related abuses on their platforms.
Critics of the social platforms argue that their measures did not generally address the problems posed by the 2020 US presidential race.
"We're seeing exactly what we expected, which is not enough, especially in the case of Facebook," according to Shannon McGregor, an assistant professor of journalism and media at the University of North Carolina, reported US News.
Meanwhile, the TikTok videos, which had made allegations of "election fraud," were first observed by Taylor Lorenz.
Although TikTok had committed to address election misinformation on its platform, it was initially not made clear to what extent it would challenge video content.
In the weeks before Tuesday's vote, social media companies vowed action on posts by candidates attempting to declare early victory.
Twitter applied labels to posts by Republican and Democrat officials in the battlegrounds Florida and Wisconsin, cautioning users that the information could be contested or inaccurate.
For Facebook and YouTube, they affixed authoritative information to election posts.
According to McGregor, "Allowing any false claim to spread can lead more people to accept it once it's there."
Republican Senator Thom Tillis received a label on Twitter for declaring a premature North Carolina reelection victory.
According to TikTok, the tech giant removed the videos in question for contravening its guidelines against misleading information but did not further comment on its decision.
A spokesperson for Facebook stated it would not flag premature claims of state wins and merely the presidential race's final result.
A Democratic official alleging that former Vice President Joe Biden had won over Wisconsin was labeled by Twitter.
Much of the slowdown in the tallying of results had been widely reported for months due to the COVID-19 pandemic. This led numerous states to make it more convenient to vote by mail. Millions of voters chose to do so rather than venturing outside to cast votes in person. Mail ballots could take longer to process than that of polling locations.
Related Article: US Election 2020: Results and Exit Polls, Donald Trump vs Joe Biden