YouTube Advertisement
(Photo : Getty Images/Sean Gallup)
BERLIN, GERMANY - SEPTEMBER 27: A young woman with a smartphone walks past a billboard advertisement for YouTube on September 27, 2019 in Berlin, Germany. YouTube has evolved as the world's largest platform for sharing video clips.

YouTube has vowed to take down video content with fraud claims playing a role in the results of the 2020 presidential election. The Google-owned video giant declared, on Wednesday, it would remove new uploads containing misinformation about prevalent electoral abuse, including claims that the election was won due to "software glitches or counting errors."

YouTube to Take Action Against Misinformation

YouTube plans to delete videos uploaded containing attempts to mislead the election results, cautioning users that it will take down content that claims unfounded voter fraud. The declaration comes amid ongoing denouncing of the video platform for what many detractors have complained is a hands-off attitude toward misleading or false video or content that could provoke brutality.

President Donald Trump's team alleged election fraud and filed multiple lawsuits across different states. His request for recounting was accepted in many states and declined in others.

According to YouTube, their main goal in the election season was to ensure they link people with authoritative information, alongside limiting misinformation reach and deleting harmful content. It added that such work is ongoing.

The policy is implemented today to thwart fake news that could mislead United States citizens. The video platform announced one day after the U.S. reached the "safe harbor" deadline when states should have completed election recounts, challenges, or audits.

YouTube stated that adequate states had verified their election results. "Given that, we will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election, in line with our approach towards historical U.S. Presidential elections," reported P.C. Mag.

Also Read: YouTube to Issue Warning to Users With Potentially Offensive Comments

Online platforms have been pressurized to police misinformation regarding the presidential election on their websites.

YouTube was widely regarded as taking a more hands-off approach than Facebook and Twitter, which began labeling content with fake election news.

YouTube announced that content containing statements "alleging widespread fraud or errors changed the outcome of a historical U.S. presidential election" would be removed, reported NTD.

There remain significant legal challenges, including one in the Supreme Court, that could alter the election's outcome.

"Now that enough states certified their Presidential election results, we'll remove any content published today (or anytime after) that alleges widespread fraud or errors changed the 2020 U.S. Presidential election outcome," YouTube wrote on Twitter.

The Twitter account also alluded to its past policy of taking down false and misleading information.

The policing of the content of the website comes over a month after the 2020 presidential election.

Following the closure of polls, President Trump has consistently claimed that voting machines across America were manipulated and has declined to concede, alleging his opponents were stealing the election.

Prevalent claims of "fake news" and allegations of voting machine errors, voter fraud, or other factors have proliferated.

According to YouTube, "Our Community Guidelines prohibit spam, scams, or other manipulated media, coordinated influence operations, and any content that seeks to incite violence. Since September, we've terminated over 8000 channels and thousands of harmful and misleading elections-related videos for violating our existing policies," reported Slash Gear.

Related Article: Trump Campaign Keeps Fighting, Files At Least 7 Lawsuits