This story is part of , CNET’s coverage of the run-up to voting in November.
As Democrats and Republicans prepare to hold their national conventions starting next week, YouTube on Thursday announced updates to its policies on deceptive videos and other content designed to interfere with the election.
The world’s largest video platform, with more than 2 billion users a month, will ban videos containing information that was obtained through hacking and could meddle with elections or censuses. That would include material like hacked campaign emails with details about a candidate. The update follows the announcement of a similar rule that Google, which owns YouTube, unveiled earlier this month banning ads that contain hacked information. Google will start enforcing that policy Sept. 1.
YouTube also said it will take down videos that encourage people to interfere with voting and other democratic processes. For example, videos telling people to create long lines at polling places in order to stifle the vote won’t be allowed.
The new policies come just ahead of the Democratic National Convention, which starts Monday and is followed by the Republicans’ event later this month. The conventions mark the home stretch of the US presidential election season, which has at times has been overshadowed by the global coronavirus pandemic. But as the general election kicks into high gear — former Vice President Joe Biden earlier this weekas his running mate — Silicon Valley companies have been eager to prove they can avoid the pitfalls they encountered in 2016. That election was marred by interference from Russia, which exploited platforms from Google, Facebook and Twitter to try to influence the outcome of the contest.
Earlier this week, several big tech companies, including Google, Facebook, Twitter, Reddit, and Microsoftthat works with US government agencies to protect election integrity.
YouTube said it will livestream both conventions, which have been scaled back to mostly virtual events in an effort to curb the spread of the. The video platform also said it’s adding new information panels on presidential and federal candidates when people search for them on YouTube. The panels will include the person’s name, party and, if they have one, a link to the candidate’s official video channel.
YouTube has also tried to secure its platform from foreign actors. Last week, the company said it banned almost 2,600 channels linked to China as part of investigations into “coordinated influence operations” on the site. YouTube also took down dozens of channels linked to Russia and Iran that had apparent ties to influence campaigns.
Google on Thursday said it will give people more information on who’s behind the political advertisements that run on Google and YouTube. The company’s political ad transparency report, which Google first started releasing two years ago, will include new ways to sort campaign spending.
- Pandemic thriller Utopia on Amazon might be the perfect viewing
- 2021 Jaguar F-Pace refreshed with new styling, luxury and tech
- 2020 Halloween full moon: This year’s spooky spectacle brings a rare twist
- The best minimalist wallet for 2020
- NASA chief calls for prioritizing Venus after surprise find hints at alien life
- YouTube is adding a new Shorts feature to rival TikTok and Instagram Reels
- Paul Rudd, world’s youngest 51-year-old, tells fellow kids to mask up
- Jonathan Majors to join MCU as villain Kang the Conquerer, report says
- TikTok ban won’t prevent employees from being paid, US says in filing