San Francisco– With the aim to demotivate false claims about Covid-19, video-streaming platform YouTube has removed over 30,000 videos for sharing misinformation about Covid-19 vaccines over the last six months.

According to a report by Axios, the video-streaming platform has taken down more than 800,000 videos containing Covid-19 misinformation since February 2020. The videos are first flagged by either the company’s AI systems or human reviewers, then receive another level of review.

Videos that violate the vaccine policy, according to YouTube’s rules, are those that contradict expert consensus on the vaccines from health authorities or the World Health Organization (WHO), the report said.

Other platforms, including Facebook and Twitter, have also rolled out policies to reduce the spread and reach of such content.

Recently, the micro-blogging platform Twitter introduced a strike system against misleading tweets about Covid-19 vaccination and five or more strikes will result in permanent suspension of the account.

Since introducing the Covid-19 guidance, Twitter said it has removed more than 8,400 tweets and challenged 11.5 million accounts worldwide.

While one strike will cause no account-level action, two strikes will lead to a 12-hour account lock; three strikes in another 12-hour account lock; four strikes in a 7-day account lock and five or more strikes means permanent suspension of the account.

Labels will first be applied by Twitter team members when they determine the content violates the platform’s policy. (IANS)