In a bid to fight fake news and misinformation, YouTube chief executive Susan Wojcicki has revealed plans to add links from “fact-based” sites such as Wikipedia to conspiracy theory videos on the platform. However, it seems like Wikipedia was not made aware of the move.
This was confirmed by Katherine Maher, executive director at Wikipedia, on Twitter, in response to queries if the YouTube initiative would only apply to pages in English. In the tweet, Maher said that this was something YouTube “did independent” of Wikipedia.
A YouTube spokesperson said in an earlier statement to Marketing that the organisation is always exploring new ways to battle misinformation on YouTube.
“At SXSW, we announced plans to show additional information cues, including a text box linking to third-party sources around widely accepted events, such as the moon landing,” the spokesperson said. The statement added that these features will be rolled out in the coming months, but did not have any additional information to share at present time. The spokesperson also declined to reveal if there was an official tie up with Wikipedia on the initiative.
While adding Wikipedia links may not solve the problem of fake news, industry playersMarketing spoke to all agreed it was a step in the right direction.
Lee Kai Xin, interactive director at Wild, said the move aids viewers in seeing content from a more balanced perspective before making judgement. It also intercepts them from getting swayed by persuasive conspiracy theory videos. She added:
However one should not mistake the content shared on Wikipedia as entirely true as it is community driven, and anyone can edit it to manipulate the truth.
“That being said, the introduction of the feature can encourage viewers to be savvier and recognise other accounts of the truth. It also signals to viewers not to trust a singular source of information in the age of the internet,” Lee explained.
Edmund Lou, head of strategy at Kingdom Digital, also agreed that Wikipedia is not a reliable platform for information as its content is crowd-sourced, with unverified contributors.
“There’s even a page on Wikipedia that questions and discuss at length the ‘Reliability of Wikipedia’,” Lou added.
While it is a good effort on YouTube’s part to do more to add credibility to the content on its platform, with more than 1.5 billion monthly active users and 500 hours of video uploaded per minute, it will be challenging for the platform given how its openness allows for anyone to upload any type of content.
“With consumers getting smarter nowadays, they have access to various information sources online and thus will be able to discern the type of news and information that can or should be trusted,” Lou added.
Also weighing in was Kristian Olsen, managing director at Type A, who said platforms such as YouTube have a responsibility to the online community to increase education on what should be considered the truth.
“At the end of the day, people are inclined to believe what they perceive to be the truth. Ultimately, it still falls on the end user to form their own conclusions on specific matters pertaining to fake or controversial news,” Olsen explained. He added:
You can’t peg a meter that validates how reliable any information, from any source is.
For the case of Wikipedia, it has built a reputation as an otherwise credible source of information online because it is a “norm” for users to refer to it for a summery of events and information, Olsen said.
“While Wikipedia may provide information that is contributed by the online community, users still need to draw their own conclusions. This is on aspect with no guarantee or sway towards decision making,” he added.