By Katelynn Rosa:
The higher-ups at YouTube are back at it again updating their website, but is what they are doing a problem for content creators?
YouTube announced in late January they made an update to their algorithm regarding recommendations. The system will no longer recommend videos that “come close to violating” their Community Guidelines; specifically videos that could “misinform users in harmful ways”. According to a blog post by the company, they won’t recommend any videos “promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11”.
Since videos of this nature don’t quite violate the guidelines YouTube has created, these videos cannot be removed. By changing the recommendation algorithm, YouTube seems to be making it harder to find these videos. However, if someone searches for those types of videos, then they will still find them on the site; and if an individual is also subscribed to channels that provide these types of videos, then they can still see that channels content. However, YouTube will no longer recommend similar videos of that nature.
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” said YouTube’s blog post.
Now, YouTube claims this change in the algorithm will only affect less than one percent of the content on its site. But with the amount of content constantly being uploaded to the site daily, there is still a large amount of content being unrecognized by the average user.
Content creators, like ReviewTechUSA, gave their input on this topic. “They’re [YouTube] not like completely getting rid of them [conspiracy videos], but they’re just really hindering them hardcore,” ReviewTechUSA said in one of his videos. “Whoever makes any of these videos they’re not get any new subscribers because YouTube’s not gonna recommend them.”
ReviewTechUSA doesn’t think YouTube did this with malicious intent. “I really think what they’re trying to do is get rid of the misinformation.”
Other creators, like Eli the Computer Guy, view what YouTube is doing is a negative thing. In his video on the topic, he emphasizes the idea that YouTube is “burying” this type of content. Referring to YouTube’s blog post, Eli the Computer Guy said, “This [blog post] does not say ‘in discussion with our content creators’. This does not say ‘in discussion with out users’. This does not say ‘in discussion’. This says ‘we think’…because we [YouTube] know what’s best.”
Eli the Computer Guy continued, specifically on YouTube’s “responsibility” to its content creators and users. “As far as I know…YouTube’s responsibility to me is basically to just follow their rules, to create a set of rules, follow those rules, and both as a creator and a viewer I will be happy. That’s where your responsibility begins and ends.”
Now, not everyone is disappointed in this change to the website. According to an NBC News article, former Google engineer Guillaume Chaslot said “he helped to build the artificial intelligence used to curate recommended videos.” He went to Twitter to further praise YouTube’s algorithm change.
In a thread, Chaslot explained how he personally knew someone, a man named Brian, who went down the rabbit hole of conspiracy theory content and came out “not trust[ing] anyone. He stopped working, seeing friends, and wanting kids.” The purpose of the AI Chaslot helped design was to increase the amount of time people spend on the site. According to the design of the AI, people like Brian are exactly what they are looking for.
The problem lies in the content these people are watching. If YouTube limited the misinforming content on their site, then people will less likely fall into the conspiracy theory rabbit hole. Chaslot ends his thread with this tweet:
In some ways, YouTube has a bigger issue than just hiding the undesirable content on their site. They say they want to reduce the borderline content that could be harmful, but what constitutes as “harmful”? Kevin Roose, New York Times blogger, posted “YouTube Unleashed a Conspiracy Theory Boom. Can It Be Contained?” discussing just that.
“YouTube’s first challenge will be defining which of these videos constitute “harmful” misinformation, and which are innocent entertainment meant for an audience that is largely in on the joke,” Roose wrote.
In his article, Roose mentioned popular YouTuber Shane Dawson and how at the end of January he released his next big project: Conspiracy Theories With Shane Dawson. This video included a series of what Roose describes as “far-fetched hypotheses” that included, “that iPhones secretly record their owners’ every utterance; that popular children’s TV shows contain subliminal messages urging children to commit suicide; that the recent string of deadly wildfires in California was set on purpose, either by homeowners looking to collect insurance money or by the military using a type of high-powered laser called a ‘directed energy weapon.'”
Now, Dawson himself prefaced in his videos that none of these conspiracies are fact based, but just theories. But all the same, Dawson’s video received more than 30 million views. His most recent video gained more than 20 million views and even started a public feud with Chuck E. Cheese’s, forcing the restaurant chain to deny the claims that they serve recycled pizza to their customers.
According to Roose, this is a huge problem. “Many young people have absorbed a YouTube-centric worldview, including rejecting mainstream information sources in favor of platform-native creators bearing ‘secret histories’ and faux-authoritative explanations.”
So, young people now-a-days are accepting what the internet shares about these conspiracies as fact, while disregarding what mainstream information sources claim is incorrect. Roose said that it is possible for YouTube to combat the plague of conspiracy theories that infects the site. “But doing it will require acknowledging how deep these problems run and realizing that any successful effort may look less like a simple algorithm tweak, and more like deprogramming a generation,” said Roose.
What are your thoughts? Is YouTube’s update to the algorithm helping flush out misinformation or enforcing censorship to its content creators? Comment below or tweet me @KRosa_AC301.
Very thoughtful article. I see both sides, but lean towards an increased responsibility on the part of YouTube. Technology needs to be accountable and evolve with it’s users.
LikeLike