It starts with an innocent search of the day’s headlines on YouTube. While things seem normal at first, the site eventually pulls you further and further away from mainstream content and more towards fringe content designed, above all, to keep you watching. Before you know it, you have fallen down a rabbit hole full of videos about the earth being flat, when society will come to an end and why your presidential candidate could be a member of the illuminati
“YouTube is something that looks like reality, but it is distorted to make you spend more time online,” former YouTube engineer Guillaume Chaslot says in an interview with The Guardian.com. “The recommendation algorithm is not optimizing for what is truthful, or balanced, or healthy for democracy.”
“Watch time was the priority,” Chaslot also claimed in the interview when talking about his time at google “Everything else was considered a distraction.”
But what is it that attracts viewers to certain videos over others and makes them want to click that mouse? Furthermore, what elements go into a video to make it so engaging to watch and share with a group of friends? Believe it or not, there is actually a psychology behind the whole process and it’s a very interesting one to dissect and examine.
“Emotion is a big driver of what goes viral,” says Jonah Berger, during an interview with The New Republic.com “. “Whether something pulls on our heartstrings, makes us angry, or provokes controversy, the more we care, the more we share.”
An article by Forbes.com dives deeper into this and reveals that curiosity, fear and arousal are the three biggest emotional factors that encourage people to click to on things. This means that titles like, Michael Jackson exposed as an illuminati agent, proof that the earth is flat and five ways you are playing monopoly wrong are all going to garner more clicks than regular content would.
With that all being said a lot of it comes down to two things. The first is what videos appear in a person’s up next or recommended section. This will be determined by what users have liked in the past, what is currently trending and like to dislike ratio, but also by subscriptions and notification settings as well. Once the next 20 videos are served up by YouTube, the rest is based on emotions.
While curiosity, fear and arousal are the three that were mentioned in The Forbes article, there is another emotion that can also drive people to click on a video and that’s disgust! Believe it or not, this emotion apparently drives users to click on videos like Dr. Pimple Poppers blackhead popping videos and other such icky type content.
“In the space of these videos, we can still be disgusted, but not so much we have to look away. ,” says Psychologist, Alexander Skolnick, during an interview with QZ.com” We can be curious and explore the situation more so we can, in theory, learn from it to protect ourselves in the future. If we were so disgusted we looked away, “you’ll miss out on something Call it evolutionary FOMO.
Not a lot of information is known about YouTube’s exact algorithm, but Guillaume claims that the most important aspect of it is the twenty or so videos that get served up in the recommended, “watch next” section. The videos are meant to entice viewers and keep them on the website longer. The problem is that the information in the videos very often represent left- or right-wing fringe opinions.
For example, during the 2016 election between Donald Trump and Hillary Clinton, information started to leak that Clinton was a member of the occult and took part in blood rituals. , which scared voters who would have otherwise voted for her. This example represents how the algorithm of YouTube had an effect on a real life event, which has to make one wonder if these kind of practices should be allowed.
While maintaining the idea of free speech is very important in our society, especially if we are going to engage in dialogues regarding controversial topics, Youtube really needs to look at how their algorithm is affecting the rest of the world. In fact, The United States Government would be wise to implement laws that make sure each side of a political spectrum is being is being represented.
Furthermore, they need to stop giving so much weigh to more polarizing videos and find a better way to offer an enjoyable user experience. With that being said, it’s understandable that they would want to continue doing what has made the website successful, but one could argue that there are other ways of accomplishing that same mean.
In the end, longer Youtube is able to go on with a formula like this, the longer people are going to be able to abuse the system and change people’s political leanings. The problem with that is that there is no way to tell if the information that is being presented is true or not, which makes it hard to have any idea where to fall on crucial issues.
In fact, most of the Hillary Clinton videos that were being cast in the algorithm in hopes of more viewers were largely false and focused on sensationalized details and innuendos. While is not known what effect that exactly had on the 2016 in the long-term, it no doubt affected first time voters and those more likely to lean conservatively.
With that being said, there are upsides to this algorithm however as it seems to favor conservative based content. For example, one of the studies that Guillame Chaslot did when testing the algorithm was to see which political candidate it would help out the most. Interestingly enough, it seems to be harder to find a negative Donald Trump video versus a negative Hillary Clinton video is great for conservatives.
That means that the conservative political base will have a lot more scare tactics on their side in the war for hearts and minds and might want to focus on YouTube during the 2018 midterm elections and 2020 Presidential election. Not only could it help ensure another presidency for Trump, it also puts some more of Hillary Clinton’s dirty laundry in a place where young impressionable minds will see it.
That’s why there needs to be some kind governing of Youtube when it comes to politics. Furthermore, Youtube needs to understand that they are a platform in the same way that television is and realize that they have a responsibility to inform viewers correctly. With that being said, the current algorithm used isn’t’ achieving that and it could be misleading a lot of people in the process of it all.
Although Youtube’s algorithm seems to work to expose left wing views and humiliate some of the more out there candidates, Facebook took a completely different approach in 2018 by changing the algorithm to prioritize friends, family and groups. This was a way to focus on building friendships and to silence fake news stories that were making the rounds on social media website.
One problem with Facebook doing that is it seems to eliminate the possibility of using Facebook to spread political ideas. Whether that is a good thing or not is up to the user, but it’s going to be interesting to see how each political party responds to not being able to just broadcast their message and reach a bunch of people. If nothing else, that’s going to make the election harder.
Another problem with The Facebook algorithm is how it seems to ignore troll pages and profiles. In fact, there are messages on many users news feeds every day that show radical left or right wing political groups and it’s hard to know which are real and which are only trying to further divide the American public.
It also helps the spread of misinformation, especially since someone could find a false or misleading post on their newsfeed and then share it with their friends. Before they know it however, the false information will have spread to thousands of people and further muddied the waters between what is fake news and what reality.
With that being said, Facebook did the right thing by trying to focus on building relationships between friends and family, but they are missing the point of free speech if they are going to hide political posts like they are right now. Of course users have the right to opt out of these political discussions, ads and stories if they want, but not having them altogether is a huge loss for free speech and for citizens everywhere.
If nothing else, every social media platform needs to develop better algorithms for their users. In fact, if users are to finally take places like Youtube seriously as an entertainment media, they need to make sure that the videos they put on their website are factual and remove ones that are not and ones that could be used to paint someone in a bad light unfairly.
Only then can Youtube be a safe and secure platform for people to get their information from. As for Facebook, they would greatly benefit from giving users a choice in what they wanted to see on their news feeds. In fact, they need to not shy away from political topics and continue to market itself as a platform for taking about politics.
At least then, people can opt in and opt out of what they want to see and not be forcibly censored instead. Going back to Youtube, that platform will have a harder time, especially since the current algorithm is centered on longer user times, but even they could benefit by changing the way they handle news.
In the end, Youtube is a very trusted platform and will probably continue to be for quite some time. With that in mind, shouldn’t Youtube be finding a way to review the news that is being brought upon their website? Shouldn’t they be protecting their users from fake information that could affect their everyday voice and their political leanings?
The obvious and overwhelmingly easy answer to this is yes! It does!
Forbes article link
The Guardian interview