[Vol.89] The Algorithm that changes the world to suit your taste
You’ve seen a YouTube comment ‘the algorithm brought me here’. In our spare time, we watch videos and purchase items recommended by apps. Then naturally the advertisement related to the item I searched before is played. Today, our daily lives contain recommended algorithms. How much do you know about recommended algorithms in everyday life?
About the ‘algorithm’
An algorithm refers to a set of procedures, methods, and instructions for solving a problem. It is named after the Islamic mathematician, AlKhwarizmi. What we can easily see online today is the ‘recommended algorithm’. It refers to a system that provides customized content based on user usage records or personal information.
There are two main operating principles of the recommended algorithm, which can be divided into ‘Collaborative Filtering’ and ‘Content-based Filtering’. Collaborative Filtering is the principle of identifying similar usage behaviors that occur in a specific group and recommending items to people with similar tendencies. In it, it can be divided into User-based CF and Item-based CF. The userbased is a method of recommending items used by people with similar tendencies, and the itembased is a method of recommending items that have been purchased a lot along with the items I want to purchase. For example, the user-based recommends music in consideration of the user's interests and age groups, etc. And the item-based recommends music that has been appreciated a lot along with music that has been appreciated before. Another method, Content-based Filtering is a method of recommending based on analysis of the content itself. For example, it recommends music by analyzing composers, genres, etc.
The uses of algorithms
These recommended algorithms are widely used in various fields in our daily lives.
A representative case is the ‘search engine’. The amount of information on the web is so vast that it is difficult for users to find what they want without sorting the information. At this time, it is an algorithm used for search to determine the well-order by determining the suitability with the user’s needs. Google, the world's No. 1 search engine company, utilizes a Google ranking system consisting of several algorithms. After organizing hundreds of billions of web pages in the search index, it provides customized information considering comprehensive factors such as user search terms, location, preference, page relevance and quality.
Another case is the ‘streaming service’. Streaming services such as YouTube, Netflix, and TikTok use algorithms to place content that users may prefer on the top of the screen. In the case of TikTok, users are asked to select a category that they are interested in when signing up, and then a video that fits it is shown. Afterwards, posts are selected based on what content you expressed likes and dislikes, what videos you repeatedly watched or shared with others. At this time, factors such as consumers’ subscription accounts, posts, search contents, language and country settings are also considered. Even if the consumer has not set a category of interest, the system shows popular videos in general and monitors reactions to select the next post. In the case of Netflix, it is notable that it professionally employs a job called ‘Tagger’. Tagger appreciates Netflix content and categorizes the content into about 50,000 categories considering the characteristics of genres, plots, and characters. It is to increase the accuracy of the recommended algorithm by classifying the content in detail.
In addition, algorithms are also used in the field of ‘education’. After the user solves the question, the algorithm analyzes the score, the time required, and the selected answer to gauge the user's skills and recommend the question of the insufficient subject. In the case of Santa TOEIC, after solving six TOEIC questions, it helps users predict their actual scores on the test. And it focuses on the subjects that require more study in reading, writing, and listening.
The advantages of algorithms
So what are the advantages of these recommended algorithms?
The biggest characteristic and advantage of the recommended algorithm is ‘providing personalized information’. Through this, individuals can be provided with information suitable for their needs and interests in the flood of information. In addition, information search time can be reduced by a single search to easily obtain information such as directions and shopping. Along with this convenience, ‘interest’ is an advantage of algorithms for individuals. You can also spend your spare time by receiving content tailored to your interests. And you form a consensus with homogeneous groups. In most cases, free access to customized content is also an advantage for individuals with recommended algorithms.
Recommended algorithms are also attractive to companies. It is no exaggeration to say that most of the profits of big data companies today are from ‘advertising’ through recommended algorithms. In 2021, 82% of Google's $256.7 billion revenue and 97% of Meta's $117.9 billion revenue came from advertising. Here, advertisement means personalized based on big data. Neal Mohan, YouTube's chief product officer (CPO), said in an interview with the New York Times in 2019 that 70% of YouTube users’ viewing time is the result of the recommended algorithm, and the total video viewing time has increased more than 20 times with the introduction of the algorithm. In other words, from a company’s point of view, there is no reason not to use a recommended algorithm that has not only increased advertising revenue but also increased platform consumption.
The problems of algorithms
However, there are also some problems with this useful recommended algorithm.
The first is an ‘addiction to digital devices’. Individuals cannot easily escape from seductive digital devices because the algorithm provides personalized content. Professor Kim Byungkyu, author of , also said that the YouTube algorithm is like a bakery that recommends sweet cakes. In particular, it is more problematic for the youth. Because there is a concern of imitation when sensational and cruel content is provided without filtering. Frances Haugen, who worked for Meta before, said that the algorithm exposed more provocative contents in the Congressional Testimony. Then she said social media companies hid this fact from users to increase advertising and earn more profits even though they knew it was harmful to teenagers.
The second is a ‘personal information leakage’. In order to use the algorithmic system, personal information such as name, age, and preference must be entered. In addition, every time a user uses the system, what they search for and what contents they view are collected in real time and remain as big data for the company. The important personal information such as movement path, income, and health status is not an exception.
Finally, there is a 'confirmation bias' that focuses only on information that matches one's values, beliefs, and judgments and ignores other information. This confirmation bias is explained by the 'Filter bubble' and the 'Echo chamber' effects. The filter bubble refers to a phenomenon in which an information provider provides customized information to a user, and the user encounters only filtered information. Similarly, there is an echo chamber effect. ‘An echo chamber’ refers to an enclosed space where sound reverberates and has the characteristic that the same sound returns no matter what sound it makes. From this concept, ‘An echo chamber effect’ refers to an environment in which a person encounters only beliefs or opinions that coincide with their own, so that their existing views are reinforced and alternative ideas are not considered. An example of this confirmation bias can be found in the last U.S. presidential election. The protesters, who broke into the U.S. Capitol , without accepting Trump's presidential defeat, believed a completely different story through SNS. The Truth Social, a social media created by Trump, also aggravated the situation. Because it can produce stereotypes and prejudices, in other words, confirmation bias which could instigate people to hate contemporary society.
How should we accept these algorithms?
“If you always ask questions away from the familiarity of algorithms, and remember that algorithms mirror the imperfections of the world, humans can coexist well with algorithms.” This phrase is in the book, by So Yieon. I'll talk about three ways to coexist wisely with algorithms.
[1] First, we can find attempts to escape the confirmation bias caused by the filter bubble. ‘Read Across the Aisle’ is an app that measures which media company's news readers have spent more time on based on Pew Research's political spectrum figures. When the reader consumes more news from a particular bias media company, it encourages them to read news from various perspectives by providing a bias consumption warning message. The Wall Street Journal set up a special page called ‘Blue Feed, Red Feed’ before the U.S. presidential election. It was intended to alleviate biased perception by showing reports or comments from various perspectives on one page. The domestic press Hankyoreh and the JoongAng also attempted to show the opinions of progressives and conservatives on a topic in a balanced manner using a corner called ‘Into the editorial’.
[2] And, we need to have media literacy. Media literacy refers to the ability to read and understand information delivered through the media. Furthermore, it is defined as the ability to access the media, critically understand and utilize information, and express and communicate creatively. When accepting information through the media, it is important to grasp the intention of the information and accept it critically. It is necessary to figure out who created the media from what perspective and what messages are contained. It is also necessary to periodically delete data such as search records and viewing records of devices used.
[3] It is also important to provide media literacy education as a pre-service teacher. In Finland, which topped the European National Media Literacy Index released by Bulgaria’s Open Society Research Institute for five consecutive years, media literacy education has been conducted since childhood. The media education in Finland is composed of two pillars : the government agency, KAVI(National Audiovisual Institute·Kansallinen Audiovisuaalinen Instituutti) and the NGO, the FSME(MediaKasvatus Seura, The Finnish Society on Media Education). In addition, YLE, a Finnish public broadcaster, provides various media education services for children, teenagers, and teachers. Through the news class, YLE's reporters go to school in person to make news with children and develop children’s critical eyes. In the children’s newspaper of , children act as a subject of production and participation. In schools, the autonomy of educators is respected, and classes that combine media literacy with various subjects are conducted.
[4] Along with this media literacy, the most important thing is our attitude in daily life. We need to recall the importance of the inner side. The most basic of this is to have an attitude of respect for people who have different perspectives from me. In order not to fall into a biased world where only one opinion exists, it is necessary to communicate with others in reality. We should be able to exchange various opinions with people who have different thoughts and acknowledge differences. When we can accept this diversity as it is, we have the power to live in a world that uses algorithms wisely, not a world that is dependent on algorithms. To live in a more colorful world, let’s use algorithms wisely.