What is an algorithm and why does it matter?
Social media algorithms are intricately designed data organization systems aimed at maximizing user engagement by sorting and delivering content tailored to individual preferences. At their core, social media algorithms collect and subsequently use extensive user data, employing machine learning techniques to better understand and predict user behavior. Social media algorithms note and analyze hundreds of thousands of data points, including past interactions, likes, shares, content preferences, time spent viewing content, and social connections to curate a personalized feed for each user. Social media algorithms are designed this way to keep users on the site, thus giving the site more time to put advertisements on the user’s feed and drive more profits for the social media site in question. The fundamental objective of an algorithm is to capture and maintain user attention, expose the user to an optimal amount of advertisements, and use data from users to curate their feed to keep them engaged for longer.
Addiction comes in many forms
One key element contributing to the addictiveness of social media is the concept of variable rewards. Algorithms strategically present a mix of content, varying in type and engagement level, to keep users interested in their feed. This unpredictability taps into the psychological principle of operant conditioning, where intermittent reinforcement, such as receiving likes, comments, or discovering new content, reinforces habitual platform use. Every time a user sees an entertaining post or receives a positive notification, the brain releases dopamine, the main chemical associated with addiction and addictive behaviors. The constant stream of notifications and updates, fueled by algorithmic insights and carefully tailored content suggestions, can create a sense of anticipation in users for their next dopamine fix, which encourages users to frequently update and scan their feeds to receive the next ‘reward’ on their timeline. The algorithmic and numbers-driven emphasis on user engagement metrics, such as the amount of likes, comments, and shares on a post, further intensifies the competitive and social nature of social media platforms, promoting frequent use.
Algorithms know you too well
Furthermore, algorithms continuously adapt to user behavior through real-time machine learning. As users engage with content, algorithms will analyze and refine their predictions, ensuring that the content remains compelling and relevant to the user over time. This iterative feedback loop further deepens the platform’s understanding of individual users, creating a specially curated and highly addictive feed that the user can always turn to for a boost of dopamine. This heightened social aspect, coupled with the algorithms’ ability to surface content that resonates deeply with the user, enhances the emotional connection users feel to the platform and their specific feed, which keeps users coming back time after time. Whether it be from seeing a new, dopamine-producing post, or posting a status that receives many likes and shares, every time one opens a social media app or website, it can produce seemingly endless new content, further reinforcing regular, and often unhealthy use.
A fine line to tread
As explained above, social media algorithms are key to user engagement. They are able to provide seemingly endless bouts of personalized content and maintain users’ undivided attention through their ability to understand the user and the user’s preferences in content. This pervasive influence extends to children, who are increasingly immersed in digital environments from an early age. Social media algorithms can offer constructive experiences for children by promoting educational content discovery, creativity, and social connectivity that would otherwise be impossible without a social media platform. Some platforms, like YouTube Kids, leverage algorithms to recommend age-appropriate content tailored to a child’s developmental stage. This personalized curation of interest-based content can enhance learning outcomes and produce a beneficial online experience for children. However, while being exposed to age-appropriate content may not harm the child viewers, it can still cause problems related to content addiction.
‘Protected Development’
Children are generally known to be naïve and impressionable, meaning full access to the internet can be harmful for their development, as they may take anything they see at face value. The American Psychological Association has said that, “[d]uring adolescent development, brain regions associated with the desire for attention, feedback, and reinforcement from peers become more sensitive. Meanwhile, the brain regions involved in self-control have not fully matured.” Social media algorithms play a pivotal role in shaping the content children can encounter by prioritizing engagement metrics such as likes, comments, and shares. In doing this, social media sites create an almost gamified experience that encourages frequent and prolonged use amongst children. Children also have a tendency to intensely fixate on certain activities, interests, or characters during their early development, further increasing the chances of being addicted to their feed.
Additionally, the addictive nature of social media algorithms poses significant risks to children’s physical and mental well-being. The constant stream of personalized content, notifications, and variable rewards can contribute to excessive screen time, impacting sleep patterns and physical health. Likewise, the competitive nature of engagement metrics may result in a sense of inadequacy or social pressure among young users, leading to issues such as cyberbullying, depression, low self-esteem, and anxiety.
Stop Addictive Feeds Exploitation (SAFE) for Kids
The New York legislature has spotted the anemic state of internet protection for children and identified the rising mental health issues relating to social media in the youth. Announced their intentions at passing laws to better protect kids online. The Stop Addictive Feeds Exploitation (SAFE) for Kids Act is aimed explicitly at social media companies and their feed-bolstering algorithms. The SAFE for Kids Act is intended to “protect the mental health of children from addictive feeds used by social media platforms, and from disrupted sleep due to night-time use of social media.”
Section 1501 of The Act would essentially prohibit operators of social media sites from providing addictive, algorithm-based feeds to minors without first obtaining parental permission. Instead the default feed on the program would be a chronologically sorted main timeline, one more popular in the infancy of social media sites. Section 1502 of The Act would also require social media platforms to obtain parental consent before allowing notifications between the hours of 12:00 AM and 6:00 AM and creates an avenue for opting out of access to the platform between the same hours. The Act would also provide a limit on the overall number of hours a minor can spend on a social media platform. Additionally, the Act would authorize the Office of the Attorney General to bring a legal action to enjoin or seek damages/civil penalties of up to $5,000 per violation and allow any parent/guardian of a covered minor to sue for damages of up to $5,000 per user per incident, or actual damages, whichever is greater.
A sign of the times
The Act accurately represents the growing concerns of the public in its justification section, where it details many of the above referenced problems with social media algorithms and the State’s role in curtailing the well-known negative effects they can have on a protected class. The New York legislature has identified the problems that social media addiction can present, and have taken necessary steps in an attempt to curtail it.
Social media algorithms will always play an intricate role in shaping user experiences. However, their addictive nature should rightfully subject them to scrutiny, especially in their effects among children. While social media algorithms offer personalized content and can produce constructive experiences, their addictive nature poses significant risks, prompting legislative responses like the Stop Addictive Feeds Exploitation (SAFE) for Kids Act. Considering the profound impact of these algorithms on young users’ physical and mental well-being, a critical question arises: How can we effectively balance the benefits of algorithm-driven engagement with the importance of protecting children from potential harm in the ever evolving digital landscape? The SAFE for Kids Act is a step in the right direction, inspiring critical reflection on the broader responsibility of parents and regulatory bodies to cultivate a digital environment that nurtures healthy online experiences for the next generation.
It’s interesting to see the similarity between our two topics. Both revolve around the concept of addiction in which your focus was on the algorithm itself versus my discussion on short-term video efficiency. One aspect in particular that raised my curiosity is your discussion of dopamine which these platforms look to achieve through their engagement with users.
My question would be, do you find that TikTok is essentially attacking these same vulnerabilities of its users through the persona of its platform as a whole? Specifically, do you think that TikTok’s approach to minimizing the length of content is a way in which they sought to separate themselves and reach a certain aspect of addictive nature that other platforms hadn’t prior to its development? I admired your in-depth discussion on the aspects of the human brain, especially at such a young age. Similarly, I was able to learn the true vulnerability of the youth and how the ability to intensify fixations is much easier at a younger age. It’s great to see how you approached the same scenario as me regarding addiction, but tailored it towards different negative outcomes. Specifically, depression, cyberbullying, and anxiety. My focus revolved around actual physical harm and stress.
My final question pertains to the SAFE for Kids Act. I question if this act will have any actual impact on the ability to hinder algorithms from disrupting the youth. Especially if the youth find a way to obtain access on their own to provide parental consent without actually obtaining it. Similar to the idea of forging an electronic signature almost.