Social Media

Inside Social Media Algorithms: The Essential Guide for Parents and Children

By Sean, on October 30, 2024 - 4 min read

Admit it. Thumb-scrolling Instagram feeds are super fun. So, we really can’t blame our children being glued to smartphones all day, can we? 

With endless notifications and infinite reasons to scroll, social media platforms thrive by gamifying user experience. So, instead of leaving early, we stick around longer.

The real magic is in social media algorithms, which are a rather complex set of calculations to plan and present specific content feeds to users. For children, this can be entertaining, but it can also expose them to several dark sides, like screen addiction, misinformation, body dysmorphia, and even eating disorders.

So, as a parent, it’s important to understand social media algorithms, how they work, and what you can do to keep your child safe. 

What Are Social Media Algorithms?

In computing language, algorithms are specific instructions for a computer’s actions, like sorting data elements or identifying objects. Social media algorithms function to filter, rank, and present content to users. Simply put, social media algorithms decide what we see and interact with on platforms like Facebook or Instagram. 

social network
https://www.pexels.com/photo/close-up-photography-of-yellow-green-red-and-brown-plastic-cones-on-white-lined-surface-163064/

How Social Media Algorithms Work?

Social media algorithms are like little invisible guides that determine the course of content across our feeds. If you ever wondered how your Instagram feed always gives you the stuff you want to see, here’s the answer.

Every bit of content has been meticulously curated according to your past online behavior, interests, and recent actions. For example, if you’ve already searched and watched a lot of baseball-related content, you’ll continue to see more of it. 

Every day, social media companies are investing a good deal of money and time in bettering their algorithms. Why? Because that’s what pulls people back to the platform.

Here are some metrics that directly impact social media algorithms.

  • User engagement: Comments, likes, and shares
  • Relevance: Hashtags and keywords defining the content context
  • Frequency of posting: Consistent content creation boosts visibility
  • Interactions: The accounts followed and click-through rates, relevance, and content quality
  • Profile authority: Number of followers, consistency, and engagement impacting organic reach
  • Location: User demographics to create relevant content and subsequently promote to other matching users in the area
  • Content mix: Some platforms work well with static images, while for others, videos work best, like Instagram
  • Virality: Popular content with maximum shares
  • Watch time: The average duration spent by a user on particular content, like YouTube videos or Instagram reels

All this data is of immense value to social media companies as they sell advertising space to brands. In turn, businesses pay these platforms to run targeted ads. It’s a lot like buying airtime for TVCs. However, there’s a major difference. Unlike TVCs, social media companies can uniquely tailor their ads to be seen only by specific audiences.

Here’s a simple example of how social media algorithms work. If you follow the keyword “Media” on X (that’s ex-Twitter), the platform will recommend all posts containing the word “media” on your “Explore” page.

How Social Media Algorithms Can Play Foul?

Research studies have shown that social media addiction adversely affects the attention span and natural learning ability of young children. Algorithmic content recommendations tailored to user interest are the perfect recipe for a young child who just won’t put down his phone.

And that’s a major concern. Too much social media is a precursor to multiple health and mental issues, like sleep disturbance, depression,  anxiety, and negative self-image.

Schoolchildren and young teens are reportedly the ones who frequent social media. A 2023 study found that 1 out of 5 teens in the U.S. are “constantly” on social media, using platforms like TikTok and YouTube.

Earlier this year, U.S. News reported New York City and its school systems had filed lawsuits against tech giants like Facebook, Instagram, TikTok, Snapchat and YouTube. The lawsuits blamed the social media platforms for being “addictive and dangerous” and held them “responsible for the childhood mental health crisis disrupting learning and draining resources.” 

Substance Abuse and Tragic Outcomes Linked to Social Media Algorithms

Besides negative mental health issues, the rising number of adolescent drug overdose deaths has also been attributed to social media. Participant-based research established a direct relationship between exposure to substance-related social media content and the use of drugs and alcohol. 

Reportedly, the participants displayed “significantly higher odds of substance use,” particularly on days they were exposed to social media. Snapchat was one of the most frequently used apps in this case.

Now, most of these studies are cross-sectional and limited by recall bias. Yet, TorHoerman Law states that cases backed by solid evidence are entitled to compensation on more than one ground. This includes therapy and medical expenses, pain or suffering, loss of quality of life, and emotional damage.

Earlier this year,  Lawrence Riff, the Superior Court Judge in Los Angeles County, California, passed orders for pressing charges against Snap: Inc., the parent company of Snapchat. The complaining families claimed that their children had met unfavorable outcomes after purchasing Fentanyl (an illegal, synthetic opioid) through Snapchat.

The grounds for allegations included product defect, negligence, and even wrongful death. By all means,  the Snapchat lawsuit is a direct threat to the immunity of tech giants under Section 230. 

So, once again, social media algorithms are gaining negative traction. Since algorithms track a user’s behavior, they show similar harmful content throughout the day. For young children and teens, this tends to be overwhelming, often pushing them to drastic decisions. 

What Can You Do as Parents to Protect Your Children Online

The way the social media algorithms are designed definitely calls for a rethinking. But as a parent concerned about your child’s safety online, there’s a lot you can do. Here are some valid tips.

  • Educate your children on how algorithms work. Explain what to do if they encounter anything online that makes them feel bad.
  • Spend time with your children regularly to review their social feeds. Discuss who they follow and why, and keep an eye on the algorithm’s recommendations.
  • If you find anything threatening, reset your child’s social media algorithm using apps like Bright Canary. Additionally, help them unfollow specific accounts and topics.
  • Use parental controls on phones and tablets for platforms like Instagram, TikTok, and Facebook.
  • You can choose to limit screen time for children (e.g., 30 minutes of Instagram a day or no social media after 10 PM). If your child extends the duration, screens will automatically freeze. 

Undeniably, social media has become a powerful and non-separable entity in our lives. Besides professional use, social media platforms greatly influence the way we talk, behave, think, interact and share opinions. For children, the impacts are far more overreaching, and that’s why we need to be cautious. 

The sooner our children are made aware of social media’s dark side, the better they will become at navigating its pitfalls. Shunning social media is never a good idea. Instead, we should educate our children to strike the right balance. Once they learn to filter the negativity, they can always reap the goodness of social media. 

Cover https://unsplash.com/fr/@helloimnik

Sean