OPINION: We must regulate social media algorithms

Two phones, one showing the Instagram interface and one showing the reddit interface. The phones sit in front of a blue-to-white gradient.
(Natalie Bauer • The Student Life)

I often find myself checking Instagram before going to sleep, looking at my Explore page, which is filled almost entirely with food videos, for one reason or another. It’s easy to rationalize losing 30 seconds of sleep to take a gander and a gamble on the next video to see if it’ll be more entertaining, over and over again.

There are three factors at play here: technology, entertainment and algorithms. It’s a combination of these three things that makes these late-night social media sessions a shared experience among our generation. 

Previous attempts to alleviate this issue through legislation have occurred. The SMART Act, introduced in 2019, would ban infinite scrolling, abolish built-in reward systems and introduce stopping points in social media platforms. It’s a start but certainly not a complete solution.

Platforms like TikTok should be allowed to show users content that they may like, but they shouldn’t be allowed to do so to such an extent that users do not want to turn away from their screens. Without legislation explicitly disallowing social media algorithms of this level of efficiency, there is no reason for corporations to stop improving their algorithms. 

Thus, regulations that limit the ability of social media algorithms to present agreeable content to users must be put into place. 

At their most basic level, social media algorithms use an existing user’s data to automatically curate content that said user is expected to appreciate. This is a positive feedback loop. The user stays on an application which helps build their preferences profile, which an algorithm uses to present more content that would make the user want to stay online even longer. 

TikTok, for example, is able to do this through algorithms that determine what potential and existing users want based on users’ shared posts, watch durations of certain videos and liked content, among other factors. 

Now, the presence of greater recommendation technology is not all bad. Amazon can show us the products we want to buy without our having to search for them, LinkedIn can lead us to job opportunities which match our profile and Google can correctly fill in what we want to search. Improvements in algorithms for these purposes may actually lead to greater productivity in our lives.

The difference between those companies and platforms like TikTok is that social media platforms like TikTok generally only lead to an increased desire to use said social media. On the other hand, Amazon, LinkedIn and Google can lead to tangible and productive results: a pragmatic purchase, a job and knowledge, respectively. 

Akin to clips seen on “America’s Funniest Home Videos” or “Jackass,” many popular videos on platforms like TikTok offer a little schadenfreude humor and immediate doses of pleasure which, again, are not necessarily terrible in moderate quantities. 

But it is the nature of platforms like TikTok to get users to watch for extended durations. It is the nature of TikTok’s algorithm to present users fresh batches of videos which align with their previous preferences. And it is the nature of our having handheld devices that we’re going to be on them frequently. 

The banning of cellular devices and social media as a whole is not likely (though suggestions are not impossible). Thus, the best way that legislation can make substantial change is through regulating the algorithms used in curating social media content. 

Specific limits should be placed on the number of user targeted content posts that show up in a user’s feed. If a user really wanted to find a specific genre of content or a niche community, they would still be able to find it through searching key terms. 

Additionally, social media apps should be regulated to show users content they are interested in at first, and then transition into a more diverse selection of posts afterward in the span of a 30-minute period. This would lead to a natural decrease in usage over a reasonable duration of time. 

Ultimately, it is not social media that is bad, nor algorithms, nor pure entertainment. It is the usage of algorithms in the context of entertainment-filled social media which creates a situation in which one would be hard-pressed to find a reason not to introduce some form of restrictions on the usage of the three together. 

There is often a cacophonic buzz surrounding the future of AI, but not much discussion is had over its technological precursor: algorithms. As consumers of social media who avail ourselves of algorithms, we can start by being more conscious about these forces that implore us to keep scrolling for yet another post. 

We can understand that these platforms are built to keep us stagnant and staring at the next photo, clip or text. It’s easy then to remember that you’re always free to leave.

Brian Lee CM ’24 is from Diamond Bar, California. He enjoys taking walks when it’s not 90 degrees and wearing Christmas sweaters out of season. 

Facebook Comments