How social media algorithm is impacting mental health: Need for legal oversight

How social media algorithm is impacting mental health: Need for legal oversight


Social media has seamlessly woven itself into the fabric of our daily lives, allowing us to connect with others and share experiences. Uploading content and viewing the posts of friends and family can enhance our sense of connection. The digital landscape provides enhanced access to information, educational materials, social connections, and valuable opportunities for learning and creativity. Conversely, it can negatively impact overall health by intensifying anxiety, depression, loneliness, and Fomo (fear of missing out), while also fostering feelings of inadequacy and low self-esteem, particularly among teenagers and young adults who seek validation through likes and comments to boost their state of mind.

As of February 2025, 5.56 billion individuals worldwide were internet users, representing 67.9 per cent of the global population. Of this total, 5.24 billion individuals, or 63.9 per cent of the global population, were social media users. Contemporary social media platforms employ sophisticated algorithms to present users with content that is most likely to elicit a response, thereby encouraging interaction. The content algorithms aim to maintain user engagement for prolonged times. The platforms are engineered to sustain user engagement over extended periods by employing algorithms that curate content aligned with their preferences and interactions. The infinite scrolling function ensures a constant availability of new content, facilitating a purposeless scrolling habit, particularly among the youth. However, the question remains: what is the optimal approach to manage our relationship with it? A comprehensive analysis is essential to assess the influence of content algorithms, such those utilised by Instagram and various social media platforms, on emotions and overall well-being. Despite the widespread usage of social media among children and adolescents, comprehensive independent safety studies examining its impact on youth remain absent. There is a consensus within the scientific community that social media can have both positive and negative effects on youth.

Harvard study indicates that dopamine release from social media usage resembles that associated with narcotics like cocaine. The unpredictable nature of likes renders them increasingly addicted. Excessive use of social media can significantly impact brain function, resulting in diminished attention spans, memory impairments, and cognitive failures. Mental health specialists are progressively acknowledging and addressing the influence of smartphones and social media. Contemporary therapies frequently incorporate elements designed to diminish screen time, foster healthy digital practices, and tackle the detrimental cognitive processes linked to social media usage. To mitigate these effects, it is essential to establish limits on screen time and to remain mindful of the content being viewed. Most cell phones possess features and settings that adults can utilise to restrict usage and safeguard themselves against detrimental information. It is essential to educate children on the proper use of technology and to inform them about the dangers associated with excessive screen time and social media engagement.

To prevent the dissemination of hazardous content on their platforms, social media companies employ several methods to regulate posts. These methods encompass both human-assisted and automated techniques, including artificial intelligence (AI) tools. Concerns exist that users may circumvent moderation efforts with ease. This is due to moderation technology frequently struggling to identify hazardous content and being outpaced by the increasing prevalence of AI and bots on social media platforms. These constraints may facilitate the dissemination of harmful content, which can adversely affect the mental well-being of adolescents.

The Digital Personal Data Protection (DPDP) Act, 2023, combined with DPDP Rules, 2025, emphasises the safeguarding of personal data, also children’s online interactions. Although, the new regulations do not explicitly address mental health; however, they aim to mitigate associated risks by enhancing online protections. Experts emphasise the necessity of obtaining consent from a parent or guardian, necessitating rigorous verification of age and identity. However, dependence on self-declaration raises concerns over adherence to the established regulations. Certain sectors, such as education and healthcare, do not require parental consent, facilitating the tracking and targeting of advertisements towards children. The DPDP guidelines primarily pertain to the processing, tracking, and advertising of data. Nevertheless, they may be unable to prevent recommendation engines from displaying harmful content unless it is categorised as behavioural monitoring.

One strategy to address the adverse impacts of social media on youth in some Western nations is the imposition of access restrictions. Australia is leading this initiative with the enactment of The Online Safety Amendment (Social Media Minimum Age) Act 2024, which establishes a mandatory minimum age of 16 for accounts on specific social media platforms. This age restriction is set to be implemented in 2025 and aims to enhance existing protections for young users, particularly concerning risks associated with harmful content and features like persistent notifications that negatively affect sleep, stress levels, and attention. Similar efforts to limit social media access for young individuals are being examined in the United States and the United Kingdom. In the US, New York has introduced innovative legislative measures meant to minimise children’s exposure to social media algorithms. This includes the Stop Addictive Feeds Exploitation (Safe) for Kids Act, which requires social media companies to limit feeds for users under 18 unless they obtain parental consent and restricts notifications to minors between midnight and 6 am. Moreover, the New York Child Data Protection Act forbids online platforms from collecting, using, sharing, or selling the personal data of individuals under 18 without informed consent or unless necessary for site functionality.

Nonetheless, swift progress in technology has created a competitive dynamic between regulators and social media firms, as the speed of technological innovations and the complexity of algorithms have outpaced the ability to implement timely safety updates and regulations. Additionally, the ambiguity in current laws (for instance, permitting legal yet potentially detrimental content like targeted ads) and the absence of uniformity in regulations across countries further hinder attempts to reduce adverse mental health effects on youth.

As a result, there is an increasing concern that the mostly independent governance approaches taken by social media platforms, along with the weaknesses of existing laws, are inadequate to address the threats to the mental health of young individuals. Consequently, there are demands for more stringent regulations, legislative changes, and enhanced oversight of the practices of social media companies. Furthermore, it is essential to ensure that these companies are held accountable for user safety, which further requires broadening the definition of online harms to encompass concerns related to body image and technology-facilitated violence. Additionally, social media platforms ought to offer users, regulators, and researchers’ transparent information regarding promotional content in users’ feeds and the design of their algorithms.



Linkedin


Disclaimer

Views expressed above are the author’s own.



END OF ARTICLE





Source link

CATEGORIES
TAGS
Share This

COMMENTS

Wordpress (0)