TIKTOK IS UNDER INVESTIGATION IN THE UNITED STATES FOR ITS DANGEROUS EFFECTS ON CHILDREN
What are the harmful effects of TikTok on the mental health of children?
The group of eight U.S. states launched on Wednesday, March 2, an investigation into the algorithms and marketing methods of this application, which is very popular with young people.
The American authorities want to examine the “techniques used by TikTok to encourage young people” to spend more time on it, to react to content and to interact with the creators.
This social network, a subsidiary of the Chinese group ByteDance, is indeed known for its short videos, musical or parodic, carefully selected by algorithms according to the tastes of users.
It should be noted that in the United States, the application is accessible to children under 13 years old, provided that they use a modified version for them, the issue mobilizes several states. California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee and Vermont have issued a joint statement.
“Our children are growing up in the age of social media, and many feel the need to engage with these filtered versions of reality they see on their screens,” said California Attorney General Rob Bonta. We know this has devastating effects on children’s mental health and well-being. But we don’t know what the companies themselves knew and how long they’ve known it. “
TikTok immediately responded to the investigation and promised to provide information about the many safety and privacy mechanisms” in place “for teens.” Saying it is “very concerned with building an experience that is conducive to the well-being of [its] community,” the platform even praised the prosecutors’ initiative, appreciating that they are “focused on the safety of young users.”
Cases similar to TikTok have happened in the past, and several U.S. attorneys general have already taken a look at Facebook’s parent company Meta. In particular, they accuse it of promoting Instagram, which belongs to the same group, to younger people by ignoring internal reports about the suffering the app can cause. These documents were revealed in the fall by Frances Haugen, a whistleblower formerly employed by Facebook, now Meta.
Massachusetts Attorney General Maura Healey assures that research “shows that Instagram use is associated with increased risks of physical and mental health harm to young people, including depression, eating disorders and even suicide.”
As a result, the prosecutor believes that “Meta has failed to protect young people on its platforms and has instead chosen to ignore, and in some cases even reinforce, practices that pose a real threat to physical and mental health, thereby exploiting children for profit.”
Last year, Instagram suspended development of its under-13 version. But as tough as U.S. authorities have been in recent years against the big platforms, they lack concrete, quick fixes, given court delays or constraints on passing new laws.
The leaders of social platforms have immense economic and political power, and by and large, admonitions and accusations from elected officials and prosecutors have so far had little tangible impact on the companies involved. Analyst Carolina Milanesi of Creative Strategies doubts that “TikTok has much to worry about. They’re going to have to do the same number as Meta, which is to detail their features for security,” but “that won’t affect usage,” she believes.
Taking the example of Instagram, she recalls that the social network “explained that it did not create the content” because “it is the young people who put them online and watch them”. A reality, certainly, but that should not make us forget the “responsibility” of these platforms “in terms of content management”. “And this is where everything becomes blurred”, concludes Carolina Milanesi.