How Instagram Is Failing Its Youngest Users
Social media is not the best place for kids to spend their time. Learn how Instagram is failing its youngest users.
This article is more than 2 years old
The social media app recently unveiled new age verifications to ensure that young users are protected. If you registered an account as 18 or younger and attempt to change your age to 18 or older, then Instagram will prompt you to verify that information. There are three ways that you’ll be able to prove your new age: a legal ID or driver’s license, a video of yourself, or vetting from three adults on the app called social vouching.
There are new rules and protocols that are attached to Instagram’s updated age verification system. For social vouching, the three adults must respond to a verification request within three days. Each adult can’t be testifying for anyone else’s age simultaneously either. For the video selfie, you must upload a clear video of your face to Instagram. Then, the social media company sends the footage to a facial recognition company called Yoti. Yoti will estimate your age based on the video, and the video is deleted after the company has adequately assessed the clip.
Though Instagram is providing new age verification rules to help keep children safe, the protocol is only for users changing their age currently. No checks or verification are provided for anyone who has previously signed up with an 18 or older account. Though the verification rules will help younger users in real-time, it does nothing for prior young users who’ve signed up falsely with an adult account. Instagram is attempting to provide a safer platform for its teenage users, but many previously criticized the social media company’s lack of overall responsibility.
The recently implemented age verification protocol comes after a year of investigating Instagram’s effect on teenagers. Attorney generals from different states, including California, Florida, and Massachusetts, began looking into Meta (Instagram’s parent company) and its child-safety protocol. The law enforcement officers believe that Meta knew about Instagram’s harm to children and teenagers, especially their mental health. The officials base their claims on inferences and various published pieces, including the Wall Street Journal’s release of leaked documents. The series included Facebook executives who stated that they were conscious of Instagram’s harm to young women’s self-esteem and body image.
Meta vehemently disagreed with these publications and the attorney general investigation. After discrediting the release of those leaked documents, the company also stated that Instagram did internal studies about body image and found that some young people felt better about their bodies after using the platform. That rebuttal, available on Meta’s website, criticizes the Wall Street Journal’s document release that only provides data from 40 teens and claims that those teens had improved mood in 11 out of 12 categories after using Instagram. Meta continued discrediting the Wall Street Journal by saying that the publication took the study out of context and that the study didn’t measure important demographics like teenage interpersonal relationships.
You don’t have to be a researcher or an Instagram expert to know that prolonged social media use can impact a person’s self-worth or body image. Instagram is definitely working towards a safer platform for its teenage users, but its new age verification protocol leaves much to be desired.