To me, Mallory was your typical, everyday all-American little girl. Mallory Grossman was compassionate and quiet. She liked playing outdoors, making crafts and doing gymnastics. But by the time she was in sixth grade, she was getting viciously bullied both in person and over Instagram and Snapchat. Now I told Mallory she was allowed to have social media. And so I allowed her to have an Instagram account. Once the kids moved on to Snapchat, that’s when I put my foot down and said, “You’re not allowed to have Snapchat. You can’t monitor it.” Mallory’s mom Dianne knew the dangers of these platforms and fought to protect Mallory from the worst of it. But her daughter’s bullies were merciless. And in June 2017, Mallory committed suicide. She was 12 years old. By the time we found out about the online behavior, it was the day before she died. It was too late. The adolescent suicide rate is on the rise. 17 percent of kids say they have been cyberbullied, most commonly on Instagram, Facebook or Snapchat. Many parents and psychologists alike are pointing to the prevalence of smartphones and social media as a potential cause. Psychologist Jean Twenge says the year 2012 was a turning point. Turns out 2012 is the year when the percentage of Americans who owned a smartphone crossed 50 percent. That’s when the smartphone gained market saturation. Around that same time, that’s also when the majority of teens and the sizable majority ended up using social media. At the same time, national data sets revealed a major uptick in teen depression. More and more teens started to say that they felt left out and that they felt lonely. More started to say that they felt like they couldn’t do anything right or that they didn’t enjoy life. Among 10 to 14 year old girls, the number who are admitted to the emergency room for self harm like cutting has tripled in just a six year period. Twenge admits that there’s no way to prove decisively which way the correlation runs. After all, loneliness and depression could cause young people to spend more time on their phones. But Twenge certainly isn’t alone in thinking smartphones bear some of the blame. We see now that the number one psychiatric disorder among teens and young adults is anxiety. And it’s being caused really I think by the sort of obsessive nature of how we’re using the devices. And today, kids have access to this world earlier than ever before. Well I got my first phone when I was in sixth grade. My first smartphone in sixth grade. I was in fifth grade. Eleven years old. I was really excited actually because all my friends had a phone and I didn’t, and I kind of felt lonely. Today, the average age for a kid to get their first smartphone is 10.3. And on average, kids open their first social media account at 11.4 By age 12, 50 percent of kids have some kind of social media presence. But technically, no one under the age of 13 should even be on these platforms. Popular sites like Instagram, Snapchat, Facebook and TikTok supposedly don’t allow kids under 13 to sign up. It comes from a law that passed in 1998 called COPPA, or the Children’s Online Privacy Protection Act, which prohibits websites from collecting personal information from minors under age 13 without verifiable parental consent. Now obviously, many kids violate that and start going on those platforms at younger ages, and the tech industry overall has been quite pathetic when it comes to enforcing the age restrictions that COPPA has enshrined into law. Snapchat, Facebook and TikTok require users to enter their birth dates before signing up. Instagram does not. None have mechanisms in place to ensure that kids aren’t lying about their age, and there’s no clear way to report underage users from within the apps themselves. To report kids under 13 on Instagram and Facebook, you need to fill out a form. For Snapchat and TikTok, you need to email their support teams. Ultimately, the companies have little incentive to curb underage use. Kicking preteens off their platforms could reduce advertising revenue and make it more difficult to hook kids early. Now by the time they’re 13, many teens are already used to logging in every day. I think they probably do want as much data as they can, but within the boundaries that they can get away with. I mean it is a little cynical to say, but obviously they’re going to make more money the more data they have from all of us. So we cannot really count to this day on the industry to regulate itself. So long as tech companies can claim ignorance of underage users, they can’t get in trouble. We know that the users are there and they are underage, but technically no, the platforms are not breaking the law. The child is the one breaking the law interestingly enough , not the platform. In a statement to CNBC, Instagram said, “When it comes to safety on Instagram, we’re always looking to do more, both to make sure those on Instagram are protected from bullying and also to make sure everyone on our platform is over 13. But the reality is, there’s just currently no way to quickly and reliably verify a user’s real age online.” I have never seen a good solution to enforce a specific age to join a social media site. I can’t imagine how you would do it. Well it does pose a technical challenge, social media companies do use a mix of algorithms and human power to identify and delete inappropriate content or verify the accounts of public figures. So some question why they aren’t applying these same technologies to age verification processes. It really is the central question of does Facebook owe a duty to verify that these kids are 12 or under? Do they owe a duty to say, yes when they’re writing that they’re, you know, 15 years old that they’re actually 15 years old. And that’s not really built into the law. I think this is really a parental responsibility. This is where you draw the thin line between the parent and the tech company and say, nope this one belongs to parents. So how do parents even know when a child is ready? Rosen says the age limit of 13 is totally arbitrary. It’s even different around the world. In Europe , according to the new General Data Protection Regulation, kids under 16 need parental consent to join social media platforms, though enforcement there is equally weak. People ask us all the time what age they should give their kids a cellphone, and the truth is the parent has to make the final decision. But for me personally, I always say delay, delay, delay. Common Sense Media recommends an age limit of 15 for Instagram and Facebook and 16 for Snapchat and TikTok, due to issues like language, sexually suggestive content or aggressive marketing tactics. But healthy use at an appropriate age can give kids a platform for self-expression and make them feel more connected. Research shows that teens with no social media are actually a little bit unhappier than those who use it in moderation. And if you take their word for it, 45 percent of teens say social media has no effect on their happiness, while 31 percent believe it has a mostly positive effect. It’s when smartphone use cuts into time spent interacting with friends in person, sleeping or studying that trouble is brewing. The Pew Research Center reports that 45 percent of teens say they’re online almost constantly, so by the time these teens are adults they’re hooked. 88 percent of 18 to 29 year olds have at least one social media account. For parents who want to curb this pattern, there are numerous third party apps they can use such as FamilyTime, Norton Family Premier and Net Nanny. With these, parents can monitor and restrict their child’s smartphone use as well as block certain apps or websites. You know, I tell my parents if your kid has a bachelor’s in Snapchat, you better have a master’s or they don’t need to be using it. But beyond what parents can do on their own, Dianne thinks the social media companies should use their influence to invest in educational initiatives around responsible online behavior. They know that children are misusing their app, and yet they’re worth billions of dollars. What I’d like for them to do is take those billions of dollars and invest it back in to mental health, digital responsibility, like do your part. Some major tech companies are taking steps. With the iOS 12 update, Apple rolled out Screen Time, a feature that lets users know how much time they’re spending on their phone and even allows parents to set limits on their children’s devices from their own phones. For their part, Instagram says they’re using machine learning technology to proactively detect bullying in photos and captions. And last year, they rolled out an offensive comments filter, which automatically hides comments containing attacks or threats. If the big rise in digital media use and social media is at the root of a sudden increase in teen depression, that means we can do something about it. Social media was not designed to replace the human connection. It was designed to improve the human connection, and we as a society have allowed it to replace it. We need to find a way to put the human back in our human connections.