Estimated read time: 6-7 minutes
SALT LAKE CITY — Facing increased scrutiny from lawmakers in Utah and around the country, representatives of TikTok's U.S. subsidiary met with local reporters in Salt Lake City on Tuesday morning to outline steps the social media giant has taken that they say will increase safety and privacy for young users on the app.
Several TikTok U.S. Data Security employees were joined by a pair of Utahns who monetize the content they share on the app, both of whom said the platform — known for popularizing short-form videos — is key to their livelihood.
JT Laybourne, a Farmington resident whose lip-synch videos have helped him build a following of 1.7 million people, said the platform enabled him to support his family while doing something that makes him happy.
"I went from not being able to spend much time with my wife and kids because I worked a lot ... now it's the exact opposite," he said. "I get to spend so much time with them because of TikTok."
The event, titled "TikTok Sparks Community," is one of several similar discussions the company has hosted in at least 16 states, with a stop in Houston scheduled for later this week.
While the app has exploded in popularity with 170 million U.S. users, it has been a growing target for lawsuits around the nation, including from Utah. The Beehive State has filed a pair of lawsuits against TikTok, accusing the platform in June of operating "in part like a virtual strip club" and allowing for the exploitation of young users.
And despite the efforts on TikTok's part, a state representative who helped craft Utah's first-in-the-nation social media regulations says he believes the steps don't go far enough.
"We have worked closely with these social media companies in trying to pinpoint what exactly is causing the harm, and I'm excited to see the social media companies have taken that to heart and are making changes," Rep. Jordan Teuscher, R-South Jordan, told KSL.com on Tuesday. "With that being said, they aren't going far enough. ... I'm really excited that they're moving in a good direction and trying to protect youth. They just need to do more."
TikTok's safety features
TikTok's approach to content moderation and safety provides several different levels of moderation, according to Suzy Loftus, the head of trust and safety for TikTok U.S. Data Security. In addition to community guidelines that dictate what kinds of posts are prohibited on the platform, Loftus said the company also limits certain types of content from being promoted on users' "For You" page and restricts some content from being viewed by users who are between the ages of 13 and 17.
"Oftentimes, people will think their experience on TikTok is the same thing a 13-year-old or 15-year-old is experiencing on TikTok," she said, but added that is not necessarily the case.
The experience on the app is also different for young users, she said. For users between the ages of 13 and 15, direct messaging is not allowed, accounts are private by default, users are not allowed to view the algorithmically fed For You page, screen time is limited to 60 minutes per day and notifications are disabled overnight beginning at 9 p.m.
Users ages 16 or 17 have direct messaging turned off by default, have the option to make their account public, and are eligible for the For You page. They also have an hour of screen time as a default and have notifications disabled at 10 p.m.
A separate "family pairing" feature allows parents to link accounts with their children to customize those and other privacy features. Many of the safety features are available for adult users but are not enabled by default.
TikTok implemented the family pairing and screen time features in April 2020 and enabled the sunset hours for push notifications for teen users in August 2021, according to a company spokesperson.
Adams-Wheatley, whose 4.7 million followers tune in for her makeup tutorials and wedding videos, spoke highly of a feature that allows her to hide bullying and offensive comments on her posts.
"I didn't have to see the trolls all the time trying to make content about me, to make fun of me and bring me down for their own gain," she said.
Asked about why the public information tour is happening now — weeks after a redacted lawsuit revealed that TikTok employees were aware that some features were harmful to young users and that measures to limit screen time had been largely ineffective — Loftus said the visit has been in the works for a long time.
"We're working to become the most trusted platform, and so trust-building means showing up and also being accountable for ... what we're doing and just making sure that we are on the ground in places where people love TikTok and that we're sharing more about what our approach is and also listening and learning at the same time," she said.
Lawmaker wants more done
While Teuscher is pleased to see some of the steps that TikTok and other platforms have taken, he said the efforts don't address what he sees as the primary drivers that make social media addictive to children: endless scrolling feeds, push notifications and autoplay videos.
These "engagement-driven designs" are what he and state Sens. Mike McKell, R-Spanish Fork, and Kirk Cullimore, R-Draper, have tried to tackle through a pair of bills meant to rein in social media companies. Although one of the bills was temporarily blocked after a tech industry group challenged its legality, Teuscher said its companion — which aims to make it easier to sue social platforms for alleged harms to teens — is still in effect.
When asked about implementing a feature to limit the ability to endlessly scroll through TikTok feeds, Loftus said it's an "ongoing discussion" but pointed to the screen time limit as a solution.
"I think we've really looked at safety by design in the way that we handle these features, to give people the most opportunity to customize their experience and to deal with issues like that," she said.
Teuscher accused social media companies of acting like "Dr. Jekyll and Mr. Hyde" when working with the Utah government, saying they "seem to be in good faith, wanting to work with the state in trying to do whatever they can to protect minors, while then on the other side, they use their industry groups to come after the state and sue us and try to undermine the laws that we're putting in place — that we worked hand-in-hand with them to put in place."
"I would like to believe that they're working in good faith," he said. "I just think that the incentives are still out of whack, and there's too much money on the table for them to really do what they need to do to protect kids. And until the state is able to help balance that out and provide the right incentives to social media companies, they're just not going to have enough incentive to do what's right."