ELLY GRIMM
• Leader & Times
The need for protection of children while accessing social media sites and other parts of the Internet has long been a hot topic and soon, some changes could be coming regarding just that.
A jury found last Wednesday social media company Meta and video streaming service YouTube harmed a young user with design features that were addictive and led to her mental health distress, according to a recap of the case from the New York Times. The finding reinforces a legal theory of how social media sites or apps can cause personal injury and is likely to factor into similar cases expected to go to trial this year.
Alexandra Ryabova, COO of the social media app Wizz, said the ruling is a positive step forward.
“Anyone working in the social media field was watching this unfold every day, and when the verdict came out, it was incredibly interesting because for the first time, we heard everything the child experts had been saying for years confirmed,” Ryabova said. “I think this is very important, and it’s a day people should mark as a day when child safety took a step up on social media. It means a lot for the safety of social media users. This is something that will start changes in the coming weeks and months, and it’s also going to impact the companies a lot, because the companies are also going to be under big scrutiny, and it’s good those changes will be happening.”
Even though the verdict is still rather new, Ryabova said there are many changes she would like to see happen soon.
“I would like companies to allocate better teams and resources to child safety. I think it’s absolutely normal for companies to want their products, their tools, their apps to be captivating, but these apps should not be addictive, and I think it’s extremely important safety is on platforms from the ground up inside the overall architecture,” Ryabova said. “I do not think there is one tool or one thing that can help the safety. I would love to see multiple layers of security implemented in every app that has minor users. I’m looking forward to more responsibility, and I would want companies to think of safety from day one and take accountability and responsibility for what they’re doing toward their users. I also think implementing age verification mechanisms is a big factor that is going to play a big role in the nearest future.”
Ryabova said Wizz has already had multiple safety precautions in place for many years.
“Wizz was created in 2000, and in 2019 the people who created it were looking into the loneliness pandemic that was happening in the market, and they were thinking about how they can tackle this loneliness pandemic within this young generation,” Ryabova said. “They said ‘If we are going to work with the younger generation, we need to implement safety mechanisms and age verification mechanisms from the beginning.’ In 2019 we started working with Yoti, which is a tool that scans your face and is able to say what age you are approximately. Wizz is available to users from 16 years old and older. We require an ID, and then when you enter the app, we pre-moderate all the content, which means when User A sends User B a message, that message is passed to a third party we work with. We also use a third party for image moderation, meaning every image that is being sent on the platform is sent to them, and that site determines if an image is harmful, or if it’s fine. All of the pre-moderation happens for text, video, audio, and any other type of content. When you’re dealing with minors, that’s an age where you’re trying to get to know who you are and interact with others your age. We want to make sure we stay authentic, there are no bots, and there are no harmful algorithms, and that takes multiple layers of help and security. Those layers are quite simple, but they’ve helped build Wizz into something safe we can give to that younger audience.”
When it comes to the safety of minors online, Ryabova said no corners should be cut.
“We live in a world which is tough, and I wish every young adult and every adult would have to take a course on how to use social media that emphasizes keeping certain information private and how to stay safe,” Ryabova said. “But we do not have those classes, and if we do not have parents to talk to us about those subjects, we do not know how to interact on social media. I strongly believe it’s also the responsibility of platforms to build a safe space where people can interact. I think this is the first step toward bringing to light the responsibility platforms carry towards their users well.”
Ryabova added there are multiple reasons why it took so many years for this issue to come under such a bright spotlight.
“This is actually the first generation that is growing up with a phone in their hand, or a tablet of some type,” Ryabova said. “My younger sister, she’s a total iPad kid, she loves games and all of those other features. There is also definitely the question of what resources companies are willing to allocate towards safety. I think a lot of companies neglect that – however, with Wizz, we put 50 percent of our all of our operational costs towards stress and safety, because we know we have minors on the platform. I also think that it took several people who came out and who were willing to speak about this, for this trial to go that far.”
While the recent ruling is a good step, Ryabova said there is still plenty of work to be done not only by the companies involved but also parents.
“It’s not a one size fits all for all companies. There are different teams who know how to manage one thing differently from another teams. There are different resources and different money these companies will be willing to put in on this subject,” Ryabova said. “Every company is going to have to find how they want to verify users’ age, how they protect their users differently. I’m very excited to see what’s going to happen in the industry in the coming months. I feel terrible for all the people coming out about their addictions and saying that they had such negative experiences. It’s terrible they had to come to this point, and especially at such a young age. On the other side, I am proud of them for speaking up and I’m happy this subject is brought on the table, so that’s a brilliant step forward. Parents will also have to have those conversations with their children about how to use social media and the importance of being vigilant with those types of interactions so they stay safe online.”

