We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


Smartphones and social media have often been called our era’s cigarettes — an addictive, destructive cancer. Children are particularly susceptible to the corruption permeating the digital world, whether they’re soaking in misinformation or battling a barrage of bullying and abuse.

But while the internet can be toxic, millions of young people have only ever known a world in which they interact with fellow humans through MySpace, Facebook, Instagram, YouTube, Twitter, Snapchat, WhatsApp, and countless other social platforms. Parents who try to keep their offspring from using such services face an uphill battle from the outset, so the onus has increasingly fallen on technology companies to make the internet a more pleasant place.

Most of the companies behind these platforms have sought to address a growing tech backlash by introducing features that claim to fix at least some of the problems. Instagram, for example, now uses AI to warn users before they post offensive text in hopes that they’ll reconsider. Elsewhere, Alphabet offshoot Jigsaw is working with publishers on technology that enables users to filter out abusive comments.

Deep-pocketed tech giants have the resources and AI expertise to at least attempt to address the problem, but smaller companies … not so much. This is where Swiss startup Privately is hoping to carve out a niche — by giving developers and device makers the tools to add well-being and safety features that protect children in the digital age.

Private eye

Founded out of Lausanne, Switzerland in 2014, Privately makes software that integrates with any app and enables it to detect safety markers pertaining to children’s online communications. It then provides guidance around cyberbullying, privacy, and more.

A few years back, Privately launched a consumer-focused mobile app called Oyoty, which serves as a demonstration of its technology. Oyoty, in its original guise, was a downloadable mobile app that linked to users’ social accounts (Facebook, Instagram, and Twitter) — anylizing posts for “problems” and intervening when necessary. Its automated bot might recommend that a child not share sensitive information, such as a phone number, or alert them if they’re showing too much skin in a photo.

Above: Oyoty in action

Oyoty has evolved in the intervening years, and it’s still being developed in a handful of European markets. Privately told VentureBet that the company is currently in “advanced discussions” with a major device manufacturer to provide a customized version of Oyoty on their hardware in several countries.

To achieve true scale, however, Privately is pushing into the B2B sphere to make its underlying technology available to anyone and everyone. Its Online Wellbeing and Safety (OWAS) tech has been available since September, and the BBC was among the first third-party organizations to tap the technology when it launched the Own It mobile app and keyboard in the U.K.

The general idea behind Own It is that children install the app and make it the default keyboard on their device (presumably at their parents’ insistence). The kids then receive warnings, prompts, and real-time advice whenever the app detects something untoward.

For example, if someone types the words “I’m feeling suicidal” into Google Search, a little message on the keyboard will pop up telling them to talk to someone who can help, along with a free support number to call.

Above: BBC Own It: Feeling suicidal?

Elsewhere, the BBC’s Own It keyboard can detect when a user writes something mean and ask them to think again before posting.

Above: Companies can use Privately to integrate anti-abuse technology into their apps

Similarly, if someone tries to share personal information such as an email address or telephone number, the app will prompt the user to reconsider.

Above: Privately warns users about sharing too much information

The BBC Own It app helps demonstrate some of the ways Privately’s technology can be used by developers. And Privately is planning to broaden its horizons to include other forms of digital communication — including voice.

“Voice is a top priority, as it is slowly replacing text in many environments,” Privately CEO and cofounder Deepak Tewari told VentureBeat. “We will have at least some features around voice in 2020.”

The Oyoty app also provides a glimpse into how its underlying AI can be leveraged for visual forms of communication, including the ability to detect “too much skin” in a photo.

Privately said it has trained its systems to “understand a number of modalities” to determine what may or may not be inappropriate. It takes into account the context of a photo — such as whether it was taken indoors or outdoors and whether it’s an individual or a family snap — to “interpret whether an image might be provocative,” Tewari said.

Above: Oyoty demonstrates how Privately’s technology can be used to prevent children over-sharing personal information.

Privately is quick to stress that its technology is deployed on the device itself, which not only helps deliver speedy, real-time guidance but also boosts its privacy credentials, given that data isn’t transferred to a remote server. With privacy-focused regulations such as GDPR taking hold, this could be a big selling point for app developers.

“The privacy-preserving AI element of the Own It application that was developed through this collaboration [with the BBC] is an industry first,” Tewari added.

Privately has so far been supported by a Swiss angel investor, in addition to a grant that helped fund its R&D base in Switzerland. The company is also gearing up to raise funding from U.K. investors in early 2020, which is why Privately has now opened an office in London.

In terms of its business model, Privately said it garners revenue chiefly through licensing its technology to companies, which can pick out the features they wish to use. Companies can also customize its technology, for an additional fee, to meet specific client requirements.

It’s still early days for Privately, but the startup is already working on various proofs of concept with companies operating in the gaming industry, in addition to charities, antivirus companies, and telcos. Earlier this year, Privately was invited to become a member of the Fair Play Alliance, a gaming company working group dedicated to improving gamers’ experiences. Tencent also invited Privately to participate in the development of new protection standards for minors.

“There has been very strong interest overall,” Tewari said.

Well-being

At its core, Privately is tapping into a growing digital health trend. Back in July, Instagram introduced a new automated feature that warns users before they post abusive or bullying comments beneath photos or videos, and this feature was recently expanded to captions.

But reducing toxicity is only part of the picture — Privately is also focusing on well-being at the individual level.

“While toxicity and hate speech is a big problem ailing the internet, we see our focus more broadly on the subject of digital well-being,” Tewari continued. “To that end, we will develop technologies that understand the digital environment as well as the user better and provide personalized assistance to users to have a net positive relationship with technology.”

This is an area the big mobile platform providers are investing heavily in, with both Google and Apple launching various tools to improve people’s relationships with their digital devices. And it will likely become an increasingly greater focus for tech companies across the spectrum. Privately is betting it can improve the quality of time spent online by providing an extra layer of protection for children, with AI filling in the gaps for parents who can’t monitor their offspring 24/7.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Author
Topics