Gemma Bloemen 3 October 2023

Welcome, Unitary AI

We all generate content on the internet on a daily basis - whether it be on social media, a forum, a marketplace or a dating app. But not all this content is welcome.

According to a recent report from the Alan Turing Institute, two thirds (66%) of all adults reported that they had witnessed harmful content online before, whilst for participants aged 18-34, this was 86%. Participants in the youngest age bracket reported the highest exposure to harm, with 41% of 18-24 year olds indicating that they had been exposed to harmful content many times. 

With more and more companies hosting user generated content and reporting harmful materials, the UK and EU have brought in regulation which aims to have more control over the content that platforms showcase. However, the amount of content which is uploaded to the internet every day (estimated at over 80 years worth in video alone), is not something that regulation alone will solve. 

This is a problem which is large and growing rapidly, especially on the back of more content generated as a result of artificial intelligence. More and more companies are managing and hosting user generated content online, and moderating these videos, communities and messages is key to ensuring user and brand safety. This is why we are so excited to announce the latest member of the Creandum family, Unitary AI.

About Unitary AI

When we first met Sasha Haco and James Thewlis in 2021, they were at the beginning of their journey. In the past two years we have been amazed by the thoughtfulness of their approach to building a complex business in the AI-regulation space, the fast growth of their commercial business, and the deep technical knowledge they both bring. This was the right time for us to partner and help them police the internet forever. 

Unitary is developing AI technology that understands video the same way that humans can. By simultaneously analyzing multiple signals, its machine learning technology is capable of understanding the content of videos and images, as well as the context in which they appear. 

Unitary has built a multi-modal AI model which can accurately label any type of content, focusing especially on video and images. Unitary is already working with some of the world’s leading brands, ensuring that their platforms are kept safe.

The team

Sasha and James met while on Entrepreneur First, and they bring deep technical expertise to the problem they are solving. Sasha has a PhD in Physics from Cambridge University, working with Stephen Hawking on the blackhole paradox; James has a Computer Vision PhD from Oxford and experienced the content moderation issue first hand during stints at Reddit and Facebook. They lead a lean team of 40 talented individuals passionate about building the world’s best safety and moderation AI software. 

Unitary has made some impressive additions to the board, with Carolyn Everson, formerly a Senior Executive at Meta and on the board of Walt Disney and Coca Cola; and Ian Hogarth, co-founder of Plural, who is now chairing the UK Government’s Frontier AI Taskforce.

About the round

The Creandum funds’ led Unitary AI’s $15 million Series A round, with participation from Paladin Capital and existing investor Plural. Unitary will use this investment to bolster its research and development programmes, scale its team, and forge deep partnerships with the world’s leading social platforms and brand safety organizations.

If you are as excited as we are about Unitary’s mission and growth to date, they are hiring for commercial and technical roles - check out their job board here and reach out to us should you have any questions.

Sasha Haco and James Thewlis

Further
articles

Staffan Helgesson
2019-09-11
Creandum backs e-motorcycle company Cake
Creating a new category in the urban commuter market 
Johan Brenner
2023-10-17
Hanel Baveja
2023-12-21
Backing Kosmik
The Creandum funds' led the $3.7m round to redesign the traditional desktop
Sabina Wizander
2023-05-22
Data-driven revenue
Our investment in Twirl Data