Deepdub raises $20M for A.I.-powered dubbing that uses actors’ original voices

Netflix’s Korean drama “Squid Game” was one of the most-watched dubbed series of all time, proving the massive potential for foreign-language programming to become a hit in overseas markets. Now, a startup called Deepdub is capitalizing on the growing demand for localized content by automating parts of the dubbing process using A.I. technology. With its […]

Deepdub raises $20M for A.I.-powered dubbing that uses actors’ original voices

Netflix’s Korean drama “Squid Game” was one of the most-watched dubbed series of all time, proving the massive potential for foreign-language programming to become a hit in overseas markets. Now, a startup called Deepdub is capitalizing on the growing demand for localized content by automating parts of the dubbing process using A.I. technology. With its end-to-end platform, Deepdub can decrease the time it takes to complete a dubbing project, allowing content owners and studios have results in weeks instead of months.

What’s more, it does this by using just a few minutes of the actors’ voices — so the dubbed version sounds more like the original.

The Tel Aviv startup has now closed on $20 million in Series A funding for its efforts, led by New York-based investment firm Insight Partners.

Existing investors Booster Ventures and Stardom Ventures also participated in the round alongside new investors at Swift VC. Deepdub was additionally backed by several angels, including Emiliano Calemzuk, former President of Fox Television Studios; Kevin Reilly, former CCO of HBO Max; Danny Grander, co-founder of Snyk; Roi Tiger VP, Engineering at Meta; plus Gideon Marks and Daniel Chadash.

The company was founded in 2019 by two brothers, Ofir and Nir Krakowski, whose backgrounds included machine learning and A.I. expertise.

The older brother, Ofir, “basically founded the machine learning division of the Israeli Air Force,” explains his younger brother Oz Krakowski, who’s also Deepdub’s CRO, having joined the startup at a later stage. (Ofir had held positions in the IAF Ofek unit including head of data science and integration, chief architect and CTO of the A.I. branch, plus A.I. research and innovation manager.)

The team’s youngest brother, Nir, meanwhile, has some 25 years of technology R&D expertise, including in cyber security roles, and had previously co-founded the Y Combinator-backed web gateway Metapacket.

Entrepreneurial in nature, the brothers had been looking for a new business where they could leverage the knowledge they acquired over the years in a way that would bring the most value to consumers, says Oz. They landed on what became Deepdub after having conversations with several people in the industry.

With Deepdub, the aim is to bridge the language barrier and cultural gaps of entertainment experiences using advanced A.I. technologies with an end-to-end platform for content creators, content owners, and distributors. That means Deepdub isn’t just involved in the actual dubbing process itself — it supports all other aspects of a dubbing project, including the translation, the adapting, and the mix. In other words, it’s not just an A.I. platform, it’s a full business that includes human experts at every step along the way to help oversee the work and make corrections, as needed.

But Deepdub’s use of A.I. and machine learning is what makes it a unique solution in this space.

Where a traditional dubbing process may take 15 to 20 weeks to convert a two-hour movie into another language, Deepdub can wrap the same project in just about four weeks. To accomplish this, Deepdub first takes 2 to 3 minutes of the original actors’ voice data and uses that to create a model that translates the characteristics of the original voices into the target language. And, notes Oz, Deepdub’s A.I. voices can “scream, shout, and do all those things that are very complicated for A.I. voices in general,” he says.

“We basically cracked something that has not been done so far,” Oz adds. “You and I will not be able to tell this is a machine. This will entirely sound like a human voice.”

The details as to how this process is being accomplished are the startup’s secret sauce — in other words, they’re not saying, beyond noting they’ve jumped ahead of the published academic research on the matter. The proof, Deepdub claims, is in the output, the investor backing, and the studio relationships it’s gathered.

For example, Deepdub recently entered into a multi-series partnership with streaming service Topic.com to dub their catalog of foreign TV shows into English. Deepdub also became the first company to dub an entire feature-length film into Latin American Spanish utilizing A.I. voices (“Every Time I Die.”) And now, Deepdub says it’s working with both small and large Hollywood studios on projects, but isn’t able to say which ones due to non-disclosure agreements.

There is, however, much debate over whether viewers should enjoy foreign language films and shows in their original language with subtitles, or the dubbed version. Netflix’s “Squid Game,” for example, may have seen a lot of dubbed streams, but there was controversy around how the dubbed version lacked accuracy when compared with the original Korean dialogue. Even “Squid Game’s” creator recommended that viewers watch the subtitled version instead.

One of the issues is that dubbed versions try to match the language to the movement of the actors’ lips so as not to detract from the viewing experience. But there’s an art to this — and it can be complicated to get right. Some dubs have to be stretched out or cut shorter using different words and phrases so that the dubbed speech is in line with the actor’s mouth movement, and this can slightly change the meaning of what was said as a result.

Oz, of course, argues that the dubbed version is better than reading subtitles.

“Some people are not as fluent in reading,” he points out. “And reading subtitles makes you look at the bottom of the screen…with subtitles, you find yourself sometimes rewinding just to watch what really happened because you missed it,” he says.

In addition, the demand for dubbed content is growing as the streaming industry becomes more competitive. Being able to more easily convert titles into other languages can help expand a platform’s offerings without requiring direct investment in the production of new, original content or in the acquisition or licensing of other studios’ titles. It can provide more value from an existing catalog by allowing titles to reach global audiences.

This trend is on the rise, too. Recently, Netflix COO and Chief Product Officer, Greg Peters, noted the streamer had dubbed some 5 million run-time minutes of content in 2021 and had subtitled 7 million. “At that scale, we’re learning […] how to make that localization more compelling to our members,” he said.

“We are accelerating to a world where A.I. is now augmenting humanity’s creative potential,” said George Mathew, Managing Partner at Insight Partners, who’s joining Deepdub’s Board of Directors with this round. “As the media industry continues to globalize, we see Deepdub’s AI/NLP-based dubbing platform as essential in scaling great content to audiences everywhere. We believe Deepdub represents the next great leap forward in global content distribution, engagement and consumption,” he added.

The startup said it will use the funds to double its current team of 30 full-time employees, most of whom are based in Tel Aviv. It’s hiring in sales and marketing to help increase brand awareness and global market reach, as well as researchers and engineers to improve its A.I. engine and further develop its platform.