DeepL deploys new Nvidia chips to translate whole internet in 18 days
In a major leap for AI translation technology, DeepL announced Wednesday it's rolling out some seriously powerful Nvidia hardware that'll let them translate the entire internet in just 18 days. That's pretty mind-blowing when you consider it used to take them 194 days. Talk about a speed boost!
The German startup, now valued at a cool $2 billion, has developed its own AI models for language translation that go head-to-head with Google Translate. But what's really interesting here is how Nvidia's expanding beyond just supplying chips to tech giants like Microsoft and Amazon.
DeepL's using what's called a DGX SuperPOD system - fancy tech speak for "ridiculously powerful computer setup." Each rack contains 36 B200 Grace Blackwell Superchips, which are some of Nvidia's newest toys on the market. These chips aren't just nice-to-haves; they're absolutely essential for training and running the massive neural machine translation models that DeepL has built.
"The idea is, of course, to provide a lot more computational power to our research scientists to build even more advanced models," Stefan Mesken, chief scientist at DeepL, told CNBC in an interview.
So what's the point of all this processing muscle? Well, DeepL's looking to beef up their translation accuracy and enhance products like Clarify, which they launched earlier this year. Clarify is pretty clever - it asks users questions to make sure the context is right before spitting out a translation. Anyone who's used translation services knows context is everything!
Mesken explained that these kinds of features just weren't possible before. "It just wasn't technically feasible until recently with the advancements that we've made in our next-gen efforts. This has now became possible. So those are the kinds of advances that we continue to hunt for," he said.
The batch processing capabilities this hardware enables are honestly game-changing for language AI. With translation speed ramped up by more than 10x, DeepL can process massive amounts of text in record time.
What does this mean for the average user? Better, faster, more accurate translations through DeepL's service and API access. Their deep learning algorithms can now be trained on vastly more data, which typically leads to better results.
But the implications go beyond just DeepL. This shows how specialized AI hardware is enabling smaller companies to compete with tech giants in the AI space. And it's a pretty clear signal that Nvidia wants to get its chips into more hands than just the usual suspects.
Will other translation services follow suit? Can Google Translate keep up? That remains to be seen, but one thing's for sure - the race for translation supremacy just got a whole lot more interesting.