Baidu isn’t too well-known on this side of the world, and being a Chinese search engine at its core, that’s not much of a surprise. However, much like how Google has worked on or created some impressive technologies, Baidu has done the same, but being a Chinese company, most of us in the US and Canada simply don’t hear too much about it.
Baidu’s efforts in deep learning and artificial intelligence sure haven’t gone under the radar of NVIDIA, though. I can’t recall the last time the company held one of its GPU Technology Conferences and didn’t namedrop Baidu a handful of times, or even invite one of its smartest people up to share the stage with CEO Jensen Huang. Given this, it almost seemed like a strong partnership was inevitable.
And it could have well been, as NVIDIA and Baidu today announce a new collaboration to accelerate the impact of AI in cloud data centers, autonomous vehicles, and of course, the home.
At Baidu’s own AI developer conference in Beijing, the company’s COO Qi Lu announced that his company would be adopting NVIDIA’s upcoming Volta GPUs for use in Baidu Cloud, adopt NVIDIA’s DRIVE PX platform for the company’s autonomous efforts, introduce AI to Chinese consumers by way of a conversational system added to NVIDIA’s SHIELD, and last but not least, the intent to optimize its PaddlePaddle open source deep learning framework for Volta.
Being that there is just a single Volta-based card on the market, it should come as no surprise that Baidu’s eying Volta V100 (seen above), the most powerful GPU NVIDIA offers today (and the only with HBM memory). In addition, the company will also implement Tesla P4 accelerators; both GPUs will ultimately split up the workloads for AI training and inference.
Overall, these four separate announcements encapsulate much of NVIDIA’s current portfolio, including the DRIVE PX system, its datacenter products, and even the lowly NVIDIA SHIELD – something I didn’t expect. It’s been said before that in time, AI will be everywhere, and this set of announcements helps prove it.