"I would like to share how the infusion of generative AI into robotics is speeding up the ability to bring robots from proof-of-concept to real-world deployment."
This is what NVIDIA's Embedded and Edge Computing Vice President Deepu Talla said as he shared the latest innovations of the chip manufacturer at CES 2024.
Deepu Talla said that the platform they use for building AI-powered robots is called NVIDIA Isaac, which relies on two computers: the AI factory and Edge.
AI factory is where NVIDIA creates simulations and trains new artificial intelligence models. Talla explained that the AI factory is essentially creating the AI model.
He also said the AI factory leverages NVIDIA's data center compute infrastructure, NVIDIA AI, and NVIDIA Ominiverse platforms.
The second computer (Edge) is the run-time of AI-powered robots. Talla explained that this second computer can become an on-premise server. This means it can perform defect inspections, such as those needed for a high-speed semiconductor manufacturing line.
Aside from this, it can also be an autonomous machine processor, such as the NVIDIA Jetson, which powers an industrial ARM or AMR with multiple sensors.
Using Generative AI for Transformative Robotics
Talla said that generative AI has already shown how it can improve productivity. These can be seen in the use of AI-powered chatbots and co-pilots.
He said this can also happen in robotics since applying generative AI to robots and other machines will accelerate smart robot developments and their deployments.
He promised that this would soon be possible, thanks to NVIDIA's Omniverse platforms and Isaac SIM. Talla confirmed that these advanced AI products can turbocharge the run-time capabilities of smart robots, allow humans to interact with machines using natural language, and many more.
If you want to learn more about how NVIDIA is transforming smart robotics using its latest AI innovations, just watch the YouTube below.