Coco Robotics Partners With UCLA On AI Lab

Coco Robotics Taps UCLA Professor To Lead New Physical AI Research Lab

Coco Robotics taps UCLA professor to lead new physical AI research lab aimed at transforming robot learning and automation. The Los Angeles-based startup, best known for its last-mile delivery bots, is unlocking insights from over five years of real-world data to advance physical artificial intelligence.

Coco Robotics Partners With UCLA On AI Lab

Image Credits:Coco Robotics

The company announced that UCLA Professor Bolei Zhou will lead this new initiative while also joining Coco Robotics as its Chief AI Scientist. Zhou’s appointment marks a major step toward deepening the startup’s commitment to cutting-edge research in robotics and real-world AI systems.

A New Era For Physical AI

When Coco Robotics launched in 2020, it relied on teleoperators to guide its delivery bots through complex city environments. According to co-founder and CEO Zach Rash, the ultimate vision has always been to make these robots fully autonomous—reducing costs and improving delivery efficiency.

After accumulating millions of miles of navigation data, the company now sees an opportunity to push physical AI to new heights.

“We’ve collected data in some of the most complicated urban settings possible,” Rash explained. “That data is crucial for training useful and reliable AI systems. We finally have enough scale to accelerate research in physical AI.”

Why Bolei Zhou Is The Perfect Fit

Appointing Professor Zhou to lead this initiative was an easy decision for Coco Robotics. Zhou’s expertise spans computer vision, reinforcement learning, and robotic navigation—all essential pillars for physical AI advancement.

“Zhou is one of the world’s top researchers in robot navigation and learning,” Rash said. “He’s already helping us recruit some of the best minds in the field to join Coco Robotics and accelerate innovation.”

Both Rash and co-founder Brad Squicciarini are UCLA alumni who have maintained close ties with the university. Their collaboration with Zhou predates this announcement—Coco even donated one of its robots to UCLA’s research lab to support ongoing experiments.

Inside Coco Robotics’ Physical AI Lab

The newly established physical AI research lab will operate independently from Coco Robotics’ ongoing collaboration with OpenAI. While OpenAI provides access to its powerful models, Coco Robotics contributes its real-world robot data to fuel AI development.

This dedicated lab will focus on enhancing the robots’ ability to interpret environments, make real-time decisions, and adapt autonomously.
For now, Coco Robotics intends to keep its research findings proprietary—using them to refine its automation systems rather than licensing data to other companies.

Data-Driven Automation For Smarter Robots

With a trove of data collected from real-world conditions, Coco Robotics aims to improve the local models powering its delivery bots. These models determine how robots perceive obstacles, respond to changing weather, and navigate unpredictable human environments.

The collaboration with UCLA’s Zhou is expected to bridge academic research and commercial innovation, setting new benchmarks for physical AI applications. By investing in AI-driven autonomy, Coco Robotics is positioning itself as a leader in sustainable, efficient, and intelligent last-mile delivery.

Coco Robotics’ Vision For The Future

Coco Robotics believes that physical AI will be the cornerstone of next-generation robotics—where machines learn from experience rather than instruction. The company’s investment in research underscores a broader trend: merging academic insight with commercial scalability to build smarter, safer, and more adaptive machines.

As Coco Robotics taps UCLA professor to lead new physical AI research lab, it signals not just a new chapter for the company but also a significant moment for robotics innovation. By leveraging academic expertise and vast real-world data, the startup is pushing the limits of what autonomous delivery technology can achieve.

Post a Comment

Previous Post Next Post