Veo Grabs $12M, Led by GV & Lux, to Help Industrial Robots “See”

True artificial intelligence is likely decades away, if it ever comes. In the meantime, companies are making robots “smarter” in less dramatic ways that nevertheless could prove quite useful.

Veo Robotics is one of the players trying to boost the intelligence of industrial robots by enhancing their perception and responsiveness to their surroundings—thereby making it easier and safer for them to work alongside humans.

“Our goal is to have humans be at home in the world of machines,” says Veo co-founder and CEO Patrick Sobalvarro.

Today, Veo’s vision got a $12 million jolt from investors. The Series A funding round was led by Lux Capital and GV, the venture capital arm of Google’s parent company, Alphabet. Some of the money came from an earlier Veo backer, Next47, a venture firm created by manufacturing giant Siemens. Lux’s Bilal Zuberi and GV’s Andy Wheeler will join Veo’s board.

Cambridge, MA-based Veo has now raised $13 million in total venture capital, Sobalvarro says.

While automation has led to manufacturing job losses in recent decades, Veo is one of the voices arguing that robots will not make factory workers obsolete, and that they can actually enhance the productivity of humans.

“What we’re seeing is the value of human labor in manufacturing is going up tremendously, so that manufacturers can be more responsive to market needs,” Sobalvarro says. “If you have a 100 percent automated [manufacturing] line, you’re not going to be able to adapt.”

What he means is that companies are producing a wider variety of goods these days, and they’re often custom orders. That means factory lines are constantly being reconfigured—something that requires humans, Sobalvarro says.

Veo’s plan is for robots and humans to handle the tasks that best suit their strengths. Machines will lift heavy parts and perform repetitive tasks, while humans will focus on work that requires creativity and judgment, as well as tasks that would be difficult and costly to automate, Sobalvarro says.

Veo certainly isn’t the only company trying to make industrial robots more sophisticated and able to work closely with humans. Universal Robots and Rethink Robotics—where Sobalvarro served as president a few years ago—are two others working in this area.

To ensure their robots don’t hurt the people working alongside them, some companies have limited the force their robots can exert and the speed at which they move, Veo says. Meanwhile, most of the large, powerful, and fast-moving industrial robots aren’t capable of adjusting their movements when people get close, so the robots are enclosed in cages for safety purposes, and they must be shut off in order for humans to enter the work station.

Veo’s team thinks they have figured out a solution that doesn’t require those tradeoffs. The company aims to free powerful and speedy robots from their cages by equipping them with 3D sensors and software—effectively, “eyes” and “brains,” Sobalvarro says—that enable the machines to safely operate with humans in close proximity.

“In order to make big robots collaborative, they need to be intelligent and perceptive,” he says. He’s talking about robots that can lift anywhere from 100 pounds to more than 4,400 pounds. “If they were to bump into you, it would be too late already,” he adds.

Here’s a possible scenario for using Veo’s system: A robot equipped with its technology could lift up a refrigerator door and hold it in place while a human tightens the screws to attach it to the refrigerator.

I visited Veo’s office a few months ago and saw one of its prototypes in action. As the robotic arm moved around, I reached out to touch it. When my hand got close, the robot immediately stopped. As I pulled back it slowly started to move again and went about its business.

Veo’s system includes a set of infrared 3D cameras placed at various spots in a work station. On the back end, the company’s software fuses the data from those images together in order to reconstruct the scene and constantly track the movement of objects near the robot. Veo says the software can identify humans and the parts the robot is working with, and the system can be configured to follow rules that, say, allow it to touch the part, but not humans. The robot’s response time is around one tenth of a second, and it will generally stop when someone comes within 1.5 feet of it, Sobalvarro says.

Now, Veo must prove to customers that its system works as well as advertised. Sobalvarro says the company will begin testing its technology with pilot customers early next year. He wouldn’t name the initial partners, but says they include a large automaker, a household appliance manufacturer, a consumer packaged goods company, and a firm that operates distribution centers.

“We’ll be doing pilot systems together with them in controlled environments that are set up to replicate their production” lines, Sobalvarro says. Veo’s goal is to begin selling its product more widely in 2019, he adds.

Sobalvarro started Veo in early 2016, while he was an entrepreneur-in-residence at Siemens Venture Capital. Veo was formed as an independent company, and Siemens doesn’t have any licensing rights to its technology, Sobalvarro says. The relationship between Veo and Next47, the venture firm formed by Siemens, is a traditional venture capital equity deal, he adds.

Sobalvarro’s co-founders are vice president of engineering Clara Vu, a former senior software engineer at iRobot and a co-founder of Harvest Automation; and Scott Denenberg, Veo’s senior director of hardware and a former senior director of electrical engineering at Jentek Sensors. (The trio is pictured above, with one of the robots.)

The fresh capital will go toward product development and expanding Veo’s team. The company currently has around 10 employees, and that number will grow to about 25 over the next six months, mostly through engineering hires, Sobalvarro says.

“We’re really looking forward to building the workforce, the company, and [taking] the product to the next level,” Sobalvarro says.

Trending on Xconomy