“Engineering is For Helping People”: Xconomist of the Week Yoky Matsuoka
We had an amazing lineup of speakers at our first-ever forum on the future of robotics at SRI International in Menlo Park back in May. But I was especially excited to have the opportunity to do an on-stage interview with Yoky Matsuoka, whose pioneering studies of “neurobotics” have brought us closer to a future where amputees will be able to use brain signals to control agile, realistic prosthetic limbs.
As a professor of computer science and engineering at the University of Washington—and at Carnegie Mellon University before that—Matsuoka gained fame for her work on “anatomically correct” robot hands. To an almost fanatical extent, Matsuoka’s robot hands mimicked the joints, tendons, and other details of human hands. Only a limb capable of reproducing the full range of human motions, Matsuoka reasoned, could properly interpret the complex neural signals coming from the brain.
Matsuoka’s attack on the problem was innately interdisciplinary, mixing computer science, biomedical engineering, neuroscience, and, of course, robotics. That attracted the attention of the MacArthur Foundation, which awarded her a “genius grant” fellowship in 2007, and won her spots as one Popular Science magazine’s “Briliant Ten,” one of Barbie’s “Top Women to Watch in 2010,” and one of Seattle Magazine’s “Power 25.”
But Matsuoka stayed busy outside the lab too, founding YokyWorks, a non-profit foundation that works to get girls interested in science and engineering by putting them to work on building custom assistive devices for people with disabilities. In our interview, which I recorded, Matsuoka (who is one of our Xconomists) said she went into robotics because she wanted to help people. She said that the goal of YokyWorks is to show middle-school-age girls that they, too, can help people by becoming engineers.
These days Matsuoka is vice president of technology at Nest, the Palo Alto startup building iPhone-like thermostats aimed at changing the way people interact with the environmental systems in their homes. That may seem like an odd career shift, but Matsuoka said in our interview that she sees the Nest thermostat as another kind of robot—one that’s so beautifully designed that it could finally pave the way for many other kinds of ‘bots to enter people’s homes.
Here’s an edited writeup of our conversation.
Xconomy: When you were younger you played tennis quite intensively. In fact you made it to the qualifying rounds at Wimbledon. But I understand you suffered injuries on occasion.
Yoky Matsuoka: On occasion is an understatement.
X: So, do you feel being an athlete prepared you to think about the mechanics of human motion? What’s the thread between that experience, and the work that you did at Carnegie Mellon and the University of Washington on simulating and building anatomically accurate human prosthetic limbs?
YM: When I was playing tennis in college, my selfish motivation was to build a tennis buddy that could play tennis with me. I didn’t really think about how I could help other people with injuries. But that certainly shaped the form of my education in terms of what I ended up doing for my PhD. I went into robotics because I wanted to build a tennis buddy for myself. But then I ended up going off the deep end. We had to understand the neuroscience, or else I could not keep going and build myself a tennis buddy.
By the time I got there and really started thinking about how to build those systems, those crazy five or ten hours a day and hallucinating about which muscles are being activated did come in quite handy. I ended up really trying to come up with a computational model of how the human brain learns different motions. A great analogy would be, when you learn how to play tennis, if the ball bounces in the same spot maybe you can learn how to hit that ball well. But what if the next ball that comes at you bounces very differently? Somehow we come up with a way to improvise it.
Robots are not as good at that. That’s what intrigued me and the computational mechanisms that grow up in the brain [to handle that], that’s what I ended up really focusing on.
So, did it help? Yes. But where the injuries came in, was really understanding the applications of neuroscience. I didn’t realize how many people have neurological injuries that are preventing them from being able to move. And I thought “Wait a second, we are sitting here knowing so many robotic technologies. We can actually utilize this to help them.” That was the turning point.
X: You’ve talked about the scene in The Empire Strikes Back where Luke has a new bionic hand, and there’s that brief moment where you can see the levers in the hand clicking back and forth like tendons. To make something like that work you obviously have to have communication with the brain. So I’m curious—you’re someone who thinks deeply about both halves of the problem, meaning how do you build an anatomically correct robot hand, but also how do you drive it from actual neural impulses. Which of those problems is harder? Or is it the kind of thing where you have to solve them both at once?
YM: We’ve solved them both already right? [Laughter] If you are thinking about the Star Wars scene, and the robot is doing this and it’s super fast and dexterous, that’s a long way from now, in both an anatomically correct mechanical system as well as the brain signals. But we’re making critical headway on both of those things.
Comparing is kind of hard. What are we limited by in terms of the mechanical systems? We are limited by making it light. Batteries. Making it extremely hard to break. Something that can float in the ocean and get beaten up and still be okay. Connectors. Wires. Wow, all of those problems actually get in our way. And it’s kind of interesting that the same kinds of problems are getting in the way of brain science as well. Why aren’t we getting tons of data that computer scientists can play with in terms of brain signals? Because there is nothing small enough to go in the brain that you can wear inside your skull and walk around collecting data. If we had something like that, we would make amazing advances.
So the anatomically correct hands have a dual purpose. It’s not really aiming to make the next set of hands that’s going to hit the market. Really it’s about pushing the envelope of knowledge. We don’t know what 10 things are in our hands that should be in the current robotic hand to make it move just the way we do. Is it the shape of the bone? Is it the way tendons are routed? Is it the way the skin lies? What is it?
When you don’t know, you can either mimic one at a time, or you can mimic all of them and then remove them one at a time, and that’s kind of been the procedure with the anatomically correct hand. So we even try to copy the form of the bone, and we say “Is this bump important? Is this groove important? Wow, there’s this asymmetry and we wanted to get rid of it but it turns out to be extremely important.” That’s the kind of procedures we have gone through on the mechanical side. It’s extremely fascinating.
X: A lot of the prosthetic limbs that amputees wear today are still fairly primitive. I’m curious what you think will be the path between the prosthetic limbs of today and something that is more responsive and agile. Which companies are experimenting in this area? How do you get this stuff to market?
YM: Prosthetics, while it’s so hot in one way, is also so cold because it’s hard to get funding. It’s a relatively small market, unless we start chopping off all of your arms and making you into subjects, which apparently I am not allowed to do yet. [Laughter.] So it’s actually pretty hard to do, to sort of move the field.
One of the companies that we worked with that is doing amazing mechanically is a company called Touch Bionics. They actually put in five motors for each finger. The hand is pretty light, it’s pretty cheap. Insurance will reimburse $20K for the hardware and maybe another $20K for connecting it, so 100 percent getting covered by insurance in some cases. So it’s getting there. It’s not so bad. And people are walking around with this. Thousands of people now have this device.
So why isn’t this common knowledge? Why isn’t everybody wearing it? One, it has five motors, but they all move together because we can’t get the control signals for them separately. How are we going to know from one or two signals that you want to do this, or you want to do that [twisting her arm]? Or make different grasping shapes? That’s turned out to be really hard. So from about two signals they get open signal and close signal. So is the limitation mechanical? Not so much.
This is why I think that with more resources, Silicon Valley and venture capital could really jump in and say, “You know what, this might not be the money making machine of the century, but boy it’s for a good cause, and let’s take some of the software tools we’ve got and bring them into the field.” I bet we could go a long way.
X: I bet we could too. Moving to the opposite end of the funding spectrum, I wanted to ask you about YokyWorks. You set that up a few years ago. It’s a non profit and you focus on commercializing technologies that could help people with Parkinson’s and other disabilities to lead more fulfilling lives. There’s a big educational component to it—getting young entrepreneurs interested in robotics. So I’m curious, what are the career paths open these days to young people who have an interest in robotics and might be trying to decide, “Should I go to a startup, should I go to a university, should I start my own company, should I go to Google and do it on my 20 percent time”?
YM: To say a little more about YokyWorks and its motivation: My work has always focused very heavily on pushing the scientific envelope. Even though I was in the field of robotics, it was because I wanted to help people. So that urge led me to create YokyWorks, which was really about taking requests from individuals. For example, someone might have sent me an e-mail saying, “My son has cerebral palsy, and his fingers are not moving in a way that he can type on a computer, so he is getting Ds in class even though he is really smart. Is there anything you can do about that?” Those are the kinds of requests that I was getting by just being on the Web and being a faculty professor. I thought, you know what, yeah, if I had 20 percent time or worked at Google or could take one day a week, I could probably do this on my own. So that’s why I started YokyWorks. I started to work with my students, that was when I was at CMU, and said, “Hey, let’s take this problem and see if we can build something.”
And then that really started to combine with another motive of really recruiting more women, more girls, into the field of robotics and training them in science and engineering. I had gone through it. I was a girl, growing up. [Laughter.] I hid the fact that I liked science and engineering. When I ask the girls now, “Well, what do you want to do when you grow up?” many of them say, “You know, I just want to help people.” And it seems like engineering is not about that, because engineering is this hard, clunky, gear-y thing. They say, “If I become a nurse, if I do business, I’ll end up being able to help people.”
Well, it doesn’t have to be that way at all. I think that’s where YokyWorks comes in as a tool, to actually recruit those girls. To say “Well, you actually might want to start brushing up on your math and science.” And to show them. We partnered up with a junior high school in Seattle and had sixth grade and seventh grade girls who came in and spent time in the lab and some time at YokyWorks who got exposed to the fact that engineering is for helping people. And these girls are sticking with science and engineering, and that is really great.
X: You were getting a lot of attention for your work in neurobotics, but now you’re working at Nest, which is building the next generation of home monitoring technology for energy. Why was that a logical transition for you?
YM: When you think about how much energy you use every day at your house, it turns out about 50 percent of the energy is spent heating and cooling your house. You think about turning off the lights, but you don’t worry too much about what would happen if we made the house two degrees colder. That would save a lot, lot more than turning off the lights. This is such a big opportunity—to really start being smart about it.
[At this point Matsouoka showed a Nest promo video.] Now, people in this room full of roboticists are probably looking at that and going, “Yeah, simple technology, I could program that tomorrow.” And they could be right. But the starting point is interesting. I teamed up with the people who basically built the first-generation iPod and iPhone. And the combination of some of the robotic technology and what we know, combined with that beautiful package and the human connection that they can get, really finally gets this into people’s homes.
And this may be a very different path for robotics to get into people’s homes. But it is a way, and it’s actually a unique and different way that has to be tried. So I feel that now we are building this brand new kind of robot that hasn’t been built, coming from the consumer product side, not the research side of robotics, but still using the same technology and the same people. Maybe we can meet halfway somewhere to achieve the same goal.
X: So in a way, you’re saying a home is sort of a robot—it has all kinds of sensors and actuators—it just has a terrible interface. One way of looking at it could be that you are building a much cooler interface for the robot that is your house.
YM: Yeah. It doesn’t have to have eyes and a mouth. This really is the entry point. If you look back at the thermostat you have in your home today you will realize it’s not the prettiest thing, unless you have Nest. Robotics has always been thought of as something that’s sexy, but there wouldn’t be a place for it in a normal home. This is a nice way to get it in there.