Earlier this year, a swarm of buzzing, glowing little robots took the Internet by storm. The video in which they appeared has amassed more than six million views on YouTube and looks like something straight out of a sci-fi movie: Twenty flying helicopter-robots hover in the air, glide together to form a perfect square, then zip through a small opening in pairs.
Posted by numerous media outlets — including SmartPlanet — the video came from a research team at the University of Pennsylvania’s General Robotics, Automation, Sensing and Perception (GRASP) Lab. One of the robots’ co-developers, Vijay Kumar, directed the lab from 1998-2005 and now serves as the deputy dean for education in Penn’s Mechanical Engineering and Applied Mechanics Department. He recently spoke with us about the viral video, his goals for the now-famous flying robot swarm and the status of GRASP’s other projects, including bacteria-powered micro-robots.
GRASP’s flying robot squadron made quite a splash on the web recently. Can you tell me more about developing them?
We build lots of flying robots. They’re all small helicopters. The ones you see on the Internet have four rotors — they’re called quadrotors. They’re autonomous in the sense that they’re able to determine how to regulate the different propeller speeds to fly where they need to fly and usually they rely on a higher level of [programmed] intelligence that tells them where to go. In quadrotor paths, [a robot's] challenge is doing what the planner tells it to do while at the same time keeping track of what its neighbors are doing. Unlike humans, our robots are always altruistic.
How long in the making were these quadrotors?
We’re familiar with the technology, but we didn’t start making them ourselves until the last six months or so. We were using off-the-shelf products [before that] and retrofitting them. You can buy quadrotors right from Brookstone. They’re little remote-control toys for a kid. The problem with those is that they’re not computer controlled and they don’t have sensors, so we were using off-the-shelf platforms and customizing them to our needs. Then we got to a point where we wanted more and more performance, more and more capabilities in terms of flight, so we decided to start building our own.
So the ones in the video are ones that you built yourselves?
What kind of applications do you envision for these robots?
In our business, you develop technology. It would be very grandiose of me to say, ‘This is how it’s going to pan out and this is who’s going to get impacted’ and so on. There are all kinds of things at stake. How technology evolves is very hard to predict. But I can tell you in the lab what our interests are and what we hope the technology would be used for and useful for.
First, for search and rescue, first response, law enforcement, emergency response personnel. They often have to go into really dangerous environments to respond to emergencies. We’d like to have robots do that. For example, if there’s a gunshot in the city of Philadelphia, I should be able to have robots to the scene right away. If there’s something catastrophic that happens in a school building and you don’t know if there’s a shooter or not, you don’t want to send a human in to find out what the problem is and put that person in harm’s way. You want robots to be at the scene immediately.
It’s not that they’re going to rescue people; it’s just that you have eyes and ears on the scene immediately and the information is broadcast to everyone. Everyone has complete situational awareness of what’s going on so that decisions can be made smartly and people can react to threats in a timely fashion.
It sounds like in any of those scenarios the robots would need some sort of camera capability.
Yeah. Obviously, the robots have to have sensors. In some of our experiments, we’ve sort of ignored that piece of it because we focused more on the control, the autonomy piece. In other projects we’re focusing just on the sensing piece.
Why is it more useful in those situations to have a swarm going in rather than a single robot?
First, when you scale things down, the amount of stuff you can carry really goes down dramatically. So if you want to carry 10 pounds worth of fencing equipment, you want to have a robot that’s a hundred pounds in payload. On the other hand, if you want to have lots and lots of little guys, which is what you really need to navigate the complicated three-dimensional environment, maybe each robot only carries a tenth of a pound or two tenths of a pound. That’s not a big deal if you can distribute the payload across a hundred robots simultaneously. The other advantage is that you can quickly explore a three-dimensional building.
Are there any other applications you envision beyond the first-responder scenarios?
The other thing I’d like to do is think about how to teach using these platforms. Right now, these are one-of-a-kind prototypes that we make, but we’d like to make this technology accessible to high-school kids so they can experiment with it and do their own thing. To really operate these robots you need to understand physics and mathematics, and that’s the cornerstone of all science and technology. You can teach science and technology using other kinds of applications, but this is a pretty exciting technology. If students understand that this technology is within their grasp and all they need to learn is a little physics and mathematics, I think that will keep a lot more students engaged in science.
How soon might we see these robots out in the world, either in search-and-rescue situations or in classrooms?
I don’t know about the search-and-rescue piece. There are a lot of things that have to be done that are required to make this a hardened platform that can be used by firefighters, for instance. We are talking to people who are interested in commercializing it. We hope that some of these efforts will come to fruition, but it’s hard for me to predict how the market forces will speed up the development process.
On the educational front, we’re actively looking for funding from different sources to develop these inexpensive platforms and modules for high school students. We’re looking for funding to push that forward.
What sort of reactions did you hear from people who saw the quadrotor videos?
Everybody’s excited. I think there’s a small faction of people that are worried about the applications of the technology. They think this is all military funding this research to make nuclear weapons and so on. But I think everybody’s sort of excited and quite honestly we never anticipated this amount of public interest in these kinds of research projects. Sure, there are 5,000 or so people who work in this area who follow our work, but people like you who pick up the phone and call — that quite honestly has been quite amazing and in some cases overwhelming.
How do you respond to the people who think it’s scary?
I don’t understand why people are freaked out by it. What we do has nothing to do with military applications. In fact, our lab has students from different countries. Everything we do, we publish. If we had time, we would invite anyone who wants to come look at our technology. This is research and so people interested in science and technology are always welcome in our lab.
Tell me about some of GRASP’s other big projects over the years.
I was only marginally involved in this, but my colleague [Associate Professor] Dan Lee worked on developing an autonomous car. We have a Toyota Prius which looks very much like the Google car, but we made it long before Google ever got into this business. This car can drive in urban environments. It can recognize four-way stop signs and see what the driver wants to do and react to that. It can merge onto highways. It can obviously keep speed. It can do a lot of things that you’d expect of humans and has human-like intelligence. There was a competition in which this car came in fourth. It drove for about 60 miles and since it’s a Prius, it used less than a gallon of gas. And it was all autonomous — there was no human controlling it.
We also think about how to get robots to play soccer autonomously. We’ve had a whole slew of soccer-playing robot teams, and the latest ones look like humans. In these soccer competitions you have to figure out where you are, where the goalpost is, where your team members are, where the opponents are. You have to figure out when to pass to your team members. And all this happens autonomously — you’re not allowed to control the robots.
Does your team of robots play against other labs?
Yes, every year we have a soccer competition. It’s our World Cup soccer.
What’s your record?
We’ve done very well in competitions. Unlike the World Cup, it is not just one prize you win. There are multiple prizes. Last year, we lost in the semifinals, but we won the penalty challenge. If you come to our lab, you’ll see all the cups that the robots have won.
What else is the GRASP Lab working on now?
We’re building robots at the micro scale. These are really neat. They’re actually powered by bacteria. When you scale things down, it’s very hard to find motors that can drive robots. Our strategy has been to look at bacteria and how bacteria move and then get bacteria to actually drive our robots for us. We literally stick the bacteria onto small robots and then we manipulate the way they think and the way they react to the environment so that they carry the robot where we want the robot to go.
What can the micro-robots do?
Right now, they can sense chemicals in the environment and they can transport small payloads. We’re looking at ways for them to take small beads in which we encapsulate drug molecules and then carry the drugs from one cell to another. We’re looking at ways to deliver these beads to small cells.
Looking at the robotics field in general right now, how do you feel? Are we making the progress we should be?
It is very exciting, of course, but we are so far behind where we should be. I think the problems are very hard. What you see in the popular press and so on are examples of machines that are operated by humans remotely. You look at [Unmanned Aerial Vehicles], you look at these robots in Afghanistan — these are not really robots in the sense that there’s a human who’s aware of exactly what’s going on and makes the decisions. UAVs don’t kill people, it is people who kill people. There’s someone who is at the trigger of that UAV remotely. The UAVs are not autonomous.
I think there’s a long way to go before we can truly build robots that are autonomous and can sense the environment and react to it and do the right thing. That’s one of the things we’re working on. We should be farther along. There are lots of challenges to solve and I think we need to engage the smartest and brightest people in the world to work on these problems.