Missy Cummings is a professor in the department of mechanical engineering and director of the humans and autonomy lab at Duke. As one of the first female Navy fighter pilots, Cummings has an intimate knowledge of aviation and the military, which she continues to incorporate in her research into human-technology interactions, as well as in her work on drone policy. The Chronicle’s Ian Jaffe sat down with her to discuss her experience and current drone policy issues.
The Chronicle: As one of the first female Navy fighter pilots, did you feel that you were doing something groundbreaking?
Missy Cummings: Oh yeah, it was clear. It was hard, because none of the guys would speak to me. I think it is hard to realize because it is 2015—but we are talking about 20 years ago, and guys were very bitter that women were forced on them by Congress. It took an act of law to make that happen, so they were very resentful. And [the pushback] is one of the reasons I left. You know, I loved flying, and I loved the fighter-bomber mission, but it was just not a very fulfilling social environment.
TC: How would you describe your current work? Is there a unifying theme in your work?
MC: The unifying theme is that I am looking at the intersection of human interaction with technology. It’s very interdisciplinary, because I do some psychology, some brain science, but I also do more traditional human factors engineering, building drones, which is mechanical engineering, and also electrical and computer engineering. I spend a significant amount of time looking at how to advance automation— how when you take a complex system and put more autonomy into that system, how do you need to design the system to work with—or in the presence of—humans? I am a human advocate in a very technical system.
TC: What would you say are the most important social or ethical implications of technology development, particularly with human-technology interactions?
MC: It really depends on the technology, but perhaps the biggest ones we see when we are talking about advanced automation are privacy, with drones for example, which I predicted in 2007-2008 that there was going to be a huge eruption in the United States population about privacy. People back then didn’t believe me because they didn’t understand that drones were coming. And sure enough, as soon as they became widespread, the privacy debate erupted. I also think another big one is autonomous weapons. So as we start to make more and more robotic weapons, should we let them fire on their own? How reliable are they? And this is where my military background really comes in as well, because I have been there, and seen what mistakes can be made. I’ve seen human mistakes, I’ve seen machine mistakes.
TC: How do you envision the future of drones in light of today’s debates?
MC: So, there is no question that drones are always safer—in terms of flying. They do better at the act of flying than humans do. We are still working some of the kinks out of the system, and the infrastructure that supports that, but there is no question in my mind that in the future it will be the Jetsons. You will have your own flying car, but the car will do all the driving and the flying for you. In 10-20 years, most freight will be flown by drones; the military will be drones.
TC: How accurate do you think the American public’s views of drones are?
MC: I think the public’s view is skewed by what they see in the media... there is a lot of paranoia and fear that is actually pushed on them by the media. It is interesting that even with the media scare tactics—and by that I mean worrying about the privacy issues—they have actually made an impressive entry into the commercial marketplace. And because the business case is so strong, for example a police force can have a $10,000 drone that does the work of a $500,000 helicopter, the budget just cannot deny those numbers. So in the end, having drones in your local law enforcement—and even at the state and federal level—really is good for us because it costs significantly less taxpayer dollars. And it gives us a layer of safety that we would not have otherwise. One of the most dangerous areas of aviation is in first response: police helicopters, traffic helicopters, trying to get out in bad weather. And that can all be turned over to drones, which are much safer. We will see deaths from those accidents go way down.
TC: What do you think should be done domestically in terms of drone policies, particularly those of the Federal Aviation Administration?
MC: It’s interesting about the FAA, because starting in about 2003-2004, researchers like myself started petitioning the FAA to take this issue seriously, to start putting regulations in place to start addressing the fact that drones were coming. The FAA basically told us that we were wrong, that this was never going to be a developing technology. Now, ten years later, they were caught with their pants down, and we are still in this place where the FAA is refusing to understand the scope of the growth potential for this field. But, the good news is that as soon as Google and Amazon jumped in there, the FAA started taking it seriously. So, now the rules are changing, and they are changing very quickly.