“I’m not advocating any of these scenarios,” Patrick Lin prefaces before rapidly painting a world in which robots guard our criminals and monitor prisoners’ vital signs while they are tortured, cyborg insects and robots the size of hummingbirds spy on civilians, and enhanced soldiers wage war for days, immune to hunger and sleep deprivation.
“Speaking hypothetically” is a sort of mantra for the technology ethicist, a way of reminding his listener—and possibly himself—that he’s not responsible if and when these visions come to pass. His primary objective is simply to make sure that if they do, policy makers are prepared for them with reasonable laws governing these new technological capabilities.
- FILE PHOTO
- THE MAN WHO SPEAKS IN HYPOTHETICALS : Patrick Lin has dedicated his career to posing ethical questions about emerging technologies.
“Again, I’m not advocating any of these scenarios.”
Lin repeats his disclaimer before stating his case for robots being used in interrogation scenarios: “Humans could take [torture] too far because of human emotions: hatred, anger, vengeance, overblown sense of nationalism, whatever. A robot would be immune to these things. A robot would be immune to adrenaline or hunger. Or fatigue.”
He’d be surprised if members of the intelligence community hadn’t already considered it. After all, one of his long-ago predictions came to pass when South Korea debuted robotic prison guards in November.
On Nov. 9, at the behest of In-Q-Tel (the CIA’s venture-capital arm), Lin did a briefing via Skype with an estimated 50 members of the intelligence community. All the big-hitters—CIA, NSA, FBI, DHS, DoD, etc.—were there. And they wanted to hear Lin’s take on ethical issues that potentially might arise from the intelligence community’s use of robots.
It was hardly Lin’s first encounter with one of the key communities he hopes to influence with his work. This summer alone he participated in a briefing hosted jointly by DARPA and the National Academies of Science and another on the ethics of informational warfare at a UNESCO (United Nations Educational, Scientific, and Cultural Organization) workshop in London. He’s one of dozens of San Luis Obispo residents helping to shape the future of robotics, an industry that’s no longer the stronghold of geeks in basements. Robots are already changing the face of warfare and, as likely as not, cleaning your living room floor to boot.
For the record, Lin’s not a futurist. He’s more like Nancy Drew, if she had a PhD and worked as the director of the Ethics+Emerging Sciences Group at Cal Poly. He scours websites—mainly those belonging to the Department of Defense, the CIA, and DARPA (Defense Advanced Research Projects Agency), an appendage of the DoD—to familiarize himself with projects currently under development. Based on this information, and by networking with industry professionals and military personnel, he develops a forecast of sorts for how the technology will be used. It’s not a prediction so much as an educated guess, and though he doesn’t like to work too far out, he knows that being surprised by new technology is a lot like being caught unawares in a foxhole with your pants down: It’s embarrassing at best. Deadly at worst.
Take the government’s use of unmanned aerial vehicles, or drones. In the last two years, increased use of drones to stage attacks on militants in Afghanistan and Pakistan has generated criticism: The short-term benefits—fewer casualties among American soldiers—aren’t worth the long-term damage to America’s standing and perception abroad, specifically in the Middle East. It’s impossible to predict with any certainty what the consequences might be, but these aren’t the questions Lin’s asking during his briefings. These are the issues the government already knows about.
“What they’re trying to do is look ahead to make sure they’re not caught off guard by any other issues that might crop up,” explained Lin, who hopes that the questions he raises at least give key decision makers pause. “They might just be interested in avoiding PR disasters. That’s fine. As long as the goal is to make sure these drones are used ethically, I like to think that I’ve succeeded, in part.”
He believes the people he meets at his myriad conventions and briefings are genuinely interested in the subject matter, but faces the same doubts as anyone whose function is to advocate caution: that his presence at these briefings might simply be a way to pay lip service to ethics, so everyone else can give the appearance of caring. After all, asking scientists to harness their work and military personnel to ignore a new weapon is a lot like asking a kid not to play with a new toy.
When Lin first got the call from the CIA, he considered declining. But he ultimately decided that, whether the intelligence community was really listening or not, any conversation about ethics was better than none at all.
“There was some sense of obligation that I—I don’t want to say ‘use my powers for good,’ but that we should help to guide practical results, policy,” he explained. “This isn’t just academic. There are real-world ties and implications to a lot of the work that goes on at Cal Poly.”
And the government does place Lin at an advantage. He cites the United States as one of the most transparent societies in the world, in part because research universities like Cal Poly do much of the research for military projects. In 2011 alone, the university received more than $4.6 million from the Department of Defense for various research projects, according to a university report. If a project is going really well and the military is considering making it operational, the program may “go dark,” meaning the public line of information will cease. But there are still tell-tale signs, references to the earlier work on university websites and press releases. Lin may not always know all the details, but it’s the rare project that leaves no trace of its existence within public view.
In terms of his own research, Lin prefers to operate in broad, unclassified daylight. He hasn’t yet been asked to do a classified briefing but would be inclined to turn it down if he were, on the grounds that higher-level security clearance obstructs an academic’s ability to share his work. He knows other academics who have turned down similar opportunities out of the same concern. It’s much more difficult to find practical applications for your work if you can’t tell anybody about it.
In fact, Lin’s more recent projects trend in the exact opposite direction. Along with two other scientists—including roboticist George Bekey, who lives in Arroyo Grande—Lin put together a book titled Robot Ethics: The Ethical and Social Implications of Robotics. It was released in mid-December 2011, and, almost unexpectedly, its target audience isn’t necessarily the scientists and academics with whom the trio of editors typically converses. It’s intended for the everyday person, the citizen-scientist whose grasp on the minute details of the science behind robots is more elementary, but who nonetheless has an interest and stake in what the future holds.
- IMAGE COURTESY OF PATRICK LIN
- THE FATHER OF MODERN ROBOTICS : After a career in robotics at USC, George Bekey lives in Arroyo Grande and is dedicating himself to robotics ethics.
“A lot of academics, they’re not good at publicizing what they do,” Lin admitted. “A lot of them could care less. But I think it’s important to show that what they do has an impact beyond the classroom. Because, technology ethics, ultimately our goal is to make an impact on society, on the broader world, not just to crank out papers that 10 people are going to read, but to do what we can to leave the world in a little better shape than we found it.”
Lin didn’t develop this perspective by treading a research university’s hallowed halls. Having worked for several years in the tech industry, including public relations, Lin learned that keeping the public in the loop has its advantages. After all, these technologies are, ultimately, being developed for the public.
Bekey is Lin’s partner in developing geek literature and asking uncomfortable questions. He shares Lin’s concerns about where technology may take us, as well as his disdain for academia’s status quo. Bekey, who Lin called “the father of modern robotics,” has been working with the machines for 30 years, mostly out of USC’s Computer Science Department where he’s a professor emeritus. Five years ago, he published a 500-page book called Autonomous Robots. In the true spirit of emerging technologies, it’s already obsolete.
Bekey started working with robots in the early ’80s, back when most of them were limited to mechanical arms used to assemble automobiles. He wrote a grant proposal to the National Science Foundation, and the next thing he knew his lab resounded with the pitter-patter of little robot feet. By the ’90s, his lab was building mobile robots. Bekey was particularly interested in the robots’ ability to learn to follow instructions. For example, he cited a toy car his team rebuilt.
“The car’s instructions when we turned it loose on the floor were keep moving, don’t stop, but don’t hit anything.”
They let the car loose and it instantly struck the nearest object in its path. But then there was more time between each subsequent collision.
“Fifteen minutes of running around, it was no longer hitting anything,” Bekey marveled proudly.
But as robots are being tasked with increasingly complicated instructions, and interactions with the public increase, there are bound to be some cartoonish moments. Bekey cites the use of robots to care for the elderly, specifically a robot that might be tasked with reminding its elderly charge to take his or her medication.
“What does the robot do if the patient refuses?” he asked with the same tone someone might use before unveiling a great punchline. “Does it force the pills down his throat? Does it lock up and freeze? I think we’re going to be confronting these issues more and more.”
As ludicrous as a robot shaking down Grandpa might be, Bekey’s been in robotics almost as long as the field has been around, and the fact that he’s now directing his expertise toward questioning how his life’s work will be used is undeniably significant. Bekey firmly believes that robots can be developed and programmed to follow society’s rules. Otherwise, he wouldn’t bother pointing out ethical hurdles along the path. It’s just a question of whether current and future generations of roboticists are willing to follow his lead. And when it comes to this question, Bekey becomes a bit of a pessimist.
“What concerns me,” he explained, “is that, almost without exceptions, robot designers never think about the consequences.”
Engineering schools rarely incorporate an ethics class to remind future roboticists that their actions have repracussuions. Moral guidance is usually left to the family unit, Bekey pointed out, and being an educator, he believes those types of conversations need to happen in a classroom setting as well. It may be more convenient for scientists to focus exclusively on the task of creating without regard for the potential consequences. But it’s been 193 years since Frankenstein was published, and the tale of Mary Shelley’s scientist, obsessed to the point of obstinate blindness to the realities of his task, still rings true.
The problem of scientists not considering the ethical implications of their work wouldn’t be as problematic if the public were picking up the scientists’ slack, keeping them honest. But of most of your opinions about new technologies were formed with a single viewing of Terminator 2, you might not be well prepared to leap into the conversation.
And academics don’t always make it easy for people. The simple task of defining the term “robot” is one that stumps most engineers and scientists, according to Lin.
“Not even roboticists agree on what a robot is,” Lin said. “Definitions are notoriously slippery, especially once you involve a philosopher and they press you on the definition. We all have some idea of what a robot is. The trick is to really nail them down.”
When pressed to define a robot, iRobot engineer Jon Souliere compiled a few qualities or functions, including, “basically you turn it on and it will do its job without you interacting with it other than cleaning it out here and there,” “something that has to be able to do functions or operations on its own,” and “I see robots as having autonomy within them.”
Eventually, he conceded, “It’s a very hard thing to define because I find myself trying to figure out what an actual definition is.”
Not Bekey, of course, who promptly states, “A robot is a machine that senses, thinks, and acts.”
But Lin, who articulates his views on technology quickly and expertly, has a hard time explaining why a drone is not a robot. And if the experts who work with these technologies day in and day out can’t even agree on a single definition, well, you start to understand the public’s dilemma.
Bekey’s belief that ethics ought to be a consideration in every engineering and science program by no means implies that there’s a correct answer to the thousands of questions that emerge with new technologies. Take the military’s use of drones, for example. On the one hand, you have human soldiers who mostly know society’s rules but are placed in intensely stressful and terrifying circumstances, and so often violate these rules. The fog of war—shoot everything that moves—is a byproduct of this fear, hunger, fatigue, anger, grief. A robot’s not going to be subject to these human weaknesses. But a robot can’t necessarily distinguish combatants from non-combatants, either.
Currently, the drones that rain hellfire missiles on parts of the world most Americans have never seen are operated by Air Force pilots stationed in bunkers in the Nevada desert. Drones don’t decide when and where to drop missiles. But according to Bekey, it’s just a matter of time.
“I can tell you that the military are increasingly looking for ways of enabling these vehicles to make their own decisions about firing,” he said. “And it won’t be 10 years before they get that kind of autonomy.”
Besides impacting the country’s reputation overseas, these shiny new toys somehow always have a habit of finding their way into the civilian sector once they’ve fulfilled their original purpose. In early 2011, both Gawker and The Washington Post reported that the FAA would have regulations in place allowing law enforcement to use drones for surveillance by 2013.
On Dec. 10, 2011, the Los Angeles Times reported that North Dakota police made their first arrest with the assistance of a surveillance drone, and they had flown at least two dozen surveillance missions since June.
In all likelihood, these surveillance drones won’t be equipped with missiles, but it’s evidence that the distinction between war and civilian life may not be as clear as we like to think.
And it isn’t just a question of military technologies crossing over into civilian life; as often as not, the technology dictates the war.
“If we didn’t have robots, I can guarantee we wouldn’t be sending ground troops sneaking across the border of Pakistan to take out suspected terrorists,” Lin insisted. “We just wouldn’t be doing it.”
If Bekey and Lin sound like Luddites, they’re actually far from it. You’d be hard pressed to find people who enjoy discussing technology more than these two. Bekey happily cohabitates with a robot—a Roomba that cleans his floors—and cheerfully predicts a future in which we all live with robots. And, when Lin’s not sleuthing through the Department of Defense’s website, one of his primary responsibilities is networking with industry professionals, meaning tech geeks.
They’ve heard the anti-technology arguments, the accusations about “playing god,” dark muttering about scientific hubris. Then again, scientific hubris gave us antibiotics, doubled our lifespan, helped us realize that our environment is in a state of crises, and will play an essential role in hopefully resolving that problem. But despite their tendencies toward caution, Bekey and Lin just don’t happen to be on board the anti-technology train.
“Some people are just really tied to tradition and the status quo, that the way we are now ought to be the way we continue to be, that there’s something noble about the human animal right now and they can’t see how it could possibly be any better,” Lin said, framing the argument. “Other people see technology as an expression of human creativity and the path to our potential, that we become more human with the help of technology.”
Most people would probably be surprised to learn that the path to our potential meanders through a town perhaps best known for an alley filled with chewed gum. Lin and others have found their way here, and they’re doing their part to drive technology forward by blazing a safe, dialogue-filled path. There’s a reasonable chance that the robots that eventually make their way down this avenue will be produced in San Luis Obispo, too—or at the very least be prototyped here.
iRobot opened a branch in San Luis Obispo several years back. If the name doesn’t ring any bells, iRobot is the company that created the Roomba, a floor-cleaning device that accounts for about half of the world’s home-use robots. There are more than 6 million of these devices scurrying across floors, cleaning before docking and recharging when it wearies from these exertions.
But the San Luis Obispo branch has had very little to do with the company’s home division robots. Mostly, it’s been busy prototyping military and industrial robots, the ones that wind up in Afghanistan detecting IEDs and wheeling fearlessly into caves to provide soldiers with an early glimpse of what’s inside. There are currently 4,000 such robots in use by the U.S. military, according to iRobot Director of Communications Matthew Lloyd. They go by names like iRobot 510 PackBot and the iRobot SUGV (Small Unmanned Ground Vehicle). Two of the company’s most recent robots—the 110 FirstLook and 710 Warrior—were prototyped in San Luis Obispo.
iRobot engineer Souliere cites two major factors behind the company’s decision to create a San Luis Obispo branch: Cal Poly and the weather.
The university functions as a talent pool. Souliere estimates that about 80 percent of its local employees come from Cal Poly. And the weather comes in handy when engineers want to test a new model during the winter months, which isn’t really an option at iRobot’s East Coast offices. The Oceano dunes—an environment much like the places these robots could ultimately end up—are a mere 45 minutes away, accessible any time of the year.
The robots developed by iRobot’s industrial and government division weigh between 5 and 347 pounds, and travel at maximum speeds of 8 miles per hour. They move on treads, which is pretty typical, and have Wall-E’s charm with the bravado of a military tank. Perhaps most importantly, according to both iRobot and the Department of Defense, they’re saving the lives of American soldiers.
“The statistic is that 51 percent of all injuries in conflict happen as a result of first contact with the enemy, and our job is to make that first contact via robot so that our soldiers don’t get hurt,” Lloyd explained.
It started more than a decade ago with the PackBot. The name was a reflection of one of the robot’s key functional design elements: It could fit into a backpack.
“They realized afterward that the platform was a little too large to really have on somebody’s back on a long hike,” Souliere said. “It kept the name PackBot, and we made smaller versions that fit much better for that purpose.”
Souliere started working for iRobot right out of college, starting at the East Coast branch before relocating to San Luis Obispo. He helped review designs for the FirstLook, a compact throwable robot that’s expected to go on the market in the first quarter of 2012. In iRobot’s promotional videos, soldiers throw the FirstLook into a deteriorated building. The toaster-sized robot suddenly springs into action, pushes itself upright, and begins investigating the space, tumbling down stairs and repeatedly righting itself.
Its operators, or drivers, manipulate a video game controller, a not-so-subtle reminder that the people the military employs are yesteryear’s video gamers. The parents who yelled at their kids to turn off their games may have been wrong; those skills come in handy as the high-tech engineering companies are now sinking their talent and money into creating products that cater to such a background.
But it’s the 710 Warrior that’s nearest and dearest to Souliere’s heart. The program was in the works for several years—unlike the FirstLook, which was designed in about six months—first as a government contract that was eventually dissolved, then as an iRobot project.
“The robot’s been developed under the idea of developing a newer heavy-lift capability robot, so it’s actually a larger robot, the largest robot we’re looking to productize right now,” Souliere explained. “It’s capable of lifting very large items up in the 120-pound range.”
And it was designed in our own backyard.
iRobot’s Warrior is only now available for purchase, but they’re already at work on the other side of the world. At the request of Kansai Electric Power Company, Inc., iRobot sent several 710 Warriors and two PackBots to Japan to help clean up the Fukushima 1 Nuclear Power Plant following the March 2011 disaster. Besides the great PR, the move proves that the Warrior is perhaps more than its name implies. It will undoubtedly find its way to the front lines of whatever wars the country happens to be fighting, but it shares something in common with its Roomba cousin: It can also help clean up humanity’s messes.
Rather than sending a human into the area decked out in a radiation suit, “Instead they’re doing it with a robot,” Souliere said proudly.
The San Luis Obispo office is still abuzz with activity as engineers put the finishing touches on the Warrior and FirstLook. Souliere’s working on developing accessories for the Warrior, customizations that may appeal to potential customers. For the most part, the team is somewhat isolated from the people who wind up using the robots, tossing them into buildings and caves. But they receive letters expressing appreciation.
And then there’s Scooby Doo.
You might call Scooby Doo a fallen warrior. He now resides in the lobby of iRobot headquarters in Massachusetts. How he got there is the interesting story.
“Scooby Doo dismantled 22 explosive devices,” Lloyd explained, “and the last one, there was a soldier who was a little bit scared and triggered an explosive and blew the robot up. They carried the robot back to the base as if it was a wounded soldier, saying, ‘This robot has gotta be fixed.’ They didn’t want a new robot. They wanted that robot.”
Scooby Doo was beyond repair. Presumably, the bereaved soldiers got a new one. But their display of grief highlights the unusual place robots have come to occupy in our minds, not quite human but hardly just an ordinary tool or piece of equipment.
Of course, a company that can successfully produce a military robot receives more than a soldier’s gratitude. iRobot’s revenue for 2011 was more than $450 million, even as the company laid off 55 employees across three different branches, including in San Luis Obispo, in anticipation of decreased Department of Defense spending in 2012, according to the Boston Herald and a number of business blogs. While they anticipate cutbacks in the industrial and government division, the home robot division is stronger than ever. If Bekey’s prediction of a robot in every home actually comes to pass, iRobot wants it to be none other than their robots.
When asked to comment on the robots of the future, Souliere offered a cautious response befitting an engineer at a company that’s made a name for itself producing practical robots. The robots of the future will advance as much as technology allows, traveling as far as battery power enables them to travel. Lloyd was more venturesome, predicting multiple robots in every home, as well as a more autonomous PackBot.
But when asked whether the PackBots’ evolution will parallel that of the drone, Lloyd’s caution returned.
“We’re a company that right now is not developing any weaponized robots,” he said. “There isn’t a demand for them by the military. If the demand increases over time, that will be something we consider. … We are a company that doesn’t believe that a robot should be left to make life and death decisions on its own. People have to stay involved, regardless of the situation.”
Managing Editor Ashley Schwellenbach can be reached at email@example.com.