Study Guide

I, Robot Quotes

By Isaac Asimov

  • Technology and Modernization

    All that had been done in the mid-twentieth century on "calculating machines" had been upset by Robertson and his positronic brain-paths. The miles of relays and photocells had given way to the spongy globe of plantinumiridium about the size of a human brain. (Introduction.6)

    Asimov is writing this in the mid-twentieth century when "calculating machines" were like this. So he's got to come up with a science fictional variety of technology, which is the positronic brain—and we have to accept that as our premise if we want to see where these stories lead. Also, "platinumiridium" might sound weird, but it's just two regular elements: platinum and iridium.

    In high good-humor the family took a taxi-gyro to the airport (Weston would have preferred using his own private 'gyro, but it was only a two-seater with no room for baggage) and entered the waiting liner. (Robbie.144)

    Yes, robots are the most interesting technology in these stories, but we shouldn't forget that a lot of other things have changed in Asimov's future. And here's one: people aren't traveling by car or train, but by gyro, which probably means a sort of helicopter. Also, we have to admit, when Asimov says "liner," we're not sure what to think, but we do picture something like this.

    Even ten years, technologically speaking, meant so much. Compare Speedy with the type of robot they must have had back in 2005. But then, advances in robotics these days were tremendous. (Runaround.11)

    This may be the central story of this whole book: robot technology advances from non-speaking Robbie to world-controlling Machines. This is a pretty good reminder that technology has a tendency to move forward—even though sometimes we might be fearful about what new technology will bring.

    When these stations were first established to feed solar energy to the planets, they were run by humans. However, the heat, the hard solar radiations, and the electron storms made the post a difficult one. Robots were developed to replace human labor and now only two human executives are required for each station. We are trying to replace even those, and that's where you come in. You're the highest type of robot ever developed and if you show the ability to run this station independently, no human need ever come here again except to bring parts for repairs. (Reason.24)

    Besides giving us an understandable history of human space technology, this passage may hint at one other issue with technology: it often replaces or changes human occupations. Here's a whole space station that might be totally run by robots. What will the people do who used to have these jobs? Will their lives be better now that they aren't exposed to so much radiation?

    There's still the possibility of a mechanical breakdown in the body. That leaves about fifteen hundred condensers, twenty thousand individual electric circuits, five hundred vacuum cells, a thousand relays, and upty-ump thousand other individual pieces of complexity that can be wrong. (Catch that Rabbit.47)

    We're glad we don't have Powell and Donovan's job because robot technology is complicated. This comes up again in "Liar!" in regards to how complex the robot construction process is. And this is the only reason why we bring this quote up: robot minds may seem simple with only three laws, but robot bodies are very complicated. There's a whole lot of complexity that goes into making something like a robot.

    Bogert relaxed into an undignified grin, "She's using lipstick, if that's what you mean." 

    "Hell, I know that. Rouge, powder and eye shadow, too." (Liar.99-100)

    Yes, it's the future, with all sorts of amazing new technologies—robots and taxi-gyros and space stations that beam power to Earth. But women still have lipstick and rouge and eye shadow—there's been no progress in that area. Maybe this comes from the fact that Asimov didn't really know about make-up. But we might argue that it shows how not everything progresses at the same rate. Or maybe a bit of both.

    But these incidents I have written up don't apply much to the modern world. I mean, there was only one mind-reading robot ever developed, and Space-Stations are already outmoded and in disuse, and robot mining is taken for granted. What about interstellar travel? It's only been about twenty years since the hyperatomic motor was invented and it's well known that it was a robotic invention. (Little Lost Robot.9)

    The flip side of technology's advance is this: things are cool at first (space stations!) but then people start taking them for granted (eh, space stations) and we need new cool technology (hyperatomic drives!). In fact, this is kind of the interviewer in a nutshell—he's a young guy who was born into a world with robots, so he takes them for granted; whereas Calvin saw robots from their first humble beginnings and so doesn't take them for granted.

    "You see, sir, Consolidated's machines, their Super-Thinker among them, are built without personality. They go in for functionalism, you know—they have to, without U. S. Robot's basic patents for the emotional brain paths. Their Thinker is merely a calculating machine on a grand scale, and a dilemma ruins it instantly." (Escape.30-1)

    Near the end of the book, we finally get a little explanation of why the robots here are so emotional—it's part of their positronic brain (which is a patent owned by US Robots). But also, we see that this bit of technology isn't just for fun: US Robots makes emotional robots with personalities because it's helpful technology.

    "By using human ova and hormone control, one can grow human flesh and skin over a skeleton of porous silicone plastics that would defy external examination. The eyes, the hair, the skin would be really human, not humanoid. And if you put a positronic brain, and such other gadgets as you might desire inside, you have a humanoid robot." (Evidence.157)

    Lanning explains to Quinn how you could make a robot look like a human, which is not exactly acceptable technology. We pulled this quote for that reason: it's a reminder that technology can make things possible that we might want to avoid. But also, it's a little glimpse into other technological advances: we don't just have robots in this future, we have "hormone control" that can be used to grow a human-like body over "silicone plastics."

    "And that is all," said Dr. Calvin, rising. "I saw it from the beginning, when the poor robots couldn't speak, to the end, when they stand between mankind and destruction. I will see no more. My life is over. You will see what comes next." (Evitable Conflict.229)

    Calvin may be useful as a frame narrator for this book since she's been around nearly from the beginning. (In fact, all the people who have been around as long as her have passed away—the original Robertson, Lanning, Bogert.) She helps maintain our sense of technological progress without taking anything for granted or being too shocked about the whole thing.

  • Morality and Ethics

    "They're a cleaner better breed than we are." (Introduction.32)

    This is Calvin's final opinion on robots, and we'll hear this a lot in the last two stories, "Evidence" and "The Evitable Conflict." Notice that we get this idea up front, in the very introduction. It's like Asimov doesn't want us to miss it: robots are fundamentally good.

    You know that it is impossible for a robot to harm a human being; that long before enough can go wrong to alter that First Law, a robot would be completely inoperable. It's a mathematical impossibility. (Robbie.78)

    Here we have some explanation as to why robots are better than people: because they're designed to be; or rather, because a robot couldn't work if it wasn't designed to be good.

    "So Rule 3 has been strengthened—that was specifically mentioned, by the way, in the advance notices on the SPD models—so that his allergy to danger is unusually high." (Runaround.148)

    Speedy is as moral as the other robots we meet. But his desire for self-preservation (Third Law) nearly gets Powell and Donovan killed. It's not that protecting himself is a bad thing to do, it's just that there's something of a conflict between his needs and his orders. Not to mention that this whole conflict comes about because Speedy doesn't understand the situation; if Powell and Donovan told him they need this selenium to live, he would've gotten it very quickly. No matter how moral he is, Speedy doesn't know everything.

    Cutie said nothing, nor did any other robot, but Donovan became aware of a sudden heightening of tension. The cold, staring eyes deepened their crimson, and Cutie seemed stiffer than ever. "Sacrilege," he whispered—voice metallic with emotion. (Reason.96-7)

    In "Reason," the robots still have good morals—they don't want people getting hurt, which is why they take over the space station. But there's some strangeness in the way they do it, inventing a religion based on the Power Converter and coming up with some new rituals and taboos. Here Donovan has performed something sacrilegious. There's not a lot of religion in this book, but it's worth asking: what's the relationship between religion and morality in this book?

    And, finally, worked his precise mechanical mind over the highest function of the robot world—the solutions of problems in judgment and ethics. (Catch that Rabbit.38)

    We love the way Asimov works up to this. First, Powell and Donovan test Dave's math (which is normal, what we'd expect to do with a calculator), then they test his physical reactions (like you'd test somebody's reflexes, which is a little weird with robots, but understandable since robots have bodies), and finally, they test… his moral reasoning. That's something that seems really strange for robots from other science fiction works, but Asimov works his way up to it in a way that makes it seem reasonable.

    "You can't tell them," droned the psychologist slowly, "because that would hurt and you mustn't hurt. But if you don't tell them, you hurt, so you must tell them. And if you do, you will hurt and you mustn't, so you can't tell them; but if you don't, you hurt, so you must; but if you do, you hurt, so you mustn't; but if you don't, you hurt, so you must; but if you do, you—" (Liar.258)

    Ah, Herbie. Other robots get caught up between two Rules, but Herbie is caught by just one, so there's no way out for him—he's damned if he tells and he's damned if he doesn't tell. Luckily, Calvin is here to help talk him through this problem. Wait, did we say talk him through? No, she's driving him insane on purpose, out of revenge. So Herbie is caught because he doesn't want to hurt anyone and Calvin is purposely hurting him. Just another reminder that robots are better than people.

    "Positronic brains were constructed that contained the positive aspect only of the Law, which in them reads: 'No robot may harm a human being.' That is all. They have no compulsion to prevent one coming to harm through an extraneous agency such as gamma rays." (Little Lost Robot.50)

    The Nestors that get used at Hyper Base have a slightly altered sense of morality: they can't hurt people, but they can let them get hurt. The government thought they needed these robots for their research; but Calvin knows that pulling out one element of the Three Laws means the system will be unstable, like a game of Jenga. So the Three Laws aren't just individual moral ideas, but a whole system that works together.

    "Now that they've managed to foul theirs up, we have a clear field. That's the nub, the... uh... motivation. It will take them six years at least to build another and they're sunk, unless they can break ours, too, with the same problem." (Escape.16)

    Consolidated broke their super-computer and now they want to destroy the super-computer of US Robots. Why? Just because they're competitors. Perhaps this reminds us that humans are not as moral as robots, especially if they're in business.

    "Because, if you stop to think of it, the three Rules of Robotics are the essential guiding principles of a good many of the world's ethical systems. … To put it simply—if Byerley follows all the Rules of Robotics, he may be a robot, and may simply be a very good man." (Evidence.138)

    This is one of Calvin's main ideas in this book: robots are more moral than humans because they have to follow the Three Laws. Robots aren't simply moral because they have to follow rules, but because the rules they follow are moral guiding principles, the same that people should follow.

    "Think about the Machines for a while, Stephen. They are robots, and they follow the First Law. But the Machines work not for any single human being, but for all humanity, so that the First Law becomes: 'No Machine may harm humanity; or, through inaction, allow humanity to come to harm.'" (Evitable Conflict.212)

    In later robot books, Asimov's robots come up with something called the Zeroeth Law, which is basically this idea here: robots have to prevent harm not just to individuals, but to all of humanity. Now, this raises some problems in this story, since in order to help people in "The Evitable Conflict," the Machines have to slightly hurt some individuals, those who belong to the Society for Humanity. So what do you think—is it more moral to help many by hurting a few?

  • Fear

    "Afterward, they became more human and opposition began. The labor unions, of course, naturally opposed robot competition for human jobs, and various segments of religious opinion had their superstitious objections." (Introduction.35)

    We have robots today that don't look like us and work in factories, and some labor unions have worried (understandably) about lost jobs; but notice in Asimov's world, people only really start to worry about robots when they "became more human." Why is it that people in these stories fear robots when they start to look more human?

    Gloria's mother, however, was a source of uneasiness to Robbie and there was always the impulse to sneak away from her sight. (Robbie.49)

    Most of fear in this story stems from people being afraid of robots, so it's interesting to us that one of the first stories involves the opposite: it's true that Grace Weston is afraid of Robbie, but Robbie is a little afraid of her as well. And this is one of our first robots, so already we see that Asimov's robots are emotional.

    The little dots that marked the position formed a rough circle about the red cross of the selenium pool. And Powell's fingers went to his brown mustache, the unfailing signal of anxiety. (Runaround.22)

    Powell is calm and Donovan is passionate most of the time, but even Powell gets a little upset when he's facing death. On one hand, this may be Asimov's way of telling us that fear isn't a terrible thing—it can sometimes be a useful emotion since it tells us that there's a problem. On the other hand, let's note that anxiety doesn't really help Powell here. Only science will help.

    Mathematical squiggles on paper were not always the most comforting protection against robotic fact. (Reason.6)

    Generally, in this book, the people who are afraid of robots aren't the smart people who work with them. Take Mrs. Weston for example—she may be smart, but she doesn't understand robotics. Sometimes Powell and Donovan (and Calvin, especially in "Escape!") get worried about robots, too. So how should we feel when robot specialists are occasionally afraid of robots?

    Donovan pounded the desk, "But, Greg, he only goes wrong when we're not around. There's something—sinister—about—that." (Catch that Rabbit.50)

    Again, Donovan gets fearful, even though he's a robot specialist. It turns out he's just being ridiculous—but do we know that when we start this story? Again, Asimov fakes us out by offering a "robot rebellion" possibility that is totally untrue.

    "Having it walking beside me, calmly peering into my thoughts and picking and choosing among them gave me the willies." (Liar.18)

    Having a robot read your mind might be frightening, as Milton Ashe and Calvin agree. In fact, this might be the first really scary robot we see: a robot who is not only better than us at something we can do (like math), but does something that we can't do at all.

    "Susan," said Bogert, with an air of sympathetic amusement. "I'll admit that this Frankenstein Complex you're exhibiting has a certain justification—hence the First Law in the first, place." (Little Lost Robot.66)

    "Little Lost Robot" is so interesting to us because it's the only story in this book that has a dangerous robot. In the rest of the stories, Asimov shows us how robots aren't monsters and shouldn't feared. But in this story, he has the most intelligent character in the book, Susan Calvin, express fear.

    "Listen, this junk about the space-warp knocked out Consolidated's robot, and the longhairs said it was because interstellar travel killed humans. Which robot are you going to trust?" (Escape.170)

    Once again, Powell and Donovan are on the frontline of robot testing, and this means they're in some danger. Now, they shouldn't be, because robots should be incapable of hurting people. But once again, since these robots are doing something new (figuring out the warp drive), there's always a danger that the Three Laws won't hold like normal.

    "Why not prove it? Or would you still rather try to prove it to the public?" (Evidence.56)

    Quinn motivates Lanning to get involved through the threat of making his accusations public. His accusations would just be rumors, but Lanning is still afraid that people will react negatively to those rumors. In other words, Lanning is fearful that people will be more afraid of robots if Quinn says that some people might be robots. You could imagine that people would fear a Battlestar Galactica type of situation with robots infiltrating human society. Eek.

    "That such small unbalances in the perfection of our system of supply and demand, as I have mentioned, may be the first step towards the final war." (Evitable Conflict.18)

    Luckily, we know by the end of this story that Byerley is wrong and that robots aren't trying to kill us all. (That would be the Terminator situation, if you're keeping track of what science fiction films this story almost is.) We don't want to make fun of him, exactly, but it does seem like an awfully big jump from "small unbalances" in the economy to "the final war." It must just be something about robots, though: everyone always expects them to kill all humans.

  • Foolishness and Folly

    "It was all quite ridiculous and quite useless. And yet there it was." (Introduction.35)

    This is Calvin talking about human opposition to robots. Notice how this human folly is both a) silly and b) ineffective. If it were only one of those things—say, if people were right to oppose robots but still ineffective, or if people were silly to oppose robots but effective in their opposition—then this wouldn't be folly, but tragedy.

    "Most of the villagers consider Robbie dangerous. Children aren't allowed to go near our place in the evenings." (Robbie.88)

    Grace Weston is fearful and foolish (though we should add, not really villainous), but we should note that she isn't alone. US Robots has trouble because lots of people are foolishly worried about robots. How can you deal with popular prejudice like this?

    And his last words as he receded into the distance were, "There grew a little flower 'neath a great oak tree," followed by a curious metallic clicking that might have been a robotic equivalent of a hiccup. (Runaround.119)

    Speedy is the robotic equivalent of drunk. And this might be the most foolish a robot ever gets in these stories. Most of the foolishness in this book is humans doing the wrong thing. When a robot is foolish, it's because its brain doesn't work right (although maybe that's what it means when a human is being foolish as well).

    "Call it intuition. That's all it is so far. But I intend to reason it out, though. A chain of valid reasoning can end only with the determination of truth, and I'll stick till I get there." (Reason.11)

    Why is Cutie being foolish here? The big reason is that he ignores the evidence and is only interested in reason. (Which is to say, he's not very scientific about the issue.) But the other reason why we call him foolish is that he starts from a position that has nothing to do with reason; after all, "intuition" is almost the opposite of reason.

    Donovan was back with the suits, "They've gone jingo on us, Greg. That's a military march." (Catch that Rabbit.69)

    OK, we included something like this under the section on fear but we had to include it under Foolishness as well. Because, honestly, how does Donovan get from "military march" to "robot rebellion"? We mean, not everyone who wears a uniform is a soldier and not everyone who marches is a soldier. Usually, you know who the enemy soldiers are because they're the ones shooting at you. In other words, Donovan is getting worried about the totally wrong thing here.

    "There's irony in three of the greatest experts in robotics in the world falling into the same elementary trap, isn't there?" (Liar.216)

    "Liar!" may be one of our favorite stories because everyone looks bad in it: Milton Ashe doesn't realize how he's hurting Calvin; Bogert and Lanning don't realize what Herbie is doing; and Herbie can't figure out that what he's doing is going to cause more harm. But Calvin probably ends up looking the most foolish because she's so close to getting it, like when she notes that she always used to pretend that Ashe's girlfriend was his cousin, which was just what Herbie told her (77-8). Of course, she figures it out first, and that's why we love her also—most foolish but also smartest scientist in the room.

    Bogert saw her politely to the door and grimaced eloquently when she left. He saw no reason to change his perennial opinion of her as a sour and fidgety frustration. Susan Calvin's train of thought did not include Bogert in the least. She had dismissed him years ago as a smooth and pretentious sleekness. (Little Lost Robot.70-1)

    Unlike Powell and Donovan (who are foolish in a funny way and fight just for fun), Calvin and Bogert make a terrible team because they really seem to dislike each other. Now, we love Calvin—she's really the smartest person in the book and is almost always right—so it's clear that we think Bogert is being foolish. But Calvin is also being a little foolish here. Bogert may be many things, but he's also a pretty good mathematician and a pretty good negotiator. If they worked better together, they would have an easier time with this mystery; but their personal hostility means they have a harder time with things.

    She brought it out calmly, "He developed a sense of humor—it's an escape, you see, a method of partial escape from reality. He became a practical joker." (Escape.322)

    We're big fans of folly here at Shmoop, so we're glad that we finally get an example of folly that's actually somewhat useful. In order to figure out how to make a hyperatomic drive, the Brain had to find a loophole in his moral code, and to cover that loophole, he becomes…humorous. Actually, we think all his jokes are pretty silly, but he does have the mind of a child. At least with this bit of foolishness, he was able to solve a serious mystery without hurting anyone permanently.

    "He won't open Byerley," said Calvin, disdainfully. "Byerley is as clever as Quinn, at the very least." (Evidence.164)

    Quinn is supposed to be a master manipulator in "Evidence," and he certainly manipulates Lanning. But Calvin points out that Byerley is smarter than Quinn. In fact, Quinn gets tricked up by his own trick in this story, which is a pretty foolish thing. So not all foolishness in I, Robot has to do with robots at all…unless Byerley is a robot, that is.

    "The task of the human brain remains what it has always been; that of discovering new data to be analyzed, and of devising new concepts to be tested. A pity the Society for Humanity won't understand that." (Evitable Conflict.177)

    Hiram Mackenzie points out why the Society for Humanity is foolish: they think robots are replacements for humans rather than companions. Now, we only meet Mackenzie for a moment, so we don't know if he's as smart as Calvin, but he seems to have a good point here: robots can do some things, but not all that a human can do. So the Society for Humanity has a foolish worry.

  • Language and Communication

    Robbie made a semi-circle in the air with one finger. (Robbie.40)

    Robbie can't talk, but notice that he has no problem communicating with Gloria. Our later robots can talk, but let's not forget that there are other ways to communicate.

    "At the same time, when you sent him out after the selenium, you gave him his order casually and without special emphasis, so that the Rule 2 potential set-up was rather weak." (Runaround.148)

    This is half of Speedy's problem: a decreased sense of the Second Law because Donovan gave the order too casually (the other half involves his increased sense of the Third Law). See, Asimov's robots understand English, but English can be a little ambiguous. (We'll see this issue pop up again in "Little Lost Robot," with strong language.)

    Cutie laughed. It was a very inhuman laugh—the most machine-like utterance he had yet given vent to. It was sharp and explosive, as regular as a metronome and as uninflected. (Reason.55)

    You could do a whole (awesome) paper on how robots speak in these stories, starting from un-speaking Robbie to playful and warm-sounding robots. But Cutie is still a pretty early robot and he still has a robotic voice. See also "His voice carried the cold timbre inseparable from a metallic diaphragm" (7).

    "I sensed that an order was sent, but there was never time to receive it." (Catch that Rabbit.127)

    The "positronic field" is a bit of technobabble (not even Powell and Donovan know how it works). But it's a useful reminder to us how close "language" is related to technology. That is, even writing was an invention, as were the printing press and Twitter. Communication tends to be tied to technology, and new technologies may bring new opportunities and problems. So, in the case of the "positronic field," the robots are presented with a weird situation: they sense an order, but don't receive it.

    "It's your fiction that interests me. Your studies of the interplay of human motives and emotions"—his mighty hand gestured vaguely as he sought the proper words. (Liar.40)

    Herbie isn't a human, so only novels can help him when he tries to understand human thoughts and feelings. This is another example of a different method of communication: Robbie can't talk, so he makes motions; Herbie needs to find out how people feel, so he reads novels.

    "You told him to go away?" asked Dr. Calvin with sharp interest. "In just those words? Did you say 'Go away'? Try to remember the exact words." (Little Lost Robot.90)

    As in "Runaround," part of the story of "Little Lost Robot" is not just what people say, but how they say it. Gerald Black didn't just tell Nestor-10 to leave him alone—he used a lot of swear words and harsh language. So robots aren't just sensitive to what is said; they're also sensitive to how it's said.

    "Original impressionment is not everything," Calvin snarled at him. "Robots have learning capacity, you... you fool—" And Bogert knew that she had really lost her temper. She continued hastily, "Don't you suppose he could tell from the tone used that the words weren't complimentary? Don't you suppose he's heard the words used before and noted upon what occasions?" (Little Lost Robot.124)

    We brought this quote out because we love that Calvin's version of "strong language" is to call someone a fool, but also because Calvin lays out clearly what we were talking about in the previous quote: robots understand language, but they also understand other aspects of communication through speech—they understand slang and tone. If you've ever tried to communicate with a machine on the phone (you know, the type that says, "say 'pay bill' if you want to pay a bill" and then never hears you correctly), you can see how advanced these robots are.

    "When we come to a sheet which means damage, even maybe death, don't get excited. You see, Brain, in this case, we don't mind—not even about death; we don't mind at all." (Escape.48)

    OK, so Calvin is usually right and smart, even if it takes her some time to figure out what's going on. But here she falls into the same problems that keep fouling up people's plans with robots: she's not careful about what she says. She tells Brain that humans don't mind death, so Brain figures that it's OK to kill humans temporarily. Of course, this ends with a positive turn of events—thanks to this, Brain figures out the hyperatomic drive.

    And Stephen Byerley, tight-lipped, in the face of thousands who watched in person and the millions who watched by screen, drew back his fist and caught the man crackingly upon the chin. (Evidence.268)

    With robots, it helps to keep in mind that there are ways to communicate besides what we say. For instance, Speedy and Nestor-10 both pay attention to how people say commands. But with humans, there's also other ways to communicate, and Byerley has just demonstrated one of the most basic forms of communication: hitting someone. This isn't just a regular punch, though; this punch communicates a lot of information about Byerley being human (maybe).

    "Do you remember the Machine's own statement when you presented the problem to him? It was: 'The matter admits of no explanation.' The Machine did not say there was no explanation, or that it could determine no explanation. It simply was not going to admit any explanation." (Evitable Conflict.220)

    This may be the final evolution of robot language—not only can they understand human tone and slang, but they can be purposely ambiguous in what they say. They're just like us that way.

  • Choices

    "Robbie was constructed for only one purpose really—to be the companion of a little child. His entire 'mentality' has been created for the purpose. He just can't help being faithful and loving and kind. He's a machine—made so. That's more than you can say for humans." (Robbie.76)

    We've quoted this before and we'll do so again because we love this line. It really helps express how the robots are designed to be good. Robbie doesn't really have a choice in the matter—he has to be good. There's something ironic in the way that Asimov turns something that sounds negative ("he's a machine") into something positive ("He just can't help being faithful and loving and kind").

    "Hold on, Greg. There are human rules of behavior, too. You don't go out there just like that. Figure out a lottery, and give me my chance." (Runaround.210)

    At the end of "Runaround," Powell makes a choice to risk his life to save his friend—which is the same thing that Donovan wants to do. Donovan doesn't get to make that choice because Powell has already made the choice for them both. This is a strange moment where human choices come into conflict. Awkward.

    "I merely kept all dials at equilibrium in accordance with the will of the Master." (Reason.211)

    We laugh at this because Cutie is doing something he thinks is religious but we think is engineering. What's interesting to us is that Cutie doesn't exactly have a choice about what he does—he has to protect humans. But he does have a choice about how he thinks about it. What a rebel. Religious fervor never looked so technical.

    "How is a robot different when humans are not present? The answer is obvious. There is a larger requirement of personal initiative." (Catch that Rabbit.100)

    Dave's problem in "Catch that Rabbit" is totally an issue of choice. When humans aren't around, Dave has to make his own decisions, and he can't do it. (Actually, we learn that he has trouble giving orders to all six of his subordinate robots, but it seems like he has trouble making choices.) This is sort of a weird position for a robot to be in, since we usually think of a robot as not making any choices at all since a robot is constrained by the Three Laws.

    "Have you—told anyone?"

    "Of course not!" This, with genuine surprise. "No one has asked me." (Liar.47-8)

    We love Herbie's response here. It seems as if he promised not to tell Calvin's secret to anyone, as if he were a friend. But of course the issue isn't friendship—it's just that no one asked. If Herbie thought that the person who asked would be hurt if he didn't tell, of course Herbie would tell Calvin's secret. Herbie isn't making a choice about this; he's acting according to the Three Laws of Robotics.

    "He looks deeper than the skin, and admires intellect in others. Milton Ashe is not the type to marry a head of hair and a pair of eyes." (Liar.71)

    This is Herbie talking about Milton Ashe (and by that, we mean lying about Milton Ashe). Lying here isn't a choice for Herbie—it's what he feels he has to do. But we pulled this quote because of the choice that Milton makes to choose the pretty but dumb woman over Calvin. Or is it a choice? How much free will do the humans in these stories have?

    In human beings, voluntary action is much slower than reflex action. But that's not the case with robots; with them it is merely a question of freedom of choice, otherwise the speeds of free and forced action are much the same. (Little Lost Robot.164)

    We don't usually think of robots as having a lot of choices—they have to follow the Three Laws. (Although there actually is a lot of leeway in how those laws are interpreted and how they put those laws into practice.) But "Little Lost Robot" shows us a bunch of robots making choices, and here we see something funny about robots: their voluntary choices may look a lot like their involuntary reflexes. So how can we tell them apart?

    "No, but all I eat on ships are beans. Something else would be first choice." His hand hovered and selected a shining elliptical can whose flatness seemed reminiscent of salmon or similar delicacy. It opened at the proper pressure. (Escape.198)

    Beans, beans the magical fruit, the more you eat the more you…wait,we'll let you finish that one. The point is, having only beans to eat while you're stuck in a spaceship sounds like it could get a little gassy. This is Brain's idea of a joke, but we wonder if this is really a joke about choice: Donovan tries to choose something other than beans, but he can't because all Brain has given them are beans. If you wanted to, you could make a connection between Brain's adherence to the Three Laws and the choices (or lack of choice) that he gives to Powell and Donovan.

    "And the change from nations to Regions, which has stabilized our economy and brought about what amounts to a Golden Age, when this century is compared with the last, was also brought about by our robots." (Evidence.3)

    Today, this might seem like a strange choice: nations do join into regional partnerships and such (such as NATO or the European Union), but it's hard to imagine nations choosing to totally join together into regions. And this was probably even harder to imagine in Asimov's day. But in these stories, this is what happens, thanks to the robots. People might not choose to join their nations together, but the robots help people out. This foreshadows what happens in "The Evitable Conflict," what with robots helping people to make good choices.

    "It [humanity] was always at the mercy of economic and sociological forces it did not understand—at the whims of climate, and the fortunes of war." (Evitable Conflict.225)

    Byerley is worried about the idea that the Machines have taken choices out of the hands of humanity. But Calvin points out that humans have never totally had the freedom to choose. According to her, we've always been subject to forces beyond our control. If that's true, then humans are in the same situation as robots—constrained by certain laws and forces. If you've ever been struck by lighting or an unexpected rainstorm has ruined your new hairdo, you know what we're talking about.

  • Friendship

    "There was a time when humanity faced the universe alone and without a friend. Now he has creatures to help him; stronger creatures than himself, more faithful, more useful, and absolutely devoted to him. Mankind is no longer alone." (Introduction.30)

    Would you say that the robots in these stories are our friends? Notice that Calvin first implies that robots are our friends by saying that there was a time when we were "without a friend"—but she doesn't reuse the word "friend." Certainly robots are helpful and faithful, but is that enough to make a friend?

    "He was a person just like you and me and he was my friend." (Robbie.119)

    Well, we might hesitate to call robots our friend, but Gloria doesn't hesitate to call Robbie both a friend and a person. Notice that Asimov puts this line into the mouth of a child, so we might dismiss her claim. But at the same time, we saw them play, and Robbie does really seem like a friend.

    Of course, the damn fool had worked out the cube of fourteen in advance, and on purpose. Just like him. (Runaround.212)

    What kills us about this quote is not just that Powell is willing to risk his life to save his friend's life or that Donovan is willing to risk his life to save his friend's. What gets us is that final "just like him"—which is clearly Donovan's thought and a reminder that they are friends who have spent a lot of time together and know each other's personalities.

    It is impossible to perform any act of kindness toward you two. Always the same phantasm! (Reason.211)

    We don't quite think Cutie is a friend to Powell and Donovan, but here he is at least trying to be friendly and share some info with Powell and Donovan (info about how the space station's power beam worked during the electron storm). As usual, the problem is that Powell and Donovan refuse to believe in Cutie's religion, so Cutie can't really be friends with them. This sure makes it seem like friends have to share at least a basic view of the world.

    Powell regarded Dave—laymen might think of robots by their serial numbers; roboticists never—with approval. (Catch that Rabbit.22)

    In the very next paragraph, Powell tells Dave, "you're a good fellow" (23). Once again, we're not sure we'd want to describe this relationship as a "friendship." But this is not quite the relationship between a repairman and a toaster—repairmen usually don't tell toasters that they are "good fellows." Robots may be tools for people to use, but roboticists have a different, almost friendly relation with them. Even when they're malfunctioning, as Dave is.

    "Oh, I'd just as soon, I'm just busting to tell someone—and you're just about the best—er—confidante I could find here." (Liar.173)

    We don't get to see a lot of human-human friendship in Asimov stories, and certainly not for Calvin. (Not until she meets Stephen Byerley, at least—although he might be a robot, so that would explain why she gets along so well with him.) And this isn't a friendship; this is one of those tragic mismatches, where Calvin wants to be more than friends. Ashe might not know how she feels about him, but he definitely knows that their relationship isn't totally normal—check out that bumble he makes before he finds the word "confidante."

    "Destroy all sixty-three," said the robopsychologist coldly and flatly, "and make an end of it." (Little Lost Robot.43)

    Calvin may be the human who is most friendly with robots. Bogert does say she's like a sister to them (161). That's why this story is so frightening to us: robots seem so human and Calvin seems so friendly to them. But her first reaction to a possibly dangerous robot is to destroy 62 innocent robots. We guess we can't really call Calvin a friend to robots, can we?

    "That," said Donovan, bitterly, "is news to me. I was just beginning to have a very swell time, when you told me." (Escape.159-160)

    If you didn't know any better, you might think that Powell and Donovan hate each other. But really, since we know they're friends, we can see their behavior as the sort of bantering/fighting that friends engage in. Are there other clues that they're friends?

    Two strong arms lifted John from the wheel chair. Gently, almost caressingly, Byerley's arms went around the shoulders and under the swathed legs of the cripple. (Evidence.114)

    Stephen Byerely might be a robot created by John Byerley, but that doesn't mean they can't be friends. Once again, we're not sure that this is a "friendship," but there are a lot of feelings in this relationship (as demonstrated here) that seem friend-like: warmth, respect, care, attention. John may have built Stephen, but they seem like equals when they talk about their plan to beat Quinn.

    "For this evening, Stephen, you may talk how you please and of what you please, provided you tell me first what you intend to prove." (Evitable Conflict.17)

    Susan Calvin doesn't have a lot of human friends through most of the story. Her best friend in the book might be Stephen Byerley (who may or may not be human), but check out this quote—it seems like a mix of friendly ("you may talk how you please") and professional ("tell me first what you intend to prove"). We may love Calvin and think she's great at her job, but she's not the kind of person you would probably want to hang out with all the time. She's a good robopsychologist, but maybe not a good friend.

  • Power

    "Now he has creatures to help him; stronger creatures than himself, more faithful, more useful, and absolutely devoted to him." (Introduction.30)

    Up in our "Morality and Ethics" section, Calvin notes that robots are better than people morally, but they're also better than humans on a physical level. Still, even though robots are superior to people ("stronger… more useful"), they're totally under our power ("faithful… devoted"). This issue comes up in "Little Lost Robot," and it's an issue we might want to think about: what does it mean for something superior to be totally dominated by an inferior?

    And yet he loved his wife—and what was worse, his wife knew it. George Weston, after all, was only a man—poor thing… (Robbie.94)

    In these stories, Power comes in different forms. And here's one of them: George Weston may have certain power over his family (this is, after all, a story written in the 1940s, when people had certain ideas about how wives were supposed to be subservient to their husbands). But Grace Weston has some power over George, in part because he seems like a nice guy who wants his wife to be happy.

    "We can't go after Speedy ourselves, Mike—not on the Sunside. Even the new insosuits aren't good for more than twenty minutes in direct sunlight." (Runaround.30)

    A nice reminder that the most powerful thing on Mercury is…Mercury itself. (Or, technically, the sun, which is very close to Mercury.) The environment is more powerful than human bodies or even human technology. Even Speedy is threatened by an unforeseen chemical interaction.

    "You're inferior creatures, with poor reasoning faculties, but I really feel a sort of affection for you." (Reason.122)

    Cutie isn't exactly wrong about robots being superior. As he lays out in "Reason," robots are stronger, immune to radiation that would hurt people, and don't need to sleep. In terms of raw power, robots have more than humans. And yet, raw power isn't everything (as we saw earlier with the Westons). Even if humans are inferior, Cutie still feels some affection for them. So when they get in the way, he doesn't crush them, even though he could. How nice of him.

    "There are six others under him in an extreme regimentation. He's got life and death power over those subsidiary robots and it must react on his mentality." (Catch that Rabbit.91)

    This is Donovan overreacting, but he raises some interesting questions. Dave does have power over his subordinates—does having that sort of power change a person/robot? Well, we would say yes if Dave were a person, but as a robot, Dave is good, so it seems as if having power couldn't change him.

    Lanning found his voice and let it out with a roar. "You're suspended, d'ye hear? You're relieved of all duties. You're broken, do you understand?" (Liar.165)

    Here's a relatively straightforward example of power in I, Robot: Lanning is the boss and he's yelling at his employee, Bogert. So even when we have robots to help us do our work, bosses will still yell at workers. We guess some things will never change.

    "All normal life, Peter, consciously or otherwise, resents domination. If the domination is by an inferior, or by a supposed inferior, the resentment becomes stronger. Physically, and, to an extent, mentally, a robot—any robot—is superior to human beings." (Little Lost Robot.65)

    We think this is the central issue of Power in I, Robot: robots are superior in almost every way (smarter, stronger, more moral), but humans are still in control. Calvin thinks that this situation would lead to resentment and hate on the part of the robots, but luckily we have the First Law to protect us—so long as the robots are stable and built with all the Three Laws.

    "But even so," insisted Calvin, "we couldn't take chances. Listen, from now on, no one is to as much as breathe to The Brain. I'm taking over." (Escape.66)

    Calvin gets bossed around some in this book. She's an employee of Robertson and Lanning and she's under the command of the military in "Little Lost Robot." But she's also a brilliant scientist who commands a certain amount of respect (and more so as her career goes on), so she has some power to throw around. Of course, she only uses her power for good, right?

    …Quinn neither ran for office nor canvassed for votes, made no speeches and stuffed no ballot boxes. Any more than Napoleon pulled a trigger at Austerlitz. (Evidence.9)

    Quinn isn't a politician, but he has a lot of political power. It's funny that Asimov doesn't tell us a lot about Calvin's job, but he spends some time talking about Quinn's. It's as if Asimov wants us to understand just how Quinn has this behind-the-scenes kind of power. This sets up Quinn as someone who is not to be trusted, so we probably don't mind when he loses.

    "Why, Stephen, if I am right, it means that the Machine is conducting our future for us not only simply in direct answer to our direct questions, but in general answer to the world situation and to human psychology as a whole." (Evitable Conflict.222)

    The robots have all the power, everyone run for the hills. Actually, the Machines may have grabbed some power, but this isn't necessarily a bad thing. We might worry when robots grab power, but don't they have our best interest in mind?

  • Science

    In 2008, she obtained her Ph.D. and joined United States Robots as a "Robopsychologist," becoming the first great practitioner of a new science. (Introduction.8)

    Robopsychology (which doesn't exist, at least not yet) is probably the most important science in this book, and we might remember that Calvin is not the first robopsychologist—but she is the "first great practitioner." Which helps remind us that she's almost always right in these stories.

    "But something might go wrong. Some— some—" Mrs. Weston was a bit hazy about the insides of a robot, "some little jigger will come loose and the awful thing will go berserk and— and—" (Robbie.77)

    Here's Asimov's hint to us that we shouldn't worry about what Mrs. Weston worries about. Notice that Asimov interrupts Mrs. Weston's speech to remind us that she doesn't know what she's talking about. He really wants us to not be on her side here.

    "It's just a case of remembering that oxalic acid on heating decomposes into carbon dioxide, water, and good old carbon monoxide. College chem, you know." (Runaround.168)

    This doesn't help Powell and Donovan in the long run. They may know college chemistry, but the science that's really important here is robotics and Powell notes that he's "not a robot specialist" (Runaround.152). Well, that last part doesn't seem entirely true to us, but it's a good reminder that the science that matters here is robotics.

    "I accept nothing on authority. A hypothesis must be backed by reason, or else it is worthless—and it goes against all the dictates of logic to suppose that you made me." (Reason.53)

    We often think of science as being opposed to religion—even Scientology isn't all that scientific. But here Cutie lays out one of the most important principles of science: "accept nothing on authority." In fact, that's almost the motto of the Royal Society, a British scientific organization. So here's Cutie, who will end up founding a religion, and he's sounding a little scientific if you ask us.

    "Now—there isn't a roboticist back at United States Robots that knows what a positronic field is or how it works. And neither do I. Neither do you." (Catch that Rabbit.11)

    Asimov made up positronic brains, but in these stories, there are scientists who understand those brains. For instance, in the Introduction, Calvin is described as being able to figure out how positronic brains work. But no one knows how a "positronic field" works. How do you react when you read that? On one hand, it seems like a reminder that science moves forward; on the other hand, it seems like a bit of foreshadowing for "The Evitable Conflict," when the Machines are too complex for the humans to check.

    "Right! And if you'll notice, he's been working on your time integration of Equation 22. It comes"—Lanning tapped a yellow fingernail upon the last step—"to the identical conclusion I did, and in a quarter the time. You had no right to neglect the Linger Effect in positronic bombardment." (Liar.147)

    Lanning and Bogert don't really get along. In fact, they even fight about science and math. We might expect a discussion of science to be calm and rational and maybe a little boring. But Asimov was a scientist and he knew that real scientists fought about science all the time.

    "We run the risk continually of blowing a hole in normal space-time fabric and dropping right out of the universe, asteroid and all. Sounds screwy, doesn't it? Naturally, you're on edge sometimes." (Little Lost Robot.82)

    We've said before that the hyperatomic drive project reminds us of the Manhattan Project that built the atomic bomb, and especially here, when Gerald Black explains how dangerous the project may be. It's a reminder that science promises certain advances, but may, in fact, lead to danger.

    Dr. Alfred Lanning viewed the proceedings with faint scorn—his usual reaction to the doings of the vastly better-paid business and sales divisions. (Escape.19)

    This seems like the classic relationship between the research department and the business department. Lanning cares about the science but doesn't care about the business side of things. We don't see a lot of human-human interactions, but it's interesting to note that many of the interactions we see are not particularly friendly.

    "He has never been seen to eat or drink. Never! Do you understand the significance of the word? Not rarely, but never!" (Evidence.27)

    This is Quinn's evidence that Byerley may be a robot and it's terrible evidence. This is a classic bit of bad science. As your science teacher may have told you, "absence of evidence is not evidence of absence." Unfortunately, Quinn isn't a scientist, so he takes absence of evidence as evidence of something strange. It reminds us of "Robbie," when Grace Weston doesn't understand how robots work. There may be several scientist characters here, but not everyone in the book is a scientist. And if a scientist opposes a non-scientist, we're probably going to place our bets on the scientist.

    "Every action by any executive which does not follow the exact directions of the Machine he is working with becomes part of the data for the next problem. … The Machine knows, Stephen!" (Evitable Conflict.208)

    Calvin lays out for Byerley how the Machines cannot be fooled—any time a human disobeys the Machine, the Machine takes that into account. The Machines may be the perfect administrators because they take everything into account, like the most brilliant scientists. In a way, the Machines have to be human psychologists, just like human Calvin has to be a robot psychologist.

  • Rules and Order

    "Most of the world governments banned robot use on Earth for any purpose other than scientific research between 2003 and 2007." (Robbie.225)

    Calvin explains to the interviewer how robots were banned from Earth starting in the early 2000s. Which is an example of one type of rule—a law made by a government.

    "I told you they were playing up robot-safety in those days. Evidently, they were going to sell the notion of safety by not allowing them to move about, without a mahout on their shoulders all the time." (Runaround.58)

    US Robots tried to make people less afraid of robots by making robots seem like slaves, totally subservient to human orders. This doesn't work for US Robots, and it creates problems for Powell and Donovan. Maybe human rules aren't the best.

    "They aren't obeying us. And there's probably some reason for it that we'll figure out too late." (Reason.107)

    This is one of the most alarming moments in the book for us, since it seems as if the robots are ignoring the Second Law. Well, they are, so that's why we're alarmed. Luckily, by the end of the story, we figure out the reason for it: the robots are obeying a higher rule, the First Law.

    The unwritten motto of United States Robot and Mechanical Men Corp. was well-known: "No employee makes the same mistake twice. He is fired the first time." (Catch that Rabbit.4)

    Some laws are written down, like the one banning of robots from Earth. But a lot of the rules that we live our lives by are unwritten. And here we have an unwritten rule that Powell and Donovan live their lives by. They have to solve these robot problems or else lose their jobs.

    Powell reached for the "Handbook of Robotics" that weighed down one side of his desk to a near-founder and opened it reverently. He had once jumped out of the window of a burning house dressed only in shorts and the "Handbook." In a pinch, he would have skipped the shorts. (Catch that Rabbit.18)

    Donovan is a passionate person, but Powell seems to be more of a play-by-the-rules type of guy. And here we see what rules he plays by: the "Handbook of Robotics." Now, this book seems really important to Powell—but it doesn't seem to help him solve this mystery. So, what good are these rules?

    "We've got to go about this systematically." (Liar.20)

    To solve the mystery of Herbie's mindreading, the executives of US Robots have their own fields of study to examine: Lanning and Bogert look at the math, Milton Ashe looks at the production, and Calvin looks at Herbie's psychology. In other words, Lanning has established an orderly way to solve this mystery. Of course, it doesn't work, but that's no reason for us to think rules and order are ineffective.

    "One word from you, Dr. Calvin," said the general, deliberately, "in violation of security measures, and you would be certainly imprisoned instantly." (Little Lost Robot.224)

    Major-General Kallner reminds Calvin that she's subject to military rules and order since she's consulting on a military project. And this shows us something about rules and order: sometimes, to make sure you have rules and order, you need the power to enforce those rules and order. So here, Kallner makes a threat that he can enforce to order Calvin around. Rules and order don't look so good in this book, so far.

    Robertson of US Robot & Mechanical Men Corporation, son of the founder, pointed his lean nose at his general manager and his Adam's apple jumped as he said, "You start now. Let's get this straight." (Escape.7)

    Later, Asimov notes that Robertson doesn't really understand robots (27), which raises some questions for us; like, how did this guy become the president of the company? Well, he is the old president's son, so it looks like he simply inherited his dad's position (or his dad's shares in the company). This isn't a very good way to pass power in a company, but it's a traditional way to pass command.

    "Also that you, or your men, attempted illegal invasion of my Rights of Privacy." (Evidence.208)

    There's a lot that's different about government in the future, but here's one thing that's not too different: there are rules about personal privacy.

    "I am going to have the Society outlawed, every member removed from any responsible post. And all executive and technical positions, henceforward, can be filled only by applicants signing a non-Society oath. It will mean a certain surrender of basic civil liberties, but I am sure the Congress—" (Evitable Conflict.201)

    This may remind us of Senator McCarthy and the loyalty oaths that people were supposed to take to prove that they weren't Communists. But one thing that this also might remind us of is how Earth banned the use of robots. So, you can use governmental power to ban the Society for Humanity, but Asimov doesn't seem that think that that's a good idea. Maybe instead of rules that we can enforce through fear and power, we need a better way to convince people of the truth.