8th Grade Literature: Robots on the Brain

This week we began our reading of Isaac Asimov’s I, Robot, a famous and influential collection of short stories oriented around the theme of human interaction with robots in the near future. Asimov wrote these stories in the 1940’s and 50’s, but he was remarkably prescient about some of the issues and concerns those in the future (us) would have about the advances in this kind of technology. The stories begin with robots that interact with humans but cannot talk, to robots that make active predictions, invent religions, and learn to lie which pose a host of problems for their human inventors. I don’t see Asimov as warning his readers so much as informing them that the advance of robots, whether ultimately for good or ill, is inevitable.

Asimov has robots developed with three laws:

  • A robot may not injure a human being, or through inaction allow a human being to come to harm
  • A robot must obey the orders of a human being, unless that order conflicts with the first law
  • A robot must protect its own existence so long as doing so does not conflict with the other two laws.

The laws are hierarchically structured, so that Law 1 takes precedence over Law 2, and so forth.

The three laws look solid. Their simplicity is their strength. But stories deal with the implications of these three laws, as the technology develops. One of the strenghts of Asimov’s work is that what looks simple on its face becomes complex as we interact with the new technology.

Some time ago a friend who worked as a computer programmer said something to the effect of, “Computers will do exactly what you tell them to do. When we have a problem with a computer, we likely either a) Do not understand what we told the computer to do, or b) We told the computer something different than what we thought we told them.” This holds true in the stories we have read so far.

The first story involves a robot that a family buys to serve as a companion for a young child. Obviously, they want the robot to be safe, and to look out for the safety of the child. The robot would have to be accommodating to get along with the child. But this in turn means that the robot would then enjoy what the child enjoys as a robot would, which means . . . all the time. What child wouldn’t want a companion that essentially plays with you and accommodates you whenever you want.

The child, naturally, would bond to the robot and forget about other children. The mother in the story sounds exactly like parents today. The concern is the same, only the particulars have changed. Every mother who worries about their child’s attachment to their phones, computers, or video games (why can’t they play with children instead of machines?), sounds just like the mother in the opening story, “Robbie.”

But . . . if the child is happy and if the robot protects her from disaster (which he does), and if robots are the way of the future and are simply part of how kids grow up these days, then the presence of robots becomes inevitable eventually. In this way, the mother in the story comes across slightly as the “bad guy” and such is the subversive nature of Asimov’s first story. Asimov wants us, I think, to be precise about the nature of our objection to robots.

  • Is it that we dislike change? But change in any society is inevitable.
  • Is it that we dislike the speed of change? The change may be uncomfortably fast, but if others are doing it, won’t we have to adapt to keep up? Civilizations that fall behind often get absorbed by other civilizations.
  • Is it that we dislike this particular form of technology? Ok, but how would a robot differ qualitatively from other technology that we already use? For example, a dishwasher is a robot that does not move or talk, though it does communicate with us. Our phones cannot move but can talk back to us on some level.

My impression is that with these stories, Asimov wants to force us to come to a clear understanding of what our views of life, technology, and “progress” actually are. We can’t dislike something just because it is new, or just because it is shocking or unnerving. For example, when cars were an extremely disruptive technology when first introduced, but are now just part of society. But I am also guessing that Asimov would not simply agree that any new thing must therefore be adopted. The hard question remains—where to draw the line, and why?

In the first story, “Robbie” (the robot) becomes more human like the longer he interacts with the child. For example, he learns to have favorite stories. But for humans to interact with robots, they have to learn to think according to the 3 Laws, which means, thinking like robots think. In time, some kind of overlap between robot and human “psychology” and behavior become inevitable, another unintended consequence of technology.

In all the stories, Asimov sets up the narrative so that the robot cannot really be blamed. They follow instructions. The problem is that we cannot anticipate all the ways in which they might follow those instructions, and how that will change society and humanity all at once.

In one story this means that robots learn to lie in ways similar to humans. As robot technology advances they interact more socially with humans. When we interact with those we know, we do not always tell each other the unvarnished truth. We might tell a friend that an outfit looks good even if we don’t think so, as just one example. After all, we don’t want to “harm” our friend by telling them what we really think. As the stories progess and our interaction with robots gets more complex, the robots’ ability to follow Law 1 (no harm to humans) increases. This, in turn, means that robots start to tell people what they think they want to hear, which leads to great confusion.

8th Grade Literature: Giving and Taking Away

For our next unit, we will examine short stories and literature that deal with the question of technology and its impact on humanity. In thinking about “impact,” we will think about how technology changes society, but more importantly, about how technology changes how we conceive of the meaning of our humanity. I know that this will be a challenging unit, but I hope that the students will enjoy it.

We are used to thinking of technology as neutral. Something is invented, such as a hammer, and the hammer is neither good nor bad. Rather, we can do a good thing with the hammer (build a house) or a bad thing (hit someone on the head with it). But we, the human being, remain independent from the hammer. We give meaning, form, and function to the hammer. The communication, or interaction, is, in this view, all a one-way street.

There are elements of truth to this idea, but it is an incomplete view of our interaction with the tools we create, whether those tools be a hammer, a dishwasher, or a computer. As we interact with the hammer, there is a sense in which the hammer is interacting with us and changing us thereby.

This happens even with our most simple tools, such as a hammer or shovel. We can forget the psychological impact and just focus on the phyical changes that we undergo when weilding these tools. Someone who spent their days hammering and shoveling would experience a change in their body, as certain muscles would grow where before they were possibly weak. The hammer and shovel would change our body, and this is obvious. The fact that we have the slogan, “If all you have is a hammer, every problem looks like a nail,” indicates that we perceive that something psychological happens between us and the hammer in our interactions, even if we do not directly perceive it.

There is a Chinese anecdote which runs as follows:

As Tzu-Gung traveled through the region he saw an old man working in his vegetable garden.  He had dug an irrigation ditch.  The man would descend into his well, fetch a vessel of water in his hands and pour it out into the ditch.  Then he would repeat the process as much as necessary.  While his efforts were significant the results seemed meager in comparison.

Tzu-Gung said, “There is a way whereby you can irrigate a hundred ditches in one day with little effort.  Would you like to hear it?”  [He then proceeded to explain the pulley-system with a larger bucket and grooves running out to the ditches].

Then anger rose up in the man’s face. “I have heard my teacher say that whoever uses a machine does all his work like a machine.  He who does his work like a machine grows a heart like a machine, and he who carries the heart of a machine loses his simplicity.  He who has lost his simplicity becomes unsure in the strivings of his soul–and so we lose all honest sense.  It is not that I do not know of such things: I am ashamed to use them. 

Very few of us would be willing to go in all the way with the Old Man in this story. But it is important we understand the trade-offs involved in our use of technology. What technology gives is usually quite obvious and useful. What it takes away is just as much a part of the story, though it is less obvious.

I would summarize the relationship of technology to humanity thusly:

Every increase in power creates an increase in vulnerability.

For example, a match creates fire much more quickly than sparks from two pieces of flint, or rubbing two sticks together. A match gives us power over the element of fire. However, having matches means we have lost the skill of creating fire in the traditional way. If our box of matches gets wet, we would be incapable of making fire. We now must devote extra energy to keeping the matches dry, men from previous eras had no such concerns.

Or imagine a person who wants to travel from New York to Los Angeles.

  • Walking would take the longest amount of time, but the physical act of walking risks only a twisted ankle
  • Running would take less time, but increase the possible injury risk to a broken ankle or leg
  • Riding a bike would take even less time, but a crash on a bike could badly injure parts of our whole body
  • Riding a car would reduce the trip from weeks to days, but if we make a mistake driving, or something big goes wrong with the car, we could be badly injured or killed.
  • A plan would make the trip in hours instead of days, but even a mild mechanical problem with the plane would mean death as the almost certain result.

We can also think of how much power comes from our invention of electricity. Among other things, electricity allows us to be vastly more productive than civilizations of earlier eras. We can make many more things much more quickly. But, if the electrcial grid went dark, what we could produce would drop to near zero. We have become completely dependant on electricity for most things that sustain our civilization. Our electrical grid is perhaps our greatest vulnerability.

The ancients were well aware of this trade off. Plato includes an anecodote in his “Phaedrus” dialogue that may have been from Egypt involving the invention of writing.

 At the Egyptian city of Naucratis, there was a famous old god, whose name was Theuth; the bird which is called the Ibis is sacred to him, and he was the inventor of many arts, such as arithmetic and calculation and geometry and astronomy and draughts and dice, but his great discovery was the use of letters. Now in those days the god Thamus was the king of the whole country of Egypt; and he dwelt in that great city of Upper Egypt which the Hellenes call Egyptian Thebes, and the god himself is called by them Ammon. To him came Theuth and showed his inventions, desiring that the other Egyptians might be allowed to have the benefit of them; he enumerated them, and Thamus enquired about their several uses, and praised some of them and censured others, as he approved or disapproved of them. It would take a long time to repeat all that Thamus said to Theuth in praise or blame of the various arts. But when they came to letters, “This,” said Theuth, “will make the Egyptians wiser and give them better memories; it is a specific both for the memory and for the wit. “

Thamus replied: “O most ingenious Theuth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them. And in this instance, you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.”

Some may argue that there is nothing inevitable in this trade off. Hypothetically, we did not have to abandon riding horses to drive cars, or abandon working with flint as we used matches. Hypothetically, we could have maintained modes of production that did not need electricity in tandem with the development of the power grid. Possibly, this is true, but I cannot recall an instance where this actually happened in history. In general, it seems we have to accept the trade-off, all or nothing, for good or ill. Technology seems to “require” this of us. Often the tail wags the dog with technology (think of how much of our society has been oriented around the car), and this seems to be the rule and not the exception, at least since the Industrial Revolution.

The costs of technological advances are usually hidden, which clouds our discernment about adopting such ideas or not. We see what it gives, and not what it takes away. This is the main theme of our introductory story, The Monkey’s Paw.