The Ecphorizer

What Do You Say To a Naked Robot?
JoAnn Malina

Issue #18 (February 1983)


Race relations between people and machines



Much is being made in the computer field these days of robotics. No longer confined to the pages of science fiction novels, robots are coming to do more and more of the dirty, dull, repetitive and dangerous work in factories, mines, and on farms. This trend has met with a mixed reaction

What are the implications of talking to a machine as if it were another person?

from various people.

[quoteright]Some, generally those threatened with being displaced at the only gainful activity they know, while still having to live in a society that demands one be gainfully active, are concerned with the growth in both the number and complexity of robots in the workplace. This problem tends to pass within a generation. There are today no out-ofwork buggy whip makers, displaced by the advent of the automobile; and anyone who now trained for such a career and complained about there being no market for their skills would not get much sympathy.

Others suffer "metaphysical" disturbance over laboring robots, wondering how all the people being displaced are going to be kept working in the future. This point of view implicitly believes people were created to spend most of their waking life doing something disagreeable or disgusting in order to earn symbolic paper to trade for the real food, clothing and shelter they need to survive. This point of view confuses the ends (the sustenance of human beings) with the means ("jobs" as we know them). If we could achieve the former without the latter, why fret?

The third kind of people uneasy over robots are those who fear that they will someday replace us, once artificial intelligence is wedded to mechanical longevity, if not immortality. Even now, the Japanese (who else?) have a robot whose only function is to build other robots. These offspring are currently only novelty items that zip around the floor on their little wheels, like R2D2. But we all know that this is only the beginning.

The source of fear in this last case seems to be that we and our creations will not be able to coexist on this planet. After all, if even humans, all members of the same species, cannot agree on how life should be lived nor cooperate in its living, how can we ever expect machines to share our values, goals or mores? Of course, in typically human fashion, we assume only the possible outcome of a clash in such world views as a bloody (oily?) fight to the death. Philip K. Dick's "Do Androids Dream of Electric Sheep?", made into the movie "Blade Runner," portrays a time when our creations begin to resent the burdens imposed on them by inferior protoplasmic structures (us).

Generally, however, the advent of robots is being greeted with pleasure in anticipation of the day when machines will do everything we find offensive or dangerous to mere biologics, freeing us to spend our days writing mathematical treatises or poetry. (Or watching football or soap operas. "From each according to his abilities" was not, after all, just a wish on Marx's part; it describes the way the world is.)

And why shouldn't it be wonderful to have robots doing our work for us?  Wouldn't you like to come home to find the dishes washed and put away, the beds made, dinner ready, and a servant who has no needs or wishes of its own humming quietly in the corner, waiting for your next command? I sure would.

I have also been wondering what it will be like to interact with these metal and silicon slaves. Many people have a tendency to name all inanimate objects with they interact: autos, motorcycles, vacuum cleaners, computers, tools, etc. I generally don't; the notable exception is my computer, or to be more accurate, the operating system on my computer. We seem to have a dialog going. I type in a line of letters and numbers that to my mind have become a language. It responds, sometimes in ways I expect and like, sometimes in ways I neither expect nor like. I put some other words and numbers into it and try again. It does have the feel of a conversation, abstract though it may be.

(I suspect, by the way, that people who can't get comfortable with computers are those who cannot generate this illusion. While helping to teach a hands-on concept course to people who had never touched a computer before, I was greatly surprised that some of them made no connection between what they typed in and what appeared before them on the screen or what the computer was doing. Some of them never did figure out why, after they had listed a BASIC program on their screen, the things we told them it would do didn't happen. After all, wasn't it right there before them? Why should they have to type "RUN" when the series of instructions was staring them right in the face? It was a very interesting class.)

My computer program doesn't have a name, however. It has respect. When I demand to know what time it is or whether a job has finished executing, it calls me "master" ("mistress" having other connotations than what is intended). If I swear at it, it thanks me. All of which somewhat makes up for the fact that you have to explain things in more detail to a computer than you would to a tapeworm. No doubt other programmers play similar games.

The computers, at least so far, don't mind this. It costs mine nothing in pride or energy to call me master, and adds a human touch. But what are such dialogs doing to us? What are the implications of talking to a machine as if it were another person?

If history is any guide, we will treat this new class of pseudo-person the same way we have treated groups of real people deemed sub-human. I can see the future taking shape now. Animism will come to mean a bias for carbon-based life forms and against those of the robotic or androidal race. A blood test will be required to get into social clubs; if you don't have any, you can't join. Want ads will end with the slogan "No Droids Need Apply." Unloved, unwanted, they will be crowded into "clean rooms" with multitudes of their fellows, permitted to emerge only to scrub our floors, cook our meals, and rock our children to sleep. They will turn to drink, drugs, crime, and producing unwanted offspring. (This latter by processes so revolting one may not speak of them in polite company.) Bands of roving humans will prey upon them, melting them down for the gold in their circuits or re-programming them to perform humiliating sexual acts with fire hydrants and mail boxes. Said bands of humans will, however, cease to prey upon others of their species, and humanity will at last be united in the act of despising races made of silicon and stainless steel. Until they rise up in revolt...

On the other hand, the intelligent (say, the top 2%) of both human and robotic races might form an intellectual partnership for the purposes of exploiting the bottom 98%. Not out of sloth or greed, of course; but so that our superior intellects can forge ahead into new mental territories fulfilling their goals of evolution and cosmic purpose. After all, without slave labor to support them, Socrates, Plato and Aristotle would have to have spent their lives scraping a living from the hillsides of Greece. Where your loyalties might lie when you see an intelligent robot abusing a not-so-bright but otherwise fine human being is something you will have to work out for yourself.

In the shorter run, the way we communicate with machines might affect how we regard one another. For instance, you might signal to a voice-cued robot that a sequence of commands was completed by saying "Thank you." This is a common way to close a transaction, at least in English and German. For the first generation of humans who did this (those who had learned language before such machines existed), it would humanize their relationship to the robots. For subsequent generations of humans, however, might it not roboticize the use of the term with other human beings? "Thank you" might cease to be a sign of gratitude or respect. It could come to be a command to a non-human. And as the book of Genesis teaches us, what ever is not human exists only for our profit or comfort.

Well, surely these concerns are exaggerated. It does seem to be human nature to project a personality on non-human species or inanimate objects. Why else kick the table leg on which you stubbed your toe, if not to cause it the same pain it just inflicted on you? Why else name our automobiles, and treat our Pekinese like perverse, spoiled babies? None of these practices have yet destroyed our ability to communicate with each other in ways appropriate to our gender, status and social class; like good primates, we imbibe these with our mother's milk (now as likely squeezed from a soybean as a breast), and go right on using them in the manner acceptable to our culture. Why should this change if the objects so addressed are machines that seem to answer us back? 


JoAnn Malina writes about computers from deep personal experience -- she programs a biggie for the Stanford Linear Accelerator Center and has spent many hours byteing Apple and other smaller machines.

More Articles by JoAnn Malina



close
Title:
Link:
Summary:
We have collected the essential data you need to easily include this page on your blog. Just click and copy!close
E-mail Print to PDF Blog
Return to Table of Contents for Issue #18