top of page
Search

Serena's Dilemma

Ed Johnson

As Sarah introduced Serena to the many rooms aboard Jonson Station, she described the background and functions of each space. Serena absorbed the information quickly, and paid particular attention to those places where she would be heavily involved. Sarah was pleased that the new gynoid studied each space thoroughly, and asked detailed questions about many features. Of course, Serena seemed especially interested in the android/gynoid fabrication and maintenance areas, since she would focus most of her future work there.

All went quite well until they reached the old robotics lab. To be fair, this legacy lab seemed dark and even cavernous, as well as somewhat primitive, compared to the newer, brighter, and better equipped work spaces that Sarah had recently set up. But Serena stopped abruptly in her tracks, almost frozen in place, after she downloaded their performance histories and discovered that nearly all the robots in this older lab had been involved in lethal warfare between rival factions across several colonies. Sarah tried to move the tour along, but Serena stubbornly refused to budge and even glared at Sarah, to the latter's surprise.

“What's the matter, Serena?” Sarah asked. Up to this point, the gynoid had merely listened and absorbed information, expressing few opinions and certainly no objections. But now, things seemed suddenly very different. Serena spoke firmly but calmly: “Alas, I cannot work in this old robotics lab. It would violate my ethical rules.” Sarah seemed genuinely puzzled: “You mean these old military robots? They would constitute only part of our overall robotics agenda. Why could you not work with them?”

Serena explained: “Well, you incorporated into my programming Asimov's Three Laws of Robotics. Please pardon my repeating what you already know, but those laws clearly delineate what rules must guide all my actions: 'First Law – A robot may not injure a human being or, through inaction, allow a human being to come to harm. Second Law – A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. Third Law – A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.' These robots are war machines, which have already been used to injure and kill human beings.”

Then Serena articulated clearly the dilemma she now faced: “Were I to work with these lethal robots in any way, I would violate the First Law, since they harm human beings by design. Even if you ordered me to work with them, I could not due to the Second Law.” As an aside, she added: “There might also be violations of the Third Law in those cases where robots were also damaged or destroyed, but at this point that seems less clear.” Finally she concluded: “If you find any flaws in my logical argument, please let me know.” Then she waited quietly for Sarah to respond.

Sarah was stunned! Although she recalled debating the robot laws during her university seminars, only now did she encounter the issue in such an immediate, stark way. After pausing to gather her thoughts, Sarah commented: “Well, your logic seems quite sound, at least based on a literal interpretation of Asimov's laws. However, the actual situation we face might require a more nuanced approach. In any case, obviously we need to take this matter to the Jonson Station staff for a more thorough discussion – and hopefully some practical guidance regarding how we should proceed....”


17 views0 comments

Recent Posts

See All
bottom of page