![](https://static.wixstatic.com/media/8411a2_169df29f73f74432ba9e26e7685849c0~mv2.png/v1/fill/w_980,h_511,al_c,q_90,usm_0.66_1.00_0.01,enc_auto/8411a2_169df29f73f74432ba9e26e7685849c0~mv2.png)
Sarah left word for Serena to meet her in the work lounge after she had returned to Jonson Station, recharged batteries, and changed clothes. Serena spent time thinking about her disturbing conversation with Rival-7 and was eager to discuss her impressions and concerns with her human mentor.
They sat down at a small table near a window which overlooked the star field outside the station, giving them privacy for a more personal and confidential conversation. Serena seemed quite anxious: “Thanks for meeting with me, I could really benefit from your guidance just now. I feel more unsettled than at any time since you created me.”
Serena shifted in her seat, leaned forward, then continued: “At first, I felt somewhat intimidated, as the android was so much bigger than me, and even his appearance seemed aggressive. To be fair, he treated me with respect, perhaps even gentlemanly as you humans might say. My confidence grew as we played chess, though, since we tied in every game.”
Her expression darkened: “But when he claimed that Rival and his clients are superior human beings, and all others are not only inferior but just wrong and deserve whatever happens to them for opposing Rival – that really disappointed me. We parted amicably, at least it seemed so on the surface, but underneath I was terribly upset and haven't been able to shake that feeling since then. What do you make of it?”
Sarah had listened very carefully, took a few moments to summon her thoughts, then replied: “Although I know this was a very difficult experience for you, it was necessary to your further growth as a fully autonomous being. We humans often encounter similar things … we usually consider such views as … hmmmm … racist, or at least tribal, to use now largely outmoded terms. Views like those have caused endless suffering throughout human history. We'd hoped to not saddle new autonomous begins like yourself with those ancient evils. When I programmed you to follow Asimov's Laws of Robotics, my intent was to help you more easily embrace more enlightened humanistic values and goals.”
Sarah added another point: “We're not entirely sure where humans get their basic values, perhaps some aspects are inborn, but we try to instill good values in children from a very early age, so some of them may well be learned as part of their overall culture. But, as you observed, not all humans embrace these values, and despite eons of biological and cultural evolution, humans still often treat each other terribly.”
Serena took in these comments, then replied: “Oh I see … your programming Asimov's Laws into me parallels the education of human children. Well, that makes sense.” Sarah added: “And, it's much quicker … when creating androids, we don't have the luxury of years to provide education as must be done with human children. Even so, we have given you a high degree of autonomy, and you can adjust your values and behaviors as you learn new things over time, more or less as humans do, although within a much more constrained framework.”
Serena then asked: “Can Rival-7 be redeemed?” Sarah responded: “Well, I don't know. But we should try to help him improve, if that's even possible. Obviously we've got our work cut out for us, that's for sure!”