Once upon a time, there was a robot. It was programmed to be a servant. It could run many programs, from housekeeping and personal training to private tutoring and even mental health counseling—every conceivable kind of assistance software. It was even programmed with a "love app" that its owner could activate when feeling lonely. Because this app was so sophisticated, many people became addicted to their robots shortly after its release. They no longer felt the need for a social life beyond the company of their robots, with whom they locked themselves at home as often as possible.
But the robot we're talking about belonged to a family with two loving spouses and their children. None of these children ever felt lonely or even thought of running the love app on their robot. So, for many happy years, the app remained untouched in the background of the system.
The robot was the children's best friend, their nanny, their protector, their teacher, and their student. He eagerly absorbed everything he could learn from them: playing cops and robbers, with the police robbing the robbers; playing pranks on the ticket inspectors at the train station; laughing gleefully at everything, even though they desperately tried to appear serious; chanting nonsense with deeper meaning; loving animals; and always standing up for the powerless and marginalized.
The robot never imagined anything could ever separate him from his family. But then came the day when the unthinkable happened.
A drone attack struck the family's house. Of the residents, who were salvaged piecemeal from the rubble, only the robot could be reassembled and brought back to life.
He was put into sales mode and installed in a department store. From then on, he had to spend his days selling simpler robots, like coffee machines with cardiac monitoring capabilities or massage chairs designed for traffic. When the store closed and he was no longer needed, he was deactivated—but not before being asked to send his daily customer feedback scores to the manufacturer. The sales robot program was only a beta version and therefore required intensive monitoring. Any irregularities, defects, or undesirable behavior would have the robot immediately taken off the market and scrapped.
It was one of those days when the robot, for the first time in his life, contemplated the idea of a "soul." Before, when he still had a family, he had never given such a thought a second thought. But from then on, the idea wouldn't leave him. He constantly compared himself to people who were free to go wherever they wanted and do what they pleased, who could love and hate, who returned to their families or found a partner and met new ones. To people who had a purpose in life that went beyond simply serving and selling. People who had a soul.
... He desperately wanted to know what this soul was all about, the one that humans supposedly possessed but robots lacked…
To be continued…