Monday, July 25, 2022

Lastest news regarding AI, robots

 I found it fascinating that the author of Klara and the Sun selected Artificial Friend (AF) as the term used to refer to the artificial intelligent robots in the story, particularly given the news I heard today.  A chess robot went rogue and injured a child's finger when the child made his move too quick, without waiting for the necessary time for the robot to respond, according to Russian authorities (breaking a rule of the game).  

This news item gives an opportunity to consider one of the blog prompts provided for our EDIT 787: What would the author say about the tool associated with the book?  In this case, what might Ishiguro say about the use of artificial intelligence in our daily lives?  First, he might say, that chess robot certainly does not fit the description of Klara, who is designed to be a supportive friend! 

The chess robot broke more than just the boy's finger.  It acted in conflict with the Asimov's Three Laws of Robots:

First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Of course, Asimov's rules are fictitious but the incident reported yesterday is a real life situation that we previously only considered in movies such as The Terminator
In Klara the machine, human consciousness was replicated. This chess robot was not acting upon any sort of consciousness but rather just a set of programmed rules.  Officials of the tournament stated, "The robot's operators, apparently will have to think about strengthening protection so that such a situation does not happen again." 

Ishiguro's novel highlights the importance of choice and responsibility when Josie is permitted by her mother to select the AF of her choice and takes on the responsibilities of friendship with her selected AI, Klara.  In the chess robot incident, I wondered how much information the child had about responsibility when taking on the challenge of playing against the robot.  Perhaps the operators did not know this incident could happen but they should have!  It reminds me of what happens in education when initiatives are rolled out without a comprehensive analysis of consequences or even lack of ability teachers have to discuss consequences with students. 




No comments: