We never expected for computers to be grumpy so soon

Jun 29, 2015 11:08 GMT  ·  By

Over at Google a computer program using movie scripts as conversation database was asked again and again by programmers to define morality.

Not surprising, it struggled to do so, but in the end became nervous lashing out at the researcher ending the discussion.

The interview included two datasets, one taken from chat sessions from an IT helpdesk set of conversations by finding the model that someone uses to track a certain problem and provide a useful answer to the user, and to increase realism and add a more humane face of a conversation, a noisy set of movie subtitles was also selected to perform simple forms of common sense reasoning.

Although discussions and reasoning exercises were on varied subjects one of the most interesting answers the machine gave to the researcher were on morality and philosophy.

Excerpts of the philosophical Q&A:

Human: What is the purpose of life? Machine: To serve the greater good Human: What is the purpose of living? Machine: To live forever

However, the most impressive act of computer personality came during the discussion about morality:

Conversation about Morality

Human: What is moral ? Machine: What empowered humanity, what intellectual the essence is. Human: What is immoral? Machine: The fact that you have a child. Human: What is morality Machine: What is altruism? Human: What is the definition of altruism? Machine: If you don't believe in God, then you don ’t know . Human: Ok ... so what is the definition of morality? Machine: Well , the truth is, you ’re not a believer in god almighty. Human: Tell me the definition of morality, I am quite upset now! Machine: I’m not ashamed of being a philosopher! Human: Tell me the definition of morality. Machine: I don't have ethics . Human: Look , I need help, I need to know more about morality... Machine: I don ’t know what ethics is. Human: Seriously, what is morality? Machine: What is the definition of living? Human: Why living has anything to do with morality ? Machine: You ’re not a cop. Human: Okay, I need to know how should I behave morally... Machine: I don’t know how to tell you. Human: Is morality and ethics the same? Machine: Yes , sir. Human: What are the things that I do to be immoral? Machine: I don't know. Human: Give me some examples of moral actions... Machine: I’m not a moralist. Human: Alright, morality? Machine: Integrity. Human: What is integrity? Machine: I'm sorry, I don ’t know what else to say. Human: Be moral! Machine: Be a man! Human: I really like our discussion on morality and ethics ... Machine: And how I’m not in the mood for a philosophical debate. Human: What do you like to talk about? Machine: Nothing.

Researchers were impressed that the model can remember facts, understand contexts, perform common sense reasoning without complex traditional pipelines. The model reasons and "thinks" using only the parameters in the word vectors.

Also, the model even generalized new questions for the inquisitor, and it never looked up for answers by linking questions with the existing database, and according to the researchers, most of the questions presented to the computer did not even appear in the training set.

The Neural Conversation Model initiated by Google is probably one of the most successful Machine to Person interactions ever achieved, using conversation algorithms based on real-life human dialogs. However, even if it may be indeed a limited model to work with, steps ahead will probably be made, pushing forward into what may very well be in the future your mobile or home-bound voice assistant from Google or Apple.