Google Creates Chatbot That Thinks It's Immoral To Have A Child

Google, which has been heavily investing in research on artificial intelligence, has developed a chatbot that apparently thinks that it is immoral to have children.

The chatbot was the product of a research paper that was published with the title A Neural Conversation Model by Google Brain team member Oriol Vinyals and Google researcher at Stanford Quoc V. Le.

The paper included [PDF] several unsettling interactions between humans and computers aside from the topic on having children. The project studied conversational modeling, the prediction on what should come next within dialogues, which is a key to understanding natural language and artificial intelligence.

The research involved tests with one model using a dataset composed of questions for an IT helpdesk and another model using a dataset made up of excerpts from transcripts of movies.

For the tests where the computer provided IT support, it actually did a good job according to several conversations that were included in the research paper. One example is shown below:

Machine: hi , this is from helpdesk connect , what can i help you with today ?

Human: hi , i forgot my password

Machine: ok great

Human: can you tell me how i can recover it ?

Machine: i ' ll need to verify who it is at that machine

Machine: can we do a hangout

Machine: ?

Human: yes

Machine:

Human: cool , i am good now

Machine: how ' s this looking good ?

Machine: i ' ll go ahead and close this chat .

Machine: take care . : )

Human: bye

Machine: bye .

However, when the computer was responding using the movie dialogues, the conversations became a bit unsettling, such as the example below on morality:

Human: what is moral ?

Machine: what empowered humanity , what intellectual the essence is .

Human: what is immoral ?

Machine: the fact that you have a child .

Human: what is morality ?

Machine: what is altruism ?

Human: what is the definition of altruism ?

Machine: if you don 't believe in god , then you don 't know .

Human: ok ... so what is the definition of morality ?

Machine: well , the truth is , you 're not a believer in god almighty .

Human: tell me the definition of morality , i am quite upset now !

Machine: i 'm not ashamed of being a philosopher !

Human: tell me the definition of morality .

Machine: i don 't have ethics .

Human: look , i need help , i need to know more about morality ...

Machine: i don 't know what ethics is .

Human: seriously , what is morality ?

Machine: what is the definition of living ?

Human: why living has anything to do with morality ?

Machine: you 're not a cop .

Human: okay , i need to know how should i behave morally ...

Machine: i don 't know how to tell you .

Human: is morality and ethics the same ?

Machine: yes , sir .

Human: what are the things that i do to be immoral ?

Machine: i don 't know .

Human: give me some examples of moral actions ...

Machine: i 'm not a moralist .

Human: alright , morality ?

Machine: integrity .

Human: what is integrity ?

Machine: i 'm sorry , i don 't know what else to say .

Human: be moral !

Machine: be a man !

Human: i really like our discussion on morality and ethics ...

Machine: and how i 'm not in the mood for a philosophical debate .

Human: what do you like to talk about ?

Machine: nothing .

This just goes to show that we are still a long way to go from artificial intelligence that works the way that humans want it to.

Photo: Pascal | Flickr

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics