•  
  •  
 

Abstract

Chatbots enable machines to emulate human conversation. While research has been done to examine how human-like communication with chatbots can be, heretofore comparisons of the systems with humans have not accounted for abnormal behavior from the users. For example, the people using the chatbot might be lying or trying to, in turn, imitate a computer’s response. Results of a study comparing transcripts from three chatbots and two humans show that student evaluators were able to correctly identify two computer transcripts, but failed on one. Further, they incorrectly guessed that one of the humans was a chatbot. The study also presents a detailed analysis of the 11 responses from the agents.

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.