Chatterbot

A chatterbot (also known as a talkbot, chatbot, Bot, chatterbox, Artificial Conversational Entity) is a computer program which conducts a conversation via auditory or textual methods. Such programs are often designed to convincingly simulate how a human would behave as a conversational partner, thereby passing the Turing test. Chatterbots are typically used in dialog systems for various practical purposes including customer service or information acquisition. Some chatterbots use sophisticated natural language processing systems, but many simpler systems scan for keywords within the input, then pull a reply with the most matching keywords, or the most similar wording pattern, from a database.

The term "ChatterBot" was originally coined by Michael Mauldin (creator of the first Verbot, Julia) in 1994 to describe these conversational programs.[1]

There are two main types of chatterbots, one functions based on a set of rules, and the other more advanced version uses artificial intelligence. The chatterbots based on rules, tend to be limited in functionality, and are as smart as they are programmed to be. On the other end, chatterbots that use artificial intelligence, understands language, not just commands, and continuously gets smarter as it learns from conversations it has with people.

Background

In 1950, Alan Turing's famous article "Computing Machinery and Intelligence" was published,[2] which proposed what is now called the Turing test as a criterion of intelligence. This criterion depends on the ability of a computer program to impersonate a human in a real-time written conversation with a human judge, sufficiently well that the judge is unable to distinguish reliably—on the basis of the conversational content alone—between the program and a real human. The notoriety of Turing's proposed test stimulated great interest in Joseph Weizenbaum's program ELIZA, published in 1966, which seemed to be able to fool users into believing that they were conversing with a real human. However Weizenbaum himself did not claim that ELIZA was genuinely intelligent, and the Introduction to his paper presented it more as a debunking exercise:

[In] artificial intelligence ... machines are made to behave in wondrous ways, often sufficient to dazzle even the most experienced observer. But once a particular program is unmasked, once its inner workings are explained ... its magic crumbles away; it stands revealed as a mere collection of procedures ... The observer says to himself "I could have written that". With that thought he moves the program in question from the shelf marked "intelligent", to that reserved for curios ... The object of this paper is to cause just such a re-evaluation of the program about to be "explained". Few programs ever needed it more.[3]

ELIZA's key method of operation (copied by chatbot designers ever since) involves the recognition of cue words or phrases in the input, and the output of corresponding pre-prepared or pre-programmed responses that can move the conversation forward in an apparently meaningful way (e.g. by responding to any input that contains the word 'MOTHER' with 'TELL ME MORE ABOUT YOUR FAMILY').[4] Thus an illusion of understanding is generated, even though the processing involved has been merely superficial. ELIZA showed that such an illusion is surprisingly easy to generate, because human judges are so ready to give the benefit of the doubt when conversational responses are capable of being interpreted as "intelligent".

Interface designers have come to appreciate that humans' readiness to interpret computer output as genuinely conversational—even when it is actually based on rather simple pattern-matching—can be exploited for useful purposes. Most people prefer to engage with programs that are human-like, and this gives chatbot-style techniques a potentially useful role in interactive systems that need to elicit information from users, as long as that information is relatively straightforward and falls into predictable categories. Thus, for example, online help systems can usefully employ chatbot techniques to identify the area of help that users require, potentially providing a "friendlier" interface than a more formal search or menu system. This sort of usage holds the prospect of moving chatbot technology from Weizenbaum's "shelf ... reserved for curios" to that marked "genuinely useful computational methods".

Development

The classic historic early chatterbots are ELIZA (1966) and PARRY (1972).[5][6][7][8] More recent notable programs include A.L.I.C.E., Jabberwacky and D.U.D.E (Agence Nationale de la Recherche and CNRS 2006). While ELIZA and PARRY were used exclusively to simulate typed conversation, many chatterbots now include functional features such as games and web searching abilities. In 1984, a book called The Policeman's Beard is Half Constructed was published, allegedly written by the chatbot Racter (though the program as released would not have been capable of doing so).[9]

One pertinent field of AI research is natural language processing. Usually, weak AI fields employ specialized software or programming languages created specifically for the narrow function required. For example, A.L.I.C.E. utilises a markup language called AIML, which is specific to its function as a conversational agent, and has since been adopted by various other developers of, so called, Alicebots. Nevertheless, A.L.I.C.E. is still purely based on pattern matching techniques without any reasoning capabilities, the same technique ELIZA was using back in 1966. This is not strong AI, which would require sapience and logical reasoning abilities.

Jabberwacky learns new responses and context based on real-time user interactions, rather than being driven from a static database. Some more recent chatterbots also combine real-time learning with evolutionary algorithms that optimise their ability to communicate based on each conversation held, with one notable example being Kyle, winner of the 2009 Leodis AI Award.[10] Still, there is currently no general purpose conversational artificial intelligence, and some software developers focus on the practical aspect, information retrieval.

Chatterbot competitions focus on the Turing test or more specific goals. Two such annual contests are the Loebner Prize and The Chatterbox Challenge.[11]

Usage in dialog systems

Example of an automated online assistant providing customer service on a web page.

Chatterbots are often integrated into the dialog systems of, for example, automated online assistants, giving them the ability of, for example, small talking or engaging in casual conversations unrelated to the scopes of their primary expert systems.

Currently chatterbots are widely used as part of messaging platforms like Facebook Messenger or Snapchat for entertaining purposes as well as B2C marketing. Companies like Pizza Hut, Disney, Yamato’s Line and Whole Foods have launched their own chatterbots to increase end customer engagement, promote their products and services, and give their customers a more convenient and easier way to order from them.[12]

Other companies explore ways how they can use chatterbots internally, for example for Customer Support, Human Resources, or even in Internet-of-Things (IoT) projects. Overstock, for one, has reportedly launched a chatbot named Mila to automate certain simple yet time-consuming processes when requesting for a sick leave.[13] SAP partnered with Kore Inc, a US-based chatbot platform vendor, to build enterprise-oriented chatterbots for certain SAP products like SAP Hana Cloud Platform, SAP Cloud for Customer (C4C), SAP SuccessFactors and Concur.[14] Other large companies such as Lloyds Banking Group, Royal Bank of Scotland, Renault and Citroën are now using automated online assistants instead of call centres with humans to provide a first point of contact.

Incorporation in other devices

A chatterbot may be deployed in a smartphone app. One popular category of smartphone app that relies on a chatterbot is the dating sim or romancebot category. The 36 You Games app "Boyfriend Maker" and WET Productions Inc.'s "My Virtual Boyfriend" are popular examples. According to 36 You Games' Japanese language website, as of 13 November 2012, Boyfriend Maker (later rebranded as "Boyfriend Plus" for iOS users) was ranked the number one free iPhone app in Japan and had been among the top ten overall apps in Singapore, Hong Kong and Taiwan.[15] Such apps allow a user to carry on a textual interchange with a simulated chat partner, much as one might chat with a human partner on a date, or through instant messaging or other forms of online chat. The concept is very similar to chatting with a robot in an internet chatroom or on an internet forum. Users can chat about various topics, from school homework to song lyrics, or engage in cybersex-style chats.[16][17]

Toys

Chatterbots have also been incorporated into devices not primarily meant for computing such as toys.[18]

Hello Barbie is an Internet-connected version of the doll that uses a chatterbot provided by the company ToyTalk,[19] which previously used the chatterbot for a range of smartphone-based characters for children.[20] These characters' behaviors are constrained by a set of rules that in effect emulate a particular character and produce a storyline.[21]

IBM's Watson computer has been used as the basis for chatterbot-based educational toys for companies such as CogniToys[18] intended to interact with children for educational purposes.[22]

Malicious use

Malicious chatterbots are frequently used to fill chat rooms with spam and advertising, or to entice people into revealing personal information, such as bank account numbers. They are commonly found on Yahoo! Messenger, Windows Live Messenger, AOL Instant Messenger and other instant messaging protocols. There has also been a published report of a chatterbot used in a fake personal ad on a dating service's website.[23]

See also

Citations

  1. Mauldin 1994
  2. (Turing 1950)
  3. (Weizenbaum 1966, p. 36)
  4. (Weizenbaum 1966, pp. 44–5)
  5. GüzeldereFranchi 1995
  6. Computer History Museum 2006
  7. Sondheim 1997
  8. Network Working Group 1973—Transcript of a session between Parry and Eliza. (This is not the dialogue from the ICCC, which took place October 24–26, 1972, whereas this session is from September 18, 1972.)
  9. www.everything.com 13 November 1999
  10. (German) Chatroboter simulieren Menschen
  11. "Better believe the bot boom is blowing up big for B2B, B2C businesses". VentureBeat.
  12. Greenfield, Rebecca. "Chatbots Are Your Newest, Dumbest Co-Workers". Bloomberg.
  13. Pieper, Till. "Chatbots Meet Enterprise Software".
  14. "Boyfriend Maker scores critical hits everywhere, especially in Japan!". 36 You Games. 14 November 2012. Retrieved 26 December 2013.
  15. Hawgood, Alex (24 December 2013). "'Interactive' Gets a New Meaning: Sex Toys and Cybersex Are Enhanced by New Technology". New York Times. Retrieved 26 December 2013.
  16. Romano, Aja (28 November 2012). "The "Boyfriend Maker" app is as horrifying as you'd expect". The Daily Dot. Retrieved 27 December 2013.
  17. 1 2 Amy (2015-02-23). "Conversational Toys – The Latest Trend in Speech Technology". Virtual Agent Chat. Retrieved 2016-08-11.
  18. NAGY, EVIE. "USING TOYTALK TECHNOLOGY, NEW HELLO BARBIE WILL HAVE REAL CONVERSATIONS WITH KIDS". Fast Company. Retrieved 18 March 2015.
  19. Oren Jacob, the co-founder and CEO of ToyTalk interviewed on the TV show Triangulation on the TWiT.tv network
  20. http://www.google.com/patents/US20140032471
  21. Takahashi, Dean. "Elemental's smart connected toy taps IBM's Watson supercomputer for its brains". Venture Beat. Retrieved 15 May 2015.
  22. "From Russia With Love" (PDF). Retrieved 2007-12-09. Psychologist and Scientific American: Mind contributing editor Robert Epstein reports how he was initially fooled by a chatterbot posing as an attractive girl in a personal ad he answered on a dating website. In the ad, the girl portrayed herself as being in Southern California and then soon revealed, in poor English, that she was actually in Russia. He became suspicious after a couple of months of email exchanges, sent her an email test of gibberish, and she still replied in general terms. The dating website is not named. Scientific American: Mind, October–November 2007, page 16–17, "From Russia With Love: How I got fooled (and somewhat humiliated) by a computer". Also available online.

References

External links

This article is issued from Wikipedia - version of the 11/17/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.