Google has released a neural-network powered chatbot called Meena that it claims is better than any other chatbot out there.
Data slurp: Meena was trained on a whopping 341GB of public social-media chatter—8.5 times as much data as OpenAI’s GPT-2. Google says Meena can talk about pretty much anything, and can even make up (bad) jokes.
Why it matters: Open-ended conversation that covers a wide range of topics is hard and most chatbots can’t keep up. At some point most chatbots say things that make no sense or reveal a lack of basic knowledge about the world. A chatbot that avoids such mistakes will go a long way towards making AIs feel more human—as well as make characters in video games more lifelike.
Sense and specificity: To put Meena to the test, Google has developed a new metric it calls the Sensibleness and Specificity Average (SSA), which captures important attributes for natural conversations, such as whether each utterance makes sense in context—which many chatbots can do—and is specific to what has just been said, which is harder.
What do you mean? For example, if you say “I like tennis” and a chatbot replies: “That’s nice,” the response makes sense but is not specific. Many chatbots rely on tricks like this to hide the fact they don’t know what you’re talking about. On the other hand, a response such as: “Me too, I can’t get enough of Roger Federer” is specific. Google used crowdworkers to generate sample conversations and to score utterances in around 100 conversations. Meena got an SSA score of 79 percent, compared to 56 percent for Mitsuko—a state-of-the-art chatbot that has won the Loebner prize for the last four years. Even human conversation partners only scored 86% in this new test.
Can I talk to Meena? Not yet. Google says it won’t be releasing a public demo until it has vetted the model for safety and bias, which is probably a good thing. When Microsoft released its chatbot Tay on Twitter in 2016 it started spewing racist, misogynistic invective within hours.
This content was originally published here.