Some of our clients have over 5000 intents, which cover quite a lot of the business, still it is important to let the chatbot be humble and connect the user to a human when it is too difficult to help.
Perhaps you could you expand on the use case for mimicking a human personality, rather than presenting an interface that's obviously a piece of software that uses human-like language to the best of it's ability?
Heck, that's basically most of the value the bot frameworks I've tested provide (speech to text, some UI creation and a bit of domain framework being the rest).
And it's not a small amount of value, we currently have an SMS-based menu system, and people complain that autocorrect changes what they wanted to type in, that it doesn't recognize typos, etc. Only programmers and power users are comfortable with the rigidity of a command line.
I'd say that a well-designed chatbot IS better than a command-line interface because it combines the power of command line (combining commands) with not having to learn and master the commands (and being lenient towards typos and missing parameters and stuff like that).
I'm not a fan of chatbots (even though I work with them) but I do think there's a valid use case when public needs non-frequent interactions with a system (ie help desk or customer support or nonfrequent purchases), and they definitely always need a way to access a real person.
Messaging has potential as a channel, but "bots" are a misstep. Faking a personality makes it even worse. Pretending to understand the user is just utterly transparent.
Messaging is good at collecting data, prompting response, reminding people and triggering actions like calls or clicks. None of the strengths are bot-based.
There's just a massive failure of leadership in tech companies (well, in general), so rather than deal with real challenges, they smother themselves in carbon-monoxide statistics -- things that don't set off alarms, but destroy the business through misoptimization.
For a business, a "chatbot" or any feature similar to it needs to do one thing and that is solve the users problem(s). If the user wants to do X within the app or learn about Y, the chatbot needs to help the user with that efficiently and better than a human can for the feature to be successful. The "chatbot having a personality" comes second to "solve the users problem."
If the users are completely happy with whatever chatbot they are using, then sure adding in some "personality" might be a good idea and increase engagement slightly, but a poorly-performing chatbot that can't help the user but has a personality isn't going to help the business at all.
Voice is just a medium, what I really want is more powerful abstractions. Do more with fewer interactions.
I don't want to ask Alexa what movies are playing nearby, that's as good as the 1985 moviefone service. I want to know whether there are movies worth watching, and have Alexa check movies/times/ratings/weather/traffic/my calendar and tell me "Yes - X is certified Fresh and <friend> recommended it on Twitter recently. It's playing at Y theater for $Z, you can walk there in time for the 8:00PM show. Would you like me to buy a ticket?"... I don't want to buy an HDMI cable from Amazon, I want a sample of physical/digital retailers with prices/ETAs/ratings...
Voice, GUI, heck even CLI is fine.
And when it comes to smart homes, I expect learning and proactivity. If I follow GSW, don't make me ask if they're playing, then ask to turn on the TV, then change the channels, then adjust the volume... learn that I care, ask if I care now, and get the game on ASAP if the answer is "yes".
No, I don't want to forcibly converse with a brick wall that clearly doesn't understand my question and is chipperly excited about that fact, fuck you.
So I went from loving my latest Windows 10 PC (-- second in a row that had been awesome, even!) to swearing off MS for years (again) in about 30 minutes of completely terrible bot-based customer support.
I sincerely hope that strategy works for them, and they're gaining income somewhere to offset what they've lost from me.
I don't think it will.
In fact, two of the worst chatbot design flows, in my opinion, are: 1) Having the user try to figure out whether they are talking to a real person or a chatbot 2) Prompting the user to ask you anything (e.g., "Hi, how can I help you?") rather than guiding them on the scope.
"M-x doctor" aside, try getting on #emacs and talking to fsbot. I swear I have seen some of the eeriest exchanges between human and robot in that channel – it's not an AI, of course, but man it fakes it well. As someone in the channel once said: "Someone's cheating on their Turing test..."
Agreed 100% but IMO, this is not a function of "personality" but rather a function of deeply understanding user intents. A bot cannot be purposeful if its own designers don't know its purpose from a user-centric perspective.
(Disclaimer: I work on Chatbase, a service for analyzing and optimizing bots)
The problem is you implemented chatbot in the first place so you don't have to directly deal with customers.
I wonder how one could justify the bots...
Tech is so blessedly, unfailingly logical and we stupid humans have to sully that to make some of us feel happier, and not more effective :(