Chatbot Data from a Product Lens
As chatbots continue to be a new and pioneering space, Instabot is constantly researching, testing, and understanding the many nuances of how bots capture and interpret data.
I had the chance to sit down with Instabot’s Chief Product Officer, Aaron Weymouth, to talk about how much the perception of chatbot data has evolved, and gain some insights on how to successfully execute chatbots as well as interpret user data.
How does Instabot currently collect and think about analytics collected by bots?
Traditionally, the most important metric in most sales SaaS (Software as a Service) platforms is conversions. We agree on the value of a conversion, but it really starts with engagement. Whenever a user comes to a client’s site, we consider the initial action which launches the Instabot as an incredibly valuable engagement. From that engagement, it’s an exciting challenge to see how we can optimize the engagement further. We are measuring that success rate based on the interactions the user is taking to reach a final goal.
We track each user response, the abandonment of a bot conversation, the number of times a user is clicking through a bot, and more. The idea is to tell a story on what each engagement is like so that the client using Instabot to collect information isn’t left guessing what the end user was doing; there should be no ambiguity.
How has the way Instabot approaches and displays analytics changed since its first launch?
We’ve learned a lot on how to structure and format our product since we first started building chatbots. Our initial thought process was that bots were very linear and that the structures were based on “question, binary response, question, binary response” and so on, similar to an online form. With this in mind, our initial mindset was similar to a traditional survey metric, and we would consider: “Did the user “complete” the questions in the bot?”
We quickly realized that chatbot conversations, like any normal conversation, can take any number of twists and turns, and tend to branch out pretty quickly. So we began to consider things in a non-linear manner, using a branch method. With that in mind, it was clear that our initial metric of tallying how many users “completed” conversations didn’t mean much. People cared about how many users interacted with their bot, and whether or not those users engaged with the bot long enough to provide useful information.
Did the user give their email? Did they sign up for a service? Did they ask a certain question? That may happen in the first 3-4 prompts and a user may not have to go through to complete an entire conversation. Additionally, just because someone didn’t complete every branch of the conversation or reply to each prompt doesn’t mean the conversation wasn’t a success.
I think the biggest change we’ve made involves shifting focus from closed conversations to engagements. We pay attention to how you really “converse” with a user and how successfully bots learn information about users during a conversation.
Were there specific challenges throughout the process of understanding and implementing bot analytics? If so, how did you and your team address those challenges?
Yes, when our perspective shifted from getting someone to complete a conversation to getting someone to better engage, we had to think about a few things.
Visually, from a user experience perspective, we had to ask ourselves what made a chat compelling?
Why would users engage with an Instabot over a pop-up form or something that simply asks for your email? So we were, and are, continually looking for ways to entice a user to talk to a bot and make them feel more comfortable doing it. The goal was to make Instabot conversational and playful without becoming lippy from Microsoft Word.
2. Second, we started really focusing on conversation content and structure.
When and how you ask questions make big differences with a chatbot. You can have the most visually beautiful chat experience in the world, but if the content feels static or without direction; people will drop off very quickly. As we’ve evolved through those metrics and looked more towards engagement, we began to see the patterns in which types of prompts we can ask to spur conversation and encourage users to answer more questions. When prompting a user with a chatbot, saying “Hi I’m a bot, give me your information” doesn’t cut it. We've tested this and saw an immediate drop off in user engagement. When bots converse in a more personal (and personalized) way, that’s when we see a higher engagement and a level of trust is built up, allows a chatbot to receive more information from a user.
The questions we tackle every day are:
- How do we continue to evolve the way Instabot can execute thoughtful conversations?;
- How do we continue to improve upon the fun, engaging, and natural experience?;
- How do we evolve bots as users get smarter at understanding chatbots?, and finally,
- How do we deliver these smarter, sleeker, chatbots in a product that the average marketer or product manager has no problem learning and deploying in a short amount of time?
What do you think are important pieces of data that chatbot should be able to collect?
One thing we’ve learned is that every chatbot is different, as is every use case we come across. Early on we tried to bucket our users into templates that focused on very specific cases like lead generation, sign up, form entry, etc. But we realized that no one size fits all, and every customer’s business is unique, so trying to define one single metric that everyone found important was impossible.
So we started with the one thing everyone needs: engagement.
1. Can you increase the amount of engagement that happens with a bot?
That comes down to that first prompt, that first intro, that first value add that you try to pitch which allows users to trust and know that the information and interaction adds value to him or her.
2. Then, the real metric we focus on for each customer is: what is your end goal?
Are you trying to increase demo bookings? Collect more emails for a sales campaign? Do you want more users to sign up for your blog? Do you want users to learn more about your product? With all that in mind, what we’ve moved towards is something we call “Bot Goals”, allowing clients to define what metric they want to collect, and then to flag different areas in a conversation at which that metric (goal) happens. In doing so, we allow that conversation to be organic and allow clients to use them the way they really wanted; to not force them into a template or metric that might not work out for them. We’re aiming to allow each of our client to dictate what they need and help them build the exact bot that accomplishes it.
Given the recent need to comply with GDPR, how did Instabot's product team respond?
I’m sure everyone is getting sick of the GDPR driven prompts the past few weeks where every site is re-publishing cookie policies.
What’s interesting is that a lot of those restrictions and requirements came out of the need to protect consumers from large companies who didn’t have their end users' best interest in mind. That could have meant companies were selling user data to third parties without consent, or just not paying attention to where data was being sent. I think these policies are fantastic to prevent malicious use of data.
That being said, at Instabot we’ve always had the user's best interest in mind. We didn’t build the product with the intent of collecting all user data to sell it to third-party sources. We built this product to give businesses a better way to communicate with their users, and to allow users to more quickly and accurately get the information they need from the services they use.
So, from that perspective, after making sure we notified users and complied with all GDPR requirements, nothing really changed for us. We’ve always had user privacy and best their interest in mind and that hasn’t changed. We don’t re-market or resell this data.
What are the plans to improve upon the way Instabot currently collects chatbot data?
We plan to continue to measure and analyze the foundational building blocks of what makes a conversation successful. The next thing for us is really about not just getting someone in the door and talking, but measuring both their level of engagement and how successful the conversation prompts are to converting goals.
In the future, you’ll see new dashboards including advanced drop off metrics with a stronger focus on providing our clients with the data to determine how to build better, more successful responses. I think we’re going to see some incredibly exciting data on the way that is going to really change the way the market thinks about consumer chatbots.