ramona chatbot


As I was designing and building out my remote maternal monitoring app, it became apparent through user testing that there needed to be a way to search through the various conditions more quickly than the visual interface. Further inquiry and research pointed to a chatbot solution.

Please note that this project was sold and all assets are now property of the buyer. Therefore, this case study will focus solely on the process of designing and testing a chatbot solution.


Overview

Using Google’s AI, Dialogflow, I created and tested a chatbot interface to offer a conversational solution for my users.

What I did

Prototyping, testing, voice and tone

MY ROLE

Conversational designer

DURATION

Six months

Please note that this feature was part of the sale of my app and cannot be shared publically. This case study will therefore focus exclusively on the design process.


The Inspiration

To be clear, I didn’t set out to make a chatbot for the sake of making a chatbot. At the time I thought that chatbots were awkward and frustrating and thought most people felt the same way. I was wrong. While conducting my user interviews, I found that most women were receptive to a conversational interaction and thought they would come in handy in times of need.

So with this in mind, I dug a little deeper. I found out that conversational user interface technology has improved tremendously over the past few years and that HIPAA compliance issues could be overcome. In fact, Google’s AI division was working on their own conversational programs for healthcare. I figured out that conversational user interface (CUI) might be beneficial for those who cannot use or understand the visual interface I have designed, making the app even more accessible.

Word bubble with the phrase "Hello? Is it me you're looking for" inside.

Why a conversational solution?

1. The users asked for it. 2. It can aid in accessibility. 3. Tech advancements make it less awkward.


STEP 1: VOICE AND TONE

A BBC photo of the characters from Call the Midwife.

Why voice and tone now?

Voice and tone sets the stage for how your app will interact with your users. Do this at the beginning!

Photo from BBC

The first step of the CUI project was to establish the voice and tone. I felt that it needed to be clear, direct, and authoritative given the nature of my target population to tend to “sit it out” rather than to seek medical attention. I thought of the nurses who treated me in the hospital and other women in healthcare, both real and fictional, as I put together the voice and tone for the app. I felt that the characters from “Call the Midwife” also served as great inspiration.

Voice and tone tells me what the CUI is and is NOT. I was sure to write down that the CUI is polite, direct, and not too wordy. The CUI calls anatomy by their proper names. The CUI does not use innuendo and isn’t wishy-washy in responses.


Step 2: Mind mapping

Why a Mind Map?

A Mind Map serves to show the flow of conversation and/or thoughts. While Mind Mapping I learned that I needed responses to be short, direct, and clear as some symptoms and issues are life threatening.

After voice and tone were selected, I developed a mind map to chart the flow of the CUI. Essentially I took the wireframes from the visual UI and transferred it into a conversational UI chart. Here I found that the most elegant responses were short and direct (fitting to the voice and tone that I had established). Also, while working on mind mapping, I discovered the importance of the app confirming responses by repeating what the user said. For example, if the user uttered “chest pain”, I needed the CUI to respond with “If you are experiencing chest pain after the birth of a baby, you need to dial 911 or go directly to your nearest emergency room”.






STEP 3: Scripting and testing

Illustration of a chat on mobile.

Why Wizard of Oz testing?

Wizard of Oz testing is an effective and cheap to test a concept with real users.

It was at this point that I wanted to see if people felt comfortable interacting with the CUI which at this point is a prototype of a chatbot. After all, it would be a tremendous waste of time to build a chatbot if no one wanted to use it. Together with a couple of my classmates, we wrote out scripts and tested out various scenarios and recorded any additional feedback they would offer by conducting Wizard of Oz testing.

In sum, I found a few interesting conclusions from this initial study. First, some people will type in a litany of questions all at once! I wasn’t sure what to do with that information, but it was noted. Second, the majority of respondents indicated that they would welcome use of a chatbot. This verified what I discovered in those initial interviews. Let’s move forward!


step 4: switching gears FINDING OUR MVP

It was in January of 2019 that I won my first award for my remote maternal monitoring app and the pace and direction began to change. I had to make sure that every decision was carefully thought out as it was under scrutiny of a number of healthcare professionals and experts. My initial data showed that the chatbot showed promise but developing the full, complete iteration of the chatbot didn’t make sense- I needed to find my MVP. So while I was continuing to gather data on the full app, I kept in mind that the first stages of actual development would center on the postpartum app and with that, a chatbot designed to help women during the postpartum period.


Why Dialogflow?Dialogflow is a free tool from Google that allow you to create a chatbot that works without you being present. You can collect data and see responses too!   This is a perfect way to test a more robust version of a chatbot. Photo from …

Why Dialogflow?

Dialogflow is a free tool from Google that allow you to create a chatbot that works without you being present. You can collect data and see responses too! This is a perfect way to test a more robust version of a chatbot.

Photo from coursity.com

STEP 5: Learn Dialogflow

It was also around this time that I learned to use Google AI software called Dialogflow. This is an amazing tool where you can type in entities and intents and generate responses from those entities. So for this part of the project, I wanted to use the logic flow that I already worked out and turn it into a conversational flow. I simply took the intents and entities that I already established and pumped them into Dialogflow. I cannot tell you how excited I became watching the chatbot answering my inquiries. It was definitely a nerd moment.



STEP 6: ENSURE THE RIGHT LANGUAGE

It was at this point I conducted my next user test. I used Typeform to create a survey that I sent out to 100 individuals using various social media platforms. This survey was an unmoderated user test so I was able to get results within just a few days for very little money. Compensation was offered via random drawing (this makes things easier for me and more affordable in the long run!) In the survey, I presented an illustration I made of woman struggling with a certain symptom and asked users to enter how they would enter that particular symptom into a chatbot. The purpose of this user test was to ensure that I was entering the correct language into my chatbot and that the chatbot would understand how women would normally discuss their anatomy. It was also a way to ensure that my illustrations made sense too.

Urban dictionary was also used during this process.

The data from users appeared in a spreadsheet and I analyzed them. From there the chatbot was “fed” more intents and entities to ensure that is was robust enough for the next stage of testing: actual user testing of the bot. I also found out that my seizure drawing was confusing to many. Drawing a seizure is hard.

Why test entities and intents?You need to make sure you’ve captured real language.  This needs to be done before you test your chatbot.  You might have missed something in the initial development process and so you “feed” the bot with additional ter…

Why test entities and intents?

You need to make sure you’ve captured real language. This needs to be done before you test your chatbot. You might have missed something in the initial development process and so you “feed” the bot with additional terms and phrases you learn during this testing phase.


Where am i now?

At this point in the project I was approached by a buyer and sold the IP of the chatbot with the sale of the remote maternal monitoring app. Nifty!

But what if I hadn’t sold the bot, what would have been the next step?

More testing.

With Dialogflow, I can see outcomes of user entries and thus I could analyze a few things:

  1. Accuracy of the bot

  2. Robustness of the bot’s understanding of intents.

I could also ask users about their overall satisfaction with the bot to further justify development. These would all be things I would want to know about. For methodology, I would use a split website with a video of a given symptom (little to no language used) and the chatbot on the other side. Users would be instructed to tell the chatbot what they feel is going on in the video. The chatbot was designed in such a way that new responses can be entered at any point without reseting the app.