Types of design tools for voice user interfaces and how to use them | by Jesús Martín | Nov, 2021

Read article in Spanish: Tipos de herramientas de diseño de interfaces de voz y cómo utilizarlas

If you want to learn how to design a voice interaction for Alexa, Google Assistant, or any other conversational interface, in this article I will go through the different tools you can use in your projects and how to work with each one of them.

However, if you are looking for specific software, you can read my personal recommendations in this article where I talk about Design and prototyping tools for voice interfaces

Nowadays we can mainly find three types of conversation design tools: dialogs, spreadsheets, and flowcharts. How do we design with each tool in a voice-only interface?

Blue image with with text reading: Scripts, Dialogs, Sample dialogs

How do we design conversations with scripts?

Also known as Sample Dialogs, scripts can be equivalent to wireframes in GUI. Scripts are the design artifact we should use from the beginning in every new voice project.

The most interesting advantage of working with dialogs is the fact that they help us stay closer to words and conversations. Working on that level it’s easier to be tied up to a conversational mindset, avoiding complex situational structures.

Dialogs will help us outline interactions rapidly, allowing us to validate how they sound and how they flow. We can use the dialogs crafted in any writing software (or paper!) to create a quick test and iterate the design.

What are the steps to design a voice interface with scripts?

  1. Set the context on which the conversation is taking place. For instance, the first one you will design for will be: New user that interacts with a voice-only smart speaker.
  2. Write the conversation in turns between the assistant and its users, from the initial invocation to the goodbye message. Start with the “Happy Path”, considering your users as perfect collaborators that always interact as expected.
  3. Review that users’ answers are realistic with the prompts the assistant is using. Stepping into their shoes is a great exercise to start understanding if our language is helping route users towards a successful interaction. Otherwise, we might be creating confusing messages leading to unexpected answers.
  4. Review every single prompt and ask yourself the following questions:
    What is each turn on the interaction for? Is there any turn that we could remove without a negative impact on the interaction?
    What is each sentence on the interaction for? Is there any sentence that we can remove without making the language incomprehensible?
    How are the sentences on a word level? Are we using any unnecessary verbiage or any word that can make the message confusing?
  5. Read the script out loud and listen to how it sounds. Is there anything you can do to make it easier to understand or to sound better?
  6. Read the part of the assistant from the script to someone else. Write down their answers and reactions. Were your prompts properly understood? Were the answers aligned with what you expected?
  7. If the interaction is going to change depending on users’ answers, work also on the conversation that will be triggered in those scenarios, and test them again as in step 6.
  8. After making improvements to your script, you will be done with your first happy path. Congratulations! Now, go back to step 1, define the context that sets different possible situations and create the script for each one of them (e.g.: how would the experience be in an on-the-go device?).
    This is an iterative process where we need to cover every relevant scenario with an impact on the experience.
  9. With all the expected scripts completed, is there any unhappy path that you are missing? Are there any considerations that could make an impact on the interaction? There are some common cases that we need to have defined (e.g.: stop, repeat, etc.) while other situations may pop up during your different tests. Once identified, work on the scripts for them.

Example of conversation design with a Script

U: Assistant open MyVoiceUI
A: Hello! This is MyVoiceUI. I can tell you what’s on TV now and what will be tonight. What do you want to hear?
U: What‘s on now?
A: Here you have it: <TV programs now>
A: Is there anything else I can do for you?
U: No, thank you.
A: Great! Have a good rest of the day!

How do we design voice interfaces with tables and spreadsheets?

A very popular alternative to Scripts are tables. For working with them you can use any software with this feature like Notepad, Excel, or even Airtable as I outlined in this blog post.

Using tables will help us focus on words and conversation while using a columns-and-rows structure which can be very helpful. Having dedicated columns to define the context or to add notes and comments is definitely the best part of it. The clarity the structure provides is also very useful when sharing the design deliverables with the dev team or with other designers.

Tables and spreadsheets are a great tool for interactions that do not require too many turns (3 turns as a rule of thumb), and where the conversational context is very relevant.

What are the steps to design a voice interface like Alexa or Google Assistant with tables?

  1. Create a table with at least the following columns: Context, User, Assistant, and Comments. You can obviously adapt this basic structure to your needs, but in my experience, those 4 columns have always been needed.
  2. Add a row for each turn between users and the assistant.
  3. Document the first turn of the conversation, specifying the situation and anything that defines the conversational context. Once the context is defined, add the user utterance and the assistant answer.
  4. If you are dealing with a multi-turn experience, add a new row for each new turn. The situation might also need to be specified if it helps clarify the assistant answer. As we saw with the previous design artifact (Scripts), we need to start defining the Happy Path.
  5. Once we are done with our happy path, we can create as many new tables as we need to document the different deviations from that first ideal conversation. Use the “Context” column to clearly state what’s the difference between each new table and the other ones.
  6. As we did with Scripts, review that the words you are using are the right ones, following the same 4, 5, and 6 steps we took for Scripts.
  7. Design the conversation for different contexts as we saw in step 8 for Scripts.
  8. Prepare the interaction for classic deviations as in step 9 for designing with Scripts.

Example of conversation design with a Table

How to design conversations with flowcharts

Flowcharts are a classic design tool from graphic user interfaces (GUI), but also are a very popular artifact in conversational design. There are tons of software you can use. Some of them are classic flowchart tools like Xmind, Omnigraffle, or draw.io, while others are specific to conversational design like Voiceflow or fabble.io.

The greatest advantage of using flowcharts tools is that we can represent, in a smaller space and in a more visible way, the different alternatives and paths the conversation can follow.

What are the steps to design a conversational interface with flowcharts?

  1. Create the first box for your Welcome message. The welcome message is somehow similar to a webpage home. It is the first message your users will get and they will move from that message to the rest of the interaction.
  2. Create a second level, normally above the welcome message or to its right. In that second level, create as many boxes as possible answers you expect to the welcome message. Each box can have different utterances that share the same customer intent. Use arrows to link the original box to the new ones you create in this second level. In the connection, you can add a tag specifying the customer intention with the format you prefer (description, intent ID…). Some tools like Voiceflow don’t require the creation of a specific box but include these user utterances as part of the intent training specified in the connector.
  3. For each of the different customer intents, create a new level with the assistant prompt in a new set of boxes.
  4. For each of these new prompts, define once more, as we did in step 2, the possible answers the users can give to the assistant prompts.
  5. Keep creating levels, boxes, and connectors for the different paths users can follow. In some cases those connectors can link boxes from nonadjacent levels, making the artifact a bit hard to read. Make good use of space and utilize clear descriptions and tags to improve clarity.
  6. As we did with Scripts, review that the words you are using are the right ones, following the same 4, 5, and 6 steps we took for Scripts.
  7. Design the conversation for different contexts as we saw in step 8 for Scripts.
  8. Your diagram should showcase not only the happy path but every possible path for users. This way design documentation is easier to visualize compared to scripts or tables. If the space gets too crowded you can always split the diagram in different features and represent it in different canvases.

Example of conversation design with a Flowchart

As we have seen, there are mainly three types of design tools: scripts, tables, and flowcharts. The choice of a particular tool among the others depends on the interaction we are designing for (e.g.: single turn interaction vs multi-turn ones), but it’s also tied to the team dynamics we work with. The other designers you work with, especially the dev team, have a lot to say on how design needs to be documented.

In my professional experience, a combination of the different types of tools is the best approach in most cases. I particularly recommend starting new designs working with scripts and, depending on the interaction and its complexity, moving ahead with tables or flowcharts.

Source link

Latest articles

Related articles

Leave a reply

Please enter your comment!
Please enter your name here