In the rest of this chapter, we will be implementing an AI-based voice recognition and response system in the robot and creating our own custom voice interface. We will be using Mycroft, an open source voice activated digital assistant that is adept at understanding speech and is easily extended for new functions and custom interfaces.
The process we will use for voice interaction with the robot follows this script:
- Wake word (Hey, Albert)
- Pause for the robot to make a beep sound to show it is listening
- Command or query from human (move forward one step)
- Robot responds verbally (moving forward six inches)
There are two forms of text-to-speech involved in this process that greatly simplify matters for the robot. First, the ...