December 2019
Intermediate to advanced
528 pages
11h 19m
English
This is the final method: we have now taken the picture using the camera and sent that image to Azure for analysis. Our final task is to display the results to the user. Before we look at the code, we should consider exactly what we are trying to achieve here.
Our specific application is designed to indicate someone's emotional state at the time of taking the picture. When we pass the image to Azure, it will not return a statement saying, this person is happy; instead, it will return a list of possible detected faces and a list of possible emotions for each, along with a value to indicate how much it thinks that person displays that emotion. For example, have a quick look in the mirror: how do you look? Are you happy, angry, ...