๐Misty wave back
Misty can find you and react to your movements!
In this skill, Misty uses two onboard AI Capabilities
Object Detection
Human Pose Estimation
Object Detection is used to make Misty look at the closest person. For this specific skill, I only wanted Misty to find Human. Hence I specifically look for the 1st human object and ignore the rest.
Human pose estimation is used to detect the ~waving arm gesture. The event provides 16 keypoints per message like nose, eye, ear, shoulder, elbow, wrist, hip, ankle etc.. Using these keypoints, logic can be built to detect specific gestures.
In this case, the logic will be:
Elbow is lower than Shoulder && Shoulder is lower than WristThis project will only work in the Misty Desktop Environment because we will modify the events file, exactly like we did for the QR code detector.
Open the folder containing the Python-SDK that you use for the desktop environment. The folder should look like this one.

Open the folder MistyPy, open the file Events.py in Visual Studio code and modify the Events class to add the PoseEstimation Event.

Now you're ready to use Misty's Human Pose Estimation capabilities in Python.
Constants
Since Misty will have to track your face, including some constants about Misty's head's maximum range of movements will be necessary.
To get those constants you can use the same code as the one in the Misty follow human project in the Constantssection.
Python code
In this code, every group of functions or variables is explained in their use.
As always the first steps are declaring the libraries, initializing the robot and the constants.
Right after it's used the same couple of functions are used in Misty follow human to get the current Misty's head's values.
Then you can find the logic behind the Human Pose Estimation event. These are the 16 keypoints that Misty can recognize:
NOSE(0)
LEFT_EYE(1)
RIGHT_EYE(2)
LEFT_EAR(3)
RIGHT_EAR(4)
LEFT_SHOULDER(5)
RIGHT_SHOULDER(6)
LEFT_ELBOW(7)
RIGHT_ELBOW(8)
LEFT_WRIST(9)
RIGHT_WRIST(10)
LEFT_HIP(11)
RIGHT_HIP(12)
LEFT_KNEE(13)
RIGHT_KNEE(14)
LEFT_ANKLE(15)
RIGHT_ANKLE(16)
Example of Data Received under each KeyPoint
bodyPart: 0 confidence: 0.3205725 imageX: 191 imageY: 253 pitch: 0.003858468 yaw: -0.126723886
After recognizing the logic it's time to animate Misty and you can modify it in the wave_back function.
In the person detection function the first part attempts to look just at the closest person when multiple people are in front of Misty, while the second adjusts Misty's head position.
The last lines of code start the whole program and keep it alive.
Last updated