Yorick Becomes a Mimic: Wireless 3-Axis Skull Control via Motion Capture

Introduction

Picture showing me wearing the sensor hat, with the 3-axis skull next to me.

The sensor cap and the skull it wirelessly controls.

A few years back I presented my Yorick Alexa project at a local Hack & Tell meeting. At the meeting, someone asked if I could hook up a microphone to provide the voice, rather than Alexa. That is a pretty trivial change (much easier than getting Alexa interfaced in the first place), but from that I had the idea of capturing a person’s head motions for the skull in real-time, rather than using pre-canned, repeated routines. This Mimic project was the result.

This project uses two Raspberry Pi’s, communicating over WiFi using XMLRPC, along with an Inertial Measurement Unit (IMU) sensor and a Maestro Servo Controller to have a 3-axis skull wirelessly mimic the head movements of the operator, who is wearing a baseball cap with the sensor Pi and IMU unit mounted on the brim. (A shoutout to Greg G on Haunt Forum for the ball cap mount idea).

Side note: My first job out of college was as a systems engineer on the Shuttle Mission Simulator, including the simulation code for the rate gyro assemblies and the on-board accelerometers. There, I first heard of this weird thing called a “quaternion.” Now, over 40 years later, here I am using them in a hobby project!

Overview

This project has two units, a sensor unit and a controller unit. The sensor unit consists of a Raspberry Pi Zero W and an Adafruit BNO055 9 degree of free

Sensor unit mounted on a baseball cap

Sensor unit mounted on a baseball cap

dom IMU Board. The controller unit consists of another Pi Zero W and a Pololu Maestro Servo Controller.

The project has two units: the sensor unit and the controller unit. The Sensor unit consists of a Raspberry Pi Zero W and an Adafruit BNO055 9-Degrees of Freedom IMU board. The controller unit consists of another Pi Zero W and a Pololu Maestro Servo Controller. The two communicate with each other using XMLRPC running over WiFi. The Sensor unit acts as the server and the Controller unit acts as the client.

I mounted the sensor Pi and IMU board on the brim of a baseball cap, so it will track my head movements.

The code for the project is open source, of course, and posted on GitHub.

How it Works

The IMU board has a triaxial accelerometer, a triaxial gyroscope, and a triaxial magnetometer.  The magnetometer provides orientation relative to magnetic north and is not used in the IMU only mode used in this project, as only relative orientation to the starting position (which should be looking straight forward and level) is desired. The accelerometers and gyroscope. The secret sauce in this board is that it includes a  high speed ARM Cortex-M0 based processor that takes in the raw data from all the sensors, fuses it, and outputs the calculated orientation in real-time.

The orientation is provided as Euler angles (think roll, pitch, yaw rotations about the board’s x, y, and z axes), as the x, y, and z components of the gravitational force vector, or as a quaternion (which defines the direction of a single rotation axis and the amount of rotation about that axis).  Unfortunately the chip has some flaws in calculating Euler angles, so it’s safer to use the quaternion output. For this project, the aviation convention is used for angles (x axis looking forward, y axis pointing right, and z axis down). Note however, that some of the servos move in the opposite direction, so you’ll see some sign changes in the function that converts the angles to actual servo commands).

For moving the skull, we want the Euler angles, though, to command the tilt, nod, and turn servos of the skull. Fortunately it’s easy to find the math to implement to convert the quaternion value provided into the proper Euler angles.

In this project, the sensor Pi is set up to be an XMLRPC server which, when receiving the appropriate XMLRPC request, queries the board for the current orientation, provided as a quaternion, and wirelessly sends this to the controller Pi that made the request.

The controller Pi makes this request 50 times per second. Once it has the quaternion, it converts it to the three Euler angles, and then converts each angle into the appropriate servo command to pass on to the maestro.py module that communicates with the Maestro servo controller. In addition, there is a subroutine that sends commands to move the eyes servo in a pre-determined, but relatively random, fashion.

This video [[[[[insert] shows the system in action. The goal is not to get perfectly synchronized motion between the operator and the skull, as the servos will always introduce significant delay. Rather, the intent is to generate spontaneous and realistic movements by capturing the motion of the operator, who normally will be out of sight.

Software Prerequisites

The sensor software uses CircuitPython and Adafruit’s Blinka library must be installed to support CircuitPython on a Pi. In addition, the adafruit_bno055 program is needed to interface with the board. Be sure to use the CircuitPython version rather than the earlier version.

Hardware

Picture of BNO055 IMU board connected to Pi Zero W

Closeup of the BNO055 sensor and Pi Zero sensor unit

This project requires two Raspberry Pi’s with WiFi (I used Pi Zero W’s), an Adafruit BNO055 IMU board, and a Pololu Maestro Servo Controller to control the motions of a 3-axis skull. You also need some jumper wires and a USB OTG cable. The OTG cable is used to connect the controller Pi to the Maestro Servo Controller.

Sensor

The sensor unit captures the operator’s head movements. The sensor software is just one program, imu_rpc_server.py. It is configured to use a UART interface to the BNO055 board. One can use an i2c interface instead by changing a few line at the top of the program, however all Raspberry Pi’s have a hardware issue with i2c clock stretching, and the sensor board uses clock stretching. This gave extremely unreliable results when I was developing this software and I put the entire project aside for months until I came back and found out about the hardware issue. There is a workaround that you can use if you want to use i2c that consists of slowing the speed down so that clock stretching will rarely (hopefully never) be needed. You can read more about the issue here: https://www.mcgurrin.info/robots/723/.

Normally when the Pi kernel boots up it will put a login terminal on the serial port. You’ll need to turn this off if using the UART interface. To do so, you can run the raspi-config tool and go to Interface Options, then to Serial Port, and disable shell messages on the serial connection. Then reboot the Pi. If you later need to re-enable it, just follow the same procedure. To wire up the sensor board and the Pi using the UART interface:

  • Connect BNO055 Vin to Raspberry Pi 3.3V power
  • Connect BNO055 GND to Raspbery Pi ground
  • Connect BNO055 SDA (now UART TX) to Raspberry Pi RXD pin
  • Connect BNO055 SCL (now UART RX) to Raspberry Pi TXD pin
  • Connect BNO055 PS1 to BNO055 Vin / Raspberry Pi 3.3V power

You can test that everything is working by running the simpletestuart.py program. It will print out the temperature of the board, the individual sensor parameters, and the integrated Euler angle and Quaternion values, as well as the calibration status and the axis map. This will update every 5 seconds. Don’t worry that the magnetometer readings will be None. This project uses relative orientation and therefore the software turns off the magnetometer. For more information on the board and the various outputs and settings, see the datasheet.

The imu_rpc_server.py program is similar to the test program, but it only reads the quaternion values (the Euler angles are unreliable on this sensor). It also starts and XMLRPC server that will respond to read_sensor requests by returning the quaternion value for the current orientation. The program, once started runs continually until forced closed.

Controller

The controller software runs on a different Pi, and consists of two programs: main_controller.py and maestro.py. Main_controller.py initializes the servo controller and sets limits on the speed and acceleration for the servos. I found this cuts down on the noise of the servos, provides smoother motion, and keeps the head from whipping around when turning.

The board queries the sensor Pi via XMLRPC ten times per second to get the position as a quaternion. It converts the quaternion into Euler angles and converts the Euler angles into servo commands and sends the servo commands to the Maestro board. It also generates a sequence of eye movements and sends that to the Maestro as well.

ManServoTest can be used as you set things up to make sure your controller Pi is talking to the Maestro correctly.

Use

Set up the Pi’s to connect to your local WiFi network and install CircuitPython on the sensor Pi (it’s not needed for the controller). Then install the appropriate modules on each one. Connect the sensor Pi to the BNO055 and connect the controller Pi to the Maestro Servo Controller using a USB OTG cable. And, of course, connect the appropriate pins on the servo controller to the servos controlling your skull. Roll or tilt is channel 0, pitch or nod is channel 1, yaw or pan is channel 2, and random eye movements are sent out on channel 3.

Once everything is hooked up, you must start imu_rpc_server.py on the sensor pi first, so that it starts the XMLRPC server. Then launch main_controller.py on the controller Pi and you’re off and running.

Opportunities for Expansion

It would be very straightforward to take the sensor software and use it to capture and record head motions to a file. Then a different program could read back the file and send out the previously captured motion commands.

A somewhat more complex undertaking would be to capture the motions as described above, but then translate them into the scripting language used on the Maestro servocontroller, so that they could be played back without a Pi or other computer. Sending out 50 commands per second, however, would quickly fill the limited memory of the Maestro. Instead, one would want to pre-process the motion file so that only key commands are kept (analogous to keyframe animation) and only those key commands translated to the Maestro’s scripting language.

Raspberry Pi’s, I2C Clock Stretching, and a Note on UART Interfaces with CircuitPython

Picture of BNO055 IMU board connected to Pi Zero W

My IMU project in development, where problems with I2C data corruption popped up.

I ran into a problem with interfacing Adafruit’s BNO055 breakout board with a Raspberry Pi Zero W, and this led me down a rabbit hole or three regarding the Pi’s implementation of I2C and using CircuitPython on a Pi.

There are pieces of this information spread across the internet, but I figured I’d try to summarize it in one place in the hope it will help others.

I2C Clock Stretching Problems with Raspberry Pi’s

Many sensors and other devices use I2C for communicating with microcontrollers or microcomputers, such as an Arduino or Raspberry Pi. Like many, but not all, sensor devices, the BNO055 uses “clock stretching.” Clock stretching occurs when clock line (SCL) is held low after receiving (or sending) a byte, indicating that the device it is not yet ready to process more data. The primary that is communicating with it may not finish the transmission of the current bit, but must wait until the clock line actually goes high. Unfortunately, the chip that handles I2C communications on Raspberry Pi boards has a flaw and does not handle clock stretching properly. In my case, this was causing erroneous readouts. Because the flaw is in the hardware, it can’t be fixed via a software update, but there are workarounds. As of April 2021, there are some reports that this has been fixed on Raspberry Pi 4 boards, but anecdotal reports say it’s a problem even on these newest boards. If you’ve any experience with I2C clock stretching on Pi 4’s, leave a comment on what you’ve found.

While unneeded to work around the problem, you can read a lot more detail about the bug at Raspberry Pi I2C clock-stretching bug.

Workarounds

There are three basic ways to work around this issue: 1) lower the clock speed on the I2C bus so that the device won’t need to clock stretch, 2) Use software I2C rather than the build-in hardware with the flaw, or 3) if supported by your device, use another type of interface, such as SPI or UART interfaces.

Lowering the Clock Speed

First, of course, you want to be sure that you’ve enabled I2C communications on your Pi. You can read how to do so here: Enable I2C Interface on the Raspberry Pi.

Next you want to go into the config file and slow down the clock rate so that it’s less likely that clock stretching will be used. A typical default speed on a Pi is 100000, and it’s suggested in multiple locations on the net to slow it down to 10000. You can do that by adding

dtparam=i2c_arm_baudrate=10000

to your config.txt file. If you need help on how to do that, see this: I2C Clock Stretching. You’ll need to reboot for the change to take effect. If this doesn’t work, you can try slowing it down even further.

Software I2C

Raspberry Pi’s operating system includes support for software I2C which can be enabled by just a few lines in the config.txt file. Just add

dtoverlay=i2c-gpio,bus=3

This will create an I2C bus called /dev/i2c-3. SDA will be on GPIO23 and SCL will be on GPIO24 which are pins 16 and 18 on the GPIO header respectively. For more information, see Configuring Software I2C on the Raspberry Pi.

Other Interfaces

Of course, if the device you are interfacing to supports other communications links, such as SPI or a UART interface, you may want to consider using them rather than the I2C interface.

UART Interfaces with CircuitPython and the Raspberry Pi

CircuitPython has become a very popular choice for writing code for microcontrollers, and can also be used on Linux boards, such as Rasberry Pi’s, to take advantage of the large and growing set of device libraries found in CircuitPython. Several of the libraries provide UART interfaces for boards supporting that interface. However the instantiation for the interface differs between Linux boards (including Raspberry Pi’s) and other boards without an operating system. For example, you may see code like this

# Use these lines for I2C
# i2c = busio.I2C(board.SCL, board.SDA)
# sensor = adafruit_bno055.BNO055_I2C(i2c)

# Use these lines for UART except Linux devices
uart = busio.UART(board.TX, board.RX)
sensor = adafruit_bno055.BNO055_UART(uart)

This works fine for non-Linux boards, but for Linux, the CircuitPython busio library is not used for UART interfaces. Instead, you need to import the Python serial library and assuming you have linked directly to the UART pins, the uart = line should be replaced with something like:

uart = serial.Serial("/dev/ttyS0", baudrate=9600, timeout=1000)

You can also use an external serial to USB convertor to link up your peripheral device and then use the USB port on your Pi. You can find more details at Adafruit’s CircuitPython on Linux and Raspberry Pi UART / Serial page.

Conclusion

If you’re planning a Pi project involving I2C, or have had problems with one in the past, I hope that this collection of information helps you sort things out and implement the best option for your project.

Please share any experiences you’ve had with I2C and Raspberry Pi’s in the comments.

A Sliding Puzzle for the PyBadge and PyBadge LC

15 tile sliding puzzle

A plastic  sliding 15 puzzle using numbers

The sliding puzzle has a long history. The first one, a 4×4, 15 tile puzzle, was made in 1890.  A plastic version is shown to the right. The puzzles may have letters forming words or have pictures instead of, or in addition to, numbers. Basically, the tiles are randomized, and then must be returned, by sliding adjacent tiles into the open space one at a time, until the puzzle is returned to the original arrangement.

Many such puzzles have been made, in various sizes. In addition to physical puzzles, video versions can be found on computers and the web.

This is a sliding puzzle game written in #CircuitPython for the @Adafruit #PyBadge and PyBadge LC. It uses pictures for the puzzle, with numbers superimposed to make the puzzles easier to solve. It is configurable so that different images can be used, and it supports both 3×3 (8 tile) and 4×4 (15 tile) puzzles. The tiled graphics approach used in the Adafruit displayio library is ideally suited for a sliding puzzle game.

How it Works

PyBadge showing Santa image

Complete image for Santa puzzle

PyBadge showing scrambled puzzle

PyBadge showing scrambled puzzle

PyBadge showing puzzle solution

PyBadge showing solved puzzle

After initial setup and bookkeeping, the program has an infinite while loop. It roughly follows a state machine pattern with the the states being “intro”, “setup”, “play”, and “solved”.

“intro” displays the puzzle image and then asks the player to select either a 3×3 or 4×4 puzzle. Once a selection is made, the state transitions to “setup.” In this state, the puzzle is scrambled and the scrambled puzzle is displayed. Then the state transitions to “play.” In “play”, the program monitors the up, down, left, and right buttons and moves the tiles accordingly. After each move, the puzzle is checked to see if it is in the solved position. If so, the state transitions to “solved”. Once in the “solved” state, the program displays a “You Won” message, then the full image. It then waits indefinitely until the player either presses START to go back to the “intro” state and play again or turns the badge off. The software is open source and published on GitHub.

A Note on Solvability

If one scrambles the puzzle by starting with the solution and then making random allowed moves the puzzle can always be solved. However, with this approach, it takes many, many such movements to really randomize the puzzle. If, instead, one simply places each tile at random, it turns out that only half of the possible arrangements can be slid back to the solution. Borrowing from what others have done, my code chooses a totally random arrangement, then checks if it is solvable (see the “solvable” function in the code). If not, it randomizes the puzzle again, and repeats this process until a solvable arrangement is found and used. The rules for solvability are:

  1. If the grid width is odd (e.g., 3×3), then the number of inversions in a solvable situation is even.
  2. If the grid width is even (e.g. 4×4), and the blank is on an even row counting from the bottom (second-last, fourth-last etc), then the number of inversions in a solvable situation is odd.
  3. If the grid width is even, and the blank is on an odd row counting from the bottom (last, third-last, fifth-last etc) then the number of inversions in a solvable situation is even.

The plastic slider puzzle shown at the top of this post is not solvable. The blank is on an odd row from the bottom (the first), and there is just one inversion, an odd number.

How to Play the Game

To play, simply load the software onto the PyBadge and turn it on. The display will first show the complete puzzle image, and then will ask you to press the “A” button for a 3×3 (8 tile) puzzle or the “B” button for a 4×4 (15 tile) puzzle. One slot is always empty so that tiles can be moved. Once you have made your selection, you’ll see the puzzle image with the pieces, in the solved state, and then the puzzle will be scrambled for play. The 4×4 puzzle is significantly harder than the 3×3 puzzle and will take more moves to solve, but both are fairly easy with practice.

Use the 4 directional buttons to slide a tile at a time. The goal is to get the tiles arranged in numerical order, left to right and top to bottom, with the empty spot on the bottom right. Once you have done this, you win! After the complete image is displayed upon winning, you can press the START button to play again. Sometimes you need to press the button a couple of times.

Stuck? There are a number of heuristics for humans to use to solve these puzzles (as well as heuristic algorithms for computers). The approach that works for me is documented here.

Changing the Puzzle Images and Creating Your Own

The parameters.py file stores several parameters, including the name of the folder where the puzzle images are stored. To change, for example, from the Santa to the witch puzzle, simply edit the line: puzzle_graphics_folder = “santa” to puzzle_graphics_folder = “witch”. I have provided three sets of images for the puzzle: Santa, a witch, and a Valentine’s Day floral image:

Flowers and "Happy Valentine's Day" text.

 

Santa, sleigh, and reindeer

 

Flying witch in front of moon, with bat.

To make your own puzzles, you need to create 3 bmp images:

  • The full image, to be saved as “full.bmp” in a new folder
  • A tile image for the 3×3 puzzle, to be saves as “tiles3.bmp” in the same folder
  • A tile image for the 4×4 puzzle, to be saved as “tiles4.bmp” in the same folder. .

These images must be exactly the right size in order for the program to work. The full image and 4×4 tile image must be 160 pixels wide by 128 pixels high. The 3×3 itile image must be 159 pixels wide by 126 pixels high.

Start from the full image. To make the 4×4 tile image, black out the pixels on the lower right of the image (x coordinates 121 – 160, y coordinates 96 – 128). You can also put numbers on what will be each tile to make solving the puzzle easier To do this, I use an image editing program to add a layer with a set of grid lines creating a 4×4 grid. Then I blacken out the lower right tile and put numbers in the upper right of each tile. Then I delete the grid layer and save the image as a bmp file. Follow the same process for the 3×3 tile image, but first re-scale the total image to be 159 x 126 and use a 3×3 rather than 4×4 grid. Once you have saved the three files in a new folder, change the puzzle_graphics_folder line in the parameters.py program to point to the new folder name.