Skip to content

Hardware Sensor Fusion Solutions

Kris Winer edited this page Jul 13, 2015 · 25 revisions

One of the great innovations in motion control in the last few years is the embedding of a processor along with 6 and 9 DoF motion sensors in a single package to provide sensor fusion solutions where the resulting quaternions are read directly by a microcontroller via I2C or SPI. I call this hardware sensor fusion but in reality the algorithms are executed in software on the embedded processor rather than on the host microcontroller. The advantages are many: the sensor fusion code doesn't take up limited flash memory on the host, the sensor fusion is off-loaded from the host to the embedded processor, which is optimized for sensor fusion, and highly efficient and sophisticated filtering and autocalibration of the sensors can be accomplished automagically without the user having to become an expert in sensor fusion theory. I will study three of these hardware sensor fusion solutions here: the BNO055 from Bosch, the MAX21100 from Maxim Integrated, and the EM7180 from EM Microelectronics. Although Invensense was among the first to produce a hardware sensor solution with its MPU6050, I am ignoring the 9 DoF MPU9250 here since it can only provide a 6 DoF solution with its embedded DMP; a 9 DoF solution still requires host processing. Likewise, I just became aware of Fairchild's FIS1100 IMU with embedded motion co-processor but it looks like this also requires software run on the host to perform the full 9 DoF sensor fusion.

The BNO055 combines Bosch's flagship 9 DoF motion sensor the BMX055 (itself an agglomeration of the BMA055 accelerometer, BMI055 gyro, and BMM055 magnetometer, all packaged as separate sensors also) along with a ARM Cortex M0 processor. The board I am using is this one.The sensor fusion relies on an extended Kalman filter for the fusion proper plus low- and high-pass filtering, autocalibration, temperature compensation, etc to fuse the three sensor data streams into a quaternion representation of absolute orientation as well as providing linear acceleration, gravity, and Euler angle outputs directly readable via I2C or UART from internal registers.

The MAX21100 is an accelerometer/gyro combination that has an embedded motion merging engine that does the same sensor fusion tasks as the BNO055 but, in this case, the magnetometer data must come from an external magnetometer. On the board I am using for the following tests I pair the MAX21100 with ST Microelectronics fine LIS3MDL 16-bit magnetometer.

The EM7180 is not a motion sensors, it is rather a motion sensor hub that takes external sensor data from a variety of motion sensors and performs 9 DoF sensor fusion using an extended Kalman filter, autocalibration, low-and high-pass filtering, magnetic anomaly detection all at a very low power consumption. For the following tests I am using a board with the BMX055 as the sensor data input to the EM7180. This allows a certain amount of cross comparison since for both the BNO055 and the EM7180 I am using the same BMX055 motion sensor for input data to the respective sensor fusion engines.

I am just beginning detailed comparative studies of these hardware motion sensor solutions and so I will begin with the simplest of tests. I will start the sensors in the hardware fusion mode, each being controlled by a Teensy 3.1 microcontroller and mounted on a breadboard, and capture the Yaw (Heading), Roll, and Pitch via Serial output to my laptop. The procedure is to orient the breadboard parallel to the edge of my desk (which should be 45 degrees from true North), then every 120 seconds rotate the breadboard by ninety degrees. The first series of test data is shown below. I'll start with the MAX21100 and since it is rather busy let's take some time to understand what is happening.

MAX21100

Since the breadboard remains relatively flat on my desk during this experiment the Roll and Pitch are not very interesting. Besides, an accelerometer/gyro combination are sufficient to get accurate roll and pitch; the hardest part of sensor fusion is to get an accurate Yaw or heading. Here I am plotting the heading in degrees versus time in seconds. The light blue is the heading derived from the quaternions produced from the MAX21100 motion merging engine (hardware sensor fusion) while the dark blue is the heading derived from the quaternions produced from the open-source Madgwick MARG (software sensor fusion) algorithm using the MAX21100 reported scaled sensor data. Both hardware and software solutions use the same underlying scaled sensor data, they just differ in the particulars of the fusion filter and where the filtering is taking place. Hardware sensor fusion on the processor inside the MAX21100, software sensor fusion on the Freescale Kinetis ARM Cortex M4 of the Teensy 3.1.

It takes several tens of seconds for the sensors to initialize and for basic bias calibration to complete. Once the heading starts to spew to the serial monitor at 1 Hz, I moved the board to the edge of the table. There might be a few seconds of settling where the algorithms are autocalibrating but I am not sure. There is definitely some initial motion. The heading quickly settles down to ~143 degrees for the hardware solution and -40 degrees for the software solution. There is definitely an iteration roll up to the software solution where it approaches the stable heading over several seconds whereas the hardware solution responds much more rapidly. Notice that the hardware and software solutions are almost exactly 180 degrees different; this results from a difference in the orientation reference frame. The Madgwick frame is chosen such that the heading is zero when the x-axis of the MAX21100 accelerometer is aligned with true North. The Max21100 hardware solution apparently uses a different convention. I have asked but was told this information could not be provided. It should be straightforward to figure out what the convention is but I haven't been able to yet! It won't stop us from learning about the quality of the sensor fusion solution though.

At the 120 second mark, I rotated the breadboard (sensor) by ninety degrees. The software solution takes a few seconds but does indeed settle in at ~51 degrees while the hardware sensor solution transitions almost immediately to -131 degrees. This is excellent performance, the total difference is 91 degrees for the software solution and 88 degrees for the hardware solution. At 240 seconds another ninety degree rotation and the Madgwick filters shows ~139 degrees heading and the MAX21100 motion merging engine shows -43 degrees. Again this is an 88 degree difference for the software solution versus 88 degrees for the hardware solution. I repeated this pattern until the 800 second mark where I picked up the breadboard and waved it vigorously in all directions for twenty seconds then set it back down in the same orientation it started from. We can see that the headings both returned to their previous values within the one or two degree margin I am able to maintain by controlling the board manually.

Here is a plot of the estimated heading change as a function of time for the same experiment.

Delta heading Max21100

Both the Madgwick sensor fusion filter and the MAX21100 motion merging engine are doing an excellent job here. The ideal heading solution after each of the six ninety degree turns would be 90 degrees, of course! And after the last jostling we should expect the difference should be zero if the heading returns to where it was before the breadboard was picked up and waved about. You can see clearly how well each of these sensor fusion solutions does. The standard deviation of the difference between averaged headings for each turn (after a few seconds of initial settling) is 2.2 degrees for Madgwick and 4.3 degrees for the MAX21100 motion merging engine. That is, the change in heading when the sensor is rotated by ninety degrees is 90.2 +/- 2.2 and 89.0 +/- 4.3 degrees for the software and hardware sensor fusion filters, respectively, with the same scaled MAX21100+LIS3MDL data source. That the simple Madgwick filter does as well (or better) than the dedicated and (presumably) optimized MAX21100 sensor fusion solution is somewhat surprising. Can we do better by optimizing the filter parameters, data sample rates, low-pass filtering? Perhaps. We could almost certainly do better with a test fixture not subject to the inaccuracies of manual placement of the breadboard platform as done here. Given the (somewhat) sloppy experiment, basic bias calibration, and no real attempt to optimize sensor performance these results are remarkably accurate. Can the other two hardware sensor fusion solutions match this accuracy?

More to come...