Showing posts with label acceleration sensor. Show all posts
Showing posts with label acceleration sensor. Show all posts

Tuesday, June 19, 2012

Universal orientation, gravity and motion acceleration sensor

We have seen a number of fusion possibilities for 2 sensors.


Time has come to put all of these together and to take a step toward the Holy Grail of sensor processing. We want a sensor that measures gravity, motion acceleration and orientation reliably, no matter what the circumstances are. If we achieve this, then we have a good basis for the much researched in-door navigation problem because the linear acceleration may be used for step counting while the orientation can be used to calculate the vector of movement. Errors will accumulate but if we have occasional reference points like WiFi beacons, reasonably good results can be achieved.


So what prevents us realizing this vision? We already obtained a pretty reliable motion acceleration estimation from the accelerometer-gyroscope fusion. The orientation is the thing we don't have. Using the compass-accelerometer fusion we can eliminate the tilt problem. If we throw in the gravity calculation from the accelerometer-gyroscope fusion and use the gravity estimation instead of the raw accelerometer data for the compass tilt compensation then we have also compensated the motion acceleration. The only thing we don't have is the effect of external magnetic fields which may fool the compass and can be pretty nasty in-door in the presence of household appliances, large transformators or other electric machinery. That we will compensate with a compass-gyroscope fusion.


The idea of the compensation of the external magnetic field using the gyroscope is very similar to the compensation of the motion acceleration. If we assume that no external magnetic field is present (beside the Earth's magnetic field), we just use the compass data (and then post-compensate against tilt by the gravity vector which needs accelerometer-gyroscope fusion). If we suspect that there may be external magnetic field present, we use the gyroscope rotation data to rotate a previous, trusted compass vector (that is supposed to measure only the Earth's magnetic field) and use that as compass estimation. The difference between this compass estimation and the measured compass value is the estimation for the external magnetic field.


The trick is the detection whether an external magnetic field may be present. In this prototype the same trick was used as with the accelerometer: if the length of the compass vector is too long or too short compared to the reference measurement, then we have an external magnetic field to compensate.

Here is the summary of the sensor fusions used in this prototype:


  • Accelerometer-gyroscope fusion provides gravity and motion acceleration.
  • Compass-gravity fusion provides compass tilt compensation. The gravity is already a derived measurement coming from the accelerometer-gyroscope fusion.
  • Compass-gyroscope fusion provides orientation and external magnetic field.
Click here to download the example program.

The example program starts with a complex stationary and dynamic calibration process. When stationary, it measures the reference gravity and magnetic field values. In this stage it is very important that the device is not subject to motion acceleration or external magnetic field (beside the Earth's magnetic field). In the dynamic calibration phase we calculate the bias of the compass. We do this by instructing the user to rotate the device and measuring compass vectors in each special position. These are supposed to be points on a sphere. Once we collected the measurement points, we calculate the center of the sphere by solving a set of linear equations. The Jama package was incorporated into the example program for matrix manipulations.

Once the calibration is finished, the algorithm outlined above is executed. The measurement data is visualized in a 2D coordinate system - the z coordinate is not drawn. This is just my laziness, I did not take the effort to implement a 3D visualization engine. It is important to note, however, that the motion acceleration (yellow) and the external magnetic field (cyan) vectors are really 3D, they are just not drawn as such. The white vector is the final orientation compensated against tilt, motion acceleration and external magnetic field.




The prototype does have limitations. While it survives pretty nicely short-term disturbances from external magnetic fields (try with a small magnet), in longer term (e.g. after 5-10 minutes) it gets properly lost. For example when riding the Budapest underground there are strong and varying magnetic fields generated by the train's traction motors during the entire journey. If the compensation algorithm picks up a wrong reference vector, it may get lost for the rest of the journey until it is able to pick up the Earth's undisturbed magnetic field.

Saturday, October 22, 2011

Workaround for minimizing sensor sampling battery cost

In my Droidcon 2011 presentation I tried to highlight the battery cost of the continuous sensor sampling which is necessary for detecting motion patterns. While the general case still requires some sort of improvement over the current Android sensor architecture e.g. the use of the "wake on motion" feature of the acceleration sensors, it is possible to decrease the battery consumption if the motion to be detected is longer than 5-10 seconds. This is still not suitable for recognising very short events like fall, tap, shake, etc. but can be suitable to "wake up" a step counter when the motion starts. Some steps would be missed but this may be acceptable if the battery life is increased.

Click here to download the example program.

The essence of the workaround is to use the AlarmManager and to generate periodic wakeup events. When the wakeup event is received, sensor sampling is started and some samples are collected. Then we have to figure out using some heuristics whether there is some movement. This implementation calculates the average value of the acceleration sample absolute values and looks for values deviating from the average. If there is movement detected, the sampling continues. If there is no movement detected, the program gives up the wake lock and goes back sleeping.-

I have made some measurements using my Nexus 1. I switched the phone into airplane mode and ran the program during the night, for 7-8 hours. The battery percentage consumed during an hour can be seen below as a function of wakeup period (check out WAKEUP_PERIOD constant in AlarmSleep.java).

5 sec - 1.14%/hour
10 sec - 1.10%/hour
20 sec - 0.56%/hour
30 sec - 0.27%/hour

The battery consumption with the last timeout value - 30 sec - is very close to the standby consumption, 0.25%. If you can tolerate that long "dead period", then you can bring the battery consumption in "no motion" state very close to the phone's normal standby consumption. For a general "tap" or "shake" detector, however, this is not an adequate solution. I have received encouraging mails that a proper solution relying on the sensor's low-power mode may be deployed soon.

Monday, October 10, 2011

Battery cost of sensor sampling

While at Droidcon UK 2011, I was asked to elaborate my claims about the battery cost of sensor sampling in a blog post. These claims can be found in my conference presentation but we thought it would help to describe them more in detail.

Continuous accelerometer sensor sampling introduces significant battery cost in Android devices. For example if you want to write an application that samples the sensor in the background and figures out, whether somebody double-tapped the body of the phone (not the active touch screen but anywhere on the phone's body), then the CPU of the phone can never sleep. You need to grab a partial wake lock to ensure continuous sampling otherwise the processing of the samples will stop when the device goes to sleep - typically some minutes after the keyguard activates. If you obtain partial wake lock, however, then you have to calculate with 1.5-4% battery consumption per hour (depending on sampling speed) which does not look like a lot but if you multiply it with 24 hours, you can see that you cannot sample the accelerometer continuously without spoiling the phone's usability.

Microsoft proposes a low-power co-processor for these background processing jobs with low computational complexity (accelerometer is typically sampled around 10-30 samples per second - you don't need a supercomputer to do that kind of processing). While this approach definitely solves the battery problem, there is the issue of an additional programming model (those low-power microcontrollers don't have full-blown programming environments) and it is very likely that application programmers will not be able to insert pieces of code to run on this microcontroller.

My proposal is to exploit low-power features of the accelerometer sensors widely used in Android devices. For example the very popular Bosch Sensortec BMA150 accelerometer sensor which can be found in variety of HTC devices (and probably others) has a wake up on motion mode.

In its data sheet, this mode is described like the following.

In general BMA150 is attributed to low power applications and can contribute to the system power management.

  • Current consumption 200μA operational
  • Current consumption 1μA sleep mode
  • Wake-up time 1ms
  • Start-up time 3ms
  • Data ready indicator to reduce unnecessary interface communication
  • Wake-up mode to trigger a system wake-up (interrupt output when motion detected Low current consumption in wake-up mode to master)


The BMA150 provides the possibility to wake up a system master when specific acceleration values are detected. Therefore the BMA150 stays in an ultra low power mode and periodically evaluates the acceleration data with respect to interrupt criteria defined by the user. An interrupt output can be generated and trigger the system master. The wake-up mode is used for ultra-low power applications where inertial factors can be an indicator to change the activity mode of the system.

This would allow the main CPU to go into sleep mode and to be woken up by the sensor only if there are movements. So if the device is laying on the table, there would be basically no power consumption due to sensor sampling. This would enable production-quality implementation of a range of applications, for example the Activity Level Estimator which is being researched at the University of Geneva.

The attractive property of this approach is that even though implementing it in Android devices and in the framework is not trivial, it is not very complicated either. The hardware is already in the devices, maybe the sensor's interrupt pin has to be wired up with the main processor. SensorManager needs to be extended with some functions that allows applications to activate this wake up on motion feature. Application model would remain consistent with the current Android application model, no need to fiddle with low-level microcontroller code.

Now there just need to be a device manufacturer that carries this through.

Sunday, May 29, 2011

Workshop paper

Last year I blogged about my experiences with acceleration signal processing (here , here and here) then the topic disappeared from this blog. Disappeared from the blog but not from my life because the research continued with some partners from the University of Geneva. The paper (free registration is needed for access) will be presented at the 1st International Workshop on Frontiers in Activity Recognition using Pervasive Sensing, a workshop of Pervasive 2011. The prototype was created on Android, you can access it here. Be warned: this is just a research prototype and is not guaranteed to work on anything else than on Nexus One and even on that phone it has some issues. If you happen to be on Pervasive 2011, I am happy to explain its operation personally.

Update: the workshop presentation has been done, the slideset is also available at the same link where the paper is.

Sunday, June 6, 2010

Shake-walk-run

Well, anybody can draw up frightening equations like I did in the acceleration signal analysis report in my previous post and claim that they work just because of some diagrams made by a mathematical analysis program. You'd better not believe this sort of bullshit before you experience it yourself. To help you along that path, I implemented a simple application that let you try out, how these methods work in real life.

Click here to download the example program.

If you open this simple application, you can enable the sampling with a checkbox. Once the sampling is switched on, the service behind the app constantly samples the acceleration sensor and tries to detect motion patterns. It is able to distinguish 3 patterns now: shake, walk and run. Not only it distinguishes these patterns, it also tries to extract shake, step and run count. Unfortunately I tested it only on myself so the rule base probably needs some fine tuning but it works for me. The walk detector uses the w3 wavelet (read the report if you are curious, what w3 wavelet can be) and is therefore a bit slow, it takes about 2-3 seconds before it detects walking and starts counting the steps but that delay is consistent - if you stop walking, it continues counting the steps it has not counted yet.

The moral for this application is that the tough part in detecting motion patterns is the rule base behind the signal processing algorithms. It is OK that wavelets separate frequency bands and signal power calculation allows to produce "gate" signals that enable/disable counting for certain motion patterns. But where are the signal power limits for e.g. shaking and how to synchronize "gate" signals with the signals to be counted? This example program is a good exercise as you can observe, how it uses delayers and peak counters synchronized with signal power calculators to achieve the desired functionality.

Some notes on the implementation. The machinery behind the background service that sends events to an activity is described in this and this blog posts. Also, observe the debug flags in SamplingService, if you set one of these, the service spits out signal data that you can analyse with Sage later. I did not include the keep-awake hack into this implementation because it was not necessary on my Nexus One with the leaked Android 2.2. I put a public domain license on the download page because somebody was interested in the license of the example programs.

I have to think a bit, where I want to go from here. The most attractive target for me is to formalize all these results into a context framework for Android applications but I may fiddle a bit more on signal processing.

Friday, May 14, 2010

Analysing acceleration sensor data with wavelets

Before we get back to Android programming, we need some theoretical background on signal analysis.The document is somewhat heavy on math. To quote Dr. Kaufman's Fortran Coloring Book: if you don't like it, skip it. But if your teacher likes it, you failed.

Click here to read the report.

If you are too impatient to read, here is the essence.

There is no single perfect algorithm when analysing acceleration signals. The analysis framework should provide a toolbox of different algorithms, some working in the time-domain, some operating in the frequency domain. The decision engine that classifies the movements may use a number of algorithms, a characteristic set for each movement type.

It has been concluded in the medical research community that wavelet transformation is the most optimal algorithm for frequency-domain analysis of acceleration signals. This report presented concrete cases, how wavelet transformation can be used to classify three common movements: walking, running and shake. In addition, the wavelet transformation provided data series that can be used to extract other interesting information, e.g. step count.

For those who would like to repeat my experiments, I uploaded the prototype. First
you need Sage (I used version 4.3.3). Download and unpack the prototype package and enter the proto directory. Then launch Sage (with the "sage" command) and issue the following commands:

import accel
accel.movements(5)

now you will be able to look at the different waveforms, e.g.

list_plot(accel.shake_w5)

Sage is scriptable in Python. If you know Python, you will find everything familiar, if not - bad luck, you won't get far in Sage.