Tuesday, June 19, 2012

Universal orientation, gravity and motion acceleration sensor

We have seen a number of fusion possibilities for 2 sensors.


Time has come to put all of these together and to take a step toward the Holy Grail of sensor processing. We want a sensor that measures gravity, motion acceleration and orientation reliably, no matter what the circumstances are. If we achieve this, then we have a good basis for the much researched in-door navigation problem because the linear acceleration may be used for step counting while the orientation can be used to calculate the vector of movement. Errors will accumulate but if we have occasional reference points like WiFi beacons, reasonably good results can be achieved.


So what prevents us realizing this vision? We already obtained a pretty reliable motion acceleration estimation from the accelerometer-gyroscope fusion. The orientation is the thing we don't have. Using the compass-accelerometer fusion we can eliminate the tilt problem. If we throw in the gravity calculation from the accelerometer-gyroscope fusion and use the gravity estimation instead of the raw accelerometer data for the compass tilt compensation then we have also compensated the motion acceleration. The only thing we don't have is the effect of external magnetic fields which may fool the compass and can be pretty nasty in-door in the presence of household appliances, large transformators or other electric machinery. That we will compensate with a compass-gyroscope fusion.


The idea of the compensation of the external magnetic field using the gyroscope is very similar to the compensation of the motion acceleration. If we assume that no external magnetic field is present (beside the Earth's magnetic field), we just use the compass data (and then post-compensate against tilt by the gravity vector which needs accelerometer-gyroscope fusion). If we suspect that there may be external magnetic field present, we use the gyroscope rotation data to rotate a previous, trusted compass vector (that is supposed to measure only the Earth's magnetic field) and use that as compass estimation. The difference between this compass estimation and the measured compass value is the estimation for the external magnetic field.


The trick is the detection whether an external magnetic field may be present. In this prototype the same trick was used as with the accelerometer: if the length of the compass vector is too long or too short compared to the reference measurement, then we have an external magnetic field to compensate.

Here is the summary of the sensor fusions used in this prototype:


  • Accelerometer-gyroscope fusion provides gravity and motion acceleration.
  • Compass-gravity fusion provides compass tilt compensation. The gravity is already a derived measurement coming from the accelerometer-gyroscope fusion.
  • Compass-gyroscope fusion provides orientation and external magnetic field.
Click here to download the example program.

The example program starts with a complex stationary and dynamic calibration process. When stationary, it measures the reference gravity and magnetic field values. In this stage it is very important that the device is not subject to motion acceleration or external magnetic field (beside the Earth's magnetic field). In the dynamic calibration phase we calculate the bias of the compass. We do this by instructing the user to rotate the device and measuring compass vectors in each special position. These are supposed to be points on a sphere. Once we collected the measurement points, we calculate the center of the sphere by solving a set of linear equations. The Jama package was incorporated into the example program for matrix manipulations.

Once the calibration is finished, the algorithm outlined above is executed. The measurement data is visualized in a 2D coordinate system - the z coordinate is not drawn. This is just my laziness, I did not take the effort to implement a 3D visualization engine. It is important to note, however, that the motion acceleration (yellow) and the external magnetic field (cyan) vectors are really 3D, they are just not drawn as such. The white vector is the final orientation compensated against tilt, motion acceleration and external magnetic field.




The prototype does have limitations. While it survives pretty nicely short-term disturbances from external magnetic fields (try with a small magnet), in longer term (e.g. after 5-10 minutes) it gets properly lost. For example when riding the Budapest underground there are strong and varying magnetic fields generated by the train's traction motors during the entire journey. If the compensation algorithm picks up a wrong reference vector, it may get lost for the rest of the journey until it is able to pick up the Earth's undisturbed magnetic field.

19 comments:

Anonymous said...

Hi there!

I went through the whole tutorial series - really great job! Expecially for a beginner person like me: it really let me understand how sensor fusion can work and how to use it.

Just wanted to let you know that seeing code for this last one level of data fusion would be like sensor programming heaven;)

Thank you anyway!

today said...

Great tutorial series! One of the best out there!

I have one small question. You are combining data from three different sensors. Android does not synchronize the data from them so they have different timestamps. Also quite different frequency (accelerometer is probably the fastest one). Is it not a problem when fusing sensors data? Aren't there any need for some kind of synchronization (like via interpolation or something)?

I am totally new in Android "sensors world" so maybe this question is stupid - but I was just wandering how do you deal with this asynchronous data...

Gabor Paller said...

today: generally speaking it could be a problem. But look at the realities: we are talking about human movements sampled with about 10-50 Hz (depending on the selected sampling freqency). Gyroscope sampling rate can be as high as 600 Hz in some models. Compared to the speed of the timing of the measured signals, the difference among sensor data timestamps can be neglected.

Gabor Paller said...

Anonymous: "Just wanted to let you know that seeing code for this last one level of data fusion would be like sensor programming heaven;)"

What do you mean? The example program is always available for download.

Anonymous said...

What I meant whas (I thought code is still in development) the fact that - at least here - when I download the .zip file the archive seems to be empty..

Gabor Paller said...

Anonymous, that's a Windows Explorer thing. I just got report about another ZIP archive I published. I use 7-Zip on Windows, that opens the archive without problem. Also, there's no problem on Linux or Mac.

The archive was created with Eclipse's Export Archive file feature on Linux, I can't imagine what ZIP wizardry could be at play here. Anyway, I just tried on Windows with 7-Zip and it works.

Anonymous said...

Sorry I bothered you - it did not came to my mind that something like that could be going on. Wizzardry indeed! ;)

alexdonnini said...

Hello,
If I am not mistaken, in a previous example application you used wavelet transforms to detect patterns of motion (walk, run, shake) on an Android phone.

It looks like with GCAFusion, you have an application which will detect motion more accurately using input from multiple sensors.

Could you use GCAFusion to detect motion patterns? Would you use the same approach (wavelet transforms) as in the accel application?

I would appreciate it if you could give me some pointers on how to do that to help me get started.

Thanks,

Alex Donnini

alexdonnini said...

Hello,
I think I solved the problem I asked about in the message I posted on January 27.

My next problem is how to make sensor data available to worker threads in my main application. I tried a number of ways. None of them work.

To produce and manage worker threads (I have a quite a few), I use completion services, future, and callable facilities.

When I start my worker thread execution, the sensor sampling service seems to stop running.

Do you have any ideas as to how to resolve my problem?

Thanks,

Alex Donnini

Gabor Paller said...

Alex, could you send me your code? gaborpaller at gmail.com

alexdonnini said...

Hi Gabor,
Thanks for the offer to take a look at my code. I think I resolved the problme by having the sensor sampling service run in its own process. Now, I have too make sure the various functions in my application (wifi scan processing, sensor data gathering, and location analysis don't get in each other's way (using synchronization).

At present, my code is not very readable and is pretty big. I am a little reluctant to send it to you.

At some point, I would love to get your take on what I am doing, especially as it relates to movement detection (especially walking movement).

I'll stay in touch. I hope you won't mind.

Thanks,

Alex

alexdonnini said...

Hello Gabor,

I meant to ask you but forgot until now. Now that Android 4.4 includes a ste detector and counter, do you think using it is preferable to using a module like the one you developed? For a number of reasons, I would prefer using the one you developed but...

Thanks,

Alex Donnini

Gabor Paller said...

Alex, if there's access to any context variable that is implemented in an energy-efficient way (i.e. it is not the main processor that calculates it) then it is preferable to use that method and not something that is implemented on the main processor. Those multi-core processors with GHz clock frequency consume a lot of power. Having a dedicated chip (e.g. microcontroller) saves a lot of battery power.

My dream would be a low-power microcontroller that is also programmable e.g. in RenderScript. Then you could offload your custom signal processing into a device that can execute it in energy-efficient way. But that's just a dream. :-)

alexdonnini said...

Hi Gabor,
You may have seen the Google Tango phone announcement. I wonder how much of the sensor data processing in Tango phones is offloaded to dedicated hardware (the approach you suggest).

I understand the benefit of the approach you suggest. However, I would not want to lose control over the processing of sensor data, and the algorithms used to process that data.

At this point, I want to continue to pursue the software based approach you have used in your software.

Two questions I am grappling with are:
1) the reliability of step tracking function and how to use your GCAFusion module to track walking

2) The relationship between RSSI variability and number of steps/distance travelled. There are many (and I mean many) papers written about RSSI signal variability with distance and the inherent unreliability of RSSI readings. However, I think that a) Sample size does matter, and b) RSSI is not as variable as one might think when distance changes as it does when a user walks

Both of these questions are relevant when trying to track user location in real time both indoor and outdoor as my software does,

alexdonnini said...

https://dspace.cc.tut.fi/dpub/bitstream/handle/123456789/21071/ganesan.pdf?sequence=3

alexdonnini said...

https://dspace.cc.tut.fi/dpub/bitstream/handle/123456789/21071/ganesan.pdf?sequence=3

alexdonnini said...

Hi Gabor,

You might find this interesting

https://dspace.cc.tut.fi/dpub/bitstream/handle/123456789/21071/ganesan.pdf?sequence=3

alexdonnini said...

sorry about the misfiring on the postings

Gabor Paller said...

Alex, wrt. your comments about main CPU vs. dedicated hardware. Of course, when you experiment with an algorithm, it is advisable to implement it on the most flexible platform, which in case of Android is the generic Android Java application model. Just don't be surprised when that implementation strategy is hard to turn into a consumer product. It is not by chance that in an average smartphone, no aspect of the cellular network communication is implemented by the main processor, the whole functionality is offloaded to a dedicated communication chip. I don't know anything about the Tango, but I expect that most of the signal processing is offloaded to dedicated chips or maybe to the GPU.

If you want to do step counting, I wrote a paper about it. I am happy to help if you have any problem with that prototype (the prototype was tested on just one platform).