Friday, March 12, 2010

Sensors

Ever since I heard that Android devices come with a wide array of sensors, I have been excited about the possibilities. I am a firm believer of the ubiquitous computing vision and all the consequences it brings, including sensors that can be accessed wirelessly. Some parts of the vision (e.g. self-powering microsensors embedded into wallpaint) are still futuristic but mobile phones can be equipped with such sensors easily. Google has a strategy about sensor-equipped mobile devices where the sensor values are processed by powerful data centers. As I did not have an Android device before, I could not play with those sensors. Not anymore! (again, many thanks to those gentle souls who managed to get this device to me).

Sensors are integral part of the Android user experience. Acceleration sensor makes sure that the screen layout changes to landscape when you turn the device (open an application that supports landscape layout, e.g. the calendar, keep the device in portrait mode, move the device swiftly sideways to the right and if you do it quick enough, you can force the display to change to landscape mode). Proximity sensor blanks the screen and switches on the keylock when the user makes a phonecall and puts the device to his or her ear so that the touchscreen is not activated accidentally. In some devices, temperature sensor monitors the temperature of the battery. The beauty of the Android programming model is that one can use all these sensors in one's application.

Click here to download the example program.

A very similar application called SensorDump can be found on Android Market. Our example program is inferior to SensorDump in many respect but has a crucial feature: it can log sensor values into a CSV file that can be analysed later on (use the menu to switch the capture feature on and off). Update: from the 0.2.0 version of SensorDump, sensor data logging into CSV file is available. This is not much important with e.g. the proximity sensor which provides binary data but I don't believe one can understand at a glance, what goes on with the e.g. accelerator sensor during a complex movement just by looking at the constantly changing numbers on the device screen.

I can see the following sensors on my Nexus One.

BMA150 - 3-axis accelerometer
AK8973 - 3-axis Magnetic field sensor
CM3602 - Light sensor

Some sensors are projected as more than one logical sensor, for example the AK8973 is also presented as an orientation sensor and the CM3602 as the proximity sensor. This is just software, however, these duplicate logical sensors use the same sensor chip but present the sensor data in different format.

Let's start with the most popular sensor, the accelerometer. This measures the device's acceleration along the 3 axis. A logical but somewhat unintuitive property of this sensor is that the zero point is in free fall - otherwise the Earth's gravity acceleration is always present. If the device is not subject to any other acceleration (the device is stationary or moves with constant speed), the sensor measures the gravity acceleration that points toward the center of the Earth. This is commonly used to measure the roll and the pitch of the device, try the excellent Labyrinth Lite game on Android Market if you want a demonstration.

The graph below shows sensor data in two scenarios (note that all the data series can be found in the download bundle under the /measurements directory). The red dots show the value of the accelerometer when the device was turned from horizontal position to its side, right edge pointing to the Earth. The green dots show the sensor values when the device was tilted toward its front edge so that at the end the upper edge pointed toward the Earth.



This is all beautiful but don't forget that the acceleration sensor eventually measures acceleration. If the device is subject to any acceleration other than the gravity acceleration (remember the experiment with the portrait-landscape mode at the beginning of the post), that acceleration is added to the gravity acceleration and distorts the sensor's data (provided that you want to measure the roll-pitch of the device). The following graph shows the accelerometer values when the device was laying on the table but I flicked it. The device accelerated on the surface of the table and the smaller blue dot shows the value the accelerometer measured when this happened. As if the device was tilted to the right.


The second sensor is the magnetic field sensor, the compass. As far as I know, this sensor is not used for anything by the base Android applications, it is all the more popular for all sorts of compass applications. The magnetic sensor measures the vector of the magnetic field of the Earth, represented in the device's coordinate system. In 3D, this points toward the magnetic north pole, into the crust of the Earth. The following graph shows the scenario when the device was laying on the table but was rotated in a full circle on the surface of the table.



Even though the magnetic sensor is not subject to some unwanted acceleration like the accelerometer, it is subject to the influence of metal objects. The following graph shows the values of the magnetic sensor when the device was laying on the table but after a while I put a small pair of scissors on top of the device. You can see that there are two clusters of sensor values: one with the scissors, one without.


The third sensor is the light sensor that doubles as proximity detector. The light sensor is more evident but the proximity detector deserves some explanation. The proximity detector is really a light sensor with binary output. If blocked, it emits 0.0, otherwise it emits 1.0. The photo belows demonstrates the location of the sensor and how to block it.

44 comments:

Ed Burnette said...

Nice article! More like this, please.

Kevin McDonagh said...

Ah, you listen for the light sensor being blocked. That's how to detect proximity. obviously, thanks!

Gabor Paller said...

kevin, the proximity detector is provided by the cm3602 chip which is essentially a light detector.

Aaron said...

Thanks for the example code. I have a small question maybe somebody can help me with...

When I try and build the example code to android 1.5 (so i can try it on my Samsung Galaxy) I get an error "R cannot be resolved" and I notice that the file R.java no longer gets built in the /gen directory. Do you have any idea what is causing this problem? Is there some part of the example I need to comment out so it will build correctly?

Thanks!

Aaron Lieber

Gabor Paller said...

Aaron77, the project was created for Android2.0.1. Why? Because I was plain lazy.

I propose that you backport the project to Android1.5. This should not be hard, I am not aware that the code uses any facility that is not available in 1.5. I do propose, however, that you don't fiddle with the existing build scripts, but create a new project for Android1.5 and copy the Java source files, the resource files and update the AndroidManifest.xml.

Zephyris said...

Nice stuff! This is (almost) exactly what I was looking for... Is there any chance you could make a little update? It would be really handy to be able to record data from the different sensors simultaneously, particularly the accelerometer and the compass...

vinay said...

Hi all,

At present i am working on hal part of sensors in android sdk, we are using 3- Axis BMA-150 Accelerometer sensor to get acceleration values with respect to X,y,Z Axis, I want to know whether this sensor will give o/p directly in SI units by using some calibration techniques or what ? , and i noticed that in sensor.c file they mentioned

720.0 LSG = 1G(9.8 m/s2), what is the relation between LSG and acceleration due to gravity? what is meant by LSG

why they are multiplying the o/p of accelerometer x,y,z valuse with 9.8/720.0f . please help on this part .

Thanks Vinay

Gabor Paller said...

Vinay, here is the datasheet.

nature lover said...

Are the cm3602 and the other sensor drivers standard Linux drivers, without any changes specific to Android? If not, is there a document that describes the changes needed to the drivers, and the hooks needed to interface them to the Android sensor interface? Thanks for your help.

Ugo said...

The magnetic sensors might be used for something else: http://www.youtube.com/watch?v=HrYgZIul4O0

Moustafa said...

very nice article,

Thanks

Boris said...

Thanks for this nice article!

Do you know the maximum sampling rate that we can get by the accelerometer?
According to Android 2.2 compatibility document, device should deliver the accelerometer events in 50Hz or faster. But I could only have around 5hz.

Thanks

Gabor Paller said...

Boris, there is an improved version of the Sensors program that actually measures sampling frequency.

I never got past 30 Hz on my Nexus One.

One could go much higher with these sensors, there seems to be some kind of limitation in the software.

Lars Vogel said...

Thanks for the summary on sensors. This was helpful.

Kumar Shwetabh said...

Very nice coverage of sensors especially for Nexus One :). Thanks!

Anonymous said...

the software can report a sensor but not be physically installed on the phone?

Gabor Paller said...

Anonymous, I admit I don't understand your question. What is not installed physically on the phone, the sensor or the software?

Anonymous said...

I mean the sensor.

I have a H3000 (Android chinese phone, 2 SIM) and 9 of 10 programs to test sensors, that I installed in the phone, show me three sensors CM3623 (Capella) Proximity/Ligth, AMI304 (Aichi Steel) Orientation/Magnetic and MT6516 (The Android Source Proyect) Acelerometer, but only the acelerometer works, show data and works with other software installed, the others not.

I wonder if the software can report the sensors but they are not physically installed on the phone. Is this possible?

Anonymous said...

Some H3000 specifications:

- OS: Android 2.2
- CPU: MediaTek MTK6516 416MHZ
- 256MB ROM + 256MB RAM

Gabor Paller said...

Anonymous, everything is possible. :-) The sensors are exposed by their drivers. If they took the image from a device that has those sensors but they failed to provide these sensors in hardware then this could produce exactly the symptoms that you experience.

Anonymous said...

What do you mean wiht "image"?

Jairo Gutierrez (ex-Anonimous) said...

When You say image mean the ROM?

Jairo Gutierrez (ex-Anonymous) said...

Exploring the diectories of the phone in the "\system\lib\hw\" I had found:

gralloc.default.so
lights.default.so
sensors.mt6516.so

The MT6516 is the only sensor that works...

Gabor Paller said...

Gutierrez, I don't know this particular type of phone but it seems to me that it should have all the usual sensors. My feeling is that your phone is broken.

Mouimoui said...

Great article.
I used some of your ideas to implement an abnormal magnetic field notification in Compastic! a little compass app I wrote.
The code is actually OS on google code @ code.google.com/p/compastic and I have a little blog on it + using maps and locations @ compastic.blogger.com

jumpjack said...

I can't understand where the log is stored!

Gabor Paller said...

jumpjack, the log is stored at /sdcard/capture.csv

Anonymous said...

yOu CaN rEAd MOrE aBoUt SeNsOrS HERE :
http://adaywithandroid.blogspot.com

divdav said...

Hi, thanks for the article and app.

How easy would it be to only log sensor data when the headphone media button is pressed using?

Would only logging data when the button is pressed, say for example every minute or so, use a lot less power than logging all the time?

Gabor Paller said...

divdav, look at this post

divdav said...

Basically what I want to do is use my phone attached to my dslr camera to record compass/pitch/roll/yaw data for every photo I take. Ideally I would be able to go out for a few hours to take photos and not worry about the phone battery life. Your power saving post you linked to is interesting but probably wouldn't work for my purposes as the camera is likely to be moving around all the time between taking photos.

My plan is to use a timer/relay circuit connected to the camera's hot shoe flash connector to effectively press the media button whenever I take a photo, letting the phone application know to record the data at that moment.

Do you think this is possible? Do you think the delay between activating the media button and logging the compass data would be more than a second?

I'm new to this and trying to get an idea if it's all possible so your blog and advice is really appreciated!

Gabor Paller said...

Divdav, I have no idea, I have never worked with the camera. All I know is that you don't have to sample the compass, you need just one measurement when the picture is taken.

Shailly Panchal said...

Hi.grt article. :) I ahev taken up a very complicated project n i ahve very much less time to complete it.Is it possible to interface android(galaxy Y)with ultrasonic sensor(PING)? plz help me out.i dont know embedded programming .so are there any boards which are compatible with android. IOIO android board is one but its not compatible with glxy Y and propbridge m not sure whr to buy?

Gabor Paller said...

Shailly, the only "easy" way I can see to communicate with an external sensor is Bluetooth. Does your sensor support Bluetooth?

Alok said...

how to interface the gravity sensor of android to a x bee

blazivic said...

THANK YOU!!

blazivic said...

one question though:

what is the interval time between the samples saved in the capture.csv??

nicholas said...

what is the nominal range of the proximity sensor? can it be increased?

-S said...

Sue:

Awesome article!
May I know if there is such thing as algorithms for sensors.
I want to make use of sensors to detect nearby wifi

shashank said...

Gud morning Sir ,
I need an hint for Android Phone sensing the action of human or car ..... when car is going fast it should able to note speed of the car .... It should give all sensing like if any thing is placed near phone then it should recognise that .... Please help me out

Marzieh said...

Hello,
Thanks for the information.
I'm looking for an app on Android to log heading/direction data of the smartphone.
I found 2 which are GPS based but I 'm looking for one for indoor.I appreciate any comments.

Regards
Marzieh

Gabor Paller said...

Marzieh, I didn't develop an app but I did develop a prototype. Check it out here.

Lima said...

Hi guys, i need some help with my project. I want to build an android app that detects bumps on the road using the smartphone sensor "accelerometer" that measure the intensity of vibration when the vehicle hits the bump, and then the smartphone GPS is used to locate the bump. Well my problem is: how we can measure the intensity of vibration using the acceleration????
Thank you for any help you can give me

Gabor Paller said...

Lima, is this presentation a good starting point?

Sensor fusion between car and smartphone

Start with slide 34.