tag:blogger.com,1999:blog-82144019124805033662024-02-19T12:12:06.074+01:00My life with Android :-)Gabor Pallerhttp://www.blogger.com/profile/14307475522972458932noreply@blogger.comBlogger134125tag:blogger.com,1999:blog-8214401912480503366.post-78921210142468459382017-12-23T19:46:00.001+01:002017-12-23T19:46:55.636+01:00Data from the light bulb<div dir="ltr" style="text-align: left;" trbidi="on">
Once I read a blog post from <span class="st">Oona Räisänen that
really inspired me. In her project<a href="http://www.windytan.com/2015/10/pea-whistle-steganography.html">
she encoded a digital signal into the referee's whistle</a>. I
asked myself if there is any other household object that emits a
radiation and could be used as a carrier for low-speed digital
signal. I looked around and there was a light bulb.<br />
<br />
Using light as signal transmitter is not at all a novelty. Beside
all that optoelectronics stuff, recently Apple stirred some waves
with their Li-Fi announcements that promises ultra-fast
communication by means of a light bulb. I had something else in
mind. I did not care about the speed, ad-hoc signal transmission
of small data chunks that carry location IDs, sensor measurement
was much more interesting for me. The use case would look like
that you walk with your smartphone into a room lit by an
innocently looking light bulb and presto, the smartphone picks up
info from the light bulb without any special arrangement, e.g. you
don't have to point your device anywhere. Speed and data size is
of secondary concern in this use case which is a quite common
scenario in the world of Internet of Things (IoT). Also, the light
from the light bulb must look completely ordinary for human
observer, e.g. no blinking, etc. This post is a tale of that
adventure.<br />
<br />
As I had prior experience with <a href="http://mylifewithandroid.blogspot.com/2016/01/data-transfer-to-android-device-over.html">reading
infrared signals from ordinary remote controls and sending them
to the smartphone</a>, I started with the method IR remote
controls use to communicate with their receivers. Very shortly if you
have not yet met this system: the emitter sends out period of
"0s" (no light) and "1s" (IR light modulated by a certain
frequency, typically 38 kHz). It is the <b>length</b> of these
periods that carry the information. The common approach is that
after a series of sync bursts, a digital 0 is encoded as a no light-light sequence of specified periods. A digital 1 is
encoded similarly, except that the no light-light periods are
different. You can observe this operation if you watch the light diode of an ordinary remote control through a smartphone
camera. As the smartphone camera sensor "sees" in infrared (even
though the manufacturers try to filter out this behaviour), you
will see flashes of infrared light if you push a button on the
remote.<br />
<br />
Compared to the IR remote, we have an additional requirement: the
human observer is not allowed to notice that the light bulb is
doing something weird. For this purpose, I changed the modulation
scheme. "No light" does not mean that the light bulb is switched
off (that would result in a very annoying blinking sensation that
humans are very sensitive to), only the modulation frequency is
changed. In our system, the "light" periods are modulated with 38
kHz so that the popular IR remote decoder chips can be used, and
the "no light" periods with 48 kHz. For the steep band pass input
filters of those IR receiver chips, 48 kHz is essentially "no
light". Considering the very noisy environment in the visible
light domain, I also changed a popular IR modulation scheme, now
the 0 bit is 564 microsec 48 kHz signal/564 microsec 38 kHz
signal, the 1 bit is 1692 microsec 48 kHz signal/564 microsec 38
kHz signal. The whole payload is 20 bit long allowing to transmit
a 16-bit value and a 4-bit data type selector. The data type
selector lets the emitter send multiple types of data
sequentially. In our demo, these are: station ID (for location),
temperature and humidity (obtained from a DHT-22 sensor on the
emitter board).<br />
<br />
The system therefore consists of 3 elements: the emitter circuit
that drives the light source, the adapter that receives these
light signals and adapts them to the smartphone and finally the
smartphone that acts upon those light signals. <br />
<br />
</span>
<br />
<ul>
<li><span class="st">In the prototype I am about to present the
emitter is based on an Atmel Atmega328P microcontroller, in
the form of an Arduino Pro Mini board. </span></li>
<li><span class="st">The smartphone does not have light receiver
and the Android application model cannot do real-time
processing anyway so there is an adapter in between that on
one side receives and decodes the light signals, on the other
side interfaces with the smartphone by means of Bluetooth Low
Energy (BLE). This element is implemented with two
microcontrollers, an Atmega328P that does the real-time light
signal processing and an nRF51822 SoC that deals with the BLE
interface. As the nRF51822 is a quite capable ARM Cortex-M0
microcontroller, it is an interesting question why the light
signal processing had to be offloaded to another
microcontroller. The reason is the <a href="http://mylifewithandroid.blogspot.com/2015/12/improved-hardware-for-infrared-to.html">bad
experiences I had regarding the real-time behaviour of the
nRF51822</a> when its Bluetooth stack is operating.</span></li>
<li> The third element is a smartphone that connects to the
receiver by means of BLE and displays whatever the adapter
receives. This part is implemented in Android.</li>
</ul>
Let's start with the emitter. Below is the schematic (click to enlarge).<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj6XRpDc7MYfzA5BEqPhSPAZ1Ub-atqO01YRHvc5L6biX_8r2RpOitZ19A4Xc4OJ9MIhUDp1219DthDsM2o9BTbUd2EYjdaA0Agzy7YyBBq-dx2894J6Hw0xj69mRQmZdYuSXfOhvESEyKl/s1600/light_sender.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="698" data-original-width="1100" height="406" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj6XRpDc7MYfzA5BEqPhSPAZ1Ub-atqO01YRHvc5L6biX_8r2RpOitZ19A4Xc4OJ9MIhUDp1219DthDsM2o9BTbUd2EYjdaA0Agzy7YyBBq-dx2894J6Hw0xj69mRQmZdYuSXfOhvESEyKl/s640/light_sender.png" width="640" /></a></div>
<br />
<br />
<br />
<br />
And this is how it looks like with the lighting LED beside the
microcontroller card.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQ5_FHjq1cxrJZ1DmssWXDaePmpS0ODoasN8D3yOH3dcTKi8PclHElF1DqM_C2V5e2MhMh2nTONDFSXpstftlA00FOv9ayZHR5FY33xkaNBFiMOVgeW2aS0payBPyPlVp8IDFR-TgeFJgP/s1600/light_sender.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="458" data-original-width="800" height="228" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQ5_FHjq1cxrJZ1DmssWXDaePmpS0ODoasN8D3yOH3dcTKi8PclHElF1DqM_C2V5e2MhMh2nTONDFSXpstftlA00FOv9ayZHR5FY33xkaNBFiMOVgeW2aS0payBPyPlVp8IDFR-TgeFJgP/s400/light_sender.jpg" width="400" /></a></div>
<br />
<br />
<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/light/light.zip">The source code can be found in this archive</a>, in the
light_sender/sketch subdirectory. You have to adapt the
ARDUINO_DIR variable and probably the ISP_PROG and the ISP_PORT
variables according to your programming tool. The AC input voltage
may need to be adapted according to the lighting LED you choose.
38VAC effective value worked well for me for a wide range of
lighting LEDs. The input voltage source also powers the
microcontroller and this is really a sensitive area, I burnt a Pro
Mini by screwing up something here. The problem is the large
voltage drop hetween the LED supply voltage and the 3.3V that
supplies the microcontroller. Before you insert the Pro Mini into
its socket, make sure that VCC is 3.3V by setting TM1. Also, the
Q2 FET has to be chosen carefully, the IRL540N type has
drain-to-source breakdown voltage of 100V which is more than
enough for this application.<br />
<br />
<br />
The station ID (used for indoor location application) is
hardcoded in light_sender.ino (STATION_ID). Optimally, every light
bulb should have a different station ID.<br />
<br />
<br />
Once the emitter is powered, it emits a steady 48 kHz signal
which is "no light" in our encoding scheme. Every 1 second the
emitter sends an encoded 20-bit value which is sequentially the
station ID, temperature and humidity, the last two values are
obtained from the DHT-22 sensor (U1). For the human observer, the
light bulb is simply lit.<br />
<br />
<br />
Now on to the adapter. Here is the schematic (click to enlarge).<br />
<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSdzqfZew4GMWigV6Ey1-zaXD5kOp3uVuM-rmIdYXdD0cEEBYrqtkkarL5ibsvSFCbBTFjunpttpD6iUg7UvQoHA7wKw8RTCTRcKEemO79M-2vpA2bUkMfDXWdkbUd7_4V-3ysip7b8HnO/s1600/light_receiver.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="836" data-original-width="1416" height="376" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSdzqfZew4GMWigV6Ey1-zaXD5kOp3uVuM-rmIdYXdD0cEEBYrqtkkarL5ibsvSFCbBTFjunpttpD6iUg7UvQoHA7wKw8RTCTRcKEemO79M-2vpA2bUkMfDXWdkbUd7_4V-3ysip7b8HnO/s640/light_receiver.png" width="640" /></a></div>
<br />
And here is how it looks like.<br />
<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhJTmopzt7MdCJzY7HjfdHRK_RHfyhTNrmbPMK9GxrqjhJIh9XyKnErDAOe48DZniQKdoEUir6sFRTDb1YcYk8YkIWL9JlT6Mnvt2PGKv6_G938Ega4FyjK1Yim8Wtvb0FPegrB4WJEitd3/s1600/light_receiver.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="645" data-original-width="800" height="322" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhJTmopzt7MdCJzY7HjfdHRK_RHfyhTNrmbPMK9GxrqjhJIh9XyKnErDAOe48DZniQKdoEUir6sFRTDb1YcYk8YkIWL9JlT6Mnvt2PGKv6_G938Ega4FyjK1Yim8Wtvb0FPegrB4WJEitd3/s400/light_receiver.jpg" width="400" /></a></div>
<br />
<br />
The light receiver caused the most trouble for me. For starter,
the photodiode required experiments. I played with 5 different
diodes and eventually found the Osram SFH203 which worked well for
me. If you cannot obtain this type, be prepared that you will also
have to experiment. The next source of troubles was the IR
receiver. Most IR receivers are integrated with the IR photodiode
and these devices are made insensitive to visible light.
Eventually I found VSOP58438 which has 2 problems: first it is
obsolete and therefore it is hard to get, the second is that it is
a 2mmx2mm square. Eventually my colleague helped me out and built
a breakout board that you see in the foreground, with the Osram photodiode connected to it. <br />
<br />
<br />
The rest is simpler. The Atmega328P (also provided in the form of
an Arduino Pro Mini board) runs program that was originally
designed as IR receiver. It can be found <a href="http://pallergabor.uw.hu/androidblog/light/light.zip">in this archive file</a>
(light_receiver/sketch/light_receiver.ino). The receiver library
is Chris Young's IRLib with the timings modified for our
modulation scheme. If the Atmega328P receives a value, it sends
the value out on its serial output which you can observe
SER_LIGHT_CODE pin.<br />
<br />
The data goes into the nRF51822 SoC that I
used in the form of a breakout board (<a href="http://mylifewithandroid.blogspot.com/2016/02/thermometer-application-with-nrf51822.html">here
is an earlier post </a>that describes the board and the
development environment). Also, <a href="http://mylifewithandroid.blogspot.com/2016/12/ble-enabled-christmas-light.html">check
out this post</a> for instructions, how to compile and upload
the project to the board. <a href="http://pallergabor.uw.hu/androidblog/light/blelight.zip">Here is the archive that contains the code for the nRF51822 SoC.</a> The application in the nRF51822 SoC
seems long but it is mostly boilerplate code, in reality its
operation is very simple. Whenever it gets a value from the serial
port, it writes that value into a BLE characteristic that has
notification set. This means that whoever is subscribed to that
notification, will get the event immediately, without polling the
characteristic.<br />
<br />
<br />
And finally, the reason why this topic is at all on an
Android-themed blog: the detected values are consumed by an
Android application. <a href="http://pallergabor.uw.hu/androidblog/light/lightdetect_ble.zip">Click here to download the source code</a> and
read this blog post on <a href="http://mylifewithandroid.blogspot.com/2014/12/ever-since-i-created-gas-sensor-demo.html">how
to convert the sources into an Android Studio project</a>. The
Android application is very similar with the <a href="http://mylifewithandroid.blogspot.com/2016/12/ble-enabled-christmas-light.html">previous
ones</a>, it scans for BLE endpoints with a unique UUID
(274b15b0-b9cd-4e5e-94c4-1248b42b82f8 in our case) and when it
finds one, connects to the endpoint using connection-oriented BLE.
Then it subscribes to the light data characteristic
(274b0100-b9cd-4e5e-94c4-1248b42b82f8) and when a new data comes
from that characteristic, it evaluates the data type 4 bit and
displays the lower 16 bit according to the data type.<br />
<br />
<br />
<br />
<br />
Below is a small video showing how the thing works.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/AunkBUQRzhg/0.jpg" src="https://www.youtube.com/embed/AunkBUQRzhg?feature=player_embedded" frameborder="0" allowfullscreen></iframe></div>
<br />
<br />
<br />
Observe the
normal light sources and the small spot of smart light source that
emits the data. The data is picked up from 3-4 meter distance.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgZ6V-cXpDOUzCQnAJzQLErIhPT8XULlqZWusCn3DUkAfRCwVcQWaKlFnGmH8cpLIfHwDEtHFmet7LVDa9Bv_eU5Xy2HccBgzbqP_rT8S82RFnGZIN-pN_f0HL6c9LplfefcfDPbh-are0j/s1600/light_source.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="750" data-original-width="557" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgZ6V-cXpDOUzCQnAJzQLErIhPT8XULlqZWusCn3DUkAfRCwVcQWaKlFnGmH8cpLIfHwDEtHFmet7LVDa9Bv_eU5Xy2HccBgzbqP_rT8S82RFnGZIN-pN_f0HL6c9LplfefcfDPbh-are0j/s400/light_source.jpg" width="296" /></a></div>
</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-81740021671657185172017-06-14T01:23:00.000+02:002017-06-14T01:25:00.937+02:00Android weather station with a solar-powered BLE sensor<div dir="ltr" style="text-align: left;" trbidi="on">
The ultimate test of the low energy consumption is a sensor that can
survive on its own, without maintenance. My Android weather station supported by BLE weather sensors has
been functioning for <a href="http://mylifewithandroid.blogspot.com/2016/03/android-phone-as-weather-station.html">more
than a year</a> but this year has not passed without adventures
in the battery front. First the station was powered by 2 AA NiMh
batteries - that was 2 weeks of lifetime. Then came the motorcycle battery, that
took much longer to expire but eventually the battery itself failed. Now the 2
sensors run on a discarded laptop battery which may not be able to power
a laptop but powers nicely the two sensors with their combined 5 mA
consumption.<br />
<br />
5 mA, however, is a lot so when I found this solar-powered lamp at Jysk, I
immediately realized that I had to turn the lamp into the solar-powered
version of this weather sensor. Why another weather sensor? Because I
wanted to concentrate on the solar-powering aspects and wanted to reuse
as much as possible <a href="http://mylifewithandroid.blogspot.com/2016/10/android-phone-as-weather-station-with.html">from the old sensor</a>. This prototype may serve as a template, however, for different kind of sensors too.<br />
<br />
Let's see first the solar lamp that I used as a base.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh7wWfvqvsnrrcQ7F7R4BkjRSZzhnE-aB4QLIqpLMPzk5hlYb-y_wGnqtuypiXvH_syptdPXwmskagpxjfDuXrI3irxJ-_hkiGS5nZz2KsXP15pgQrdq5ClWxC2Y3DCNpVOcDGanaP83XKg/s1600/bme280solar_charging.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="649" data-original-width="480" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh7wWfvqvsnrrcQ7F7R4BkjRSZzhnE-aB4QLIqpLMPzk5hlYb-y_wGnqtuypiXvH_syptdPXwmskagpxjfDuXrI3irxJ-_hkiGS5nZz2KsXP15pgQrdq5ClWxC2Y3DCNpVOcDGanaP83XKg/s400/bme280solar_charging.jpg" width="295" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<br />
<i>Solar lamp already containing the weather sensor. The two red LEDs indicate that the solar cell is charging the battery.</i><br />
<br />
This is a quite cheap device with a solar cell on top and a circuit
built around the XD5252F LED driver that takes care of everything from
the charging of the small NiMh battery (if there's sunlight) to
switching on the LED (if there's darkness). Unfortunately the circuit is
so specialized to solar LED lamps that I could not reuse too much of it
except for the solar panel and the LED itself. The solar panel is not
very high-powered, it is a 2V, 20 mA cell. So it became clear
immediately that the Android client app has to work more (consuming more
energy) to obtain sensor data while the sensor has to sleep more to
conserve its own battery that charges only very slowly from the
low-powered solar cell. Also, surviving the night (or longer periods
without sufficiently strong sunlight) requires a quite beefy battery in
the sensor if we want it to transmit BLE messages to the Android
application frequently enough.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/blebme280solar/blesensgw_solar.zip">Click here to download the sources of the Android application</a>. Read this post to figure out, <a href="http://mylifewithandroid.blogspot.com/2016/03/android-phone-as-weather-station.html">how to create an Android Studio project from the downloaded sources</a>.<br />
<br />
The previous Android app has been therefore changed so that instead of
15 seconds of scanning, it now scans for 70 seconds. The sensor sleeps
60 seconds then transmits the measurements for 5 seconds. This results
in a quite low, 4 mAh energy consumption daily that even the low-powered
solar cell can refill if sunny periods occur time to time. To make sure
that the sensor survives long without enough sunlight, a 2700 mAh
Li-Ion battery was installed (of the 14500 type, with the AA form
factor). As in the previous version, the measurement data is transmitted
in the BLE advertisement packets. I wanted to transmit battery
indicator in this case too so I dropped one byte from the 8-byte long
station ID (so it is now 7 bytes long) and instead of that byte now the
supply voltage of the microcontroller is transmitted. It is generally
3.3V, if it drops below that then the battery is really not charging.
This additional measurement data required that the sensor's UUID be
changed, that's how the Android app recognizes this new parameter and
displays in a graph.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsvugSfFgWKYllH7vyLaZwqm7LAlA3C6VbBveoSfAlBczmIy4tlTLayG9fhyphenhyphenmG0Fw1idT7skXrW_tXGxEUPlKHo3fJg0nMD5QtJQzbGPeM-nDxzKL7Bf-d-JQ-vOneoOSQGhMidyAAVCqL/s1600/bme280solar_batteryscreen.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="853" data-original-width="480" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsvugSfFgWKYllH7vyLaZwqm7LAlA3C6VbBveoSfAlBczmIy4tlTLayG9fhyphenhyphenmG0Fw1idT7skXrW_tXGxEUPlKHo3fJg0nMD5QtJQzbGPeM-nDxzKL7Bf-d-JQ-vOneoOSQGhMidyAAVCqL/s400/bme280solar_batteryscreen.png" width="225" /></a><br />
<br />
<i>Battery indicator in the measurement screen of the new sensor</i><br />
<br />
The schematics of the sensor can be seen below (click to enlarge).<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi7JGeqX6364ZlXYmXSynzNEpiR2YLaPHRBhb2VGN2DftQ5P6FNuxvKqIpLSMi_nd9DziP4XiXItfegi6969tuFmCbiSdD2A2tzaUCYw87X1VE7bHXxuFiGlhQGS4RJ8YiWwRaZsDs5j7oq/s1600/bmp280_solar.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1110" data-original-width="1001" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi7JGeqX6364ZlXYmXSynzNEpiR2YLaPHRBhb2VGN2DftQ5P6FNuxvKqIpLSMi_nd9DziP4XiXItfegi6969tuFmCbiSdD2A2tzaUCYw87X1VE7bHXxuFiGlhQGS4RJ8YiWwRaZsDs5j7oq/s400/bmp280_solar.png" width="360" /></a></div>
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgajhRzkifyJzi6p1Y8Ohhmmvxh-89jX3d7fFjbRA_yJ_JQtR3TDT0a_oa06E-XfnCRohio8UDVnOD2sCO56SIl6rj4aIB5fI8QamFgdFT3XHLWZGqV_bLBCw-2R7m4Y8_GjtcmoCtLe_w1/s1600/bme280solar_open.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="649" data-original-width="480" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgajhRzkifyJzi6p1Y8Ohhmmvxh-89jX3d7fFjbRA_yJ_JQtR3TDT0a_oa06E-XfnCRohio8UDVnOD2sCO56SIl6rj4aIB5fI8QamFgdFT3XHLWZGqV_bLBCw-2R7m4Y8_GjtcmoCtLe_w1/s320/bme280solar_open.jpg" width="236" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgtf4pv7JGKlVXueC1OJH2ltIXPglPDz01COrjUPTNJjNsHR9yn474T0gg-oawFTogNCljm-nIGJ1C2Hb5_TEKRzwSBRTu05Ir44TzugqnV1BH-UUY1DP-7hb6RF1feohfZopqhOta3XnYa/s1600/bme280solar_upper.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="650" data-original-width="481" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgtf4pv7JGKlVXueC1OJH2ltIXPglPDz01COrjUPTNJjNsHR9yn474T0gg-oawFTogNCljm-nIGJ1C2Hb5_TEKRzwSBRTu05Ir44TzugqnV1BH-UUY1DP-7hb6RF1feohfZopqhOta3XnYa/s320/bme280solar_upper.jpg" width="236" /></a></div>
<br />
<br />
<i>Sensor circuit installed into the solar lamp case </i><br />
<br />
Nothing much changed from the previous version, except for the solar
cell-battery charger power chain. I wanted to save myself the pain of
designing a Li-Ion charger so I used building block already avalable:
<a href="https://www.aliexpress.com/item/Free-Shipping-5PCS-0-9V-5V-to-5V-600MA-USB-Output-charger-step-up-Power-Module/32267524946.html">this DC-DC converter to produce 5V</a> from the solar cell's varying output
voltage and <a href="https://www.aliexpress.com/item/Smart-Electronics-5V-Micro-USB-1A-18650-Lithium-Battery-Charging-Board-With-Protection-Charger-Module-for/32615930404.html">this battery charger circuit</a> to take care of the Li-Ion
battery. The result is a less than optimal efficiency (almost 50% of the
solar cell's energy is lost during the different up-down conversions)
but at least it is easy to reproduce. And if you like the sensor, you
can always design a much better charging circuit. :-)<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/blebme280solar/adv_bme280_solar.zip">Click here to download the nRF51822 sources</a>. Read this blog post <a href="http://mylifewithandroid.blogspot.com/2016/02/thermometer-application-with-nrf51822.html">for compilation instructions</a>.<br />
<br />
The nRF51822 microcontroller application has not changed a lot either.
The most serious modification is the way the delays are implemented, now the
sleeping periods between two measurements are implemented in a very
low-power way and that results in a consumption in the inactive
periods of about 100 microamperes.<br />
<br />
And one thing more! <a href="http://littlechineserobot.blogspot.com/">Check out my low-cost robot project</a>! <br />
</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-50642824586395227182017-01-02T21:55:00.001+01:002017-01-02T21:55:28.718+01:00Adding more power to the BLE-enabled Christmas light<div dir="ltr" style="text-align: left;" trbidi="on">
The truth is that the low-voltage LED strip I used <a href="http://mylifewithandroid.blogspot.com/2016/12/ble-enabled-christmas-light.html">in
the previous post</a> was a backup solution. Originally I bought a
230V-operated Christmas light with two independent LED strips but
adapting that beast to Bluetooth Low Energy turned out to be a bit
more problematic than I expected. I had to learn a bit about power
electronics first. <br />
<br />
My LED light I used as a base in this post is a standard-issue Chinese-made
device. Below you can see how it looks like, its original controller
already stripped of its plastic protective housing.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjPaL6ejbc_3XKoFokUMtTyzHatVbchtQOyMTX24e9gj0r-UKd3abuIcXHch6Sa-6Rsu81tYz__T_So_Zm3brfxW_wvAL6yL-vDWn691ptVZvxnlPZh6sFlA0uR2yJptZfjj3GO3FCUPByZ/s1600/ledctr2_orig.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjPaL6ejbc_3XKoFokUMtTyzHatVbchtQOyMTX24e9gj0r-UKd3abuIcXHch6Sa-6Rsu81tYz__T_So_Zm3brfxW_wvAL6yL-vDWn691ptVZvxnlPZh6sFlA0uR2yJptZfjj3GO3FCUPByZ/s320/ledctr2_orig.jpg" width="236" /></a></div>
<br />
<br />
The circuit is <a href="http://www.seekic.com/circuit_diagram/LED_and_Light_Circuit/Four_way_flashing_light_string_circuit__8_.html">very
similar to this one</a>, except that mine had only two LED strips,
instead of 4. In my version the controller chip had HN-803 marking
and the strip-controlling thyristors are of type PCR 406. The modes
the original controller supported were all zero-crossing ones so I
retained this operation. <br />
<br />
Very shortly about the zero-crossing vs. phase-angle mode of
controlling <a href="https://en.wikipedia.org/wiki/Thyristor">thyristors
</a>or <a href="https://en.wikipedia.org/wiki/TRIAC">triacs</a>. A
good introduction <a href="http://www.oztekcorp.com/blog/bid/45104/Controlling-Power-with-SCR-Phase-Angle-vs-Zero-Crossing-Mode">can
be found here</a>. The thyristor is fed with a current that has
frequent zero-crossings. This is necessary because once the
thyristor is switched on, the simplest way to turn it off is to
remove the current on the load. That is why <a href="https://en.wikipedia.org/wiki/Diode_bridge">the
Graetz-bridge</a> converting the 230V alternating current into direct
current does not have the usual filtering capacitors. This
guarantees that the current feeding the LED strips/thyristors has
zero-crossings with 100 Hz frequency. After the zero-crossing the
thyristor can be switched on again by just a mA-range current applied on
its gate electrode. The phase difference between the zero-crossing
and the moment the gate current is applied determines whether we use
dimming or not. Then the thyristor will remain switched on until the
next zero-crossing. As the frequency of these zero-crossings is just
100 Hz, pulse-width modulation we used in the previous post for
dimming cannot be used, the human eye would notice the flickering
with such a low PWM frequency. So the simple circuit I am going to
present here can only be used to flash the LED strips but not for
dimming them. Implementing phase angle-based dimming would not be
too hard with the features of our microcontroller but I did not want
to get into that in this post.<br />
<br />
<b><i>Warning</i>: part of the circuits described in this post use
high-voltage 230 V current. Do not try to build them if you do not
have experience with high-voltage electronics because you risk
electrocuting!</b><br />
<br />
Our exercise looks very simple. We need to remove the HN-803
controller circuit, replace it with our nRF51822 BLE SoC and use 2
of the output pins of the SoC to turn on the thyristors. Once the
SoC drives the output pin to low, the thyristor will switch off at
the next zero-crossing which allows us to flash the LED strips with
frequencies lower than 100 Hz. Unfortunately nothing is simple if
high-voltage current is involved because this simple circuit would
connect the ground of the microcontroller board to a wire with
high-voltage current (HVGND on the schematic) risking electric shock if someone touches the
microcontroller board or ground-connected metal parts (like
connectors) when the circuit is in operation. So I built an
optocoupler-based isolator pictured below (click to enlarge).<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgG-B9M4V2v8LCgAtS8wkw6DFV4QtRH_9Tp9IySzn1nW1bT6SnWNY4b1_8LBJneKHzCSvJct2dEXT_GrquSC8rUijGk5Wvnmn7TYWP_4Kz1cIpvNrFAlH5F4E3yZCLmQS4M0UzVTCb4CPzR/s1600/ledctr2_isolator.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="146" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgG-B9M4V2v8LCgAtS8wkw6DFV4QtRH_9Tp9IySzn1nW1bT6SnWNY4b1_8LBJneKHzCSvJct2dEXT_GrquSC8rUijGk5Wvnmn7TYWP_4Kz1cIpvNrFAlH5F4E3yZCLmQS4M0UzVTCb4CPzR/s400/ledctr2_isolator.png" width="400" /></a></div>
<br />
<br />
<br />
The isolator ensures that the microcontroller-side has nothing to do
with high-voltage current so no special precautions need to be done
when handling the MCU board. The isolator itself, along with the
remaining parts of the original controller circuit (D1-D4, T1/T2 and
of course, LED1 and LED2 representing the two LED strips)
are placed in a separate enclosure box. R3 dissipates around 1W so
make sure that the resistor in question can withstand this power, I used
a 2W resistor.<br />
<br />
Driving the low-voltage side of the optocoupler still requires about
5 mA so I introduced additional FETs on the SoC side to provide this
current. The updated circuit looks like below (click to enlarge). This circuit can
control one low-voltage LED strip (with dimming) and two
high-voltage strips (with no dimming) at the same time.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEga7AfosquFrfQV1yvDG_9IEAmcaJMSLef3lQxRmJmzlnP6R3cZcbU1hIF38XsvBXM7ggnZXwG6qNcodboKGrJCH-BZmPEdRJuNyFnmOtNgXVLbRzoDPrAzXn1ENZ5sMLyXOszOUH0NtIJu/s1600/ledctr2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="322" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEga7AfosquFrfQV1yvDG_9IEAmcaJMSLef3lQxRmJmzlnP6R3cZcbU1hIF38XsvBXM7ggnZXwG6qNcodboKGrJCH-BZmPEdRJuNyFnmOtNgXVLbRzoDPrAzXn1ENZ5sMLyXOszOUH0NtIJu/s400/ledctr2.png" width="400" /></a></div>
<br />
<br />
In my implementation the isolator and the MCU boards are located in
two enclosure boxes which allows modular deployment - if there are
no high-voltage strips then the isolator box is not needed. The
connection between the MCU and the isolator boxes are 4 mA current
loops which is quite resistant to noise. So the connecting cable
could be much longer than in the image below.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhBFhcQAqpuCr1252tZb0nd54Hd7aGhIuikd7Wfnh7wQCOMcHYQc1lPWcFM11KCQyrz-wLtnUxs4IQ75mtmUjp9URW16lPfKdNZ2pZJe6wDAFC3Eq9a3sVkMQ7x1R9-LZp9Wd9v0wyNu-k2/s1600/ledctr2_box.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="222" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhBFhcQAqpuCr1252tZb0nd54Hd7aGhIuikd7Wfnh7wQCOMcHYQc1lPWcFM11KCQyrz-wLtnUxs4IQ75mtmUjp9URW16lPfKdNZ2pZJe6wDAFC3Eq9a3sVkMQ7x1R9-LZp9Wd9v0wyNu-k2/s320/ledctr2_box.jpg" width="320" /></a></div>
<br />
<br />
Now on to the software.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/ledctr2/ledctr.zip">Click here to download the nRF51822 code</a>.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/ledctr2/ledctr_android.zip">Click here to download the Android code</a>.<br />
<br />
Compilation instructions can be found <a href="http://mylifewithandroid.blogspot.com/2016/12/ble-enabled-christmas-light.html">in
the previous post</a>. One warning: if you built the previous
version and uploaded into the SoC, make sure that you mass-erase the
chip (mass-erase.sh script provided in the download bundle) and
upload everything again (soft device and updated application)
because the BLE service description has changed and the nRF51822 SDK
writes data into the flash about the service characteristics.<br />
<br />
Again I propose that before you start to experiment with the Android
application, test the BLE device with a BLE debug tool <a href="https://play.google.com/store/apps/details?id=no.nordicsemi.android.mcp">like
the nRF Connect for Mobile</a>. You will see that the high-voltage
LED strips are controlled by a new BLE characteristic (the old one
controlling the low-voltage strip is still available unchanged).<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgquiVP5YaveA-BETKLpWYOMl0_FuM-GyW9_RdLronU-38iIszfGhGu_QxRPrVuwNTwbHRgbbOuvn_r43q55FTVqHfc4FjKGSPqU82hja3is9NZFv4MwlKxPXDxMdERR9raRyA7QDNNqM4f/s1600/ledctr2_sc.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgquiVP5YaveA-BETKLpWYOMl0_FuM-GyW9_RdLronU-38iIszfGhGu_QxRPrVuwNTwbHRgbbOuvn_r43q55FTVqHfc4FjKGSPqU82hja3is9NZFv4MwlKxPXDxMdERR9raRyA7QDNNqM4f/s320/ledctr2_sc.png" width="180" /></a></div>
<br />
<br />
The byte array written into this characteristic is a blob that
describes the light effect. The blob has two identical sections,
each of them 9 bytes long. One section starts with 1 byte for the
repetition counter then 4 times 2 bytes, each 2 byte subsection
having 1 byte for the time duration (in 0.1 sec units) and one byte
for the bit mask of the LED strips (bit 0->1 if LED strip #1 is
to be on, bit 1->1 if LED strip #2 is to be on).<br />
<br />
Regarding the Android application, there are no too many surprises.
I used the now deprecated TabActivity because I did not feel like
playing around with fragments for this simple prototype. The screen
has separate tabs for the low- and the high-voltage strips like
this:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsnHbHZVRAK8tzLA7RU4gFhBkNSJye5TkxRWnPsPIZLa5NAbirjtStEUXp1WUHQnULMyFuaByS_3ohI6Gde3iWqlnCS_JKZQUw5RgBqZ-BcU7EYZ8Ern3VmERooFSoTLJuqea-t0GxOsYz/s1600/ledctr2_sc2.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsnHbHZVRAK8tzLA7RU4gFhBkNSJye5TkxRWnPsPIZLa5NAbirjtStEUXp1WUHQnULMyFuaByS_3ohI6Gde3iWqlnCS_JKZQUw5RgBqZ-BcU7EYZ8Ern3VmERooFSoTLJuqea-t0GxOsYz/s320/ledctr2_sc2.jpg" width="180" /></a></div>
<br />
<br />
The disconnection deficiency described in the previous post is still
there. Make sure that you disconnect from the device after each
manipulation (by pressing the Back button) because neither the
device nor the Android application implements disconnection timeout
so if you stay connected, nobody else will be able to connect to the
LED strip controller. Otherwise have fun with these BLE-enabled
Christmas lights!</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-52666835954007419802016-12-19T16:07:00.003+01:002016-12-24T17:26:57.039+01:00BLE-enabled Christmas light<div dir="ltr" style="text-align: left;" trbidi="on">
The idea came from the light-themed <a href="https://www.meetup.com/Budapest-Makers-Meetup/events/235900631/?_af=event&_af_eid=235900631&https=on">Budapest
Makers' Meetup</a> and from the cheap Christmas LED strip that my
wife bought for about 1 euro. Plus my other hobby project has not
gone as smoothly as expected but produced a connection-oriented
nRF51822 code and bang, the idea was born, let's couple the LED
strip with the Bluetooth Low Energy System-on-Chip (SoC), add an
Android application and let's see what comes out of it. This post is
the tale of that adventure.<br />
<br />
First, about the LED strip. This is a battery-operated device with
two states: off or on. It has surprisingly low power consumption
considering its 10 LEDs.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhsdwKyQvsct3gspCnuB3xZA4yy5REpQC0vVJFfgKnqQVXUxxtAvCjZ9vtHImpbBx3O0jepqjztXMOS6BrUhcYIHXR31CHQO02K6FI9Wz3X_nsfGnleBSLfKb4R9fFJ2dPnryDz3ZIWvFd1/s1600/ledstrip.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhsdwKyQvsct3gspCnuB3xZA4yy5REpQC0vVJFfgKnqQVXUxxtAvCjZ9vtHImpbBx3O0jepqjztXMOS6BrUhcYIHXR31CHQO02K6FI9Wz3X_nsfGnleBSLfKb4R9fFJ2dPnryDz3ZIWvFd1/s320/ledstrip.jpg" width="172" /></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhsdwKyQvsct3gspCnuB3xZA4yy5REpQC0vVJFfgKnqQVXUxxtAvCjZ9vtHImpbBx3O0jepqjztXMOS6BrUhcYIHXR31CHQO02K6FI9Wz3X_nsfGnleBSLfKb4R9fFJ2dPnryDz3ZIWvFd1/s1600/ledstrip.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"></a></div>
<br />
<br />
I removed the battery case and put it away
- it will serve me well in other projects. Then I hooked up the LED
strip with the nRF51822 as shown in the schematics below (click to enlarge).<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgqbwNqEpTD8pMkrLplpdgnZzl9eSeu8MqYLJT_aQvn-TOu80eWohcHMLp0hyFGrDg5uvy9y1_Ja4IG5eCH2b9NcdoPmzetcv12L6Gw5WMrrLMZ0-7pqycJMp5fetH2pRtOT63Fe-puZ3dp/s1600/ledctr.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgqbwNqEpTD8pMkrLplpdgnZzl9eSeu8MqYLJT_aQvn-TOu80eWohcHMLp0hyFGrDg5uvy9y1_Ja4IG5eCH2b9NcdoPmzetcv12L6Gw5WMrrLMZ0-7pqycJMp5fetH2pRtOT63Fe-puZ3dp/s400/ledctr.png" width="400" /></a></div>
<br />
<br />
<br />
<br />
As described in <a href="http://mylifewithandroid.blogspot.hu/2016/02/thermometer-application-with-nrf51822.html">some
previous blog posts</a>, I use a breakout board containing little
more than a sole nRF51822 and the Bus Pirate programmer to upload
the application into the chip. The components on the breakout board
(quartz, etc.) are not shown in the schematics. If you use the same
type of breakout board that I do, make sure that you connect both
GNDs together otherwise instability may be the result.<br />
<br />
Other than that, the circuit is very simple. The LED strip is driven
by P0.22 of the nRF51822. Even though the strip consumes in the mA
range, I played safe and inserted a 2N7000 FET between the MCU and
the LEDs. The 1 Ohm resistor was already part of the circuit in its
original form so I thought it is a good idea to preserve it. One can
also observe the 3.3V stabilizer circuit that transforms the
POWER_INPUT (5V in my case) to the 3.3V consumed by the circuit. Any
stabilizer circuit will do, I just point out that without the C1
condensator I experienced instability when the Bluetooth radio was
in operation. The whole circuit sits nicely in a plastic electronic
enclosure box.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtHCAtbzzOmSYhW7sotwnfZYVcCY6dRAS5GEJCGiKic2-AXj9inMaJQ9aEdo_Zx1PWRS6Vj-Zuwx5OuJmJtd46qxVbnnrncoN_JzamkN6sEWkuJPcmcePb9TCEzYeThJENp2yUkSi9ApLV/s1600/ledctr.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="307" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtHCAtbzzOmSYhW7sotwnfZYVcCY6dRAS5GEJCGiKic2-AXj9inMaJQ9aEdo_Zx1PWRS6Vj-Zuwx5OuJmJtd46qxVbnnrncoN_JzamkN6sEWkuJPcmcePb9TCEzYeThJENp2yUkSi9ApLV/s320/ledctr.jpg" width="320" /></a></div>
<br />
<br />
<br />
Now on to the software. <br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/ledctr/ledctr.zip">Click here to download the nRF51822 sources.</a><br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/ledctr/ledctr_android.zip">Click here to download the Android sources.</a><br />
<br />
The programmer software that drives the Bus Pirate tool is the <a href="https://github.com/floe/programmer">same forked version</a>
that <a href="http://mylifewithandroid.blogspot.hu/2016/10/android-phone-as-weather-station-with.html">we
used before</a>. The project assumes the 12.1.0 version of the
<a href="https://developer.nordicsemi.com/"> nRF5 SDK</a>. I propose
that you stick to this version too because upgrading the project to
another version may involve a lot of work (as I experienced
previously). Convert the soft device with the convert_s130.sh
script, upload into the device with the upload_softdevice.sh,
compile the code with the "make" command then upload the application
with the upload.sh. While doing this, you need to modify SDK path
and device files in the scripts/Makefile according to the directory
layout of your system. After a power cycle, you can observe the
device spitting out a large amount of debug messages on the debug
serial port. Also, LED1 starts flashing showing BLE advertising
activity.<br />
<br />
Before installing the Android part, let's check that the device
works correctly. Download <a href="https://play.google.com/store/apps/details?id=no.nordicsemi.android.mcp">nRF
Connect from the Google Play store</a>, start scanning for BLE
devices, look for "ledctr" (the default name given to our device),
connect and open the custom service with the 128-bit UUID of
274b15a4-b9cd-4e5e-94c4-1248b42b82f8 that it advertises. You should
see something like this:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirWnncamQ-psC4bnLO2H9kt9jg4USyRi6bfOC1owD8F7rYAxO8dBJPwb_lFAowAOF2uEG8FER_kZ5pnGDUvsKzXre-o_hYKKTgheYd0qpslET01U4xvhDkvX7Qwana8bTql4uwiY65EAAu/s1600/lednrfconnect.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirWnncamQ-psC4bnLO2H9kt9jg4USyRi6bfOC1owD8F7rYAxO8dBJPwb_lFAowAOF2uEG8FER_kZ5pnGDUvsKzXre-o_hYKKTgheYd0qpslET01U4xvhDkvX7Qwana8bTql4uwiY65EAAu/s400/lednrfconnect.jpg" width="225" /></a></div>
<br />
<br />
Write the following byte array into the characteristic with the UUID
of 274b0000-b9cd-4e5e-94c4-1248b42b82f8.<br />
<br />
<span style="font-family: "courier";">036000000000000360</span><br />
<br />
This means 3 seconds ramp-up to 0x60 intensity (96%), no flashing, 3
seconds ramp-down from 0x60 intensity. If you see this light effect,
the device is ready. Don't forget to disconnect: the application can
handle only one active connection and there's no timeout mechanism
implemented.<br />
<br />
The BLE device implements the following light effect. First there is
the ramp-up phase when the light intensity increases from 0 to a
maximum. The ramp-up time and the maximum intensity can be set by
means of BLE. Then there is the flashing phase when two states with
different intensity come one after the other. The repetition
counter, the length and the intensity of both states can be set.
Then there is the ramp-down phase, when the intensity goes down from
a maximum to zero. Here again the ramp-down time and the initial
intensity can be set. All these phases are optional, if any of the
time value is zero, that phase is skipped, if the repetition counter
is zero then the flashing phase is skipped entirely. <br />
<br />
The LED strip is capable of only on and off states hence the
intensity effect is implemented by means of pulse-width modulation
(PWM). The BLE application operates a 100 msec timer that updates
the timeouts, intensity changes and state transitions. Look for the
light_update_timer_handler function in main.c if you want to modify
that functionality.<br />
<br />
In order to create the Android project, <a href="http://mylifewithandroid.blogspot.hu/2014/12/ever-since-i-created-gas-sensor-demo.html">follow
the instructions in this blog post</a>. Essentially you have to
create an empty Android Studio project then replace the source tree
under app/src with the content of the ZIP file that you downloaded
previously.<br />
<br />
The Android application has two major parts. First, it scans for BLE
devices that advertise that unique UUID that I allocated to the
application and lists those devices in a List. If the user clicks
any of the devices, the Android application connects to the LED
controller service with the unique UUID I mentioned before,
retrieves the current state of the light effects and displays it by
means of some SeekBars. The user can manipulate these parameters
then update them on the device by pushing the "Set values" button.
The device then starts to perform the light effect. The user can
disconnect by pushing the "back" button.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjY3Ctrw8zVDkoRGvQuDYBlMSPW20AITWPPJdAqHQFka5wMHzgOeS6tEvdrC2mfECi39xSlYWyxn7Awy16a8ij269WLska6dVrSZEA5RH4g-FSZw3JbtrtrHhOMizGA-TRtXYrG1k5F1ZKj/s1600/ledctr_android.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjY3Ctrw8zVDkoRGvQuDYBlMSPW20AITWPPJdAqHQFka5wMHzgOeS6tEvdrC2mfECi39xSlYWyxn7Awy16a8ij269WLska6dVrSZEA5RH4g-FSZw3JbtrtrHhOMizGA-TRtXYrG1k5F1ZKj/s400/ledctr_android.jpg" width="225" /></a></div>
<br />
<br />
One major deficiency of the application is that disconnection
timeout was not implemented, neither in the device part, nor in the
Android application. This means that the user has to take care that
he/she disconnected from the device after modifications to the light
effects were done. If that does not happen, the device stays forever
in "connected" state which means that it will be impossible to
connect to it again without resetting the device. I leave this
exercise to the interested reader. :-)<br />
<br />
The other problematic part is the lack of security. Anyone knowing
how to connect to the device (either by knowing the format of the
light effect blob or by just simply having the Android application
installed) can manipulate the light effect. Personally I don't think
it is a major issue because security can always be added later.
Beside, what could possibly go wrong if a passersby can just change
the light effect in my window?<br />
<br /></div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-40966636019105410522016-10-18T09:27:00.000+02:002016-10-18T12:49:40.259+02:00Android phone as weather station with improved sensor<div dir="ltr" style="text-align: left;" trbidi="on">
The previous two posts introduced a <a href="http://mylifewithandroid.blogspot.com/2016/02/thermometer-application-with-nrf51822.html">BLE-enabled
weather sensor</a> (temperature and humidity) and the <a href="http://mylifewithandroid.blogspot.com/2016/03/android-phone-as-weather-station.html">Android
application that extracts data from this sensor</a>. I hinted that
I intend to proceed with a more sophisticated sensor, Bosch
Sensortech's BME280 but other projects diverted my attention
(shameless self-promotion: <a href="https://www.researchgate.net/publication/308633125_Power_consumption_considerations_of_an_agricultural_camera_sensor_with_image_processing_capability">read
this paper about microcontrollers, image processing and low-power
wide area networks</a> if you are curious, what kind of projects
took my time). But I never forgot my BME280
temperature/humidity/pressure sensors sitting in my drawer and once
I had a bit of time, I resurrected the project.<br />
<br />
The idea is the same as with the DHT-22. The nRF51822 combined ARM
Cortex-M0 microcontroller/Bluetooth Low Energy radio unit will make
the sensor data available over Bluetooth Low Energy (BLE) access.
The smartphone will read this data and display it to the user. Later
(not in this post) I intend to upload the data into some web service
for analysis. We have already done this with the DHT-22, now we
extend the sensor range with the BME280.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/blebme280/blesensgw.zip">Click
here to download the Android application</a>. Click here to <a href="http://pallergabor.uw.hu/androidblog/blebme280/adv_bme280.zip">download
the nRF51822 projects</a> that are running on the sensor hardware.<br />
<br />
Let's start with the sensor. Below is the schematic for the
hardware (click to enlarge). <br />
<br />
<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOltLFkxZ41yK8yStH0avN_tXMCU5bZpUQGpGd-9vDij4nvQW2AJGGvUOU8JWeLQbUpepdUPNqfb95GHYiAVFSWU1djvxZwFYjKGpsqsnu5lAPvj7OKPoiDCvw5TW4tzJMiA-tF_-AeIcF/s1600/blesensgw_bme280.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOltLFkxZ41yK8yStH0avN_tXMCU5bZpUQGpGd-9vDij4nvQW2AJGGvUOU8JWeLQbUpepdUPNqfb95GHYiAVFSWU1djvxZwFYjKGpsqsnu5lAPvj7OKPoiDCvw5TW4tzJMiA-tF_-AeIcF/s320/blesensgw_bme280.png" width="255" /></a><br />
<br />
<br />
The difference between this and the previous one is that the DHT-22 was
connected to a simple GPIO pin while the BME280 uses the more
sophisticated I2C bus. As the BME280 is quite miniature, <a href="https://www.aliexpress.com/item/J34-Free-Shipping-Digital-Temperature-Humidity-Barometric-Pressure-Sensor-Module-Breakout-BME280/32499331827.html">I used a breakout board for the sensor too</a>.
LED1 and the serial debug port are optional (but quite useful). Debug
messages are emitted to the serial port, you need a 3.3V-to-RS232
converter on the TxD pin if you want to observe those.<br />
<br />
<a href="http://mylifewithandroid.blogspot.com/2016/02/thermometer-application-with-nrf51822.html">As
described previously,</a> the circuit is realized with a low-cost nRF51822 breakout board and is programmed with the Bus Pirate programmer, <a href="http://floe.butterbrot.org/matrix/hacking/nrf/">adapted to nRF51822 by Florian Echtler</a>. The only thing I changed in this setup is that this time I moved the projects to the latest version of the SDK<a href="http://www.nordicsemi.com/eng/nordic/download_resource/54280/49/75216472"> which is the 12.1.0</a>.
Also, the soft device (the program module implementing the BLE stack)
was bumped from S100 to S130. These decisions caused quite a headache
because there's significant difference between the old SDK and the
12.1.0. Therefore I decided that in the nRF51822 project file I share
not only the sensor project (called adv_bme180) but two simpler ones
(blinky_new and blinky_new_s130) as additions to the instructions on
Florian's page. As a recap: the soft device need to be flashed into the
device before any BLE application is flashed and the starting address of
the BLE application depends on the size of the soft device. This has
changed between S100 and S130, hence the updated projects. In both
blinky_new_s130 and adv_bme280 you will find the convert_s130.sh and
upload_softdevice.sh scripts that convert into binary format and flash
the S130 soft device that came with the Nordic SDK.<br />
<br />
Once you uploaded the S130 soft device, compile the project in
adv_bme180 and upload it into the nRF51822. The sensor node works the
same way as the DHT-22 version. The MCU in the nRF51822 acquires
measurements from the BME280 by means of the I2C bus (called Two-Wire
Interface, TWI in the nRF51822 documentation), once in every second.
This includes temperature, humidity and pressure. Then the measurement
values are compensated by the read-only calibration data also stored in
the BME280 that the MCU reads in the initialization phase. The
BME280_compensate_T, BME280_compensate_P and bme280_compensate_H
functions come from the BME280 user's manual. The result is the
compensated temperature, humidity and pressure values that the MCU puts
into the BLE advertisement data. The advertisement data also contains
the nRF51822's unique ID that is used to identify the sensor. The sensor
has no name as the measurement+ID data is now too long to allow sensor
name info too, BLE readers now recognize the sensor purely by its unique
UUID.<br />
<br />
Now on to the Android part. The architecture of the application is pretty much the same as described <a href="http://mylifewithandroid.blogspot.com/2016/03/android-phone-as-weather-station.html">in the previous post.</a>
Creating an Android Studio project works the same way as described
there: create an empty Android Studio project, write over the
app/src/main subtree with the content of the archive that you downloaded
from this post and update app/build.gradle file with the GraphView
dependency. <br />
<br />
The BME280 functionalty was inserted in three places. First, BME280 data
has its own data provider (BME280SensorDataProvider.java). This new
provider is able to handle pressure data that DHT-22 measurements don't
have. BLESensorGWService properly recognizes BME280 sensor nodes beside
the DHT-22 sensor nodes (so both are handled), parses BME280
advertisements and puts the measurement data into the BME280 data
provider. MainActivity knows about the BME280 data provider, uses its
data to create the sensor list and invokes
BME280GraphMeasurementActivity if the sensor in question is BME280
sensor. This new visualization activity has pressure graph too.<br />
<br />
This is how the sensor list looks like with a DHT-22 and a BME280 sensor (DHT-22 does not have pressure data, BME280 does).<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgwMr3uuzb5xk3dHjd8nZCISFVoTIA61tT2oRBR5IbUaigUOMslFNF9Zyo95vZpLxdCepkAEOxqzDzdh3opq4HHjUO1ZER_TT37pc3a_1GNIM1AZzq_MrzdiC7dmueX6aXfPM2GyR1wISZV/s1600/blesensgw1.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="214" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgwMr3uuzb5xk3dHjd8nZCISFVoTIA61tT2oRBR5IbUaigUOMslFNF9Zyo95vZpLxdCepkAEOxqzDzdh3opq4HHjUO1ZER_TT37pc3a_1GNIM1AZzq_MrzdiC7dmueX6aXfPM2GyR1wISZV/s320/blesensgw1.png" width="320" /></a></div>
<br />
<br />
And this is how the pressure graph looks like in the BME280 visualization activity.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjABHtB9zrV9e1JqtqstHPJUhgrsHRIMuGyzfTI1ETkmrcR46Fy2TS2ELeGp2SILU1ZFeT-Elnd20soFZRH7fT72FtHdw6GD12r12nXN4AUnHf7ONxF5vXVeJVvC4qypcvk2i3zHWjPXaLz/s1600/blesensgw2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjABHtB9zrV9e1JqtqstHPJUhgrsHRIMuGyzfTI1ETkmrcR46Fy2TS2ELeGp2SILU1ZFeT-Elnd20soFZRH7fT72FtHdw6GD12r12nXN4AUnHf7ONxF5vXVeJVvC4qypcvk2i3zHWjPXaLz/s320/blesensgw2.png" width="180" /></a></div>
<br />
<br />
And at last, some words about the deployment. I got a question, how the
power supply works. After more than half a year of operation, I ended up
with a discarded 12V motorcycle battery as power source. This battery
used to have 6Ah capacity, now it has about half which is not enough to
feed a motorcycle but is quite enough to yield 1-2 mA per sensor node
for a long time. Also, this battery is designed to withstand quite
severe weather conditions. I can only recommend discarded but still
functional motorcycle/car batteries as power source if the place
available for the sensor permits it.<br />
<br />
Here is how the BME280 sensor looks like in its protective plastic box. The small panel in the foreground is a<a href="https://www.aliexpress.com/item/10pcs-New-Mini-Converter-Adjustable-DC-DC-Step-down-Power-Supply-Module-replace-LM2596/32391539009.html"> cheap DC-DC converter</a> (not shown in the schematics) which makes the step-down from 12V to 3.3V.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEJUfWuVPrXiGTWZKnDpvxPNTn71Ipy-rOaeuzywlnCPdDSXGM5bKJKRk83zfnNtCWzJWRRNvNVHzHfA_dI62nAlWGHktZKy7Qx5EedgBMEjcpOaKEH5qn9LIwmT0pAe9eNxIVlx8RboVI/s1600/bme280_1.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEJUfWuVPrXiGTWZKnDpvxPNTn71Ipy-rOaeuzywlnCPdDSXGM5bKJKRk83zfnNtCWzJWRRNvNVHzHfA_dI62nAlWGHktZKy7Qx5EedgBMEjcpOaKEH5qn9LIwmT0pAe9eNxIVlx8RboVI/s320/bme280_1.jpg" width="320" /></a></div>
<br />
<br />
And here is how the sensor nodes sit in an outdoor box with the battery.
The lid of the BME280 node is removed for demonstration, the other box contains the DHT-22 sensor.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9_O2FiWs0LsVcEGKPPDPX61zTjw9Q-VLN42i4PN2laALwgil6HlOIabkufDyzL5w2Lc_n5ODaA0H2GWOu0mxWpF8ESxi5c6Hwddfc2xkBHdPfz751IRu_L16r3FD3uhIr2V_qGUevhQ7o/s1600/bme280_2.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><br /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9_O2FiWs0LsVcEGKPPDPX61zTjw9Q-VLN42i4PN2laALwgil6HlOIabkufDyzL5w2Lc_n5ODaA0H2GWOu0mxWpF8ESxi5c6Hwddfc2xkBHdPfz751IRu_L16r3FD3uhIr2V_qGUevhQ7o/s1600/bme280_2.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="178" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9_O2FiWs0LsVcEGKPPDPX61zTjw9Q-VLN42i4PN2laALwgil6HlOIabkufDyzL5w2Lc_n5ODaA0H2GWOu0mxWpF8ESxi5c6Hwddfc2xkBHdPfz751IRu_L16r3FD3uhIr2V_qGUevhQ7o/s320/bme280_2.jpg" width="320" /></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifejCSBiok8vhfqwbwdIldGgSYIOnVgBOBKcyG0MBfYl9quqzAurFtYcWPHLwywWl4KpeCrHJDbKAz1QRJnW2BMYb_PYLj_rjZP_esqXt7JWnPanlzYQrEjZ8f-gGx72ctXQp7eCd99xCK/s1600/bme280_1.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"></a></div>
<br />
<br />
Now the part that is really missing is the data upload/data analysis functionality.
</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-28864636056453232482016-03-02T20:52:00.000+01:002016-03-02T20:57:24.000+01:00Android phone as weather station<div dir="ltr" style="text-align: left;" trbidi="on">
The <a href="http://mylifewithandroid.blogspot.com/2016/02/thermometer-application-with-nrf51822.html">previous
post was about a low-cost Bluetooth Low Energy sensor</a> (really,
one sensor unit that includes the BLE-enabled microcontroller too
costs less than 15 USD and that's just a single prototype, economies
of scale come on top of that) and its accompanying Android app that
allows obtaining sensor reading manually. That's not bad but
manually reading data is sort of inconvenient. If you want to know,
what the temperature and humidity was in the dawn, you have to be
awake in that early hour. Personally, I prefer to sleep then so I
decided to automate the whole process.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/bledht22/blesensgw.zip">Click here to download the sources of the Android application.</a> The
content of the archive is the app/src/main subtree of an Android
Studio project. In addition to extracting the sources into the
app/src/main subtree, update app/build.gradle like this:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace;">dependencies {<br />
compile fileTree(dir: 'libs', include:
['*.jar'])<br />
testCompile 'junit:junit:4.12'<br />
<b><i>compile 'com.jjoe64:graphview:4.0.1'</i></b><i><br />
</i>}</span><br />
<br />
The project depends on <a href="http://www.android-graphview.org/">Jonas
Gehring's GraphView project</a>, hence this new dependency.<br />
<br />
So what can we expect from this new app? In case of the app that
came with the sensor <a href="http://mylifewithandroid.blogspot.com/2016/02/thermometer-application-with-nrf51822.html">in
the previous post</a>, you started a manual scan and if the sensor
was in range, you got the humidity/temperature data. The new app
scans and stores data in the background. Once it is started, it sets
up a periodic timer (default timeout is 1 hour but can be changed in
the settings menu) and when the timer fires, it makes a scan. If it
finds a BLE node whose advertisement fits our criteria (e.g. it
advertises services with the UUID I allocated) then it extracts the
measurement data from the advertisement message and stores it in a
database on the device. This variant does not yet upload the data to
a server, that may come later. However, it can visualize the
measurements on simple graphs, hence the dependency on GraphView.
Like this:<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiNCulm82dxHx9oha8WkvOWRpxAbIteFz2r7EFTubIy9PGYv4jrPicsgrAYBKDMO1iBmBoSX6ylwbCsKAqL7HaLMkZ7bSU92GeWgmEE_amHh6JXoMwNsfJae4Q4CVuy3_TvLlADhRJYhyphenhyphenLF/s1600/blesensgw_graph.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiNCulm82dxHx9oha8WkvOWRpxAbIteFz2r7EFTubIy9PGYv4jrPicsgrAYBKDMO1iBmBoSX6ylwbCsKAqL7HaLMkZ7bSU92GeWgmEE_amHh6JXoMwNsfJae4Q4CVuy3_TvLlADhRJYhyphenhyphenLF/s400/blesensgw_graph.png" width="223" /></a></div>
<br />
<br />
<br />
<br />
Let's see the interesting bits of this app.<br />
<br />
First and foremost, it is an interesting feature of this application
that the BLE layer is used in such a way that reading the sensor is
not an extra cost for the sensor. As the measurement data is
embedded into the advertisement packets that the device broadcasts
anyway, it does not matter if 1 or 1000 phones read and store data.
So this sort of sensor network can grow into an entire ecosystem -
the more phone users install and use the app, the more precisely the
measured quantity will be available once the phones upload their
catch to the server. <br />
<br />
If you observe, how the data is stored
(DHT22SensorDataProvider.java), you can recognize an important
shortcut that I made: the database structure depends on the sensor
being used. This provider depends on the fact that DHT-22 (the
actual measurement device) provides temperature and humidity data in
the same reading. A different sensor (like the Bosch BME280 sensors
sitting in my drawer waiting for their turn) will require a new
provider and also a modification of the visualization part. So
there's significant development potential in making the app more
flexible when it comes to adding a new sensor type.<br />
<br />
The actual sampling of the service happens in BLESensorGWService
using the AlarmManager to trigger the scan. Now getting the device
awake if it was just sleeping is not a simple business. Observe in
the list below, that even though there's always an hourly reading,
there's a significant variation when the reading happens.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjcom6fauPwGMRBhulgl7ljwHCFAzpYWZTcJiH6FcAHtBf2LO-mvoaJPO69ZdeRqXv9hSLVaCLw1bLUd3mDSnqv8RtUPxsyTrY9o7iBHYuHzLSU6_cCBrheh3DYLODHmCfGA3gq-ixfgNCF/s1600/blesensgw_list.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjcom6fauPwGMRBhulgl7ljwHCFAzpYWZTcJiH6FcAHtBf2LO-mvoaJPO69ZdeRqXv9hSLVaCLw1bLUd3mDSnqv8RtUPxsyTrY9o7iBHYuHzLSU6_cCBrheh3DYLODHmCfGA3gq-ixfgNCF/s400/blesensgw_list.png" width="223" /></a></div>
<br />
<br />
In case of
our weather reading, it was not a problem but some sensors may have
more variable data. A large number of devices reading and uploading
would solve the problem of reading time variations.<br />
<br />
GraphMeasurementActivity is the activity that depends on Jonas
Gehring's GraphView. The graphs are very simple so if you have
another favourite graph view component, just replace it there.<br />
<br />
So we are at the point that we added sensors to our Android device
using Bluetooth Low Energy and created an application that samples
them producing nice weather-related data series. The next step will
be the integration of a cloud-based data analysis. I am still
thinking, which one to go for.<br />
<br />
And finally, the picture of the sensor, in its "weather-resistant"
box.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSzshIK8U26V8lYI_h93rDml0he97Wb7ByCDLx__UHzV0TBkfvsGhTeB4DShZ8yNA7CpA75eNEnlSawed9-QUiTkn6WYACxrEYtEcYVSRDlZWAeSKNsuSM4qCgXomqZIxk242pCAdxjjmq/s1600/blesensgw_box.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSzshIK8U26V8lYI_h93rDml0he97Wb7ByCDLx__UHzV0TBkfvsGhTeB4DShZ8yNA7CpA75eNEnlSawed9-QUiTkn6WYACxrEYtEcYVSRDlZWAeSKNsuSM4qCgXomqZIxk242pCAdxjjmq/s320/blesensgw_box.jpg" width="240" /></a></div>
<br />
<br />
<br /></div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com5tag:blogger.com,1999:blog-8214401912480503366.post-80536854338198021722016-02-02T18:37:00.001+01:002016-02-02T23:02:51.608+01:00Thermometer application with nRF51822 and Android<div dir="ltr" style="text-align: left;" trbidi="on">
I built quite many prototypes on this blog with <a href="http://mylifewithandroid.blogspot.hu/search/label/RFDuino">RFDuino</a>
based on Nordic Semiconductor's nRF51822 and I can still recommend
that small development board to people who want to get initiated
quickly and painlessly into the world of Bluetooth Low Energy
programming. The limitations of RFDuino became apparent quite soon
and it was time to get deeper. On the other hand, I wanted to stay
with the excellent nRF51822 so I looked for a breakout board - as
simple as possible. <br />
<br />
This is how I stumbled into <a href="http://floe.butterbrot.org/matrix/hacking/nrf/">Florian Echtler's
page about low-cost BLE development environment</a> based on the
nRF51822 and the <a href="http://dangerousprototypes.com/docs/Bus_Pirate">Bus Pirate</a>
programming tool. So I quickly ordered a no-name variant of<a href="http://www.waveshare.com/Core51822.htm"> Waveshare's
Core51822</a> breakout board, a Bus Pirate tool and a <a href="http://www.aliexpress.com/item/Free-Shipping-5pcs-lot-DHT22-AM2302-DIP-4-Digital-Temperature-And-Humidity-Sensor-100-New-Original/32246609443.html">bunch of DHT-22 sensors</a>
(because I wanted to measure something in the environment). Also note
that the breakout board has a connector with 2 mm pin spacing which is
not the usual 0.1 inch pitch. It helps if you have a prototyping board
with both 2 mm and 0.1 inch pitch <a href="http://www.aliexpress.com/item/10-pcs-lot-Prototype-PCB-for-UNO-R3-Shield-Board-DIY-Combo-2mm-2-54mm/32303172177.html">like this one</a> which cannot be found in every store.<br />
<br />
Generally speaking, following the instructions on Florian's page was
easy enough. I ran into two issues. First, I had no success with the SWD
programming software he refers to but <a href="https://github.com/floe/programmer">Florian's fork</a>
(which is based on an earlier version of the programming software)
worked well for me. Second, I experienced instability if the GND pins of
the breakout are not connected (there are 2 of them).<br />
<br />
First about the hardware. The schematic below show only the parts that are connected to the pins of the breakout board, the <a href="http://www.waveshare.com/w/upload/5/57/Core51822-Schematic.pdf">schematic of the breakout board </a>itself is not included.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj29aiqjHLHFcDi1uMYEd6-29n-NXq0JQEvZAO2LjPd4U5rTUydkZkTNbhmbSqeKWi0DNdZMudy0Pj8BfhnJVdQoVh9yoRiKVEzcJTtm-_lgyK8as1dtn7mgx0eCtxxa8yIsrz5Ik6Tw7X1/s1600/nrf51822_dht22.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj29aiqjHLHFcDi1uMYEd6-29n-NXq0JQEvZAO2LjPd4U5rTUydkZkTNbhmbSqeKWi0DNdZMudy0Pj8BfhnJVdQoVh9yoRiKVEzcJTtm-_lgyK8as1dtn7mgx0eCtxxa8yIsrz5Ik6Tw7X1/s320/nrf51822_dht22.png" width="279" /></a></div>
<br />
<br />
Highlights:<br />
<br />
<br />
<ul>
<li>DHT-22 is connected to P0.17 which is both input and output depending on the communication phase.</li>
<li>P0.21 LED provides a feedback about the BLE activities. This is a
convention coming from the PCA10028 dev board that we lied to the Nordic
tool chain that we have. You can omit this LED if you want to save some
energy.</li>
<li>SV1 header is a TTL serial port where the example program emits
some debug messages. You can omit this header if you are extremely
confident. I use a level converter <a href="http://www.lctech-inc.com/Hardware/Detail.aspx?id=67f4ee1a-7412-4d12-87f6-7d00d52566ca">like this</a> to connect this port to a standard RS232C port. The UART operates on P0.18 (RxD) and P0.20 (TxD).<br />
</li>
<li>SV2 header goes to the Bus Pirate. Check out Florian's document
about the connection. Make sure that this cable is as short as possible.<br />
</li>
</ul>
Here is how the board looks in all its glory, the Bus Pirate and the
RS232C level converter boards in the background. These are of course not
needed for deployment, the board runs standalone after the testing is
successful.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-4ZmrN-6zrFzTl73HGD9NfBKMRpnRQaXQ059hmKiXxtBVyWJSF06Gy0To1y-h_vIJdHlzYm4xSQojtvEUabMcTGh8IXakRdFP5uwFXCB9pY66rQNCdOXU_hqvRki7JJW_H0pqT-1nqxfs/s1600/bledht22.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-4ZmrN-6zrFzTl73HGD9NfBKMRpnRQaXQ059hmKiXxtBVyWJSF06Gy0To1y-h_vIJdHlzYm4xSQojtvEUabMcTGh8IXakRdFP5uwFXCB9pY66rQNCdOXU_hqvRki7JJW_H0pqT-1nqxfs/s320/bledht22.jpg" width="307" /></a></div>
<br />
<br />
<br />
<br />
Click here (<a href="http://pallergabor.uw.hu/androidblog/bledht22/adv_dht22.zip">adv_dht22.zip</a> (nRF51822), <a href="http://pallergabor.uw.hu/androidblog/bledht22/bledht22.zip">bledht22.zip</a> (Android)) to download the example programs related to this blog post.<br />
<br />
Let's start with the code that goes into the nRF51822 which can be found
in adv_dht22.zip. The assumption is that you completed Florian's
instructions, including the upload of the S110 soft device. Then unzip
adv_dht22.zip and do the following modifications:<br />
<br />
<ul>
<li>Edit Makefile and make sure that the RF51_SDK_PATH variable points to the location where you installed the Nordic SDK.</li>
<li>Edit upload.sh and make sure that the paths point to the location
where you installed Florian's version of the SWD programmer. Also, make
sure that the USB device is correct (/dev/ttyUSB0 by default).</li>
</ul>
Now you can say "make" and then "upload.sh". If all goes well, the code
is installed in the nRF51822 and you get debug messages on the serial
port. At this moment, the nRF51822 is already advertising the
measurements it obtained from the DHT-22 sensor. You can check the
content of the advertisements with <a href="http://mylifewithandroid.blogspot.com/2014/12/ever-since-i-created-gas-sensor-demo.html">this test tool</a>.<br />
<br />
The code looks quite frightening compared to the super-simple RFDuino equivalent but most of it is just template. My highlights:<br />
<ul>
<li>Check out in advertising_init(), how the advertisement packet is
set up. We transfer the measurements in a service data GAP field and I
took the liberty to allocate a 16-bit UUID for it (quite far from the
standard service UUIDs).</li>
<li>Check out timers_init(), timers_start() and
sampler_timer_handler() methods how the periodic reading of the sensor
and the update of the advertisement packet is accomplished.</li>
<li>DHT-22 sensor handling is done in dht22.c. This sensor has a somewhat peculiar 1-wire interface. Read <a href="http://www.micropik.com/PDF/dht11.pdf">this document</a> and the code will be easy to understand.<br />
</li>
</ul>
<br />
Regarding the Android code: this is just the app/src part of the source
tree of an Android Studio project. I adopted this rather primitive
export method as this super-advanced IDE still does not have code export
option that its obsolete predecessor, the Eclipse IDE used to have.
Check out onLeScan method in MainActivity.java to see, how the BLE GAP
parser <a href="http://mylifewithandroid.blogspot.com/2014/12/ever-since-i-created-gas-sensor-demo.html">introduced in this blog post</a> is used to take apart the advertisement message and filter BLE nodes that advertise DHT-22 measurements.<br />
<br />
The outcome looks like this:<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhW37tv9NU6H_01XbZqBhRxTGVzvPz1pW1cSVP-KxTIfepdfzXO0qsK8vP0Gimr4IA8l9BkZ7-qpmA78LZv8_ilPFnPo_SAHyRYOmKu4339Q8on-O1RH_q83t_TQTRg1Xxmc-XTyNCpvfaz/s1600/bledht22_screenshot.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="145" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhW37tv9NU6H_01XbZqBhRxTGVzvPz1pW1cSVP-KxTIfepdfzXO0qsK8vP0Gimr4IA8l9BkZ7-qpmA78LZv8_ilPFnPo_SAHyRYOmKu4339Q8on-O1RH_q83t_TQTRg1Xxmc-XTyNCpvfaz/s320/bledht22_screenshot.png" width="320" /></a></div>
<br />
<br />
Note that each sensor is identified by a 64-bit unique ID (a service of
the nRF51822). Now this data just needs to be uploaded into some sort of
service and then the big data analysis can start ;-). More about that
later.</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com11tag:blogger.com,1999:blog-8214401912480503366.post-29880102835174586082016-01-07T00:29:00.000+01:002016-01-07T00:29:36.588+01:00Data transfer to Android device over infared link<div dir="ltr" style="text-align: left;" trbidi="on">
The three previous parts (<a href="http://mylifewithandroid.blogspot.com/2015/12/controlling-android-device-with.html">here</a>, <a href="http://mylifewithandroid.blogspot.com/2015/12/improved-hardware-for-infrared-to.html">here</a> and <a href="http://mylifewithandroid.blogspot.com/2015/12/improved-hardware-for-infrared-to.html">here</a>)
of this series introduced the infrared-to-Android gateway. Even though
those prototypes captured the signals of an ordinary IR remote, I
already hinted that I was aiming for something more exciting. In this
part, we will replace the IR remote with our own IR transmitter. Once we
have our own IR transmitter, we will be able to transfer our own data
over IR light. This data link is not reliable enough to transfer large
amount of data but in lot of the cases that's not required. E.g. to
transfer the data of a temperature/humidity sensor, 32 bit is more than
enough.<br />
<br />
So what can we expect from IR-based data transfer with respect to more
popular, radio-based transfer? There are advantages and disadvantages.<br />
<br />
<ul>
<li>Advantage for the IR solution is that it is extremely cheap and also extremely power-efficient. <br />
</li>
<li>Advantage for radio is that IR-based solution always requires line
of sight, while radio waves can - to some extent - traverse walls, etc.</li>
<li>Advantage for radio is that the IR transmitter has to be aimed at
the receiver, if for some reason the receiver and the transmitter move
with respect to each other, they lose contact very easily.</li>
</ul>
There's also the question of range. Ordinary TV remotes work up to 3-4
meters of distances which is nice for an inexpensive consumer device but
is not enough even for indoor sensor network use cases. In this part I
try to figure out what the distance limit may be.<br />
<br />
For starter, this question is not defined precisely. With sophisticated
optics, high-powered transmitters and careful targeting, IR data
transfer can be accomplished over significant distances. But we said
that we are looking for cheap hardware so we can't rely on sophisticated
devices. We need some sort of optics but this should be simple and
cheap. That's why I went to a second-hand toy shop and bingo, I found
the IR transmitter of Thinkway Toys' <a href="https://www.youtube.com/watch?v=DITJQW2nxGc">Lazer Stunt Chaser</a>.
The small toy car has long been lost, but the handgun-like IR
transmitter somehow made its way to Hungary. This cheap, plastic toy is a
marvelous device. It promises 12 meters of effective range and even
though the mounting of the IR light source and the plastic optics is
made of cheap materials, it is surprisingly efficient. It even has a
normal (red) LED emitting its light through the transparent housing of
the infrared LED which produces visible red light circle of about 10 cm
of diameter facilitating the targeting of the IR transmitter.<br />
<br />
So how far can it transmit our IR codes? In order to try it out, I took
apart Thinkway's IR transmitter and replaced the circuitry driving the
IR LED with mine. The new emitter circuit is based on an Arduino Pro
Mini 3.3V/8MHz and the schematics looks like this:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhMR7pJK7NL8280Tmt3ArltowB_drFxGr8xY4YvJeO1lokefOx1NsDxerJFHoxasUxD8i7vR53QnoOQAkqXbxK2A0vWNj2IhNqTUHYXnwKl0ANCajHDB6ISEudgKhcZEJeVmGXUhp8yhKdb/s1600/irout.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhMR7pJK7NL8280Tmt3ArltowB_drFxGr8xY4YvJeO1lokefOx1NsDxerJFHoxasUxD8i7vR53QnoOQAkqXbxK2A0vWNj2IhNqTUHYXnwKl0ANCajHDB6ISEudgKhcZEJeVmGXUhp8yhKdb/s640/irout.png" width="640" /></a></div>
<br />
<br />
<br />
<br />
<br />
The software works on any Atmel ATmega328p-based boards, e.g. on Arduino
Uno. If your MCU uses power source with higher voltage than 3.3V, adapt
R2 accordingly. E.g. for an 5V Arduino Uno board, you need about 160
Ohm. <br />
<br />
So this is how I hacked my circuit into the IR transmitter.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgey9tplJiUI0VHiLhWHQP2etLGwZ5tJ3fvaOuTAhfhcYQIu6x5m5Kv2AFTrXRN9GcDUfSAVm7mifu_qMKtl2BTd91FvLTYOfHs2iMsJqxANo1x0zcfShbKC3mltwct4wFD5JPG8sMvkVzd/s1600/irout1.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgey9tplJiUI0VHiLhWHQP2etLGwZ5tJ3fvaOuTAhfhcYQIu6x5m5Kv2AFTrXRN9GcDUfSAVm7mifu_qMKtl2BTd91FvLTYOfHs2iMsJqxANo1x0zcfShbKC3mltwct4wFD5JPG8sMvkVzd/s640/irout1.jpg" width="640" /></a></div>
<br />
<br />
<br />
<br />
Note the two LEDs: the IR led in the tube-like mounting and the ordinary
red LED behind it that is used for targeting. Also, note how the IR LED
is connected to the ATmega328p's PD3 pin - IRLib which is the software
used to construct the transmitter selects OC2B PWM output so the primary
Timer2 PWM pin (PB3) would not work.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/irblegw/irout.zip">Click here to download the source code of the IR transmitter.</a><br /><br />
Open sketch/Makefile and update the ARDUINO_DIR according to your
installation. Also, update ISP_PROG according to the programming tool
you use to deploy the code. I used <a href="https://learn.adafruit.com/usbtinyisp">USBtinyISP</a>,
the Makefile is set accordingly. If your board has USB programming port
(like the Arduino Uno has), then setting ISP_PROG is unnecessary.<br />
<br />
In order to deploy the code into the ATmega328p, say:<br />
<br />
make ispload<br />
<br />
if you use a programming tool or simply<br />
<br />
make upload<br />
<br />
if you don't need a tool.<br />
<br />
About the code. The application emits a 32-bit NEC IR code every 4
seconds. The value of the code is increasing. The application is based
on the same <a href="http://tech.cyborg5.com/irlib/">IRLib</a> library
as the receiver was, this time the only modification necessary was that I
disabled all the receiver routines as this device does not receive any
data. And that's my response to all the people worrying about the
security of the Internet of Things: if the Thing is physically unable to
receive any data then it cannot be hacked, period.<br />
<br />
Even though this application is not particularly energy-efficient, care
was taken to implement the sleeping part with the lowest possible energy
consumption. Hence the delay logic does not use the Arduino delay()
function but puts the MCU into deep sleep and triggers the watchdog
timer to launch a new iteration. Read <a href="http://www.gammon.com.au/power">Nick Gammon's excellent analysis</a> of ATmega328p power saving techniques for further explanation.<br />
<br />
So what are my experiences? Using the infrared-to-Android gateway I
presented previously which is based on a basic IR receiver circuit
(TSOP1738) without any optics and the IR transmitter hacked into a
Thinkway toy, which uses cheap plastic optics, I was able to transmit
codes to a distance of about 10 meters (and then receive the code on the
Android device over BLE). Of course, not every transmission was
successful. My experience is that the code needs to be repeated about 5
times if you want to be nearly 100% sure that it arrives (100% can never
be achieved). One important take-away is that the targeting LED in the
Thinkway toy was not there by chance, in order to target the IR
receiver, you really need that aid of visible red light circle. But
that's about it: 2 LEDs, a cheap MCU and some optics and you can connect
sensors from an entire room to one relatively expensive BLE unit. Also,
these IR-connected sensors need very little energy. So it is worth
considering the advantages/disadvantages.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiBWuMCdIDbcpisW8pOzZmLoaptmrube_yKqe44cnt6V1unwRBGDhhEm27AT7haYGlGxL2inhxCxApv8hBfe5P_b6S4uLxqwEXojrARL_0446PleFE9tFU888FRlvCGKghnXABnASX8f7i0/s1600/irout2.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="416" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiBWuMCdIDbcpisW8pOzZmLoaptmrube_yKqe44cnt6V1unwRBGDhhEm27AT7haYGlGxL2inhxCxApv8hBfe5P_b6S4uLxqwEXojrARL_0446PleFE9tFU888FRlvCGKghnXABnASX8f7i0/s640/irout2.jpg" width="640" /></a></div>
</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com2tag:blogger.com,1999:blog-8214401912480503366.post-14801017151824743852015-12-30T02:06:00.000+01:002015-12-30T02:06:55.365+01:00Infrared-to-Android gateway implementation with interrupts<div dir="ltr" style="text-align: left;" trbidi="on">
In the previous parts of this series the <a href="http://mylifewithandroid.blogspot.com/2015/12/controlling-android-device-with.html">infrared-to-Android gateway</a> based on the RFDuino hardware and an <a href="http://mylifewithandroid.blogspot.com/2015/12/improved-hardware-for-infrared-to.html">improved version of the hardware</a>
were presented. The improved hardware offered quite reliable IR code
recognition even when the BLE connection was in operation. Trouble with
that implementation was the polling nature of the code; even though the
IR reader support hardware is capable of raising an interrupt when a new
time measurement is available, the code did not handle that interrupt,
instead it polled the interrupt signal.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/irblegw/rfduino_irblegw3.zip">Click here for the updated gateway code</a>. <a href="http://pallergabor.uw.hu/androidblog/irblegw/irblegw.zip">Click here for the (unchanged) Android application</a> serving the gateway.<br />
<br />
The reason I did not implement proper interrupt handling was the I2C
library (called "Wire") coming with the RFDuino. Even though the
nRF51822 (on which the RFDuino is based) is able to handle its I2C
interface (called TWI, two-wire interface) by means of interrupts, it
was not implemented in the "Wire" library. When the MCP23008 GPIO port
extender raised an interrupt, the MCU was expected to read the
MCP23008's capture register by means of an I2C transaction. As the
"Wire" library was polling-based, this transaction held back the
GPIO-handling interrupt for too long time, freezing the system.<br />
<br />
The solution was transforming the "Wire" library into interrupt-based
implementation. Now as my goal was a limited functionality (reading one
register of an I2C periphery) I did not do it properly. I moved the
entire "Wire" library into the application project (see it under the
"lib" directory), renamed it to "Wire2" and introduced a couple of new
methods, more importantly sendReceiveInt (in lib/Wire2.cpp). This method
initiates the write transaction of a data array followed by a read
transaction of another data array over I2C, all handled by TWI
interrupts. This means that sendReceiveInt returns immediately and the
actual data transfer happens in the background. This new method is
invoked in the GPIO interrupt handler (GPIOTE_Interrupt in
sketch/irblegw3.ino) but this time the GPIO interrupt handler completes
very quickly as its job is only to initiate the TWI transaction handler.
When the TWI transaction is finished, the TWI interrupt handler invokes
the onReceive callback that ends in the application code (twi_complete
in sketch/irblegw3.ino).<br />
<br />
The most important outcome of this - quite significant - change is that
the MCU does not spend its time spinning on the GPIO port reading loop.
Instead, it waits for interrupts in ultra-low power mode
(sketch/irblegw3.ino, IRrecvRFDuinoPCI::GetResults method,
RFduino_ULPDelay invocation) which is important if the
infrared-to-Android gateway is powered by a battery. As you may have
guessed, my goal is not to fiddle with IR remote controllers, I intend
to build a short-range network comprising of infrared, BLE and cellular
links and the infrared-to-BLE gateway was just one step. </div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com2tag:blogger.com,1999:blog-8214401912480503366.post-51198731093646184612015-12-16T18:08:00.001+01:002015-12-16T18:27:09.402+01:00Improved hardware for the infrared-to-Android gateway<div dir="ltr" style="text-align: left;" trbidi="on">
<a href="http://mylifewithandroid.blogspot.com/2015/12/controlling-android-device-with.html">In the previous post I presented the results of my experiments</a> with the
nRF51822-based RFDuino as infrared-to-Bluetooth Low Energy gateway,
accompanied by an Android client app. The outcome of that experiment was
that the nRF51822 BLE soft stack and the purely software-based IR
receiver is not a good match as the BLE soft stack "steals" enough
cycles from the Cortex M0 CPU so that the IR reading becomes very
unreliable no matter which implementation alternative we go for (3
different alternatives were attempted). I promised an improved hardware
that overcomes this limitation and this post is about this improved
hardware.<br />
<br />
The essence of our problems with the BLE soft stack was that in case of
very tight timings that the IR receiver requires, the main CPU is not
suitable anymore for measuring time periods. Typical IR timing is in the 500
microsec-2 millisec range, this is the time period we should be able to
measure reliably. With the BLE soft stack in operation, delays are
introduced into the time measurement code by the background interrupt
routine serving the soft stack and time measurement of this precision
will be wildly off. I considered two options.<br />
<br />
<ul>
<li>The most evident option is to drop the integrated MCU-BLE radio
combo that the nRF51822 is and go for a separate MCU-BLE modem option.
For example an Arduino Pro Mini with a BLE121LR modem would have been a
perfect fit as there are both of these modules in my drawer. While this
hardware would have been definitely more hassle-free than the nRF51822,
setting it up would have required two different programming tools (I
have both but that's not necessarily true for the general blog reader out
there) and I am still uneasy about soldering the BLE121LR - those pads
are smaller than my capabilities.</li>
<li>Extending the RFDuino with a dedicated hardware for time period
measurement sounded more attractive for me as this was less evident. The
functionality we expect is that the MCU is not doing any time
measurement. The external hardware must be capable of measuring the time
periods between the edges of the TSOP1738 output signal and deliver
these measurements to the MCU. The measuring range is about 500
microsec-2 msec. Larger time periods are still measured by the MCU but
in this case the disturbances caused by the soft stack are not that
relevant.<br />
</li>
</ul>
Further complication is that while the nRF51822 has 31 general-purpose
pins, RFDuino makes only 7 of those available and 2 of them are reserved
for USB communication. This requires that the circuit is interfaced
with the RFDuino with the lowest number of wires and I2C is the best
option (2 wires). nRF51822 has I2C option (called TWI, two-wire
interface) so this would work. I was not able to find a single-chip
solution that measures and captures time periods in this time range with
I2C interface but the circuit is not that complicated.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhLlGofA0NAzm9hLD1qzJO14-uBGMQ39TIkZ703Ai3HRswlRm8vljhbfG8A4U4IvldZQzaqirMDeb-xw_Vc967FWWNNW2MaPT5y5GeOtkc8uCKsNvHD2zqkz-rhqo7yMH3LSFSN8ADKsWif/s1600/irblegw2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="387" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhLlGofA0NAzm9hLD1qzJO14-uBGMQ39TIkZ703Ai3HRswlRm8vljhbfG8A4U4IvldZQzaqirMDeb-xw_Vc967FWWNNW2MaPT5y5GeOtkc8uCKsNvHD2zqkz-rhqo7yMH3LSFSN8ADKsWif/s640/irblegw2.png" width="640" /></a></div>
<br />
<br />
A 74HC4060 is
set up as oscillator and counter. The frequency of the oscillator is
about 350 kHz, yielding about 43 microsec time resolution for one counter step,
making it convenient to measure between 43 microsec and 5.5 sec with
7-bit resolution. An MCP23008 GPIO-extender with I2C interface provides
the conversion to I2C two-wire connection and also has a capture logic.
This means that whenever the output level of the IR receiver changes,
the MCP23008 stores the current value of the counter in its capture
register and raises an interrupt. This way the MCU is not doing any time
measurements and the time measurement hardware is able to survive about
1 msec autonomously without service from the MCU.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/irblegw/irblegw2.sch">Click here to download the schematic in Eagle SCH format</a>.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/irblegw/rfduino_irblegw2.zip">Click here to download the updated RFDuino gateway sources.</a><br />
<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhIrFq6bar9anfvySwkVdNlg9ja4-pSLfSFHpFq0Vj9QxIBRh2uOtTAJ3hy-PFYt6ZZBR1XRi7pjBcCHhYiryS4RcBNgp9-Ryq-w3COFDotFhDWE3sx7-H_I7fCeEqvvc9VPA07uPTfWHiC/s1600/irblegw2.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhIrFq6bar9anfvySwkVdNlg9ja4-pSLfSFHpFq0Vj9QxIBRh2uOtTAJ3hy-PFYt6ZZBR1XRi7pjBcCHhYiryS4RcBNgp9-Ryq-w3COFDotFhDWE3sx7-H_I7fCeEqvvc9VPA07uPTfWHiC/s320/irblegw2.jpg" width="320" /></a></div>
<br />
As with the previous version, edit the Makefile in the irblegw2/sketch
directory and update the RFDUINO_BASE_DIR and the AVRDUDE_COM_OPTS
variables according to the layout of the file system and the USB port
mapping of the RFDuino USB shield. Then you can simply say "make upload"
and the gateway is installed into the RFDuino. The <a href="http://pallergabor.uw.hu/androidblog/irblegw/irblegw.zip">Android client code</a>
and its usage is unchanged, check it out <a href="http://mylifewithandroid.blogspot.com/2015/04/infared-imaging-with-android-devices.html">in the previous post</a>.<br />
<br />
The gateway code represents a step in the right direction. The I2C
support library ("Wire") coming with the RFDuino gets stuck when invoked
from interrupt handler so this time the MCP23008 interrupt signal is
handled by means of polling. The result is that when BLE is active,
occasionally there are still lost IR codes even though the quality of
the recognition has improved considerably. I intend to turn the I2C
access interrupt-driven but that requires going deep into the nRF51822
that I was not yet able to accomplish in this iteration.</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-82512009392388996162015-12-04T15:33:00.002+01:002015-12-30T02:15:40.211+01:00Controlling Android device with an infrared remote<div dir="ltr" style="text-align: left;" trbidi="on">
I came across several posts about microcontrollers decoding signals
of infrared remote controllers and started to think, how an IR
remote can be integrated with an Android smartphone. This post is
about a simple use case when I control the media volume of the
Android phone with an IR remote.<br />
<br />
Long time ago, in a distant galaxy, IR transmitter-receiver was a
standard feature of almost any mobile phone. Technological progress
has eliminated that feature so we are now forced to build some
hardware. We need infrared sensor for sure to capture the remote's
infrared signal. But how can we connect that to the phone? Last
year's experiments prompted me to choose Bluetooth Low Energy (BLE).
Also, the microcontroller platform was determined by <a href="http://mylifewithandroid.blogspot.com/search/label/RFDuino">my experiences
with RFDuino</a> and I happened to have an <a href="http://www.rfduino.com/">RFDuino set</a> in my drawer.<br />
<br />
The idea is the following. An infrared receiver is connected to the
microcontroller that captures and interprets the infrared signals.
If the phone is connected to the BLE radio, then the key codes sent
by the IR remote are then sent to the phone over BLE. The phone does
whatever it wants with the key codes, in my example the volume up,
down and mute buttons are handled and are used to influence the
volume of the music played by the media player.<br />
<br />
The infrared implementation is based on the excellent <a href="https://github.com/cyborg5/IRLib/">IRLib</a>
code. IRLib
assumes that the IR receiver is connected to one pin of the
microcontroller so I built the circuit below. A TSOP1738 IR receiver
is directly connected to a GPIO pin of the RFDuino which is expected to
do all the decoding. SV1 header is only for monitoring debug messages,
it is not crucial for the operation of the gateway. If you intend to use
it, connect an RS232C level converter similar to <a href="http://www.lctech-inc.com/Hardware/Detail.aspx?id=67f4ee1a-7412-4d12-87f6-7d00d52566ca">this one</a>.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEidwE2iKveCEqPvTUH4K1DZpQ7NS7ODthMJ0NyUAYT18awIVff-mA78hqg7cwc86FE78aD4iLYzFQyLsicfQOGLnuJuH3smBEuLvOMHK7vmh1aWevfjBciSyxDKwU3a-93t877r2KYwVQ6e/s1600/irsch.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="216" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEidwE2iKveCEqPvTUH4K1DZpQ7NS7ODthMJ0NyUAYT18awIVff-mA78hqg7cwc86FE78aD4iLYzFQyLsicfQOGLnuJuH3smBEuLvOMHK7vmh1aWevfjBciSyxDKwU3a-93t877r2KYwVQ6e/s320/irsch.png" width="320" /></a><br />
<br />
Then came the complications. "Arduino code" is used with great
liberty on the internet, as if Arduino meant a single platform. But
in reality, Arduino is a very thin layer on top of the underlying
microcontroller. Most of the code, tools and articles out there are
based on Atmel's extremely popular AVR family of MCUs. These are
8-bit CPUs, equipped with a host of peripherials. The challenger in
this domain is not Intel but ARM's Cortex family. RFDuino is a
Nordic Semiconductor's nRF51822 System-on-Chip (SoC) which is a
Cortex-M0 core with integrated 2.4 GHz radio. BLE is supported by
means of a software stack. ARM Cortex is not supported as well as
AVRs by the Arduino community and this miniproject is a cautionary
tale about this fact.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/irblegw/rfduino_irblegw.zip">Click here to download the RFDuino project</a>. In order to compile it,
you need to install the Arduino IDE with the RFDuino support<a href="https://github.com/RFduino/RFduino/blob/master/README.md">
as described here</a>.<br />
<br />
For starter, the <a href="https://github.com/sudar/Arduino-Makefile">Arduino Makefile</a>
I used with such a great success previously does not support
Cortex-based systems, only AVRs. A short explanation: I don't like IDEs,
particularly not in a project where I would like to see exactly what
goes into the compile/link process. Arduino IDE is nice for beginners
but is a very limiting environment for more ambitious projects. After
several days of heavy modifications, I adapted this makefile to the ARM
Cortex tool chain of the RFDuino. Go into the irblegw/sketch directory,
open the Makefile and look for the following line:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace;">RFDUINO_BASE_DIR = /home/paller/.arduino15/packages/RFduino<br />
</span><br />
Adapt this according to the layout of your file system. Then type "make"
and the entire code should compile. "make upload" uploads the compiled
code into the RFDuino, provided that the port (AVRDUDE_COM_OPTS =
/dev/ttyUSB0 in the Makefile) is also correct.<br />
<br />
Then came further complications. The IRLib code is also AVR-specific.
The differences between Cortex-M0 and AVR are mainly related to
interrupt handling, on-chip counters and GPIO options. Polluting the
original code with ARM-specific fragments seemed too confusing so I
rather disabled the AVR-specific parts if the code is compiled on ARM
architecture and implemented the ARM-specific functionality in
Cortex-specific subclasses (in sketch/irblegw.ino). There are three
implementations (like in the base library) IRrecvRFDuino (which uses 50
microsec periodic interrupt to sample the IR receiver input),
IRrecvRFDuinoLoop (which is purely polling-based with no interrupt
support) and IRrecvRFDuinoPCI (which sets up an interrupt to detect when
GPIO02 changes state). This line determines, which one is used:<br />
<br />
IRrecvRFDuinoPCI My_Receiver(RECV_PIN);<br />
<br />
And now the biggest surprise. If BLE is not active, all the three
implementations work. But if BLE is active, the software stack running
behind the scenes on the RFDuino steals enough cycles so that IR reading
becomes completely unreliable. The best result is provided by
IRrecvRFDuinoPCI (which is activated in the download version) but even
that version, after a significant loosening of the matching rules drops
about 1 IR key out of 4. Well, folks, that's what I could achieve with
this hardware and that's the second important take-away: SoCs with
integrated communication stacks (this time BLE, but can be WiFi, GSM,
whatever) are notoriously tricky if the sensor processing logic is
time-critical.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/irblegw/irblegw.zip">Click here to download the Android code.</a> To decrease download size, I packed only the files under the app/src/main directory of the Android Studio project.<br />
<br />
The Android implementation is quite self-explanatory and is heavily related to <a href="http://mylifewithandroid.blogspot.hu/2014/11/motor-boat-control-with-bluetooth-low.html">this earlier RFDuino example program</a>.
This time I wanted to make sure that once the gateway is connected
through BLE, the application can be sent to the background so I
implemented the BLE connection logic in a service that significantly
complicated the code. But after all this wizardry with Android services,
the important part is here in IRGWService.java:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace;">if( v == IR_KEY_VOLUME_PLUS ) {<br />
audioManager.adjustStreamVolume( <br />
AudioManager.STREAM_MUSIC, <br />
AudioManager.ADJUST_RAISE, <br />
AudioManager.FLAG_SHOW_UI);</span><br />
...<br />
<br />
This is the fragment that maps IR remote keys to actions on the phone.
The RFDuino-based gateway does not do any key mapping so it is the
Android application that needs to know the meaning of the IR key codes.
The following comes from the Philips remote I grabbed for these
experiments.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOZhjl7IMjk6mRp53sV5N76H318UwenVVaj_NoBrLdN2hodgfhB24nvtAXMKAUleijyS2AJj2DaieaiaQAOW33adr6fa6r9vhRfyavhnZCB0MQnig7g4MjHDIGYY2ZnE_fMM2mHMieIvbi/s1600/irgw.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOZhjl7IMjk6mRp53sV5N76H318UwenVVaj_NoBrLdN2hodgfhB24nvtAXMKAUleijyS2AJj2DaieaiaQAOW33adr6fa6r9vhRfyavhnZCB0MQnig7g4MjHDIGYY2ZnE_fMM2mHMieIvbi/s320/irgw.jpg" width="240" /></a></div>
<br />
<br />
private static final long IR_KEY_VOLUME_PLUS = 0x20df12edL;<br />
<br />
It is highly likely that you will have to change this value according to
your remote's key code map. Just press the desired button and check the
code.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjBbops7xo0c-x8RssKP9PZbDQY32cE4OrbVo7W-XPs0IOVjvdp-HGmlG6_oO_sQLu1YHaANq-g4Q4I-t7F9OEcwnkaMDXOA7DHSoOrIYTIHFYOztH8ZnhxV5WKXY6_kqzDS3PgGQkUPkxR/s1600/irvolume.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjBbops7xo0c-x8RssKP9PZbDQY32cE4OrbVo7W-XPs0IOVjvdp-HGmlG6_oO_sQLu1YHaANq-g4Q4I-t7F9OEcwnkaMDXOA7DHSoOrIYTIHFYOztH8ZnhxV5WKXY6_kqzDS3PgGQkUPkxR/s320/irvolume.png" width="180" /></a></div>
<br />
<br />
This project was not a complete success as IR reading is not as reliable
as it should be. But it is indeed fun to control the Android phone with
an ordinary IR remote. I plan to improve the hardware a bit to make key
recognition better so stay tuned (if you care about IR remotes).<br />
<br />
<b>Update:</b> check out the follow-up blog posts (<a href="http://mylifewithandroid.blogspot.com/2015/12/improved-hardware-for-infrared-to.html">this</a> and <a href="http://mylifewithandroid.blogspot.com/2015/12/infrared-to-android-gateway.html">this</a>).<br />
<br /></div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-88383757897238238052015-04-22T17:01:00.004+02:002016-01-03T01:56:21.259+01:00Infrared imaging with Android devices<div dir="ltr" style="text-align: left;" trbidi="on">
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
One of the most evident sensors of Android devices is the camera. An
ordinary smartphone's camera is able to capture a lot of interesting
information but has its limitations too. Most evidently, its viewing
angle depends on the position of the device (so it is not fixed and
hard to measure) and its bandwidth is (mostly) restricted to the
visible light. It is therefore an exciting idea to connect special
cameras to Android devices.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhDkP5GgbONO8NMZVUNOel0tu7XLS1oi61K-EBtR6K2PmUCs6rFtJfn_8jjMsWBeJEHVOye7judqubJay0OPoAPcx5iuJcnT7iA7vGv9Yl6jFeDNr0GD8ApEG1lb7jZ6bhNiaOrjxDjsspf/s1600/ircamera.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhDkP5GgbONO8NMZVUNOel0tu7XLS1oi61K-EBtR6K2PmUCs6rFtJfn_8jjMsWBeJEHVOye7judqubJay0OPoAPcx5iuJcnT7iA7vGv9Yl6jFeDNr0GD8ApEG1lb7jZ6bhNiaOrjxDjsspf/s1600/ircamera.jpg" height="320" width="261" /></a><br />
<br />
<br />
In this post, I will present an integration of <a href="http://www.flir.com/cores/content/?id=66257">FLIR Lepton
Long-wavelength Infrared Camera</a> to an Android application over
Bluetooth Low Energy connection. Long-wavelength IR (LWIR) cameras
are not new. Previously, however, they were priced in the thousands
of dollars range (if not higher). Lepton is still pricey (currently
about 300 USD) but its price is low enough so that mere mortals can
play with it. FLIR sells a smartphone integration product (called
FLIR One) but it is currently only available for iPhone and locks
the camera to one device. Our prototype allows any device with BLE
connection to access this very special camera.<br />
<br />
The prototype system presented here needs a relatively long list of
external hardware components and it is also not trivial to prepare
these components. This list is the following:<br />
<br />
<br />
<ul>
<li>An Android phone with Bluetooth 4.0 capability. I used Nexus 5
for these experiments.</li>
<li>An FLIR Lepton module. My recommendation is the <a href="https://www.sparkfun.com/products/13233">FLIR Dev Kit
from Sparkfun</a> that has the camera module mounted on a
breakout panel that is much easier to handle than the original
FLIR socket.</li>
<li>A BeagleBone Black card with an SD Card >4GB.</li>
<li>A <a href="https://www.bluegiga.com/en-US/products/bled112-bluetooth-smart-dongle/">BLED112
BLE dongle</a> from Silicon Labs (formerly Bluegiga).</li>
</ul>
The software for the prototype can be downloaded in two packages.<br />
<br />
<ul>
<li><a href="http://pallergabor.uw.hu/androidblog/ircamera/bt_ircamera.zip">This package</a> contains the stuff for the embedded computer.</li>
<li><a href="http://pallergabor.uw.hu/androidblog/ircamera/IRCamera.zip">This package</a> is the Android application that connects to it.<br />
</li>
</ul>
Once you got all these, prepare the ingredients.<br />
<br />
<u>1. Hook up the FLIR camera with the BeagleBone Black</u><br />
<br />
Fortunately the BBB's SPI interface is completely compatible with
the Lepton's so the "hardware" just needs a couple of wires. Do
the following connections (P9 refers to the BBB's P9 extension
port).<br />
<br />
<br />
<table border="1" cellpadding="2" cellspacing="2" style="width: 100%px;">
<tbody>
<tr>
<td valign="top">FLIR</td>
<td valign="top">BBB</td>
</tr>
<tr>
<td valign="top">CS</td>
<td valign="top">P9/28 (SPI1_CS0)</td>
</tr>
<tr>
<td valign="top">MOSI</td>
<td valign="top">P9/30 (SPI1_D1)</td>
</tr>
<tr>
<td valign="top">MISO</td>
<td valign="top">P9/29 (SPI1_D0)</td>
</tr>
<tr>
<td valign="top">CLK</td>
<td valign="top">P9/31 (SPI1_SCLK)</td>
</tr>
<tr>
<td valign="top">GND</td>
<td valign="top">P9/1 (GND)</td>
</tr>
<tr>
<td valign="top">VIN</td>
<td valign="top">P9/4 (DC, 3.3V)</td>
</tr>
</tbody>
</table>
<br />
<br />
<u>2. Prepare the BBB environment</u><br />
<br />
I use Snappy Ubuntu. Grab the SD card and download the image <a href="https://developer.ubuntu.com/en/snappy/start/">as documented here.</a>
Before flashing the SD card, we have to update the device tree in the
image so that the SPI port is correctly enabled. Unpack bt_ircamera.zip
that you have just downloaded and go to the dt subdirectory. There you
find a device tree file that I used for this project. Beside the SPI1
port, it also enables some serial ports. These are not necessary for
this project but may come handy.<br />
<br />
Compile the device tree:<br />
<br />
dtc -O dtb -o am335x-boneblack.dtb am335x-boneblack.dts<br />
<br />
The output is the binary device tree (am335x-boneblack.dtb) that needs
to be put into the kernel image file. Let's suppose that the downloaded
image file is ubuntu-15.04-snappy-armhf-bbb.img and you have an empty directory at /mnt/img.
Then do the following:<br />
<br />
fdisk -l ubuntu-15.04-snappy-armhf-bbb.img<br />
<br />
Look for the first partition and note the partition image name and the offset:<br />
<br />
ubuntu-15.04-snappy-armhf-bbb.img1 * 8192 139263 65536 c W95 FAT32 (LBA)<br />
...<br />
<br />
Note that the actual partition image name may differ depending on the
Snappy image you downloaded. Calculate the offset as 8192*512=4194304<br />
Now mount the partition:<br />
<br />
mount -o loop,offset=4194304 ubuntu-15.04-snappy-armhf-bbb.img /mnt/img<br />
<br />
Then copy the dtb into the image, unmount and write the image to SD card
(on my computer the SD card interface is /dev/sdc, check before you
issue the dd command!):<br />
<br />
cp am335x-boneblack.dtb /mnt/img/a/dtbs<br />
umount /mnt/img<br />
dd if=ubuntu-15.04-snappy-armhf-bbb.img of=/dev/sdc bs=32M<br />
<br />
Now you have an SD card that you can insert into the BBB and boot from
it. Once you reached the Ubuntu prompt and logged in (ubuntu/ubuntu),
there's one thing more: the Snappy prototype application depends on the
libpng package which is not part of the default Snappy image. But before
you do it, check whether the SPI device was enabled correctly:<br />
root@localhost:~# ls /dev/spidev1.0 <br />
/dev/spidev1.0 <br />
<br />
Now about the png library. Download the armhf image from this location:<br />
<br />
wget http://ports.ubuntu.com/pool/main/libp/libpng/libpng12-0_1.2.50-1ubuntu2_armhf.deb<br />
<br />
<br />
Copy it to the BBB (update your card's IP address according to your network policies):<br />
scp libpng12-0_1.2.50-1ubuntu2_armhf.deb ubuntu@192.168.1.115:~<br />
<br />
Then go to the BBB console and install the deb package:<br />
sudo mount -o remount,rw /<br />
sudo dpkg -i libpng12-0_1.2.50-1ubuntu2_armhf.deb<br />
sudo mount -o remount,ro /<br />
<br />
<u>3. Prepare the BLE dongle</u><br />
<br />
The BLED112 stores the GATT tree in its firmware, hence in order to
provide the GATT services that connect the BBB with the Android device, a
new firmware needs to be generated and installed in the dongle. The
config files are located in the config subdirectory in the
bt_ircamera.zip archive. <a href="http://mylifewithandroid.blogspot.hu/2014/12/integrating-android-smartphone.html">Follow the steps in this post </a>to generate and install the new firmware. Once you are done, you can simply plug the dongle into the USB port of the BBB.<br />
<u><br />
</u><u>4. Install the prototype applications</u><br />
<br />
The prototype system has two parts. The application running on the BBB
acts as BLE server, fetches images from the FLIR camera and transmits
them over BLE. The Android application acts as BLE client, fetches
images from the BLE server and displays them. The BBB part is located in
bt_ircamera.zip and the Android part is in IRCamera.zip. The latter is
just the source part of the Android Studio project tree - I omitted all
the garbage that Android Studio generates into the project folders. For
the BBB installation, <a href="http://mylifewithandroid.blogspot.hu/2015/02/bled112-on-beaglebone.html">follow the instructions in this blog post</a>. Launch the BBB application like this as root:<br />
/apps/ircamera.sideload/1.0.0/bin/ircamera /dev/ttyACM0 <br />
and you are ready to go. On the Android side, select the BLE node with
the name "test", connect, click the "Take picture" button, wait for the
image to download and there you are. Note that the images are saved on
the SD card, which means that they also appear in the stock "Photos"
application.<br />
<br />
Now at last we can get to the technical issues with this prototype. One
interesting aspect is that there is no standard BLE service that
provides the functionalities - image capture triggering, image fetching -
our system needs. That's not a problem, we defined our own BLE service.
It is easiest to follow this service in irc_gattBLED112.xml
(bt_ircamera.zip, config subdirectory).<br />
<br />
The service has a custom UUID, generated randomly:<br />
<br />
<service uuid="274b15a3-b9cd-4e5e-94c4-1248b42b82f8" advertise="true"><br />
<br />
Also, its 3 GATT characteristics are in the non-standard UUID domain:<br />
<br />
<characteristic uuid="00000000-b9cd-4e5e-94c4-1248b42b82f8" id="irc_len"><br />
...<br />
<characteristic uuid="00000001-b9cd-4e5e-94c4-1248b42b82f8" id="irc_offs"><br />
...<br />
<characteristic uuid="00000002-b9cd-4e5e-94c4-1248b42b82f8" id="irc_pic"><br />
<br />
The interaction goes like the following. The BLE client connects and
reads the irc_len characteristic. This characteristic is tagged as
"user" on the BLE server side meaning that the BLE application must
generate the value on the fly, when the attribute is read. In our case,
reading this attribute fetches an image from the FLIR camera, converts
it into PNG format and stores it in the apps' data folder, returning
only the PNG file size. The Android application now can fetch the image
piece by piece. First the Android application writes the irc_offs
characteristic to inform the BLE server, what is the starting location
of the fragment it wants to fetch. Then it reads the irc_pic
characteristic which returns a maximum of 20 bytes of image data. This
makes the image download very slow (takes about 10-20 second to download
a general 5-6 Kbyte image to the Android application) but the
restriction comes from a BLE protocol layer. Maybe the old RFCOMM from
Bluetooth Classic would have been actually a better option for this
application.<br />
<br />
<b>Update: </b>I updated the client/server application to make the download faster (it is still quite slow). In order to speed up, I removed the explicit setting of the file offset (so the irc_offs characteristic is not used anymore). This made the download faster but there's still room for improvement.<br />
<br />
Other than the issue with fragment size, both applications are pretty
straighforward. Maybe the colors of the IR image are worth discussing a
bit. The FLIR camera returns a matrix of 80x60 pixels, each pixel has a
depth of 12 bit. Grayscale presentation is the most evident option but
most displays have only 256 gray colors. In order to make the IR shades
more visible, I used fake coloring. The algorithm is very simple: after
the image is fetched, the maximum and the minimum IR intensity is
calculated and the range between the two are mapped into a rainbow
gradient of 400 colors.</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com1tag:blogger.com,1999:blog-8214401912480503366.post-43700846516131025202015-02-06T19:46:00.001+01:002015-04-30T13:45:30.809+02:00BLED112 on BeagleBone<div dir="ltr" style="text-align: left;" trbidi="on">
<a href="http://mylifewithandroid.blogspot.com/2014/12/integrating-android-smartphone.html">In the previous post I demonstrated</a>, how a Bluetooth Low Energy dongle can be used to connect a PC and an Android device. While this is sort of project is appealing, connecting PCs and smartphones is not such an interesting use case. It is much more interesting, however, to transfer the PC-side program directly to an embedded device and that's what I will demonstrate in this post.<br />
<br />
The Android application used in this post did not change, <a href="http://pallergabor.uw.hu/androidblog/bled112/CTS.zip">you can download it here</a>. The BLE server application was updated according to the embedded platform's requirement, <a href="http://pallergabor.uw.hu/androidblog/beagle_cts_example.zip">you can download the new version here</a>.<br />
<br />
There are two baskets of embedded platforms out there. One of them is optimized for low power consumption. They are too limited to run a full-scale operating system therefore their system is often proprietary. Arduino (of which we have seen the <a href="http://mylifewithandroid.blogspot.com/2014/11/motor-boat-control-with-bluetooth-low.html">RFDuino variant)</a> is one of them but there are many more, e.g. Bluegiga modules also have a proprietary application model. We can typically expect power consumption in the 1-10 mA range with some platforms offering even lower standby consumption.<br />
<br />
The other basket contains scaled-down computers and they are able to run stripped down versions of a real operating system. Their power consumption is in the 100-500 mA range and they often sport 100s of megabytes of RAM and gigabytes of flash memory. They are of course not comparable to low power platforms when it comes to power consumption but their much higher performance (which can be relevant for computation-intensive tasks) and compatibility with mainstream operating systems make them very attractive for certain tasks. The card I chose is <a href="http://beagleboard.org/BLACK">BeagleBoard Black</a> and my main motivation was that Ubuntu chose this card as a reference platform for its <a href="https://developer.ubuntu.com/en/snappy/">Ubuntu Core variant</a>.<br />
<br />
The point I try to make in this post is how easy it is to port an application developed for desktop PC to these embedded computers. Therefore let's just port the <a href="http://mylifewithandroid.blogspot.com/2014/12/integrating-android-smartphone.html">BLE server part of the CTS example demo</a> to BeagleBone Black.<br />
<br />
There are a handful of operating systems available for this card. I chose Snappy Ubuntu - well, because my own desktop is Ubuntu. Grab an SD card and <a href="http://www.ubuntu.com/things#try-beaglebone">prepare a Snappy Ubuntu boot media according to this description</a>. It worked for me out of the box. <a href="https://www.youtube.com/watch?v=9V2RGrMA7ag">You can also start with this video</a> - it is really that easy. Once you hooked up the card with your PC, let's prepare the development environment.<br />
<br />
First fetch the ARM cross-compiler with this command (assuming you are on Ubuntu or Debian):<br />
<br />
sudo apt-get install gcc-arm-linux-gnueabihf<br />
<br />
Then install snappy developer tools <a href="https://developer.ubuntu.com/en/snappy/#snap-developers">according to this guide</a>.<br />
<br />
Then unpack the BLE server application into a directory and set up these environment variables.<br />
<br />
export CROSS_COMPILE=arm-linux-gnueabihf-; export ARCH=arm<br />
<br />
Enter the beagle_conn_example directory that you unpacked from the ZIP package and execute:<br />
<br />
make<br />
<br />
This should re-generate cts_1.0.0_all.snap which is already present in the ZIP archive in case you run into problems with building the app. The snap is the new package format for snappy. Then you can install this package on the card.<br />
<br />
snappy-remote --url=ssh://192.168.1.123 install ./cts_1.0.0_all.snap<br />
<br />
You have to update the IP address according to what your card obtained on your network. The upload tool will prompt you for username/password, it is ubuntu/ubuntu by default.<br />
<div>
<br /></div>
Update the GATT tree in the BLED112 firmware <a href="http://mylifewithandroid.blogspot.com/2014/12/integrating-android-smartphone.html">as described in the previous post.</a> Plug the BLED112 dongle into the BeagleBoard's USB port. Then open a command prompt on the BeagleBoard either using the serial debug interface or by connecting to the instance with ssh and execute the following command:<br />
<br />
sudo /apps/cts.sideload/1.0.0/bin/cts /dev/ttyACM0<br />
<br />
The familiar console messages appear and you can connect with the Android app as depicted in the image below.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWM8mwHJ82Q_Hp5sX48hp-zJA_yQzAVXMqU4_A0ugDXNA1xIFnEWifLRPifusCou8_6KHWwqK60omWJ68ROQnT16KmTI5cXhr8QWyNRtizApYD_xmwdZXTfU47-asAU_wvDt0WbUI_W9tb/s1600/bled112_beaglebone.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWM8mwHJ82Q_Hp5sX48hp-zJA_yQzAVXMqU4_A0ugDXNA1xIFnEWifLRPifusCou8_6KHWwqK60omWJ68ROQnT16KmTI5cXhr8QWyNRtizApYD_xmwdZXTfU47-asAU_wvDt0WbUI_W9tb/s1600/bled112_beaglebone.jpg" height="400" width="223" /></a></div>
<br />
One thing you can notice here is that Snappy's shiny new package system is not ready yet. In order for this package to access the /dev/ttyACM0 device (to which the BLED112 is mapped without problem), it has to run as root. This is something that the Snappy team is yet to figure out. The experience, however, is smooth enough that application development can be started now.<br />
<br />
<br />
<br /></div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-67215718796388028132014-12-30T23:42:00.001+01:002014-12-31T00:13:11.811+01:00Integrating an Android smartphone application with the BLED112 module<div dir="ltr" style="text-align: left;" trbidi="on">
My conclusion with the <a href="http://mylifewithandroid.blogspot.com/2014/11/motor-boat-control-with-bluetooth-low.html" style="text-align: left;">RFDuino adventures</a><span style="text-align: left;">
was that RFDuino is a perfect platform to start familiarizing with the
Bluetooth Low Energy (BLE) technology. BLE programming was made so
simple with RFDuino that it provides quick success. Simplification comes
with limitations, however, and eventually time has come for me to step
further toward a more flexible BLE platform. Bluegiga's BLE121LR long
range module seems to have outstanding range but first I tried a piece
of hardware that is equivalent from the API point of view with the
BLE121LR but is easier to start with and that is </span><a href="https://www.bluegiga.com/en-US/products/bluetooth-4.0-modules/bled112-bluetooth-smart-dongle/" style="text-align: left;">Bluegiga's BLED112 USB dongle</a><span style="text-align: left;">.</span>
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhgrmYHbmMGnIOBgS014nCBiLaQSHS9RjSt160-BRS4I4TZKsH3u_QYDfM_XPIFnzRaftzcBmFs0fexy0jD8VcdUdpjBgYeCGN7pKnVFyK_KFHGFURJhxqF6Ez36j-oc9RMd3U2APg8HfWH/s1600/bled112.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhgrmYHbmMGnIOBgS014nCBiLaQSHS9RjSt160-BRS4I4TZKsH3u_QYDfM_XPIFnzRaftzcBmFs0fexy0jD8VcdUdpjBgYeCGN7pKnVFyK_KFHGFURJhxqF6Ez36j-oc9RMd3U2APg8HfWH/s1600/bled112.jpg" height="314" width="320" /></a><br />
<br />
The BLED112 implements the same API (called BGAPI, check out <a href="https://www.bluegiga.com/en-US/products/bluetooth-4.0-modules/bluegiga-bluetooth-smart-software/documentation/">Bluetooth Smart Software API reference</a>)
that other Bluegiga BLE modules do but there's no need to buy the
pricey DKBLE development board or build any hardware. It plugs neatly
into the USB port and is functional without any additional piece of
hardware. From the serious BLE application development perspective it
has drawbacks too. Firstly, its USB interface is drawing a
constant 5mA current so this solution is not very much "low energy".
Second disadvantage is that its single USB interface is shared between
the BGAPI API and the programming interface so installing scripts into
the BLED112 is a risky enterprise. If the script running on the BLED112
occupies the USB port, there's no way to update it so the module is
essentially bricked. Hence in this exercise we will keep the BLE
application logic on the PC hosting the module and talk to the module
with BGAPI. This is very similar setup when the application logic is
running on a microcontroller or embedded PC.<br />
<br />
Click here to download the <a href="http://pallergabor.uw.hu/androidblog/bled112/CTS.zip">Android client</a> and the <a href="http://pallergabor.uw.hu/androidblog/bled112/cts_example.zip">PC server</a> example programs.<br />
<br />
In this exercise, we will implement the <a href="https://developer.bluetooth.org/gatt/services/Pages/ServiceViewer.aspx?u=org.bluetooth.service.current_time.xml">Current Time Service</a>
(CTS) and access this service from an Android application. CTS is a
standard Bluetooth service. The PC application will fetch the current
time from its clock and will update the characteristic exposed by the
BLED112. The Android application will detect the advertised CTS service,
connect to it, retrieve the time and display it to the user. The
Android application will also subscribe to time changes demonstrating
the notification feature of BLE GATT.<br />
<br />
Let's start with the PC part. Unpack the cts_example.zip file and inside
there are a set of C files belonging to the PC application in the root
directory. I developed and tested the application on Ubuntu 14.10 so if
you use a similar system, you have good chances that you just type
"make" and it will compile. Preparing the BLED112 dongle is more
complicated, however and this is the result of the quite cumbersome
Bluegiga tool chain. Any change to the GATT services (<a href="http://mylifewithandroid.blogspot.hu/2014/11/connect-your-android-to-real-world-with.html">read this presentation</a> if you don't know what GATT is) requires a firmware update
of the BLED112. This sounds scary but it is not too complicated if you
have a Windows system. Bluegiga SDK supports only Windows and there is
one element of the tool chain, the firmware downloader that does not run
on emulated Windows either - you need the real thing. So the steps are
the following:<br />
<br />
<ul>
<li>Grab a Windows machine, download the <a href="https://www.bluegiga.com/en-US/products/bluetooth-4.0-modules/bluegiga-bluetooth-smart-software/documentation/">Bluegiga SDK</a> and install it.</li>
<li>Get the content of the config subdirectory in cts_example.zip and
copy somewhere in the Windows directory system. Then generate the new
firmware with the <bluegigasdk_install_location>\bin\bgbuild.exe
cts_gattBLED112_project.bgproj command. The output will be the
cts_BLED112.hex file which is the new firmware. We could have placed
application logic into the firmware with a script but as I said, it is a
bit risky with the BLED112 so this time the new firmware contains only
the GATT database for the CTS service.</li>
<li>Launch the BLE GUI application, select the BLED112 port and try to
connect by clicking the "Attach" button. If all goes well, you will see
green "Connected" message. Then select Commands/DFU menu item, select
the HEX file we have just generated, click on the "Boot into DFU mode"
button. One pecularity of the BLED112 that in DFU mode it becomes
logically another USB device so the main window will display red
"Disconnected" message. Then click "Upload". If the upload counter
reaches 100% and you see the "Finished" message, the firmware update is
done.</li>
</ul>
At this point we are finished with Windows and can start the serious
business. Plug the dongle into the Ubuntu machine and check out its
port.<br />
<br />
dmesg | tail<br />
...<br />
usb 3-2: Product: Low Energy Dongle<br />
usb 3-2: Manufacturer: Bluegiga<br />
usb 3-2: SerialNumber: 1<br />
cdc_acm 3-2:1.0: ttyACM0: USB ACM device<br />
<br />
<br />
So this time the dongle is mapped to /dev/ttyACM0. Launch the BLE server with the following command:<br />
<br />
<br />
./conn_example /dev/ttyACM0<br />
<br />
<br />
The server is ready, let's see the Android client. Import the Android
project in CTS.zip into Android Studio. Note that I am still baffled by
the fact that this shiny new IDE does not have a project export command
so I had to zip part of the project's directory tree manually. Once you
launch the Android application, you will see a screen like this:<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4acBZLVjMyS9wR4ZuziqGrMkt3Tx9l0bKXQ20Fmqqt3YnbWfbTx9INtR6XTz6c60LELJlAeaHDbaV8vyCUi26rvsmgbgVVvp8FPHbGWtPjWMN56VpPdeWxHyI-QGVT9RBHbvIrZ6kC6nL/s1600/cts_devicelist.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4acBZLVjMyS9wR4ZuziqGrMkt3Tx9l0bKXQ20Fmqqt3YnbWfbTx9INtR6XTz6c60LELJlAeaHDbaV8vyCUi26rvsmgbgVVvp8FPHbGWtPjWMN56VpPdeWxHyI-QGVT9RBHbvIrZ6kC6nL/s1600/cts_devicelist.png" height="265" width="320" /></a></div>
<br />
<br />
The device named "test" is our device and the CTS service is
identified by the UUID of 0x1805. The other entry is just another BLE
device that I threw in for demonstration. Click on the device name and
you get the time emitted by the BLE dongle:<br />
<br />
<br />
The current time is also updated every second demonstrating that the client successfully subscribed to the changes.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiqKxg2xRIdNhf2S3qOrU7W40ejPC3Q9MryrF48U7N-9qKdLIQEUKQtc5wBGFGATPL3qrzkNbMtsyP4qwiKSU1pabMffDqP29AiE7dO4QwV28mC91C9b6hdfptNRwR-hvyVqVTSuCDTC0Kf/s1600/cts_timescreen.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiqKxg2xRIdNhf2S3qOrU7W40ejPC3Q9MryrF48U7N-9qKdLIQEUKQtc5wBGFGATPL3qrzkNbMtsyP4qwiKSU1pabMffDqP29AiE7dO4QwV28mC91C9b6hdfptNRwR-hvyVqVTSuCDTC0Kf/s1600/cts_timescreen.png" height="320" width="180" /></a></div>
<br />
<br />
<br />
On the server side, it is important to note that the BGAPI protocol
is defined in terms of byte arrays sent and received over the serial
port (which is mapped to USB in the case of BLED112). The BGAPI support
library coming from Bluegiga that I used in this demo is just a wrapper
over this interface so it can be replaced with an optimized
implementation if the library is too heavy for the application platform
(e.g. for a microcontroller) or is not implemented in the desired language (e.g. in Python). On the Android client side, it is interesting to note how the BLE advertisement parser library <a href="http://mylifewithandroid.blogspot.com/2014/12/ever-since-i-created-gas-sensor-demo.html">I presented in this post</a> is used to figure out, whether the device advertises the CTS service we are interested in.
</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com6tag:blogger.com,1999:blog-8214401912480503366.post-37009839757297969602014-12-19T17:38:00.003+01:002016-05-09T11:54:47.196+02:00Parsing BLE advertisement packets<div dir="ltr" style="text-align: left;" trbidi="on">
Ever since I created the Gas Sensor demo (<a href="http://mylifewithandroid.blogspot.com/2014/09/gas-sensor-prototype-explained.html">post here</a>, <a href="https://www.youtube.com/watch?v=iYWIzbJK81U">video here</a>, <a href="http://www.slideshare.net/paller/realworldconnect">presentation here</a>),
I had the feeling of an unfinished business. That demo sent the sensor
data in BLE advertisement packets so the client never connected to the
sensor but received data from the sensor in a broadcast-like fashion. The
implementation looked like this:<br />
<br />
public void onLeScan(final BluetoothDevice device, int rssi, byte[] scanRecord) {<br />
String deviceName = device.getName();<br />
...<br />
int addDataOffs = deviceName.length() + 16;<br />
int siteid = ((int)scanRecord[addDataOffs]) & 0xFF;<br />
int ad1 = ((int)scanRecord[addDataOffs+1]) & 0xFF;<br />
<br />
This was a quick & dirty solution that remained there from my earliest
prototypes. It sort of assumes that the structure of the BLE
advertisement packet is fixed so the sensor data can always be found at
fixed locations of the advertisement packet. This does not have to be
the case, <a href="https://www.bluetooth.org/en-us/specification/adopted-specifications">Bluetooth 4.0 Core Specification</a>,
Part C, Appendix C (or Core Specification Supplement in case of 4.2
version) describes, how the fields of the advertisement packets look
like. It just so happens that with the given version of the RFDuino BLE
module, the Manufacturer Specific Data field where RFDuino puts the user
data for the advertisement packet can always be found at a specific
location.<br />
<br />
The proper way is of course to parse this data format according to the
referred appendix of the specification and in this post I will show you
how I implemented it.<br />
<br />
<a href="http://www.sfonge.com/forum/topic/parsing-ble-advertisement-packets"></a>Here are the three example programs mentioned in this post: <a href="http://pallergabor.uw.hu/androidblog/bleparse/blescan.zip">blescan.zip</a>,
<a href="http://pallergabor.uw.hu/androidblog/bleparse/gassensordemo.zip">gassensordemo.zip</a>, <a href="http://pallergabor.uw.hu/androidblog/bleparse/gas_adv.ino">gas_adv.ino</a><br />
<br />
Let's see first the BLEScan project.<br />
<i><br /></i>
<i><b>Update:</b> the project has been updated to support more of the 4.2 elements. It has also been converted into an Android Studio project but the download material contains only the app/src part of the tree.</i><br />
<br />
<b>Update: </b><i>I was asked by e-mail, how to import the project (in blescan.zip) into Android Studio. Here is a simple process.</i><br />
<ul style="text-align: left;">
<li><i>Create a new project in Android Studio under any name. Make sure that your project supports at least API level 18. Choose the "create no Activity" option.</i></li>
<li><i>Once your project is created, go and find it on the disk. On my Ubuntu system, the project files go under ~/StudioProjects/<ProjectName> where <ProjectName> is the name you gave to your project. We will call this directory <ProjectDir>.</i></li>
<li><i>Go into <ProjectDir>/app/src and delete everything there. Copy blescan.zip into <ProjectDir>/app/src and unzip it. It will create a single directory called "main" and the sources below.</i></li>
<li><i>In Android Studio, do File/Synchronize. After that is completed, you can open your project files, build APK, etc.</i></li>
</ul>
<br />
The parser code is under the
hu.uw.pallergabor.ble.adparser package. Then you just give the
scanRecord array to AdParser's parseAdData method like this:<br />
<br />
ArrayList<AdElement> ads = AdParser.parseAdData(scanRecord);<br />
<br />
and then you get an array of objects, each describing an element in the
scan record. These objects can also produce printable representation
like this:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgFCzAAaJ1DMoj6D9BFc-yK3wdkoNsIzVL4lrr01S8Y1WSFUtClGNc2o00924NDReqgKojC0B7H_XwEGJP7Ly_RbbzFvplE_G0xTBo2lwLAOexNwqndISYuCqFZPySIM9U48wrHc8sq71cH/s1600/blescan.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="169" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgFCzAAaJ1DMoj6D9BFc-yK3wdkoNsIzVL4lrr01S8Y1WSFUtClGNc2o00924NDReqgKojC0B7H_XwEGJP7Ly_RbbzFvplE_G0xTBo2lwLAOexNwqndISYuCqFZPySIM9U48wrHc8sq71cH/s1600/blescan.png" width="320" /></a></div>
<br />
<br />
Now let's see the revised GasSensorDemo project, how the gas sensor
measurement is properly parsed out of the scan record. First we parse
the scan packet fields:<br />
<br />
ArrayList<AdElement> ads = AdParser.parseAdData(scanRecord);<br />
<br />
Then we look for a TypeManufacturerData element which corresponds to a
Manufacturer Specific Data field in BLE. We make an extra check to make
sure that the manufacturer field in the Manufacturer Specific Data is
0x0000 because RFDuino always creates a Manufacturer Specific Data field
like that if the application programmer specifies additional
advertisement data.<br />
<br />
AdElement e = ads.get(i);<br />
if( e instanceof TypeManufacturerData ) {<br />
TypeManufacturerData em = (TypeManufacturerData)e;<br />
if( em.getManufacturer() == 0x0000) {<br />
<br />
<br />
It would be tempting to use a custom manufacturer field or better, a
Service Data field. But then we run into another limitation of RFDuino
because RFDuino with its default firmware is only able to create
advertisement packets like in the previous example. This is not bad
because it allows the programmer to achieve quick success but later on,
we will need more flexibility and that will need another BLE module.</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com4tag:blogger.com,1999:blog-8214401912480503366.post-51412740389468088272014-11-24T07:22:00.001+01:002014-11-24T07:22:33.330+01:00Connect your Android to the real world with Bluetooth Low Energy<div dir="ltr" style="text-align: left;" trbidi="on">
This is my presentation at Londroid IoT meeting, 2014 nov. 19.<br />
<br />
<br /></div>
<iframe src="//www.slideshare.net/slideshow/embed_code/41845205" width="425" height="355" frameborder="0" marginwidth="0" marginheight="0" scrolling="no" style="border:1px solid #CCC; border-width:1px; margin-bottom:5px; max-width: 100%;" allowfullscreen> </iframe> <div style="margin-bottom:5px"> <strong> <a href="//www.slideshare.net/paller/realworldconnect" title="Connect your Android to the real world with Bluetooth Low Energy" target="_blank">Connect your Android to the real world with Bluetooth Low Energy</a> </strong> from <strong><a href="//www.slideshare.net/paller" target="_blank">Gabor Paller</a></strong> </div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com1tag:blogger.com,1999:blog-8214401912480503366.post-17713569010668682722014-11-21T10:27:00.002+01:002015-11-13T07:17:30.406+01:00Motor boat control with Bluetooth Low Energy<div dir="ltr" style="text-align: left;" trbidi="on">
<a href="http://mylifewithandroid.blogspot.com/2014/09/gas-sensor-prototype-explained.html">My previous post about Bluetooth Low Energy applications with RFDuino and Android presented a connectionless gas sensor.</a> That prototype was based solely on BLE advertisements, no connection was built between the scanner device (Android phone or tablet) and the sensor. While this connection-less operation is advantageous for sensors that just broadcast their measurement data, more complex scenarios that e.g. require authentication or build a communication session cannot be implemented in this model. The prototype I am going to present in this post demonstrates connection-oriented operation between RFDuino and an Android application.<br />
<br />
Watch this video to see what the application is about.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/IWID35cOLZg?feature=player_embedded' frameborder='0'></iframe></div>
<br />
<br />
The story behind this motor boat project is that I bought this RC-controlled model boat while I worked in the UK. But when we moved back to Hungary, I lost the RC controller. So the boat had been unused for years until I realized how great it would be to use an Android device as a controller. Hence I quickly integrated the RFDuino with the motor boat's original control circuitry and wrote the necessary software. As you can see in the video, it has quite respectable range even though I did not dare go into the October water of Lake Balaton where the second part of the video was shot (water temperature: some 10 degrees centigrade).<br />
<br />
First about the "hardware". I did not have the circuit schema of the original RC controller in the boat so I had to experiment a bit. By following the motors' cables I quickly found two three-legged stocky elements that looked like switching transistors (although the labels on them were not readable after all those years in service). I removed one end of the resistors that I thought connected the base of these transistors to the rest of the RC control circuit and tried out, how much current is needed to switch on the motors. To my pleasant surprise, 1 mA current was enough so I rather believe that these are actually not transistors but power switching ICs. Anyway, RFDuino outputs can provide 1 mA switching current so I just connected the other end of those removed resistors to two spare RFDuino I/O ports. Lo and behold, it worked. If RFDuino raises any of these pin to 1, the respective motor starts. One minor additional problem was about the power supply of RFDuino. The motor boat employs an 7.2 Volt battery and RFDuino needs 3.3 V. I added an LM1117-3.3V power regulator circuit between the battery and RFDuino and the "hardware" was ready.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQCGE08tiCOMNd0UyoinC_HikHKrKbe1oCUvfpmcHW3fnih2r1dtDnuOKzZJx35QwRJ9iVuOE9kQKBXgZd6BvgFHrnRwRe4DpkyzI5ezxETVEtrPZhzjEk4ai7eoxn2KQgMjtlRYW-cvDB/s1600/motorboat_int.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQCGE08tiCOMNd0UyoinC_HikHKrKbe1oCUvfpmcHW3fnih2r1dtDnuOKzZJx35QwRJ9iVuOE9kQKBXgZd6BvgFHrnRwRe4DpkyzI5ezxETVEtrPZhzjEk4ai7eoxn2KQgMjtlRYW-cvDB/s1600/motorboat_int.jpg" width="209" /></a></div>
<br />
<br />
Do you know about BLE concepts like service and characteristics? If not, <a href="http://www.slideshare.net/paller/realworldconnect">please read this presentation for a quick introduction.</a> In short: BLE services (also called GATT profiles) are composed of characteristics which are key-value pairs decorated by meta-information that the BLE specification calls descriptors. RFDuino with its default firmware is not able to implement any standard GATT profile except for its own custom GATT profile. This is a major disadvantage in product-level development but makes RFDuino code super-easy because the programmer does not have to deal with BLE details. In the RFDuino custom service, a "read" and a "write" characteristic is defined. Whatever the client (in our case, the Android application) writes into the "write" characteristic appears for the RFDuino code as incoming data callback. If the RFDuino code calls the RFduinoBLE.send(v) method, the data appears in the "read" characteristic. The Android client can register a callback for data manipulations of the "read" characteristic so it will receive callbacks when RFDuino code invokes the RFduinoBLE.send() method. There are additional callbacks for service connections and disconnections.<br />
<br />
You can download the <a href="http://pallergabor.uw.hu/androidblog/motorboat.zip">Android project </a>and the <a href="http://pallergabor.uw.hu/androidblog/motor_boat.ino">RFDuino source</a> here.<br />
<br />
First about the RFDuino code. Beside the familiar setup() and loop() functions, you will see three functions characteristic to RFDuino. RFduinoBLE_onConnect() is called if a client connects to the RFDuino BLE service, RFduinoBLE_onDisconnect() is called on disconnection and RFduinoBLE_onReceive() is called when there is incoming data. There is only one complication in the code that requires explanation: this is a powerful motorboat and it can go out of BLE radio range very quickly. In the early versions the boat became uncontrollable in this case meaning that it just continued with the last command received. That was not a nice feature so I implemented a heartbeat message feature which is sent in every 5 second. The loop() method starts by sending the heartbeat to the client and goes to sleep. It wakes up after 5 seconds and if it finds that the client did not send back the heartbeat, it stops the motors. Otherwise the client just sends a bit mask about which motors to stop or start and the RFDuino just responds with the same mask informing the client that the motors were indeed started or stopped. This means that the arrows on the user interface showing which motors are running represent the actual state of the boat which is advantageous if something goes wrong.<br />
<br />
Then about the Android side. The code starts by discovering the device. Once the device is discovered, we connect to the device with the connectGatt() method then discover its services with gatt.discoverServices() method. Once the service discovery callback arrives, we retrieve the RFDuino service (getService(), we expect this custom service) and obtain the characteristic handles (getCharacteristic()). We use the "read" characteristic's client configuration descriptor to enable notifications from the RFDuino server to the Android client so that we get a callback when the RFDuino side sends something to us.<br />
<br />
Disconnection is worth detailing because there's an RFDuino speciality here. Normally, one can just disconnect from the service with the disconnect() method invocation. RFDuino however is left in a limbo state in this case: the BLE session is disconnected but the RFDuino application does not receive a callback and cannot accept a new connection request. The "disconnect" characteristic has to be written to (the value does not matter) for the RFDuino server to properly disconnect.<br />
<div>
<br /></div>
</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-73241178664372006602014-10-03T15:53:00.000+02:002014-10-07T21:21:28.319+02:00Award for our Android Gas Sensor <div dir="ltr" style="text-align: left;" trbidi="on">
I just got a mail that <a href="http://mylifewithandroid.blogspot.hu/2014/09/android-gas-sensor-application-with.html">our gas sensor entry </a>(a gas sensor with Bluetooth Low Energy connectivity and the associated Android application) has just won 3rd place on the <a href="http://www.semiconductorstore.com/blog/2014/We-Know-RFduino-Video-Contest-Winners/864">We Know RFDuino contest</a>. Thanks to everyone who viewed our video and thus helped us to compete successfully! Meanwhile the <a href="http://mylifewithandroid.blogspot.hu/2014/09/gas-sensor-prototype-explained.html">source code of the prototype</a> was made open source so you may want to check out that too!</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-81800618253378363622014-09-29T21:22:00.002+02:002014-09-29T21:22:52.123+02:00Gas sensor prototype explained<div dir="ltr" style="text-align: left;" trbidi="on">
The "<a href="http://www.semiconductorstore.com/pages/Promo_Landing/2014/RFduino_Contest_2014_Summer.asp">We know RFDuino</a>" contest has not ended yet but its end is sufficiently close so that I can explain <a href="http://mylifewithandroid.blogspot.hu/2014/09/android-gas-sensor-application-with.html">our prototype application.</a> Our entry is a Bluetooth Low Energy-connected gas sensor and it is presented in the video below. Make sure that you watch it, you help us win the competition.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/iYWIzbJK81U?feature=player_embedded' frameborder='0'></iframe></div>
<br />
<br />
The prototype demonstrates a unique capability of Bluetooth Low Energy device advertisement messages: you can embed user data into these broadcasts. These come handy if you just want to send out some measurement data to whoever cares to listen without creating a session between the BLE client and server. This broadcast-type data transfer may support unlimited number of clients with very low energy consumption on the sensor side.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/gassensor_android.zip">Click here to download the Android client application project.</a><br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/gas_adv.ino">Click here to download the RFDuino source code.</a><br />
<br />
The prototype works like the following. The microcontroller presented in the video measures the Lower Explosion Limit and sends this value to the RFDuino microcontroller over a super-simpe serial protocol. A message of this protocol looks like this:<br />
<br />
0xA5 <seq_no> <LEL%><br />
<br />
where seq_no is an increasing value and LEL% is the measured Lower Explosion Limit value. The microcontroller code is not shared here but you can get the idea. The RFDuino code receives the LEL% value over the serial port it creates on GPIO pins 3 and 4, creates a custom data structure for BLE advertisements consisting of the site ID and the LEL% value then starts advertising. This is performed cyclically so the LEL% value is updated in the sensor's BLE advertisement every second.<br />
<br />
Now let's see what happens on the Android side. This is a non-trivial application with multiple activities but the Real Thing (TM) happens in the MapScreenActivity, in the onLeScan method. This method is called every time the Android device's BLE stack discovers a device. In this case we check whether the device's name is "g" (this is how we identify our sensor) and we retrieve the LEL% data from the advertisement packet. We also handle the Received Signal Strenght Indicator (rssi) value for proximity indication. Bluetooth device discovery is restarted in every 2 seconds so that we can retrieve the latest LEL% value. The rest is just Plain Old Android Programming.<br />
<br />
The identification of the sensor and the encoding of the sensor data is obviously very naive but this is not really the point. You can make it as complex as you like, e.g. you can protect the sensor data with a hash and place that hash also into the advertisement so that the receiver can make sure that it gets data from an authorized sensor and not a fake one. The important thing is that the entire framework is sufficiently flexible so that relatively complex functionality can be implemented and RFDuino really simplifies sensor programming a lot.<br />
<br />
If you enjoyed the example application, make sure you watch the video (many times if possible :-)) and if you happen to be in London on 2014 November 19, you might as well come to the <a href="http://www.meetup.com/android/events/208993142/">Londroid meetup</a> where I present this and another BLE project (a connection-oriented one, called MotorBoat).</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-37653323858512035742014-09-06T12:08:00.003+02:002014-09-06T20:25:48.692+02:00Camera shot on charger connection<div dir="ltr" style="text-align: left;" trbidi="on">
Somebody came to me with an idea whether a cheap Android phone can be turned into an automatic camera. Some external sensor would send a signal to the phone and the phone would take a picture automatically. We started to discuss the possible connection of the external sensor and an interesting idea came up: the charger connection.<br />
<br />
Android delivers an event whenever the charging power is connected or disconnected: can it be used to send a binary signal to an application in a very simple way, without fiddling with Bluetooth or USB?<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/shotonpower.zip">Click here to download the example application.</a><br />
<br />
You have to start the application once. Then whenever you connect the charger, it takes a picture. When the application is in the foreground, a preview is shown but as long as the application is active (not destroyed) it works from the background too.<br />
<br />
Here are the experiences:<br />
<br />
<ul style="text-align: left;">
<li>On my high-end device the application reacted quickly to charger connection, the reaction time from connecting the charger to the camera shot was less than a second. But when the application was tested on the very low-end Android target device, the picture was much less rosy: the delay increased to 3-4 seconds, effectively making the solution unusable.</li>
<li>In order for this application to work, it has to be started at least once manually. This pretty much kills all unattended use cases.</li>
<li>The shutter sound is almost impossible to remove. <i><b>Update:</b> on certain devices (Nexus 4 and Nexus 7 confirmed) there is no shutter sound in silent mode.</i></li>
</ul>
<div>
The takeaway for us was to reject the idea. But I share the example program anyway, maybe it can be useful for somebody.</div>
<div>
<br /></div>
<div>
One last thing. <b><i><a href="https://www.youtube.com/watch?v=iYWIzbJK81U">View our video about our Bluetooth Low Energy sensor application prototype</a> and help us win the <a href="http://www.semiconductorstore.com/pages/Promo_Landing/2014/RFduino_Contest_2014_Summer.asp">"We Know RFDuino" contest</a>!</i></b></div>
<div>
<br /></div>
</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com2tag:blogger.com,1999:blog-8214401912480503366.post-27506106503640898312014-09-02T22:33:00.001+02:002014-09-02T22:33:22.243+02:00Android gas sensor application with Bluetooth Low Energy/RFDuino <div dir="ltr" style="text-align: left;" trbidi="on">
I have always had a fascination with sensors linked up with mobile devices so it seemed just a good opportunity to try out the latest fashionable technology in the area, Bluetooth Low Energy in the context of a competition. SemiconductorStore.com announced the <a href="http://www.semiconductorstore.com/pages/Promo_Landing/2014/RFduino_Contest_2014_Summer.asp">"We know RFDuino" competition</a> for applications of the RFDuino module. RFDuino is an Arduino module with Bluetooth Low Energy (BLE) support. It is ideal to act as an interface between a sensor and a BLE-enabled mobile device like the Nexus 7.<br />
<br />
Eventually I will publish the entire source code of this prototype application on this blog. But as this is a contest, I will wait until the contest ends (Sept. 30). Till then, watch the (very amateurish) video we have prepared about our sensor and the Android application. The entry with the most views wins the contest so if you like the concept, share the video with others! Thanks in advance. :-)<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/iYWIzbJK81U?feature=player_embedded' frameborder='0'></iframe></div>
<br /></div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-91784464952435419012014-03-11T21:42:00.001+01:002015-11-13T07:20:37.578+01:00Beyond RenderScript - parallelism with NEON<div dir="ltr" style="text-align: left;" trbidi="on">
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
My last post about <a href="http://mylifewithandroid.blogspot.com/2014/02/renderscript-in-android-parallel-version.html">the parallel implementation</a> of <a href="http://mylifewithandroid.blogspot.com/2014/01/renderscript-in-android-benchmark.html">Distributed Time Warping (DTW)</a>
algorithm was a disappointment. The RenderScript runtime executed the
parallel implementation significantly slower than the single-core
implementation (also implemented with RenderScript). It turned out that
parallelizing the processing of 10000-50000 element vectors on multiple
cores were not worth the cost of the multi-thread processing and all the
overhead that comes with it (threads, semaphores, etc.). One core must
be allocated a significantly larger workload but our DTW algorithm is
not able to generate such a large, independent workload because rows of
the DTW matrix depend on each other. So in order to exploit RenderScript
multi-core support, it is best to have an algorithm where the output
depends on only the input and not on some intermediate result because
this type of algorithm can be sliced up easily to multiple cores.<br />
<br />
It would have been such a waste to discard our quite complicated
parallel processing DTW algorithm so I turned to other means of parallel
execution. Multi-core is one option but the ARM processors in popular
Android devices have another parallel execution engine, internal to the
core, the NEON execution engine. One NEON instruction is able to process
4 32-bit integers in parallel (see picture below). Can we speed up DTW
fourfold with this option?<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhZeFOSJBTIb0zpNloSZ2mUAEtFqzZDrX82SoPcsD_fMvJSbZvcCiDpb5gaf3PJf8adB6YEXijBHcmq1n5qq2bTmCutgLrZM7mWrcOa4ws-4V8__Vbv5FAFL8Z1nivP-5VTGOohJwY2E29m/s1600/neon.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="248" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhZeFOSJBTIb0zpNloSZ2mUAEtFqzZDrX82SoPcsD_fMvJSbZvcCiDpb5gaf3PJf8adB6YEXijBHcmq1n5qq2bTmCutgLrZM7mWrcOa4ws-4V8__Vbv5FAFL8Z1nivP-5VTGOohJwY2E29m/s1600/neon.jpg" width="400" /></a><br />
<br />
NEON is actually quite an old technology, even Nexus One was equipped
with it. It is much more widely deployed therefore than multi-core CPUs.
While ordinary applications can take advantage of multi-core CPUs (e.g.
two processes can execute in parallel on two cores), NEON programs are
difficult to write. Although some compilers claim the ability to
generate NEON code and template libraries are available, the experience
is that the potential performance benefits cannot be exploited without
hand-coding in assembly and that's not for the faint hearted.<br />
<br />
<a href="http://pallergabor.uw.hu/androidblog/samplerecognizer_neon.zip">The example program can be downloaded from here.</a><br />
<br />
The relevant functions are in jni/cpucore.c. There are 3
implementations, processNativeSlow, processNative and processNativeNEON,
each is progressively more optimized than the previous one. The
processNativeSlow and processNative functions are in C, in
processNativeNEON the most time-critical loop ("tight loop") is entirely
implemented in mixed ARM/NEON assembly. This tight loop produces 4
result elements in parallel so we expect huge performance gain over the
single-core RenderScript implementation (dtw.rs).<br />
<br />
The experience is completely different. While the NEON implementation is
significantly faster on small datasets, one second of voice is 8000
samples so data sizes grow quickly. On 10 second data sets (80000
samples, 6.4 billion element DTW matrix) the simple nested loop C99
implementation and the complex, hard to understand NEON implementation
produces about the same execution time.<br />
<br />
How is this possible? Let's take an example of 10 second reference and
evaluation samples. This means 80000 elements, 80000*80000=6.4 billion
values to calculate. Calculating each value takes 20 bytes to access (2
input samples (2 bytes each), 3 neighbor cells (4 bytes each) and
storing the result (4 bytes)). <a href="https://play.google.com/store/apps/details?id=com.a1dev.sdbench">A1 SD Bench</a>
measures 800 Mbyte/sec copying performance on my Galaxy Nexus (and
similar values on the two cheap Android tablets that the family has),
that obviously means 2 accesses (one read and one write). For
simplicity, let's assume that reads and writes take about the same time.
This means that according to this very rough calculation, the memory
accesses themselves take about 80 sec. The real execution time is about
120 sec, the difference can be explained by the simplifications. Cache
does not really help because of the large data size. The performance is
determined by the RAM speed and the simplest single-core implementation
already reaches the bottleneck. All the wizardry with parallelism is
futile.<br />
<br />
Obviously the case was not helped by the selection of the DTW algorithm
as benchmark which intentionally does not fit into the class of
algorithms normally used to demonstrate the benefits of parallel
processing. Grayscale conversion would be better (one read, one write
and 3 multiplications per pixel). But this means that you actually have
to be really lucky with your algorithms for these parallel options to
speed up your code significantly. Even then, it is worth looking at the
parallel options inside the core before going multi-core. And you
definitely should not forget the auxiliary costs of parallel
computation, e.g. distributing/gathering the data to/from the parallel
processing units or whether other hardware (e.g. memory) is able to keep
the pace with the CPU.<br />
<br />
One wild idea at the end. Could RenderScript computation model be used
to generate NEON code? With some limitations, the answer is probably
yes.</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com19tag:blogger.com,1999:blog-8214401912480503366.post-64620285871774451592014-02-07T21:35:00.000+01:002015-11-13T09:21:06.291+01:00RenderScript in Android - the parallel version<div dir="ltr" style="text-align: left;" trbidi="on">
<a href="http://mylifewithandroid.blogspot.hu/2014/01/renderscript-in-android-anatomy-of.html">In the previous post I promised to revisit the parallel case. </a>The big
promise of RenderScript is to exploit parallelism among different CPUs,
GPUs and DSPs in the device at no additional cost. Once the algorithm is
properly transformed into parallel version, the RenderScript runtime
grabs whatever computing devices are available and schedules the subtask
automatically.<br />
<br />
The problem with<a href="http://mylifewithandroid.blogspot.hu/2014/01/renderscript-in-android-benchmark.html"> DTW</a> is that it is not so trivial to parallelize. Each
cell in the matrix depends on cells at (x-1,y), (x-1,y-1) and (x,y-1)
(provided that the cell to calculate is at (x,y)). By traversing the
matrix horizontally or vertically, only two rows (one horizontal and one
vertical) can be evaluated in parallel.<br />
<br />
Michael Leahy <a href="http://membres-liglab.imag.fr/termier/ParallelDMWorkshop/WorkshopNotes_PDM11.pdf#page=17">recommended a paper</a>
that solves this problem. This algorithm traverses the matrix
diagonally. Each diagonal row depends on the two previous diagonal rows
but cells in one diagonal row don't depend on each other. One diagonal
row can be then fed to RenderScript to iterate over it. The picture
below illustrates the concept.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiqrKVsOUbistS1TSWMz3tJ2mOTLMMgOWycYJuPw7x4EhfuEL2kfNTDIVcIbhd2d23R5Vz7nzYF4NAjuxLNMVQKYRkxsfTm-ro8aP5zbKts5jf87c169iuQZrrdVB1-WTU3_RJlXz5XFXCa/s1600/dtw.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiqrKVsOUbistS1TSWMz3tJ2mOTLMMgOWycYJuPw7x4EhfuEL2kfNTDIVcIbhd2d23R5Vz7nzYF4NAjuxLNMVQKYRkxsfTm-ro8aP5zbKts5jf87c169iuQZrrdVB1-WTU3_RJlXz5XFXCa/s1600/dtw.jpg" width="221" /></a></div>
<br />
<br />
The example program <a href="http://pallergabor.uw.hu/androidblog/samplerecognizer_optimized.zip">can be downloaded from here</a>.<br />
<br />
You will notice that there are two parallel implementations. The
findReferenceSignalC99Parallel() is the "proper" implementation that
follows closely the RenderScript tutorial. Here the diagonal rows are
iterated in Java and only the parallel kernel is implemented in
RenderScript. This version - even though it is functional - is not
invoked by default because it delivers completely inacceptable
performance on my 2-core Galaxy Nexus. By looking closely at the
execution times, I concluded that even though RenderScript runtime
invocations ( copying into Allocations and invoking forEach) are
normally fast, sometimes very innocent-looking invocations (like copying
5 integers into an Allocation) can take about a second. This completely
ruined this implementation's performance.<br />
<br />
The other parallel implementation which is actually invoked and whose
performance is compared to the 1-core RenderScript implementation (the
fastest one) is findReferenceSignalC99ParallelRSOnly(). This version is
implemented entirely in RenderScript. Unfortunately its performance is
2-2.5 times <i><b>slower</b></i> than the 1-core implementation. How can it be?<br />
<br />
First, if you compare dtw.rs and dtwparallel2.rs, you will notice that
the parallel implementation is considerably more complex. Indexing out
those varying-length diagonal rows takes a bit of fiddling while the
1-core implementation can take the advantage of fast pointer arithmetic
to move from cell to cell sequentially. So the parallel implementation
starts with a handicap. This handicap is not compensated by the 2 cores of the Galaxy Nexus.<br />
<br />
OK, Galaxy Nexus is the stone age but what happens on a 4-core processor
like on a Nexus 4? The runtime does launch with 4 cores but then the
Adreno driver kicks in and the result is that the parallel
implementation is about 3 times slower than the serial one. What happens
in the driver, I don't know, as far as I can see, the source code is
not available.<br />
<br />
Jasons Sams recommended to disable the GPU driver with<br />
adb shell setprop debug.rs.default-CPU-driver 1<br />
but I decided to stop my adventures here. The conclusion I drew for
myself is that RenderScript in its present form is not ready for
parallel programming. Clang-LLVM is a very promising compilation
technology but the parallel runtime suffers from a number of problems.
IMHO, there should be a way to programmatically control the way the
workloads are allocated to CPUs/GPUs. Until then, if you want to harness
the power of your multicore processor, code the parallel runtime
yourself. Using RenderScript for the serial code if you wish.</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com4tag:blogger.com,1999:blog-8214401912480503366.post-47413771237900930412014-01-30T22:05:00.000+01:002015-11-13T09:25:53.554+01:00RenderScript in Android - anatomy of the benchmark program<div dir="ltr" style="text-align: left;" trbidi="on">
<a href="http://mylifewithandroid.blogspot.com/2014/01/renderscript-in-android-java.html">In the previous post I have presented our RenderScript benchmark</a> and
demonstrated that RenderScript implementation of the same algorithm
can be 2-3 times faster than Java. How can a "script" be so fast? In
order to understand this speed difference, let's see how the
RenderScript fragment is executed.<br />
<br />
<a href="http://sfonge.com/forum/topic/renderscript-android-java-optimization-benchmark-program"></a><a href="http://pallergabor.uw.hu/androidblog/samplerecognizer_optimized.zip">The example program is available here</a>.<br />
<br />
First, let's see how the script looks like. The source can be found in
dtw.rs, in the same directory where other Java sources (just one file in
this case) are.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgx0IeuEfher8aWhhq2gkp1V_aYTM2qxGhgA4zrcDQwkbY-UY2R8uz8kXGbMs0IViQMlPLzwNAuI-zV_UqGsxXwhYRdtYksW7SOYTtXJp_f_Q4IEq6_h8OLm7vzzKKIeViINUMi5ybYFQly/s1600/dtwrs.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgx0IeuEfher8aWhhq2gkp1V_aYTM2qxGhgA4zrcDQwkbY-UY2R8uz8kXGbMs0IViQMlPLzwNAuI-zV_UqGsxXwhYRdtYksW7SOYTtXJp_f_Q4IEq6_h8OLm7vzzKKIeViINUMi5ybYFQly/s1600/dtwrs.jpg" /></a></div>
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
It looks like an innocent C function but there are some
specialties. All the global variables like these ones:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace; font-size: x-small;">
int32_t s2len = 0;<br />
int32_t *d0;</span><br />
<br />
can be used to pass data to the script. The toolchain generates a Java
wrapper for each .rs file, ours is called ScriptC_dtw.java. In order to
set s2len, for example, one calls the set_s2len function in the
ScriptC_dtw class. The d0 global variable is a pointer type, setting
this variable requires an Allocation Java object. Open MainActivity.java
and look up the findReferenceSignalC99() method. There you will find:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace; font-size: x-small;"> Allocation signal1Allocation = Allocation.createSized(<br />
rsC,<br />
Element.I16(rsC),<br />
refSignal.length);<br />
signal1Allocation.copyFrom(refSignal);<br />
script.bind_signal1(signal1Allocation);</span><br />
<br />
Here we created an allocation that holds 16-bit integers, copied the
input signal into it and bound the allocation so that the allocation's
data is available to the script. When the script is invoked, the data
area of this allocation is simply available to the script as:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace; font-size: x-small;">
int16_t *signal1;</span><br />
<br />
This sort of parameter passing is one-way for simple values but two-way
for allocations. So whatever you write in your script into e.g. s2len,
you won't be able to read it in the Java layer after the script finishes
executing. In contrast, Allocations provide two-way data transfer,
that's why the result value is passed back to Java in an Allocation.<br />
<br />
In MainActivity.java:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace; font-size: x-small;"> Allocation rAllocation = Allocation.createSized(<br />
rsC,<br />
Element.I32(rsC),<br />
1);<br />
script.bind_r(rAllocation);</span><br />
<br />
In dtw.rs:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace; font-size: x-small;"> *r = d1[s1len-1];</span><br />
<br />
And again in MainActivity.java after the script finished executing:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace; font-size: x-small;"> int result[] = new int[1];<br />
rAllocation.copyTo(result);<br />
...<br />
int maxc = result[0];</span><br />
<br />
The execution of the script seems simple enough but there's more than meets the eye.<br />
<br />
Execution context and the script instance are created. <br />
<br />
<span style="font-family: "courier new" , "courier" , monospace; font-size: x-small;"> RenderScript rsC = RenderScript.create( this);<br />
ScriptC_dtw script = new ScriptC_dtw( <br />
rsC, <br />
getResources(), <br />
R.raw.dtw);</span><br />
<br />
ScriptC_dtw is the wrapper which was generated by RenderScript
toolchain. But what is R.raw.dtw? Let's see how our "script" was turned
into executable code. If you unzip the APK file, you find some
interesting artifacts. Under the res/raw directory, you find dtw.bc.
This is the LLVM bytecode that dtw.rs was compiled to. In addition,
under the lib directory, you will find .so files for the ARM, MIPS and
Intel platform. If you disassemble librs.dtw.so, you will find highly
optimized binary compiled from our script which is really a piece of
valid C code.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhx8yZsnqbsZz0Bnmkvmh60nlV9eGcgOmB-8Mm5atkE2gmhLeRxf8UF_-hVWL75tlLHK0a74CFQHdKAgEsVTnGJ_J0blsvRNbt25r2vEgSVNTVwzEkX96AbLozERUo2w1P7Y76CVyycmvEa/s1600/samplerecognizer_apk.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhx8yZsnqbsZz0Bnmkvmh60nlV9eGcgOmB-8Mm5atkE2gmhLeRxf8UF_-hVWL75tlLHK0a74CFQHdKAgEsVTnGJ_J0blsvRNbt25r2vEgSVNTVwzEkX96AbLozERUo2w1P7Y76CVyycmvEa/s1600/samplerecognizer_apk.jpg" width="640" /></a><br />
<br />
The RenderScript name generates a confusion. This name evokes a
proprietary scripting language when in fact it is Clang's C99 front-end
compiler with a set of libraries that are ported to a large number of
processors. Optimized C code is fast, what is so surprising in it? When
our "script" is executed on an ARM processor, the RenderScript runtime
just has to load the precompiled ARM code and execute it. If it turns
out that there is no precompiled native code for the target processor
(e.g. it is a GPU) then LLVM backend compiler swings into action and
generates code for that processor at installation time. Both compilation
steps (from C to LLVM bytecode and from LLVM bytecode to native) are
subject to optimization so the resulting native code is very fast. No
wonder therefore that RenderScript beats Dalvik VM so easily and with
such a large margin.<br />
<br />
After all the global variables have been initialized, the script can be invoked.<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace; font-size: x-small;"> script.invoke_dtw();<br /> rsC.finish();</span><br />
<br />
Note the finish() invocation here. The invoke_dtw() method is
asynchronous meaning that when it returns, the execution of the "script"
has not finished, in fact, it was not even started. The finish() method
on the RenderScript instance blocks until the script invocations on that context all finish. Script invocations in the same context are executed sequentially.<br />
<br />
But what happens when more than one context is created? Allocations and
script execution in those contexts are independent. If you have enough
cores/processors, script invocations in those contexts will execute in
parallel. Be aware, however, that if you create more contexts than the
number of processing units you have then those contexts will compete for
the same processing units by means of context switching and these
context switches will eventually decrease your performance. If your
algorithm requires an element scan which is more complicated than the
sequence that foreach() supports, you can always create a dummy
allocation with as many elements as the processing elements your
algorithm supports and release foreach() on that dummy allocation. Then
your kernel will access elements of the data set in any order it wishes.<br />
<br />
How does RenderScript compare to established technologies like Android
SDK or NDK? For Google, the equation is simple: RenderScript is mainly
for GPUs, hence its name. I tried to present the case here that for an
average Android programmer, RenderScript provides a much more productive
way to offload computation-intensive code fragments to highly optimized
native code than NDK. RenderScript is integrated with the Android SDK,
compilation is super-fast, wrappers are generated automatically, JNI
issues are non-existing, coding parallel execution is simpler than
either with the SDK or with the NDK. Faster execution also means lower
battery consumption as <a href="http://www.slideshare.net/paller/advantages-and-limitations-of-phonegap-for-sensor-processing">this presentation demonstrated in a different context</a>.
And who knows, one day a device with a multicore CPU, GPU or DSP comes
along that speeds up your application even further, at no cost. As
RenderScript has LLVM at its heart, the possibility is there.</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com0tag:blogger.com,1999:blog-8214401912480503366.post-36267383578099987452014-01-27T21:24:00.000+01:002015-11-13T09:26:34.043+01:00RenderScript in Android - Java optimization in the benchmark program<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="background-color: white; color: #666666; font-family: 'Lucida Sans', 'Lucida Grande', 'Lucida Sans Unicode', Verdana, Geneva, sans-serif; font-size: 12px; line-height: 18px; margin-bottom: 1em; margin-right: 10px; margin-top: 1em; padding: 0px;">
<a class="ext" href="https://www.blogger.com/comment.g?blogID=8214401912480503366&postID=3300567118479430729" style="color: #27638c; margin: 0px; padding: 0px; text-decoration: none;" target="_blank">I got into discussion with Michael Leahy</a> with regards to the benchmark program posted <a href="http://www.sfonge.com/forum/topic/renderscript-android-benchmark-program" style="color: #27638c; margin: 0px; padding: 0px; text-decoration: none;">at the end of the previous post</a>. Michael claimed that the Java implementation of the benchmark algorithm in my test program is suboptimal and he contributed an optimized version that he claims could deliver 70% better performance. </div>
<div style="background-color: white; color: #666666; font-family: 'Lucida Sans', 'Lucida Grande', 'Lucida Sans Unicode', Verdana, Geneva, sans-serif; font-size: 12px; line-height: 18px; margin-bottom: 1em; margin-right: 10px; margin-top: 1em; padding: 0px;">
Therefore I decided to re-evaluate the benchmark results - but with a twist. Not only I added Michael's optimized Java implementation but I optimized the RenderScript implementation too. The results are the following.</div>
<ul style="background-color: white; color: #666666; font-family: 'Lucida Sans', 'Lucida Grande', 'Lucida Sans Unicode', Verdana, Geneva, sans-serif; font-size: 12px; line-height: 18px; margin: 0px 0px 1.5em 2em; padding: 0px;">
<li style="margin: 0px; padding: 0px;">Michael's implementation did improve the execution time of the Java implementation by about 2.6 times.</li>
<li style="margin: 0px; padding: 0px;">RenderScript implementation is still about 2.3 times faster than Michael's optimized Java implementation.</li>
</ul>
<div style="background-color: white; color: #666666; font-family: 'Lucida Sans', 'Lucida Grande', 'Lucida Sans Unicode', Verdana, Geneva, sans-serif; font-size: 12px; line-height: 18px; margin-bottom: 1em; margin-right: 10px; margin-top: 1em; padding: 0px;">
<a href="http://pallergabor.uw.hu/androidblog/samplerecognizer_optimized.zip">The new version is available here.</a> I will continue with explaining this example program in the next post.</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHMO6-YIKoH0K9WEiiipL2FtTQZ1mWHLih2Al0VoUL2xy2RscbrR35mXxEspsr92EbPSzOHyoRnNwVCOj4_uYVRZMGK0Qq6icndoJ9OR0TDduPLvcjFTGRBbQsDR5jm9A1i2ztg9HPEKFG/s1600/samplerecognizer2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHMO6-YIKoH0K9WEiiipL2FtTQZ1mWHLih2Al0VoUL2xy2RscbrR35mXxEspsr92EbPSzOHyoRnNwVCOj4_uYVRZMGK0Qq6icndoJ9OR0TDduPLvcjFTGRBbQsDR5jm9A1i2ztg9HPEKFG/s1600/samplerecognizer2.png" width="180" /></a></div>
<div style="background-color: white; color: #666666; font-family: 'Lucida Sans', 'Lucida Grande', 'Lucida Sans Unicode', Verdana, Geneva, sans-serif; font-size: 12px; line-height: 18px; margin-bottom: 1em; margin-right: 10px; margin-top: 1em; padding: 0px;">
<br /></div>
</div>
Gabor Pallerhttp://www.blogger.com/profile/02390936870056951146noreply@blogger.com13