Tutorial tunes into Android sensors
Jul 9, 2009 — by Eric Brown — from the LinuxDevices Archive — 1 viewsIBM DeveloperWorks has published a tutorial on exploiting the capabilities of Android's sensors. Frank Ableson's intermediate-level “Tapping into Android's Sensors” offers tips on using the sensor subsystem, and supplies sample code for monitoring orientation and accelerometer sensors, as well as recording audio snippets.
Android's sensors can be used for tasks as diverse as building a baby monitor, unlocking a door with voice activation, or even building a basic seismograph, writes Ableson. The Android SDK (software development kit) is exceptional for providing developer access to underlying device hardware in a way that is rarely available to mobile developers, he adds.
"Though the Android Java environment still sits between you and the metal," writes Ableson, "the Android development team brings much of the hardware's capability to the surface."
The tutorial starts off with an overview of hardware-oriented features exposed in the Android SDK, including:
- android.hardware.Camera — A class that interacts with the camera to snap a photo, acquire images for a preview screen, and modify parameter
- android.hardware.SensorManager — A class that permits access to the sensors available within Android
- android.hardware.SensorListener — An interface implemented by a class intended to receive updates to sensor values as they change in real time
- android.media.MediaRecorder — A class used to record media samples
- android.FaceDetector — A class that permits basic recognition of a person's face as contained in a bitmap
- Android.os — Power management, file watcher, handler, and message classes
- java.util.Date; java.util.Timer; java.util.TimerTask — Classes for date and time stamping, timers, and scheduling
Ableson then explores the SensorManager class, which is the main staging area for sensor development work. The tutorial summarizes SensorManager's constants, starting with sensor type, which controls orientation, accelerometer, light, magnetic field, proximity, temperature, and other sensors. He also looks at SensorManager's sampling rate and accuracy constants.
SensorManager interacts with the SensorListener interface via two main methods, writes Ableson. The onSensorChanged method is used whenever a sensor value has changed, while onAccuracyChanged is invoked when the accuracy of a sensor has been changed. Ableson also explains how an application "registers" to listen for activity related to one or more sensors using SensorManager's registerListener method.
The tutorial then proceeds with in-depth explanations of two supplied code samples. The first application monitors changes to the accelerometers and reports them to the screen of an Android device. The second uses the MediaRecorder class to record an audio snippet.
With Android's "array of input or stimulus options, coupled with capable computational and networking functions," writes Ableson, the mobile Linux/Java stack "becomes an attractive platform for building real-world systems."
Availability
Frank Ableson's intermediate-level IBM DeveloperWorks tutorial, "Tapping into Android's Sensors," is freely available, including downloadable code samples, here.
This article was originally published on LinuxDevices.com and has been donated to the open source community by QuinStreet Inc. Please visit LinuxToday.com for up-to-date news and articles about Linux and open source.