Chapter 16. Sensors, NFC, Speech, Gestures, and Accessibility
Thanks to advances in technology, both the environment and the user can interact with devices in a variety of ways, from external sensors that can detect when a device has changed orientation within an environment, to touch-screen adaptations that enable complex gestures to trigger an event within the device. Android provides APIs that enable the developer to access these sensors and the user to interact with these devices in a variety of ways. In this chapter, we will explore some of these APIs—sensors, NFC (Near Field Communication), the Gesture libraries, and accessibility.
Sensors
The modern smartphone provides more than just the ability to send and receive communication in various forms. The addition of external sensors that can report information about the environment the phone is in has made the phone more powerful and useful for the user as well as the developer. Starting with Android 1.5 (API level 3), a standard set of sensors are available. The physical sensors include, but are not limited to, accelerometers that measure acceleration along various axes, gyroscopes that measure rotational change around some axes, magnetic field sensors that sense the strength of magnetic fields along a set of axes, a light sensor that measures the amount of ambient light, a proximity sensor that measures external objects’ proximity to the device, temperature sensors that measure ambient temperature, and pressure sensors that act ...
Get Programming Android now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.