I’ve covered a fair amount of ground in the last few chapters, and you should now have a solid grasp of the basics of handling the sensor data produced by the hardware.
Predictably in a book talking about sensors I’ve focused on the parts of the SDK that will be most helpful, and allow you to use the basic sensor hardware in your own applications. But even there I’ve left out a lot in an attempt to simplify and get you started quickly, especially when it comes to audio. A more in-depth look at the iPhone SDK is available in Programming iOS 4, by Matt Neuburg (O’Reilly).
The iPhone is one of the most popular devices for geolocation: users use it for everything from driving directions to finding a restaurant close to them. As a developer, you can get in on the geolocation game by using the Core Location framework, one of the most powerful and interesting frameworks in the iPhone SDK. It abstracts the details of determining a user’s location, and does all the heavy lifting for you behind the scenes. From there you can use the MapKit framework to embed maps directly into your views, and then go ahead and annotate those maps. I’ll deep-dive into both these topics in upcoming title Geolocation in iOS, by Alasdair Allan (O’Reilly).