Chapter 15. Working with the Real World
Desktops, laptops, iPhones, and iPads are all physical devices existing in the real world—either on your desk, on your lap, or in your hand. For a long time, your apps were largely confined to your computer, and weren’t able to do much with the outside world besides instructing a printer to print a document.
Starting with iOS and OS X 10.6, however, things began to change, and your code is now able to learn about the user’s location, how the device is moving and being held, and how far away the computer is from landmarks.
In this chapter, you’ll learn about how your programs can interact with the outside world. Specifically, you’ll learn how to use Core Location to determine where your computer or device is on the planet, how to use MapKit to show and annotate maps, how to use Core Motion to learn about how the user is holding the device, how to use the printing services available on OS X and iOS to work with printers, how to connect game controllers into your apps, and how to make sure your apps don’t excessively drain the user’s battery.
Note
Most of the technology discussed in this chapter works on both OS X and iOS. Some of the technologies have identical APIs on both platforms (Core Location, MapKit, and Game Controllers), some have different APIs on the two platforms (print services), and some are only available on iOS (Core Motion) or OS X (App Nap). We’ll let you know which technology is available where.
Working with Location
Almost every ...
Get Swift Development with Cocoa now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.