The most striking, and obvious, difference between the iPhone and the iPad is screen size. The original iPhone screen has 480×320 pixel resolution at 163 pixels per inch. The iPhone 4 and 4th generation iPod touch Retina Displays have a resolution of 960×640 pixel at 326 pixels per inch. Meanwhile both generations of the iPad screen have 1024×768 pixel resolution at 132 pixels per inch. This difference will be the single most fundamental thing to affect the way you design your user interface on the two platforms. Attempting to treat the iPad as simply a rather oversized iPod touch or iPhone will lead to badly designed applications. The metaphors you use on the two different platforms
The increased screen size of the device means that you can develop desktop-sized applications, not just phone-sized applications, for the iPad platform. Although in doing so, a rethink of the user interface to adapt to multi-touch is needed. What works for the iPhone or the desktop, won’t automatically work on an iPad. For example, Apple totally redesigned the user interface of the iWork suite when they moved it to the iPad. If you’re intending to port a Mac OS X desktop application to the iPad you should do something similar.
Note
Interestingly there is now an option for iOS developers to port
their iPhone and iPad projects directly to Mac OS X. The Chameleon
Project http://chameleonproject.org is a drop in
replacement for UIKit
that runs on
Mac OS X, allowing iOS applications to be run on the desktop with little
modification, in some cases none.
Due to its size and function the iPad is immediately associated in our minds with other more familiar objects like a legal pad or a book. Holding the device triggers powerful associations with these items, and we’re mentally willing to accept the iPad has a successor to these objects. This is simply not true for the iPhone; the device is physically too small.
However this book is not about how to design your user interface or manage your user experience. For the most part the examples I present in this book are simple view-based applications that could be equally written for the iPhone and iPod touch or the iPad. The user interface is only there to illustrate how to use the underlying hardware. This book is about how to use the collection of sensors in these mobile devices.
The slider button on the side of the iPad can, optionally, be used to lock the device’s orientation. This means that if you want the screen to stay in portrait mode, it won’t move when you turn it sideways if locked. However despite the presence of the rotation lock (and unlike the iPhone where many applications only supported Portrait mode) an iPad application is expected to support all orientations equally.
Note
Apple has this to say about iPad applications: “An application’s interface should support all landscape and portrait orientations. This behavior differs slightly from the iPhone, where running in both portrait and landscape modes is not required.”
To implement basic support for all interface orientations, you
should implement the shouldAutorotateToInterfaceOrientation:
method
in all of your application’s view controllers, returning YES
for all orientations. Additionally, you
should configure the auto-resizing mark property of your views inside
Interface Builder so that they correctly respond to layout changes (i.e.
rotation of the device).
If you want to go beyond basic support for alternative
orientations there is more work involved. Firstly for custom views,
where the placement of subviews is critical to the UI and need to be
precisely located, you should override the layoutSubviews
method to add your custom
layout code. However, you should override this method only if the
autoresizing behaviors of the subviews are not what you desire.
When an orientation event occurs, the UIWindow
class will work with the front-most
UIViewController
to adjust the
current view. Therefore if you need to perform tasks before, during,
or after completing device rotation you should use the relevant
rotation UIViewController
notification methods. Specifically the view controller’s willRotateToInterfaceOrientation:duration:
,
willAnimateRotationToInterfaceOrientation:
duration:
,
and didRotateFromInterfaceOrientation:
methods
are called at relevant points during rotation allowing you to perform
tasks relevant to the orientation change in progress. For instance you
might make use of these callbacks to allow you to add or remove
specific views and reload your data in those views.
Get Basic Sensors in iOS now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.