Chapter 4. Sound
Sound is a frequently overlooked part of games. Even in big-name titles, sound design and programming are sometimes left until late in the game development process. This is especially true on mobile devices—the user might be playing the game in a crowded, noisy environment and might not even hear the sounds and music you’ve put into it, so why bother putting in much effort?
However, sound is an incredibly important part of games. When a game sounds great, and makes noises in response to the visible parts of the game, the player gets drawn in to the world that the game’s creating.
In this chapter, you’ll learn how to use iOS’s built-in support for playing both sound effects and music. You’ll also learn how to take advantage of the speech synthesis features built into iOS.
Sound good?[1]
Playing Sound with AVAudioPlayer
Solution
The simplest way to play a sound file is using AVAudioPlayer
, which is a class available in the AVFoundation framework. To use this feature, you first need to import
the AVFoundation
module in each file that uses the AVFoundation classes:
import
AVFoundation
You create an AVAudioPlayer
by providing it with the location of the file you want it to play. This should generally be done ahead of time, before the sound needs to be played, to avoid playback delays.
In this example, audioPlayer
is an optional AVAudioPlayer
instance variable:
let
soundFileURL
=
NSBundle
.
mainBundle
().
URLForResource
(
"TestSound"
,
withExtension
:
"wav"
)
var
error
:
NSError
?
=
nil
audioPlayer
=
AVAudioPlayer
(
contentsOfURL
:
soundFileURL
,
error
:
&
error
)
if
(
error
!=
nil
)
{
println
(
"Failed to load the sound: \(error)"
)
}
audioPlayer
?
.
prepareToPlay
()
To begin playback, you use the play
method:
audioPlayer
?
.
play
()
To make playback loop, you change the audio player’s numberOfLoops
property. To make an AVAudioPlayer
play one time and then stop:
audioPlayer
?
.
numberOfLoops
=
0
To make an AVAudioPlayer
play twice and then stop:
audioPlayer
?
.
numberOfLoops
=
1
To make an AVAudioPlayer
play forever, until manually stopped:
audioPlayer
?
.
numberOfLoops
=
-
1
By default, an AVAudioPlayer
will play its sound one time only. After it’s finished playing, a second call to play
will rewind it and play it again. By changing the numberOfLoops
property, you can make an AVAudioPlayer
play its file a single time, a fixed number of times, or continuously until it’s sent a pause
or stop
message.
To stop playback, you use the pause
or stop
method (the pause
method just stops playback, and lets you resume from where you left off later; the stop
method stops playback completely, and unloads the sound from memory):
// To pause:
audioPlayer
?
.
pause
()
// To stop:
audioPlayer
?
.
stop
()
To rewind an audio player, you change the currentTime
property. This property stores how far playback has progressed, measured in seconds. If you set it to zero, playback will jump back to the start:
audioPlayer
.
currentTime
=
0
You can also set this property to other values to jump to a specific point in the audio.
Discussion
If you use an AVAudioPlayer
, you need to keep a strong reference to it (using an instance variable) to avoid it being released from memory. If that happens, the sound will stop.
If you have multiple sounds that you want to play at the same time, you need to keep references to each (or use an array to contain them all). This can get cumbersome, so it’s often better to use a dedicated sound engine instead of managing each player yourself.
Preparing an AVAudioPlayer
takes a little bit of preparation. You need to either know the location of a file that contains the audio you want the player to play, or have an NSData
object that contains the audio data.
AVAudioPlayer
supports a number of popular audio formats. The specific formats vary slightly from device to device; the iPhone 5 supports the following formats:
- AAC (8 to 320 Kbps)
- Protected AAC (from the iTunes Store)
- HE-AAC
- MP3 (8 to 320 Kbps)
- MP3 VBR
- Audible (formats 2, 3, 4, Audible Enhanced Audio, AAX, and AAX+)
- Apple Lossless
- AIFF
- WAV
You shouldn’t generally have problems with file compatibility across devices, but it’s usually best to go with AAC, MP3, AIFF, or WAV.
In this example, it’s assumed that there’s a file called TestSound.wav in the project. You’ll want to use a different name for your game, of course.
Use the NSBundle
’s URLForResource(_, withExtension)
method to get the location of a resource on disk:
let
soundFileURL
=
NSBundle
.
mainBundle
().
URLForResource
(
"TestSound"
,
withExtension
:
"wav"
)
This returns an NSURL
object that contains the location of the file, which you can give to your AVAudioPlayer
to tell it where to find the sound file.
The initializer for AVAudioPlayer
is AVAudioPlayer(contentsOfURL:, error:)
. The first parameter is an NSURL
that indicates where to find a sound file, and the second is a pointer to an NSError
reference that allows the method to return an error object if something goes wrong.
Let’s take a closer look at how this works. First, you create an NSError
variable and set it to nil
:
var
error
:
NSError
?
=
nil
Then you call AVAudioPlayer(contentsOfURL:, error:)
and provide the NSURL
and the NSError
variable, prefixed with an ampersand (&
):
audioPlayer
=
AVAudioPlayer
(
contentsOfURL
:
soundFileURL
,
error
:
&
error
)
When this method returns, audioPlayer
is either a ready-to-use AVAudioPlayer
object, or nil
. If it’s nil
, the NSError
variable will have changed to be a reference to an NSError
object, which you can use to find out what went wrong:
if
(
error
!=
nil
)
{
println
(
"Failed to load the sound: \(error)"
)
}
Finally, the AVAudioPlayer
can be told to preload the audio file before playback. If you don’t do this, it’s no big deal—when you tell it to play, it loads the file and then begins playing back. However, for large files, this can lead to a short pause before audio actually starts playing, so it’s often best to preload the sound as soon as you can. Note, however, that if you have many large sounds, preloading everything can lead to all of your available memory being consumed, so use this feature with care.
Recording Sound with AVAudioRecorder
Solution
AVAudioRecorder
is your friend here. Like its sibling AVAudioPlayer
(see Playing Sound with AVAudioPlayer), AVAudioRecorder
lives in the AVFoundation framework, so you’ll need to import that module in any files where you want to use it. You can then create an AVAudioRecorder
as follows:
// destinationURL is the location of where we want to store our recording
var
error
:
NSError
?
audioRecorder
=
AVAudioRecorder
(
URL
:
destinationURL
,
settings
:
nil
,
error
:&
error
)
if
(
error
!=
nil
)
{
println
(
"Couldn't create a recorder: \(error)"
)
}
audioRecorder
?
.
prepareToRecord
()
To begin recording, use the record
method:
audioRecorder
?
.
record
()
To stop recording, use the stop
method:
audioRecorder
?
.
stop
()
When recording has ended, the file pointed at by the URL you used to create the AVAudioRecorder
contains a sound file, which you can play using AVAudioPlayer
or any other audio system.
Discussion
Like an AVAudioPlayer
, an AVAudioRecorder
needs to have at least one strong reference made to it in order to keep it in memory.
To record audio, you first need to have the location of the file where the recorded audio will end up. The AVAudioRecorder
will create the file if it doesn’t already exist; if it does, the recorder will erase the file and overwrite it. So, if you want to avoid losing recorded audio, either never record to the same place twice, or move the recorded audio somewhere else when you’re done recording.
The recorded audio file needs to be stored in a location where your game is allowed to put files. A good place to use is your game’s Documents directory; any files placed in this folder will be backed up when the user’s device is synced.
To get the location of your game’s Documents folder, you can use the NSFileManager
class:
let
documentsURL
=
NSFileManager
.
defaultManager
()
.
URLsForDirectory
(
NSSearchPathDirectory
.
DocumentDirectory
,
inDomains
:
NSSearchPathDomainMask
.
UserDomainMask
).
last
as
!
NSURL
Once you have the location of the directory, you can create a URL relative to it. Remember, the URL doesn’t have to point to a real file yet; one will be created when recording begins:
return
documentsURL
.
URLByAppendingPathComponent
(
"RecordedSound.wav"
)
Working with Multiple Audio Players
Solution
Create a manager object that manages a collection of AVAudioPlayer
s. When you want to play a sound, you ask this object to give you an AVAudioPlayer
. The manager object will try to give you an AVAudioPlayer
that’s not currently doing anything, but if it can’t find one, it will create one.
To create your manager object, create a file called AVAudioPlayerPool.swift with the following contents:
// An array of all players stored in the pool; not accessible
// outside this file
private
var
players
:
[
AVAudioPlayer
]
=
[]
class
AVAudioPlayerPool
:
NSObject
{
// Given the URL of a sound file, either create or reuse an audio player
class
func
playerWithURL
(
url
:
NSURL
)
->
AVAudioPlayer
?
{
// Try to find a player that can be reused and is not playing
let
availablePlayers
=
players
.
filter
{
(
player
)
->
Bool
in
return
player
.
playing
==
false
&&
player
.
url
==
url
}
// If we found one, return it
if
let
playerToUse
=
availablePlayers
.
first
{
println
(
"Reusing player for \(url.lastPathComponent)"
)
return
playerToUse
}
// Didn't find one? Create a new one
var
error
:
NSError
?
=
nil
if
let
newPlayer
=
AVAudioPlayer
(
contentsOfURL
:
url
,
error
:&
error
)
{
println
(
"Creating new player for url \(url.lastPathComponent)"
)
players
.
append
(
newPlayer
)
return
newPlayer
}
else
{
// We might not be able to create one, so log and return nil
println
(
"Couldn't load \(url.lastPathComponent): \(error)"
)
return
nil
}
}
}
You can then use it as follows:
if
let
url
=
NSBundle
.
mainBundle
().
URLForResource
(
"TestSound"
,
withExtension
:
"wav"
)
{
let
player
=
AVAudioPlayerPool
.
playerWithURL
(
url
)
player
?
.
play
()
}
Discussion
AVAudioPlayer
s are allowed to be played multiple times, but aren’t allowed to change the file that they’re playing. If you want to reuse a single player, you have to use the same file; if you want to use a different file, you’ll need a new player.
This means that the AVAudioPlayerPool
object shown in this recipe needs to know which file you want to play.
Our AVAudioPlayerPool
object does the following things:
-
It keeps a list of
AVAudioPlayer
objects in an array. - When a player is requested, it checks to see if it has an available player with the right URL; if it does, it returns that.
-
If there’s no
AVAudioPlayer
that it can use—either because all of the suitableAVAudioPlayer
s are playing, or because there’s noAVAudioPlayer
with the right URL—it creates one, prepares it with the URL provided, and adds it to the list ofAVAudioPlayer
s. This means that when this newAVAudioPlayer
is done playing, it can be reused.
Cross-Fading Between Tracks
Solution
This method slowly fades an AVAudioPlayer
from a starting volume to an end volume, over a set duration:
func
fadePlayer
(
player
:
AVAudioPlayer
,
fromVolume
startVolume
:
Float
,
toVolume
endVolume
:
Float
,
overTime
time
:
Float
)
{
// Update the volume every 1/100 of a second
var
fadeSteps
:
Int
=
Int
(
time
)
*
100
// Work out how much time each step will take
var
timePerStep
:
Float
=
1
/
100.0
self
.
audioPlayer
?
.
volume
=
startVolume
;
// Schedule a number of volume changes
for
step
in
0
...
fadeSteps
{
let
delayInSeconds
:
Float
=
Float
(
step
)
*
timePerStep
let
popTime
=
dispatch_time
(
DISPATCH_TIME_NOW
,
Int64
(
delayInSeconds
*
Float
(
NSEC_PER_SEC
)));
dispatch_after
(
popTime
,
dispatch_get_main_queue
())
{
let
fraction
=
(
Float
(
step
)
/
Float
(
fadeSteps
))
player
.
volume
=
startVolume
+
(
endVolume
-
startVolume
)
*
fraction
}
}
}
To use this method to fade in an AVAudioPlayer
, use a startVolume
of 0.0 and an endVolume
of 1.0:
fadePlayer
(
audioPlayer
!
,
fromVolume
:
0.0
,
toVolume
:
1.0
,
overTime
:
1.0
)
To fade out, use a startVolume
of 1.0 and an endVolume
of 0.0:
fadePlayer
(
audioPlayer
!
,
fromVolume
:
1.0
,
toVolume
:
0.0
,
overTime
:
1.0
)
To make the fade take longer, increase the overTime
parameter.
Discussion
When you want the volume of an AVAudioPlayer
to slowly fade out, what you really want is for the volume to change very slightly but very often. In this recipe, we’ve created a method that uses Grand Central Dispatch to schedule the repeated, gradual adjustment of the volume of a player over time.
To determine how many individual volume changes are needed, the first step is to decide how many times per second the volume should change. In this example, we’ve chosen 100 times per second—that is, the volume will be changed 100 times for every second the fade should last:
// Update the volume every 1/100 of a second
var
fadeSteps
:
Int
=
Int
(
time
)
*
100
// Work out how much time each step will take
var
timePerStep
:
Float
=
1
/
100.0
Note
Feel free to experiment with this number. Bigger numbers will lead to smoother fades, whereas smaller numbers will be more efficient but might sound worse.
The next step is to ensure that the player’s current volume is set to be the start volume:
self
.
audioPlayer
?
.
volume
=
startVolume
;
We then repeatedly schedule volume changes. We’re actually scheduling these changes all at once; however, each change is scheduled to take place slightly after the previous one.
To know exactly when a change should take place, all we need to know is how many steps into the fade we are, and how long the total fade should take. From there, we can calculate how far in the future a specific step should take place:
for
step
in
0
...
fadeSteps
{
let
delayInSeconds
:
Float
=
Float
(
step
)
*
timePerStep
Once this duration is known, we can get Grand Central Dispatch to schedule it:
let
popTime
=
dispatch_time
(
DISPATCH_TIME_NOW
,
Int64
(
delayInSeconds
*
Float
(
NSEC_PER_SEC
)));
dispatch_after
(
popTime
,
dispatch_get_main_queue
())
{
The next few lines of code are executed when the step is ready to happen. At this point, we need to know exactly what the volume of the audio player should be:
let
fraction
=
(
Float
(
step
)
/
Float
(
fadeSteps
))
player
.
volume
=
startVolume
+
(
endVolume
-
startVolume
)
*
fraction
When the code runs, the for
loop creates and schedules multiple blocks that set the volume, with each block reducing the volume a little. The end result is that the user hears a gradual lessening in volume—in other words, a fade out!
Synthesizing Speech
Solution
First, import the AVFoundation
in your file (see Playing Sound with AVAudioPlayer).
Then, create an instance of AVSpeechSynthesizer
:
var
speechSynthesizer
=
AVSpeechSynthesizer
()
When you have text you want to speak, create an AVSpeechUtterance
:
let
utterance
=
AVSpeechUtterance
(
string
:
self
.
textToSpeakField
.
text
)
You then give the utterance to your AVSpeechSynthesizer
:
self
.
speechSynthesizer
.
speakUtterance
(
utterance
)
Discussion
The voices you use with AVSpeechSynthesizer
are the same ones seen in the Siri personal assistant that’s built into all devices released since the iPhone 4S, and in the VoiceOver accessibility feature.
You can send more than one AVSpeechUtterance
to an AVSpeechSynthesizer
at the same time. If you call speakUtterance
while the synthesizer is already speaking, it will wait until the current utterance has finished before moving on to the next.
Warning
Don’t call speakUtterance
with the same AVSpeechUtterance
twice—you’ll cause an exception, and your app will crash.
Once you start speaking, you can instruct the AVSpeechSynthesizer
to pause speaking, either immediately or at the next word:
// Stop speaking immediately
self
.
speechSynthesizer
.
pauseSpeakingAtBoundary
(
AVSpeechBoundary
.
Immediate
)
// Stop speaking after the current word
self
.
speechSynthesizer
.
pauseSpeakingAtBoundary
(
AVSpeechBoundary
.
Word
)
Once you’ve paused speaking, you can resume it at any time:
self
.
speechSynthesizer
.
continueSpeaking
()
If you’re done with speaking, you can clear the AVSpeechSynthesizer
of the current and pending AVSpeechUtterance
s by calling stopSpeakingAtBoundary
. This method works in the same way as pauseSpeakingAtBoundary
, but once you call it, anything the synthesizer was about to say is forgotten.
Getting Information About What the Music App Is Playing
Solution
To do this, you’ll need to add the Media Player framework to your code by importing the MediaPlayer
module in your file.
First, get an MPMusicPlayerController
from the system, which contains information about the built-in music library. Next, get the currently playing MPMediaItem
, which represents a piece of media that the Music app is currently playing. Finally, call valueForProperty
to get specific information about that media item:
let
musicPlayer
=
MPMusicPlayerController
.
systemMusicPlayer
()
let
currentTrack
:
MPMediaItem
?
=
musicPlayer
.
nowPlayingItem
let
title
=
currentTrack
?
.
valueForProperty
(
MPMediaItemPropertyTitle
)
as
?
String
??
"None"
let
artist
=
currentTrack
?
.
valueForProperty
(
MPMediaItemPropertyArtist
)
as
?
String
??
"None"
let
album
=
currentTrack
?
.
valueForProperty
(
MPMediaItemPropertyAlbumTitle
)
as
?
String
??
"None"
self
.
titleLabel
.
text
=
title
self
.
artistLabel
.
text
=
artist
self
.
albumLabel
.
text
=
album
Once you’ve got this information, you can do whatever you like with it, including displaying it in a label, showing it in-game, and more.
Discussion
An MPMusicPlayerController
represents the music playback system that’s built into every iOS device. Using this object, you can get information about the currently playing track, set the currently playing queue of music, and control the playback (such as by pausing and skipping backward and forward in the queue).
There are actually two MPMusicPlayerController
s available to your app. The first is the system music player, which represents the state of the built-in Music application. The system music player is shared across all applications, so they all have control over the same thing.
The second music player controller that’s available is the application music player. The application music player is functionally identical to the system music player, with a single difference: each application has its own application music player. This means that they each have their own playlist.
Only one piece of media can be playing at a single time. If an application starts using its own application music player, the system music player will pause and let the application take over. If you’re using an app that’s playing music out of the application music player, and you then exit that app, the music will stop.
To get information about the currently playing track, you use the nowPlayingItem
property of the MPMusicPlayerController
. This property returns an MPMediaItem
, which is an object that represents a piece of media. Media means music, videos, audiobooks, podcasts, and more—not just music!
To get information about an MPMediaItem
, you use the valueForProperty
method. This method takes one of several possible property names. Here are some examples:
-
MPMediaItemPropertyAlbumTitle
- The name of the album.
-
MPMediaItemPropertyArtist
- The name of the artist.
-
MPMediaItemPropertyAlbumArtist
- The name of the album’s main artist (for albums with multiple artists).
-
MPMediaItemPropertyGenre
- The genre of the music.
-
MPMediaItemPropertyComposer
- The composer of the music.
-
MPMediaItemPropertyPlaybackDuration
- The length of the music, in seconds.
Warning
The media library is only available on iOS devices—it’s not available on the iOS Simulator. If you try to use these features on the simulator, it just plain won’t work.
Detecting When the Currently Playing Track Changes
Solution
Use NSNotificationCenter
to subscribe to the MPMusicPlayerControllerNowPlayingItemDidChangeNotification
notification. First, create a property of type AnyObject?
to store a reference to the observer object:
var
trackChangedObserver
:
AnyObject
?
When you want to begin tracking when the now playing item changes, ask the notification center to begin observing the notification:
trackChangedObserver
=
NSNotificationCenter
.
defaultCenter
()
.
addObserverForName
(
MPMusicPlayerControllerNowPlayingItemDidChangeNotification
,
object
:
nil
,
queue
:
NSOperationQueue
.
mainQueue
())
{
(
notification
)
->
Void
in
self
.
updateTrackInformation
()
}
Next, get a reference to the MPMusicPlayerController
that you want to get notifications for, and call beginGeneratingPlaybackNotifications
on it:
let
musicPlayer
=
MPMusicPlayerController
.
systemMusicPlayer
()
musicPlayer
.
beginGeneratingPlaybackNotifications
()
When you begin observing notifications using addObserverForName
, you’re given a reference to an object: the observer object. You need to keep a reference to this object around, because when you want to tell the notification system to stop notifying you (and you must do this, or else you’ll get bugs and crashes), you pass the object back to the NSNotificationCenter
and call the removeObserver
method. A common place to do this is in the view controller’s deinit
method:
deinit
{
NSNotificationCenter
.
defaultCenter
().
removeObserver
(
trackChangedObserver
!
)
}
Discussion
Notifications regarding the current item won’t be sent unless beginGeneratingPlaybackNotifications
is called. If you stop being interested in the currently playing item, call endGeneratingPlaybackNotifications
.
Note that you might not receive these notifications if your application is in the background. It’s generally a good idea to manually update your interface whenever your game comes back from the background, instead of just relying on the notifications to arrive.
Controlling Music Playback
Solution
Use the MPMusicPlayerController
to control the state of the music player:
let
musicPlayer
=
MPMusicPlayerController
.
systemMusicPlayer
()
musicPlayer
?
.
play
()
musicPlayer
?
.
pause
()
musicPlayer
?
.
skipToBeginning
()
musicPlayer
?
.
skipToNextItem
()
musicPlayer
?
.
skipToPreviousItem
()
musicPlayer
?
.
beginSeekingForward
()
musicPlayer
?
.
beginSeekingBackward
()
musicPlayer
?
.
stop
()
Discussion
Don’t forget that if you’re using the shared system music player controller, any changes you make to the playback state apply to all applications. This means that the playback state of your application might get changed by other applications—usually the Music application, but possibly by other apps.
You can query the current state of the music player by asking it for the playbackState
, which is one of the following values:
-
MPMusicPlaybackStateStopped
- The music player isn’t playing anything.
-
MPMusicPlaybackStatePlaying
- The music player is currently playing.
-
MPMusicPlaybackStatePaused
- The music player is playing, but is paused.
-
MPMusicPlaybackStateInterrupted
- The music player is playing, but has been interrupted (e.g., by a phone call).
-
MPMusicPlaybackStateSeekingForward
- The music player is fast-forwarding.
-
MPMusicPlaybackStateSeekingBackward
- The music player is fast-reversing.
You can get notified about changes in the playback state by registering for the MPMusicPlayerControllerPlaybackStateDidChangeNotification
notification, in the same way MPMusicPlayerControllerNowPlayingItemDidChangeNotification
allows you to get notified about changes in the currently playing item.
Allowing the User to Select Music
Solution
You can display an MPMediaPickerController
to let the user select music.
First, make your view controller conform to the MPMediaPickerControllerDelegate
:
class
ViewController
:
UIViewController
,
MPMediaPickerControllerDelegate
{
Next, add the following code at the point where you want to display the media picker:
let
picker
=
MPMediaPickerController
(
mediaTypes
:
MPMediaType
.
AnyAudio
)
picker
.
allowsPickingMultipleItems
=
true
picker
.
showsCloudItems
=
true
picker
.
delegate
=
self
self
.
presentViewController
(
picker
,
animated
:
false
,
completion
:
nil
)
Then, add the following two methods to your view controller:
func
mediaPicker
(
mediaPicker
:
MPMediaPickerController
!
,
didPickMediaItems
mediaItemCollection
:
MPMediaItemCollection
!
)
{
for
item
in
mediaItemCollection
.
items
{
let
itemName
=
item
.
valueForProperty
(
MPMediaItemPropertyTitle
)
as
?
String
println
(
"Picked item: \(itemName)"
)
}
let
musicPlayer
=
MPMusicPlayerController
.
systemMusicPlayer
()
musicPlayer
.
setQueueWithItemCollection
(
mediaItemCollection
)
musicPlayer
.
play
()
self
.
dismissViewControllerAnimated
(
false
,
completion
:
nil
)
}
func
mediaPickerDidCancel
(
mediaPicker
:
MPMediaPickerController
!
)
{
self
.
dismissViewControllerAnimated
(
false
,
completion
:
nil
)
}
Discussion
An MPMediaPickerController
uses the exact same user interface as the one you see in the built-in Music application. This means that your player doesn’t have to waste time learning how to navigate a different interface.
When you create an MPMediaPickerController
, you can choose what kinds of media you want the user to pick. In this recipe, we’ve gone with MPMediaTypeAnyAudio
, which, as the name suggests, means the user can pick any audio: music, audiobooks, podcasts, and so on. Other options include:
-
MPMediaType.Music
-
MPMediaType.Podcast
-
MPMediaType.AudioBook
-
MPMediaType.AudioITunesU
-
MPMediaType.Movie
-
MPMediaType.TVShow
-
MPMediaType.VideoPodcast
-
MPMediaType.MusicVideo
-
MPMediaType.VideoITunesU
-
MPMediaType.HomeVideo
-
MPMediaType.AnyVideo
-
MPMediaType.Any
In addition to setting what kind of content you want the user to pick, you can also set whether you want the user to be able to pick multiple items or just one:
picker
.
allowsPickingMultipleItems
=
true
Finally, you can decide whether you want to present media that the user has purchased from iTunes, but isn’t currently downloaded onto the device. Apple refers to this feature as “iTunes in the Cloud,” and you can turn it on or off through the showsCloudItems
property:
picker
.
showsCloudItems
=
true
When the user finishes picking media, the delegate of the MPMediaPickerController
receives the mediaPicker(_, didPickMediaItems:)
message. The media items that were chosen are contained in an MPMediaItemCollection
object, which is basically an array of MPMediaItem
s.
In addition to getting information about the media items that were selected, you can also give the MPMediaItemCollection
directly to an MPMusicPlayerController
, and tell it to start playing:
let
musicPlayer
=
MPMusicPlayerController
.
systemMusicPlayer
()
musicPlayer
.
setQueueWithItemCollection
(
mediaItemCollection
)
musicPlayer
.
play
()
Once you’re done getting content out of the media picker, you need to dismiss it, by using the dismissViewControllerAnimated(_, completion:)
method. This also applies if the user taps the Cancel button in the media picker: in this case, your delegate receives the mediaPickerDidCancel
message, and your application should dismiss the view controller in the same way.
Cooperating with Other Applications’ Audio
Solution
You can find out if another application is currently playing audio by using the AVAudioSession
class:
let
session
=
AVAudioSession
.
sharedInstance
()
if
(
session
.
otherAudioPlaying
)
{
// Another application is playing audio. Don't play any sound that might
// conflict with music, such as your own background music.
}
else
{
// No other app is playing audio - crank the tunes!
}
Discussion
The AVAudioSession
class lets you control how audio is currently being handled on the device, and gives you considerable flexibility in terms of how the device should handle things like the ringer switch (the switch on the side of the device) and what happens when the user locks the screen.
By default, if you begin playing back audio using AVAudioPlayer
and another application (such as the built-in Music app) is playing audio, the other application will stop all sound, and the audio played by your game will be the only thing audible.
However, you might want the user to be able to listen to her own music while playing your game—the background music might not be a very important part of your game, for example.
To change the default behavior of muting other applications, you need to set the audio session’s category. For example, to indicate to the system that your application should not cause other apps to mute their audio, you need to set the audio session’s category to AVAudioSessionCategoryAmbient
:
var
error
:
NSError
=
nil
AVAudioSession
.
sharedInstance
().
setCategory
(
AVAudioSessionCategoryAmbient
,
error
:
&
error
)
if
(
error
!=
nil
)
{
println
(
"Problem setting audio session: \(error)"
)
}
There are several categories of audio session available. The most important to games are the following:
-
AVAudioSessionCategoryAmbient
- Audio isn’t the most important part of your game, and other apps should be able to play audio alongside yours. When the ringer switch is set to mute, your audio is silenced, and when the screen locks, your audio stops.
-
AVAudioSessionCategorySoloAmbient
- Audio is reasonably important to your game. If other apps are playing audio, they’ll stop. However, the audio session will continue to respect the ringer switch and the screen locking.
-
AVAudioSessionCategoryPlayback
- Audio is very important to your game. Other apps are silenced, and your app ignores the ringer switch and the screen locking.
Note
When using AVAudioSessionCategoryPlayback
, your app will still be stopped when the screen locks. To make it keep running, you need to mark your app as one that plays audio in the background. To do this, follow these steps:
- Open your project’s information page by clicking the project at the top of the Project Navigator.
- Go to the Capabilities tab.
- Turn on “Background Modes,” and then turn on “Audio and AirPlay.”
Your app will now play audio in the background, as long as the audio session’s category is set to AVAudioSessionCategoryPlayback
.
Determining How to Best Use Sound in Your Game Design
Solution
It’s really hard to make an iOS game that relies on sound. For one, you can’t count on the user wearing headphones, and sounds in games (and everything else, really) don’t sound their best coming from the tiny speakers found in iOS devices.
Many games “get around” this by prompting users to put on their headphones as the game launches, or suggesting that they are “best experienced via headphones” in the sound and music options menu, if it has one. We think this is a suboptimal solution.
The best iOS games understand and acknowledge the environment in which the games are likely to be played: typically a busy, distraction-filled environment, where your beautiful audio might not be appreciated due to background noise or the fact that the user has the volume turned all the way down.
The solution is to make sure your game works with, or without, sound. Don’t count on the user hearing anything at all, in fact.
Get iOS Swift Game Development Cookbook, 2nd Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.