Chapter 4. Sound

Sound is a frequently overlooked part of games. Even in big-name titles, sound design and programming are sometimes left until late in the game development process. This is especially true on mobile devices—the user might be playing the game in a crowded, noisy environment and might not even hear the sounds and music you’ve put into it, so why bother putting in much effort?

However, sound is an incredibly important part of games. When a game sounds great, and makes noises in response to the visible parts of the game, the player gets drawn in to the world that the game’s creating.

In this chapter, you’ll learn how to use iOS’s built-in support for playing both sound effects and music. You’ll also learn how to take advantage of the speech synthesis features built into iOS.

Sound good?1

4.1 Playing Sound with AVAudioPlayer


You want to play back an audio file, as simply as possible and with a minimum of work.


The simplest way to play a sound file is using AVAudioPlayer, which is a class available in the AVFoundation framework. To use this feature, you first need to import the AVFoundation module in each file that uses the AVFoundation classes:

import AVFoundation

You create an AVAudioPlayer by providing it with the location of the file you want it to play. This should generally be done ahead of time, before the sound needs to be played, to avoid playback delays. To get the location of the file, you use the Bundle class’s url(forResource:, withExtension:) method, which allows you to access the location of any resource that’s been added to your app’s target in Xcode (for example, by dragging and dropping it into the Project navigator.)

In this example, audioPlayer is an optional AVAudioPlayer instance variable:

guard let soundFileURL = Bundle.main.url(forResource: "TestSound",
                                         withExtension:"wav") else {
                                            print("URL not found")

do {
    audioPlayer = try AVAudioPlayer(contentsOf: soundFileURL)
} catch let error {
    print("Failed to load the sound: \(error)")


To begin playback, you use the play method:


To make playback loop, you change the audio player’s numberOfLoops property. To make an AVAudioPlayer play one time and then stop:

audioPlayer?.numberOfLoops = 0

To make an AVAudioPlayer play twice and then stop:

audioPlayer?.numberOfLoops = 1

To make an AVAudioPlayer play forever, until manually stopped:

audioPlayer?.numberOfLoops = -1

By default, an AVAudioPlayer will play its sound one time only. After it’s finished playing, a second call to play will rewind it and play it again. By changing the numberOfLoops property, you can make an AVAudioPlayer play its file a single time, a fixed number of times, or continuously until it’s sent a pause or stop message.

To stop playback, you use the pause or stop method (the pause method just stops playback, and lets you resume from where you left off later; the stop method stops playback completely, and unloads the sound from memory):

// To pause:
// To stop:

To rewind an audio player, you change the currentTime property. This property stores how far playback has progressed, measured in seconds. If you set it to zero, playback will jump back to the start:

audioPlayer.currentTime = 0

You can also set this property to other values to jump to a specific point in the audio.


If you use an AVAudioPlayer, you need to keep a strong reference to it (using an instance variable) to avoid it being released from memory. If that happens, the sound will stop.

If you have multiple sounds that you want to play at the same time, you need to keep references to each (or use an array to contain them all). This can get cumbersome, so it’s often better to use a dedicated sound engine instead of managing each player yourself.

Preparing an AVAudioPlayer takes a little bit of preparation. You need to either know the location of a file that contains the audio you want the player to play, or have a Data object that contains the audio data.

AVAudioPlayer supports a number of popular audio formats. The specific formats vary slightly from device to device; the iPhone X supports the following formats:

  • AAC-LC

  • HE-AAC

  • HE-AAC v2

  • Protected AAC

  • MP3

  • Linear PCM (.wav)

  • Apple Lossless

  • FLAC

  • Dolby Digital (AC-3),

  • Dolby Digital Plus (E-AC-3)

  • Audible (formats 2, 3, 4, Audible Enhanced Audio, AAX, and AAX+)

You shouldn’t generally have problems with file compatibility across devices, but it’s usually best to go with AAC, MP3, or WAV.

In this example, it’s assumed that there’s a file called TestSound.wav in the project. You’ll want to use a different name for your game, of course.

Use the Bundle’s url(forResource:, withExtension:) method to get the location of a resource on disk:

guard let soundFileURL = Bundle.main.url(forResource: "TestSound",
                                         withExtension:"wav") else {
                                            print("URL not found")

This returns an URL object that contains the location of the file, which you can give to your AVAudioPlayer to tell it where to find the sound file.

Finally, the AVAudioPlayer can be told to preload the audio file before playback. If you don’t do this, it’s no big deal—when you tell it to play, it loads the file and then begins playing back. However, for large files, this can lead to a short pause before audio actually starts playing, so it’s often best to preload the sound as soon as you can. Note, however, that if you have many large sounds, preloading everything can lead to all of your available memory being consumed, so use this feature with care.

4.2 Recording Sound with AVAudioRecorder


You want to record sound made by the player, using the built-in microphone.


AVAudioRecorder is your friend here. Like its sibling AVAudioPlayer (see Recipe 4.1), AVAudioRecorder lives in the AVFoundation framework, so you’ll need to import that module in any files where you want to use it. You can then create an AVAudioRecorder as follows:

// destinationURL is the location of where we want to store our recording

do {
    audioRecorder = try AVAudioRecorder(url:destinationURL, settings: [:])
} catch let error {
    print("Couldn't create a recorder: \(error)")


To begin recording, use the record method:


To stop recording, use the stop method:


When recording has ended, the file pointed at by the URL you used to create the AVAudioRecorder contains a sound file, which you can play using AVAudioPlayer or any other audio system.


Like an AVAudioPlayer, an AVAudioRecorder needs to have at least one strong reference made to it in order to keep it in memory.

To record audio, you first need to have the location of the file where the recorded audio will end up. The AVAudioRecorder will create the file if it doesn’t already exist; if it does, the recorder will erase the file and overwrite it. So, if you want to avoid losing recorded audio, either never record to the same place twice, or move the recorded audio somewhere else when you’re done recording.

The recorded audio file needs to be stored in a location where your game is allowed to put files. A good place to use is your game’s Documents directory; any files placed in this folder will be backed up when the user’s device is synced.

To get the location of your game’s Documents folder, you can use the FileManager class:

let documentsURL = FileManager.default
    .urls(for: FileManager.SearchPathDirectory.documentDirectory,

Once you have the location of the directory, you can create a URL relative to it. Remember, the URL doesn’t have to point to a real file yet; one will be created when recording begins:

return documentsURL.appendingPathComponent("RecordedSound.wav")

4.3 Working with Multiple Audio Players


You want to use multiple audio players, but reuse players when possible.


Create a manager object that manages a collection of AVAudioPlayers. When you want to play a sound, you ask this object to give you an AVAudioPlayer. The manager object will try to give you an AVAudioPlayer that’s not currently doing anything, but if it can’t find one, it will create one.

To create your manager object, create a file called AVAudioPlayerPool.swift with the following contents:

// An array of all players stored in the pool; not accessible
// outside this file
private var players : [AVAudioPlayer] = []

class AVAudioPlayerPool: NSObject {

    // Given the URL of a sound file, either create or reuse an audio player
    class func player(url : URL) -> AVAudioPlayer? {

        // Try and find a player that can be reused and is not playing
        let availablePlayers = players.filter { (player) -> Bool in
            return player.isPlaying == false && player.url == url

        // If we found one, return it
        if let playerToUse = availablePlayers.first {
            print("Reusing player for \(url.lastPathComponent)")
            return playerToUse

        // Didn't find one? Create a new one

        do {
            let newPlayer = try AVAudioPlayer(contentsOf: url)
            return newPlayer
        } catch let error {
            print("Couldn't load \(url.lastPathComponent): \(error)")
            return nil



You can then use it as follows:

if let url = Bundle.main.url(forResource: "TestSound",
                                                  withExtension: "wav") {
    let player = AVAudioPlayerPool.player(url: url)


AVAudioPlayers are allowed to be played multiple times, but aren’t allowed to change the file that they’re playing. If you want to reuse a single player, you have to use the same file; if you want to use a different file, you’ll need a new player.

This means that the AVAudioPlayerPool object shown in this recipe needs to know which file you want to play.

Our AVAudioPlayerPool object does the following things:

  1. It keeps a list of AVAudioPlayer objects in an array.

  2. When a player is requested, it checks to see if it has an available player with the right URL; if it does, it returns that.

  3. If there’s no AVAudioPlayer that it can use—either because all of the suitable AVAudioPlayers are playing, or because there’s no AVAudioPlayer with the right URL—it creates one, prepares it with the URL provided, and adds it to the list of AVAudioPlayers. This means that when this new AVAudioPlayer is done playing, it can be reused.

4.4 Cross-Fading Between Tracks


You want to blend multiple sounds by smoothly fading one out and another in.


This method slowly fades an AVAudioPlayer from a starting volume to an end volume, over a set duration:

func fade(player: AVAudioPlayer,
                fromVolume startVolume : Float,
                toVolume endVolume : Float,
                overTime time : TimeInterval) {

    let stepsPerSecond = 100

    // Update the volume every 1/100 of a second
    let fadeSteps = Int(time * TimeInterval(stepsPerSecond))
    // Work out how much time each step will take
    let timePerStep = TimeInterval(1.0 / Double(stepsPerSecond))

    player.volume = startVolume;

    // Schedule a number of volume changes
    for step in 0...fadeSteps {

        let delayInSeconds : TimeInterval = TimeInterval(step) * timePerStep

        let deadline = + delayInSeconds

        DispatchQueue.main.asyncAfter(deadline: deadline, execute: {
            let fraction = (Float(step) / Float(fadeSteps))

            player.volume = startVolume + (endVolume - startVolume) * fraction


To use this method to fade in an AVAudioPlayer, use a startVolume of 0.0 and an endVolume of 1.0:

fade(player: audioPlayer!, fromVolume: 0.0, toVolume: 1.0, overTime: 1.0)

To fade out, use a startVolume of 1.0 and an endVolume of 0.0:

fade(player: audioPlayer!, fromVolume: 1.0, toVolume: 0.0, overTime: 1.0)

To make the fade take longer, increase the overTime parameter.


When you want the volume of an AVAudioPlayer to slowly fade out, what you really want is for the volume to change very slightly but very often. In this recipe, we’ve created a method that uses Grand Central Dispatch to schedule the repeated, gradual adjustment of the volume of a player over time.

To determine how many individual volume changes are needed, the first step is to decide how many times per second the volume should change. In this example, we’ve chosen 100 times per second—that is, the volume will be changed 100 times for every second the fade should last:

let stepsPerSecond = 100

// Update the volume every 1/100 of a second
let fadeSteps = Int(time * TimeInterval(stepsPerSecond))
// Work out how much time each step will take
let timePerStep = TimeInterval(1.0 / Double(stepsPerSecond))

Feel free to experiment with this number. Bigger numbers will lead to smoother fades, whereas smaller numbers will be more efficient but might sound worse.

The next step is to ensure that the player’s current volume is set to be the start volume:

player.volume = startVolume;

We then repeatedly schedule volume changes. We’re actually scheduling these changes all at once; however, each change is scheduled to take place slightly after the previous one.

To know exactly when a change should take place, all we need to know is how many steps into the fade we are, and how long the total fade should take. From there, we can calculate how far in the future a specific step should take place:

// Schedule a number of volume changes
for step in 0...fadeSteps {

    let delayInSeconds : TimeInterval = TimeInterval(step) * timePerStep

Once this duration is known, we can get Grand Central Dispatch to schedule it:

let deadline = + delayInSeconds

DispatchQueue.main.asyncAfter(deadline: deadline, execute: {

The next few lines of code are executed when the step is ready to happen. At this point, we need to know exactly what the volume of the audio player should be:

let fraction = (Float(step) / Float(fadeSteps))

player.volume = startVolume + (endVolume - startVolume) * fraction

When the code runs, the for loop creates and schedules multiple blocks that set the volume, with each block reducing the volume a little. The end result is that the user hears a gradual lessening in volume—in other words, a fade out!

4.5 Synthesizing Speech


You want to make your app speak.


First, import the AVFoundation in your file (see Recipe 4.1).

Then, create an instance of AVSpeechSynthesizer:

var speechSynthesizer = AVSpeechSynthesizer()

When you have text you want to speak, create an AVSpeechUtterance:

let utterance = AVSpeechUtterance(string: textToSpeak)

You then give the utterance to your AVSpeechSynthesizer:



The voices you use with AVSpeechSynthesizer are the same ones seen in the Siri personal assistant that’s built into all devices released since the iPhone 4S, and in the VoiceOver accessibility feature.

You can send more than one AVSpeechUtterance to an AVSpeechSynthesizer at the same time. If you call speak while the synthesizer is already speaking, it will wait until the current utterance has finished before moving on to the next.


Don’t call speak with the same AVSpeechUtterance twice—you’ll cause an exception, and your app will crash.

Once you start speaking, you can instruct the AVSpeechSynthesizer to pause speaking, either immediately or at the next word:

// Stop speaking immediately
self.speechSynthesizer.pauseSpeaking(at: AVSpeechBoundary.immediate)
// Stop speaking after the current word
self.speechSynthesizer.pauseSpeaking(at: AVSpeechBoundary.word)

Once you’ve paused speaking, you can resume it at any time:


If you’re done with speaking, you can clear the AVSpeechSynthesizer of the current and pending AVSpeechUtterances by calling stopSpeaking(at:). This method works in the same way as pauseSpeaking(at:), but once you call it, anything the synthesizer was about to say is forgotten.

4.6 Getting Information About What the Music App Is Playing


You want to find out information about whatever song the Music application is playing.


To do this, you’ll need to add the Media Player framework to your code by importing the MediaPlayer module in your file.

First, get an MPMusicPlayerController from the system, which contains information about the built-in music library. Next, get the currently playing MPMediaItem, which represents a piece of media that the Music app is currently playing. Finally, call valueForProperty to get specific information about that media item:

let musicPlayer = MPMusicPlayerController.systemMusicPlayer

let currentTrack : MPMediaItem? = musicPlayer.nowPlayingItem
let title = currentTrack?.value(forProperty: MPMediaItemPropertyTitle)
    as? String ?? "None"
let artist = currentTrack?.value(forProperty: MPMediaItemPropertyArtist)
    as? String ?? "None"
let album = currentTrack?.value(forProperty: MPMediaItemPropertyAlbumTitle)
    as? String ?? "None"

self.titleLabel.text = title
self.artistLabel.text = artist
self.albumLabel.text = album

Once you’ve got this information, you can do whatever you like with it, including displaying it in a label, showing it in-game, and more.

Finally, because the contents of the user’s media library is private information, you need to justify why you need permission to access it. When you request the systemMusicPlayer, iOS will use the text you provide to ask the user for permission to access the library.

To do this, go to the Project navigator, and select the project at the top. In the Targets list that appears, select the app target. Click the Info tab, and add a new entry for “Privacy - Media Library Usage Description”. In the Value column for this new entry, write the justification for why your app needs access to the user’s media library.


An MPMusicPlayerController represents the music playback system that’s built into every iOS device. Using this object, you can get information about the currently playing track, set the currently playing queue of music, and control the playback (such as by pausing and skipping backward and forward in the queue).

There are actually two MPMusicPlayerControllers available to your app. The first is the system music player, which represents the state of the built-in Music application. The system music player is shared across all applications, so they all have control over the same thing.

The second music player controller that’s available is the application music player. The application music player is functionally identical to the system music player, with a single difference: each application has its own application music player. This means that they each have their own playlist.

Only one piece of media can be playing at a single time. If an application starts using its own application music player, the system music player will pause and let the application take over. If you’re using an app that’s playing music out of the application music player, and you then exit that app, the music will stop.

To get information about the currently playing track, you use the nowPlayingItem property of the MPMusicPlayerController. This property returns an MPMediaItem, which is an object that represents a piece of media. Media means music, videos, audiobooks, podcasts, and more—not just music!

To get information about an MPMediaItem, you use the valueForProperty method. This method takes one of several possible property names. Here are some examples:


The name of the album.


The name of the artist.


The name of the album’s main artist (for albums with multiple artists).


The genre of the music.


The composer of the music.


The length of the music, in seconds.


The media library is only available on iOS devices—it’s not available on the iOS Simulator. If you try to use these features on the simulator, it just plain won’t work.

4.7 Detecting When the Currently Playing Track Changes


You want to detect when the currently playing media item changes.


Use a NotificationCenter to subscribe to the MPMusicPlayerControllerNowPlayingItemDidChangeNotification notification. First, create a property of type AnyObject? to store a reference to the observer object:

var trackChangedObserver : AnyObject?

When you want to begin tracking when the now playing item changes, ask the NotificationCenter to begin observing the notification:

trackChangedObserver = NotificationCenter.default
    .addObserver(forName: .MPMusicPlayerControllerNowPlayingItemDidChange,
    object: nil, queue: OperationQueue.main) { (notification) -> Void in

Next, get a reference to the MPMusicPlayerController that you want to get notifications for, and call beginGeneratingPlaybackNotifications on it:

let musicPlayer = MPMusicPlayerController.systemMusicPlayer


When you begin observing notifications using addObserver(forName:, object: queue, handler:), you’re given a reference to an object: the observer object. You need to keep a reference to this object around, because when you want to tell the notification system to stop notifying you (and you must do this, or else you’ll get bugs and crashes), you pass the object back to the NotificationCenter and call the removeObserver method. A common place to do this is in the view controller’s deinit method:

deinit {


Notifications regarding the current item won’t be sent unless beginGeneratingPlaybackNotifications is called. If you stop being interested in the currently playing item, call endGeneratingPlaybackNotifications.

Note that you might not receive these notifications if your application is in the background. It’s generally a good idea to manually update your interface whenever your game comes back from the background, instead of just relying on the notifications to arrive.

4.8 Controlling Music Playback


You want to control the track that the Music application is playing.


Use the MPMusicPlayerController to control the state of the music player:

let musicPlayer = MPMusicPlayerController.systemMusicPlayer


Don’t forget that if you’re using the shared system music player controller, any changes you make to the playback state apply to all applications. This means that the playback state of your application might get changed by other applications—usually the Music application, but possibly by other apps.

You can query the current state of the music player by asking it for the playbackState, which is one of the following values:


The music player isn’t playing anything.


The music player is currently playing.


The music player is playing, but is paused.


The music player is playing, but has been interrupted (e.g., by a phone call).


The music player is fast-forwarding.


The music player is fast-reversing.

You can get notified about changes in the playback state by registering for the MPMusicPlayerControllerPlaybackStateDidChange notification, in the same way MPMusicPlayerControllerNowPlayingItemDidChange allows you to get notified about changes in the currently playing item.

4.9 Allowing the User to Select Music


You want to allow the user to choose some music to play.


You can display an MPMediaPickerController to let the user select music.

First, make your view controller conform to the MPMediaPickerControllerDelegate:

class ViewController: UIViewController, MPMediaPickerControllerDelegate {

Next, add the following code at the point where you want to display the media picker:

let picker = MPMediaPickerController(mediaTypes:MPMediaType.anyAudio)

picker.allowsPickingMultipleItems = true
picker.showsCloudItems = true

picker.delegate = self

self.present(picker, animated:false, completion:nil)

Then, add the following two methods to your view controller:

func mediaPicker(_ mediaPicker: MPMediaPickerController,
    didPickMediaItems mediaItemCollection: MPMediaItemCollection) {

    for item in mediaItemCollection.items {
        if let itemName = item.value(forProperty: MPMediaItemPropertyTitle)
            as? String {
            print("Picked item: \(itemName)")


    let musicPlayer = MPMusicPlayerController.systemMusicPlayer

    musicPlayer.setQueue(with: mediaItemCollection)

    self.dismiss(animated: false, completion:nil)

func mediaPickerDidCancel(_ mediaPicker: MPMediaPickerController) {
    self.dismiss(animated: false, completion:nil)


An MPMediaPickerController uses the exact same user interface as the one you see in the built-in Music application. This means that your player doesn’t have to waste time learning how to navigate a different interface.

When you create an MPMediaPickerController, you can choose what kinds of media you want the user to pick. In this recipe, we’ve gone with MPMediaTypeAnyAudio, which, as the name suggests, means the user can pick any audio: music, audiobooks, podcasts, and so on. Other options include:


  • MPMediaType.podcast

  • MPMediaType.audioBook

  • MPMediaType.audioITunesU


  • MPMediaType.tvShow

  • MPMediaType.videoPodcast

  • MPMediaType.musicVideo

  • MPMediaType.videoITunesU

  • MPMediaType.homeVideo

  • MPMediaType.anyVideo

  • MPMediaType.anyAudio

  • MPMediaType.any

In addition to setting what kind of content you want the user to pick, you can also set whether you want the user to be able to pick multiple items or just one:

picker.allowsPickingMultipleItems = true

Finally, you can decide whether you want to present media that the user has purchased from iTunes, but isn’t currently downloaded onto the device. Apple refers to this feature as “iTunes in the Cloud,” and you can turn it on or off through the showsCloudItems property. Apple Music content is also displayed, if the user is a subscriber.

picker.showsCloudItems = true

When the user finishes picking media, the delegate of the MPMediaPickerController receives the mediaPicker(_, didPickMediaItems:) message. The media items that were chosen are contained in an MPMediaItemCollection object, which is basically an array of MPMediaItems.

In addition to getting information about the media items that were selected, you can also give the MPMediaItemCollection directly to an MPMusicPlayerController, and tell it to start playing:

let musicPlayer = MPMusicPlayerController.systemMusicPlayer

musicPlayer.setQueue(with: mediaItemCollection)

Once you’re done getting content out of the media picker, you need to dismiss it, by using the dismiss(animated:, completion:) method. This also applies if the user taps the Cancel button in the media picker: in this case, your delegate receives the mediaPickerDidCancel message, and your application should dismiss the view controller in the same way.

4.10 Cooperating with Other Applications’ Audio


You want to play background music only when the user isn’t already listening to something.


You can find out if another application is currently playing audio by using the AVAudioSession class:

let session = AVAudioSession.sharedInstance()

if (session.isOtherAudioPlaying) {
    // Another application is playing audio. Don't play any sound that might
    // conflict with music, such as your own background music.
} else {
    // No other app is playing audio - crank the tunes!


The AVAudioSession class lets you control how audio is currently being handled on the device, and gives you considerable flexibility in terms of how the device should handle things like the ringer switch (the switch on the side of the device) and what happens when the user locks the screen.

By default, if you begin playing back audio using AVAudioPlayer and another application (such as the built-in Music app) is playing audio, the other application will stop all sound, and the audio played by your game will be the only thing audible.

However, you might want the user to be able to listen to her own music while playing your game—the background music might not be a very important part of your game, for example.

To change the default behavior of muting other applications, you need to set the audio session’s category. For example, to indicate to the system that your application should not cause other apps to mute their audio, you need to set the audio session’s category to AVAudioSession.Category.ambient:

do {
    try AVAudioSession.sharedInstance()
} catch {
    println("Problem setting audio session: \(error)")

There are several categories of audio session available. The most important to games are the following:


Audio is reasonably important to your game. If other apps are playing audio, they’ll stop. However, the audio session will continue to respect the ringer switch and the screen locking. This is the default session category.


Audio isn’t the most important part of your game, and other apps should be able to play audio alongside yours. When the ringer switch is set to mute, your audio is silenced, and when the screen locks, your audio stops.


Audio is very important to your game. Other apps are silenced, and your app ignores the ringer switch and the screen locking.


When using AVAudioSession.Category.playback, your app will still be stopped when the screen locks. To make it keep running, you need to mark your app as one that plays audio in the background. To do this, follow these steps:

  1. Open your project’s information page by clicking the project at the top of the Project Navigator.

  2. Select the application’s target from the Targets list.

  3. Go to the Capabilities tab.

  4. Turn on “Background Modes,” and then turn on “Audio and AirPlay.”

Your app will now play audio in the background, as long as the audio session’s category is set to AVAudioSession.Category.playback.

4.11 Determining How to Best Use Sound in Your Game Design


You want to make optimal use of sound and music in your game design.


It’s really hard to make an iOS game that relies on sound. For one, you can’t count on the user wearing headphones, and sounds in games (and everything else, really) don’t sound their best coming from the tiny speakers found in iOS devices.

Many games “get around” this by prompting users to put on their headphones as the game launches, or suggesting that they are “best experienced via headphones” in the sound and music options menu, if it has one. We think this is a suboptimal solution.

The best iOS games understand and acknowledge the environment in which the games are likely to be played: typically a busy, distraction-filled environment, where your beautiful audio might not be appreciated due to background noise or the fact that the user has the volume turned all the way down.

The solution is to make sure your game works with, or without, sound. Don’t count on the user hearing anything at all, in fact.


Unless you’re building a game that is based around music or sound, you should make it completely playable without sound. Your users will thank you for it, even if they never actually thank you for it!

1 We apologize for this pun and have fired Jon, who wrote it.

Get iOS Swift Game Development Cookbook, 3rd Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.