Chapter 4. Advanced Media Element Scripting

The majority of HTML5 media element use focuses on their primary purpose, which is to play media resources. We might use JavaScript to craft new controls or improve the accessibility of the content, and eventually we’ll see what we can do with media controllers and multiple tracks—but we’ll rarely go beyond these core capabilities.

However, some folks have looked beyond the basic boxes of the video and audio element, and have demonstrated the use of these elements with other technologies, including SVG (Scalable Vector Graphics) and the canvas element. In addition, a couple of the browser companies have expanded the capability of the audio element so that it can generate sound as well as play it.

In this chapter, I’ll introduce you to some of these advanced experimentation efforts with the HTML5 media elements, and provide some background so that you can give these effects a try on your own.

Note

Most of the material in this chapter has very limited support among browsers. The only browser capable of working with all the examples (at this time) is Firefox. I’ll note browser support with each. Many of the examples are also very CPU-intensive. Use cautiously.

Media Elements and Canvas

HTML5 not only gave us the media elements, it also formalized the canvas element. The canvas element was introduced by Apple years ago, and provided a way for us to draw into an area directly in the web page. Most browsers supported the element and associated API. However, the element was standardized in HTML5 and now all of our target browsers support it.

You can combine the HTML5 media elements with canvas to create some amazing effects. Mozilla demonstrated one such effect by showing how the canvas element, combined with HTML5 video, could be used to replace the plain greenscreen background of a video with the Firefox logo. Doctor HTML5 demonstrated how to create a grayscale effect using HTML5 video and canvas. Sebastian Deutsch combined HTML5 audio with canvas and Twitter for a musical visual treat.

Note

Mozilla’s work with canvas and HTML5 video can be found at https://developer.mozilla.org/En/manipulating_video_using_canvas. You can see the HTML5 Doctor’s work at http://html5doctor.com/video-canvas-magic/. Deutsch’s beautiful work can be found at http://9elements.com/io/?p=153. A follow-up effort that generalized the canvas/audio combination, but without Twitter, can be found at http://nooshu.com/three-js-and-the-audio-data-api-visualisation.

I’m going to emulate the good HTML5 Doctor’s effort by creating a video filter. First, though, I’m going to go through the steps to just get the video playing in a canvas element.

Playing a Video in an Canvas Element

To play the video in a canvas, we’ll need to add both the video element and the canvas element to the web page:

   <video id="videoobj" controls width="480" height="270">
      <source src="videofile.mp4" type="video/mp4" />
      <source src="videofile.webm" type="video/webm" />
   </video>
   <canvas id="canvasobj" width="480" height="270"></canvas>

In this example, both the canvas and video elements are the same size, specified in the width and height attribute for both.

When we work with a canvas, we’re really working with two different objects: the element and the context. The canvas element supports more than one context, but the one we’re more familiar with is the two-dimensional (2D) context, created as follows:

var canvasObj = document.getElementById("canvasobj");
var ctx = canvasObj.getContext("2d");

To draw the video onto the canvas, we’re going to use the 2D Canvas API method drawImage. This method takes whatever is given in the first parameter, and draws it into the canvas element. There are three versions of the drawImage method:

drawImage(image, dx, dy); // draw image starting at canvas position dx,dy
drawImage(image, dx, dy, dw, dh); // draw image starting at dx,dz w/dimensions dw,dh

// draw from image given image dimensions to canvas given canvas dimensions
drawImage(image, sx, sy, sw, sh, dx, dy, dw, dh);

The image shown in the methods can be one of three different types of objects:

  • An HTMLImageElement object instance (an img element)

  • An HTMLCanvasElement object instance (another canvas element)

  • An HTMLVideoElement object instance (an HTML5 video element)

We’re only interested in the last method, which takes a video element instance as its first parameter.

Once we have access to the canvas element’s context and a reference to the video object, all we need to do to draw a single frame of a video object into the context (beginning at the left/top of the canvas element) is add the following JavaScript:

var videoObj = document.getElementById("videoobj");
ctx.drawImage(videoObj, 0,0);

The drawImage method isn’t a live method, which means we have to call it every time was want to draw a video frame into the canvas. My first inclination is to use the video object’s timeupdate event handler function to invoke the drawing action. After all, the custom control application in Chapter  had success using the timeupdate event handler to manage the playback indicator.

Incorporating the new event handler results in the application shown in Example 4-1.

Example 4-1. A first cut at drawing video data to a canvas element
 <!DOCTYPE html>
<head>
<title>Grayscale a Video</title>
   <meta charset="utf-8" />
   <script>
      window.onload=function() {
         document.getElementById("videoobj").
                 addEventListener("timeupdate", drawVideo, false);
      }
      function drawVideo() {
        var videoObj = document.getElementById("videoobj");
        var canvasObj = document.getElementById("canvasobj");
        var ctx = canvasObj.getContext("2d");
        ctx.drawImage(videoObj,0,0);
      }
   </script>
</head>
<body>
   <video id="videoobj" controls width="480" height="270">
      <source src="videofile.mp4" type="video/mp4" />
      <source src="videofile.webm" type="video/webm" />
   </video>
   <canvas id="canvasobj" width="480" height="270"></canvas>
</body>

The application works in IE10, Firefox, Opera, and WebKit Nightly, but not Chrome (not even Canary), or Safari 5. However, in the browsers where the application worked, the drawing action in the canvas element was choppy, and lagged noticeably behind the video, as shown in Opera in Figure 4-1.

One of the problems with timeupdate is that it didn’t fire quickly enough. Searching about through online documentation, I found that Firefox fires the timeupdate event every 250 milliseconds, and I am assuming the other browsers do something similar. Untold frames have flipped past in such a long time!

Though timeupdate was sufficient for our playback indicator, it isn’t sufficient for video playback. A better approach, and one that was taken in the HTML5 Doctor article, is to use the JavaScript function setTimeout, and a time frame of 20 milliseconds—over ten times faster than using timeupdate.

The script block in Example 4-1 is modified to now use setTimeout, as shown in Example 4-2. Since we’re no longer using an event that’s playback-related, we also have to know when to stop drawing. The code tests to ensure that the video isn’t paused or finished, first, before doing the draw.

in Opera with a noticeable lag in canvas playback
Figure 4-1. Example 4-1 in Opera with a noticeable lag in canvas playback
Example 4-2. Updating the video/canvas application to improve performance
   <script>
      window.onload=function() {
         document.getElementById("videoobj").
                 addEventListener("play", drawVideo, false);
      }
      function drawVideo() {
        var videoObj = document.getElementById("videoobj");

        // if not playing, quit
        if (videoObj.paused || videoObj.ended) return false;

        // draw video on canvas
        var canvasObj = document.getElementById("canvasobj");
        var ctx = canvasObj.getContext("2d");
        ctx.drawImage(videoObj,0,0,480,270);
        setTimeout(drawVideo,20);
      }
   </script>

When we run the application with Firefox, Opera, IE10, and Webkit Nightly, we get a nice, smooth integration between the video play and the canvas drawing actions. Now we’re ready to add in the filter functionality.

Creating a Video Visual Filter using the Canvas Element

To modify the presentation of the video data in the canvas element, we’re actually going to need to create a new canvas element as a scratch, or temporary working object, place the original video data into it, apply the filter, and then access the data from the scratch canvas object and use it to update the displayed canvas object. Yes, it seems like a lot of work, but it’s necessary. We can modify canvas data, but we can’t directly modify video data. And we don’t want to use our display canvas element to perform all our manipulation.

Warning

Security protocols currently prohibit accessing the canvas data if the image used (regardless of whether it’s video or a still image) is accessed from a domain other than the one serving the web page with the canvas application.

The filter function we’re using is one that’s also been used with still images and the canvas element. It’s basically a desaturation of the color to create a grayscale effect. I modified a filter for still images I found at the HTML5 Rocks web site. It’s very similar to the same filter used in the HTML5 Doctor article:

      function grayScale(pixels) {
         var d = pixels.data;
         for (var i=0; i<d.length; i+=4) {
            var r = d[i];
            var g = d[i+1];
            var b = d[i+2];
            var v = 0.2126*r + 0.7152*g + 0.0722*b;
            d[i] = d[i+1] = d[i+2] = v
          }
        return pixels;
      }

The canvas pixel data is sent to the filter, which does the grayscale conversion and then returns the data. Example 4-3 shows the script block incorporating the new filter.

Example 4-3. Using a scratch canvas to create a grayscale of the video
   <script>

      // grayscale filter
      function grayScale(pixels) {
         var d = pixels.data;
         for (var i=0; i<d.length; i+=4) {
            var r = d[i];
            var g = d[i+1];
            var b = d[i+2];
            var v = 0.2126*r + 0.7152*g + 0.0722*b;
            d[i] = d[i+1] = d[i+2] = v
          }
        return pixels;
      }

      // event listeners
      window.onload=function() {
         document.getElementById("videoobj").
                 addEventListener("play", drawVideo, false);
      }

      // draw the video
      function drawVideo() {
        var videoObj = document.getElementById("videoobj");

        // if not playing, quit
        if (videoObj.paused || videoObj.ended) return false;

        // create scratch canvas
        var canvasObj = document.getElementById("canvasobj");
        var bc = document.createElement("canvas");
        bc.width=480;
        bc.height=270;

        // get contexts for scratch and display canvases
        var ctx = canvasObj.getContext("2d");
        var ctx2 = bc.getContext("2d");

        // draw video on scratch and get its data
        ctx2.drawImage(videoObj, 0, 0, 480, 270);
        var pData = ctx2.getImageData(0,0,480,270);

        // grayscale it and set to display canvas
        pData = grayScale(pData);
        ctx.putImageData(pData,0,0);

        setTimeout(drawVideo,20);
     }
   </script>

Figure 4-2 shows the grayscale filter in action in Webkit Nightly. The grayscale effect also works with IE10, Firefox, and Opera.

Video and video in grayscale using canvas
Figure 4-2. Video and video in grayscale using canvas

The HTML5 Rocks site had some other interesting filters to apply against the video element (if you have the CPU for the task). One in particular, though, struck me as quite useful and a good justification for the use of such a filter: being able to increase or decrease the brightness of the video.

I created a global variable, brightness, used to track the current video’s brightness setting. I also added two new buttons to the web page: one to decrease the brightness level of the video, and one to increase it.

Next, I added a new filter function, adjBrightness, that takes pixel data and an adjustment indicator to increase or decrease the brightness of the video:

      function adjBrightness(pixels, adjustment) {
         var d = pixels.data;
         for (var i=0; i<d.length; i+=4) {
            d[i] += adjustment;
            d[i+1] += adjustment;
            d[i+2] += adjustment;
         }
         return pixels;
      }

I also added event listeners to the button elements’ click events. In the click event handler functions, the code increases or decreases the brightness variable accordingly. Example 4-4 contains the complete web page.

Example 4-4. Video playback in canvas with brightness controls
<!DOCTYPE html>
<head>
<title>Adjust Video Brightness</title>
   <meta charset="utf-8" />
   <style>
      #backcanvas { display: none; }
   </style>
   <script>

      var brightness = 0;

      // adjust brightness
      function adjBrightness(pixels, adjustment) {
         var d = pixels.data;
         for (var i=0; i<d.length; i+=4) {
            d[i] += adjustment;
            d[i+1] += adjustment;
            d[i+2] += adjustment;
         }
         return pixels;
      }
      window.onload=function() {
         document.getElementById("videoobj").
                 addEventListener("play", drawVideo, false);

         // brighten video
         document.getElementById("increase").
                 addEventListener("click",function() {
                    brightness+=5;
                    },false);

         // darken video
         document.getElementById("decrease").
                 addEventListener("click",function() {
                    brightness-=5;
                    },false);
      }
      function drawVideo() {
        var videoObj = document.getElementById("videoobj");

        // if not playing, quit
        if (videoObj.paused || videoObj.ended) return false;

        // access draw canvas, create scratch canvas
        var canvasObj = document.getElementById("canvasobj");
        var bc = document.createElement("canvas");
        bc.width=480;
        bc.height=270;

        // get context for both canvas objects
        var ctx = canvasObj.getContext("2d");
        var ctx2 = bc.getContext("2d");

        // draw video on scratch canvas obj, get data
        ctx2.drawImage(videoObj, 0, 0);
        var pData = ctx2.getImageData(0,0,480,270);

        // adjust brightness and set to display canvas
        pData = adjBrightness(pData, brightness);
        ctx.putImageData(pData,0,0);

        setTimeout(drawVideo,20);
      }

   </script>
</head>
<body>
   <video id="videoobj" controls width="480" height="270">
      <source src="videofile.mp4" type="video/mp4" />
      <source src="videofile.webm" type="video/webm" />
   </video>
   <canvas id="canvasobj" width="480" height="270"></canvas>
<div>
<button id="increase">Increase brightness</button>
<button id="decrease">Decrease brightness</button>
</body>

The use of the brightness filter works with Firefox, Webkit Nightly, and IE10. You can adjust the filter in Opera but you get truly bizarre results if you do.

To incorporate this functionality as part of your custom control, hide the video and display the canvas element, instead. Figure 4-3 shows the brightness filter in action. The top picture is from the video, and the bottom picture is the brightness-adjusted canvas. I actually think the brightness-adjusted canvas is an improvement. Also notice in the image a slight lag in the video playback, even with our improved event handling.

Demonstrating the new brightness control implemented using the canvas element
Figure 4-3. Demonstrating the new brightness control implemented using the canvas element

Note

Access the different canvas image filters at HTML5 Rocks at http://www.html5rocks.com/en/tutorials/canvas/imagefilters/.

Media Elements and SVG

In March 2010, Paul Irish wrote about spending some time with folks proficient with SVG, and provided a web page demonstrating how to use SVG and HTML5. At Webjam8, Andreas Bovens demonstrated the use of HTML5 media elements with SVG and SMIL. Mike Thomas showed how to use SVG masks with video to beautiful effect. Mr. Doob demonstrated how to combine SVG and HTML5 audio to create a 3D Waveform. Erik Dahlström created a terrific music video based, in part, on the use of SVG patterns.

Combining HTML5 media elements and SVG is combining the best of the old and new from the W3C. The HTML5 media elements are new, but already demonstrate robustness and extensibility. SVG has been around for over a decade and with the advent of HTML5, is now supported in HTML as well as XHTML.

There are two ways you can combine the HTML5 media elements with SVG: using the SVG capabilities with media elements within an HTML document, or embedding an HTML5 media element within an SVG document.

Note

Mike Thomas’s demonstration combining HTML5 video with SVG masks can be found at http://atomicrobotdesign.com/blog/htmlcss/svg-masks-html5-video-and-firefox-4/. Andreas Bovens’ presentation can be seen at http://my.opera.com/ODIN/blog/2008/10/20/my-webjam-8-presentation-on-builder-au-cool-things-with-html5-svg-and-smil, though the examples no longer work. Paul Irish’s demo page and video can be found at http://paulirish.com/2010/svg-filters-on-html5-video/. Mr. Doob’s Waveform can be launched at http://www.chromeexperiments.com/detail/3d-waveform/. Erik Dahlström’s work can be seen at the SVGWow page, http://svg-wow.org/blog/2011/03/28/rotozoom-video/.

Adding an HTML5 Video to an SVG Document

Most demonstrations of using SVG and HTML5 video together are based on SVG and the video element embedded into an HTML or XHTML document. However, Chris Double created a nice video/SVG demonstration page where multiple videos are stacked in a web page, and can be resized and rotated about using SVG functionality.

The real issue with using videos in SVG is that SVG is famous for being able to scale without degrading. Video, on the other hand, does not scale. If you increase the size of the video file, all you’ll get is a blurry, pixelated picture.

Still, if you’re familiar with SVG, it’s fun to give HTML5 video and SVG a try.

Note

Find out about Chris Double’s efforts with HTML5 video and SVG at http://www.bluishcoder.co.nz/2007/08/svg-video-demo.html. Access the video/SVG demo at http://double.co.nz/video_test/video.svg. Note that the demo only works in Firefox and Opera.

An SVG document is XML, which means we have to make sure our attribute values are quoted, our tags are closed, and values assigned to boolean attributes. HTML within SVG documents is added using a specialized element, foreignObject. When you add an HTML block to an SVG document, you’re telling the user agent not to process the contents as SVG, but as HTML. Typically the block is added as a body element, complete with HTML namespace.

Why would you include video within an SVG document? One reason is to make use of the more sophisticated graphics available with SVG. For instance, you could provide a more interesting frame for the video element. Example 4-5 shows an SVG document with an embedded video, nested within a rectangle with a pink background and a green dashed border.

Example 4-5. HTML5 video embedded in an SVG document
<?xml version="1.0"?>
<svg xmlns="http://www.w3.org/2000/svg"
width="510px" height="300px" viewBox="0 0 510 300">

<rect x="10" y="10" width="490" height="280"
        stroke="green" stroke-width="10"  stroke-linejoin="round"
        stroke-dasharray="5,3,2" fill="pink" />

<g id="videoobj">
  <foreignObject viewBox="0 0 480 270" width="480px"
     height="270px" x="15" y="15">
     <body xmlns="http://www.w3.org/1999/xhtml"
           style="margin: 0; padding: 0">
        <video controls="controls" width="480" height="270">
           <source src="videofile.mp4" type="video/mp4" />
           <source src="videofile.ogg" type="video/ogg" />
        </video>
     </body>
   </foreignObject>
</g>
</svg>

Notice in the code how I had to assign a value to the controls boolean attribute. If I didn’t, I’d get the dreaded orange screen of death for Firefox that appears when you send bad XML to the browser.

Figure 4-4 shows my masterpiece in Webkit Nightly. The application also works in Firefox, Chrome, Safari, but not in IE9/IE10 or Opera.

Well, OK, the visual impact of my work of art is nothing to write home about. But another advantage to SVG is being able to find openly available SVG documents throughout the Web, and being able to modify these documents for your own use. Figure 4-5 shows a slightly better graphic to use for the HTML video element.

HTML5 video playing, framed by SVG
Figure 4-4. HTML5 video playing, framed by SVG
HTML5 video playing within a TV frame in a found SVG file
Figure 4-5. HTML5 video playing within a TV frame in a found SVG file

The advantage of SVG is that all of the graphics are individually accessible markup elements. Having the primitives makes it simple to find what you need and modify it to match the result you want. Best of all, the graphics can be easily scaled to whatever dimensions you need.

Note

I found the TV SVG graphic at http://www.openclipart.org/, the best place to find public domain SVG you can use in your projects.

Embedding HTML in SVG can be fun, if a little challenging. Most uses of SVG with HTML5 video, though, are to borrow many of the SVG capabilities for use with an HTML5 video within an HTML5 document, as discussed next.

Note

If you have the same size image and video, you could also just use an absolutely positioned PNG element and get much the same effect, as demonstrated by Chris Mills at http://people.opera.com/cmills/video_mask/.

Applying SVG Filters to Video Files within HTML

An interesting use of SVG and HTML5 video together is to apply SVG filters or masks or other graphical devices to the video element.

SVG filters are highly sophisticated effects that can be used individually or combined into more complex presentations. These filters are declarative effects that can be applied to SVG elements, but can also be applied to any HTML, including HTML media elements.

To use an SVG filter with an HTML5 video element within an HTML5 document, we’ll have to add an SVG block to the same HTML file that has the video:

<!DOCTYPE html>
<head>
   <title>SVG filter effects with video</title>
   <meta charset="utf-8" />
</head>
<body>
   <video controls poster="poster.jpg">
      <source src="videofile.mp4" type="video/mp4" />
      <source src="videofile.webm" type="video/webm" />
   </video>
   <svg id='image' version="1.1">
   </svg>

</body>

Filters are created within a defs (definitions) element, and given unique identifiers. To apply the filter to the video, the filter identifier is used within a CSS style setting:

<style>
  video { filter:url(#filtername); }
</style>

The CSS filter property takes one value, the url function, passing in the filter’s identifier.

To demonstrate, we’ll create one of the simpler filter effects: a filter that takes a color video and converts it to black and white. This particular effect is created using the feColorMatric filter, which applies a matrix transformation on the RGBA color and alpha values of each pixel in the original element, in this case a video. The matrix structure is as follows:

| R' |     | a00 a01 a02 a03 a04 |   | R |
| G' |     | a10 a11 a12 a13 a14 |   | G |
| B' |  =  | a20 a21 a22 a23 a24 | * | B |
| A' |     | a30 a31 a32 a33 a34 |   | A |
| 1  |     |  0   0   0   0   1  |   | 1 |

The actual value passed to the grayscale filter effect is:

0.3333 0.3333 0.3333 0 0
0.3333 0.3333 0.3333 0 0
0.3333 0.3333 0.3333 0 0
0      0      0      1 0

Example 4-6 has a complete page with HTML video and applied SVG filter. Pay particular attention to the fact that no JavaScript is used to create the effect.

Example 4-6. Applying an SVG filter to an HTML5 video within an HTML document
<!DOCTYPE html>
<head>
<title>Video</title>
   <meta charset="utf-8" />
   <style>
      video { filter:url(#blackandwhite); border: 2px solid red;}
   </style>
</head>
<body>
   <video controls poster="poster.jpg" autoplay>
      <source src="videofile.mp4" type="video/mp4" />
      <source src="videofile.webm" type="video/webm" />
   </video>
   <svg id='image' version="1.1">
      <defs>
         <filter id="blackandwhite">
            <feColorMatrix values="0.3333 0.3333 0.3333 0 0
                             0.3333 0.3333 0.3333 0 0
                             0.3333 0.3333 0.3333 0 0
                             0      0      0      1 0"/>         </filter>
      </defs>
   </svg>
</body>

Figure 4-6 shows the video playing in Firefox…the only browser that supports the effect.

Applying an SVG filter to an HTML5 video
Figure 4-6. Applying an SVG filter to an HTML5 video

Mozilla created the concept of applying SVG filters (and masks and other graphical devices) to HTML elements within an HTML document. No other browser has taken up the idea, though I hope they eventually do so. As you can see, the code is quite simple, and doesn’t require any JavaScript.

As much as I like this ability, though, there are performance issues with applying SVG effects against HTML elements in this manner. I created an offscreen illumination effect for one example, and thought my laptop’s fan would fly out of the case—and my laptop is usually sufficient to meet the needs of even the most tortuously complex web page effect.

Note

For more on applying SVG filters to HTML documents, see the support page for the effect at Mozilla, at https://developer.mozilla.org/En/Applying_SVG_effects_to_HTML_content.

The Audio Data APIs

It’s easy to focus on the HTML5 video element and forget that the specification also includes a wonderful new facility to add audio to the web page. The audio element is just as easy to work with as the video element, and the HTMLMediaElement and other API functionality works equally well with the audio element.

What isn’t part of the HTML5 or any other specification is functionality that is only relevant to the audio type: the Audio Data APIs. The Audio Data APIs are a set of APIs being worked on by Mozilla, Google, and Webkit—though not all on the same API—to add an API that not only provides audio playback, but also audio generation, too.

Matt Hobbs had fun with the Audio Data API. He extrapolated Mr. Doob’s efforts discussed earlier in the chapter and combined the canvas visual effects with the Audio Data API. Greg Jopa combined Mozilla’s Audio Data API with the VexFlow music notation rendering API to create an extraordinary web page-based online musical instrument.

Note

Matt Hobb’s work combining canvas and the Audio Data API can be seen at http://nooshu.com/three-js-and-the-audio-data-api-visualisation. The work by Greg Jopa combining the Audio Data API and VexFlow can be explored at http://www.gregjopa.com/2010/12/html5-guitar-tab-player-with-firefox-audio-data-api/.

The Mozilla Audio Data API provides a way to get an array of values representing the audio sample data. Though related to audio, evidently the functionality can be applied to the audio and video element, though you’ll have access only to the audio data.

To access the data, add an event listener for the MozDataAvailable event to the media element. In the event handler function, the event object provides access to the audio data through the frameBuffer. Mozilla provides an example that actually takes this data and prints it out to the page. I copied portions of it in Example 4-7.

Example 4-7. Printing out the audio data to the web page
<!DOCTYPE html>
<html>
  <head>
    <title>JavaScript Visualization Example</title>
    <body>
    <audio id="audio-element"
           src="sharecropper.ogg"
           controls>
    </audio>
        <pre id="raw">hello</pre>
    <script>
      function audioAvailable(event) {
        var frameBuffer = event.frameBuffer;
        var t = event.time;

        var text = "Samples at: " + t + "\n"
        text += frameBuffer[0] + "  " + frameBuffer[1]
        raw.innerHTML = text;
      }

      var raw = document.getElementById('raw')
      var audio = document.getElementById('audio-element');
      audio.addEventListener('MozAudioAvailable', audioAvailable, false);

    </script>
  </body>
</html>

It’s important to understand that audio can be represented textually, because the Audio Data API can be used to actually generate sound, as well as play sound. Another example that Mozilla provides, shown in Example 4-8, produces a tone—though in fact, a rather annoying tone.

Example 4-8. Generating a tone using the Audio Data API
<!doctype html>
<html>
  <head>
     <title>Generating audio in real time</title>   
     <script type="text/javascript">
     function playTone() {
       var output = new Audio();
      output.mozSetup(1, 44100);
       var samples = new Float32Array(22050);
       var len = samples.length;

       for (var i = 0; i < samples.length ; i++) {
         samples[i] = Math.sin( i / 20 );
       }
              output.mozWriteAudio(samples);
     }
   </script>
 </head>
 <body>
   <p>This demo plays a one second tone when you click the button below.</p>
   <button onclick="playTone();">Play</button>
 </body>
 </html>

Though the result is simple, consider the capability this API provides: the ability to not only play audio files without having to use a plug-in, but also to generate audio. Game developers in particular should be interested in this capability.

As I mentioned earlier, there is no specification work underway for the Audio Data API, most likely because the work on the concept is still very new. Apple and Google are also exploring an audio data API via work in WebKit, called the Web Audio API. Google did propose the formation of a W3C Audio Group, but none has been created at this time.

Do check out the APIs and the many excellent examples, but remain cautious about using the APIs in your production efforts.

Note

The Audio Data API MozillaWiki entry can be found at https://wiki.mozilla.org/Audio_Data_API. Mozilla has also provided an introduction to the API at https://developer.mozilla.org/en/Introducing_the_Audio_API_Extension. The BBC has also provided an excellent tutorial/introduction to the Audio Data API at http://www.bbc.co.uk/blogs/researchanddevelopment/2010/11/mozilla-audio-data-api.shtml. Access more information on the Web Audio API at http://chromium.googlecode.com/svn/trunk/samples/audio/index.html.

Get HTML5 Media now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.