In response to one of my articles on MythTV setup, a reader wrote that building a home theater PC and using it as a PVR recalls the spirit of experimentation bubbling up in the PC world in the 1980s. Whether or not the analogy is appropriate, it's certainly true that building a MythTV system is not for the faint of heart. In the course of my installation, I had four major problems that I had to get through. Two were performance problems, one looked as if it were a performance problem, and one was a display problem. I'll tackle all four in this article.
The thorniest problem I faced was an apparent performance problem. My efforts to figure it out lasted several months, with a few blind alleys. KQED, San Francisco's biggest PBS station, uses the flexibility of digital TV broadcasting to show five channels. Channel 9-1 is high-definition prime-time programming, while 9-2 through 9-5 show standard-definition programs.
One of the many options for the MythTV capture subsystem is the format in which capture programs are saved. Digital TV programs are transmitted over the air as a transport stream, and the transport stream is a container for individual program streams. By default, MythTV will save the transport stream. Conversion to program streams can save disk space, but the conversion process is not always perfect.
On my initial MythTV setup, I left the capture settings at the default value, so I was saving capture streams. When I initially tuned in to KQED, there was no audio on any of the programs, though every other channel I receive did have audio. For troubleshooting purposes, I changed the save format from the transport stream into program stream. All KQED programs had some audio, though it was very choppy in almost every case. Instead of the smooth sound I expected, the audio sounded as if it were playing back through a fan. Several times per second, the audio buffer would be overrun. Each time, the sound would abruptly cut out and the picture would hesitate slightly. Usually, buffer overruns indicate that the buffer drain process is not fast enough, so I focused my troubleshooting and configuration on improving audio performance.
Operating on the theory that it was a performance problem, I experimented extensively, as described in my post to the mythtv-users mailing list. I tried disabling CPU speed throttling to ensure that the core processor was operating at maximum speed. For good measure, I attempted to transcode the audio track into a much lower bit rate audio stream. ALSA allows programs to directly access hardware for demanding applications, so I tried allowing MythTV to write directly to the hardware. I experimented extensively with the PCI latency timers to ensure that the audio chipset was not being crowded out of bus access by the demands of the video card. I even modified the source code slightly to watch audio buffers being set up and allocated, to see if buffers for KQED were being allocated any differently from other channels.
As my frustration reached its peak, I found that the problem was in the conversion from the transport stream to the program stream. By saving the program as a transport stream, the audio is preserved intact. To fix the silent playback problem, change audio tracks with the plus key. Even though the debug output identifies only one audio track within the program stream, it still must be manually selected. (I added the audio track switching command to my remote control to allow the fix to work from my couch.)
At this point, I suspect that KQED is transmitting the audio channel for its programming in a slightly different manner from other channels, and the problem may be specific to the DVB drivers that run the capture cards. (Several other guides that describe setting up the Video4Linux drivers do not mention this bug.) When exporting video programs for later use on a laptop, some encoders will miss the audio track.
MythTV uses some auxiliary background processes that should run at very low priority. Commercial flagging and transcoding both are much lower priority tasks than recording or playing back programs. Both jobs are run at very high nice levels, but the implicit assumption is that if a program is niced, it is not important enough to increase the CPU speed for. I like commercial flags because I can use a single button on the remote control to skip over them to the next breakpoint identified, and I would rather have my transcoded video sooner. In both cases, I want the CPU to increase speed for a niced process.
On the first Fedora-based installation, the CPU speed was controlled by the
cpuspeed daemon. As a feature, it did not change the processor speed in response to demand from niced processes. To have
cpuspeed increase the CPU clock speed in response to demand from niced processes, I applied this patch to the speed monitoring daemon.
Gentoo does not use a user-space CPU speed controller. Instead, it uses the
CPUfreq driver in the kernel. Multiple governors can be used to control the CPU speed. I use the on-demand governor, which is activated by writing the word "ondemand" to the file that controls which governor is used (/sys/devices/system/cpu/cpu0/cpufreq/scaling_governor). Naturally, I modified my system start-up scripts so that the CPU speed governor is set up as part of the system boot. To increase CPU speed in response to demand from niced processes, write a "1" into the file /sys/devices/system/cpu/cpu0/cpufreq/ignore_niced.
The Athlon64 3200+ processor can run at 1.0 GHz, 1.8 GHz, or 2.0 GHz. It idles at 1.0 GHz, and is able to record programs at the lowest speed. Speed is only increased when playing back video or marking commercials in recorded programs.
In addition to the audio glitches on playback on KQED, I observed other sporadic audio glitches that looked like performance problems. During playback, the disk activity light is often solid. If the system is attempting to flag commercials in the background in addition to video playback, the disk load can be so high that performance problems result.
The 2.6 kernel has three different disk schedulers that can be loaded into the kernel. I experimented with all three of them to see which of them worked the best with a high disk load. As described in the documentation, the deadline scheduler, although the simplest, tends to work best in heavy I/O situations.
For testing, I compiled each of the schedulers as a module so that I could experiment without kernel rebuilds. To use a scheduler, two things must be done: the code must be resident in the kernel, and the scheduler must be selected. I use a simple startup script to activate the deadline scheduler that carries out both steps:
#!/bin/sh # load and activate deadline i/o scheduler code modprobe deadline_iosched cat "deadline" > /sys/block/sda/queue/scheduler
The main reason I fiddled with the I/O scheduler was a different performance problem. My initial Fedora-based MythTV system was able to play back high-definition streams without difficulty, but my Gentoo-based system occasionally struggled with the load. By building MythTV with and without OpenGL video timing support, I figured out that the Bob deinterlacing method is extremely taxing when used with both TwinView and OpenGL video timing. Either disabling TwinView or using a real-time clock for video synchronization will eliminate the performance problem. I chose to disable TwinView because I had used it only for troubleshooting. With most of the configuration problems shaken out, I no longer needed the monitor at the side for debugging.
Although it is no longer necessary to use the deadline scheduler, using an I/O scheduler does help to reduce the average disk wait time by synchronizing requests so that they are presented in an intelligent order to the disk. (I believe it also makes disk access slightly quieter because the seek operations tend to be sequential, but the Samsung SpinPoint disks that I use are so quiet it is hard to tell if that is just my imagination.)
Some channels may have black and white "fuzz" at the top of the screen, as shown in figure 1. The noise shown is part of the analog television signal called the Vertical Blanking Interval (VBI). Analog television signals trace the electron beam from the lower right to the upper left during the blanking interval. With no visual data transmitted, the visual representation of the signal during the VBI is meaningless.
TV signals are usually overscanned before display, which has the effect of cropping the image. To prevent the display of the static, increase the vertical overscan until it disappears. On my system, the static vanished once the overscan reached 2 percent.
Coming up in the next installments: getting the picture right on playback, the setup of the remote controls, and transporting video in a convenient form. You don't want to miss any of it . . .
Return to digitalmedia.oreilly.com