1 Background

Sampling Signals

To use digital signal processing techniques, you must first convert an analog signal into its digital representation. In practice, this is implemented by using an Analog-to-Digital (A/D) converter. Consider an analog signal x(t) that is sampled every Δt seconds. The time interval Δt is known as the sampling interval or sampling period. Its reciprocal, 1/Δt, is known as the sampling frequency, with units of samples/second. Each of the discrete values of x(t) at t = 0, Δt, 2Δt, 3Δt, and so forth, is known as a sample. Thus, x(0), xt), x(2Δt),…, are all samples. The signal x(t) can thus be represented by the discrete set of samples

{x(0), xt), x(2Δt), x(3Δt),…, x(kΔt),…}

Figure 1-1 shows an analog signal and its ...

Get LabVIEW Signal Processing now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.