9. Noise

A noise signal is statistical in the sense that its future value can not be predicted. It is not possible to describe the signal a priori by a unique function y(t), although such a function can be introduced to describe a specific time segment after it has occurred. Suppose such a time segment has been recorded. It is then possible to calculate certain characteristic attributes, which tend to unique values as the length of the segment increases. One such quantity is the time average

The presence of a finite time average would represent a predictable DC component which is essentially contradictory so one aspect of noise is that it has a zero average. Consider the noise signal associated with the common practice of repeatedly measuring some static quantity such as the velocity of light. The noise signal would represent the departure from the mean occurring in any one measurement, or the fluctuation about the mean. The average fluctuation about the mean is zero by definition so the customary method of quantifying the fluctuations is the average of the square of the fluctuations, or variance. This would be given by

A very important generalization of the variance is a quantity called the autocorrelation function which in fact represents the most complete description of a noise signal that is possible. The variance can be looked upon as the sum over the product of the signal at a given time with the value of the signal at the same time. The generalization results when the second term in the product is instead the value of the signal at a different time. In detail the autocorrelation function of a given noise signal R(t)is given by

A closely related quantity is the power spectrum. The autocorrelation function and the power spectrum are essentially Fourier transform pairs. Consider the transform

It should be noticed that from its definition the autocorrelation function is symmetric with respect to the sign of the argument, referred to as the lag. This imposes a condition on the transform since in general it consists of an anti-symmetric imaginary component. Imposing this condition leads to the simplification

where the power spectrum S(w) = 2G(w). Consideration is then restricted to both w >= 0 and t >= 0. The term power spectrum can be understood from the following considerations. Eqn (4) with t = 0 can be rearranged to read

Whether the signal represents a voltage or a current the power developed is proportional to the square of the signal so that the relation in Eqn(5) expresses the manner in which the power is distributed over frequency with d< y2 > being the element of power in the frequency band extending from f to f + df.


The inverse relationship becomes


9.1 Thermal Noise

The properties of thermal, or Johnson noise may be derived from very general thermodynamic considerations. The following microscopic derivation is less rigorous but more insightful. Consider an electron with velocity ve(t) at some instant. The electron will continue to have this velocity until it suffers a collision, at which point an entirely new and unpredictable velocity occurs. Thus ve(t) is a random signal. The average motion due to the collisions is governed by a resistive force F = -av, so the equation of motion in the absence of any external force is

with solution

Then the autocorrelation function is

This form of the result is now transformed to relate the microscopic parameters to more useful macroscopic ones. In thermal equilibrium m< v2 > = kT where m is the electron mass, k is Boltzmann's constant and T is the absolute temperature. The current produced in a resistance of length l is ev/l so the autocorrelation for the current can be written

The remaining microscopic quantity is the tc the collision time and this is related to the resistance of the material through which the electron moves. An applied electric field induces a steady current corresponding to a non-statistical velocity, referred to as the drift velocity, vd. This is obtained from the equation

where E is the applied electric field. The steady state solution is

The steady state current is thus

where the last equality follows from E = V/l, where V is the applied voltage across the resistor. The bracketed quantity is obviously the inverse of the resistance R.

From Eqn(9.13) this gives

From Eqns(9.8)and (9.17)

For extremely short collision times, say 10-12s, the bracketed term is unity for all reasonable frequencies. This result is often expressed as the noise power per unit bandwidth, being 4kT, since the power is R< i2 >. An obvious but nevertheless important fact is that the noise power depends directly upon the temperature. In measurements requiring extremely low noise contributions, such as radio astronomy or nuclear spectroscopy, it is standard practice to cool the input stage of the amplifiers used.

9.2 Shot Noise

A quite different natural source of fluctuations arises from the fact that fundamentally charge is quantized in units of the electronic charge e. It is rather amusing to consider that in electrical measurements there is from this viewpoint no such thing as an analogue signal since all voltages across capacitors for example must be generated from an integral number of charges so the charge can always be written Q = Ne, where N is an integer. Since N is typically of the order of 1012 the quantization is not noticed in common practice. At very low levels this is no longer the case and the quantization is treated as a noise fluctuation.

The measurement of current is an implicit count rate measurement determining the number of electrons per unit time detected by the ammeter. Assume the collection time of the instrument is tm. Then the current is

Now the quantity N follows the counting or Poisson distribution and has a variance of N. This result is often expressed as the error in a count is the root of the count. Making use of this fact it follows that

Here i represents the current fluctuation about the mean. Data taken in the measurement interval are correlated. Data taken in a new interval are not. Recalling that the auto-correlation function must be symmetrical about the origin, and taking the correlation to span the measurement period, then

giving the shot noise result

for tm<<1.

This type of noise differs considerably from Johnson noise. It requires the existence of a current flow, is independent of temperature and the power depends on the resistance through which the current flows.


9.3 Summary

Random signals are described in the time domain by their autocorrelation function. The value at the time origin is the mean square fluctuation of the signal. This value is also the area under the power spectrum, which describes the distribution of the mean square fluctuation in frequency. For the examples above, and in general the auto-correlation is finite only for a very short time, essentially a correlation time usually <10-12 s. As a consequence the power spectrum is spread over a very wide range of frequencies and is approximated as a "white spectrum" with constant value independent of frequency. This is equivalent to approximating the auto-correlation function by a delta function.

Noise signals may be classified by their power spectra, the most common variations being 1/fn with n = 0,1 or 2. The 1/f variation has been observed in a wide variety of phenomena from river level fluctuations to frequency fluctuations in atomic clocks and is the subject of much study and interest. The major point however is that the noise power is widely distributed in frequency. This characteristic often determines the manner in which deterministic signals are processed.