1. Information capacity of time-continuous channels
- Author
-
R. Johnson and R. Huang
- Subjects
Discrete mathematics ,Stochastic resonance ,Noise spectral density ,Spectral density ,White noise ,Library and Information Sciences ,Computer Science Applications ,Channel capacity ,symbols.namesake ,Shannon–Hartley theorem ,Additive white Gaussian noise ,Gaussian noise ,Statistics ,symbols ,Computer Science::Information Theory ,Information Systems ,Mathematics - Abstract
The maximum average mutual information in the observation of the output, y(t) , of a channel over the time interval [T_3,T_4] about the signal (input), s(t) , in the interval [T_1, T_2] is taken as the definition of channel capacity for the time-continuous case. In the case where the channel introduces additive Independent Gaussian noise of known correlation function, the capacity is evaluated subject to the constraint that the signal process have a given correlation function. For this evaluation a new joint expansion of the processes y(t) and s(t) is introduced which has the property that all coefficients in the expansion are uncorrelated. Thus, the expansion is a generalization of the Karkunen-Lo'{e}ve expansion to which it reduces when the noise is white and the time intervals coincide. The channel capacity is shown to be directly related to results in the theory of optimum filtering over a finite time interval. Closed form results for the capacity of several channels are given as well as some limiting expressions and bounds. For the case of white noise of spectral density N_o , the capacity is always bounded by \bar{E}/N_o where \bar{E} is the average signal energy.
- Published
- 1962
- Full Text
- View/download PDF