|
|
The physical layer is the lowest layer in almost all reference models of
computer networks.
2.1. The Theoretical Basis for Data Communication
Information is transmitted on wires by varying some physical property such
as voltage or current. Let f(t) be a function of time representing the value
of this voltage or current modeling the behavior of the signal.
2.1.1. Fourier Analysis
Any reasonable behaved periodic function, g(t), can be expressed in the form of Fourier series

where f = 1/T is the fundamental frequency and an and bn are the sine and cosine amplitudes of the n-th harmonics. The values of c, an, and bn can be expressed by the following equations:
![]() | ![]() | ![]() |
Consider an example - the transmission of the ASCII character b encoded in an 8-bit byte as 01100010. The voltage output of the transmitting computer is shown in Fig. 2-1(a). The Fourier analysis of this signal yields the coefficients:



The values an2 + bn2 are of interest because they are proportional to the energy at the corresponding frequency (Fig. 2-1(a)).
No transmission facility can transmit signal without loosing some power in the process of transmission. All transmission facilities diminish different Fourier components by different amount, thus introducing distortion. If all Fourier components were equally diminished, the resulting signal would be reduced in amplitude but not distorted, i.e., it would have the same nice squared-off shape as in Fig. 2-1. Usually, the amplitudes are transmitted undiminished from 0 up to some frequency fc (measured in cycles/sec or Hertz (Hz)) with all frequencies above this cutoff frequency strongly attenuated (as a consequence of a physical property of transmission medium or intentionally introduced by filter).
Fig. 2-1 shows how the signal of Fig. 2-1(a) would look if the bandwidth were so low that only the lowest frequencies were transmitted.
Fig. 2-1. (a) A binary signal and its root-mean square Fourier
amplitudes.
(b)-(e) Successive approximations to the original signal.
The time T required to transmit a character depends on:
The number of changes per second is measured in baud. A b baud line does not necessarily transmits b bits/sec since each signal might convey several bits. If the voltage 0,1,2,...,7 were used, each signal value could be used to convey 3 bits, so the bit rate would be three times the baud rate. In our example, only 0s and 1s are being used as a signal levels, so the bit rate is equal to baud rate.
If a bit rate is b bits/sec, the time to send an 8 bits character is 8/b. The frequency of character transmission is b/8. If we have a channel with a cutoff frequency f, the number of the highest harmonic passed through the channel is f/(b/8).
Example: An ordinary telephone line (called often voice-grade line), has an artificially introduced cutoff frequency near 3000 Hz. For some data rates, the number of the highest harmonics passed through the line are shown in Fig. 2-2.
Fig. 2-2. Relation between data rate and harmonics.
Sophisticated coding schemes that use several voltage levels do exist and
can achieve higher data rates than 38.4 kbps.
2.1.3. The Maximum Data Rate of a Channel
In 1924 H.Nyquist derived an equation expressing the maximum data rate for a finite bandwidth noiseless channel.
Nyquist proved that if an arbitrary signal has been run through a low-pass filter of bandwidth H, the filtered signal can be completely reconstructed by making only 2H exact samples per second. Sampling the line faster than 2H times per second is pointless because the higher frequency components that such sampling could recover have already been filtered out. If the signal consists of V discrete levels, Nyquist theorem states:
maximum data rate = 2H log2 V bits/sec
For example, a noiseless 3-kHz channel cannot transmit binary (i.e. two-level) signal at rate exceeding 6000 bps.
If we consider the presence of noise, the situation is worse. If we denote the signal power by S and the noise power by N, as the measure of the signal-to-noise ratio the quantity 10 log10 S/N given in decibels (dB) is taken.
In 1948, Claude Shannon extended Nyquist's work as follows: the maximum data rate of a noisy channel whose bandwidth is H Hz, and whose signal-to-noise ratio is S/N, is given by
maximum number of bits/sec = H log2 (1 + S/N)
For example, a channel of 3000 Hz bandwidth, and a signal to thermal noise ratio of 30 dB (S/N = 1000), can never transmit much more than 30000 bps, no matter how many or few signal levels are used. Shannon's result was derived using information-theory arguments and applies to any channel subject to Gaussian (thermal) noise.