**The Baum-Welch Algorithm **

The Baum-Welch algorithm was developed by Prof. Lloyd Welch prior to his joining the CSI faculty. Prof. Welch developed this algorithm in collaboration with his colleague, Dr. Leonard Baum, while both were at the Institue for Defense Analysis (IDA). The Baum-Welch algorithm optimizes the parameters of a finite-state machine (FSM) model using empirical observations. This is a two step process; step one is computing a-posteriori probabilities for a given model, and step two involves re-estimation of the model parameters. The Baum-Welch algorithm has become an important tool in many fields, speech recognition in particular. It has also served as motivation for other algorithms. One example is the so-called Expectation-Maximization (EM) algorithm, of which the Baum-Welch algorithm is a specific case. Another is the forward-backward algorithm (FBA or BCJR algorithm), which is step one of the Baum-Welch algorithm. The FBA has gained recent noteriety as a key component of the turbo-decoding algorithm.

Prof. Golomb, while at the Glenn Martin company in the early 1950's, was one of the pioneers in the study of pseudorandom sequences of bits and their application to radar and communications. After joining USC's faculty in 1963, Professor Golomb published the first book devoted exclusively to pseudorandom sequences, the classic Shift Register Sequences. Pseudorandom sequences, also known as pseudonoise sequences, or shift-register sequences, have played increasingly important roles as digital communication technology evolved. Now shift register generation of pseudorandom sequences is: the underlying technique for CDMA (Code Division Multiple Access) communication, emerging as the preferred standard (IS-95) for digital cellular communication, the principal method of key generation for stream ciphers, used extensively in cryptography, the basis for direct sequence spread spectrum communications, used by the military for secure guidance of missiles, the main technique for generating long-duration signals for CW radar, to obtain radar echoes from very remote objects, and, designed into many types of high-density integrated circuits, to create a built-in test capability.

In 1960, prior to joining the USC faculty, Prof. Irvin Reed invented an elegant and powerful algebraic error correction code. Prof. Reed conducted this work with his colleague, Dr. Gus Solomon, while both were at MIT Lincoln Labs. For years these Reed-Solomon codes were admired for their mathematical beauty, but considered too complex for practical use. In the 1970s and 1980s, RS codes, used as outer codes with convoilutional inner codes, formed the most powerful error correction method available and were used in numerous deep space exploration modems. In the 1990s, RS codes became ubiquitous as the error correction method on compact discs.

Prof. Bob Scholtz is recognized as the primary academic figure in the UWB community, having written the earliest papers, obtained the first federal research grants, and organized the first workshops in the mid-1990s. Today, UWB systems are nearing standardization as part of the IEEE 802.15.3 group. Prof. Scholtz's first Ph.D. student working on UWB systems, Moe Win, is currently a faculty member at MIT. To learn more about Prof. Scholtz's UWB work, visit the Ultrawideband Radio Lab.

The Viterbi algorithm was developed by Dr. Andrew Viterbi in 1968 while he was a faculty member at UCLA, just a few years after receiving his Ph.D. in EE at USC. The Viterbi algorithm (VA) is the optimal decoder for a convolutional code and also solves several related problems in communications (e.g., data detection in intersymbol interference channels). The discovery and popularization of the VA coincided with the realization of large scale integrated circuitry. This was fortuitous because the VA maps nicely onto digital hardware architectures. Dr. Viterbi and his associates recognized these capabilities and founded Linkabit, and later Qualcomm, essentially launching the "telecom valley" in North County San Diego.

Code-Division Multiple-Access or CDMA is the preferred method of allowing multiple users to share the same cellular communication channel in third-generation (3G) cellular communication systems. In a CDMA system, the signals of different users are kept separate by assigning to each user, a distinct scrambling code. A family of efficient scrambling codes, introduced in a 1996 paper authored by P. V. Kumar, T. Helleseth, A. R. Calderbank and A. R. Hammons, Jr., is under use as the short scrambling code of the 3G W-CDMA standard. While traditional scrambling codes employ a binary alphabet, the scrambling codes of Kumar et. al. have a quaternary alphabet that is well-suited to the quaternary nature of the signal modulation employed in third-generation cellular communication. This family of scrambling codes also has large cardinality, enabling scrambling codes to be assigned to a large number of potential users.

Prof. Kumar working with his Ph.D. student, Roger Hammons, Jr., discovered that certain well-known nonlinear binary error-correction codes could be viewed as linear codes over a 4-ary signal alphabet. Kumar and Hammons published this result in 1994 in a joint paper with researchers at AT&T Bell Laboratories and the Centre National Recherche Scientifique in France who were obtaining similar results. The paper received the IEEE Transactions on Information Theory Prize Paper Award. The work generated considerable interest and became the catalyst for many research efforts in the field. Part of the interest was due to long-standing questions in the field that the research answered in an elegant manner. Specifically, it was well-known that the non-linear binary Kerdock and Preparata codes satisfied some code-duality relations only guaranteed for linear codes. By showing that these nonlinear binary codes can be viewed as projections of a linear Z4 code, these properties were logically explained.

The architectural design of the Ku-band Space Shuttle Radar was developed by Prof. Charles Weber for the NASA Manned Space Center. The short range and docking radar was designed to track its targets to a range as short as 30 meters, where station-keeping and rendezvous take place, as well as up to distances of 300 N. Mi. Target parameters measured by the radar during the tracking phase include: range and range rate, and angle and angle rate (elevation and azimuth). The system is a coherent pulse Doppler type radar with a peak power of 60 watts. The radar has enjoyed an excellent performance record throughout the Shuttle Program.

The Viterbi Algorithm is known to provide optimal data decisions for modulation, coding, and propagation channels that are modeled as finite state machines (FSMs). In many modern communications systems, however, the parameters that define transition probabilities are unknown or time varying. Per-Survivor Processing (PSP) was a term coined by Andreas Polydoros while a he was a professor at CSI in 1990. Prof. Polydoros and Visiting Prof. Riccardo Raheli, from U. of Pisa, Italy, developed PSP as a general tool for enabling a Viterbi-like algorithm to acquire or track changing channel conditions. Prof. Keith Chugg performed some of the basic analysis and theoretical justification of PSP and later, along with then CSI-Ph.D. student, Achilleas Anastasopoulos, generalized the notion to adaptive soft-in/soft-out decoding and data detection algorithms. This related advance enabled adaptive iterative detection (AID) and decoding algorithms that effectively estimate and track unknown or time-varying channel parameters such as carrier phase dynamics in turbo-like codes. PSP and AID have been implemented in a number of modems designed for mobile and satellite communication systems.

Accurate synchronization plays a cardinal role in the efficient utilization of any spread-spectrum system. Typically, the process of synchronization between the spreading (incoming) pseudonoise code and the local dispreading (receiver) code is performed in two steps: first, code acquisition, then tracking via one of the available code-tracking loops. Professors Andreas Polydoros (CSi alumnus and former CSI faculty, now with the University of Athens, Greece) and Charles Weber developed a unified approach to serial search spread-spectrum code acquisition, which consolidated several earlier efforts into one general theory. The theory is formulated in a general manner that allows for significant freedom in the receiver modeling. The method has been widely used and adopted by most textbooks in the field. Various spread spectrum syschronization systems are special cases of this unified approach.

Constructing orthogonal functions on a finite interval of the real line is a well-known technique in electrical engineering. If the real coordinate is time, these functions have to be synchronous (occupy the same time interval) to demonstrate orthogonality when used as inputs on the arms of a correlator. Functions that are not synchronous and retain nearly orthogonal properties are the modulation building blocks for spread-spectrum multiple-access systems. Prof. Lloyd Welch in his classic paper "Lower Bounds on the Maximum Correlation of Signals" produced a lower bound on asynchronous correlation that is only a function of the dimension of the signal space and the number of signals desired in the signal set. The bound often is nearly achievable by good signal-set design, and has become "The Welch Bound" standard by which spread-spectrum signal-set designs are judged. For more on this, see his original paper in the IEEE Trans. On Information Theory, pp. 397-399, May 1974. For more on the Welch bound and its use in spread spectrum signal design, see Prof. Scholtz's Chapter 5 in The Spread Spectrum Communications Handbook by M. K. Simon et al.