Packet Speech on the Arpanet: a History of Early LPC Speech and Its Accidental Impact on the Internet Protocol Robert M

Packet Speech on the Arpanet: a History of Early LPC Speech and Its Accidental Impact on the Internet Protocol Robert M

Research Triangle Park 11 June 2007 Packet speech on the Arpanet: A history of early LPC speech and its accidental impact on the Internet Protocol Robert M. Gray Information Systems Laboratory, Department of Electrical Engineering Stanford, CA 94305 The author’s work in speech was partially supported by the National Science Foundation. Thanks to J. D. Markel, A.H. “Steen” Gray, Jr., John Burg, Charlie Davis, Mike McCammon, Danny Cohen, Steve Casner, Richard Wiggins, Vishu Viswanathan, Jim Murphy, Cliff Weinstein, Joseph P. Campbell, Randy Cole, Rich Dean, Vint Cerf, and Bob Kahn. http://ee.stanford.edu/ gray/lpcip.html http://ee.stanford.edu/ gray/dl.html LPC & IP 1 Origins of this talk Special Workshop in Maui (SWIM), 12 January 2004 LPC & IP 2 Part I: Linear Prediction & Speech Observe data sequence {X0,X1,...,Xm−1}. Guess Xm Optimal 1-step prediction ¿ What is the optimal predictor of the form X˜m = p(X0,...,Xm−1)? Optimal 1-step linear prediction ¿ What is the optimal linear ˜ Pm predictor of the form Xm = − l=1 alXm−l ? Modeling/density estimation ¿ What is the probability density function (pdf) that “best” models Xm? Spectrum Estimation ¿ What is the “best” estimate of the power spectral density or covariance of the underyling random process? LPC & IP 3 The Application Speech Coding ¿ How apply linear prediction to produce low bit rate speech of sufficient quality for speech understanding and speaker recognition? E.g., reproducing waveform (waveform coding) or use model to synthesize (voice coding). Wide literature exists on all of these topics in a speech context and they are intimately related. See, e.g., J. Makhoul’s classic survey [34] and J.D. Markel and A.H. Gray Jr’s classic book [39]. Problems ill-posed unless define terms like “optimal” and assume some structure. LPC & IP 4 Optimal Prediction m t Random vector X = (X0,X1,...,Xm−1) Correlation ri,j = E[XiXj], Rn = {ri,j; i, j = 0, 1, . , n − 1} m 2 ¿ Best X˜m = p(X ) yielding minimum E[(Xm − X˜m) ]? | {z } m ˜ m 2 Answer: X = E[X |X ] MMSE = α = σ m. m m m Xm|X LPC & IP 5 m+1 m t m If X Gaussian, ⇒ E[Xm|X ] = (am−1, . , a2, a1) X t −1 where (am−1, . , a2, a1) = (rm,0, rm,1, . , rm,m−1)Rm αm = | Rm+1 | / | Rm | ⇒ Form and performance are determined entirely by Rm+1! ⇒ optimal predictor is linear!! ⇒ optimal linear predictor = optimal predictor LPC & IP 6 Optimal Linear Prediction ˜ Pm 2 t Xm = − l=1 alXm−l ⇒ MSE= E[m] = a Rm+1a , ∆ t where a = (a0 = 1, a1, . , am) , whether or not Gaussian!. argmin atR a ⇒ optimal a for linear prediction (LP) is a:a0=1 m+1 = same as optimal for Gaussian! a and αm as before. Moral: Gaussian assumption provides short cut proofs in nonGaussian problems — no calculus and get global optimality! Efficient inversion to find a: Cholesky decomposition =⇒ ˙ covariance method If Rm+1 Toeplitz, Levinson-Durbin algorithm =⇒ ˙ autocorrelation method LPC & IP 7 Other derivations: Calculus or orthogonality principle ⇒ normal equations (Wiener-Hopf, Yule-Walker): m linear equations in m unknowns. ¿ What if don’t know Rm+1, but observe long sequence of actual data X0,X1,...,Xn−1? Can estimate: n−1 1 X rˆ = X X ; Rˆ = {rˆ ; i, j = 0, 1, . , m} k n − m l l−|k| m+1 i−j l=m n−1 1 X r = X X ; R = {r ; i, j = 0, 1, . , m} i,j n − m l−i l−j m+1 i,j l=m and “plug in.” Rˆm Toeplitz, Rm not. As n → ∞, Rm+1 ≈ Rˆm+1 LPC & IP 8 Processes and Filters For n = m, m + 1,... find linear least squares estimate X˜n = Pm − l=1 alXn−l: Previous formulation ⇒ optimal a, MMSE αm. LTI filter with input Xn, response ak: prediction error filter or Pm −i2πnf inverse filter ⇔ A(f) = n=0 ane Pm n = k=0 akXn−k m X - A(f) -1/A(f) -X = − P a X n residual, excitation n n l=1 l n−l Limit: m → ∞, the orthogonality principal ⇒ prediction error becomes white! Choose A to make prediction error as white as possible. LP again LPC & IP 9 And there are more formulations with same solution: • Maximum likelihood Assume {Xn} Gauss autoregressive process. Given observations, what is maximum likelihood estimate of parameters describing Gaussian distribution? • Maximum entropy Suppose estimate Rˆm+1 of correlations up to lag m of process {Xn}. What mth order Markov random process maximizes the Shannon differential entropy rate? (Variational problem, no Gaussian assumption.) • Minimum distortion Minimum distortion fit of spectra: Itakura-Saito/Kullback-Leibler/minimum discrimination information. • Correlation matching Given set of m autocorrelation values, what is the best estimate of the remaining coefficients? LPC & IP 10 Linear Predictive Coding (LPC) 1/A + true residual n -+ - Xn true speech 6+ white noise Wn Yn synthesized speech X˜n P ⇒ LPC model: mth-order autoregressive model with A solving LP problem. Simplistic: no voicing or pitch estimation details. Switch excitation between white noise (unvoiced sounds) and pulse train (voiced sounds) LPC & IP 11 White Noise • @I @ √ Synthetic -@ α /A(f) - m Speech Pulse Train • LPC Estimate autocorrelation or covariance of observed data and find LP model (αm,A). Coding occurs when the final model is selected from a discrete set, e.g., quantize separate parameters or parameter vector. Local synthesis at decoder. Classic vocoder instead of a waveform coder. LPC & IP 12 Part II: History – 1966 UCSB Glen Culler introduces On-Line System (OLS or Culler-Fried system) — allows real time signal processing at individual student terminals, e.g., DFTs of real sampled speech. Culler is reknowned for building fast and effective computer systems. LPC & IP 13 1966 In December Saito and Itakura at NTT [4] describe an approach to automatic phoneme discrimination and develop the ML approach and minimum distortion approach to speech coding: LP parameters extracted using autocorrelation method & transmitted to decoder with voicing information. Decoder synthesizes from noise or pulse train driving autoregressive filter. See also 1968 & 1969 papers [10, 11]. From [4]: LPC & IP 14 LPC & IP 15 1967 October John Burg presents maximum entropy approach [8] and wins best presentation award at the meeting of the Society of Exploration Geophysists. Focus is on prediction error properties. Variational, not parametric. Ed Jaynes and John Burg. LPC & IP 16 1967 November B.S. Atal and M.R. Schroeder [5]: LP coefficients used to form prediction residual, which is also coded. Adaptive predictive coding (APC), residual excited LPC. No explicit modeling. Elaborated in 1968 [6, 7] using covariance method. From [5] LPC & IP 17 1968 John Markel drops required language course in French for PhD program at Arizona State. Moves to UCSB (Fortran is accepted there). Joins Speech Communications Research Lab (SCRL). Reads Flanagan’s book and sets goal to someday write and publish a book in the same series with the same publisher. Begins working with A.H. Gray Jr and Hisashi Wakita on implementations of Itakura’s approach. LPC & IP 18 1968 John Burg [9] presents “A new analysis technique for time series data” at NATO Advanced Study Institute — the Burg algorithm. Finds reflection coefficients from original data using a forward-backward algorithm. Later dubbed “covariance lattice” approach in speech[44, 53]. Glen Culler contributes to Interface Message Processor (IMP) specification (with Shapiro, Kleinrock, Roberts) — the “node” of the ARPANET. BBN gets contract from ARPA to build and deploy 4 in January 1969. [63] Culler cofounds Culler-Harrison Inc (CHI), which builds early array processors which are adopted and commercialized by FPS, will replace SPS-41 array processors. LPC & IP 19 1969 Itakura and Saito[11] introduce partial correlation (PARCOR) [1969] variation on autocorrelation method, finds partial correlation [1] coeffients. Similar to Burg algorithm, but based on classical statistical ideas and lower complexity. May Glen Culler proposes online speech processing system aimed at real-time speech encoding based on a signal decomposition that would now be called a Gabor wavelet analysis. [12] November B.S. Atal presents LPC speech coder at Annual Meeting of the Acoustical Society of America. [13]. Abstract published in 1970, full paper with Hanauer in 1971[15], uses covariance method. LPC & IP 20 Thanks to Culler, UCSB becomes the third node (IMP) on the ARPAnet (joining #1 UCLA, #2 SRI, #4 University of Utah) No two computers were the same (Sigma-7, SDS-940, IBM-360, and DEC-PDP10) Drawing by Jon Postel of ISI LPC & IP 21 1971 Real time LPC using Cholesky/covariance at Philco-Ford in PA. LONGBRAKE II Final Report in 1974[32]. 16 bit fixed point LPC. Four were sold (Navy and NSA), they weighed 250 lbs @. Used PFSP signal processing computer. [39] LPC & IP 22 1972 Bob Kahn (ARPA) with Jim Forgie (LL) and Dave Walden (BBN) initiate first efforts towards packet speech on net. Simulated pieces of 64 Kbps PCM speech packets on ARPANET to understand how might eventually fit packet speech into net. Concluded major change in packet handling and serious compression would be needed. Danny Cohen working at Harvard on realtime visual flight simulation. Bob Kahn suggests to Danny that similar ideas would work for real time speech communication over developing ARPAnet and described his project at the USC Information Sciences Institute (ISI) in Marina del Rey. LPC & IP 23 1973 Danny moves to ISI, works with Steve Casner, Randy Cole, and others and with SCRL on real time operating systems. Kahn forms Network Secure Communications (NSC) group. (Later called Network Speech Compression and Network Skiing Club because of a preference for winter meetings in Alta.) Every node on ARPAnet had different equipment and software.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    44 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us