The Bell Labs Shannon Conference on the Future Of

Total Page:16

File Type:pdf, Size:1020Kb

The Bell Labs Shannon Conference on the Future Of The Bell Labs Shannon Conference on the Future of the Information Age A two-day conference to celebrate Claude Shannon’s centennial and explore the continued impact of information theory on our society. Date: April 28-29, 2016 Location: Murray Hill, New Jersey Attendees: 250 leaders, visionaries and researchers from industry, academia and Bell Labs Celebrate the dawn of a new digital era We are at the dawn of a new digital era, where everyone and everything will be digitally connected and controllable, allowing an unprecedented level of automation and consequently the ability to ‘create time’. The extent of disruption to human existence local – physical economy. The magnitude of Bell Labs is celebrating the centennial of will be such that this will likely be viewed change will be comparable in the departure Shannon’s birthday — April 30th, 1916 as the 6th technological revolution of the from the past and impact on the future - to — by hosting a unique 2-day conference modern era. This revolution will profoundly the agricultural revolution 12 millennia ago focusing on a discussion of the future digital change the global techno-economic fabric, and the industrial revolution 2 centuries ago. information economy and the impact of massively shifting towards a new global- This revolution is rooted in the dawn of the information theory on society today and local paradigm, where any digital ‘good’ is Information Age, characterized by the shift in the digital future. This special event accessible, anywhere. from industrial activity to an information- celebrates Shannon’s life and influence based economy, based on emergence of and commemorates his profound impact Furthermore, with the emergence of ever digital computing and digital transmission with the creation of new Bell Labs Shannon more sophisticated 3D printing techniques, techniques, which were in turn based largely Visionary Awards presented at the event. physical goods would be recreated locally on the information theoretic concepts on demand at any time in a new digital + defined by Claude Elwood Shannon. April 28-29, 2016, Murray Hill, New Jersey The Bell Labs Shannon Conference on the Future of the Information Age We begin our 2-day conference with a These distinguished guests will entertain Day Two - April 29 gathering of the leaders and visionaries us with personal recollections of Shannon, We are hosting a series of technical in the Information Society, with as well as an exhibit of previously presentations in a technical symposium provocative and informed discussion unreleased Shannon memorabilia. outlining the latest research built upon of our digital past, present, and future, Shannon’s work in both traditional In his groundbreaking paper, “A in the classic Bell Labs tradition. There Research Demonstrations communications as well as in areas such Mathematical Theory of Communication” will be a student competition, highlights and Student Competition as bioinformatics, economic systems (Bell System Technical Journal, July and of Shannon and his life, and a technical Bell Labs researchers will showcase and social networks. October, 1948) Claude Shannon laid symposium on current research. their latest innovations based on new the formal foundation of a quantitative applications of information theory to Our notable invitees include: science of digital information. This theory Day One - April 28 multiple domains, from social graphs to • Emmanuel Abbe grew, in part, out of the field of statistical Morning Session wireless beamforming and novel future • Gerhard Kramer mechanics, where the notion of entropy Keynote presentations by 5 global optical transmission systems. • Vince Poor had been introduced as a measure luminaries and visionaries who will be • Muriel Medard of disorder in large configurations of recognized with Bell Labs Shannon Invited students will present their • Shlomo Shamai atoms and molecules in random physical Visionary Awards. We are joined by: research in Information Theory tools and • Amin Shokrollahi systems. Shannon’s seminal work showed • Eric Schmidt, Executive Chairman of applications in a competitive forum. • Christopher Sims how entropy, appropriately redefined and Alphabet Inc. • Emre Telatar applied, provided the mathematical means • Irwin Jacobs, Co-founder of Qualcomm Evening • David Tse to measure the information content of • Bob Metcalfe, Co-inventor of Ethernet The day will culminate with the premiere • Michelle Effros signals and systems. Immediate areas of and formulator of Metcalfe’s Law performance of the Human Digital Orchestra • Olgica Milenkovic application were the quantification of the • Henry Markram, Director of the Human — a new Bell Labs ‘Experiment in Art and • Andrea Goldsmith capacity limits of communication in noisy Brain and Blue Brain projects Technology (EAT)’, together with Stevens • Rudiger Urbanke channels, and shortly afterwards the limits • Amber Case, Cyborg Anthropologist. Institute of Technology, followed by a of data compression and reliable storage. gala dinner. Shannon’s formal foundations were Afternoon Session broadened subsequently to many other We are remembering Shannon in a series application areas, including cryptography, of talks and fireside chats. Our invited natural language processing, statistical guests include: inference, neurobiology and more recently to • Elwyn Berlekamp evolution, ecology, quantum computing, • Robert Gallager linguistics, pattern recognition and • Leonard Kleinrock progressively into many areas of what • Sergio Verdu we now call data analytics. April 28-29, 2016, Murray Hill, New Jersey To register for the Conference www.etouches.com/shannoncentennial The Bell Labs Shannon Conference on the Future of the Information age April 28-29, 2016, Murray Hill, New Jersey.
Recommended publications
  • Digital Communication Systems 2.2 Optimal Source Coding
    Digital Communication Systems EES 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 2. Source Coding 2.2 Optimal Source Coding: Huffman Coding: Origin, Recipe, MATLAB Implementation 1 Examples of Prefix Codes Nonsingular Fixed-Length Code Shannon–Fano code Huffman Code 2 Prof. Robert Fano (1917-2016) Shannon Award (1976 ) Shannon–Fano Code Proposed in Shannon’s “A Mathematical Theory of Communication” in 1948 The method was attributed to Fano, who later published it as a technical report. Fano, R.M. (1949). “The transmission of information”. Technical Report No. 65. Cambridge (Mass.), USA: Research Laboratory of Electronics at MIT. Should not be confused with Shannon coding, the coding method used to prove Shannon's noiseless coding theorem, or with Shannon–Fano–Elias coding (also known as Elias coding), the precursor to arithmetic coding. 3 Claude E. Shannon Award Claude E. Shannon (1972) Elwyn R. Berlekamp (1993) Sergio Verdu (2007) David S. Slepian (1974) Aaron D. Wyner (1994) Robert M. Gray (2008) Robert M. Fano (1976) G. David Forney, Jr. (1995) Jorma Rissanen (2009) Peter Elias (1977) Imre Csiszár (1996) Te Sun Han (2010) Mark S. Pinsker (1978) Jacob Ziv (1997) Shlomo Shamai (Shitz) (2011) Jacob Wolfowitz (1979) Neil J. A. Sloane (1998) Abbas El Gamal (2012) W. Wesley Peterson (1981) Tadao Kasami (1999) Katalin Marton (2013) Irving S. Reed (1982) Thomas Kailath (2000) János Körner (2014) Robert G. Gallager (1983) Jack KeilWolf (2001) Arthur Robert Calderbank (2015) Solomon W. Golomb (1985) Toby Berger (2002) Alexander S. Holevo (2016) William L. Root (1986) Lloyd R. Welch (2003) David Tse (2017) James L.
    [Show full text]
  • Principles of Communications ECS 332
    Principles of Communications ECS 332 Asst. Prof. Dr. Prapun Suksompong (ผศ.ดร.ประพันธ ์ สขสมปองุ ) [email protected] 1. Intro to Communication Systems Office Hours: Check Google Calendar on the course website. Dr.Prapun’s Office: 6th floor of Sirindhralai building, 1 BKD 2 Remark 1 If the downloaded file crashed your device/browser, try another one posted on the course website: 3 Remark 2 There is also three more sections from the Appendices of the lecture notes: 4 Shannon's insight 5 “The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” Shannon, Claude. A Mathematical Theory Of Communication. (1948) 6 Shannon: Father of the Info. Age Documentary Co-produced by the Jacobs School, UCSD- TV, and the California Institute for Telecommunic ations and Information Technology 7 [http://www.uctv.tv/shows/Claude-Shannon-Father-of-the-Information-Age-6090] [http://www.youtube.com/watch?v=z2Whj_nL-x8] C. E. Shannon (1916-2001) Hello. I'm Claude Shannon a mathematician here at the Bell Telephone laboratories He didn't create the compact disc, the fax machine, digital wireless telephones Or mp3 files, but in 1948 Claude Shannon paved the way for all of them with the Basic theory underlying digital communications and storage he called it 8 information theory. C. E. Shannon (1916-2001) 9 https://www.youtube.com/watch?v=47ag2sXRDeU C. E. Shannon (1916-2001) One of the most influential minds of the 20th century yet when he died on February 24, 2001, Shannon was virtually unknown to the public at large 10 C.
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 53, No.4, December 2003 Editor: Lance C. Pérez ISSN 1059-2362 The Shannon Lecture Hidden Markov Models and the Baum-Welch Algorithm Lloyd R. Welch Content of This Talk what the ‘running variable’ is. The lectures of previous Shannon Lecturers fall into several Of particular use will be the concept of conditional probabil- categories such as introducing new areas of research, resusci- ity and recursive factorization. The recursive factorization tating areas of research, surveying areas identified with the idea says that the joint probability of a collection of events can lecturer, or reminiscing on the career of the lecturer. In this be expressed as a product of conditional probabilities, where talk I decided to restrict the subject to the Baum-Welch “algo- each is the probability of an event conditioned on all previous rithm” and some of the ideas that led to its development. events. For example, let A, B, and C be three events. Then I am sure that most of you are familiar with Markov chains Pr(A ∩ B ∩ C) = Pr(A)Pr(B | A)Pr(C | A ∩ B) and Markov processes. They are natural models for various communication channels in which channel conditions change Using the bracket notation, we can display the recursive fac- with time. In many cases it is not the state sequence of the torization of the joint probability distribution of a sequence of model which is observed but the effects of the process on a discrete random variables: signal. That is, the states are not observable but some func- tions, possibly random, of the states are observed.
    [Show full text]
  • Information Theory and Statistics: a Tutorial
    Foundations and Trends™ in Communications and Information Theory Volume 1 Issue 4, 2004 Editorial Board Editor-in-Chief: Sergio Verdú Department of Electrical Engineering Princeton University Princeton, New Jersey 08544, USA [email protected] Editors Venkat Anantharam (Berkeley) Amos Lapidoth (ETH Zurich) Ezio Biglieri (Torino) Bob McEliece (Caltech) Giuseppe Caire (Eurecom) Neri Merhav (Technion) Roger Cheng (Hong Kong) David Neuhoff (Michigan) K.C. Chen (Taipei) Alon Orlitsky (San Diego) Daniel Costello (NotreDame) Vincent Poor (Princeton) Thomas Cover (Stanford) Kannan Ramchandran (Berkeley) Anthony Ephremides (Maryland) Bixio Rimoldi (EPFL) Andrea Goldsmith (Stanford) Shlomo Shamai (Technion) Dave Forney (MIT) Amin Shokrollahi (EPFL) Georgios Giannakis (Minnesota) Gadiel Seroussi (HP-Palo Alto) Joachim Hagenauer (Munich) Wojciech Szpankowski (Purdue) Te Sun Han (Tokyo) Vahid Tarokh (Harvard) Babak Hassibi (Caltech) David Tse (Berkeley) Michael Honig (Northwestern) Ruediger Urbanke (EPFL) Johannes Huber (Erlangen) Steve Wicker (GeorgiaTech) Hideki Imai (Tokyo) Raymond Yeung (Hong Kong) Rodney Kennedy (Canberra) Bin Yu (Berkeley) Sanjeev Kulkarni (Princeton) Editorial Scope Foundations and Trends™ in Communications and Information Theory will publish survey and tutorial articles in the following topics: • Coded modulation • Multiuser detection • Coding theory and practice • Multiuser information theory • Communication complexity • Optical communication channels • Communication system design • Pattern recognition and learning • Cryptology
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 63, No. 3, September 2013 Editor: Tara Javidi ISSN 1059-2362 Editorial committee: Ioannis Kontoyiannis, Giuseppe Caire, Meir Feder, Tracey Ho, Joerg Kliewer, Anand Sarwate, Andy Singer, and Sergio Verdú Annual Awards Announced The main annual awards of the • 2013 IEEE Jack Keil Wolf ISIT IEEE Information Theory Society Student Paper Awards were were announced at the 2013 ISIT selected and announced at in Istanbul this summer. the banquet of the Istanbul • The 2014 Claude E. Shannon Symposium. The winners were Award goes to János Körner. the following: He will give the Shannon Lecture at the 2014 ISIT in 1) Mohammad H. Yassaee, for Hawaii. the paper “A Technique for Deriving One-Shot Achiev - • The 2013 Claude E. Shannon ability Results in Network Award was given to Katalin János Körner Daniel Costello Information Theory”, co- Marton in Istanbul. Katalin authored with Mohammad presented her Shannon R. Aref and Amin A. Gohari Lecture on the Wednesday of the Symposium. If you wish to see her slides again or were unable to attend, a copy of 2) Mansoor I. Yousefi, for the paper “Integrable the slides have been posted on our Society website. Communication Channels and the Nonlinear Fourier Transform”, co-authored with Frank. R. Kschischang • The 2013 Aaron D. Wyner Distinguished Service Award goes to Daniel J. Costello. • Several members of our community became IEEE Fellows or received IEEE Medals, please see our web- • The 2013 IT Society Paper Award was given to Shrinivas site for more information: www.itsoc.org/honors Kudekar, Tom Richardson, and Rüdiger Urbanke for their paper “Threshold Saturation via Spatial Coupling: The Claude E.
    [Show full text]
  • Network Information Theory
    Network Information Theory This comprehensive treatment of network information theory and its applications pro- vides the first unified coverage of both classical and recent results. With an approach that balances the introduction of new models and new coding techniques, readers are guided through Shannon’s point-to-point information theory, single-hop networks, multihop networks, and extensions to distributed computing, secrecy, wireless communication, and networking. Elementary mathematical tools and techniques are used throughout, requiring only basic knowledge of probability, whilst unified proofs of coding theorems are based on a few simple lemmas, making the text accessible to newcomers. Key topics covered include successive cancellation and superposition coding, MIMO wireless com- munication, network coding, and cooperative relaying. Also covered are feedback and interactive communication, capacity approximations and scaling laws, and asynchronous and random access channels. This book is ideal for use in the classroom, for self-study, and as a reference for researchers and engineers in industry and academia. Abbas El Gamal is the Hitachi America Chaired Professor in the School of Engineering and the Director of the Information Systems Laboratory in the Department of Electri- cal Engineering at Stanford University. In the field of network information theory, he is best known for his seminal contributions to the relay, broadcast, and interference chan- nels; multiple description coding; coding for noisy networks; and energy-efficient packet scheduling and throughput–delay tradeoffs in wireless networks. He is a Fellow of IEEE and the winner of the 2012 Claude E. Shannon Award, the highest honor in the field of information theory. Young-Han Kim is an Assistant Professor in the Department of Electrical and Com- puter Engineering at the University of California, San Diego.
    [Show full text]
  • Ieee Information Theory Society
    IEEE INFORMATION THEORY SOCIETY The Information Theory Society is an organization, within the framework of the IEEE, of members with principal professional interests in information theory. All members of the IEEE are eligible for membership in the Society and will receive this TRANSACTIONS upon payment of the annual Society membership fee of $30.00. For information on joining, write to the IEEE at the address below. Member copies of Transactions/Journals are for personal use only. BOARD OF GOVERNORS President Secretary Treasurer Transactions Editor Conference Committee Chair GIUSEPPE CAIRE NATASHA DEVROYE NIHAR JINDAL HELMUT BÖLCSKEI BRUCE HAJEK EE Dept. Dept. ECE Dept. Elec. Eng. Comp. Sci. Dept. IT & EE Dept. EECS and Univ. of Southern California Univ. of Illinois Univ. of Minnesota ETH Zurich Coordinated Science Lab. Los Angeles, CA 90089 USA Chicago, IL 60607 USA Minneapolis, MN 55455 USA Zurich, Switzerland Univ. of Illinois Urbana, IL 61801 USA First Vice President Second Vice President Junior Past President Senior Past President MURIEL MÉDARD GERHARD KRAMER FRANK R. KSCHISCHANG ANDREA J. GOLDSMITH HELMUT BÖLCSKEI (2011) ALEX GRANT (2011) MURIEL MÉDARD (2012) EMINA SOLJANIN (2011) MARTIN BOSSERT (2012) ROLF JOHANNESSON (2012) PRAKASH NARAYAN (2012) DAVID TSE (2013) MAX H. M. COSTA (2012) GERHARD KRAMER (2011) L. PING (2012) ALEXANDER VARDY (2013) MICHELLE EFFROS (2013) J. NICHOLAS LANEMAN (2013) AMIN SHOKROLLAHI (2013) SERGIO VERDÚ (2011) ABBAS EL GAMAL (2011) HANS-ANDREA LOELIGER (2012) PAUL SIEGEL (2011) EMANUELE VITERBO (2013) IEEE TRANSACTIONS ON INFORMATION THEORY HELMUT BÖLCSKEI, Editor-in-Chief Executive Editorial Board G. DAVID FORNEY,JR.SHLOMO SHAMAI (SHITZ)ALEXANDER VARDY SERGIO VERDÚ CYRIL MÉASSON, Publications Editor PREDRAG SPASOJEVIc´, Publications Editor ERDAL ARIKAN ELZA ERKIP NAVIN KASHYAP DANIEL P.
    [Show full text]
  • Andrew J. and Erna Viterbi Family Archives, 1905-20070335
    http://oac.cdlib.org/findaid/ark:/13030/kt7199r7h1 Online items available Finding Aid for the Andrew J. and Erna Viterbi Family Archives, 1905-20070335 A Guide to the Collection Finding aid prepared by Michael Hooks, Viterbi Family Archivist The Andrew and Erna Viterbi School of Engineering, University of Southern California (USC) First Edition USC Libraries Special Collections Doheny Memorial Library 206 3550 Trousdale Parkway Los Angeles, California, 90089-0189 213-740-5900 [email protected] 2008 University Archives of the University of Southern California Finding Aid for the Andrew J. and Erna 0335 1 Viterbi Family Archives, 1905-20070335 Title: Andrew J. and Erna Viterbi Family Archives Date (inclusive): 1905-2007 Collection number: 0335 creator: Viterbi, Erna Finci creator: Viterbi, Andrew J. Physical Description: 20.0 Linear feet47 document cases, 1 small box, 1 oversize box35000 digital objects Location: University Archives row A Contributing Institution: USC Libraries Special Collections Doheny Memorial Library 206 3550 Trousdale Parkway Los Angeles, California, 90089-0189 Language of Material: English Language of Material: The bulk of the materials are written in English, however other languages are represented as well. These additional languages include Chinese, French, German, Hebrew, Italian, and Japanese. Conditions Governing Access note There are materials within the archives that are marked confidential or proprietary, or that contain information that is obviously confidential. Examples of the latter include letters of references and recommendations for employment, promotions, and awards; nominations for awards and honors; resumes of colleagues of Dr. Viterbi; and grade reports of students in Dr. Viterbi's classes at the University of California, Los Angeles, and the University of California, San Diego.
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 66, No. 3, September 2016 Editor: Michael Langberg ISSN 1059-2362 Editorial committee: Frank Kschischang, Giuseppe Caire, Meir Feder, Tracey Ho, Joerg Kliewer, Anand Sarwate, Andy Singer, Sergio Verdú, and Parham Noorzad President’s Column Alon Orlitsky ISIT in Barcelona was true joy. Like a Gaudi sion for our community along with extensive masterpiece, it was artfully planned and flaw- experience as board member, associate editor, lessly executed. Five captivating plenaries and a first-class researcher. Be-earlied con- covered the gamut from communication and gratulations, Madam President. coding to graphs and satisfiability, while Al- exander Holevo’s Shannon Lecture reviewed Other decisions and announcements con- the fascinating development of quantum cerned the society’s major awards, including channels from his early theoretical contribu- three Jack Wolf student paper awards and the tions to today’s near-practical innovations. James Massey Young Scholars Award that The remaining 620 talks were presented in 9 went to Andrea Montanari. Surprisingly, the parallel sessions and, extrapolating from those IT Paper Award was given to two papers, I attended, were all superb. Two records were “The Capacity Region of the Two-Receiver broken, with 888 participants this was the Gaussian Vector Broadcast Channel With Pri- largest ISIT outside the US, and at 540 Euro vate and Common Messages” by Yanlin Geng registration, the most economical in a decade. and Chandra Nair, and “Fundamental Limits of Caching” by Mohammad Maddah-Ali and Apropos records, and with the recent Rio Games, isn’t ISIT Urs Niesen. While Paper Awards were given to two related like the Olympics, except even better? As with the ultimate papers recently, not since 1972 were papers on different top- sports event, we convene to show our best results, further our ics jointly awarded.
    [Show full text]
  • Bog Candidate Bios
    From: "Steven W. McLaughlin" <[email protected]> Subject: BoG Candidate Bios Date: June 11, 2007 13:49:36 GMT+02:00 To: Bixio Rimoldi <bixio.rimoldi@epfl.ch> Cc: "Steven W. McLaughlin" <[email protected]>, Urbashi Mitra <[email protected]>, Giuseppe Caire <[email protected]>, Andrea Goldsmith <[email protected]>, Rob Calderbank <[email protected]>, Muriel Medard <[email protected]>, Dan Costello <[email protected]>, Sergio Servetto <[email protected]>, "John D. Anderson" <[email protected]>, Dave Forney <[email protected]>, Alex Grant <[email protected]>, Venu Veeravalli <[email protected]>, Hans- Andrea Loeliger <[email protected]>, Ryuji Kohno <[email protected]>, Richard Cox <[email protected]>, Shlomo Shamai <[email protected]>, Ezio Biglieri <[email protected]>, Frank Kschischang <[email protected]>, Vince Poor <[email protected]>, Elza Erkip <[email protected]>, David Tse <[email protected]>, Alon Orlitsky <[email protected]>, Adriaan van Wijngaarden <[email protected]>, João Barros <[email protected]>, Imai Hideki <[email protected]>, Tony Ephremides <[email protected]>, Tor Helleseth <[email protected]>, [email protected], Vinck Han <[email protected]>, Marc Fossorier <[email protected]>, Prakash Narayan <[email protected]>, Anant Sahai <[email protected]>, Laneman Nicholas J <[email protected]>, Ken Zeger <[email protected]>, Daniela Tuninetti <[email protected]>, Ralf Koetter <[email protected]>, Dave Neuhoff <[email protected]> Dear BoG members Below are the bios of those who have agreed to be candidates for the Board of Governors election this Fall.
    [Show full text]
  • Ieee-Level Awards
    IEEE-LEVEL AWARDS The IEEE currently bestows a Medal of Honor, fifteen Medals, thirty-three Technical Field Awards, two IEEE Service Awards, two Corporate Recognitions, two Prize Paper Awards, Honorary Memberships, one Scholarship, one Fellowship, and a Staff Award. The awards and their past recipients are listed below. Citations are available via the “Award Recipients with Citations” links within the information below. Nomination information for each award can be found by visiting the IEEE Awards Web page www.ieee.org/awards or by clicking on the award names below. Links are also available via the Recipient/Citation documents. MEDAL OF HONOR Ernst A. Guillemin 1961 Edward V. Appleton 1962 Award Recipients with Citations (PDF, 26 KB) John H. Hammond, Jr. 1963 George C. Southworth 1963 The IEEE Medal of Honor is the highest IEEE Harold A. Wheeler 1964 award. The Medal was established in 1917 and Claude E. Shannon 1966 Charles H. Townes 1967 is awarded for an exceptional contribution or an Gordon K. Teal 1968 extraordinary career in the IEEE fields of Edward L. Ginzton 1969 interest. The IEEE Medal of Honor is the highest Dennis Gabor 1970 IEEE award. The candidate need not be a John Bardeen 1971 Jay W. Forrester 1972 member of the IEEE. The IEEE Medal of Honor Rudolf Kompfner 1973 is sponsored by the IEEE Foundation. Rudolf E. Kalman 1974 John R. Pierce 1975 E. H. Armstrong 1917 H. Earle Vaughan 1977 E. F. W. Alexanderson 1919 Robert N. Noyce 1978 Guglielmo Marconi 1920 Richard Bellman 1979 R. A. Fessenden 1921 William Shockley 1980 Lee deforest 1922 Sidney Darlington 1981 John Stone-Stone 1923 John Wilder Tukey 1982 M.
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 53, No. 1, March 2003 Editor: Lance C. Pérez ISSN 1059-2362 LIVING INFORMATION THEORY The 2002 Shannon Lecture by Toby Berger School of Electrical and Computer Engineering Cornell University, Ithaca, NY 14853 1 Meanings of the Title Information theory has ... perhaps The title, ”Living Information Theory,” is a triple enten- ballooned to an importance beyond dre. First and foremost, it pertains to the information its actual accomplishments. Our fel- theory of living systems. Second, it symbolizes the fact low scientists in many different that our research community has been living informa- fields, attracted by the fanfare and by tion theory for more than five decades, enthralled with the new avenues opened to scientific the beauty of the subject and intrigued by its many areas analysis, are using these ideas in ... of application and potential application. Lastly, it is in- biology, psychology, linguistics, fun- tended to connote that information theory is decidedly damental physics, economics, the theory of the orga- alive, despite sporadic protestations to the contrary. nization, ... Although this wave of popularity is Moreover, there is a thread that ties together all three of certainly pleasant and exciting for those of us work- these meanings for me. That thread is my strong belief ing in the field, it carries at the same time an element that one way in which information theorists, both new of danger. While we feel that information theory is in- and seasoned, can assure that their subject will remain deed a valuable tool in providing fundamental in- vitally alive deep into the future is to embrace enthusias- sights into the nature of communication problems tically its applications to the life sciences.
    [Show full text]