Information Theory, Machine Learning and Statistics

Information Theory, Machine Learning and Statistics

Wright State University CORE Scholar Computer Science & Engineering Syllabi College of Engineering & Computer Science Fall 2007 CS 790: Information Theory, Machine Learning and Statistics Shaojun Wang Wright State University - Main Campus, [email protected] Follow this and additional works at: https://corescholar.libraries.wright.edu/cecs_syllabi Part of the Computer Engineering Commons, and the Computer Sciences Commons Repository Citation Wang, S. (2007). CS 790: Information Theory, Machine Learning and Statistics. https://corescholar.libraries.wright.edu/cecs_syllabi/231 This Syllabus is brought to you for free and open access by the College of Engineering & Computer Science at CORE Scholar. It has been accepted for inclusion in Computer Science & Engineering Syllabi by an authorized administrator of CORE Scholar. For more information, please contact [email protected]. Page 1of2 CS790: INFORMATION THEORY, MACHINE LEARNING AND STATISTICS FALL 2007 Infonnation theory deals with encoding data in order to transmit it correctly and effectively. Statistics and machine learning deal with estimating models of data and predicting future observations. ls there any relationship between the two? It turns out, perhaps not surprisingly, that the most compact encoding ofthe data is by the probabilistic model that describes it best. In other words, there is a fundamental link between information and probability. This course starts with the basic notions of infonnation theory and explores its relationship to machine learning and statistics. The course will have a strong theoretical component, but will also focus on applications and computing. The topics to be covered are: • Entropy, mutual information, Kullback-Leibler and Bregman divergences; • Source-channel models; boosting and optimal betting strategy; • Maximum likelihood and Bayesian inference; • Channel capacity, rate distortion and information bottleneck method; • Maximum entropy principle, information geometry and alternating algorithms; • Large deviations, coding theory and approximate inference in graphical models. LECTURES Time: Tuesday/Thursday 8:00 pm -9:15 pm; Location: Russ Center 155 INSTRUCTOR Shaojun Wang 428, Russ Engineering Center Building shaoj1m. wang( at)wright.edu (937) 775-5140 Oflice hours: Tuesday/Thursday 2:00PM-3:30PM Thomas M. Cover and Joy A. Thomas El~m&111s__~lnfqrmmkl11~Thi!ory~l_£4i@11 Wiley-Interscience, 2006. http://www.cs.wright.edu/~swang/cs790inf/ 9/11/2007 Page 2 of2 I. Csiszar and P. Shields l!Jfgrmation Theory and Statistics: A Tutorial Foundations and Trends in Communications and Information Theory, 1(4):417-528, 2004 COURSE GRADES AND WORKLOAD Three Homeworks 60% Projects 40% (Presentation: 20%; Report: 20%) PREREQUISITE A rudimentary knowledge ofprobability and statistics, for example, familiar with the materials in the standard textbook, A First Course in Probability by Sheldon Ross. http://www.cs.wright.edu/~swang/cs790inf/ 9/11/2007 .

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    3 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us