6.435 - Theory of Learning and System Identification - Spring 2007
Description Syllabus References Homeworks Scribing Project Topics

 

Individual project requirements (multiply by two for groups of two):

  • 40-minute in-class presentation, covering all or most of the submitted content.
  • Complete, self-contained copy of the presentation (PPT/TeX with PDF copy, Sample 1, Sample 2).
  • Summary or extended abstract, no more than 3 pages (TeX with PDF copy, using the scribing template).

Deadlines and times:

  • All deliverables are to be submitted via email, by the end of Monday, May 14.
  • Presentations will be held during class times on May 15 and 17, and up to two additional times/days (TBD).
  • Allocated times will be posted and a notification will be sent out by email.

Suggested topics.

The following is a list of suggested topics for your project. We encourage you to discuss these with us, and we will provide you with further references if needed.

  1. Model Quality evaluation for ARX models (Ljung and references there including MDL and AIC methods).
  2. Support Vector Machines for Graphical models (Link).
  3. Sample path analysis and PAC learning (Kulkarni)
  4. Information Geometry of exponential families of graphical models (Wainwright and Jordan)
  5. Relaxation techniques for inference on graphs: Semi-definite methods. (Wainwright).
  6. Recursive estimation methods and convergence of stochastic differential equations.
  7. Estimation of generalized point-processes (Brown).
  8. Nonlinear regression.
More...
  1. Active learning. Kulkarni S.R., Mitter S.K., Tsitsiklis J.N., "Active Learning Using Arbitrary Binary Valued Queries", Machine Learning, Vol. 11, No. 1, pp. 23-35, April 1993.
  2. Counterexample to the metric entropy conjecture, when the space of probability measures is restricted. Dudley R.M., Kulkarni S.R., Richardson T.J., Zeitouni O., "A Metric Entropy Bound is Not Sufficient for Learnability", IEEE Transactions on Information Theory, Vol. 40, No. 3, pp. 883-885, May, 1994.
  3. Decoding as free energy minimization. Yedidia J.S., Freeman W.T. and Weiss Y. " Bethe free energy, Kikuchi approximations and belief propagation algorithms". Longer version of NIPS 2000 paper.
  4. Neural networks and the bias/variance dilemma. Geman S., Bienenstock E. and Doursat R. "Neural networks and the bias/variance dilemma". Neural Computation Vol. 4, No. 1, pp. 1-58, January 1992.
  5. Maximum likelihood estimation for ARMA/ARMAX. Hannan E.J. and Deistler M. The Statistical Theory of Linear Systems. John Wiley, New York, 1988.
  6. Statistical procedures. Effron B. The Jackknife, the Bootstrap, and Other Resampling Plans. Society for Industrial and Applied Mathematics, Philadelphia, 1982.
  7. Extension of VC theory to exchangeable random sequences.
  8. Learning a curve by counting the number of intersections with randome lines. Kulkarni S.R., Mitter S.K., Tsitsiklis J.N., and Zeitouni O., "PAC Learning With Generalized Samples and an Application to Stochastic Geometry". IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 15, No. 9, pp. 933-942, September 1993.
  9. Valiant's theory of the learnable. Valiant L.G., "A theory of the learnable". Communications of the ACM, Vol. 27, No. 11, pp 1134-1142, November 1984.
  10. Hidden Markov models and speech recognition. Rabiner L.R. and Juang B.H. Fundamentals of Speech Recognition. Prentice-Hall, Englewood Cliffs, New Jersey, 1993. Rabiner L.R. " A tutorial on hidden Markov models and selected applications in speech recognition", Proceedings of the IEEE, Vol. 77, No. 2, pp 257-286, February 1989.
  11. Original paper on the EM algorithm. Dempster A., Laird N., and Rubin D., "Maximum likelihood from incomplete data via the EM algorithm". Journal of the Royal Statistical Society, Series B, Vol. 39, No. 1, pp 1–38, 1977.