Computational Models of Intelligence (Spring 2012)
 
  • Interdisciplinary Program in Cognitive Science, Seoul National University
  • Instructor: Dr. Joon Shik Kim
    • Tel: 010-2838-0324
    • E-mail: jskim.ozmagi@gmail.com
    • Homepage: http://bi.snu.ac.kr/~jskim
    • Consultation Time/Place: 10:00-12:00 AM (Mon., Wed., Fri.) / 302-314-1
  • Classroom: Bldg. 14, Room# 207-1
  • Time: Thursday 13:00-16:00


  • Purpose of Course
    • Alan Turing proposed the concept of artificial intelligence with Turing machine in 1950. Today, computational intelligence is applied in various areas through a machine learning algorithm. We study various machine learning algorithms in the class. Student will have a chance of writing a research report with his/her own data based on the discussion of the class. Not only the students of science and engineering background but also the students of humanity, social science or art are welcomed.

  • Evaluation
    • Attendance (10%)
    • Task (30%)
    • Medium (30%)
    • Final (30%)

  • References
    1. AM Turing, Computing machinery and intelligence, Mind, 59(236): 433-460, 1950.
    2. RM Neal, Markov Chain sampling methods for Dirichlet process mixture models, Journal of computational and graphical statistics, 9(2): 249-265, 2000.
    3. G Welch, G Bishop, An introduction to the Kalman filter, TR 95-041 (department of computer science, university of north carolina at Chapel Hill), 2006.
    4. RS Sutton, Integrated modeling and control based on reinforcement learning and dynamic programming, NIPS 3 (proceedings of the 1990 conference on Advances in neural information processing systems 3).
    5. GD Honey, CHY Fu, J Kim, MJ Brammer, TJ Croudace, J Suckling, EM Pich, SCR Williams, ET Bullmore, Effects of verbal working memory load on corticocortical connectivity modeled by path analysis of functional magnetic resonance imaging data, Neuroimage 17: 573-582, 2002.
    6. LR Rabiner, A tutorial on hidden Markov models and selected applications in speech recognition, Proceedings of the IEEE, 77(2): 257-286, 1989.
    7. C Cortes, V Vapnik, Support-vactor networks, Machine learning, 20: 273-297, 1995.
    8. K Friston, The free-energy principle: a rough guide to the brain?, Trends in cognitive sciences, 13(7): 293-301, 2009.
    9. CKI Williams, CE Rasmussen, Gaussian processes for regression, NIPS, 1996.
    10. NL Roux, Y Bengio, Representational power of restricted Boltzmann machines and deep belief networks, Neural computation, 20: 1631-1649, 2008.
    11. JS Kim, JC Kim, J O, BT Zhang, A global minimization algorithm based on a geodesic of a Lagrangian formulation of Newtonian dynamics, Neural processing letters, 26: 121-131, 2007.

  • Lecture Plan
  • Week Lecture Content Slides
    1 Week Turing machine and artificial intelligence (ref.[1]) down
    2 Week Dirichlet process with Gibbs sampling (ref.[2]) down
    3 Week Kalman filter for tracking the missile (ref.[3]) down
    4 Week Reinforcement learning (ref.[4]) down
    5 Week Presentation of reference paper (students)
    6 Week Structural equation of modeling for connectivity analysis (ref.[5]) down
    7 Week Hidden Markov model (Viterbi and Baum-Welch algorithms) (ref.[6]) down
    8 Week Support vector machine (SVM) (ref.[7]) down
    9 Week Free-energy principle (ref.[8]) down
    10 Week Gaussian process (GP) (ref.[9]) down
    11 Week Mid-presentation of research (students)
    12 Week Restricted Boltzmann machine (RBM) (ref.[10] by Jiseob Kim) down
    13 Week Geodesic of a Lagrangian dynamics (ref.[11]) down
    14 Week Final presentation of research (students)

Last updated: April 2012 by Chung-Yeon Lee