Objectives
Algorithmic Information Theory (AIT) is based on the mathematical notion of complexity, which has been invented 50 years ago to solve issues related to machine learning, randomness and proof theory. It derives from a fundamental intuition:
Complex objects cannot be described by short algorithms. Complexity corresponds to the size of algorithms (and not to their speed; see caveat below).
Creating Artificial intelligence is one of the greatest challenges in the history of humankind. Programs are said to be "intelligent" because they solve difficult problems, such as playing the game of Go. Unfortunately, Artificial intelligence is often perceived as no more than that, just a collection of brilliant, innovative methods to solve problems. Most people don’t imagine that intelligent behaviour can be universally described in terms of algorithmic information.
There is currently a growing interest in Complexity and AIT for their role in the theoretical foundations of Artificial Intelligence. Moreover, practical approaches to complexity based on compression techniques or minimum length descriptions offer efficient techniques in machine learning. AIT plays an important role in mathematics, for instance to set limits to what a formal theory or an intelligent system can do. More recently, AIT has been shown essential to address aspects of human intelligence, such as perception, relevance, decision making and emotional intensity.
Content
Topics
Chapter 1.


Complexity measured by code length.
Complexity of integers.
Conditional Complexity. 



Chapter 2.


Compressibility.
Language recognition through compression.
Huffman codes  Complexity and frequency.
Zipf’s law.
"Google" distance  Meaning distance. 



Chapter 3.


Incomputability of C.
Algorithmic probability  Algorithmic Information.
Randomness.
Gödel’s theorem revisited. 



Chapter 4.


Induction  Minimum Description Length (MDL).
Analogy as complexity minimization.
Machine Learning and compression. 



Chapter 5.


Cognitive complexity.
Simplicity and coincidences.
Subjective probability & subjective information.
Relevance. 
Slides
Validation
 Answers to questions during the lab sessions are recorded and evaluated.
 Students will have to answer a short quiz on the last day.
 Students will make a small original contribution (typically, as a continuation of a lab work question)
 and present it orally.
This microstudy should emphasize the link with Kolmogorov complexity and Algorithmic Information. You are expected to choose a topic of study, and to do something for this project (typically write a small program). The topic of the project must be related to Kcomplexity.
Very soon:
Before then end of the course
Your small report will describe your project and what you found
(typically: 3 or 4 pages, more if there are images). Don’t forget to include:
 Your name, date and context
 The title of your study
 The problem you addressed
 What you did + what you found
 A bit of theoretical reflection/extension about what you did.
 A bibliography (with weblinks)
All contributions that pass will be grouped together into a document made accessible to all.
Short bibliography
 Chaitin, G. J. (2004). On the intelligibility of the universe and the notions of simplicity, complexity and irreducibility. In W. Hogrebe & J. Bromand (Eds.), Grenzen und Grenzüberschreitungen, XIX, 517534. Berlin: Akademie Verlag.Devine, S. D. (2014). Algorithmic information theory: Review for physicists and natural scientists. .
 Chaitin, G. J. (2005). Meta Math! The quest for Omega. Vintage Books, ed. 2006.
 Chater, N. (1999). The search for simplicity: A fundamental cognitive principle?. The Quarterly Journal of Experimental Psychology, 52 (A), 273302.
 Downey, R. G. & Hirschfeldt, D. R. (2010). Algorithmic randomness and complexity. New York: Springer.
 Grünwald, P. D. (2007). The minimum description length principle. MIT press.
 Hutter, M. (2005). Universal artificial intelligence: Sequential decisions based on algorithmic probability. Berlin: Springer.Li, M. & Vitányi, P. (1993). An introduction to Kolmogorov complexity and its applications (3rd ed.). New York: Springer Verlag, ed. 1997.
 Li, M. & Vitányi, P. (1993). An introduction to Kolmogorov complexity and its applications (3rd ed.). New York: Springer Verlag, ed. 1997.
 Solomonoff, R. J. (1997). The discovery of algorithmic probability. Journal of Computer and System Sciences, 55 (1), 7388.
 Solomonoff, R. J. (1978). Complexitybased induction systems: Comparisons and convergence theorems. IEEE transactions on Information Theory, 24 (4), 422432.
 Vitányi, P. & Li, M. (2001). Simplicity, information, Kolmogorov complexity and prediction. In A. Zellner, H. A. Keuzenkampf & M. McAleer (Eds.), Simplicity, inference and modelling: Keeping it sophisticatedly simple, 135155. Cambridge, UK: Cambridge University Press.
 Zenil, H. (2013). A computable universe: understanding and exploring nature as computation. World Scientific.
En français: