Michael I. Jordan


Michael Irwin Jordan is an American scientist, Professor at the University of California, Berkeley and a researcher in machine learning, statistics, and artificial intelligence.

In the 1980s Jordan started developing recurrent neural networks as a cognitive model. In recent years, though, his work is less driven from a cognitive perspective and more from the background of traditional statistics.

He popularised Bayesian networks in the machine learning community and is known for pointing out links between machine learning and statistics. Jordan was also prominent in the formalisation of variational methods for approximate inference and the popularisation of the expectation-maximization algorithm in machine learning.