
PhD studentship: Statistical Mechanics Analysis of Natural Gradient
Learning

Applications are sought for a three year PhD position to study natural
gradient learning using the methods of statistical mechanics and
stochastic dynamical systems. The position will be supported by an
EPSRC studentship and based in the computer science department at
Manchester University, which is one of the largest and most successful
computer science departments in the UK. Living expenses will be paid
according to current EPSRC rates (19635 pounds over three years) with
substantial extra funding available for participation at international
conferences and workshops.
Project description:
Natural gradient learning was recently introduced as a principled
algorithm for determining the parameters of a statistical model
online. The algorithm has been applied to feedforward neural
networks, independent component analysis and deconvolution algorithms,
often providing much improved performance over existing methods. The
algorithm uses an underlying Riemannian parameter space to redefine
the direction of steepest descent and respects certain
invariances which should be observed by any consistent
algorithm. Natural gradient learning is known to provide optimal
asymptotic performance under certain restricted conditions but a
good general understanding of the nonasymptotic learning performance is
not yet available. This is really the regime which we expect to
dominate the learning time and recent work by the project supervisor
and coworkers [1,2] provides some quantification of the advantage
which can be expected over other algorithms. This analysis involves a
statistical mechanics formalism which allows an exact solution to
learning dynamics for learning in a feedforward neural network. The
proposed project will build on these initial results in order to
characterize the behaviour of natural gradient learning with greater
generality. The project will also explore other applications of
information geometry to probabilistic modelling.
This project will touch on many interesting mathematical topics
(information theory, differential geometry, statistical mechanics and
stochastic dynamical systems) and application areas (optimization,
neural networks, probabilistic modelling). Prospective candidates
would ideally be interested in a number of these topics. A good first
degree in physics, mathematics or a related subject is required.
Contact: Magnus Rattray ([log in para visualizar])
Computer Science Department,
University of Manchester,
Manchester M13 9PL, UK.
Tel +44 161 275 6187.
http://www.cs.man.ac.uk/~magnus/magnus.html
References:
[1] M Rattray, D Saad, S Amari, "Natural Gradient Descent for Online
Learning", Physical Review Letters 81, p5461 (1998).
[2] M Rattray, D Saad, "Transients and Asymptotics of Natural Gradient
Learning", Proceeding of ICANN 98, edited by L Niklasson, M Boden and
T Ziemke (SpringerVerlag, London), p165 (1998).
