FISES Archivos

Foro sobre Fisica Estadistica

FISES@LISTSERV.REDIRIS.ES

Opciones: Vista Forum

Use Monospaced Font
Por defecto enseñar Text Part
Mostrar todas las cabeceras de correo

Mensaje: [<< Primero] [< Prev] [Siguiente >] [Último >>]
Tema: [<< Primero] [< Prev] [Siguiente >] [Último >>]
Autor: [<< Primero] [< Prev] [Siguiente >] [Último >>]

Print Responder
Subject:
Emisor:
Angeles Rodriguez <[log in para visualizar]>
Reply To:
Foro sobre Fisica Estadistica <[log in para visualizar]>
Fecha:
Wed, 13 Jan 1999 13:08:39 +0100
Content-Type:
text/plain
Parts/Attachments:
text/plain (62 lines)
---------------------------------------------------------------------
PhD studentship: Statistical Mechanics Analysis of Natural Gradient
                 Learning
---------------------------------------------------------------------

Applications are sought for a three year PhD position to study natural
gradient learning using the methods of statistical mechanics and
stochastic dynamical systems. The position will be supported by an
EPSRC studentship and based in the computer science department at
Manchester University, which is one of the largest and most successful
computer science departments in the UK. Living expenses will be paid
according to current EPSRC rates (19635 pounds over three years) with
substantial extra funding available for participation at international
conferences and workshops.

Project description:

Natural gradient learning was recently introduced as a principled
algorithm for determining the parameters of a statistical model
on-line. The algorithm has been applied to feed-forward neural
networks, independent component analysis and deconvolution algorithms,
often providing much improved performance over existing methods. The
algorithm uses an underlying Riemannian parameter space to re-define
the direction of steepest descent and respects certain
invariances which should be observed by any consistent
algorithm. Natural gradient learning is known to provide optimal
asymptotic performance under certain restricted conditions but a
good general understanding of the non-asymptotic learning performance is
not yet available. This is really the regime which we expect to
dominate the learning time and recent work by the project supervisor
and co-workers [1,2] provides some quantification of the advantage
which can be expected over other algorithms. This analysis involves a
statistical mechanics formalism which allows an exact solution to
learning dynamics for learning in a feed-forward neural network. The
proposed project will build on these initial results in order to
characterize the behaviour of natural gradient learning with greater
generality. The project will also explore other applications of
information geometry to probabilistic modelling.

This project will touch on many interesting mathematical topics
(information theory, differential geometry, statistical mechanics and
stochastic dynamical systems) and application areas (optimization,
neural networks, probabilistic modelling). Prospective candidates
would ideally be interested in a number of these topics. A good first
degree in physics, mathematics or a related subject is required.

Contact:  Magnus Rattray ([log in para visualizar])
          Computer Science Department,
          University of Manchester,
          Manchester M13 9PL, UK.
          Tel +44 161 275 6187.
          http://www.cs.man.ac.uk/~magnus/magnus.html

References:

[1] M Rattray, D Saad, S Amari, "Natural Gradient Descent for On-line
Learning", Physical Review Letters 81, p5461 (1998).

[2] M Rattray, D Saad, "Transients and Asymptotics of Natural Gradient
Learning", Proceeding of ICANN 98, edited by L Niklasson, M Boden and
T Ziemke (Springer-Verlag, London), p165 (1998).

ATOM RSS1 RSS2