Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Studying the rate of convergence of gradient optimisation algorithms via the theory of optimal experimental design

Haycroft, Rebecca Jane 2008. Studying the rate of convergence of gradient optimisation algorithms via the theory of optimal experimental design. PhD Thesis, Cardiff University.

[img] PDF - Accepted Post-Print Version
Download (23MB)

Abstract

The most common class of methods for solving quadratic optimisation problems is the class of gradient algorithms, the most famous of which being the Steepest Descent algorithm. The development of a particular gradient algorithm, the Barzilai-Borwein algorithm, has sparked a lot of research in the area in recent years and many algorithms now exist which have faster rates of convergence than that possessed by the Steepest Descent algorithm. The technology to effectively analyse and compare the asymptotic rates of convergence of gradient algorithms is, however, limited and so it is somewhat unclear from literature as to which algorithms possess the faster rates of convergence. In this thesis methodology is developed to enable better analysis of the asymptotic rates of convergence of gradient algorithms applied to quadratic optimisation problems. This methodology stems from a link with the theory of optimal experimental design. It is established that gradient algorithms can be related to algorithms for constructing optimal experimental designs for linear regression models. Furthermore, the asymptotic rates of convergence of these gradient algorithms can be expressed through the asymptotic behaviour of multiplicative algorithms for constructing optimal experimental designs. The described connection to optimal experimental design has also been used to influence the creation of several new gradient algorithms which would not have otherwise been intuitively thought of. The asymptotic rates of convergence of these algorithms are studied extensively and insight is given as to how some gradient algorithms are able to converge faster than others. It is demonstrated that the worst rates are obtained when the corresponding multiplicative procedure for updating the designs converges to the optimal design. Simulations reveal that the asymptotic rates of convergence of some of these new algorithms compare favourably with those of existing gradient-type algorithms such as the Barzilai-Borwein algorithm.

Item Type: Thesis (PhD)
Status: Unpublished
Schools: Mathematics
Subjects: Q Science > QA Mathematics
ISBN: 9781303212994
Date of First Compliant Deposit: 30 March 2016
Last Modified: 10 Jan 2018 02:28
URI: http://orca-mwe.cf.ac.uk/id/eprint/54699

Actions (repository staff only)

Edit Item Edit Item