Discrete Optimisation and Big Data

With the ever-increasing amount of data available these days, one faces optimisation problems that are too large to be solved by traditional methods.  There is therefore a need for “scalable” algorithms, i.e., algorithms whose time and memory requirements grow only slowly with the size of the instance.  This project is concerned with the development, analysis and testing of scalable algorithms for computing lower and/or upper bounds for discrete optimisation problems.  Such algorithms could include those based on the “primal-dual” paradigm and/or “sublinear” algorithms, which work with only a sample of the problem data.

Applicants for this research project need to have good qualifications in mathematics or a related discipline, and have experience with computer programming. Experience with LaTeX would also be desirable, but is not essential as training can be given.

For further information contact Trivikram Dokka or Adam Letchford.

Return to the Optimisation Research Group page.