Image shows an aerial view of Hurricane Matthew. Improvements to optimisation algorithms enable weather forecasters to make better weather predictions. Credit: Pixabay Numerous science and engineering applications require finding the lowest or highest value of a mathematical model. This is usually obtained computationally by running an optimisation algorithm. When working with big data which is more complex, it becomes computationally much more time-consuming and expensive to arrive at an optimal or close-to-optimal solution. Increasingly, the computational efficiencies of algorithms become of paramount importance when tackling such complex scenarios.
Prof Vincent TAN from the Department of Mathematics, NUS and Prof William HASKELL from the Department of Industrial Systems Engineering and Management, NUS with their student Renbo ZHAO proposed several practical strategies which can improve the computational speed of a well-known optimisation algorithm known as the stochastic L-BFGS (limited-memory Broyden-Fletcher-Goldfarb-Shanno) algorithm by several orders of magnitude. The mathematicians also proved theoretically that their proposed algorithms converge much faster than other available algorithms.
Several aspects of the existing stochastic L-BFGS algorithms can be improved to ensure higher computational efficiency. For example, traditional strategies use uniform sampling at each computational step. Surprisingly, the NUS mathematicians have shown that by using a non-uniform sampling strategy, the algorithm can process much faster.
Prof Tan said, "Large-scale optimisation problems lie at the heart of big data analysis. It is important to be able to solve them efficiently. Many practical problems, such as training of neural networks for machine learning, are complex in nature and can benefit from stochastic optimisation algorithms. We believe that the theoretical advancements will have significant impact in enhancing the training phases of a variety of machine learning algorithms."
Explore further: Physicists uncover similarities between classical and quantum machine learning
More information: "Stochastic L-BFGS Revisited: Improved Convergence Rates and Practical Acceleration Strategies", Proceedings of Conference on Uncertainty in Artificial Intelligence (UAI), Sydney, Australia, Aug 2017 (to appear).