NEO

NEuro-Inspired Optimization—A Fractional Time Series Approach

Journal Article (2021)
Author(s)

Sarthak Chatterjee (Rensselaer Polytechnic Institute)

Subhro Das (IBM Research)

Sérgio Pequito (TU Delft - Team Sergio Pequito)

Research Group
Team Sergio Pequito
DOI related publication
https://doi.org/10.3389/fphys.2021.724044
More Info
expand_more
Publication Year
2021
Language
English
Research Group
Team Sergio Pequito
Volume number
12
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Solving optimization problems is a recurrent theme across different fields, including large-scale machine learning systems and deep learning. Often in practical applications, we encounter objective functions where the Hessian is ill-conditioned, which precludes us from using optimization algorithms utilizing second-order information. In this paper, we propose to use fractional time series analysis methods that have successfully been used to model neurophysiological processes in order to circumvent this issue. In particular, the long memory property of fractional time series exhibiting non-exponential power-law decay of trajectories seems to model behavior associated with the local curvature of the objective function at a given point. Specifically, we propose a NEuro-inspired Optimization (NEO) method that leverages this behavior, which contrasts with the short memory characteristics of currently used methods (e.g., gradient descent and heavy-ball). We provide evidence of the efficacy of the proposed method on a wide variety of settings implicitly found in practice.