
1 RISE: An Incremental Trust-Region Method for Robust Online Sparse Least-Squares Estimation David M. Rosen, Student Member, IEEE, Michael Kaess, Member, IEEE, and John J. Leonard, Fellow, IEEE Abstract—Many point estimation problems in robotics, com- puter vision and machine learning can be formulated as instances of the general problem of minimizing a sparse nonlinear sum-of- squares objective function. For inference problems of this type, each input datum gives rise to a summand in the objective function, and therefore performing online inference corresponds to solving a sequence of sparse nonlinear least-squares minimiza- tion problems in which additional summands are added to the objective function over time. In this paper we present Robust Incremental least-Squares Estimation (RISE), an incrementalized version of the Powell’s Dog-Leg numerical optimization method suitable for use in online sequential sparse least-squares min- imization. As a trust-region method, RISE is naturally robust to objective function nonlinearity and numerical ill-conditioning, Fig. 1. The Powell’s Dog-Leg update step hdl is obtained by interpolating and is provably globally convergent for a broad class of inferential the (possibly approximate) Newton step hN and the gradient descent step hgd cost functions (twice-continuously differentiable functions with using a trust-region of radius ∆ centered on the current iterate x. By adapting bounded sublevel sets). Consequently, RISE maintains the speed ∆ online in response to the observed performance of the Newton steps near of current state-of-the-art online sparse least-squares methods x, the algorithm is able to combine the rapid end-stage convergence speed of while providing superior reliability. Newton-type methods with the reliability of gradient descent. Index Terms—Sparse least-squares minimization, online esti- mation, SLAM, computer vision, machine learning tion of an independent minimization problem using standard sparse least-squares techniques (most commonly Levenberg- I. INTRODUCTION Marquardt [6]–[8]). While this approach is general and pro- ANY point estimation problems in robotics, computer duces good results, it is computationally expensive, and does M vision and machine learning can be formulated as not exploit the sequential structure of the underlying inference instances of the general problem of minimizing a sparse problem; this limits its utility in real-time online applications, nonlinear sum-of-squares objective function; for example, where speed is crucial. the archetypal problems of full (smoothing) simultaneous More sophisticated solutions achieve faster computation by localization and mapping (SLAM) [1] (in robotics), bundle directly exploiting the sequentiality of the online inference adjustment (BA) [2], [3] (in computer vision), and sparse problem. The canonical example is online gradient descent, (kernel) regularized least-squares classification and regression which is attractive for its robustness, simplicity, and low [4], [5] (in machine learning) all belong to this class. For memory and per-iteration computational costs, but its first- inference problems of this type, each input datum gives rise to order rate can lead to painfully slow convergence [8]. Al- a summand in the objective function, and therefore performing ternatively, Kaess et al. have developed incremental smooth- online inference (in which the data is collected sequentially ing and mapping (iSAM),[9], [10], which exploits recent and the estimate updated after the incorporation of each new algorithmic advances in sparse numerical linear algebra to datum) corresponds to solving a sequence of sparse least- implement an efficient incrementalized version of the Gauss- squares minimization problems in which additional summands Newton method [8] for use in online sparse least-squares are added to the objective function over time. minimization. This incremental approach, together with the In practice, these online inference problems are often solved Gauss-Newton method’s superlinear convergence rate, enables by computing each estimate in the sequence as the solu- iSAM to achieve computational speeds unmatched by iterated Received Aug. 5, 2013; revised Jan. 9, 2014; accepted Mar. 31, 2014. batch techniques. However, the Gauss-Newton method can D.M. Rosen and J.J. Leonard are with the Computer Science and Artificial exhibit poor (even globally divergent) behavior when applied Intelligence Laboratory of the Massachusetts Institute of Technology, Cam- to objective functions with significant nonlinearity [11], which bridge, MA 02139, USA. Email: fdmrosen|[email protected]. M. Kaess is with the Robotics Institute at Carnegie Mellon University, restricts the class of problems to which iSAM can be reliably Pittsburgh, PA 15213, USA. Email: [email protected]. applied. To date, the development of a fully incremental online This work was partially supported by Office of Naval Research sparse least-squares solver that combines the robustness of (ONR) grants N00014-12-1-0093, N00014-11-1-0688, N00014-06-1-0043 and N00014-10-1-0936, and by Air Force Research Laboratory (AFRL) contract gradient descent with the superlinear convergence rate of FA8650-11-C-7137. Newton-type methods has remained an outstanding problem. 2 To that end, in this paper we present Robust Incremental least-Squares Estimation (RISE), an incrementalized version of the Powell’s Dog-Leg numerical optimization algorithm [8], [12] suitable for use in online sequential sparse least-squares minimization. As a trust-region method (Fig.1), Powell’s Dog-Leg is naturally robust to objective function nonlinearity and numerical ill-conditioning, and enjoys excellent global Fig. 2. The factor graph formulation of the full (smoothing) SLAM problem. convergence properties [13]–[15]; furthermore, it is known to Here variable nodes are shown as large circles and factor nodes as small solid perform significantly faster than Levenberg-Marquardt in batch circles. The variables consist of robot poses x and landmark positions l, and sparse least-squares minimization while obtaining solutions of the factors are odometry measurements u, a prior p, loop closing constraints c and landmark measurements m. Each of the factors corresponds to a summand comparable quality [16]. By exploiting iSAM’s pre-existing ri in (1). In the online case, as the robot explores previously unseen areas, functionality to incrementalize the computation of the dog- new variable nodes (i.e. robot positions and landmarks) and factor nodes leg step, RISE maintains the speed of current state-of-the-art (measurements) are added to this graph over time; the corresponding online inference problem is then given by (2). online sparse least-squares solvers while providing superior robustness to objective function nonlinearity and numerical ubiquity of these models, robust and computationally efficient ill-conditioning. methods for solving (1) are thus of significant practical import. The rest of this paper is organized as follows. In the next In the case of online inference, the input data is collected section we formulate the sequential sparse least-squares mini- sequentially, and we wish to obtain a revised estimate for X mization problem and discuss its connections to online infer- after the incorporation of each datum. Since each input datum ence. In Section III we review the class of Newton-type opti- gives rise to a summand in (1), online inference corresponds mization methods, focusing in particular on the Gauss-Newton to solving the sequence of sparse least-squares problems method and its incrementalization to produce iSAM. Section min S(t)(x ) n t IV introduces the general class of trust-region methods, paying xt2R t m particular attention to Powell’s Dog-Leg. Here we derive t 2 (2) (t) X 2 (t) the indefinite Gauss-Newton-Powell’s Dog-Leg (IGN-PDL) S (xt) = ri (xi) = r (xt) algorithm (an extension of Powell’s Dog-Leg with Gauss- i=1 Newton steps to the case of indefinite Jacobians), analyze its for r(t) : Rnt ! Rmt and t = 1; 2;::: , where: + robustness with respect to objective function nonlinearity and 1) mt; nt 2 N are monotonically non-decreasing in t, numerical ill-conditioning, and establish sufficient conditions 2) mt ≥ nt for all t, ni for its global convergence (Theorem3). We then derive the 3) xi 2 R for all i and xi ⊆ xj for all i ≤ j. RISE and RISE2 algorithms in SectionV by incrementalizing Condition 1 above expresses the fact that the summation in IGN-PDL with the aid of iSAM. We contextualize RISE with (2) evolves over time only through the addition of new terms. a discussion of related work in SectionVI, and evaluate its Condition 2 is necessary in order for the minimization problem performance in Section VII on standard 6DOF pose-graph in (2) to have a unique solution. Condition 3 formalizes the SLAM benchmarks and on a real-world visual mapping task idea that we also allow the vector of states X that we wish using a calibrated monocular camera. Finally, Section VIII to estimate to be augmented online by the addition of new concludes with a summary of this paper’s contributions and a quantities of interest (for example, as in the case of robotic discussion of future research directions. mapping when exploring previously unseen areas, cf. Fig.2). Our goal in this paper is to develop a fully incremental II. PROBLEM FORMULATION algorithm capable of robustly solving online sparse least- squares minimization problems of the form (2) in real-time. We are interested in the general problem of obtaining a ∗ n point estimate x 2 R of some quantity of interest X as the III. REVIEW OF NEWTON-TYPE OPTIMIZATION METHODS solution of a sparse nonlinear least-squares problem AND ISAM min S(x) The RISE algorithm that we develop in SectionV exploits n x2R iSAM’s incremental computation of the Gauss-Newton step in m X (1) order to solve the sequential sparse least-squares problem (2) S(x) = r (x)2 = kr(x)k2 i efficiently in the online case. In this section, we review the i=1 general class of Newton-type optimization methods, their spe- for r : Rn ! Rm with m ≥ n.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages17 Page
-
File Size-