Testing Stability by Quantifier Elimination
Total Page:16
File Type:pdf, Size:1020Kb
View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Elsevier - Publisher Connector J. Symbolic Computation (1997) 24, 161–187 Testing Stability by Quantifier Elimination HOON HONG†, RICHARD LISKA‡ AND STANLY STEINBERG§ †RISC Linz, Johannes Kepler University, Linz, Austria †Faculty of Nuclear Sciences and Physical Engineering, Czech Technical University in Prague, Bˇrehov´a7, 115 19 Prague 1, Czech Republic §Department of Mathematics and Statistics, University of New Mexico, Albuquerque NM 87131-1141, U.S.A. For initial and initial-boundary value problems described by differential equations, sta- bility requires the solutions to behave well for large times. For linear constant-coefficient problems, Fourier and Laplace transforms are used to convert stability problems to questions about roots of polynomials. Many of these questions can be viewed, in a natu- ral way, as quantifier-elimination problems. The Tarski–Seidenberg theorem shows that quantifier-elimination problems are solvable in a finite number of steps. However, the complexity of this algorithm makes it impractical for even the simplest problems. The newer Quantifier Elimination by Partial Algebraic Decomposition (QEPCAD) algorithm is far more practical, allowing the solution of some non-trivial problems. In this paper, we show how to write all common stability problems as quantifier-elimination problems, and develop a set of computer-algebra tools that allows us to find analytic solutions to simple stability problems in a few seconds, and to solve some interesting problems in from a few minutes to a few hours. c 1997 Academic Press Limited 1. Introduction Initial-value problems for systems of ordinary differential equations, and initial-boundary value problems for systems of partial differential equations are one of the most common mathematical structures used to model physical processes. In general, these problems are nonlinear, and the first question asked about such problems is: Is the problem well posed? That is, does the problem possess a unique solution that depends continuously on the data of the problem? If the problem is well posed, then the next most important questions concern the stability of the solutions. If two solutions that start out close together remain close together, then the problem is stable, while if the solutions become arbitrarily close, then the problem is asymptotically stable. Otherwise, the problem is unstable. The analysis of such stability problems begins by studying a system of equations that † E-mail: [email protected] ‡ E-mail: [email protected] § E-mail: [email protected] 0747–7171/97/020161 + 27 $25.00/0 sy970121 c 1997 Academic Press Limited 162 H. Hong et al. are linear and have constant coefficients. A standard approach to solving the linear constant-coefficient problems is to use transforms to reduce the problem to an algebraic form; in particular, if a variable is defined for all real values, the Fourier transform in that variable is used, while if the variable is defined for positive real values, the Laplace transform in that variable is used. The transforms and some algebra reduce the computation of solutions to the problem of finding the roots of a polynomial, called the characteristic polynomial, which is a polynomial in the transformed time variable with coefficients that are polynomials in the parameters of the problem and other transform variables. The method of choice for solving an initial-boundary value problem for which analytic solutions cannot be found by the transform methods, is to approximate the problem by a discrete problem and then solve the discrete problem numerically. As in the continuum, it is important to know that the discrete problem is well posed, and if it is well posed, whether it is stable or asymptotically stable. In fact, more is required, since the discrete problems will contain discretization parameters: step sizes that go to zero, or the number of grid points that go to infinity. The analysis of discrete problems follows the same pattern as the continuum problem. In particular, linear constant-coefficient problems are analysed first, using the same transforms as in the continuum problems. For linear constant-coefficient problems, the notion of stability can be somewhat sim- plified. A problem is well posed if all solutions are bounded in time by an exponential function, is stable if all solutions are bounded, and is asymptotically stable if all solutions converge to zero. The transform method of analysis is not particularly well suited to sym- bolic computing. However, the transform method shows that all solutions can be found by calculating all solutions of exponential form. Functions of the appropriate exponential form are called trial solutions. For example, for systems of ordinary differential equations, the trial solutions are a vector times an exponential function of the time variable t, eλt, where λ is the complex transform variable related to the time variable t. For this expo- nential to be a solution, λ will have to be a root of the characteristic polynomial, where typically the coefficients of the polynomial contain both parameters of the problem and other transform variables. Note that, when t ,if λ>0, the trial solution grows in time, while if λ = 0, the trial solution is bounded,→∞ < and if λ<0, the trial solution converges to zero.< (The notation used in this paper is summarized< in Appendix A.) In the case of discrete problems, in particular for systems of ordinary difference equa- tions, the trial solutions are also a vector times an exponential function of n, given by setting t = n ∆t in the previous trial solution. For computational purposes it is better to rewrite this function as a power: eλn∆t =sn,s=eλ∆t. Again, note that when n ,if s >1, the trial solution grows in time, while if s =1, the trial solution is bounded,→∞ and| if| s < 1, the trial solution converges to zero. | | Because of the way that exponentials| | grow, and given that the stability conditions must be uniform in the transform variables, one is tempted to say that a problem is well posed, if for all roots λ of the characteristic equation, there is a constant K (independent of the other transform variables) such that λ K, that the problem is stable if λ 0, and asymptotically stable if λ<0. The< first≤ and last of these conditions are< correct,≤ but the condition for stability< is only correct if the polynomial has no multiple roots Testing Stability by Quantifier Elimination 163 with λ = 0. In the case of multiple roots, the solutions of the problem are exponentials, possibly< multiplied by polynomials, and then the analysis becomes more difficult. There- fore, for problems in this paper, we restrict our attention to showing that problems are asymptotically stable or unstable. Similar comments apply to discrete problems. How are such problems cast as quantifier-elimination problems? Asymptotic stability of the continuum problem for partial differential equations requires that all the roots of the characteristic polynomial have a negative real part for all transform variables related to space. If the characteristic polynomial is C(λ, ξ,~ ~α) where λ is the complex transform variable related to the time variable, the components of ξ~ are the real transform variables related to the spatial variables, and ~α represents the parameters of the problem, then the asymptotic stability of the problem can be written ξ~ R λ C C(λ, ξ,~ ~α)=0 λ<0 . (1.1) ∀ ∈ ∀ ∈ { ⇒< } A necessary condition for stability is obtained by replacing < by in the last formula. The well posedness can be stated as a similar quantifier elimination≤ problem including existence of the global upper bound for the real part of the roots. In this paper we consider only asymptotic stability; however, well posedness can be analysed by similar methods. Eliminating the quantifiers from the previous logical formula produces a formula equivalent to the original but only involving the parameters ~α. This formula gives a complete answer to the stability problem. The Tarski–Seidenberg Theorem states that the quantifiers in the previous logical formula can be eliminated in a finite number of steps to produce a logical formula in polynomials in ~α. However, all of the algorithms for eliminating quantifiers are extremely complex. In the literature on differential equations and numerical methods, stability problems are solved using a variety of methods, but the solutions of even relatively modest problems are quite complex (e.g. Strikwerda, 1989, Example 11.4.1, p. 260). For difficult stability problems, it is typical to either prove an estimate of the stability condition (e.g. Wendroff, 1991), or to estimate the stability region using numerical sampling (e.g. Ganzha and Vorozhtsov, 1987). Fortunately, the quantifier-elimination problems arising in stability theory are neither completely general nor trivial. In the next section, a number of special techniques are described for analysing and sometimes solving these problems. Because of the importance and complexity of stability problems, it is valuable to have a battery of tools for solving them, either approximately or exactly. The solution procedures described in this paper start by using trial solutions to deter- mine a characteristic polynomial for a problem and then writing the stability problem as a quantifier-elimination problem. If the problem concerns asymptotic stability of an initial-value problem (see equation (1.1)), then a Routh–Hurwitz-type procedure is used to eliminate one complex quantified variable (λ). For ordinary differential or difference equations, this along with some simplification, solves the problem. For partial differential or difference equations (for these a conformal map is applied before Routh–Hurwitz), the remaining quantifiers are eliminated using a quantifier elimination using partial cylindri- cal algebraic decomposition algorithm (QEPCAD). For stability problems (equation (1.1) with < replaced by ), the Routh–Hurwitz-type criteria do not apply, so QEPCAD must be applied directly,≤ and this makes such problems much more difficult.