<<

Solving in Smoothed Polynomial Time

Solving Polynomial Equations in Smoothed Polynomial Time

Peter B¨urgisser Technische Universit¨at Berlin

joint work with Felipe Cucker (CityU Hong Kong)

Workshop on Polynomial Solving Simons Institute for the Theory of Computing, October 16, 2014 Solving Polynomial Equations in Smoothed Polynomial Time Overview

Context and Motivation

I Solving polynomial equations is a fundamental mathematical problem, studied for several hundred years.

I The problem is NP-complete over the field F2 (equivalent to SAT). I Traditionally, the problem is studied over C.There,itis NP-complete in the model of Blum-Shub-Smale.

I Methods of symbolic computation (Gr¨obner bases etc) solve polynomial equations, but the running time is exponential. And these are also slow in practice.

I Numerical methods provide less information on the solutions, but perform much better in practice.

I Theoretical explanation? Solving Polynomial Equations in Smoothed Polynomial Time Overview

Smale’s 17th Problem

I The 17th of Steve Smale’s problems for the 21st century asks: Can a zero of n complex polynomial equations in n unknowns be found approximately, on the average, in polynomial time with a uniform ?

I The problem has its origins in the series of papers ”Complexity of Bezout’s Theorem I-V” by Shub and Smale (1993-1996).

I Beltr´an and Pardo (2008) answered Smale’s 17th problem armatively, when allowing randomized algorithms. Solving Polynomial Equations in Smoothed Polynomial Time Overview

Our Contributions

Near solution to Smale’s 17th problem We design a deterministic numerical algorithm for Smale’s 17th problem (log log N) with expected running time NO ,whereN denotes input size.

For systems of bounded degree the expected running time is polynomial. E.g., (N2) for quadratic . O Smoothed analysis is a blend of average-case and worst-case analysis. It was proposed by Spielman and Teng (2001) and successfully applied to the simplex algorithm.

Smoothed polynomial time We perform a smoothed analysis of the randomized algorithm of Beltr´an and Pardo, proving that its smoothed expected running time is polynomial. Solving Polynomial Equations in Smoothed Polynomial Time Newton iteration, condition, and homotopy continuation

Setting

I For degree vector d =(d1,...,dn)defineinput space

d := f =(f1,...,fn) fi C[X0,...,Xn] homogeneous of degree di . H { | 2 } Input size N:= dim . C Hd n n I Output space is complex projective space P : Look for zero ⇣ P with f (⇣) = 0. 2

n I Metric d on P (angle).

I Fix unitary invariant hermitian inner product , on d (Weyl). This defines a norm f := f , f 1/2 and an (angular)h i H distance d on k k h i the projective space P( d ), respectively on the sphere S( d ). H H

I Solution variety (smooth )

n V := (f ,⇣) f (⇣)=0 d P . { | ✓H ⇥

Solving Polynomial Equations in Smoothed Polynomial Time Newton iteration, condition, and homotopy continuation

Condition

I Let f (⇣) = 0. How much does ⇣ change when we perturb f alittle?

I This can be quantified by the condition number of (f ,⇣):

µ(f ,⇣) := f M† , k k·k k

where ( ⇣ = 1, M† stands for pseudo-inverse) k k 1 n (n+1) M := diag( d1,..., dn) Df (⇣) C ⇥ . 2 p p n I µ is well defined on P( d ) P : µ(tf ,⇣)=µ(f ,⇣)fort C⇤. H ⇥ 2 Solving Polynomial Equations in Smoothed Polynomial Time Newton iteration, condition, and homotopy continuation

Newton iteration and approximate zeros

I Projective Newton iteration

xk+1 = Nf (xk )

n n with Newton operator Nf : P P and starting point x0. ! I Gamma Theorem (Smale): Put D := maxi di .If 0.3 d(x ,⇣) , 0  D3/2 µ(f ,⇣)

then immediate convergence of xk+1 = Nf (xk )withquadratic speed: 1 d(x ,⇣) d(x ,⇣). k 2k 1 0  2

Call x0 approximate zero of f . Solving Polynomial Equations in Smoothed Polynomial Time Newton iteration, condition, and homotopy continuation

From local to global search: homotopy continuation

I Given a start system

n (g,⇣) V := (f ,⇣) f (⇣)=0 d P . 2 | ✓H ⇥ n in the solution manifold V .

I Connect input f to g by segment [g, f ]= q t [0, 1] . 2Hd { t | 2 } I If none of the qt has multiple zero, there exists unique lifting of t q to a solution path in V 7! t :[0, 1] V , t (q ,⇣ ) ! 7! t t

such that (q0,⇣0)=(g,⇣). Solving Polynomial Equations in Smoothed Polynomial Time Newton iteration, condition, and homotopy continuation

Adaptive linear homotopy

I Adaptive Linear Homotopy ALH: follow solution path numerically. Put t0 =0, q0 := g, z0 := ⇣.

Compute ti+1, qi+1, zi+1 adaptively from ti , qi := qti , zi by Newton’s method:

3 7.5 10 d(qi+1, qi )= 3/2 · 2 , D µ(qi , zi )

zi+1 = Nqi+1 (zi ).

I Let K(f , g,⇣) denote the number k of Newton continuation steps needed to follow the homotopy.

I Shub-Smale & Shub (2007): zi is approximate zero of ⇣ti and

1 K(f , g,⇣) 217 D3/2 µ((t))2 q˙ dt.  k t k Z0 Solving Polynomial Equations in Smoothed Polynomial Time Newton iteration, condition, and homotopy continuation

Randomization

I How to choose the start system? (1) I Almost all (g,⇣) V are “good”: µ(g,⇣)=NO (Shub-Smale). 2 I Unknown how to eciently construct such (g,⇣): “problem to find hay in a haystack.”

I We may choose g S( ) uniformly at random. 2 Hd I Alternatively, we may choose g according to the standard Gaussian distribution on :ithasthedensity Hd

N 1 2 ⇢(g)=(2⇡) exp( g ). 2k k Solving Polynomial Equations in Smoothed Polynomial Time Newton iteration, condition, and homotopy continuation

ALasVegasalgorithm

I Standard distribution on solution variety V :

I choose g d from standard Gaussian, 2H I choose one of the d1 dn many zeros ⇣ of g uniformly at random. ··· I Ecient sampling of (g,⇣) V is possible (Beltr´an& Pardo 2008). 2

I Las Vegas algorithm LV: on input f ,draw(g,⇣) V at random, run ALH on (f , g,⇣) 2

I LV has expected “running time” K(f ) := Eg,⇣ K(f , g,⇣).

Average of LV (Beltr´anand Pardo)

3/2 Ef K(f )= D Nn O for standard Gaussian f d . 2H Solving Polynomial Equations in Smoothed Polynomial Time Results

Smoothed expected polynomial time

Smoothed analysis: Fix f d and >0. The isotropic Gaussian on with mean f and variance2H 2 has the density Hd 1 1 ⇢(f )= exp f f 2 . (2⇡2)N 22 k k ⇣ ⌘ We write f N(f ,2I ). Technical issue:⇠ we truncate this Gaussian by requiring f f p2N, 2 k k obtaining the distribution NT (f , I ).

Smoothed analysis of LV D3/2Nn sup 2 K(f )= . Ef NT (f , I ) f =1 ⇠ O k k ⇣ ⌘ Solving Polynomial Equations in Smoothed Polynomial Time Results

Near solution to Smale’s 17th problem

The deterministic algorithm below computes an approximate zero of (log log N) f d with an expected number of arithmetic operations NO , for2 standardH Gaussian input f . 2Hd

I (I) D n: Run ALH with the start system (g,⇣), where  g = X di X di ,⇣=(1,...,1) i i 0 µ(g,⇣)2 2(n + 1)D .  I (II) D n: Use known method from computer algebra (Renegar), taking roughly Dn steps.

1 " D If D n ,forfixed">0, then n and hence the running time is polynomially bounded in N.SimilarlyforD n1+". Solving Polynomial Equations in Smoothed Polynomial Time Results

On the proof

I Reduce to smoothed analysis of mean condition number defined as

1/2 1 2 µ2(q):= µ(q,⇣) for q d . d1 dn 2H ⇣ ··· q(X⇣)=0 ⌘

I Main auxiliary result:

2 µ2(q) n 2 sup Eq N(q, I ) 2 = 2 . q =1 ⇠ q O k k ⇣ k k ⌘ ⇣ ⌘ I Proof is involved and proceeds by the analysis of certain probability distributions on fiber bundles (coarea formula etc).

I This way, the proof essentially reduces to a smoothed analysis of a condition number.