<<

GeometricAnalysisofConstrainedCurvesforImageUnderstanding

AnujSrivastava WashingtonMio EricKlassen XiuwenLiu DepartmentofStatistics DepartmentofMathematics DepartmentofComputerScience [email protected] [email protected] [email protected] [email protected] FloridaStateUniversity Tallahassee,FL32306,USA

Abstract rithmicstudyofcurves,theirvariationsandstatistics.In thisapproach,afundamentalelementisaspaceofcon- Weofferanoverviewanovel,comprehensiveapproachto strainedcurves,calledSCChenceforth,withconstraintsap- thecomputationaldifferentialgeometryofcurves,withap- propriatetotheapplicationofinterest.(Asanexample,we plicationstoshapeanalysisanddiscovery/recognitionof maybeinterestedinanalyzingclosedcurvesandthispro- objectsinimages.Themainideaistospecifyaspaceof videsaconstraint.)WeexploitthegeometryofSCCsusing curveswithconstraintssuitedtotheapplication,andexploit elementssuchastangents,normals,geodesicsandgradient thedifferentialgeometryofthisspacetosolveoptimiza- flows,tosolveoptimizationandstatisticalinferenceprob- tionandinferenceproblems.Applicationsofthisapproach lemsforavarietyofcostfunctionsandprobabilitydensi- includethestatisticalanalysisofplanarshapes,curvein- ties.Thisframeworkdiffersfromthoseemployedinpre- terpolationswithelasticae,andtheBayesiandiscoveryof viousworkson“geometry-drivenflows”[10]inthesense contoursofobjectsinnoisyimages. thathereboththegeometryofthecurvesandthegeometry ofSCCsareutilized.Thedynamicsofactivecontoursis describedbyvectorfieldsonSCCs,thusreducingtheevo- 1 Introduction lutionofcurvestothestudyofordinarydifferentialequa- tions(ODEs)onSCCs.Itisimportanttoemphasizethat Animportantgoalinimageunderstandingistodetect,track aSCCisusuallyanon-linear,infinite-dimensionalmani- andlabelobjectsofinterestpresentinobservedimages.Im- fold,anditselementsaretheindividualcurvesofinterest. agedobjectscanbecharacterizedinmanyways:according Severalinterestingapplicationscanbeaddressedinthisfor- totheircolors,textures,shapes,movements,andlocations. mulation: Thepastdecadehasseensignificantadvancesinthemod- elingandanalysisofpixelvaluesortexturestocharacter- 1.Efficientdeformationsbetweenanytwocurvesare izeobjectsinimages,albeitwithlimitedsuccess.Onthe generatedbygeodesicpathsconnectingtheelements otherhand,planarcurvesthatrepresentcontoursofobjects theyrepresentintheSCC.Geodesiclengthsalsopro- havebeenstudiedindependentlyforalongtime.Anemerg- videanaturalmetricforclusteringcurves(automated ingopinioninthevisioncommunityisthatglobalfeatures learning)accordingtotheirshapes. suchasshapesofcontoursshouldalsobetakenintoaccount forthesuccessfuldetectionandrecognitionofobjects.A 2.Thediscoveryofplanarshapesingivennoisyimages commonapproachtoanalyzingcurvesinimagesistotreat canbetreatedastheproblemoffindingmaximuma- themaslevelsetsoffunctions,andalgorithmsinvolving posterioripointsinSCCs. suchactivecontoursaregovernedusuallybypartialdiffer- 3.Givenasetofcurves(orshapes),onecandefine entialequations(PDEs)drivenbyappropriatedataterms theconceptsofmeanandcovarianceusinggeodesic andsmoothnesspenalties(seeforexample[12]).Regular- paths,andthusdevelopstatisticalframeworksfor izedcurveevolutionsandregion-basedactivecontoursoffer studyingshapes.Furthermore,onecandefineproba- alternativesinsimilarframeworks.Thisremarkablebodyof bilitiesonaSCCtoperformcurve(orshape)classifi- workcontainsvariousstudiesofcurveevolution,eachwith cationviahypothesistesting. relativestrengthsanddrawbacks.RecentworkofCharpiat etal.[1]providesanotherviewpointonshapeanalysisus- 4.Trackingtime-varyingcurvesordynamicshapescan ingadifferentformulation. bestudiedasaproblemofBayesiannonlinearfiltering Inthispaper,wepresentanovelframeworkforthealgo- onSCCs. 5. Completions of partially occluded contours using elas- statistical analysis on a SCC, while Section 5 presents two ticae (or local elasticae) can be accomplished via applications of this framework: (i) completion of partially gradient-based optimizations on a SCC. occluded objects in images, and (ii) discovery of shapes in noisy images using a Bayesian framework. Many of these problems have been studied in the past with elegant solutions presented in the literature (examples in- clude [11, 13, 8, 2, 6]), but using significantly different 2 Geometric Representations of Con- ideas. We demonstrate the strength of the proposed frame- work by addressing the above-mentioned applications in a strained Curves comprehensive and unified manner. This framework relates C C closely to the ideas presented in [15]. In this section we provide two examples 1 and 2 of Given past achievements in PDE-based approaches to SCCs on which optimization and inference problems will curve evolution, what is the need for newer frameworks? be solved later in the paper. In this paper we restrict mostly R2 R3 The study of the structure of SCCs provides new insights to curves in although curves in can be handled sim- R 7→ R2 and solutions to problems involving dynamic contours and ilarly. Let α : denote the coordinate func- problems in quantitative shape analysis. Once the con- tion of a curve parameterized by arc-length, i.e., satisfying k k straints are utilized in definitions of SCCs the resulting so- α˙ (s) =1, for every s. A direction function√ θ(s) is a jθ(s) − lutions automatically satisfy these constraints. It also com- function satisfying α˙ (s)=e , where j = 1. θ cap- plements existing methods of image processing and analy- tures the angle made by the velocity vector with the x-axis, sis well by providing new computational efficiencies. The and is defined up to the addition of integer multiples of 2π. ˙ main strength of this approach is its exploitation of the dif- The curvature function κ(s)=θ(s) can also be used to ferential geometry of SCCs. For instance, a geodesic or represent a curve. gradient flow Xt of an energy function E (on a SCC) can The choice of representation of curves will depend on be generated as solution of an ODE of the type the specific application. Furthermore, different constraints imposed on curves lead to different SCCs. Two examples dXt suitable for the applications considered in this paper are dis- =Π(∇E(Xt)) , (1) dt cussed next. where Π denotes an appropriate projection onto the tangent bundle of that SCC. This is in contrast with the nonlinear 1. A Space of Closed Curves with Fixed Length: Con- PDE-based curve evolutions of past works. The geometry sider the problem of studying shapes of contours or of SCCs also enables us to derive statistical elements: prob- silhouettes of imaged objects as closed, planar curves R2 ability measures, means and covariances on SCCs; these in , parameterized by arc length. Since shapes are quantities have rarely been treated in previous studies. In invariant to rigid motions (rotations and translations) shape extraction, the main focus in past works has been on and uniform scaling, a shape representation should be solving PDEs driven by image features under smoothness insensitive to these transformations. Scaling can be re- constraints, and not on the statistical analysis of shapes of solved by fixing the length of α to be 2π, and trans- curves. The use of geodesic paths, or piecewise geodesic lations by representing curves via their direction func- L2 paths, has also seen limited use in the past. tions. Thus, we consider the space of all square → R We should also point out the main limitations of the pro- integrable functions θ :[0R , 2π] , with the usual h i 2π posed framework. One drawback is that curve evolutions inner product f,g = 0 f(s)g(s) ds. To account can not handle certain changes in , which is one for rotations and ambiguities on the choice of θ,we of the key features of level-set methods; a SCC is purposely restrict direction functions to those having a fixed av- setup to not allow curves to branch into several components. erage, say, π.ForR α to be closed, it must satisfy the clo- 2π jθ(s) Secondly, this idea does not extend easily to the analysis sure condition 0 e ds =0. Thus, we represent of surfaces in R3. Despite these limitations, the proposed curves by direction functions satisfying the average- methodology provides powerful for the analysis π and closure conditions; we call this space of direc- of planar curves as demonstrated by the examples presented tion functions D1. Summarizing, D1 is the subspace later. Moreover, even in applications where branching ap- of L2 consisting of all (direction) functions satisfying pears to be essential, the proposed methods may be applica- the constraints ble with additional developments. Z 2π Z 2π This paper is laid out as follows: Section 2 studies ge- θ(s) ds = π ; cos(θ(s)) ds =0; ometric representations of constrained curves with two ex- Z 0 0 amples of SCCs. A brief geometric analysis of these spaces 2π sin(θ(s)) ds =0. (2) is presented in Section 3. Section 4 provides an example of 0 2 1 It is still possible to have multiple elements of D1 rep- compute the gradient on the linear space L (H , resp.) in resenting the same shape. This variability is due to which the SCC is contained (usually, a simpler task), and the choice of the reference point (s =0)along the then subtract the normal components to obtain the compo- 1 curve. For x ∈ S and θ ∈D1, define (x · θ) as nent that is tangent to the SCC; (ii) we wish to employ iter- a curve whose initial point (s =0) is changed by ative numerical methods in the simulation of geodesic and a distance of x along the curve. We term this a re- gradient flows; at each step in the iteration, we first flow parametrization of the curve. To remove the variability in the linear space L2 (H1, resp.) using standard methods, due to this re-parametrization group, define the quo- and then project the new point back onto the SCC using our 1 tient space C1 ≡D1/S as the space of continuous, knowledge of the normal structure, as discussed in Section planar shapes. 3.2. The tangent spaces on these two SCCs are described next. 2. Space of Curves with Specified Boundary Con- ditions: As another example, we consider curves 1. Case 1: For technical reasons, it is convenient to re- → R2 α: I satisfying given boundary conditions to duce optimization and inference problems on C1 to ∈ R2 first order, where I =[0, 1]. Given points p0,p1 problems on the D1, so we study the latter. It k − k ∈ R with p1 p0 < 1 and angles θ0,θ1 , we are is difficult to specify the tangent spaces to D1 directly, interested in curves α that admit angle functions θ sat- because they are infinite-dimensional. When working isfying α(i)=pi and θ(i)=θi, for i =0, 1. with finitely many constraints, as is the case here, it is 2 Curves with α(0) = p0 are determined by their di- easier to describe the space of normals to D1 in L in- ∈ L2 Rrection functions via the expression α(s)=p0 + stead. It can be shown that a vector f is tangent s jθ(u) D 0 e du. For these curves, the conditions above to 1 at θ if and only if f is orthogonal to the subspace can be rephrased as spanned by {1, sin θ, cos θ}. Hence, these three func- tions span the normal space to D1 at θ. Implicitly, the Z 1 jθ(s) tangent space is given as: θ(0) = θ0,θ(1) = θ1, and e ds = d, 0 2 Tθ(D1)={f ∈ L |f ⊥ span{1, cos θ, sin θ}} . where d = p1 − p0 is the total displacement of α. This last condition ensures that the end point of the curve α Thus, the projection Π in Eqn. 1 can be specified by 1 2 is p1. We consider the vector space H of all absolutely subtracting from a function (in L ) its projection onto continuous functions θ : I → R with square integrable the space spanned by these three elements. derivative, equipped with the inner product hf,gi = R 1 f(0)g(0) + f˙(s)˙g(s) ds. Here, we use the space 2. Case 2: Similar to Case 1, one can specify the four- 0 C H1 H1 instead of L2 because we wish to be able to control dimensional space of normals to 2 inside as the space spanned by {1,s,ε1,ε2}, where ε1,ε2 : I → R the values of θ at the end points. The space C2 consists of all functions in H1 satisfying the three conditions are characterized by ε¨1 = cos θ, ε1(0) =ε ˙1(0) = 0, above. and ε¨2 = sin θ, ε2(0) =ε ˙2(0) = 0 [7]. Thus, 2 Tθ(C2)={f ∈ L |f ⊥ span{1,s,ε1,ε2}} . 3 Geometries of SCCs 3.2 Geodesics Connecting Closed Curves The main idea in the proposed framework is to use the geo- metric structure of SCCs to solve optimization and statis- We first describe the computation of geodesics (or, one- tical inference problems on these spaces. This approach parameter flows) in D1 with prescribed initial conditions. often leads to simple formulations of these problems and The intricate geometry of D1 disallows explicit analytic ex- to more efficient vision algorithms. Thus, we must study pressions. Therefore, we adopt an iterative strategy, where issues related to the and topology of in each step, we first flow infinitesimally in the prescribed those SCCs. In this paper we restrict to the tangent and nor- tangent direction in the space L2, and then project the end mal bundles, and geodesic flows on these spaces. point of the path to D1. Next, we parallel transport the velocity vector to the new point by projecting the previ- D 3.1 Tangents and Normals to SCCs ous velocity orthogonally onto the tanget space of 1 at the new point. Again, this is done by subtracting normal There are two main reasons for studying the tangential and components. The simplest implementation is to use Euler’s normal structures: (i) to compute the gradient of the restric- method in L2, i.e., to move in each step along short straight tion of a functional on L2 (H1, resp.) to a SCC, we can first line segments in L2 in the prescribed direction, and then project the path back onto D1. Details of this numerical Since C1 is complete, the intrinsic mean as defined above construction of geodesics are provided in [5]. always exists. However, there may be collections of shapes A one-parameter flow can be specified by an initial con- for which µ is not unique. dition θ ∈D1 and a direction f ∈ Tθ(D1), the space of all We now review an iterative given in [6] for tangent directions at θ. We will denote the corresponding finding a Karcher mean of given shapes. flow by Ψ(θ, t, f), where t is the time parameter. The tech- ≤ nique just described allows us to compute Ψ numerically. Algorithm 1 Set k =0. Choose some time increment  1 ∈C Next, we focus on the problem of finding a geodesic n . Choose a point µ0 1 as an initial guess of the mean. path between any two given shapes θ1, θ2 ∈D1. The (For example, one could just take µ0 = θ1.) only remaining issue is to find that appropriate direction ∈ ∈ D 1. For each i =1,...,nchoose the tangent vector fi f Tθ1 ( 1) such that a geodesic from θ1 in that direction C Tµk ( 1) which is tangent to the shortest geodesic from passes through θ2 at time t =1. In other words, the problem ∈ D µk to θi, and whose norm is equalP to the length of this is to solve for an f Tθ1 ( 1) such that Ψ(θ1, 0,f)=θ1 n shortest geodesic. The vector g = i=1 fi is equal to and Ψ(θ1, 1,f)=θ2. One can treat the search for this di- (−2) times the gradient at µk of the function V : C1 → rection as an optimization problem over the tangent space R D which we defined above. Tθ1 ( 1). The cost to be minimized is given by the func- k − k2 tional H[f]= Ψ(θ1, 1,f) θ2 , and we are looking 2. Flow for time  along the geodesic which starts at µk ∈ C for that f Tθ1 ( 1) for which: (i) H[f] is zero, and and has velocity vector g. Call the point where you end k k (ii) f is minimum among all such tangents. Since the up µk+1, i.e. µk+1 =Ψ(µk,,g). D space Tθ1 ( 1) is infinite dimensional, this optimization is not straightforward. However, since f ∈ L2, it has a Fourier 3. If not converged, set k = k +1, and go to Step 1. decomposition, and we can solve the optimization problem over a finite number of Fourier coefficients. For any two A similar algorithm and convergence results for a (finite- dimensional) landmark-based representation of shapes are shapes θ1,θ2 ∈D1, we have used a shooting method to find the optimal f [5]. The basic idea is to choose an initial di- described in [6]. C ⊂ L2 rection f specified by its Fourier coefficients and then use a For the mean µ, let Tµ( 1) be the space of all ∈ gradient search to minimize H as a function of the Fourier tangents to the shape space at µ. Let a tangent element f C coefficients. Tµ( 1) be represented by its Fourier expansion: Finally, to find the shortest path between two shapes in Xm C 1, we compute the shortest geodesic connecting represen- f(s) ≈ (ak cos(ks)+bk sin(ks)) tatives of the given shapes in D1. This is a simple numer- k=1 ical problem, because C1 is the quotient of D1 by the 1- dimensional re-parametrization group S1. for a large positive integer m. Using the identification 2m−1 Shown in Figure 1 are two examples of geodesic paths in f ≈ a = {ak,bk}∈R , one can define a probability C1 connecting given shapes. Drawn in between are shapes distribution on the tangent vectors in an approximate fash- corresponding to equally spaced points along the geodesic ion. We will model a as multivariate normal with mean (2m−1)×(2m−1) paths. Similar ideas can be used to find geodesics on the 0 and covariance K ∈ R . We will term 2 space C2. σ0 = trace(K) the dispersion of a shape model. Estima- tion of K from observed shapes follows the standard proce- dure. One can easily sample tangent vectors from this multi- 4 Statistical Models on SCCs variate normal, and use the geodesic calculation to generate sample shapes. Shown in Figure 2 is an illustration of this Using the case of C1, we illustrate statistical modeling of the idea. The left nine panels show the actual observed shapes constrained curves. Algorithms for finding geodesic paths from which the mean and covariance are calculated. The on SCCs allow us to compute means and covariances in mean is shown in the middle image and the nine random these spaces. We adopt a notion of mean known as the in- shapes generated under the Gaussian model are shown in trinsic mean or the Karcher mean ([4]) that is quite natural the right panels. in our geometric framework. Let d( , ) be the shortest- path metric on C1. To calculate the Karcher mean of shapes {θ1,...,θPn} in C1, define a function V : C1 → R by 5 Applications n 2 V (θ)= i=1 d(θ, θi) . Then, define the Karcher mean of the given shapes to be any point µ ∈C1 for which V (µ) In this section, we describe some applications that involve is a local minimum. In the case of Euclidean spacesP this the solution of optimization or inference problems on SCCs 1 n definition agrees with the usual definition µ = n i=1 pi. with geometric tools. 2 1.5 1 0.5 0 0 2 4 6 8 10

1.5 1 0.5 0 −0.5 0 2 4 6 8 10 12

Figure 1: Examples of evolving one shape into another via a geodesic path. Leftmost shape is θ1, rightmost curves are θ2, and intermediate shapes are equispaced points along the geodesic.

1.2 1 1 1

0.8 0.8 0.8 1 0.6 0.6 0.6 0.8 0.8 0.4 0.8 0.4 0.4 0.6 0.6 0.2 0.6

0.2 0.2 0.4 0.4 0 0.4

0 0 0.2 0.2 −0.2 0.2

−0.2 0 0 −0.2 −0.4 0

−0.2 −0.4 −0.6 1 −0.2 −0.2 −0.4 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 1 1.2 1.4 −0.2 0 0.2 0.4 0.6 0.8 1 1.2 1.4 −1 −0.5 0 0.5 1 0.8 −0.4 −0.4 −0.4

−0.8 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 1 1.2 −1 −0.5 0 0.5 1 −0.8 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 1 1.2 1 0.6 1 1 0.4 1 0.8 0.8 0.8 0.8 0.8 0.2 0.8 0.6 0.6 0.6 0.6 0.6 0 0.6 0.4 0.4 0.4 0.4 0.4 −0.2 0.4 0.2 0.2 0.2 0.2 0.2 −0.4 0.2 0 0 0 −1 −0.5 0 0.5 1 0 0 0 −0.2 −0.2 −0.2 −0.2 −0.2 −0.2 −0.4 −0.4 −0.4 −0.4 −0.4 −0.6 −0.4 −0.6 −0.6 −0.8 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 1 1.2 −1 −0.5 0 0.5 1 −0.8 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 1 1.2 −0.8 −1 −0.8 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 1 −1 −0.5 0 0.5 1 −1 −0.5 0 0.5 1 1 1

0.8 0.8 0.8 1 1 1 0.6 0.6 0.6 0.8 0.8 0.4 0.4 0.6 0.4 0.6 0.2 0.4 0.2 0.2 0.5 0.4

0 0.2 0.2 0 0

0 −0.2 0 −0.2 −0.2

−0.2 0 −0.2 −0.4 −0.4 −0.4 −1 −0.5 0 0.5 1 −1 −0.5 0 0.5 1 −0.8 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 1 1.2 −0.4 −0.4

−0.6 −0.6

−0.5 −0.8 −0.8

−0.8 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 1 1.2 −1 −0.5 0 0.5 1 −1 −0.5 0 0.5 1

Figure 2: Learning shape models: For the nine observed shark shapes shown in left, the middle panel shows the mean shapes, and the right panels show nine random samples the Gaussian model.

5.1 Completion of Curves Using Constrained We are interested in finding the critical points of E re- Elastic Curves stricted to C2 using a gradient search method. Curves represented by these critical points are known as elasti- ∗ In the problem of recognizing objects in given images, the cae. Let θ (s)=θ(s)−θ0, and f : I → R be the func- extraction and use of edges present in the images play an tion obtained by projecting θ∗ onto the tangent space C ∇ important role. If the objects of interest are partially ob- of 2 at θ. Then, C2 E = f, i.e., f is the gradient of scured by other objects, an important task is to interpolate E : C2 → R at θ. The flow lines of the negative gra- −∇ C between the observed edges to complete contours. The ge- dient field C2 E on 2 approach elasticae asymp- ometry near the end points of the observed edges provide totically. Flows of this type that seek to minimize the the boundary points p0, p1, and the boundary angles θ0, θ1. elastic energy efficiently are known as curve straight- We complete the missing edge using an elastica satisfying ening flows. these boundary conditions. This problem has been studied by several other researchers as well; see for example [13, 8]. For simplicity, we only consider the case of interpolations Shown in the top panels of Figure 3 are some examples R2 R3 with elasticae with a given fixed length, normalized to be 1. of elasticae in and for given boundary condi- tions p0, p1, θ0, and θ1 (depicted via arrows). The bot- tom row shows an example of using a variant known as 1. Elasticae for Curve Completion: In our notation, the scale-invariant elasticae for completing missing edges elastic energy E of θ : I → R can be expressed as of a partially occluded object. Object in the left panel Z is obscured artificially, and the boundaries of the visi- 1 1 ble parts are used to find interpolating elasticae that are E(θ)= θ˙(s)2 ds. (3) 2 0 shown in white lines in the right panel. 0.3 0.15

0.2 0.1 0.05 0 0.1

0 0.2 −0.8

0.15 −0.7 −0.1 0.1 −0.6 0 −0.2 0.05 −0.5 −0.05 −0.4 0 −0.1 −0.3 −0.15 −0.3 −0.15 −0.2 −0.2 −0.4 −0.1 −0.25 −0.05 −0.1 0.1 −0.3 0.05 −0.5 0 −1 −0.8 −0.6 −0.4 −0.2 0 −0.35 0 0

120 120 120

100 100 100

80 80 80

60 60 60

40 40 40

20 20 20

0 20 40 60 80 100 120 20 40 60 80 100 120 0 20 40 60 80 100 120

Figure 3: Upper row: solid lines show the elastica between given points and directions shown by arrows. Lower two rows: objects in left images are obscured and the end points of the visible edges are used to find elastica, shown in the right panels.

2. Using Local Harmonics to Constrain Curves:So 5.2 Bayesian Discovery of Objects in Images far, the task of completing curves has been based solely on first-order boundary data. It seems logical to use An important application of these shape analysis tools in more information from the visible portions. Our idea the discovery of partially occluded objects in noisy images. is to consider a subspace V of H1 associated with the Given some prior knowledge of their shapes, how can we in- dominant lower harmonics of the direction function of corporate it in the search for the objects? We define a prior the visible portions of the edge, and restrict the search on an appropriate curve space, and V use a Bayesian framework to infer new shapes. The curve for completions to the space C2 = V ∩C2. The en- space C used here is a simple variation of C1 that takes po- ergy now consists of two terms: E1 is the same as the sition, orientation, and scale into consideration. Elements E2 elastic energy defined in item 1, while gives a mea- ∈C surement of the similarity between the predicted and γ can (up to reparametrizations) be represented as pairs γ =(x, θ), where x is a finite-dimensional variable encod- observed curves. ∈C Z ing initial position, initial velocity and scale, and θ 1. l C ∗ 2 We start by deriving a posterior density on . E2(θ)= |θ(s) − θ (s)| ds , 0 1. Image Likelihood: The likelihood function can be de- where l and θ∗ are the length and the direction func- scribed as follows: Let D ⊂ R2 be the image domain tion of the visible curve, respectively. We have derived and I : D → R+ be an image. A (simple) closed curve analytical expressions for the gradients of these two γ in D divides the image domain into a region Di(γ) V energies on C2 , and have utilized them to search for inside the curve, and a region Do(γ) outside. Let Pi the completion with least total energy. The edge com- be a probability model for the pixel values inside the pletion can be complemented with a study of textures. curve, and Po be a model for pixels outside. For ex- of the texture of visible regions can used to ample, for a noisy two-phase image, one can choose predict the pixels values in the augmented region [14]. Pi and Po to be Gaussian distributions with different Shown in Figure 4 are simple examples computed us- means. ing these ideas. The left panels show images of par- For a given image I, the likelihood that γ is present in − 1 H(θ,I) tially obscured objects and our goal is to predict the σ2 it is proportional to e 1 , where H is given by: missing pieces of these objects. We solve this pre- Z Z diction problem in two steps using: (i) curve comple- − tion, and (ii) texture growth. For curve completion, log( Pi(I(x))dx+ Po(I(x))dx) . (4) Di(γ) Do(γ) we extract the boundaries of the visible portions, ex- tract dominant harmonics, and use these harmonics in This image model can be modified with energies such the prediction of closed contours . The middle panels as the magnitude square of the image gradient [9], or show the visible portions in marked lines and the cor- information-theoretic entropy based terms [3]. responding optimal completions in plain lines. Texture growth is used to produce the final results displayed on 2. Prior Density: We choose a “Gaussian” probability 2 the right panels of each row. density as a prior. Let θ0 represent a mean shape, σ0 Figure 4: Left panels: given images of obscured objects. Middle panels: edges extracted from visible parts (marked lines) are used to find optimal curve completions (solid lines). Right panels: Texture statistics from the visible portions are used to grow pixels in new regions.

be the shape dispersion, and θ(γ) ∈C1 be the shape influence of the prior (going from left to right). Successful associated with γ. Then, define the prior density by discovery of the hidden shape despite partial obscuration (in 1 2 P (x) − d(θ(γ),θ0) − σ2 σ2 the last row) emphasizes the need and power of a Bayesian µ(γ) ∝ e 0 e 2 , where d(·, ·) is the approach in such problems. geodesic distance on C1 discussed earlier and P is a prior energy on the space of nuisance variables. The posterior density is given by: 6 Conclusions 1 2 1 1 1 − 2 d(θ(γ),θ0) − 2 P (x)− 2 H(γ,I) µ(γ|I)= e σ0 σ2 σ1 . Z We have presented an overview of an ambitious framework to solve optimization and inference problems on spaces of We have used a gradient approach to find the MAP es- constrained curves (SCC). The main idea is to exploit the timate of γ for a given image. For an initial condition differential geometry of these Riemannian to ob- γ0 =(x0,θ0), let Ψ(γ0,t,w,f) be the geodesic flow from tain simpler solutions as compared to those obtained with γ0 with initial velocity (w, f). Here, w is the component PDE-based methods. Using two examples of SCCs, we tangential to the “nuisance” variable x and f ∈ Tθ (C1). 0 have presented some applications of this framework in im- Using the fact that d(θ, θ0)=kfk, we rewrite the posterior age understanding. In particular, these ideas lead to a novel energy as: statistical theory of shapes of planar objects with powerful 1 1 2 1 tools for shape analysis. E[w, f]= 2 H(Ψ(γ0, 1,w,f),I)+ 2 kfk + 2 P (x(w)) . σ1 σ0 σ2 Now using a Fourier decomposition of f, we use a gradient process to minimize the posterior energy. References Shown in Figure 5 is an illustration of this Bayesian shape extraction. Top left panel shows the true shape em- [1] G. Charpiat, O. Faugeras, and R. Keriven. Approxima- bedded in the observed images, and top right shows the tions of shape metrics and application to shape warp- prior mean shape associated with θ0. In the lower two rows, ing and empirical shape statistics. Institute Nationale the left panels show the observed images and the remain- de receherche en informatique et en automatique, No. ing panels show MAP estimates of θ under an increasing 4820, May, 2003. True Shape Prior Mean Shape

50

50

45 45

40 40

35 35

30 30

25

25

15 20 25 30 35 40 45 10 15 20 25 30 35 40 45 50

55 55 55 5

50 50 50 10

45 45 45 15

40 40 40 20

35 35 35 25

30 30 30 30

25 25 25 35

20 20 20 40

15 15 15 45

10 10 10 50

5 5 5 55 Low SNR Case 5 10 15 20 25 30 35 40 45 50 55 5 10 15 20 25 30 35 40 45 50 55 5 10 15 20 25 30 35 40 45 50 55 5 10 15 20 25 30 35 40 45 50 55

55 55 55 5

50 50 50 10

45 45 45 15

40 40 40 20

35 35 35 25

30 30 30 30

25 25 25 35

20 20 20 40

15 15 15 45

10 10 10 50

5 5 5 55 Partial Obscuration 5 10 15 20 25 30 35 40 45 50 55 5 10 15 20 25 30 35 40 45 50 55 5 10 15 20 25 30 35 40 45 50 55 5 10 15 20 25 30 35 40 45 50 55 Data No Prior Medium Prior Strong Prior

Figure 5: Top panels: left shows the true shape present in the images and right shows the prior mean. Middle and bottom panels: left shows the noisy data wand next three show MAP shape estimates (superimposed) with increasing influence of the prior from left to right.

[2] N. Duta, M. Sonka, and A. K. Jain. Learning shape [9] D. Mumford and J. Shah. Optimal approximations by models from examples using automatic shape cluster- piecewise smooth functions and associated variational ing and Procrustes analysis. Lecture Notes in Com- problems, 1989. puter Science, 1999. [10] Editor: B. Romeny. Geometry Driven Diffusions in [3] A. Ben Hamza and H. Krim. Entropic image registra- Computer Vision. Kluwer, 1994. tion and segmentation. In Proceedings of EMMCVPR, 2003, to appear. [11] T. B. Sebastian, P. N. Klein, and B. B. Kimia. On aligning curves. IEEE Transactions on Pattern Analy- [4] H. Karcher. Riemann center of mass and mollifier sis and Machine Intelligence, 25(1):116–125, 2003. smoothing. Communications on Pure and Applied [12] J. Sethian. Level Set Methods: Evolving Interfaces , 30:509–541, 1977. in Geometry, Fluid Mechanics, Computer Vision, and [5] E. Klassen, A. Srivastava, and W. Mio. Analysis of Material Science. Cambridge University Press, 1996. planar shapes using geodesic paths on shape spaces. [13] E. Sharon, A. Brandt, and R. Basri. Completion ener- IEEE Pattern Analysis and Machiner Intelligence,in gies and scale. IEEE Transactions on Pattern Analysis review, 2003. and Machine Intelligence, 22(10):1117–1131, 2000. [6] H. Le. Locating frechet means with application [14] Y. N. Wu, S. C. Zhu, and X. Liu. Equivalence of julesz to shape spaces. Advances in Applied Probability, ensembles and frame models. International Journal of 33(2):324–338, 2001. Computer Vision, 38(3):247–265, 2000.

[7] W. Mio, A. Srivastava, and E. Klassen. Interpolation [15] L. Younes. Optimal matching between shapes via elas- by elastica in euclidean spaces. Quarterly of Applied tic deformations. Journal of Image and Vision Com- Mathematics, to appear, 2003. puting, 17(5/6):381–389, 1999. [8] D. Mumford. Elastica and computer vision, pages 491–506. Springer, New York, 1994.