Isospectral Transformations: a New Approach to Analyzing Multidimensional Systems and Networks

Total Page:16

File Type:pdf, Size:1020Kb

Isospectral Transformations: a New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Applications of Isospectral Transformations Summary Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Leonid Bunimovich Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Outline 1 Isospectral Graph Reductions Networks and Graphs Graph Reductions 2 Applications of Isospectral Transformations Eigenvalue Approximations Dynamical Network Stability Improved Escape Estimates in Open Systems 3 Summary Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Networks What is a network? Basic Definition: A network is a collection of elements that interact in some way. Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Example: Technological Networks World Wide Web1 1 http://www3.nd.edu/ networks/Image%20Gallery/gallery.htm Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Example: Social Networks High School Friendship Network2 2 http://www-personal.umich.edu/ mejn/networks/school.gif Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Example: Biological Networks Neural Network3 3 http://www.livescience.com/40855-brain-connections-no-neuron-is-an-island.html Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Reducing Network Complexity Question: To what extent can a network be simplified (reduced) while maintaining one or more of its basic characteristics? What does it mean to simplify a network? What network characteristic should we preserve? To answer these questions we first need a way of representing a network. Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary A Network as a Graph Graph of a Network A network can be represented by a graph G = (V, E, !) with vertices V and edges E where (i) V represent the network elements; (ii) E the interactions between network elements; and (iii) ! : E ! W gives the edge weights of the network edges. E. Coli Metabolic Network3 3 http://www.kavrakilab.org/bioinformatics/metapath Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Weighted Adjacency Matrix Definition If G = (V, E, !) where V = fv1,..., vng and eij is the edge from vi to vj, the weighted adjacency matrix M = M(G) of G is !(eij) if eij 2 E Mij = . 0, otherwise Question: Is it possible to reduce the number of vertices in a graph while maintaining the eigenvalues, including multiplicities, of its weighted adjacency matrix? Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary The Class of Edge Weights Definition For a graph G we let the edge weights W be rational functions of the form !(λ) = p(λ)=q(λ) where (i) p(λ) and q(λ) have no common factors; and (ii) deg(p) 6 deg(q). λ-2 1 λ 1 2 1 λ G Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Spectrum and Inverse Spectrum Definition If G = (V, E, !) then det(M - λI) = p(λ)=q(λ) 2 W. We call σ(G) = fλ 2 C : p(λ) = 0g the spectrum and -1 σ (G) = fλ 2 C : q(λ) = 0g the inverse spectrum of G. λ-2 1 λ 1 2 1 λ G λ-2λ2+λ4-λ5 det(M - λI) = λ2 p 1=3 2=3 1 -1 σ(G) = f(-1) , -(-1) , 2 (-1 ± 5)g and σ (G) = f0, 0g Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Structural Sets Definition For G = (V, E, !) the nonempty vertex set S ⊆ V is a structural set of G if each nontrivial cycle of G contains a vertex of S. We let st(G) denote the set of all structural sets of G. v1 v2 G Figure: S = fv1, v2g a structural set of G Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Structural Sets Definition For G = (V, E, !) the nonempty vertex set S ⊆ V is a structural set of G if each nontrivial cycle of G contains a vertex of S. We let st(G) denote the set of all structural sets of G. v4 v v v v 3 1 v5 2 6 G Figure: T = fv1, v3, v4, v5g not a structural set of G Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Branches Definition For G = (V, E) with S = fv1,..., vmg 2 st(G) let Bij(G, S) be the set of paths or cycles from vi to vj having no interior vertices in S S. Furthermore, let BS(G)= Bij(G, S) denote the 16i,j6m branches of G with respect to S. v1 v2 G Figure: Each branch is colored either red, brown, green, or blue. Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Branch Products Definition Let β 2 BS(G). If β = v1,..., vm, m > 2 and !ij = !(eij) then m-1 i=1 !i,i+1 P!(β) = m-1 Qi=2 (λ - !ii) is the branch product of β. If Qm = 2 then P!(β) = !12. !22 !12 !23 β !12!23 Figure: P!(β) = . λ-!22 Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Isospectral Graph Reduction Definition For G = (V, E, !) with the structural set S = fv1 ..., vmg, let RS(G) = (S, E, µ) where eij 2 E if Bij(G, S) 6= ; and µ(eij) = P!(β), 1 6 i, j 6 m. β2BXij(G,S) We call RS(G) the isospectral reduction of G over S. Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Example: Isospectral Reduction 1 1 1 1 1 1 1 v1 v2 1 1 1 1 G 1 λ-1 1 1 λ-1 1 + λ v1 v2 1 λ RS(G) Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Spectrum of a Reduced Graph Question: What is the relation between the eigenvalues of G and the eigenvalues of RS(G)? Proposition: (Bunimovich, Webb) For G = (V, E, !) and S 2 st(G) det[M(G)- λI] det[M(RS(G)) - λI] = det[M(Gj¯S)- λI] where S¯ is the complement of S in V. Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Spectrum of a Reduced Graph Theorem: (Bunimovich, Webb) For G = (V, E, !) and S 2 st(G) -1 -1 σ(RS(G)) = [σ(G) [ σ (Gj¯S)] - [σ(Gj¯S) [ σ (G)]. Corollary: (Bunimovich, Webb) If G = (V, E, !) has complex-valued weights and S 2 st(G) then σ(RS(G)) = σ(G)- σ(Gj¯S). Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Example: Isospectral Reduction v1 v2 σ(G) = f2, -1, 1, 1, 0, 0g σ(Gj¯S) = f1, 1, 0, 0g 1 λ-1 1 1+ 1 λ-1 v v λ 1 1 2 λ σ(RS(G)) = σ(G)- σ(Gj¯S) = f2, -1g Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and Graphs Applications of Isospectral Transformations Graph Reductions Summary Sequential Reductions Since RS(G) is a graph with edge weights in W, it is possible to further reduce this graph, i.e. sequentially reduce G. Question: To what extent is the structure of a graph preserved under different sequences of reductions? Leonid Bunimovich Isospectral Transformations: A New Approach to Analyzing Multidimensional Systems and Networks Isospectral Graph Reductions Networks and
Recommended publications
  • An Elementary Proof of the Spectral Radius Formula for Matrices
    AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES JOEL A. TROPP Abstract. We present an elementary proof that the spectral ra- dius of a matrix A may be obtained using the formula 1 ρ(A) = lim kAnk =n; n!1 where k · k represents any matrix norm. 1. Introduction It is a well-known fact from the theory of Banach algebras that the spectral radius of any element A is given by the formula ρ(A) = lim kAnk1=n: (1.1) n!1 For a matrix, the spectrum is just the collection of eigenvalues, so this formula yields a technique for estimating for the top eigenvalue. The proof of Equation 1.1 is beautiful but advanced. See, for exam- ple, Rudin's treatment in his Functional Analysis. It turns out that elementary techniques suffice to develop the formula for matrices. 2. Preliminaries For completeness, we shall briefly introduce the major concepts re- quired in the proof. It is expected that the reader is already familiar with these ideas. 2.1. Norms. A norm is a mapping k · k from a vector space X into the nonnegative real numbers R+ which has three properties: (1) kxk = 0 if and only if x = 0; (2) kαxk = jαj kxk for any scalar α and vector x; and (3) kx + yk ≤ kxk + kyk for any vectors x and y. The most fundamental example of a norm is the Euclidean norm k·k2 which corresponds to the standard topology on Rn. It is defined by 2 2 2 kxk2 = k(x1; x2; : : : ; xn)k2 = x1 + x2 + · · · + xn: Date: 30 November 2001.
    [Show full text]
  • Stat 309: Mathematical Computations I Fall 2018 Lecture 4
    STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 4 1. spectral radius • matrix 2-norm is also known as the spectral norm • name is connected to the fact that the norm is given by the square root of the largest eigenvalue of ATA, i.e., largest singular value of A (more on this later) n×n • in general, the spectral radius ρ(A) of a matrix A 2 C is defined in terms of its largest eigenvalue ρ(A) = maxfjλij : Axi = λixi; xi 6= 0g n×n • note that the spectral radius does not define a norm on C • for example the non-zero matrix 0 1 J = 0 0 has ρ(J) = 0 since both its eigenvalues are 0 • there are some relationships between the norm of a matrix and its spectral radius • the easiest one is that ρ(A) ≤ kAk n for any matrix norm that satisfies the inequality kAxk ≤ kAkkxk for all x 2 C , i.e., consistent norm { here's a proof: Axi = λixi taking norms, kAxik = kλixik = jλijkxik therefore kAxik jλij = ≤ kAk kxik since this holds for any eigenvalue of A, it follows that maxjλij = ρ(A) ≤ kAk i { in particular this is true for any operator norm { this is in general not true for norms that do not satisfy the consistency inequality kAxk ≤ kAkkxk (thanks to Likai Chen for pointing out); for example the matrix " p1 p1 # A = 2 2 p1 − p1 2 2 p is orthogonal and therefore ρ(A) = 1 but kAkH;1 = 1= 2 and so ρ(A) > kAkH;1 { exercise: show that any eigenvalue of a unitary or an orthogonal matrix must have absolute value 1 Date: October 10, 2018, version 1.0.
    [Show full text]
  • Hilbert's Projective Metric in Quantum Information Theory D. Reeb, M. J
    Hilbert’s projective metric in quantum information theory D. Reeb, M. J. Kastoryano and M. M. Wolf REPORT No. 31, 2010/2011, fall ISSN 1103-467X ISRN IML-R- -31-10/11- -SE+fall HILBERT’S PROJECTIVE METRIC IN QUANTUM INFORMATION THEORY David Reeb,∗ Michael J. Kastoryano,† and Michael M. Wolf‡ Department of Mathematics, Technische Universit¨at M¨unchen, 85748 Garching, Germany Niels Bohr Institute, University of Copenhagen, 2100 Copenhagen, Denmark (Dated: August 15, 2011) We introduce and apply Hilbert’s projective metric in the context of quantum information theory. The metric is induced by convex cones such as the sets of positive, separable or PPT operators. It provides bounds on measures for statistical distinguishability of quantum states and on the decrease of entanglement under LOCC protocols or other cone-preserving operations. The results are formulated in terms of general cones and base norms and lead to contractivity bounds for quantum channels, for instance improving Ruskai’s trace-norm contraction inequality. A new duality between distinguishability measures and base norms is provided. For two given pairs of quantum states we show that the contraction of Hilbert’s projective metric is necessary and sufficient for the existence of a probabilistic quantum operation that maps one pair onto the other. Inequalities between Hilbert’s projective metric and the Chernoff bound, the fidelity and various norms are proven. Contents I. Introduction 2 II. Basic concepts 3 III. Base norms and negativities 7 IV. Contractivity properties of positive maps 9 V. Distinguishability measures 14 VI. Fidelity and Chernoff bound inequalities 21 VII. Operational interpretation 24 VIII.
    [Show full text]
  • Banach Algebras
    Banach Algebras Yurii Khomskii Bachelor Thesis Department of Mathematics, Leiden University Supervisor: Dr. Marcel de Jeu April 18, 2005 i Contents Foreword iv 1. Algebraic Concepts 1 1.1. Preliminaries . 1 1.2. Regular Ideals . 3 1.3. Adjoining an Identity . 4 1.4. Quasi-inverses . 8 2. Banach Algebras 10 2.1. Preliminaries of Normed and Banach Algebras . 10 2.2. Inversion and Quasi-inversion in Banach Algebras . 14 3. Spectra 18 3.1. Preliminaries . 18 3.2. Polynomial Spectral Mapping Theorem and the Spectral Radius Formula . 22 4. Gelfand Representation Theory 25 4.1. Multiplicative Linear Functionals and the Maximal Ideal Space . 25 4.2. The Gelfand Topology . 30 4.3. The Gelfand Representation . 31 4.4. The Radical and Semi-simplicity . 33 4.5. Generators of Banach algebras . 34 5. Examples of Gelfand Representations 36 5.1. C (X ) for X compact and Hausdorff . 36 5.2. C 0(X ) for X locally compact and Hausdorff. 41 5.3. Stone-Cecˇ h compactification . 42 5.4. A(D) . 44 5.5. AC (Γ) . 46 5.6. H 1 . 47 ii iii Foreword The study of Banach algebras began in the twentieth century and originated from the observation that some Banach spaces show interesting properties when they can be supplied with an extra multiplication operation. A standard exam- ple was the space of bounded linear operators on a Banach space, but another important one was function spaces (of continuous, bounded, vanishing at infin- ity etc. functions as well as functions with absolutely convergent Fourier series). Nowadays Banach algebras is a wide discipline with a variety of specializations and applications.
    [Show full text]
  • Isospectral Graph Reductions, Estimates of Matrices' Spectra, And
    ISOSPECTRAL GRAPH REDUCTIONS, ESTIMATES OF MATRICES’ SPECTRA, AND EVENTUALLY NEGATIVE SCHWARZIAN SYSTEMS A Thesis Presented to The Academic Faculty by Benjamin Zachary Webb In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in the School of Mathematics Georgia Institute of Technology May 2011 ISOSPECTRAL GRAPH REDUCTIONS, ESTIMATES OF MATRICES’ SPECTRA, AND EVENTUALLY NEGATIVE SCHWARZIAN SYSTEMS Approved by: Dr. Leonid A. Bunimovich, Advisor Dr. Dana Randall School of Mathematics College of Computing Georgia Institute of Technology Georgia Institute of Technology Dr. Yuri Bakhtin Dr. Howie Weiss School of Mathematics School of Mathematics Georgia Institute of Technology Georgia Institute of Technology Dr. Luca Dieci Date Approved: March 8, 2011 School of Mathematics Georgia Institute of Technology To my wife, Rebekah. iii ACKNOWLEDGEMENTS In acknowledging the people who have advised, mentored, guided, supported, and helped me throughout my time at Georgia Tech I would like to start with my advisor Dr. Leonid Bunimovich, with whom I have enjoyed working and who has had a significant impact on my understanding of mathematics, mentoring, and academics in general. I am also grateful for Yuri Bakhtin, Luca Dieci, Dana Randall, and Howie Weiss for not only serving as part of my dissertation committee but also guiding and supporting my endeavors while at Georgia Tech. I would also like to thank those in the School of Mathematics at Georgia Tech for the general atmosphere of supportiveness. This is especially true of both Klara Grodzinsky and Rena Brakebill who have guided my teaching efforts over the past years. I would also extend this thanks to Sharon McDowell and Genola Turner who along with my advisor have not only been supportive of myself but of my wife and children who have found themselves in a rather unique situation over the past years of my doctoral studies.
    [Show full text]
  • Wavelets Through a Looking Glass: SATISFACTION GUARANTEED! the World of the Spectrum Remember: Your 30-Day Return Privilege by O
    20% Discount Pre-Pub Offer Forthcoming June 2002! List Price: $59.95 (tent.) Special Price: $47.96 + shipping & handling Wavelets through a Please mention reference # Y404 when placing your order! Offer Valid Until 31 August 2002 Looking Glass Table of Contents The World of the Spectrum List of Figures List of Tables O. Bratteli, University of Oslo, Oslo, Norway Preface P. Jorgensen, University of Iowa, Iowa City, IA 1. Introduction Overture: Why wavelets? * Subband filters This book combining wavelets and the world of the spectrum focuses on recent and sieves * Matrix functions and developments in wavelet theory, emphasizing fundamental and relatively multiresolutions * Qubits: The oracle of timeless techniques that have a geometric and spectral-theoretic flavor. The Feynman and the algorithm of Shor * Chaos exposition is clearly motivated and unfolds systematically, aided by numerous and cascade approximation * Spectral bounds for the transfer and subdivision graphics. operators * Connections to group theory * Wavelet packets * The Gabor transform * Key features of the book: Exercises * Terminology 2. Homotopy Theory and Cascades • The important role of the spectrum of a transfer operator is studied The dangers of navigating the landscape of • wavelets * Homotopy classes of wavelets * Excellent graphics show how wavelets depend on the spectra of the transfer Multiresolution analysis and tight frames * operators Generality of multiresolution analysis * • Key topics of wavelet theory are examined: connected components in the Global homotopy and an index theorem * Cascades in L2(R) * An open problem * variety of wavelets, the geometry of winding numbers, the Galerkin projection Exercises method, classical functions of Weierstrass and Hurwitz and their role in describing the eigenvalue-spectrum of the transfer operator, isospectral 3.
    [Show full text]
  • Isospectral Graph Reductions
    Graphs Reductions Eigenvalue Estimation Summary and Implications Isospectral Graph Reductions Leonid Bunimovich Leonid Bunimovich Isospectral Graph Reductions Networks and Graphs Graphs Reductions Definitions Eigenvalue Estimation Graph Reductions Summary and Implications Main Results Outline 1 Graphs Reductions Networks and Graphs Definitions Graph Reductions Main Results 2 Eigenvalue Estimation Gershgorin’s Theorem Brauer’s Theorem Brualdi’s Theorem 3 Summary and Implications References Leonid Bunimovich Isospectral Graph Reductions Networks and Graphs Graphs Reductions Definitions Eigenvalue Estimation Graph Reductions Summary and Implications Main Results Network Structure Typical real networks are defined by some large graph with complicated structure [2,8,11]. E.coli metabolic network Question: To what extent can this structure be simplified/reduced while maintaining some characteristic of the network? Leonid Bunimovich Isospectral Graph Reductions Networks and Graphs Graphs Reductions Definitions Eigenvalue Estimation Graph Reductions Summary and Implications Main Results The collection of graphs G The graph of a network may or may not be directed, weighted, have multiple edges or loops. Each such graph can be considered a weighted, directed graph without multiple edges possibly with loops. 3 2 1 3 2 Let G be the collection of all such graphs. Leonid Bunimovich Isospectral Graph Reductions Networks and Graphs Graphs Reductions Definitions Eigenvalue Estimation Graph Reductions Summary and Implications Main Results The collection of graphs G Definition A graph G ∈ G is triple G = (V , E, ω) where V is its vertices, E its edges, and ω : E → W where W is the set of edge weights. An important characteristic of a network/graph is the spectrum of its weighted adjacency matrix [1,3,10].
    [Show full text]
  • Lecture Notes on Spectra and Pseudospectra of Matrices and Operators
    Lecture Notes on Spectra and Pseudospectra of Matrices and Operators Arne Jensen Department of Mathematical Sciences Aalborg University c 2009 Abstract We give a short introduction to the pseudospectra of matrices and operators. We also review a number of results concerning matrices and bounded linear operators on a Hilbert space, and in particular results related to spectra. A few applications of the results are discussed. Contents 1 Introduction 2 2 Results from linear algebra 2 3 Some matrix results. Similarity transforms 7 4 Results from operator theory 10 5 Pseudospectra 16 6 Examples I 20 7 Perturbation Theory 27 8 Applications of pseudospectra I 34 9 Applications of pseudospectra II 41 10 Examples II 43 1 11 Some infinite dimensional examples 54 1 Introduction We give an introduction to the pseudospectra of matrices and operators, and give a few applications. Since these notes are intended for a wide audience, some elementary concepts are reviewed. We also note that one can understand the main points concerning pseudospectra already in the finite dimensional case. So the reader not familiar with operators on a separable Hilbert space can assume that the space is finite dimensional. Let us briefly outline the contents of these lecture notes. In Section 2 we recall some results from linear algebra, mainly to fix notation, and to recall some results that may not be included in standard courses on linear algebra. In Section 4 we state some results from the theory of bounded operators on a Hilbert space. We have decided to limit the exposition to the case of bounded operators.
    [Show full text]
  • NEW FORMULAS for the SPECTRAL RADIUS VIA ALUTHGE TRANSFORM Fadil Chabbabi, Mostafa Mbekhta
    NEW FORMULAS FOR THE SPECTRAL RADIUS VIA ALUTHGE TRANSFORM Fadil Chabbabi, Mostafa Mbekhta To cite this version: Fadil Chabbabi, Mostafa Mbekhta. NEW FORMULAS FOR THE SPECTRAL RADIUS VIA ALUTHGE TRANSFORM. 2016. hal-01334044 HAL Id: hal-01334044 https://hal.archives-ouvertes.fr/hal-01334044 Preprint submitted on 20 Jun 2016 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. NEW FORMULAS FOR THE SPECTRAL RADIUS VIA ALUTHGE TRANSFORM FADIL CHABBABI AND MOSTAFA MBEKHTA Abstract. In this paper we give several expressions of spectral radius of a bounded operator on a Hilbert space, in terms of iterates of Aluthge transformation, numerical radius and the asymptotic behavior of the powers of this operator. Also we obtain several characterizations of normaloid operators. 1. Introduction Let H be complex Hilbert spaces and B(H) be the Banach space of all bounded linear operators from H into it self. For T ∈ B(H), the spectrum of T is denoted by σ(T) and r(T) its spectral radius. We denote also by W(T) and w(T) the numerical range and the numerical radius of T. As usually, for T ∈B(H) we denote the module of T by |T| = (T ∗T)1/2 and we shall always write, without further mention, T = U|T| to be the unique polar decomposition of T, where U is the appropriate partial isometry satisfying N(U) = N(T).
    [Show full text]
  • Spectrum (Functional Analysis) - Wikipedia, the Free Encyclopedia
    Spectrum (functional analysis) - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Spectrum_(functional_analysis) Spectrum (functional analysis) From Wikipedia, the free encyclopedia In functional analysis, the concept of the spectrum of a bounded operator is a generalisation of the concept of eigenvalues for matrices. Specifically, a complex number λ is said to be in the spectrum of a bounded linear operator T if λI − T is not invertible, where I is the identity operator. The study of spectra and related properties is known as spectral theory, which has numerous applications, most notably the mathematical formulation of quantum mechanics. The spectrum of an operator on a finite-dimensional vector space is precisely the set of eigenvalues. However an operator on an infinite-dimensional space may have additional elements in its spectrum, and may have no eigenvalues. For example, consider the right shift operator R on the Hilbert space ℓ2, This has no eigenvalues, since if Rx=λx then by expanding this expression we see that x1=0, x2=0, etc. On the other hand 0 is in the spectrum because the operator R − 0 (i.e. R itself) is not invertible: it is not surjective since any vector with non-zero first component is not in its range. In fact every bounded linear operator on a complex Banach space must have a non-empty spectrum. The notion of spectrum extends to densely-defined unbounded operators. In this case a complex number λ is said to be in the spectrum of such an operator T:D→X (where D is dense in X) if there is no bounded inverse (λI − T)−1:X→D.
    [Show full text]
  • Equiangular Lines with a Fixed Angle 3
    EQUIANGULAR LINES WITH A FIXED ANGLE ZILIN JIANG, JONATHAN TIDOR, YUAN YAO, SHENGTONG ZHANG, AND YUFEI ZHAO Abstract. Solving a longstanding problem on equiangular lines, we determine, for each given fixed angle and in all sufficiently large dimensions, the maximum number of lines pairwise separated by the given angle. d Fix 0 <α< 1. Let Nα(d) denote the maximum number of lines through the origin in R with pairwise common angle arccos α. Let k denote the minimum number (if it exists) of vertices in a graph whose adjacency matrix has spectral radius exactly (1 − α)/(2α). If k < ∞, then Nα(d)= ⌊k(d−1)/(k −1)⌋ for all sufficiently large d, and otherwise Nα(d)= d+o(d). In particular, N1/(2k−1)(d)= ⌊k(d − 1)/(k − 1)⌋ for every integer k ≥ 2 and all sufficiently large d. A key ingredient is a new result in spectral graph theory: the adjacency matrix of a connected bounded degree graph has sublinear second eigenvalue multiplicity. 1. Introduction A set of lines passing through the origin in Rd is called equiangular if they are pairwise separated by the same angle. Equiangular lines and their variants appear naturally in pure and applied math- ematics. It is an old and natural problem to determine the maximum number of equiangular lines in a given dimension. The study of equiangular lines was initiated by Haantjes [12] in connection with elliptic geometry and has subsequently grown into an extensively studied subject. Equiangular lines show up in coding theory as tight frames [20]. Complex equiangular lines, also known under the name SIC-POVM, play an important role in quantum information theory [19].
    [Show full text]
  • Notes on Matrix Computation University of Chicago, 2014
    Notes on Matrix Computation University of Chicago, 2014 Vivak Patel September 7, 2014 1 Contents 1 Introduction 3 1.1 Variations of solving Ax = b .....................3 1.2 Norms . .3 1.3 Error Analysis . .6 1.4 Floating Point Numbers . .7 2 Eigenvalue Decomposition 10 2.1 Eigenvalues and Eigenvectors . 10 2.2 Jordan Canonical Form . 10 2.3 Spectra . 12 2.4 Spectral Radius . 12 2.5 Diagonal Dominance and Gerschgorin's Disk Theorem . 13 3 Singular Value Decomposition 15 3.1 Theory . 15 3.2 Applications . 16 4 Rank Retaining Factorization 25 4.1 Theory . 25 4.2 Applications . 25 5 QR & Complete Orthogonal Factorization 27 5.1 Theory . 27 5.2 Applications . 28 5.3 Givens Rotations . 29 5.4 Householder Reflections . 30 6 LU, LDU, Cholesky and LDL Decompositions 31 7 Iterative Methods 32 7.1 Overview . 32 7.2 Splitting Methods . 32 7.3 Semi-Iterative Methods . 35 7.4 Krylov Space Methods . 36 2 1 Introduction 1.1 Variations of solving Ax = b 1. Linear Regression. A is known and b is known but corrupted by some error unknown error r. Our goal is to find x such that: x 2 arg min kAx − bk2 = arg minfkrk2 : Ax + r = bg n 2 2 x2R 2. Data Least Squares. A is known but corrupted by some unknown error E. We want to determine: x 2 arg min fkEk2 :(A + E)x = bg n F x2R 3. Total least squares. A and b are both corrupted by errors E and r (resp.). We want to determine: x 2 arg min fkEk2 + krk2 :(A + E)x + r = bg n F 2 x2R 4.
    [Show full text]