And Max-Relative Entropies

And Max-Relative Entropies

Min- and Max-Relative Entropies Nilanjana Datta University of Cambridge,U.K. Focus : on an important quantity which arises in Quantum Mechanics & Quantum Information Theory Quantum Entropy or von Neumann Entropy and its parent quantity: Quantum Relative Entropy & on new quantum relative entropies : Min- & Max-Relative Entropies Quantum Entropy or von Neumann Entropy In 1927 von Neumann introduced the notion of a mixed state, represented by a density matrix: ρ ≥=0; Tr ρ 1 k ρ = ∑ piiψψ i; ρψ↔ { pii, } i=1 & defined its entropy: S(ρ) :Tr=− (ρ log ρ ) Why? -To extend the classical theory of Statistical Mechanics developed by Gibbs, Boltzmann et al to the quantum domain; not for the development of Quantum Information Theory In fact, this was well before Shannon laid the foundations of Classical Information Theory (1948). S(ρ) :Tr=− (ρ log ρ ) k p ; ρ = ∑ iiψψ i ψ ijψδ≠ ij i=1 S()ρ = 0if and only if ρ is a pure state: ρ = ΨΨ ∴S()ρ = a measure of the “mixedness” of the state ρ Relation with Statistical Mechanics The finite volume Gibbs state given by the density matrix: −β HΛ Gibbs e ρΛ = Tr e−β HΛ maximizes the von Neumann entropy, given the expected value of the energy i.e., the functional : ρ SH()ρρΛ − βTr (ΛΛ ) is maximized if and only if Gibbs ρρΛΛ= log ZSΛΛΛΛ=−max( (ρρ )βTr ( H )) ρΛ In 1948 Shannon defined the entropy of a random variable Let X ∼ px (); xJ ∈ ; J = a finite alphabet Shannon entropy of X ; HX():=−∑ px ()log() px xJ∈ H ()XHpx≡ ({ ()}) Supposedly von Neumann asked Shannon to call this quantity entropy, saying “You should call it ‘entropy’ for two reasons; first, the function is already in use in thermodynamics; second, and most importantly, most people don’t know what entropy really is, & if you use ‘entropy’ in an argument, you will win every time.” Relation between Shannon entropy & thermodynamic entropy total # of Sk:log= Ω microstates th Suppose the r microstate occurs with prob. pr ………. consider v replicas of the system th Then on average vvp rr ≈ [] replicas are in the r state v Total # of vv! Ω= (by Stirling’s) microstates vv12 vr vv12!!..! vrr vv 12 .. v Th.dyn.entropy of compound Skvpp=− log system of v replicas vrr∑ r SSv==−/ k plog p Shannon vrr∑ = k H ({ pr }) r entropy Relation between Shannon entropy & von Neumann entropy k S()ρ :Tr=− (ρ log ρ ); ρ = ∑ piiψψ i; i=1 ρψ↔ { pii,;} ψ ijψδ≠ ij eigenvalues n † BUT ρ = ρ , Spectral decomposition ρλ= ∑ iiΠ ; i=1 n ρ ≥ 0; a probability λλ≥=0, 1; λ n = ⇒ ii∑ {}i i=1 distribution Tr ρ =1 i=1 n S()ρ :log=−∑λiiλ = H ({λi}) i=1 Classical Information Theory [Shannon 1948] = Theory of information-processing tasks e.g. storage & transmission of information Quantum Information Theory: A study of how such tasks can be accomplished using quantum-mechanical systems not just a generalization of Classical Information Theory to the quantum realm! novel features! The underlying which have no quantum mechanics classical analogue! In Quantum Information Theory, information is carried by physical states of quantum-mechanical systems: e.g. polarization states of photons, spin states of electrons etc. Fundamental Units Classical Information Theory: a bit ; takes values 0 and 1 Quantum Information Theory: a qubit = state of a two-level quantum-mechanical system Physical representation of a qubit Any two-level system; e.g. spin states of an electron, polarization states of a photon…… A multi-level system which has 2 states which can be effectively E1 decoupled from the rest; E0 Operational Significance of the Shannon Entropy = optimal rate of data compression for a classical i.i.d. (memoryless) information source successive signals emitted by source : indept. of each other Modelled by a sequence of i.i.d. random variables ∼ UU12, ,..., Un Upui () uJ∈ signals emitted by the source = (uu12 , ,..., un ) Shannon entropy of the source: H ():Upupu=−∑ ()log() uJ∈ Operational Significance of the Shannon Entropy (Q) What is the optimal rate of data compression for such a source? [ min. # of bits needed to store the signals emitted per use of the source] (for reliable data compression) Optimal rate is evaluated in the asymptotic limit n →∞ n = number of uses of the source ()n One requires pnerror →→∞0 ; (A) optimal rate of data compression = HU() Shannon entropy of the source Operational Significance of the von Neumann Entropy = optimal rate of data compression for a memoryless (i.i.d.) quantum information source A quantum info source emits: signals (pure states) ψ12,ψψ ...., k ∈ / p ,pp ,..., with probabilities 12 k Hilbert space Then source characterized by: {ρ, / } density matrix k ρ = p ψψ ∑ ii i ψ ijψδ≠ ij i=1 To evaluate data compression limit : Consider a sequence {ρnn,/ } If the quantum info source is memoryless (i.i.d.) ⊗U ρ = ρ ⊗n ρ ∈/ / n =/ ; n Optimal rate of data compression = S()ρ NOTE: Evaluated in the asymptotic limit n →∞ n = number of uses of the source One requires ()n perr → 0 n→∞ e.g. A memoryless quantum info source emitting qubits Characterized by ⊗n ⊗U {ρnn,;/ } ρρn = ; / n =/ ; Or simply by {ρ,/ } Consider nn successive uses of the source ; qubits emitted Stored in m n qubits (data compression) mn rate of data compression = n m Optimal rate of data compression n R∞ := lim = S()ρ n→∞ n under the requirement that ()n perror → 0 n→∞ Quantum Relative Entropy A fundamental quantity in Quantum Mechanics & Quantum Information Theory is the Quantum Relative Entropy of ρ w.r.t. σ , ρ ≥=≥ 0, Tr ρσ0, 0: S(ρ ||)σ :Tr= ρ log ρρσ− Tr lgo well-defined if supp ρ ⊆ supp σ It acts as a parent quantity for the von Neumann entropy: SSI()ρ :Tr= −= ρ log ρ − (ρ ||) ()σ = I It also acts as a parent quantity for other entropies: e.g. for a bipartite state ρ A B : A B Conditional entropy SAB(|):(= SSρ AB )− ()ρB = −⊗ S (ρ ABA || I ρ B ) ρ = Tr ρ Mutual information BAAB IAB(:):()=+SSSρ A ()ρρBAB)(− = S (ρ AB || ρ A ⊗ ρB ) Some Properties of S(ρ ||σ ) Klein’s inequality: S(||)0ρ σ ≥ “distance” = 0 if & only if ρ = σ Joint convexity: SS()(||)ppρ )|| σ ≤ p ρ σ ∑ kk∑ k k ∑ k kk kk k Monotonicity under completely positive trace preserving (CPTP) maps Λ : SS(()||())Λ ρ Λ≤σρσ (||) What is a CPTP map? For an isolated quantum system: time evolution is unitary -- dynamics governed by the Schroedinger equation. In Quantum Info. Theory one deals with open systems unavoidable interactions between system (A) & its environment (B) A B Time evolution not unitary in general Most general description of time evolution of an open system given by a completely positive trace-preserving (CPTP) map Λ Λ→: ρ σ describes discrete state changes resulting from any allowed physical density process : a superoperator operators Properties satisfied by a superoperator Λ : ρ → σ Linearity: ppρ ρ Λ=Λ()∑ kk∑ k () k kk Positivity: σ = Λ≥()ρ 0 Trace-preserving: (TP) Trσ =Λ Tr (ρρ ) = Tr =1 Complete positivity: / AB⊗/ (CP) A B ()0Λ⊗ABABid )(ρ ≥ ρ AB Kraus Representation † AA† Λ=()ρρ∑ AAk k ;∑ kk= I Theorem: k k Monotonicity of Quantum Relative Entropy under CPTP map Λ : ……….(1) v.powerful! SS(()||())ΛΛ≤ρ σρσ (||) Many properties of other entropic quantities can be proved using (1) e.g. Strong subadditivity of the von Neumann entropy Conjecture by Lanford, Robinson – proved by Lieb & Ruskai ‘73 A SSSS()()()()ρABC+≤ρρ B AB + ρ BC B C Outline of the rest Define 2 new relative entropy quantities Discuss their properties and operational significance Give a motivation for defining them Define 2 entanglement monotones (I will explain what that means ) Discuss their operational significance Two new relative entropies Definition 1 : The max- relative entropy of a density matrix ρ & a positive operator σ is Smax (||):logρ σλρλσ=≤ (min:{}) ()0λσρ− ≥ Definition 2: The min- relative entropy of a state ρ & a positive operator σ is Smin (||):logρ σπσ= − Tr (ρ ) where π ρ denotes the projector onto the support of ρ (supp ρ) Remark: The min- relative entropy Smin (||):logρ σπσ= − Tr ρ (where π ρ denotes the projector onto supp ρ ) is expressible in terms of: quantum relative Renyi entropy of order α with α ≠ 1 1 S (||):ρ σρσ= log Tr α 1−α α α −1 as follows: supp ρ ⊆ supp σ SSmin (||)ρ σρσ= lim(||)α α →0+ Min- and Max-Relative Entropies Like S (||) ρ σ we have for S* (ρ ||σ )0≥ *max,= min SS**(()||())ΛΛ≤ρ σρσ (||) for any CPTP map Λ for any unitary Also †† SSUUUU**(ρσ|| )(= ρ|| σ ) operator U Most interestingly SSSmin(||)ρ σρσρσ≤ (||)≤ max (||) The min-relative entropy is jointly convex in its arguments: n n For two mixtures of states ρ = p ρ & ∑ ii σ = ∑ piiσ i=1 i=1 n as for S(||)ρ σ SpSmin(ρ ||σρσ)(≤ ∑ iii min || ) i=1 The max-relative entropy is quasiconvex: SSmax(||)ρ σρσ≤ max max (ii ||) 1≤≤in Min- and Max- entropies H ()ρ :(||)= −S ρ I Hmin ()ρ :(||)=−Smax ρ I max min = log rank(ρ ) =−log||ρ ||∞ analogous to: SSI()ρ = − (ρ ||) For a bipartite state ρ AB : HAmin (|B)ρ :(= −⊗Smax ρ ABA||I ρ B ) etc. analogous to: SAB(|)= −⊗ S (ρ ABA || I ρ B ) HABmin (): ρ :(= Smin ρ ABA|| ρρ⊗ B) etc. analogous to: SAB(:)= S (ρ ABA ||ρρ⊗ B ) Min- and Max- Relative Entropies satisfy the: A B (1) Strong Subadditivity Property C just as SSSS()()()()ρABC+ ρρ B≤+ AB ρ BC HHHHminABCminBminABminBC()ρ +≤ ()ρρ () + () ρ (2) Araki-Lieb inequality A just as B SSS()ρAB≥− ()()ρρ A B HHHmin()ρ AB≥− min ()ρρ A min () B (Q) What is the operational significance of the min- and max- relative entropies? In Quantum information theory, initially one evaluated: optimal rates of info-processing tasks: e.g.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    53 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us