Exercise 1 Advanced Sdps Matrices and Vectors

Exercise 1 Advanced Sdps Matrices and Vectors

Exercise 1 Advanced SDPs Matrices and vectors: All these are very important facts that we will use repeatedly, and should be internalized. • Inner product and outer products. Let u = (u1; : : : ; un) and v = (v1; : : : ; vn) be vectors in n T Pn R . Then u v = hu; vi = i=1 uivi is the inner product of u and v. T The outer product uv is an n × n rank 1 matrix B with entries Bij = uivj. The matrix B is a very useful operator. Suppose v is a unit vector. Then, B sends v to u i.e. Bv = uvT v = u, but Bw = 0 for all w 2 v?. m×n n×p • Matrix Product. For any two matrices A 2 R and B 2 R , the standard matrix Pn product C = AB is the m × p matrix with entries cij = k=1 aikbkj. Here are two very useful ways to view this. T Inner products: Let ri be the i-th row of A, or equivalently the i-th column of A , the T transpose of A. Let bj denote the j-th column of B. Then cij = ri cj is the dot product of i-column of AT and j-th column of B. Sums of outer products: C can also be expressed as outer products of columns of A and rows of B. Pn T Exercise: Show that C = k=1 akbk where ak is the k-th column of A and bk is the k-th row of B (or equivalently the k-column of BT ). • Trace inner product of matrices. For any n × n matrix A, the trace is defined as the sum P of diagonal entries, T r(A) = i aii. For any two m × n matrices A and B one can define P the Frobenius or Trace inner product hA; Bi = ij aijbij. This is also denoted as A • B. Exercise: Show that hA; Bi = T r(AT B) = T r(BAT ). m n • Bilinear forms. For any m × n matrix A and vectors u 2 R ; v 2 R , the product T Pn Pn u Av = hu; Avi = i=1 j=1 aijuivj. Exercise: Show that uT Av = huvT ;Ai. This relates bilinear product to matrix inner product. • PSD matrices. Let A be a symmetric n×n matrix with entries aij. Recall that we defined A to be PSD if there exist vectors vi for i = 1; : : : ; n (in some arbitrary dimensional space) such that aij = vi · vj for each 1 ≤ i; j ≤ n. The following properties are all equivalent ways to characterizing PSD matrices: 1. aij = vi · vj for each 1 ≤ i; j ≤ n. A is called the Gram matrix of vectors vi. So if V T is the matrix obtained by stacking these vectors with i-th column vi, then A = V V . T n 2. A is symmetric and x Ax ≥ 0 for all x 2 R . 3. A is symmetric and has all eigenvalues non-negative. Pn T 4. A = i=1 λiuiui for λi ≥ 0 and ui are an orthonormal set of vectors. The λi are the eigenvalues of A and ui are the corresponding eigenvectors. Exercise: Show that (1) ! (2) and (2) ! (3). Solution: T (1) ! (2): Note that (1) implies that A = V V , where V = (v1; : : : ; vn). Then for any n x 2 R , we have that xT Ax = xT V T V x = kV xk2 ≥ 0, where kV xk2 is the squared Euclidean norm. (2) ! (3): Since A is symmetric, A is diagonizable and only real eigen values. Now assume that A has a negative eigen value λ < 0 with corresponding eigen vector v 6= 0 such that Av = λv. But then, vT Av = λkvk2 < 0, contradicting (2). (3) ! (4) follows from the well-known spectral theorem (that we do not prove here) that any symmetric matrix B has real eigenvalues and its eigenvectors are orthogonal. That P T is, B = i βiuiui where βi 2 R and ui is an orthonormal set of vectors. Exercise: Show that (4)p ! (1) usingp the two views of matrix products discussed above. Solution: Let U = ( λ1u1;:::; λnun) (i.e. with the indicated columns). Then n n X T X p p T T A = λiuiui = ( λiui)( λiui) = UU . i=1 i=1 T Now letting (v1; : : : ; vn) = U denote the columns of U, we directly get that Aij = T (UU )ij = vi · vj. Exercise: If A and B are n × n PSD matrics. Show that A + B is also PSD. Solution: T T n T Note that x Ax ≥ 0 and x Bx ≥ 0, 8x 2 R , clearly implies that x (A + B)x = T T n x Ax + x Bx ≥ 0 8x 2 R . Thus, by (2) A + B is also PSD. Hint: It is easiest to use definition (2) of PSD above. Exercise: Show the above using (1) instead of (2). In particular if aij = vi · vj and bij = wi ·wj can you construct vectors yi using these vi and wi such that aij +bij = yi ·yj? T T T Solution: Define z1; : : : ; zn by the relation zi = (vi ; wi ) for i 2 [n]. Then clearly hzi; zji = hvi; vji + hwi; wji = aij + bij. • Tensors. Let v 2 Rn. We define the two-fold tensor v⊗2 as the n × n matrix with (i; j)-th T ⊗2 2 entry vi · vj. This is same as vv , but it is useful to view v as an n dimensional vector. Similarly, if v 2 Rn and w 2 Rm, v ⊗ w = vwT is viewed as an nm dimensional vector. Exercise: Show that if v; w 2 Rn and x; y 2 Rm, then hv ⊗ x; w ⊗ yi = hv; wihx; yi. One can remember this rule as, the dot product of tensors is the product of their vector dot products. Solution: X X hv ⊗ x; w ⊗ yi = (v ⊗ x)ij(w ⊗ y)ij = vixjwiyj i2[n];j2[m] i2[n];j2[m] X X = ( viwi)( xjyj) = hv; wihx; yi . i2[n] j2[m] Similarly, one can generalize this to higher order tensors. For now we just discuss the k-fold tensor of a vector by itself. If v 2 Rn v⊗k is the nk dimensional vector with the (i1; : : : ; ik) entry equal to the product vi1 vi2 ··· vik . n ⊗k ⊗k k Exercise: Show (by just expanding things out) that if v; w 2 R then v ; w = (hv; wi) . Solution: X X hv⊗k; w⊗ki = v⊗k w⊗k = (v ··· v )(w ··· w ) i1i2:::ik i1i2:::ik i1 ik i1 ik i1;:::;ik2[n] i1;:::;ik2[n] X X X k = (vi1 wi1 ) ··· (vik wik ) = ( vi1 wi1 ) ··· ( vik wik ) = hv; wi , i1;:::;ik2[n] i12[n] ik2[n] as needed. Exercise: Let p(x) a univariate polynomial with non-negative coefficients. Let A be a n × n PSD matrix with entries aij, and let p(A) denote the matrix which has its (i; j)- entry p(aij). Show that p(A) is also PSD. 0 0 Hint: Use that aij = hvi; vji for each i; j, and construct suitable vectors vi and vj such 0 0 ⊗k ⊗k k that p(aij) = vi · vj. Use the property hv ; w i = (hv; wi) of dot products tensors stated above. Solution: Since A is PSD we can write Aij = hvi; vji for vectors v1; : : : ; vn. Let k p(x) = c0 + c1x + ··· + ckx , where c0; c1; : : : ; ck ≥ 0. Now define the vectors z1; : : : ; zn by p p p ⊗2 p ⊗k zi = ( c0; c1vi; c2vi ;:::; ckvi ) for i 2 [n]. Then by the previous exercises, we see that ⊗2 ⊗2 ⊗k ⊗k hzi; zji = c0 + c1hvi; vji + c2hvi ; vj i + ··· + ckhvi ; vj i 2 k = c0 + c1hvi; vji + c2hvi; vji + ··· + ckhvi; vji 2 k = c0 + c1aij + c2aij + ··· ckaij = p(aij) . • If the Goemans Williamson SDP relaxation for maxcut on a graph G has value (1 − )jEj where jEj is the number of edges in G, show that the hyperplane rounding algorithm p achieves a value of (1 − O( ))jEj. n Solution: Let v1; : : : ; vn 2 R , kvik = 1 denote the optimal solution to the SDP for G. Recall that the SDP value is X 1 (1 − hv ; v i) := SDP, 2 i j (i;j)2E and that the value achieved by Goemans Williamson rounding is X θij/π (i;j)2E where cos(θij) = hvi; vji for all (i; j) 2 E. By assumption, we know that the MAXCUT of the graph has size at least (1−)jEj edges, and hence the value of the value of SDP is at least (1 − )jEj as well. To begin the analysis, we will first remove all the edges for which the angles are less than π=2, and show that the value of the remaining edges is still at least 1 − 2 (this will allow 0 us to apply a useful concavity argument). Namely, let E = f(i; j) 2 E : hvi; vji ≤ 0g, and let α = jE0j=jEj. We first show that α ≥ 1 − 2. To see this, note that X 1 X 1 (1 − )jEj ≤ (1 − hv ; v i) + (1 − hv ; v i) 2 i j 2 i j (i;j)2EnE0 (i;j)2E0 ≤ (jEj − jE0j)=2 + jE0j = (1 − α)jEj=2 + αjEj , where the lower bound α ≥ 1 − 2 now follows by rearranging. Using the above, we can lower bound the value of the SDP restricted to the edges of E0 as follows, X 1 X 1 1 (1−hv ; v i) ≥ (1−)jE|− (1−hv ; v i) ≥ (1−)jE|− (jE|−|E0j) ≥ (1−2)jEj .

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    5 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us