Characterization and properties of .R;S/-commutative matrices

William F. Trench 

Trinity University, San Antonio, Texas 78212-7200, USA Mailing address: 659 Hopkinton Road, Hopkinton, NH 03229 USA

NOTICE: this is the author’s version of a work that was accepted for publication in and Its Applications. Changes resulting from the publishing process, such as peer review, editing,corrections, structural formatting,and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Please cite this article in press as: W.F. Trench, Characterization and properties of .R;S /-commutative matrices, Linear Algebra Appl. (2012), doi:10.1016/j.laa.2012.02.003

Abstract

1 Cmm Let R D P diag. 0Im0 ; 1Im1 ;:::; k1Imk1 /P 2 and S D 1 Cnn Q diag. .0/In0 ; .1/In1 ;:::; .k1/Ink1 /Q 2 , where m0 C m1 C C mk1 D m, n0 C n1 CC nk1 D n, 0, 1, ..., k1 are distinct mn complex numbers, and  W Zk ! Zk D f0;1;:::;k 1g. We say that A 2 C is .R;S /-commutative if RA D AS . We characterize the class of .R;S /- commutative matrrices and extend results obtained previously for the case where 2i`=k ` D e and .`/ D ˛` C  .mod k/, 0 Ä ` Ä k 1, with ˛,  2 Zk. Our results are independent of 0, 1,..., k1, so long as they are distinct; i.e.,

if RA D AS for some choice of 0, 1,..., k1 (all distinct), then RA D AS for arbitrary of 0, 1,..., k1.

MSC: 15A09; 15A15; 15A18; 15A99

Keywords: Commute; Eigenvalue problem; Least Squares problem; Moore–Penrose Inverse; .R;S /-commutative; Singular value decomposition

e-mail:[email protected]

1 2 William F. Trench

1 Introduction

n1 nn A A Œars C is said to be centrosymmetric if D r;sD0 2

anr1;ns1 ars ; 0 r; s n 1; D Ä Ä or centro-skewsymmetric if

anr1;ns1 ars; 0 r; s n 1: D Ä Ä The study of such matrices is facilitated by the observation that A is centrosymmetric (centro-skewsymmetric) if and only if JA AJ (JA AJ ), where J is the flip matrix, with ones on the secondary diagonalD and zeroesD elsew here. Several authors [2, 3, 4, 5, 8, 10, 13, 25] used thisobservationto showthat centrosymmetric and centro- skewsymmetric matrices can be written as A P CP 1, where P diagonalizes J and C has a useful block structure. We will discussD this further in Example 3. Followingthis idea, other authors [6, 11, 12, 14, 24] considered matrices satisfying RA AR or RA AR, where R is a nontrivial involution; i.e., R R1 I . We continuedD this lineD of investigaton in [15, 16, 17, 19], and extendedD it in [18,¤˙ 20], defining A Cmn to be .R; S/-symmetric (.R; S/-skew symmetric) if RA AS (RA AS2), where R Cmm and S Cnn are nontrivialinvolutions. We showedD that aD matrix A with either2 of these properties2 can be written as A P CQ1, where P and Q diagonalize R and S respectively and C has a useful blockD form. Chen [7] and Fasino [9] studied matrices A Cnn such that RAR A, where R is a that satisfies Rk I 2for some k n and  e2i=kD . In [21] we studied matrices A Cmn such thatDRA AS, whereÄ D 2 D k1 1 R P diag Im ; Im ;:::; Im P ; (1) D 0 1 k1  Á k1 1 S Q diag In ; In ;:::; In Q ; (2) D 0 1 k1  Á m0 m1 mk1 m; n0 n1 nk1 n; (3) C CC D C CC D and ˛; Zk 0;1;:::;k 1 : 2 Df g Finally, motivated by a problem concerning unilevel block circulants [22], in [23] we mn  ˛ considered matrices A C such that RA  AS , with ˛; Zk. We called such matrices .R;S;˛;/2 -symmetric, and showedD that A has this property2 if and only if

k1 ˛`C. mod k/n` A P˛`C. k/F`Q` with F` C ; 0 ` k 1; (4) D mod 2 Ä Ä D X` 0 b which has useful computational and theoretical applications. (P0, ..., Pk1 and Q0, ..., Qk1 are defined in Section 2, specifically, (7)–(10).) The class of .R;S;˛;/- symmetric matrices includes, for example, centrosymmetric, skew-centrosymmetric,b b .R;S /-commutative matrices 3

R-symmetric, R-skew symmetric, .R; S/-symmetric, and .R; S/-skew symmetric ma- trices, and block circulants ŒAs˛r r;sD0. Having said this,we now proposethat all the papers in our bibliography– including our own – are based on an unnecessarily restrictive assumption; namely, that the spectra of the matrices R and S that are used to define the symmetries consist of a set (usually the complete set) of k-th roots of unity for some k 2. In this paper we point out that this assumption is irrelevant and present an alternative approach that eliminates this requirement and exposes a wider class of generalized symmetries if k>2. We extend our results in [21] and [23] to this larger class of matrices.

2 Preliminary considerations

Throughout the rest of this paper, 1 mm R P diag. 0Im ; 1Im ;:::; k1Im /P C (5) D 0 1 k1 2 and 1 nn S Q diag. 0In ; 1In ;:::; k1In /Q C ; (6) D 0 1 k1 2 where 0, 1,..., k1 are distinct complex numbers, except when there is an explicit statement to the contrary. We define 1 R P diag .0/Im ; .1/Im ;:::; .k1/Im P D 0 1 k1 and  1 S Q diag. .0/In ; .1/In ;:::; .k1/In /Q ; D 0 1 k1 where  Zk Zk. We canW partition!

P P0 P1 Pk1 ; Q Q0 Q1 Qk1 ; (7) D  D   P0   Q0  P Q 1 1 1 1 P 2 b: 3 and Q 2 b: 3 ; (8) D : D : 6 b 7 6 b 7 6 7 6 7 6 Pk1; 7 6 Qk1; 7 where 4 5 4 5 b b mmr mr m Pr C ; Pr C ; Pr Ps ırsIm ; 0 r; s k 1; (9) 2 2 D r Ä Ä nnr nr n Qr C ; Qr C ; and Qr Qs ırsIn ; 0 r; s k 1: (10) 2 2b b D r Ä Ä We can now write b b k1 k1

R `P`P`; R .`/P`P`; (11) D D D D X` 0 X` 0 k1 b k1 b

S `Q`Q`; and S .`/Q`Q`: (12) D D D D X` 0 X` 0 b b 4 William F. Trench

Definition 1 In general, if U Cmm, V Cnn, and A Cmn, we say that A is 2 2 2 mn .U; V /-commutative if UA AV . In particular, we say that A C is .R;S /- D 2 commutative if RA AS . If  is the identity (i.e., RA AS), we say that A is .R; S/-commutative.D If A, R Cnn and RA AR, weD say – as usual – that A commutes with R. 2 D

3 Necessary and sufficient conditions for .R;S/-commutativity

mn Theorem 1 A C is .R;S /-commutative if and only if 2

k1 1 mr ns A P ŒCrs Q ; where Crs C (13) D r;sD0 2  Á and Crs 0 if r .s/; 0 r; s k 1: (14) D ¤ Ä Ä  PROOF. Any A Cm n can be written as in (13) with C P 1AQ partitioned as indicated. If 2 D D diag 0Im ; 1Im ;:::; k1Im D 0 1 k1 and  D diag .0/In ; .1/In ;:::; .k1/In ; D 0 1 k1 then 

1 1 1 k1 1 RA .PDP /.P CQ / PDCQ P Œ r Crs Q D D D r;sD0  Á and

1 1 1 k1 1 AS .P CQ /.QD Q / P CD Q P Œ .s/Crs Q : D D D r;sD0  Á Therefore RA AS if and only if . r .s//Crs 0, 0 r; s k 1, which is D D Ä Ä equivalent to (14), since 0, 1,..., k1 are distinct. The following theorem is a convenient reformulation of Theorem 1.

mn Theorem 2 A C is .R;S /-commutative if and only if 2 k1 m.`/ n` A P.`/F`Q` with F` C ; 0 ` k 1; (15) D 2 Ä Ä D X` 0 b in which case F` P.`/AQ`; 0 ` k 1; (16) D Ä Ä and b k1 RA AS .`/P.`/F`Q` (17) D D D X` 0 for arbitrary 0, 1,..., k1. b .R;S /-commutative matrices 5

 PROOF. From (13), an arbitrary A Cm n can be written as 2 k1 k1

A Pr CrsQ`: (18) D sD0 rD0 X X b From Theorem 1, A is .R;S /-commutative if and and only if Crs 0 if r .s/, in m.`/ n` D ¤ which case (18) reduces to (15) with F` C.`/;` C . From (10) and (15), D 2 AQ` P.`/F`, 0 ` k 1, so (9) with r .`/ implies (16). Eqns. (9)–(12) and (15)D imply (17).Ä Ä D

Example 1 If  is the permutation

012345  .0; 1; 3/.2; 4/.5/; D 134025 D Â Ã then (15) becomes

A P1F0Q0 P3F1Q1 P4F2Q2 P0F3Q3 P2F4Q4 P5F5Q5; D C C C C C with b b b b b b

m1n0 m3n1 m4n2 F0 C ; F1 C ; F2 C ; 2 2 2 m0n3 m2n4 m5n5 F3 C ; F4 C ; F5 C ; 2 2 2 and

RA AS 1P1F0Q0 3P3F1Q1 4P4F2Q2 0P0F3Q3 2P2F4Q4 5P5F5Q5 D D C C C C C for arbitrary 0,..., 5.b b b b b b Example 2 If 012345  D 210120 Â Ã (which is not a permutation), then (15) becomes

A P2F0Q0 P1F1Q1 P0F2Q2 P1F3Q3 P2F4Q4 P0F5Q5; D C C C C C with b b b b b b

m2n0 m1n1 m0n2 F0 C ; F1 C ; F2 C ; 2 2 2 m1n3 m2m4 m0n5 F3 C ; F4 C ; F5 C ; 2 2 2 and

RA AS 2P2F0Q0 1P1F1Q1 0P0F2Q2 1P1F3Q3 2P2F4Q4 0P0F5Q5 D D C C C C C for arbitrary 0,..., 5.b b b b b b 6 William F. Trench

Example 3 All results obtained by assuming that R and S are involutions (and there- fore have eigenvalues 1 and 1) can just as well be obtained by assuming only that R and S have the same two distinct eigenvalues, with possibly different multiplicities. The original idea in thisarea of research has itsoriginsin the observation that A is cen- trosymmetric (skew-centrosymmetric) if and only if AJ JA (AJ JA). Since J 2 I , these conditions can just as well be written as JAJD A (JAJD A); how- ever,D this and the invertibility of J are irrelevant. To illustrateD this, supposeD n 2r, in which case D T Ir 0 P0 J P0 P1 T D 0 Ir P Ä Ä 1  T T  (i.e., P0 P and P1 P ), where D 0 D 1

1 Ir 1 Ir b P0b and P1 : D p Jr D p Jr 2 Ä  2 Ä  Starting from this, it can be shown AJ JA (or, equivalently, A is centrosymmetric) if and only if D

T B0 0 P0 T T A P0 P1 T P0B0P0 P1B1P1 (19) D 0 B1 P D C Ä Ä 1   rr  with B0, B1 C . However, Theorem 2 implies that A has the form (19) if RA AR for some2R of the form D

T 0Ir 0 P0 R P0 P1 T D 0 1Ir P Ä Ä 1    with 0 1, in which case ¤ T T RA AR 0P0B0P 1P1B1P : D D 0 C 1 for arbitrary 0 and 1. According to the classical theorem, AJ JA (or, equivalently, A is skew- centrosymmetric) if and only if D

T 0 C1 P0 A P0 P1 T P1C0P0 P0C1P1 (20) D C0 0 P D C Ä Ä 1    rr with C0, C1 C . Now let .0/ 1 and .1/ 0, so b b 2 D D T 1Ir 0 P0 R P0 P1 T : D 0 0Ir P Ä Ä 1    Theorem 2 implies that A has the form (20) if and onlyif RA AR for some 0 and D 1 with 0 1, in which case ¤

RA AR 1P1C0P0 0P0C1P1 D D C for all 0 and 1. b b .R;S /-commutative matrices 7

k1 Example 4 Let R Œır;s1. k/ , which is the 1-circulant with first row D mod r;sD0 0 1 0 0 :   Ckk  k1 By theAblow-Brenner theorem [1], C is an ˛-circulant C Œcs˛r.mod k/r;sD0 if and only if RC CR˛ . Since 2 D D R P diag.1;;2;:::;k1/P  D where 1 ` 1 2 2` 3 P p0 p1 pk1 with p` ; 0 ` k 1; D  D pk 6 : 7 Ä Ä 6 : 7   6 7 6 .k1/` 7 6 7 4 5 and R˛ P diag.1;˛ ;2˛;:::;.k1/˛ /P ; D the Ablow-Brenner theorem can be interpreted to mean that C is .R; R /-commutative with .`/ ˛` .mod k/, 0 ` k 1. Therefore Theorem 2 implies that D Ä Ä k1  C p˛`. k/f`p ; D mod ` D X` 0 where f0, f1,..., fk1 are scalars. As a matter of fact, if

 R P diag. 0; 1;:::; k1/P D with arbitrary 0, 1,..., k1, then

k1  RC CR ˛`. k/p˛`. k/f`p : D D mod mod ` D X` 0 Example 5 Let R and S be as in (1) and (2) and let .`/ ˛`  .mod k/, so D C  ˛C .k1/˛C 1 S Q diag  Im ; Im ;:::; Im Q : D 0 1 k1  Á Then the .R;S;˛;/- A in (4) is .R;S /-commutative. More gen- erally, if R and S are as in (5) and (6) and .`/ ˛`  .mod k/, then D C k1

RA AS ˛`C. k/P˛`C. k/F`Q` D D mod mod D X` 0 b for arbitrary 0, 1,..., k1. 8 William F. Trench

Renaming the variables in Theorem 2 yields the followingtheorem.

nm Theorem 3 If  Zk Zk; then B C is .S; R/-commutative if and only if W ! 2 k1 n.`/ m` B Q.`/G`P` with G` C ; 0 ` k 1; (21) D 2 Ä Ä D X` 0 b in which case G` Q.`/BP`; 0 ` k 1; D Ä Ä and b k1 SB BR .`/Q.`/G`P` D D D X` 0 for arbitrary 0, 1,..., k1. b

4 General Results

Remark 1 If  or  is a permutation of Zk, we can replace ` by .`/ or ` by .`/ k1 in a summation `D0, as in the proof of the following theorem, where “ ” denotes composition; i.e.,  .`/ ..`// and  .`/ ..`//. Also, ı P ı D ı D

P.r/P.s/ ırsIm and Q.r/Q.s/ ırsIn ; 0 r; s k 1; (22) D .r/ D .r/ Ä Ä if andb onlyif  and  are permutations.b We will use this frequently withoutspecifically invoking it.

mn nm Theorem 4 Suppose A C is .R;S /-commutative and B C is .S; R/- 2 2 commutative: Then (a) AB is .R; Rı/-commutative if  is a permutationand (b) BA W is .S;Sı /-commutative if  is a permutation:

PROOF. From Theorems 2 and 3, our assumptions imply that A is as in (15) and B is as in (21). If  is a permutation then replacing ` by .`/ in (15) yields

k1

A P..`//F.`/Q.`/: D D X` 0 b From this, (21), and (22),

k1

AB P..`//F.`/G`P`; D D X` 0 b so (9) and (11) imply that

k1

R.AB/ .AB/Rı ..`//P..`//F.`/G`P`; D D D X` 0 b .R;S /-commutative matrices 9 which proves (a). If  is a permutation, replacing ` by .`/ in (21) yields

k1

B Q..`//G.`/P.`/: D D X` 0 b From this, (15), and (22),

k1

BA Q..`//G.`/F`Q`; D D X` 0 b so (10) and (12) imply that

k1

S.BA/ .AB/Sı ..`//Q..`//G.`/F`Q`; D D D X` 0 b which proves (b).

mn nm Corollary 1 If  is a permutation; A C is .R;S /-commutative; and B C 2 2 is .S; R 1 /-commutative; then AB commutes with R and BA commutes with S:

mm Theorem 5 Suppose j >1 and Aj C is .R; R /-commutative; where j is a 2 j permutation if j >1: Then A1A2 Aj is .R; R1ı2ııj /-commutative specifically; if  I k1 .j/ Aj P .`/F P`; (23) D j ` D X` 0 then b k1 .1/ .2/ A1A2 P ı .`/F F P`; D 1 2 2.`/ ` D X` 0 k1 b .1/ .2/ .3/ A1A2A3 P ı ı .`/F F F P`; D 1 2 3 2ı3.`/ 3.`/ ` D X` 0 and; in general; b

k1 .1/ .2/ .j 1/ .j/ A1A2 Aj P ı ıı .`/F F F F P`:  D 1 2 j 2ııj .`/ 3ııj .`/  j .`/ ` D X` 0 b PROOF. To minimize complicated notation, suppose

k1 .j/ Bj P ı ıı .`/G P` D 1 2 j ` D X` 0 b 10 William F. Trench

for some j 1. Since j C1 is a permutation, we can replace ` by j C1.`/ to obtain  k1

Bj P ı ıı ı .`/G .`/P .`/: D 1 2 j j C1 j C1 j C1 D X` 0 Therefore, from (23) with j replaced by j 1, b C k1 k1 .j C1/ Bj Aj C1 P ı ıı ı .`/G .`/P .`/ P .`/F P` D 1 2 j j C1 j C1 j C1 j C1 ` D ! D ! X` 0 X` 0 k1 b b .j C1/ .j C1/ .j C1/ P ı ıı ı .`/G P` with G G .`/F : D 1 2 j j C1 ` ` D j C1 ` D X` 0 This provides the basis for a straightforward inductionb proof of the assertion.

mm Corollary 2 If  is a permutation; A C is .R; R /-commutative; and j is a j 2 positive ; then A is .R; R j /-commutative explicitly,  I k1 j A P j F .j 1/ F.`/F`P` (24) D  .`/  .`/  D X` 0 and b k1

RA AR j j P j F .j 1/ F.`/F`P` D  D  .`/  .`/  .`/  D X` 0 for arbitrary 0, 1,..., k1. b

5 Generalized Inverses and Singular Value Decompos- tions

If A is an arbitrary complex matrix then A is a reflexive inverse of A if AAA A and AAA A. The Moore-Penrose inverse AŽ of A is the unique matrixD that satisfies the PenroseD conditions

.AAŽ/ AAŽ; .AŽA/ AAŽ; AAŽA A; and AŽAAŽ AŽ: D D D D mn Theorem 6 Suppose  is a permutation and A C is .R;S /-commutative; so 2 k1

A P.`/F`Q`; (25) D D X` 0 b by Theorem 2. Let F0 ; F1 ; . . . ; Fk1 be reflexive inverses of F0; F1; . . . ; Fk1; and define k1 B Q`F P.`/: (26) D ` D X` 0 b .R;S /-commutative matrices 11

Then B is a reflexive inverse of A: Moreover; if P and Q are unitary, then

k1 Ž Ž  A Q`F P : (27) D ` .`/ D X` 0 PROOF. From (9), (10), (22), (25), and (26),

k1 k1 AB P.`/F`F P.`/; BA Q`F F`Q`; (28) D ` D ` D D X` 0 X` 0 b b k1 k1 ABA P.`/F`F F`Q` P.`/F`Q` A; (29) D ` D D D D X` 0 X` 0 and b b k1 k1 BAB Q`F F`F P.`/ Q`F P.`/ B: (30) D ` ` D ` D D D X` 0 X` 0 The last two equations show that B is a reflexiveb inverse of Ab. If P and Q are unitary and we redefine k1 Ž  B Q`F P ; D ` .`/ D X` 0 then (28)–(30) become

k1 k1 Ž  Ž  AB P.`/F`F P ; BA Q`F F`Q ; (31) D ` .`/ D ` ` D D X` 0 X` 0 k1 k1 Ž   ABA P.`/F`F F`Q P.`/F`Q A; D ` ` D ` D D D X` 0 X` 0 and k1 k1 Ž  Ž  BAB Q`F F`F P Q`F P B: D ` ` .`/ D ` .`/ D D D X` 0 X` 0 Moreover, from (31)

k1 k1  Ž   Ž  .AB/ P.`/.F`F / P P.`/F`F P AB D ` .`/ D ` .`/ D D D X` 0 X` 0 and k1 k1  Ž   Ž  .BA/ Q`.F F`/ Q Q`F F`Q BA: D ` ` D ` ` D D D X` 0 X` 0 Therefore B AŽ, which implies (27). D 12 William F. Trench

mn Corollary 3 If  is a permutation; P and Q are unitary; and A C is .R;S /- Ž 2 commutative; then A is .S; R 1 /-commutative:

PROOF. From (9)–(11), (22), and (27),

k1 Ž Ž Ž  SA A R 1 `Q`F P D  D ` .`/ D X` 0 for arbitrary 0, 1,..., k1.

Remark 2 It is well known – and straightforward to verify – that if G Cpq and rank G q, then GŽ .GG/1G. Hence, (27) implies the folllowing2 corollary. D D

Corollary 4 In additionto the assumptionsof Theorem 6; suppose that rank.F`/ n`; 0 ` k 1 .or; equivalently; rank.A/ n/: Then D Ä Ä D k1 Ž  1   A Q`.F F`/ F P : D ` ` .`/ D X` 0

Theorem 7 Suppose  is a permutation; P and Q are unitary; and A is .R;S /- commutative and therefore of the form

k1  A P.`/F`Q ; D ` D X` 0 by Theorem 2: Let  F` ˝``˚ ; 0 ` k 1; D ` Ä Ä with

m.`/ m.`/ m.`/ n` n` n` ˝` C ; ` C ; and ˚` C ; 0 ` k 1; 2 2 2 Ä Ä be singular value decompositions of F`;0 ` k 1: Let Ä Ä

˝ P.0/˝0 P.1/˝1 P.k1/˝k1 D  and   ˚ Q0˚0 Q1˚1 Qk1˚k1 D  Then    A ˝ diag.0;1;:::;k1/˚ D is a singular value decomposition of A; except that the singular values are not neces- sarily arranged in decreasing order: Thus, for 0 ` k 1, each singular value of Ä Ä F` is a singular value of A with an associated left singular vector in the column space of P.`/ and a right singular vector in the column space of Q`:

We invoke the first equalityin (22)repeatedly in the proofof the followingtheorem. .R;S /-commutative matrices 13

mm Theorem 8 Suppose  is a permutation; P is unitary; and A C is .R; R /- commutative; so 2 k1  A P.`/F`P ; D ` D X` 0 by Theorem 2: Then A is HermitianW if and only if F  P  F P ;0 ` k 1: (i) .`/  2.`/ ` ` D D Ä Ä (ii) A is normal if and onlyif F F.`/ F`F ;0 ` k 1: .`/ D ` Ä Ä Ž Ž Ž Ž (iii) A is EP .i:e:; AA A A/ if and only if F F.`/ F`F ;0 ` k 1: D .`/ D ` Ä Ä PROOF. Since R is unitary, Theorems 2 and 6 imply that

k1 k1 k1     Ž Ž  A P.`/F`P ; A P`F P ; and A P`F P : (32) D ` D ` .`/ D ` .`/ D D D X` 0 X` 0 X` 0 Replacing ` by .`/ in the second sum in (32) yields

k1    A P.`/F P 2 ; D .`/  .`/ D X` 0 and comparing this with the first sum in (32) yields (i). From (32), k1    AA P.`/F`F P (33) D ` .`/ D X` 0 and k1 k1      A A P`F F`P P.`/F F.`/P : D ` ` D .`/ .`/ D D X` 0 X` 0 Comparing the second sum here with (33) yields (ii). From (33), k1 Ž Ž  AA P.`/F`F P (34) D ` .`/ D X` 0 and k1 k1 Ž Ž  Ž  A A P`F F`P P.`/F F.`/P ; D ` ` D .`/ .`/ D D X` 0 X` 0 Comparing the second sum here with (34) yields (iii).

6 Solving A´ D w and the least-squares problem

mn Throughout this section  is a permutation and A C is .R;S /-commutative, and can therefore be written as in (15). 2 14 William F. Trench

If ´ Cn and w Cm, we write 2 2 k1 k1

´ Qu Q`u` and w P v P`v`; (35) D D D D D D X` 0 X` 0

n` m` with u` C and v` C , 0 ` k 1. 2 2 Ä Ä Theorem 9 If (35) holds then

(a) A´ w if and onlyif (b) F`u` v.`/; 0 ` k 1: (36) D D Ä Ä PROOF. From (10), (15), and (35),

k1 k1 k1 k1

A´ w P.`/F`u` P`v` P.`/F`u` P.`/v.`/ D D D D D D X` 0 X` 0 X` 0 X` 0 k1

P.`/ F`u` v.`/ ; (37) D `D0 X  so (36)(b) implies (36)(a). From (22) and (37),

F`u` v.`/ P.`/.A´ w/; 0 ` k 1; D Ä Ä so (36)(a) implies (36)(b). b m.`/ n` Since F` C , 0 ` k 1, (36) implies the following theorem. 2 Ä Ä

Theorem 10 A is invertible if and only if m.`/ n` and F` is invertible;0 ` k 1 .which; from (3); implies that m n/: In thisD case; Ä Ä D k1 1 1 A Q`F P.`/ (38) D ` D X` 0 b and the solution of A´ w is D k1 1 ´ Q`F v.`/: D ` D X` 0 1 Moreover; A is .S; R 1 /-commutative specifically;  I k1 1 1 1 SA A R 1 `Q`F P.`/ D  D ` D X` 0 b for arbitrary 0; 1; . . . ; k1: .R;S /-commutative matrices 15

If m n and R S (so A is .R; R /-commutative), then (38) becomes D D k1 1 1 A P`F P.`/: D ` D X` 0 b In this case, k1 j 1 1 1 A P`F F F j 1 P j ; D ` .`/   .`/  .`/ D X` 0 b which can be verified by simply multiplying the right hand side by Aj as written in (24). Before turning to the least squares problem for A, we review some elementary facts about the least squares problem for a matrix G Cpq and a given u Cp ; i.e., find v Cq such that 2 2 2 Gv u min G u ; k kD 2Cq k k where is the 2-norm. An arbitrary v Cpq can be written as kk 2 v GŽu G.v GŽu/; D C so 2 Ž 2 Ž 2 Gv u .GG Ip /u G.v G u/ ; k k Dk k Ck k since

Ž  Ž Ž Ž  Ž ŒG.v G u/ .GG Ip /u ŒGG G.v G /u .GG Ip /u D Ž  Ž Ž ŒG.v G u/ GG .GG Ip /u D and Ž Ž Ž Ž Ž G .GG Ip / G GG G 0: D D Hence, min G u .GGŽ I /u ; 2Cq k kDk k and this minimum is attained with a given v if and onlyif v GŽu h where Gh 0. In this case, v 2 GŽu 2 h 2 since D C D k k Dk k Ck k hGŽu hGŽGGŽu .GŽGh/GŽu 0; D D D Ž so v0 G u is the unique solution of (37) with minimal norm, and is therefore called D  1  the optimal solution. From Remark 2, v0 .G G/ G u if rank.G/ q. If P is unitary then (37) implies that D D

k1 2 2 A´ w F`u` v.`/ ; k k D k k D X` 0 16 William F. Trench so the least squares problem for A and a given w reduces to k independent least squares m.`/ n` m.`/ problems for F` C and a given v.`/ C , 0 ` k 1. Therefore, 2 2 Ä Ä A´ w min A w k kD 2Cn k k if and only if k1 Ž ´ Q`.F v.`/ h`/; D ` C D X` 0 where F`h` 0, 0 ` k 1. If Q is also unitary, then D Ä Ä k1 k1 k1 2 Ž 2 Ž 2 2 ´ F v.`/ h` F v.`/ h` ; k k D k ` C k D k ` k C k k D D D X` 0 X` 0 X` 0 so the unique optimal (least norm) solution of the least squares problem is

k1 Ž ´ Q`F v.`/; D ` D X` 0 which can be written as

k1  1  ´ Q`.F F`/ F v.`/ if rank.F`/ n`; 0 ` k 1; D ` ` D Ä Ä D X` 0 or, equivalently, if rank.A/ n. D 7 The eigenvalue problem

mm Throughout this section A C is .R; R /-commutative, and can therefore be written as 2

k1 m.`/ m` A P.`/F`P` where F` C 0 ` k 1; (39) D 2 Ä Ä D X` 0 b and  is a permutation. An arbitrary ´ Cm can be written as 2 k1 m` ´ P`u` with u` C ; 0 ` k 1: D 2 Ä Ä D X` 0 Therefore (9) and (39) imply that

k1 k1 k1

A´ ´ P.`/F`u`  P`u` P.`/.F`u` u.`// (40) D D I D D D X` 0 X` 0 X` 0 .R;S /-commutative matrices 17 hence, A´ ´ if and only if D

F`u` u.`/; 0 ` k 1: D Ä Ä We first consider the case where  is the identity. The next three theorems are es- sentially restatements of results from [21], recast so as to be consistent with viewpoint that we have taken in this paper. k1 Let C` denote the column space of P` and let C C`. D [`D0 Theorem 11 If A commutes with R then  is an eigenvalue of A if and only if  is an eigenvalue of one or more of the matrices F0; F1; ..., Fk1: Assuming this to be true; let SA./ ` 0;1;:::;k 1  is an eigenvalue of F` : D 2f g .1/ .2/ .d / ˚ ` ˇ m` «m` If ` SA./ and u ; u ; ; u isa basisfortheset u` C F`u` u` ; 2 f ` `  ` g ˇ 2 D then P u.1/; P u.2/; . . . ; P u.d` / are linearly independent -eigenvectors of A: Moreover; ` ` ` ` ` ` ˚ ˇ « ˇ

.1/ .2/ .d` / P`u ; P`u ; ; P`u f ` `  ` g 2 ` S[A./ is a basis for the -eigenspace of A: Finally; A is diagonalizableif and only if F0; F1; . . . ; Fk1 are all diagonalizable: In this case; A has m` linearly independent eigenvec- tors in C`;0 ` k 1: Ä Ä It seems useful to consider the case where A is diagonalizable more explicitly.

Theorem 12 Suppose a A commutes with R and and F` 1 D ˝`D`˝ is a spectral decomposition of F`;0 ` k 1: Let ` Ä Ä

˝ P0˝0 P1˝1 Pk1˝k1 D  Then   k1 1 A ˝ D` ˝ D sD0 ! M with 1 ˝0 P0 ˝1P 1 1 1 ˝ 2 : b 3 D : 6 b 7 6 ˝1 P 7 6 k1 k1 7 4 5 is a spectral decomposition of A: b Remark 3 It is well known that commuting diagonalizablematrices are simultaneouly diagonalizable. Theorem 12 makes this explicit, since since ˝R˝1 and ˝A˝1 are both diagonal. 18 William F. Trench

The original version of the following theorem, which dealt with centrosymmetric matrices, is due to Andrew [2, Theorem 6]. The proof is practically identical to An- drew’s original proof.

Theorem 13 (i) If A commutes with R and  is an eigenvalue of A; then the -eigenspace of S has a basis in C. (ii) If A has n linearly independent eigenvectors in C; then A commutes with R.

PROOF. (i)See Theorem 11. (ii)If ´ C then R´ `´ for some ` Zk. If A´ ´ 2 D 2 D and R´ `´, then D

RA´ R´  `´ and AR´ `A´ `´ D D D D I hence, RA´ AR´. Now suppose that A has n linearly independent eigenvectors D n n ´1; ´2;:::;´n in C. Then we can write an arbitrary ´ C as ´ ai ´i . f g 2 D iD1 Since RA´i AR´i , 1 i n, it follows that RA´ AR´. Therefore AR RA. D Ä Ä D P D

For the remainder of this section we assume that A is .R; R /-commutative and  is a permutation other than the identity. The following theorem shows that finding the null space of A reduces to finding the null spaces of F0, F1,..., Fk1.

Theorem 14 If A is .R; R /-commutative and  is a permutation then A´ 0 if and k1 D only if ´ P`u`; where D `D0

P F`u` 0; 0 ` k 1 (41) D Ä Ä I hence; the null space if A is independent of  .so long as  is a permutation/:

PROOF. Clearly, (41) implies that A´ 0 without any assumption on . For the D converse, note from (22) and (40) that if  is a permutation then P.`/A´ F`u`, 0 ` k 1, so A´ 0 implies (41). D Ä Ä D Henceforth we assume that  0. In this case, suppose that bhas p orbits O0, ¤ j ..., Op1. If p 1, then  is a k-cycle and Zk  .0/ 0 j k 1 . In any D D Ä Ä case, if `r Or , 0 r p 1, then Zk O0 Op1, where 2 Ä Ä D [[˚ ˇ « ˇ j Or  .`r / 0 j kr 1 ; 0 r p 1; D Ä Ä Ä Ä ˚ ˇ « and k0 kp1 k. It is important to note that CC D ˇ

kr  .`r / `r ; 0 r p 1; (42) D Ä Ä and k0, k1, ..., kp1 are respectively the smallest positive for which these equalities hold. In Example 1, p 3, O0 0;1;3 , O1 2;4 , O2 5 , so D D f g D f g D f g k0 3, k1 2, k3 1, Z6 O0 O1 O2, and we may choose `0 0, `1 2, D D D D D D and `2 5. D S S .R;S /-commutative matrices 19

k1 To solve the eigenvalue problem, we rearrange the terms in ´ P`u` as D `D0 p1 kr 1 P ´ ´r with ´r P j u j ; 0 r p 1; (43) D D  .`r /  .`r / Ä Ä rD0 j D0 X X and rearrange the terms in (39) as

p1 kr 1 A Ar with Ar P j C1 F j P j ; 0 r p 1: (44) D D  .`r /  .`r /  .`r / Ä Ä rD0 j D0 X X b Since (9) implies that Ar As 0 if r s, we can replace (44) by D ¤

A A0 A1 Ap1 D ˚ ˚˚ I hence, A´ ´ if and only if D

Ar ´r ´r ; 0 r p 1: D Ä Ä Therefore, the eigenvalue problem for A reduces to p independent eigenvalue problems for A0, A1,..., Ap1. From (43) and (44), Ar ´r ´r if and only if D kr 1 kr 1 kr 1

P j C1 F j u j  P j u j  P j C1 u j C1 ;  .`r /  .`r /  .`r / D  .`r /  .`r / D  .`r /  .`r / j D0 j D0 j D0 X X X which is equivalent to

F j u j u j C1 ; 0 j kr 1: (45)  .`r /  .`r / D  .`r / Ä Ä

If kr 1 then .`r / `r and (44) becomes F` u` u` ; hence, if .; u` / is D D r r D r r an eigenpair of F` then ´r P` u` is -eigenvector of A. r D r r If kr >1 then (42) and (44) imply that

k m`r m`r Gr u`  u` ; where Gr F kr 1 F.` /F` C : r D r D  .`r /  r r 2

2i=kr 1=k 1=k Therefore, if  is a nonzero eigenvalue of Gr and  e , then  ,  ,..., 1=k kr 1 D   are distinct eigenvalues of Ar (and therefore of A). If  is any one of these eigenvalues, then the corresponding eigenvector ´r of Ar (and therefore of A) is given by (43), where u j , 1 j kr1, can be computed recursively from (44) as  .`r / Ä Ä 1 u j F j 1 u j 1 ; 1 j kr 1:  .`r / D   .`r /  .`r / Ä Ä References

[1] C. M. Ablow, J. L. Brenner, Roots and canonical forms for circulant matrices, Trans. Amer. Math. Soc. 107 (1963) 360–376. 20 William F. Trench

[2] A. L. Andrew, Eigenvectors of certain matrices, Linear Algebra Appl. 7 (1973) 151–162.

[3] A. L. Andrew, Solution of equations involving centrosymmetric matrices, Tech- nometrics 15 (1973) 405-407.

[4] A. L. Andrew, Centrosymmetric matrices, SIAM Review 40 (1998) 697-698.

[5] A. Cantoni, F. Butler, Eigenvalues and eigenvectors of symmetric centrosymmet- ric matrices, Linear Algebra Appl. 13 (1976), 275–288.

[6] H.-C. Chen, A. Sameh, A matrix decomposition method for orthotropic elasticity problems, SIAM J. Matrix Anal. Appl. 10 (1989), 39–64.

[7] H.-C. Chen, Circulative matrices of degree , SIAM J. Matrix Anal. Appl. 13 (1992) 1172–1188.

[8] A. R. Collar, On centrosymmetric and centroskew matrices, Quart. J. Mech. Appl. Math. 15 (1962) 265–281.

[9] D. Fasino, Circulative properties revisited: Algebraic properties of a generaliza- tion of cyclic matrices, Italian J. Pure Appl. Math 4 (1998) 33–43.

[10] I. J. Good, The inverse of a centrosymmetric matrix, Technometrics 12 (1970) 925–928.

[11] G. L. Li, Z. H. Feng, Mirrorsymmetricmatrices, their basic properties, and an ap- plication on odd/even decomposition of symmetric multiconductor transmission lines, SIAM J. Matrix Anal. Appl. 24 (2002) 78–90.

[12] I. S. Pressman, Matrices with multiple symmetry properties: applications of cen- trohermitian and perhermitian matrices, Linear Algebra Appl. 284 (1998) 239– 258.

[13] W. C. Pye, T. L. Boullion,T. A. Atchison, The pseudoinverse of a centrosymmet- ric matrix, Linear Algebra Appl. 6 (1973) 201–204.

[14] D. Tao, M. Yasuda, A spectral characterization of generalized real symmetric centrosymmetric and generalized real symmetric skew-centrosymmetric matri- ces, SIAM J. Matrix Anal. Appl. 23 (2002) 885–895.

[15] W. F. Trench, Characterization and properties of matrices with generalized sym- metry or skew symmetry, Linear Algebra Appl. 377 (2004) 207-218.

[16] W. F. Trench, Inverse eigenproblems and associated approximation problems for matrices with generalized symmetry or skew symmetry Linear Algebra Appl. 380 (2004) 199–211.

[17] W. F. Trench, Hermitian, Hermitian R-symmetric, and Hermitian R-skew sym- metric Procrustes problems, Linear Algebra Appl. 387 (2004) 83–98. .R;S /-commutative matrices 21

[18] W. F. Trench, Minimization problems for .R; S/-symmetric and .R; S/-skew symmetric matrices, Linear Algebra Appl. 389 (2004) 23–31. [19] W. F. Trench, Hermitian, Hermitian R-symmetric, and Hermitian R-skew sym- metric Procrustes problems, Linear Algebrs Appl. 387 (2004) 83–98. [20] W. F. Trench, Characterization and properties of .R; S/-symmetric, .R; S/- skew symmetric, and .R; S/-conjugate matrices, SIAM J. Matrix Anal. Appl. 26 (2005) 748–757. [21] W. F. Trench, Characterization and properties of matrices with k-involutory sym- metries, Linear Algebra Appl. 429, Issues 8-9 (2008) 2278-2290. [22] W. F. Trench, Properties of unilevel block circulants, Linear Algebra Appl. 430 (2009) 2012U2025.˝

[23] W. F. Trench, Characterization and properties of matrices with k-involutory sym- metries II, Linear Algebra Appl. 432, (2010) 2282-2797. [24] M. Yasuda, A Spectral Characterization of Hermitian Centrosymmetric and Her- mitian Skew-Centrosymmetric K-Matrices; SIAM Journal on Matrix Analysis and Applications, Volume 25, No. 3 (2003) pp 601-605. [25] J. R. Weaver, Centrosymmetric (cross-symmetric) matrices, their basic properties, eigenvalues, eigenvectors, American Mathematical Monthly 92 (1985) 711-717.