On Perfect Hypercomplex Algebra
Total Page:16
File Type:pdf, Size:1020Kb
1 On Perfect Hypercomplex Algebra Daizhan Cheng, Zhengping Ji Abstract—The set of associative and commutative hypercom- STP of matrices is a generalization of conventional ma- plex numbers, called the perfect hypercomplex algebra (PHA) is trix product. It is a powerful tool to deal with multi-linear investigated. Necessary and sufficient conditions for an algebra to mappings. In [5], STP was used to investigate finite algebra be a PHA via semi-tensor product(STP) of matrices are reviewed. R F The zero set is defined for non-invertible hypercomplex numbers extensions of . (in fact, the extensions of any with in a given PHA, and a characteristic function is proposed for Char(F )=0 have been discussed there.) In [6], STP was calculating zero set. Then PHA of different dimensions are con- used to investigate general Boolean-type algebras. A key issue sidered. First, 2-dimensional PHAs are considered as examples to in these approaches is to define a matrix, called product matrix 3 calculate their zero sets etc. Second, all the -dimensional PHAs of a certain algebra. Using STP, associativity, commutativity, are obtained and the corresponding zero sets are investigated. Third, 4-dimensional or even higher dimensional PHAs are and some other properties of a finite algebra extension can be also considered. Finally, matrices over pre-assigned PHA, called verified via its product matrix. perfect hypercomplex matrices (PHMs) are considered. Their In this paper, this STP approach is used to investigate properties are also investigated. PHA and PHM. For PHA, first the formulas for verifying Index Terms—Perfect hypercomplex algebra (PHA), perfect whether an HA is associative and commutative are reviewed. hypercomplex matrix (PHM), zero-set, semi-tensor product (STP) Then the zero set is defined as the set of non-invertible of matrices. numbers. A characteristic function is proposed to calculate (or characterize) the zero set. Then the PHA of dimensions 2, 3, I. INTRODUCTION or 4 are constructed separately. Even higher dimensional cases Hypercomplex number (HN) are generalization of complex are also discussed. Their zero-sets, which are of measure zero, numbers (C). A class of HN with pre-assigned addition and are calculated. Analytic functions and some other properties product form a special vector space over R, called a hypercom- of PHAs are then discussed. plex algebra (HA). HAs have various applications including As for PHMs, their invertibility and some further properties signal and image processing [10], dealing with differential are investigated. The Lie-group and Lie algebra over a pre- assigned PHA, say, , are also defined as general linear operators [2], [?], designing neural networks [9], etc. A group, GL(n, ) and general linear algebra, gl(n, ) are It was proved by Weierstrass that the only finite field A A extension of real numbers (R) is complex numbers (C) [8]. investigated. Certain properties are revealed. HA can be considered as an extension of real numbers (R) The rest of this paper is organized as follows: to finite dimensional algebras. We call such extension finite Before ending this section, we give a list of notations: arXiv:2105.03205v1 [math.RA] 7 May 2021 algebra extension of real numbers. 1) H: Set of PHAs, as finite algebra extensions over R. 2) : set of m n dimensional matrices, with all In this paper we consider only a particular kind of finite al- Am×n × R entries in algebra . gebra extensions of , which are commutative and associative. A Hence, throughout this paper the following is assumed: 3) ⋉: STP of matrices. 4) Col(A) : the set of columns (rows) of A; Coli(A) Assumption 1: H is the set of PHAs, that is, the set of (Row (A)): the i-th column (row) of A. R i finite dimensional algebras over which are commutative and i 5) δ : The i-th column of identity matrix Ik. associative. k In addition to complex numbers, hyperbolic numbers, dual II. PRELIMINARIES numbers, and Tessarine quaternion are also PHA. A. Semi-tensor Product of Matrices This work is supported partly by the National Natural Science Foundation Since STP is a fundamental tool in this approach, this of China (NSFC) under Grants 62073315, 61074114, and 61273013. section will give a brief survey for STP. We refer to [4] for Key Laboratory of Systems and Control, Academy of Mathematics and Systems Sciences, Chinese Academy of Sciences, Beijing 100190, P. R. China more details. STP is a generalization of conventional matrix (e-mail: [email protected], [email protected]). product, defined as follows: 2 Definition 2.1: Let A R and B R , t = Definition 2.5: Let = (V, ) be a k-dimensional vector ∈ m×n ∈ p×q A ∗ lcm(n,p) be the least common multiple of n and p. Then space with e = i , i , , i as a set of basis. Denote { 1 2 ··· k} the STP of A and B, denoted by A ⋉ B, is defined by k i i = cs i , i,j =1, 2, , k. (8) ⋉ i ∗ j i,j s ··· A B := A It/n B It/P , (1) s=1 ⊗ ⊗ X where is Kronecker product. Then the product matrix of is defined as ⊗ A It is easy to see that STP is a generalization of conventional c1 c1 c1 c1 1,1 1,2 ··· 1,k ··· k,k matrix product. That is, when n = p, the STP is degenerated to 2 2 2 2 c1,1 c1,2 c1,k ck,k P := ··· ··· . (9) the conventional matrix product, i.e.,, A ⋉ B = AB. Because A .. .. . of this, in most cases the symbol ⋉ is omitted. ck ck ck ck 1,1 1,2 ··· 1,k ··· k,k One of the most important advantages of STP is that STP keeps most important properties of conventional matrix k Assume x = xiij , is expressed in a column vector form product available, including association, distribution, etc. In j=1 as x = (x , x ,P , x )T . Similarly, y = (y ,y , ,y )T . the following we introduce some additional properties of STP, 1 2 ··· k 1 2 ··· k Then which will be used in the sequel. Theorem 2.6: In vector form two product of two hypercom- Define a swap matrix W[m,n] mn×mn as follows: ∈M plex numbers x, y is computable via following formula. ∈ A W := I δ1 , I δ2 , , I δm, . [m,n] n ⊗ m n ⊗ m ··· n ⊗ m ∈ Lmn×mn (2) x y = PAxy. (10) Proposition 2.2: Let x Rm and y Rn be two column ∗ ∈ ∈ vectors. Then Using formula (10) and the properties of STP yields the following results, which are fundamental for our further in- ⋉ ⋉ W[m,n]x y = y x. (3) vestigation. Theorem 2.7: The following proposition “swaps” a vector with a matrix: Proposition 2.3: Let x Rt be a column vector, and A be (i) is commutative, if and only if, ∈ A an arbitrary matrix. Then P I W =0. (11) A k − [k,k] ⋉ ⋉ x A = (It A) x. (4) (ii) is associative, if and only if, ⊗ A Throughout this paper the default matrix product is assumed 2 PA = PA (Ik PA) . (12) to be STP, and the symbol ⋉ is omitted if there is no possible ⊗ confusion. III. HYPERCOMPLEX NUMBERS A. Perfect Hypercomplex Algebra on R B. Matrix Expression of an Algebra Definition 3.1: [11] A number p is called a hypercomplex R We are only interested in algebras over . number, if it can be expressed in the form Definition 2.4: [7] p = p + p i + + p i , (13) (i) An algebra over R is a pair, denoted by = (V, ), 0 1 1 ··· n n A ∗ where V be a real vector space, and : V V V , where p R, i = 0, 1, ,n, i , i = 1, 2, ,n are called ∗ × → i ∈ ··· i ··· satisfies hyperimaginary units. (ax + by) z = ax z + by z, Remark 3.2: A hypercomplex number may belong to dif- ∗ ∗ ∗ x (ay + bz)= ax y + bx z, x,y,z V, a,b R. ferent algebras, depending on their product structure matrices. ∗ ∗ ∗ ∈ ∈ A hypercomplex algebra, denoted by , is an algebra over R (5) A with basis e = i0 := 1, i1, , in . (ii) An algebra = (V, ) is said to be commutative, if { ··· } A ∗ Proposition 3.3: Assume x y = y x, x, y V. (6) ∗ ∗ ∈ = p0 + p1i1 + + pnin p0,p1, ,pn R . A { ··· | ··· ∈ } (iii) An algebra = (V, ) is said to be associative, if A ∗ Then its product matrix (x y) z = x (y z), x,y,z V. (7) P := [M ,M , ,M ], ∗ ∗ ∗ ∗ ∈ A 0 1 ··· n 3 where M R , i = 0, 1, ,n, satisfies the The ξ(x , x , , x ) is called the characteristic function i ∈ (n+1)×(n+1) ··· 0 1 ··· n following condition. of . A (i) Example 3.8: Consider C = R(i). Calculating right hand side of (17) for PC, we have M0 = In+1 (14) 2 2 xi(x1, x2)= x + x . is an identity matrix. 1 2 (ii) Hence, ξ(x1, x2)=0, if and only if, x1 = x2 =0. It follows that PC is jointly non-singular. Col (M )= δj+1 , j =1, 2, , n. (15) 1 j n+1 ··· Summarizing above arguments, we have the following re- Definition 3.4: A hypercomplex algebra is called a PHA, sult. A denoted by H, if it is commutative and associative. Proposition 3.9: Let be a finite dimensional algebra over A ∈ A Example 3.5: Consider C. It is easy to calculate that its R. Then is a field, if and only if, A product structure matrix is (i) is commutative, that is, (11) holds; A 1 0 0 1 (ii) is associative, that is, (12) holds; PC = .