Deep Learning-Based Side Channel Attack on HMAC SM3 Xin Jin1, Yong Xiao1, Shiqi Li2*, Suying Wang2

Deep Learning-Based Side Channel Attack on HMAC SM3 Xin Jin1, Yong Xiao1, Shiqi Li2*, Suying Wang2

Deep Learning-based Side Channel Attack on HMAC SM3 Xin Jin1, Yong Xiao1, Shiqi Li2*, Suying Wang2 1 CSG Electric Power Research Institute, Guangzhou (China) 2 Open Security Research, Inc., Shenzhen (China) Received 1 July 2020 | Accepted 2 November 2020 | Published 16 November 2020 Abstract Keywords SM3 is a Chinese hash standard. HMAC SM3 uses a secret key to encrypt the input text and gives an output as CNN, Neural Network, the HMAC of the input text. If the key is recovered, adversaries can easily forge a valid HMAC. We can choose HMAC, Side Channel different methods, such as traditional side channel analysis, template attack-based side channel analysis to Analysis, HMAC-SM3. recover the secret key. Deep Learning has recently been introduced as a new alternative to perform Side- Channel analysis. In this paper, we try to recover the secret key with deep learning-based side channel analysis. We should train the network recursively for different parameters by using the same dataset and attack the target dataset with the trained network to recover different parameters. The experiment results show that the secret key can be recovered with deep learning-based side channel analysis. This work demonstrates the DOI: 10.9781/ijimai.2020.11.007 interests of this new method and show that this attack can be performed in practice. I. Introduction Then, more and more complex network structures are proposed, such as VGGNet [7], GoogLeNet [8] and ResNet [9], etc. These networks IDE CHANNEL analysis is a powerful technique that helps an work well in many areas, e.g. image recognition area, face recognition Sadversary recover sensitive information without damaging the area and so on. target. Target device leaks information (e.g. power consumption [1], In recent years, it appears that deep learning techniques are Electromagnetic emanation [2], temperature [3], acoustic [4], etc.) applied in side channel analysis research area [10], [11]. Comparing that is related to sensitive data during calculation [5]. An adversary to traditional side channel method, deep learning-based side channel can make use of the leakage to recover sensitive data. In order to analysis performs better. Deep learning-based side channel analysis decrease the leakage, several countermeasures such as masking and performs better especially when the implementation has mask or hiding are used in the cryptographic implementation. However, even jitter [12]. Without alignment or pre-processing, neural network can with countermeasures, adversaries can come up with more powerful recover sensitive information as well, which is much more convenient methods to recover the sensitive information. than the traditional side channel method. Many researches are done Following the current trend in the side channel analysis research in recent years. In 2013, Martinasek. et al., play an attack on an AES area, recent works have demonstrated that deep learning algorithms implementation working on PIC16F84A with only one hidden layer were very efficient to conduct security evaluations of embedded [13]. Another work [14] compares different kind of machine learning systems and had many advantages compared to the other methods. method on DPA contest V2 dataset. A research [15] proposed a CNN Nowadays, machine learning becomes a popular topic in many based side channel analysis and claim that this method is robust to areas. It is usually divided into three classes: supervised learning, trace misalignment. Different structures of network are applied on the unsupervised learning and semi-supervised learning. In most dataset and come up with a CNN-best structure for the dataset. Picek situation, supervised learning is mainly used. There are many kinds of et al. compare different methods for the class imbalance situation on structures which are used in machine learning, such as Support Vector DL-SCA [16]. Last, a research [17] come up with correlation-based loss Machine (SVM) and Random Forest, etc. Deep learning is a kind of function. machine learning. It extracts features by several non-linear layers. In this paper, we use a deep learning-based side channel analysis Deep learning becomes popular since AlexNet [6] is proposed in 2012. technique to analyze HMAC SM3 algorithm implementation. The structure of this paper is as follows: In Section II, we introduce the SM3 algorithm, HMAC SM3, basic idea of CNN as well as the attack * Corresponding author. path. This section will help readers have a basic understanding of E-mail address: [email protected] the algorithm and the attack method. The attacks on real traces are Please cite this article in press as: - 1 - X. Jin, Y. Xiao, S. Li, S. Wang. Deep learning-based Side Channel Attack on HMAC SM3, International Journal of Interactive Multimedia and Artificial Intelligence, (2020), http://dx.doi.org/10.9781/ijimai.2020.11.007 International Journal of Interactive Multimedia and Artificial Intelligence demonstrated in Section III. In this section, the target, the structure of The detail of each loop is as follows: the network and the attack are illustrated. In the end, conclusion and First, initialize the 8 32-bit local variables named a to h: future work are presented in Section IV. a0 = IVi −1, ···0 (9) II. Background b0 = IVi −1, ···1 (10) c = IV (11) A. SM3 Algorithm 0 i −1, ···2 SM3 is the Chinese hash standard [18]. The structure of the d0 = IVi −1, ···3 (12) algorithm is shown in Fig. 1. The input data of the function is padded e = IV (13) such that it can be split into N blocks of 512 bits. Each block will be 0 i −1, ···4 treated in a same procedure: the former block calculates a new IV for f = IV (14) the latter block through function f(), and the output of block N is the 0 i −1, ···5 hash result of the algorithm. g0 = IVi −1, ···6 (15) 64 = (16) 0 bit ≤ Data < 2 bits h0 IVi −1, ···7 Padding For each round Rj, j∈[0,63], we compute: ... 512 512 512 512 SS1j = ((aj <<< 12) + ej + (Tj <<< j) <<< 7) (17) Block #1 Block #2 Block #3 ... Block #N SS2 = SS1 (a <<< 12) (18) 8*32 j j j IV0 f IV1 f IV2 f IV3 ... IVN-1 f hash * TT1j = FFj (⊕aj, bj, cj) + dj + SS2j + Wj (19) 512 TT2j = GGj (ej, fj, gj) + hj + SS1j + Wj (20) T ... a = TT1 (21) 2*32 2*32 j+1 j * * (W0, W0 ) (W63, W63 ) bj+1 = aj (22) 8*32 R 8*32 0 R1 R63 ⊕ cj+1 = bj <<< 9 (23) Fig. 1. Structure of SM3 algorithm. dj+1 = cj (24) The structure of function f() is show in Fig 2. The function T() ej+1 = P0 (TT2j) (25) convert the 512-bit input into 64 32-bit word pairs. Each pair (Wi, f = e (26) Wi*) are used during round Ri, and the result of each round is used j+1 j as input of the next round. When the 64th round is completed, a final g +1 = f <<< 19 (27) transformation is applied by adding the input of the first round and j j the output of the last round together as the output of the function f(). hj+1 = gj (28) where all additions are done modulo 232 <<< n means left rotation 512 of n bits, constants T is: T j ... 2*32 2*32 (W , W *) (W , W *) 0 0 63 63 (29) 8*32 8*32 R0 R1 R63 ⊕ Function FFj is: Fig. 2. Structure of the f function. (30) In order to explain the detail of each loop, we define the loop by Function GGj is: function: IVi = f(IVi-1, Blocki) The first constant IV0 is: (31) IV0,0 = 0x7380166F (1) Function Pk is: IV0,1 = 0x4914B2B9 (2) IV0,2 = 0x172442D7 (3) (32) Input plaintext of each block PlainBlock is split into 32-bit words IV0,3 = 0xDA8A0600 (4) PlainBlock = {PB0, PB1, …, PB15}. Then the parameter Wj is computed as: IV0,4 = 0xA96F30BC (5) IV0,5 = 0x163138AA (6) (33) IV0,6 = 0xE38DEE4D (7) * And the parameter Wj is computed as: IV = 0xB0FB0E4E (8) 0,7 * Wj = Wj Wj+4 (34) ⊕ - 2 - Article in Press The function f() is finished by 32-bit XOR with the initial state: correlation between mid-values and the side channel signals, we can figure out the correct guess. Template attack (TA) is another kind IV = IV a (35) i,0 i−1,0 64 of passive attack. It has two stages: first, template building, second, template matching. Deep learning based SCA is similar to TA. We will IVi,1 = IVi−1,1 ⊕ b64 (36) discuss TA and deep learning based SCA in the following section. IVi,2 = IVi−1,2 ⊕ c64 (37) IVi,3 = IVi−1,3 ⊕ d64 (38) IVi,4 = IVi−1,4 ⊕ e64 (39) IVi,5 = IVi−1,5 ⊕ f64 (40) IVi,6 = IVi−1,6 ⊕ g64 (41) 0.8 IVi,7 = IVi−1,7 ⊕ h64 (42) right keys 0.7 wrong keys B. SM3 Based H⊕MAC 0.6 0.5 The HMAC stands for keyed- Hash Message Authentication Code 0.4 and is a NIST standard which can be found in [19]. Fig. 3 presents the 0.3 process of HMAC SM3. 0.2 Correlation 0.1 Precomputation 0 - 0.1 Key K Data - 0.2 0 20 40 60 80 100 120 140 160 180 200 Key Padding Padding Time ⊕ ipad 512 bit 512 bit 512 bit 512 bit Fig. 4. Side Channel Leakage. K1 512 bit Block 1 Block 2 Block 3 Block 4 ... Block N-1 8*32 D. Deep Learning Based Side Channel Analysis bit irst IV f IV =H f IV f IV f IV IV f 0 1 i 2 3 4 ..

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    8 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us