Journal of Cryptologic Research
Citation Search Quick Search DOI Advanced
  Author Center
   » Submission Online
   » Instruction for Authors
   » Template
   » Copyright Agreement
  Peer Review
   » Peer Review
   » Peer Review Policy
Journal Online
   » Accepted
   » In Press
   » Current Issue
   » Earlier Issues
   » View by Fields
   » Top Downloaded
   » RSS
Journal of Cryptologic Research  
  Journal of Cryptologic Research--2018, 5 (6)   Published: 07 December 2018
Select | Export to EndNote

Z Cipher Scheme Hot!

Journal of Cryptologic Research. 2018, 5 (6): 579-590. ;  doi: 10.13868/j.cnki.jcr.000267
Full Text: HTML (1 KB)  PDF (5714 KB)  ( 1131 )
Show Abstract ( 452 )

This paper designs Z cipher scheme to satisfy the security requirements for mobile ends, including the identity authentication and key exchange. The design of Z cipher is dynamic in the way that the algorithm structure depends on the user's key, which is different from known ciphers with static structure and different keys for different users. This means that different users have different encryption logic which appears not only in different cipher parameters, but also in different logic layers, different operations, different components, and different data processing. This dynamic feature makes the Z cipher algorithm secure against static attacks. We call this Z's characteristics as `different user, different algorithm'. Some new security concepts are proposed, such as instance security, system security and distance security, which expands the security measures of traditional block ciphers. The users can update their keys and hence the cipher structures periodically or occasionally. Furthermore, the encryption algorithms in Z cipher scheme can be provided in the form of executable code, so it is easy to use and manage in software.

On Cycle Structure of Non-singular Shift Registers with 2 Circles Hot!

Journal of Cryptologic Research. 2018, 5 (6): 591-598. ;  doi: 10.13868/j.cnki.jcr.000268
Full Text: HTML (1 KB)  PDF (485 KB)  ( 96 )
Show Abstract ( 69 )

The non-singular feedback shift register (NFSR) is a widely studied class of registers used in communications and cryptographic algorithms. The cycle structure is a common expression for describing the state diagram of NFSRs, that is, how many circles can an NFSR generate and what is the length of each circle? The problem of the cycle number distribution on NFSRs is a counting problem of NFSRs with certain number of circles. In the 80s of last century, scholars solved the problem of linear and specially appointed nonlinear cycle structure of NFSRs. So far, only the number of NFSRs with one circle can be determined, namely the number of $M$-sequences. There are few results on the cycle number distribution on the other kinds. This study investigates the counting problem of NFSRs with 2 circles. This problem can be converted into the number of assignment points in the $M$-sequence state cycle structure. Accordingly, we propose two constraints of the number on the cycle structure of NFSRs with 2 circles. Based on the number of the assorted assignment points and aliquot circles, we present a new structure property law of $M$-sequences. Based on the $m$-sequences, we construct a class of the cycle structure of NFSRs with 2 circles. We present the relation between the cycle number of NFSRs and the number of minor terms  as well as its relationship with the weight distribution of the $M$-sequence feedback function.

Selectively Public-verifiable Outsourced Computation Scheme Based on Garbled Circuit Hot!

Journal of Cryptologic Research. 2018, 5 (6): 599-611. ;  doi: 10.13868/j.cnki.jcr.000269
Full Text: HTML (1 KB)  PDF (769 KB)  ( 81 )
Show Abstract ( 71 )

As the analysis and processing of data is becoming more and more dependent on computer capacity, users prefer to outsource their complex computing tasks to a more powerful server. This paper studies outsourced computation in the setting of dual-server and multi-user, which allows users with weak computing power to outsource the computation of function $f$ on their collective input $(x_1, \cdots, x_n)$ to two non-collaborating severs $S_1$ and $S_2$. $S_2$ is responsible for computing and returning the result, while $S_1$ uses the public key of specified user to execute the encryption operation in order to provide a proof of the correctness for the final result. Compared with previous schemes, the proposed scheme satisfies selective public verifiability, namely only specified users can use their own private key to verify the correctness of the returned results, hence the sensitive data can be effectively kept secret to unauthorized users. It also improves the efficiency of computation and avoids unnecessary interaction among servers and among users. In addition, this scheme meets the defined privacy and reliability requirements in the case of malicious behavior of a server. The proposed scheme is of certain practical significance in terms of avoiding damage caused by information leakage in the cloud environment.

Research on Plaintext Recovery Algorithm of RC4 Hot!

Journal of Cryptologic Research. 2018, 5 (6): 612-622. ;  doi: 10.13868/j.cnki.jcr.000270
Full Text: HTML (1 KB)  PDF (3050 KB)  ( 192 )
Show Abstract ( 79 )

With the exposing biases of the output key streams, RC4 algorithm is confronted with great security challenges. In 2013, Al Fardan et al. proposed a plaintext recovery attack using single-byte and double-byte biases. Given $13\cdot2^{30}$ ciphertexts encrypted by different keys, the first 256 bytes can be recovered successfully with probability 1. In the same year, Ohigashi et al. proposed a guess and determine attack to recover the plaintexts encrypted by RC4. Given $2^{35}$ ciphertexts encrypted by different keys, any byte of a plaintext can be recovered with probability close to 1. However, when the amount of ciphertexts is less than $2^{35}$, the success probability decreases rapidly. This study proposes a more effective guess and determine attack by using the $t$ value to replace the traditional probability, and the existing bias is fully utilized to modify the guess phase of Ohigashi's algorithm. Given $2^{34}$ ciphertexts encrypted by different keys, any byte of a plaintext can be recovered by the proposed method with probability close to 100\%, and given $2^{33}$ ciphertexts encrypted by different keys, any byte of a plaintext can be recovered with probability being above 98\%.

Niederreiter Public Key Cryptosystem Based on Polar Codes Hot!

Journal of Cryptologic Research. 2018, 5 (6): 623-630. ;  doi: 10.13868/j.cnki.jcr.000271
Full Text: HTML (1 KB)  PDF (941 KB)  ( 155 )
Show Abstract ( 72 )

Quantum computers have made rapid progress in recent years, which is bound to have certain impact on traditional cryptography. Public-key cryptosystems based on error-correcting codes are considered to be a class of cryptosystems that can resist quantum attacks. This paper first reviews the research background of public-key cryptosystems based on various error-correcting codes, then the SC fast decoding algorithm of polar codes is introduced in detail. Using the SC decoding algorithm, a Niederreiter public key cryptosystem based on polar codes is designed. The security of the cryptosystem is based on the difficulty of large matrix decomposition and the NP-complete problem of decoding linear block codes, which are known as fundamental hard problems. The designed cryptosystem is less complex and more efficient than the Niederreiter public key cryptosystems based on other error-correcting codes, and can resist many kinds of attacks, such as equation-solving attack, chosen plaintext attack and chosen ciphertext attack. It is a cryptosystem with working factor up to $2^{82}$, and can be used in scenarios where the amount of data to be transmitted is small and the security level is high.

Analysis of Impossible Differential Characteristics for SPECK Families of Block Ciphers Hot!

Journal of Cryptologic Research. 2018, 5 (6): 631-640. ;  doi: 10.13868/j.cnki.jcr.000272
Full Text: HTML (1 KB)  PDF (533 KB)  ( 64 )
Show Abstract ( 54 )

SPECK is a family of lightweight block ciphers published by National Security Agency (NSA) of USA in 2013. Xu Hong et al. found some new 6-round impossible differential characteristics of SPECK 32/64 and SPECK 48/96 by analyzing differential diffusion property of addition, and a 10-round impossible differential cryptanalysis on these two ciphers was presented which is the best result of impossible differential attack so far. This paper further analyzes the longest impossible differential characteristics for SPECK families of block ciphers having differential diffusion property of addition. Based on differential diffusion property of addition given by Xu Hong et al., the differential diffusion properties of SPECK 32 in encryption and decryption are analyzed in this paper. It is shown that the maximum number of rounds of impossible differential characteristics for SPECK 32 under the differential diffusion property of addition is 6-round, and all the 6-round impossible differential characteristics are given. Moreover, the result can be extended to SPECK $2n~(2n=48,64,96,128)$. With the similar method, it is shown that the maximum number of rounds of impossible differential characteristics for SPECK $2n$ having the differential diffusion property of addition is also 6-round, and all the 6-round impossible differential characteristics are given.

A New Threshold Implementation of the S-box in SM4 Hot!

Journal of Cryptologic Research. 2018, 5 (6): 641-650. ;  doi: 10.13868/j.cnki.jcr.000273
Full Text: HTML (1 KB)  PDF (15636 KB)  ( 589 )
Show Abstract ( 70 )

Since its birth, side channel attack has posed a great threat to the security of cryptographic algorithms. As one of the typical side channel attack methods, DPA attack has become the most popular and widely used attack method in the field of side channel attack because of its high effectiveness and simple implementation. As a block cipher standard in China, SM4 algorithm has attracted wide attention, and its security has become a research hotspot. Shortly after the publication of SM4 algorithm, it was successfully cracked by the industry scholars using DPA attack. The security of SM4 algorithm in its implementation is facing severe challenges. In this study, a new threshold implementation scheme of SM4 is proposed to resist second-order DPA attack. In this scheme, the input of S-box is transformed into composite field by normal basis, and then a new S-box is constructed by combining threshold implementation theory. By dividing the input into three groups, the new S-box guarantees the resistance against second-order DPA attacks, and reduces the implementation area of the scheme by introducing ring mask structure and decomposition method for inversion. Analysis shows that the S-box constructed in this scheme can effectively resist second-order DPA attack. The experimental results show that the area of the proposed scheme is reduced by 6\% and the required number of random masks is at a low level compared with the conventional masking schemes based on composite field.

Cryptanalysis of Two White-box SM4 Implementations Hot!

Journal of Cryptologic Research. 2018, 5 (6): 651-671. ;  doi: 10.13868/j.cnki.jcr.000274
Full Text: HTML (1 KB)  PDF (711 KB)  ( 124 )
Show Abstract ( 77 )

Traditional cryptographic algorithms are designed to be secure without considering their platform risks. In 2002, Chow et al. introduced the white-box context, assuming attackers have full control over the execution of a cryptographic algorithm. Attackers can thus observe the algorithm internal states and even modify these values. This white-box model is very practical when evaluating cryptographic devices which are used in untrusted environments, where a legitimate user may be a potential attacker. Then, traditional cryptographic algorithms become no longer secure in such a context, and how to protect their security is strongly required in practice, e.g. digital rights protection and mobile device security. Taking advantage of obfuscation operations and lookup-tables, Chow et al. constructed a white-box implementation of AES and DES. Similarly, Xiao et al. constructed a white-box implementation of SM4 and Bai et al. proposed another white-box SM4 by more complex inner encodings/decodings and random numbers. This paper analyzes these two white-box implementations of SM4. It first points out a flaw in Lin's analysis on Xiao's white-box SM4. In counting encoding matrixes and affine constants, the real value should be 61 200$\cdot2^{32}$ instead of only one in their analysis. Then an improvement of Lin's method is given. By adjusting the order of recovering affine constants, the improved method greatly reduces the attacking complexity. For example, in recovering the affine constants of external encoding of look-up tables, the new method needs no more than $2^{10}$ look-ups by first searching equivalent keys and then determining affine constants. This is far less than $2^{46}$ in Lin's analysis. This study also proposes a third-party analysis on Bai's white-box SM4, and points out that the size of keys and external encodings is 61 200$\cdot 2^{128}$. Analysis shows that, both of the two white-box SM4 rely on the security of their affine constants in external encodings, the linear transformation parts contribute very little to security, and simply complicating internal encodings/decodings. Furthermore, at the expense of implementation hardness to increase diversity, the method of splitting affine matrixes or constants cannot efficiently improve security. All these findings will be useful to analyze and design white-box ciphers.

Discovery and Analysis of High Frequency Words in Password Sets Hot!

Journal of Cryptologic Research. 2018, 5 (6): 671-679. ;  doi: 10.13868/j.cnki.jcr.000275
Full Text: HTML (1 KB)  PDF (954 KB)  ( 109 )
Show Abstract ( 96 )

Vocabulary is an important feature of a password set. Quantitative analysis of vocabulary has not received sufficient attention. This paper first presents a method to discovery for high-frequency words (hot words) based on effective frequency, which can effectively eliminate the interference of short words and improve the accuracy of discovery. In our hot word discovery software, a compressed trie tree was used to improve the performance of hot word discovery. Using the above method, hot words are collected from two password sets, such as CSDN and RockYou. By analyzing the top 100 words in these sets, we found the different vocabulary characteristics in passwords from China Mainland users and Western country users. The correlation analysis with the existing corpuses further confirms the significant difference between the two password sets. This paper also presents a method to quantitatively measure the vocabulary difference between two password sets by the cosine similarity. Analysis of multiple password sets shows that this method can effectively distinguish password sets from different regions. This paper quantitatively describes the characteristics of vocabulary in password set, which is of great significance for constructing a target corpus and improving the efficiency of password guessing.

Construction and Application of 4 × 4 Formalized MDS Matrices Hot!

Journal of Cryptologic Research. 2018, 5 (6): 680-694. ;  doi: 10.13868/j.cnki.jcr.000276
Full Text: HTML (1 KB)  PDF (470 KB)  ( 195 )
Show Abstract ( 87 )

The $4\times4$ MDS matrices over $\textrm{GL}(4, F_{2})$ are widely used in the design of diffusion layer in block ciphers, because they can provide the largest number of differential branches, and thus can resist differential and linear analyses. Since lightweight MDS matrices with few XOR operations are more cost-saving in circuit implementation, such kind of matrices have important applications in fast searching schemes. The need of such matrices is more pronounced with cryptographic algorithms being more frequently used in lightweight circumstances such as Internet of Things. Based on the construction of the lightweight $4\times4$ MDS matrices, this paper presents the concept of formalized MDS matrix: some of the entries that make up the matrix are known, their positions are fixed, and the criteria of being an MDS matrix are satisfied. Based on the definition above, this paper constructs $4\times4$ formalized MDS matrices with only three entries, like $L_{i}[I, A, A^{d}]$, where $I$ is a $4\times4$ identity matrix, $i$ represents the number of identity matrices in the formalized MDS matrices, $o(A)$ = 15 and $d = 2, 3,\cdots, 14$. Based on the theory of finite fields, by traversing all the $4\times4$ matrices of period 15 over $F_{2}$, a total of 18 432 lightweight MDS matrices of 13 XORs and 17 280 of 12 XORs are obtained. Based on these results, a general construction of formalized MDS matrices is given by increasing the number of entries and transforming the algebraic relations among them. Using this algorithm, a fast implementation of the lightest $4\times4$ MDS matrices with 10 XORs over $\textrm{GL}(4, F_{2})$ is conducted, and the algebraic structures of these matrices is studied. Compared with general exhaustive search, the method presented in this paper greatly reduces the searching space and has higher accuracy and construction efficiency.

Authenticated Key Exchange from Supersingular Isogeny Hot!

Journal of Cryptologic Research. 2018, 5 (6): 695-704. ;  doi: 10.13868/j.cnki.jcr.000277
Full Text: HTML (1 KB)  PDF (816 KB)  ( 131 )
Show Abstract ( 34 )

This paper studies the supersingular isogeny cryptosystem, a new promising post-quantum cryptosystem, and proposes a provably secure two-pass authenticated key exchange protocol over supersingular isogeny cryptosystems inspired by twin Diffie-Hellman problems. The proposed protocol is also an MQV-style protocol, hence it inherits sound properties of (H)MQV protocol. The security of the proposed protocol is based on the hardness of computing isogenies between supersingular elliptic curves. This paper gives heuristic arguments about the security properties, and formally proves its security in the authenticated-links adversarial model of Canetti-Krawczyk. To prove the security, a twin version of supersingular isogeny assumption is proposed. Compared with the lattice-based schemes, the proposed protocol has smaller keys and larger computation.

Copyright © Journal of Cryptologic Research