In cryptography, key size or key length refers to the number of bits in a key used by a cryptographic algorithm (such as a cipher).
Key length defines the upper-bound on an algorithm's security (i.e. a logarithmic measure of the fastest known attack against an algorithm), because the security of all algorithms can be violated by brute-force attacks. Ideally, the lower-bound on an algorithm's security is by design equal to the key length (that is, the algorithm's design does not detract from the degree of security inherent in the key length).
Most symmetric-key algorithms are designed to have security equal to their key length. However, after design, a new attack might be discovered. For instance, Triple DES was designed to have a 168-bit key, but an attack of complexity 2112 is now known (i.e. Triple DES now only has 112 bits of security, and of the 168 bits in the key the attack has rendered 56 'ineffective' towards security). Nevertheless, as long as the security (understood as "the amount of effort it would take to gain access") is sufficient for a particular application, then it does not matter if key length and security coincide. This is important for asymmetric-key algorithms, because no such algorithm is known to satisfy this property; elliptic curve cryptography comes the closest with an effective security of roughly half its key length.
Keys are used to control the operation of a cipher so that only the correct key can convert encrypted text (ciphertext) to plaintext. All commonly-used ciphers are based on publicly known algorithms or are open source and so it is only the difficulty of obtaining the key that determines security of the system, provided that there is no analytic attack (i.e. a "structural weakness" in the algorithms or protocols used), and assuming that the key is not otherwise available (such as via theft, extortion, or compromise of computer systems). The widely accepted notion that the security of the system should depend on the key alone has been explicitly formulated by Auguste Kerckhoffs (in the 1880s) and Claude Shannon (in the 1940s); the statements are known as Kerckhoffs' principle and Shannon's Maxim respectively.
A key should, therefore, be large enough that a brute-force attack (possible against any encryption algorithm) is infeasible - i.e. would take too long and/or would take too much memory to execute. Shannon's work on information theory showed that to achieve so-called 'perfect secrecy', the key length must be at least as large as the message and only used once (this algorithm is called the one-time pad). In light of this, and the practical difficulty of managing such long keys, modern cryptographic practice has discarded the notion of perfect secrecy as a requirement for encryption, and instead focuses on computational security, under which the computational requirements of breaking an encrypted text must be infeasible for an attacker.
Encryption systems are often grouped into families. Common families include symmetric systems (e.g. AES) and asymmetric systems (e.g. RSA and Elliptic-curve cryptography [ECC]). They may be grouped according to the central algorithm used (e.g. ECC and Feistel ciphers). Because each of these has a different level of cryptographic complexity, it is usual to have different key sizes for the same level of security, depending upon the algorithm used. For example, the security available with a 1024-bit key using asymmetric RSA is considered approximately equal in security to an 80-bit key in a symmetric algorithm.
The actual degree of security achieved over time varies, as more computational power and more powerful mathematical analytic methods become available. For this reason, cryptologists tend to look at indicators that an algorithm or key length shows signs of potential vulnerability, to move to longer key sizes or more difficult algorithms. For example,, a 1039-bit integer was factored with the special number field sieve using 400 computers over 11 months.[1] The factored number was of a special form; the special number field sieve cannot be used on RSA keys. The computation is roughly equivalent to breaking a 700 bit RSA key. However, this might be an advance warning that 1024 bit RSA keys used in secure online commerce should be deprecated, since they may become breakable in the foreseeable future. Cryptography professor Arjen Lenstra observed that "Last time, it took nine years for us to generalize from a special to a nonspecial, hard-to-factor number" and when asked whether 1024-bit RSA keys are dead, said: "The answer to that question is an unqualified yes."[2]
The 2015 Logjam attack revealed additional dangers in using Diffie-Hellman key exchange when only one or a few common 1024-bit or smaller prime moduli are in use. This practice, somewhat common at the time, allows large amounts of communications to be compromised at the expense of attacking a small number of primes.[3] [4]
See main article: article and Brute-force attack. Even if a symmetric cipher is currently unbreakable by exploiting structural weaknesses in its algorithm, it may be possible to run through the entire space of keys in what is known as a brute-force attack. Because longer symmetric keys require exponentially more work to brute force search, a sufficiently long symmetric key makes this line of attack impractical.
With a key of length n bits, there are 2n possible keys. This number grows very rapidly as n increases. The large number of operations (2128) required to try all possible 128-bit keys is widely considered out of reach for conventional digital computing techniques for the foreseeable future.[5] However, a quantum computer capable of running Grover's algorithm would be able to search the possible keys more efficiently. If a suitably sized quantum computer would reduce a 128-bit key down to 64-bit security, roughly a DES equivalent. This is one of the reasons why AES supports key lengths of 256 bits and longer.
IBM's Lucifer cipher was selected in 1974 as the base for what would become the Data Encryption Standard. Lucifer's key length was reduced from 128 bits to 56 bits, which the NSA and NIST argued was sufficient for non-governmental protection at the time. The NSA has major computing resources and a large budget; some cryptographers including Whitfield Diffie and Martin Hellman complained that this made the cipher so weak that NSA computers would be able to break a DES key in a day through brute force parallel computing. The NSA disputed this, claiming that brute-forcing DES would take them "something like 91 years".[6]
However, by the late 90s, it became clear that DES could be cracked in a few days' time-frame with custom-built hardware such as could be purchased by a large corporation or government.[7] [8] The book Cracking DES (O'Reilly and Associates) tells of the successful ability in 1998 to break 56-bit DES by a brute-force attack mounted by a cyber civil rights group with limited resources; see EFF DES cracker. Even before that demonstration, 56 bits was considered insufficient length for symmetric algorithm keys for general use. Because of this, DES was replaced in most security applications by Triple DES, which has 112 bits of security when using 168-bit keys (triple key).
The Advanced Encryption Standard published in 2001 uses key sizes of 128, 192 or 256 bits. Many observers consider 128 bits sufficient for the foreseeable future for symmetric algorithms of AES's quality until quantum computers become available. However, as of 2015, the U.S. National Security Agency has issued guidance that it plans to switch to quantum computing resistant algorithms and now requires 256-bit AES keys for data classified up to Top Secret.
In 2003, the U.S. National Institute for Standards and Technology, NIST proposed phasing out 80-bit keys by 2015. At 2005, 80-bit keys were allowed only until 2010.[9]
Since 2015, NIST guidance says that "the use of keys that provide less than 112 bits of security strength for key agreement is now disallowed." NIST approved symmetric encryption algorithms include three-key Triple DES, and AES. Approvals for two-key Triple DES and Skipjack were withdrawn in 2015; the NSA's Skipjack algorithm used in its Fortezza program employs 80-bit keys.[10]
The effectiveness of public key cryptosystems depends on the intractability (computational and theoretical) of certain mathematical problems such as integer factorization. These problems are time-consuming to solve, but usually faster than trying all possible keys by brute force. Thus, asymmetric keys must be longer for equivalent resistance to attack than symmetric algorithm keys. The most common methods are assumed to be weak against sufficiently powerful quantum computers in the future.
Since 2015, NIST recommends a minimum of 2048-bit keys for RSA,[11] an update to the widely-accepted recommendation of a 1024-bit minimum since at least 2002.[12]
1024-bit RSA keys are equivalent in strength to 80-bit symmetric keys, 2048-bit RSA keys to 112-bit symmetric keys, 3072-bit RSA keys to 128-bit symmetric keys, and 15360-bit RSA keys to 256-bit symmetric keys.[13] In 2003, RSA Security claimed that 1024-bit keys were likely to become crackable sometime between 2006 and 2010, while 2048-bit keys are sufficient until 2030.[14] the largest RSA key publicly known to be cracked is RSA-250 with 829 bits.[15]
The Finite Field Diffie-Hellman algorithm has roughly the same key strength as RSA for the same key sizes. The work factor for breaking Diffie-Hellman is based on the discrete logarithm problem, which is related to the integer factorization problem on which RSA's strength is based. Thus, a 2048-bit Diffie-Hellman key has about the same strength as a 2048-bit RSA key.
Elliptic-curve cryptography (ECC) is an alternative set of asymmetric algorithms that is equivalently secure with shorter keys, requiring only approximately twice the bits as the equivalent symmetric algorithm. A 256-bit Elliptic-curve Diffie–Hellman (ECDH) key has approximately the same safety factor as a 128-bit AES key.[11] A message encrypted with an elliptic key algorithm using a 109-bit long key was broken in 2004.[16]
The NSA previously recommended 256-bit ECC for protecting classified information up to the SECRET level, and 384-bit for TOP SECRET;[17] In 2015 it announced plans to transition to quantum-resistant algorithms by 2024, and until then recommends 384-bit for all classified information.[18]
The two best known quantum computing attacks are based on Shor's algorithm and Grover's algorithm. Of the two, Shor's offers the greater risk to current security systems.
Derivatives of Shor's algorithm are widely conjectured to be effective against all mainstream public-key algorithms including RSA, Diffie-Hellman and elliptic curve cryptography. According to Professor Gilles Brassard, an expert in quantum computing: "The time needed to factor an RSA integer is the same order as the time needed to use that same integer as modulus for a single RSA encryption. In other words, it takes no more time to break RSA on a quantum computer (up to a multiplicative constant) than to use it legitimately on a classical computer." The general consensus is that these public key algorithms are insecure at any key size if sufficiently large quantum computers capable of running Shor's algorithm become available. The implication of this attack is that all data encrypted using current standards based security systems such as the ubiquitous SSL used to protect e-commerce and Internet banking and SSH used to protect access to sensitive computing systems is at risk. Encrypted data protected using public-key algorithms can be archived and may be broken at a later time, commonly known as retroactive/retrospective decryption or "harvest now, decrypt later".
Mainstream symmetric ciphers (such as AES or Twofish) and collision resistant hash functions (such as SHA) are widely conjectured to offer greater security against known quantum computing attacks. They are widely thought most vulnerable to Grover's algorithm. Bennett, Bernstein, Brassard, and Vazirani proved in 1996 that a brute-force key search on a quantum computer cannot be faster than roughly 2n/2 invocations of the underlying cryptographic algorithm, compared with roughly 2n in the classical case.[19] Thus in the presence of large quantum computers an n-bit key can provide at least n/2 bits of security. Quantum brute force is easily defeated by doubling the key length, which has little extra computational cost in ordinary use. This implies that at least a 256-bit symmetric key is required to achieve 128-bit security rating against a quantum computer. As mentioned above, the NSA announced in 2015 that it plans to transition to quantum-resistant algorithms.
In a 2016 Quantum Computing FAQ, the NSA affirmed:
In a 2022 press release, the NSA notified:
Since September 2022, the NSA has been transitioning from the Commercial National Security Algorithm Suite (now referred to as CNSA 1.0), originally launched in January 2016, to the Commercial National Security Algorithm Suite 2.0 (CNSA 2.0), both summarized below:[20]
CNSA 2.0
Algorithm | Function | Parameters | |
---|---|---|---|
Advanced Encryption Standard (AES) | Symmetric block cipher for information protection | 256-bit keys | |
CRYSTALS-Kyber | Asymmetric algorithm for key establishment | Level V | |
CRYSTALS-Dilithium | Asymmetric algorithm for digital signatures | Level V | |
Secure Hash Algorithm (SHA) | Algorithm for computing a condensed representation of information | SHA-384 or SHA-512 | |
Leighton-Micali Signature (LMS) | Asymmetric algorithm for digitally signing firmware and software | All parameters approved. SHA256/192 recommended. | |
Xtended Merkle Signature Scheme (XMSS) | Asymmetric algorithm for digitally signing firmware and software | All parameters approved |
CNSA 1.0
Algorithm | Function | Parameters | |
---|---|---|---|
Advanced Encryption Standard (AES) | Symmetric block cipher for information protection | 256-bit keys | |
Elliptic Curve Diffie-Hellman (ECDH) Key Exchange | Asymmetric algorithm for key establishment | Curve P-384 | |
Elliptic Curve Digital Signature Algorithm (ECDSA) | Asymmetric algorithm for digital signatures | Curve P-384 | |
Secure Hash Algorithm (SHA) | Algorithm for computing a condensed representation of information | SHA-384 | |
Diffie-Hellman (DH) Key Exchange | Asymmetric algorithm for key establishment | Minimum 3072-bit modulus | |
[Rivest-Shamir-Adleman] RSA | Asymmetric algorithm for key establishment | Minimum 3072-bit modulus | |
[Rivest-Shamir-Adleman] RSA | Asymmetric algorithm for digital signatures | Minimum 3072-bit modulus |