Encryption. Cryptography Modern technologies for encrypting electronic data

An overview of the encryption algorithms widespread in the world allows not only to select the algorithm necessary for your task, but also to estimate the costs of its implementation and the capabilities and requirements awaiting the user.

Encryption is a method of protecting information

From time immemorial, there has been no value greater than information. The twentieth century is the century of informatics and informatization. The technology makes it possible to transmit and store ever larger amounts of information. This benefit also has a downside. Information becomes more and more vulnerable for various reasons:

increasing volumes of stored and transmitted data;
  • expanding the circle of users with access to computer resources, programs and data;
  • complication of operating modes of computing systems.
  • Therefore, the problem of protecting information from unauthorized access (NSD) during transmission and storage is becoming increasingly important. The essence of this problem is the constant struggle of information security specialists with their "opponents".

    Characteristics of Composite Encryption Algorithms

    Information protection is a set of measures, methods and means that ensure:

    • exclusion of NSD for computer resources, programs and data;
    • checking the integrity of information;
    • exclusion of unauthorized use of programs (protection of programs from copying).

    The obvious trend towards the transition to digital methods of transmission and storage of information allows the use of unified methods and algorithms for the protection of discrete (text, fax, telex) and continuous (speech) information.

    A proven method of protecting information from tampering is encryption (cryptography). Encryption is the process of converting open data (plaintext) into encrypted (ciphertext, ciphertext) or encrypted data into open data according to certain rules using keys. In English literature, encryption / decryption is enciphering / deciphering.

    With the help of cryptographic methods, it is possible:

    encryption of information;
  • implementation of an electronic signature;
  • distribution of encryption keys;
  • protection against accidental or deliberate changes in information.
  • Certain requirements are imposed on encryption algorithms:

    • high level of data protection against decryption and possible modification;
    • information security should be based only on the knowledge of the key and not depend on whether the algorithm is known or not (Kirkhoff's rule);
    • a small change in the original text or key should lead to a significant change in the ciphertext ("collapse" effect);
    • the range of key values ​​must exclude the possibility of decrypting data by enumerating the key values;
    • economical implementation of the algorithm with sufficient performance;
    • the cost of decrypting the data without knowing the key must exceed the cost of the data.

    Legends of deep antiquity ...

    Boris Obolikshto

    Cryptology is an ancient science and this is usually emphasized by the story of Julius Caesar (100 - 44 BC), whose correspondence with Cicero (106 - 43 BC) and other "subscribers" in Ancient Rome was encrypted ... Caesar's cipher, otherwise a cyclic substitution cipher, consists in replacing each letter in a message with an alphabet letter, spaced from it by a fixed number of letters. The alphabet is considered to be cyclic, that is, after Z is followed by A. Caesar replaced the letter with a letter three times from the original.
    Today in cryptology it is customary to operate with symbols not in the form of letters, but in the form of numbers corresponding to them. So, in the Latin alphabet, we can use numbers from 0 (corresponding to A) to 25 (Z). Denoting the number corresponding to the original character, x, and the encoded one - y, we can write down the rule for applying a substitution cipher:

    y = x + z (mod N), (1)

    where z- The secret key, N- the number of characters in the alphabet, and addition modulo N is an operation similar to ordinary addition, with the only difference that if ordinary summation gives a result greater than or equal to N, then the remainder of its division by N.

    The Caesar cipher in the accepted notation corresponds to the value of the secret key z = 3 (while Caesar Augustus has z = 4). Such ciphers are revealed extremely simply even without knowing the value of the key: it is enough to know only the encryption algorithm, and the key can be found by a simple brute-force attack (the so-called force attack). Cryptology consists of two parts - cryptography, which studies methods of encryption and / or authentication of messages, and cryptanalysis, which considers ways of decrypting and substituting cryptograms. The instability of the first ciphers for many centuries created an atmosphere of secrecy around the work of a cryptographer, and slowed down the development of cryptology as a science.

    For more than two thousand years, the so-called "pre-scientific" cryptography has semi-intuitively "groped" for quite a few interesting solutions. The simplest thing is to do the substitution in a non-alphabetical order. It is also a good idea to rearrange the characters in the message in places (permutation ciphers).

    The first systematic work on cryptography is considered to be the work of the great architect Leon Battista Alberti (1404-1472). The period until the middle of the 17th century is already full of works on cryptography and cryptanalysis. The intrigues around cipher programs in Europe at that time are surprisingly interesting. Alas, limited by the magazine's capabilities, we will choose only one surname known from school - Francois Viet (1540 - 1603), who at the court of King Henry IV of France was so successfully engaged in cryptanalysis (which did not yet bear this proud name) that the Spanish king Philip II complained to the Pope about the use of black magic by the French. But everything went without bloodshed - at the Pope's court at that time there were already advisers from the Argenti family, whom we would call cryptanalysts today.

    It can be argued that over the centuries, the decryption of cryptograms has been helped by frequency analysis of the appearance of individual symbols and their combinations. The probabilities of the appearance of individual letters in the text vary greatly (for the Russian language, for example, the letter "o" appears 45 times more often than the letter "f"). This, on the one hand, serves as the basis for both the disclosure of keys and the analysis of encryption algorithms, and on the other hand, it is the reason for the significant redundancy (in the information sense) of the text in natural language. Any simple substitution does not allow hiding the frequency of the symbol's appearance - like an awl sticking out of a bag in Russian text, symbols corresponding to the letters "o", "e", "a", "i", "t", "n". But the theory of information and the measure of redundancy have not yet been created, and RANDOMIZATION is proposed to fight the enemy of the cryptographer - frequency analysis. Its author Karl Friedrich Gauss (1777 - 1855) mistakenly believed that he had created an unbreakable cipher.

    The next notable figure in the history of cryptology that we shouldn't miss is the Dutchman Auguste Kerkhoff (1835-1903). He owns the wonderful "Kerkhoff rule": the strength of the cipher should be determined ONLY by the secrecy of the key. Considering the time when this rule was formulated, it can be recognized as the greatest discovery (before the creation of a systematic theory, more than half a century!). This rule assumes that the encryption ALGORITHM IS NOT SECRET, which means that an open discussion of the advantages and disadvantages of the algorithm can be held. Thus, this rule translates works on cryptology into the category of OPEN scientific works that allow discussion, publication, etc.

    20th century - from intuition to science

    The last name we will mention in pre-scientific cryptology is AT&T engineer G.S. Vernam. In 1926, he proposed a truly unbreakable cipher. The idea behind the cipher is to choose a new z value for each next character in equation (1). In other words, the secret key should only be used once. If such a key is chosen at random, then, as was rigorously proved by Shannon 23 years later, the cipher is indecipherable. This cipher is the theoretical basis for the use of so-called "cipher pads", the widespread use of which began during the Second World War. The cipher pad contains many single-use keys that are sequentially selected when encrypting messages. Vernam's proposal, however, does not solve the problem of secret communication: instead of a method for transmitting a secret message, it is now necessary to find a way to transmit a secret key that is EQUAL IN LENGTH, that is, containing the same number of characters as there are in the plain text.

    In 1949, Claude Shannon's article "Theory of Communication in Secret Systems" laid the foundation for scientific cryptology. Shannon showed that for some "random cipher" the number of ciphertext characters, having received which a cryptanalyst with unlimited resources, can recover the key (and uncover the cipher),

    H (Z) / (rlog N), (2)

    where H (Z) is the entropy of the key, r is the redundancy of the plaintext, and N- the volume of the alphabet.

    From the efficiency with which archivers compress text files, we well know how great the redundancy of plain text is - after all, their job is to reduce redundancy (and only on the most easily eliminated part of it). With plain text redundancy of the order of 0.75 and a 56-bit key (such as DES assumes), 11 characters of ciphertext are sufficient to recover the key with unlimited cryptanalyst resources.


    Strictly speaking, relation (2) has not been proved for an arbitrary cipher, but it is true for certain special cases. A remarkable conclusion follows from (2): the work of a cryptanalyst can be impeded not only by improving the cryptosystem, but also by reducing the redundancy of the plain text. Moreover, if the redundancy of the plaintext is reduced to zero, then even a short key will yield a cipher that a cryptanalyst cannot reveal.

    Before encryption, information should be subjected to statistical coding (compression, archiving). In this case, the amount of information and its redundancy will decrease, and the entropy will increase (the average amount of information per symbol). Since there will be no repeated letters and words in the compressed text, decryption (cryptanalysis) will be difficult.

    Classification of encryption algorithms

    1. Symmetric (with secret, single key, single-key).
    1.1. Streaming (data stream encryption):

    with a one-time or infinite key (infinite-key cipher);
  • with final key (Vernam system);
  • based on a pseudo-random number generator (PRN).
  • 1.2. Block (data encryption block-by-block):
    1.2.1. Permutation ciphers (permutation, P-blocks);
    1.2.2. Substitution ciphers (substitutions, S-boxes):

    • mono-alphabetic (Caesar code);
    • polyalphabetic (Vigener cipher, Jefferson cylinder, Wheatstone disk, Enigma);

    1.2.3. composite (table 1):

    • Lucipher (IBM, USA);
    • DES (Data Encryption Standard, USA);
    • FEAL-1 (Fast Enciphering Algoritm, Japan);
    • IDEA / IPES (International Data Encryption Algorithm /
    • Improved Proposed Encryption Standard, Ascom-Tech AG, Switzerland);
    • B-Crypt (British Telecom, UK);
    • GOST 28147-89 (USSR); * Skipjack (USA).

    2. Asymmetric (public-key):

    • Diffie-Hellman DH (Diffie, Hellman);
    • Rivest-Shamir-Adleman RSA (Rivest, Shamir, Adleman);
    • El Gamal ElGamal.

    In addition, there is a division of encryption algorithms into actual ciphers and codes. Ciphers work with individual bits, letters, symbols. Codes operate with linguistic elements (syllables, words, phrases).

    Symmetric encryption algorithms

    Symmetric encryption algorithms (or cryptography with secret keys) are based on the fact that the sender and recipient of information use the same key. This key must be kept secret and transmitted in a way that prevents it from being intercepted.

    Information exchange is carried out in 3 stages:

    the sender transfers the key to the recipient (in the case of a network with several subscribers, each pair of subscribers must have its own key, which is different from the keys of other pairs);
  • the sender uses the key to encrypt the message, which is forwarded to the recipient;
  • If a unique key is used for each day and for each communication session, this will increase the security of the system.

    Stream ciphers

    In stream ciphers, that is, when encrypting a data stream, each bit of the original information is encrypted independently of the others using gamma.

    Gamming is the imposition of a cipher gamma (random or pseudo-random sequence of ones and zeros) on open data according to a specific rule. Commonly used is "exclusive OR", also called modulo-2 addition and implemented in assembly programs with the XOR instruction. For decryption, the same gamut is superimposed on the encrypted data.

    With a single use of a random gamut of the same size with the encrypted data, breaking the code is impossible (the so-called cryptosystems with a one-time or infinite key). In this case, "infinite" means that the gamut is not repeated.

    In some stream ciphers, the key is shorter than the message. Thus, in the Vernam telegraph system, a paper ring containing a gamma is used. Of course, the strength of such a cipher is not ideal.

    It is clear that exchanging keys the size of the information to be encrypted is not always appropriate. Therefore, the gamma obtained using a pseudo-random number generator (PRN) is more often used. In this case, the key is a seed number (initial value, initializing value, IV) for starting the PRNG generator. Each PRNG generator has a period after which the generated sequence is repeated. Obviously, the period of the pseudo-random gamut should exceed the length of the encrypted information.

    The PRNG generator is considered correct if the observation of fragments of its output does not allow recovering the missing parts or the entire sequence with a known algorithm, but an unknown initial value.

    When using a PRNG generator, several options are possible:

    Bitwise encryption of the data stream. The digital key is used as the initial value of the PRNG generator, and the output bit stream is summed modulo 2 with the original information. In such systems, there is no error propagation property.
  • Bit-by-bit encryption of the closed-loop data stream (OS) by ciphertext. This system is similar to the previous one, except that the ciphertext is returned as a parameter to the PRNG generator. Distribution of errors is characteristic. The range of error propagation depends on the structure of the PRNG generator.
  • Bitwise encryption of the data stream from the OS according to the source text. The base of the PRNG generator is the initial information. The property of unlimited propagation of the error is characteristic.
  • Bitwise encryption of the data stream from the OS using ciphertext and source text.
  • Block ciphers

    With block encryption, information is divided into blocks of fixed length and encrypted block by block. Block ciphers are of two main types:

    permutation ciphers (transposition, permutation, P-boxes);
  • substitution ciphers (substitutions, S-boxes).
  • Permutation ciphers rearrange the elements of the open data (bits, letters, symbols) in some new order. There are ciphers of horizontal, vertical, double permutations, lattices, labyrinths, slogans, etc.

    Replacement ciphers replace elements of open data with other elements according to a specific rule. Distinguish ciphers of simple, complex, pair replacement, alphanumeric encryption and column replacement ciphers. Replacement ciphers are divided into two groups:

    mono-alphabetic (Caesar code);
  • polyalphabetic (Vigener cipher, Jefferson's cylinder, Wheatstone disk, Enigma).
  • In mono-alphabetic replacement ciphers, the letter of the original text is replaced by another, predetermined letter. For example, in the Caesar code, a letter is replaced by a letter spaced from it in the Latin alphabet by a certain number of positions. It is obvious that such a cipher is very easy to break. It is necessary to calculate how often letters are found in the cipher text, and compare the result with the frequency of occurrence of letters known for each language.

    In polyalphabetic substitutions, to replace a certain character of the original message in each case of its occurrence, different characters from a certain set are sequentially used. It is clear that this set is not infinite, after a certain number of characters it must be used again. This is the weakness of purely polyalphabetic ciphers.

    In modern cryptographic systems, as a rule, both methods of encryption (substitution and permutation) are used. Such a cipher is called a product cipher. It is more secure than a scrambler using only substitutions or permutations.

    Block encryption can be done in two ways:

    No feedback (OS). Several bits (block) of the original text are encrypted simultaneously, and each bit of the original text affects each bit of the ciphertext. However, there is no mutual influence of the blocks, that is, two identical blocks of the source text will be represented by the same ciphertext. Therefore, such algorithms can only be used to encrypt a random sequence of bits (for example, keys). Examples are DES in ECB mode and GOST 28147-89 in simple overwrite mode.
  • With feedback. Typically, the OS is organized as follows: the previous encrypted block is added modulo 2 with the current block. The initialization value is used as the first block in the OS chain. An error in one bit affects two blocks - the erroneous and the next one. An example is DES in CBC mode.
  • The PSCH generator can also be used for block encryption:

    1. Block-by-block encryption of the data stream. The encryption of sequential blocks (substitution and swap) depends on the key-controlled PRNG generator.
    2. Block-by-block encryption of the data stream from the OS. The PRNG generator is controlled by cipher or source code, or both.

    The US federal standard DES (Data Encryption Standard) is very widespread, on which the international standard ISO 8372-87 is based. DES has been endorsed by the American National Standards Institute (ANSI) and recommended for use by the American Bankers Association (ABA). DES provides 4 modes of operation:

    • ECB (Electronic Codebook) electronic codebook;
    • CBC (Cipher Block Chaining) block chain;
    • CFB (Cipher Feedback) ciphertext feedback;
    • OFB (Output Feedback) output feedback.

    GOST 28147-89 - Russian standard for data encryption. The standard includes three algorithms for encrypting (decrypting) data: a simple replacement mode, a gamma mode, a closed-loop gamma mode, and a simulated insertion mode.

    With the help of an imitation insert, it is possible to record an accidental or deliberate modification of encrypted information. An imitation insert can be generated either before encryption (after decryption) of the entire message, or simultaneously with encryption (decryption) in blocks. In this case, the block of information is encrypted by the first sixteen cycles in the simple replacement mode, then it is added modulo 2 with the second block, the result of the summation is again encrypted by the first sixteen cycles, etc.

    Encryption algorithms GOST 28147-89 have the advantages of other algorithms for symmetric systems and surpass them in their capabilities. So, GOST 28147-89 (256-bit key, 32 encryption cycles) in comparison with algorithms such as DES (56-bit key, 16 encryption cycles) and FEAL-1 (64-bit key, 4 encryption cycles) has more high cryptographic strength due to a longer key and a larger number of encryption cycles.

    It should be noted that, unlike DES, in GOST 28147-89, the substitution block can be arbitrarily changed, that is, it is an additional 512-bit key.

    Gamma algorithms GOST 28147-89 (256-bit key, 512-bit substitution block, 64-bit initialization vector) are superior in cryptographic strength to the B-Crypt algorithm (56-bit key, 64-bit initialization vector).

    The advantages of GOST 28147-89 are also the presence of protection against the imposition of false data (generation of an imitation insert) and the same encryption cycle in all four GOST algorithms.

    Block algorithms can be used to generate gamma as well. In this case, the gamma is generated in blocks and is added block-by-block modulo 2 with the original text. As an example, we can name B-Crypt, DES in CFB and OFB modes, GOST 28147-89 in gamma and closed-loop gamma modes.

    Asymmetric encryption algorithms

    In asymmetric encryption algorithms (or public key cryptography), one key (public) is used to encrypt information, and another (secret) key is used to decrypt information. These keys are different and cannot be derived from one another.

    The information exchange scheme is as follows:

    the recipient calculates the public and private keys, keeps the secret key secret, and makes the public one available (informs the sender, a group of network users, publishes);
  • the sender, using the recipient's public key, encrypts the message that is forwarded to the recipient;
  • the recipient receives the message and decrypts it using their private key.
  • RSA

    Protected by US patent N 4405829. Developed in 1977 at the Massachusetts Institute of Technology (USA). It was named after the first letters of the authors' surnames (Rivest, Shamir, Adleman). The cryptographic strength is based on the computational complexity of the problem of factoring a large number into prime factors.

    ElGamal

    Developed in 1985. Named after the author's surname - El-Gamal. Used in the US Digital Signature Standard (DSS). Crypto resistance is based on the computational complexity of the problem of taking the logarithm of integers in finite fields.

    Comparison of symmetric and asymmetric encryption algorithms

    In asymmetric systems, long keys (512 bits or more) must be used. A long key will dramatically increase the encryption time. In addition, key generation is very time consuming. But you can distribute keys through unprotected channels.

    Symmetric algorithms use shorter keys, which means that encryption is faster. But in such systems, key distribution is difficult.

    Therefore, when designing a protected system, both symmetric and asymmetric algorithms are often used. Since the system with public keys also allows the distribution of keys in symmetric systems, it is possible to combine asymmetric and symmetric encryption algorithms in the system for transmitting protected information. With the help of the first one to send out keys, the second one is to actually encrypt the transmitted information.

    Information exchange can be carried out as follows:

    the recipient calculates the public and private keys, keeps the secret key secret, and makes the public one available;
  • the sender, using the recipient's public key, encrypts the session key, which is sent to the recipient via an unsecured channel;
  • the recipient receives the session key and decrypts it using their private key;
  • the sender encrypts the message with the session key and sends it to the recipient;
  • the recipient receives the message and decrypts it.
  • It should be noted that only symmetric algorithms are used in government and military communication systems, since there is no rigorous mathematical justification for the security of public key systems, as, incidentally, the opposite has not been proven either.

    Information authentication. Digital signature

    When transferring information, the following must be provided together or separately:

    • Confidentiality - an attacker should not be able to find out the content of the transmitted message.
    • Authenticity, which includes two concepts:
    1. integrity - the message must be protected from accidental or deliberate changes;
    2. identification of the sender (verification of authorship) - the recipient must be able to verify who sent the message.

    Encryption can provide confidentiality and, in some systems, integrity.

    The integrity of the message is checked by calculating the check function of the message - some number of small length. This control function should be highly likely to change even with small changes in the message (deletion, inclusion, rearrangement or reordering of information). The control function is called and calculated in different ways:

    message authenticity code (MAC);
  • Quadratic Congruentical Manipulation Detection Code (QCMDС);
  • Manipulation Detection Code (MDC);
  • Message Digest Algorithm (MD5);
  • check sum;
  • Block Check Character (BCC)
  • cyclic redundancy code (CEC, Cyclic Redundancy Check, CRC);
  • hash function (hash);
  • imitation insert in GOST 28147-89;
  • an algorithm with truncation to n bits (n-bit Algorithm with Truncation).
  • When calculating the check function, any encryption algorithm can be used. It is possible to encrypt the checksum itself.

    A digital signature is widely used (a digital addition to the transmitted information, which guarantees the integrity of the latter and allows you to verify its authorship). There are known digital signature models based on symmetric encryption algorithms, but when using systems with public keys, the digital signature is more convenient.

    To use the RSA algorithm, the message should be compressed with the hashing function (MD5 - Message Digest Algorithm) to a 256-bit hash (H). The message signature S is calculated as follows:

    d
    S = H mod n

    The signature is sent along with the message.

    The identification process consists in obtaining the hash function of the message (H ") and comparing it with

    e
    H = S mod n

    where H- hash of the message,

    S- his signature,

    d- The secret key,
    e- public key.

    Authentication standards are dedicated to:

    • authentication (authentication) - ISO 8730-90, ISO / IES 9594-90 and ITU X.509;
    • integrity - GOST 28147-89, ISO 8731-90;
    • digital signature - ISO 7498, P 34.10-94 (Russia), DSS (Digital Signature Standard, USA).

    ISO- International Organization for Standardization / ISO /,
    ITU- International Telecommunication Union / ITU /.

    Implementation of encryption algorithms

    The encryption algorithms are implemented in software or hardware. There are a great many purely software implementations of various algorithms. Due to their low cost (some are completely free), as well as the increasing speed of PC processors, ease of operation and reliability, they are very competitive. The well-known program Diskreet from the Norton Utilities package, which implements DES.

    We cannot fail to mention the PGP package (Pretty Good Privacy, version 2.1, by Philip Zimmermann), in which practically all the problems of protecting the transmitted information are comprehensively solved. Compression of data before encryption, powerful key management, symmetric (IDEA) and asymmetric (RSA) encryption algorithms, calculation of a control function for digital signature, reliable key generation are applied.

    The publications of the Monitor magazine with detailed descriptions of various algorithms and corresponding listings enable everyone who wishes to write their own program (or use a ready-made listing).

    Hardware implementation of algorithms is possible using specialized microcircuits (crystals for DH, RSA, DES, Skipjack, GOST 28147-89 algorithms are produced) or using general-purpose components (due to their low cost and high speed, digital signal processors are promising - DSP, Digital Signal Processor, DSP ).

    Among the Russian developments should be noted the boards "Krypton" (firm "Ankad") and "Grim" (methodology and algorithms of the firm "LAN-Crypto", technical development of the Scientific and Production Center "ELiPS").

    "Krypton" are single board devices using cryptoprocessors (specialized 32-bit microcomputers, also called "blooming"). Blooming devices implement the GOST 28147-89 algorithms in hardware, they consist of a calculator and RAM for storing keys. Moreover, the crypto processor has three areas for storing keys, which makes it possible to build multi-level key systems.

    For greater encryption reliability, two cryptoprocessors work simultaneously, and a 64-bit data block is considered to be correctly encrypted only if the information at the output of both blooming units matches. Encryption speed - 250 KB / s.

    In addition to two blooming mills, the board contains:

    controller for interfacing with the computer bus (except for "Krypton-EC" boards are designed to work with the ISA bus);
  • BIOS of the board, designed to interface with a computer and performing self-testing of the device and entering keys into cryptoprocessors;
  • a random number sensor (RNC) for generating encryption keys, made on noise diodes.
  • The following types of "Krypton" boards are produced:

    • "Krypton-EC" is intended for PC series EC 1841-1845;
    • "Krypton-3";
    • "Krypton-4" (the overall dimensions are reduced due to the movement of a number of discrete elements into the base crystals, the exchange speed is increased due to the internal buffer by 8 bytes);
    • "Krypton-IK" is additionally equipped with an IK controller (smart card, smart card, smart card).

    In devices "Krypton-EC", "Krypton-3", "Krypton-4" keys are stored as a file on a floppy disk. In "Krypton-IK" keys are located on IK, which makes it difficult to counterfeit and copy.

    The Grim card uses Analog Devices ADSP-2105 and ADSP-2101 digital signal processors, giving encryption rates of 125 and 210 KB / s, respectively. The board contains a physical RAM and ROM with programs for the initial test, access rights checking, loading and key generation. Keys are stored on a non-standard formatted diskette. The board implements GOST 28147-89 and digital signature algorithms.

    To protect information transmitted over communication channels, channel encryption devices are used, which are manufactured in the form of an interface card or a stand-alone module. Encryption speed of various models from 9600 bps to 35 Mbps.

    In conclusion, we note that encrypting information is not a panacea. It should be considered only as one of the methods of information protection and must be applied in combination with legislative, organizational and other measures.

    Public Key Cryptology

    Boris Obolikshto

    It would seem that the impetus given by Shannon should have caused a collapse of results in scientific cryptology. But that did not happen. Only the rapid development of telecommunications, remote access to computers with the imperfection of existing cryptosystems with a secret key brought to life the next and, perhaps, the most interesting stage of cryptology, the countdown of which dates back to the article by Whitfield Diffie and Marty E. Hellman, which appeared in November 1976, "New Directions in cryptography ". W. Diffie himself dates the receipt of the results published in November 1976 to May of the same year; thus, we have an occasion from Ma to November to celebrate the TWENTY ANNIVERSARY of public key cryptology.

    One of the problems that has remained unresolved in traditional cryptography is the distribution of secret keys. The idea of ​​transmitting a "secret" key over an open channel seems insane at first glance, but if, abandoning perfect secrecy and confining oneself to practical stability, then one can come up with a way to exchange keys.

    The exponential key exchange was the first of the widespread methods. Its essence is as follows:

    • Alice and Bob (using as sides not abstract "A" and "B", but pretty Alice and Bob, has become a tradition in this area of ​​cryptology) choose random numbers Xa and Xb, respectively.
    • Alice passes to Bob Ya = aXa (mod q), and Bob to Alice - Yb = aXb (mod q).

    Here a- the so-called primitive element of the finite Galois field GF (q), whose remarkable property for us is that its degrees are given by all nonzero values ​​of the elements of the field. The secret key is the value

    Ya = aXaXb (mod q),

    which Alice obtains by raising the number given by Bob to the power Xa, known only to her, and Bob - the number received from Alice to a power known only to him Xb... The cryptanalyst is forced to calculate the logarithm of at least one of the transmitted numbers.

    The stability of the exponential key exchange is based on the so-called one-sidedness of the exponentiation function: the computational complexity of obtaining Ya from Xa with q length of 1000 bits is about 2000 multiplications of 1000 bit numbers, while the reverse operation will require about 1030 operations. ONE-WAY functions with a similar asymmetry in the computational complexity of the forward and inverse problems play a leading role in public key cryptography.

    Even more interesting is the one-sided function with a secret passage ("loophole"). The idea is to build a function that can be reversed only by knowing a certain "loophole" - the secret key. The function parameters then serve as a public key that Alice can transmit over an insecure channel to Bob; Bob, using the obtained public key, performs encryption (computation of the direct function) and transmits the result to Alice via the same channel; Alice, knowing the "loophole" (the secret key), easily calculates the inverse function, while the cryptanalyst, not knowing the secret key, is doomed to solve a much more complicated problem.

    In 1976, R.C. Merkle managed to construct such a function on the basis of the knapsack packing problem. The task itself is one-sided: knowing a subset of weights packed in a knapsack, it is easy to calculate the total weight, but knowing the weight, it is not easy to determine a subset of weights. In our case, a one-dimensional version of the problem was used: the vector of loads and the sum of the components of its subvectors. Having built in a "loophole", it was possible to obtain the so-called Merkle-Hellman knapsack system. The first public-key cryptosystem went live, and Merkle offered $ 100 to anyone who could uncover it.

    The award went to A. Shamir six years after he published in March 1982 a report on the disclosure of the Merkle-Hellman knapsack system with one iteration. L. Adleman demonstrated the disclosure of the knapsack system at the Crypto conference 82. L. Adleman demonstrated the disclosure of the knapsack system on an Apple II computer. This hides one of the greatest dangers for public-key cryptography: there is no rigorous proof of the one-sidedness of the algorithms used, that is, no one is guaranteed against the possibility of finding a decryption method, which probably does not require solving the inverse problem, the high complexity of which allows us to hope for practical strength of the cipher It will be good if the disclosure of one or another system is carried out by a world-renowned scientist (in 1982 A. Shamir was already known as one of the authors of the RSA system).

    To conclude the knapsack system drama, there is another wager that Merkle made with those wishing to reveal an improved multi-iteration system worth $ 1,000. And this amount had to be paid. It was received by E. Brickell, who discovered in the summer of 1984 a system with forty iterations and with one hundred messages per hour of Cray-1 operation.

    Much more successful today is the fate of the RSA system, named after the first letters of the names of its authors R. Rivest and A. Shamir and L. Adlman, who are already familiar to us. By the way, Alice and Bob owe their birth to the first systematic presentation of the RSA algorithm. With their "help", the authors in 1977 described a system based on the one-sided properties of a prime factorization function (multiplying is easy, but decomposing is not).

    The development of public key cryptology allowed cryptological systems to find widespread commercial applications rather quickly. But the heavy use of cryptography is not complete without overlaps. From time to time we learn about troubles in one or another security system. The latest sensational incident in the world was the hacking of the Kerberos system. This system, developed in the mid-80s, is quite popular in the world, and its breaking caused considerable concern among users.

    In the case of Kerberos, the trouble was not in the encryption algorithm, but in the way the random numbers were obtained, that is, in the way the algorithm was implemented. When news came in last October of a random number error in Netscape software discovered by students at the University of Berkeley, Stephen Lodin discovered a similar nuisance in Kerberos. Together with Brian Dole, he managed to find a hole in the Kerberos system. The characters in this story are not dilettantes. Alumni from Purdue University, Illinois have partnered with the Computer Operations, Audit, and Security Technology (COAST) laboratory, a computer security professional run by Prof. Spafford, who is also the founder of PCERT (Purdue Computer Emergency Response Team), the university's "rapid response" team for computer emergencies. PCERT, in turn, is a member of a similar international organization FIRST (Forum of Incident Response Teams). As you can see, the mine was found by sappers, and this inspires hope that users of cryptosystems will not remain defenseless even in the event of flaws.

    Characteristic is the content of the first appeal to the press (dated February 16, 1996), which was made on behalf of the discoverers by prof. Spafford. In it, along with information about the unreliability of the password system and the possibility of its cracking within five minutes, it says about the delay in the further dissemination of technical information until the developers make adjustments to prevent unauthorized access.

    Our mistakes have not been spared either. Fortunately, there are professionals in our area who are able to timely find and show weaknesses in the protection system. A month has not passed since the specialists of the Kiev Fintronic LLC P.V. Leskov and V.V. Tatyanin demonstrated the shortcomings of one of the popular banking security systems: the time to open ciphertexts was less than 6 minutes, and the time required for an uncontrolled violation of the integrity of a document (bypassing the authentication system) was less than 5 minutes. And here we, the reader, will also have to wait while the developers make the necessary changes. And then we can tell you more about how and what was done.

    Literature:

    1. Vodolazsky V. Commercial encryption systems: basic algorithms and their implementation. Part 1. // Monitor. - 1992. - N 6-7. - c. 14 - 19.
    2. Ignatenko Yu.I. How to make it so that? .. // World of PC. - 1994. - N 8. - c. 52 - 54.
    3. Kovalevsky V., Maksimov V. Cryptographic methods. // ComputerPress. - 1993. - No. 5. - c. 31 - 34.
    4. Maftik S. Mechanisms of protection in computer networks. - M .: Mir, 1993.
    5. Spesivtsev A.V., Wegner V.A., Krutyakov A.Yu. etc. Protection of information in personal computers. - M .: Radio and communication, 1992.
    6. Xiao D., Kerr D., Madnick S. Computer protection. - M .: Mir, 1982.
    7. Shmeleva A. Makeup - what is it? // Hard "n" Soft. - 1994. - N 5.
    8. GOST 28147-89. Information processing systems. Cryptographic protection. Algorithm for cryptographic transformation.

    Data encryption is extremely important to protect privacy. In this article, I will introduce the different types and methods of encryption that are used to protect data today.

    Did you know?
    Back in the days of the Roman Empire, encryption was used by Julius Caesar to make letters and messages unreadable to the enemy. It played an important role as a military tactic, especially during wars.

    As the possibilities of the Internet continue to grow, more and more of our businesses are being hired online. Among these, the most important are Internet banking, online payment, e-mails, the exchange of private and official messages, etc., which provide for the exchange of confidential data and information. If this data falls into the wrong hands, it can harm not only an individual user, but also the entire online business system.

    To prevent this from happening, some network security measures have been taken to protect the transmission of personal data. Chief among these are the data encryption and decryption processes known as cryptography. There are three main encryption methods used in most systems today: hashing, symmetric, and asymmetric encryption. In the next lines, I will discuss each of these encryption types in more detail.

    Encryption types

    Symmetric encryption

    With symmetric encryption, normal readable data, known as plain text, is encrypted (encrypted) so that it becomes unreadable. This scrambling of data is done with a key. Once the data is encrypted, it can be safely transferred to the receiver. At the recipient, the encrypted data is decoded using the same key that was used for encryption.

    Thus, it is clear that the key is the most important part of symmetric encryption. It should be hidden from strangers, since everyone who has access to it will be able to decrypt private data. This is why this type of encryption is also known as a "secret key".

    In modern systems, a key is usually a string of data that is obtained from a strong password, or from a completely random source. It is fed into symmetric encryption software, which uses it to encrypt the input. Data scrambling is achieved using symmetric encryption algorithms such as Data Encryption Standard (DES), Advanced Encryption Standard (AES), or International Data Encryption Algorithm (IDEA).

    Restrictions

    The weakest link in this type of encryption is the security of the key, both in terms of storage and transmission of the authenticated user. If a hacker is able to get hold of this key, he can easily decrypt the encrypted data, destroying the whole point of encryption.

    Another drawback is due to the fact that the software that processes the data cannot handle encrypted data. Therefore, to be able to use this software, the data must first be decoded. If the software itself is compromised, then an attacker can easily obtain the data.

    Asymmetric encryption

    An asymmetric encryption key works similarly to a symmetric key in that it uses the key to encrypt the transmitted messages. However, instead of using the same key, it uses a completely different key to decrypt this message.

    The key used for encryption is available to anyone and everyone on the network. As such, it is known as the "public" key. On the other hand, the key used for decryption is kept secret and intended to be used privately by the user himself. Hence, it is known as the "private" key. Asymmetric encryption is also known as public key encryption.

    Since, with this method, the secret key required to decrypt the message does not have to be transmitted every time, and it is usually known only to the user (receiver), the likelihood that a hacker will be able to decrypt the message is much lower.

    Diffie-Hellman and RSA are examples of algorithms that use public key encryption.

    Restrictions

    Many hackers use man-in-the-middle as a form of attack to bypass this type of encryption. In asymmetric encryption, you are given a public key that is used to securely exchange data with another person or service. However, hackers use trickery networks to trick you into communicating with them while making you believe you are on a safe line.

    To better understand this type of hacking, consider the two interacting parties Sasha and Natasha, and the hacker Sergey with the intent to intercept their conversation. First, Sasha sends a message over the network intended for Natasha, asking for her public key. Sergei intercepts this message and receives the public key associated with it, and uses it to encrypt and transmit a false message, Natasha, containing his public key instead of Sasha.

    Natasha, thinking that this message came from Sasha, now encrypts it using Sergey's public key and sends it back. This message was again intercepted by Sergey, decrypted, changed (if desired), encrypted again using the public key that Sasha had originally sent, and sent back to Sasha.

    Thus, when Sasha receives this message, he is made to believe that it came from Natasha, and continues to be unaware of foul play.

    Hashing

    The hashing technique uses an algorithm known as a hash function to generate a special string from the given data, known as a hash. This hash has the following properties:

    • the same data always produces the same hash.
    • it is impossible to generate raw data from the hash alone.
    • It is impractical to try different combinations of inputs in order to try to generate the same hash.

    Thus, the main difference between hashing and the other two forms of data encryption is that once the data is encrypted (hashed), it cannot be received back in its original form (decrypted). This fact ensures that even if a hacker gets his hands on the hash, it will be useless to him, since he will not be able to decrypt the contents of the message.

    Message Digest 5 (MD5) and Secure Hashing Algorithm (SHA) are two widely used hashing algorithms.

    Restrictions

    As mentioned earlier, it is nearly impossible to decrypt data from a given hash. However, this is only true if strong hashing is implemented. In the case of a weak implementation of the hashing technique, using a sufficient amount of resources and brute force attacks, a persistent hacker can find data that matches the hash.

    Combination of encryption methods

    As discussed above, each of these three encryption methods suffers from some disadvantages. However, when a combination of these methods is used, they form a reliable and highly effective encryption system.

    Most often, private and public key techniques are combined and used together. The private key method allows for quick decryption, while the public key method offers a safer and more convenient way to transfer the secret key. This combination of techniques is known as the digital envelope. The PGP email encryption program is based on the digital envelope technique.

    Hashing is used as a means of checking the strength of a password. If the system stores the hash of the password, instead of the password itself, it will be more secure, since even if a hacker gets his hands on this hash, he will not be able to understand (read) it. During verification, the system will check the hash of the incoming password, and see if the result matches what is stored. Thus, the actual password will be visible only in brief moments when it needs to be changed or verified, which will significantly reduce the likelihood of it falling into the wrong hands.

    Hashing is also used to authenticate data with a secret key. The hash is generated using the data and this key. Therefore, only the data and hash are visible, and the key itself is not transmitted. This way, if changes are made to either the data or the hash, they will be easily detected.

    In conclusion, it can be said that these techniques can be used to efficiently encode data into an unreadable format that can ensure that it remains secure. Most modern systems typically use a combination of these encryption techniques along with strong algorithms implementation to improve security. In addition to security, these systems also provide many additional benefits, such as verifying the user's identity and ensuring that the data received cannot be tampered with.

    Due to the fact that the main function of our software is data encryption, we are often asked questions regarding certain aspects of cryptography. We decided to collect the most frequently asked questions in one document and tried to give the most detailed answers to them, but, at the same time, not overloaded with unnecessary information.

    1. What is cryptography?

    Cryptography is a theoretical scientific discipline, a branch of mathematics that studies the transformation of information in order to protect it from the rational actions of the enemy.

    2. What is an encryption algorithm?

    An encryption algorithm is a set of logical rules that determine the process of converting information from an open state to an encrypted state (encryption) and, conversely, from an encrypted state to an open state (decryption).

    Encryption algorithms appear as a result of theoretical research, both by individual scientists and scientific teams.

    3. How is data protected with encryption?

    The basic principle of protecting data with encryption is data encryption. Encrypted data looks like "information garbage" to outsiders - a meaningless set of characters. Thus, if information in encrypted form gets to an attacker, he simply will not be able to use it.

    4. What is the strongest encryption algorithm?

    In principle, any encryption algorithm proposed by any renowned cryptographer is considered strong until proven otherwise.

    As a rule, all newly emerging encryption algorithms are published for general information, and are comprehensively studied in specialized cryptographic research centers. The results of such studies are also published for public information.

    5. What is an encryption key?

    An encryption key is a random, pseudo-random, or specially formed sequence of bits, which is a variable parameter of the encryption algorithm.

    In other words, if you encrypt the same information with the same algorithm, but with different keys, the results will also be different.

    An encryption key has one essential characteristic - length, which is usually measured in bits.

    6. What are the encryption algorithms?

    Encryption algorithms are divided into two large classes - symmetric and asymmetric (or asymmetric).

    Symmetric encryption algorithms use the same key to encrypt information and to decrypt it. In this case, the encryption key must be secret.

    Symmetric encryption algorithms, as a rule, are easy to implement and do not require a lot of computational resources for their work. However, the inconvenience of such algorithms manifests itself in cases when, for example, two users need to exchange keys. In this case, users need to either meet directly with each other, or have some kind of reliable, interception-protected channel for sending the key, which is not always possible.

    Examples of symmetric encryption algorithms - DES, RC4, RC5, AES, CAST.

    Asymmetric encryption algorithms use two keys - one for encryption and the other for decryption. In this case, they talk about a pair of keys. One key from a pair can be public (available to everyone), the other secret.

    Asymmetric encryption algorithms are more difficult to implement and more demanding on computational resources than symmetric ones; however, the problem of key exchange between two users is easier to solve.

    Each user can create his own pair of keys and send the public key to his subscriber. This key can only encrypt data; to decrypt, you need a secret key, which is stored only by its owner. Thus, obtaining a public key by an attacker will not give him anything, since it is impossible for him to decrypt the encrypted data.

    Examples of asymmetric encryption algorithms - RSA, El-Gamal.

    7. How are encryption algorithms cracked?

    In cryptographic science there is a subsection - cryptanalysis, which studies the issues of breaking encryption algorithms, that is, obtaining open information from encrypted without an encryption key.

    There are many different ways and methods of cryptanalysis, most of which are too complex and voluminous to reproduce here.

    The only method that is pertinent to mention is the brute force method (also called brute force). The essence of this method is to enumerate all possible values ​​of the encryption key until the required key is found.

    8. How long should the encryption key be?

    Today, for symmetric encryption algorithms, a sufficient encryption key length is considered to be 128 bits (16 bytes). For a complete search of all possible keys with a length of 128 bits (brute force attack) in one year, 4.2x1022 processors with a capacity of 256 million encryption operations per second are required. The cost of this number of processors is 3.5x1024 US dollars (according to Bruce Schneier, Applied Cryptography).

    There is an international project distributed.net, the purpose of which is to unite Internet users to create a virtual distributed supercomputer that searches for encryption keys. The latest 64-bit key cracking project was completed within 1,757 days, with over 300,000 users participating, and the computing power of all the computers in the project was equivalent to almost 50,000 AMD Athlon XP processors clocked at 2 GHz.

    It should be borne in mind that an increase in the length of the encryption key by one bit increases the number of key values, and, consequently, the search time, twofold. That is, based on the above figures, in 1757 * 2 days it is possible to crack not a 128-bit key, as it might seem at first glance, but only a 65-bit one.

    9. I've heard about 1024 and even 2048 bit encryption keys, and you say that 128 bits is enough. What does it mean?

    That's right, encryption keys 512, 1024 and 2048 bits, and sometimes longer, are used in asymmetric encryption algorithms. They use principles that are completely different from symmetric algorithms, so the scale of encryption keys is also different.

    The answer to this question is the most guarded secret of the secret services of any state. From a theoretical point of view, it is impossible to read data encrypted using a well-known algorithm with a key of sufficient length (see previous questions), however, who knows what is hidden behind the veil of state secrets? It may well turn out that there are some alien technologies known to the government, with which you can break any cipher 🙂

    The only thing that can be asserted with certainty is that no state or special service will reveal this secret, and even if it is possible to somehow decrypt the data, it will never show it in any way.

    A historical example can be used to illustrate this statement. During the Second World War, British Prime Minister Winston Churchill, as a result of interception and decryption of German messages, became aware of the forthcoming bombing of the city of Coventry. Despite this, he did not take any measures to prevent the enemy from learning that British intelligence could decipher their messages. As a result, on the night of November 14-15, 1940, Coventry was destroyed by German aircraft, a large number of civilians were killed. Thus, for Churchill, the cost of disclosing information that he could decipher German messages turned out to be higher than the cost of several thousand human lives.

    Obviously, for modern politicians, the price of such information is even higher, therefore, we will not learn anything about the capabilities of modern special services, either explicitly or indirectly. So even if the answer to this question is in the affirmative, this possibility will most likely not manifest itself in any way.

    Source: SecurIT

    ^ back to the beginning ^

    Usually, new encryption algorithms are published for public inspection and studied in specialized scientific centers. The results of such studies are also published for public information.

    Symmetric algorithms
    Encryption algorithms are divided into two large classes: symmetric (AES, GOST, Blowfish, CAST, DES) and asymmetric (RSA, El-Gamal). Symmetric encryption algorithms use the same key to encrypt information and to decrypt it, while asymmetric algorithms use two keys - one to encrypt and the other to decrypt.

    If the encrypted information needs to be transferred to another location, then the decryption key must also be transferred. The weak point here is the data transmission channel - if it is not secure or is being listened to, then the decryption key can get to the attacker. Systems based on asymmetric algorithms are free from this drawback. Since each member of such a system has a pair of keys: Public and Secret Keys.

    Encryption key
    This is a random or specially created sequence of bits using a password, which is a variable parameter of the encryption algorithm.
    If you encrypt the same data with the same algorithm, but with different keys, the results will also be different.

    Usually, in Encryption Programs (WinRAR, Rohos, etc.), the key is created from the password that the user specifies.

    The encryption key comes in different lengths, usually measured in bits. With an increase in the key length, the theoretical strength of the cipher increases. In practice, this is not always true.

    In cryptography, the encryption mechanism is considered to be an unclassified value, and an attacker can have the complete source code of the encryption algorithm, as well as the ciphertext (Kerkhoff's rule). Another assumption that can take place is that an attacker may know part of the unencrypted (plain) text.

    The strength of the encryption algorithm.
    An encryption algorithm is considered strong until proven otherwise. Thus, if an encryption algorithm has been published, has existed for more than 5 years, and no serious vulnerabilities have been found for it, we can assume that its strength is suitable for the tasks of protecting classified information.

    Theoretical and practical endurance.
    In 1949 K.E. Shannon published the article "Theory of Communication in Secret Systems". Shannon viewed the strength of cryptographic systems as both Practical and Theoretical. The conclusion on the theoretical strength is still pessimistic: the length of the key must be equal to the length of the plaintext.
    Therefore, Shannon also considered the issue of the practical strength of cryptographic systems. Is the system reliable if an attacker has limited time and computing resources to analyze intercepted messages?

    Usually vulnerabilities are found in programs that encrypt data using some algorithm. In this case, programmers make a mistake in the logic of the program or in the cryptographic protocol, thanks to which, having studied how the program works (at a low level), you can eventually get access to secret information.

    Cracking the encryption algorithm
    A cryptosystem is believed to be compromised if an attacker can compute the secret key and perform a transformation algorithm equivalent to the original cryptoalgorithm. And so that this algorithm is executable in real time.

    In cryptology there is a subsection - cryptanalysis, which studies the issues of breaking or forging encrypted messages. There are many ways and methods of cryptanalysis. The most popular is the method of direct enumeration of all possible values ​​of the encryption key (the so-called brute force method). The essence of this method is to enumerate all possible values ​​of the encryption key until the required key is found.

    In practice, this means that an attacker must:

    • Have at your disposal a cryptosystem (i.e. a program) and examples of encrypted messages.
    • Understand the cryptographic protocol. In other words, how the program encrypts data.
    • Develop and implement an algorithm for enumerating Keys for this cryptosystem.

    How can you tell if a key is correct or not?
    It all depends on the specific program and the implementation of the encryption protocol. Usually, if after decryption you get ‘garbage’, then this is the wrong Key. And if the text is more or less meaningful (this can be checked), then the Key is correct.

    Encryption algorithms
    AES (Rijndael)... It is currently the US federal encryption standard.

    What encryption algorithm to choose to protect information?

    Approved by the Department of Commerce as a standard on December 4, 2001. The decision came into force from the moment of publication in the federal register (06.12.01). A cipher version with only a block size of 128 bits is adopted as a standard.

    GOST 28147-8. Standard of the Russian Federation for data encryption and imitation protection. Initially, it had a neck (OV or SS - it is not known for sure), then the neck was gradually reduced, and by the time the algorithm was officially carried out through the USSR State Standard in 1989, it was removed. The algorithm remained chipboard (as you know, chipboard is not considered a bar). In 1989 it became the official standard of the USSR, and later, after the collapse of the USSR, the federal standard of the Russian Federation.

    Blowfish A complex scheme for generating key elements significantly complicates an attack on the algorithm by brute force, but makes it unsuitable for use in systems where the key changes frequently and small data is encrypted on each key.

    The algorithm is best suited for systems in which large amounts of data are encrypted on the same key.

    DES US Federal Encryption Standard 1977-2001. It was adopted as a US federal standard in 1977. In December 2001, it lost its status due to the introduction of a new standard.

    CAST In a sense, an analogue of DES.

    www.codenet.ru/progr/alg/enc
    Encryption algorithms, Review, information, comparison.

    http://www.enlight.ru/crypto
    Materials on asymmetric encryption, digital signatures and other "modern" cryptographic systems.

    Alexander Velikanov,
    Olga Cheban,
    Tesline-Service SRL.

    Former Abu Dhabi banker Mohammad Gheit bin Mahah Al Mazrui has developed a cipher that he claims cannot be broken. A cipher called "Abu Dhabi Code" was created on the basis of a group of characters invented by Al Mazrui himself. In its code, each letter is replaced by a specially invented symbol, and these symbols do not belong to any of the languages ​​known in the world.

    Which data encryption algorithms are more secure

    It took the developer a year and a half to work on the cipher, which Al Mazrui calls "completely new".

    According to the enthusiast, everyone can create their own code, and the complexity of the cipher determines the length of its key. It is believed that, in principle, given the desire, certain skills and the appropriate software, almost everyone, even the most complex cipher, can be broken.

    However, Al Mazrui assures that his creation cannot be hacked and is the most reliable cipher today. “It is almost impossible to decipher a document encoded with the Abu Dhabi Code,” said Al Mazrui.

    To prove his case, the banker challenged all outstanding encryptors, hackers and cryptographers, urging them to try to break his cipher.

    3. Kryptos is a sculpture that the American sculptor James Sanborn installed on the territory of the CIA headquarters in Langley, Virginia, in 1990. The encrypted message inscribed on it still cannot be deciphered.

    4. The code printed on chinese gold bar... Seven gold bars were allegedly issued to General Wang in Shanghai in 1933. They are marked with pictures, Chinese letters and some kind of encrypted messages, including in Latin letters. They may contain certificates of authenticity for the metal issued by one of the US banks.

    Which encryption algorithm to choose in TrueCrypt

    5. Bale's Cryptograms- Three encrypted messages believed to contain a treasure of two wagons of gold, silver and precious stones buried in the 1820s near Lynchburg, in Bedford County, Virginia, by a party of gold prospectors led by Thomas Jefferson Bale. The price of the hitherto not found treasure in terms of modern money should be about $ 30 million. The mystery of cryptograms has not yet been solved, in particular, the question of the real existence of the treasure remains controversial. One of the messages has been decoded - it describes the treasure itself and gives general instructions on its location. The remaining unopened letters may contain the exact place of the bookmark and a list of the owners of the treasure. (detailed information)

    6. Voynich manuscript, which is often called the most mysterious book in the world. The manuscript uses a unique alphabet, about 250 pages and drawings depicting unknown flowers, naked nymphs and astrological symbols. It first appeared at the end of the 16th century, when the Holy Roman Emperor Rudolph II bought it in Prague from an unknown merchant for 600 ducats (about 3.5 kg of gold, today more than 50 thousand dollars). From Rudolf II, the book passed to nobles and scientists, and at the end of the 17th century it disappeared. The manuscript reappeared around 1912 when it was bought by the American bookseller Wilfried Voynich. After his death, the manuscript was donated to Yale University. British scholar Gordon Rugg believes the book is a clever hoax. There are features in the text that are not characteristic of any of the languages. On the other hand, some features, such as the length of words, the way letters and syllables are connected, are similar to those existing in real languages. “Many people think that this is all too complicated to be hoaxed, it would take years for some insane alchemist to build such a system,” says Rugg. However, Rugg shows that this complexity could have been easily achieved using a cipher device invented around 1550 called the Cardan grid. In this symbol table, words are created by moving a card with holes cut in it. Due to the spaces left in the table, words are obtained of different lengths. By superimposing such grids on the syllable table of the manuscript, Rugg created a language that has many, if not all, of the features of the manuscript language. According to him, it would take three months to create the entire book. (detailed information, wikipedia)

    7. Dorabella cipher composed in 1897 by the British composer Sir Edward William Elgar. In encrypted form, he sent a letter to the city of Wolverhampton to his friend Dora Penny, 22-year-old daughter of Alfred Penny, rector of St. Peter's Cathedral. This cipher remains unsolved.

    8. Until recently, the list included and chacode, which could not be revealed during the life of its creator. The code was invented by John F. Byrne in 1918, and for nearly 40 years tried unsuccessfully to interest the US authorities in it. The inventor offered a monetary reward to anyone who could reveal his code, but as a result, no one asked for it.

    But in May 2010, Byrn's family members donated all of his remaining documents to the National Museum of Cryptography in Maryland, leading to the disclosure of the algorithm.

    9. D'Agapeyeff cipher... In 1939, Russian-born British cartographer Alexander D'Agapeyeff published a book on the basics of cryptography, Codes and Ciphers, in the first edition of which he cited a cipher of his own invention. This code was not included in subsequent editions. Subsequently, D'Agapeyeff admitted that he had forgotten the algorithm for breaking this cipher. It is suspected that the failures that have befallen everyone who tried to decipher his work are due to the fact that the author made mistakes when encrypting the text.

    But nowadays there is a hope that the cipher will be able to be discovered using modern methods - for example, a genetic algorithm.

    10. Taman Shud... On December 1, 1948, on the Australian coast in Somerton, near Adelaide, the dead body of a man dressed in a sweater and coat was found, despite the typical hot day for the Australian climate. No documents were found with him. Attempts to compare the prints of his teeth and fingers with the available data on living people also did not lead to anything. Pathological examination revealed an unnatural rush of blood, which was filled, in particular, his abdominal cavity, as well as an increase in internal organs, but no foreign substances were found in his body. At the same time, a suitcase was found at the railway station, which could have belonged to the deceased. The suitcase contained trousers with a secret pocket, in which they found a piece of paper torn from a book with the words printed on it Taman shud... The investigation established that a piece of paper was torn from a very rare copy of the Rubai collection by the great Persian poet Omar Khayyam. The book itself was found in the back seat of a car, abandoned unlocked. On the back cover of the book were five lines scribbled in capital letters, the message of which was never understood. To this day, this story remains one of Australia's most mysterious mysteries.

    In our computer age, mankind is increasingly refusing to store information in handwritten or printed form, preferring documents for it. And if earlier they just stole papers or parchments, now it is electronic information that is hacked. The very same data encryption algorithms have been known since time immemorial. Many civilizations preferred to encrypt their unique knowledge so that only a knowledgeable person could get it. But let's see how all this is displayed in our world.

    What is a data encryption system?

    First, you need to decide what cryptographic systems are in general. Roughly speaking, this is a special algorithm for recording information that would be understandable only to a certain circle of people.

    In this sense, to an outsider, everything that he sees should (and in principle, it is) seem like a meaningless set of symbols. Only those who know the rules of their arrangement can read such a sequence. The simplest example is to define an encryption algorithm with words written, say, backwards. Of course, this is the most primitive thing you can think of. It is understood that if you know the recording rules, it will not be difficult to restore the original text.

    Why is this needed?

    Why all this was invented is probably not worth explaining. Look, after all, what amounts of knowledge left over from ancient civilizations are encrypted today. Either the ancients did not want us to find out, or all this was done so that a person could use them only when he reaches the desired level of development - so far one can only guess about this.

    However, if we talk about today's world, information security is becoming one of the biggest problems. Judge for yourself, because how many documents there are in the same archives, about which the governments of some countries would not want to spread, how many secret developments, how many new technologies. But all this, by and large, is the primary goal of the so-called hackers in the classical sense of the term.

    Only one phrase comes to mind, which has become a classic of Nathan Rothschild's principles of activity: "Who owns the information, he owns the world." And that is why information has to be protected from prying eyes, so that someone else does not use it for their own selfish purposes.

    Cryptography: a starting point

    Now, before considering the very structure that any encryption algorithm has, let's plunge a little into history, in those distant times when this science was just in its infancy.

    It is believed that the art of hiding data began to actively develop several millennia ago. The primacy is attributed to the ancient Sumerians, King Solomon and the Egyptian priests. Only much later did the same runic signs and symbols appear. But here's what's interesting: sometimes the encryption algorithm for texts (and at that time they were encrypted) was such that in the same one character could mean not only one letter, but also a whole word, concept or even a sentence. Because of this, the decryption of such texts, even with the presence of modern cryptographic systems that allow restoring the original form of any text, becomes absolutely impossible. In modern terms, these are quite advanced, as they say now, symmetric encryption algorithms. Let us dwell on them separately.

    The modern world: types of encryption algorithms

    As for the protection of confidential data in the modern world, it is worth dwelling separately on those times when computers were unknown to mankind. Not to mention how many papers were translated by alchemists or the same Knights Templar, trying to hide the true texts about the knowledge they knew, it is worth remembering that since the emergence of the connection, the problem has only worsened.

    And here, perhaps, the most famous device can be called the German encryption machine of the Second World War called "Enigma", which means "riddle" in English. Again, this is an example of how symmetric encryption algorithms are used, the essence of which is that the encryptor and decryptor know the key (algorithm) originally used to hide the data.

    Today, such cryptosystems are used everywhere. The most striking example can be considered, say, the AES256 encryption algorithm, which is an international standard. In terms of computer terminology, it allows the use of a 256-bit key. In general, modern encryption algorithms are quite diverse, and they can be conditionally divided into two large classes: symmetric and asymmetric. They, depending on the field of application, are widely used today. And the choice of the encryption algorithm directly depends on the tasks and the method of recovering information in its original form. But what is the difference between them?

    Symmetric and asymmetric encryption algorithms: what is the difference

    Now let's see what is the cardinal difference between such systems, and on what principles their application in practice is based. As it is already clear, encryption algorithms are associated with geometric concepts of symmetry and asymmetry. What this means will now be found out.

    The DES symmetric encryption algorithm, developed back in 1977, implies a single key, which is supposedly known to two interested parties. Knowing such a key, it is not difficult to apply it in practice to read the same meaningless set of characters, making it, so to speak, readable.

    What are asymmetric encryption algorithms? Here, two keys are used, that is, one is used to encode the original information, and the other is used to decrypt the content, and it is absolutely not necessary that they coincide or be at the same time at the encoding and decoding sides. For each of them, one is enough. Thus, to a very high degree, both keys are excluded from falling into third hands. However, based on the current situation, for many intruders, thefts of this type are not particularly problematic. Another thing is the search for exactly the key (roughly speaking, a password) that is suitable for decrypting data. And here there can be so many options that even the most modern computer will process them for several decades. As it was stated, none of the available computer systems in the world can and will not be able to hack access to it and get what is called "wiretapping" over the next decades.

    The most famous and commonly used encryption algorithms

    But back to the computer world. What do the main encryption algorithms offer today, designed to protect information at the present stage of development of computer and mobile technology?

    In most countries, the de facto AES cryptographic system based on a 128-bit key is the de facto standard. However, in parallel with it, an algorithm is sometimes used which, although it refers to encryption using a public (public) key, is nevertheless one of the most reliable. This, by the way, has been proven by all leading experts, since the system itself is determined not only by the degree of data encryption, but also by maintaining the integrity of the information. As for the early developments, to which the DES encryption algorithm belongs, it is hopelessly outdated, and attempts to replace it began in 1997. It was then on its basis that a new Advanced standard arose (first with a 128-bit key, then - with a 256-bit key).

    RSA encryption

    Now let's dwell on the RSA technology, which refers to the asymmetric encryption system. Suppose one subscriber sends another information encrypted using this algorithm.

    For encryption, two sufficiently large numbers X and Y are taken, after which their product Z is calculated, called the modulus. Next, a certain extraneous number A is selected that satisfies the condition: 1< A < (X - 1) * (Y - 1). Оно обязательно должно быть простым, то есть не иметь общих делителей с произведением (X - 1) * (Y - 1), равным Z. Затем происходит вычисление числа B, но только так, что (A * B - 1) делится на (X - 1) * (Y - 1). В данном примере A - открытый показатель, B - секретный показатель, (Z; A) - открытый ключ, (Z; B) - секретный ключ.

    What Happens During Shipment? The sender creates a ciphertext, denoted as F, with an initial message M, followed by A and multiplication by modulus Z: F = M ** A * (mod Z). It remains for the recipient to calculate a simple example: M = F ** B * (mod Z). Roughly speaking, all these actions are reduced solely to raising to a power. The option with creating a digital signature works according to the same principle, but the equations are somewhat more complicated here. In order not to bother the user with algebra, such material will not be given.

    As for hacking, the RSA encryption algorithm poses an almost unsolvable task for the attacker: to calculate the key B. This could theoretically be done using available factoring tools (decomposition into factors of the initial numbers X and Y), but today there are no such tools. therefore, the task itself becomes not only difficult - it is generally impracticable.

    DES encryption

    Before us is another, in the past, a fairly effective encryption algorithm with a maximum block length of 64 bits (characters), of which only 56 are significant. USA even for the defense industry.

    The essence of its symmetric encryption is that a certain sequence of 48 bits is used for this. In this case, 16 cycles are used for operations from a selection of keys of 48 bits. But! All cycles are similar in principle, so at the moment it is not difficult to calculate the required key. For example, one of the most powerful computers in the US, worth over a million dollars, “breaks” encryption in about three and a half hours. For machines of lower rank, it takes no more than 20 hours to calculate even the sequence at its maximum.

    AES encryption

    Finally, we have before us the most widespread and, as it was believed, until recently, invulnerable system - the AES encryption algorithm. It is presented today in three modifications - AES128, AES192 and AES256. The first option is used more to ensure the information security of mobile devices, the second is used at a higher level. As a standard, this system was officially introduced in 2002, and its support was immediately announced by Intel Corporation, which manufactures processor chips.

    Its essence, unlike any other symmetric encryption system, is reduced to computations based on the polynomial representation of codes and computation operations with two-dimensional arrays. According to the United States government, it will take about 149 trillion years for a decoder, even the most modern one, to crack a 128-bit key. Let us disagree with such a competent source. Over the past hundred years, computer technology has made a leap, commensurate with, so you should not flatter yourself, especially since today, as it turned out, there are encryption systems that are more abrupt than those that the United States has declared completely resistant to hacking.

    Virus and decryption problems

    Of course, we are talking about viruses. Recently, quite specific ransomware viruses have appeared that encrypt the entire contents of the hard drive and logical partitions on the infected computer, after which the victim receives a letter notifying that all files are encrypted, and only the specified source can decrypt them after paying a tidy sum.

    At the same time, most importantly, it is indicated that the AES1024 system was used for data encryption, that is, the key length is four times larger than the currently existing AES256, and the number of options when searching for the corresponding decryptor increases simply incredibly.

    And if we proceed from the statement of the US government about the time allotted for decryption of a key with a length of 128 bits, then what about the time it will take to find a solution for the case with a key and its variants with a length of 1024 bits? It was then that the United States was punctured. They thought their computer cryptography system was perfect. Alas, there were some experts (apparently, in the post-Soviet space) who surpassed the "unshakable" American postulates in all respects.

    With all this, even the leading developers of anti-virus software, including Kaspersky Lab, the specialists who created Doctor Web, ESET corporation and many other world leaders simply shrug their shoulders, they say, there is simply no money to decipher such an algorithm, while keeping silent about the fact that there is not enough time. Of course, when contacting the support service, it is suggested to send an encrypted file and, if available, preferably its original - in the form in which it was before the start of encryption. Alas, even a comparative analysis has not yet yielded tangible results.

    The world we don't know

    But what can I say, if we are chasing the future, unable to decipher the past. If you look at the world of our millennium, you will notice that the same Roman emperor Gaius Julius Caesar used symmetric encryption algorithms in some of his messages. Well, if you look at Leonardo da Vinci, in general it becomes somehow uncomfortable from the mere realization that in the field of cryptography this man, whose life is covered with a certain veil of mystery, has surpassed his modernity for centuries.

    Until now, many are haunted by the so-called "Gioconda's smile", in which there is something so attractive that a modern person is not able to understand. By the way, relatively recently, some symbols were found in the picture (in the eye, on the dress, etc.), which clearly indicate that all this contains some kind of information encrypted by the great genius, which today, alas, we extract unable. But we did not even mention all sorts of large-scale constructions that could turn the understanding of physics at that time.

    Of course, some minds are inclined exclusively towards the fact that in most cases the so-called "golden ratio" was used, however, it does not give the key to all that huge repository of knowledge, which is believed to be either incomprehensible to us or lost forever. Apparently, cryptographers still have to do an incredible amount of work to understand that modern encryption algorithms sometimes cannot be compared with the developments of ancient civilizations. In addition, if today there are generally accepted principles of information protection, then those that were used in antiquity, unfortunately, are completely inaccessible and incomprehensible to us.

    One more thing. There is an unspoken opinion that most of the ancient texts cannot be translated just because the keys to their deciphering are carefully guarded by secret societies like Masons, Illuminati, etc. Even the Templars left their mark here. What can we say about the fact that the Vatican library is still completely inaccessible? Is it not where the main keys to understanding antiquity are kept? Many experts are inclined towards this version, believing that the Vatican is deliberately withholding this information from the public. Whether this is true or not, no one knows yet. But one thing can be said for sure - the ancient systems of cryptography were in no way inferior (and maybe even superior) to those used in the modern computer world.

    Instead of an afterword

    Finally, it should be said that far from all aspects related to current cryptographic systems and the techniques they use have been considered here. The fact is that in most cases, you would have to give complex mathematical formulas and represent calculations, from which most users will simply go round their heads. It is enough to look at the example describing the RSA algorithm to realize that everything else will look much more complicated.

    The main thing here is to understand and grasp, so to speak, the essence of the issue. Well, if we talk about what modern systems are that offer to store confidential information in such a way that it is available to a limited number of users, there is little choice here. Despite the presence of many cryptographic systems, the same RSA and DES algorithms are clearly inferior to the specifics of AES. However, the majority of modern applications developed for completely different operating systems use AES (of course, depending on the area of ​​application and device). But the "unauthorized" evolution of this cryptosystem, to put it mildly, shocked many, especially its creators. But in general, based on what is available today, it will not be difficult for many users to understand what cryptographic data encryption systems are, why they are needed and how they work.

    Usually, new encryption algorithms are published for public inspection and studied in specialized scientific centers. The results of such studies are also published for public information.

    Symmetric algorithms
    Encryption algorithms are divided into two large classes: symmetric (AES, GOST, Blowfish, CAST, DES) and asymmetric (RSA, El-Gamal). Symmetric encryption algorithms use the same key to encrypt information and to decrypt it, while asymmetric algorithms use two keys - one to encrypt and the other to decrypt.

    If the encrypted information needs to be transferred to another location, then the decryption key must also be transferred. The weak point here is the data transmission channel - if it is not secure or is being listened to, then the decryption key can get to the attacker. Systems based on asymmetric algorithms are free from this drawback. Since each member of such a system has a pair of keys: Public and Secret Keys.

    Encryption key
    This is a random or specially created sequence of bits using a password, which is a variable parameter of the encryption algorithm.
    If you encrypt the same data with the same algorithm, but with different keys, the results will also be different.

    Usually, in Encryption Programs (WinRAR, Rohos, etc.), the key is created from the password that the user specifies.

    The encryption key comes in different lengths, usually measured in bits. With an increase in the key length, the theoretical strength of the cipher increases. In practice, this is not always true.

    In cryptography, the encryption mechanism is considered to be an unclassified value, and an attacker can have the complete source code of the encryption algorithm, as well as the ciphertext (Kerkhoff's rule). Another assumption that can take place is that an attacker may know part of the unencrypted (plain) text.

    The strength of the encryption algorithm.
    An encryption algorithm is considered strong until proven otherwise. Thus, if an encryption algorithm has been published, has existed for more than 5 years, and no serious vulnerabilities have been found for it, we can assume that its strength is suitable for the tasks of protecting classified information.

    Theoretical and practical endurance.
    In 1949 K.E. Shannon published the article "Theory of Communication in Secret Systems". Shannon viewed the strength of cryptographic systems as both Practical and Theoretical. The conclusion on the theoretical strength is still pessimistic: the length of the key must be equal to the length of the plaintext.
    Therefore, Shannon also considered the issue of the practical strength of cryptographic systems. Is the system reliable if an attacker has limited time and computing resources to analyze intercepted messages?

    Usually vulnerabilities are found in programs that encrypt data using some algorithm. In this case, programmers make a mistake in the logic of the program or in the cryptographic protocol, thanks to which, having studied how the program works (at a low level), you can eventually get access to secret information.

    Cracking the encryption algorithm
    A cryptosystem is believed to be compromised if an attacker can compute the secret key and perform a transformation algorithm equivalent to the original cryptoalgorithm. And so that this algorithm is executable in real time.

    In cryptology there is a subsection - cryptanalysis, which studies the issues of breaking or forging encrypted messages. There are many ways and methods of cryptanalysis. The most popular is the method of direct enumeration of all possible values ​​of the encryption key (the so-called brute force method). The essence of this method is to enumerate all possible values ​​of the encryption key until the required key is found.

    In practice, this means that an attacker must:

    • Have at your disposal a cryptosystem (i.e. a program) and examples of encrypted messages.
    • Understand the cryptographic protocol. In other words, how the program encrypts data.
    • Develop and implement an algorithm for enumerating Keys for this cryptosystem.
    How can you tell if a key is correct or not?
    It all depends on the specific program and the implementation of the encryption protocol. Usually, if after decryption "garbage" is obtained, then this is the wrong Key. And if the text is more or less meaningful (this can be checked), then the Key is correct.

    Encryption algorithms
    AES (Rijndael)... It is currently the US federal encryption standard. Approved by the Department of Commerce as a standard on December 4, 2001. The decision came into force from the moment of publication in the federal register (06.12.01). A cipher version with only a block size of 128 bits is adopted as a standard.

    GOST 28147-8. Standard of the Russian Federation for data encryption and imitation protection. Initially, it had a neck (OV or SS - it is not known for sure), then the neck was gradually reduced, and by the time the algorithm was officially carried out through the USSR State Standard in 1989, it was removed. The algorithm remained chipboard (as you know, chipboard is not considered a bar). In 1989 it became the official standard of the USSR, and later, after the collapse of the USSR, the federal standard of the Russian Federation.

    Blowfish A complex scheme for generating key elements significantly complicates an attack on the algorithm by brute force, but makes it unsuitable for use in systems where the key changes frequently and small data is encrypted on each key. The algorithm is best suited for systems in which large amounts of data are encrypted on the same key.

    DES US Federal Encryption Standard 1977-2001. It was adopted as a US federal standard in 1977. In December 2001, it lost its status due to the introduction of a new standard.

    CAST In a sense, an analogue of DES.

    Share this