I've read about "7z support strong AES-256 encryption" in help file.
Also 7zip are opensource project, and i want to know - are there any public researches (made by well-known cryptographists) that guaranted that this is a good realization of this alghorithm?
Best regards, andrey.
There are 2 things:
1) AES code itself. I use Gladman's AES code. And I use same code for WinZip's AES and RAR's AES. Since all these things work, I suppose AES code is correct.
2) I use SHA-256 to hash password for key to AES.
Being 7-Zip open source, such analisys should be quite straightforward to do. Times ago I queried sci.crypt to find if such analisys was done but I was not able to find a conclusive response.
However, important factors in evaluating the correct implementation of known security best practices are:
1) algorithm(s) implementation and mode of use:
generally speaking Gladman's AES code is THE reference implementation for C and C-like languages; block cyphers, like AES, are strongly influenced by mode of use for meetning specific security requirements, i.e. ECB mode should no longer used, and while there are higly reputed modes like CTR to meet highest privacy requirements, in some cases it would be better use some other modes, like EAX mode (CTR+OMAC), to meet authentication requirement along with privacy ones (in the example of EAX, the file cannot be read by unauthenticated users, and could also not be modified without password, being the authentication cryptographically strong and bound to the strenght of the password through the OMAC pass)
2) key derivation
Key derivation is an important requirement to make the cryptosystem resistent to dictionary attacks and brute force attacks, imposing a *fixed* time penalty each time the cracker try a new password, which is unnoticeable by the legitimate user trying a single password but become a decisive isues against even good dictionary attacks needing to test few billion phrases/variants.
To make this phase deliberately slow, and not shortcircuitable jumping to a plausible solution, strong hash functions are iterated thousand times to derive a key from the user provided password and a known, random generated, salt value, which avoid the crackr to take advantage of precomputation of password/keys couples.
The curret best regarded standard is PBKDF2 (passord based key derivation function 2), which can use different hashes, being SHA256, SHA512 or Whilpool the presently hashes known as strongest.
3) randomness collection and derivation:
it's important to collect non deterministic events (like user's mouse movements, key pressed, timing of actions, network card, cpu, clock, memory and disk events and so on...) to seed a deterministic algorithm (usually a strong hash or an encryption algorithm) to generate random values which can be used to seed key derivation and encryption in some encryption modes.
If the events are not enthropy rich (i.e. seeding the algorithm with the old DOS clock which could have few millions of distint values) the randomness sampling is less valuable, in the former example the attacker could expect few million of some possible salt values and, depending on his resources, that may make feasible relying on precomputation of some attacks, regardless the strenght of the deterministic fucntion used to derive those events, which would be not feasible for any foreseeable attacker if enough enthropy is introduced in a robust hash, wich would generate 2^256 or even 2^512 possible unique salts.
i thougt random data vare for async encryption (public/private keys). and nyt for sync encryption like AES.
no claiming I'm right. It's just nice to get my info corrected.
Random data is useful in many ways in cryptography.
In example, it's used for salting (being the salt called properly IV in this case) symmetric secret key encryption if stream cyphers or block cyphers user in stream-like modes (i.e. CTR, OFB) are used.
The reason is that the same key in stream cyphers and in block cyphers in CTR mode gives origin to the same encryption stream, so if the users tend to reuse the same password (they does) the attacker could simply subtract bit by bit two messages of about the same size, probably with the same password, to remove the encryption stream; then the attacker has only to apply linear statistic methods to distinguish the two messages.
If an unique IV is used (being it randomly generated guarantee that collision are very impobable), then users can use passwords without bothering of have a single different password for each encrypter file and message (in example in CTR mode the nonce with a counter appended are encrypted and then xored with the plain text, so each password will give a different encryption stream if the nonce is changed; the nonce doesn't need to be sectret since the password is the secret element in originating the encryption stream).
Moreover, in a similar way a random salt comes handy in key derivation function: for a non salted derivation function in example an attacker may precalculate and keep stored keys for most common password (i.e. in few gigabytes can be store keys originated from some million passwords choosing most common words in most spoken languages) in order to start testing decryption without paying the time-ticket of the KDF for each tested key; for each bit of salt the attacker will intead need to store twice password-key couples, so even a quite small salt i.e 64 or 128 bit quickly make practically unfeasible storing password-keys couples.
Also in this case the salt doesn't need to be kept secret.
I hope to have been sufficiently clear in this quick explanation
To verify the AES implementation, test it against the official Test Vectors.
Number 2 makes me think of something mentioned in Applied Cryptography about using the same key in chained encryption methods. Say, using 112 bits of a 128-bit key for triple-DES, then using it again in AES. In theory, this is at best, only as good as the strongest algorithm and quit possibly worse than the weakest one (56-bit DES in this example). Not that it seems likely that patterns of bits get 'stuck', but weren't these hashes not designed to be fed back into like that? By the way, using triple-DES with 2 of the keys set the same is exactly equivalent to DES with the unique key. At least it's not equivalent to 40-bit keystrength or worse! :evilgrin:
Log in to post a comment.
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.