org.jasypt.encryption.pbe.NumberUtils checks the amount of memory available before allocating a new byte array to handle the extra padding for BigInteger or BigDecimal PBE decryption.
private static int maxSafeSizeInBytes() {
// In order to avoid Java heap size exceptions due to
// malformations or manipulation of encrypted data, we will
// only consider "safe" the allocation of numbers with a size
// in bytes less or equal to half the available free memory.
//
// Available free memory is computed as current free memory
// (in the amount of memory currently allocated by the JVM) plus
// all the amount of memory that the JVM will be allowed to
// allocate in the future (until maxMemory).
final long max = Runtime.getRuntime().maxMemory();
final long free = Runtime.getRuntime().freeMemory();
final long total = Runtime.getRuntime().totalMemory();
return (int)((free + (max - total)) / 2);
}
final int expectedSize =
NumberUtils.intFromByteArray(encryptedMessageExpectedSizeBytes);
if (expectedSize < 0 || expectedSize** >** maxSafeSizeInBytes()) {
throw new EncryptionOperationNotPossibleException();
}
It does this in maxSafeSizeInBytes which unfortunately casts the result of a long calculation to an int. Whereas most of the time this is going to be okay on a VM on a modern 64bit server box with more than 20Gb of RAM under Windows the VM will by default allocate 5Gb of RAM for the heap which means that there is greater than Integer.MAX_INT available and the function overflows and returns a negative integer which causes the code to exception unnecessarily.
Greetings.
This is a very serious issue (it blocks us from using Jasypt entirely) with a relatively simple fix. Any hope for a fixed release soon?
Still open? o_O
We are working on this and we expect to have a new version soon, hopefully as soon as May.
Any news? This bug is serious
Fixed in jasypt v1.9.3 released yesterday