Interoperability between two AES algorithms

Posted by lpfavreau on Stack Overflow See other posts from Stack Overflow or by lpfavreau
Published on 2010-03-27T15:12:37Z Indexed on 2010/03/27 15:33 UTC
Read the original article Hit count: 289

Filed under:
|
|
|
|

Hello,

I'm new to cryptography and I'm building some test applications to try and understand the basics of it. I'm not trying to build the algorithms from scratch but I'm trying to make two different AES-256 implementation talk to each other.

I've got a database that was populated with this Javascript implementation stored in Base64. Now, I'm trying to get an Objective-C method to decrypt its content but I'm a little lost as to where the differences in the implementations are. I'm able to encrypt/decrypt in Javascript and I'm able to encrypt/decrypt in Cocoa but cannot make a string encrypted in Javascript decrypted in Cocoa or vice-versa.

I'm guessing it's related to the initialization vector, nonce, counter mode of operation or all of these, which quite frankly, doesn't speak to me at the moment.

Here's what I'm using in Objective-C, adapted mainly from this and this:

@implementation NSString (Crypto)

- (NSString *)encryptAES256:(NSString *)key {
    NSData *input = [self dataUsingEncoding: NSUTF8StringEncoding]; 
    NSData *output = [NSString cryptoAES256:input key:key doEncrypt:TRUE];  
    return [Base64 encode:output];
}

- (NSString *)decryptAES256:(NSString *)key {
    NSData *input = [Base64 decode:self];
    NSData *output = [NSString cryptoAES256:input key:key doEncrypt:FALSE];
    return [[[NSString alloc] initWithData:output encoding:NSUTF8StringEncoding] autorelease];
}

+ (NSData *)cryptoAES256:(NSData *)input key:(NSString *)key doEncrypt:(BOOL)doEncrypt {
    // 'key' should be 32 bytes for AES256, will be null-padded otherwise
    char keyPtr[kCCKeySizeAES256 + 1]; // room for terminator (unused)
    bzero(keyPtr, sizeof(keyPtr)); // fill with zeroes (for padding)    
    // fetch key data
    [key getCString:keyPtr maxLength:sizeof(keyPtr) encoding:NSUTF8StringEncoding]; 
    NSUInteger dataLength = [input length]; 
    // See the doc: For block ciphers, the output size will always be less than or
    // equal to the input size plus the size of one block.
    // That's why we need to add the size of one block here
    size_t bufferSize = dataLength + kCCBlockSizeAES128;
    void* buffer = malloc(bufferSize);  
    size_t numBytesCrypted = 0;
    CCCryptorStatus cryptStatus =
        CCCrypt(doEncrypt ? kCCEncrypt : kCCDecrypt,
            kCCAlgorithmAES128,
            kCCOptionECBMode | kCCOptionPKCS7Padding,
            keyPtr, kCCKeySizeAES256,
            nil, // initialization vector (optional)
            [input bytes], dataLength, // input
            buffer, bufferSize, // output
            &numBytesCrypted
        );  
    if (cryptStatus == kCCSuccess) {
        // the returned NSData takes ownership of the buffer and will free it on deallocation
        return [NSData dataWithBytesNoCopy:buffer length:numBytesCrypted];
    }   
    free(buffer); // free the buffer;
    return nil;
}

@end

Of course, the input is Base64 decoded beforehand.

I see that each encryption with the same key and same content in Javascript gives a different encrypted string, which is not the case with the Objective-C implementation that always give the same encrypted string. I've read the answers of this post and it makes me believe I'm right about something along the lines of vector initialization but I'd need your help to pinpoint what's going on exactly.

Thank you!

© Stack Overflow or respective owner

Related posts about aes

Related posts about objective-c