c# - AES Encryption between .NET WinRT and iOS Obj-C isn't the same -


i'm having hard time getting ios aes encryption/decryption same 1 have on winrt.

i can't change implementation on winrt side since it's used in published app.

here 2 sampls made zeroed key , zeroed iv, outputs differents.

c# sample code :

using (memorystream savedatamemorystreamcrypto = new memorystream()) {     var savedatakeyprovider = windows.security.cryptography.core.symmetrickeyalgorithmprovider.openalgorithm(windows.security.cryptography.core.symmetricalgorithmnames.aescbcpkcs7);     var savedatakeybuffer = windows.security.cryptography.cryptographicbuffer.createfrombytearray(new byte[32]);     var savedatakey = savedatakeyprovider.createsymmetrickey(savedatakeybuffer);     var savedatasaltbuffer = windows.security.cryptography.cryptographicbuffer.createfrombytearray(new byte[32]);     var savedatadatabuffer =  windows.security.cryptography.cryptographicbuffer.convertstringtobinary("abcdefgh", windows.security.cryptography.binarystringencoding.utf16be);     var savedataoutbuffer = windows.security.cryptography.core.cryptographicengine.encrypt(savedatakey, savedatadatabuffer, savedatasaltbuffer);     var savedataoutbytes = savedataoutbuffer.toarray(); } 

c# bytes output :

80 87 109 195 133 40 205 81 117 91 17 132 229 3 119 251 205 8 246 64 13 57 210 142 11 153 121 39 122 196 63 10

obj-c sample code :

byte keyptr[32]; bzero(keyptr, sizeof(keyptr)); byte ivptr[32]; bzero(ivptr, sizeof(ivptr));  nsstring *text = @"abcdefgh"; nsuinteger datalength; void * buffer = malloc([text length]); [text getbytes:buffer maxlength:[text length] usedlength:&datalength encoding:nsutf16bigendianstringencoding options:0 range:nsmakerange(0, datalength) remainingrange:nil];  size_t buffersize = datalength * kccblocksizeaes128; void * bufferout = malloc(buffersize);  size_t numbytesencrypted = 0;  cccryptorstatus cryptstatus = cccrypt(kccencrypt, kccalgorithmaes128, kccoptionpkcs7padding, keyptr, kcckeysizeaes256, ivptr, buffer, datalength, bufferout, buffersize, &numbytesencrypted); 

obj-c bytes output :

23 144 186 234 149 182 123 79 155 234 250 54 52 38 151 87 179 62 176 1 203 115 59 1 35 54 176 1 44 213 120 1

does have idea i'm doing wrong ?

thanks, greg

here working code, main difference data length accounts utf16 being 2 bytes per character:

u_int8_t  keyptr[32]; bzero(keyptr, sizeof(keyptr)); u_int8_t  ivptr[32]; bzero(ivptr, sizeof(ivptr));  nsstring *text = @"abcdefgh"; nsuinteger datalength = [text length] * 2; // allow utf16 void * buffer = malloc(datalength); [text getbytes:buffer maxlength:datalength usedlength:nil encoding:nsutf16bigendianstringencoding options:0 range:nsmakerange(0, datalength) remainingrange:nil];  size_t buffersize = datalength * kccblocksizeaes128; u_int8_t * bufferout = malloc(buffersize);  size_t numbytesencrypted = 0;  cccryptorstatus cryptstatus = cccrypt(kccencrypt, kccalgorithmaes128, kccoptionpkcs7padding, keyptr, kcckeysizeaes256, ivptr, buffer, datalength, bufferout, buffersize, &numbytesencrypted);  printf("encoded text in decimal: "); (int i=0; i<numbytesencrypted; i++) {     printf("%d ", bufferout[i]); } printf("\n"); 

printf output:

encoded text in decimal: 80 87 109 195 133 40 205 81 117 91 17 132 229 3 119 251 205 8 246 64 13 57 210 142 11 153 121 39 122 196 63 10

yes, awful code, made minimum changes necessary. guess new era being entered data dumps in decimal, hex dead.


Comments

Popular posts from this blog

commonjs - How to write a typescript definition file for a node module that exports a function? -

openid - Okta: Failed to get authorization code through API call -

thorough guide for profiling racket code -