Effective way to find checksum in perl without memory leakage -


in program need checksum many files. checksum calculation within find command.

find(sub {         $file = $file::find::name;         return if ! length($file); open (file, "$file"); $chksum = md5_base64(<file>); close file; }, "/home/nijin"); 

the above code works perfectly. if there file large size example 6gb in path /home/nijin, load 6 gb ram memory , process takes 6 gb ram continuously until process completed. please note backup process , take more 12 hours process complete. lose 6gb until process completed. worst case process gets hangs due large memory usage. option have tried use file::map . code pasted below.

find(sub {             $file = $file::find::name;             return if ! length($file);    map_file $map, "$filename", '<';     $chksum = md5_base64($map);     }, "/home/nijin"); 

the above code works getting segmentation fault error while using above code. have tried sys::mmap having same issue first one. there other option try?

there's no reason read whole file memory @ once.

you can explicitly process in 64k chunks following:

my $chksum = {     open $fh, '<:raw', $file;     $md5 = digest::md5->new;     local $/ = \65536; # read 64k @ once     while (<$fh>) {         $md5->add($_);     }     $md5->hexdigest; }; # whatever going here 

you can pass filehandle directly, although not guarantee how process it:

my $chksum = {     open $fh, '<:raw', $file;     digest::md5->new->addfile($fh)->hexdigest }; 

Comments

Popular posts from this blog

commonjs - How to write a typescript definition file for a node module that exports a function? -

openid - Okta: Failed to get authorization code through API call -

thorough guide for profiling racket code -