amazon s3 - How do I get Zlib to uncompress from S3 stream in Ruby? -


ruby zlib::gzipreader should created passing io-like object (must have read method behaves same io#read).

my problem can't io-like object aws::s3 lib. far know, way of having stream passing block s3object#stream.

i tried:

zlib::gzipreader.new(aws::s3::s3object.stream('file', 'bucket')) # wich gaves me error: undefined method `read' #<aws::s3::s3object::value:0x000000017cbe78> 

does know how can achieve it?

a simple solution write downloaded data stringio, read out:

require 'stringio'  io = stringio.new io.write aws::s3::s3object.value('file', 'bucket') io.rewind  gz = zlib::gzipreader.new(io) data = gz.read gz.close  # data ... 

a more elaborate way start inflating gzipped data while stream still downloading, can achieved io.pipe. along lines of this:

reader, writer = io.pipe  fork   reader.close   aws::s3::s3object.stream('file', 'bucket') |chunk|     writer.write chunk   end end  writer.close  gz = zlib::gzipreader.new(reader) while line = gz.gets   # line ... end  gz.close 

you can use thread instead of fork:

reader, writer = io.pipe  thread = thread.new   aws::s3::s3object.stream('file', 'bucket') |chunk|     writer.write chunk   end   writer.close end  gz = zlib::gzipreader.new(reader) while line = gz.gets   # line end  gz.close thread.join 

Comments

Popular posts from this blog

commonjs - How to write a typescript definition file for a node module that exports a function? -

openid - Okta: Failed to get authorization code through API call -

thorough guide for profiling racket code -