javascript - Replacing fs.readFile with fs.createReadStream in Node.js -


i have code reading image directory , sending index.html.

i trying replace fs.readfile fs.createreadstream have no idea how implement can not find example.

here got (index.js)

var app = require('express')(); var http = require('http').server(app); var io = require('socket.io')(http);  var fs = require('fs');  http.listen(3000, function () {      console.log('listening on *:3000'); }); app.get('/', function (req, res) {      res.sendfile(__dirname + '/public/views/index.html'); }); io.on('connection', function (socket) {      fs.readfile(__dirname + '/public/images/image.png', function (err, buf){         socket.emit('image', { image: true, buffer: buf.tostring('base64') });      }); }); 

index.html

<!doctype html> <html> <body>  <canvas id="canvas" width="200" height="100">     browser not support html5 canvas tag. </canvas>  <script src="https://cdn.socket.io/socket.io-1.2.0.js"></script>  <script>     var socket = io();     var ctx = document.getelementbyid('canvas').getcontext('2d');     socket.on("image", function (info) {         if (info.image) {             var img = new image();             img.src = 'data:image/jpeg;base64,' + info.buffer;             ctx.drawimage(img, 0, 0);         }     }); </script> </body > </html > 

the below approach uses core modules , reads chunks stream.readable instance returned fs.createreadstream() , returns chunks buffer. isn't great of approach if you're not going stream chunks back. you're going hold file within buffer resides in memory, solution reasonably sized files.

io.on('connection', function (socket) {   filetobuffer(__dirname + '/public/images/image.png', (err, imagebuffer) => {     if (err) {        socket.emit('error', err)     } else {       socket.emit('image', { image: true, buffer: imagebuffer.tostring('base64') });      }   }); });  const filetobuffer = (filename, cb) => {     let readstream = fs.createreadstream(filename);     let chunks = [];      // handle errors while reading     readstream.on('error', err => {         // handle error          // file not read         return cb(err);     });      // listen data     readstream.on('data', chunk => {         chunks.push(chunk);     });      // file done being read     readstream.on('close', () => {         // create buffer of image stream         return cb(null, buffer.concat(chunks));     }); } 

http response stream example

its better idea use http streaming data since built protocol , you'd never need load data memory @ once since can pipe() file stream directly response.

this basic example without bells , whistles , demonstrate how pipe() stream.readable http.serverresponse. example uses express works exact same way using http or https node.js core api.

const express = require('express'); const fs = require('fs'); const server = express();  const port = process.env.port || 1337;  server.get ('/image', (req, res) => {     let readstream = fs.createreadstream(__dirname + '/public/images/image.png')      // when stream done being read, end response     readstream.on('close', () => {         res.end()     })      // stream chunks response     readstream.pipe(res) });  server.listen(port, () => {     console.log(`listening on ${port}`); }); 

Comments

Popular posts from this blog

commonjs - How to write a typescript definition file for a node module that exports a function? -

openid - Okta: Failed to get authorization code through API call -

ios - Change Storyboard View using Seague -