node (socket) live audio stream / broadcast node (socket) live audio stream / broadcast node.js node.js

node (socket) live audio stream / broadcast


You'll want to look for packages that work on Streams and from there it's just about piping your streams to output as necessary. Using Express or just the built-in HTTP you can accomplish this quite easily. Here's an example built around osx-audio which provides a PCM stream, lame which can encode a stream to mp3, and Express:

var Webcast = function(options) {  var lame = require('lame');  var audio = require('osx-audio');  var fs = require('fs');  // create the Encoder instance  var encoder = new lame.Encoder({    // input    channels: 2,        // 2 channels (left and right)    bitDepth: 16,       // 16-bit samples    sampleRate: 44100,  // 44,100 Hz sample rate    // output    bitRate: options.bitrate,    outSampleRate: options.samplerate,    mode: (options.mono ? lame.MONO : lame.STEREO) // STEREO (default), JOINTSTEREO, DUALCHANNEL or MONO  });  var input = new audio.Input();  input.pipe(encoder);  // set up an express app  var express = require('express')  var app = express()  app.get('/stream.mp3', function (req, res) {    res.set({      'Content-Type': 'audio/mpeg3',      'Transfer-Encoding': 'chunked'    });    encoder.pipe(res);  });  var server = app.listen(options.port);}module.exports = Webcast;

How you get your input stream might be the most interesting part, but that will depend on your implementation. The popular request package is built around Streams as well though, so it might just be an HTTP request away!


On the web browser you have the HTML5 video element and the audio element. Both of them have sources. Each web browser supports different codecs natively. So you'll want to watch out for that if you're trying to stream mp3.

You don't need socket.io, you only need HTTP. Your app is reading a file, music.ogg, and for each chunk it reads, it will send it through the http server. It will be one single HTTP request that's kept open until the file is transferred.

Here's how your html will look:

<audio src="http://example.com/music.ogg"></audio>

And your nodejs code will be something like this (haven't tested this):

var http = require('http');var fs = require('fs');http.on('request', function(request, response) {    var inputStream = fs.open('/path/to/music_file.ogg');    inputStream.pipe(response);})

I'm only using the ReadableStream.pipe method on the inputStream and the http and fs modules for the above code. If you want to transcode the audio file (for example, from mp3 to ogg) you'll want to find a module that does that and pipe the data from the file into the transcoder then into response:

// using some magical transcoderinputStream.pipe(transcoder).pipe(response);

The method will call end on the stream whenever it's finished writing so that the HTTP request will be finished as soon as the file is done being read (and transcoded).


You can do this with node and RTC. There is some tools ready to use like SimpleWebRTC or EasyRTC. For what I already tested video is still a trouble, but audio works great.