HTTP live streaming server on iPhone HTTP live streaming server on iPhone node.js node.js

HTTP live streaming server on iPhone


I posted this on the apple developer forum, we carrying on a lively (excuse the pun) discussion. This was in answer to someone who brought up a similar notion.

I think correct me if I am wrong, and give us an example how if you disagree that creating an mpeg ts from the raw h264 which you get from AVCaptureVideoDataOutput is not aneasy task unless you transcode using x264 or something similar. lets assume for a minute that you could easy get mpeg ts files, then it would be a simple matter of compiling them in an m3u8 container, launching a little web server and serving them.As far as I know , and there are many many apps that do it, using localhost tunnels from the device are not a reject issue. So maybe somehow you could generate hls from the device I question the performance you would get.

So on to technique number 2Still using AvCaptureVideoDataOutput, you capture the frames , wrap them in some neat little protocol , json or perhaps something more esoteric like bencode open a socket and send them to your server.Ahh... good luck better have a nice robust network because sending uncompressed frames even over wifi is going to require bandwidth.

So on to technique number 3.

You write a new movie using avassetwriter and read back from the temp file using standard c functions, this is fine but what you have is raw h264, the mp4 is not complete thus it does not have any moov atoms, now comes the fun part regenerating this header. good luck.

So on to tecnique 4 that seems to actually have some merit

We create not one but 2 avassetwriters , we manage them using a gcd dispatch_queue, since after instantiation avassetwriters can only be used one time , we start the first one on a timer , after a pre-determined period say 10 seconds we start the second while tearing the first one down. Now we have a series of .mov files with complete moov atoms, each of these contained compressed h264 video. Now we can send these to the server and assemble them into one complete video stream. Alternately we could use a simple streamer that takes the mov files and wraps them in rtmp protocol using librtmp and send them to a media server.

Could we just send each individual mov file to another apple device thus getting device to device communication, that question has been misinterpreted many many times, locating another iphone device on the same subnet over wifi is pretty easy and could be done. Locating another device on tcp over celluar connection is almost magical, if it can be done its only possible on cell networks that use addressable ip's and not all common carriers do.

Say you could , then you have an additional issue because non of the avfoundation video players will be able to handle the transition between that many different seperate movie files. You would have to write your own streaming player probably based off of ffmpeg decoding. (thats does work rather well)