Stream Video from the Raspberry Pi Camera to Web Browsers, Even on iOS and Android
Posted byon under
I've been excited about the Raspberry Pi Camera Module since it was announced last year, so I went and ordered one from Element14 as soon as it came on sale.
I have a few ideas for cool things to build with this camera and I will be blogging about them as I get to develop them. Today, I will show you how to transform the Raspberry Pi into a webcam server. You will be able to watch the video stream from the camera on any device that has a web browser. And yes, this includes the iPad/iPhone and Android devices!
The official streaming method
The introductory article about the camera module in the Raspberry Pi blog shows a method to stream video from the Raspberry Pi to another computer. This method essentially works as follows:
- On the Pi the
raspividutility is used to encode H.264 video from the camera
- The video stream is piped to the
ncutility, which pushes it out to the network address where the video player is.
- On the player computer
ncreceives the stream and pipes it into
This is an efficient method of streaming video from the Pi to another computer, but it has a few problems:
- The Raspberry Pi needs to know the address of the computer that is playing the video
- The playing computer needs to have an advanced player that can play a raw H.264 video stream. No mobile device that I know can do this, for example.
- Since this system relies on a direct connection between the Pi and the player, it is impossible to have the player computer connect and/or disconnect from the stream, the connection needs to be on at all times.
- What if there are two or three concurrent players? Things get awfully complicated for the Pi.
This ad hoc solution that the Raspberry Pi Camera team proposes isn't that useful to me, so I went to search for better options.
I think an important requirement for a streaming camera is that you can view it with ease. To me, this means that the stream should be playable from a web browser. Having to run a custom player is a complication, and puts it out of reach of most mobile devices.
There are a few modern streaming protocols for web browsers out there. For example, HLS is Apple's choice, so it has great support on iDevices but not much elsewhere. Another one, called Fragmented MP4 is supported by Adobe and Microsoft, but requires browser plugins from these companies on the player computer, so Windows and Mac computers can do it, but Linux and mobile cannot. HTML5 video is also based on the MP4 format but support is not that great.
Besides, for all the streaming protocols listed above there is a need to have a streaming server that prepares the video for streaming by segmenting it and packaging it, and while there are several open source utilities that can do this for a static video stream, I haven't found any that can do it on a live stream. Note: I have been corrected on this statement, more recent releases of
ffmpeg than the binary available for Raspbian can generate an HLS live stream.
So what other options are there?
Motion JPEG to the rescue
I then investigated how IP webcams do it, and a lot of them use an older streaming protocol called Motion JPEG or MJPEG.
What is Motion JPEG? Pretty simple, it's just a stream of individual JPEG pictures, one after another. I was surprised to find that most modern browsers can play MJPEG streams natively.
The down side of MJPEG streams is that they are not as efficient as H.264, which greatly improves quality and reduces size by encoding only the differences from one frame to the next. With MJPEG each frame is encoded as an entire JPEG picture. For my needs this isn't a concern, though.
Continuing with my research I stumbled upon MJPG-streamer, a small open source MJPEG streaming server written in C that I was easily able to compile for the Raspberry Pi.
The following sections describe how I've used this tool to create a very flexible, play anywhere, streaming server for my Raspberry Pi camera.
UPDATE: This section is outdated. Please use the instructions on my updated guide to build and install MJPG-Streamer.
Unfortunately there isn't a package for MJPEG-streamer that can be installed with
apt-get, so it needs to be compiled from source.
MJPEG-streamer is hosted at sourceforge.net, so head over to the project's download page to get the source tarball.
To compile this application I used the following commands:
$ sudo apt-get install libjpeg8-dev $ sudo apt-get install imagemagick $ tar xvzf mjpg-streamer-r63.tar.gz $ cd mjpg-streamer-r63 $ make
This tool requires
libjpeg and the
convert tool from ImageMagick, so I had to install those as well.
The makefile does not include an installer, if you want to have this utility properly installed you will need to copy the
mjpg_streamer and its plugins
output_*.so to a directory that is in the path, like
/usr/local/bin. It is also possible to run this tool directly from the build directory.
Setting up the JPEG source stream
The streaming server needs a sequence of JPEG files to stream, and for this we are going to use the
raspistill utility that is part of Raspbian. For those that are concerned about performance, keep in mind that the JPEG encoder used by
raspistill runs in the GPU, the load required to generate JPEGs is pretty small.
To setup a constant stream of JPEG images the command is as follows:
$ mkdir /tmp/stream $ raspistill -w 640 -h 480 -q 5 -o /tmp/stream/pic.jpg -tl 100 -t 9999999 -th 0:0:0 &
Let's go over the arguments to
raspistill one by one:
-wsets the image width. For an HD stream use 1920 here.
-hsets the image height. For an HD stream use 1080 here.
-qsets the JPEG quality level, from 0 to 100. I use a pretty low quality, better quality generates bigger pictures, which reduces the frame rate.
-osets the output filename for the JPEG pictures. I'm sending them to a temp directory. The same file will be rewritten with updated pictures.
-tlsets the timelapse interval, in milliseconds. With a value of 100 you get 10 frames per second.
-tsets the time the program will run. I put a large number here, that amounts to about two hours of run time.
-thsets the thumbnail picture options. Since I want the pictures to be as small as possible I disabled the thumbnails by setting everything to zero.
&puts the application to run in the background.
Starting the streaming server
Okay, so now we have a background task that is writing JPEGs from the camera at a rate of ten per second. All that is left is to start the streaming server. Assuming you are running it from the build directory the command is as follows:
$ LD_LIBRARY_PATH=./ ./mjpg_streamer -i "input_file.so -f /tmp/stream -n pic.jpg" -o "output_http.so -w ./www"
Let's break this command down to understand it:
LD_LIBRARY_PATHsets the path for dynamic link libraries to the current directory. This is so that the application can find the plugins, which are in the same directory.
-isets the input plugin. We are using a plugin called
input_file.so. This plugin watches a directory and any time it detects a JPEG file was written to it it streams that file. The folder and file to watch are given as the
-osets the output plugin. We are using the HTTP streaming plugin, which starts a web server that we can connect to to watch the video. The root directory of the web server is given as the
-wargument. We will use the default web pages that come with the application for now, these can be changed and customized as necessary.
Watching the stream
Now everything is ready. Go to any device that has a web browser and connect to the following website:
IP-address is the IP address or hostname of your Raspberry Pi.
I tested playback of the stream from an iPad, an Android smartphone and a variety of web browsers on Windows and OS X, and I was able to play the stream in all of them.
I hope you find this method useful. Let me know in the comments below if you have a different method of streaming.
Become a Patron!
Hello, and thank you for visiting my blog! If you enjoyed this article, please consider supporting my work on this blog on Patreon!
#1 Besnik Brahimi said 2013-05-26T22:12:18Z
Thanks for sharing your research results. I've also spend much time in finding a good solution. MJPG isn't an option for me because it is unefficient and streaming has a lag of 5+ seconds. i need some kind of real time streaming... i actually using nc but with the "-u" (=UDP) parameter (1080p with no lag!)...
the option that would be the best is to use the stream directly in a html5 video tag :-)
#2 Miguel Grinberg said 2013-05-26T22:40:21Z
@Besnik: If I keep the resolution small (640x480) and a slowish frame rate (10fps or less) I get a negligible lag of no more than a second with the MJPEG setup I described in this article. For HD I can keep the lag well below 5s if I reduce the frame rate to 2-3 fps. The bad news I have to give you is that all of the advanced streaming protocols (HLS, MP4, DASH) stream in segments of 2-10 seconds and the latency for live streams is of at least two segments, so you'll be looking at a 4 second lag in the best case. I will write a second part to this article once I figure out how to implement these advanced protocols.
#3 Peter kula said 2013-05-27T20:53:43Z
How did you make it? I keep on getting error.
gcc -O2 -DLINUX -D_GNU_SOURCE -Wall -c -o mjpg_streamer.o mjpg_streamer.c
mjpg_streamer.c:27:28: fatal error: linux/videodev.h: No such file or directory
make: *** [mjpg_streamer.o] Error 1
I tried installing libv4l-dev, but that ony gave me videodev2.h that is not compatible header. :( I am using wheezy upgraded. not sure what am I missing?
#4 browni said 2013-05-27T20:54:23Z
I tried your procedure but failed
it seems that the transfert to var/www is not working. Can you help me ?
Is is possible to stream over the web ?
#5 Miguel Grinberg said 2013-05-27T21:42:38Z
@Peter: did you update your Raspbian to latest? Run the following:
$ sudo apt-get update
$ sudo apt-get upgrade
Then try again.
#6 Miguel Grinberg said 2013-05-27T21:44:15Z
@browni: the /var/www folder isn't used for this. The mjpg-streamer application includes its own web server, which in my example above is configured to serve from the mjpg-streamer/www folder. Can you be more specific about the failure? What error(s) do you get?
#7 rich said 2013-05-29T02:52:43Z
@Peter_kula try installing from svn as described here:
#8 BenScar said 2013-05-31T23:17:44Z
I couldn't get the mjpg-streamer working from the stated location, looking at the thread mentioned by @rich I found the version at:
That one worked a treat, now have it running to multiple devices:
Thanks for all the info Miguel et all.
#9 Basti said 2013-06-01T15:35:26Z
Thanks for that tutorial. It works even for me raspberry novice.
And thanks for BenScar, I downloaded the mjpg-streamer-tar from his source, because I had the same error Peter had.
@Miguel: Does your video really have 10fps? When I use
raspistill -w 640 -h 480 -q 5 -o /tmp/stream/pic.jpg -tl 100 -t 30000 -th 0:0:0 &
my video has less than 10fps. And it's running more than 30s (which should be the time with -t 30000). I guess it just takes its time to get those 300 pictures (-tl 100 -t30000) and it cannot achieve this within 10fps.
#10 Johann said 2013-06-01T21:04:55Z
Thank you for your tutorial. I followed successfully each step but... No image displayed in the stream page, just a blank one.
Everything is up to date.
Did I missed something?
#11 Miguel Grinberg said 2013-06-01T21:07:26Z
#12 Jacob said 2013-06-04T19:03:20Z
Any ideas of ways to approach accessing the stream from another network other than the one that the camera is on? I want to find a way, without opening up certain ports on the router to access the stream from my phone on another network, most likely the cellular network. I realize it is probably difficult if not impossible, but I figured maybe someone could help me come up with a solution or at least a starting point.
#13 Miguel Grinberg said 2013-06-05T04:11:08Z
@Jacob: if you don't want to enable direct access to your Pi you can have the Pi push the jpegs to a server outside of your network, maybe a cheap VPS. Then you run mjpg-streamer on the VPS. The frame rates will decrease, but your Pi will not be reachable from the outside.
#14 szantaii said 2013-06-05T16:44:30Z
You've mentioned this in your post: " I have been corrected on this statement, more recent releases of ffmpeg than the binary available for Raspbian can generate an HLS live stream."
Do you have any resources how to achieve this? (I mean HLS live stream using ffmpeg.)
#15 Miguel Grinberg said 2013-06-05T16:57:44Z
@szantaii:: the arch linux distro comes with a much more recent ffmpeg binary. I have followed more or less the instructions in this page to achieve HLS streaming:
#16 Peter said 2013-06-12T11:57:49Z
I had the sane thing, found that raspistill had tripped and wasn't sending pictures to pic.jpg, hard reboot the Pi and restart everything and it worked for me.
#17 Donald Palmer said 2013-06-24T01:11:32Z
gcc -O2 -DLINUX -D_GNU_SOURCE -Wall -c -o mjpg_streamer.o mjpg_streamer.c mjpg_streamer.c:27:28: fatal error: linux/videodev.h: No such file or directory compilation terminated. make: *** [mjpg_streamer.o] Error 1
Tried the update and upgrade .. still get the same error. Any ideas?
#18 Miguel Grinberg said 2013-06-24T06:02:58Z
@Donald: by now I can't really remember for sure how I obtained the source for mjpg_streamer, but after a web search I've found that the official download from sourceforge.net is really old. Please try download a snapshot of the latest code using this link instead: http://sourceforge.net/p/mjpg-streamer/code/HEAD/tarball. Let me know if this helps.
#19 Tatsuya said 2013-07-04T01:04:01Z
Thank you for the instruction. It works like a charm.
Some guy suggests that writing jpeg images frequently on an SD card will wear out the card quickly. So it's better to write the images on tmpfs rather than the physical flash memory.
#20 Tatsuya said 2013-07-04T08:39:47Z
Smaller tmpfs partition size for mjpeg, the better streaming quality.
4 ~ 8 MB is sufficient.
Though /run partition is already mounted on tmpfs, never use it for mjpeg.
#21 Jaf said 2013-07-08T11:31:25Z
Recent safari on ipad or iphone5 can access the live stream but it does not refresh automatically. Any hint how to solve this issue?
Safari in iphone 4S has no problem. Safari on MacBook Pro, Chrome on Windows7 and Chrome on Android has no problem...
#22 Miguel Grinberg said 2013-07-09T04:57:46Z
#23 Steve said 2013-07-22T21:20:18Z
Brilliant tutorial. Using the Pi camera and your streaming technique as eyes for an NXT-python Robot. Works like a charm. Thanks for a very clear to follow outline.
#24 Jo said 2013-07-24T14:28:37Z
when I try to install the streamer and try the make command I get the following error:
make: [v4l2uvc.lo] Error 1
make: Leaving directory '/home/mjpg-streamer/mjpg-streamer/plugins/
make: [input_uvc.so] Error 2
I dont know why, because I followed your tutorial step by step...
#25 Miguel Grinberg said 2013-07-24T16:23:19Z
@Jo: the "uvc" plugin for mjpg_streamer is not used. Check if you have a "mjpg_streamer" binary. If you do, you are good to go!