Stream Video from the Raspberry Pi Camera to Web Browsers, Even on iOS and Android

Posted by
on under

I've been excited about the Raspberry Pi Camera Module since it was announced last year, so I went and ordered one from Element14 as soon as it came on sale.

I have a few ideas for cool things to build with this camera and I will be blogging about them as I get to develop them. Today, I will show you how to transform the Raspberry Pi into a webcam server. You will be able to watch the video stream from the camera on any device that has a web browser. And yes, this includes the iPad/iPhone and Android devices!

The official streaming method

The introductory article about the camera module in the Raspberry Pi blog shows a method to stream video from the Raspberry Pi to another computer. This method essentially works as follows:

  • On the Pi the raspivid utility is used to encode H.264 video from the camera
  • The video stream is piped to the nc utility, which pushes it out to the network address where the video player is.
  • On the player computer nc receives the stream and pipes it into mplayer to play.

This is an efficient method of streaming video from the Pi to another computer, but it has a few problems:

  • The Raspberry Pi needs to know the address of the computer that is playing the video
  • The playing computer needs to have an advanced player that can play a raw H.264 video stream. No mobile device that I know can do this, for example.
  • Since this system relies on a direct connection between the Pi and the player, it is impossible to have the player computer connect and/or disconnect from the stream, the connection needs to be on at all times.
  • What if there are two or three concurrent players? Things get awfully complicated for the Pi.

This ad hoc solution that the Raspberry Pi Camera team proposes isn't that useful to me, so I went to search for better options.

Streaming protocols

I think an important requirement for a streaming camera is that you can view it with ease. To me, this means that the stream should be playable from a web browser. Having to run a custom player is a complication, and puts it out of reach of most mobile devices.

There are a few modern streaming protocols for web browsers out there. For example, HLS is Apple's choice, so it has great support on iDevices but not much elsewhere. Another one, called Fragmented MP4 is supported by Adobe and Microsoft, but requires browser plugins from these companies on the player computer, so Windows and Mac computers can do it, but Linux and mobile cannot. HTML5 video is also based on the MP4 format but support is not that great.

Besides, for all the streaming protocols listed above there is a need to have a streaming server that prepares the video for streaming by segmenting it and packaging it, and while there are several open source utilities that can do this for a static video stream, I haven't found any that can do it on a live stream. Note: I have been corrected on this statement, more recent releases of ffmpeg than the binary available for Raspbian can generate an HLS live stream.

So what other options are there?

Motion JPEG to the rescue

I then investigated how IP webcams do it, and a lot of them use an older streaming protocol called Motion JPEG or MJPEG.

What is Motion JPEG? Pretty simple, it's just a stream of individual JPEG pictures, one after another. I was surprised to find that most modern browsers can play MJPEG streams natively.

The down side of MJPEG streams is that they are not as efficient as H.264, which greatly improves quality and reduces size by encoding only the differences from one frame to the next. With MJPEG each frame is encoded as an entire JPEG picture. For my needs this isn't a concern, though.

Continuing with my research I stumbled upon MJPG-streamer, a small open source MJPEG streaming server written in C that I was easily able to compile for the Raspberry Pi.

The following sections describe how I've used this tool to create a very flexible, play anywhere, streaming server for my Raspberry Pi camera.

Installing MJPEG-streamer

UPDATE: This section is outdated. Please use the instructions on my updated guide to build and install MJPG-Streamer.

Unfortunately there isn't a package for MJPEG-streamer that can be installed with apt-get, so it needs to be compiled from source.

MJPEG-streamer is hosted at sourceforge.net, so head over to the project's download page to get the source tarball.

To compile this application I used the following commands:

$ sudo apt-get install libjpeg8-dev
$ sudo apt-get install imagemagick
$ tar xvzf mjpg-streamer-r63.tar.gz
$ cd mjpg-streamer-r63
$ make

This tool requires libjpeg and the convert tool from ImageMagick, so I had to install those as well.

The makefile does not include an installer, if you want to have this utility properly installed you will need to copy the mjpg_streamer and its plugins input_*.so and output_*.so to a directory that is in the path, like /usr/local/bin. It is also possible to run this tool directly from the build directory.

Setting up the JPEG source stream

The streaming server needs a sequence of JPEG files to stream, and for this we are going to use the raspistill utility that is part of Raspbian. For those that are concerned about performance, keep in mind that the JPEG encoder used by raspistill runs in the GPU, the load required to generate JPEGs is pretty small.

To setup a constant stream of JPEG images the command is as follows:

$ mkdir /tmp/stream
$ raspistill -w 640 -h 480 -q 5 -o /tmp/stream/pic.jpg -tl 100 -t 9999999 -th 0:0:0 &

Let's go over the arguments to raspistill one by one:

  • -w sets the image width. For an HD stream use 1920 here.
  • -h sets the image height. For an HD stream use 1080 here.
  • -q sets the JPEG quality level, from 0 to 100. I use a pretty low quality, better quality generates bigger pictures, which reduces the frame rate.
  • -o sets the output filename for the JPEG pictures. I'm sending them to a temp directory. The same file will be rewritten with updated pictures.
  • -tl sets the timelapse interval, in milliseconds. With a value of 100 you get 10 frames per second.
  • -t sets the time the program will run. I put a large number here, that amounts to about two hours of run time.
  • -th sets the thumbnail picture options. Since I want the pictures to be as small as possible I disabled the thumbnails by setting everything to zero.
  • & puts the application to run in the background.

Starting the streaming server

Okay, so now we have a background task that is writing JPEGs from the camera at a rate of ten per second. All that is left is to start the streaming server. Assuming you are running it from the build directory the command is as follows:

$ LD_LIBRARY_PATH=./ ./mjpg_streamer -i "input_file.so -f /tmp/stream -n pic.jpg" -o "output_http.so -w ./www"

Let's break this command down to understand it:

  • LD_LIBRARY_PATH sets the path for dynamic link libraries to the current directory. This is so that the application can find the plugins, which are in the same directory.
  • -i sets the input plugin. We are using a plugin called input_file.so. This plugin watches a directory and any time it detects a JPEG file was written to it it streams that file. The folder and file to watch are given as the -f and -n arguments.
  • -o sets the output plugin. We are using the HTTP streaming plugin, which starts a web server that we can connect to to watch the video. The root directory of the web server is given as the -w argument. We will use the default web pages that come with the application for now, these can be changed and customized as necessary.

Watching the stream

Now everything is ready. Go to any device that has a web browser and connect to the following website:

http://<IP-address>:8080

Where IP-address is the IP address or hostname of your Raspberry Pi.

The default website served by the streaming server provides access to several players to watch the stream. I've found that the "Stream" option worked on most devices I tried. For a few that "Stream" didn't show video I went to "Javascript" and I was able to play the video just fine.

I tested playback of the stream from an iPad, an Android smartphone and a variety of web browsers on Windows and OS X, and I was able to play the stream in all of them.

I hope you find this method useful. Let me know in the comments below if you have a different method of streaming.

Miguel

Become a Patron!

Hello, and thank you for visiting my blog! If you enjoyed this article, please consider supporting my work on this blog on Patreon!

141 comments
  • #126 JayP said

    My son doesn't use his Kano anymore and I'd like to use it to stream his rabbit cage for him. I bought an 8MP Raspicam, but can't seem to get it to work on the Kano OS. Any suggestions?

  • #127 Miguel Grinberg said

    @JayP: Sorry, I'm not familiar with Kano OS myself, no idea how similar or different it is to Raspbian.

  • #128 Anonymous said

    I want to develop android app and receive the raspberry pi camera stream in the app?Can you please tell how to do it?

  • #129 Miguel Grinberg said

    @Anonymous: On the server side you don't have to change anything. On the Android side, just use a WebView pointed at the server URL.

  • #130 Debopam Parua said

    Hello Miguel Grinberg,

    I have been working on an IoT project using raspberry pi. The pi reads a few sensors through gpio and uses a raspicam. I have set up a web interface that shows sensor data on the internet once you login to a dashboard. However, I am having a problem using the camera. I intend to use the camera for two different purposes, getting a still snap on demand, and showing the live video feed. I solved the still snap portion using FTP, though it takes quite a bit of time to call it real-time. But I have been lost about the live streaming part totally. Like you, I used the mjpeg streamer, and also mmalmotion. These worked good till I was using the service from my home network, and using the dashboard on my localhost. But these streaming URLs became invalid as soon as I hosted the dashboard on a cloud space. Port forwarding was suggested to me, but my ISP does not provide me access to my public IP and hence port forwarding cannot be done. Do you have any suggestions regarding streaming the video online? It would really be a lot of help if you could give me a solution.

    Thanks in advance.

  • #131 Miguel Grinberg said

    @Debopam: port forwarding is done inside your home network, it is a function of your internet router and does not require your ISP to even know you are doing it. If you get a public IP address, then you should be able to forward any port from your computer to the router, and that should make it accessible from the outside.

  • #132 Yvan said

    i did the "How to build and run MJPG-Streamer on the Raspberry Pi.html" tutorial. works. thnk you. now, i assume this is linux programming since it is in the Pi itself. Now, a Pi has GPIO's. and i currently only know of triggering python code with it. How would one activate the camera/stream with a bttn? i have one Zero with the camera module wich streams, and another Pi(the very first one) with a 3.2" screen wich are connected by LAN cable and are networked. A bttn is supposed to activate the Zero's camera/stream, as well as play an mp3 on the display-Pi, de-activate the screensaver, and launch the browser on the preset <IP-adress>:8080.
    pressing the bttn again(connected to both PiZeroCam as PiDisplay's GPIO) must deactivate the PiZerostream, and revert the actions on the PiDisplay.
    O:) making a closed system intercom with falling and getting up, but my nose starts too hurt... communication between linux & python confuses me. can u advise?

  • #133 Miguel Grinberg said

    @Yvan: not sure exactly how you trigger your Python code from a button, but you can start commands from Python using the subprocess module: https://docs.python.org/3/library/subprocess.html. You can use that from Python to run raspistill and mjpeg_streamer.

  • #134 Nexia said

    Hi,
    I want to ask is it possible to create a function of a raspberry pi with camera module to send image to a website you made?

  • #135 Miguel Grinberg said

    @Nexia: Yes, that is possible, but you will need your website to have a way to receive files. This can be done in many different ways, for example HTTP file uploads, FTP, etc.

  • #136 Charles said

    I am still very new to rasberry Pi but is it possible to stream without using a web browser?
    Like streaming through a local network like flashshare for android?

  • #137 Miguel Grinberg said

    @Charles: you don't need a browser. Any HTTP client that supports Motion JPEG can display the video stream.

  • #138 Anuj said

    Hi Miguel..Your library works like magic..Thanks for the creation
    I have a suggestion:-please EDIT this page and provide link to updated tutorial(from 2018) or take down this page .
    Why?- Its from 2012 and like many posts from its time, it is highly indexed by search engines.So it is one of the first results when I search for Rasp Pi webserver Camera. But its now outdated because Raspbian has undergone major changes. It's very frustrating for a newbie.
    Thanks...Peace!

  • #139 Miguel Grinberg said

    @Anuj: nothing in this tutorial is outdated, as far as I know. The newer articles I wrote on video streaming are not a replacement, they are alternatives.

  • #140 SAnjtih said

    tar (child): mjpg-streamer-r63.tar.gz: Cannot open: No such file or directory
    tar (child): Error is not recoverable: exiting now
    tar: Child returned status 2
    tar: Error is not recoverable: exiting now

    how do i solve this?

  • #141 Miguel Grinberg said

    @SAnjtih: did you download the tar file? And if you did, maybe you have it with a different filename?

Leave a Comment