DIY Remote Camera using Raspberry Pi, Pi Camera, and Socket.IO

Identifying a camera socket

Now, our server has two different sockets connecting to it. The problem here is that our server cannot know which socket is from the client and which socket is from the camera. Without knowing which socket is of the remote camera, we won’t be able to request the camera for a picture.

So, let’s have the camera emit an event soon after establishing a connection. When we receive that event on the server, we know that the socket that received the event is the socket belonging to the remote camera. Let’s store that socket in a variable called camera.

In the connect event listener of our camera app, add the following line to emit an event called camera.

socket.emit("camera", "online");

In the server app, create a variable called camera outside the socket connection event listener. An event listener function is called every time a new socket connects, and if we declare this variable inside the event listener, every listener will have its own version of the camera variable. Since we need to access the camera socket inside the client socket’s event listener, we need to make sure that this variable has a global scope.

var camera;

socket.of('camera').on("connection", io=>{
    console.log("Client connected!",;
        console.log("Camera Online!");

Then, create an event listener that will listen to the camera event. In the event listener, save the socket that receives the event in the camera variable. Now, we know which one is our camera socket.

Client-Server-Camera communication flow

Let’s modify the take event that we created to send and receive messages between the server and the client. Now, when the client emits a take message, the server should emit an event called takeP to the remote camera. The remote camera should listen to that event and send a response to the server, which should forward that to the client.

io.on("take",(mes, fn)=>{

        console.log("Picture request ",mes);
            console.log("Requesting Camera");
                console.log("Picture Received");

First, we are checking if the camera variable is not null. If it has a value, then that means a camera is online. Then, emit an event on the camera socket and pass a callback function which will be called by the remote camera on receiving a message on that event. Call our client’s callback function inside the callback function we pass to our remote camera.

Next, let’s create an event listener in our camera app.

socket.on("takeP", (mes, fn) => {
    console.log("Received picture request");
    fn("This is your camera speaking");

When a message is received on the takeP event, we are calling the callback function of the server and passing a message that reads “This is your camera speaking”. Remember that our server’s callback function will call our client’s callback function and pass the same argument we pass into our server’s callback function.

Now, run all three apps and click the button on our webpage again. 

You should be able to see the message that our camera sent in the browser console. Let’s try to understand the flow once again. Our camera emits a take event to the server with a function that should be called by the server. The server, on receiving the take event, emits a takeP event to the camera with a callback function to be called by the camera. The camera receives the takeP event, calls the server’s callback function and passes a message into it. We call the client’s callback function inside the server’s callback function and pass the message we received from the camera.

I hope the above diagram helps you understand the sequence of callbacks better.

Sending pictures

We can now emit an event to the server from the client, have the server emit another event to the camera, get a response from the camera, and have it send to the client. We can use the same flow to request for and receive a picture, can’t we? All we need to do is to have the camera a pass a picture into the callback function instead of a text message.

Get a picture and paste it into the root directory of your camera app. To load the image into the node app, we need to use the fs module. Since that module is already available with node, just importing it is enough.

var fs = require('fs');

To send the image as an argument, we need to encode it into the base64 format. To that end, load the image using the fs module’s readFileSync method and pass the returned stream into a Buffer. Then use the toString method to convert it to a base64 string as shown below.

var data = new Buffer(fs.readFileSync("./photo.jpg")).toString("base64");

Pass the data variable as the argument instead of our previous message. Let’s try running our apps again.

In the browser console, you would get a long string of random characters. Don’t fret, this is our encoded image! We can easily decode it into an image in HTML.

In our client webpage, create an img tag with the id image.

<img id="image" src=""/>

Now, inside the call back function we passed to the emit event, add the following line of code.


Your emit function should look like this now.



Save the file and refresh the browser tab. You should now be able to see your picture on the webpage.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.