Projects

Virtual Virtual Reality

Pan / Tilt Control with a VR-style Experience

Oliver Higgins

Issue 11, May 2018

This article includes additional downloadable resources.
Please log in to access.

Log in

Need to view something remotely in real-time, but one angle isn’t enough? Try out this Virtual Reality Pan and Tilt camera, to create a truly immersive experience.

BUILD TIME: 1 hour
DIFFICULTY RATING: intermediate

The idea of virtual reality has been around in a practical form since the 90s, but the practicality and high-cost of equipment prohibited it from ever taking off in a mass market way. However, as we enter the new age of technology, we have seen the emergence of low-cost virtual reality headsets for use with mobile phones. The apps that are used with these provide an immersive experience, be it a virtual rollercoaster or playback from one of the new breed of 360-degree action cameras. This got us thinking; what if we could take advantage of this hardware and create a “live” stream that would react to the movement of the virtual reality hardware, and move a camera in real-time? This could be achieved through many different means, all of which present different problems we need to overcome.

The finished camera module with 3D printed Raspberry Pi mount.
The finished camera module with 3D printed Raspberry Pi mount.

THE BROAD OVERVIEW

To make this work correctly, our system requires several blocks. Our aim is to use a mobile handset to send its gyroscope position data to a camera mounted on a servo, that then moves as our phone moves. It then relays the video image stream back to the handset to show what we want to see.

PROGRAM FLOW

Handset Gyro

All modern phones are different. Cross-operating system compatibility is often non-existent, but with the advent of HTML5 and the wake of the browser wars, we can use these standards to retrieve our device’s orientation via Safari or Chrome.

The gyro data is quite complex, so we will not dive into the depths of the equations. As an overview though, we have two main ways we can work with the data: quaternion and Euler. Both methods have their advantages, but it depends on what it is that you want to do. In our case, Euler serves to deliver the “point in space” data we need. In terms of our project, this gives us the handset position in relative space, denoted by a coordinate base system. This means we just need to translate the data into the positions of the servos being driven by the Raspberry Pi (RPi).

handset gyro

Web Server

Our web server that is reading this data is a Raspberry Pi. The magic itself happens using Node.js and JavaScript. Here we take the data from the handset and convert that over to a Pulse Width Modulated frequency for the servos.

Control Pan/Tilt

We then use the RPi’s general purpose input/output (GPIO) ports to control two SG90 servos. This allows us to control the basic pan (left and right) and tilt (up and down).

Send Image Back

The RPi has the camera attached. When the web connection is received, then the stream is started and sent back via websockets to the web browser located on the phone.

send image back

HOW IT WORKS

Servos

“Servo” is short for “servo motor” or “servo mechanism”. It is comprised of a two-wire DC motor, a gear train, a potentiometer, an integrated circuit, and an output shaft. They have three wires that exit out from the motor casing, one is for power, one is for ground, and one is a control input line. This makes the control of objects much easier as all we need to do is feed the servo a pulse width modulation (PWM) signal, and the motor will turn to the position we have designated. In the case of this project, we are using 12 servo SG90 motors. These are very low cost and as such, make projects like this much more cost-effective. One drawback however, is the fact that they have plastic gears and a small motor. For the most part, this is okay, given that a servo draws current based on the loads it is subjected to, but if you mechanically overload them it is easy to burn out the motors or strip the plastic gears. The MG90 is a metal-geared version in a similar package; it has a higher torque rating and would appear to be more responsive than the SG90. The problem is that its average cost, at the time of print, of an MG90 is around USD6.75 compared to USD2.75. On a project like this, that equates to a cost that is almost two thirds higher. If we're to build a permanent version, where the pan and tilt mechanism is built out of aluminium rather than plastic, then the MG90 would be a much better investment.

Node.js

Node.js is a server-side implementation of JavaScript, and presents a unified language for allowing fast asynchronous programming. It is an open source server framework that runs on all the major operating systems.

Node.js runs single-threaded, non-blocking, asynchronously programming, which means you have a much more efficient system that can run much faster. Node.js can generate dynamic page content. It can create, open, read, write, delete, and close files on the server. It can also collect form data, as well as add, delete and modify data in your database. Before it can work, it must be initiated on the server first. In a system like this, the process would send the task to their (the server’s) file system, before then getting ready to handle the next request. When the file system has opened and read the file, the server returns the content to the client.

Node.js eliminates the waiting and simply continues with the next request.

BUILDING THE CIRCUIT

We actually have two parts of the build to look at:

  • The Raspberry Pi controller
  • The pan and tilt mechanism
Parts Required:Jaycar Altronics
1 x Raspberry Pi 3XC3630Z6302B
1 x Raspberry Pi CameraXC9020Z6302C
1 x 30cm Pi Camera Cable--
2 x SG90 ServoYM2758Z6392
Glue--
3D Printed Parts--

You will also require some M3 screws, and basic tools.

The Circuit

Firstly, the circuit itself is quite simple, containing just two SG90 servo motors, and the Raspberry Pi Camera which connects via its own cable.

It's worthwhile connecting all the electronics to the Raspberry Pi before mounting into the 3D prints if you're unsure. We'll connect the servos directly to the GPIO pins. You'll need some female to male jumper wires for this, to make the appropriate connections.

These servos have three wires. 5V, GND, and the PWM input. Generally you'll find a red, brown and orange wires. Red is 5V, brown is GND, and orange is PWM input. If your servos don't follow these colours, check the supplier documentation.

Servo 1, will become our PAN servo. Connect this to 5V, GND, and the orange wire to Pin 9 on the GPIO, as shown in the diagram.

Servo 2, will become our TILT servo. Connect this to 5V, GND, and the orange wire to Pin 10 as shown.

It's worth testing these before mounting the servos into the 3D printed parts, but not essential.

circuit

The Pan And Tilt Mechanism

This comprises of several different parts. To keep it compact and easy to print, we need to use glue to make it solid. However, you can press-fit this into a point; but we would recommend using some type of adhesive. In addition to the two servos, you will also need one cross-control horn and one single arm. These are used to screw down the servos to the frame. The horns should be included with the servos when you purchase them.

HOW TO BUILD THE PAN AND TILT

To make this easier we have labelled the parts as following:

A
Base Plate
Base Plate
B
Servo Plate
Servo Plate
C
Left Upright
Left Upright
D
Right Upright
Right Upright
E
Tilt Plate
Tilt Plate
F
Camera Mount
Camera Mount

Start with the bottom plate “A” and check for fit with the servo horns. You can use anything you need here, but the goal is to get it as flat as possible. We have used the standard cross horn that is included in the servo. There are some small guides and holes if you wish to screw it down, but you will need very short screws to do so. Also, when you make sure it is flat, before glueing it down, check that the control horn screw can fit through underneath.

Next, take the round piece “B” and the two uprights, “C” and “D”. Make sure they both fit their slots before glueing them in place. Part “E” is the camera mounting plate. We have created two sets of holes on this. If you wish, there are guide holes that match the Raspberry Pi camera module; otherwise, we have induced part “F” which is a press-on mount for the camera. This is then either screwed or bolted to part “E” camera mount, in whatever orientation you require.

With this in place, we need to add in the servos. Take the first servo and turn it upside down. Take part “B” and push the servo through the large rectangle making sure that the servo rotation control is now in the centre of the part. Take the second servo and align to the two uprights.

With this complete, you can take part “A” and push it onto the servo.

construction 3

Place the screw through the bottom of the part and tighten. We then take the camera mount and place the single arm servo horn on the side with the larger hole. Place this horn onto the second servo, and push a screw through the other side. We found that it was easier to make this hole slightly larger so as to allow movement for the screws you are using.

optional additions

After a little testing, we found that there are two upgrades that can potentially make this mount better. It all depends on your usage and preferences, so we've detailed them separately.

G
Enlarged Base Plate
Enlarged Base Plate

We quickly realised that somewhere to mount the Rasperry Pi 3 Model B, would be useful. Especially if you're using the standard camera cable, there's not a whole lot of places to put it where it will still reach. There are four posts which suit the Raspberry Pi, which accept M3 screws. The rest of the mount is unchanged from the standard square base plate. Print G instead of A to use.

H
Camera Retainer
Camera Retainer

Our 3D printed mount is essentially a friction-fit, with the four thin posts holding the camera in place. After a little movement, these can work loose and we quickly found ourselves with tape holding the camera onto the mount. We're comfortable with tape as a fix, but a retaining plate seemed like a better idea. This plate mounts across the camera without obsctructing the camera view, with two screws from behind the tilt plate. Best fitted before you mount it onto the servos.

THE SETUP

Setting Up Your Raspberry Pi

Setting up the RPi is reasonably straightforward, but you will need to be familiar with running commands via the terminal.

Software

To set up the server side on our Raspberry Pi you will need to open the command line or terminal and type:

sudo apt-get update

Then, to make sure that we have Node.js setup correctly:

curl -sL https://deb.nodesource.com/setup_9.x | 
sudo -E bash -   
sudo apt-get install nodejs

If you do not already have pigpio installed, please use the following:

sudo apt-get install pigpio

Once this complete download the pi-vr-cam.zip file and extract to you home directory. Change to this directory:

cd pi-vr-cam

Next, we need to install the npm packages. Make sure from project directory:

npm install

Finally we can start our server.

npm start

Your screen will display something like:

> picam@1.0.0 start /home/pi/pi-vr-cam 
> sudo node index.js
App is running at http://[IP]:8080 
Press CTRL-C to stop

Powering Up

At this stage, you can power up the unit and fire up your code. If you have connected the servos, they will go to a mid-position for adjustment. Once powered on, they will move to where the centre or 90 degrees of their rotation is. From the command line stop, the program with CTRL-C and the servos will stop but stay in place. Undo the screws and adjust the camera mount to a mid-point position; about 45 degrees is correct. With this done, retighten the screws and power your unit back up. Repeat as required.

screen display

THE CODE

The code that runs and performs all the functions is located in /lib/server.js and /lib/servo-controls.js.

There are a few key things we need to know in order to make some changes to the system and understand what is happening.

In this first section of servo-controls.js, we need to create and define the two motor objects. These are used to drive the servos via the RPi’s GPIO ports. We start by defining the Variable motor, to create a new GPIO, specific to the pin it is located on.

const motorTilt = new Gpio(10, {mode: Gpio.OUTPUT});
const motorPan  = new Gpio(9, {mode: Gpio.OUTPUT});

For this project a pulse width of 500 will take the servo to the 0-degree position, where 2250 will take it to 180 degrees.

Now we set both a min and max pulse width for both servos based on what our 3D printed casing can support.

this.tilt = {      
  min : 1800, // tilt down limit      
  max : 2300 // tilt up limit
};
this.pan = {
  min : 500, // pan left limit      
  max : 1600 // pan right limit
};

The middle pulse position is calculated from the min and max values.

// Start and position servos 
motorTilt.servoWrite(tiltCenter);   
motorPan.servoWrite(panCenter);

Here we have the active code that is run when a new client is connected. On the bottom line is the piece of code required to start the video stream. If you are debugging or experimenting with the code, we would recommend commenting it out, so you can see the display output to the console.

new_client(socket) { 
  var self = this; 
  console.log(‘New WebSocket connection’); 
  // start stream on new connection 
  self.start_feed(); 

This is where the gyroscope and velocity data is pulled from the handset transmitted via a websocket connection. You can see that we can pull either the quaternion or Euler methodology.

socket.on("message", function(data){ 
  // quaternion object 
{ x: value, y: value, z: value, w: value }; 
  // euler object 
{ alpha: value, beta: value, gamma: value };       
  servos.onRotation(JSON.parse(data));

Our final section shows how we activate the servos themselves. In the tilt case, we calculate the beta percentage and the tilt range based from our min and max values and adjust the motor.

tiltCamera(beta) {
  let tiltPercentage = beta / 180;
  let tilt = (this.tilt.range * tiltPercentage ) + this.tilt.min;    
  tilt = Math.ceil(tilt); // round up
  // do not allow tilt value to go outside of servo limits
  if (tilt < this.tilt.min) tilt = this.tilt.min;
  if (tilt > this.tilt.max) tilt = this.tilt.max;
    
  motorTilt.servoWrite(tilt);
}

WHERE TO FROM HERE?

One question that came up while designing this, was “why didn’t we use Bluetooth?” There are two reasons for this. Firstly, it becomes operating-system-dependent and we would need to create a “fat” client app for both Android and iOS (anybody still using Blackberry? I’m pretty sure there is one of you out there!). Secondly, we would be confined to being in range of the device itself.

By using “cloud” technology, the unit can be located anywhere in the world while we can have the headset anywhere else in the world. All of a sudden, the project becomes so much more interesting!

This unit presents a fascinating proof of concept in creating an immersive virtual reality. In fact, these concepts are translatable into more than just VR; through Node.js we can easily send commands and data to and from our microcontrollers, and we can also reuse the code provided, to get gyroscopic data from mobile handsets using browser technology.