An Attempt At Bullet Time

My (first) attempt at creating the bullet time effect


Like many people, I’ve always had a fascination with Bullet Time (also known as Time Slice) ever since I first saw it’s use in The Matrix.

If you're not already familiar with Bullet Time, check out this video. It is essentially a visual effect that allows a scene to be captured in extreme slow motion whilst the camera moves around a subject.

The Matrix films certainly popularised the effect, but the technique involved is in fact much older, dating back to 19th century experiments by Eadweard Muybridge, who famously analysed the motion of a galloping horse.

Muybridge's Horse In Motion

How is it done?

The traditional way of achieving the effect is to set up a line of cameras, trigger them at the same time, or one by one very rapidly, and then combine the resulting images to form a moving sequence. The rig constructed for the original Matrix film used over a hundred computer controlled cameras. The huge cost and setup time for such a project means that use of the effect is, not surprisingly, quite uncommon.

Part of the rig used for The Matrix

There are certainly other methods that can be used, and many people have come up with creative ways of replicating the effect.

When I found out that my university had 30 identical DSLR cameras, all with matching lenses, I knew that I could not miss such an opportunity, and I began to look into the technical components to build my own bullet time rig.

Triggering

I had used the gPhoto library in a previous project to trigger a single camera over USB and I was hoping that I might be able to use the same method to trigger multiple cameras. I managed to get two cameras to trigger at exactly the same time, but scaling up to more than about five cameras caused large and unpredictable delays, so it was clear that USB was a no-go (for triggering at least).

The more accurate way of triggering a DSLR is to use the remote shutter port. With the Canon EOS 700D's that I was using, this connection is in the form of a 2.5mm jack plug, and can be used to activate both the shutter release and autofocus.

2.5mm Jack plug

Connecting either the autofocus or shutter release connections to ground causes the camera to trigger the respective function, with very little delay.

Using a passive system to trigger multiple cameras at the same time is quite simple. You can just wire all of the cameras in parallel to a single switch, as shown below. The diodes prevent the current from flowing in the wrong direction.

Passive circuit for triggering cameras

I wanted, however, to be able to trigger the cameras sequentially, and therefore each camera had to be controlled independently. This can be done by connecting each camera to an opto-isolator, which is essentially a switch that closes when a voltage is applied. I used a chain of shift registers to control the opto-isolators, and an arduino to control the shift registers (check out this arduino tutorial on shift registers).

Block diagram for an active trigger system

With the number of cameras that I planned on using, there were far too many components and connections to build the circuit on a breadboard (or even a few breadboards!). I designed a circuit board using Eagle, and had ten of them manufactured by Itead Studio.

Optoboard Schematic

I was completely new to this process, but didn’t encounter any issues and I was surprised at how cheap the manufacturing was. Of course, I still had to solder the components to the boards, plus all of the jack connectors to some lengths of speaker wire (thanks to my friend Aron for the help!). Each board can control eight cameras, and the boards can be daisy-chained to add more channels.

Daisy chaining the boards
The assembled Optoboard

Software

The triggering mechanism was sorted, but I still wanted a way to be able to control camera settings remotely, as well as be able to download images from the cameras automatically.

I began by trying to get this functionality working using gPhoto, but I found the library and it’s documentation to be quite difficult to understand.

The other option was to use Canon’s own library - EDSDK. I was a bit put off at first, as Canon requires you to apply for a developer account before you can access the framework and documentation. The application form asks you to explain who you are and why you want a developer account, which I did, and it got approved a couple of days later.

Canon’s documentation is fairly good, and I was able to get some basic functionality working quite quickly. EDSDK is written in C, and I was writing my application in Objective-C. Therefore, I found myself writing a lot of wrapper code around the library's api. This code eventually evolved into EOSFramework - an open source project suitable for any Mac developer who wants to interface with Canon cameras.

Software Architecture

The final software that I created had the ability to display information for all of the connected cameras, control important settings (exposure, white balance, image quality etc.) and be able to duplicate these settings across all of the cameras. The software could also communicate with the arduino to control the trigger system and, once triggered, could automatically download the resulting images, naming them appropriately based on their position.

EOS Multi Camera Software

Planks

Many of the bullet time rigs that I’d seen online used a custom steel structure, with lots of finely adjusted tripod heads to support the cameras. Unfortunately, this was a little bit beyond my budget, so my solution was to use some planks of wood, and screw the cameras directly onto them. Each plank would support five cameras, and would be held up by two tripod stands.

I planned to lay out the planks along a curve, and used a bit of geometry to work out where exactly to drill the holes.

Layout of the planks

I wanted to leave my options open, so I actually ended up drilling holes for three possible paths; a 122deg arc with a radius of 3.4m, a 150deg arc with a radius of 4.5m, and finally a straight path. I decided on a fixed distance of 25cm between each camera.

Building The Rig

The time came to actually construct the rig. I was only able to get the cameras for a couple of days (apparently they thought it was unfair for one student to take all of the photography cameras!). This meant that it was the first time that I had actually tested all thirty cameras with the systems that I had developed, and unfortunately as you will find out, I encountered a few unforeseen problems.

30 700Ds
Several Hours Later...

Everything took much longer than I anticipated - just setting up and aligning the wooden planks took almost a whole day. It turned out that the boards were a bit warped, and only being able to pan the cameras left/right was not enough to be able to properly align them. Fortunately, since the cameras were capturing at a resolution of 5.5K, I was able to stabilise the footage later, whilst still retaining a good output resolution.

Camera Alignment
More Camera Alignment

With the triggering system, I had assumed that, since the cameras were operating in manual mode, I would not need to trigger the auto-focus. It turns out that triggering the auto-focus seems to prepare the cameras for taking a picture, even when in manual mode.

I discovered that if I did not prepare the cameras in this way first, the shutter release circuits had to be kept closed anywhere between 50ms - 80ms in order to trigger the camera. Preparing the cameras using the auto-focus reduced this delay to less than 5ms.

The system I had designed was not capable of triggering the auto-focus - I had not even bothered to solder the connection on the jack plugs! This meant I had to trigger the auto-focus over USB, which was slow and unreliable.

Trigger System

Speaking of USB, I had bought the cheapest ten port USB hubs that I could find - about £3 each! I figured that, since the cameras don’t actually draw any power from USB, they would be ok. Not surprisingly, they didn’t work when daisy chained, and my laptop only had two USB ports (I needed three, plus one for the arduino).

USB Hub

I had previously managed to control twenty cameras from my five year old laptop, so I assumed that my brand new desktop PC with six USB3 ports would have no problem managing all thirty cameras.

Nope.

It could barely handle eight before the keyboard and mouse would stop functioning. I ended up having to use two laptops each handling half of the cameras. This meant that I didn’t get to properly preview the sequences that were being captured, and I didn’t realise at the time that there was a lot of motion blur in the images, caused by the shutter speed not being set fast enough.

Mission Control

With all of the problems, there wasn’t actually much time left to film anything! I managed to capture a few sequences - I had two cameras on either end of the rig capturing video as well.

The Completed Setup
Action!

You can see one of the results below, but generally, I didn’t find the footage to be very usable, mainly due to the amount of motion blur. Despite this, I still had a lot of fun working on this project, and I gained a lot of knowledge along the way.

Advice

If you are mad enough to attempt such a project, here are some lessons that I learned during my experience:

Don’t rely on USB for anything time sensitive. Use the remote shutter port for triggering both the shutter release and auto-focus.

Don’t connect all of the cameras via USB to one computer. I would recommend connecting a maximum of 16 cameras per computer, and then network the computers together to be able to manage them from a single point. Some cameras can now be controlled over ethernet or wifi, which is a more appropriate solution for dealing with long distances and addressing multiple devices.

Build a proper support structure. Using a piece of curved scaffold would be a lot quicker and easier to setup - and it won’t warp! Ideally, each camera needs it’s own tripod head to allow for adjustment - otherwise, expect to have to do a lot of stabilisation in post. Make sure you capture a sequence with nothing but a target or object at the focal point. Then, if you do need to some stabilisation, you can use that reference to stabilise any other sequences, as long as the cameras remain in the same position.

Power the cameras using AC. I was relying on batteries, and found myself having to continually change them, which inevitably meant that the cameras got moved slightly, therefore requiring them to be refocussed. Using an external power source is a much more ideal solution (but more expensive).

Thanks to the team!