We Didn’t Start the Fire(Wire)

I have written about this before, but at my current job my boss and I are rewriting our robotics control software in Swift. This is an excellent blog post here that explains why we are doing this.

This is the camera setup we have on our robotics systems. Cameras help with dispenser positioning and we support both video and image capture for our users.

This is the camera setup we have on our robotics systems. Cameras help with dispenser positioning and we support both video and image capture for our users.

There have been a few projects we have open sourced after implementing them in this project. This blog post details the most recent project we have completed and open sourced, which was to write a wrapper class allowing us to connect to an external camera that conforms to the IIDC standard. This project can be found here.

Cameras are an important feature in our robotics systems. Users use the camera to help position their dispensers and to capture media. Videos and images of the dispensing process have been used in papers and documentation of scientific research, so continuing to support this functionality is vitally important.

What is the Goal?

Back when the code was initially written in 2007, AV Foundation and GPUImage did not exist. There was not really an easy way to hook up an external camera to an application. Additionally, the standard for rapid data transfer at the time was Firewire.

The fact that there were no easy solutions meant that our code was overly complex. There were much easier ways to connect to a camera and run the video through a filter that we simply couldn’t implement because our code touched too many other things. We set out to simply the code in our rewrite.

One major goal of this project was to make it easier to add additional cameras while still supporting the legacy cameras out in the field.

Since this company has been around for over a decade, we do have legacy hardware out in the field that we still need to support. Currently we have three different kinds of cameras out in the field associated with our robotics systems: Unibrain, Point Grey Flea2, and Point Grey BlackFly. At some point in the next year or so we will need to support a fourth camera because our current camera, the BlackyFly, has been discontinued.

What is IIDC 1394?

IEEE 1394 is a serial bus standard for high speed, real time data transfer. USB is another serial bus standard that is more widely adopted because IEEE 1394, aka FireWire, was proprietary to Apple.

Our first camera type, the Unibrain camera

Our first camera type, the Unibrain camera

Even though FireWire ports are no longer available on Macs being sold today, there are still many cameras that conform to the IEEE 1394 standard. Our current Point Grey BlackFly cameras have a USB 3 plug but they conform to the IEEE 1394 standard.

IIDC is the FireWire data format for live video. In order to be able to interface with an IIDC compliant camera, we have to conform to their standard.

There is a library to interface with IEEE cameras, libdc 1394. We have integrated that library into our project and adapted it in order to be able to communicate with our cameras. This library’s functionality is what we are wrapping in our GPUImageIIDCCamera class.

We did not integrate the GPUImageIIDCCamera class into the primary GPUImage framework. The libdc 1394 library has less permissive public licensing than GPUImage has, so for legal reasons, the class could not be merged into GPUImage proper and must remain a separate entity.

Objective-C? Why Not Swift

Taking a legacy piece of software that integrates with hardware is something of a challenge. Since Objective-C is a superset of C, there was a lot of low level C programming that could easily be integrated into the previous iteration of the control software that now presents some challenges when we attempt to implement them in Swift.

One such challenge was figuring out how to interact with our hardware. Prior to attempting to connect and control our camera, we had to determine how to talk to our micro controller. We were able to do this within the current constraints of Swift, but there is one feature of the C language that Swift does not yet support, which is mutable function pointers.

Since this was an integral part of our process, it was necessary to write this class in Objective-C. This, for the record, is the first time in our six-month process where we encountered a problem that we could not code in Swift. This didn’t prevent us from being able to implement this feature, it simply meant that we had to finagle a few things to fully integrate the Objective-C class into our control software code.

What do we Need the Code to do?

These are several things we needed this class to accomplish:

  • Connect to the camera
  • Capture frames
  • Set up the proper video format for the camera type
  • Remap the YUV colorspace to RGB colorspace
  • Get and set camera settings for things like brightness and saturation
  • Handle camera disconnection

Challenges

One of my personal challenges was simply understanding the code. Since much of our functionality would be done differently in the new code, I couldn’t just port it over from the old version of the software. It was important to get a sense of how to wrap the IIDC functionality in such a way that it would be easy to implement new cameras into our process. It was also important to figure out what lifting would be done by GPUImage and what would be done by the IIDC camera class.

Our current camera, the Point Grey BlackFly

Our current camera, the Point Grey BlackFly

Additionally, Brad did some extra work on our version of libdc1394 and his changes had not been documented. I couldn’t use the general documentation, what little of it there was, for the code.

Initially we thought that we would not need to use any OpenGL to process the video frames. It was later determined that a shader would be necessary for finding the frame size. This was beyond my present OpenGL experience, so Brad needed to write the necessary shader to accomplish this.

We also had to deal with different video modes. There are about thirty types of video modes we have access to, but all of these boil down to one of two types: Format 7 or anything else.

Format 7 allows you to set the frame size and the colorspace. All of the other video modes specify those things in their mode name.

Point Grey Flea2 camera mounted on our Desktop system

Point Grey Flea2 camera mounted on our Desktop system

Not all cameras support Format 7. Our first camera, the Unibrain, does not support Format 7. So we needed to make sure we were able to connect and use both Format 7 and non-Format 7 cameras.

We also had to deal with the fact that we were talking to a piece of hardware. Those settings, along with brightness, saturation, and others are all set on the physical piece of hardware. We can communicate with the hardware using C functions, but the point of wrapping this class is to avoid having to touch the messy underlying C library.

Each property associated with the camera that we can set has overridden getters and setters. We override them in order to make sure the camera and the application are on the same page about what each expects the settings to be. When you drop this class into another application, it appears to work the same way for the programmer with all the nasty bits tucked away in accessor methods.

Final Thoughts

When I worked on figuring out libxml2 at the beginning of the year I thought that was the hardest thing I would work on. That was just a warm-up for this project.

This was a huge challenge for me personally. I think trying to figure this out has been the hardest thing I have done in my career so far. In addition to how difficult this has been, not working with Cocoa since 2014 has made trying to get back in the swing of Cocoa development has been a bit of a challenge.

I hope that as I progress in my career it gets easier for me to pivot from low to high level development more easily. I wish I could have done this entire thing by myself, but I understand that we have deadlines that need to be met. I am proud of the amount I was able to do here and the growth I have experienced as a programmer by pushing myself to work on something this difficult.