Lexical or Preprocessor Issue

So, today was the day I decided to bite the bullet and start working on my Metal demo for CocoaConf Columbus and 360|iDev.

Since a large focus of my talk is on GPUImage, I am hoping to put together a light Metal version of GPUImage that processes an image using a series of filters. I want to write between three and five filters that are easily stacked on one another that have a GPUImage counterpart in order to test how fast Metal processes images compared to GPUImage.

I went to look at what sample code is available from Apple for Metal. To my delight, I saw that there was an image processing base project. It includes one filter to change an image to black and white and that is hardcoded. I should be able to go into this project, add my filters, and add some UI elements allowing me to add the filter shaders I write.

Today I opened the sample code. Immediately, there was an error.

“Lexical or Preprocessor Issue: QuartzCore/CAMetalLayer.h not found.”

This is why we can't have nice things!!

This is why we can’t have nice things!!

Huh. That is inconvenient.

Did some digging. Refrained from asking this question on Stack Overflow because the last time I asked a question about the betas I got a snide person telling me to go somewhere else. Headed to the Dev Forums and found this thread.

Apparently, for the time being, there is no support for Metal in the simulator. There should be support for Metal if you have an A7 device like the iPhone 5S (which I have) that is running the iOS 8 beta.

I have not yet updated my phone to the beta. I know we are getting close to the point where it will be released, so it isn’t a huge thing to update to the beta, I just feel like I have no guarantee that stuff will work on there properly even after I update to the beta.

I must say that this latest wrinkle is not doing anything to sell me on Metal.

Metal only works on iOS A7 chips and now further won’t even work in the simulator. I usually use the simulator in my talks to demonstrate things I am doing, but now I have to get it on my device. I think I can use Airplay to show what the screen looks like, but that is one more step that can go wrong in my process.

The other things I am noticing in the sample applications is that most of the class implementation files end in “.mm”, which means that they are explicitly telling the compiler that there is going to be C++ code in them.

I have not worked with Swift as much as I should have, but I am wondering if this is going to be a problem with trying to write an app in Swift. I know that theoretically Swift is supposed to behave like Objective-C in that you can include C and C++ code, but I have not tried to write straight C code in a Swift class yet. Can you write C code in a Swift class, or is the support just that I can import a C class into a Swift-based project? How is this going to work with Metal?

At least with OpenGL ES you have the GLKit framework with should work with Swift. I am interested to know more about this, but sadly I don’t believe I will be able to explore these issues before I give my talk in Columbus.

I am also trying to figure out just how much C++ I need to know to fully work with Metal. I thought that I needed to know about the same amount of C++ as you need to know of C to work with GLSL, but after seeing the number of classes that are implementing C++, I am slightly worried that I am going to be out of my depth for a while.

These are things I am going to have to take into consideration and disclose during my talk. I know most of these issues will resolve themselves in the next few years, it is just slightly frustrating to sit on the sidelines trying to figure out how to make it work here and now.

Fortune favors the brave.