Category Archives: Metal

Land of Misfit Frameworks

Right now I am in the process of writing a book on Metal. Metal came out the same year as Swift. One thing that has caused a lot of frustration among people like myself who are interested in Metal is the fact that there is not a lot of material out there on the framework. It’s gotten much better. Ray Wenderlich has released a video series on Metal. Apple dedicated a whopping five videos to Metal in 2016. If you’ve been checking the docs between when Metal came out and now, they have increased dramatically in both quantity and quality. Metal has come a long way over the last year.

There are frameworks that Apple released in the last few years that make Metal dramatically more useful. It integrates well with SceneKit, which makes it far easier to work with 3D graphics. In 2015, Apple introduced the Model I/O framework to make it easier to import 3D models from programs like Blender and Maya. Before that, you had to parse all these files by hand like a savage.

I had a sample project I wanted to do today that I didn’t realize I needed to import a model for. Most of the things that I need to do with Metal are made infinitely easier by Model I/O and SceneKit. But I don’t see people clamoring for more books on SceneKit or tutorials on Model I/O. There is almost no information out there about it outside of the Apple documentation. There is one piece of sample code and only one video from 2015.

I am not picking at Apple for not having more documentation. They have a lot on their plate and they’re doing the best they can. I am simply confused as to why I don’t see more people complaining about the lack of information about Model I/O. I don’t know if the lack of adoption of Model I/O is clamping the adoption of Metal or if it goes the other way.

I feel like there is a lack of understanding about what is out there and what the most optimal way is to get things done as efficiently as possible.

I know that SceneKit never got the amount of attention that it may have deserved. Everyone I talk to that would use SceneKit don’t want to because it’s not available on Android. I feel like most people I interact with use Unity because it was the best thing available five years ago and it’s just the thing they’re most comfortable with.

Apple has a lot of misfit frameworks. Gameplay Kit. Model I/O. SceneKit. These all work together and support one another. I want to see more stuff out in the world about these frameworks.

I am hoping that I can incorporate learning these frameworks into my exploration of Metal. I think that knowing these tools and using them in my projects will make my experience with Metal far richer and more productive than it would have been otherwise. I know this runs the risk of being a rabbit hole I fall down that prevents me from focusing and getting projects done. However, I think if I don’t raise my awareness of these frameworks, I will be missing the forrest for the trees. I will wind up wasting a bunch of time reinventing the wheel or creating useless things that don’t accomplish what they are capable of.

I am trying to remember that these are large topics. These take time to learn and adopt. Rome wasn’t built in a day. It’s important to remember this is a large topic and to not get frustrated when I realize things are large.

Special Announcement: The Metal Programming Guide

Hello all. For the last few weeks I have been hinting that I have a new project that I had not yet announced yet. I wanted to make this announcement at MacTech, which I did yesterday, so I am now publicly announcing this.

I am working on a new book: The Metal Programming Guide. If you are familiar with the OpenGL series of books like the OpenGL Red Book and the OpenGL Blue Book, this is going to be The Red Book for Metal. The publisher for this project will be Pearson/Addison Wesley, so this is not a self-publishing effort on my part. I will have technical editors and people breathing down my neck if I miss deadlines!

Back in 2014, when Metal was announced, it excited me tremendously. I did a series of conference talks on it throughout that year, but a little thing called Swift kind of got in the way of my learning path for Metal.

This year has been something of a disaster for me on a personal level and I didn’t get to accomplish what I had hoped to this year. Being given an opportunity to finally get to master Metal and write a book about it has given me hope for the future. I have always felt like Metal was something I was meant to do, but I wasn’t really ready to tackle it until now.

Right now I am trying to balance doing some short term contract work with attempting to work on the book. I hope to not run out of money before I finish the book, but I also want to avoid doing full time work that would prevent me from making progress on the book. We are very early in the process right now, but I am giving it everything I have to try and get this done in a reasonable amount of time.

I am hoping to be able to make weekly posts on here and my podcast about various aspects of Metal programming that I am not able to include in the book due to lack of time/space/things being out of scope.

Even though I have been a co-author on a number of programming books already, it has always been my dream to do an immersive book on a topic such as Metal. I am excited to learn this amazing framework and share that knowledge with everyone else.

I’m announcing it here on this blog because I have absolute confidence that this project will be successful. I’ve been waiting to do this for a long time. I do not plan to drop the ball on this. Bring it on!

Primitive Drawing and Assembly in Metal

One of the main reasons I got interested in iOS was because I wanted to learn graphics and audio programming.

I got really interested in OpenGL, but after a year of trying to learn it I wasn’t making any progress.

Anyone who has tried to learn OpenGL have gone through the same frustrating experience I have. You find a tutorial on Ray Wenderlich and you write a bunch of code and at the end you have a spinning multicolored cube. That’s really cool!!

But then you realize that you have absolutely no idea how to do anything else.

For a really long time, I thought that I had to enter all of my vertices by hand because every tutorial I saw had you write your vertices by hand. My brain was scrambled by the idea of trying to create a large, complex 3D model by hand coding the vertices. It’s hard enough coding things with auto complete, what if you miss a value?! Do you have to keep building it and keeping track of the vertices and try to figure out which one needs to be moved when there are a thousand of them?!

Eventually I was told that you import a pattern file into your application, but until the release of Model/IO in iOS 9, you had to write your own parser from scratch to import a file from Blender or Maya.

So I faced an incredible amount of frustration trying to figure out how you get vertices into an application and how they work together to create something. I kept being told that those projects of the spinning cube introduce you to all the things you need to know in order to make OpenGL work, but it was something that was not intuitive.

I am seeing people go through similar frustration while trying to learn Metal. I am hoping to do a series of blog posts over the next few months about aspects of Metal and 3D graphics programming that I don’t feel I have seen explained very well in other places. This stuff is complicated and difficult to explain, so no judgement on anyone who produces technical materials on Metal or OpenGL.

Star Project

I decided to do a project that was slightly more complicated than a triangle or a cube but not as complicated as a 3D pug model. I want to figure out how to explain this stuff in a way that the reader can extrapolate and scale the complexity while still understanding how the fundamental concepts work.

I decided to try and draw a two-dimensional star. It’s slightly more complicated than a triangle but it’s still simple enough for a person to sit down and conceptualize.

Yes, I screwed up the vertex labels at the bottom.

Yes, I screwed up the vertex labels at the bottom.

I really hoped that I could find a simple CAD program to generate a pattern file for my 2D star, but after multiple frustrating conversations with various people, I decided to bite the bullet and just plot out the vertices by hand.

The Metal rendering space is slightly different than what one expects coming from something like Core Graphics. I am used to the idea that the phone has a normalized coordinate space where the height is one unit and the width is one unit.

Metal still utilizes a normalize coordinate space, but it’s two units high and two units wide, so the center of the screen is coordinate (0,0,0). So the upper left corner of the screen is at coordinate (-1, 1, 0). The lower right corner is (1, -1, 0). It’s not complicated to understand, but it’s slightly counterintuitive for someone coming from the idea that everything is a value between 0 and 1.

I created my coordinate space using graph paper. I made each square on the paper worth 0.2 units and made the space ten squares by ten squares.

I understand that because the phone screen is not perfectly square that the star is not going to look like this when it’s finally rendered. One of the things I want to do later is figure out how to constrain the drawing area to be square so that the star renders properly, but that’s a task for a later time.

Writing out vertices by hand like a savage.

Writing out vertices by hand like a savage.

In Metal and OpenGL, shapes are composed of triangles. All the big scary 3D models that make up a Pixar movie are composed of meshes of triangles. Everything can be broken down into triangles.

So let’s think about a star. It’s obvious that the points are composed of triangles, but what about the middle? The middle is a pentagon. This pentagon can be composed of five triangles by drawing out from the center to each of the vertices between the points.

So if you think about how to describe the star to the renderer, you are going to describe ten triangles using eleven vertices. There are five vertices at the points, the five between the points, and finally one in the middle.

Metal Primitives

When you package and pass your vertices to the vertex buffer, you need to describe to the vertex buffer what type of shape it’s drawing. I know I just went off on how everything can be broken down into triangles, but there are a few flavors of shapes you can draw with Metal.

Metal has a enum of Metal Primitive Types. There are five different primitives available to you:

  • Point: Rasterizes a point at each vertex. You have to define a point size in the vertex shader.
  • Line: Rasterizes a line between a pair of vertices. These lines are separate and not connected If there are an odd number, then the last vertex is ignored
  • Line Strip: Rasterizes a line between a bunch of adjacent vertices, resulting in a connected line.
  • Triangle: For every separate set of three vertices, rasterize a triangle. If the number of vertices is not a multiple of three, either one or two vertices is ignored.
  • Triangle Strip: For every three adjacent vertices, rasterize a triangle.

So the easiest way to think about how to describe the star to the vertex shader is to hand enter ten sets of three vertices that describe a triangle.

let vertexData:[Float] =
    [
        // Internal Triangles
        0.0, 0.0, 0.0, 1.0,
        -0.2, 0.2, 0.0, 1.0,
        0.2, 0.2, 0.0, 1.0,
        
        0.0, 0.0, 0.0, 1.0,
        0.2, 0.2, 0.0, 1.0,
        0.3, 0.0, 0.0, 1.0,
        
        0.0, 0.0, 0.0, 1.0,
        0.3, 0.0, 0.0, 1.0,
        0.0, -0.2, 0.0, 1.0,
        
        0.0, 0.0, 0.0, 1.0,
        0.0, -0.2, 0.0, 1.0,
        -0.3, 0.0, 0.0, 1.0,
        
        0.0, 0.0, 0.0, 1.0,
        -0.3, 0.0, 0.0, 1.0,
        -0.2, 0.2, 0.0, 1.0,
        
        // External Triangles
        0.0, 0.6, 0.0, 1.0,
        -0.2, 0.2, 0.0, 1.0,
        0.2, 0.2, 0.0, 1.0,
        
        0.6, 0.2, 0.0, 1.0,
        0.2, 0.2, 0.0, 1.0,
        0.3, 0.0, 0.0, 1.0,
        
        0.6, -0.4, 0.0, 1.0,
        0.0, -0.2, 0.0, 1.0,
        0.3, 0.0, 0.0, 1.0,
        
        -0.6, -0.4, 0.0, 1.0,
        0.0, -0.2, 0.0, 1.0,
        -0.3, 0.0, 0.0, 1.0,
        
        -0.6, 0.2, 0.0, 1.0,
        -0.2, 0.2, 0.0, 1.0,
        -0.3, 0.0, 0.0, 1.0
]

I want my star to be kind of flashy. I would like to set a pseudo-radial gradient on the star where the tips are red, but the middle is white. So I need to set up another array of floats describing the color data as it correlates to the positional data.

let vertexColorData:[Float] =
    [
        // Internal Triangles
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        
        // External Triangles
        1.0, 0.0, 0.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        
        1.0, 0.0, 0.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        
        1.0, 0.0, 0.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        
        1.0, 0.0, 0.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        
        1.0, 0.0, 0.0, 1.0,
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0
]

Over in the view controller, you need to create a buffer to hold the position data and one for the color data for your vertices:

var vertexBuffer: MTLBuffer! = nil
var vertexColorBuffer: MTLBuffer! = nil

Now you need to connect those buffers to the arrays of vertex data. The buffers don’t know how much data they need to store, so you have to calculate how large these vertex arrays are so the buffers know how much space they need to allocate for the vertex data.

let dataSize = vertexData.count * MemoryLayout.size
vertexBuffer = device.makeBuffer(bytes: vertexData, length: dataSize, options: [])
vertexBuffer.label = "vertices"

vertexColorBuffer = device.makeBuffer(bytes: vertexColorData, length: dataSize, options: [])
vertexColorBuffer.label = "colors"

img_5660Since the vertex position and color arrays are the same size, you can reuse the data size variable for both buffers.

At the end of the process, this data is scheduled by the render encoder to be sent to the vertex shader and be processed by the GPU.

renderEncoder.setVertexBuffer(vertexBuffer, offset: 0, at: 0)
renderEncoder.setVertexBuffer(vertexColorBuffer, offset:0 , at: 1)
renderEncoder.drawPrimitives(type: .triangle,
                                   vertexStart: 0,
                                   vertexCount: vertexData.count)

Build and run the app on a phone and you get a nice star! Huzzah!

Wrap Up

I created a GitHub repo for this project here. Honestly, most of this project was basically me just using what comes in the template but changing the vertex data.

Looking this over, it seems kind of inefficient. The middle set of triangles feel like they could be a triangle strip. I would like to add a nice outline to the star to make it look nice and neat. I also would like the star to not be morphed by the screen.

I am planning to update this project periodically to make it more efficient and customizable.

At the very least, I hope this adds some understanding to how Metal breaks down larger shapes into triangles and how it’s able to go through and build the shapes back up again.

Streaming WWDC 2016

I have never had the privilege of attending WWDC. Most years (including this one) I never bothered to apply to the lottery because I couldn’t afford to go. The one year I could afford to go, I didn’t win a ticket and I decided I would rather have the money as a buffer than go out to WWDC. This was the correct decision.

I attend a lot of conferences. I speak at a lot of conferences. Unfortunately, I have had some difficulty actually attending sessions at conferences. I have panic attacks when I am trapped in a room full of people and I can’t get up and walk around. This was one reason I was never super disappointed about going to WWDC the last few years. The idea of being stuck in a room for a whole week makes me feel like curling in a ball and crying. I go to conferences to network and drink with my friends. Now I am at the point where it’s just networking since I gave up drinking.

One thing I had forgotten about was discovering new things by attending sessions I hadn’t thought to go to. When I went to my first CocoaConf, I encountered a lot of interesting things because I wanted to watch Jonathan Penn and Josh Smith present.

When Swift was introduced two years ago, most of the conference sessions revolved around talking about Swift. I like Swift, it’s a neat language, but I am sick of talking about it. I am tired of hearing people talk about side effects and protocols and immutable state. I miss the first few years I was an iOS developer when people talked about frameworks and weird little nooks and crannies of the Cocoa architecture.

Taken together, this has created something of a perfect storm where I got burned out on iOS development. I got sick of talking to people about it because it always boiled down to Swift and arguing about code purity and a bunch of other bullshit.

I saw the Keynote this year and I had absolutely no enthusiasm for anything this year. I was irritated and cranky and didn’t want to deal with anything. But I noticed that this year Apple decided to stream most of the sessions live. The sessions were always available online later and last year they started showing select sessions. I watched the Swift ones because it was for my job and was still new and exciting. But I rarely watch the sessions afterward because when I watch the sessions, I sit there and pause every few minutes to try and process the vast amount of information that is being presented. There is a massive backlog of lots of sessions I think would be nice to watch but I never get around to watching. I did not think I would do anything this year.

I was wrong.

Streaming the sessions live has completely changed my life this week.

I work from home and so I just kind of threw the live stream on while I worked on stuff. I have it on in the background. I can’t pause the live stream, so I am not poring over every second of each video minutely. I am getting an overview of what they are talking about so I can go and research things later. I also have a team of people on various Slack channels who are watching it with me that I can chat with about the things we find new and exciting.

There were five whole sessions on Metal this year. The last two years I only got through the first Metal video because I felt like I didn’t understand it well enough to move on to the next video. This year, since they were just on, I could passively leave it on and get through all the videos. If this was a normal year, I would not have encountered the thing that has excited me the most this year, which is doing neural networks in Metal. That was introduced in “What’s New in Metal Part 2,” which was the fourth Metal video streamed. I did not need all the context from the first three videos to get excited about the new stuff in Metal.

I got to watch all the videos about GameplayKit, Photos, SpriteKit, etc… All of these technologies that I have been interested in but in a passive way were all just there for me to listen in on. I got introduced to so many things I didn’t know about in obscure frameworks that don’t get a lot of love because most people need to pay the bills and so they don’t do sessions on SceneKit.

This is what it was like at the beginning when I started going to conferences. I would discover so many new things that I would go home excited to get working on something. I haven’t felt this way for the last two years.

I worked for Brad Larson for a year. He told me that the reason he got into making Molecules and got into OpenGL and doing GPUImage was because he had a free period at WWDC and just decided, on a whim, to watch a session on OpenGL. It’s crazy to me about how things you do on a whim or by chance can completely change your life. By not being exposed to these sessions over the last few years, I have been cutting myself off from these chance encounters to find something truly special that I can learn and make my own.

It has been a great gift to get to participate with WWDC from home. Being able to get up and walk around during a session and cuddle with Delia while listening to people give their talks has helped me tremendously. I can talk to people on Slack from all over the world about the sessions as they happen so we can all be excited together. I know that people get something out of being there and getting to talk to the engineers, but for someone with mental health issues that prevent them from being able to be comfortable with massively large amounts of people, this has been a godsend.

I am planning in the future to go back and watch all the videos from previous years that I never watched because they took too long. I can have them on in the background while I work on other things. I can pick out the parts that interest me and look into them further.

For the first time in a really long time, I am excited about iOS development. Thank you Apple for giving that back to me.

Getting Metal Up and Running: Metal Template

Note: The code from this tutorial is based on this tutorial from Ray Wenderlich. This was written in 2014 and was not updated for Swift 2.0, so there are a few changes I have made to my template to make it up to date. I am also writing my own explanations about what each part of this process does. If you just want something that works, you can download to template. If you want something easier to skim, I suggest looking through the Ray Wenderlich tutorial.

Metal was announced at WWDC 2014. It was the most exciting announcement of WWDC for approximately five minutes until Swift was announced.

I was doing a high level Swift talk in 2014 when I didn’t know the framework very well yet and I just wanted to give people an idea about why Metal was important. People were understandably unhappy that I didn’t show them how to code Metal. I am rectifying that mistake.

As was the case when Metal was first announced, you can’t test Metal code in the simulator. You have to build on a device with an A7 chip or later. The most primitive iOS device you can use for this is the iPhone 5S.

The goal of this project is to create a template that can be used as a base to just get Metal up and running. This doesn’t do anything other than render a color to the screen. I will take this template and add vertex buffers and geometry in a later post to explain how that process works. I didn’t include those in this project because I didn’t want to include anything that would need to be deleted by the programmer before it could be used.

Let’s get started!

Create Project and Import Frameworks

If you want to walk through the process of building the template rather than just downloading it from GitHub, you can follow along with the directions.

Open Xcode and create a Single View Application. You can name it anything you want. If you are using this as a basis for a project, go ahead and name it whatever it will eventually be. Choose Swift as the language. I made mine Universal, but if you know it will only be on an iPhone, go ahead and just choose iPhone.

There are a few frameworks that you will need to import before you can get Metal up and running.

Add these import statements at the top of your ViewController:

import Metal
import QuartzCore

Metal is obvious. You need to import the framework to do anything with Metal. The QuartzCore is a little less obvious. We’ll get to that soon.

Most of the work we are doing will be in the ViewController class. Unless otherwise specified (as in the Shader code), add all code to the ViewController.

MTLDevice

First thing you need to set up for a Metal project is the MTLDevice. The MTLDevice is the software representation of the GPU. I go over the properties of MTLDevice in a previous blog post. We don’t need access to everything that MTLDevice does for this simple template.

At the top of the ViewController class, add the following property:

// Properties
var device: MTLDevice! = nil

You will be seeing and using the device property a lot in this project. The MTLDevice is the manager of everything going on with your Metal code. You will be instantiating this (and all other properties) in the viewDidLoad() method.

There is only one safe way to initialize your device property:

device = MTLCreateSystemDefaultDevice()

At this point in time, every Metal-capable device only has one GPU. This function returns a reference to that GPU.

CAMetalLayer

Note: You might see an error at some point in this code block that says CAMetalLayer not found. This drove me crazy for a really long time. I downloaded Apple’s template that utilized MetalKit that does not have an instance of CAMetalLayer because it is implemented behind the scenes. I thought it was something that was phased out.

This is an ambiguous compiler error. At some point Xcode switched from saying the build device was a physical device to the simulator. Rather than saying the code won’t build on the simulator, it says the CAMetalLayer was not found.

If you ever get weird, ambiguous compiler errors while coding Metal that say something doesn’t exist, check the build target!

Remember back at the beginning of the blog post where I told you to import QuartzCore? You are importing that for one purpose: To get access to CAMetalLayer.

In iOS, everything you see on your screen is backed by a CALayer. Every view, every button, every cell is backed by a CALayer. CALayer is like the canvas you use to render things to the screen. If you want to know more about CALayer there are a few good tutorials on it here and here.

Since Metal is a different beast than a normal UIView, you need to create a special kind of CALayer: CAMetalLayer. This layer doesn’t live in the Metal framework, it lives in the QuartzCore framework, along with the special layer for rendering different flavors of OpenGL ES.

Create an CAMetalLayer property under your MTLDevice property:

var metalLayer: CAMetalLayer! = nil

Initializing the CAMetalLayer is a little more complicated than initializing the MTLDevice. There are four setable properties on the CAMetalLayer:

  • Device
  • Pixel Format
  • Framebuffer Only
  • Drawable Size

Device is self explanatory. It is simply the MTLDevice we created in our last step.

Pixel format is your chosen MTLPixelFormat. There are over a hundred items in the MTLPixelFormat struct, but there are only three options available on the CAMetalLayer: MTLPixelFormatBGRA8Unorm, MTLPixelFormatBGRA8Unorm_sRGB, and MTLPixelFormatRGBA16Float. The default pixel format is MTLPixelFormatBGRA8Unorm, so I am just going to leave it there unless I have some reason to change it.

Framebuffer only is an optimization option. There are two kinds of MTLResource types in Metal: MTLTexture and MTLBuffer. MTLTexture objects are less efficient because they need to be able to sample textures and do pixel read/write operations. If you don’t need those things you can make your code more efficient by telling the compiler this is something it never has to worry about.

Drawable size specifies how large your texture is. Since we are not currently using a texture, we don’t need to set this.

Add the following code to your viewDidLoad() method:

// Set the CAMetalLayer
metalLayer = CAMetalLayer()
metalLayer.device = device
metalLayer.pixelFormat = .BGRA8Unorm
metalLayer.framebufferOnly = true
metalLayer.frame = view.layer.frame
view.layer.addSublayer(metalLayer)

You’re initializing the metalLayer, then setting the properties on it that are relevant to this project. We’re leaving the pixel format to the default and since we don’t have any textures, we’re setting the optimization to true.

As with all layers, we’re setting the frame and adding it as a sublayer. Huzzah!

Command Queue

We will need an object that will organize the commands that we need to execute. In Metal, that object is the MTLCommandQueue. The MTLCommandQueue is our Space Weaver that keeps all of our command threads straight and running properly.

Add this property to the top of your View Controller class:

var commandQueue: MTLCommandQueue! = nil

This creates a command queue that is available to all of our methods within our View Controller. It will be used in a couple of different places and we want it to remain consistent and not go out of scope.

Up next, we need to set the command queue. Add this line of code into your viewDidLoad() function at the bottom:

// Set the Command Queue
commandQueue = device.newCommandQueue()

We’re almost done implementing all the code we need in our viewDidLoad() method. We just need to set up the functionality to actually draw to the screen. For that, we need a display link.

DisplayLink

We now have our CAMetal sublayer in place and we can draw to the screen. But how do we know when to trigger the screen to redraw?

It needs to redraw any time the screen refreshes. Since this is a common task in iOS, there is a built in class available to do this: CADisplayLink.

CADisplayLink exists to synch your drawing to the refresh rate of the display.

Add a new property to your ViewController class:

var timer: CADisplayLink! = nil

In order to set up your display link, you need to tell it what the target is and what code needs to be run every time the link is triggered. Since we are creating this for the view controller, the target is self. We just need the program to render, so I called this selector renderloop:

// Set the Timer
timer = CADisplayLink(target: self, selector: Selector("renderloop"))
timer.addToRunLoop(NSRunLoop.mainRunLoop(), forMode: NSDefaultRunLoopMode)

After you initialize the display link, you need to register it with a run loop. We are just registering it with the main run loop on the default mode to get access to coordinate the redrawing of the screen.

You have specified renderloop as your selector, but it hasn’t been created yet. Go ahead and do that now at the bottom of the class:

func render() {
  // TODO
}
 
func renderloop() {
  autoreleasepool {
    self.render()
  }
}

So, we didn’t just create the renderloop method, we created another one as well. The renderloop method just calls a render method encapsulated within an autoreleasepool. We’ll be setting up that render method next.

Render Pipeline

Rendering is the process where the program takes all of the information it has about the scene and compiles it together to determine the color of each pixel on the screen.

This is a rich and immersive topic that I would like to explore more fully in time, but for now I am trying to make sure I include the most basic information necessary to understand the code needed to do the minimum here.

If you would like a better explanation of what rendering is and how it works, there is a great explanation of it and the math involved in Pixar in a Box on Khan Academy.

Now that we have our display link in place, we would like to be able set up the code necessary to take a bunch of numbers and small shader programs and turn them into something cool we see on the screen.

To do that, we need to set up a rendering pipeline. A rendering pipeline takes all the inputs you have (vertices, shaders, optimizations, etc…) and coordinates them to make sure that each vertex and fragment is produced properly on the screen.

This will require us to put a number of pieces in place. I will go over each one and explain its role in the process.

Render Pass Descriptor

We are going to go into that empty render method and start to fill it out. The first thing we are going to create in that method is the render pass descriptor. The render pass descriptor is a collection of color, depth, and stencil information for your renderer. In this simple template we are not concerned with the depth or the stencil properties, so we are focusing on the color properties.

Begin filling out your render() method with the following code:

func render() {
    let renderPassDescriptor = MTLRenderPassDescriptor()
    let drawable = metalLayer.nextDrawable()
    renderPassDescriptor.colorAttachments[0].texture = drawable!.texture
    renderPassDescriptor.colorAttachments[0].loadAction = .Clear
    renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(red: 1.0, green: 0.0, blue: 1.0, alpha: 1.0)

First off, you’re creating a MTLRenderPassDescriptor(). You will need to set up the attachments for the render pass descriptor a little later. First you need to set up a property that you will need to hook up to the render pass descriptor.

The render pass descriptor needs a source for it to render. That source is maintained by the instance of the CAMetalLayer. The CAMetalLayer has a method on it called nextDrawable(). Internally, CAMetalLayer maintains a cache of textures for displaying content. This method grabs the next one in the queue available for use.

Our render pass descriptor needs to know a few things about its color attachments: texture, load action, and clear color. Our texture is whichever texture is up next in the CAMetalLayer’s pool of drawable textures.

We have three options for our load action: Don’t Care, Load, and Clear. Don’t care allows the pixel to take on any value at the start of the rendering pass. Load maintains the previous texture. Clear specifies that a value is written to every pixel. We want to use clear here because we want to over write the color that exists currently.

Lastly, we’re setting a clear color. I have arbitrarily set it to a nice magenta color. When you finally pull the switch on this application, you’ll know this succeeds when your screen turns a lovely shade of pink.

Create the Command Buffer

Everything that we do deals with buffers. You need to send commands to a buffer in order to get them to be executed and to see something happen on your screen.

In order to send your commands to a buffer, you need to create one. Let’s do that now.

Add the following code to the bottom of your render() method:

// Create Command Buffer
let commandBuffer = commandQueue.commandBuffer()

The MTLCommandBuffer, like the MTLDevice, is a protocol. It can’t be subclassed and there is only one way to instantiate a buffer object. That is calling the commandBuffer() method on and instance of MTLCommandQueue.

Earlier in this code we made MTLCommandQueue object that was available to the entire View Controller. This is why. We needed to be able to access the same command queue instance we created when the view loaded.

Render Pipeline State

We’re going to briefly jump back to our viewDidLoad() method to add one last piece that we need to set up our rendering pipeline. We need to monitor the pipeline state.

Create a new property at the top of the class along with the others:

var metalPipeline: MTLRenderPipelineState! = nil

Our pipeline state exists to interface with our shaders. Don’t worry, we will get to those.

In order to access our shaders, we are going to need some code:

// Set the Rendering Pipeline
let defaultLibrary = device.newDefaultLibrary()
let fragmentProgram = defaultLibrary?.newFunctionWithName("basic_fragment")
let vertexProgram = defaultLibrary?.newFunctionWithName("basic_vertex")

let pipelineStateDescriptor = MTLRenderPipelineDescriptor()
pipelineStateDescriptor.vertexFunction = vertexProgram
pipelineStateDescriptor.fragmentFunction = fragmentProgram
pipelineStateDescriptor.colorAttachments[0].pixelFormat = .BGRA8Unorm
        
do {
    try metalPipeline = device.newRenderPipelineStateWithDescriptor(pipelineStateDescriptor)
} catch let error {
    print("Failed to create pipeline state, error \(error)")
}

The first thing you will notice is that we are creating a default library.

In Metal, shaders are instances of MTLFunction objects. A Library is a collection of MTLFunction objects.

The purpose of your MTLRenderPipelineDescriptor is to look at your collection of shaders and tell the renderer which fragment and vertex shader your project is going to use.

We’re creating a new default library. After that, we are creating instances of MTLFunction objects by reaching into the default library and looking for them by name.

After we have our MTLFunction objects, we need to tell the pipeline state descriptor which vertex and fragment shaders we want to use.

Since creating a Metal Pipeline is a failable operation, we need to wrap it in a do-try-catch block.

At this point you might be wondering where those vertex and fragment shader functions came from. Good question, you’re about to generate them.

Metal Shaders

Note: I am creating both a vertex and a fragment shader even though this template has no vertex buffer. I am not sure if you need a vertex shader if there is nothing for it to shade, but it builds and this is something you would be adding later when you did have a vertex buffer. It might not be strictly necessary, but I am leaving it here anyway.

In GLSL, you need both a vertex and a fragment shader since they were two halves of the same program. If the Metal Shading Language follows the same paradigms as GLSL, then this is also the case and both a vertex and a fragment shader are necessary for every program you build.

This section is going to be disappointing. I haven’t had a chance to figure this out very well yet, so I apologize for the lack of detail about how I came up with these parts. This will be rectified later.

This is the one part of your code that is not going into the ViewController class. You get to create a new file now!

Create a new file and choose the “Metal File” template. Name it “Shaders.metal”.

The Metal Shading Language is based on C++. I am not as familiar with it as I would like, so sadly this section is going to be less in depth than I would like. I promise there will be many blog posts about this in the next few months.

You are going to create two different shaders: vertex and fragment. At this point you might be wondering how these get into your default library. The way the default library works is that it automatically includes any function that is included in a .Metal file. You could create hundreds of vertex and fragment shaders in various files and all of them would be accessible from the default library. I believe that you can create multiple libraries, but that is just one of the thinks I am going to explore more fully in the future.

This is the code for the vertex shader:

vertex float4 basic_vertex(const device packed_float3* vertex_array [[ buffer(0) ]], unsigned int vid [[ vertex_id ]]) {
    return float4(vertex_array[vid], 1.0);
}

This is the code for the fragment shader:

fragment half4 basic_fragment(){
    return half4(1.0);
}

This code comes directly from the Ray Wenderlich tutorial. I don’t know how it was generated. I don’t want to speculate about how this works because I don’t want to have something out there that I am completely wrong about, so just know this works and I will explain why in a later blog post.

Render Command Encoder

Now that we have a buffer, we need to add some commands to it.

We’ve been creating a bunch of different things that haven’t been applied anywhere yet. We’re ready to put those things to work.

Add this code to the bottom of the render() method:

// Create Render Command Encoder
let renderEncoder = commandBuffer.renderCommandEncoderWithDescriptor(renderPassDescriptor)
renderEncoder.setRenderPipelineState(metalPipeline)
renderEncoder.endEncoding()

The command encoder needs to know about the render pass and the pipeline state. We already created both of these things and went into some depth about what they do. We’re simply adding these specifications to the render encoder. After we add those to the encoder, we need to let the render encoder know we’re done by ending our encoding.

Commit the Command Buffer

Almost done! Just two more lines of code:

// Commit the Command Buffer
commandBuffer.presentDrawable(drawable!)
commandBuffer.commit()

We are taking that top drawable texture from the CAMetalLayer and queuing it up in the command buffer to be processed by the render pipeline. Finally, we are committing the command buffer.

Bring it all Together

I have placed this code in a GitHub repository.

There are a lot of aspects of this project that I am looking forward to exploring in more depth over the next few months.

The purpose of this blog post was to try and flesh out some of the code a little bit. Most of the blog posts I have seen involving this kind of template don’t really get into the nuts and bolts about what each of these things does and why you need them in your project.

It would not be useful to delve into the nitty gritty of everything in the framework. This post was long enough as it is!

I hope that this was a little more helpful if you’re like me and you wonder what things do and why you need them instead of just being satisfied with having something that works.

This post took over a month for me to write. It’s possible there are aspects of this post that are disjointed or that I forgot to include. I would appreciate being made aware of any issues by being contacted on Twitter. I would also appreciate knowing if there are any specific topics in Metal that people are interested in hearing about.

Thanks for bearing with me through this incredibly long post!

The Space Weaver is busy rendering fragments and vertices.

The Space Weaver is busy rendering fragments and vertices.

Getting Metal Up and Running: MTLDevice

I was really hoping last week to get a post out about creating a default Metal template, but I have been sick for a few weeks. Last weekend I sat in my chair staring at the TV because that was all I could deal with.

I am also recovering from a migraine that I have had for the past four days.

I was hoping to be able to go through my Metal template and remove all of the bells and whistles that were added by the Apple engineers to make the template do something and not just be a blank slate. I don’t feel comfortable enough to go through this and delete things yet, so I would like to talk about various Metal objects and components necessary to create a baseline Metal project and what purpose each of these serve.

I am posting this project on GitHub. I will be modifying it in the next few weeks to make this into a functional template anyone can grab if they want to start a Metal project but don’t want to deal with deleting boilerplate code. It’s not at that point yet, but hopefully before the end of the year!

The first, and most important, property I want to talk about is the MTLDevice. The documentation for MTLDevice is here.

Why Does Our Code Need an MTLDevice?

The MTLDevice is the software representation of the GPU. It is the way that you are able to interface with the hardware.

You might be wondering why this is something you would need. The iPhone is a cohesive unit that just kind of works when you program it. Why do you need to initialize a variable to represent part of your phone?

Whenever you create software that is going to interact with external hardware, you need to have a way to interface with it in your code.

When I was working at SonoPlot, we were writing control software for robots. We had classes for the robotics system and for the camera we were using.

The camera class had to have functions for every action we needed it to have. It also had a property for each thing that we needed to either get or set. One example is resolution. Some of our cameras had variable resolution and others were fixed. We had to have a property to either retrieve what that resolution was or that would set the resolution on the ones that can be set.

Even though we don’t necessarily think of the iPhone in terms of “external” hardware, it is. There are different parts of the iPhone that you can interface through the Cocoa frameworks. Speaking of cameras, the iPhone has a camera.

The AVCaptureDevice in AVFoundation is the interfacing class that Apple provides to allow you to talk to the camera. It fulfills the same function that our Camera and Robotics classes had in the robotics software.

Just as we needed to create a camera and robotics class to talk to our hardware, you also need to create an object that talks directly to your GPU.

One of the promises of the Metal framework is that you, as the developer, would have far more control over allocating and deploying resources in your code. This is reflected in the methods associated with the MTLDevice protocol.

Protocol, Not Object

The first thing to point out is that MTLDevice is not a class, it’s a protocol. At this point I am not entirely certain why it was designed this way.

The code to create the device is as follows:

let device: MTLDevice = MTLCreateSystemDefaultDevice()!

If you hold option and hover over the device variable, it indicates that it is a MTLDevice object. In order to be an object, it must be an instance of a class. My best guess is that this is an instance of NSObject that conforms to the MTLDevice protocol.

I am fascinated as to why this was implemented in this manner. I plan to look into this further, but this is one of those things that having an understanding of why it was done this way doesn’t necessarily help me get things done, so I am going to try and leave it alone for now.

Functionality

The functionality that the MTLDevice needs to be able to do for you are the following:

Identifying Properties

There are many more Apple devices and chip types that support Metal programming now than there were when it was initially announced. Back when it came out in iOS 8, we just had the iPhone 5S and one of the iPad models with an A7 chip.

Now, we have a lot of devices with a lot of different chip sets that all support Metal programming, including the Mac.

We’re in a similar situation with these chips that I was in with supporting the legacy robotics software at SonoPlot. We had three different types of cameras that all had different properties. You can’t assume that all the GPUs in each device are the same.

So, just as we did with the robotics software, there are several properties that are retrievable from the GPU.

The properties you can access on each GPU are:

  • Maximum number of threads per thread group
  • Device Name
  • Whether this GPU supports a specific feature set
  • Whether this GPU supports a specific texture sample count

Looking over this list of properties, it looks like as the chips have progressed, they have more and better capabilities. The A7 chip in my iPhone 5S is not as powerful as the A9X chip in my iPad Pro.

If you want to target less powerful devices, it might be useful to get familiar with the idiosyncrasies of the chips in these devices.

Creating Metal Shader Libraries

As a protege of Brad Larson, I am very interested in exploring shaders in Metal.

One of the big, stupid questions I have with Metal is how the shaders are set up. Every project I have seen (including the template) has had one shader file named “Shaders.metal.” Does that mean that all shaders go into one file? Are these just projects that use one shader? Is this basically set up the same way that OpenGL ES 2.0 is set up?

This says that all of the “.metal” files are compiled into a single default library, so that tells me that I can have more than one shader file in a project.

I am looking forward to exploring the shader libraries in the future, but right now just making a note that this is a responsibility of the MTLDevice.

Creating Command Queues

This is interesting to see that Metal has built in command queues.

I know that Brad utilizes Grand Central Dispatch in GPUImage to make it work as efficiently as possible. OpenGL ES doesn’t have a built-in command queue structure to the best of my knowledge. I know that OpenGL ES can only exist on one thread at any given time and that one of the promises of Metal was multithreading. If you’re going to multithread something you need some way of keeping your threads straight.

Looking forward to exploring the Metal Command Queues further.

Creating Resources

The resources you are creating with the Metal device are buffers, textures, and sampler states.

Buffers should be familiar to anyone who works with graphics or audio. To the best of my memory, in OpenGL ES you have three buffers: One for the frame that is currently being displayed, one for the next frame to be displayed, and one in the wings waiting to be deployed when the top buffer gets popped off the stack.

Textures are also a familiar concept. In GPUImage the photo or video you are filtering is your texture, so this is fairly straightforward.

I have never heard of a sampler state, so I am interested to find out what this does.

Not a lot new or exciting here if you have any familiarity with OpenGL ES.

Creating Command Objects to Render Graphics

This is where you set up your pipeline to render your graphics. In OpenGL ES 1.0 this was a fixed function pipeline without shaders. In OpenGL ES 2.0, the programmable pipeline was introduced along with the ability to use shaders. Shaders gave the ability to really generate shadows and ambient occlusion in graphics on an iOS device. Since we’re not taking a step backwards, there are commands here to set up your programmable pipeline.

Creating Command Objects to Perform Computational Tasks

One of the things that really excited me about Metal was the ability to do general purpose GPU programming (GPGPU) in iOS. I had hoped in iOS 8 to hear that OpenCL would be made available on iOS, so I was rather pleased to hear that this functionality was made available.

Jeff Biggus has been speaking about OpenCL for a few years and using the GPU for something other than just graphics processing. It is one of those things on my giant list of things I am interested in but haven’t had a chance to explore yet. This excites me and I am looking forward to writing a Metal project that utilizes this extra functionality.

Thoughts

I am really excited about all of the functionality I see exposed to me in this protocol.

I know I basically just went through the documentation and didn’t necessarily tell you anything you couldn’t look up on your own, but I do think it helps to have some context about WHY this stuff is in here, not just that it exists.

I will be getting into the command queues and buffers in more depth in my next post because those are absolutely necessary for a minimum viable application. It’s helpful to know that these things are created and controlled by this master object.

I hope that this was interesting and useful. I know in most of the documentation I am reading it just mentions you need to create a MTLDevice without exploring what it’s role is in the application. It’s a really vital part of the application and I hope that you have a better understanding and appreciation of what it does within a Metal program.

Getting Metal Up and Running: Part One

My current side project is working on trying to write at least one graphics post each week. I noticed that people tend to be asked to speak about things they write about. I also noticed that the vast majority of my posts for the last few months have been about depression and cooking and cleaning my house. I would prefer not to be a lifestyle blogger, so I am going to make a better effort to write more about what I am interested in, namely graphics.

Over the next few months I would like to write about frameworks (like Metal and Scene Kit), 3D mathematics, and shader applications. I am going to try to write something about what I am doing. I might not be able to get a whole sample project up and running and it might take a few weeks to get something done. But I plan to try to write about what I have learned and to chart my progress through my various explorations.

Getting Started with Metal

For a while there was a really good OpenGL ES template available in Xcode. Or so I have been told. Now if you try to make an OpenGL template, first off it’s difficult to even find. Secondly, the template is full of a lot of garbage. It’s similar to the Sprite Kit template including the rocket ship and other various assets. You can’t just open a template that renders a solid color on the screen.

My goal initially is to get a good Metal Template without the garbage that I can post on GitHub and make available to people who just want to get up and running.

Metal By Example

I recently bought Warren Moore’s Metal by Example book. Warren was kind enough to complete this book before abandoning us to go and work for Apple on the Metal frameworks. I didn’t realize he was going back until he did, but he’s been kind enough to answer questions about the book and Metal.

My plan was to go through the first chapter of the book and set up a Metal project, but I ran into some issues.

The books is written in Objective-C. I do not want to be one of those people who can’t or won’t read a book in Objective-C, but it does make things a little difficult when you are not used to it. I have difficulty switching my brain from one language to another, so this was one difficulty for me personally.

I also could not get the code to build. The compiler could not find the CAMetalLayer. I realized I was supposed to import Metal. I forgot how to do this. Either this is already done for you in Swift or I have spent so much time on my robotics project with Brad Larson where we wrote most of our own code that I simply didn’t remember how to link anything and the directions were not included. I found the directions on Warren’s companion web site. So if you are going through the book, I highly recommend looking at the site because it has content that is not in the book.

(I don’t want this to come off as a complaint against Warren. I greatly appreciate his efforts with the book and as an author myself, I can totally see me just wanting to get the damn thing done and have it be gone. I am not saying this is what he did, but if this was my book I totally would have done that. I still highly recommend his book and hope I don’t hurt his feelings.)

Even with this additional help, I could not get my code to build. It had trouble finding the QuartzCore/CAMetalLayer.h file. I imported Quartz and Core Animation, but no dice.

At this point I was frustrated and thought about giving up, but I decided to try and load Apple’s base Metal template.

Apple’s Built-In Templates

The Metal template is hidden in the Game templates options. If you navigate through the game templates, there are four types of templates here: Sprite Kit, Scene Kit, OpenGL ES, and Metal. Just because these are in the Game templates doesn’t mean you can only make games with these!

gameTemplates

metalTemplate

options

So I navigated through and got Metal base project in Swift. Yay! The project had a compiler warning! WTF?!

I was on my second glass of wine and was massively annoyed. Nothing I was doing would even build.

I looked up the compiler warning and realized that Metal still does not build in the iOS simulator. When Metal first came out it did not work in the simulator in Xcode 6. I had forgotten that and assumed it would be fix, but apparently not. If you are working with Metal, bear that in mind if you have sample code that you see that comes preloaded with a compiler warning.

I needed to build on my phone, so I plugged it into the computer. It built, but then it would not run. I got a warning on both the phone and Xcode about there being a permissions/privacy issue. My phone was not set up to run my code because it was an unknown developer.

So that began the great search through the Setting on the phone to grant my developer account permission to load code on my phone.

Here are a series of screen shots of where I found the ability to do this in the Settings. I blocked out the number after my email address because I am not sure if it is something I shouldn’t make public or not. I am sure someone will frantically Tweet me about some proprietary information being in these screenshots that I should not share. BTW: I never check the email address on here, I just use it for my developer account, so please do not spam me there, I am plenty available on Twitter.

IMG_3104

IMG_3105

IMG_3106

I don’t remember running code on a device being this difficult. It’s possible I have never tried running my own app on my phone with my own developer account. I had an educational account in school and then most of my other apps were for other companies. I think it might have something to do with TestFlight, maybe?? I would be interested to hear if things got more difficult in Xcode 7.

Finally, after all of this, I got the code to build and run on the phone! Success!!

IMG_3113

The base template is the usual triangle with a different color at each vertex floating around in space.

At this point I could just deleted that stuff and have my base template that was my goal for this week, but I don’t want to do that yet. I would like to look over this code as is to try and figure out how the vertices are read into the program and how the rotation is applied.

I want to figure out how to import a 3D model from a program like Maya or Blender. One of the big things that freaked me out about 3D graphics programming was the idea that I would have to construct my shapes by hand in the code rather than importing XML file representation of 3D objects.

Probably my goal for next week will be to go over the functionality of how this base generated template works before removing the floating triangle and uploading this to GitHub. I would like to use this template as my starting point for all of my future projects. I would also like to figure out what I forgot to import and connect in Warren’s project for the CAMetalLayer object because it is bothering me.

Didn’t get as much done this weekend as I wanted to. I talked in an earlier post about having a rough week and just getting this written was a bit of a struggle. Hoping future weeks will be more productive. But just doing something is better than not doing anything, I guess.

Final Countdown to CocoaConf Columbus 2014

After months of prep work and a roller coaster of changes, I am in the final day before heading off to the first of my three August conferences.

I have encountered more issues with Metal than I was hoping to find. This is the first time I have had a paid developer account during the beta period. Prior to now I was so busy just trying to establish a foundation that I somewhat ignored the new stuff that was coming out. This is the first time I have participated in the early release of not one but two new groundbreaking technologies on the ground floor.

I had to move more of my GPU programming talk over to OpenGL ES than I was planning to. I don’t think that is a bad thing per se. The most important thing I wanted to do was to answer a very specific question about one aspect of OpenGL programming. The fact that Apple came and changed everything about that made my talk both easier and harder. A lot of time was spend explaining why Metal is necessary and that fit into the parameters I wanted to address.

I will be giving this talk again in December. There will be a golden master of Xcode 6 at that point in time. I hope that it will be stable enough at that point that I can speak more about how to do things in Metal specifically rather than just ambiguously saying “This is how this would work if it were working, but it isn’t.”

I am giving my talks later today at Bendyworks. Bendy has been very kind to let me come and practice my talks there. I have found the feedback I get from them to be invaluable. I have also found that I am far less nervous once I have performed the talk at least once in front of real people and not just my pugs.

Speaking of my pugs, I am not going to see them for a week and I am very sad about that. I am going to miss my little buddies. Such is life.

I still have not packed. I need to pack sometime today. I also have to go to our Swift user group meeting to make arrangements with the people I am carpooling to Ohio with.

So I have a half dozen tasks to do today. Just need to take them one at a time to avoid feeling overwhelmed.

This is really stupid, but I keep forgetting that I do these talks because I love traveling to the conference and meeting new people. It’s hard to remember that this is going to be an amazing and awesome experience because I am putting a lot of pressure on myself to do a good job with my talks. I need to make sure I take some time to chill out and not worry so much about what I am doing.

Don’t panic.

Looking forward to seeing all my peeps at CocoaConf Columbus and That Conference in Wisconsin Dells!!

Code of Conduct

So today there has been some controversy on Twitter about 360|iDev’s Code of Conduct.

I honestly do not understand what the controversy is here. They are clearly stating that they want a diverse conference where everyone will be respected.

Here is my perspective on things.

I am a female programmer. I got into programming later in life. I originally studied video and audio production. I was young and foolish and thought I could succeed purely through sheer force of will. I would say I was about 30 when I really began to learn enough programming to make a go of it. At the last job I had I was not only the only woman, I was also the oldest person by a decade. I also have some health issues that make it impossible to work more than 40 hours a week.

Between being a woman, being older, not having a dozen years of experience, and having health issues, I feel very vulnerable in a community that fetishized boys barely old enough to drink who have been coding for fun since they were ten.

I have been extraordinarily privileged to be given the opportunity to speak at conferences on something that isn’t feminism. 360|iDev will be the fifth conference I have spoken at this year. I am speaking about Apple’s new 3D graphics programming framework.

People like Jim Remsik, Dave Klein, and John Wilker are all throwing me a lifeline to give me the chance to establish myself as a professional and maybe have a career.

I worry sometimes that I only get these opportunities because of “male guilt”. I am very concerned with being seen as someone who only gets to speak because they want more women. I have worked tirelessly to try to prove my cred by tackling difficult programming topics that frighten most people away. I am worried sick about not having a great talk to present at 360|iDev because my reach exceeds my grasp.

Even if I make an idiot of myself, at least I was given the chance. That’s all I ask for. I am being given a time and a place and what I do with it is up to me. A lot of people won’t even do that and I am eternally grateful for being given the chance to do what I can and make what I can of it. I am also grateful for the chance to meet the other people who are speaking and attending. In a world where connections are everything, the connections I have made at my conferences have been absolutely invaluable.

I have absolutely no idea what in the Code of Conduct created this fuss. I do know that I submitted a talk to a conference I could not afford to attend and that the organizers are not only giving me a chance to speak about a topic that is near and dear to my heart, they are also financially making it possible for me to go.

Talk is cheap. If you care so much about getting more women in technology, hire more women. Be willing to train them when they don’t have 5-10 years of experience. Mentor someone. Do something that actually costs you time or money. 360|iDev did. Don’t just go on Twitter and be a douche.

Lexical or Preprocessor Issue

So, today was the day I decided to bite the bullet and start working on my Metal demo for CocoaConf Columbus and 360|iDev.

Since a large focus of my talk is on GPUImage, I am hoping to put together a light Metal version of GPUImage that processes an image using a series of filters. I want to write between three and five filters that are easily stacked on one another that have a GPUImage counterpart in order to test how fast Metal processes images compared to GPUImage.

I went to look at what sample code is available from Apple for Metal. To my delight, I saw that there was an image processing base project. It includes one filter to change an image to black and white and that is hardcoded. I should be able to go into this project, add my filters, and add some UI elements allowing me to add the filter shaders I write.

Today I opened the sample code. Immediately, there was an error.

“Lexical or Preprocessor Issue: QuartzCore/CAMetalLayer.h not found.”

This is why we can't have nice things!!

This is why we can’t have nice things!!

Huh. That is inconvenient.

Did some digging. Refrained from asking this question on Stack Overflow because the last time I asked a question about the betas I got a snide person telling me to go somewhere else. Headed to the Dev Forums and found this thread.

Apparently, for the time being, there is no support for Metal in the simulator. There should be support for Metal if you have an A7 device like the iPhone 5S (which I have) that is running the iOS 8 beta.

I have not yet updated my phone to the beta. I know we are getting close to the point where it will be released, so it isn’t a huge thing to update to the beta, I just feel like I have no guarantee that stuff will work on there properly even after I update to the beta.

I must say that this latest wrinkle is not doing anything to sell me on Metal.

Metal only works on iOS A7 chips and now further won’t even work in the simulator. I usually use the simulator in my talks to demonstrate things I am doing, but now I have to get it on my device. I think I can use Airplay to show what the screen looks like, but that is one more step that can go wrong in my process.

The other things I am noticing in the sample applications is that most of the class implementation files end in “.mm”, which means that they are explicitly telling the compiler that there is going to be C++ code in them.

I have not worked with Swift as much as I should have, but I am wondering if this is going to be a problem with trying to write an app in Swift. I know that theoretically Swift is supposed to behave like Objective-C in that you can include C and C++ code, but I have not tried to write straight C code in a Swift class yet. Can you write C code in a Swift class, or is the support just that I can import a C class into a Swift-based project? How is this going to work with Metal?

At least with OpenGL ES you have the GLKit framework with should work with Swift. I am interested to know more about this, but sadly I don’t believe I will be able to explore these issues before I give my talk in Columbus.

I am also trying to figure out just how much C++ I need to know to fully work with Metal. I thought that I needed to know about the same amount of C++ as you need to know of C to work with GLSL, but after seeing the number of classes that are implementing C++, I am slightly worried that I am going to be out of my depth for a while.

These are things I am going to have to take into consideration and disclose during my talk. I know most of these issues will resolve themselves in the next few years, it is just slightly frustrating to sit on the sidelines trying to figure out how to make it work here and now.

Fortune favors the brave.