Marble mazes, edge detection and an iPhone 3g
It started with a simple question. It was summer 2009 during the wild west days of the App Store when fart apps were cool and an android was still just that pale guy from star trek. My housemate pointed to my iPhone 3g and asked - "why don't we make some money out of that? lets write an app of our own.". Soon a page of scrap paper was covered in ideas; we crossed out all variations on the fart app and continued to whittle away until we had found our calling. Remember those marble run toys? we were going to turn my iPhone into a virtual version of that. Draw a marble run on paper, photograph it, allow our AR engine to work its magic. Now drop a marble onto the photograph of the course and watch it roll down the ramps and off the jumps you just drew as if they were real objects.
The next week I was on holiday in egypt and I started work on the physics engine. Ben's image processing library would transform the photograph into a boolean bitmap image where cells would be occupied if a line was drawn at the corresponding point in the photo. The hardware limitations of early iPhones meant that this collision bitmap would be a much coarser grid than the original photo. The question was how could I hide this coarseness by "smoothing" out these squares so that the ball would run the drawn lines exactly as the user intended.
A physicist by training, I started by transforming each "true" cell into a repulsive potential and integrating the equations of motion. Numerical integration of such a system is a nontrivial problem. I was unable to strike a balance between a method that conserved energy in collisions sufficiently well to fool the eye, ticking fast enough to stop the ball iterating through lines, and cramming all these calculations into the limited capabilities of the iPhone 3g. Throughout all this work I had never once had an issue with the simple vector bounce code I had implemented to make the ball bounce off the walls and keep it on the screen. I realised that the problem would become very simple if the drawing were in vectorised format.
The original problem of bouncing a ball off a bitmapped edges had morphed into the problem of how to vectorise that edge map and use the simpler vector bounce code. That is a hard problem in general, but if you know that your ball is near an edge (a simple occupancy check of the surrounding bitmap squares will tell you that) then it is straightforward to find a local straight line approximation to the drawing. To do this I returned to my inverse potentials, placing one on each occupied pixel, and I assumed that the grad of the resulting scalar field pointed normal to the tangent line at that point. The assumption turned out to be reasonable and it was not long before the ball was rolling smoothly down a coarsely bitmapped ramp.
Meanwhile, back at home Ben was grappling with how to efficiently implement Canny's edge detection algorithm. This all had to happen in the CPU because the programmable graphics pipeline would only make it to the iPhone ecosystem following year with the 3gs. If we earnt our money at any point in this development process it was the night he spent tweaking the edge detector to distinguish between the lines that come with lined paper, and those drawn on by the user. It took a few more evenings to polish the UI, but in no time Snap 'n Tap was in Apple's review queue.
The perfect conclusion to this story would be "we sold a million copies in a week, and by the end of the month we were slinging caviar at each other during a foodfight on my private yacht". That didn't happen, but we did make money. Certainly enough to recoup the cost of the iPhone, not enough enough for a helicopter, but more than enough to teach me that I was capable of making money. When I finished my PhD a year later I didn't thumb through the classifieds to find a job, I headed to canada to start a mining tech company. Although successful, spending my days down a mine and the evenings in -30C ruined the experience. I am now back in Europe building a software company with another old friend from PhD days. I'm proud to say I've never had a job, and thanks to Snap 'n Tap's lessons I don't think I ever will.
My iPhone 3g died quite soon after Snap 'n Tap's release, so there was only ever one update. It seems wrong to allow Snap 'n Tap to decay after the life changing lesson it taught me, so as soon as Apple get me my iPhone 5, it is update time. I will add support for retina displays, the new 4" screen and maybe even the camera iPads. Meanwhile, try it out for yourself in the App Store.
Posted on 30 October 2012
Based on a work at http://slidetocode.com/blog
Slide to code blog is licensed under a Creative Commons Attribution 3.0 Unported License