Graphic CV

Now that I’m primarily looking for work with fast-paced tech start-ups and other cutting-edge companies, I decided to modernise my CV accordingly.

Mark K Cowan’s interactive graphic CV available here.

Non-interactive / non-clickable PDF-version available here.

Presentation trumps performance for this project, so jQuery, jQuery UI, jQuery Mobile, async.js, underscore.js are all used. My own scripts+stylesheets are minified and embedded in the HTML document (by a perl script and a makefile) in order to reduce the number of server requests required.

Smartie

As a quick way to get into computer vision, discrete optimisation, and robotics, I drew up a Raspberry Pi project. Despite the motivation, some elements of this project may be useful for the Pi’s eventual intended target audience, children in classrooms.

Smarties, http://en.wikipedia.org/wiki/File:Smarties-UK-Candies.jpg

At its simplest, the objective is to build a little robot which can move around on a flat floor, picking up Smarties that it “likes” while avoiding ones that it “dislikes”.  It detects the Smarties via the Raspberry Pi camera module, and moves around via two wheels which are controlled from a PiFace extension board.

Vision

The computer vision part is complicated, but the individual building blocks of it are not.  It will require a simple perspective transform to allow estimation of distance to smarties.  These distances and directions will then be used by the optimiser to plot a near-optimal route.  The optimiser can of course be omitted from any “educational” version of this project and replaced with a simple “nearest Smartie first” decision.  The smarties will be identified by finding regions of high saturation in the transformed images (note: HSV colour space), and by applying a union-find algorithm such as the very simple Hoshen-Kopelman to the image, to identify coloured blobs.  This will require that the floor and nearby walls are solid, black/white/unsaturated colours.  A simple heuristic which looks at blob size and shape can then be used to identify Smarties and their corresponding colours.

Motion

The robotics part is considerably simpler: a H-bridge composed of MOSFETs provides independent directional control over each of the two wheels, while the bridge is driven by PWM signals from the Pi thus allowing independent speed control over each wheel.  Each motor requires two of the PiFace’s open-collector outputs, thus leaving four outputs remaining.  Two of these can be used to control solenoids which move the “mouth” of the robot, allowing it to scoop up smarties.  Alternatively, one PiFace output plus a delay line (e.g. a simple RC integrator + AND-gate) could be used to conserve PiFace outputs somewhat.  A schematic for the motor controller is shown below:

Image taken on Raspberry Pi camera, illuminated using flash on my THL W8S smartphone

I chose the STP80PF55 and STP36NF06L MOSFETs, not due to cost or any amazing technical qualities but simply as they were the only MOSFETs that CPC had in stock which came close to fitting my requirements.  They can also handle dozens of amperes (whereas my 3V motors only draw 2-3A) so the inductive discharge from my motors is probably too low to harm the transistors.  Even so, I’ve added a 22µF tantalum capacitor in parallel with the motor to short some of the discharge.  While any capacitance from 100nF upwards should suffice, it is imperative that the capacitor is NOT polarised since we are running the motor in either direction.  For the pull-up resistors on the gates, any 5-50kΩ resistor should suffice, unless your MOSFETs have an insanely high gate capacitance.   I used 10kΩ 0.1A resistors here.  After connecting the two gate pairs to open collector outputs #2, #3 (zero-based numbering) on the PiFace, I ran my pwm-bidi.py script to test the controller.  Read the source to see what keyboard-keys are used to interact with it.  It implements the PWM speed control in addition to being able to switch the MOSFETs appropriately to reverse the motor direction.

Power plant

The Pi, the camera, the PiFace and the motors together require more power than my USB hub can supply, and more current than the Pi’s current limiter will permit.  As a temporary fix, I took an old ATX power supply from a dead PC and used the “standby power” line from that, which supplies a very stable 5V and ~2A.  If your PSU follows the standard, then the 5V standby should be purple and ground should be black.  Disconnect the Pi’s USB power and connect the 5V/GND lines from the PSU to the 5V/0V terminals on the PiFace.  This will also power the Pi and the camera if you haven’t fiddled with the PiFace’s jumpers.

ATX power hack for RasPi

It is amusing that a Pi running at full load with a camera and motor attached still uses less power than a desktop PC that’s turned off!

Gearing and wheels

To get the motor power down to the ground, gearing is usually necessary.  I intend to have a pair of planetary gearsets laser-cut or 3d-printed (e.g. see here).  The sun gear connects to a motor, the carrier holding the planet gears is fixed to the robot base and the annular gear actually is the wheel.