Quick Arduino LED Clock

In two weeks I’ll be giving a workshop on Controlling LEDs with Arduino, where I hope to present ways in which the Arduino can be used to control lots of LEDs though charlieplexing, IC controllers (such as the MAX7219 used for the clock face) and addressable LEDs such as the WS2812 (a strip of 24 such LEDs illuminates the seconds and hour position from the back of this clock).

The clock was thrown together with what I had kicking around. The body comes from an old IKEA clock (I’d bought it for the movement which went into a CNC cut LP clock) with four 8×8 LED matrices controlled by MAX7219s. Behind that a strip of NeoPixels / WS2812 LEDs provides the seconds (red band) and hour position (blue pixel).

The code could do with some work. It is based on the TimeSerial example provided with the Time library and as such it needs to connect to a PC to set the time. I’ve ordered a Real Time Clock (RTC) to aid in removing that dependancy. I might implement some sort of minute hand on the RGB LEDs, perhaps by lighting a few in another colour (and blending the colours as I have done when the second hand reaches the hour hand). The clock could be programmed to display a rainbow or larson scanner effect when the clock strikes an hour. I might add a Light Dependant Resistor (LDR) to adjust the brightness of the LEDs to match ambient light levels.

Finishing off the face with something which will replace the scrap of cardboard used to hold the LED matrices in place is about all I expect to do with this.

HDMIPi Stand

HDMIPiStand1

I backed the kickstarter for HDMIPi, a 9″ LCD designed for use with the Raspberry Pi. It didn’t take long to assemble thanks to a good video walk-through and although I had some teething troubles with a dry joint on the micro-USB power socket (fixed by reflowing the solder with hot air from a desoldering station) I have been reinvigorated in my interest in the Raspberry Pi. The first thing I needed to do with the HDMIPi was to build a stand for it. I had elected for the black acrylic styling and since I happen to have some offcuts of 3mm black acrylic I was able to laser cut a matching stand at MAKLab.
Continue reading

A Good Days Polargraphing

I first learned of polargraphing from either OllyR on Letsmakerobots.com or from seeing Sandy at the Edinburgh Mini Maker Faire in 2013. When I recently realised that I had the necessary parts (minus a few of the cheaper components) to follow in the footsteps of Sandy’s Polargraph.co.uk build I downloaded the files and 3d printed some sprockets (on MAKLab’s Ultimaker2) and laser cut motor mounts (my own design). Today I finished that all off by making a crude pen gondola (clear CD, cardboard tube and screws/nuts/washers).

Gondola and Super Goat

The first print was of a vector produced by one of the MAKLab volunteers (I didn’t have anything prepared and was busy setting up the polargraph so it fell on the volunteers to produce a vector drawing).

MAKLab M on Polargraph

Proven to be working the next logical step was to draw the MAKLab logo. In this picture you can see the two stepper motors mounted to the drawing board (acrylic mounts clamp to the board) and the Arduino and Motor Driver Shield are zip tied to the top. My laptop is on the right running the processing sketch, turning the vector drawings into gcode and sending them to the Arduino.

Spirograph on Polargraph

I downloaded SVG Spirograph (it was the first link for my google search) and promptly produced a spirograph (ideal for this setup since I don’t have a servo installed to lift the pen off the page). It was mesmerising to watch and drew some attention. I think it would be cool to add a “Spirograph of the Day” which automatically generates a new spirograph and draws them each day (same spirograph drawn all day whenever the paper is replaced and a button on a laptop pressed).

Time lapse of the plotter in action. Unfortunately I forgot to take a photo of the finished spirograph.

EDIT: Added a servo to lift the pen off the page and got a photo including the finished spirograph from the previous Saturday.

Spirograph and Pen Lift Servo

Produced another Spirograph, This one looked better part-way through but looks messy now. Timelapse at the end of this post shows it turn from good to great to messy.

Saturday Spirograph

We wanted to try drawing a picture so I selected a photo from my flickr vectorised in Inkscape and loaded into the Polargraph software. The result was great.

Polargraph Penguin

Time lapse of this Saturdays drawings:

Minishift from Arachnid Labs

While at this year’s Maker Faire UK I picked up a Minishift kit from Arachnid Labs. I had previously bought their circuit pattern cards which I make use of when teaching the Intro to Arduino workshop. I soldered up the kit in the evening I got back from Newcastle using the instructional videos (a few non-intuitive elements such as orientation of the LED array and remembering to put the screws in before the LED array). I struck an obstacle when I tried to install and use Arachnid Labs’ Python example and not having much experience with Python I’m clutching at straws trying to get it working.

Win7pip

Falling at the first hurdle. On Windows 7 I tried using pip found in the Scripts folder within my Python 2.7.3 installation to install minishift-python but it spat back ImportError: No module named resource.

In hunting for a solution I found that I hadn’t added C:\Python27\ to the Windows Environment Variable %PATH% and while I was correcting that I added %PYTHONPATH%.

https://docs.python.org/2/library/resource.html states that the resource library is for UNIX platforms. I got this far and having seen this on a few google results decided that perhaps this isn’t Windows compatible.


 

 

Moving to Ubuntu on my old laptop I followed the Arachnid Labs instructions but couldn’t get the daemon running nor could I use the python example program to directly write text to the display.

ImportError: No module named hidUbuntu Screenshot

I had a go at installing cython in order to install hidapi (can’t find the link that inspired me to try that route) but my abilities with linux are rather limited and I hit some roadblocks I couldn’t get past.

Trying python -m minishift.minishiftd -d 32 appeared to pass (no feedback to the contrary) but when I use curl -G http://localhost:8000/set –data-urlencode “text=Test” it responded with curl: (7) couldn’t connect to host.


 

I was able to test that the minishift itself works by connecting it to an Arduino (without the USB to SPI adapter) and running the test code provided by Gregory Fenton on his blog labby.co.uk. With this success I went on to reuse portions of Arduino code I’d used with a MAX7219 LED array to scroll some text (only to find that Gregory had gone on to do something similar and post it on his blog).

MAKLab Move Shop

New MAKLAB Studio

On Monday 7th March 2014 MAKLab opened it’s doors at their new premises in Charing Cross. They started off in Gallery 1 of The Lighthouse and after nearly 2 years in that space decided it was time to move. With a sterling effort from staff and volunteers the move was completed in only 2 weeks. This included ripping up floor tiles and laying a reclaimed gym floor, paining the studio and moving in the equipment. There are a few jobs left, putting finishing touches here and there, but as our first Saturday approaches I think they’ve settled in nicely.

The shop is located in the Charing Cross Mansions at the foot of Sauchiehall Street overlooking the busy junction with the M8 motorway running by and major road arteries stretching off into the West End, up to Maryhill and down to the Kingston Bridge.

While I’ve been down often to help out I’ve yet to get some photos of the work that has been done. A fellow MAKLab volunteer has a photoset available on Flickr if you’re keen to take a sneak peek but I highly recommend you pay us a visit.

Halloween Costume – MAD Scientist


This Halloween and the month leading up to it I put together a MAD Scientist costume. Most of the preparation was in the assembly of the 16×8 LED array which was free-formed to sit inside and take the shape of a brain-shaped jelly mould. It was the chance encounter with the jelly mould that sparked this costume off. The semi-transparent vacuum-formed mould looked ideal and it was a good fit for wearing on my head like a hat.
I already had some MAX7219’s so I knew that I was going to use these to drive the LED arrays and the left-over LEDs from swapping the colours on the LED Message Board were prime candidates for the array. I fool-hardedly embarked upon a bi-colour array, alternating Red and Blue without giving any thought to the different voltages required by the two types of LED. I kind of got away with it and was able to made a few animations based around the colour difference (i.e. police lights) but if I were to do it again I’d probably opt for just a single colour.

LED arrays rest in the brain jelly mould with an Arduino (Freeduino kit) driving the array.

LED arrays rest in the brain jelly mould with an Arduino (Freeduino kit) driving the array.

With the LED array nearing completion my attention turned to the asthetics of the brain. I didn’t want the LEDs to be obvious so would need to add an opaque layer and decided to paper mache some tissue paper inside the plastic. Finding white and pink tissue paper (the pink was too vibrant so the white went on first) I quite easily obtained a thin layer of coloured paper between the plastic and the LEDs. The LEDs are arranged on some single-core wire which was bent to the shape of the mould before the whole lot (including the MAX boards) was hot-glued in place.
A lab coat was bought on e-bay for less than £10 (wish I had spent a little more time on this as I went for a lab coat with striped edges instead of a plain lab coat). I took the lab coat with me to MAKLab and had a go at digital embroidery with an emblem and some text on the breast of the coat. This didn’t turn out brilliantly as we only had some cheap thread to hand but I felt it gave a weathered look. Had I not left it so late I might have given the coat a proper weathering (couple of coffee stains, dragging it around the studio and maybe a wash or two to loosen it up, add some elbow patches etc.)

M.A.D. Labs Inc - Dr E Gore

M.A.D. Labs Inc – Dr E Gore

To accessorise I found an old pair of safety goggles which had seen better days and dust mask and applied speckles of ink to both by spraying them using a sharpie pen and a makeshift air nozzle (straw cut and shaped to a 2mm hole). Blowing through the straw and placing the sharpie nib in front of the nozzle I got a few splashes and splats of colour (black with some red accents).

I took my old animatronic hand off the servo board and restrung it (the fishing lines had all snapped anyway). This was place on a carriage in the breast pocket of the lab coat and the fishing lines ran down the inside of the jacket to the lower pocket.

Gloved animatronic hand in pocket

Gloved animatronic hand in pocket

A hole in the other pocket permitted a cable to be run from the pocket up the jacket to the collar where it connected to a socket at the back of the head to supply power and signal from an Arduino in the pocket to the brain LED array.
I bought a halloween costume wig (£12) and modified it to accept the brain where the used to be a bald patch. This is held on by some hot-glue between the plastic of the mould and the fabric of the wig. I finished this off with some electrical tape (intending to match the Red/White/Blue of the lab coat but only going as far as Red and White).
The assembled costume cost me less than £30 (lab coat £10, wig £12, jelly mould £1 or so, tissue paper £2 with lots to spare, the rest was found or repurposed) and was great fun to put together and even more fun to wear.

Mad Scientist and Dalek

Circuit diagram and code available for anyone who is interested (though neither are anything special), leave me a comment and if there’s any interest I’ll post them here.

Thoughts for upgrades or alternatives:
RGB LEDs arranged beneath the jelly mould would allow for more advanced animation and expression.
Even PWM control of the brightness of individual LEDs would be good (the MAX chip allows for 16 brightness levels but they are set for the entire array not individual LEDs).
Makeup, dirty/sooty look for the face perhaps with panda eyes where the goggles provide protection.
Weather the brain jelly mould (it’s a bit too clean and shiny)
Better blend the brain-face boundary, I wanted a metal strip but ran out of time and just taped up the edge of the mould. Perhaps add some wires or bolt/screw-heads.
Replace safety goggles with something DIY, perhaps steampunk-ish.
Similarly the dust mask wasn’t ideal, replace with custom ‘respirator’ continuing the bio-hazard symbology on the lab coat.
Replace Red/White/Blue strips on lab coat with Black/Yellow 45deg stripes (don’t know if or where these would be available).
More accessories, i.e. test tubes, black rubber gauntlets (I had some disposable gloves left over from spray painting), robot pet etc.
Could be a couples costume if the other half dressed as a frankenstein/monster/lab rat.

LED Twitter Display for MAKLab

[1st September 2013] With the help of Glasgow Open Source Hardware Group (GOSHG) members I’ve replaced all the LEDs on MAKLab‘s old/defunct 80×7 LED Display Board (RC8200 from “Real Color Displays Inc”). Removing the old processor (a Z80) and superfluous ICs (8-to-1 Multiplexer, buffers, NANDs, ANDs, J-K Flip-Flops and such-like) I’ve got full control of the LEDs with just 7 pins (4 to control the rows, 2 to clock in data into the columns and one to strobe the data to the LEDs). Up until now I’ve been using an Arduino Mega ADK and Ethernet Shield to handle the processing (today I added some code to make use of the microSD slot such that messages could be read from the SD and displayed if an Ethernet connection is not made). I’m now considering options for condensing the hardware controlling the display. I like the Ethernet shield with the microSD slot but having it on top of an Arduino Mega takes up a lot of space. Considering something like the Arduino Uno Ethernet, though I’d like more code space (will try optimising/condensing to fit on an UNO since I certainly don’t need the pin count or other features of the Mega).

Having recently discovered Vine and rediscovered Twitter I’ve been Vine-ing and Tweeting about the board.

Other than the Vine-ing and occasional Tweets I’ve not been documenting my work on this project, instead just forging ahead over the past few weeks, so I thought I’d add a blog post and add to it from time to time. This post is a work in progress, published with the intention that I actually do something with it (way too many draft blog posts that won’t see the light of day).

Android Controlled Tank

At work someone mentioned using a robot to distribute the Friday afternoon sweeties and I accepted the challenge. Already having a tank remotely controlled with a PS3 Controller I decided it would need a camera to see where it’s going. Having recently upgraded my phone I was able to use the old one with IP Webcam to share the camera feed over wifi.

Tank01

When I stumbled upon a brief article on Let’s Make Robots about a LEGO robot with a wifi camera which is controlled by a custom Android App. I followed the instructions on the website linked in the article and had the proof of concept up and running with the free version of the app. Upon buying the Pro version of the app I was able to drive the tank around via bluetooth and simultaneously see the camera footage from the old phone over WiFi.

Tank02

My configuration takes a Heng Long 1:32 scale Bulldog Tank base (chassis, motors, gearbox and treads) driven by a Dagu 4-channel motor controller (over the top in this instance but a common item now found in three of my robots) and controlled by an Arduino MEGA ADK (also over the top but I was using this for the USB Host aspect when the tank was operated by PS3 controller via bluetooth dongle). The bluetooth dongle has been replaced with a Bluetooth module for a simple serial link.

The Bluetooth module I have is only identified by the following address written on the back http://shop34694757.taobao.com/ and the label BT_Board v1.1. Thankfully I didn’t need to do much to make it work, just wired up the power (VCC, GND) and plugged the Tx and Rx of the bluetooth module into the Rx and Tx of the Arduino (crossed over so Tx talks to Rx and vice versa).

Servo added and the Arduino code updated I can now tilt the camera using some buttons on the BTBotControl interface.

A video has been uploaded to YouTube.

Where should I go from here?

  • Overhaul Arduino sketch to allow for joystick control of camera angle (like a pan and tilt mode where horizontal joystick moves will rotate the robot but vertical moves only raise or lower the camera).
  • Perhaps mount an Airsoft BB gun (I have the original one from the tank turret).
  • Write my own custom Android app? Try piping control of the robot through the on-board Android phone, doing away with the mix of Bluetooth and WiFi. While I’m at it make more use of the Android, it has GPS, an accelerometer etc.
  • Reversing view using phone’s second camera? Phone orientation means the rear-view camera is hidden in the holder.

Edinburgh Mini Maker Faire

Edinburgh’s Summerhall was home to a Mini Maker Faire on the 7th of April 2013 and I travelled through with MAKLab to help out and to see what was on show.

We set up the Repair Cafe in the overspill of Summerhall’s cafe, taking over a picnic bench and surrounding area. Although there was ample signage around Summerhall the footfall was quite light, especially for the 2,500 visitors that came through the door. That said some folks came prepared with items to repair.


We took a look at a waffle iron and though we found the fault we couldn’t safely repair it.


This collection of stained glass pieces was dropped off and we all worked together to reassemble the mobile as it had once been. This was hampered by the fact we didn’t have the self adhesive copper which had been used to assemble the original but we made do with the original copper strips where available and added some sheet tin to reinforce the structure.

MAKLab were setup in the courtyard area with a laser cutting producing little dinosaur kits. They had also brought along two large dinosaurs, Derek the Raptor and Terry the Pteranodon. These were cut out of 18mm plywood on MAKLab’s big CNC router (I helped with Derek and produced a time lapse which you can see here). The two dinosaurs were painted by the visitors and by the end of the day they were looking well camouflaged.

As Repair Cafe was quiet we each took some time to tour the Mini Maker Faire. My first destination was the Robot Room where I saw some awesome projects. The DR-1 by Eve Robotics Team was an interesting little Arduino Tank, At another table an off-the-shelf USB robot arm had been upgraded with position feedback by attaching a USB webcam and pointing it at a matrix of black spots on a white screen. I got the chance to see an OpenROV which featured on Kickstarter last year among the other Underwater ROVs built and displayed by Martin Evans.

Up the stairs from the main entrance there was a lot to see, I especially liked the PolarGraph, a series of drawing machines (photo below). There was 3d printers churning out musical instruments, a pair of model houses with a variety of energy saving materials an lots of wonderfully crafted items. Through into the next room there was electronic kits, vintage LEDs and large quantities of Arduinos on sale. The PolyFloss Factory were showing off their candyfloss machine adapted to produce a wool like material from recycled plastic, members of the public could make a ruler using the materials (packing it into a form which was heated up and subsequently cooled by the PolyFloss folk). Last but not least there was a Tesla coil in a lift cage.

Unfortunately I only took a whistle-stop tour of the exhibits upstairs and I didn’t even get a proper look outside to see what else was out there. There are a few more photos in my Flickr photoset which I haven’t included on this page.

I didn’t want to let the lovely vinyl poster go to waste so I re-purposed it as a covering for my briefcase while we were packing up.