it’s on. much gazing.
it’s on. much gazing.
PSUs installed. belt installed. motor and driver installed. ribbon hung and wired. controller electronics installed. LED bars mounted on and hooked up.
power on. ssh in over wifi. run test programs. no explosions.
so, with hours rather than days to go, finally get to sit in front of the machine and start to think of it as art. although this is not finessing: the motors are still not behaving, not all eight led strips are working, i’m still working with test sequences. soldering iron and ssh connections in hand, time to pull the rabbit out of the hat.
the upright bars should have been in place on my arrival, that was the pre-requisite for me being able build the installation. as things were, the now partially pre-assembled units were only going up two days later, with only two days to go. this wasn’t good, but it did at least force some pleasant down-time where i got to do some things i’d wanted to do for ages, but motor issues had kept me from. so here is the oled finally looking nice, and not impairing the machine running at 60fps. drawing any text using python’s imaging system was way too slow, but with a little inspection of the adafruit driver library, i saw i could swap out the image buffer and flush that to the screen, which would be near-instantaneous. so with a little pre-rendering, the oled could now update while the controller waits for the next frame’s start time. controller code, now at ~9th revision
pre-render before starting the animation sequence, capturing each rendered buffer instead of sending the buffer:
def render_buffer_cache(prefix): for i in range(10): draw.rectangle((0,0,oled.width, oled.height), outline=0, fill=0) draw.text((0, 0), prefix + str(i), font=font, fill=255) if host_upsidedown: oled.image(image.rotate(180)) else: oled.image(image) buffer_cache[i] = list(oled._buffer)
for each frame of animation, point the buffer at the next pre-rendered one and send the contents to the display:
oled._buffer = buffer_cache[frame_index % 10] oled.display()
arduino loops to the microsecond to give smooth motion weren’t the only performance-critical bit of code required. the animation sequence wasn’t playing far too slowly - nowhere near the 60fps the animations were authored for, and now we can see it moving, it was clear it needed to be that fast and smooth to play with your perception as you looked on. updating the oled display and addressing the led pixels was taking too long. i knew you could drive this kind of oled crazy-fast as i’d written my own library from scratch for the mixer hardware, and i knew there was a lower-level way of driving the led strips. so: abandoned updating the oled for the time being, and advanced the controller python code to construct a bytearray per strip rather than set each led individually… and voila! naked-eye illusion. code. tho’, somewhat by definition, the video doesn’t quite capture that (tho’ watch the leds as the bar accelerates up…).
what the above skips over is that if an arduino is now going to sit by the stepper and drive it, the main raspberry pi controller now needs to control the arduino, not the stepper drive’s pulse pins. and so, perhaps inevitably, the custom board i made to extend the pi gets a hack to disconnect two direct control lines and rewire them to the pi’s hardware serial pins. at least the pcb design was good, it’s just that the spec changed!
to get this working, the arduino just interpreted the last byte received as a 0-255 value and scaled this up to the step range, around 20,000. that was pretty coarse and certainly less than the precision of the float value supplied in the animation sequence, and so while the machine was in transit i developed a protocol to pack an integer that could represent that kind of number into a series of bytes, keeping things as efficient as possible.
in python, on the controller: from a single byte, coarse and with no messaging capability –
to three bytes each with two bit command type and six bit payload –
#command 01 = sequence run vpos start vpos_steps = int(meta*drive_max_steps arduinoserial.write(chr(0x40 + ((vpos_steps >> 12) & 0x3f)) + chr((vpos_steps >> 6) & 0x3f) + chr(vpos_steps & 0x3f))
oh, did i mention there’s two of everything? the model is actually two floor-ceiling bars each with the rails of slide bar, carriage running on them, and led bar cantilevered from the carriage. being way over schedule and needing to ship the model to the gallery in the us, here is the excellent david meckin helping to wire up led bar #2.
in the midst of those debug wires was a logic analyser that hooks up to a desktop app. best thing ever – thanks again to ninja friend Arron Smith. using the logic probe to examine the control signals, i stripped down the control script to create a constant turn and the result is not good. stepper pulses, top: looks even. stepper pulses, middle: hell no. that’s just a little later in the same sample.
no wonder the motor doesn’t have smooth motion, with the shuddering action making horrible noises and vibrations through the structure. realisation one is that it’s not possible to get the timing precision to get near the pulse frequency to drive the motor at a decent speed, and realisation two is that the timing accuracy is just not there. running this machine from a python script on a raspberry pi with it’s stock linux installation was an experiment, and this is where “let’s try the future” fell down. simply, it’s not a realtime environment. however, a bare bones sketch on an arduino, flipping a digital out between delaymicroseconds() of 5 and up did work beautifully. somehow a middle-ground had to be reached.
the first stage of that was to simulate the 60fps commands of the driving animation while being a stand-alone program. a sin calculation also takes ~120µs on an arduino, considerably longer than the minimum pulse time of around 5-10µs. the beautiful step signal shown bottom is the output of calculating sin(time) every 1/60s, and for every loop in-between, calculating a linear interpolation between the last two sin points, comparing that with the current position, and issuing a step and/or direction change if needed.
later, in the down-time while shipping and the gallery install, i’d spent time working through options to drive the motor better. the machine would run smoothly – purr, even – if i drove it with something known-smooth like a sine wave, but whenever i switched from this a problem appeared. the intention had always been to add some kind of inertial smoothing to the control, so that whatever input the controller took would translate into something that was mechanically viable for the machine. accelerating the led bar puts torque on the linear bearing, and the weight of the whole assembly will simply stall a stepper motor pulsed from still to a constant turn. the answer should be to employ a physical model in the code, which with a little research the accelstepper library effectively does. you can command a step position, and it will take the motor there within a defined envelope of speed and acceleration. every stepper project should use this… except it turns out the maths is too computationally expensive to achieve anything close to stepping fast enough for this project on an 8mhz arduino.
problem was, any other strategy i tried in my own stepper driving code proved insufficient – i had high hopes for a simple median filter. this may have been partly due to the other side of driving the stepper smoothly: the microsecond consistency of pulse timings (as discovered originally, trying to drive direct from the pi). i suspect the serial read of commanded step position from the pi was interfering with this and/or disrupting the exact regularity between received commands that the linear interpolation relied upon.
in hindsight, two things stand out. one is that the raspberry pi is incredibly fast and powerful compared to the arduino, so running such a library fast enough would present no issue, if it ran a realtime operating system akin to the arduino. this required more linux-fu than i had headspace for at the time, but is surely the correct way to go. the other is a little more embarrassing, in that research since has shown the stepper driver – the hardware box that takes the logic level step pulses and makes it so – has an optional smoothing filter that is configurable by software. i never got to this, as mentally the driver was sold as not needing software configuration, the very point being it was pre-tuned for this series of motors, and physically, the interface cable was bespoke and i didn’t buy it. just perhaps breaking that mental mould to make up that cable could have saved me nights worth of grief.
to cut back to sitting in front of the machine with the opening in hours not days, anything that wasn’t that sinusoidal self-driving arduino sketch was causing horrible, horrible sounds (something like a resonant frequency in the half-coupling and splined shaft that slotted into the pulley), and so running sinusoidal was what it was going to be.
and to underline just how crazy the whole motor episode was, simply adding a debug line to send the current position of the motor to the serial console when the belt mid-point marker passes the sensor caused mechanical vibrations that made you want to turn the whole thing off. that’s how sensitive the pulse timings are. throughout, it has been the worst case of observing changing the observed.
of course, things are never straightforward and the particular nightmare of this project starts: motion control. every single aspect has problems. start the analysis, isolate issues…
LED baton mounted on the vertical slide mounted on the floor-ceiling bar, power supplies installed and wired up, first pass software to run the LED strips… let there be light!
to this point, everything has been to plan and as per researched data sheets. the motors do not prove that way, in any way. mechanically mating them to the pulley end-unit proves to be fraught with issues; likewise for their control.
the particular motor chosen has a shaft that matches the diameter of the bore in the pulley, this i had checked. what i had overlooked, is that unique to this particular motor in the range, the shaft length is much less than the others. so short, that engaging into the pulley is an issue. this requires making a motor mating plate as thin as possible, and some horrible work arounds with offset grub screws, taps and holes where they shouldn’t be, and access holes where they shouldn’t be to be able to tighten the result up. eventually it was done, but it was a bruising night.
having got a mechanical fit of motor to frame and pulley, then onto control. a quick implementation in the python script running the controller, and the vertical position is being read from the text file and translated into stepper pulses to sync the mechanical position. code
so… it’s alive! it works….
…but actually it doesn’t. the control isn’t giving smooth operation. it hurts just thinking about the jittery movement and how the cantilevered led bar amplifies that.
carriage blocks machined, pcb assembled, parts mounted and LED bars have LED strips fitted and on their way to being wired. damn fiddly, not without it’s trials and certainly time consuming, but looking damn sharp.
fitting the ‘dotstar’ LED strips in particular is both wonderful and frustrating. such a nice fit when fitted, but all sorts of trouble getting in and out, an action liberally aided by silicone lubricant. lots of silicone lubricant. but many gotchas: the rubber sheath stretches as you pull the assembly through, so woe betide you if you need to adjust afterwards or pull out again (which, this being both prototype and first assembly, you’ll end up doing a few times); the dotstars are not continuous strips and you either need to de-solder the appropriate joint of the 60/m low-density strips which sold as 5m rolls but are actually factory soldered 0.5m sections, or completely remove the copious silicone and whatnot of the 1m sections of 144/m high-density strips and solder together. just doing that on the bench could be unreliable, and when sheathed and pulled into the groove… yeah.
this is a daughter-board for the raspberry-pis that will run each arm of tekton one three. in theory, it was trivial: bridging the pi’s header pins to connectors for the specific cable runs that needed to be made, via 3.3v-5v driver chips. in reality, it was a painstaking hand layout: as well as the data lines, this was the power distribution board for whole lot of LEDs. double digit amps, most of the ribbon cable between moving carriage and fixed upright is simply power.
laurel pardue is an accomplished violinist with an augmented violin. her work is all in the aim of making music using a beloved instrument with repertoire and musicians with lifetimes already invested in it. the grunt of her work, though, is engineering: designing sensors and processing to capture the dynamics of a violin being played – as if you could ask the violin itself. that sets an interesting dichotomy in demonstrating her work.
now an analogue instrument with a real-time data feed is attractive to me. having long explored realtime video, i’ve been wanting to work with the directness of lighting for some time. delays and something-one-removed so often seem inevitable for live video work, and here was a chance to get immediate data and direct rendering. so we hatched a plan, a simple but atmospheric visualisation of her playing, around her playing. four parallel beams of light behind, one for each string, and one moving beam for the bow. the trick is for the light’s attenuation and movement to be the music, so eliminating latency is the key: the sound and light are perceived as one.
and here we are, prepping on stage. at this iteration, i’ve got a mixer controller taking in OSC and driving DMX LED lighting. it all would be fine, but experimental hardware being experimental hardware, laurel’s system is needing some emergency attention…
for a few fractions of a millimetre here and there, this is pretty much a whole new pcb design. it also meant no magnetically isolated ethernet jacks (the mag in magjack should really be magic instead), which puts you on a path of doing that isolation your own circuitry, and that… that can put you on the path of feature creep.
yep, feature creep it has, and with that more delay. but for a very good end: that ethernet jack can now do more than ethernet…
package was waiting for me, opened the box: two assembled pcbs, all looking as they should.
to my utter amazement, the electronics all work out: once the pinouts were updated in the mBed firmware, its talking RS232 to the video processor without drama or debugging. to my double amazement, after a quick primer on SPI and decyphering the oled’s datasheet, a test program incants the screen into life, and some time after that the random pixels of an uninitialised buffer have turned to a pattern of my coding. some image editing and two processing sketches later, i have the byte sequences to display a full screen image and typeset my choice of pixel font. the test program fleshed out and re-rolled as a library, and we have the above sight. given the delays the screen has caused since 2009, its so satisfying having this work as per the data sheet, as per design, out of the box; and not even relying on other people’s voodoo: my library, from scratch.
of course, things are never quite that simple. the electronics check out, but the physical fit requires some rejigging – note to self: order the enclosure in advance next time, no matter how well dimensioned its spec drawing may seem. a millimetre here, and a millimetre there has now forced a complete re-layout of the PCB.
aaaand: the ultimate irony? the physical fit issues weren’t just components hitting case internals. what you can’t see in the photo are 40 jumper wires coming out of the mBed’s socket on the PCB leading to a displaced mBed sat in a prototyping block. for all of the correct design and manufacture, these assemblies are compromised by a square peg (mBed pins) not fitting into a round hole (a quirk of the socket strip i spec’d). tssssch!
it’s been two and a half years since the magic week of going from idea to breaking a working dvi mixer package and the ensuing dreams of getting it out there for everybody. problem is, thats still in dream territory: where’s the manufacturable hardware or website buy-button?
by june 2010 there was a v1 and things were looking good. it had taken far longer than seemed necessary… but that turned out to be just the tip of the iceberg. this was a year ago, and this slightly less. so what has been happening? long story short, the software1 had been waiting for hardware that was perpetually “almost” there. if you’re time-constrained, don’t work with someone else who is also time-constrained; doubly so if you’re entirely relying on them. if you don’t have the skills to make you the master of your own destiny, get on with getting them.
getting on with getting them, for the past two months i’ve been working away and what you see above is an extract from my first PCB design, a from-scratch reworking of what a d-fuser PCB needs to be. i’ve just sent it out to be manufactured and all-importantly assembled up (data sheets and making up eagle library parts may no longer scare me, but soldering 0.5mm pitch FFCs does). if it doesn’t work its because of the design, and that i can work with – and now have some guru backup.
all of which means it’s as if it’s may 2010 again: we’re in “pcbs, parts, plans” territory, complete with corresponding announcement: i will be presenting the work-in-progress at dorkbot london #782. the difference is, this time i know the whole widget, its entirely down to me, and i’ve even secured a little start-up funding to expedite this prototyping.
including the gift of the tv-one header for any other projects out there, and doing some reverse engineering to get EDID upload functionality that plug'n'play would need. props to vade for the insane undertaking of reworking the original tv-one header to have every rs232 command and resolution under the sun in there, not to mention bootstrapping the QC plug-in itself.
if you’re in town, come down!