Author Archives: bmayton

Milling circuit boards

For the past 12 or so years, the AT86RF23x 802.15.4 radios have been my go-to for low-power digital communication. They work pretty well, and I have a good software stack and protocols built up around them (which my friends decided should be called “Bri-Fi.”)

They’re sort of expensive, though—the bare chips are a few dollars each and modules were at least $20-30 last I looked. On a fully custom sensor board they’re not that bad, but for random side projects where I just want two things to talk to each other wirelessly, the cost of the chips and doing an RF layout are kind of annoying.

I’ve been seeing a lot of Nordic’s nRF24L01+ radios in the maker community. It seems there’s a pretty good Arduino library and the modules are available super cheap. I think I got five complete modules for about what I’d pay for one of the RF233 chips.

Anyway, I’m playing around with these modules and put together a couple of quick PCBs to try them out. I’ve been getting pretty good results milling boards at home using my little CNC router, so I thought I’d snap a few photos and write a “quick” blog post. This board is a little USB-to-RF bridge based around the ATmega32U2. If this works, it’s going to be the computer side of a custom user input device.

Not an ideal layout for an RF board, but some compromises are generally required to make things work in a single layer.

I’ve also been working on switching from Altium to KiCad (which in addition to being free runs without needing a virtual machine on my Linux desktop) so simple boards like this are a good way to get used to the new workflow.

I always love seeing boards in 3D before I make them. I’m very meticulous about drawing all of my own footprints and accurate 3D models of all of my components—it takes a lot of time but the results are satisfying.

For many years, I used the “fab modules” from the How to Make (almost) Anything class to convert my layouts to G-code for milling. I had a script that would take a PDF exported from Altium, separate out the layers, rasterize them, and pipe them through the command-line fab modules to produce a set of toolpaths. But the fab modules have changed a lot since then (they’re now primarily browser-based, which is great if you want a GUI but makes automation a bit trickier) and the Altium PDF export thing was always a hack.

Since I’m switching to new layout software on a new computer (which doesn’t have my ancient copy of the fab modules) and a new CNC router, I decided to try something new. I’ve been using a program called pcb2gcode, which has been working very well. It takes gerber files exported from KiCad and generates toolpaths. The whole process is quite seamless.

The CNC router is probably a subject for a whole series of posts that I may or may not ever get around to writing. It started as a pile of parts from one of the ubiquitous cheap “3018” CNC kits which are quite lousy as sold. In the process of trying to improve it, I basically ended up building an entirely new machine.

The router, which at this point is a very odd mix of cheap parts from the original kit and nicer parts.

Another deviation from the fab class process is the tooling. The class uses 1/64″ endmills from Carbide Depot, which are about $17 each. I’d generally get about 10 small boards out of a well-cared-for one before it started getting dull, which isn’t a terrible cost-per-board, but they’re also so small that they have a tendency to break if even slightly mistreated. They’re also a bit on the big side for some components—1/64″ is about 16 mils or 0.4mm. That just barely works for TSSOP ICs with 0.65mm pin spacing if one makes the pads so narrow that they’re no wider than the traces. For the class it’s not a huge issue since people tend to work with 1206 passives and SOICs or 0.8mm QFPs, but it certainly limits density and makes it harder to work with some more interesting parts. 0.01″ endmills can also be used, but these break if you look at them funny.

30° v-bit engraving tool for traces, 0.8mm 2-flute endmill for holes and routing the board outline.

Instead, I’ve switched to using a 30° v-bit engraving tool to mill in between the traces. These are basically a single-flute D-bit cutter. Half of the shank is removed (leaving a D-shape) and sharpened to a point. It’s relieved on one edge so there’s only one cutting surface. The benefit is that there’s the maximum possible amount of material supporting the cutting edge, so it’s a lot less fragile than a complex multi-flute endmill geometry while still coming to a very sharp point. This means, though, that the width of the cut is dependent on the depth of the cut—the board either needs to be perfectly flat in the machine, or you at least need to compensate for variations in height in the toolpath. I solved this problem with auto-leveling. I connect wires to the blank PCB and the tool with alligator clips, and then the machine probes downward until the tool touches the board, completing a circuit. By repeating this process on a grid over the surface of the board, the machine can automatically map out the height of the board and compensate in the toolpath.

In my current configuration I’m milling at a depth of 45μm, which is giving me a cut about 140μm (about 6 mils). In theory that should be good enough to make footprints for 0.5mm QFN parts, but I haven’t tried pushing the process that far yet.

I haven’t milled enough boards to determine how long these tools last yet (though I’ve dulled a couple by crashing them into the board when adjusting the Z-height). I think I’m on the fourth board for my current tool and it’s still cutting nice and sharp. And they’re only $14 for a pack of ten, so they are significantly more economical than the 1/64″ endmills.

To drill any holes and cut out the finished board, I’m using an 0.8mm 2-flute endmill, which is the same as the fab class process. These cheap ones I’m using don’t last for very long before they get dull and leave a bit of a burr on the edge, but they work.

cncjs executing the toolpath for the board.

The control board in the router is the original from the 3018 kit, and it runs GRBL, which is a pretty basic CNC controller designed to run on Arduino-based hardware. It’s not fancy but it gets the job done. As an interface, I’m using the excellent cncjs, which I have running on the Raspberry Pi clipped to the back of the machine. This combo is pretty great—I just plug the machine in, it connects to my Wi-Fi, and I can open up a web browser to upload a G-code file and control the machine. I’m using this extension for cncjs which implements the autoleveling. After uploading the G-code for the board, I run a macro that invokes the extension, which probes on a grid and then modifies the G-code in place to account for any variations in board height.

Milling out the traces.

In the end, I’m quite pleased with the results. It’s taken a lot of tuning to get everything dialed in, not to mention the long process of building the machine, chasing rigidity and reducing backlash… but it now seems to be working pretty well. This board took about 10 minutes to mill the traces and another 5 or so to cut the holes and the board outline. This definitely isn’t a process I’d use for really complex boards or anything I need multiple copies of, especially when PCB fabrication services are so readily available and cheap now, but sometimes it’s nice to be able to go from an idea to a completed board within a single afternoon.

The finished board.
And fully assembled. Not the greatest soldering job—I only have a basic iron at home so it was hard to avoid getting some solder on the ground pour which makes it look a little ugly, but it works fine. Some of the components I ordered also got lost in the mail, so I’m also using a few 0603s scavenged from junk boards where I intended to put 0805s.

Now I just need to write some code to see how well I like these radio modules for my application.

Lightning Damage

Last week, lightning struck at Tidmarsh, the former cranberry bog where I’m conducting experiments for my Ph.D. research on sensor networks.  We’ve had some nearby lightning strikes before that have caused some minor equipment  damage (it’s one of the perils of working in this environment) but nothing quite so major as this.  We have some protection against lightning—perhaps not as much as we should have, but it can be quite challenging when we have many cables extending out over a large area.

I was actually there on site when it happened, making myself a cup of tea in the guesthouse.  I was looking the wrong way to see the actual bolt, but saw a flash and heard the thunder instantaneously, so that must have been it.

We lost pretty much the entire audio installation at the former impoundment.  Closer examination of the damaged equipment tells the story of the path the current took.

I don’t think anything was hit directly, or the damage would have been even more extensive.  Most likely, the lightning hit the ground very close to where one of our cameras is out in the marsh.  That traveled about 150 meters through a CAT6 ethernet line, into port 4 on the switch in the south box:

The metal at the bottom of the photo is the metal housing on the ethernet jack; the black box with the pins at the top is the magnetics.  There’s some definite charring and delamination of the PCB.  From there, current probably followed the Power over Ethernet (PoE) path, through the center taps on the transformers.  On the other side of the board, several ICs are pretty well obliterated, with sections of plastic completely blown away:

This is probably a switching regulator for the PoE power.  On the reverse side of the board is the likely exit point, where current found its way to the grounded case.  These unpopulated pads show some charring:

… but the real story is told on the case itself, where there are definitely signs of arcing and pitting underneath where those pads were on the PCB:

High voltage obviously was present on the man digital power supply rail on the PCB, as several other chips on the board that don’t have anything to do with PoE are pretty clearly toast:

Those little holes blasted into the chips are evidence of the magic smoke getting liberated.  The bypass capacitor to the left of the second chip also looks pretty unhappy about the voltage it’s seen.

In the other damaged equipment, the story isn’t quite so clear.  The south audio input box (S16) has a couple damaged op amps/switches on two of its preamp channels:

Some of the input resistors have clearly seen more power than they’re rated for, too:

Since the most obvious damage here is around the microphone inputs, and this box isn’t connected at all to the ethernet switch, it’s likely that this was a second path current took into the system, from the strike location into a microphone or two and through the cables into the preamps.

The north box also took some damage.  The mixer/audio interface (X32) lost functionality in one of its input boards (not yet photographed) but remains functional as a mixer (still powers on, screen works, channels 1-8 still work, and still communicates on the network).

The ethernet switch at the north box miraculously still works as a switch and continues to move packets on both the fiber and copper ports.  All of its PoE functionality is dead, though.  Since no copper ethernet cables run from that switch into the field where the strike was, it’s likely that voltage came from the ethernet cable that goes between the two boxes, damaging the directly connected PoE circuitry but sparing the galvanically isolated ethernet functions.  The north switch is still operating in the field so I haven’t disassembled it yet to inspect it for damage.  However, I have a pretty good idea of where current found its path to ground here, as the Intel NUC computer that was attached to it is now completely dead.

A close inspection of the NUC’s ethernet section reveals some termination resistors that have clearly seen power beyond their rating:

And finally, near the ethernet magnetics, a PCB trace that’s completely vaporized:

So most likely, lightning struck near the marsh camera, traveled up the cable to the south box, completely destroyed the ethernet switch there, finding a path to ground by going through the PoE circuitry and arcing over an 8mm gap to the case.  It also propagated down the ethernet cable that runs to the switch in the north box, where it found a path to ground through the PoE circuitry to the NUC computer, vaporizing some traces in the process.  Independently, high voltage was picked up through microphone cables, damaging several preamp channels.

As we rebuild, it seems like it will be a good idea to make sure there’s surge protection on every ethernet connection, not just the ones that seem most at risk (like we had before).  And it might be time to design some microphone surge protectors—something that doesn’t really exist on the market.

Lenovo Bluetooth Keyboard Repairs

For one of my home desktop setups, I have very particular keyboard requirements.  Since I put the keyboard in my lap (there’s no desk/table, the monitor is suspended on a cantilevered arm) the pointing device needs to be integrated into the keyboard itself.  I’ve become less of a fan of trackpads over the years, especially the terrible ones that are integrated into cheap wireless keyboard combos like the ubiquitous Logitech K400.  Furthermore, as it’s a Linux machine and I’m very accustomed to the X11 clipboard, which uses the middle mouse button to paste, I want a physical middle button.  I’ve only found one wireless keyboard that meets those requirements, and it’s Lenovo’s bluetooth keyboard with a TrackPoint:

Bluetooth keyboard

I like the keyboards and TrackPoints on my ThinkPads, so it’s nice to have the same setup.  Unfortunately, the wireless keyboard is a bit of a regression from the ones on my ThinkPads: it lacks the row above the function keys, the function keys have tiny markings with big icons for their secondary functions (which I don’t care about) and the build quality is overall not as good as older ThinkPads.  I also don’t like Bluetooth (pairing is complicated, and it doesn’t work in the bootloader/BIOS).  But, it works.  Well, it did, at least until my TrackPoint stopped working one day.  The keyboard continued to work, but the TrackPoint started drifting incessantly to the upper left no matter how it was deflected, followed by ceasing to work altogether (including the buttons) a couple of days later.

Searching for others with the same issue led me to this surprisingly recent and relevant GitHub issue.  I suspected it might be an issue with the keyboard’s firmware, that it had gotten into some state where it had disabled the TrackPoint (most ThinkPads with the Windows drivers had a hotkey to disable the TrackPoint; did the wireless keyboard perhaps have a similar, but undocumented function?)  Attempts to reset the keyboard (disconnecting the battery, holding different keys on powerup) didn’t bring it back, though.  I did discover a few things that would cause it to lose its Bluetooth pairing, suggesting a reset of some sort (holding the Esc key when connecting the battery seems to do this), the TrackPoint still didn’t come back.

Removing the circuit board and inspecting the traces on both sides led to more insight about how the keyboard works.  In all ThinkPads I’ve seen (up through my T420s, at least) the TrackPoint controller is on a small PCB physically attached to the TrackPoint itself on the back of the keyboard.  The TrackPoint is a couple of strain gages; the controller reads the strain gages, does some dynamic calibration, and translates the result into cursor movement, which is communicated to the computer over PS/2.  The wireless keyboard is missing the controller on the TrackPoint itself.  Instead, a 4-conductor flex cable goes back to the main board.

pcb_2

The main board has two microcontrollers on it.  Most of the functionality appears to be handled by a BCM20730 SoC, which is intended for implementing Bluetooth HID devices.  This chip sits on a module mounted to the mainboard with castellated vias along the edge.  It has a built-in trace antenna, though for whatever reason the keyboard designers have chosen to use an external antenna on the mainboard instead.  The UART pins on the module are wired to test points on the bottom of the board with their functions labeled; these points are accessible without disassembling the keyboard by removing the label from the bottom.  This is probably how the keyboard’s firmware is initially programmed.

The signals from the keyboard matrix connector all route to this module, with the exception of the mouse buttons, so the module seems to be handling the keyboard functionality directly.  The second chip, on the mainboard itself, appears to connect to the module through I²C (judging by the labels on the test points on the bottom of the board) though I didn’t actually probe the signals to confirm this.  This IC, which is labeled 502A6 HF372 7AV1 in my unit, isn’t something I’ve been able to identify.  I suspect it might be an ASIC based on the lack of any programming interface.  Otherwise, it would have to be preprogrammed before being installed on the board.

This second IC appears to be the TrackPoint controller, and on my keyboard appeared to be poorly soldered.  I was unable to visually confirm good connections between most of the pads.  Removing the chip with hot air didn’t change my keyboard’s behavior—the keyboard continued to pair and work perfectly, while the TrackPoint was dead.  This confirmed that this IC is not used for the keyboard functionality and must just be the TrackPoint controller.

Inspecting the bottom side of the chip under a microscope (I unfortunately did not take pictures) made me extremely doubtful that some of the pads had ever had solder on them; they appeared dull and oxidized.  I cleaned off the PCB and the chip with solder wick, and re-soldered the chip using leaded solder.  When I connected everything again, I once again had a working TrackPoint.  I can’t say I’m impressed with the quality control for an $80 keyboard.

Repair Guide

To help others who are having the same or similar problems, I’ve put together some instructions for performing the fix.  This isn’t really an exhaustive guide and requires some expertise, but hopefully will be useful to someone.

There are only three screws in the entire assembly, and they hold the TrackPoint on to the bottom of the keyboard.  Everything else is just press-fit or stuck together.  The top frame just pops off.  I started in the bottom left corner, and didn’t need any tools other than my figners to start removing it.  (If yours is tight, a plastic spudger might help).

Opening the case

Continue to work around the outside until the frame is freed from the bottom all the way around.

Starting to open the case

Once the frame is off, peel the keyboard up from the base.  It’s just stuck down using double-sided tape.  Make sure to pry underneath the metal backing and not the plastic bezel.  Be aware that there are two cables attached to the bottom of the keyboard: one for the keys and buttons, and another for the TrackPoint.  The keyboard cable is fairly large and robust and has some extra length, but the TrackPoint cable is thin and fragile.  I suggest releasing this from its connector as soon as you can to reduce the risk of damaging it:

trackpoint_cable

The adhesive can be reluctant to let go, but just be patient and apply gentle, continuous force as it comes free.  Just be careful not to tear the cables or bend the keyboard.

keyboard_inside

With the keyboard disconnected and removed, this is what’s left inside the case.  There’s the battery, the main board, a small PCB with the LED and switch, and the NFC pairing board along the front edge.  Make sure to disconnect the battery connector before doing any work on the PCB so you don’t short anything.

pcb_2

The chip we’re interested in is U2, at the upper left corner of the main board right next to the TrackPoint connector.  At this point, there are two options.  The first is to remove the chip with hot air, clean the chip and PCB, apply the proper amount of solder paste, and reflow.  However, if you don’t have all of the equipment for that, repairs might be possible with an iron with a relatively fine tip and a good flux.  Apply the flux (I recommend this stuff, it is expensive but fantastic) so that it coats both the pads on the board and the side of the chip.  Put a small blob of solder on your iron tip, and run it along the side of the chip.  The idea is that you want the blob to touch both the pads and the side of the chip, but the tip itself shouldn’t physically drag across either (the pads on the board are very easily scraped off, especially several without traces connected to them).  If you’re doing this correctly, the perfect amount of solder should be left on the pad and the chip without leaving bridges between adjacent pads.  If you do end up with bridges, add more flux and try again.  Note that the tip doesn’t need to be particularly fine, as long as it’s not so big that you can’t avoid the passive components nearby.

Given my initial assessment of the board, I am doubtful that simply heating the chip without additional flux and solder would fix the problem (lead-free solder really does not flow nicely).  However, if you have a hot air gun it might be worth trying before attempting the above.  I would still add some flux, and possibly apply some gentle pressure to the top of the chip while heating, though be careful not to slide it around on the board.

A properly soldered chip should look something like this:

chip_solder

Note how you can see nice fillets that go between the pad on the board and the exposed copper on the side of the chip.  There might be connections underneath even if you can’t see these fillets on the side of the chip, but seeing them gives you a nice confirmation that you have a good mechanical and electrical connection.

If things are working again, the reassembly procedure is basically the reverse of disassembly.  Make sure that the cables that go between the boards are routed through their proper channels, reconnect the keyboard matrix and trackpoint cables, and stick the keyboard back down (there are a few features in the plastic that help you align it).  Stick it down lightly and then apply pressure from the middle outward—this will help ensure that it stays centered and even.

I’d be interested in hearing if this helped fix your broken TrackPoint, or if you have any suggestions to improve this article.

ADAT modification for the Layla converters

Background and Motivation

Since this project involves a bunch of digital audio stuff that some of my readers might not be familiar with, I’ll start by describing my motivation for the project and some of the background information about the protocols and hardware involved.  If you’re already familiar with this stuff and just want to see the hack, jump to the next section.

I have a somewhat unusual audio setup at home.  I use a DAW (digital audio workstation) software on my desktop computer as a digital mixer for all of the sound coming from it.  Using JACK on Linux, I route the output of each program to a different mixer channel, so in addition to having different volume settings for each program, I can apply effects as well (such as equalization or applying a little bit of compression when watching a movie late at night, so the loud parts aren’t quite so loud.)  I can then route the audio between multiple outputs, primarily my studio monitors and my headphone amplifier.

The sound card I use is an RME Digi9652.  These are older PCI cards, which are now inexpensively available second-hand since newer computers have mostly PCI-e slots instead.  But, the card still works on my motherboard, has great Linux support, and provides 26 inputs and 26 outputs with very low latency.  Like many multichannel audio cards, all of the I/O is digital.  The 9652 has three pairs of ADAT Lightpipe ports and one pair of coaxial S/PDIF connectors.  In order to get analog audio in and out, it requires the use of external converters connected to the ADAT ports.

ADAT Lightpipe is a protocol developed by Alesis in the early 90s for their digital multitrack tape recorders (the Alesis Digital Audio Tape, or ADAT).  It uses the same plastic fiber optic cables and connectors as consumer TOSLINK connections, but carries 8 channels of audio rather than a stereo pair.  The ADAT tape deck is now pretty much obsolete as most digital audio recording is now done with computers or hard-disk based recorders, but the optical interface is still around, often just called “ADAT” now.  The relatively low complexity of its implementation and inexpensive LED transceivers has established it as the de facto standard for connecting low channel count digital audio devices together.

Previously in my home setup, I wasn’t using any of the inputs, and ran 4 channels of output—two to my headphone amplifier and two to my monitors.  I was using another board that I built (which I might write about later) to take one of the ADAT outputs and split it into multiple 2-channel S/PDIF outputs, one of which went to a Benchmark DAC1 driving my headphone amplifier and the other to my AMB γ2 driving the monitors.

I wanted, however, to be able to include some of my other audio devices in the same setup as well.  One of these devices is my Netflix player; I run Linux on my desktop which Netflix doesn’t support, so it’s easier just to use hardware where it’s supported.  Basically, I wanted a few analog inputs that I could plug other devices like this into in order to get their audio into my mixer, so it could be processed and routed like the audio coming from software running on my computer.

The Hack

The off-the-shelf solution to my problem would have been to just buy an 8-channel A/D and D/A converter box with ADAT I/O.  These tend to start somewhere around $500 for a cheap one, though, and I didn’t want to spend that much.  They also tend to hold more of their value second-hand, as ADAT is still widely used, so it’s harder to find cheap used ones.  But, there was something interesting in my junk pile that ended up being the solution.

layla

This thing is the converter box for an audio interface that Echo made in the late 90s, called the Layla.   There were two other audio interfaces in the Event series, named Darla and Gina, with fewer channels—the Layla was the top of the series with 10 analog outputs and 8 analog inputs.  It connected to the computer by a 25-pin proprietary umbilical cable to a PCI card.  I came across this interface box being thrown away, with the PCI card long gone.  By itself, it’s practically worthless, as the 25-pin interface is not at all standard.  I originally grabbed it intending to salvage some of the parts out of it and maybe re-use the case (empty rackmount cases are expensive!)

But my current need for audio I/O had me looking at it again.  I decided to replace the DB-25 connector with an ADAT interface, which would let me connect it to my RME card.  I’ll jump to the end result and show it working before I go into detail on how it was done:

back_plugged

That black plate in the middle is the only indication that it’s been modified.  It covers up the spot where the original DB-25 connector was, and replaces it with a pair of optical connectors and a mini USB port for configuration.  On the left side are the analog inputs and outputs; these are all functional except for outputs 9 and 10 (as ADAT only carries 8 channels).  The wordclock I/O is also functional, though the S/PDIF and MIDI are not.

To accomplish this, I essentially re-used the whole analog section and the converters themselves, while re-wiring and replacing most of the digital electronics.

Inside

interior

Inside the Layla box (ignore the modifications for now) the analog inputs are the section on the left, and the outputs are immediately to the right of the inputs.  Both directions use Crystal Semiconductor (now Cirrus Logic) converter chips, with two channels of audio handled by each chip.  On the input side, there’s an MC33079 op amp acting as a balanced line receiver, a CS3310 digital volume control IC, and another op amp buffering the inputs to the CS5335 analog-to digital converters.  The CS3310s are controlled over a serial (SPI) bus, and allow the input gain to be set anywhere from -95.5dB up to +31.5dB, in 0.5dB increments.

On the outputs, the CS4327 converters feed into more MC33079 op amps that act as the balanced line drivers.  Each pair of channels also has a quad CMOS switch, with two switches allocated to each channel.  One disconnects the converters from the line drivers, acting as a hardware mute.  The other switches an extra resistor into the feedback loop of the op amps, dropping the output levels from +4dBu to -10dBV.

To the right of the outputs is the power supply, which creates the ±15V analog voltage supply rails for the op amps, and the +5V analog/digital supply for the converters and digital volume controls.  The big red, black, and yellow wires are actually how it came from the factory—originally, Echo intended to power the interface box from the PCI card (you can see where the 25-pin connector used to be below the power supply, by the way) but they ended up adding the dedicated power supply stuck on to the right side of the board after being unsatisfied with the performance of the device when it was running on noisy computer power at the end of a long cable that also carried high-speed digital signals. [1]

To the right of the power supply, there used to be an FPGA that coordinated all of the converters and sent the data back over the 25-pin cable.  In the picture above, I’ve removed the original FPGA completely.

The modification consists of two main parts: the ADAT interface itself, and a microcontroller to manage the digital controls like the input and output gains.

ADAT Interface

adat_board

ADAT Lightpipe was covered by a few patents that Alesis owned, which meant paying license fees to include it as an interface.  As a result, there aren’t a lot of off-the-shelf chips that handle ADAT.  (This is now changing—best I can tell (though I am not a lawyer) the patents have recently expired, and implementations of the ADAT protocol are now showing up as IP blocks that can run on FPGAs and some microcontrollers.)  For this project, I’ve used the Wavefront chipset.  Wavefront Semiconductor manufactured a couple of Alesis’s custom chips, including the AL1401AG and AL1402G ADAT interface chips.  These are presumably the chips that were inside the actual ADAT decks (though I’ve never taken one apart to confirm that).

The AL1402G takes an ADAT bitstream and decodes it into 4 channels of serial digital audio, with each data line containing interleaved data for two audio channels.  It also recovers a system clock (256*Fs, where Fs is the sample rate), a bit clock (64*Fs) which indicates when the individual bits of audio data should be latched, and word clock (Fs).  The AL1401AG takes the serial audio signals and wordclock and goes the other direction, producing an ADAT bitstream.

digital_audio

To add the ADAT chipset, I started by gluing a 20-pin header to an open space on the board.  I glue the headers down first with superglue, and add some 2-part epoxy later to really make sure that they stay put.  This provides a point from which signals on the board can be wired into an additional PCB that can be easily disconnected for servicing.

The signals are then wired up using 30-gauge wire-wrapping wire.  I’ve roughly color-coded data lines as yellow, clocks as white, power as red, and ground as blue.  The various signals were located by referencing the datasheets for the converter chips and looking for continuity between the pins on the chip and easier places to solder to.  The A/D converters have resistors on most of their signals, which have nice big pads to which to solder wires.  The D/As didn’t have the resistors on the data lines, so I’ve scraped away a bit of the soldermask on the traces and soldered the wires to the exposed copper.  The clock lines are shared between all of the converters on the board, so the clock signals just connect in one location and then run through the traces on the PCB to everything else.  I did cut off the part of the traces where they used to run to the FPGA to keep them from acting as big antennas.

There are a couple of different formats for serial digital audio—they all use a system clock, bit clock, and word clock (sometimes also called LR clock because its value toggles between the left and right channels) and all will generally have 32 bits of data transmitted for each channel.  Most audio converters are either 16, 20, or 24-bits, however, so there’s variation on which of those 32 bits are used to transmit the actual data.  The two most common are left-justified (where, for a 24-bit converter, the first 24 bits are used and the last 8 are ignored) and I²S, which is like left-justified but the first bit of the channel comes one bit clock cycle after the word clock changes, rather than at the same time.  Right-justified formats are also sometimes used as well.  To account for all of these different formats, most chips that have a serial digital audio interface have a couple of pins that can be set high or low to configure which format should be used.  This is true of both the converters in the Layla and the Wavefront chips.  Unfortunately, the Layla PCB is hardwired to configure the converters to use I²S, which is a format that the Wavefront chips don’t support.  This required lifting the configuration pins off of their pads on the PCB and running short wires to connect them to either power or ground.  With the D/A converters, it was only necessary to change one of the configuration pins, which you can see as the short blue wires next to the chips.  On the A/D side, both configuration pins needed to change, so there’s both a red and a blue wire.  Wiring-wise, this was the most tedious part of the modification.

The actual board that carries the ADAT chipset is made from a single-sided PCB, with the spaces between traces milled out with a 1/64″ endmill.  A 20-pin through-hole connector on the back side of the board mates with the 20-pin header epoxied to the original board.  The ADAT transceivers connect via the 6-pin header on the top side of the board, which will be discussed later in this article.

Next to the ADAT board, you can also see an additional 7805 regulator that I’ve added to the main PCB.  The existing 5-volt supply is already pretty heavily loaded by the converters themselves, and since it’s also used by the analog side of the converters, I didn’t want to put too much more digital circuitry on it.  The additional 7805 is a separate supply that runs the circuitry I’ve added.  The tab is soldered to a pad for an electrolytic capacitor that was unpopulated on the original board.

The Controller

control

The original Layla came with software that allowed the user to adjust the input and output gains.  This was mediated by the FPGA, which I removed.  To be able to control these signals, I added another 20-pin header and PCB that controls these signals.  I used an ATmega32U2 microcontroller, which has just enough I/O pins for the task.  The 32U2 also has a USB interface, with which I provide a way for the settings to be changed from a computer.

All of the digital control lines conveniently come out to resistor packs near where the FPGA used to be.  There are 4 chip select lines for the 4 digital volume ICs, as well as shared clock and data lines for their SPI bus.  Then there are 9 digital lines that go to the CMOS switches on the output—8 switch the individual outputs between +4dBu and -10dBV levels, and the ninth is the mute signal, which applies to all channels at once.  The controller board connects to these lines via the epoxied header, and the USB port on the back panel connects through another 6-pin header on the top side of the board.

The USB port, when connected, shows up as a vendor-specific USB device, for which I’ve written a small command-line tool in Python that allows changing the gains.  The actual gain settings are stored in the microcontroller’s EEPROM, so they are retained across power cycles.

The error signal from the ADAT receiver is also wired to the microcontroller, which drives the mute line when no ADAT signal or an invalid ADAT signal is present.  This prevents horrible noises from going to the outputs when the ADAT input is unplugged.  The D/A converters don’t have any kind of built-in muting feature, and have a tendency to produce very loud squealing noises if the clock inputs are invalid.  The automatic muting keeps that noise from going to my speakers.

The Rear Panel

back

The actual connectors are mounted on the back of the case on an aluminum plate.  The optical transmitter and receiver were salvaged from a lightning-damaged AudioBox 1818VSL (long story.)  There’s also an LED on the back panel that lights up when a valid ADAT bitstream is present on the input, and the USB port for controlling the gains.

The panel is waterjet-cut from 1/16″ aluminum and somewhat hastily spraypainted matte black.  A small PCB with the I/O connectors screws on with a pair of angle brackets.

io_solder

The optical transmitter and receiver are through-hole parts, so they end up on the other side of the single-sided copper board with the LED and brackets:

io_component

And then everything gets connected together with 6-pin ribbon cables, through a hole milled into the back of the box where the original DB-25 connector used to be:

plate_removed

Clocking

I wanted to be able to still use the wordclock input and output on the back of the box—wordclock is always useful to have in larger audio systems.  The clock recovered from the ADAT bitstream is also sometimes more jittery than desirable.  The ADAT receiver chip supports a wordclock input as well.

The clock configuration isn’t as cleanly done in this hack at this point, mainly because I didn’t have a digital mux in my parts bin that would have enabled the microcontroller to select the clock source.  Instead, the ADAT receiver can be switched between clock modes with the little slide switch on the side of the ADAT board, and the buffered signal from the BNC jack is enabled with this jumper on a header (also glued to the original PCB):

clock

So, currently, changing the clock source requires opening up the case.  I might at some point get around to adding that mux so it’s software selectable.

The wordclock output connector is always active, and will output whatever clock source the board is currently using.

Conclusion

In the end, it works great!  It ended up being a bit more work than I was expecting (especially discovering that the config pins on the converters were going to need to get lifted from the baord to change the data format) but it’s always nice to recycle something useless into something functional.  The entire hack took a couple days, starting with probing for the signals, adding the headers and soldering all of the jumper wires, making the PCBs, debugging, testing, and finally writing the software for the microcontroller (which I’m still tweaking a little bit.)

The converters are old—they are only 20 bits whereas most newer pro audio gear is 24-bit, but 20 bits is actually fine for most purposes.  The analog stuff is actually pretty well done, and I’m pleased with how it sounds.  I haven’t tried it for any serious recording work (yet), but the outputs are actually very satisfying on my monitors, which frees up my gamma 2 DAC for me to use elsewhere.

Lighting Control Boards

I designed these boards to be integrated into 12VDC track lighting fixtures with MR16 LED lamps in the Media Lab atrium.  They are based on the Atmel XMega A4 series (originally designed for the ATxmega32A4 and that’s what’s in the atrium lighting installation, but forwards-compatible with the A4U series chips; most of my current uses for this board use the ATxmega128A4U) and the AT86RF231 radio (though the RF230 and newer variants like the RF233 should also be usable.)

mr16board

The radio is pinned out to the SPI interface on port C, and 3 LEDs along the edge of the board are pinned out to port D.  Dimming the MR16 lamps is accomplished by PWM control of a low-side N-channel MOSFET, connected to a PWM output compare unit on port E.  All of the other GPIO pins are broken out to 100-mil headers on the board, which includes all of ports A, B, and E, and part of ports C and D.  22 total GPIO pins are broken out.

The left half of the board contains the power supply circuitry and the power MOSFET.  The bridge rectifier on the power input allows for the supply to be connected in either polarity, or to AC power.  A linear regulator cheaply (but inefficiently) drops the supply voltage to the 3.3V needed by the logic on the board.  If the power supply circuitry and MOSFET are not needed, the entire left half of the board may be cut off and 3.3V fed directly to the 100-mil headers.

The RF antenna can be attached via an SMA connector, but in low-cost applications a 25mm wire (quarter-wave antenna) works very well.

The removability of the power supply circuitry and the extensive I/O breakout make this board useful as a general-purpose Xmega wireless development board, and I’ve indeed reused it in several other wireless projects.

I have adapted my basic extension of Atmel’s 802.15.4 MAC/transceiver toolkit (which is now quite dated and contains unnecessary workarounds for AT86RF230 revision A silicon errata) to this board.  Most of my recent development, however, uses the Atmel Lightweight Mesh protocol, to which I’ve added HAL support for the A4 series chips and board support for this board.  (Hg repository: http://simonetti.media.mit.edu/hg/lwm).

MR16 Board Schematics are available.