My adventures with a Raspberry Pi and Arduino programming

AJ Arduino pin usage

I need to ensure I have enough oomph in my hardware. For this I need to know that I have both enough PINs for all my hardware, and enough power for my application.

PINs in use on my Arduino Pro Mini

  • 0 Serial RX – SD card
  • 1 Serial TX – SD card
  • 2 INT0 – Software serial radio RX
  • 3 INT1 PWM – Ascent/descent
  • 4 T0 – Software serial radio TX
  • 5 T1 PWM – Forward / aft
  • 6 AIN0 PWM – Turn port/starboard
  • 7 AIN1 – free
  • 8 CLK0 – free
  • 9 OC1A PWM – free
  • 10 I2C SS PWM – free
  • 11 SPI MISO – free
  • 12 SPI MOSI PWM – free
  • 13 SPI SCK LED – free
  • 14 A0 ADC0 – free
  • 15 A1 ADC1 – free
  • 16 A2 ADC2 – free
  • 17 A3 ADC3 – free
  • 18 A4 ADC4 I2C SDA – I2C rangefinders x 6, 9 DoF sensor (accelerometer, gyro, magnetometer)
  • 19 A5 ADC5 I2C SCL – I2C rangefinders x 6, 9 DoF sensor (accelerometer, gyro, magnetometer)
  • A6 ADC6 Analog only – charging current
  • A7 ADC7 Analog only – low voltage

Power usage

Item Min Typ Max
Sharp 10cm-80cm sensors (6) ? 30 mA * 6 40 mA
 SparkFun I2C Rangefinders (6)  0.3 1.7 mA * 6  ?
 9 DoF sensor  Accel’ 0.4 mA (10Hz)

Gyro 6.5 mA

Magne’ 0.1mA

 Arduino (hi power mode)  0.5 mA  4.13 mA  4.13 mA
 Current monitor  0 (on arduino)
 Low voltage monitor  ? TBD
 SD card reader/writer  2 mA  5 mA  6 mA
 Radio tx/rx (BLE Nano)  4.7 mA  8.0 mA  11.8 mA
 Copter circuitry itself (max)  ?  3258 mA (3.3A)  ?
 Totals    Elec: 214.33
Copter: 3258
Total: 3473 mA
 

The above default power usage stats with a modified 2000mAh/3.7V battery give a fully operating flight time of at least 34.5 minutes.

We can probably halve the electronics’ power usage, but this would only get us to 35.6 minutes flight time – so what’s the point? We will have to test the actual flying at low speed to see how long the copter will fly itself around. I’m guessing the actual flight time will end up nearer an hour with the bigger 2000mAh battery I’ll be using.

Eventually if the bot ends up performing tasks without the rotors on (observation from a static point perhaps?) then looking at power savings in the electronics begins to make sense. As a bare minimum I’ll shut the sensors off when the bot is stationary. (landed)

Why an Airbot?

I love little helicopters and quadcopters. Never owned a Quad though, so thought I’d get a small one. It’s great to fly aircraft, but even better to programme them and watch them fly themselves! I wanted to create something that responded to basic stimuli and would work indoors – perfect for experimenting with aviation.

I love the idea of those little bugs you find in gadget stores – they jiggle around, follow the edges of a cage, and keep walking around. They’re simple but interesting as an academic exercise. I wanted to do something a little like this, but with quadcopter… I love aircraft!

Where to start?

I figured start with a very small nano quadcopter. This also means I can run everything at Arduino system voltage (around 3.3V), making the electronics easy. I’ve just bought a Hubsan X3 mini (7cm diameter) quadcopter. This can be used indoors or outdoors. I don’t want to re-invent the (very complex) wheel of flight controls – so I intend to retrofit this quad. I’ll let the onboard electronics worry about stability – all my code is going to do is say hover, up, down, forward, back, turn – just like the old logo robots from school!

Controlling the aircraft

This does imply a few things though. Firstly I need to interject an Arduino in to the lines from the control receiver to the flight control unit – so the arduino is saying forward/backward, not me. Shouldn’t be too hard (!)

After this I need to get basic controls working. I need the arduino to ‘know’ it has taken off, how high it is, and how to land again. The same goes for any movement. Thus the Arduino needs to know its acceleration in the x, y and z planes. This is something the flight controller may be doing anyway – if I can intercept these sensor readings then great, else I’ll have to add some more sensors for this.

Position

To know my height I need to know my acceleration for a period of time in the z plane. Same with x and y position. This means a basic inertial navigation system (INS) needs to be developed. This will help me find my landing pad again, and to ‘learn’ how to navigate around rooms. Basically provides a way of remembering walls in a room too.

Eventually I’ll want to add some IR distance sensors for fine input, and to sense obstacles, but initially knowing how far above the landing zone I am will suffice.

Learning about the environment

Flies respond simply to external stimuli. This is why they keep head butting windows. I want something simple, but not that simple. I could create a 1cm grid of an entire house, but this would be expensive in storage terms. My bot is 7cm in diameter. My likely proximity sensors have a 10cm minimum distance. If I assume a 10cmx10cmx10cm cube around my airbag then I can safely map a grid at 15cm resolution.

I can provide a 10cm safety zone around the aircraft – E.g. 10cm below, 7cm for the aircraft, and 10cm above. This is 27cm and so a point (a wall found to be) within 30cm means ‘I can’t fit there’ whereas a gap means ‘I can fit there’. So approximating  a room in to 15cm increments provides basic navigation. My front room where I’m typing now is approx 4m x 5m x 2.2m – which means approx 27 x 34 x 15 data points, so 13770 data points, each of 1 bit, so 1.68 KB of data. Not too bad at all. I could even simplify this eventually by storing planes for walls with four lines for boundaries, dropping the storage required significantly.

Autonomous flight

Being able to fly itself also implies it must manage it’s own health. For airbots this means battery usage. I’ll use a SparkFun solar buddy and a panel to charge the unit. I’ll also use the trick mentioned on it’s tutorial page to measure the current flow to the battery (and thus if it is not full), and if I can a ‘low current’ warning too (nearly empty).

I can use this information as ‘fear’ stimuli – in this case to find a location to sit still that provides a high charge rate – i.e. somewhere to sit in the sun. Just like my Labrador does (although for different reasons!)

Other ‘health’ and ‘fear’ indicators will also be useful. Houses are full of things that move. Thus not only will my room learning code need to figure out what is fixed versus what is moveable (accomplished by ‘number of observations’ of ‘wall here’ at a particular location), but it will also need a short term ‘collision avoidance’ mechanism.

The Science Museum has some simple toys that do this. They hover and are moved by you bringing your hands near them, and they steer themselves clear. I’ll do something similar. Useful in particular for doorways, labradors, and crap sitting in my office.

Behaviour and flight planning

As you can tell from the above, the bot will need several types of behaviour. Here’s a typical flight:-

  1. AJ is stationary in the ‘idle’ mode on a desk.
  2. AJ decides “I’m bored” and chooses to ‘take off’ (transition from idle to flying states)
  3. Once this manoeuvre is complete, AJ randomly chooses to ‘explore’ (transition from flying to exploring states)
    1. The explore state’s ‘planning’ step is invoked. This chooses a direction to go in that does not interfere with known ‘local’ hazards (like known walls, desks, etc.)
    2. This pushes a ‘turn’ then a ‘forward’ and a ‘ascent/descent rate’ action on to the activity list (in this order)
    3. This action also then pushes the ‘navigate near to location’ action on to the list (at the end)
    4. When nearby, the final action ‘hover’ is invoked.
    5. The action list is now empty, so explore’s transition chooser determines what to do next.
  4. Depending on how ‘adventurous’ AJ has been told to be determines what percentage chance AJ has of continuing to explore.  (i.e. transitioning state to the same state) Say it’s 80%, and for our purposes he chooses to explore again
    1. We invoke ‘planning’, which in turn adds a turn, forward, ascent/descent rate, navigate near to, and hover command on to the action plan list
    2. Whilst in the navigate to action a sensor determines that the target is 40cm away, but a wall is 30cm away.
    3. This basic instinct causes an ‘avoid’ flight action to be executed, and a new ‘explore new boundary’ phase to be entered in to. (This may or may not ‘throw away’ the original explore command – probably easier if it does)
      1. This implies that an ‘explore space’ state with an ‘avoid’ instinct stimuli means we transition to an ‘explore new boundary’ state
  5. Exploring a boundary. Here we need to determine what line the boundary follows – at the moment we only have a point on it observed.
    1. We first ‘observe’ the boundary to determine if it is static. If not  the ‘avoid’ stimuli will again fire (no problem), or the object will move out of the way, in which case we transition back to the explore state originally found.
    2. If stationary in the hover however, and the obstacle doesn’t move, we ‘peak’ left and right, using the direction calculator and INS readings to determine the likely line of the ‘wall’ (or other obstacle). We then plot a flight plan along this obstacle (turn, forward (slowly?), and no ascent/descent)
    3. As we move along the wall we record that it is still there. If we find an obstacle in the direction of flight, we log a new obstacle and follow that boundary again.
    4. If we don’t then after a while we’ll be in an extended line far away from ‘known space’. The ‘center of gravity’ of the bots nearest known positions should increase our ‘fear’. If this becomes high, then we should increase or decrease altitude and move back along our obstacle (hover, turn, ascend, hover, forward)
    5. If we encounter somewhere we’ve been before, again we’ll change height and move back along the obstacle. This time we’ll get a little further because there are more points in the extension, and thus our fear allows us to move further. Thus over time we’ll incrementally explore our environment.
      1. This ‘fear of the unknown space’ should be a basic instinct – if it hits a certain amount we should ‘return to base’ – e.g. if I’ve accidentally ended up outside – we should retrace our steps to the ‘last known safe location’ Failing this, navigate home directly.
  6. Eventually our ‘low battery’/hungry stimuli will fire, causing us to find a known nice charging point (a place we’ve had high recharge at before) or return to home (last resort)
  7. A ‘tiredness’ stimuli may fire after charging even, and cause the bot to return home. Much like my labrador does after eating, drinking after a long walk – he goes and lays down.

Long distance flight planning

Rooms are weird shapes. As are cities and the ground. This is why aviation has developed flight lanes. An aircraft moves to a flight lane to go beyond a local distance. By remembering ‘navigation friendly’ intersecting boxes we can provide a safe and finite way to plan routes that take in to account complex urban and in house environments. This will effectively map the centre of a room and it’s doorways. A simple cuboid we’re allowed to fly in to.

This means ‘take me home, I’m scared!’ can allow me to fly back to me last safe position, then to my last safe navigation. From here I can use navigations to plan a route to the safe navigation nearest my home position (and without a wall in between!).

This happily is a simple directed graph problem to solve. I count the minimum distance between centre points of navigations that intersect. So when entering room A through doorway 1 and evaluating the routes via doorway 2, I count the ‘cost’ of the manoeuvre by calculating the distance from the centre point of Room A’s intersection with door 1 and Room A’s intersection with doorway 2. Easy planar mathematics using Pythagoras.

I can even weight route choices by the quality of a navigation – it’s volume. A higher volume is more like a highway. These should be preferred routes. It also means a likely outcome is a ‘shortcut’ – going out the kitchen window, up the side of the building, and in to the office window in my house rather than flying up the stairs!

Memory

As we’re flying about we should try and remember things we find along the way. This should include areas where temporary obstacles aren’t found that often (like people, dogs, etc.). Knowing this provides the ability to find ‘rest spots’ or even ‘hide holes’. Useful in an urban environment.

Remembering temperature is also useful. A cold temperature means poor battery life, whereas a warm temperature means overheating. Each of these things can have its own memory, or feed in to a general ‘fear factor’ of particular locations. Fearing the dog’s room for example is quite useful.

Basics complete

Once the above is done we will have an autonomous super-fly that is capable of seeing to its own dietary and survival needs, and mastering its environment. This is a minimum viable product really, to which additional features and behaviours can be added.

Things needed

Clearly proximity, temperature, current, low voltage sensors are all needed. As is a state machine with simple static transitions mapped out. Shouldn’t be too difficult to build a super fly from a quadcopter!

What is not needed for this simple world is artificial intelligence – there is no ‘learning’ in the AI sense here – only ‘mapping’ an environment. Intelligence has an edge because it can be put in to new environments and use past experiences to help with decisions. This is not what I’m proposing. I’m proposing learning one fixed environment provided by the bot’s creator. Hence no neural nets or anything else. There’s nothing stopping you adding those instead of a state machine though.

Future extensions

To turn a fly in to a useful ‘pet’ or ‘companion’ a level of more behaviours (not intelligence necessarily) is needed. The ability to recognise people and communicate with them. LEDs and a simple sound generator, and the ability to listen to it’s own name and commands, much like a dog does.

Some commands are quite easy and desirable:-

  • Bedtime – sends to sleep when you’re working on a project / conference call
  • Go to ‘x’ – a named navigation space? remember 10 ‘words’ heard when in a particular navigation space perhaps?
  • Follow me – watch me (person speaking) and follow on
  • Watch me – record video for posterity, doesn’t necessarily imply follow me
  • Stay – hold position, but not necessarily attitude (so stay and watch me can be combined)
  • Find Person X – navigate through navigations to within voice print of a particular person. Perhaps combined with last known location of that person’s voice, or recent locations you’ve heard them
  • Come – come near the speaker
  • AJ – pre-command authentication – only responds to top 5 people it’s been near
  • Go with X – adds person X to ‘owner’ list temporarily (like going with an auntie for a day)
  • Lay down – land at current position
  • move left/right/forward/back – fine positioning before landing.
  • AJ you’re at ‘environment x’ in ‘navigation y’ – when first activated, so able to come with you to new places, e.g. climbing crags.

I purchased a SparkFun Thumb Joystick and have just gotten around to making it work. Works a treat. Gives two analog outputs – one vertical, one horizontal – and a digital output for the stick being depressed like a button.

I’m now creating a library to manage a Menu system using the joystick. The menu system will support 16×2 monochrome parallel LCD screens, like the one I just got working.

The idea being you define different menu levels with options, and child menu items. Selection can be enabled or disabled. Selecting a parent menu item shows the children, moving left and right to highlight an item, depressing the button to select.

This will be a generic menu system that will have inbuilt support for the thumb joystick, but will also allow integration with other input systems.

I’m going to use this on the instructor’s receiver for my D of E group radio tracker project. You’ll be able to monitor multiple teams and use the menu system to show their positions, navigate to them, or view recent status information like location, last report time, distance, speed and so on.

Once I’ve got this working I’ll post a video of an example. Wish me luck!

Finally got around to getting my display working. I had thought I had a broken display but turns out that although Arduino analog pins can be configured to be digital pins, that won’t work with standard parallel mode LCD displays. Plugging all 6 pins to digital works fine.

Power usage stats (LCD + Arduino pro mini 3.3V):-

  • LCD backlight, no text – 3.69 mA
  • LCD backlight, 16×2 text – 3.70 mA
  • No LCD backlight, 16×2 text – 0.71 mA

So no prizes for guessing what mode I’ll have it in most of the time! I’ll have it off (no displayed text, no backlight)

Eye SPI Arduino…

I’ve been toying with how to handle multiple peripherals that require Serial. Up until now I’ve used the Software Serial library on Arduino, and have resorted for my SD card logger to use the hardware serial.

But is there a better way…

Maybe…

Lets have a look at the comms options:-

Part Hard Serial Soft Serial SPI I2C
ArduLogger V3 Y NO YES [0] YES [0]
GPS – GP635T Y Y NO NO
GPS – UBlox 6M module Y Y NO NO [1]
3DR Radio Y Y NO NO
LCD – Hitachi 44780 Y Y NO [2] NO

[0] ArduLog software only, not the SparkFun OpenLog software. Oh and you need to modify the code yourself to add support!
[1] The compass on the UBlox does have I2C SCL and SDA lines, but not the GPS
[2] It is possible to use a register to drive the LCD. May be possible to cleverly link this to an SPI interface

Hmmm… So maybe not then… Although of course nothing stopping me using interface circuitry to make all the above work, but it would probably add to the number of IO lines used, not reduce them!

Speaking of IO lines used, I’m currently using these:-

  • ArduLogger – 2 (hardware serial)
  • GP635T – 3 (2x soft serial, 1x power mosfet on/off)
  • 3DR Radio – 3 (2x soft serial, 1x power mosfet on/off)
  • LCD – 6
  • TOTAL: 14 (Arduino Pro Mini V3 I use has 22 pins that can be used for digital IO – 4 of these are optionally SPI, and another two optionally I2C)

I’ll keep a watching brief on using SPI though – could potentially be useful, and allow me to ‘off board’ a lot of serial comms in future.

[UPDATE 26 MAY 2015]

You can buy ICs that act as an SPI to multiple UART convertors each with a 64 byte FIFO queue. The chip model for two output UARTs (up to 4MBit/s!) is

SC16IS752IBS and is documented at http://www.nxp.com/documents/leaflet/75015676.pdf. This product also has some GPIO ports, so could theoretically externalise all my IO (including LCD) if I really needed to.

I’ve got a nice simple working circuit for my ArduLogger V3 board. Re-jigged it just now to minimise the number of wires. Image below for your edification:-

Ardu Logger V3 and Arduino Pro Mini

Ardu Logger V3 and Arduino Pro Mini

Remember too, when using a solderable breadboard (from SparkFun) of the same size, the logging sd card board will rotate 180 degrees, and sit below the breadboard rather than sticking out. A nice compact logging circuit.

I’ve got some test software that simulates logging GPX (XML) format GPS based track information to the SD card every couple of seconds. Sample SD card append code is available on my GitHub page.

Connect the FTDI to the pins (on the right of the breadboard, above) to your computer, and you’re away!

BE AWARE: When programming the board, remove the ArduLogger breakout from the circuit, else it interferes with serial communication to the Arduino pro mini. Once reprogrammed, disconnect the FTDI cable, add the ArduLogger back in, then reconnect the FTDI (to power the circuit). Alternatively power the breadboard directly.

Note on wires above:-

  • Blue wire links ground on the logger to the Arduino
  • yellow, orange, and grey wires take a circuitous route to link 5V with VCC on the Arduino
  • The position of the logger board next to the Arduino automatically lines up and connects TX and RX pins (and RST)

The sharp eyed among you may have noticed the Arduini is a 3.3V 8MHz variant, whereas the power pin on the logger says 5V. You can run 3.3V through here quite happily.

Note that as previously mentioned, the ArduLogger board and the open logger software only works with Hardware Serial – software serial WILL NOT work. You have been warned.

So I’ve been struggling with wires everywhere in my projects. I’ve determined now that I suck so bad at soldering tonnes of plain wires that I need a better solution!

What I’ve done is solder the wire to female (yes, female) breakaway headers.

Idea being that male 90 degree bend headers are easiest to get, so I should be able to put these on the underside of my solderable breadboards. If I use a couple of pins in rather than the outermost ones I can even hide part of the female headers, so not taking up any extra space in my projects.

All also means that if one component is faulty I can easily replace them. I can also use them across projects, or plug them straight in to a breadboard for prototyping.

This arrangement is shown in the below image. This shows a basic LCD circuit. The top rail (j 1-6) are the IO pins pins going to the Arduino. The bottom rail shows two sets of 6 headers (only one shown plugged in for visibility).

LCD attached to breadboard via female headers

LCD attached to breadboard via female headers

I’ve kept the same numbers on the breadboard as on the LCD (left to right pins on top of the LCD). Thus it’s easier to remember. This also allowed me to get my resistor in there too.

The blue square on the left is a small breadboard potentiometer (variable resistor). Note I’ve put the LCD backlight pin (a 16) to ground, as I’m not using the backlight.

The circuit shown was taken from a PighiXXX ABC diagram.

This now gives me a low cost and re-usable way to plug various components in and transfer them between projects. I can do the same to SMD components, like a female USB A socket, like below:-

Female USB A with headers

Female USB A with headers

Pretty cool… Not bad for a few minutes work!

I’ve been playing around with a great SD card logging unit from Hobby Tronics. This is a slightly improved version (and cheaper!) of the SparkFun OpenLog unit.

Software Serial has failed me on this piece though. You absolutely must use the standard hardware serial (Serial.println() ) functions in order to not lose data. Even with large multi-second delays and very little information being transferred, the Arduino Software Serial library failed to keep up.

This is unusual, as I’ve found the Software Serial library very reliable for use with GPS and 3DR radios’ serial interfaces.

I’ve successfully used the setup() routine to specify the file I want to save to. This required a bit of hackery. To send a Ctrl+Z character in Arduino (required for setting up the open log in command mode) you need to do the following in code:-

 Serial.print((char)0x1A); // Ctrl + z
 Serial.print((char)0x1A);
 Serial.print((char)0x1A);
 delay(1000);
 Serial.println("append " + file); // tell open log to append to a named file (file created if it doesn't exist)
 delay(1000);
 logGpx(file,team,now); // my customised logging setup function

Those delay() calls probably aren’t strictly required, but they don’t slow down my app either.

Note that char 0x1A is a hexadecimal number for ASCII character 26 (the A is the 10th character in a sequence, the 1 in the second column means 1×16 (base 16), thus 10 + 16, which is character code 26. ASCII character 26 is the same as Ctrl+Z in a terminal in Windows CoolTerm, which is the required character for entering and leaving command mode in OpenLog.

I’m using the XML based GPX format to write updates. In future I’m going to hook this up to a GPS, so it makes sense to use this format rather than NMEA as it can easily be read by online and desktop tools. You could use KML instead (Google Earth format).

I wonder how much current this little device draws?…

Power usage of just the memory card (doesn’t include the Arduino itself):-

  • During command mode, up to 12mA
  • During normal standby operation, 1.57 mA
  • During writing data, anywhere from 6.5 to 11.5 mA, averaging approx 7.8mA

Just a quick list for my own needs…

Things already done:-

  • Proven transmitter sends valid GPS tracking data to receiver on computer
  • Built prototype transmitter and tested in the field
  • Integrated GPS, 3DR radio, Li-po charger, Li-po battery, Arduino solderable breadboard in to an enclosure
  • Tested power saving mode (7 – 15 days charge possible!)
  • Created very basic symmetric cryptography routines

Things to do next:-

  • Get route logging software working, preferably with GPX file output (OpenLogger on HobbyTronics ArduLogger platform with micro SD card)
  • Range test air to ground module outside on maximum power settings, continuous transmit
  • Investigate 3DR circuit to see if its possible to make own PCB cheaper (and integrate whole device in one half eurocard PCB)
  • Get LCD display working on protoboard
  • Get Joystick working for basic menu items on same protoboard
  • Test multiple transmitters with one receiver (including testing using ‘air’ unit for receiver, and receiving multiple signals on same receiver)
    • If can’t use air module, use usb to serial circuit for ground module (might be fun to try anyway…)
  • Test 3DR radio LBT (Listen Before Transmit) mode
  • Test encrypted messaging
  • Develop own binary format for messaging (C language struct?)

Recent purchases…

I’ve made a couple of recent purchases after considering the full scope of my project.

Firstly I’ve bought another SparkFun 16×2 LCD module, and this time I won’t ruin it with a soldering iron by accident!!!

This LCD will be used in the D of E supervisor’s receiver module. I’ve also developed a user interface and set of menus so you can navigate through and track multiple teams, find your own position, edit settings, and even navigate to a selected team. (distance, bearing).

In order to drive this though I had to use some sort of interface. Several small buttons seemed a bit fiddly to add to the box, so I’ve opted for a PlayStation controller like Joystick! I can mount this on the project box next to the LCD. If you push this joystick down it also acts like a selection button, so I have left to right, and up to down navigation, and a selection button. Just like a standard modern GPS unit (but at a fraction of the cost).

Interestingly, SparkFun have stopped selling the black project box, now instead selling a clear one. That’s pretty awesome from a show-and-tell perspective! I can now show off the project, and see the power LEDs through the case. No need to drill LED holes that may leak in water.

I’ve also decided to buy a couple of micro SD interface cards – a ArduLogger device from a local supplier, but with the SparkFun OpenLogger software installed. This software is a bit more flexible, allowing you to name multiple files and either replace their content or append new content. Perfect for a receiver tracking multiple teams – you can have a GPX file for each day for each team. Great! I’ll also fit this on the transmitter so I can check the teams actual route later if they go out of signal line of sight. Not that I don’t trust them…

I also decided against bluetooth for a couple of reasons. Firstly, more complexity, space, and power usage for a very limited ‘download’ mode at the end of a walk. Also because I have a whopping 433MHz module already with a high baud rate! May as well re-use that to request and force an upload of an entire set of logs. They’re only a few KB for a day, so won’t take long at all to transmit.

Having two transceivers also brings the tantalising prospect of sending and receiving messages. A future ‘posh’ version of the transmitter may be a bigger battery, and LCD screen, and another joystick – so the team can send progress reports and receive information from their supervisors. E.g. ‘get off the mountain – crazy weather coming!’

I’ve also found a cheap supplier in Singapore for my Arduino Mega boards. More on that in another post. They’re approx GBP 1.80 each! Great if you want to make a lot of modules.

For my next trick I’ll use a Dremel to cut holes in my project case so I can mount the components. More to follow!…