My adventures with a Raspberry Pi and Arduino programming

Archive for the ‘Design’ Category

AJ Arduino pin usage

I need to ensure I have enough oomph in my hardware. For this I need to know that I have both enough PINs for all my hardware, and enough power for my application.

PINs in use on my Arduino Pro Mini

  • 0 Serial RX – SD card
  • 1 Serial TX – SD card
  • 2 INT0 – Software serial radio RX
  • 3 INT1 PWM – Ascent/descent
  • 4 T0 – Software serial radio TX
  • 5 T1 PWM – Forward / aft
  • 6 AIN0 PWM – Turn port/starboard
  • 7 AIN1 – free
  • 8 CLK0 – free
  • 9 OC1A PWM – free
  • 10 I2C SS PWM – free
  • 11 SPI MISO – free
  • 12 SPI MOSI PWM – free
  • 13 SPI SCK LED – free
  • 14 A0 ADC0 – free
  • 15 A1 ADC1 – free
  • 16 A2 ADC2 – free
  • 17 A3 ADC3 – free
  • 18 A4 ADC4 I2C SDA – I2C rangefinders x 6, 9 DoF sensor (accelerometer, gyro, magnetometer)
  • 19 A5 ADC5 I2C SCL – I2C rangefinders x 6, 9 DoF sensor (accelerometer, gyro, magnetometer)
  • A6 ADC6 Analog only – charging current
  • A7 ADC7 Analog only – low voltage

Power usage

Item Min Typ Max
Sharp 10cm-80cm sensors (6) ? 30 mA * 6 40 mA
 SparkFun I2C Rangefinders (6)  0.3 1.7 mA * 6  ?
 9 DoF sensor  Accel’ 0.4 mA (10Hz)

Gyro 6.5 mA

Magne’ 0.1mA

 Arduino (hi power mode)  0.5 mA  4.13 mA  4.13 mA
 Current monitor  0 (on arduino)
 Low voltage monitor  ? TBD
 SD card reader/writer  2 mA  5 mA  6 mA
 Radio tx/rx (BLE Nano)  4.7 mA  8.0 mA  11.8 mA
 Copter circuitry itself (max)  ?  3258 mA (3.3A)  ?
 Totals    Elec: 214.33
Copter: 3258
Total: 3473 mA
 

The above default power usage stats with a modified 2000mAh/3.7V battery give a fully operating flight time of at least 34.5 minutes.

We can probably halve the electronics’ power usage, but this would only get us to 35.6 minutes flight time – so what’s the point? We will have to test the actual flying at low speed to see how long the copter will fly itself around. I’m guessing the actual flight time will end up nearer an hour with the bigger 2000mAh battery I’ll be using.

Eventually if the bot ends up performing tasks without the rotors on (observation from a static point perhaps?) then looking at power savings in the electronics begins to make sense. As a bare minimum I’ll shut the sensors off when the bot is stationary. (landed)

AJ Airbot Project Introduction

Why an Airbot?

I love little helicopters and quadcopters. Never owned a Quad though, so thought I’d get a small one. It’s great to fly aircraft, but even better to programme them and watch them fly themselves! I wanted to create something that responded to basic stimuli and would work indoors – perfect for experimenting with aviation.

I love the idea of those little bugs you find in gadget stores – they jiggle around, follow the edges of a cage, and keep walking around. They’re simple but interesting as an academic exercise. I wanted to do something a little like this, but with quadcopter… I love aircraft!

Where to start?

I figured start with a very small nano quadcopter. This also means I can run everything at Arduino system voltage (around 3.3V), making the electronics easy. I’ve just bought a Hubsan X3 mini (7cm diameter) quadcopter. This can be used indoors or outdoors. I don’t want to re-invent the (very complex) wheel of flight controls – so I intend to retrofit this quad. I’ll let the onboard electronics worry about stability – all my code is going to do is say hover, up, down, forward, back, turn – just like the old logo robots from school!

Controlling the aircraft

This does imply a few things though. Firstly I need to interject an Arduino in to the lines from the control receiver to the flight control unit – so the arduino is saying forward/backward, not me. Shouldn’t be too hard (!)

After this I need to get basic controls working. I need the arduino to ‘know’ it has taken off, how high it is, and how to land again. The same goes for any movement. Thus the Arduino needs to know its acceleration in the x, y and z planes. This is something the flight controller may be doing anyway – if I can intercept these sensor readings then great, else I’ll have to add some more sensors for this.

Position

To know my height I need to know my acceleration for a period of time in the z plane. Same with x and y position. This means a basic inertial navigation system (INS) needs to be developed. This will help me find my landing pad again, and to ‘learn’ how to navigate around rooms. Basically provides a way of remembering walls in a room too.

Eventually I’ll want to add some IR distance sensors for fine input, and to sense obstacles, but initially knowing how far above the landing zone I am will suffice.

Learning about the environment

Flies respond simply to external stimuli. This is why they keep head butting windows. I want something simple, but not that simple. I could create a 1cm grid of an entire house, but this would be expensive in storage terms. My bot is 7cm in diameter. My likely proximity sensors have a 10cm minimum distance. If I assume a 10cmx10cmx10cm cube around my airbag then I can safely map a grid at 15cm resolution.

I can provide a 10cm safety zone around the aircraft – E.g. 10cm below, 7cm for the aircraft, and 10cm above. This is 27cm and so a point (a wall found to be) within 30cm means ‘I can’t fit there’ whereas a gap means ‘I can fit there’. So approximating  a room in to 15cm increments provides basic navigation. My front room where I’m typing now is approx 4m x 5m x 2.2m – which means approx 27 x 34 x 15 data points, so 13770 data points, each of 1 bit, so 1.68 KB of data. Not too bad at all. I could even simplify this eventually by storing planes for walls with four lines for boundaries, dropping the storage required significantly.

Autonomous flight

Being able to fly itself also implies it must manage it’s own health. For airbots this means battery usage. I’ll use a SparkFun solar buddy and a panel to charge the unit. I’ll also use the trick mentioned on it’s tutorial page to measure the current flow to the battery (and thus if it is not full), and if I can a ‘low current’ warning too (nearly empty).

I can use this information as ‘fear’ stimuli – in this case to find a location to sit still that provides a high charge rate – i.e. somewhere to sit in the sun. Just like my Labrador does (although for different reasons!)

Other ‘health’ and ‘fear’ indicators will also be useful. Houses are full of things that move. Thus not only will my room learning code need to figure out what is fixed versus what is moveable (accomplished by ‘number of observations’ of ‘wall here’ at a particular location), but it will also need a short term ‘collision avoidance’ mechanism.

The Science Museum has some simple toys that do this. They hover and are moved by you bringing your hands near them, and they steer themselves clear. I’ll do something similar. Useful in particular for doorways, labradors, and crap sitting in my office.

Behaviour and flight planning

As you can tell from the above, the bot will need several types of behaviour. Here’s a typical flight:-

  1. AJ is stationary in the ‘idle’ mode on a desk.
  2. AJ decides “I’m bored” and chooses to ‘take off’ (transition from idle to flying states)
  3. Once this manoeuvre is complete, AJ randomly chooses to ‘explore’ (transition from flying to exploring states)
    1. The explore state’s ‘planning’ step is invoked. This chooses a direction to go in that does not interfere with known ‘local’ hazards (like known walls, desks, etc.)
    2. This pushes a ‘turn’ then a ‘forward’ and a ‘ascent/descent rate’ action on to the activity list (in this order)
    3. This action also then pushes the ‘navigate near to location’ action on to the list (at the end)
    4. When nearby, the final action ‘hover’ is invoked.
    5. The action list is now empty, so explore’s transition chooser determines what to do next.
  4. Depending on how ‘adventurous’ AJ has been told to be determines what percentage chance AJ has of continuing to explore.  (i.e. transitioning state to the same state) Say it’s 80%, and for our purposes he chooses to explore again
    1. We invoke ‘planning’, which in turn adds a turn, forward, ascent/descent rate, navigate near to, and hover command on to the action plan list
    2. Whilst in the navigate to action a sensor determines that the target is 40cm away, but a wall is 30cm away.
    3. This basic instinct causes an ‘avoid’ flight action to be executed, and a new ‘explore new boundary’ phase to be entered in to. (This may or may not ‘throw away’ the original explore command – probably easier if it does)
      1. This implies that an ‘explore space’ state with an ‘avoid’ instinct stimuli means we transition to an ‘explore new boundary’ state
  5. Exploring a boundary. Here we need to determine what line the boundary follows – at the moment we only have a point on it observed.
    1. We first ‘observe’ the boundary to determine if it is static. If not  the ‘avoid’ stimuli will again fire (no problem), or the object will move out of the way, in which case we transition back to the explore state originally found.
    2. If stationary in the hover however, and the obstacle doesn’t move, we ‘peak’ left and right, using the direction calculator and INS readings to determine the likely line of the ‘wall’ (or other obstacle). We then plot a flight plan along this obstacle (turn, forward (slowly?), and no ascent/descent)
    3. As we move along the wall we record that it is still there. If we find an obstacle in the direction of flight, we log a new obstacle and follow that boundary again.
    4. If we don’t then after a while we’ll be in an extended line far away from ‘known space’. The ‘center of gravity’ of the bots nearest known positions should increase our ‘fear’. If this becomes high, then we should increase or decrease altitude and move back along our obstacle (hover, turn, ascend, hover, forward)
    5. If we encounter somewhere we’ve been before, again we’ll change height and move back along the obstacle. This time we’ll get a little further because there are more points in the extension, and thus our fear allows us to move further. Thus over time we’ll incrementally explore our environment.
      1. This ‘fear of the unknown space’ should be a basic instinct – if it hits a certain amount we should ‘return to base’ – e.g. if I’ve accidentally ended up outside – we should retrace our steps to the ‘last known safe location’ Failing this, navigate home directly.
  6. Eventually our ‘low battery’/hungry stimuli will fire, causing us to find a known nice charging point (a place we’ve had high recharge at before) or return to home (last resort)
  7. A ‘tiredness’ stimuli may fire after charging even, and cause the bot to return home. Much like my labrador does after eating, drinking after a long walk – he goes and lays down.

Long distance flight planning

Rooms are weird shapes. As are cities and the ground. This is why aviation has developed flight lanes. An aircraft moves to a flight lane to go beyond a local distance. By remembering ‘navigation friendly’ intersecting boxes we can provide a safe and finite way to plan routes that take in to account complex urban and in house environments. This will effectively map the centre of a room and it’s doorways. A simple cuboid we’re allowed to fly in to.

This means ‘take me home, I’m scared!’ can allow me to fly back to me last safe position, then to my last safe navigation. From here I can use navigations to plan a route to the safe navigation nearest my home position (and without a wall in between!).

This happily is a simple directed graph problem to solve. I count the minimum distance between centre points of navigations that intersect. So when entering room A through doorway 1 and evaluating the routes via doorway 2, I count the ‘cost’ of the manoeuvre by calculating the distance from the centre point of Room A’s intersection with door 1 and Room A’s intersection with doorway 2. Easy planar mathematics using Pythagoras.

I can even weight route choices by the quality of a navigation – it’s volume. A higher volume is more like a highway. These should be preferred routes. It also means a likely outcome is a ‘shortcut’ – going out the kitchen window, up the side of the building, and in to the office window in my house rather than flying up the stairs!

Memory

As we’re flying about we should try and remember things we find along the way. This should include areas where temporary obstacles aren’t found that often (like people, dogs, etc.). Knowing this provides the ability to find ‘rest spots’ or even ‘hide holes’. Useful in an urban environment.

Remembering temperature is also useful. A cold temperature means poor battery life, whereas a warm temperature means overheating. Each of these things can have its own memory, or feed in to a general ‘fear factor’ of particular locations. Fearing the dog’s room for example is quite useful.

Basics complete

Once the above is done we will have an autonomous super-fly that is capable of seeing to its own dietary and survival needs, and mastering its environment. This is a minimum viable product really, to which additional features and behaviours can be added.

Things needed

Clearly proximity, temperature, current, low voltage sensors are all needed. As is a state machine with simple static transitions mapped out. Shouldn’t be too difficult to build a super fly from a quadcopter!

What is not needed for this simple world is artificial intelligence – there is no ‘learning’ in the AI sense here – only ‘mapping’ an environment. Intelligence has an edge because it can be put in to new environments and use past experiences to help with decisions. This is not what I’m proposing. I’m proposing learning one fixed environment provided by the bot’s creator. Hence no neural nets or anything else. There’s nothing stopping you adding those instead of a state machine though.

Future extensions

To turn a fly in to a useful ‘pet’ or ‘companion’ a level of more behaviours (not intelligence necessarily) is needed. The ability to recognise people and communicate with them. LEDs and a simple sound generator, and the ability to listen to it’s own name and commands, much like a dog does.

Some commands are quite easy and desirable:-

  • Bedtime – sends to sleep when you’re working on a project / conference call
  • Go to ‘x’ – a named navigation space? remember 10 ‘words’ heard when in a particular navigation space perhaps?
  • Follow me – watch me (person speaking) and follow on
  • Watch me – record video for posterity, doesn’t necessarily imply follow me
  • Stay – hold position, but not necessarily attitude (so stay and watch me can be combined)
  • Find Person X – navigate through navigations to within voice print of a particular person. Perhaps combined with last known location of that person’s voice, or recent locations you’ve heard them
  • Come – come near the speaker
  • AJ – pre-command authentication – only responds to top 5 people it’s been near
  • Go with X – adds person X to ‘owner’ list temporarily (like going with an auntie for a day)
  • Lay down – land at current position
  • move left/right/forward/back – fine positioning before landing.
  • AJ you’re at ‘environment x’ in ‘navigation y’ – when first activated, so able to come with you to new places, e.g. climbing crags.

Eye SPI Arduino…

I’ve been toying with how to handle multiple peripherals that require Serial. Up until now I’ve used the Software Serial library on Arduino, and have resorted for my SD card logger to use the hardware serial.

But is there a better way…

Maybe…

Lets have a look at the comms options:-

Part Hard Serial Soft Serial SPI I2C
ArduLogger V3 Y NO YES [0] YES [0]
GPS – GP635T Y Y NO NO
GPS – UBlox 6M module Y Y NO NO [1]
3DR Radio Y Y NO NO
LCD – Hitachi 44780 Y Y NO [2] NO

[0] ArduLog software only, not the SparkFun OpenLog software. Oh and you need to modify the code yourself to add support!
[1] The compass on the UBlox does have I2C SCL and SDA lines, but not the GPS
[2] It is possible to use a register to drive the LCD. May be possible to cleverly link this to an SPI interface

Hmmm… So maybe not then… Although of course nothing stopping me using interface circuitry to make all the above work, but it would probably add to the number of IO lines used, not reduce them!

Speaking of IO lines used, I’m currently using these:-

  • ArduLogger – 2 (hardware serial)
  • GP635T – 3 (2x soft serial, 1x power mosfet on/off)
  • 3DR Radio – 3 (2x soft serial, 1x power mosfet on/off)
  • LCD – 6
  • TOTAL: 14 (Arduino Pro Mini V3 I use has 22 pins that can be used for digital IO – 4 of these are optionally SPI, and another two optionally I2C)

I’ll keep a watching brief on using SPI though – could potentially be useful, and allow me to ‘off board’ a lot of serial comms in future.

[UPDATE 26 MAY 2015]

You can buy ICs that act as an SPI to multiple UART convertors each with a 64 byte FIFO queue. The chip model for two output UARTs (up to 4MBit/s!) is

SC16IS752IBS and is documented at http://www.nxp.com/documents/leaflet/75015676.pdf. This product also has some GPIO ports, so could theoretically externalise all my IO (including LCD) if I really needed to.

LCD breadboard and cunning headers…

So I’ve been struggling with wires everywhere in my projects. I’ve determined now that I suck so bad at soldering tonnes of plain wires that I need a better solution!

What I’ve done is solder the wire to female (yes, female) breakaway headers.

Idea being that male 90 degree bend headers are easiest to get, so I should be able to put these on the underside of my solderable breadboards. If I use a couple of pins in rather than the outermost ones I can even hide part of the female headers, so not taking up any extra space in my projects.

All also means that if one component is faulty I can easily replace them. I can also use them across projects, or plug them straight in to a breadboard for prototyping.

This arrangement is shown in the below image. This shows a basic LCD circuit. The top rail (j 1-6) are the IO pins pins going to the Arduino. The bottom rail shows two sets of 6 headers (only one shown plugged in for visibility).

LCD attached to breadboard via female headers

LCD attached to breadboard via female headers

I’ve kept the same numbers on the breadboard as on the LCD (left to right pins on top of the LCD). Thus it’s easier to remember. This also allowed me to get my resistor in there too.

The blue square on the left is a small breadboard potentiometer (variable resistor). Note I’ve put the LCD backlight pin (a 16) to ground, as I’m not using the backlight.

The circuit shown was taken from a PighiXXX ABC diagram.

This now gives me a low cost and re-usable way to plug various components in and transfer them between projects. I can do the same to SMD components, like a female USB A socket, like below:-

Female USB A with headers

Female USB A with headers

Pretty cool… Not bad for a few minutes work!

Recent purchases…

I’ve made a couple of recent purchases after considering the full scope of my project.

Firstly I’ve bought another SparkFun 16×2 LCD module, and this time I won’t ruin it with a soldering iron by accident!!!

This LCD will be used in the D of E supervisor’s receiver module. I’ve also developed a user interface and set of menus so you can navigate through and track multiple teams, find your own position, edit settings, and even navigate to a selected team. (distance, bearing).

In order to drive this though I had to use some sort of interface. Several small buttons seemed a bit fiddly to add to the box, so I’ve opted for a PlayStation controller like Joystick! I can mount this on the project box next to the LCD. If you push this joystick down it also acts like a selection button, so I have left to right, and up to down navigation, and a selection button. Just like a standard modern GPS unit (but at a fraction of the cost).

Interestingly, SparkFun have stopped selling the black project box, now instead selling a clear one. That’s pretty awesome from a show-and-tell perspective! I can now show off the project, and see the power LEDs through the case. No need to drill LED holes that may leak in water.

I’ve also decided to buy a couple of micro SD interface cards – a ArduLogger device from a local supplier, but with the SparkFun OpenLogger software installed. This software is a bit more flexible, allowing you to name multiple files and either replace their content or append new content. Perfect for a receiver tracking multiple teams – you can have a GPX file for each day for each team. Great! I’ll also fit this on the transmitter so I can check the teams actual route later if they go out of signal line of sight. Not that I don’t trust them…

I also decided against bluetooth for a couple of reasons. Firstly, more complexity, space, and power usage for a very limited ‘download’ mode at the end of a walk. Also because I have a whopping 433MHz module already with a high baud rate! May as well re-use that to request and force an upload of an entire set of logs. They’re only a few KB for a day, so won’t take long at all to transmit.

Having two transceivers also brings the tantalising prospect of sending and receiving messages. A future ‘posh’ version of the transmitter may be a bigger battery, and LCD screen, and another joystick – so the team can send progress reports and receive information from their supervisors. E.g. ‘get off the mountain – crazy weather coming!’

I’ve also found a cheap supplier in Singapore for my Arduino Mega boards. More on that in another post. They’re approx GBP 1.80 each! Great if you want to make a lot of modules.

For my next trick I’ll use a Dremel to cut holes in my project case so I can mount the components. More to follow!…

Image

D of E Tracker project box component layout…

Here’s some radio project pics for you…

gpstxrx-part-assembled

The first one, above, shows the partly soldered project. You can see the battery plug (top right of red solderable breadboard) which will go to my LiPo recharger circuit. You also see the 3DR radio airside module (left) and the mini GP-635T GPS unit (centre).

The aerial you see is the one that comes with the project. This is a WiFi aerial so will be getting replaced. Just waiting on a 433MHz compatible antenna to arrive.

gpstxrx-projectbox

Here you see the project box layout. To the bottom of the picture is the bottom half of the project box. You see the GPS unit in the top part so its antenna doesn’t have a clouded view of the sky.

You also see the battery in the top half. This will be plugged in to the LiPo charging circuit (red, bottom picture, right hand side), which will also have a power cable going to the solderable mini breadboard with the Arduino mounted. (bottom left).

Again the radio (this time the ground side usb module) is mounted at the bottom. I’m investigating whether a short adapter circuit can be used to make use of the ground side module. Ideally I want a UART module like the airside one, but suppliers of the full pair of units give you one of each. (For about GBP 30).

The embedded SoC radio circuit though costs only GBP 2.62 from Mouser!!! So it may work out cheaper to take this, and the open source 3DR radio schematics (zip), and build my own UART on a PCB with this chip mounted.

Design rethinking

Something I’ve noticed is that the extra vertical pins on the Arduino (A4-A7) only just fit in the project box. This is without the non-permanent header block from the first image. I’ll remove the 4 extra analog pins as I’m no longer using them. This will mean I don’t have to solder the Arduino entirely to the breadboard – leaving it removable for use in other projects. (You can see the height of the Arduino in the top image because I use a header block to mount it.)

I may also remove the barrel jack from the LiPo charger to remove the possibility of someone plugging in both this and the USB cable!

I also need to check if the LiPo circuit has a voltage regulator. If so I’ll wire the output directly to the VCC rather than the RAW line of the Arduino. The Arduino’s voltage regulator is well known for being power hungry, so removing this will help overall power consumption.

I’ve toyed with the idea of modifying the LiPo charger to be a 1-2 Amp charger, but most USB sockets don’t support above 500mA so I think its easier to just make people wait 4 hours to recharge the battery and keep the power lower.

In the future I’d like to add a way to download an activity log. I may develop a Bluetooth app for this – but for now I’m happy with a live view of data.

ROS woes…

I’ve had a lot of issues using the Robot Operating System…

(more…)