Home‎ > ‎I) Other‎ > ‎I.9) Quadcopters‎ > ‎

I.9.4) IR Guidance

Introduction

The quadcopters (Micro Air Vehicles (MAV)) that I own have an organic ability to perform an automated landing, either on command or programmed as a waypoint. Whilst the speed of descent is quite carefully controlled, the accuracy (i.e., the difference in position between the actual touchdown point and the desired touchdown point) can be (ahem) ‘variable’.
 

Terms:

  • Localisation - determination of the MAV position.
  • Touchdown point - the location at which the MAV comes to rest following an automated landing.
 

Abbreviations

 Abbr. Intended meaning.
 CCTV Closed Circuit Television
 DC Direct Current
 IRInfra Red
 I2CInter-Integrated Circuit
 LED Light Emitting Diode
 MAVMicro Air Vehicle
 ObexObject Exchange
 OTTOver The Top 
 PCPersonal Computer
 PSTPropeller Spin Tool
 PWMPulse Width Modulation
 R/CRadio Control
 RMSRoot Mean Square
 RTLReturn to launch
 RxReceiver 
 SCLI2C clock line
 SDAI2C data line
 SMDSurface Mount Device
 VddPositive supply voltage
 Vss Negative supply voltage 

Current performance

 A quick set of measurements taken on the beach at low tide on a still day. Takeoff, fly around a bit and select RTL. A bit of a furtive activity thanks to the Daily Mail's demonisation of "drones" and the fact it was semi-dark, the measurements were not 100% accurate but, as the saying goes, "good enough for government work".


The RMS value is 1.4 m.

Aim

This little challenge is to improve the accuracy of automated landings performed by a quadcopter; it is a ‘given’ that it will be based on IR tracking. All this funky vision-based tracking stuff is all very well but a bit OTT.
 

"Desirements"

  • Platform agnostic
    • Not dependent upon the functionality of any one specific set of flight control hardware, software or sensors.
  • Minimal effect on performance of quadcopter
    • Low weight
    • Low power demand
    • Small space envelope
    • No changes to autopilot / flight controller
  • Portable
    • Low weight
    • Low power demand
    • Small space envelope
    • Able to use available power sources
  • No impact on telemetry
    • No changes to autopilot / flight controller
    • No new messages
    • Air-to-ground messaging unaffected
    • Compatible with existing messages
  • Achievable
    • Appreciable effect on accuracy
    • No novel technologies 
    • No difficult mathematics [key requirement]
  • Usable.
    • In bright sunlight
    • In light to medium winds
    • Gust-free conditions
    •  Effective from 30m to ground level
  • Testable
    •  Ability to record/replay information is very desirable.
 

Approach

The two main tenets of the approach taken are that:
  • It will be based on the Parallax Propeller – this selection is influenced by my familiarity with the Propeller, its development environment and an expectation that high speed deterministic parallel processing will be necessary. A solution embedded into a Propeller-based flight controller or using a ‘micro’ board will be little or no weight;
  • It will use infra-red (IR) tracking technology – such techniques are well developed and understood.
 
The interpretation of these two constraints and the ‘desirements’, above suggest a federated approach: a passive element at ground level that identifies the desired touchdown point; an active element on the MAV that detects the ground element and influences the trajectory of the MAV based upon the perceived difference in position.
 
The camera of a Nintendo™ Wii-mote is able to identify and report the position of up to four ‘blobs’ that are sources of IR light within the frustrum of the camera sensor. This information is available at a rate of several hundred Hertz via an I2C interface. Within the Parallax Obex can be found objects for use of Wii-mote data. These were created by user Graham Stabler and his application of the data is shown in this video - clever stuff:


 



The sections below outline the justification of each element of the design.

The general plan is for the prop to sit between the Rx and the flight controller (three channels). In normal conditions the signals are passed straight through with minimal delay; when the RTL mode is detected the signals are modulated by commands (pitch, yaw, roll) calculated from the offset from the touchdown point. The sensor detects the transmitter at the touchdown point and calculates the commands to make.

The Prop reads from the flight controller the heading and flight mode.

The architecture of the Prop makes it easy for this to happen in parallel - without interrupts.

A VGA connection is only used during development.



Air Element – Sensing

Whilst the camera from the Wii-mote seems ideal it is very, very small and I can hardly see the connectors, let alone solder them into place. For the purposes of investigations and development a similar (if not identical) “heat-detecting” sensor was procured from France. Whilst it doesn’t need the clock or reset signal required by the Wii-mote it remains compatible with the Obex code without modification. A simple four wire connection (Vss, Vdd, SDA and SCL) is all that is required. The spec gives a resolution of XXXX by XXXX and viewing angles of 23° x 32°.

Ground Element – Transmitter

Whilst the generation of multiple IR ‘blobs’ in a fixed configuration could – with some complicated mathematics –  improve the accuracy of localisation I didn't fancy attempting that and, in any case, having four blobs available all of the time is unlikely.
 
The IR transmitter on the ground needs to be sufficiently powerful that it is detectable in daylight from the nominal altitude of 30m. In addition to the ‘directly above’ scenarios, the transmitted IR needs to be detectable when the MAV is offset from the transmitter and when the MAV is not perfectly level. Another design decision results from the way in which the sensor detects and reports ‘blobs’. 
 
IR sources tend to be LED based with individual LEDs having emitting in cones having internal angles between 5° and 30°. Clustered IR LEDs tend to provide illumination at night for scenes under surveillance by CCTV cameras.

The illuminators with many LED, e.g., 48 in the example below, were detected across a wider area but as fewer blobs. This meant a less precise localisation.


 
The image below is a of a high power CCTV illuminator based on four IR SMDs. It has had the glass removed and the light sensor (centre) covered so that the LEDs are on in daylight. The SMD LED have had their lenses removed and the top-left and lower-right LED have been fitted with new lenses with a 60° angle


IR Projector


Original Lens

New 60° Lens

The updated lenses give a more divergent beam than the originals. However, these are high power LED and though the light might not be visible it is not A Good Thing to look at directly. For close-range testing a lower power test target was created, though it is still not a good idea to stare at it.

Test Target

For the purposes of initial testing and development four 10mm IR LED were fitted to a blank Protoboard:



Computing Environment

  • Development is on a Parallax Propeller Protoboard, powered by a 12V DC source and with a USB connection to a PC; a Demoboard was used initially.
  • Software development is via Prop tool version 1.3.2 installed on a Windows 10 PC.
  • The Propeller Serial Tool (PST) is used for debugging and general observation of the machinations within the Prop.
  • A serial link between the PST and Protoboard operates at 115,200 baud.
  • A VGA connection from the Protoboard connects to a monitor.
  • For graphics a Demoboard was setup to provide a composite video signal, viewable on a small TFT display of the type usually used for reversing cameras in vehicles and available at low cost.
 

Development Progress

 The story so far...


 COG 1 - Sensing and Processing

Demoboard...

The spin code “Mode 5_demo” from the Obex was used for some initial testing, modified to use the Graphics.spin object. The graphical output shows the detected positions of the first four blobs and the centroid of their position:

The crappy music is to drown out the sound of children squabbling - sorry.
The average (centroid) position is available to other cogs as a 'global' (hub) variable)

Protoboard:

Having run out of accessible pins on the Demoboard the gubbins were transferred to a Protoboard set up as the bird's nest shown below.

The R/C Rx originated in a quadcopter and the Protoboard reads the aileron and throttle channels.

The servo is a small version originally used for a tilt/pan mechanism.


COG 2 - Controlling a servo:

The spin code "Dual Servo Driver" was established to run in Cog 2. It will be expanded to control more servos at a later date, probably by the Servo32 or Radio Override code.

The servo responds across its full range as the centroid of the detected blobs moves across the field of view of the sensor:


COG 3 - Receive R/C data


The spin code "ServoInput.spin" was established to run in Cog 3. It reads the PWM-coded signals from the R/C Rx.



MAVLINK (WIP)


Stereo Vision

Adding a second sensor gave an even better response, that is to say, the region under scrutiny was larger and some of the jitter in the servo had reduced as a result of an increase in the number of blobs being detected.


To be continued...
Comments