WO2008061307A1 - A method of determining characteristics of a remote surface with application to the landing of an aerial vehicle - Google Patents

A method of determining characteristics of a remote surface with application to the landing of an aerial vehicle Download PDF

Info

Publication number
WO2008061307A1
WO2008061307A1 PCT/AU2007/001793 AU2007001793W WO2008061307A1 WO 2008061307 A1 WO2008061307 A1 WO 2008061307A1 AU 2007001793 W AU2007001793 W AU 2007001793W WO 2008061307 A1 WO2008061307 A1 WO 2008061307A1
Authority
WO
WIPO (PCT)
Prior art keywords
landing
landing area
mirror
target
attitude
Prior art date
Application number
PCT/AU2007/001793
Other languages
French (fr)
Inventor
Matthew Garratt
Sebastien Eckersley-Maslin
Original Assignee
Newsouth Innovations Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2006906551A external-priority patent/AU2006906551A0/en
Application filed by Newsouth Innovations Pty Ltd filed Critical Newsouth Innovations Pty Ltd
Publication of WO2008061307A1 publication Critical patent/WO2008061307A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • G01S17/875Combinations of systems using electromagnetic waves other than radio waves for determining attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
    • G05D1/0684Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing on a moving platform, e.g. aircraft carrier

Definitions

  • UAVs unmanned aerial vehicles
  • UAVs include both fixed wing vehicles such as unmanned aeroplanes and rotary wing vehicles such as unmanned helicopters.
  • the landing task is usually controlled by input from the pilot to the aircraft controls based on a visual assessment of the landing area with the aid of aircraft instrumentation such as compass and altimeter.
  • these vehicles are typically controlled by a combination of commands received from a ground control facility and by commands generated in control systems
  • the unmanned vehicle There is a need to improve the ability of manned and unmanned vehicles to land in a landing area and this is particularly the case where the landing area itself may be on a slope or subject to movement such as in the case of landing on a ship or other floating platform.
  • the present invention provides a method of determining the attitude of a remote surface including the steps of: directing a beam towards at least three separate points on the surface, the points defining a shape; receiving light reflected from the surface from the at least three points; and determining the attitude of the 30 surface based on the received light.
  • the shape may be substantially oval or circular.
  • the beam may be a beam of light.
  • the beam may be a laser beam.
  • the beam may be directed by moving a mirror.
  • the mirror may rotate.
  • the attitude of the surface may be determined based on the time taken between the light being directed towards a point and receiving light reflected back from that point.
  • the method may be earned out on a moving aerial vehicle and the surface may be a landing area for the vehicle.
  • the attitude of the surface relative to the moving aerial vehicle may be varying.
  • the future attitude of the surface relative to the moving vehicle can be predicted using the method.
  • the present invention provides a system for determining characteristics of a remote surface including: directing means for directing a beam towards at least three separate points on the surface, the points define a shape; and receiving means for receiving reflections of the beam from the at least three points.
  • the directing means may include a moveable mirror.
  • the mirror may rotate about an axis and the axis of rotation of the mirror intersects the plane of the reflective surface of the mirror.
  • the axis of rotation may intersect the plane of lhe reflective surface at an angle of typically between 70 to 90 degrees.
  • the present invention provides a method of landing an aerial vehicle including the steps of: directing a beam towards a landing area; receiving reflections of the beam which are reflected back from the landing area; and determining characteristics of the landing area based on the time taken to receive reflections of the beam. The characteristics determined may include either the distance to the landing area or the attitude of the landing area.
  • the method may further include the step of providing a vision sensor on the vehicle which is arranged to detect a target which is provided in a pre-dctermined location in relation to the landing area.
  • a fourth aspect the present invention provides a system for landing an aerial vehicle including: directing means for directing a beam in the direction of a landing area; receiving means for receiving reflections of the beam back from the landing area.
  • the system may further include a vision sensor arranged to detect a target which is provided in a pre-dctermined location in relation to the landing area.
  • Embodiments of the invention provide the operator with a measure of the slope of the ground and the distance to the ground directly under the helicopter at all times during hover and the landing phase.
  • Embodiments of the invention provide an accurate measure of the distance to the ground which can be used in a feedback loop to control the height of the vehicle automatically.
  • the motion of the ship can be estimated based on the output of the sensor and this estimate can then be integrated in to the guidance system of the flight vehicle.
  • Figure 1 is a schematic view of a rotary winged UAV fitted with a time of flight measuring instrument
  • Figure 2 is a detailed view of a time of flight measuring instrument
  • Figure 3 is an alternative embodiment of a time of flight measuring instrument
  • Figure 4 is a schematic illustration of the systems used to control the UAV of figure 1
  • Figure 5 illustrates a vehicle body axes system used in embodiments of the invention
  • Figure 6 illustrates the geometry of an embodiment of the invention
  • Figure 7 illustrates how the vision sensor tracking algorithm determines bright regions from the live pixel stream using adjacency rules.
  • a UAV in the form of an unmanned helicopter 10 is shown hovering above a landing area 14 being the deck of a ship at sea.
  • the UAV is fittcd with a means for directing a laser beam towards the landing area and means for receiving the reflected beam in the form of transceiver 12.
  • transceiver 12 includes a laser rangefinder 16 such as the AccuRange 4000 produced by Acuity, Inc.
  • the AccuRange 4000 laser rangefinder is an optical long distance measuring tool. It operates by means of an infrared laser diode that emits light of 780nm wavelength at a power of 2OmW. Employing time-of-flight measuring principles, this particular rangefinder can accurately gauge distances up to about 20 metres.
  • Transceiver 12 further includes a mirror 18 which is rotatably mounted to a housing 20.
  • Mirror 18 is rotated by a stepper motor 22 which includes an encoder.
  • the encoder provides a value between zero and 4096 that corresponds to the angles of rotation (0° to 360°).
  • the encoder value is output with the respective range measurement so that the measurements can be related to the helicopter's body axis coordinate system.
  • Stepper motor 22 has an axis of rotation A which is offset from the axis of the beam emitted by rangefinder 16 by 45 degrees.
  • the face of mirror 18 is offset to the axis of rotation by 10 degrees.
  • the base of the conical surface 24 describes an oval shape on the landing area when the helicopter is hovering directly over a substantially flat landing area.
  • rangefinder 16 directs a laser beam towards mirror 32 which is offset by 45 degrees to the axis of the laser beam.
  • the beam is reflected towards mirror 34 which is offset to the axis of the beam by about 42.5 degrees.
  • Both of mirrors 32 and 34 are mounted in a carrier 33 which in turn is mounted to the output shaft of stepper motor 22.
  • the carrier is rotated by the stepper motor and a counterbalance 35 is provided to minimise vibration forces.
  • the diagram schematically illustrates a computing device 40 for a helicopter.
  • the device 40 includes a PC104 flight computer 42 in combination with a high speed interface card 43 (HSIF).
  • HSIF high speed interface card
  • the HSBF card enables a maximum sampling rate of 50,000 samples per second.
  • the samples come over the bus in an 8 byte format that includes a 19 bit range value and 1 byte values for signal strength, ambient light and sensor internal temperature as well as status and general input bits.
  • the transceiver 12 continuously takes readings to measure the distance to points in three dimensions based on reflections of the emitted laser bcam recei ved by the transceiver and based on the position of the mirror stepper motor.
  • the transceiver 12 continuously takes readings to measure the distance to points in three dimensions based on reflections of the emitted laser bcam recei ved by the transceiver and based on the position of the mirror stepper motor.
  • the transceiver 12 continuously takes readings to measure the distance to points in three dimensions based on reflections of the emitted laser bcam recei ved by the transceiver and based on the position of the mirror stepper motor.
  • a distance measurement and encoder output are taken from the laser apparatus.
  • the range measurement is corrected for known errors due to changes i ⁇ temperature, ambient Jight and reflectivity.
  • the range measurement is scaled in to an appropriate set of units.
  • the encoder measurement is converted in to an azimuth angle measured from a reference point such as the nose of the vehicle. For example, given an encoder with 4096 discrete positions per revolution, the azimuth angle ( ⁇ ) would be calculated from the encoder output (E) using equation (1).
  • the range (R) and azimuth angle ( ⁇ ) is then be converted in to a three- dimensional position relative to the aircraft axes system, taking in to account the mirror geometry.
  • the aircraft body axes are a right-handed axes system fixed at the sensor position and rotating with the vehicle as shown in figure 5.
  • the vehicle illustrated is a fixed wing UAV, but the axes system equally applies to rotary wing UAVs.
  • Tbe x- body axis is aligned with the length of the aircraft so that the positive x direction points forwards from the nose.
  • the y-axis passes out to the right parallel to the straight line joining the wing tips.
  • the z-axis points vertically down when the aircraft is flying level.
  • the mirror shaft has a tilt of ⁇ , and the mirror face is offset from a plane normal to the axis of rotation by ⁇ m degrees.
  • the unit vector normal to the face of the mirror is given by equations (2-3).
  • Equation (4) provides the resulting coordinates for each scan point given the range and the components of the unit vecto normal to the mirror.
  • the scan point coordinates are shown in aircraft body axes with the origin being defined as the centre of the mirror.
  • each 3D point can be adjusted for the attitude (pitch, roll, yaw) of the flight vehicle as measured by the vehicle's attitude reference system.
  • the shift in position of each point due to the velocity of the vehicle can be corrected if the vehicle's velocity is known from another sensor such as a global positioning sensor.
  • the points are in global coordinates defined relative to the earth based axes syste
  • Each 3D point is stored in to a buffer in the processing unit memory.
  • the buffer of 3D points is passed to a software subroutine which calculates the plane of best fit to the stored points.
  • a plane in 3D space is described by equation 5.
  • One way of determining the coefficients describing the plane is to use a least- squares method.
  • the objective of the least-squares minimisation is to find the value of the plane coefficient vecto uch that the sum of the squares of the error residuals (R) in equation 6 is minimised.
  • Equation (8) is a solution to the least square problem
  • equation (10) defines the orientation of the plane.
  • the results of the above calculations give an indication of the distance to a landing area, and also an indication of the slope or attitude of the landing area. This information can be used to assist in landing either a manned or unmanned vehicle.
  • the surface estimation algorithm can be further refined to disregard points that do not lie on the landing area., For example, when the vehicle is above a ship's depth, some of the scanned points may fall on the sea rather than on the deck. This can be achieved using a simple intrative process as follows. First the distance between every scanned point in the first estimate of the plain equation can be calculated. Any points that are outside of a certain tolerance are ignored and the plain equation recalculated. This process can be continued until all the points marked valid are within a said tolerance of the deck surface.
  • the laser rangefmder transceiver gives an indication of the orientation and distance of a remote surface, such as a landing area, but it does not indicate the relative location of the vehicle to a point on the surface, in directions parallel to the surface.
  • a visual tracking sensor is fixed to the vehicle to locale a target provided at the landing area.
  • the visual tracking sensor provides a measurement of the relative azimuth angle and elevation angle between the tracking sensor and the target. The equation for a line in space which passes through the tracking sensor and the plane of the landing surface can then determined.
  • the intersection of this line and the plane of the landing surface measured using the laser rangefinder then provides the relative x,y and z position of the helicopter relative to the beacon which can be used in a feedback loop to control the relative position of the helicopter to the landing point.
  • Other sensors such as mertial sensors may be used to smooth the relative x,y,z position estimates from the combined rangefinder and tracking sensor combination.
  • a single light source, or beacon is used as the target which is centred in the field of view where practicable.
  • a single beacon is all that is required to fix the position of the helicopter with respect to the centre of the deck.
  • the yaw angle of the helicopter is not determined using the combination of a point target and laser scanner, the yaw loop may be controlled using a PD feedback scheme based on heading angle, and all that is required is a system to tell the helicopter what heading to steer to match the course of the ship. This only requires a slow update as ships take a long time to change course when underway.
  • a bright LED having a relatively narrow output frequency band is used for the beacon. Rejection of unwanted specular reflections and other lights is possible by choosing a narrow band filter matched to the spectral output of the beacon.
  • the LED output frequency band is ideally chosen where atmospheric absorption is minimal.
  • An alternative proposal is to make use of a colour camera or image sensor, controlled so that pixels with a significant blue or green intensity are discounted. Since astronomical light sources such as the sun (and lheir reflections) have a broad spectral presence, operating in embodiments of the invention with the target analysed principally in the red portion of the spectrum can be an enhanced option.
  • An example image sensor is achieved through Ihe use of a CMOS image sensor, with all of the necessary image processing and coordinate determination located within a single FPGA (Field Programmable Gate Array).
  • the FPGA interfaces to the flight control system and delivers the reliable coordinates of the beacon within the image field to the flight computer.
  • the optics used to image onto the sensor define the positioning accuracy and initial capture range of the beacon.
  • the CS mount allows for other lenses to achieve a desired FOV (Field of View) and f-number.
  • the beam pattern may be narrowed, increasing the output aperture, in which case more of the sensor will be illuminated by the beacon.
  • the image sensor may be controlled in exposure and frame rate.
  • the pixel intensity data is delivered to the adjacent FPGA 7 upon which the first of two pipelined algorithms are run. This algorithm monitors the live pixel stream to locate adjacent pixels of brightness above a threshold, and determines whether to consider these as contenders for the beacon.
  • the algorithm has found four regions (1,2,3 and 4) that have a sufficient brightness and the requirement for adjoining pixels to be considered as a part of the tracked light. For regions 5 and 6 the connection between the pixels that is not good enough to create a single region. As the determination of the area of the region must be taken as pixels are rastered from the sensor, the left most column and run length within that row are remembered for each of a finite number of regions, as is the first encountered row (top). As rows are analysed the boundaries of each region are moved outwards to the farthest contiguous breadth and height, hence the left, top, right and bottom encompass each region contender in the frame. From this the rectangular area is obtained, and the horizontal and vertical centres are easily determined.
  • a single line depth FIFO (First In First Out) is all that is required for the vertical comparison-
  • the threshold is updated at the end of each frame based on a fixed percentage of the peak pixel value within that frame which is remembered during the streaming, and is applied during the following frame on the assumption there will be little interframe change.
  • the computed centres of the regions are analysed, with the centres and areas stored.
  • areas that are not the target such as specular reflections from the sea surface
  • the most likely location for the beacon based on area and intensity is chosen for each frame.
  • a calibration look-up table within the FPGA is used to convert the pixel location from the tracking algorithm into an azimuth and elevation angle measured from the tracking sensor to the beacon. These angles arc then forwarded to the flight computer. Should no reliable beacon be determined from this stage, no coordinate is sent in that frame time.
  • shape is intended to define an area or implied area in two dimensions and does not encompass a straight line. Whilst the preferred embodiment has been described with reference to landing an unmanned helicopter, the invention also has application for manned aerial vehicles where the pilot may be provided with useful landing guidance information.
  • the invention Whilst the preferred embodiment has been described with reference to landing a vehicle on the ground, the invention also has other applications, such as docking two vehicles. These vehicles could be space vehicles.
  • One vehicle may have a flat reference surface and target.
  • the flat reference surface may be a collection of reflectors defining an imaginary reference surface.
  • the invention may allow automated or assisted docking of a ship or a blimp where the flat surface and target is at the dock or guiding fixed aircraft to the threshold of a landing strip. Any reference to prior art contained herein is not to be taken as an admission that the information is common general knowledge, unless otherwise indicated.
  • the beam may not be directed by a movable mirror, but instead a movable prism, or by a movable optical fibre, moving the laser itself, or a combination of these.
  • a single beam may be split into multiple beams by a hologram or optical fibre bundle, and each of the multiple beams may be pointed in a static direction relative to the vehicle.
  • the multiple beams may originate from multiple sources such as lasers.

Abstract

A method of determining the attitude of a remote surface (14) is disclosed. The method includes the step of directing beam (24) towards at least three separate points on the surface (14), the points defining a shape. The method also includes the step of receiving light reflected from the surface (14) from the at least three points. The attitude of the surface (14) is determined based on the received light. The method has particular application in the landing of an aerial vehicle.

Description

A method of determining characteristics of a remote surface with application to the landing of an aerial vehicle
Technical Field
This invention relates to a method of determining characteristics of a remote 5 surface and has particular application in the field of landing aerial vehicles including unmanned aerial vehicles (UA Vs). UAVs include both fixed wing vehicles such as unmanned aeroplanes and rotary wing vehicles such as unmanned helicopters.
Background to the Invention
10 When landing an aerial vehicle such as a helicopter it is important that the pilot or automated flight control system for the aerial vehicle has access to accurate velocity and position information relative to the surface of the landing area. Further, an accurate estimate of the landing area orientation and any motion of the landing area (as on a ships' flight deck) must be available so that a safe decision on if and when to land can
15 be made. In the case of a manned vehicle, the landing task is usually controlled by input from the pilot to the aircraft controls based on a visual assessment of the landing area with the aid of aircraft instrumentation such as compass and altimeter. In the case of UAVs, these vehicles are typically controlled by a combination of commands received from a ground control facility and by commands generated in control systems
20 provided on board the unmanned vehicle. There is a need to improve the ability of manned and unmanned vehicles to land in a landing area and this is particularly the case where the landing area itself may be on a slope or subject to movement such as in the case of landing on a ship or other floating platform.
25 Summary of the Invention
In a first aspect the present invention provides a method of determining the attitude of a remote surface including the steps of: directing a beam towards at least three separate points on the surface, the points defining a shape; receiving light reflected from the surface from the at least three points; and determining the attitude of the 30 surface based on the received light.
The shape may be substantially oval or circular.
The beam may be a beam of light.
The beam may be a laser beam. The beam may be directed by moving a mirror. The mirror may rotate.
The attitude of the surface may be determined based on the time taken between the light being directed towards a point and receiving light reflected back from that point.
The method may be earned out on a moving aerial vehicle and the surface may be a landing area for the vehicle.
The attitude of the surface relative to the moving aerial vehicle may be varying. The future attitude of the surface relative to the moving vehicle can be predicted using the method.
In a second aspect the present invention provides a system for determining characteristics of a remote surface including: directing means for directing a beam towards at least three separate points on the surface, the points define a shape; and receiving means for receiving reflections of the beam from the at least three points. The directing means may include a moveable mirror.
The mirror may rotate about an axis and the axis of rotation of the mirror intersects the plane of the reflective surface of the mirror.
The axis of rotation may intersect the plane of lhe reflective surface at an angle of typically between 70 to 90 degrees. In a third aspect the present invention provides a method of landing an aerial vehicle including the steps of: directing a beam towards a landing area; receiving reflections of the beam which are reflected back from the landing area; and determining characteristics of the landing area based on the time taken to receive reflections of the beam. The characteristics determined may include either the distance to the landing area or the attitude of the landing area.
The method may further include the step of providing a vision sensor on the vehicle which is arranged to detect a target which is provided in a pre-dctermined location in relation to the landing area. Tn a fourth aspect the present invention provides a system for landing an aerial vehicle including: directing means for directing a beam in the direction of a landing area; receiving means for receiving reflections of the beam back from the landing area. The system may further include a vision sensor arranged to detect a target which is provided in a pre-dctermined location in relation to the landing area.
For manned and unmanned operations of vertical landing, it is critical that the slope and condition of the ground being landed on is known so that a decision can be made on whether the landing is safe. Helicopters have a maxim urn slope on which they are permitted to land. If this slope is exceeded, the helicopter can topple or slide or the control limits of the rotor may be exceeded. Embodiments of the invention provide the operator with a measure of the slope of the ground and the distance to the ground directly under the helicopter at all times during hover and the landing phase.
Embodiments of the invention provide an accurate measure of the distance to the ground which can be used in a feedback loop to control the height of the vehicle automatically.
For operations on a moving deck, such as on a ship, the motion of the ship can be estimated based on the output of the sensor and this estimate can then be integrated in to the guidance system of the flight vehicle.
Brief Description of the Drawings
An embodiment of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic view of a rotary winged UAV fitted with a time of flight measuring instrument;
Figure 2 is a detailed view of a time of flight measuring instrument; Figure 3 is an alternative embodiment of a time of flight measuring instrument; Figure 4 is a schematic illustration of the systems used to control the UAV of figure 1; Figure 5 illustrates a vehicle body axes system used in embodiments of the invention;
Figure 6 illustrates the geometry of an embodiment of the invention; and Figure 7 illustrates how the vision sensor tracking algorithm determines bright regions from the live pixel stream using adjacency rules.
Detailed Description of the Preferred Embodiment
Referring to Figure 1 , a UAV in the form of an unmanned helicopter 10 is shown hovering above a landing area 14 being the deck of a ship at sea. The UAV is fittcd with a means for directing a laser beam towards the landing area and means for receiving the reflected beam in the form of transceiver 12.
Referring to figure 2, transceiver 12 includes a laser rangefinder 16 such as the AccuRange 4000 produced by Acuity, Inc. The AccuRange 4000 laser rangefinder is an optical long distance measuring tool. It operates by means of an infrared laser diode that emits light of 780nm wavelength at a power of 2OmW. Employing time-of-flight measuring principles, this particular rangefinder can accurately gauge distances up to about 20 metres.
Transceiver 12 further includes a mirror 18 which is rotatably mounted to a housing 20. Mirror 18 is rotated by a stepper motor 22 which includes an encoder. The encoder provides a value between zero and 4096 that corresponds to the angles of rotation (0° to 360°). The encoder value is output with the respective range measurement so that the measurements can be related to the helicopter's body axis coordinate system. In use the mirror rotates at a fixed speed of about 1500rpτn. Stepper motor 22 has an axis of rotation A which is offset from the axis of the beam emitted by rangefinder 16 by 45 degrees. The face of mirror 18 is offset to the axis of rotation by 10 degrees. Thus, as minor 18 rotates the laser beam is directed to trace out a corneal surface 24. The base of the conical surface 24 describes an oval shape on the landing area when the helicopter is hovering directly over a substantially flat landing area.
Referring to figure 3, an alternative arrangement of transceiver is shown and like reference numerals indicate identical components as seen in figure 2. In this arrangement, rangefinder 16 directs a laser beam towards mirror 32 which is offset by 45 degrees to the axis of the laser beam. The beam is reflected towards mirror 34 which is offset to the axis of the beam by about 42.5 degrees. Both of mirrors 32 and 34 are mounted in a carrier 33 which in turn is mounted to the output shaft of stepper motor 22. The carrier is rotated by the stepper motor and a counterbalance 35 is provided to minimise vibration forces.
Referring to figure 4, the diagram schematically illustrates a computing device 40 for a helicopter. The device 40 includes a PC104 flight computer 42 in combination with a high speed interface card 43 (HSIF). The HSBF card enables a maximum sampling rate of 50,000 samples per second. The samples come over the bus in an 8 byte format that includes a 19 bit range value and 1 byte values for signal strength, ambient light and sensor internal temperature as well as status and general input bits.
For convenience the geometry of the system is shown m figure 6 and the system will now be explained with reference to this geometry.
Detailed Description of how the system calculates attributes of landing area
In operation, the transceiver 12 continuously takes readings to measure the distance to points in three dimensions based on reflections of the emitted laser bcam recei ved by the transceiver and based on the position of the mirror stepper motor. By taking a number of samples at different rotational positions of the mirror it is possible to build up a three dimensional 3D cloud of points. Typically, 100 points arc measured for each revolution of the mirror. From this, the distance to the landing area, and the slope of the landing area can be calculated.
For each scanned sample, a distance measurement and encoder output are taken from the laser apparatus. The range measurement is corrected for known errors due to changes iα temperature, ambient Jight and reflectivity. The range measurement is scaled in to an appropriate set of units. The encoder measurement is converted in to an azimuth angle measured from a reference point such as the nose of the vehicle. For example, given an encoder with 4096 discrete positions per revolution, the azimuth angle (ψ) would be calculated from the encoder output (E) using equation (1).
Figure imgf000006_0001
The range (R) and azimuth angle (ψ) is then be converted in to a three- dimensional position relative to the aircraft axes system, taking in to account the mirror geometry. The aircraft body axes are a right-handed axes system fixed at the sensor position and rotating with the vehicle as shown in figure 5. The vehicle illustrated is a fixed wing UAV, but the axes system equally applies to rotary wing UAVs. Tbe x- body axis is aligned with the length of the aircraft so that the positive x direction points forwards from the nose. The y-axis passes out to the right parallel to the straight line joining the wing tips. The z-axis points vertically down when the aircraft is flying level. Referring to figure 6, the mirror shaft has a tilt of θ, and the mirror face is offset from a plane normal to the axis of rotation by θm degrees. Assuming that the laser beam departs the rangcfinder parallel to the aircraft x body axis and pointing forwards, then the unit vector
Figure imgf000007_0007
normal to the face of the mirror is given by equations (2-3).
Figure imgf000007_0002
Wher
Figure imgf000007_0001
From the unit vecto the coordinates of each scan point in the aircraft body axes
Figure imgf000007_0008
syste
Figure imgf000007_0005
can be determined by a reflection transformation applied to the incident laser beam striking the mirror. Equation (4) provides the resulting coordinates for each scan point given the range and the components of the unit vecto
Figure imgf000007_0006
normal to the mirror. The scan point coordinates are shown in aircraft body axes with the origin being defined as the centre of the mirror.
Figure imgf000007_0003
Tf desired, each 3D point can be adjusted for the attitude (pitch, roll, yaw) of the flight vehicle as measured by the vehicle's attitude reference system. The shift in position of each point due to the velocity of the vehicle can be corrected if the vehicle's velocity is known from another sensor such as a global positioning sensor. After these transformations, the points are in global coordinates defined relative to the earth based axes syste Each 3D point is stored in to a buffer in the processing unit
Figure imgf000007_0004
memory.
After a complete scan, the buffer of 3D points is passed to a software subroutine which calculates the plane of best fit to the stored points. A plane in 3D space is described by equation 5. By detemύning the coefficients Ki , K2 and K3 the plane of the surface is then defined.
Figure imgf000008_0003
One way of determining the coefficients describing the plane is to use a least- squares method. The objective of the least-squares minimisation is to find the value of the plane coefficient vecto uch that the sum of the squares of the error
Figure imgf000008_0001
residuals (R) in equation 6 is minimised.
Figure imgf000008_0004
To implement this, the coordinates of the scan points arc arranged in matrix form as per equation 7.
Figure imgf000008_0002
Equation (8) is a solution to the least square problem
Figure imgf000008_0005
Once the equation of the plane is found from equation (8) or by some other means, the instantaneous height H of the vehicle above the surface can be found using equation (9).
Figure imgf000008_0006
Likewise the roll and pitch of the surface can be found from the equation of the plane. If we define pitch angle (θ) as the inclination of the plane from the y-axis and roll (φ) as the inclination of the plane from the x-axis, then equation (10) defines the orientation of the plane.
Figure imgf000009_0001
Thus, the results of the above calculations give an indication of the distance to a landing area, and also an indication of the slope or attitude of the landing area. This information can be used to assist in landing either a manned or unmanned vehicle.
The surface estimation algorithm can be further refined to disregard points that do not lie on the landing area., For example, when the vehicle is above a ship's depth, some of the scanned points may fall on the sea rather than on the deck. This can be achieved using a simple intrative process as follows. First the distance between every scanned point in the first estimate of the plain equation can be calculated. Any points that are outside of a certain tolerance are ignored and the plain equation recalculated. This process can be continued until all the points marked valid are within a said tolerance of the deck surface.
Visual Tracking Sensor
Use of the laser rangefmder transceiver gives an indication of the orientation and distance of a remote surface, such as a landing area, but it does not indicate the relative location of the vehicle to a point on the surface, in directions parallel to the surface. To provide this information a visual tracking sensor is fixed to the vehicle to locale a target provided at the landing area. The visual tracking sensor provides a measurement of the relative azimuth angle and elevation angle between the tracking sensor and the target. The equation for a line in space which passes through the tracking sensor and the plane of the landing surface can then determined. Using simple Cartesian geometry, the intersection of this line and the plane of the landing surface measured using the laser rangefinder then provides the relative x,y and z position of the helicopter relative to the beacon which can be used in a feedback loop to control the relative position of the helicopter to the landing point. Other sensors such as mertial sensors may be used to smooth the relative x,y,z position estimates from the combined rangefinder and tracking sensor combination. Detailed Description of Preferred Embodiment of Tracking Sensor
A single light source, or beacon, is used as the target which is centred in the field of view where practicable. In conjunction with the laser sensor, a single beacon is all that is required to fix the position of the helicopter with respect to the centre of the deck. Whilst the yaw angle of the helicopter is not determined using the combination of a point target and laser scanner, the yaw loop may be controlled using a PD feedback scheme based on heading angle, and all that is required is a system to tell the helicopter what heading to steer to match the course of the ship. This only requires a slow update as ships take a long time to change course when underway.
A bright LED having a relatively narrow output frequency band is used for the beacon. Rejection of unwanted specular reflections and other lights is possible by choosing a narrow band filter matched to the spectral output of the beacon. The LED output frequency band is ideally chosen where atmospheric absorption is minimal. An alternative proposal is to make use of a colour camera or image sensor, controlled so that pixels with a significant blue or green intensity are discounted. Since astronomical light sources such as the sun (and lheir reflections) have a broad spectral presence, operating in embodiments of the invention with the target analysed principally in the red portion of the spectrum can be an enhanced option.
An example image sensor is achieved through Ihe use of a CMOS image sensor, with all of the necessary image processing and coordinate determination located within a single FPGA (Field Programmable Gate Array). The FPGA interfaces to the flight control system and delivers the reliable coordinates of the beacon within the image field to the flight computer.
The optics used to image onto the sensor define the positioning accuracy and initial capture range of the beacon. We have used a machine vision CS mounted lens with 6mm focal length and red narrowband optical filters to improve rejection over the specular reflections expected in the water environment. The CS mount allows for other lenses to achieve a desired FOV (Field of View) and f-number. The beam pattern may be narrowed, increasing the output aperture, in which case more of the sensor will be illuminated by the beacon. To enhance performance, the image sensor may be controlled in exposure and frame rate. The pixel intensity data is delivered to the adjacent FPGA7 upon which the first of two pipelined algorithms are run. This algorithm monitors the live pixel stream to locate adjacent pixels of brightness above a threshold, and determines whether to consider these as contenders for the beacon. As an example, in Figure 7 the algorithm has found four regions (1,2,3 and 4) that have a sufficient brightness and the requirement for adjoining pixels to be considered as a part of the tracked light. For regions 5 and 6 the connection between the pixels that is not good enough to create a single region. As the determination of the area of the region must be taken as pixels are rastered from the sensor, the left most column and run length within that row are remembered for each of a finite number of regions, as is the first encountered row (top). As rows are analysed the boundaries of each region are moved outwards to the farthest contiguous breadth and height, hence the left, top, right and bottom encompass each region contender in the frame. From this the rectangular area is obtained, and the horizontal and vertical centres are easily determined. A single line depth FIFO (First In First Out) is all that is required for the vertical comparison- The threshold is updated at the end of each frame based on a fixed percentage of the peak pixel value within that frame which is remembered during the streaming, and is applied during the following frame on the assumption there will be little interframe change.
The computed centres of the regions are analysed, with the centres and areas stored. With an appropriately tuned spatial and/or temporal filter, areas that are not the target (such as specular reflections from the sea surface) can be rejected whilst tolerating smaller morions due to atmospheric induced scintillation effects, and motion caused in the image field by aircraft and landing platform alike to be tolerated. The most likely location for the beacon based on area and intensity is chosen for each frame. A calibration look-up table within the FPGA is used to convert the pixel location from the tracking algorithm into an azimuth and elevation angle measured from the tracking sensor to the beacon. These angles arc then forwarded to the flight computer. Should no reliable beacon be determined from this stage, no coordinate is sent in that frame time.
Now that embodiments of the invention have been described it will be appreciated that some embodiments have some of the following advantages: it is possible to identify the orientation, location and distance of a remote surface with respect to an aerial vehicle even when the remote surface is not fixed in space; it is known when the circle of laser points fall off the edge of a ships deck, for example, or the laser point is reflected badly, or is aimed down a hoie in the deck of the ship; the algorithms can predict attitude, target location and distance in the future; the LED beacon helps guide the aerial vehicle down to a particular spot on the surface; and the future position of the remote surface and target can be predicted which is useful in situations such when a landing surface which is a moving deck of a ship.
Tn this document the term "shape" is intended to define an area or implied area in two dimensions and does not encompass a straight line. Whilst the preferred embodiment has been described with reference to landing an unmanned helicopter, the invention also has application for manned aerial vehicles where the pilot may be provided with useful landing guidance information.
Whilst the preferred embodiment has been described with reference to landing a vehicle on the ground, the invention also has other applications, such as docking two vehicles. These vehicles could be space vehicles. One vehicle may have a flat reference surface and target. The flat reference surface may be a collection of reflectors defining an imaginary reference surface. Alternatively, the invention may allow automated or assisted docking of a ship or a blimp where the flat surface and target is at the dock or guiding fixed aircraft to the threshold of a landing strip. Any reference to prior art contained herein is not to be taken as an admission that the information is common general knowledge, unless otherwise indicated.
Finally, it is to be appreciated that various alterations or additions may be made to the parts previously described without departing from the spirit or ambit of the present invention- For example, the beam may not be directed by a movable mirror, but instead a movable prism, or by a movable optical fibre, moving the laser itself, or a combination of these. Alternatively, a single beam may be split into multiple beams by a hologram or optical fibre bundle, and each of the multiple beams may be pointed in a static direction relative to the vehicle. The multiple beams may originate from multiple sources such as lasers.

Claims

I . A method of determining the attitude of a remote surface including the steps of: directing a beam towards at least three separate points on the surface, the points defining a shape; receiving light reflected from the surface from the at least three points; and determining the attitude of the surface based on the received light. 2. A method according to claim 1 wherein the shape is substantially circular or oval-shaped-
3. A method according to either of claim 1 or claim 2 wherein the beam is a beam of light.
4. A method according to any preceding claim wherein the beam is a laser beam. 5. A method according to any preceding claim wherein the beam is directed by moving a mirror.
6. A method according to claim 5 wherein the mirror rotates.
7. A method according to claim 1 wherein the attitude of the surface is determined based on the time taken between the light being directed towards a point and- receiving light reflected back from that point.
8. A method according to claim 6 wherein a rotational angle of the mirror is known.
9. A method according to claim 8 wherein the known rotational angle is used to determine the attitude of the surface. 10. A method according to any preceding claim further comprising the step of fitting an imaginary plane to a measured position of each of the at least three points.
I I. A method according to any o»e of the preceding claims wherein the method is carried out on a moving aerial vehicle and the surface is a landing area for the vehicle. 12. A method according to claim 11 wherein the attitude of the surface relative to the moving aerial vehicle is varying.
13. A method according to claim 12 wherein the future attitude and/or distance of the surface relative to the moving vehicle is predicted.
14. A system for determining characteristics of a remote surface including: directing means for directing a beam towards at least three separate points on the surface, the points defining a shape; and receiving means for receiving reflections of the beam from the at least three points.
15. A system according to claim 14 wherein the directing means includes a moveable mirror.
16. A system according to claim 15 wherein the mirror rotates.
17. A system according to claim 16 wherein the mirror rotates about an axis which is not aligned with the incident beam and the axis of rotation of the mirror intersects the plane of the reflective surface of the mirror.
18. A system according to any one of claims 14 to 17 wherein the beam is a beam of light.
19. A system according to claim 18 wherein the beam is a laser beam. 20. A system according to any of the claims 14 to 19 further including a laser rangefinder.
21. A system according to claim 20 wherein the laser rangefinder includes a pulsed laser source for generating the beam.
22. A system according to either of claims 16 and 17 wherein the mirror is rotated by a stepper motor.
23. A system according to claim 22 wherein the stopper motor has an associated encoder.
24. A method of landing an aerial vehicle including the steps of: directing a beam towards a landing area; receiving reflections of the beam which are reflected back from the landing area; and determining characteristics of the landing area based on the time taken to receive reflections of the beam.
25. A method according to claim 24 wherein the characteristics determined include cither the distance to the landing area or the attitude of the landing area.
26. A method according to either claim 24 or claim 25 and further including the step of providing a vision sensor on the vehicle which is arranged to detect a target which is provided in a pre-determϊned location in relation to the landing area. 27- A method according to claim 26 wherein the step of detecting the target includes the step of detecting a characteristic frequency of light emitted by the target.
28. A method according to claim 27 wherein the step of detecting the target includes the step of first passing the characteristic frequency of light through a narrow band filter.
29. A method according to any one of claims 26 to 28 wherein the step of detecting the target includes the step of forming an image of the target.
30. A method as defined in claim 29 further including the step of electronically processing the image of the target to determine the targets position relative to the vehicle.
31. A method as defined by any one of the claims 26 to 30 wherein the step of detecting the target includes temporal and/or spatial filtering of a signal from the vision sensor.
32. A system for landing an aerial vehicle including: directing means for directing a beam in the direction of a landing area; and receiving means for receiving reflections of the beam back from the landing area.
33. A system according to claim 32 and further including a system as claimed in any one of claims 14 to 23.
34. A system according to either of claims 32 and 33 further including a vision sensor arranged to detect a target which is provided in a pre-delermined location in relation to the landing area.
35. A system according to any one of claims 32 to 34 further including an inertial sensor.
36. A system according to any one of claims 32 to 35 further including an image sensor.
37. A system according to any one of claims 32 to 36 further including a Field Programmable Gate Array. 38. A system according to claim 36 wherein the image sensor is provided with control means to discount the value of pixels having a significant intensity at a frequency not emitted by the target. 39. A system including a flight system to control the landing of an aircraft using information derived from a system or method of any one of the preceding claims.
PCT/AU2007/001793 2006-11-23 2007-11-23 A method of determining characteristics of a remote surface with application to the landing of an aerial vehicle WO2008061307A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
AU2006906551 2006-11-23
AU2006906551A AU2006906551A0 (en) 2006-11-23 A method of determining characteristics of a remote surface
AU2007900907 2007-02-22
AU2007900907A AU2007900907A0 (en) 2007-02-22 A method of determining characteristics of a remote surface

Publications (1)

Publication Number Publication Date
WO2008061307A1 true WO2008061307A1 (en) 2008-05-29

Family

ID=39429310

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2007/001793 WO2008061307A1 (en) 2006-11-23 2007-11-23 A method of determining characteristics of a remote surface with application to the landing of an aerial vehicle

Country Status (1)

Country Link
WO (1) WO2008061307A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010071502A1 (en) * 2008-12-15 2010-06-24 Saab Ab Measuring of a landing platform of a ship
WO2010071505A1 (en) * 2008-12-15 2010-06-24 Saab Ab Method and system for facilitating autonomous landing of aerial vehicles on a surface
WO2010144014A1 (en) * 2009-06-12 2010-12-16 Saab Ab Centering above a predetermined area of a landing platform
CN102224078A (en) * 2008-10-13 2011-10-19 Dcns公司 System for guiding a drone during the approach phase to a platform, in particular a naval platform, with a view to landing same
US8855816B2 (en) 2011-06-10 2014-10-07 Seiko Epson Corporation Piezoelectric actuator, robot hand, and robot
US9947232B2 (en) 2015-12-08 2018-04-17 Honeywell International Inc. Methods and apparatus for identifying terrain suitable for aircraft landing
US10611495B2 (en) 2016-04-08 2020-04-07 Sikorsky Aircraft Corporation Sea state estimation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4925303A (en) * 1988-12-27 1990-05-15 Pavo Pusic Aircraft piloting aid laser landing system
US6327520B1 (en) * 1999-08-31 2001-12-04 Intelligent Machine Concepts, L.L.C. Planar normality sensor
GB2374743A (en) * 2001-04-04 2002-10-23 Instro Prec Ltd Surface profile measurement

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4925303A (en) * 1988-12-27 1990-05-15 Pavo Pusic Aircraft piloting aid laser landing system
US6327520B1 (en) * 1999-08-31 2001-12-04 Intelligent Machine Concepts, L.L.C. Planar normality sensor
GB2374743A (en) * 2001-04-04 2002-10-23 Instro Prec Ltd Surface profile measurement

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ECKERSLEY-MASLIN: "Autonomous Landing of a UAV at sea", NAVY ENGINEERING BULLETIN, March 2006 (2006-03-01), Retrieved from the Internet <URL:http://www.navy.gov.au/publications/engineering/march2006/uav.html> *
JOHNSON ET AL.: "Lidar-based Hazard Avoidance for Safe Landing on Mars", J. GUID. CONTROL. DYN., vol. 25, 2002, pages 1091 - 1099 *
LIEBE ET AL.: "Laser Radar for Spacecraft Guidance Applications", IEEE AEROSPACE CONFERENCE 2003, vol. 6, March 2003 (2003-03-01), pages 6_2647 - 6_2662, XP010660622 *
SARIPLALLI ET AL.: "Landing on a Moving Target using an Autonomous Helicopter", 2003, Retrieved from the Internet <URL:http://www-robotics.usc.edu/~rsik/papers/icra2003.pdf> *
SHUANG ET AL.: "Autonomous optical navigation for loading on asteroids", AIRCRAFT ENGINEERING AND AEROSPACE TECHNOLOGY, vol. 77, no. 4, 2005, pages 317 - 323, XP001234357, DOI: doi:10.1108/00022660510606402 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102224078A (en) * 2008-10-13 2011-10-19 Dcns公司 System for guiding a drone during the approach phase to a platform, in particular a naval platform, with a view to landing same
US8538133B2 (en) * 2008-10-13 2013-09-17 Dcns System for guiding a drone during the approach phase to a platform, in particular a naval platform, with a view to landing same
US20120076397A1 (en) * 2008-10-13 2012-03-29 Dcns System for guiding a drone during the approach phase to a platform, in particular a naval platform, with a view to landing same
US8457813B2 (en) 2008-12-15 2013-06-04 Saab Ab Measuring of a landing platform of a ship
EP2366130A1 (en) * 2008-12-15 2011-09-21 Saab AB Measuring of a landing platform of a ship
WO2010071502A1 (en) * 2008-12-15 2010-06-24 Saab Ab Measuring of a landing platform of a ship
WO2010071505A1 (en) * 2008-12-15 2010-06-24 Saab Ab Method and system for facilitating autonomous landing of aerial vehicles on a surface
US8554395B2 (en) 2008-12-15 2013-10-08 Saab Ab Method and system for facilitating autonomous landing of aerial vehicles on a surface
EP2366130A4 (en) * 2008-12-15 2014-11-12 Saab Ab Measuring of a landing platform of a ship
WO2010144014A1 (en) * 2009-06-12 2010-12-16 Saab Ab Centering above a predetermined area of a landing platform
US9158306B2 (en) 2009-06-12 2015-10-13 Saab Ab Centering above a predetermined area of a landing platform
EP2440982A4 (en) * 2009-06-12 2016-08-10 Saab Ab Centering above a predetermined area of a landing platform
US8855816B2 (en) 2011-06-10 2014-10-07 Seiko Epson Corporation Piezoelectric actuator, robot hand, and robot
US9947232B2 (en) 2015-12-08 2018-04-17 Honeywell International Inc. Methods and apparatus for identifying terrain suitable for aircraft landing
US10611495B2 (en) 2016-04-08 2020-04-07 Sikorsky Aircraft Corporation Sea state estimation

Similar Documents

Publication Publication Date Title
JP6843773B2 (en) Environmental scanning and unmanned aerial vehicle tracking
EP2366131B1 (en) Method and system for facilitating autonomous landing of aerial vehicles on a surface
US20200150217A1 (en) Laser Speckle System and Method for an Aircraft
CN109556577B (en) Positioning system for aerial non-destructive inspection
US9482524B2 (en) Measuring system for determining 3D coordinates of an object surface
JP3345113B2 (en) Target object recognition method and target identification method
US10969493B2 (en) Data processing device, data processing method, and data processing program
KR101553998B1 (en) System and method for controlling an unmanned air vehicle
US8457813B2 (en) Measuring of a landing platform of a ship
US9758239B2 (en) System and method for controlling an unmanned air vehicle
WO2008061307A1 (en) A method of determining characteristics of a remote surface with application to the landing of an aerial vehicle
JP2017075863A (en) Aerial type inspection device and inspection method
EP1975646A2 (en) Lader-based motion estimation for navigation
GB2243741A (en) Passive object ranging and sizing
JP7011908B2 (en) Optical information processing equipment, optical information processing method and optical information processing program
CN109573088A (en) A kind of Shipborne UAV photoelectricity guidance carrier landing system and warship method
CN110865353A (en) System and method for reducing DVE impact on LIDAR returns
Garratt et al. Visual tracking and lidar relative positioning for automated launch and recovery of an unmanned rotorcraft from ships at sea
US20220099442A1 (en) Surveying System
CN112578398B (en) Double-focal-plane detection and identification system and detection and identification method
KR20160118558A (en) Lidar system
JP2746487B2 (en) Aircraft position measurement method for vertical take-off and landing aircraft
EP4047396A1 (en) Structured light navigation aid
US20220260721A1 (en) Structured light navigation aid
JP2022028894A (en) Optical information processing apparatus, optical information processing method, and program for optical information processing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07815595

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07815595

Country of ref document: EP

Kind code of ref document: A1