WO2020064969A1 - Aerial imaging device and system - Google Patents

Aerial imaging device and system Download PDF

Info

Publication number
WO2020064969A1
WO2020064969A1 PCT/EP2019/076114 EP2019076114W WO2020064969A1 WO 2020064969 A1 WO2020064969 A1 WO 2020064969A1 EP 2019076114 W EP2019076114 W EP 2019076114W WO 2020064969 A1 WO2020064969 A1 WO 2020064969A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
flight
flight path
imaging
component
Prior art date
Application number
PCT/EP2019/076114
Other languages
French (fr)
Inventor
Kelvin Hamilton
Conrad RIDER
Original Assignee
Flare Bright Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flare Bright Ltd filed Critical Flare Bright Ltd
Publication of WO2020064969A1 publication Critical patent/WO2020064969A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C31/00Aircraft intended to be sustained without power plant; Powered hang-glider-type aircraft; Microlight-type aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C29/00Aircraft capable of landing or taking-off vertically, e.g. vertical take-off and landing [VTOL] aircraft
    • B64C29/02Aircraft capable of landing or taking-off vertically, e.g. vertical take-off and landing [VTOL] aircraft having its flight directional axis vertical when grounded
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C31/00Aircraft intended to be sustained without power plant; Powered hang-glider-type aircraft; Microlight-type aircraft
    • B64C31/02Gliders, e.g. sailplanes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • B64U70/30Launching, take-off or landing arrangements for capturing UAVs in flight by ground or sea-based arresting gear, e.g. by a cable or a net
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • B64U70/70Launching or landing using catapults, tracks or rails
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/105Simultaneous control of position or course in three dimensions specially adapted for aircraft specially adapted for unpowered flight, e.g. glider, parachuting, forced landing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/31UAVs specially adapted for particular uses or applications for imaging, photography or videography for surveillance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/10Wings

Definitions

  • This invention relates to the field of imaging. More particularly, it relates to an imaging device for use in aerial imaging and/or surveillance, to a system incorporating the device and to a method of aerial imaging.
  • Aerial imaging finds application in many civil and military applications, including surveying (e.g. of buildings, cityscapes, crops and landscapes), search and rescue, surveillance and reconnaissance.
  • Aerial imaging and surveillance systems typically comprise sophisticated hardware, often drones with GPS or geolocation capability and image capture devices.
  • the operation of the drones requires skill from the user, in order to obtain useful images, often line of sight to operate and potentially also control of the cameras for imaging.
  • operation of a drone may require a licence.
  • a conventional drone requires flying with a joystick, with considerable skill and line of sight.
  • The‘black hornet’ developed by Prox Dynamics AS and available from FLIR Systems Inc (https://www.flir.co.uk/globalassets/imported- assets/document/b lack-homet-prs-spec-sheet.pdf) is a micro surveillance remote control helicopter, which is highly controllable, agile and transmits imagery back to the operator. It is characterised by its capability and its small size and light weight. Its features are described in numerous patent documents, including WO-A- 2013/139509 and WO-A-2012/130790.
  • the portable munitions launcher enables the vehicle to rapidly reach flight speed after which it may be flown by ground-based remote control to an imaging target location.
  • the system comprises a fuselage and collapsible wings which deploy after launch. Still and video images from the target imaging area are transmitted via a data link system. It may then be flown back to the operator and recovered.
  • US9738383 describes a remotely or autonomously controlled unmanned aerial vehicle which may be flown to a target area, deploy a parachute to enable the vehicle to loiter (in the manner of a paraglider) and in some
  • embodiments jettison the parachute and return to base.
  • US8263919 concerns a surveillance vehicle comprising a vessel and parasail in which the vessel is a launched via a mortar tube, using an on-board launch propulsion toward an area of imaging interest, at which point a parasail is deployed (relying on GPS location, period since launch or altitude detection) and a camera capture image data which is transmitted back to the operator via communications satellite. It may optionally comprise an on-board propulsion device for extended flight.
  • the vehicle in US8263919 is contemplated as a single use vehicle and optionally has a self-destruct capability.
  • US-A-2004/0196367 describes an apparatus for surveillance over a zone having a projectile launcher, a projectile having a camera, image transmitter circuit (with corresponding ground-based receiver) and a deployable parachute to stabilise the projectile as it falls.
  • US9234728 similarly describes a reconnaissance capsule launched from an artillery launching platform and carrying a camera for capturing and transmitting live video images back to the user during parachute stabilised descent.
  • Both US-A-2004/0196367 and US9234728 are effectively single use items, although provision may be made to recover and re-use components and provide image capture from directly above an image target area during descent.
  • W002/073129 an imaging device that can be launched from the ground and is configured, by way of a body member with angled fins, to rotate about its roll axis along a parabolic flight path to capture panoramic images.
  • Images are captured from a lens disposed in the side body of the device
  • the image data can be tagged with orientation data from on-board sensors, optionally. Image data can be transmitted to a use on the ground. However, such devices do not return to base and are essentially single use unless components can be recovered.
  • the present inventor has devised a device and system to facilitate cost effective capture of images at a target location.
  • an imaging device for capturing aerial images of an imaging target, the imaging device being capable of and configured for flight, wherein the imaging device preferably has no on-board propulsion and is preferably autonomous-in-flight.
  • an imaging device for capturing aerial images of an imaging target, the imaging device being capable of and/or configured for a flight path defining a generally arcuate or flattened trajectory component and having no on-board propulsion.
  • an imaging device for capturing aerial images of an imaging target, the imaging device capable of a loop flight path defining a generally arcuate or flattened trajectory at apogee and having no on-board propulsion.
  • an imaging device that is autonomous in-flight, which device comprises a body capable of flight when subject to sufficient launch energy, an in-flight orientation means disposed in relation to the body, an imaging component, one or more positional, orientational, motion or environmental sensors to provide sensor data, a controller for controlling the in-flight orientation means and the imaging component in response to pre-defined criteria characterized by time periods and/or sensor data or data derived therefrom and a power source for providing power to the flight orientation means, one or more sensors, imaging component and controller.
  • an imaging device for capturing aerial images of an imaging target, the imaging device being capable of and configured for flight, wherein the imaging device is configured to navigate toward a homing target (e.g. definable by a pattern and preferably by an on-board machine vision module).
  • a homing target e.g. definable by a pattern and preferably by an on-board machine vision module.
  • a system for capturing aerial images of an imaging target comprising an imaging device as defined above, a launcher or adaptor for a launcher for launching the imaging device and an external CPU or external processor external to the imaging device configured for data communication with a controller and/or a data storage means on the imaging device.
  • a launcher or adapter for a launcher for an imaging device as defined above.
  • a controller or controller software configured to operate an imaging device as defined above, the controller or controller software having multiple flight operational phases comprising an ascent phase, an image capture phase and a homing phase.
  • a catcher for a deployed image capture device as defined above, the catcher having a homing target recognizable by a computer vision module on board the image capture device on at least one surface of the catcher.
  • a method of capturing an image of an imaging target comprises providing an imaging device as defined above, providing a launcher for the imaging device, at a launch point orientating the imaging device for launch at a suitable pitch, preferably in a heading generally reciprocal to the heading of the imaging target from the launch point, causing the launcher to launch the device, providing a homing target identifiable by a machine vision module on board the device and positioning and orientating the homing target such that it can be identified by the machine vision module.
  • the system of the invention enables image capture by a robust, autonomous-in-flight imaging device with back-to-base capability, which is very efficient in on-board energy use and easy to use, without extensive training.
  • Figure 1 is a rear perspective view of an embodiment of a device of the invention
  • Figure 2 is a sectional plan view of an embodiment of a device of the invention.
  • Figure 3 is a rear perspective view of a further embodiment of a device of the invention.
  • Figures 4A and 4B are side and perspective transparent views of a main body or fuselage of a device of Figure 3;
  • Figure 5 is an illustration of a flight path of an embodiment of a device of the invention.
  • Figure 6 is an illustration of a flight path of a second embodiment of the device of the invention
  • Figure 7 is an illustration of a flight path of a third embodiment of the invention
  • FIGS 8a to 8f are illustrations of alternative flight paths of the device of the invention according to several embodiments.
  • Figure 9 is a perspective view illustration of one embodiment of a catcher of the invention.
  • the imaging device of the invention is for aerial imaging and, in particular for capturing aerial images of an imaging target. Preferred embodiments described herein relate, where the context allows, to any of the aspects of invention defined above.
  • the device has a body that is capable of flight (e.g. upon launching from a launcher, typically a ground-based launcher).
  • the imaging device is capable of and/or configured for a flight path, preferably defining a generally arcuate or flattened trajectory and having no on-board propulsion.
  • the imaging device is preferably capable of a loop flight path, preferably an inside loop flight path.
  • the imaging device may be configured to be launched from a ground launcher.
  • a flight path for the device may in part be determined by the pitch and heading at launch. Such pitch and heading at launch may be selected to achieve a desired flight path and orientation for imaging.
  • the imaging device is autonomous in-flight.
  • autonomous-in-flight it is meant the imaging device is configured to adjust, re orientate or adapt its own course or flight path for at least a portion of the flight (without control from an operator).
  • the imaging device is capable of adjusting, re-orientating or adapting its flight path during flight, preferably dynamically adapting its flight path.
  • Such reorientation or adaptation of its flight path may be, for example, to allow the device to follow a pre-defined flight path or in response to data, such as data generated from sensors (or image capture sensors), preferably on-board, concerning environmental factors or location, orientation, acceleration and velocity data or image data.
  • the imaging device has a homing capability.
  • the device is configured to be capable of recognizing a homing target, typically a pre-defined homing target.
  • the imaging device is configured to recognize a homing target via sensors, such as via a machine vision module, which preferably makes use of the same imaging components (preferably cameras) used for image capture of a target image.
  • the imaging device is configured to dynamically adapt its flight path toward a homing target (and, for example, to navigate to the homing target) in response to data from recognition sensors or a machine vision module, which data may typically relate to the relative location of the homing target (e.g. relative to the device).
  • homing target it is meant any target that is identifiable and recognizable and having a characterizing recognizable feature, signal or pattern.
  • the homing target may be active (e.g. a variable or transmitting signal or pattern) or passive (a static signal or pattern).
  • the homing target is a visual pattern that is recognizable by a machine vision module or system on board the device.
  • the device comprises a controller for controlling one or more functions or components on the device, such as the orientation means, one or more sensors, an imaging component and/or a machine vision module.
  • the controller comprises a processor or CPU and software or a computer program defining criteria and processes for the processor.
  • the device comprises one or more sensors, such as positional sensors, orientational sensors, motion sensors or environmental sensors.
  • the device comprises inertial sensors.
  • the sensors generate sensor data, which is preferably provided to a processor or flight computer. Such data may be used by the controller to make decisions as to flight path etc. For example, should data from inertial sensors indicate that the device in flight has deviated from a pre-defined flight path plan (pre-defined course), the controller may calculate an adjustment necessary to put the device back on course and then control the orientation means (e.g. control surfaces, such as elevons) to make that adjustment.
  • the orientation means e.g. control surfaces, such as elevons
  • an imaging device such as is defined above, comprises a controller configured for multiple flight operational phases comprising an ascent phase, an image capture phase and a homing phase.
  • Each flight operational phase may be characterized by a different state or condition of control of the controller.
  • on-board propulsion means a means by which the body may be propelled in extended flight under its own power.
  • on-board propulsive directional means by which it is meant there is no propulsive booster or jet of any kind which may be insufficient to maintain the body in flight but could facilitate orientation.
  • the invention comprises in other aspects, as defined above, a system for capturing aerial images of an imaging target, a launcher or adapter for a launcher for an imaging device, a controller or controller software configured to operate an imaging device, a catcher for a deployed image capture device and a method of capturing an image of an imaging target.
  • the imaging device may be defined, in the alterative (to an imaging device), as a projectile (e.g. absent an image capture and storage means), which may optionally comprise sensor data capture means (including image data - for example for use in machine vision-based navigation) or may be configured to follow pre-defined in-flight adjustments for delivery to target location.
  • a projectile e.g. absent an image capture and storage means
  • sensor data capture means including image data - for example for use in machine vision-based navigation
  • pre-defined in-flight adjustments for delivery to target location may be configured to follow pre-defined in-flight adjustments for delivery to target location.
  • the imaging device of the invention is for capturing aerial images of an imaging target, is capable of and/or configured for a flight preferably having a path defining a generally arcuate or flattened trajectory component and has no on board propulsion.
  • the imaging device is configured to capture aerial images of an imaging target, e.g. a target location.
  • the imaging device is configured to capture images at a pre-defined heading and at one or more pitch angle from an altitude that may optionally be pre-defined or desired.
  • the imaging device is configured to capture images along a range of pitch angles at a pre-defined heading, which can be referred to as the image target heading.
  • the range of pitch angles may be selected so as to scan a particular heading.
  • the range of pitch angles may optionally be over a range of from 10° to 180°, for example, such as from 30° to 120° and may be 60° to 105°. In one preferred embodiment, the range of pitch angles may be over a 90° range, e.g.
  • the device is configured to capture images from a generally horizontal orientation (or within 5° or 10° thereof) being at or close to a levelised component of flight path, preferably at or close to apogee in a looped flight path, so as to capture an image toward the horizon at the image target heading (or within 5° or 10° thereof) and preferably to capture images through a range of pitch angles at the image target heading (or within 5° or 10° thereof) through to a vertical downward pitch (or within 5° or 10° thereof).
  • the image target heading is typically a reciprocal heading to the launch heading.
  • the imaging device is capable of and/or configured for a flight path defining a generally arcuate component.
  • the imaging device is capable of and/or configured for a flight path defining a flattened trajectory component, optionally both a generally flattened trajectory and a generally arcuate component.
  • flattened trajectory component it is meant a component of the flight path following a flat course or a straight course within 30° of horizontal or a curved course having a radius of curvature of 100 m or greater.
  • the generally arcuate component it is meant a component of the flight path which is curved.
  • the generally arcuate component may have a variable or relatively constant rate of rotation or relatively constant radius.
  • a generally arcuate component having a variable rate of rotation or variable radius preferably has an increasing or decreasing radius, so as to define a spiraled curve.
  • the generally arcuate component has a relatively constant rate of rotation or relatively constant radius.
  • relatively constant radius it is mean within 20% of the mean radius for that component of the flight path, preferably within 15% of the mean, more preferably within 10% of the mean and still more preferably within 5% of the mean.
  • the radius does not vary by more than 5% in the component of the flight path having a relatively constant radius and preferably by no more than 2%.
  • An arcuate component may optionally comprise an arc of a circle which has a constant radius.
  • the device is capable of and/or configured to have a flight path having a generally levelised component, which generally levelised component is a component of the flight path that is horizontal or within 30° of horizontal, preferably within 15°, more preferably within 10° and more preferably within 5° of horizontal.
  • the generally levelised component of the flight path according to the preferred embodiment of the present invention is at a portion of the flight path of the imaging device from which the device descends (e.g. back to the user or to a homing target after).
  • the levelised component may be part of a curved or arcuate flight path and may be at apogee.
  • the levelised component is part of a flattened trajectory or generally arcuate trajectory component or immediately precedes it.
  • the levelised component of the flight path is after apogee. In a preferred embodiment, the levelised component of the flight path is at apogee.
  • the flight path may follow a path after launch (e.g. pitched or vertical launch) in which an ascent phase continues until the rate of ascent reaches zero, at which apogee a horizontal vector of velocity may be zero or up to say 2 m/s, at which point the device begins to descend from its maximum height (at apogee) and pitch into a levelised component prior to curving about a generally arcuate flight path through descent.
  • the levelised component is after apogee and the horizontal vector of velocity at the levelised component depends upon the energy gained during descent from apogee to the altitude of the levelised component.
  • the flight path may follow a path after launch (e.g. pitched) in which an ascent phase continues until the rate of ascent reaches a pre-determined threshold greater than zero, at which point the device pitches toward a levelised component of flight path which is achieved at apogee, at which apogee a horizontal vector of velocity may be at least 1 m/s, preferably at least 2 m/s, preferably at least 2.5 m/s and more preferably at least 4 m/s, at which point the device begins to descend from its maximum height (at apogee) through a generally arcuate flight path.
  • the levelised component is at apogee and the horizontal vector of velocity at the levelised component depends upon the point before apogee at which the device pitches to a levelised flight path.
  • the levelised component of the flight path is at apogee, by which it is meant at the highest point in the flight path.
  • the levelised component may be within 20% of the highest point.
  • the flattened trajectory component or generally arcuate component encompasses or is immediately preceded by the levelised component of the flight path and preferably also apogee.
  • the flight path in descent which typically includes an image capture phase of the flight and a homing phase of the flight, may comprise a series of straight course angled flight path components but preferably comprises a curved flight path and more preferably comprises a generally arcuate component.
  • the generally arcuate flight path component is preferably within 30° of the levelised component (and preferably apogee), more preferably within 20°, still more preferably within 15°, e.g. within 10°, still more preferably within 5°.
  • the generally arcuate component encompasses or immediately follows the levelised component of the flight path (and preferably apogee).
  • the imaging device is preferably configured to fly in a loop flight path, preferably an inverse loop and preferably defining a generally arcuate or flattened trajectory at apogee (which is the point of a levelised component of flight).
  • Apogee is the highest point in the flight of the imaging device and the highest point in the loop flight path.
  • the device is configured to follow a generally arcuate path extending about at least 30°, preferably at least 45°, more preferably at least 60°, still more preferably at least 90° and preferably up to 200°.
  • the generally arcuate component comprises a rotation in the range of from 30° to 60° or from 60° to 90° or slightly further (e.g. 105°).
  • the device is configured to dynamically adapt and/or determine the flight path in flight.
  • the flight path of the device is defined by a pre-defined rate of rotation in a generally arcuate component or a pre-defined radius or range of radii of the generally arcuate component and a pre-defined horizontal vector of velocity at a generally levelised component, which preferably immediately precedes or is encompassed within the generally arcuate component and which is preferably at apogee.
  • the device is configured to define a desired flight path portion in terms of radius of arc and horizontal vector of velocity at the generally levelised component (preferably at apogee), which flight path is dynamically adapted in altitude during flight in dependence upon the rate of ascent and/or in order to achieve the pre-defined radius of arc and horizontal vector of velocity.
  • the pre-defined horizontal vector of velocity is selected to be up to 20 m/s, more preferably up to 15 m/s. It is believed that selecting a greater horizontal vector of velocity will result in insufficient time about the course of an arc to obtain sufficiently clear images and may also limit the altitude from which the images may be captured.
  • the pre-defined horizontal vector of velocity is selected to be at least 1 m/s, preferably at least 2 m/s, more preferably at least 2.5 m/s, optionally at least 4 m/s, such as at least 5 m/s and optionally at least 7.5 or 10 m/s. It is believed that selecting a lesser horizontal vector of velocity, the imaging device will not be sufficiently stable and may not be capable of completing a desired generally arcuate component of the flight path at a desired radius.
  • the desired radius of curvature that is pre-determined for an image capture flight with the device in a preferred embodiment may be any suitable radius depending upon the altitude of the flight, the launch energy, the size of the device, the number or nature of images to be captured among other factors.
  • the pre-defined radius of curvature for the generally arcuate component of the flight path is at least 1 m, preferably at least 2 m, more preferably at least 4 m, still more preferably at least 5 m, e.g. at least 7 m, preferably up to 50 m, more preferably up to 25 m, still more preferably up to 20 m and more preferably up to 15 m.
  • a range of 2 to 10 m is desirable in one embodiment. In another embodiment, a range of 5 to 15 m is preferable.
  • the imaging device is configured to follow a fixed flight path for at least a portion of the flight path, which flight path is fixed in altitude, heading and course, the fixing of the fixed flight path being completed at or before the device reaches a levelised component or apogee.
  • the fixed flight path portion is between apogee and vertical descent and preferably comprises a generally arcuate component and extends for about at least 30°, preferably at least 60°, and optionally from 75 to 90°.
  • the fixed flight path is fixed before the device reaches apogee and may comprises in excess of 90°, e.g. up to 105° or up to 120°.
  • the fixed flight path according to this embodiment is typically fixed according to a predefined radius of curvature or rate of rotation and the desired and achieved horizontal vector of velocity at a levelised component of flight path (e.g. at apogee).
  • the device is configured to pitch into a generally arcuate flight path or flattened trajectory before the rate of ascent of the device falls below a pre-defined threshold.
  • the pre-defined threshold of rate of ascent is within a pre-defined range.
  • the imaging device comprises a controller for controlling the trajectory toward the dynamically adapted flight path.
  • the controller preferably comprises a CPU comprising a computer program comprising an algorithm for dynamically adapting a desired flight path for the device during flight.
  • the imaging device is to recognize a pre-defined homing target and re-orientate itself in-flight to facilitate a flight path toward the homing target.
  • the imaging device comprises a body capable of flight, e.g. when subject to sufficient launch energy.
  • the imaging device comprises an in-flight orientation means, e.g. disposed in relation to the body.
  • the imaging device comprises an imaging component or image capture means.
  • the imaging device comprises one or more sensors, such as positional, orientational, motion or environmental sensors to provide sensor data.
  • sensors such as positional, orientational, motion or environmental sensors to provide sensor data.
  • the imaging device comprises a controller for controlling one or more controllable components on the device, such as one or more of the in flight orientation means and the imaging component, typically in response to pre defined criteria characterized by time periods and/or sensor data or data derived therefrom.
  • the imaging device comprises a power source for providing power to one or more other components on the device, such as the flight orientation means, one or more sensors, an imaging component and a controller.
  • a power source for providing power to one or more other components on the device, such as the flight orientation means, one or more sensors, an imaging component and a controller.
  • An imaging device in a further aspect of the invention comprises a body capable of flight when subject to sufficient launch energy, an in-flight orientation means disposed in relation to the body, an imaging component, one or more positional, orientational, motion or environmental sensors to provide sensor data, a controller for controlling the in-flight orientation means and the imaging component in response to pre-defined criteria characterized by time periods and/or sensor data or data derived therefrom and a power source for providing power to the flight orientation means, one or more sensors, imaging component and controller.
  • the body preferably comprises a main body and a fixed wing arrangement.
  • the main body and the fixed wings are preferably structurally distinguishable.
  • the body preferably comprises wings disposed equally either side of the main body.
  • the body preferably comprises a longitudinal axis along its main body.
  • Preferably the body is symmetrical about the first vertical plane.
  • a second horizontal plane may be defined relative to the body and comprising the longitudinal axis and generally extending in the direction of the wings.
  • the body may be symmetrical about this second horizontal plane, but is preferably asymmetrical.
  • the body has a clearly distinguished top surface and bottom surface which are defined to give the body lift.
  • the main body may be defined by a fore portion and an aft portion being those portions nearer the fore or aft of the main body along its longitudinal axis.
  • the wings are centred relative to the main body more toward the aft of the main body than the fore, preferably centred within the aft-most third of the main body.
  • the profile of the wings have an aerodynamic shape, preferably to provide a desired amount of lift and to enable fine control, such as a profile corresponding to or adapted from a NACA airfoil.
  • the main body preferably has an aerodynamic profile.
  • the lateral dimension (being generally in or parallel to the plane of the wings, i.e. parallel to the second horizontal plane) is of generally consistent lateral extent (width), typically varying in width by no more than 25% of the maximum width of the main body, more preferably no more than 15% and still more preferably no more than 10% over at least 75%, more preferably at least 80% and still more preferably at least 90% of its length (e.g. the entire body other than the extreme fore, i.e. nose, and aft, i.e. tail, portions).
  • the nose, at the fore of the main body may have a curved or squared plan shape, but preferably curved.
  • the aft preferably comprises an extended tail portion which is extends longitudinally out from the aft of the main body.
  • This may be curved and/or tapered and optionally be a portion of the maximum width of the main body, e.g. from 30 to 70% of the width of the main body at the mid-point of the tail.
  • the main body has a vertical axis, being the axis perpendicular to the longitudinal axis and the lateral axis.
  • the vertical extent (or depth) of the main body may vary along the length of the main body.
  • the main body has a greater depth, on average, toward the fore than toward the aft.
  • the portion of the main body having the greatest depth is in the fore half.
  • the depth decreases gradually from the position of maximum depth toward the aft.
  • the in-flight orientation means are disposed in relation to the fixed wing arrangement.
  • the in-flight orientation means preferably comprise control surfaces, which are preferably elevons.
  • the control surfaces e.g. elevons
  • the controller e.g. in response to sensor data.
  • Servo motors may use driving rods to operate the control surfaces or elevons, based upon instructions received from the controller, but preferably the control surfaces or elevons are attached directly to the servo motors to reduce mechanical complexity and improve robustness of the device.
  • the imaging device is autonomous in flight. Furthermore, it is preferred that the device may not be controlled through the course of its flight remotely. Indeed, it is preferred that the device is not capable of being controlled externally or remotely through its entire flight after launch.
  • the device may be configured with radio communications capability, it preferably does not rely on radio frequency communications for control or location (e.g. GPS). More preferably, in order to minimise cost and weight, the device is absent a radio communications capability. By not relying on radio communications for location or navigation, use of the device is not hampered by RF jamming.
  • the imaging device is not configured for two-way communication and is thus preferably not capable of two-way communication and absent two-way communication means, thereby reducing the risk of data being intercepted and furthermore reducing the weight and cost of the device.
  • the imaging component one or more sensors, controller
  • in-flight orientation mean and power source may each be located at any suitable position on or in the body.
  • at least the power source and the CPU are located in a main body and preferably the power source is located toward the fore of the main body.
  • the in-flight orientation means is located on the wings arrangement and preferably toward a tail edge of the wing arrangement, e.g. in the form of control surfaces such as elevons.
  • the imaging component or components may be located on the main body (e.g. in the nose) or on the wings (e.g. in a front edge or disposed above or beneath the wings).
  • the sensors may be disposed at any suitable location, but typically in the main body.
  • a connector is provided in the tail portion of the main body, for connecting the CPU with an external CPU prior to and after launch.
  • the connector is preferably for use in transferring data and power.
  • the connector is capable of auto-detachment, whereby at launch, a cable or other corresponding external connection means automatically detaches (e.g. by way of a pull-fit, snap-fit or magnetic mounting) without interrupting or destabilizing the launch process.
  • data and/or power are transferred by inductive coupling.
  • At least one and preferably at least two (more preferably two) imaging components which are preferably image sensors such as CCD or CMOS sensors.
  • the imaging components are preferably for capturing image data in the visible spectrum but may be used to capture infra-red image data and/or UV image data.
  • the image data captured by the imaging components is preferably stored in a data storage component, typically part of an on-board computer.
  • image data is tagged with sensor data and time data so that the location and/or direction of the image the data represents can be identified during a post-processing step.
  • two image capture devices are disposed in the nose of the main body of the device.
  • Any suitable sensors may be used. These may include an
  • an inertial sensor is provided, such as a MEMS inertial sensor e.g. in the main body.
  • an altimeter may be provided.
  • a clock is provided in association with the CPU.
  • the power source may be any suitable power source and is configured to supply power to the controller (and CPU), the image capture components, a servo motor for controlling the elevons, and the one or more sensors.
  • the power source may be a battery, typically a rechargeable battery, a fuel cell (e.g. using methane, natural gas or hydrogen) or a supercapacitor.
  • the power source is a supercapacitor, which has the benefit of rapid charging, low weight and greater robustness.
  • the controller is configured to activate the imaging device only during an image capture phase and a homing phase of a flight, preferably by way of a state machine arrangement.
  • the invention comprises an imaging device for capturing aerial images of an imaging target, the imaging device being capable of and configured for flight, wherein the imaging device is configured to navigate toward a homing target.
  • the invention comprises an imaging device for capturing aerial images of an imaging target, the imaging device being capable of and configured for flight, wherein the imaging device comprises an on-board machine vision module.
  • the imaging device comprises a machine vision module.
  • the machine vision module comprises the imaging component and the controller (typically the same controller) which is preferably provided with machine vision software and preferably configured with software to recognize a pre-defined image signal or pattern (as a homing target) in an image or series of images captured by the imaging component.
  • the controller is configured to identify the pre-defined image signal or pattern (the homing target) and track the homing target in an image frame and use the positioning data of the homing target within the image frame as a navigational aid in order to navigate the imaging device toward the homing target.
  • the image signal or pattern may be a static or an active signal.
  • An active signal or pattern may comprise, for example, one or more lights that are identifiable by a machine vision image capture component and which are configured to emit light for pre-defined periods or frequencies. Such lights may be provided by an LED or LED arrangement.
  • a passive signal or pattern may comprise a distinctive shape or arrangement of shapes, etc, on a surface, which is readily recognizable by a machine vision module.
  • the imaging device has a wingspan of up to 50 cm, preferably, up to 30 cm, e.g. from 20 to 30 cm . More preferably, the imaging device has a wingspan of up to 20 cm, more preferably up to 18 cm, e.g. from 5 to 15 cm, preferably 10 to 12 cm.
  • the imaging device has a total weight of up to 500 g, preferably up to 350 g. In one embodiment, the total weight of the imaging device is from 200 to 350 g. In an alternative preferred embodiment, the imaging device has a of up to 200 g, preferably up to 150 g, more preferably up to 125 g and preferably in the range 50 to 120 g, e.g. 60 to 110 g more preferably from 70 to 110 g.
  • the imaging device has a wingspan of 5 to 15 cm and a weight of from 50 to 125 g.
  • a device of this embodiment can be used to capture aerial images, which are stable and resolvable, from 20 to 200 m, preferably 30 to 150 m, still more preferably up to 125 m, still more preferably 50 to 100 m.
  • aerial images which are stable and resolvable, from 20 to 200 m, preferably 30 to 150 m, still more preferably up to 125 m, still more preferably 50 to 100 m.
  • a system of an aspect of the invention is for capturing aerial images of an imaging target and comprises an imaging device as defined above, a launcher or adaptor for a launcher for launching the imaging device and an external CPU or external processor external to the imaging device configured for data
  • the external CPU or processor is a CPU in a computer, tablet or smartphone connected to the launcher or connected to the imaging device via the launcher or built into the launcher for connection to the imaging device prior to launch.
  • the external CPU may be provided with software to calculate, for a given desired imaging target and a given launcher, a desired horizontal vector of velocity and radius of curvature of a generally arcuate flight path to achieve a desired altitude.
  • the external CPU may communicate such flight data to the on board controller for use in autonomous dynamic flight control.
  • the external CPU may be connected with the device to extract the image data.
  • the external CPU is configured with image processing data to enable the images to be viewed on a screen associated therewith and the features captured in the image data to be located in dependence on contemporaneous sensor data.
  • a catcher may be used for catching a deployed image capture device as defined above, which catcher may have a homing target (comprising an active of passive pattern or visible signal) recognizable by a computer vision module on board the image capture device on at least one surface of the catcher and preferably a catching means associated therewith.
  • a catching means may comprises an energy absorbing deformable member such as an inflated member (e.g. forming a deformable airbag or the like), a resiliently deformable foam member, or a net or canvas member disposed about a rim (of any suitable shape and configuration) or may simply comprise a glove member with an extended member (e.g. up to 1 m in diameter, or up to 50 cm in diameter).
  • the imaging device preferably comprises a controller configured for multiple operational phases, which preferably comprise an ascent phase, an image capture phase and a homing phase.
  • software is provided to the controller (comprising a processor) to enable the controller to operate the imaging device according to the multiple flight operational phases.
  • the controller may preferably be configured as a state machine in which phases of multiple operational phases may be associated with a state and change from one state to the next may be governed by an input trigger, such as clock, compass or other sensor data.
  • the state machine may comprises an ascent state, an image capture state and a homing state.
  • the controller is configured with a dynamic approach state, which may optionally be within the ascent phase.
  • the controller is configured to operate the orientation means in response to real time sensor data in order to navigate the imaging device toward a levelised component of the flight at a pre-defined horizontal vector of velocity, the altitude of which may be adapted according to software associated with the controller in order to achieve that depending on the sensor data.
  • An image capture phase or state may be entered, when a pre-defined altitude, orientation or speed is reached, such as when a levelised component of the flight is achieved at which point the device is configured to activate the image capture devices and associated software.
  • the controller during the image capture phase is preferably configured to follow a fixed generally arcuate flight path to enable image capture of an image target and/or along a range of pitch from near horizontal or horizontal to near vertical or vertical.
  • the controller may enter the homing phase or state.
  • the device is configured to run a machine vision software to process image data from the imaging components to identify a pre-defined homing target and then to track a pre-defined homing target within the image data, which data may then be used by the controller to navigate the device toward the homing target.
  • the dynamic approach may form part of the ascent phase (e.g. entered once the sensors detect a pre- defined threshold of ascent rate) or may occur after the device reaches apogee (e.g. sensors detect zero ascent), during which dynamic approach phase the controller is configured to adapt the flight path dynamically and pitch toward a levelised component of the flight path in dependent upon a target horizontal vector of velocity at the levelised component of the flight path.
  • the ascent phase preferably also comprises a sub-phase or sub-state being the initial ascent or launch phase during which location data for the device is generated not by sensor data (due to the unreliability of sensors at rapid acceleration) but by calculation from initial launch angle, predicted launch speed and a time period (e.g. 200 ms).
  • pre-launch including a‘ready’ state in which the launcher and device are prepared (navigational data having been uploaded and power source charged up) and it is orientated at an angle which will enable the desired flight path to be achieved.
  • the imaging device 1 comprises a main body 3 having a fixed wing arrangement comprising a wing 9 disposed either side of the main body 3.
  • the wings have a profile adapted from a NACA profile.
  • the wings 9 are disposed toward the rear of the main body 3.
  • the main body 3 has a broad profile and a square nose 5, although a more rounded nose may optionally be provided to improve robustness for recovery.
  • the rearward projecting tail 7 provides stability and a suitable location for location a power and data connection for the device.
  • An elevon 11 is disposed along rearmost edge 12 of each wing to provide orientation control.
  • the elevons 11 are driven by servo motors 15 under the control of flight computer 13.
  • One or more cameras 17 are provided, disposed in the nose 5 and/or on one or both wings, e.g. at a foremost edge 19 and are controlled by flight computer 13.
  • Image data captured by the cameras 17 is stored in data storage provided in or in association with flight computer 13.
  • Power to the flight computer 13 and other components such as servo motors 15 (and thus elevons 11) and cameras 17 is provided by supercapacitor 21.
  • FIG 3 an alternative embodiment of the imaging device 1 is illustrated which has a smaller wingspan.
  • Rearward projecting tail 7 has disposed therein a power and data connection 39 for the device 1.
  • a camera 17 is disposed in a recess 41 in the nose 5.
  • the main body 3 of device 1 is illustrated in Figures 4A and 4B in which the outer casing 43 is illustrated transparently to show the contents.
  • Within the main body 3 is disposed in the fore portion of the device 1 toward the nose 5 are there supercapacitors 21 for providing power to other components.
  • a pair of recesses 41 in the nose 5 allow image access for cameras 17 disposed within the recesses 41.
  • controller 13 or flight computer
  • Servo motors 15 are arranged in association with the wings (not shown) and power and data connection 39 is disposed in the tail 7.
  • Figure 5 illustrates the phases of a flight path 23 of a device 1 of the invention when used in this system according to one embodiment.
  • the initial ascent phase 25 during which the device 1 is rapidly accelerating.
  • the components in the device are maintained in a steady state as possible.
  • dynamic ascent phase 27 is followed by operating elevons 11 in response to sensor data representative of location, motion and orientation.
  • the object of the navigation during the dynamic ascent phase 27 is to enable maximum altitude to be achieved whilst navigating a flattened (and extended) trajectory at apogee 33 at sufficient airspeed to facilitate stability for image capture.
  • the image capture phase 29 which takes place at apogee, the flight path follows a relatively steady and relatively flat trajectory during which images can be captured. Flight path adjustments by movement of the elevons 11 are kept to a minimum in order to maintain stability.
  • the device 1 may be manoeuvred to ensure that image capture of the desired image target is achieved. As the device falls out of apogee and begins its descent, it enters the homing phase 31.
  • the device 1 alters its course to be directed to what is calculated to be the vicinity of its launch position.
  • the user should now have provided a visible and identifiable‘homing target’ 35 being a distinctive and machine vision identifiable pattern.
  • the machine vision module comprising the cameras 17 and a machine vision algorithm on the flight computer 13 identify and locate the position of the homing target 35 relative to the device 1. Feedback from the machine vision module as to the relative position of the homing target 35 feeds into the navigation function of the flight computer which adjusts the trajectory accordingly.
  • Two further phases are not part of the flight path but of the operation of the device, include launch phase comprising the pre-launch and the launch event as well as the catch phase, subject to a successful homing phase 31.
  • Figure 6 illustrates a flight path 23 according to an alternative embodiment, wherein the flight path 23 follows a loop that incorporates a generally arcuate component at apogee 33.
  • Figure 7 illustrates an alternative flight path 23 wherein the flight path follows a vertical or near vertical path to apogee 33 and then pitches into a levelised component 45 on descent before entering generally arcuate flight path component 47 through the imaging phase 29 before continuing into the homing phase 31.
  • Figure 8a illustrates a variable radius inside loop flight path 23 in which after launch the device 1 follows a fixed angled ascent as part of the dynamic ascent phase 27 maintaining course until a predetermined criteria is reached (e.g. ascent rate or height or likely speed at apogee 33 based upon intended path) whereupon the device 1 pitches into apogee33 at a controlled increasing pitch rate (i.e. spiralling in) and after apogee 33 spirals out at decreasing pitch rate for the course of the imaging phase 29 until the device 1 is near vertical at which point it enters the homing phase 31 toward homing target 35.
  • a predetermined criteria e.g. ascent rate or height or likely speed at apogee 33 based upon intended path
  • Figure 8b illustrates a fixed radius inside loop flight path 23.
  • the flight path follows a dynamic ascent phase 27 which at a pre-defined condition (e.g. ascent rate) pitches into an arcuate, fixed radius path to apogee 33 and through the imaging phase 29 until near vertical at which point it enters the homing phase 31.
  • a pre-defined condition e.g. ascent rate
  • Figure 8c illustrates an outside loop flight path 23 in which the device 1 follows an outside loop.
  • Figure 8d illustrates an alternative flight path 23 having a dynamic ascent phase 27 during which the device 1 follows a pre-determined course until a condition is met (e.g. ascent rate threshold) at which point it pitches, e.g. in an arcuate path, toward apogee 33 and beyond, until directed toward an imaging target whereupon it follows a steady course (e.g. at an angle to the horizontal) toward the imaging target for a pre-defmed period, the imaging phase 29 and then pitches toward the homing target 35 entering the homing phase 31.
  • a condition e.g. ascent rate threshold
  • Figure 8e illustrates an alternative flight path 23 whereupon the device 1 follows a pre-determined course until a condition is met (e.g. ascent rate threshold) at which point it pitches, e.g. in an arcuate path, toward apogee 33.
  • a condition e.g. ascent rate threshold
  • apogee 33 it follows an arcuate path having a tight radius of curvature 49, but retains a horizontal vector of velocity through apogee 33 of, for example, at least 1 m/s, preferably at least 2 m/s (i.e. the device 1 does not stall at apogee 33) in order to maintain control and stability.
  • the device pitches into a flatter course 53 which follows a steady course directed toward an imaging target, the imaging phase 29, following which the device 1 pitches into the homing phase.
  • Figure 8f illustrates an alternative flight path 23 comprising banked turns.
  • the ascent phase 27 follows a steady course at a heading that is typically not reciprocal to the imaging target although the device 1 rolls and then at a pre-determined altitude (or ascent rate) levels off and then banks into a banked turn about a an arcuate course (as illustrated in the plan view) during which is the imaging phase 29 before rolling into the homing phase 31.
  • the catch phase is illustrated in Figure 9 which shows an embodiment in which the catcher 37 is composed of an airbag, which may be inflatable on demand, having on at least one surface that can be made visible to the machine vision module of the device 1 that has a homing target 35 disposed thereon.
  • the imaging device 1 will be disposed in a cradle (not shown) of a launcher (not shown).
  • a power and data connector automatically links with a corresponding power and data connector in the tail 7 of the device 1.
  • the power and data connector in the launcher links to an associated interface (e.g. of a tablet or other computer).
  • the supercapacitor(s) 21 on board the device 1 are charged through the power and data connector (in about 20 seconds). During this pre-launch portion of the launch phases, the device may be aimed but not launched until the supercapacitors (21) are suitably charged and the device 1 is aimed at a suitable pitch angle within defined limits.
  • the sensors (not shown) in the device 1 detect the direction (ideally 180° from the direction of the target image) and pitch of the device 1.
  • Sensors used on board include inertial sensors which include gyroscopes, magnetometers and accelerometers. These allow the position, orientation and speed of the device to be determined.
  • the flight computer 13 relies on the calculated position from the initial ascent phase adapted with sensor data to provide location, position and velocity information at any one time.
  • the flight computer 13 calculates the adjustment to the elevons that is necessary to manoeuvre the device 1 the required amount to bring it back toward the intended flight path.
  • the degree of adjustment necessary will vary depending upon the part of the flight path the device 1 is in. Adjustment will be minimised in the dynamic ascent phase 27 in order to preserve energy for the homing phase 31.
  • the device 1 will pitch into the image capture phase 29 as the rate of ascent as determined by the sensors approaches a pre-defined threshold.
  • the transition from ascent into apogee is configured to take place earlier than it would naturally on the dynamic ascent flight path.
  • the device 1 does not reach quite the height it would naturally on the dynamic ascent flight path.
  • the trajectory is cut short so as to facilitate an extended steady flattened apogee trajectory during which images of an image target area may be captured.
  • the image capture phase begins, starting with activating the cameras so as to capture images toward to image capture target at several angles as the device 1 moves through apogee (during which the device 1 is upside down relative to the launch orientation) and dips into homing phase 31.
  • Images captured are stored within on-board data storage.
  • the flight at apogee will last a pre- defined period and/or until a predetermined threshold velocity is reached.
  • the cameras remain active during the homing phase 31 and become part of a machine vision module along with a machine vision software algorithm which is caused to operate as the device 1 enters the homing phase 31.
  • Images captured are tagged with data such as the position and orientation of device 1 for later use.
  • machine vision is used to identify a homing target 35 and then is used as feedback to the flight computer to facilitate navigational adjustment, thus leading the device 1 back to the homing target for catching in the catcher 37.
  • the captured device can be recovered, and connected via a power and data connector to a tablet or computer to download image data and data that is precursor to location and orientation.

Abstract

A system for capturing aerial images of an imaging target including and imaging device that is capable of and configured for flight,has no on-board propulsion and is autonomous-in-flight and a launcher or launch adaptor for launching the imaging device and an external processor configured for data communication with the device enables image capture by a robust, autonomous-in-flight imaging device with back-to-base capability, which is very efficient in on-board energy use and easy to use, without extensive training.

Description

Aerial Imagine Device and System
FIELD OF THE INVENTION
This invention relates to the field of imaging. More particularly, it relates to an imaging device for use in aerial imaging and/or surveillance, to a system incorporating the device and to a method of aerial imaging.
BACKGROUND OF THE INVENTION
Aerial imaging finds application in many civil and military applications, including surveying (e.g. of buildings, cityscapes, crops and landscapes), search and rescue, surveillance and reconnaissance.
Aerial imaging and surveillance systems typically comprise sophisticated hardware, often drones with GPS or geolocation capability and image capture devices. The operation of the drones requires skill from the user, in order to obtain useful images, often line of sight to operate and potentially also control of the cameras for imaging. As well as skill and experience, in certain areas operation of a drone may require a licence. Typically, a conventional drone requires flying with a joystick, with considerable skill and line of sight.
In seeking to provide smaller, lower cost and/or more easily operable aerial imaging system, several alternative solutions have been proposed.
The‘black hornet’ developed by Prox Dynamics AS and available from FLIR Systems Inc (https://www.flir.co.uk/globalassets/imported- assets/document/b lack-homet-prs-spec-sheet.pdf) is a micro surveillance remote control helicopter, which is highly controllable, agile and transmits imagery back to the operator. It is characterised by its capability and its small size and light weight. Its features are described in numerous patent documents, including WO-A- 2013/139509 and WO-A-2012/130790. However, such micro surveillance devices are highly complex, require skill and training to operate, require control and signals to be communicated from the operator unit/joystick to and from the drone, a real- time video feed to the operator and are relatively high in cost. US6119976 describes a shoulder launched unmanned
reconnaissance system for aerial surveillance of remote targets. The portable munitions launcher enables the vehicle to rapidly reach flight speed after which it may be flown by ground-based remote control to an imaging target location. The system comprises a fuselage and collapsible wings which deploy after launch. Still and video images from the target imaging area are transmitted via a data link system. It may then be flown back to the operator and recovered.
US9738383 describes a remotely or autonomously controlled unmanned aerial vehicle which may be flown to a target area, deploy a parachute to enable the vehicle to loiter (in the manner of a paraglider) and in some
embodiments jettison the parachute and return to base.
US8263919 concerns a surveillance vehicle comprising a vessel and parasail in which the vessel is a launched via a mortar tube, using an on-board launch propulsion toward an area of imaging interest, at which point a parasail is deployed (relying on GPS location, period since launch or altitude detection) and a camera capture image data which is transmitted back to the operator via communications satellite. It may optionally comprise an on-board propulsion device for extended flight. The vehicle in US8263919 is contemplated as a single use vehicle and optionally has a self-destruct capability.
US-A-2004/0196367 describes an apparatus for surveillance over a zone having a projectile launcher, a projectile having a camera, image transmitter circuit (with corresponding ground-based receiver) and a deployable parachute to stabilise the projectile as it falls. US9234728 similarly describes a reconnaissance capsule launched from an artillery launching platform and carrying a camera for capturing and transmitting live video images back to the user during parachute stabilised descent. Both US-A-2004/0196367 and US9234728 are effectively single use items, although provision may be made to recover and re-use components and provide image capture from directly above an image target area during descent.
W002/073129 an imaging device that can be launched from the ground and is configured, by way of a body member with angled fins, to rotate about its roll axis along a parabolic flight path to capture panoramic images.
Images are captured from a lens disposed in the side body of the device
perpendicular to the flight path. The image data can be tagged with orientation data from on-board sensors, optionally. Image data can be transmitted to a use on the ground. However, such devices do not return to base and are essentially single use unless components can be recovered.
There is a need therefore for an imaging system that is capable of capturing aerial images of a desired or target location.
The present inventor has devised a device and system to facilitate cost effective capture of images at a target location.
PROBLEM TO BE SOLVED BY THE INVENTION
There is a need for an imaging device and system that is capable of capturing aerial images of a desired or target location in a cost-effective manner.
It is an object of this invention to provide an imaging system which is usable without extensive training, is robust and re-usable and is low cost.
It is an object of this invention to provide an imaging device and system which is low cost, portable, usable without extensive training, robust and re-usable.
SUMMARY OF THE INVENTION
In accordance with a first aspect of the invention, there is provided an imaging device for capturing aerial images of an imaging target, the imaging device being capable of and configured for flight, wherein the imaging device preferably has no on-board propulsion and is preferably autonomous-in-flight.
In a second aspect of the invention, there is provided an imaging device for capturing aerial images of an imaging target, the imaging device being capable of and/or configured for a flight path defining a generally arcuate or flattened trajectory component and having no on-board propulsion.
In a third aspect of the invention, there is provided an imaging device for capturing aerial images of an imaging target, the imaging device capable of a loop flight path defining a generally arcuate or flattened trajectory at apogee and having no on-board propulsion.
In a fourth aspect of the invention, there is provided an imaging device that is autonomous in-flight, which device comprises a body capable of flight when subject to sufficient launch energy, an in-flight orientation means disposed in relation to the body, an imaging component, one or more positional, orientational, motion or environmental sensors to provide sensor data, a controller for controlling the in-flight orientation means and the imaging component in response to pre-defined criteria characterized by time periods and/or sensor data or data derived therefrom and a power source for providing power to the flight orientation means, one or more sensors, imaging component and controller.
In a fifth aspect of the invention, there is provided an imaging device for capturing aerial images of an imaging target, the imaging device being capable of and configured for flight, wherein the imaging device is configured to navigate toward a homing target (e.g. definable by a pattern and preferably by an on-board machine vision module).
In a fifth aspect of the invention there is provided a system for capturing aerial images of an imaging target, the system comprising an imaging device as defined above, a launcher or adaptor for a launcher for launching the imaging device and an external CPU or external processor external to the imaging device configured for data communication with a controller and/or a data storage means on the imaging device.
In a sixth aspect of the invention, there is provided a launcher or adapter for a launcher for an imaging device as defined above.
In a seventh aspect of the invention, there is provided a controller or controller software configured to operate an imaging device as defined above, the controller or controller software having multiple flight operational phases comprising an ascent phase, an image capture phase and a homing phase.
In an eighth aspect of the invention, there is provided a catcher for a deployed image capture device as defined above, the catcher having a homing target recognizable by a computer vision module on board the image capture device on at least one surface of the catcher.
In an ninth aspect of the invention, there is provided a method of capturing an image of an imaging target, the method comprises providing an imaging device as defined above, providing a launcher for the imaging device, at a launch point orientating the imaging device for launch at a suitable pitch, preferably in a heading generally reciprocal to the heading of the imaging target from the launch point, causing the launcher to launch the device, providing a homing target identifiable by a machine vision module on board the device and positioning and orientating the homing target such that it can be identified by the machine vision module.
ADVANTAGES OF THE INVENTION
The system of the invention enables image capture by a robust, autonomous-in-flight imaging device with back-to-base capability, which is very efficient in on-board energy use and easy to use, without extensive training.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a rear perspective view of an embodiment of a device of the invention;
Figure 2 is a sectional plan view of an embodiment of a device of the invention;
Figure 3 is a rear perspective view of a further embodiment of a device of the invention
Figures 4A and 4B are side and perspective transparent views of a main body or fuselage of a device of Figure 3;
Figure 5 is an illustration of a flight path of an embodiment of a device of the invention;
Figure 6 is an illustration of a flight path of a second embodiment of the device of the invention; Figure 7 is an illustration of a flight path of a third embodiment of the invention;
Figures 8a to 8f are illustrations of alternative flight paths of the device of the invention according to several embodiments; and
Figure 9 is a perspective view illustration of one embodiment of a catcher of the invention.
DETAILED DESCRIPTION OF THE INVENTION
The imaging device of the invention is for aerial imaging and, in particular for capturing aerial images of an imaging target. Preferred embodiments described herein relate, where the context allows, to any of the aspects of invention defined above. The device has a body that is capable of flight (e.g. upon launching from a launcher, typically a ground-based launcher). The imaging device is capable of and/or configured for a flight path, preferably defining a generally arcuate or flattened trajectory and having no on-board propulsion. The imaging device is preferably capable of a loop flight path, preferably an inside loop flight path.
Preferably, the imaging device may be configured to be launched from a ground launcher. Optionally, a flight path for the device may in part be determined by the pitch and heading at launch. Such pitch and heading at launch may be selected to achieve a desired flight path and orientation for imaging.
Preferably, the imaging device is autonomous in-flight. By autonomous-in-flight, it is meant the imaging device is configured to adjust, re orientate or adapt its own course or flight path for at least a portion of the flight (without control from an operator). Preferably, the imaging device is capable of adjusting, re-orientating or adapting its flight path during flight, preferably dynamically adapting its flight path. Such reorientation or adaptation of its flight path may be, for example, to allow the device to follow a pre-defined flight path or in response to data, such as data generated from sensors (or image capture sensors), preferably on-board, concerning environmental factors or location, orientation, acceleration and velocity data or image data. Preferably, the imaging device has a homing capability. In one embodiment, the device is configured to be capable of recognizing a homing target, typically a pre-defined homing target. In one embodiment, the imaging device is configured to recognize a homing target via sensors, such as via a machine vision module, which preferably makes use of the same imaging components (preferably cameras) used for image capture of a target image. Preferably, the imaging device is configured to dynamically adapt its flight path toward a homing target (and, for example, to navigate to the homing target) in response to data from recognition sensors or a machine vision module, which data may typically relate to the relative location of the homing target (e.g. relative to the device).
By homing target it is meant any target that is identifiable and recognizable and having a characterizing recognizable feature, signal or pattern. The homing target may be active (e.g. a variable or transmitting signal or pattern) or passive (a static signal or pattern). Preferably, the homing target is a visual pattern that is recognizable by a machine vision module or system on board the device.
Preferably the device comprises a controller for controlling one or more functions or components on the device, such as the orientation means, one or more sensors, an imaging component and/or a machine vision module. Preferably the controller comprises a processor or CPU and software or a computer program defining criteria and processes for the processor.
Preferably, the device comprises one or more sensors, such as positional sensors, orientational sensors, motion sensors or environmental sensors. Preferably the device comprises inertial sensors. Preferably, the sensors generate sensor data, which is preferably provided to a processor or flight computer. Such data may be used by the controller to make decisions as to flight path etc. For example, should data from inertial sensors indicate that the device in flight has deviated from a pre-defined flight path plan (pre-defined course), the controller may calculate an adjustment necessary to put the device back on course and then control the orientation means (e.g. control surfaces, such as elevons) to make that adjustment. Preferably, an imaging device, such as is defined above, comprises a controller configured for multiple flight operational phases comprising an ascent phase, an image capture phase and a homing phase. Each flight operational phase may be characterized by a different state or condition of control of the controller.
It is a preferred feature of the device of the present invention in relation to all aspects that there is no significant on-board propulsion means.
By on-board propulsion means it is meant a means by which the body may be propelled in extended flight under its own power. Preferably there is no on-board propulsive directional means, by which it is meant there is no propulsive booster or jet of any kind which may be insufficient to maintain the body in flight but could facilitate orientation.
The invention comprises in other aspects, as defined above, a system for capturing aerial images of an imaging target, a launcher or adapter for a launcher for an imaging device, a controller or controller software configured to operate an imaging device, a catcher for a deployed image capture device and a method of capturing an image of an imaging target.
Where the context allows, the imaging device may be defined, in the alterative (to an imaging device), as a projectile (e.g. absent an image capture and storage means), which may optionally comprise sensor data capture means (including image data - for example for use in machine vision-based navigation) or may be configured to follow pre-defined in-flight adjustments for delivery to target location.
The imaging device of the invention is for capturing aerial images of an imaging target, is capable of and/or configured for a flight preferably having a path defining a generally arcuate or flattened trajectory component and has no on board propulsion.
Preferably, the imaging device is configured to capture aerial images of an imaging target, e.g. a target location. In one embodiment, the imaging device is configured to capture images at a pre-defined heading and at one or more pitch angle from an altitude that may optionally be pre-defined or desired. Preferably, the imaging device is configured to capture images along a range of pitch angles at a pre-defined heading, which can be referred to as the image target heading. The range of pitch angles may be selected so as to scan a particular heading. The range of pitch angles may optionally be over a range of from 10° to 180°, for example, such as from 30° to 120° and may be 60° to 105°. In one preferred embodiment, the range of pitch angles may be over a 90° range, e.g. from a horizontal pitch or within 5° to 10° thereof through to a vertical (downward) pitch or within 5° to 10° thereof. It is preferred that the device is configured to capture images from a generally horizontal orientation (or within 5° or 10° thereof) being at or close to a levelised component of flight path, preferably at or close to apogee in a looped flight path, so as to capture an image toward the horizon at the image target heading (or within 5° or 10° thereof) and preferably to capture images through a range of pitch angles at the image target heading (or within 5° or 10° thereof) through to a vertical downward pitch (or within 5° or 10° thereof). The image target heading is typically a reciprocal heading to the launch heading.
Preferably, the imaging device is capable of and/or configured for a flight path defining a generally arcuate component. Preferably the imaging device is capable of and/or configured for a flight path defining a flattened trajectory component, optionally both a generally flattened trajectory and a generally arcuate component.
By flattened trajectory component, it is meant a component of the flight path following a flat course or a straight course within 30° of horizontal or a curved course having a radius of curvature of 100 m or greater.
By generally arcuate component, it is meant a component of the flight path which is curved. The generally arcuate component may have a variable or relatively constant rate of rotation or relatively constant radius. A generally arcuate component having a variable rate of rotation or variable radius preferably has an increasing or decreasing radius, so as to define a spiraled curve. Preferably, the generally arcuate component has a relatively constant rate of rotation or relatively constant radius. By relatively constant radius, it is mean within 20% of the mean radius for that component of the flight path, preferably within 15% of the mean, more preferably within 10% of the mean and still more preferably within 5% of the mean. Preferably, the radius does not vary by more than 5% in the component of the flight path having a relatively constant radius and preferably by no more than 2%. An arcuate component may optionally comprise an arc of a circle which has a constant radius.
Preferably, the device is capable of and/or configured to have a flight path having a generally levelised component, which generally levelised component is a component of the flight path that is horizontal or within 30° of horizontal, preferably within 15°, more preferably within 10° and more preferably within 5° of horizontal. The generally levelised component of the flight path according to the preferred embodiment of the present invention is at a portion of the flight path of the imaging device from which the device descends (e.g. back to the user or to a homing target after). The levelised component may be part of a curved or arcuate flight path and may be at apogee.
Preferably, the levelised component is part of a flattened trajectory or generally arcuate trajectory component or immediately precedes it.
In one embodiment, the levelised component of the flight path is after apogee. In a preferred embodiment, the levelised component of the flight path is at apogee.
In one embodiment, in which the levelised component of the flight path is after apogee, the flight path may follow a path after launch (e.g. pitched or vertical launch) in which an ascent phase continues until the rate of ascent reaches zero, at which apogee a horizontal vector of velocity may be zero or up to say 2 m/s, at which point the device begins to descend from its maximum height (at apogee) and pitch into a levelised component prior to curving about a generally arcuate flight path through descent. Thus, in this embodiment, the levelised component is after apogee and the horizontal vector of velocity at the levelised component depends upon the energy gained during descent from apogee to the altitude of the levelised component.
In a preferred embodiment, in which the levelised component of the flight path is at apogee, the flight path may follow a path after launch (e.g. pitched) in which an ascent phase continues until the rate of ascent reaches a pre-determined threshold greater than zero, at which point the device pitches toward a levelised component of flight path which is achieved at apogee, at which apogee a horizontal vector of velocity may be at least 1 m/s, preferably at least 2 m/s, preferably at least 2.5 m/s and more preferably at least 4 m/s, at which point the device begins to descend from its maximum height (at apogee) through a generally arcuate flight path. Thus, in this embodiment, the levelised component is at apogee and the horizontal vector of velocity at the levelised component depends upon the point before apogee at which the device pitches to a levelised flight path.
Preferably, the levelised component of the flight path is at apogee, by which it is meant at the highest point in the flight path. In the alternative, the levelised component may be within 20% of the highest point.
In one embodiment, the flattened trajectory component or generally arcuate component encompasses or is immediately preceded by the levelised component of the flight path and preferably also apogee.
The flight path in descent, which typically includes an image capture phase of the flight and a homing phase of the flight, may comprise a series of straight course angled flight path components but preferably comprises a curved flight path and more preferably comprises a generally arcuate component. The generally arcuate flight path component is preferably within 30° of the levelised component (and preferably apogee), more preferably within 20°, still more preferably within 15°, e.g. within 10°, still more preferably within 5°. Preferably the generally arcuate component encompasses or immediately follows the levelised component of the flight path (and preferably apogee).
As mentioned above, the imaging device is preferably configured to fly in a loop flight path, preferably an inverse loop and preferably defining a generally arcuate or flattened trajectory at apogee (which is the point of a levelised component of flight). Apogee, as used herein, is the highest point in the flight of the imaging device and the highest point in the loop flight path.
In one embodiment, the device is configured to follow a generally arcuate path extending about at least 30°, preferably at least 45°, more preferably at least 60°, still more preferably at least 90° and preferably up to 200°. In one embodiment, the generally arcuate component comprises a rotation in the range of from 30° to 60° or from 60° to 90° or slightly further (e.g. 105°).
Preferably, the device is configured to dynamically adapt and/or determine the flight path in flight.
In a preferred embodiment, the flight path of the device is defined by a pre-defined rate of rotation in a generally arcuate component or a pre-defined radius or range of radii of the generally arcuate component and a pre-defined horizontal vector of velocity at a generally levelised component, which preferably immediately precedes or is encompassed within the generally arcuate component and which is preferably at apogee.
Preferably, the device is configured to define a desired flight path portion in terms of radius of arc and horizontal vector of velocity at the generally levelised component (preferably at apogee), which flight path is dynamically adapted in altitude during flight in dependence upon the rate of ascent and/or in order to achieve the pre-defined radius of arc and horizontal vector of velocity.
Preferably, the pre-defined horizontal vector of velocity is selected to be up to 20 m/s, more preferably up to 15 m/s. It is believed that selecting a greater horizontal vector of velocity will result in insufficient time about the course of an arc to obtain sufficiently clear images and may also limit the altitude from which the images may be captured.
Preferably, the pre-defined horizontal vector of velocity is selected to be at least 1 m/s, preferably at least 2 m/s, more preferably at least 2.5 m/s, optionally at least 4 m/s, such as at least 5 m/s and optionally at least 7.5 or 10 m/s. It is believed that selecting a lesser horizontal vector of velocity, the imaging device will not be sufficiently stable and may not be capable of completing a desired generally arcuate component of the flight path at a desired radius.
The desired radius of curvature that is pre-determined for an image capture flight with the device in a preferred embodiment, may be any suitable radius depending upon the altitude of the flight, the launch energy, the size of the device, the number or nature of images to be captured among other factors.
However, preferably, the pre-defined radius of curvature for the generally arcuate component of the flight path is at least 1 m, preferably at least 2 m, more preferably at least 4 m, still more preferably at least 5 m, e.g. at least 7 m, preferably up to 50 m, more preferably up to 25 m, still more preferably up to 20 m and more preferably up to 15 m. A range of 2 to 10 m is desirable in one embodiment. In another embodiment, a range of 5 to 15 m is preferable.
Preferably, the imaging device is configured to follow a fixed flight path for at least a portion of the flight path, which flight path is fixed in altitude, heading and course, the fixing of the fixed flight path being completed at or before the device reaches a levelised component or apogee. Preferably, the fixed flight path portion is between apogee and vertical descent and preferably comprises a generally arcuate component and extends for about at least 30°, preferably at least 60°, and optionally from 75 to 90°. In one embodiment, the fixed flight path is fixed before the device reaches apogee and may comprises in excess of 90°, e.g. up to 105° or up to 120°. The fixed flight path according to this embodiment is typically fixed according to a predefined radius of curvature or rate of rotation and the desired and achieved horizontal vector of velocity at a levelised component of flight path (e.g. at apogee).
Preferably, the device is configured to pitch into a generally arcuate flight path or flattened trajectory before the rate of ascent of the device falls below a pre-defined threshold. Preferably, the pre-defined threshold of rate of ascent is within a pre-defined range.
Preferably, the imaging device comprises a controller for controlling the trajectory toward the dynamically adapted flight path. The controller preferably comprises a CPU comprising a computer program comprising an algorithm for dynamically adapting a desired flight path for the device during flight.
Preferably, the imaging device is to recognize a pre-defined homing target and re-orientate itself in-flight to facilitate a flight path toward the homing target.
Preferably, the imaging device comprises a body capable of flight, e.g. when subject to sufficient launch energy. Preferably, the imaging device comprises an in-flight orientation means, e.g. disposed in relation to the body.
Preferably, the imaging device comprises an imaging component or image capture means.
Preferably, the imaging device comprises one or more sensors, such as positional, orientational, motion or environmental sensors to provide sensor data.
Preferably, the imaging device comprises a controller for controlling one or more controllable components on the device, such as one or more of the in flight orientation means and the imaging component, typically in response to pre defined criteria characterized by time periods and/or sensor data or data derived therefrom.
Preferably, the imaging device comprises a power source for providing power to one or more other components on the device, such as the flight orientation means, one or more sensors, an imaging component and a controller.
An imaging device in a further aspect of the invention, and as a preferred embodiment of the first and second aspects of the invention, comprises a body capable of flight when subject to sufficient launch energy, an in-flight orientation means disposed in relation to the body, an imaging component, one or more positional, orientational, motion or environmental sensors to provide sensor data, a controller for controlling the in-flight orientation means and the imaging component in response to pre-defined criteria characterized by time periods and/or sensor data or data derived therefrom and a power source for providing power to the flight orientation means, one or more sensors, imaging component and controller.
The body preferably comprises a main body and a fixed wing arrangement. The main body and the fixed wings are preferably structurally distinguishable. The body preferably comprises wings disposed equally either side of the main body. The body preferably comprises a longitudinal axis along its main body. There is preferably definable a first vertical plane relative to the body, which comprises the longitudinal axis extending generally perpendicular to the orientation of the wings. Preferably the body is symmetrical about the first vertical plane. A second horizontal plane may be defined relative to the body and comprising the longitudinal axis and generally extending in the direction of the wings. The body may be symmetrical about this second horizontal plane, but is preferably asymmetrical. Preferably, the body has a clearly distinguished top surface and bottom surface which are defined to give the body lift. The main body may be defined by a fore portion and an aft portion being those portions nearer the fore or aft of the main body along its longitudinal axis. In one embodiment, the wings are centred relative to the main body more toward the aft of the main body than the fore, preferably centred within the aft-most third of the main body.
Preferably, the profile of the wings have an aerodynamic shape, preferably to provide a desired amount of lift and to enable fine control, such as a profile corresponding to or adapted from a NACA airfoil.
The main body preferably has an aerodynamic profile. Preferably the lateral dimension (being generally in or parallel to the plane of the wings, i.e. parallel to the second horizontal plane) is of generally consistent lateral extent (width), typically varying in width by no more than 25% of the maximum width of the main body, more preferably no more than 15% and still more preferably no more than 10% over at least 75%, more preferably at least 80% and still more preferably at least 90% of its length (e.g. the entire body other than the extreme fore, i.e. nose, and aft, i.e. tail, portions). The nose, at the fore of the main body, may have a curved or squared plan shape, but preferably curved. The aft preferably comprises an extended tail portion which is extends longitudinally out from the aft of the main body. This may be curved and/or tapered and optionally be a portion of the maximum width of the main body, e.g. from 30 to 70% of the width of the main body at the mid-point of the tail. Preferably, the main body has a vertical axis, being the axis perpendicular to the longitudinal axis and the lateral axis. The vertical extent (or depth) of the main body may vary along the length of the main body. Preferably the main body has a greater depth, on average, toward the fore than toward the aft. Preferably the portion of the main body having the greatest depth is in the fore half. Preferably, the depth decreases gradually from the position of maximum depth toward the aft.
Preferably, the in-flight orientation means are disposed in relation to the fixed wing arrangement. The in-flight orientation means preferably comprise control surfaces, which are preferably elevons. The control surfaces (e.g. elevons) are preferably operable by the controller (e.g. in response to sensor data). Servo motors may use driving rods to operate the control surfaces or elevons, based upon instructions received from the controller, but preferably the control surfaces or elevons are attached directly to the servo motors to reduce mechanical complexity and improve robustness of the device.
In a preferred embodiment, the imaging device is autonomous in flight. Furthermore, it is preferred that the device may not be controlled through the course of its flight remotely. Indeed, it is preferred that the device is not capable of being controlled externally or remotely through its entire flight after launch.
Whilst the device may be configured with radio communications capability, it preferably does not rely on radio frequency communications for control or location (e.g. GPS). More preferably, in order to minimise cost and weight, the device is absent a radio communications capability. By not relying on radio communications for location or navigation, use of the device is not hampered by RF jamming.
Preferably, the imaging device is not configured for two-way communication and is thus preferably not capable of two-way communication and absent two-way communication means, thereby reducing the risk of data being intercepted and furthermore reducing the weight and cost of the device.
The imaging component, one or more sensors, controller
(preferably comprising a CPU), in-flight orientation mean and power source may each be located at any suitable position on or in the body. Preferably, at least the power source and the CPU are located in a main body and preferably the power source is located toward the fore of the main body. Preferably, the in-flight orientation means is located on the wings arrangement and preferably toward a tail edge of the wing arrangement, e.g. in the form of control surfaces such as elevons. The imaging component or components may be located on the main body (e.g. in the nose) or on the wings (e.g. in a front edge or disposed above or beneath the wings). The sensors may be disposed at any suitable location, but typically in the main body. Preferably, a connector is provided in the tail portion of the main body, for connecting the CPU with an external CPU prior to and after launch. The connector is preferably for use in transferring data and power. Preferably, the connector is capable of auto-detachment, whereby at launch, a cable or other corresponding external connection means automatically detaches (e.g. by way of a pull-fit, snap-fit or magnetic mounting) without interrupting or destabilizing the launch process. Optionally, data and/or power are transferred by inductive coupling.
Preferably, there is provided at least one and preferably at least two (more preferably two) imaging components which are preferably image sensors such as CCD or CMOS sensors. The imaging components are preferably for capturing image data in the visible spectrum but may be used to capture infra-red image data and/or UV image data. The image data captured by the imaging components is preferably stored in a data storage component, typically part of an on-board computer. Preferably image data is tagged with sensor data and time data so that the location and/or direction of the image the data represents can be identified during a post-processing step. Optionally, two image capture devices are disposed in the nose of the main body of the device.
Any suitable sensors may be used. These may include an
accelerometer, a magnetometer, a gyroscope or a compass. Preferably, an inertial sensor is provided, such as a MEMS inertial sensor e.g. in the main body.
Optionally an altimeter may be provided. Preferably a clock is provided in association with the CPU.
The power source may be any suitable power source and is configured to supply power to the controller (and CPU), the image capture components, a servo motor for controlling the elevons, and the one or more sensors. The power source may be a battery, typically a rechargeable battery, a fuel cell (e.g. using methane, natural gas or hydrogen) or a supercapacitor.
Preferably, the power source is a supercapacitor, which has the benefit of rapid charging, low weight and greater robustness.
By configuring the imaging device only to activate components when they are required, power can be saved and the size, weight and cost of the power source required may be reduced. Accordingly, in a preferred embodiment, the controller is configured to activate the imaging device only during an image capture phase and a homing phase of a flight, preferably by way of a state machine arrangement.
In a further aspect of the invention and in a preferred embodiment of the above aspects, the invention comprises an imaging device for capturing aerial images of an imaging target, the imaging device being capable of and configured for flight, wherein the imaging device is configured to navigate toward a homing target. In a further aspect of the invention and in a preferred
embodiment of the above aspects, the invention comprises an imaging device for capturing aerial images of an imaging target, the imaging device being capable of and configured for flight, wherein the imaging device comprises an on-board machine vision module.
Preferably, the imaging device comprises a machine vision module. Preferably, the machine vision module comprises the imaging component and the controller (typically the same controller) which is preferably provided with machine vision software and preferably configured with software to recognize a pre-defined image signal or pattern (as a homing target) in an image or series of images captured by the imaging component. Preferably, the controller is configured to identify the pre-defined image signal or pattern (the homing target) and track the homing target in an image frame and use the positioning data of the homing target within the image frame as a navigational aid in order to navigate the imaging device toward the homing target. The image signal or pattern may be a static or an active signal. An active signal or pattern may comprise, for example, one or more lights that are identifiable by a machine vision image capture component and which are configured to emit light for pre-defined periods or frequencies. Such lights may be provided by an LED or LED arrangement. A passive signal or pattern may comprise a distinctive shape or arrangement of shapes, etc, on a surface, which is readily recognizable by a machine vision module.
In one embodiment, the imaging device has a wingspan of up to 50 cm, preferably, up to 30 cm, e.g. from 20 to 30 cm . More preferably, the imaging device has a wingspan of up to 20 cm, more preferably up to 18 cm, e.g. from 5 to 15 cm, preferably 10 to 12 cm.
In one embodiment, the imaging device has a total weight of up to 500 g, preferably up to 350 g. In one embodiment, the total weight of the imaging device is from 200 to 350 g. In an alternative preferred embodiment, the imaging device has a of up to 200 g, preferably up to 150 g, more preferably up to 125 g and preferably in the range 50 to 120 g, e.g. 60 to 110 g more preferably from 70 to 110 g.
In a particularly preferred embodiment, the imaging device has a wingspan of 5 to 15 cm and a weight of from 50 to 125 g. Such a small device can achieve a greater height and capture better images for a given launch energy. In one embodiment, a device of this embodiment can be used to capture aerial images, which are stable and resolvable, from 20 to 200 m, preferably 30 to 150 m, still more preferably up to 125 m, still more preferably 50 to 100 m. For lower heights, below 30 m, it is preferred to use a device of a wingspan greater than 15 cm, more preferably greater than 20 cm, in order to provide good control (and good image stability) at those altitudes.
A system of an aspect of the invention is for capturing aerial images of an imaging target and comprises an imaging device as defined above, a launcher or adaptor for a launcher for launching the imaging device and an external CPU or external processor external to the imaging device configured for data
communication with a controller and/or a data storage means on the imaging device.
Preferably, the external CPU or processor is a CPU in a computer, tablet or smartphone connected to the launcher or connected to the imaging device via the launcher or built into the launcher for connection to the imaging device prior to launch.
The external CPU may be provided with software to calculate, for a given desired imaging target and a given launcher, a desired horizontal vector of velocity and radius of curvature of a generally arcuate flight path to achieve a desired altitude. The external CPU may communicate such flight data to the on board controller for use in autonomous dynamic flight control. After collection of the device, the external CPU may be connected with the device to extract the image data. Preferably, the external CPU is configured with image processing data to enable the images to be viewed on a screen associated therewith and the features captured in the image data to be located in dependence on contemporaneous sensor data.
A catcher may be used for catching a deployed image capture device as defined above, which catcher may have a homing target (comprising an active of passive pattern or visible signal) recognizable by a computer vision module on board the image capture device on at least one surface of the catcher and preferably a catching means associated therewith. A catching means may comprises an energy absorbing deformable member such as an inflated member (e.g. forming a deformable airbag or the like), a resiliently deformable foam member, or a net or canvas member disposed about a rim (of any suitable shape and configuration) or may simply comprise a glove member with an extended member (e.g. up to 1 m in diameter, or up to 50 cm in diameter).
The imaging device preferably comprises a controller configured for multiple operational phases, which preferably comprise an ascent phase, an image capture phase and a homing phase. Preferably, software is provided to the controller (comprising a processor) to enable the controller to operate the imaging device according to the multiple flight operational phases.
The controller may preferably be configured as a state machine in which phases of multiple operational phases may be associated with a state and change from one state to the next may be governed by an input trigger, such as clock, compass or other sensor data. Thus the state machine may comprises an ascent state, an image capture state and a homing state.
Preferably, the controller is configured with a dynamic approach state, which may optionally be within the ascent phase. In the dynamic approach state, the controller is configured to operate the orientation means in response to real time sensor data in order to navigate the imaging device toward a levelised component of the flight at a pre-defined horizontal vector of velocity, the altitude of which may be adapted according to software associated with the controller in order to achieve that depending on the sensor data. An image capture phase or state may be entered, when a pre-defined altitude, orientation or speed is reached, such as when a levelised component of the flight is achieved at which point the device is configured to activate the image capture devices and associated software. The controller during the image capture phase is preferably configured to follow a fixed generally arcuate flight path to enable image capture of an image target and/or along a range of pitch from near horizontal or horizontal to near vertical or vertical. When sensor data determines the device to be vertical (or at another predetermined angle in descent or at a pre-determined velocity in descent), the controller may enter the homing phase or state. In the homing phase, the device is configured to run a machine vision software to process image data from the imaging components to identify a pre-defined homing target and then to track a pre-defined homing target within the image data, which data may then be used by the controller to navigate the device toward the homing target.
Depending upon the general flight path type, the dynamic approach may form part of the ascent phase (e.g. entered once the sensors detect a pre- defined threshold of ascent rate) or may occur after the device reaches apogee (e.g. sensors detect zero ascent), during which dynamic approach phase the controller is configured to adapt the flight path dynamically and pitch toward a levelised component of the flight path in dependent upon a target horizontal vector of velocity at the levelised component of the flight path. The ascent phase preferably also comprises a sub-phase or sub-state being the initial ascent or launch phase during which location data for the device is generated not by sensor data (due to the unreliability of sensors at rapid acceleration) but by calculation from initial launch angle, predicted launch speed and a time period (e.g. 200 ms).
In a preferred embodiment, there are provided further phases or states, pre-launch, including a‘ready’ state in which the launcher and device are prepared (navigational data having been uploaded and power source charged up) and it is orientated at an angle which will enable the desired flight path to be achieved.
The invention will now be described in more detail, without limitation, with reference to the accompanying Figures.
In Figure 1 , the imaging device 1 comprises a main body 3 having a fixed wing arrangement comprising a wing 9 disposed either side of the main body 3. The wings have a profile adapted from a NACA profile. The wings 9 are disposed toward the rear of the main body 3. The main body 3 has a broad profile and a square nose 5, although a more rounded nose may optionally be provided to improve robustness for recovery. The rearward projecting tail 7 provides stability and a suitable location for location a power and data connection for the device. An elevon 11 is disposed along rearmost edge 12 of each wing to provide orientation control.
As shown in Figure 2, the elevons 11 are driven by servo motors 15 under the control of flight computer 13. One or more cameras 17 are provided, disposed in the nose 5 and/or on one or both wings, e.g. at a foremost edge 19 and are controlled by flight computer 13. Image data captured by the cameras 17 is stored in data storage provided in or in association with flight computer 13. Power to the flight computer 13 and other components such as servo motors 15 (and thus elevons 11) and cameras 17 is provided by supercapacitor 21.
In Figure 3, an alternative embodiment of the imaging device 1 is illustrated which has a smaller wingspan. Rearward projecting tail 7 has disposed therein a power and data connection 39 for the device 1. A camera 17 is disposed in a recess 41 in the nose 5. The main body 3 of device 1 is illustrated in Figures 4A and 4B in which the outer casing 43 is illustrated transparently to show the contents. Within the main body 3 is disposed in the fore portion of the device 1 toward the nose 5 are there supercapacitors 21 for providing power to other components. A pair of recesses 41 in the nose 5 allow image access for cameras 17 disposed within the recesses 41. In a mid-portion of the device 1 , controller 13 (or flight computer) is disposed. Servo motors 15 are arranged in association with the wings (not shown) and power and data connection 39 is disposed in the tail 7.
Figure 5 illustrates the phases of a flight path 23 of a device 1 of the invention when used in this system according to one embodiment. After launch is the initial ascent phase 25 during which the device 1 is rapidly accelerating. During the initial ascent phase 25, the components in the device are maintained in a steady state as possible. Following this is dynamic ascent phase 27. During the dynamic ascent phase 27, sensors are activated and desired flight path is maintained by operating elevons 11 in response to sensor data representative of location, motion and orientation. The object of the navigation during the dynamic ascent phase 27 is to enable maximum altitude to be achieved whilst navigating a flattened (and extended) trajectory at apogee 33 at sufficient airspeed to facilitate stability for image capture. This is achieved to by causing the device 1 to turn into apogee (the image capture phase 29) before the rate of ascent of the device 1 falls below a pre- defined level for the situation in view of relative location of image target, angle of images desired, length of image capture phase 29 and optionally weather conditions. During the image capture phase 29, which takes place at apogee, the flight path follows a relatively steady and relatively flat trajectory during which images can be captured. Flight path adjustments by movement of the elevons 11 are kept to a minimum in order to maintain stability. However, the device 1 may be manoeuvred to ensure that image capture of the desired image target is achieved. As the device falls out of apogee and begins its descent, it enters the homing phase 31. During the homing phase, the device 1 alters its course to be directed to what is calculated to be the vicinity of its launch position. The user should now have provided a visible and identifiable‘homing target’ 35 being a distinctive and machine vision identifiable pattern. The machine vision module comprising the cameras 17 and a machine vision algorithm on the flight computer 13 identify and locate the position of the homing target 35 relative to the device 1. Feedback from the machine vision module as to the relative position of the homing target 35 feeds into the navigation function of the flight computer which adjusts the trajectory accordingly.
Two further phases, not illustrated in Figure 5, are not part of the flight path but of the operation of the device, include launch phase comprising the pre-launch and the launch event as well as the catch phase, subject to a successful homing phase 31.
Figure 6 illustrates a flight path 23 according to an alternative embodiment, wherein the flight path 23 follows a loop that incorporates a generally arcuate component at apogee 33.
Figure 7 illustrates an alternative flight path 23 wherein the flight path follows a vertical or near vertical path to apogee 33 and then pitches into a levelised component 45 on descent before entering generally arcuate flight path component 47 through the imaging phase 29 before continuing into the homing phase 31.
Figure 8a illustrates a variable radius inside loop flight path 23 in which after launch the device 1 follows a fixed angled ascent as part of the dynamic ascent phase 27 maintaining course until a predetermined criteria is reached (e.g. ascent rate or height or likely speed at apogee 33 based upon intended path) whereupon the device 1 pitches into apogee33 at a controlled increasing pitch rate (i.e. spiralling in) and after apogee 33 spirals out at decreasing pitch rate for the course of the imaging phase 29 until the device 1 is near vertical at which point it enters the homing phase 31 toward homing target 35.
Figure 8b illustrates a fixed radius inside loop flight path 23.
According to this embodiment, the flight path follows a dynamic ascent phase 27 which at a pre-defined condition (e.g. ascent rate) pitches into an arcuate, fixed radius path to apogee 33 and through the imaging phase 29 until near vertical at which point it enters the homing phase 31. Figure 8c illustrates an outside loop flight path 23 in which the device 1 follows an outside loop.
Figure 8d illustrates an alternative flight path 23 having a dynamic ascent phase 27 during which the device 1 follows a pre-determined course until a condition is met (e.g. ascent rate threshold) at which point it pitches, e.g. in an arcuate path, toward apogee 33 and beyond, until directed toward an imaging target whereupon it follows a steady course (e.g. at an angle to the horizontal) toward the imaging target for a pre-defmed period, the imaging phase 29 and then pitches toward the homing target 35 entering the homing phase 31.
Figure 8e illustrates an alternative flight path 23 whereupon the device 1 follows a pre-determined course until a condition is met (e.g. ascent rate threshold) at which point it pitches, e.g. in an arcuate path, toward apogee 33. About apogee 33, it follows an arcuate path having a tight radius of curvature 49, but retains a horizontal vector of velocity through apogee 33 of, for example, at least 1 m/s, preferably at least 2 m/s (i.e. the device 1 does not stall at apogee 33) in order to maintain control and stability. In a preliminary descent phase 51 from apogee 33, the device then pitches into a flatter course 53 which follows a steady course directed toward an imaging target, the imaging phase 29, following which the device 1 pitches into the homing phase.
Figure 8f illustrates an alternative flight path 23 comprising banked turns. According to this embodiment, the ascent phase 27 follows a steady course at a heading that is typically not reciprocal to the imaging target although the device 1 rolls and then at a pre-determined altitude (or ascent rate) levels off and then banks into a banked turn about a an arcuate course (as illustrated in the plan view) during which is the imaging phase 29 before rolling into the homing phase 31.
The catch phase is illustrated in Figure 9 which shows an embodiment in which the catcher 37 is composed of an airbag, which may be inflatable on demand, having on at least one surface that can be made visible to the machine vision module of the device 1 that has a homing target 35 disposed thereon. In use, according to one embodiment having a flight path shown in Figure 5, the imaging device 1 will be disposed in a cradle (not shown) of a launcher (not shown). In the launcher, a power and data connector automatically links with a corresponding power and data connector in the tail 7 of the device 1. The power and data connector in the launcher links to an associated interface (e.g. of a tablet or other computer). The supercapacitor(s) 21 on board the device 1 are charged through the power and data connector (in about 20 seconds). During this pre-launch portion of the launch phases, the device may be aimed but not launched until the supercapacitors (21) are suitably charged and the device 1 is aimed at a suitable pitch angle within defined limits.
Prior to launch, the sensors (not shown) in the device 1 detect the direction (ideally 180° from the direction of the target image) and pitch of the device 1.
Sensors used on board include inertial sensors which include gyroscopes, magnetometers and accelerometers. These allow the position, orientation and speed of the device to be determined.
At launch and during initial ascent phase 25, the degree of acceleration is too great for the sensors to function and so for the first few tens of milliseconds the sensors are not relied on but the position after an initial period is determined from an estimated velocity and pitch and direction of launch. In the dynamic ascent phase, the flight computer 13 relies on the calculated position from the initial ascent phase adapted with sensor data to provide location, position and velocity information at any one time. To the extent the actual position and direction of travel deviate from the intended flight path (a pre-defined flight path calculated, for example, to achieve a loop path in the most efficient way whilst meeting pre-defined criteria, such as length of image capture phase and angle of image capture) as determined from the sensor data, the flight computer 13 calculates the adjustment to the elevons that is necessary to manoeuvre the device 1 the required amount to bring it back toward the intended flight path. The degree of adjustment necessary will vary depending upon the part of the flight path the device 1 is in. Adjustment will be minimised in the dynamic ascent phase 27 in order to preserve energy for the homing phase 31. The device 1 will pitch into the image capture phase 29 as the rate of ascent as determined by the sensors approaches a pre-defined threshold. Essentially, the transition from ascent into apogee is configured to take place earlier than it would naturally on the dynamic ascent flight path. As such, the device 1 does not reach quite the height it would naturally on the dynamic ascent flight path. The trajectory is cut short so as to facilitate an extended steady flattened apogee trajectory during which images of an image target area may be captured. By pitching into apogee early, sufficient velocity remains to allow a short flattened trajectory flight to occur. As the device pitches into apogee, the image capture phase begins, starting with activating the cameras so as to capture images toward to image capture target at several angles as the device 1 moves through apogee (during which the device 1 is upside down relative to the launch orientation) and dips into homing phase 31. Images captured are stored within on-board data storage. The flight at apogee will last a pre- defined period and/or until a predetermined threshold velocity is reached. The cameras remain active during the homing phase 31 and become part of a machine vision module along with a machine vision software algorithm which is caused to operate as the device 1 enters the homing phase 31. Images captured are tagged with data such as the position and orientation of device 1 for later use.
In the homing phase 31 , machine vision is used to identify a homing target 35 and then is used as feedback to the flight computer to facilitate navigational adjustment, thus leading the device 1 back to the homing target for catching in the catcher 37.
The captured device can be recovered, and connected via a power and data connector to a tablet or computer to download image data and data that is precursor to location and orientation.
By minimising the power consumption in the course of the entire flight, it is possible to use a small power source, which if sufficiently small can be a supercapacitor, which has the advantages of being very efficient at charging and holding charge and also very robust. Since just a small supercapacitor is needed, the weight of the device is reduced and the protection is reduced again reducing the weight. Thus, there is significant advantage in the energy management on the device of the present invention.
The invention has been described with reference to a preferred embodiment. However, it will be appreciated that variations and modifications can be effected by a person of ordinary skill in the art without departing from the scope of the invention.

Claims

CLAIMS:
1. An imaging device for capturing aerial images of an imaging target, the imaging device being capable of and configured for flight, wherein the imaging device has no on-board propulsion and is autonomous-in-flight.
2. An imaging device as claimed in claim 1 , wherein the imaging device is capable of and/or configured for a flight path defining a generally arcuate or flattened trajectory component.
3. An imaging device as claimed in claim 1 or claim 2, wherein the imaging device is capable of and/or configured for a flight path defining a generally arcuate component.
4. An imaging device as claimed in any one of claims 1 to 3, wherein the imaging device is capable of and/or configured for a flight path defining a flattened trajectory component.
5. An imaging device as claimed in any one of claims 1 to 4 that has a flight path having a generally levelised component that is horizontal or within 30° of horizontal, preferably within 15°, more preferably within 10° and more preferably within 5° of horizontal.
6. An imaging device as claimed in claim 5, wherein the levelised component is part of the flattened trajectory or generally arcuate trajectory component or immediately precedes it.
7. An imaging device as claimed in claim 4 or claim 5, wherein the levelised component is at apogee.
8. An imaging device as claimed in any one of the preceding claims, wherein the flattened trajectory component or generally arcuate component encompasses or is immediately preceded by apogee.
9. An imaging device as claimed in any one of the preceding claims, wherein the imaging device is capable of a loop flight path, preferably an inside loop.
10. An imaging device as claimed in any one of the preceding claims, which is configured to navigate toward a homing target.
11. An imaging device as claimed in claim 10, wherein navigation toward a homing target is facilitated by an on-board machine vision module.
12. An imaging device as claimed in any one of the preceding claims, which is configured to be launched from a ground-based launcher, to capture aerial images of a target location, to recognize a pre-defined homing target and re-orientate itself in-flight to facilitate a flight path toward the homing target.
13. An imaging device as claimed in any one of the preceding claims, which is configured to capture images along one or more pitch angle or along a range of pitch angles (e.g. horizontal through to vertically downwards) at a pre-defined heading.
14. An imaging device as claimed in any one of the preceding claims, wherein the device is configured to dynamically adapt and/or determine the flight path in flight.
15. An imaging device as claimed in claim 14, wherein the device is configured to dynamically calculate and follow a flight path defining a generally arcuate path, the generally arcuate path extending about at least 30°, preferably at least 45°, more preferably at least 60°, still more preferably at least 90° and preferably up to 200° .
16. An imaging device as claimed in claim 15, wherein the generally arcuate path encompasses or follows apogee.
17. An imaging device as claimed in any one of the preceding claims, having a flight path that is defined by a pre-defmed rate of rotation in a generally arcuate path or a pre-defmed radius or range of radii of the generally arcuate path and a pre-defmed horizontal vector of velocity at a levelised component of the flight path, which is preferably at apogee.
18. An imaging device as claimed in any one of the preceding claims, wherein the device is configured to define a desired flight path portion in terms of radius of arc and horizontal vector of velocity at a levelised component of flight (e.g. at apogee), which flight path is dynamically adapted in altitude during flight in dependence upon the rate of ascent.
19. An imaging device as claimed in claim 17 or claim 18, wherein the pre- defmed horizontal vector of velocity is selected to be up to 15 m/s.
20. An imaging device as claimed in any one of claims 17 to 19, wherein the pre-defmed horizontal vector of velocity is selected to be at least 1 m/s.
21. An imaging device as claimed in any one of the preceding claims, wherein the device is configured to follow a fixed flight path for at least a portion of its flight path between a levelised component of flight (e.g. at apogee) and vertical descent, the fixed flight path comprising at least 30°, preferably at least 60° and optionally from 75 to 90°.
22. An imaging device as claimed in claim 21 , wherein the fixed flight path is determined and fixed during flight before the device reaches the levelised component of flight (e.g. at apogee), in dependence on the characteristics of pre- defined rate of rotation and horizontal vector of velocity at the levelised component of the flight.
23. An imaging device as claimed in any one of the preceding claims, wherein the device is configured to pitch into a generally arcuate flight path or flattened trajectory before the rate of ascent of the device reaches or falls below a pre- defined threshold.
24. An imaging device as claimed in claim 23, wherein the pre-defmed threshold of rate of ascent is within a pre-defmed range.
25. An imaging device as claimed in any one of the preceding claims, wherein the device comprises a controller for controlling the trajectory toward a dynamically adapted flight path.
26. An imaging device as claimed in claim 25, wherein the controller comprises a CPU, which CPU comprises a computer program comprising an algorithm for dynamically adapting a desired flight path for the device during flight.
27. An imaging device as claimed in any one of the preceding claims, which device comprises a body capable of flight when subject to sufficient launch energy, an in-flight orientation means disposed in relation to the body, an imaging component, one or more positional, orientational, motion or environmental sensors to provide sensor data, a controller for controlling the in-flight orientation means and the imaging component in response to pre-defmed criteria characterized by time periods and/or sensor data or data derived therefrom and a power source for providing power to the flight orientation means, one or more sensors, imaging component and controller.
28. An imaging device as claimed in claim 27, wherein the body comprises a main body and a fixed wing arrangement and wherein the in-flight orientation means are disposed in relation to the fixed wing arrangement.
29. An imaging device as claimed in claim 28, wherein the in-flight orientation means comprises control surfaces, e.g. elevons, operable by the controller.
30. An imaging device as claimed in any one of the preceding claims, which is characterized by the absence of reliance on radio communications and/or capability of radio communications.
31. An imaging device as claimed in any one of the preceding claims, which is configured to operate after launch in the absence of two-way communication.
32. An imaging device as claimed in any one of the preceding claims, wherein the device comprising an orientation means for orientating the device according to a desired flight path (e.g. a dynamically determined flight path or according to desired interim and final destinations), a controller for controlling the orientation means and a machine vision module for identifying a homing target and generating data relating to the location of the homing target, wherein controller controls the orientation means in dependence on data generated by the machine vision module so as to navigate the device toward the homing target.
33. An imaging device as defined in any one of claims 1 to 32, which comprises a controller configured for multiple flight operational phases comprising an ascent phase, an image capture phase and a homing phase.
34. A system for capturing aerial images of an imaging target, the system comprising an imaging device as defined in any one of claims 1 to 33, a launcher or adaptor for a launcher for launching the imaging device and an external CPU or external processor external to the imaging device configured for data communication with a controller and/or a data storage means on the imaging device.
35. A system as claimed in claim 34, wherein the external CPU or processor is a CPU in a computer, tablet or smartphone connected to the launcher or connected to the imaging device via the launcher or built into the launcher for connection to the imaging device prior to launch.
36. A launcher or adapter for a launcher for an imaging device as defined in any one of claims 1 to 33.
37. A controller or controller software configured to operate an imaging device as defined in any one of claims 1 to 33, the controller or controller software having multiple flight operational phases comprising an ascent phase, an image capture phase and a homing phase.
38. A catcher for a deployed image capture device as defined in any one of claims 1 to 33, the catcher having a homing target recognizable by a computer vision module on board the image capture device on at least one surface of the catcher.
39. A method of capturing an image of an imaging target, the method comprises providing an imaging device according to any one of clams 1 to 33, providing a launcher for the imaging device, at a launch point orientating the imaging device for launch at a suitable pitch, in a heading generally reciprocal to the heading of the imaging target from the launch point, causing the launcher to launch the device, providing a homing target identifiable by a machine vision module on board the device and positioning and orientating the homing target such that it can be identified by the machine vision module.
PCT/EP2019/076114 2018-09-26 2019-09-26 Aerial imaging device and system WO2020064969A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1815726.3A GB201815726D0 (en) 2018-09-26 2018-09-26 Aerial imaging device and system
GB1815726.3 2018-09-26

Publications (1)

Publication Number Publication Date
WO2020064969A1 true WO2020064969A1 (en) 2020-04-02

Family

ID=64024114

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/076114 WO2020064969A1 (en) 2018-09-26 2019-09-26 Aerial imaging device and system

Country Status (2)

Country Link
GB (2) GB201815726D0 (en)
WO (1) WO2020064969A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115097861A (en) * 2022-05-15 2022-09-23 西北工业大学 Multi-Unmanned Aerial Vehicle (UAV) capture strategy method based on CEL-MADDPG
CN115097861B (en) * 2022-05-15 2024-04-26 西北工业大学 Multi-unmanned aerial vehicle trapping strategy method based on CEL-MADDPG

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6119976A (en) 1997-01-31 2000-09-19 Rogers; Michael E. Shoulder launched unmanned reconnaissance system
WO2002073129A1 (en) 2001-03-13 2002-09-19 Tacshot, Inc. Panoramic aerial imaging device
US20040196367A1 (en) 2002-08-21 2004-10-07 Pierre Raymond Method and apparatus for performing reconnaissance, intelligence-gathering, and surveillance over a zone
US8263919B2 (en) 2008-08-27 2012-09-11 Raytheon Company Unmanned surveillance vehicle
WO2012130790A2 (en) 2011-03-28 2012-10-04 Prox Dynamics As Uav kit
WO2013139509A1 (en) 2012-03-22 2013-09-26 Prox Dynamics As Method and device for controlling and monitoring the surrounding areas of an unmanned aerial vehicle
US9234728B2 (en) 2013-11-08 2016-01-12 Lonestar Inventions, L.P. Rocket or artillery launched smart reconnaissance pod
US20160018823A1 (en) * 2014-07-17 2016-01-21 Benjamin Longmier Atmospheric data collection and recovery systems and methods
US9738383B2 (en) 2011-03-01 2017-08-22 Richard D. Adams Remote controlled aerial reconnaissance vehicle
US20170240276A1 (en) * 2014-08-11 2017-08-24 Almog Rescue Systems Ltd. Unmanned glider system for payload dispersion
US20180053139A1 (en) * 2015-06-01 2018-02-22 Ingar LLC Systems, methods, and apparatuses for managing aerial drone parcel transfers

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6923404B1 (en) * 2003-01-10 2005-08-02 Zona Technology, Inc. Apparatus and methods for variable sweep body conformal wing with application to projectiles, missiles, and unmanned air vehicles
US7059566B2 (en) * 2003-06-20 2006-06-13 The United States Of America As Represented By The Secretary Of The Navy Unmanned aerial vehicle for logistical delivery
WO2017184806A1 (en) * 2016-04-20 2017-10-26 Worcester Polytechnic Institute Airborne vehicle recovery

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6119976A (en) 1997-01-31 2000-09-19 Rogers; Michael E. Shoulder launched unmanned reconnaissance system
WO2002073129A1 (en) 2001-03-13 2002-09-19 Tacshot, Inc. Panoramic aerial imaging device
US20020196339A1 (en) * 2001-03-13 2002-12-26 Andrew Heafitz Panoramic aerial imaging device
US20040196367A1 (en) 2002-08-21 2004-10-07 Pierre Raymond Method and apparatus for performing reconnaissance, intelligence-gathering, and surveillance over a zone
US8263919B2 (en) 2008-08-27 2012-09-11 Raytheon Company Unmanned surveillance vehicle
US9738383B2 (en) 2011-03-01 2017-08-22 Richard D. Adams Remote controlled aerial reconnaissance vehicle
WO2012130790A2 (en) 2011-03-28 2012-10-04 Prox Dynamics As Uav kit
WO2013139509A1 (en) 2012-03-22 2013-09-26 Prox Dynamics As Method and device for controlling and monitoring the surrounding areas of an unmanned aerial vehicle
US9234728B2 (en) 2013-11-08 2016-01-12 Lonestar Inventions, L.P. Rocket or artillery launched smart reconnaissance pod
US20160018823A1 (en) * 2014-07-17 2016-01-21 Benjamin Longmier Atmospheric data collection and recovery systems and methods
US20170240276A1 (en) * 2014-08-11 2017-08-24 Almog Rescue Systems Ltd. Unmanned glider system for payload dispersion
US20180053139A1 (en) * 2015-06-01 2018-02-22 Ingar LLC Systems, methods, and apparatuses for managing aerial drone parcel transfers

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Control of Autonomous Drone iHSMD - LP-RESEARCH", 4 October 2016 (2016-10-04), XP055642831, Retrieved from the Internet <URL:https://web.archive.org/web/20161004204602/https://lp-research.com/control-of-autonomous-drone-ihsmd/> [retrieved on 20191114] *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115097861A (en) * 2022-05-15 2022-09-23 西北工业大学 Multi-Unmanned Aerial Vehicle (UAV) capture strategy method based on CEL-MADDPG
CN115097861B (en) * 2022-05-15 2024-04-26 西北工业大学 Multi-unmanned aerial vehicle trapping strategy method based on CEL-MADDPG

Also Published As

Publication number Publication date
GB201913906D0 (en) 2019-11-13
GB201815726D0 (en) 2018-11-07
GB2579129A (en) 2020-06-10

Similar Documents

Publication Publication Date Title
JP6921147B2 (en) Multimode unmanned aerial vehicle
AU2018220147B2 (en) Aerial vehicle imaging and targeting system
EP3447436B1 (en) Method for defending against threats
EP3447435A1 (en) Virtual reality system for aerial vehicle
US9725172B2 (en) Surveillance system
JP7181723B2 (en) Maritime search system, unmanned air vehicle, and unmanned flight method
WO2011144497A1 (en) Remotely operated air reconnaissance device
KR20180081644A (en) Evasive movement method in drone using bio-inspired scheme
WO2020064969A1 (en) Aerial imaging device and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19789867

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19789867

Country of ref document: EP

Kind code of ref document: A1