GB2596512A - Improvements in and relating to drone control - Google Patents

Improvements in and relating to drone control Download PDF

Info

Publication number
GB2596512A
GB2596512A GB2007822.6A GB202007822A GB2596512A GB 2596512 A GB2596512 A GB 2596512A GB 202007822 A GB202007822 A GB 202007822A GB 2596512 A GB2596512 A GB 2596512A
Authority
GB
United Kingdom
Prior art keywords
signal
animal
drone
processor
kinematics
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2007822.6A
Other versions
GB202007822D0 (en
Inventor
Palego Cristiano
Shearwood Jake
Aldabashi Nawaf
Cross Paul
Haynes Williams Samuel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bangor University
Original Assignee
Bangor University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bangor University filed Critical Bangor University
Priority to GB2007822.6A priority Critical patent/GB2596512A/en
Publication of GB202007822D0 publication Critical patent/GB202007822D0/en
Publication of GB2596512A publication Critical patent/GB2596512A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0284Relative positioning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/02Systems for determining distance or velocity not using reflection or reradiation using radio waves
    • G01S11/06Systems for determining distance or velocity not using reflection or reradiation using radio waves using intensity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/14Systems for determining distance or velocity not using reflection or reradiation using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S2205/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S2205/01Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations specially adapted for specific applications
    • G01S2205/09Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations specially adapted for specific applications for tracking people
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/02Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using radio waves
    • G01S3/14Systems for determining direction or deviation from predetermined direction
    • G01S3/16Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived sequentially from receiving antennas or antenna systems having differently-oriented directivity characteristics or from an antenna system having periodically-varied orientation of directivity characteristic
    • G01S3/20Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived sequentially from receiving antennas or antenna systems having differently-oriented directivity characteristics or from an antenna system having periodically-varied orientation of directivity characteristic derived by sampling signal received by an antenna system having periodically-varied orientation of directivity characteristic
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/783Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/80Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using ultrasonic, sonic or infrasonic waves
    • G01S3/802Systems for determining direction or deviation from predetermined direction
    • G01S3/803Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from receiving transducers or transducer systems having differently-oriented directivity characteristics
    • G01S3/8032Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from receiving transducers or transducer systems having differently-oriented directivity characteristics wherein the signals are derived sequentially
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0278Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves involving statistical or probabilistic considerations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0294Trajectory determination or predictive filtering, e.g. target tracking or Kalman filtering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/12Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves by co-ordinating position lines of different shape, e.g. hyperbolic, circular, elliptical or radial

Abstract

A system 100 for controlling the operation of a drone 110 comprises: a drone 110; a receiver unit 120 mounted on the drone 110, the receiver unit 120 configured to receive a 5 signal from an animal; a processor 130 operable to process the received signal to determine the kinematics of the animal and provide an output based on the kinematics such as position or velocity of travel; and a controller 140 operable to control operation of the drone 110 based on the output from the processor 130. The signal may be an radio signal from transmitter attached to the animal, a reflected signal from a reflective object on the animal and/or a sound signal from the animal. The processor preferably being operable to determine angle of arrival and received signal strength to determine the kinematics.

Description

IMPROVEMENTS IN AND RELATING TO DRONE CONTROL
[0001] The present disclosure relates to a system for controlling the operation of a drone, and a method thereof.
BACKGROUND
[0002] Telemetry is the process of obtaining information remotely. Animal telemetry has become increasingly important, as researchers and businesses look to monitor and study animal behaviour, function and the effect of the surrounding environment.
[0003] Tagging animals with instrumentation tags can allow useful data to be collected. In some cases, a tagged animal must be caught to obtain the information from the tag. In other cases, the tag may wirelessly transmit the data to a remote data collection point.
[0004] Where wireless data transmission is performed, there are numerous difficulties relating to data collection. It is preferable for instrumentation tags to be small and lightweight, so as to minimise the impact on the animal. However, to achieve suitable range of data transmission, a suitably powerful, and thus heavy, transmitter is required.
[0005] An alternative approach is to maintain a suitable distance between the transmitter and receiver to ensure that data may be continuously detected. However, difficulties arise in tracking fast-moving and/or flying animals, for example insects, such as bees, wasps and hornets. Furthermore, whilst it is desirable to monitor the animal without affecting their natural behaviour, current approaches are not suitable or present additional problems.
[0006] It is an object of the present invention to provide an improved system for controlling the operation of a drone and/or method thereof and/or address one or more of the problems discussed above, or discussed elsewhere, or to at least provide an alternative system and/or method.
SUMMARY OF THE INVENTION
[0007] According to the present invention there is provided a system and method as set forth in the appended claims. Other features of the invention will be apparent from the dependent claims, and the description which follows.
[0008] According to a first aspect of the present invention there is provided a system for 30 controlling the operation of a drone, the system comprising: a drone; a receiver unit mounted on the drone, the receiver unit configured to receive a signal from an animal; a processor operable to process the received signal to determine the kinematics of the animal and provide an output based on the kinematics; and a controller operable to control operation of the drone based on the output from the processor.
[0009] In this way, animal telemetry is facilitated, in particular the tracking of fast-moving and/or flying animals. By providing a controller operable to control operation of the drone based on the processor output, which is based on the kinematics of the animal, ensures that signal loss is mitigated. Drone control is thus improved. Such a construction ensures effective monitoring and data collection.
[0010] The system may be an autonomous system. That is, it may not require any human pilot intervention in order to control drone operation.
[0011] In one example, the signal comprises one or more of: a radio frequency signal from a transmitter attached to the animal; an optical signal from an optical element attached to the animal; a reflected signal from a reflective element attached to the animal; and/or a sound signal from the animal.
[0012] That is, the receiver may be a radio receiver, optical signal receiver, acoustic receiver, or any receiver suitable to receive the signal. The system may comprise a transmitter operable to transmit a signal. The transmitter may comprise a piezoelectric energy harvester and an antenna. A construction incorporating a radio transmitter is advantageous because sensitive detection equipment may be provided to allow the signal to be received at range. Radio frequency transmission is also possible using low power electronics.
[0013] In one example, the processor is operable to process the received signal to determine kinematics that are characteristic of the animal.
[0014] The kinematics of an animal can depend on the task which the animal is performing. Advantageously, determining characteristic kinematics allows the system to learn the characteristic manner in which the animal moves. This information is useful in characterising animal behaviour. This information is also useful in predicting motion of the animal, which facilitates improved control of the drone.
[0015] In one example, the processor output is based on the characteristic kinematics.
[0016] As the controller is operable to control the drone based on the output from the processor, the output being based on the characteristic kinematics provides improved drone control. That is, the drone may follow the signal more effectively as the processor may provide output based on predictive movements of the animal.
[0017] In one example, the processor is configured to use machine learning techniques to determine the characteristic kinematics.
[0018] Advantageously, machine learning techniques, including neural networks, may be trained to learn characteristic movement patterns. The processor can thus make better informed predictions of animal behaviour, including characteristic kinematics, in order to provide output for use in controlling the drone.
[0019] In one example, the processor is operable to process the received signal to determine the angle of arrival and/or power of the signal, thereby to determine the kinematics of the animal.
[0020] Advantageously, determining the angle of arrival and/or power allows the processor to determine the location, speed and/or velocity of the signal, and thus the animal, in a simple yet robust manner. As the receiver is mounted on the drone, the processed signal information is relative to the drone position, ultimately simplifying drone control. The processor may monitor and process the received signal. In this way, kinematics may be updated in real-time.
[0021] In one example, the processor and controller are mounted on the drone.
[0022] A self-contained system is thus provided. Advantageously, data transmission to a remote location for processing is not necessary. This ensures that the drone can rapidly respond to movement of the animal.
[0023] In one example, the receiver unit comprises an electronically steered array. An electronically steered array, for example a phased array antenna, advantageously does not require moving parts to scan.
[0024] In one example, the controller is operable to control the drone: 3D position, speed, acceleration, pitch, yaw and/or roll. Thus, the drone may be controlled to follow the signal effectively.
[0025] In one example, in the event of a loss of the signal from the animal, the processor is operable to provide an output to the controller based on previously determined kinematics of the animal and the controller is operable to control operation based on the output, to attempt to relocate the signal.
[0026] In this way, the drone may autonomously relocate the signal. The previously determined kinematics may be characteristic kinematics. Use of characteristic kinematics allows the system to make informed prediction as to the present or future kinematics, including location and trajectory.
[0027] According to a second aspect of the present invention there is provided a method of controlling operation of a drone, the method comprising the steps of: receiving, at the drone, a signal from an animal; processing the signal from the animal to determine the kinematics of the animal and providing an output based on the kinematics; and controlling operation of the drone based on the output from the processor.
BRIEF DESCRIPTION OF THE DRAWINGS
[0028] For a better understanding of the invention, and to show how embodiments of the same may be carried into effect, reference will now be made, by way of example only, to the accompanying diagrammatic drawings in which: Fig. 1 shows a schematic of a system according to an embodiment of the present invention; Fig. 2 shows an animal having an attached transmitter comprised in a system according to an embodiment of the present invention; Fig. 3 shows operating principles of a system according to an embodiment of the present invention; Fig. 4 shows operating principles, for determining location of an animal, of a system according to an embodiment of the present invention; Fig. 5 shows operating principles, for determining characteristic kinematics of an animal, of a system according to an embodiment of the present invention; Fig. 6 shows a flowchart of operating principles of a system according to an embodiment of the present invention; and Fig. 7 shows a method of controlling operation of a drone according to an embodiment of the present invention.
DETAILED DESCRIPTION
[0029] Referring to Figure 1, a system 100 for controlling the operation of a drone is shown.
The system 100 comprises a drone 110, a receiver unit 120, a processor 130 and a controller 140 [0030] In the exemplary embodiment described herein, the receiver unit 120, processor 130 and controller 140 are all mounted on the drone. Nevertheless, the skilled person will 30 appreciate from the present disclosure that only the receiver unit 120 need be mounted on the drone, and interaction between the receiver unit 120, processor 130 and controller 140 may take place remotely.
[0031] For the avoidance of doubt, a drone may be otherwise known as an unmanned aerial vehicle (UAV). A drone is an aircraft without a human pilot on board. The terms drone and unmanned aerial vehicle may be used interchangeably.
[0032] The receiver unit 120 is configured to receive a signal from an animal. In the exemplary embodiment described herein, the animal is an insect, in particular a flying insect, and in particular a bee, wasp or hornet. Nevertheless, the skilled person will appreciate from the present disclosure that the system 100 is capable of receiving a signal from any animal, the operating principles of the system being similar or identical to those described in relation to this exemplary embodiment.
[0033] The processor 130 is operable to process the signal received by the receiver unit 120. The processor 130 processes the signal to determine the kinematics of the animal and to provide an output based on the kinematics.
[0034] As will be described in greater detail herein, determining kinematics of the animal relates to determining its motion. In one exemplary embodiment, the processor 130 is configured to determine position, speed, velocity and/or acceleration of the animal. That is, the processor 130 is configured to determine vectors and trajectories describing the kinematics of the animal.
[0035] The controller 140 is operable to control operation of the drone 110 based on the output from the processor 130. That is, in one exemplary embodiment, the processor 130 provides the output directly to the controller 140. In another exemplary embodiment, intermediate signal processing of the processor output is performed, before passing a signal to the controller 140. In one exemplary embodiment, intermediate signal processing comprises wireless transfer of data (including position and environmental metrics) from the processor 130 to a mobile application, which enables more extended processing in relation to previous readings and facilitates improved drone control. The mobile application allows for dynamic control of the drone thus facilitating a lighter on-board controller. The controller 140 is operable to control the drone 110: 3D position, speed, acceleration, pitch, yaw and/or roll.
[0036] Referring to Figure 2, a signal emitting element arranged to emit a signal from an insect is shown. In the exemplary embodiment illustrated, the signal emitting element is a radio frequency transmitter 200 attached to an insect 210. The transmitter 200 comprises a piezoelectric energy-harvesting member 212 which harvests the mechanical vibration of the insect 210 to power the transmitter electronics. The transmitter comprises an antenna 214 which, when powered, radiates a radio frequency signal. In one exemplary embodiment, the transmitter mass is less than 30mg.
[0037] In another exemplary embodiment, the signal emitting element is an optical element (not shown) attached to the insect 210 and configured to emit an optical signal. For the avoidance of doubt, an optical signal includes IR and UV wavelengths, as well as visible wavelengths. In another exemplary embodiment, the signal emitting element is a reflective element (not shown) attached to the insect 210 and configured to reflect or scatter an incident signal toward the receiver 120. The incident signal may be provided by an emitter mounted on the drone 100, or may be any other incident signal, for example sunlight. In another exemplary embodiment, the signal emitting element may be an emitter mounted on the drone 100, and a reflective element attached to the insect, the reflective element configured to reflect or scatter the signal from the emitter back toward the receiver 120. Reflective elements facilitate a reduced transmitter, or tag, weight. In another exemplary embodiment, the signal emitting element may be any part of the insect 210 emitting a sound signal. Acoustic signal detection mitigates the need for a transmitter, or tag, altogether.
[0038] In each exemplary embodiment, the receiver unit 120 is configured to receive the signal. That is, in the exemplary embodiment illustrated, the receiver unit 120 is configured to receive radio signals. As shown in Figure 1, the receiver unit 120 comprises a high sensitivity radio receiver 122 and a phased array antenna 124 (that is, an electronically scanned array).
The phased array antenna 124 is capable of scanning to detect variations in signal strength (or power). This is otherwise known as "received signal strength indicator (RSSI)". In this way, the angle of arrival of the signal, that is, direction from which the signal is received, may be determined, by associating the angle of arrival of the strongest signal with the direction of a lobe steered by the phased array antenna.
[0039] Referring to Figure 3, operating principles of the system are shown.
[0040] As shown in Figure 3.1, the phased array antenna 124 performs electronic beam scanning to detect variations in signal strength. Data from the signal strength scan is passed to the processor 130.
[0041] As shown in Figure 3.2, the processor 130 constructs a curve of the received signal strength indicator (RSSI) as a function of antenna scan angle. This can be performed in real-time. The position of the insect can be determined from analysis of the RSSI plot. In this example, the position of the insect may be determined from the plot minima, indicated at 310. In the present case, the plot minima indicates the signal direction as the received signal power and receiver unit 120 output voltage are inversely proportional. That is, the greater the received signal strength, the lower the voltage on the RSSI plot. A calibration curve may subsequently be used to determine the received signal power from the voltage output, however, for the purposes of the present disclosure, this is not necessary.
[0042] Figure 3.3 provides a schematic overview of system operation. At Figure 3.3.1, the signal is received by the receiver unit 120. At Figure 3.3.2, the received signal is filtered and analysed by the processor 130 to construct the RSSI plot. The insect position determined from the RSSI plot is processed by the processor 130 using machine learning algorithms. As will be described in greater detail herein, the algorithms output information to allow the drone to better follow the signal, and thus the insect. The output of the processor 130 is passed to the controller 140 to control operation of the drone 110, for example, in order to update the position, velocity or acceleration of the drone 110.
[0043] As shown in Figure 3.4, the drone is controlled to follow the flight path of the insect, and update its position based on the position of the insect. An autonomous insect tracking system 100 is thus provided. It will be appreciated that the position, speed and/or acceleration of the drone 110 need not necessarily be updated if the signal is within a predetermined distance, speed and/or acceleration range. That is, the drone 110 may be controlled to remain over a point of interest (for example, hover), where this is deemed suitable to detect and monitor the signal by the processor 130.
[0044] Referring to Figure 4, operating principles of the processor 130 for determining insect position are shown.
[0045] At Figure 4.1, the received signal from the insect is filtered to remove erroneous values. This may be performed by threshold filtering to remove outliers.
[0046] At Figure 4.2, the values are mapped in 2D according to the antenna scan pattern. This allows a heat map to be produced. The heat map is interpreted as a rectangular plot reporting the relative strength of a relevant monitored parameter in a region of interest.
[0047] At Figure 4.3, the maximum value of the heatmap is chosen as the target point. The maximum value of the heat map corresponds to the element of the map for which the monitored parameter (e.g. received power) is the highest (or the readout voltage is lowest, for reasons as described in relation to RSSI above). The target point is fed to the machine learning algorithms and is used to provide an output which is fed to the controller 140.
[0048] The kinematics of an animal can depend on the task which the animal is performing.
For example, an exploring animal will move erratically, looking to find new resources e.g. nectar from flowers, in the case of bees. On the other hand, an animal tracking back and forth between a known food source and a base (for example a nest or hive) to forage or harvest the resource will exhibit stable motion, often following a direct or previously travelled path, typically
B
without large deviation from said path. That is, the animal kinematics will differ whether the animal is in a "exploration mode" or a "foraging mode". These "characteristic kinematics" are particularly noticeable in the flight patterns of flying insects, in particular bees, wasps and hornets.
[0049] Referring to Figure 5, an overview of operating principles of the processor 130 for determining characteristic kinematics are shown.
[0050] As shown in Figure 5.1, a first stage of the algorithm involves building a predictive path for the drone to follow. As introduced above, machine learning techniques are used in processing to allow the drone to better follow the signal, and thus the insect. By monitoring the received signal, the machine learning techniques allow the system 100 to learn insect behavioural patterns. Using these behavioural patterns, predictions may be made as to future locations of the signal and kinematics of the insect. By incorporating this information into the drone control functionality, the system 100 mitigates loss of the signal, and ensures that the drone 110 remains at an appropriate distance from the insect 210.
[0051] Making predictions of the kinematics of the insect 210 comprises using a plurality of position data points, determined as described above. The position data points are used as input for a neural network, along with additional metadata. This metadata includes a. Digressiveness -a measurement of efficiency of an insect travel path. For example, foraging flights are more efficiently structured than exploration flights; b. Deviation from a normalised bearing -a measurement of in-the-moment changes to insect trajectory. These are more pronounced in exploration flights, as the insect explores an area for new resources; c. Speed -both current and average; and/or d. Distance from a base (nest or hive).
[0052] The neural network output can be used adjust the drone tracking functionality. In one example, the neural network may determine an insect to be in a foraging mode, by recognising typically straighter, or more direct, flight paths and/or the insect making fewer deviations from a flight path. Where this is the case, the drone may be controlled to fly faster and turn slower for flights to existing resource locations. In another example, the neural network may determine an insect to be in an exploration mode, by recognising erratic flight paths. Where this is the case, the drone may be controlled to fly more slowly and turn faster, to be more responsive to changes in insect direction. Additionally, when sudden changes in insect position are detected, these may be weighted to minimise the influence of possible false readings on the control of the drone.
[0053] In each example described above, use of information provided by the neural network allows the drone to track the insect more effectively by determining characteristic kinematics of the insect.
[0054] As shown in Figure 5.2, a second stage of the algorithm involves looking for time periods in which the insect position remains constant. This may occur where an insect remains in one location whilst feeding. This is an indicator of an insect in a "foraging mode". This information can also be fed into the neural network to characterise the foraging mode.
[0055] As shown in Figure 5.3, a third stage of the algorithm involves predicting a relationship between a target location and insect movement when leaving the hive. In one example, target locations are food sources that are known to the insect. The food sources may be mapped, and the location of known food sources supplied the processor 130. It has been found that, in certain cases, an insect will leave the hive in a pre-determined direction when heading to a known food source. The direction in which the insect leaves the hive (for example, a bearing) relative to the known food source, can be fed into the neural network to characterise the foraging mode.
[0056] In the event of a loss of the signal from the insect, the processor 130 is operable to provide an output to the controller 140 based on previously determined kinematics of the insect. The controller 140 is operable to control operation of the drone 110 based on the output from the processor 130, to attempt to relocate the signal. For example, in the event of a loss of signal, the processor 130 may produce a predicted insect flight path and provide an output to the controller 130 to control the drone 110 to follow the predicted flight path. The predicted flight path is based on previously determined kinematics, including recently recorded direction, speed and/or acceleration of the insect. In another example, the processor may lose signal from an insect that is determined to be in a foraging mode. That is, the previously determined kinematics includes characteristic kinematics. Where this is the case, the processor 130 may use the previously determined kinematics alongside a prediction of a food source the insect is heading toward to produce a predicted flight path. The processor 130 provides output to the controller 140 to control the drone to follow the predicted flight path, to attempt to relocate the signal en route. The prediction of food source may be based on a map of surrounding food sources, the last recorded kinematics of the insect and/or the direction in which the insect left the hive.
[0057] One application of the system 100 is found in predicting locations of interest such as an insect arriving at a patch of flowers. The flower patch can be classified via imagery and Al analysis by using a neural network trained on publicly available image sets of flower species. A map of local features can be built based on visits to these locations and duration of each visit. This can be achieved using similar methods to characteristic kinematic determination. Given an understanding of the insect bearing, operational mode, and additional data determined by the drone, the neural network will be able to predict which target from a list that the insect if flying towards. This map can be used to categorise and evaluate hive diet, hive feeding patterns, and other fields of interest. The map can also be used to reacquire a target insect in the event of a temporary loss of signal, as described above.
[0058] Figure 6 provides an overview of system function according to an exemplary embodiment of the system 100. Each stage will be described below.
6.1: START: Insect tracking begins 6.2: The processor 130 reads the voltage from the phased array antenna 124 of the receiver 120. This allows the RSSI curve to be constructed.
6.3: The processor 130 interprets the angle of arrival of the signal, and the distance to the signal, from the RSSI curve.
6.4: Stages 6.2 to 6.3 are repeated, if necessary or desired.
6.5: The directional data obtained by the processor is readied for further processing. This may involve processing the data in a smartphone app. That is, the processor 130 may be comprised in a smartphone.
6.6: The directional data is stored in memory.
6.7: Current directional data and directional data stored in memory are both used by the processor 130 as input for the neural network to predict future directional data and insect kinematics.
6.8: Stages 6.2 to 6.7 are repeated, if necessary or desired.
6.9: Current directional data and predicted future directional data are both used to produce an output to be fed to the controller 140.
6.10: Output from the processor 130 is fed to the controller 140.
6.11: The controller 140 controls the drone to move accordingly. A user of the system 100 can be updated graphically via a GUI as to the progress of the drone 110.
6.12: Stages 6.2 to 6.11 are repeated, if necessary or desired.
6.13: Insect tracking terminates 6.14: A map of insect trajectory may be produced, if necessary or desired.
6.15: Memory data from the processor 130 is shared with an external computer, via wireless data transfer.
6.16: An interactive map is created from the memory data using flight metrics, including position and environmental metric).
6.17: END [0059] Referring to Figure 7, a method of controlling operation of a drone is shown. In Step 7.1, a signal from an animal is received at the drone. In Step 7.2, the signal from the animal is processed to determine the kinematics of the animal, and provide an output based on the kinematics. In Step 7.3, operation of the drone is controlled based on the output from the processor.
[0060] Although a few preferred embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the invention, as defined in the appended claims.
[0061] The preceding description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
[0062] The terms and words used in the preceding description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the disclosure is provided for illustration purpose only and not for the purpose of limiting the disclosure as defined by the appended claims and their equivalents.
[0063] It is to be understood that the singular forms a, an, and "the" include plural referents unless the context clearly dictates otherwise. The terms "front", "rear", "side", "upper", "lower", "over", "under", "inner", "outer" and like terms are used to refer to the apparatus and its components in the orientation in which it is illustrated, which is the orientation in which it is intended to be used but should not be taken as otherwise limiting. Like reference numerals are used to denote like features throughout the figures, which are not to scale.
[0064] At least some of the example embodiments described herein may be constructed, partially or wholly, using dedicated special-purpose hardware. Terms such as 'component', 'module' or 'unit' used herein may include, but are not limited to, a hardware device, such as circuitry in the form of discrete or integrated components, a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks or provides the associated functionality. In some embodiments, the described elements may be configured to reside on a tangible, persistent, addressable storage medium and may be configured to execute on one or more processors. These functional elements may in some embodiments include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. Although the example embodiments have been described with reference to the components, modules and units discussed herein, such functional elements may be combined into fewer elements or separated into additional elements. Various combinations of optional features have been described herein, and it will be appreciated that described features may be combined in any suitable combination. In particular, the features of any one example embodiment may be combined with features of any other embodiment, as appropriate, except where such combinations are mutually exclusive. Throughout this specification, the term "comprising" or "comprises" means including the component(s) specified but not to the exclusion of the presence of others.
[0065] Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.
[0066] All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
[0067] Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
[0068] The invention is not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

Claims (11)

  1. CLAIMS1. A system for controlling the operation of a drone, the system comprising: a. a drone; b. a receiver unit mounted on the drone, the receiver unit configured to receive a signal from an animal; c. a processor operable to process the received signal to determine the kinematics of the animal and provide an output based on the kinematics; and d. a controller operable to control operation of the drone based on the output from the processor.
  2. 2. The system according to claim 1 wherein the signal comprises one or more of: a radio frequency signal from a transmitter attached to the animal; an optical signal from an optical element attached to the animal; a reflected signal from a reflective element attached to the animal; and/or a sound signal from the animal.
  3. 3. The system according to any previous claim wherein the processor is operable to process the received signal to determine kinematics that are characteristic of the animal.
  4. 4. The system according to claim 3 wherein the processor output is based on the characteristic kinematics.
  5. 5. The system according to either of claims 3 or 4 wherein the processor is configured to use machine leaming techniques to determine the characteristic kinematics.
  6. 6. The system according to any previous claim wherein the processor is operable to process the received signal to determine the angle of arrival and/or power of the signal, thereby to determine the kinematics of the animal.
  7. 7. The system according to any previous claim wherein the processor and controller are mounted on the drone.
  8. 8. The system according to any previous claim wherein the receiver unit comprises an electronically steered array.
  9. 9. The system according to any previous claim wherein the controller is operable to control the drone: 3D position, speed, acceleration, pitch, yaw and/or roll.
  10. 10. The system according to any previous claim wherein in the event of a loss of the signal from the animal, the processor is operable to provide an output to the controller based on previously determined kinematics of the animal and the controller is operable to control operation based on the output, to attempt to relocate the signal.
  11. 11. A method of controlling operation of a drone, the method comprising the steps of: a. receiving, at the drone, a signal from an animal, b. processing the signal from the animal to determine the kinematics of the animal and providing an output based on the kinematics; and c. controlling operation of the drone based on the output from the processor.
GB2007822.6A 2020-05-26 2020-05-26 Improvements in and relating to drone control Pending GB2596512A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2007822.6A GB2596512A (en) 2020-05-26 2020-05-26 Improvements in and relating to drone control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2007822.6A GB2596512A (en) 2020-05-26 2020-05-26 Improvements in and relating to drone control

Publications (2)

Publication Number Publication Date
GB202007822D0 GB202007822D0 (en) 2020-07-08
GB2596512A true GB2596512A (en) 2022-01-05

Family

ID=71406219

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2007822.6A Pending GB2596512A (en) 2020-05-26 2020-05-26 Improvements in and relating to drone control

Country Status (1)

Country Link
GB (1) GB2596512A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015139091A1 (en) * 2014-03-19 2015-09-24 Reactive Electronics System for detecting target animals in a protected area
US20160304198A1 (en) * 2014-12-03 2016-10-20 Google Inc. Systems and methods for reliable relative navigation and autonomous following between unmanned aerial vehicle and a target object
WO2018084717A2 (en) * 2016-11-02 2018-05-11 Birdview As Unmanned aerial vehicle
US9979463B1 (en) * 2016-04-16 2018-05-22 Arizona Board Of Regents Acting For And On Behalf Of Northern Arizona University UAV wildlife monitoring system and related methods
WO2019168047A1 (en) * 2018-02-28 2019-09-06 株式会社ナイルワークス Drone, drone control method, and drone control program
US20200154695A1 (en) * 2018-11-16 2020-05-21 BirdBrAin Inc. Methods and systems for automatically relocating a pest deterrent system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015139091A1 (en) * 2014-03-19 2015-09-24 Reactive Electronics System for detecting target animals in a protected area
US20160304198A1 (en) * 2014-12-03 2016-10-20 Google Inc. Systems and methods for reliable relative navigation and autonomous following between unmanned aerial vehicle and a target object
US9979463B1 (en) * 2016-04-16 2018-05-22 Arizona Board Of Regents Acting For And On Behalf Of Northern Arizona University UAV wildlife monitoring system and related methods
WO2018084717A2 (en) * 2016-11-02 2018-05-11 Birdview As Unmanned aerial vehicle
WO2019168047A1 (en) * 2018-02-28 2019-09-06 株式会社ナイルワークス Drone, drone control method, and drone control program
US20200154695A1 (en) * 2018-11-16 2020-05-21 BirdBrAin Inc. Methods and systems for automatically relocating a pest deterrent system

Also Published As

Publication number Publication date
GB202007822D0 (en) 2020-07-08

Similar Documents

Publication Publication Date Title
Svanström et al. Real-time drone detection and tracking with visible, thermal and acoustic sensors
US20190303668A1 (en) Visual observer of unmanned aerial vehicle for monitoring horticultural grow operations
KR102181283B1 (en) Tree metrology system
JP2020098567A (en) Adaptive detection/avoidance system
US20140169138A1 (en) Apparatus and method for repelling pests in greenhouse
US20190246623A1 (en) Pest deterrent system
Vanderelst et al. Place recognition using batlike sonar
WO2018048708A1 (en) Systems and methods for identifying pests in crop-containing areas via unmanned vehicles
Saha et al. A cloud based autonomous multipurpose system with self-communicating bots and swarm of drones
Bieganowski et al. Sensor‐based outdoor monitoring of insects in arable crops for their precise control
US20240090395A1 (en) Method and system for pollination
Pham et al. Complete coverage path planning for pests-ridden in precision agriculture using UAV
GB2596512A (en) Improvements in and relating to drone control
Shigaki et al. Animal-in-the-loop system to investigate adaptive behavior
US11703882B2 (en) Bio-hybrid odor-guided autonomous palm-sized air vehicle
Srivastava et al. Connotation of unconventional drones for agricultural applications with node arrangements using neural networks
Choudhury et al. Agricultural informatics: Automation using the IoT and machine learning
KR102159197B1 (en) Radar sensor payload for small uav, remote processing device, detection system and method of detecting target using the same
KR20210084038A (en) Agricultural product management system using agricultural product harvest autonomous robot and method thereof
Kline et al. A Framework for Autonomic Computing for In Situ Imageomics
US11183073B2 (en) Aircraft flight plan systems
JP7091723B2 (en) Controls, mobiles, autonomous distributed control programs
Ezeofor et al. IOT Architecture for Real Time Maize Stem Borers’ Detection and Capturing in Precision Farming
JP7445909B1 (en) Pest control systems and pest control programs
Pavithra et al. Drone-Based Weed And Disease Detection In Agricultural Fields To Maximize Crop Health Using a Yolov8 Approach