GB2548166A - Determining whether an object has entered a certain space - Google Patents

Determining whether an object has entered a certain space Download PDF

Info

Publication number
GB2548166A
GB2548166A GB1606293.7A GB201606293A GB2548166A GB 2548166 A GB2548166 A GB 2548166A GB 201606293 A GB201606293 A GB 201606293A GB 2548166 A GB2548166 A GB 2548166A
Authority
GB
United Kingdom
Prior art keywords
space
immobiliser
measurer
measuring
entered
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1606293.7A
Other versions
GB2548166B (en
GB2548166A8 (en
Inventor
David Down Christopher
Rockliffe Armstrong Neil
Edward Cross James
James Wilkinson Alexander
Sebastian Wilkinson Roland
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neil Rockcliffe Armstrong
Openworks Eng Ltd
Original Assignee
Neil Rockcliffe Armstrong
Openworks Eng Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neil Rockcliffe Armstrong, Openworks Eng Ltd filed Critical Neil Rockcliffe Armstrong
Publication of GB2548166A publication Critical patent/GB2548166A/en
Publication of GB2548166A8 publication Critical patent/GB2548166A8/en
Application granted granted Critical
Publication of GB2548166B publication Critical patent/GB2548166B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0082Surveillance aids for monitoring traffic from a ground station
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41HARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
    • F41H13/00Means of attack or defence not otherwise provided for
    • F41H13/0006Ballistically deployed systems for restraining persons or animals, e.g. ballistically deployed nets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/006Navigation or guidance aids for a single aircraft in accordance with predefined flight zones, e.g. to avoid prohibited zones
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Abstract

An apparatus for determining whether a first object 203, which may be a drone or UAV, has entered a space 207, possibly restricted airspace. The appparatus includes object position measurer (101 fig.1) and a processor (111 fig.1) determines whether the first object has entered the space based on its position and information defining the space; if the object has encroached on the space a signal is transmitted to an immobiliser. The object position measurer may include a range finder (123 fig.1) and a direction sensor (125 fig.1). The apparatus may be included in a system which includes the immobilizer which comprises a second object, a projectile for carrying the second object and a launcher for launching the projectile towards the first object. The projectile deploys the second object, which may be a net, in the vicinity of the first object for capturing, immobilising or disabling the first object.

Description

DETERMINING WHETHER AN OBJECT HAS ENTERED A CERTAIN SPACE
FIELD OF THE INVENTION
The present invention relates to a technique for determining and recording whether an object has entered a certain space. For example, certain exemplary embodiments provide an apparatus and method for determining whether an aerial vehicle has entered a restricted airspace, and for recording evidence of unauthorised entry of the aerial vehicle into the restricted airspace.
BACKGROUND OF THE INVENTION
In the last few years, the commercial availability of cheap, small Unmanned Aerial Vehicles (UAVs), for example drones and quadcopters, has increased greatly, and this has resulted in an increasing number of instances of unauthorised or illegal intrusion of UAVs into restricted airspaces. Occurrences of unauthorised or illegal use of UAVs include use of UAVs near airports, sports stadia, prisons, crowded areas, buildings, structures and other installations. Such use is undesirable on grounds including security, legality and nuisance.
Various regulations have been introduced restricting or preventing use of aerial vehicles in certain spaces. For example. United Kingdom (UK) Civil Aviation Authority (CAA) Article 167 specifies that a person in charge of a small unmanned surveillance aircraft must not fly the aircraft (a) over or within 150 metres of any congested area; (b) over or within 150 metres of an organised open-air assembly of more than 1,000 persons; (c) within 50 metres of any vessel, vehicle or structure which is not under the control of the person in charge of the aircraft; or (d) within 50 metres of any person.
In order to enforce such regulations it is necessary to determine whether or not a UAV has actually entered a restricted space, and if so, to collect evidence of such intrusion so that the person in control of the UAV may be prosecuted or reprimanded. Currently, such a determination may be made by human eye judgement, and evidence may be provided in the form of an eye witness account. However, the human eye is relatively poor at judging distance and an eye witness account may not provide sufficient evidence to secure a prosecution.
Accordingly, what is desired is a technique for determining and recording whether an object (e.g. a UAV) has entered a certain space (e.g. a restricted airspace) with high accuracy, for example to allow law enforcement agencies to effectively prosecute users of UAVs that intrude into restricted airspaces without authorisation.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present invention.
SUMMARY OF THE INVENTION
It is an aim of certain embodiments of the present invention to address, solve, mitigate or obviate, at least partly, at least one of the problems and/or disadvantages associated with the related art, for example at least one of the problems and/or disadvantages mentioned herein. Certain embodiments of the present invention aim to provide at least one advantage over the related art, for example at least one of the advantages mentioned herein.
The present invention is defined by the independent claims. A non-exhaustive set of advantageous features that may be used in various exemplary embodiments of the present invention are defined in the dependent claims.
In accordance with an aspect of the present invention, there is provided an apparatus for determining whether a first object has entered a space, the apparatus comprising: an object position measurer for measuring the position of the first object; and a processor for determining whether the first object has entered the space based on the position of the first object and information defining the space.
In certain exemplary embodiments, the object position measurer comprises: a range finder for measuring the distance to an object in a direct line of sight; and a direction sensor for measuring the direction of an object.
In certain exemplary embodiments, the apparatus is configured for transmitting a signal to an immobiliser in response to determining that the first object has entered the space.
In certain exemplary embodiments, the apparatus is configured for transmitting position information indicating the measured position of the first object to an immobiliser.
In accordance with another aspect of the present invention, there is provided a system comprising: a measurer comprising an apparatus according to any of the above aspects; and an immobiliser for deploying a second object for capturing, immobilising or disabling the first object, wherein the immobiliser comprises the second object, a projectile for carrying the second object therein, and a launcher for launching the projectile towards the first object, wherein the projectile is configured for deploying the second object in the vicinity of the first object for capturing, immobilising or disabling the first object, wherein the measurer is configured to transmit a signal to the immobiliser in response to determining that the first object has entered the space, and wherein the immobiliser is configured to launch the projectile in response to receiving the signal.
In accordance with another aspect of the present invention, there is provided a system comprising: an immobiliser for deploying a second object for capturing, immobilising or disabling a first object, wherein the immobiliser comprises the second object, a projectile for carrying the second object therein, and a launcher for launching the projectile towards the first object, wherein the projectile is configured for deploying the second object in the vicinity of the first object for capturing, immobilising or disabling the first object; and a measurer for measuring the position of the first object, and transmitting position information indicating the measured position of the first object to the immobiliser, wherein the immobiliser further comprises an aiming system for aiming a barrel of the launcher based on the position information received from the measurer.
In certain exemplary embodiments, the system comprises two or more measurers for measuring the position of the first object and for transmitting respective position information indicating the measured position of the first object to the immobiliser, and the immobiliser is configured for aiming the barrel of the launcher based on position information received from two or more of the measurers.
In certain exemplary embodiments, the system comprises two or more immobilisers, and the system further comprises a controller for receiving position information from one or more measures, and for transmitting a signal for activating a selected immobiliser.
In accordance with another aspect of the present invention, there is provided a method for determining whether a first object has entered a space, the method comprising: measuring the position of the first object; and determining whether the first object has entered the space based on the position of the first object and information defining the space.
In certain exemplary embodiments, the method further comprises transmitting a signal to an immobiliser in response to determining that the first object has entered the space.
In certain exemplary embodiments, the method further comprises transmitting position information indicating the measured position of the first object to an immobiliser.
In certain exemplary embodiments, the first object comprises an aerial vehicle (e.g. a UAV) and the space comprises an airspace (e.g. a restricted airspace).
In accordance with another aspect of the present invention, there is provide a computer program comprising instructions arranged, when executed, to implement a method, device, apparatus and/or system in accordance with any aspect, embodiment, example or claim disclosed herein. In accordance with another aspect of the present invention, there is provided a machine-readable storage storing such a program.
Other aspects, advantages, and salient features of the present invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, disclose exemplary embodiments of the present invention.
BRIEF DESCRIPTION OF THE FIGURES
Figure 1 illustrates an apparatus according to an exemplary embodiment of the present invention;
Figures 2a and 2b illustrate exemplary uses of the apparatus of Figure 1 in an exemplary environment; and
Figure 3 illustrates an exemplary system comprising a measurer for determining whether a target object has entered a target space, and an immobiliser for capturing, immobilising or disabling the target object.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
The following description of exemplary embodiments of the present invention, with reference to the accompanying drawings, is provided to assist in a comprehensive understanding of the present invention, as defined by the claims. The description includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the present invention, as defined by the claims.
The terms and words used in this specification are not limited to the bibliographical meanings, but, are merely used to enable a clear and consistent understanding of the present invention.
The same or similar components may be designated by the same or similar reference numerals, although they may be illustrated in different drawings.
Detailed descriptions of elements, features, components, structures, constructions, functions, operations, processes, characteristics, properties, integers and steps known in the art may be omitted for clarity and conciseness, and to avoid obscuring the subject matter of the present invention.
Throughout this specification, the words “comprises”, “includes”, “contains” and “has”, and variations of these words, for example “comprise” and “comprising”, means “including but not limited to”, and is not intended to (and does not) exclude other elements, features, components, structures, constructions, functions, operations, processes, characteristics, properties, integers, steps and/or groups thereof.
Throughout this specification, the singular forms “a”, “an” and “the” include plural referents unless the context dictates otherwise. For example, reference to “an object” includes reference to one or more of such objects.
By the term “substantially” it is meant that the recited characteristic, parameter or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement errors, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic, parameter or value was intended to provide.
Throughout this specification, language in the general form of “X for Y” (where Y is some action, process, function, activity, operation or step and X is some means for carrying out that action, process, function, activity, operation or step) encompasses means X adapted, configured or arranged specifically, but not exclusively, to do Y.
Elements, features, components, structures, constructions, functions, operations, processes, characteristics, properties, integers, steps and/or groups thereof described herein in conjunction with a particular aspect, embodiment, example or claim are to be understood to be applicable to any other aspect, embodiment, example or claim disclosed herein unless incompatible therewith.
It will be appreciated that embodiments of the present invention can be realized in the form of hardware or a combination of hardware and software. Any such software may be stored in any suitable form of volatile or non-volatile storage device or medium, for example a ROM, RAM, memory chip, integrated circuit, or an optically or magnetically readable medium (e.g. CD, DVD, magnetic disk or magnetic tape). It will also be appreciated that storage devices and media are embodiments of machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement embodiments of the present invention.
Figure 1 illustrates an apparatus 100 according to an exemplary embodiment of the present invention. Figures 2a and 2b illustrate exemplary uses of the apparatus of Figure 1 in exemplary environments. In this example, the apparatus is described in relation to determining whether an aerial vehicle (e.g. a drone or UAV), referred to below as a target object, has entered a certain airspace, referred to below as a target airspace. However, the skilled person will appreciate that the present invention is not limited to this example. For example, the present invention may be applied in relation to objects other than vehicles and/or may be applied in relation to objects (e.g. vehicles or people) other than aerial objects.
The apparatus 100 may be fixed or portable. In the exemplary embodiment described below, the apparatus 100 is provided in the form of a hand-held device capable of being carried by a person. In certain embodiments, the apparatus 100 may be incorporated into an apparatus for capturing, immobilising or disabling the target object, for example as described in UK Patent Application 1601228.8. However, the skilled person will appreciate that the present invention is not limited to these configurations.
The target airspace may be defined in any suitable way. For example, the target airspace may be defined in terms of a certain predefined or fixed space (e.g. an airspace located above a defined geographical region such as a stadium or an airspace surrounding a building or other fixed structure). For example, a target airspace may be defined in terms of the space occupied by a vertical column (which may be a certain height or of unlimited height) above a defined geographical region on the surface of the earth. Alternatively or additionally, the target space may be defined based on the position(s) of one or more objects, for example in terms of a locus of points that are less than a certain threshold distance away from one or more objects (referred to below as protected objects), which may be fixed (e.g. buildings) or mobile (e.g. people or vehicles). In this case, the distance may be the distance in three dimensions, or alternatively the distance considering only those positional components in a certain plane (e.g. a horizontal plane).
In the embodiment of Figure 1, the apparatus 100 comprises an object position measurer 101, a self-position measurer 103 an input unit 105, an output unit 107, a communication unit 109, a processor 111, and a memory 113. The skilled person will appreciate that, in certain alternative embodiments, one or more of these components may be omitted, or replaced with components performing equivalent functions.
The object position measurer 101 is configured for measuring the position of one or more remote objects, including the target object and possibly one or more protected objects. The object position measurer 101 may be configured to directly measure the absolute position of an object (e.g. specified in terms of a latitude, longitude and altitude), or alternatively to measure the relative position of the object (e.g. the position of the object relative to the position of the apparatus 100). As described further below, a measured relative position of an object may be combined with a measured absolute position of the apparatus 100 to compute an absolute position of the object.
The object position measurer 101 may be configured to measure the position of an object using any suitable technique. For example, as will be described in greater detail below, in the embodiment illustrated in Figure 1, the object position measurer 101 comprises a number of sensors for measuring the distance from the apparatus 100 to the object in a direct line of sight, and the direction of the object relative to the apparatus 100, whereby the distance and direction measurements together provide a measurement of the position of the object relative to the apparatus 100. However, the skilled person will appreciate that the object position measurer 101 is not limited to this specific example.
For example, the position of an object may be measured based on one or more alternative or additional techniques, for example based on audio detection, visual detection, thermal detection, radar detection and/or radio-frequency detection. In certain embodiments, the object position measurer 101 may comprise two or more sensors distributed at different locations to enable the position of an object to be determined based on triangulation, trilateration, multilateration or similar techniques.
The self-position measurer 103 is configured for measuring the position of the apparatus 100. For example the self-position measurer 103 may take measurements using (i) a global navigation satellite system, such as the Global Positioning System (GPS) or Galileo, and/or (ii) using a local positioning system, such as a system in which signals are received from a set of (e.g. three or more) beacons having known positions and a position is computed based on triangulation, trilateration, multilateration or similar technique. In embodiments where the apparatus 100 has a known (e.g. fixed) position, the self-position measurer 103 may be omitted.
The input unit 105 is configured for receiving user input and controls. For example, the input unit may comprise one or more buttons, switches and/or triggers, a keyboard and/or keypad, and/or a touch screen.
The output unit 107 is configured for outputting information, indications and/or alerts to a user, for example in visual, audible and/or tactile form. For example, the output unit may comprise one or more of a screen, one or more lights, a speaker, and a vibration unit.
The communication unit 109 is configured for receiving information from an external source (e.g. a server or other remote entity). For example, the communication unit 109 may be configured to receive information specifying or defining one or more predefined restricted airspaces from a database. The communication unit 109 may alternatively or additionally be configured for transmitting information to an external source (e.g. a server or other remote entity). For example, the communication unit 109 may be configured to transmit information and data collected during operation of the apparatus 100, for example position measurements and any associated information, such as the time of the measurements and the identity of the user of the apparatus. The skilled person will appreciate that, in alternative embodiments, the communication unit 109 may be omitted if communication between the apparatus 100 and a remote entity is not required.
The processor 111 is configured for controlling the overall operation of the apparatus 100, as described in greater detail below.
The memory is configured for storing information and/or data obtained during operation of the apparatus 100.
The object position measurer 101 will now be described in more detail. The object position measurer 101 illustrated in Figure 1 comprises a sight 121, a range finder 123 and a direction sensor 125.
The sight 121 is configured for allowing the user to visually acquire an object whose position it is desired to measure. For example, the sight 121 may comprise a conventional telescopic gun sight.
The range finder 123 is configured for measuring (e.g. one-time, continuously or periodically) the distance to the object in the direct line of sight when the object has been visually acquired, and for providing (e.g. one-time, continuously or periodically) the measured distance(s) to the processor 111. For example, the range finder 123 may comprise a conventional laser range finder.
The direction sensor 125 is configured for measuring (e.g. one-time, continuously or periodically) the direction of the object when the object has been visually acquired, and for providing (e.g. one-time, continuously or periodically) the measured direction to the processor 111. For example, the direction sensor 125 may comprise one or more (e.g. three) accelerometers, one or more (e.g. three) gyroscopes, and/or a magnetometer. The direction sensor 125 may be configured to measure the zenith (or polar) angle of the object (i.e. the elevation angle between an imaginary horizontal plane and an imaginary line connecting the apparatus 100 and the object. The direction sensor 125 may also configured to measure the azimuthal angle of the target object with respect to a fixed reference (e.g. magnetic pole).
The measured distance to the object may be expressed in terms of a radial distance, and the measured direction of the object may be expressed in terms of a zenith angle and an azimuthal angle. Accordingly, the measured distance and measured direction together provide spherical coordinates of the object relative to the apparatus 100. The processor 111 may be configured to convert the spherical coordinates into another coordinate system, for example x, y and z Cartesian coordinates. The skilled person will appreciate that any suitable coordinate system may be used, and that conversion to another coordinate system may not be required in certain embodiments.
When the user has visually acquired an object through the sight 121, the user may trigger a measurement of the position of the object using the input unit 105. For example, in response to the user actuating a “measurement” button or switch provided in the input unit 105, the processor 111 samples the coordinates obtained from the data received from the object position measurer 101, thereby obtaining coordinates of the object at the time the user triggered the measurement. In certain embodiments, the processor 111 may also obtain the position of the apparatus 100 using the self-position measurer 103 at the time the user triggered the measurement.
In certain embodiments, the user may input information through the input unit 105 to indicate a type of the object (e.g. a “target object” or “protected object”). The processor 111 may control the memory 113 to store relevant information and/or data obtained during operation of the apparatus 100, for example the coordinates of an object, the coordinates of the apparatus 100, and any associated information, such as the time of the measurement, the type of the object and the identity of the user.
In certain embodiments, when the user has visually acquired an object through the sight 121, but before the user has triggered a measurement of the position of the object, the user may initiate an “acquisition phase”, for example by actuating an “acquisition” button provided on the input unit 105. During the acquisition phase, the processor 111 may collect a number of samples of coordinates obtained through the object position measurer 101, process and analyse the samples, and determine whether an object has been validly acquired based on the processing and analysis. When the processor 111 has determined that an object has been validly acquired, the processor 111 may control the output unit 107 to provide the user with an indication (e.g. audible, visual and/or tactile indication) of valid acquisition. Upon receiving the indication, the user may trigger a measurement of the position of the acquired object. Accordingly, more reliable measurements may be obtained. The skilled person will appreciate that the acquisition phase may be omitted in alternative embodiments.
The above operations may be repeated any desired number of times to measure or remeasure the positions of one or more objects.
In the case that the positions of two or more objects are measured, the processor 111 may be configured to calculate the distance between two of the objects. For example, if the position of a first object A is denoted by vector ^ and the position of a second object B is denoted by vector ^ then the distance between objects A and B is given by \Ρβ-Ρ/\\· In this example, the distance between objects A and B is defined as the length of vector AB=Pr -Pa. However, in alternative embodiments, the distance between objects A and B may be defined as the length of a vector formed by the projection of vector AB onto a certain plane. For example, the projection of vector ^ onto a plane defined by normal N is given by ABC^^=Nx(ABxN)I\N\^. and the distance between objects A and B may be defined as the length of vector AB^^K i.e. \AB^\. The certain plane may be, for example, the horizontal plane.
The processor 111 may be configured to control the output unit 107 to display the calculated distance between the objects (e.g. in numerical or graphical form). The processor 111 may be configured to control the output unit 107 to output an indication (e.g. visual audible or tactile indication) if the distance between objects A and B is less than a certain threshold. For example, the threshold may be predefined or specified by the user via the input unit 105.
The processor 111 may also be configured to calculate the absolute position of an object A based on the relative position of the object A with respect to the apparatus 100 and the absolute position of the apparatus 100. For example, if the relative position of the object A with respect to the apparatus 100 is given by the vector ΔΡα and the absolute position of the apparatus 100 is denoted by vector Papp then the absolute position of the object A is given by Papp + ΔΡα. The processor 100 may be further configured to compare the position of object A with one or more predetermined spaces Si, S2, ... to determine whether or not object A is located within any of the spaces.
Information specifying or defining the spaces may be obtained, for example, from data stored in the memory 113 and/or data retrieved from a remote source (e.g. a database stored in a remote server) through the communication unit 109. Information specifying or defining spaces may be added, deleted and/or modified in the memory 113 and/or the remote source, for example by the user or another party, to define new spaces, or delete or modify existing spaces. Such information may specify or define a space in any suitable format, form or representation.
The processor 111 may be configured to control the output unit 107 to display a graphical representation (e.g. three-dimensional map) illustrating the spatial relation between one or more of the spaces and the position of the object. The processor 111 may be configured to control the output unit 107 to output an indication (e.g. visual, audible or tactile indication) if the object A is located within one or more of the spaces.
As mentioned above, the processor 111 may be configured to control the memory 113 to store various pieces of information or data obtained during operation of the apparatus 100, for example the position of an object, the type of the object, the position of the apparatus 100, the distance between two objects, an identification of a space in which an object is located, the time at which a measurement was made and/or the identity of the user. A number of exemplary scenarios in which the apparatus 100 of Figure 1 may be used will now be described. The skilled person will appreciate that the present invention is not limited to these specific examples.
In a first scenario (Figure 2a), a user 201 of the apparatus 100 (e.g. a police officer) has noticed that an aerial vehicle 203 (e.g. a drone) is flying in the vicinity of a sports stadium 205. The flying of drones within the restricted airspace 207 above the stadium 205 (e.g. a target airspace) is prohibited, and the police officer 201 wishes to know whether the drone 203 has actually entered the restricted airspace 207, and if so, the police office 201 wishes to collect evidence of the unauthorised drone 203 use to facilitate prosecution of the person 209 controlling the drone 203.
The police officer 201 uses the input unit 105 to indicate to the processor 111 that a measurement with respect to a target object 203 will commence (e.g. by pressing a certain button provided in the input unit 105). The police officer 201 then visually acquires the drone 203 through the sight 121 and presses the “acquisition” button. When the apparatus 100 has verified that an object 203 has been validly acquired, the processor 111 controls the output unit 107 to provide a suitable indication to the police officer 201, for example in the form of lighting an LED, sounding a buzzer, or creating a mild vibration. In response, the police officer 201 presses a “measurement” button to trigger a measurement of the position of the drone 203 relative to the apparatus 100.
When a measurement has been triggered, the processor 111 obtains spherical coordinates of the relative position of the drone 203 through the object position measurer 101 and may convert the spherical coordinates into Cartesian coordinates, if required. The processor 111 also obtains the current absolute position of the apparatus 100 through the self-position measurer 103. The processor 111 then calculates the absolute position of the drone Pdmne based on the relative position of the drone and the absolute position of the apparatus 100.
The processor 100 then compares the calculated position Pdmne with data stored in the memory defining one or more restricted airspaces, including the airspace 207 corresponding to the stadium 205. If the processor 100 determines that the calculated position of the drone 203 is located within a restricted airspace, for example the airspace 207 corresponding to the stadium 205, the processor 111 controls the output unit 107 to provide a suitable alert to the police officer 201, for example in the form of flashing an LED, sounding an alarm, or creating a high intensity vibration. The processor 111 may also control the output unit 107 to display a graphical interface allowing the police office 201 to more clearly understand the spatial relationship between the drone 203 and the restricted airspace 207. For example, the interface may comprise a map (e.g. in the form of a three-dimensional map) showing the stadium 205, the airspace 207 corresponding to the stadium 205 (e.g. as a shaded region on the map), and the position of the drone 203 relative to the airspace 207 (e.g. as a dot, cross or icon superimposed over the map at the relevant position). In certain embodiments, the interface may also display the coordinates of the drone 203 in numerical form (e.g. in terms of a latitude, longitude and altitude).
The police office 201 may choose to record evidence by pressing a “record” button provided in the input unit 105. In response, the processor 111 stores relevant information or data in the memory 113, for example one or more of: the position of the drone 203, information identifying the stadium 205 (e.g. the name or unique ID of the stadium 205), the time of the measurement, and information identifying the police officer 201 (e.g. the police officer’s 201 badge number). The police officer 201 may then choose to transmit the collected evidence to a police database through the communication unit 109 by pressing a “send” button provided on the input unit 105.
In a second scenario (Figure 2b), a user 201 of the apparatus (e.g. a police officer) has noticed that an aerial vehicle 203 (e.g. a drone) is flying in the vicinity of a crowded area 211. Regulations prohibit the flying of drones 203 within 150m from a crowded area 211, and the police officer 201 wishes to know whether or not the drone 203 is in breach of the regulations, and if so, the police officer 201 wishes to collect evidence of the breach.
The police officer 201 applies the procedure described above in relation to the first scenario to trigger a measurement of the position of the drone 203 (which is specified by the police officer 201 as a target object) relative to the apparatus 100, resulting in the processor obtaining coordinates (e.g. Cartesian coordinates) of the position of the drone 203, Pdrone.
The police officer 201 then applies a similar procedure to trigger a measurement of the position of the crowd 211 (which is specified by the police officer 201 as a protected object, rather than a target object) relative to the apparatus 100. The position of the crowd 211 may be represented, for example, by the position of a person 213 in the crowd 211. This procedure results in the processor 111 obtaining Cartesian coordinates of the position of the crowd 211, R^rowd· The position of the crowd 211 may define a restricted airspace 215 (e.g. a target airspace).
The processor 111 then calculates the distance between the drone 203 and the crowd 211, denoted by AP=\Ri:rowd - Pdrone\, and compares this distance with a threshold T (e.g. 150m in this example). As mentioned above, alternatively, the distance AP may be calculated considering only those positional components in a certain plane (e.g. a horizontal plane). For example, the distance may be calculated ignoring vertical positional components. If the processor 111 determines that the distance is less than the threshold (i.e. if ΔΡ<Τ), the processor 111 controls the output unit 107 to provide a suitable alert to the police officer 201, for example in the form of a flashing LED, alarm sound or high intensity vibration. The processor 111 may also control the output unit 107 to display a graphical interface allowing the police officer 201 to more clearly understand the spatial relationship between the drone 203 and the crowd 211 in a similar manner as described above in relation to the first scenario. In certain embodiments, the interface may also display the distance between the drone 203 and the crowd 211, for example in numerical or graphical form.
In a similar manner as described above in relation to the first scenario, the police officer 201 may choose to record evidence by pressing the “record” button, and may choose to transmit the collected evidence to a police database through the communication unit 109 by pressing the “send” button.
In various alternative scenarios, the protected objects may be different from those described in the first and second scenarios above. For example, in an alternative to the second scenario, the protected object may comprise a building rather than a crowd. In various alternative scenarios, there may be two or more target objects, two or more protected objects and/or two or more target spaces. In such cases, multiple measurements, distance calculations and/or comparisons of positions with spaces may be performed to build up a complete picture of the scenario, to determine whether or not any of the target objects have entered any of the target spaces.
In certain embodiments, in cases where the target object is moving, the apparatus 100 may be further configured to track the target object as it moves. In this case, the user may visually acquire the target object in the sight 121 and track the target object as it moves. During this tracking, the processor 111 may obtain a sequence of coordinates of the target object at discrete time points, which define a parameterized trajectory of the target object.
The processor 111 may be configured to control the output unit 107 to alert the user when the trajectory of the target object enters a target airspace. The processor 111 may be configured to control the memory 113 to record the trajectory of the target object. The processor may be configured to control the output unit 107 to display the trajectory of the target object on a user interface (e.g. a three-dimensional map) to illustrate the spatial relationship between the trajectory of the target object and the target space and/or one or more other objects.
In certain embodiments, the processor 111 may input the tracked trajectory of the target object into a suitable motion model to predict the future trajectory of the target object based on the previous trajectory. In this case, the processor 111 may be configured to control the output unit 107 to alert the user when the trajectory of the target object is predicted to enter a target airspace, for example with a certain degree of certainty. The processor 111 may be configured to control the output unit 107 to indicate to the user the estimated amount of time before the target object is predicted to enter the target airspace.
The processor 111 may be further configured to obtain various statistics related to the trajectory of the target object. For example, if the trajectory enters a target airspace, the statistics may comprise the total amount of time the target object spends inside the target airspace and/or how deep inside the target airspace the target object enters. If the trajectory of the target object does not enter the target airspace, the statistics may comprise the distance of closest approach to the target airspace and/or the average distance of the target object from the target airspace. These statistics may be stored in the memory 113 and/or sent to a remote location through the communication unit 109.
In certain embodiments, a system (referred to below as an immobiliser) may be provided for deploying an object (e.g. a net) for capturing, immobilising or disabling a target object (e.g. an aerial drone). For example, the immobiliser may comprise the object, a projectile and a launcher. The projectile is configured for transporting the object to the vicinity of the target object. The launcher is configured for launching the projectile. The projectile is further configured for deploying the object in the vicinity of the target object. Following deployment, the object is configured for capturing, immobilising or disabling the target object.
Figure 3 illustrates an exemplary system 300 comprising an apparatus 301 (referred to below as a measurer) for determining whether a target object 303 has entered a target space 305, and an immobiliser 307 for capturing, immobilising or disabling the target object 303. The measurer 301 may comprise, for example, an apparatus 100 as described above. The immobiliser 307 may comprise, for example, a system as described in UK Patent Application 1601228.8.
In the system 300 illustrated Figure 3, the measurer 301 and the immobiliser 307 are located at remote positions relative to each other. However, in other embodiments, the measurer 301 and the immobiliser 307 may be located in the vicinity of each other. In the system 300 illustrated in Figure 3, the immobiliser 307 and the measurer 301 are provided as separate units. However, in other embodiments, the measurer 301 and the immobiliser 307 may be combined into a single unit. The measurer 301 and/or the immobiliser 307 may be fixed or portable.
The immobiliser 307 may be configured to deploy an object (e.g. a net) for capturing, immobilising or disabling the target object 303 (e.g. an aerial drone) when the measurer 301 has determined that the target object 303 is located inside the target airspace 305. For example, when the measurer 301 has determined that the target object 303 is located inside the target airspace 305, for example in a manner described above, the measurer 301 transmits a signal to the immobiliser 307 (either directly, or indirectly via one or more intermediate nodes, for example including a central controller). In response to receiving the signal, the immobiliser 307 may be configured to automatically launch a projectile to deploy an object to capture, immobilise or disable the target object 303. Alternatively, in response to receiving the signal, the immobiliser 307 may be configured to indicate to a user of the immobiliser 307 that the target object 303 has entered the target airspace 305, prompting the user to operate the immobiliser 307 to capture, immobilise or disable the target object 303.
The immobiliser 307 may comprise an aiming mechanism for achieving or assisting correct aim of the launcher (e.g. aim of a barrel of the launcher) when launching the projectile. For example, in the case that the immobiliser 307 is manually operated, the aiming mechanism may comprise a sight for allowing a user to visually acquire the target object 303, a range finder for measuring the distance to the acquired target object 303 in the direct line of sight, and a direction sensor for measuring the direction of the acquired target object 303. The immobiliser 307 may also comprise a processor for computing a direction in which the launcher should be orientated to achieve successful capture of the target object 303 based on the measured distance and direction, and an actuator for orientating the launcher to the computed direction.
For example, the processor may compute a direction such that when the projectile is launched in that direction with a known muzzle velocity, the resulting trajectory of the projectile includes an optimum net deployment position. An optimum net deployment position is a position in the vicinity of the target object 303 such that if the net were to be deployed from the projectile in that position the net would intercept the target object 303. When computing the direction, the processor may take into account various forces that may act on the projectile, for example gravity and aerodynamic (e.g. drag or lift) forces.
In the case that the immobiliser 307 is automatically operated, the immobiliser 307 may comprise any suitable means for automatically acquiring and tracking the target object 303, for example using a camera and image processing. In this case, the user-operated sight may be omitted in certain embodiments. In the case that the immobiliser 307 is automatically operated, the immobiliser 307 may comprise any suitable means for automatically determining the line of sight distance and the direction of the target object 303 relative to the immobiliser 307.
In embodiments in which the immobiliser 307 is automatic or semi-automatic, it may be more difficult for the immobiliser 307 to correctly aim the launcher compared to embodiments in which the immobiliser 307 is manually operated. For example, it may be relatively difficult for an automated immobiliser 307 to accurately acquire and track a target object 303 without assistance from a user.
In certain embodiments, the measurer 301 may be used to assist the immobiliser 307 in correctly aiming the launcher. For example, the measurer 301 may be used to measure the location of the target object 303 in a manner described above. The measurer 301 may then transmit information indicating the measured location of the target object 303 to the immobiliser 307 (either directly, or indirectly via one or more intermediate nodes, for example including a central controller). In certain embodiments, if the measurer 301 is used to assist the immobiliser 307 in correctly aiming the launcher, the measurer 301 may not be required to determine (or have the capability of determining) whether the target object 303 has entered the target airspace 305. The immobiliser 307 determines its own location, for example using a position measurer as described above (e.g. using a global navigation satellite system or a local positioning system) provided in the immobiliser 307. Alternatively, if the immobiliser 307 is fixed, the position of the immobiliser 307 may be known in advance. The immobiliser 307 then computes the position of the target object 303 relative to the immobiliser 307 using the position of the immobiliser 307 determined by the immobiliser 307 and the position of the target object 303 indicated by the information received from the measurer 301.
The immobiliser 307 may then compute a direction in which the launcher should be orientated to enable successful capture of the target object 303 based on the computed relative position. For example, the immobiliser 307 may determine the line of sight distance and direction of the target object 303 using the computed relative position, and then apply these values in computing a direction in which the launcher should be orientated to enable successful capture of the target object.
In certain embodiments, two or more measurers 301 may be used to simultaneously measure the position of a single target object 303. In this case, each measurer 301 may transmit respective position information to the immobiliser 307. The immobiliser 307 may then compute a direction in which the launcher should be orientated to enable successful capture of the target object 303 based on a combination of the information received from two or more of the measurers 301.
In certain embodiments, the immobiliser 307 may be configured to compute a direction in which the launcher should be orientated based on a combination of its own measurements (e.g. the line of sight distance and direction of the target object 303 measured by the immobiliser 307) and measurements based on information received from one or more measurers 301.
In the preceding examples, the measurements provided by the measurers 301 and/or the measurements made by the immobiliser 307 may be assigned weights according to one or more criteria. For example, the criteria may be based on the reliability of the respective measurements where measurements having a higher reliability are assigned greater weights.
In one exemplary scenario, one or more users (e.g. police officers) each carry a respective measurer (e.g. handheld measurer). When a police officer notices or suspects that a target object (e.g. an aerial vehicle) has entered a target airspace (e.g. a restricted airspace), the police officer uses their own measurer to measure the position of the target object and transmit position information to an immobiliser (e.g. a fixed-position and automated immobiliser). In certain embodiments, the measurer may also be configured to confirm to the police officer and/or the immobiliser whether or not the target object is actually located in the target airspace. Other police officers who also notice the same aerial vehicle may perform the same procedure.
Accordingly, the immobiliser receives position information from one or more of the measurers, allowing the immobiliser to compute a direction in which a launcher of the immobiliser should be orientated to enable successful capture, immobilisation or disabling of the target object. The immobiliser is configured to operate (e.g. automatically or in response to control of an operator) so as to capture, immobilise or disable the target object. The immobiliser may be configured to operate if it is determined that the target object has actually entered the target airspace.
In the embodiments described above, one immobiliser is provided. However, in other embodiments, two or more immobilisers may be provided at different locations. In this case, a central controller may be provided that receives the position information from the one or more measurers. The controller then aggregates the various pieces of position information and selects one or more immobiliser to use to capture, immobilise or disable the target object. For example, the controller may select the immobiliser that provides the best chance of successful capture (e.g. based on one or factors such as the distance of each immobiliser from the target object, or the trajectory of the target object relative to each immobiliser). The controller transmits a signal to the selected immobiliser to control the immobiliser to launch a projectile to capture, immobilise or disable the target object.
In certain embodiments, the controller may rank the immobilisers in order of likelihood of successful capture of the target object (e.g. based on one or more factors as described above). For example, the highest ranked immobiliser may be selected by default by the controller, and in the event the highest ranked immobiliser is not available (e.g. if it is not loaded with a projectile or is otherwise non-operational), the next highest ranked immobiliser may be selected by the controller. Alternatively, the controller may display a list of the immobilisers in order of rank to an operator, who may select one of the immobilisers from the list. In certain embodiments, if a selected immobiliser is unsuccessful is capturing the target object, another immobiliser (e.g. the next highest ranked immobiliser) may be selected to try.

Claims (45)

1. An apparatus for determining whether a first object has entered a space, the apparatus comprising: an object position measurer for measuring the position of the first object; and a processor for determining whether the first object has entered the space based on the position of the first object and information defining the space.
2. An apparatus according to claim 1, wherein the space is a predefined space, and the information defining the space comprises pre-stored information, and wherein the processor is configured for determining whether the first object has entered the space by determining whether the position of the first object is located inside the predefined space.
3. An apparatus according to claim 1 or 2, wherein the object position measurer is configured for measuring the position of the first object relative to the apparatus, and wherein the processor is configured for calculating the position of the first object based on the position of the apparatus and the position of the first object relative to the apparatus.
4. An apparatus according to claim 3, wherein the apparatus further comprises a selfposition measurer for measuring the position of the apparatus.
5. An apparatus according to any preceding claim, wherein object position measurer is further configured for measuring the position of a second object, and wherein the information defining the space is obtained based on the position of the second object.
6. An apparatus according to claim 5, wherein the processor is configured for determining whether the first object has entered the space based on determining whether the position of the first object is less than a threshold distance from the position of the second object.
7. An apparatus according to any preceding claim, wherein the object position measurer comprises: a range finder for measuring the distance to an object in a direct line of sight; and a direction sensor for measuring the direction of an object.
8. An apparatus according to any preceding claim, wherein the processor is configured for storing the position of the first object.
9. An apparatus according to any preceding claim, wherein the processor is configured for storing one or more of: the information defining the space; a time of each measurement; a type of each object; and an identity of a user.
10. An apparatus according to any preceding claim, wherein the apparatus comprises an output unit for outputting an indication to the user when the first object is determined to have entered the space.
11. An apparatus according to any preceding claim, wherein the apparatus comprises an output unit for providing a visual indication of the spatial relationship between the first object and the space.
12. An apparatus according to any preceding claim, wherein the apparatus is configured for tracking the trajectory of the first object.
13. An apparatus according to any preceding claim, wherein the processor is configured for determining that the first object has been validly acquired before the object position measurer measures the position of the first object.
14. An apparatus according to any preceding claim, wherein the apparatus further comprises an apparatus for capturing, immobilising or disabling the first object.
15. An apparatus according to any preceding claim, wherein the first object comprises an aerial vehicle, and the space comprises an airspace.
16. An apparatus according to any preceding claim, wherein the apparatus is configured for transmitting a signal to an immobiliser in response to determining that the first object has entered the space.
17. An apparatus according to any preceding claim, wherein the apparatus is configured for transmitting position information indicating the measured position of the first object to an immobiliser.
18. A method for determining whether a first object has entered a space, the method comprising: measuring the position of the first object; and determining whether the first object has entered the space based on the position of the first object and information defining the space.
19. A method according to claim 18, wherein the space is a predefined space, and the information defining the space comprises pre-stored information, and wherein determining whether the first object has entered the space comprises determining whether the position of the first object is located inside the predefined space.
20. A method according to claim 18 or 19, wherein measuring the position of the first object comprises: measuring the position of the first object relative to the apparatus, and calculating the position of the first object based on the position of the apparatus and the position of the first object relative to the apparatus.
21. A method according to claim 20, further comprising measuring the position of the apparatus.
22. A method according to any of claims 18 to 21, wherein the method further comprises: measuring the position of a second object, and obtaining the information defining the space based on the position of the second object.
23. A method according to claim 22, wherein determining whether the first object has entered the space comprises determining whether the position of the first object is less than a threshold distance from the position of the second object.
24. A method according to any of claims 18 to 23, wherein measuring the position of the first or second object comprises: measuring the distance to the first or second object in a direct line of sight; and measuring the direction of the first or second object.
25. A method according to any of claims 18 to 24, further comprising storing the position of the first object.
26. A method according to any of claims 18 to 25, further comprising storing one or more of: the information defining the space; a time of each measurement; a type of each object; and an identity of a user.
27. A method according to any of claims 18 to 26, further comprising outputting an indication to the user when the first object is determined to have entered the space.
28. A method according to any of claims 18 to 27, further comprising providing a visual indication of the spatial relationship between the first object and the space.
29. A method according to any of claims 18 to 28, further comprising tracking the trajectory of the first object.
30. A method according to any of claims 18 to 29, further comprising determining that the first object has been validly acquired before measuring the position of the first object.
31. A method according to any of claims 18 to 30, further comprising capturing, immobilising or disabling the first object.
32. A method according to any of claims 18 to 31, wherein the first object comprises an aerial vehicle, and the space comprises an airspace.
33. A method according to any of claims 18 to 32, further comprising transmitting a signal to an immobiliser in response to determining that the first object has entered the space.
34. A method according to any of claims 18 to 33, further comprising transmitting position information indicating the measured position of the first object to an immobiliser.
35. A system comprising: a measurer comprising an apparatus according to any of claims 1 to 17; and an immobiliser for deploying a second object for capturing, immobilising or disabling the first object, wherein the immobiliser comprises the second object, a projectile for carrying the second object therein, and a launcher for launching the projectile towards the first object, wherein the projectile is configured for deploying the second object in the vicinity of the first object for capturing, immobilising or disabling the first object, wherein the measurer is configured to transmit a signal to the immobiliser in response to determining that the first object has entered the space, and wherein the immobiliser is configured to launch the projectile in response to receiving the signal.
36. A system comprising: an immobiliser for deploying a second object for capturing, immobilising or disabling a first object, wherein the immobiliser comprises the second object, a projectile for carrying the second object therein, and a launcher for launching the projectile towards the first object, wherein the projectile is configured for deploying the second object in the vicinity of the first object for capturing, immobilising or disabling the first object; and a measurer for measuring the position of the first object, and transmitting position information indicating the measured position of the first object to the immobiliser, wherein the immobiliser further comprises an aiming system for aiming a barrel of the launcher based on the position information received from the measurer.
37. A system according to claim 36, wherein the system comprises two or more measurers for measuring the position of the first object and for transmitting respective position information indicating the measured position of the first object to the immobiliser, and wherein the immobiliser is configured for aiming the barrel of the launcher based on position information received from two or more of the measurers.
38. A system according to claim 37 wherein the position information received from respective measurers are assigned weights according to one or more criteria.
39. A system according to claim 38, wherein the criteria comprise a reliability of the measured position.
40. A system according to any of claims 35 to 39, wherein the system comprises two or more immobilisers, and wherein the system further comprises a controller for receiving position information from one or more measures, and for transmitting a signal for activating a selected immobiliser.
41. A system according to claim 40, wherein the controller is configured to select an immobiliser based on the received position information,
42. A system according to claim 40 or 41, wherein the controller is configured to receive a selection of an immobiliser from an operator.
43. A system according to claim 40, 41 or 42, wherein the controller is configured to rank the immobilisers in order of likelihood of successful capture, immobilising or disabling of the target object by the immobilisers.
44. A system according to any of claims 35 to 43, wherein the measurer comprises an apparatus according to any of claims 1 to 15.
45. A system according to any of claims 35 to 44, wherein the measurer and the immobiliser are provided as separate units.
GB1606293.7A 2016-03-07 2016-04-12 Determining whether an object has entered a certain space Active GB2548166B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1603913.3A GB2548107B (en) 2016-03-07 2016-03-07 Determining whether an object has entered a certain space

Publications (3)

Publication Number Publication Date
GB2548166A true GB2548166A (en) 2017-09-13
GB2548166A8 GB2548166A8 (en) 2017-11-22
GB2548166B GB2548166B (en) 2022-04-13

Family

ID=55859117

Family Applications (2)

Application Number Title Priority Date Filing Date
GB1603913.3A Active GB2548107B (en) 2016-03-07 2016-03-07 Determining whether an object has entered a certain space
GB1606293.7A Active GB2548166B (en) 2016-03-07 2016-04-12 Determining whether an object has entered a certain space

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB1603913.3A Active GB2548107B (en) 2016-03-07 2016-03-07 Determining whether an object has entered a certain space

Country Status (1)

Country Link
GB (2) GB2548107B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018069911A1 (en) * 2016-10-11 2018-04-19 Dr. Frucht Systems Ltd Method and system for detecting and positioning an intruder using a laser detection and ranging device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5583311A (en) * 1994-03-18 1996-12-10 Daimler-Benz Aerospace Ag Intercept device for flying objects
US7190304B1 (en) * 2003-12-12 2007-03-13 Bae Systems Information And Electronic Systems Integration Inc. System for interception and defeat of rocket propelled grenades and method of use
US7202809B1 (en) * 2004-05-10 2007-04-10 Bae Systems Land & Armaments L.P. Fast acting active protection system
US20100181424A1 (en) * 2009-01-19 2010-07-22 Honeywell International Inc. Catch and snare system for an unmanned aerial vehicle
US8205537B1 (en) * 2008-08-11 2012-06-26 Raytheon Company Interceptor projectile with net and tether
GB2487664A (en) * 2011-01-28 2012-08-01 Boeing Co Expanding countermeasure and launcher system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003027934A1 (en) * 2001-09-26 2003-04-03 Hodge Philip T Method and apparatus for controlling the use of airspace and assessment of use fees and penalties
WO2009139937A2 (en) * 2008-02-15 2009-11-19 Kutta Technologies, Inc. Unmanned aerial system position reporting system and related methods
US8157169B2 (en) * 2009-11-02 2012-04-17 Raytheon Company Projectile targeting system
TW201235808A (en) * 2011-02-23 2012-09-01 Hon Hai Prec Ind Co Ltd System and method for controlling UAV to flight in predefined area
CN102496312B (en) * 2011-12-22 2014-10-15 北京东进航空科技股份有限公司 Warning method and device for invasion of aerial target in restricted airspace
CN104950907B (en) * 2015-06-26 2018-02-02 巴州极飞农业航空科技有限公司 The monitoring method of unmanned plane, apparatus and system
CN205336543U (en) * 2015-12-18 2016-06-22 苏州贝多环保技术有限公司 Unmanned aerial vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5583311A (en) * 1994-03-18 1996-12-10 Daimler-Benz Aerospace Ag Intercept device for flying objects
US7190304B1 (en) * 2003-12-12 2007-03-13 Bae Systems Information And Electronic Systems Integration Inc. System for interception and defeat of rocket propelled grenades and method of use
US7202809B1 (en) * 2004-05-10 2007-04-10 Bae Systems Land & Armaments L.P. Fast acting active protection system
US8205537B1 (en) * 2008-08-11 2012-06-26 Raytheon Company Interceptor projectile with net and tether
US20100181424A1 (en) * 2009-01-19 2010-07-22 Honeywell International Inc. Catch and snare system for an unmanned aerial vehicle
GB2487664A (en) * 2011-01-28 2012-08-01 Boeing Co Expanding countermeasure and launcher system

Also Published As

Publication number Publication date
GB2548107B (en) 2022-04-13
GB2548107A (en) 2017-09-13
GB2548166B (en) 2022-04-13
GB201603913D0 (en) 2016-04-20
GB2548166A8 (en) 2017-11-22

Similar Documents

Publication Publication Date Title
US20210358311A1 (en) Automated system of air traffic control (atc) for at least one unmanned aerial vehicle (uav)
US20200162489A1 (en) Security event detection and threat assessment
US20200279100A1 (en) Unmanned aerial vehicle with biometric verification
US11365014B2 (en) System and method for automated tracking and navigation
EP2879371B1 (en) System for following an object marked by a tag device with a camera
US10690772B2 (en) LIDAR site model to aid counter drone system
EP3251953B1 (en) Method and apparatus for landing flight device
US8577083B2 (en) Geolocating objects of interest in an area of interest with an imaging system
EP2972465B1 (en) Radar false alert reduction
US20170234724A1 (en) Device for uav detection and identification
KR102440819B1 (en) Area patrol system with patrol drone
CN104058098B (en) System and method for graphical display spatial domain speed data
CN107360394B (en) More preset point dynamic and intelligent monitoring methods applied to frontier defense video monitoring system
KR102292117B1 (en) Drone control system and method for detecting and identifying of drone using the same
KR20170028811A (en) Method for monitoring flight of unmanned aerial vehicle, user terminal, server and detecting apparatus using the same
WO2018211777A1 (en) Control device, control method, and program
KR102290533B1 (en) RTK-GPS interlocking system and method for detecting and responding to illegal flight
CN109960277A (en) Expel unmanned plane and its interference method, device, storage medium and electronic equipment
KR20180113158A (en) Method, device and system for mapping position detections to a graphical representation
GB2548166A (en) Determining whether an object has entered a certain space
US9626588B1 (en) Detecting and locating lasers pointed at aircraft
US10388132B2 (en) Systems and methods for surveillance-assisted patrol
US20200408886A1 (en) Passive altimeter system for a platform and method thereof
KR101928995B1 (en) Intrusion object detection system and method for controlling the system thereof
US20210272434A1 (en) Detection device, detection method, robot, and program