GB2548107A - Determining whether an object has entered a certain space - Google Patents

Determining whether an object has entered a certain space Download PDF

Info

Publication number
GB2548107A
GB2548107A GB1603913.3A GB201603913A GB2548107A GB 2548107 A GB2548107 A GB 2548107A GB 201603913 A GB201603913 A GB 201603913A GB 2548107 A GB2548107 A GB 2548107A
Authority
GB
United Kingdom
Prior art keywords
space
measuring
determining whether
entered
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1603913.3A
Other versions
GB201603913D0 (en
GB2548107B (en
Inventor
David Down Christopher
Rockcliffe Armstrong Neil
Edward Cross James
James Wilkinson Alexander
Sebastian Wilkinson Roland
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Openworks Eng Ltd
Original Assignee
Openworks Eng Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Openworks Eng Ltd filed Critical Openworks Eng Ltd
Priority to GB1603913.3A priority Critical patent/GB2548107B/en
Priority to GB1606293.7A priority patent/GB2548166B/en
Publication of GB201603913D0 publication Critical patent/GB201603913D0/en
Publication of GB2548107A publication Critical patent/GB2548107A/en
Application granted granted Critical
Publication of GB2548107B publication Critical patent/GB2548107B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0082Surveillance aids for monitoring traffic from a ground station
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41HARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
    • F41H13/00Means of attack or defence not otherwise provided for
    • F41H13/0006Ballistically deployed systems for restraining persons or animals, e.g. ballistically deployed nets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/006Navigation or guidance aids for a single aircraft in accordance with predefined flight zones, e.g. to avoid prohibited zones
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Abstract

An apparatus 100 for determining whether a first object 203 has entered a space 207 comprises an object position measurer for measuring the position of the first object, and a processor (111 fig.1) for determining whether the first object has entered the space based on the position of the first object and information defining the space which may be a restricted airspace. The position measurer may comprise a range finder (123 fig.1) for measuring the distance to an object in a direct line of sight, and a direction sensor (125 fig.1) for measuring the direction of an object. Information regarding the type of space e.g. a stadium 205, time of measurement, type of object such as a drone or UAV and identity of the user may be stored. The apparatus may provide a visual indication of the spatial relationship between the object and the space and may track the trajectory of the object. The apparatus may be used by law enforcement officers 201 to monitor restricted airspace.

Description

DETERMINING WHETHER AN OBJECT HAS ENTERED A CERTAIN SPACE
FIELD OF THE INVENTION
The present invention relates to a technique for determining and recording whether an object has entered a certain space. For example, certain exemplary embodiments provide an apparatus and method for determining whether an aerial vehicle has entered a restricted airspace, and for recording evidence of unauthorised entry of the aerial vehicle into the restricted airspace.
BACKGROUND OF THE INVENTION
In the last few years, the commercial availability of cheap, small Unmanned Aerial Vehicles (UAVs), for example drones and quadcopters, has increased greatly, and this has resulted in an increasing number of instances of unauthorised or illegal intrusion of UAVs into restricted airspaces. Occurrences of unauthorised or illegal use of UAVs include use of UAVs near airports, sports stadia, prisons, crowded areas, buildings, structures and other installations. Such use is undesirable on grounds including security, legality and nuisance.
Various regulations have been introduced restricting or preventing use of aerial vehicles in certain spaces. For example. United Kingdom (UK) Civil Aviation Authority (CAA) Article 167 specifies that a person in charge of a small unmanned surveillance aircraft must not fly the aircraft (a) over or within 150 metres of any congested area; (b) over or within 150 metres of an organised open-air assembly of more than 1,000 persons; (c) within 50 metres of any vessel, vehicle or structure which is not under the control of the person in charge of the aircraft; or (d) within 50 metres of any person.
In order to enforce such regulations it is necessary to determine whether or not a UAV has actually entered a restricted space, and if so, to collect evidence of such intrusion so that the person in control of the UAV may be prosecuted or reprimanded. Currently, such a determination may be made by human eye judgement, and evidence may be provided in the form of an eye witness account. However, the human eye is relatively poor at judging distance and an eye witness account may not provide sufficient evidence to secure a prosecution.
Accordingly, what is desired is a technique for determining and recording whether an object (e.g. a UAV) has entered a certain space (e.g. a restricted airspace) with high accuracy, for example to allow law enforcement agencies to effectively prosecute users of UAVs that intrude into restricted airspaces without authorisation.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present invention.
SUMMARY OF THE INVENTION
It is an aim of certain embodiments of the present invention to address, solve, mitigate or obviate, at least partly, at least one of the problems and/or disadvantages associated with the related art, for example at least one of the problems and/or disadvantages mentioned herein. Certain embodiments of the present invention aim to provide at least one advantage over the related art, for example at least one of the advantages mentioned herein.
The present invention is defined by the independent claims. A non-exhaustive set of advantageous features that may be used in various exemplary embodiments of the present invention are defined in the dependent claims.
In accordance with an aspect of the present invention, there is provided an apparatus for determining whether a first object has entered a space, the apparatus comprising: an object position measurer for measuring the position of the first object; and a processor for determining whether the first object has entered the space based on the position of the first object and information defining the space.
In certain exemplary embodiments, the object position measurer comprises: a range finder for measuring the distance to an object in a direct line of sight; and a direction sensor for measuring the direction of an object.
In accordance with another aspect of the present invention, there is provided a method for determining whether a first object has entered a space, the method comprising: measuring the position of the first object; and determining whether the first object has entered the space based on the position of the first object and information defining the space.
In certain exemplary embodiments, the first object comprises an aerial vehicle (e.g. a UAV) and the space comprises an airspace (e.g. a restricted airspace).
In accordance with another aspect of the present invention, there is provide a computer program comprising instructions arranged, when executed, to implement a method, device, apparatus and/or system in accordance with any aspect, embodiment, example or claim disclosed herein. In accordance with another aspect of the present invention, there is provided a machine-readable storage storing such a program. other aspects, advantages, and salient features of the present invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, disclose exemplary embodiments of the present invention.
BRIEF DESCRIPTION OF THE FIGURES
Figure 1 illustrates an apparatus according to an exemplary embodiment of the present invention; and
Figures 2a and 2b illustrate exemplary uses of the apparatus of Figure 1 in an exemplary environment.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
The following description of exemplary embodiments of the present invention, with reference to the accompanying drawings, is provided to assist in a comprehensive understanding of the present invention, as defined by the claims. The description includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the present invention, as defined by the claims.
The terms and words used in this specification are not limited to the bibliographical meanings, but, are merely used to enable a clear and consistent understanding of the present invention.
The same or similar components may be designated by the same or similar reference numerals, although they may be illustrated in different drawings.
Detailed descriptions of elements, features, components, structures, constructions, functions, operations, processes, characteristics, properties, integers and steps known in the art may be omitted for clarity and conciseness, and to avoid obscuring the subject matter of the present invention.
Throughout this specification, the words “comprises”, “includes”, “contains” and “has”, and variations of these words, for example “comprise” and “comprising”, means “including but not limited to”, and is not intended to (and does not) exclude other elements, features, components, structures, constructions, functions, operations, processes, characteristics, properties, integers, steps and/or groups thereof.
Throughout this specification, the singular forms “a”, “an” and “the” include plural referents unless the context dictates otherwise. For example, reference to “an object” includes reference to one or more of such objects.
By the term “substantially” it is meant that the recited characteristic, parameter or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement errors, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic, parameter or value was intended to provide.
Throughout this specification, language in the general form of “X for Y” (where Y is some action, process, function, activity, operation or step and X is some means for carrying out that action, process, function, activity, operation or step) encompasses means X adapted, configured or arranged specifically, but not exclusively, to do Y.
Elements, features, components, structures, constructions, functions, operations, processes, characteristics, properties, integers, steps and/or groups thereof described herein in conjunction with a particular aspect, embodiment, example or claim are to be understood to be applicable to any other aspect, embodiment, example or claim disclosed herein unless incompatible therewith.
It will be appreciated that embodiments of the present invention can be realized in the form of hardware or a combination of hardware and software. Any such software may be stored in any suitable form of volatile or non-volatile storage device or medium, for example a ROM, RAM, memory chip, integrated circuit, or an optically or magnetically readable medium (e.g. CD, DVD, magnetic disk or magnetic tape). It will also be appreciated that storage devices and media are embodiments of machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement embodiments of the present invention.
Figure 1 illustrates an apparatus 100 according to an exemplary embodiment of the present invention. Figures 2a and 2b illustrate exemplary uses of the apparatus of Figure 1 in exemplary environments. In this example, the apparatus is described in relation to determining whether an aerial vehicle (e.g. a drone or UAV), referred to below as a target object, has entered a certain airspace, referred to below as a target airspace. However, the skilled person will appreciate that the present invention is not limited to this example. For example, the present invention may be applied in relation to objects other than vehicles and/or may be applied in relation to objects (e.g. vehicles or people) other than aerial objects.
The apparatus 100 may be fixed or portable. In the exemplary embodiment described below, the apparatus 100 is provided in the form of a hand-held device capable of being carried by a person. In certain embodiments, the apparatus 100 may be incorporated into an apparatus for capturing, immobilising or disabling the target object, for example as described in UK Patent Application 1601228.8. However, the skilled person will appreciate that the present invention is not limited to these configurations.
The target airspace may be defined in any suitable way. For example, the target airspace may be defined in terms of a certain predefined or fixed space (e.g. an airspace located above a defined geographical region such as a stadium or an airspace surrounding a building or other fixed structure). For example, a target airspace may be defined in terms of the space occupied by a vertical column (which may be a certain height or of unlimited height) above a defined geographical region on the surface of the earth. Alternatively or additionally, the target space may be defined based on the position(s) of one or more objects, for example in terms of a locus of points that are less than a certain threshold distance away from one or more objects (referred to below as protected objects), which may be fixed (e.g. buildings) or mobile (e.g. people or vehicles). In this case, the distance may be the distance in three dimensions, or alternatively the distance considering only those positional components in a certain plane (e.g. a horizontal plane).
In the embodiment of Figure 1, the apparatus 100 comprises an object position measurer 101, a self-position measurer 103 an input unit 105, an output unit 107, a communication unit 109, a processor 111, and a memory 113. The skilled person will appreciate that, in certain alternative embodiments, one or more of these components may be omitted, or replaced with components performing equivalent functions.
The object position measurer 101 is configured for measuring the position of one or more remote objects, including the target object and possibly one or more protected objects. The object position measurer 101 may be configured to directly measure the absolute position of an object (e.g. specified in terms of a latitude, longitude and altitude), or alternatively to measure the relative position of the object (e.g. the position of the object relative to the position of the apparatus 100). As described further below, a measured relative position of an object may be combined with a measured absolute position of the apparatus 100 to compute an absolute position of the object.
The object position measurer 101 may be configured to measure the position of an object using any suitable technique. For example, as will be described in greater detail below, in the embodiment illustrated in Figure 1, the object position measurer 101 comprises a number of sensors for measuring the distance from the apparatus 100 to the object in a direct line of sight, and the direction of the object relative to the apparatus 100, whereby the distance and direction measurements together provide a measurement of the position of the object relative to the apparatus 100. However, the skilled person will appreciate that the object position measurer 101 is not limited to this specific example.
For example, the position of an object may be measured based on one or more alternative or additional techniques, for example based on audio detection, visual detection, thermal detection, radar detection and/or radio-frequency detection. In certain embodiments, the object position measurer 101 may comprise two or more sensors distributed at different locations to enable the position of an object to be determined based on triangulation, trilateration, multilateration or similar techniques.
The self-position measurer 103 is configured for measuring the position of the apparatus 100. For example the self-position measurer 103 may take measurements using (i) a global navigation satellite system, such as the Global Positioning System (GPS) or Galileo, and/or (ii) using a local positioning system, such as a system in which signals are received from a set of (e.g. three or more) beacons having known positions and a position is computed based on triangulation, trilateration, multilateration or similar technique. In embodiments where the apparatus 100 has a known (e.g. fixed) position, the self-position measurer 103 may be omitted.
The input unit 105 is configured for receiving user input and controls. For example, the input unit may comprise one or more buttons, switches and/or triggers, a keyboard and/or keypad, and/or a touch screen.
The output unit 107 is configured for outputting information, indications and/or alerts to a user, for example in visual, audible and/or tactile form. For example, the output unit may comprise one or more of a screen, one or more lights, a speaker, and a vibration unit.
The communication unit 109 is configured for receiving information from an external source (e.g. a server or other remote entity). For example, the communication unit 109 may be configured to receive information specifying or defining one or more predefined restricted airspaces from a database. The communication unit 109 may alternatively or additionally be configured for transmitting information to an external source (e.g. a server or other remote entity). For example, the communication unit 109 may be configured to transmit information and data collected during operation of the apparatus 100, for example position measurements and any associated information, such as the time of the measurements and the identity of the user of the apparatus. The skilled person will appreciate that, in alternative embodiments, the communication unit 109 may be omitted if communication between the apparatus 100 and a remote entity is not required.
The processor 111 is configured for controlling the overall operation of the apparatus 100, as described in greater detail below.
The memory is configured for storing information and/or data obtained during operation of the apparatus 100.
The object position measurer 101 will now be described in more detail. The object position measurer 101 illustrated in Figure 1 comprises a sight 121, a range finder 123 and a direction sensor 125.
The sight 121 is configured for allowing the user to visually acquire an object whose position it is desired to measure. For example, the sight 121 may comprise a conventional telescopic gun sight.
The range finder 123 is configured for measuring (e.g. one-time, continuously or periodically) the distance to the object in the direct line of sight when the object has been visually acquired, and for providing (e.g. one-time, continuously or periodically) the measured distance(s) to the processor 111. For example, the range finder 123 may comprise a conventional laser range finder.
The direction sensor 125 is configured for measuring (e.g. one-time, continuously or periodically) the direction of the object when the object has been visually acquired, and for providing (e.g. one-time, continuously or periodically) the measured direction to the processor 111. For example, the direction sensor 125 may comprise one or more (e.g. three) accelerometers, one or more (e.g. three) gyroscopes, and/or a magnetometer. The direction sensor 125 may be configured to measure the zenith (or polar) angle of the object (i.e. the elevation angle between an imaginary horizontal plane and an imaginary line connecting the apparatus 100 and the object. The direction sensor 125 may also configured to measure the azimuthal angle of the target object with respect to a fixed reference (e.g. magnetic pole).
The measured distance to the object may be expressed in terms of a radial distance, and the measured direction of the object may be expressed in terms of a zenith angle and an azimuthal angle. Accordingly, the measured distance and measured direction together provide spherical coordinates of the object relative to the apparatus 100. The processor 111 may be configured to convert the spherical coordinates into another coordinate system, for example x, y and z Cartesian coordinates. The skilled person will appreciate that any suitable coordinate system may be used, and that conversion to another coordinate system may not be required in certain embodiments.
When the user has visually acquired an object through the sight 121, the user may trigger a measurement of the position of the object using the input unit 105. For example, in response to the user actuating a “measurement” button or switch provided in the input unit 105, the processor 111 samples the coordinates obtained from the data received from the object position measurer 101, thereby obtaining coordinates of the object at the time the user triggered the measurement. In certain embodiments, the processor 111 may also obtain the position of the apparatus 100 using the self-position measurer 103 at the time the user triggered the measurement.
In certain embodiments, the user may input information through the input unit 105 to indicate a type of the object (e.g. a “target object” or “protected object”). The processor 111 may control the memory 113 to store relevant information and/or data obtained during operation of the apparatus 100, for example the coordinates of an object, the coordinates of the apparatus 100, and any associated information, such as the time of the measurement, the type of the object and the identity of the user.
In certain embodiments, when the user has visually acquired an object through the sight 121, but before the user has triggered a measurement of the position of the object, the user may initiate an “acquisition phase”, for example by actuating an “acquisition” button provided on the input unit 105. During the acquisition phase, the processor 111 may collect a number of samples of coordinates obtained through the object position measurer 101, process and analyse the samples, and determine whether an object has been validly acquired based on the processing and analysis. When the processor 111 has determined that an object has been validly acquired, the processor 111 may control the output unit 107 to provide the user with an indication (e.g. audible, visual and/or tactile indication) of valid acquisition. Upon receiving the indication, the user may trigger a measurement of the position of the acquired object. Accordingly, more reliable measurements may be obtained. The skilled person will appreciate that the acquisition phase may be omitted in alternative embodiments.
The above operations may be repeated any desired number of times to measure or remeasure the positions of one or more objects.
In the case that the positions of two or more objects are measured, the processor 111 may be configured to calculate the distance between two of the objects. For example, if the position of a first object A is denoted by vector βη and the position of a second object B is denoted by vector Ρβ then the distance between objects A and B is given by \Ρβ - Pfi\. The processor 111 may be configured to control the output unit 107 to display the calculated distance between the objects (e.g. in numerical or graphical form). The processor 111 may be configured to control the output unit 107 to output an indication (e.g. visual audible or tactile indication) if the distance between objects A and B is less than a certain threshold. For example, the threshold may be predefined or specified by the user via the input unit 105.
The processor 111 may also be configured to calculate the absolute position of an object A based on the relative position of the object A with respect to the apparatus 100 and the absolute position of the apparatus 100. For example, if the relative position of the object A with respect to the apparatus 100 is given by the vector and the absolute position of the apparatus 100 is denoted by vector Papp then the absolute position of the object A is given by Papp + APa. The processor 100 may be further configured to compare the position of object A with one or more predetermined spaces Si, S2, ... to determine whether or not object A is located within any of the spaces.
Information specifying or defining the spaces may be obtained, for example, from data stored in the memory 113 and/or data retrieved from a remote source (e.g. a database stored in a remote server) through the communication unit 109. Information specifying or defining spaces may be added, deleted and/or modified in the memory 113 and/or the remote source, for example by the user or another party, to define new spaces, or delete or modify existing spaces. Such information may specify or define a space in any suitable format, form or representation.
The processor 111 may be configured to control the output unit 107 to display a graphical representation (e.g. three-dimensional map) illustrating the spatial relation between one or more of the spaces and the position of the object. The processor 111 may be configured to control the output unit 107 to output an indication (e.g. visual, audible or tactile indication) if the object A is located within one or more of the spaces.
As mentioned above, the processor 111 may be configured to control the memory 113 to store various pieces of information or data obtained during operation of the apparatus 100, for example the position of an object, the type of the object, the position of the apparatus 100, the distance between two objects, an identification of a space in which an object is located, the time at which a measurement was made and/or the identity of the user. A number of exemplary scenarios in which the apparatus 100 of Figure 1 may be used will now be described. The skilled person will appreciate that the present invention is not limited to these specific examples.
In a first scenario (Figure 2a), a user 201 of the apparatus 100 (e.g. a police officer) has noticed that an aerial vehicle 203 (e.g. a drone) is flying in the vicinity of a sports stadium 205. The flying of drones within the restricted airspace 207 above the stadium 205 (e.g. a target airspace) is prohibited, and the police officer 201 wishes to know whether the drone 203 has actually entered the restricted airspace 207, and if so, the police office 201 wishes to collect evidence of the unauthorised drone 203 use to facilitate prosecution of the person 209 controlling the drone 203.
The police officer 201 uses the input unit 105 to indicate to the processor 111 that a measurement with respect to a target object 203 will commence (e.g. by pressing a certain button provided in the input unit 105). The police officer 201 then visually acquires the drone 203 through the sight 121 and presses the “acquisition” button. When the apparatus 100 has verified that an object 203 has been validly acquired, the processor 111 controls the output unit 107 to provide a suitable indication to the police officer 201, for example in the form of lighting an LED, sounding a buzzer, or creating a mild vibration. In response, the police officer 201 presses a “measurement” button to trigger a measurement of the position of the drone 203 relative to the apparatus 100.
When a measurement has been triggered, the processor 111 obtains spherical coordinates of the relative position of the drone 203 through the object position measurer 101 and may convert the spherical coordinates into Cartesian coordinates, if required. The processor 111 also obtains the current absolute position of the apparatus 100 through the self-position measurer 103. The processor 111 then calculates the absolute position of the drone Pdrone based on the relative position of the drone and the absolute position of the apparatus 100.
The processor 100 then compares the calculated position Pdmne with data stored in the memory defining one or more restricted airspaces, including the airspace 207 corresponding to the stadium 205. If the processor 100 determines that the calculated position of the drone 203 is located within a restricted airspace, for example the airspace 207 corresponding to the stadium 205, the processor 111 controls the output unit 107 to provide a suitable alert to the police officer 201, for example in the form of flashing an LED, sounding an alarm, or creating a high intensity vibration. The processor 111 may also control the output unit 107 to display a graphical interface allowing the police office 201 to more clearly understand the spatial relationship between the drone 203 and the restricted airspace 207. For example, the interface may comprise a map (e.g. in the form of a three-dimensional map) showing the stadium 205, the airspace 207 corresponding to the stadium 205 (e.g. as a shaded region on the map), and the position of the drone 203 relative to the airspace 207 (e.g. as a dot, cross or icon superimposed over the map at the relevant position). In certain embodiments, the interface may also display the coordinates of the drone 203 in numerical form (e.g. in terms of a latitude, longitude and altitude).
The police office 201 may choose to record evidence by pressing a “record” button provided in the input unit 105. In response, the processor 111 stores relevant information or data in the memory 113, for example one or more of: the position of the drone 203, information identifying the stadium 205 (e.g. the name or unique ID of the stadium 205), the time of the measurement, and information identifying the police officer 201 (e.g. the police officer’s 201 badge number). The police officer 201 may then choose to transmit the collected evidence to a police database through the communication unit 109 by pressing a “send” button provided on the input unit 105.
In a second scenario (Figure 2b), a user 201 of the apparatus (e.g. a police officer) has noticed that an aerial vehicle 203 (e.g. a drone) is flying in the vicinity of a crowded area 211. Regulations prohibit the flying of drones 203 within 150m from a crowded area 211, and the police officer 201 wishes to know whether or not the drone 203 is in breach of the regulations, and if so, the police officer 201 wishes to collect evidence of the breach.
The police officer 201 applies the procedure described above in relation to the first scenario to trigger a measurement of the position of the drone 203 (which is specified by the police officer 201 as a target object) relative to the apparatus 100, resulting in the processor obtaining coordinates (e.g. Cartesian coordinates) of the position of the drone 203, Pdrone.
The police officer 201 then applies a similar procedure to trigger a measurement of the position of the crowd 211 (which is specified by the police officer 201 as a protected object, rather than a target object) relative to the apparatus 100. The position of the crowd 211 may be represented, for example, by the position of a person 213 in the crowd 211. This procedure results in the processor 111 obtaining Cartesian coordinates of the position of the crowd 211, Pcrowd- The position of the crowd 211 may define a restricted airspace 215 (e.g. a target airspace).
The processor 111 then calculates the distance between the drone 203 and the crowd 211, denoted by AP=\Pcrowd - Pdrone\, and compares this distance with a threshold T (e.g. 150m in this example). As mentioned above, alternatively, the distance AP may be calculated considering only those positional components in a certain plane (e.g. a horizontal plane). For example, the distance may be calculated ignoring vertical positional components. If the processor 111 determines that the distance is less than the threshold (i.e. if ΔΡ<Τ), the processor 111 controls the output unit 107 to provide a suitable alert to the police officer 201, for example in the form of a flashing LED, alarm sound or high intensity vibration. The processor 111 may also control the output unit 107 to display a graphical interface allowing the police officer 201 to more clearly understand the spatial relationship between the drone 203 and the crowd 211 in a similar manner as described above in relation to the first scenario. In certain embodiments, the interface may also display the distance between the drone 203 and the crowd 211, for example in numerical or graphical form.
In a similar manner as described above in relation to the first scenario, the police officer 201 may choose to record evidence by pressing the “record” button, and may choose to transmit the collected evidence to a police database through the communication unit 109 by pressing the “send” button.
In various alternative scenarios, the protected objects may be different from those described in the first and second scenarios above. For example, in an alternative to the second scenario, the protected object may comprise a building rather than a crowd. In various alternative scenarios, there may be two or more target objects, two or more protected objects and/or two or more target spaces. In such cases, multiple measurements, distance calculations and/or comparisons of positions with spaces may be performed to build up a complete picture of the scenario, to determine whether or not any of the target objects have entered any of the target spaces.
In certain embodiments, in cases where the target object is moving, the apparatus 100 may be further configured to track the target object as it moves. In this case, the user may visually acquire the target object in the sight 121 and track the target object as it moves. During this tracking, the processor 111 may obtain a sequence of coordinates of the target object at discrete time points, which define a parameterized trajectory of the target object.
The processor 111 may be configured to control the output unit 107 to alert the user when the trajectory of the target object enters a target airspace. The processor 111 may be configured to control the memory 113 to record the trajectory of the target object. The processor may be configured to control the output unit 107 to display the trajectory of the target object on a user interface (e.g. a three-dimensional map) to illustrate the spatial relationship between the trajectory of the target object and the target space and/or one or more other objects.
In certain embodiments, the processor 111 may input the tracked trajectory of the target object into a suitable motion model to predict the future trajectory of the target object based on the previous trajectory. In this case, the processor 111 may be configured to control the output unit 107 to alert the user when the trajectory of the target object is predicted to enter a target airspace, for example with a certain degree of certainty. The processor 111 may be configured to control the output unit 107 to indicate to the user the estimated amount of time before the target object is predicted to enter the target airspace.
The processor 111 may be further configured to obtain various statistics related to the trajectory of the target object. For example, if the trajectory enters a target airspace, the statistics may comprise the total amount of time the target object spends inside the target airspace and/or how deep inside the target airspace the target object enters. If the trajectory of the target object does not enter the target airspace, the statistics may comprise the distance of closest approach to the target airspace and/or the average distance of the target object from the target airspace. These statistics may be stored in the memory 113 and/or sent to a remote location through the communication unit 109.

Claims (30)

1. An apparatus for determining whether a first object has entered a space, the apparatus comprising: an object position measurer for measuring the position of the first object; and a processor for determining whether the first object has entered the space based on the position of the first object and information defining the space.
2. An apparatus according to claim 1, wherein the space is a predefined space, and the information defining the space comprises pre-stored information, and wherein the processor is configured for determining whether the first object has entered the space by determining whether the position of the first object is located inside the predefined space.
3. An apparatus according to claim 1 or 2, wherein the object position measurer is configured for measuring the position of the first object relative to the apparatus, and wherein the processor is configured for calculating the position of the first object based on the position of the apparatus and the position of the first object relative to the apparatus.
4. An apparatus according to claim 3, wherein the apparatus further comprises a selfposition measurer for measuring the position of the apparatus.
5. An apparatus according to any preceding claim, wherein object position measurer is further configured for measuring the position of a second object, and wherein the information defining the space is obtained based on the position of the second object.
6. An apparatus according to claim 5, wherein the processor is configured for determining whether the first object has entered the space based on determining whether the position of the first object is less than a threshold distance from the position of the second object.
7. An apparatus according to any preceding claim, wherein the object position measurer comprises: a range finder for measuring the distance to an object in a direct line of sight; and a direction sensor for measuring the direction of an object.
8. An apparatus according to any preceding claim, wherein the processor is configured for storing the position of the first object.
9. An apparatus according to any preceding claim, wherein the processor is configured for storing one or more of: the information defining the space; a time of each measurement; a type of each object; and an identity of a user.
10. An apparatus according to any preceding claim, wherein the apparatus comprises an output unit for outputting an indication to the user when the first object is determined to have entered the space.
11. An apparatus according to any preceding claim, wherein the apparatus comprises an output unit for providing a visual indication of the spatial relationship between the first object and the space.
12. An apparatus according to any preceding claim, wherein the apparatus is configured for tracking the trajectory of the first object.
13. An apparatus according to any preceding claim, wherein the processor is configured for determining that the first object has been validly acquired before the object position measurer measures the position of the first object.
14. An apparatus according to any preceding claim, wherein the apparatus further comprises an apparatus for capturing, immobilising or disabling the first object.
15. An apparatus according to any preceding claim, wherein the first object comprises an aerial vehicle, and the space comprises an airspace.
16. A method for determining whether a first object has entered a space, the method comprising: measuring the position of the first object; and determining whether the first object has entered the space based on the position of the first object and information defining the space.
17. A method according to claim 16, wherein the space is a predefined space, and the information defining the space comprises pre-stored information, and wherein determining whether the first object has entered the space comprises determining whether the position of the first object is located inside the predefined space.
18. A method according to claim 16 or 17, wherein measuring the position of the first object comprises: measuring the position of the first object relative to the apparatus, and calculating the position of the first object based on the position of the apparatus and the position of the first object relative to the apparatus.
19. A method according to claim 18, further comprising measuring the position of the apparatus.
20. A method according to any of claims 16 to 19, wherein the method further comprises: measuring the position of a second object, and obtaining the information defining the space based on the position of the second object.
21. A method according to claim 20, wherein determining whether the first object has entered the space comprises determining whether the position of the first object is less than a threshold distance from the position of the second object.
22. A method according to any of claims 16 to 21, wherein measuring the position of the first or second object comprises: measuring the distance to the first or second object in a direct line of sight; and measuring the direction of the first or second object.
23. A method according to any of claims 16 to 22, further comprising storing the position of the first object.
24. A method according to any of claims 16 to 23, further comprising storing one or more of: the information defining the space; a time of each measurement; a type of each object; and an identity of a user.
25. A method according to any of claims 16 to 24, further comprising outputting an indication to the user when the first object is determined to have entered the space.
26. A method according to any of claims 16 to 25, further comprising providing a visual indication of the spatial relationship between the first object and the space.
27. A method according to any of claims 16 to 26, further comprising tracking the trajectory of the first object.
28. A method according to any of claims 16 to 27, further comprising determining that the first object has been validly acquired before measuring the position of the first object.
29. A method according to any of claims 16 to 28, further comprising capturing, immobilising or disabling the first object.
30. A method according to any of claims 16 to 29, wherein the first object comprises an aerial vehicle, and the space comprises an airspace.
GB1603913.3A 2016-03-07 2016-03-07 Determining whether an object has entered a certain space Active GB2548107B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1603913.3A GB2548107B (en) 2016-03-07 2016-03-07 Determining whether an object has entered a certain space
GB1606293.7A GB2548166B (en) 2016-03-07 2016-04-12 Determining whether an object has entered a certain space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1603913.3A GB2548107B (en) 2016-03-07 2016-03-07 Determining whether an object has entered a certain space

Publications (3)

Publication Number Publication Date
GB201603913D0 GB201603913D0 (en) 2016-04-20
GB2548107A true GB2548107A (en) 2017-09-13
GB2548107B GB2548107B (en) 2022-04-13

Family

ID=55859117

Family Applications (2)

Application Number Title Priority Date Filing Date
GB1603913.3A Active GB2548107B (en) 2016-03-07 2016-03-07 Determining whether an object has entered a certain space
GB1606293.7A Active GB2548166B (en) 2016-03-07 2016-04-12 Determining whether an object has entered a certain space

Family Applications After (1)

Application Number Title Priority Date Filing Date
GB1606293.7A Active GB2548166B (en) 2016-03-07 2016-04-12 Determining whether an object has entered a certain space

Country Status (1)

Country Link
GB (2) GB2548107B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110583014B (en) * 2016-10-11 2021-04-20 深圳市前海腾际创新科技有限公司 Method and system for detecting and locating intruders using laser detection and ranging device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003027934A1 (en) * 2001-09-26 2003-04-03 Hodge Philip T Method and apparatus for controlling the use of airspace and assessment of use fees and penalties
WO2009139937A2 (en) * 2008-02-15 2009-11-19 Kutta Technologies, Inc. Unmanned aerial system position reporting system and related methods
CN102496312A (en) * 2011-12-22 2012-06-13 北京东进记录科技有限公司 Warning method and device for invasion of aerial target in restricted airspace
US20120215382A1 (en) * 2011-02-23 2012-08-23 Hon Hai Precision Industry Co., Ltd. System and method for controlling unmanned aerial vehicle in flight space
CN104950907A (en) * 2015-06-26 2015-09-30 广州快飞计算机科技有限公司 Method, device and system for monitoring unmanned aerial vehicle
CN205336543U (en) * 2015-12-18 2016-06-22 苏州贝多环保技术有限公司 Unmanned aerial vehicle

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4409424C1 (en) * 1994-03-18 1995-08-10 Daimler Benz Aerospace Ag Catchment device for flying objects
US7190304B1 (en) * 2003-12-12 2007-03-13 Bae Systems Information And Electronic Systems Integration Inc. System for interception and defeat of rocket propelled grenades and method of use
US7202809B1 (en) * 2004-05-10 2007-04-10 Bae Systems Land & Armaments L.P. Fast acting active protection system
US8205537B1 (en) * 2008-08-11 2012-06-26 Raytheon Company Interceptor projectile with net and tether
US8375837B2 (en) * 2009-01-19 2013-02-19 Honeywell International Inc. Catch and snare system for an unmanned aerial vehicle
US8157169B2 (en) * 2009-11-02 2012-04-17 Raytheon Company Projectile targeting system
US8596178B2 (en) * 2011-01-28 2013-12-03 The Boeing Company Expanding countermeasure and launcher system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003027934A1 (en) * 2001-09-26 2003-04-03 Hodge Philip T Method and apparatus for controlling the use of airspace and assessment of use fees and penalties
WO2009139937A2 (en) * 2008-02-15 2009-11-19 Kutta Technologies, Inc. Unmanned aerial system position reporting system and related methods
US20120215382A1 (en) * 2011-02-23 2012-08-23 Hon Hai Precision Industry Co., Ltd. System and method for controlling unmanned aerial vehicle in flight space
CN102496312A (en) * 2011-12-22 2012-06-13 北京东进记录科技有限公司 Warning method and device for invasion of aerial target in restricted airspace
CN104950907A (en) * 2015-06-26 2015-09-30 广州快飞计算机科技有限公司 Method, device and system for monitoring unmanned aerial vehicle
CN205336543U (en) * 2015-12-18 2016-06-22 苏州贝多环保技术有限公司 Unmanned aerial vehicle

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
- *
Laser Technology Inc. 2014, "TruPulse Series" Available from http://web.archive.org/web/20140701211124/http://www.lasertech.com/TruPulse-Laser-Rangefinder.aspx accessed July 21st 2016. *

Also Published As

Publication number Publication date
GB201603913D0 (en) 2016-04-20
GB2548166B (en) 2022-04-13
GB2548107B (en) 2022-04-13
GB2548166A (en) 2017-09-13
GB2548166A8 (en) 2017-11-22

Similar Documents

Publication Publication Date Title
US10708548B2 (en) Systems and methods for video analysis rules based on map data
US10288737B2 (en) LiDAR sensing system
US11365014B2 (en) System and method for automated tracking and navigation
KR100719765B1 (en) System and method for highly accurate real time tracking and location in three dimensions
US10251019B2 (en) Adaptive position indicator
US10616533B2 (en) Surveillance system and method for camera-based surveillance
US20120194554A1 (en) Information processing device, alarm method, and program
CN104058098B (en) System and method for graphical display spatial domain speed data
CA3055316C (en) Target detection and mapping
US20200126418A1 (en) Method and system for vehicle location
KR102292117B1 (en) Drone control system and method for detecting and identifying of drone using the same
US10033852B2 (en) Limiting sensor use in restricted areas
JP6917936B2 (en) Methods, devices and systems for mapping location detection to graphical representations
US11210529B2 (en) Automated surveillance system and method therefor
EP2878119B1 (en) Laser event recorder
GB2548107A (en) Determining whether an object has entered a certain space
GB2562813A (en) Methods, computer programs, computing devices and controllers
US9626588B1 (en) Detecting and locating lasers pointed at aircraft
KR101700933B1 (en) Providing location information system in video display using rfid system
KR101228072B1 (en) System and method for data recording and playing
JP2020020645A (en) Position detection system and position detection method
KR102466481B1 (en) Control system and method for preventing flying in flight area and collision of unmanned aerial vehicle
des Aunais et al. Analytical method for multimodal localization combination using Wi-Fi and camera
Misquith et al. Drone Assisted Rescue System
Fridolf et al. Indoor Localization for Fire Safety