WO2019092706A1 - Transmission detection using line of sight - Google Patents

Transmission detection using line of sight Download PDF

Info

Publication number
WO2019092706A1
WO2019092706A1 PCT/IL2018/051191 IL2018051191W WO2019092706A1 WO 2019092706 A1 WO2019092706 A1 WO 2019092706A1 IL 2018051191 W IL2018051191 W IL 2018051191W WO 2019092706 A1 WO2019092706 A1 WO 2019092706A1
Authority
WO
WIPO (PCT)
Prior art keywords
airborne vehicle
geo
sight
transmitter
positions
Prior art date
Application number
PCT/IL2018/051191
Other languages
French (fr)
Inventor
Tomer Nahum
Efrat AHARON
Assaf KOUSSEWITZKY
Original Assignee
Elbit Systems Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from IL255512A external-priority patent/IL255512B/en
Application filed by Elbit Systems Ltd. filed Critical Elbit Systems Ltd.
Priority to US16/762,436 priority Critical patent/US10989792B2/en
Priority to EP18875544.1A priority patent/EP3707524B1/en
Publication of WO2019092706A1 publication Critical patent/WO2019092706A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/04Position of source determined by a plurality of spaced direction-finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S1/00Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith
    • G01S1/02Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith using radio waves
    • G01S1/08Systems for determining direction or position line
    • G01S1/20Systems for determining direction or position line using a comparison of transit time of synchronised signals transmitted from non-directional antennas or antenna systems spaced apart, i.e. path-difference systems
    • G01S1/24Systems for determining direction or position line using a comparison of transit time of synchronised signals transmitted from non-directional antennas or antenna systems spaced apart, i.e. path-difference systems the synchronised signals being pulses or equivalent modulations on carrier waves and the transit times being compared by measuring the difference in arrival time of a significant part of the modulations, e.g. LORAN systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/20Position of source determined by a plurality of spaced direction-finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/021Auxiliary means for detecting or identifying radar signals or the like, e.g. radar jamming signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4804Auxiliary means for detecting or identifying lidar signals or the like, e.g. laser illuminators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52001Auxiliary means for detecting or identifying sonar signals or the like, e.g. sonar jamming signals

Definitions

  • the disclosed technique relates to transmission detection, in general, and to systems and methods for transmission detection using line of sight measurements, in particular.
  • Detection systems for mobile objects typically employ one or more sensors to detect a transmission directed towards the vehicle, such as a fired projectile, or a transmission by a tracking device, and the like.
  • sensors may comprise receivers for detecting visual signals, acoustic signals, RADAR signals, and the like.
  • these systems might issue a warning to the driver of the vehicle.
  • such systems might apply topographical data to compute for the vehicle an alternative route that is outside the transmitting range of the detected transmission.
  • US Patent 9,163,949B2 to Andersson entitled “Dynamic Route Planner”
  • US Patent 9,255,808 B2 to Andersson entitled “Route Planning System and Method for Minimizing Exposure to Threats”
  • US Patent 9,163,949B2 to Andersson entitled “Dynamic Route Planner”
  • US Patent 9,255,808 B2 to Andersson entitled “Route Planning System and Method for Minimizing Exposure to Threats”
  • the system may issue a warning, and redirect the vehicle to the alternative route.
  • a method for automatically determining at least one potential transmission region comprising: building multiple layers of access respective of a geographical region and a transmission, wherein each the layer of access of the multiple layers of access is associated with a different geo-position of at least one airborne vehicle; computing an intersection of the multiple layers of access; and determining, from the intersection, the at least one potential transmission region respective of the transmission and each of the different geo-positions of at least one airborne vehicle, wherein building each one of the multiple layers of access comprises: sensing state data of the at least one airborne vehicle comprising at least the geo-position associated with the one layer of access, sensing an indication for a direct line of sight between the transmission and the at least one airborne vehicle, while the at least one airborne vehicle is positioned at the geo- position associated with the one layer of access, obtaining topographical data for the geographical region, responsive to the sensing the indication for the direct line of sight, combining i) the indication for the direct line of sight, ii) the state data
  • the state data further includes a parameter selected from the group consisting of: a time stamp associated with the sensing of the indication for the direct line of sight; a heading measurement for the at least one airborne vehicle; an altitude measurement for the at least one airborne vehicle; a pose measurement for the at least one airborne vehicle; a velocity measurement for the at least one airborne vehicle; and an acceleration measurement for the at least one airborne vehicle.
  • the transmission is selected from the group consisting of: a threat signal; and a distress signal.
  • the at least one airborne vehicle is selected from the group consisting of: a fixed wing aircraft; a helicopter; a drone; and a surveillance balloon.
  • the transmission is selected from the group consisting of: an optical signal; a radio signal; a LIDAR signal; a RADAR signal, an acoustic signal, an ultrasound signal, a thermal signal; and any of a distance, amplitude, velocity, and acceleration associated with a projectile.
  • an automatic transmission detection system for use with an airborne vehicle, comprising: at least one processor configured to: build multiple layers of access respective of a geographical region and a transmission, wherein each of the multiple layers of access is associated with a different geo-position of at least one airborne vehicle; compute an intersection of the multiple layers of access; and determine, from the intersection, at least one potential transmission region respective of each of the different geo-positions of the at least one airborne vehicle, wherein building each one of the multiple layers of access comprises: responsive to a sensing of an indication for a direct line of sight between the transmission and the at least one airborne vehicle while the at least one airborne vehicle is positioned at the geo-position associated with the one layer of access, combining i) the indication for the direct line of sight, ii) state data of the at least one airborne vehicle comprising at least the geo-position associated with the one layer of access, and iii) topographical data for the geographical region, to compute a subset of the geographical region that includes at
  • the system further comprises the at least one airborne vehicle, the at least one airborne vehicle provided with the at least one processor, the memory unit, the at least one first sensor, and the at least one second sensor.
  • the at least one first sensor is selected from the group consisting of: a compass, a GPS unit, a 3D accelerometer, a gyroscope, and a camera.
  • the at least one second sensor is selected from the group consisting of: a long range radio antenna, a LIDAR detector, a RADAR antenna, a thermal detector, an acoustic detector, an ultrasound detector, and a camera.
  • system of further comprises a user interface configured to display the at least one potential transmission region respective of each of the different geo-positions of at least one airborne vehicle.
  • Figures 1A-1 G are schematic illustrations of a system for detecting a potential transmitting region respective of at least one airborne vehicle moving over a geographical region, constructed and operative with an embodiment of the disclosed techniques;
  • Figure 2 is a schematic illustration of the system of Figures 1A-1 G, constructed and operative in accordance with another embodiment of the disclosed techniques;
  • Figures 3A-3G are schematic illustrations of another system for detecting a potential transmitting region respective of at least one airborne vehicle moving over a geographical region, constructed and operative with an embodiment of the disclosed techniques;
  • FIGS. 4A-4B taken together, are a schematic illustration of a method for determining a potential transmission region, constructed and operative in accordance with an embodiment of the disclosed technique.
  • the disclosed technique overcomes the disadvantages of the prior art by providing a system and method for detecting a potential transmitting region respective of an airborne vehicle moving over a geographical region.
  • the airborne vehicle is provided with one or more sensors that sense any combination of radio, optical, acoustic, thermal, LIDAR, and RADAR transmission signals that are transmitted by a transmitter positioned somewhere in the geographical region.
  • the transmission signals may be communicated from a transmitter to the airborne vehicle for tracking purposes, attack purposes, alert or distress purposes, and the like.
  • each airborne vehicle is provided with one or more sensors that sense the geo-position of the vehicle at a given point in time, where geo-position is understood to be the real-world geolocation of the airborne vehicle respective of the elevation, longitude and latitude geo-coordinates.
  • a processor may be provided with the airborne vehicle to determine a potential transmission region for the transmitter by combining multiple layers of access respective of the different geo-positions of the airborne vehicle.
  • the processor builds each layer of access by combining a topographical map of the geographical region with the respective geo-position of the airborne vehicle and the line of site indications detected at the respective geo-position.
  • the potential transmission region is further reduced to provide a more accurate assessment for the position of the transmitter. This may be useful for assessing the location of a threat, such as an enemy tracking system, or projectile. Alternatively, determining the position of the transmitter may be useful for assessing the location of an ally, such as when the ally transmits a distress signal.
  • Line of sight is defined as the unobstructed straight line between a transmitter and a sensor. Therefore, when the sensor senses a transmission by the transmitter, this is a binary indication for direct line of sight, as defined in this application, validating the assumption that an unobstructed straight line exists between the transmitter and the sensor.
  • Figures 1A-1 G are schematic illustrations of a system 100 for detecting a potential transmitting region respective of a transmitter 106 and at least one airborne vehicle 102 moving over a geographical region 104, constructed and operative with an embodiment of the disclosed techniques.
  • Geographical region 104 includes one or more peaks and valleys, such as peaks 108A, 108B, and 108C.
  • Figure 1A illustrates system 100 at a time T1 with airborne vehicle positioned at a geo-position P(T1 ), and Figure 1 B illustrates a layer of access, Layer(P(T1 )) corresponding to geo- position P(T1 )) of Figure 1A.
  • Figure 1 C illustrates system 100 at a time 12, after airborne vehicle 102 has moved to another geo-position P(T2), and
  • Figure 1 D illustrates a layer of access, Layer(P(T2)) corresponding to geo-position P(T2)) of Figure 1 C.
  • System 100 includes at least one computer 1 10.
  • computer 1 10 is provided with airborne vehicle 102.
  • computer 1 10 is provided separately from airborne vehicle 102, and is in communication with airborne vehicle 102 to receive one or more indications of a sensed transmission and sensed geo-positioning data respective of airborne vehicle 102.
  • a transmitter 106 is positioned at one of multiple potential transmitting regions within geographical region 104, such on top of one of peaks 108A, 108B and 108C, each of which poses a potential transmitting region.
  • Transmitter 106 may be any of a LIDAR, RADAR, infrared (IR), radio, optical, acoustic, ultrasound (US), thermal or electric transmitter.
  • the transmission is associated with a tracking system or a guided projectile system of an opponent.
  • the transmission is a distress signal from an ally.
  • Computer 1 10 includes hardware componentry and software for acquiring the respective geo-position of airborne vehicle 102 from a geo- positioning sensor (not shown) provided with airborne vehicle 102 as the respective geo-position of airborne vehicle 102 changes with time, i.e. from time T1 to time 12. Additionally, computer 1 10 includes hardware componentry and software for acquiring indications of detections by a transmission sensor (not shown) provided with airborne vehicle 102 as the respective geo-position of airborne vehicle 102 changes with time. The transmission sensor detects one or more transmissions transmitted by transmitter 106 in the direction of airborne vehicle 102. Computer 1 10 is operative to apply the indications of the detected transmissions to determine one or more indications for line of sight respective of transmitter 106 and the current geo-position of airborne vehicle 102.
  • Computer 1 10 additionally includes hardware componentry and software for storing topographical data respective of geographical region 104, as well as software instructions that are executable by computer 1 10. These and other components of computer 1 10, the transmission sensor and the geo-positioning sensor will be described in greater detail herein below in conjunction with Figure 2. As airborne vehicle 102 moves and the respective geo-position of airborne vehicle 102 changes over time, computer 1 10 builds multiple layers of access respective of geographical region 104 and transmissions by transmitter 106. Each of the multiple layers of access is associated with a different geo-position of airborne vehicle 102. A description of the layers of access respective of each geo- location of airborne vehicle 102 now follows:
  • airborne vehicle 102 is at geo-position, P(T1 ). While airborne vehicle 102 is positioned at geo-position P(T1 ), the transmission sensor (not shown) provided with airborne vehicle 102 senses one or more transmissions transmitted by transmitter 106 external to airborne vehicle 102. Computer 1 10 obtains indications, such a binary indication of the sensed transmissions at time T1 .
  • the transmission may be any combination of: a LIDAR, RADAR, IR, radio, optical, acoustic, ultrasound, thermal or electric signal.
  • Airborne vehicle 102 additionally is provided with at least one geo-positioning sensor (not shown) for sensing state data corresponding to the physical state and geo-position of airborne vehicle 102.
  • the geo-positioning sensor acquires the physical state data, including at least geo-position P(T1 ) i.e. the geo-position of airborne vehicle 102 at time T1 , and may additionally include parameters such as the pose, velocity, acceleration, heading and timestamp T1 respective of the time of sensing.
  • Computer 1 10 acquires the state data from the geo-positioning sensor and determines from the state data and the detected transmission, one or more indications for direct line of sight between transmitter 106 and airborne vehicle 102 positioned at geo- position P(T1 ), such as indications for direct lines of sight 1 12A(T1 ) respective of peak 108A, 1 12B(T1 ) respective of peak 108B, and 1 12C(T1 ) respective of peak 108C.
  • computer 1 10 Responsive to sensing the indications for direct line of sight 1 12A(T1 ), 1 12B(T1 ), and 1 12C(T1 ), computer 1 10 combines i) the state data, including at least P(T1 ), with ii) indications for direct line of sight 1 12A(T1 ), 1 12B(T1 ), and 1 12C(T1 ) and iii) the topographical data for geographical region 104 to compute a first subset of geographical region 104.
  • the first subset of geographical region 104 is indicated in Figure 1A by the union of sub-regions 108A(T1 ), 108B(T1 ), and 108C(T1 ) respective of peaks 108A, 108B, and 108C.
  • Each of sub-regions 108A(T1 ), 108B(T1 ) and 108C(T1 ) includes at least one potential point within geographical region 104 that defines a potential line of sight between the transmission by transmitter 106 and airborne vehicle 102 while positioned at P(T1 ).
  • Each potential point is a potential location for transmitter 106, respective of geo-position P(T1 ), i.e.
  • Potential point 1 14A(T1 ) is determined respective of sub-region 108A(T1 )
  • potential point 1 14B(T1 ) is determined respective of sub-region 108B(T1 )
  • potential point 1 14C(T1 ) is determined respective of sub-region 108C(T1 ).
  • Each potential line of sight describes an unobstructed, observational perspective from each potential point to airborne vehicle 102. It is to be noted that Figure 1A indicates three potential points respective of three sub-regions for illustrative purposes only. However each sub-region may include multiple potential points.
  • Computer 1 10 builds a layer of access Layer(P(T1 )) associated with geo- position P(T1 ) that and at least sub-regions 108A(T1 ), 108B(T1 ) and 108C(T1 ).
  • Layer(P(T1 )) is given below with respect to Figure 1 B.
  • FIG. 1 B which is associated with Figure 1A, a schematic illustration of layer of access Layer(P(T1 )) built by computer 1 10 and overlaid on a digital terrain map (DTM) 204 of geographical region 104 is shown.
  • DTM 204 may be provided in advance as a database and stored at a storage device of computer 1 10.
  • computer 1 10 may build DTM 204 in realtime based on data acquired via a sensor, such as any of the components provided with transmission sensor 242 described hereinbelow with respect to Figure 2, i.e. via a camera, RADAR antenna, LIDAR detector, radio receiver i.e. antenna, acoustic detector, sonar (US) detector, thermal detector, and the like.
  • a sensor such as any of the components provided with transmission sensor 242 described hereinbelow with respect to Figure 2, i.e. via a camera, RADAR antenna, LIDAR detector, radio receiver i.e. antenna, acoustic detector, sonar (US) detector, thermal detector, and
  • Layer(P(T1 )) is built respective of geo-position P(T1 ) of airborne vehicle 102 at time T1 .
  • Layer of access Layer (P(T1 )) includes the union of three representative sub-regions 208A(T1 ), 208B(T1 ) and 208C(T1 ) of a DTM 204 of geographical region 104, and corresponding to respective sub-regions 108A(T1 ), 108B(T1 ) and 108C(T1 ) of peaks 108A, 108B, and 108C.
  • Peaks 208A, 208B, and 208C represent peaks 108A, 108B, and 108C, and are obtained from the topographical data of geographical region 104 stored in a memory unit of computer 1 10.
  • Each of representative sub-regions 208A(T1 ), 208B(T1 ) and 208C(T1 ) includes at least one potential point representing a potential line of sight between the transmission by transmitter 106 and airborne vehicle 102 while positioned at geo-position P(T1 ), and resulting from sub- regions 108A(T1 ), 108B(T1 ) and 108C(T1 ) of peaks 108A, 108B, and 108C being within direct line of sight of airborne vehicle 102 at geo-position P(T1 ).
  • airborne vehicle 102 moves to a second geo-position, P(T2). While airborne vehicle 102 is positioned at geo- position P(T2), the transmission sensor of airborne vehicle 102 senses one or more transmissions transmitted by transmitter 106.
  • the transmission may be any combination of: a LIDAR, RADAR, IR, radio, optical, acoustic, ultrasound, thermal or electric signals.
  • Computer 1 10 obtains indications, such as a binary indication of the sensed transmissions at time T2.
  • the geo-positioning sensor provided with airborne vehicle 102 senses state data that includes at least geo- position P(T2) of airborne vehicle 102, i.e.
  • Computer 1 10 acquires the state data from the geo-positioning sensor and determines from the detected transmission one or more indications for direct line of sight between a transmission transmitted by transmitter 106 and airborne vehicle 102 positioned at P(T2). While positioned at P(T2), peaks 108A and 108B are within the line of sight from airborne vehicle 102, however peak 108C is outside the line of sight from airborne vehicle 102. Accordingly, computer 1 10 senses indications for direct line of sight 1 12A(T2) and 1 12B(T2), respective of peaks 108A, and 108B.
  • computer 1 10 Responsive to sensing the indications for direct line of sight 1 12A(T2) and 1 12B(T2), computer 1 10 combines i) the state data, including geo-position P(T2), with ii) indications for direct line of sight 1 12A(T2) and 1 12B(T2), and iii) the topographical data for geographical region 104 to compute a second subset 5 of geographical region 104.
  • the second subset of geographical region 104 is indicated in Figure 1 C by the union of sub-regions 108A(T2), and 108B(T2), respective of peaks 108A and 108B.
  • Each of sub-regions 108A(T2) and 108B(T2) includes at least one potential point within geographical region 104 that defines a potential line of i o sight between the transmission by transmitter 106 and airborne vehicle 102 while positioned at P(T2).
  • Each potential point is a potential location for transmitter 106, respective of geo-position P(T2), i.e. potential point 1 14A(T2) is determined respective of sub-region 108A(T2), and potential point 1 14B(T2) is determined respective of sub-region 108B(T2).
  • each sub--position P(T2) i.e. potential point 1 14A(T2) is determined respective of sub-region 108A(T2)
  • potential point 1 14B(T2) is determined respective of sub-region 108B(T2).
  • Computer 1 10 builds a layer of access Layer(P(T2)) associated with geo-position P(T2) that and at least sub- regions 108A(T2) and 108B(T2).
  • FIG. 1 D which is associated with Figure 1 C, a schematic illustration of layer of access Layer(P(T2)), as built by computer 1 10
  • Layer of access Layer includes the union of only two representative sub- regions 208A(T2), and 208B(T2) of DTM 204 of geographical region 104, and corresponding to respective sub-regions 108A(T2) and 108B(T2) of Figure 1 C.
  • Each of representative sub-regions 208A(T2) and 208B(T2) includes at least
  • computer 1 10 creates a superimposition 224 by overlaying layer of access Layer(P(T1 )), associated with geo-position P(T1 ) with layer of access Layer(P(T2)), associated with geo-position P(T2).
  • Superimposition 224 includes representative sub-regions 208A(T1 ), 208B(T1 ), and 208C(T1 ) overlaid on representative sub-regions 208A(T2) and 208B(T2).
  • superimposition 224 includes representative overlap regions 224A and 224B, corresponding to the non-zero overlap, or intersection between representative sub-regions 208A(T1 ) and 208B(T1 ) with representative sub-regions 208A(T2) and 208B(T2), i.e. those points that are common to both representative sub-regions 208A(T1 ) and 208B(T1 ) and representative sub-regions 208A(T2) and 208B(T2).
  • computer 1 10 determines intersection 226 by extracting representative overlap regions 224A and 224B from superimposition 224 of Figure 2D and overlaying overlap regions 224A and 224B on DTM 204.
  • Computer 1 10 determines from representative overlap regions 224A and 224B of intersection 226 at least one potential transmission region respective of each of the different geo-positions, i.e. P(T1 ) and P(T2) of airborne vehicle 102. In this manner, computer 1 10 narrows down the potential transmission region from the entire geographic region 104 to sub-regions within geographic region 104 corresponding to representative overlap regions 224A and 224B of intersection 226.
  • Additional indications such as binary indications for lines of sight respective of the addition geo-positions
  • Computer 1 10 may build additional layers of access associated with the additional geo-positions.
  • additional layers of access may be built by additional computers associated with additional airborne vehicles (not shown).
  • Computer 1 10 of airborne vehicle 102 uses representative overlap regions 224A and 224B of respective Layer(P(T1 )) and Layer(P(T2)) to determine each of regions 124A and 124B on respective peaks 108A and 108B as a potential transmission region for transmitter 106.
  • Potential transmission regions 124A and 124B are defined respective of the geo-position of airborne vehicle 102 at time T1 , i.e. P(T1 ), indicated by an "X" 126, and the geo-position of airborne vehicle 102 at time T2, i.e. P(T2), indicated by an "X" 128.
  • computer 1 10 of airborne vehicle 102 displays indications for regions 124A and 124B on a user interface provided with computer 1 10.
  • the user interface is an augmented reality display
  • regions 124A and 124B are displayed as features overlaid on a real-time rendition of geographic region 104.
  • FIG. 2 is a schematic illustration of a system 200 including a computer, generally referenced 210, a transmission sensor 242, and a geo-positioning sensor 254, constructed and operative in accordance with another embodiment of the disclosed techniques.
  • Computer 210, transmission sensor 242, and geo- positioning sensor 254 are provided by way of example, and are representative, respectively, of computer 1 10, the transmission sensor, and the geo-positioning sensor described above with respect to Figures 1A-1 G.
  • Computer 210 includes at least one processor 230.
  • Processor 230 may be any of a CPU, GPU, or APU, digital signal processor (DSP), and the like.
  • Computer additionally includes a memory 264, and a user interface 266. Memory 264, and user interface 266 are electrically coupled to processor 230.
  • Transmission sensor 242 includes one or more of: a medium range RF transceiver 234 (e.g., WIFI), a long range transceiver 238, such as in the FM or AM range, and a cellular communication transceiver 240 (e.g., GSM, LTE, WIMAX).
  • transmission sensor 242 is operative to send and receive radio frequency (RF) signals relating to data and executable instructions for executing one or more of the procedures described herein.
  • RF radio frequency
  • Transmission sensor 242 may additionally include one or more of: a LIDAR detector 244 operative to detect a Light Detection and Ranging (LIDAR) signal, a RADAR antenna 246 operative to detect a RADAR signal, a thermal detector 248 operative to detect a thermal signal, an acoustic detector 250A operative to detect acoustic signals, an ultrasound (US) detector 250B operative to detect ultrasound signals, and one or more cameras 252 configured to detect spectra within any of the visible, infrared, or ultra-violet range. Camera 252 may be any combination of a monoculer camera, a stereoscopic camera, a scanning camera, and combinations thereof.
  • LIDAR Light Detection and Ranging
  • RADAR antenna 246 operative to detect a RADAR signal
  • a thermal detector 248 operative to detect a thermal signal
  • an acoustic detector 250A operative to detect acoustic signals
  • an ultrasound (US) detector 250B operative
  • Geo-positioning sensor 254 is operative to perform real-time spatial tracking and geo-positioning of any combination of absolute and relative translational and rotational motion of airborne vehicle 102.
  • Geo-positioning sensor 254 includes at least a 3D accelerometer unit 256, a gyroscope 258, and a compass 260, that are collectively operative to perform 9-axis inertial measurements respective of airborne vehicle 102.
  • Geo-positioning sensor 254 additionally includes a GPS receiver 262 operative to receive global geo- positioning measurements respective of airborne vehicle 102.
  • Memory unit 264 is operative to store one or more software instructions executable by processor 232. Memory unit 264 is additionally operative to store topographical data respective of a geographical region, such as region 104. Memory unit 264 is additionally operative to store any results from executing the software instructions, such as layers of access Layer(P(T1 )) 220, and Layer(P(T2)), superimposition 224 and intersection 226 described above in with respect to Figures 1 B, and 1 D-1 F.
  • User interface 266 operative to display one or more graphical features, such as Layer(P(T1 )), and Layer(P(T2)), superimposition 224 and intersection 226 respective of a representation 204 of geographical region 104.
  • User interface (Ul) 266 includes at least a display for displaying any of layers of access Layer(P(T1 )), and Layer(P(T2)), superimposition 224 and intersection 226. Ul 266 may be any of a heads-up display, augmented reality display, virtual reality display, flat screen display, and the like.
  • transmission sensor 242 and the components included therein described above are coupled to processor 230 via a respective converter, i.e. an analog to digital converters ADCs, and a digital to analog converter DACs, (not shown).
  • transmission sensor 242 and components included therein are coupled to computer 210 wirelessly, such as via electromagnetic or optical communications means.
  • Geo-positioning sensor 254 and the components included therein described above is communicatively coupled to processor 230, such as via wired, or wireless, e.g. electromagnetic or optical, communications means.
  • Memory unit 254 and Ul 266 are electrically coupled to processor 230.
  • Processor 230 determines any of the absolute and relative geo- position of airborne vehicle 102 according to any combination of: measurements acquired by geo-positioning unit 254, and transmission sensor 242. For example, processor 230 may apply one or more GPS measurements acquired by GPS 262 with one or more measurements acquired by any of compass 260, 3D accelerometer 256, and gyroscope 258, and one or more transmissions received by transmission sensor 242, such as images acquired by camera 252, to determine a relative or absolute geo-positioning of airborne vehicle 102 respective of geographical region 104 and transmitter 106. Processor 230 may additionally apply data received via an operating system configured with computer 210 to determine a relative or absolute geo-positioning of airborne vehicle 102 respective of geographical region 104 and transmitter 106.
  • processor 230 receives translational and rotational motion information from 3D accelerometer 256 and gyroscope 258, and changes in absolute orientation from compass 260. Additionally, processor 230 periodically receives positioning information from GPS 262, and one or more images from camera 252. Processor 230 applies the information received from compass 260, 3D accelerometers 256, gyroscope 258, and GPS 262, and optionally camera 252 to determine and update the geo-position of airborne vehicle 102.
  • Gyroscope 258 measures rotational velocity of airborne vehicle 102.
  • 3D accelerometer 256 measures the acceleration of airborne vehicle 102.
  • Processor 230 acquires the data measured by geo-positioning sensor 254 to compute the elevation and the roll of the mechanical frame of airborne vehicle 102 with respect to the vertical direction.
  • Compass 260 measures the Earth's magnetic field vector affecting airborne vehicle 102. Additionally, one or more signals received via transceiver 232 may be used to correct for drift, noise and the like.
  • Processor 230 periodically receives indications of any of: a LIDAR signals detected via LIDAR detector 244; a RADAR signal detected via RADAR antenna 246; a thermal signal detected via thermal detector 248; an acoustic signal detected via acoustic detector 250A; an ultra-sound signal detected via US 250B; and one or more images acquired via camera 252 to detect an indication for potential line-of-sight respective of a transmission by transmitter 106.
  • FIGS 3A-3G are a schematic illustration of a system 300 for detecting a potential transmitting region respective of at least one airborne vehicle 302 moving over a geographical region 304, constructed and operative with another embodiment of the disclosed techniques.
  • System 300 is substantially similar to system 100 described above with respect to Figures 1A-1 G.
  • System 300 includes at least one airborne vehicle 302 provided with a computer 310.
  • Computer 310 may be represented by computer 210 of Figure 2.
  • System 300 additionally includes a transmission sensor (not shown) and a geo-positioning sensor (not shown) provided with airborne vehicle 302, such as corresponding to transmission sensor 242, and geo-positioning sensor 254 of Figure 2.
  • airborne vehicle 302 is positioned at a geo-position P 3 o2(T1 ). While airborne vehicle 302 is positioned at geo-position P 3 o2(T1 ), the transmission sensor of airborne vehicle 302 senses one or more transmissions transmitted by a transmitter (not shown). Computer 310 acquires an indication, such as a binary indication of the sensed transmission and determines from the one or more transmissions sensed at time T1 , one or more indications for direct line of sight between the transmitter and airborne vehicle 302. The geo-positioning sensor of airborne vehicle 302 senses state data that includes at least geo-position P 3 o2(T1 ) of airborne vehicle 302 at time T1 .
  • Computer 310 acquires the state data, and applies a DTM of geographical region 304 to build a layer of access 320(H ), using the techniques described above with respect to Figures 1A-1 G.
  • Layer of access 320(H ) includes the union of sub-regions 308A(T1 ), 308B(T1 ), 308C(T1 ), and 308D(T1 ).
  • Each of sub-regions 308A(T1 ), 308B(T1 ), 308C(T1 ), and 308D(T1 ) includes at least one potential point within geographical region 304 that defines a potential line of sight between the transmission and airborne vehicle 302 while positioned at P 302 (T1 ).
  • airborne vehicle 302 has moved and is now positioned at a geo-position P 3 o2(T2). While airborne vehicle 302 is positioned at geo-position P 3 o2(T2), the transmission sensor of airborne vehicle 302 senses one or more transmissions transmitted by the transmitter.
  • Computer 310 acquires an indication, such as a binary indication of the sensed transmission and determines from the one or more transmissions sensed at time 12, one or more indications for direct line of sight between the transmitter and airborne vehicle 302.
  • the geo-positioning sensor of airborne vehicle 302 senses state data that includes at least geo-position P 3 o2(T2) of airborne vehicle 302 at time T2.
  • Computer 310 acquires the state data, and applies a DTM of geographical region 304 to build a layer of access 320(T2), using the techniques described above with respect to Figures 1A-1 G.
  • Layer of access 320(T2) includes the union of sub-regions 308C(T2), and 308D(T2).
  • Each of sub-regions 308C(T2), and 308D(T2) includes at least one potential point within geographical region 304 that defines a potential line of sight between the transmission and airborne vehicle 302 while positioned at P 3 o2(T2).
  • computer 310 creates a superimposition 324 by overlaying layer of access 320(T1 ) of Figure 3A, and associated with geo-position P 3 o2(T1 ) with layer of access 320(T2) of Figure 3B, and associated with geo-position P 3 o2(T2).
  • Superimposition 324 includes sub-regions 308A(T1 ), 308B(T1 ), and 308C(T1 ) overlaid on sub-regions 308C(T2) and 308D(T2).
  • superimposition 324 includes overlap regions 324A, 324B, 324C, 324D, and 324E overlaid on DTM 304, and corresponding to the intersection, i.e. those points that are common to both sub-regions 308A(T1 ), 308B(T1 ), and 308C(T1 ) and sub-regions 308C(T2) and 308D(T2).
  • airborne vehicle 302 has moved and is now positioned at a geo-position P 3 o2(T3). While airborne vehicle 302 is positioned at geo-position P 3 o2(T3), the transmission sensor of airborne vehicle 302 senses one or more transmissions transmitted by the transmitter.
  • Computer 310 acquires an indication, such as a binary indication of the sensed transmission and determines from the one or more transmissions sensed at time T3, one or more indications for direct line of sight between the transmitter and airborne vehicle 302.
  • the geo-positioning sensor of airborne vehicle 302 senses state data that includes at least geo-position P 3 o2(T3) of airborne vehicle 302 at time T3.
  • Computer 310 acquires the state data, and applies a DTM of geographical region 304 to build a layer of access 320(T3), using the techniques described above with respect to Figures 1A-1 G.
  • Layer of access 320(T3) includes the union of sub-regions 308C(T3), 308D(T3), and 308E(T3).
  • Each of sub-regions 308C(T3), 308D(T3), and 308E(T3) includes at least one potential point within geographical region 304 that defines a potential line of sight between the transmission and airborne vehicle 302 while positioned at P 3 o2(T3).
  • computer 310 creates a superimposition 326 by overlaying layer of access 320(T3) of Figure 3D, and associated with geo-position P 30 2(T3) with overlap regions 324A, 324B, 324C, 324D, and 324E, determined above with respect to Figure 3C.
  • Superimposition 326 includes sub- regions 308C(T3), 308D(T3), and 308EC(T3) overlaid on overlap regions 324A, 324B, 324C, 324D, and 324E.
  • superimposition 326 includes overlap regions 326A, 326B, 326C, and 326D, overlaid on DTM 304, and corresponding to the intersection, i.e.
  • computer 310 extracts overlap regions 326A, 326B, 326C, and 326D from superimposition 326, and overlays these regions on DTM map 304.
  • the potential region of transmission at time T3 is considerably smaller than at time T1 .
  • computer 310 defines a general transmission region 328, including overlap regions 326A, 326B, 326C, and 326D.
  • Computer 310 may display any of region 328 and overlap regions 326A, 326B, 326C, and 326D on a user interface, such as Ul 266 of Figure 2.
  • computer 310 displays region 328 and overlap regions 326A, 326B, 326C, and 326D on an augmented reality display of the geographical region represented by DTM 304.
  • Figures 4A-4B are a schematic illustration of a method for determining a potential transmission region, constructed and operative in accordance with an embodiment of the disclosed technique.
  • Figure 4A illustrates a general method to determine a potential transmission region
  • Figure 4B illustrates a more detailed implementation for procedure 400 of Figure 4A.
  • each layer of access of the multiple layers of access is associated with a different geo-position of at least one airborne vehicle.
  • the transmission is any of: a threat signal; and a distress signal.
  • the at least one airborne vehicle may be any of: a fixed wing aircraft; a helicopter; a drone; and a surveillance balloon.
  • the transmission may be any combination of: an optical signal; a radio signal; a LIDAR signal; a RADAR signal; an acoustic signal, an ultrasound signal, a thermal signal, and any of a distance, amplitude and velocity associated with a projectile.
  • computer 1 10 builds multiple layers of access respective of geographical region 104 and a transmission by transmitter 106.
  • Each of the multiple layers of access is associated with a different geo-position of airborne vehicle 102.
  • Layer(P(T1 )) 220 of Figure 1 B is associated with geo-position P(T1 ) of airborne vehicle 102 at time T1 , as shown in Figure 1A
  • Layer(P(T2)) 222 of Figure 2B is associated with geo-position P(T(2)) of airborne vehicle 102 at time T2, as shown in Figure 1 C.
  • procedure 402 an intersection of the multiple layers of access is computed.
  • computer 1 10 creates superimposition 224, shown in Figure 1 E, by overlaying layer of access Layer(P(T1 )) associated with geo-position P(T1 ), and shown in Figure 1 B, with layer of access Layer(P(T2)) associated with geo- position P(T2), and shown in Figure 1 D.
  • representative sub-regions 208A(T1 ), 208B(T1 ), and 208C(T1 ) defining Layer(P(T1 )) of Figure 1 B are overlaid on representative sub-regions 208A(T2) and 208B(T2) defining Layer(P(T2)) of Figure 1 D.
  • Computer 1 10 computes overlap regions 224A and 224B by determining those points that are common to both respective representative sub-regions 208A(T1 ), 208B(T1 ), and 208C(T1 ) and representative sub-regions 208A(T2) and 208B(T2), such as by performing a logical AND operation.
  • Computer 1 10 computes intersection 226 of Figure 1 F by extracting overlap regions 224A and 224B from superimposition 224 of Figure 1 E, and stores superimposition 224.
  • Computer 1 10 may display any of representative layers of access Layer(P(T1 )) and Layer(P(T2)); superimposition 224; and intersection 226 on Ul 266 of computer 210.
  • At least one potential transmission region is determined from the intersection.
  • computer 1 10 maps overlap regions 224A and 224B of Figure 2E onto geographic region 104, as corresponding regions 124A and 124B on respective peaks 108A and 108B.
  • Computer 1 10 determines that at least one of regions 124A and 124B are a potential transmission region for transmitter 106, respective of the geo-position of airborne vehicle 102 at time T1 , indicated by an "X" 126, and the geo-position of airborne vehicle 102 at time 12, indicated by an "X” 128.
  • Computer 1 10 displays indications for regions 124A and 124B on Ul 366.
  • Ul 366 is an augmented reality display, and regions 124A and 124B are displayed as features overlaid on a real-time rendition of geographic region 104.
  • state data of the at least one airborne vehicle comprising at least the geo-position associated with the one layer of access is sensed.
  • the state data further includes any combination of: a time stamp indicating the time of sensing of the indication for the direct line of sight; a heading measurement for the at least one airborne vehicle; an altitude measurement for the at least one airborne vehicle; a pose measurement for the at least one airborne vehicle; a velocity measurement for the at least one airborne vehicle; and an acceleration measurement for the at least one airborne vehicle.
  • the sensed indication is a binary indication for the existence of a direct line of sight.
  • processor 230 applies measurements acquired by any of 3D accelerometer 256, gyroscope 258, compass 260, GPS 262, and camera 252 to determine at least the geo-positions of airborne vehicle 102 at time T1 , shown in Figure 1A, and at time 12, shown in Figure 1 B.
  • Processor 230 may additionally use any of the above measurements, and any additional measurements acquired via one or more components of transceiver 232 to determine the heading, pose, velocity, and acceleration of airborne vehicle 102.
  • transmission sensor 242 senses an indication for direct line of sight between a transmission by transmitter 106 and airborne vehicle 102.
  • long RF range transceiver 238 may sense a radio signal transmitted by transmitter 106.
  • LIDAR detector 244 may sense a LIDAR signal transmitted by transmitter 106.
  • RADAR antenna 246 may sense a RADAR signal transmitted by transmitter 106.
  • thermal detector 248 may sense a thermal signal transmitted by transmitter 106.
  • acoustic detector 250A may detect an acoustic signal transmitted by transmitter 106.
  • US detector 250B may detect a US signal transmitted by transmitter 106.
  • camera 252 may detect an optical signal transmitted by transmitter 106.
  • Processor 230 applies at least one of the above detected signals to determine binary indications for direct line of sight 1 12A(T1 ), 1 12B(T1 ), and 1 12C(T1 ) between transmitter 106 and airborne vehicle 102 while airborne vehicle is positioned at geo-position P(T1 ), as shown in Figure 1A.
  • processor 230 applies at least one of the above detected signals to determine binary indications for direct line of sight 1 12A(T2) and 1 12B(T2) between transmitter 106 and airborne vehicle 102 while airborne vehicle is positioned at geo-position P(T2), as shown in Figure 1 B.
  • topographical data for the geographical region is obtained.
  • processor 230 obtains topographical data for geographical region 104 from memory 264.
  • procedure 426 responsive to sensing the indication for the direct lines of sight while the airborne vehicle is positioned at the geo-position associated with the one layer of access, i) the binary indication for the direct lines of sight is combined with ii) state data including at least the associated geo-position of the at least one airborne vehicle and iii) the topographical data for the geographical region, to compute a subset of the geographical region.
  • the subset of the geographic region includes at least one potential point of the geographical region defining a potential line of sight between the transmission and the airborne vehicle positioned at the associated geo-position.
  • Each associated layer of access described above includes the subset of the geographical region.
  • transmission sensor 242 senses respective indications for direct lines of sight 1 12A(T1 ), 1 12B(T1 ), and 1 12C(T1 ). Responsive to the sensing by transmission sensor 242, processor 230 obtains state data including at least geo-position P(T1 ) from geo-positioning unit 254. Processor 230 additionally obtains topographical data for geographical region 104 from memory 264.
  • Processor 230 combines i) indications for direct lines of sight 1 12A(T1 ), 1 12B(T1 ), and 1 12C(T1 ), ii) the state data including at least geo-position P(T1 ), and the topographical data to compute the subset comprising the union of sub-regions 108A(T1 ), 108B(T1 ), and 108C(T1 ) of geographical region 104.
  • the subset of geographic region 104 comprising the union of sub-regions 108A(T1 ), 108B(T1 ), and 108C(T1 ), includes at least one potential point of geographical region 104 defining a potential line of sight between the transmission by transmitter 106 and airborne vehicle 102 positioned at the associated geo-position P(T(1 )).
  • layer of access Layer (P(T1 )) 220 includes three representative sub-regions 208A(T1 ), 208B(T1 ) and 208C(T1 ) of representation 204 of geographical region 104, and corresponding to respective sub-regions 108A(T1 ), 108B(T1 ) and 108C(T1 ) of peaks 108A, 108B, and 108C, shown in Figure 1A.
  • transmission sensor 242 senses respective indications for direct lines of sight 1 12A(T2) and 1 12B(T2). Responsive to the sensing by transmission sensor 242, processor 230 obtains state data including at least geo-position P(T2) from geo-positioning unit 254. Processor 230 additionally obtains topographical data for geographical region 104 from memory 264.
  • Processor 230 combines i) indications for direct lines of sight 1 12A(T2) and 1 12B(T2), ii) the state data including at least geo-position P(T2), and the topographical data to compute a subset comprising the union of sub-regions 108A(T2) and 108B(T2) of geographical region 104.
  • the subset of geographic region 104, comprising the union of sub-regions 108A(T2) and 108B(T2) includes at least one potential point of geographical region 104 defining a potential line of sight between the transmission by transmitter 106 and airborne vehicle 102 positioned at the associated geo-position P(T(1 )).
  • layer of access Layer includes two representative sub-regions 208A(T2) and 208B(T2) of representation 204 of geographical region 104, and corresponding to respective sub-regions 108A(T2) and 108B(T2) of peaks 108A and 108B, shown in Figure 1 B.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method for automatically determining a potential transmission region, comprising: acquiring multiple binary indications for a direct line of sight between a transmitter and an airborne vehicle respective of multiple geo- positions of the airborne vehicle; acquiring each of the multiple geo-positions respective of each of the multiple binary indications for the direct line of sight; for each one of the acquired geo-positions, determining a layer of access respective of topographical data for the geographical region and the acquired geo-position, as a subset of the geographical region that includes at least one potential point defining a potential line of sight between the transmitter and the airborne vehicle, thereby determining multiple layers of access; determining an intersection of the multiple layers of access; and determining, from the intersection, the potential transmission region respective of the transmitter and the geo-positions of the airborne vehicle.

Description

TRANSMISSION DETECTION USING LINE OF SIGHT
FIELD OF THE DISCLOSED TECHNIQUE
The disclosed technique relates to transmission detection, in general, and to systems and methods for transmission detection using line of sight measurements, in particular.
BACKGROUND OF THE DISCLOSED TECHNIQUE
Detection systems for mobile objects, such as armored vehicles and the like, typically employ one or more sensors to detect a transmission directed towards the vehicle, such as a fired projectile, or a transmission by a tracking device, and the like. Such sensors may comprise receivers for detecting visual signals, acoustic signals, RADAR signals, and the like. On detecting one or more transmissions directed at the vehicle, these systems might issue a warning to the driver of the vehicle. Additionally, such systems might apply topographical data to compute for the vehicle an alternative route that is outside the transmitting range of the detected transmission.
US Patent 9,163,949B2 to Andersson, entitled "Dynamic Route Planner", and US Patent 9,255,808 B2 to Andersson, entitled "Route Planning System and Method for Minimizing Exposure to Threats", are directed to systems and methods for determining one or more alternative routes for a vehicle, responsive to detecting a shot fired at the vehicle. Based on the firing history, the system may issue a warning, and redirect the vehicle to the alternative route.
SUMMARY OF THE PRESENT DISCLOSED TECHNIQUE
It is an object of the disclosed technique to provide a novel method and system for automatically determining at least one potential transmission region.
In accordance with the disclosed technique, there is thus provided a method for automatically determining at least one potential transmission region, comprising: building multiple layers of access respective of a geographical region and a transmission, wherein each the layer of access of the multiple layers of access is associated with a different geo-position of at least one airborne vehicle; computing an intersection of the multiple layers of access; and determining, from the intersection, the at least one potential transmission region respective of the transmission and each of the different geo-positions of at least one airborne vehicle, wherein building each one of the multiple layers of access comprises: sensing state data of the at least one airborne vehicle comprising at least the geo-position associated with the one layer of access, sensing an indication for a direct line of sight between the transmission and the at least one airborne vehicle, while the at least one airborne vehicle is positioned at the geo- position associated with the one layer of access, obtaining topographical data for the geographical region, responsive to the sensing the indication for the direct line of sight, combining i) the indication for the direct line of sight, ii) the state data of the at least one airborne vehicle comprising at least the geo- position associated with the one layer of access, and iii) the topographical data for the geographical region to compute a subset of the geographical region that includes at least one potential point of the geographical region defining a potential line of sight between the transmission and the at least one airborne vehicle positioned at the geo-position associated with the one layer of access, the one layer of access comprising the subset of the geographical region.
In some embodiments, the state data further includes a parameter selected from the group consisting of: a time stamp associated with the sensing of the indication for the direct line of sight; a heading measurement for the at least one airborne vehicle; an altitude measurement for the at least one airborne vehicle; a pose measurement for the at least one airborne vehicle; a velocity measurement for the at least one airborne vehicle; and an acceleration measurement for the at least one airborne vehicle.
In some embodiments, the transmission is selected from the group consisting of: a threat signal; and a distress signal.
In some embodiments, the at least one airborne vehicle is selected from the group consisting of: a fixed wing aircraft; a helicopter; a drone; and a surveillance balloon.
In some embodiments, the transmission is selected from the group consisting of: an optical signal; a radio signal; a LIDAR signal; a RADAR signal, an acoustic signal, an ultrasound signal, a thermal signal; and any of a distance, amplitude, velocity, and acceleration associated with a projectile.
There is further provided, in accordance with an embodiment, an automatic transmission detection system for use with an airborne vehicle, comprising: at least one processor configured to: build multiple layers of access respective of a geographical region and a transmission, wherein each of the multiple layers of access is associated with a different geo-position of at least one airborne vehicle; compute an intersection of the multiple layers of access; and determine, from the intersection, at least one potential transmission region respective of each of the different geo-positions of the at least one airborne vehicle, wherein building each one of the multiple layers of access comprises: responsive to a sensing of an indication for a direct line of sight between the transmission and the at least one airborne vehicle while the at least one airborne vehicle is positioned at the geo-position associated with the one layer of access, combining i) the indication for the direct line of sight, ii) state data of the at least one airborne vehicle comprising at least the geo-position associated with the one layer of access, and iii) topographical data for the geographical region, to compute a subset of the geographical region that includes at least one potential point of the geographical region defining a potential line of sight between the transmission and the at least one airborne vehicle positioned at the geo-position associated with the one layer of access, the one layer of access comprising the subset of the geographical region; a memory unit configured to store the topographical data for the geographical region; at least one first sensor configured to sense the state data comprising at least each of the associated geo-positions of the at least one airborne vehicle; and at least one second sensor configured to sense each of the indications for direct line of sight between the transmission and the at least one airborne vehicle, while the at least one airborne vehicle is positioned at each of the geo-positions associated with each the respective layer of access.
In some embodiments, the system further comprises the at least one airborne vehicle, the at least one airborne vehicle provided with the at least one processor, the memory unit, the at least one first sensor, and the at least one second sensor.
In some embodiments, the at least one first sensor is selected from the group consisting of: a compass, a GPS unit, a 3D accelerometer, a gyroscope, and a camera.
In some embodiments, the at least one second sensor is selected from the group consisting of: a long range radio antenna, a LIDAR detector, a RADAR antenna, a thermal detector, an acoustic detector, an ultrasound detector, and a camera.
In some embodiments, the system of further comprises a user interface configured to display the at least one potential transmission region respective of each of the different geo-positions of at least one airborne vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosed technique will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which:
Figures 1A-1 G are schematic illustrations of a system for detecting a potential transmitting region respective of at least one airborne vehicle moving over a geographical region, constructed and operative with an embodiment of the disclosed techniques;
Figure 2 is a schematic illustration of the system of Figures 1A-1 G, constructed and operative in accordance with another embodiment of the disclosed techniques;
Figures 3A-3G are schematic illustrations of another system for detecting a potential transmitting region respective of at least one airborne vehicle moving over a geographical region, constructed and operative with an embodiment of the disclosed techniques;
Figures 4A-4B, taken together, are a schematic illustration of a method for determining a potential transmission region, constructed and operative in accordance with an embodiment of the disclosed technique.
DETAILED DESCRIPTION OF THE EMBODIMENTS
The disclosed technique overcomes the disadvantages of the prior art by providing a system and method for detecting a potential transmitting region respective of an airborne vehicle moving over a geographical region. The airborne vehicle is provided with one or more sensors that sense any combination of radio, optical, acoustic, thermal, LIDAR, and RADAR transmission signals that are transmitted by a transmitter positioned somewhere in the geographical region. The transmission signals may be communicated from a transmitter to the airborne vehicle for tracking purposes, attack purposes, alert or distress purposes, and the like. Additionally, each airborne vehicle is provided with one or more sensors that sense the geo-position of the vehicle at a given point in time, where geo-position is understood to be the real-world geolocation of the airborne vehicle respective of the elevation, longitude and latitude geo-coordinates. As the geo-position of the airborne vehicle changes over time, the sensors operative to detect the transmissions external to the airborne vehicle detect different indications of line of sight respective of the transmission signals. A processor may be provided with the airborne vehicle to determine a potential transmission region for the transmitter by combining multiple layers of access respective of the different geo-positions of the airborne vehicle. The processor builds each layer of access by combining a topographical map of the geographical region with the respective geo-position of the airborne vehicle and the line of site indications detected at the respective geo-position. As the processor combines additional layer of access, the potential transmission region is further reduced to provide a more accurate assessment for the position of the transmitter. This may be useful for assessing the location of a threat, such as an enemy tracking system, or projectile. Alternatively, determining the position of the transmitter may be useful for assessing the location of an ally, such as when the ally transmits a distress signal.
Line of sight (LOS), as referred to herein, is defined as the unobstructed straight line between a transmitter and a sensor. Therefore, when the sensor senses a transmission by the transmitter, this is a binary indication for direct line of sight, as defined in this application, validating the assumption that an unobstructed straight line exists between the transmitter and the sensor. Reference is now made to Figures 1A-1 G which are schematic illustrations of a system 100 for detecting a potential transmitting region respective of a transmitter 106 and at least one airborne vehicle 102 moving over a geographical region 104, constructed and operative with an embodiment of the disclosed techniques. Geographical region 104 includes one or more peaks and valleys, such as peaks 108A, 108B, and 108C. Figure 1A illustrates system 100 at a time T1 with airborne vehicle positioned at a geo-position P(T1 ), and Figure 1 B illustrates a layer of access, Layer(P(T1 )) corresponding to geo- position P(T1 )) of Figure 1A. Figure 1 C illustrates system 100 at a time 12, after airborne vehicle 102 has moved to another geo-position P(T2), and Figure 1 D illustrates a layer of access, Layer(P(T2)) corresponding to geo-position P(T2)) of Figure 1 C.
System 100 includes at least one computer 1 10. In one embodiment, computer 1 10 is provided with airborne vehicle 102. In another embodiment (not shown), computer 1 10 is provided separately from airborne vehicle 102, and is in communication with airborne vehicle 102 to receive one or more indications of a sensed transmission and sensed geo-positioning data respective of airborne vehicle 102. Although the system and methods described hereinbelow will relate to one airborne vehicle 102, the invention is applicable to multiple airborne vehicles 102 as well. A transmitter 106 is positioned at one of multiple potential transmitting regions within geographical region 104, such on top of one of peaks 108A, 108B and 108C, each of which poses a potential transmitting region. Transmitter 106 may be any of a LIDAR, RADAR, infrared (IR), radio, optical, acoustic, ultrasound (US), thermal or electric transmitter. In one implementation, the transmission is associated with a tracking system or a guided projectile system of an opponent. Alternatively, in another implementation, the transmission is a distress signal from an ally.
Computer 1 10 includes hardware componentry and software for acquiring the respective geo-position of airborne vehicle 102 from a geo- positioning sensor (not shown) provided with airborne vehicle 102 as the respective geo-position of airborne vehicle 102 changes with time, i.e. from time T1 to time 12. Additionally, computer 1 10 includes hardware componentry and software for acquiring indications of detections by a transmission sensor (not shown) provided with airborne vehicle 102 as the respective geo-position of airborne vehicle 102 changes with time. The transmission sensor detects one or more transmissions transmitted by transmitter 106 in the direction of airborne vehicle 102. Computer 1 10 is operative to apply the indications of the detected transmissions to determine one or more indications for line of sight respective of transmitter 106 and the current geo-position of airborne vehicle 102. Computer 1 10 additionally includes hardware componentry and software for storing topographical data respective of geographical region 104, as well as software instructions that are executable by computer 1 10. These and other components of computer 1 10, the transmission sensor and the geo-positioning sensor will be described in greater detail herein below in conjunction with Figure 2. As airborne vehicle 102 moves and the respective geo-position of airborne vehicle 102 changes over time, computer 1 10 builds multiple layers of access respective of geographical region 104 and transmissions by transmitter 106. Each of the multiple layers of access is associated with a different geo-position of airborne vehicle 102. A description of the layers of access respective of each geo- location of airborne vehicle 102 now follows:
With reference to Figure 1A, at time T1 , airborne vehicle 102 is at geo-position, P(T1 ). While airborne vehicle 102 is positioned at geo-position P(T1 ), the transmission sensor (not shown) provided with airborne vehicle 102 senses one or more transmissions transmitted by transmitter 106 external to airborne vehicle 102. Computer 1 10 obtains indications, such a binary indication of the sensed transmissions at time T1 . The transmission may be any combination of: a LIDAR, RADAR, IR, radio, optical, acoustic, ultrasound, thermal or electric signal. Airborne vehicle 102 additionally is provided with at least one geo-positioning sensor (not shown) for sensing state data corresponding to the physical state and geo-position of airborne vehicle 102. The geo-positioning sensor acquires the physical state data, including at least geo-position P(T1 ) i.e. the geo-position of airborne vehicle 102 at time T1 , and may additionally include parameters such as the pose, velocity, acceleration, heading and timestamp T1 respective of the time of sensing. Computer 1 10 acquires the state data from the geo-positioning sensor and determines from the state data and the detected transmission, one or more indications for direct line of sight between transmitter 106 and airborne vehicle 102 positioned at geo- position P(T1 ), such as indications for direct lines of sight 1 12A(T1 ) respective of peak 108A, 1 12B(T1 ) respective of peak 108B, and 1 12C(T1 ) respective of peak 108C. Responsive to sensing the indications for direct line of sight 1 12A(T1 ), 1 12B(T1 ), and 1 12C(T1 ), computer 1 10 combines i) the state data, including at least P(T1 ), with ii) indications for direct line of sight 1 12A(T1 ), 1 12B(T1 ), and 1 12C(T1 ) and iii) the topographical data for geographical region 104 to compute a first subset of geographical region 104.
The first subset of geographical region 104 is indicated in Figure 1A by the union of sub-regions 108A(T1 ), 108B(T1 ), and 108C(T1 ) respective of peaks 108A, 108B, and 108C. Each of sub-regions 108A(T1 ), 108B(T1 ) and 108C(T1 ) includes at least one potential point within geographical region 104 that defines a potential line of sight between the transmission by transmitter 106 and airborne vehicle 102 while positioned at P(T1 ). Each potential point is a potential location for transmitter 106, respective of geo-position P(T1 ), i.e. potential point 1 14A(T1 ) is determined respective of sub-region 108A(T1 ), potential point 1 14B(T1 ) is determined respective of sub-region 108B(T1 ), and potential point 1 14C(T1 ) is determined respective of sub-region 108C(T1 ). Each potential line of sight describes an unobstructed, observational perspective from each potential point to airborne vehicle 102. It is to be noted that Figure 1A indicates three potential points respective of three sub-regions for illustrative purposes only. However each sub-region may include multiple potential points. Computer 1 10 builds a layer of access Layer(P(T1 )) associated with geo- position P(T1 ) that and at least sub-regions 108A(T1 ), 108B(T1 ) and 108C(T1 ). A more detailed description of Layer(P(T1 )) is given below with respect to Figure 1 B.
Referring to Figure 1 B, which is associated with Figure 1A, a schematic illustration of layer of access Layer(P(T1 )) built by computer 1 10 and overlaid on a digital terrain map (DTM) 204 of geographical region 104 is shown. DTM 204 may be provided in advance as a database and stored at a storage device of computer 1 10. Alternatively, computer 1 10 may build DTM 204 in realtime based on data acquired via a sensor, such as any of the components provided with transmission sensor 242 described hereinbelow with respect to Figure 2, i.e. via a camera, RADAR antenna, LIDAR detector, radio receiver i.e. antenna, acoustic detector, sonar (US) detector, thermal detector, and the like. Layer(P(T1 )) is built respective of geo-position P(T1 ) of airborne vehicle 102 at time T1 . Layer of access Layer (P(T1 )) includes the union of three representative sub-regions 208A(T1 ), 208B(T1 ) and 208C(T1 ) of a DTM 204 of geographical region 104, and corresponding to respective sub-regions 108A(T1 ), 108B(T1 ) and 108C(T1 ) of peaks 108A, 108B, and 108C. Peaks 208A, 208B, and 208C represent peaks 108A, 108B, and 108C, and are obtained from the topographical data of geographical region 104 stored in a memory unit of computer 1 10. Each of representative sub-regions 208A(T1 ), 208B(T1 ) and 208C(T1 ) includes at least one potential point representing a potential line of sight between the transmission by transmitter 106 and airborne vehicle 102 while positioned at geo-position P(T1 ), and resulting from sub- regions 108A(T1 ), 108B(T1 ) and 108C(T1 ) of peaks 108A, 108B, and 108C being within direct line of sight of airborne vehicle 102 at geo-position P(T1 ).
With reference to Figure 1 C, at time 12, airborne vehicle 102 moves to a second geo-position, P(T2). While airborne vehicle 102 is positioned at geo- position P(T2), the transmission sensor of airborne vehicle 102 senses one or more transmissions transmitted by transmitter 106. The transmission may be any combination of: a LIDAR, RADAR, IR, radio, optical, acoustic, ultrasound, thermal or electric signals. Computer 1 10 obtains indications, such as a binary indication of the sensed transmissions at time T2. The geo-positioning sensor provided with airborne vehicle 102 senses state data that includes at least geo- position P(T2) of airborne vehicle 102, i.e. the geo-position of airborne vehicle 102 at time T2, and may additionally include the pose, velocity, acceleration, heading and timestamp T2 respective of the time of sensing. Computer 1 10 acquires the state data from the geo-positioning sensor and determines from the detected transmission one or more indications for direct line of sight between a transmission transmitted by transmitter 106 and airborne vehicle 102 positioned at P(T2). While positioned at P(T2), peaks 108A and 108B are within the line of sight from airborne vehicle 102, however peak 108C is outside the line of sight from airborne vehicle 102. Accordingly, computer 1 10 senses indications for direct line of sight 1 12A(T2) and 1 12B(T2), respective of peaks 108A, and 108B. Responsive to sensing the indications for direct line of sight 1 12A(T2) and 1 12B(T2), computer 1 10 combines i) the state data, including geo-position P(T2), with ii) indications for direct line of sight 1 12A(T2) and 1 12B(T2), and iii) the topographical data for geographical region 104 to compute a second subset 5 of geographical region 104.
The second subset of geographical region 104 is indicated in Figure 1 C by the union of sub-regions 108A(T2), and 108B(T2), respective of peaks 108A and 108B. Each of sub-regions 108A(T2) and 108B(T2) includes at least one potential point within geographical region 104 that defines a potential line of i o sight between the transmission by transmitter 106 and airborne vehicle 102 while positioned at P(T2). Each potential point is a potential location for transmitter 106, respective of geo-position P(T2), i.e. potential point 1 14A(T2) is determined respective of sub-region 108A(T2), and potential point 1 14B(T2) is determined respective of sub-region 108B(T2). As with Figure 1A, each sub-
15 region may include multiple potential points. Computer 1 10 builds a layer of access Layer(P(T2)) associated with geo-position P(T2) that and at least sub- regions 108A(T2) and 108B(T2).
Referring to Figure 1 D, which is associated with Figure 1 C, a schematic illustration of layer of access Layer(P(T2)), as built by computer 1 10
20 respective of geo-position P(T2) of airborne vehicle 102 at time T2 is shown.
Layer of access Layer (P(T2)) includes the union of only two representative sub- regions 208A(T2), and 208B(T2) of DTM 204 of geographical region 104, and corresponding to respective sub-regions 108A(T2) and 108B(T2) of Figure 1 C. Each of representative sub-regions 208A(T2) and 208B(T2) includes at least
25 one potential point representing a potential line of sight between the transmission by transmitter 106 and airborne vehicle 102 while positioned at geo-position P(T2), and resulting from only sub-regions 108A(T2), 108B(T2) of peaks 108A and 108B being within direct line of sight of airborne vehicle 102 at geo-position P(T2).
30 As shown in Figure 1 E, computer 1 10 creates a superimposition 224 by overlaying layer of access Layer(P(T1 )), associated with geo-position P(T1 ) with layer of access Layer(P(T2)), associated with geo-position P(T2). Superimposition 224 includes representative sub-regions 208A(T1 ), 208B(T1 ), and 208C(T1 ) overlaid on representative sub-regions 208A(T2) and 208B(T2). As a result of the overlaying, superimposition 224 includes representative overlap regions 224A and 224B, corresponding to the non-zero overlap, or intersection between representative sub-regions 208A(T1 ) and 208B(T1 ) with representative sub-regions 208A(T2) and 208B(T2), i.e. those points that are common to both representative sub-regions 208A(T1 ) and 208B(T1 ) and representative sub-regions 208A(T2) and 208B(T2). Since peak 208C was not within line of sight of airborne vehicle 102 at geo-position P(T2)), the overlap respective of sub-region 208C(T2) of Layer(P(T1 )) 220 with Layer (P(T2)) 222 is the null set.
As shown in Figure 1 F, computer 1 10 determines intersection 226 by extracting representative overlap regions 224A and 224B from superimposition 224 of Figure 2D and overlaying overlap regions 224A and 224B on DTM 204. Computer 1 10 determines from representative overlap regions 224A and 224B of intersection 226 at least one potential transmission region respective of each of the different geo-positions, i.e. P(T1 ) and P(T2) of airborne vehicle 102. In this manner, computer 1 10 narrows down the potential transmission region from the entire geographic region 104 to sub-regions within geographic region 104 corresponding to representative overlap regions 224A and 224B of intersection 226.
As airborne vehicle 102 moves to additional geo-positions over time, additional indications, such as binary indications for lines of sight respective of the addition geo-positions, are sensed. Computer 1 10 may build additional layers of access associated with the additional geo-positions. Similarly, additional layers of access may be built by additional computers associated with additional airborne vehicles (not shown).
Referring to Figure 1 G, system 100 is illustrated corresponding to Figure 1 F. Computer 1 10 of airborne vehicle 102 uses representative overlap regions 224A and 224B of respective Layer(P(T1 )) and Layer(P(T2)) to determine each of regions 124A and 124B on respective peaks 108A and 108B as a potential transmission region for transmitter 106. Potential transmission regions 124A and 124B are defined respective of the geo-position of airborne vehicle 102 at time T1 , i.e. P(T1 ), indicated by an "X" 126, and the geo-position of airborne vehicle 102 at time T2, i.e. P(T2), indicated by an "X" 128. Optionally, computer 1 10 of airborne vehicle 102 displays indications for regions 124A and 124B on a user interface provided with computer 1 10. In one embodiment, the user interface is an augmented reality display, and regions 124A and 124B are displayed as features overlaid on a real-time rendition of geographic region 104. A description of a system and method for determining regions 124A and 124B on respective peaks 108A and 108B now follows.
Reference is now made to Figure 2, which, while still referencing Figures 1A-1 G, is a schematic illustration of a system 200 including a computer, generally referenced 210, a transmission sensor 242, and a geo-positioning sensor 254, constructed and operative in accordance with another embodiment of the disclosed techniques. Computer 210, transmission sensor 242, and geo- positioning sensor 254 are provided by way of example, and are representative, respectively, of computer 1 10, the transmission sensor, and the geo-positioning sensor described above with respect to Figures 1A-1 G.
Computer 210 includes at least one processor 230. Processor 230 may be any of a CPU, GPU, or APU, digital signal processor (DSP), and the like. Computer additionally includes a memory 264, and a user interface 266. Memory 264, and user interface 266 are electrically coupled to processor 230.
Transmission sensor 242 includes one or more of: a medium range RF transceiver 234 (e.g., WIFI), a long range transceiver 238, such as in the FM or AM range, and a cellular communication transceiver 240 (e.g., GSM, LTE, WIMAX). In one implementation, transmission sensor 242 is operative to send and receive radio frequency (RF) signals relating to data and executable instructions for executing one or more of the procedures described herein. Transmission sensor 242 may additionally include one or more of: a LIDAR detector 244 operative to detect a Light Detection and Ranging (LIDAR) signal, a RADAR antenna 246 operative to detect a RADAR signal, a thermal detector 248 operative to detect a thermal signal, an acoustic detector 250A operative to detect acoustic signals, an ultrasound (US) detector 250B operative to detect ultrasound signals, and one or more cameras 252 configured to detect spectra within any of the visible, infrared, or ultra-violet range. Camera 252 may be any combination of a monoculer camera, a stereoscopic camera, a scanning camera, and combinations thereof.
Geo-positioning sensor 254 is operative to perform real-time spatial tracking and geo-positioning of any combination of absolute and relative translational and rotational motion of airborne vehicle 102. Geo-positioning sensor 254 includes at least a 3D accelerometer unit 256, a gyroscope 258, and a compass 260, that are collectively operative to perform 9-axis inertial measurements respective of airborne vehicle 102. Geo-positioning sensor 254 additionally includes a GPS receiver 262 operative to receive global geo- positioning measurements respective of airborne vehicle 102.
Memory unit 264 is operative to store one or more software instructions executable by processor 232. Memory unit 264 is additionally operative to store topographical data respective of a geographical region, such as region 104. Memory unit 264 is additionally operative to store any results from executing the software instructions, such as layers of access Layer(P(T1 )) 220, and Layer(P(T2)), superimposition 224 and intersection 226 described above in with respect to Figures 1 B, and 1 D-1 F. User interface 266 operative to display one or more graphical features, such as Layer(P(T1 )), and Layer(P(T2)), superimposition 224 and intersection 226 respective of a representation 204 of geographical region 104. User interface (Ul) 266 includes at least a display for displaying any of layers of access Layer(P(T1 )), and Layer(P(T2)), superimposition 224 and intersection 226. Ul 266 may be any of a heads-up display, augmented reality display, virtual reality display, flat screen display, and the like.
In one embodiment, transmission sensor 242 and the components included therein described above, are coupled to processor 230 via a respective converter, i.e. an analog to digital converters ADCs, and a digital to analog converter DACs, (not shown). Alternatively, in another embodiment, transmission sensor 242 and components included therein are coupled to computer 210 wirelessly, such as via electromagnetic or optical communications means. Geo-positioning sensor 254 and the components included therein described above, is communicatively coupled to processor 230, such as via wired, or wireless, e.g. electromagnetic or optical, communications means. Memory unit 254 and Ul 266 are electrically coupled to processor 230.
Processor 230 determines any of the absolute and relative geo- position of airborne vehicle 102 according to any combination of: measurements acquired by geo-positioning unit 254, and transmission sensor 242. For example, processor 230 may apply one or more GPS measurements acquired by GPS 262 with one or more measurements acquired by any of compass 260, 3D accelerometer 256, and gyroscope 258, and one or more transmissions received by transmission sensor 242, such as images acquired by camera 252, to determine a relative or absolute geo-positioning of airborne vehicle 102 respective of geographical region 104 and transmitter 106. Processor 230 may additionally apply data received via an operating system configured with computer 210 to determine a relative or absolute geo-positioning of airborne vehicle 102 respective of geographical region 104 and transmitter 106.
In general, processor 230 receives translational and rotational motion information from 3D accelerometer 256 and gyroscope 258, and changes in absolute orientation from compass 260. Additionally, processor 230 periodically receives positioning information from GPS 262, and one or more images from camera 252. Processor 230 applies the information received from compass 260, 3D accelerometers 256, gyroscope 258, and GPS 262, and optionally camera 252 to determine and update the geo-position of airborne vehicle 102. Gyroscope 258 measures rotational velocity of airborne vehicle 102. 3D accelerometer 256 measures the acceleration of airborne vehicle 102. Processor 230 acquires the data measured by geo-positioning sensor 254 to compute the elevation and the roll of the mechanical frame of airborne vehicle 102 with respect to the vertical direction. Compass 260 measures the Earth's magnetic field vector affecting airborne vehicle 102. Additionally, one or more signals received via transceiver 232 may be used to correct for drift, noise and the like.
Processor 230 periodically receives indications of any of: a LIDAR signals detected via LIDAR detector 244; a RADAR signal detected via RADAR antenna 246; a thermal signal detected via thermal detector 248; an acoustic signal detected via acoustic detector 250A; an ultra-sound signal detected via US 250B; and one or more images acquired via camera 252 to detect an indication for potential line-of-sight respective of a transmission by transmitter 106.
Reference is now made to Figures 3A-3G which, taken together, are a schematic illustration of a system 300 for detecting a potential transmitting region respective of at least one airborne vehicle 302 moving over a geographical region 304, constructed and operative with another embodiment of the disclosed techniques. System 300 is substantially similar to system 100 described above with respect to Figures 1A-1 G. System 300 includes at least one airborne vehicle 302 provided with a computer 310. Computer 310 may be represented by computer 210 of Figure 2. System 300 additionally includes a transmission sensor (not shown) and a geo-positioning sensor (not shown) provided with airborne vehicle 302, such as corresponding to transmission sensor 242, and geo-positioning sensor 254 of Figure 2.
With reference to Figure 3A, at a time T1 , airborne vehicle 302 is positioned at a geo-position P3o2(T1 ). While airborne vehicle 302 is positioned at geo-position P3o2(T1 ), the transmission sensor of airborne vehicle 302 senses one or more transmissions transmitted by a transmitter (not shown). Computer 310 acquires an indication, such as a binary indication of the sensed transmission and determines from the one or more transmissions sensed at time T1 , one or more indications for direct line of sight between the transmitter and airborne vehicle 302. The geo-positioning sensor of airborne vehicle 302 senses state data that includes at least geo-position P3o2(T1 ) of airborne vehicle 302 at time T1 . Computer 310 acquires the state data, and applies a DTM of geographical region 304 to build a layer of access 320(H ), using the techniques described above with respect to Figures 1A-1 G. Layer of access 320(H ) includes the union of sub-regions 308A(T1 ), 308B(T1 ), 308C(T1 ), and 308D(T1 ). Each of sub-regions 308A(T1 ), 308B(T1 ), 308C(T1 ), and 308D(T1 ) includes at least one potential point within geographical region 304 that defines a potential line of sight between the transmission and airborne vehicle 302 while positioned at P302(T1 ).
With reference to Figure 3B, at a time 12, airborne vehicle 302 has moved and is now positioned at a geo-position P3o2(T2). While airborne vehicle 302 is positioned at geo-position P3o2(T2), the transmission sensor of airborne vehicle 302 senses one or more transmissions transmitted by the transmitter. Computer 310 acquires an indication, such as a binary indication of the sensed transmission and determines from the one or more transmissions sensed at time 12, one or more indications for direct line of sight between the transmitter and airborne vehicle 302. The geo-positioning sensor of airborne vehicle 302 senses state data that includes at least geo-position P3o2(T2) of airborne vehicle 302 at time T2. Computer 310 acquires the state data, and applies a DTM of geographical region 304 to build a layer of access 320(T2), using the techniques described above with respect to Figures 1A-1 G. Layer of access 320(T2) includes the union of sub-regions 308C(T2), and 308D(T2). Each of sub-regions 308C(T2), and 308D(T2) includes at least one potential point within geographical region 304 that defines a potential line of sight between the transmission and airborne vehicle 302 while positioned at P3o2(T2).
With reference to Figure 3C, computer 310 creates a superimposition 324 by overlaying layer of access 320(T1 ) of Figure 3A, and associated with geo-position P3o2(T1 ) with layer of access 320(T2) of Figure 3B, and associated with geo-position P3o2(T2). Superimposition 324 includes sub-regions 308A(T1 ), 308B(T1 ), and 308C(T1 ) overlaid on sub-regions 308C(T2) and 308D(T2). As a result of the overlaying, superimposition 324 includes overlap regions 324A, 324B, 324C, 324D, and 324E overlaid on DTM 304, and corresponding to the intersection, i.e. those points that are common to both sub-regions 308A(T1 ), 308B(T1 ), and 308C(T1 ) and sub-regions 308C(T2) and 308D(T2).
With reference to Figure 3D, at a time T3, airborne vehicle 302 has moved and is now positioned at a geo-position P3o2(T3). While airborne vehicle 302 is positioned at geo-position P3o2(T3), the transmission sensor of airborne vehicle 302 senses one or more transmissions transmitted by the transmitter. Computer 310 acquires an indication, such as a binary indication of the sensed transmission and determines from the one or more transmissions sensed at time T3, one or more indications for direct line of sight between the transmitter and airborne vehicle 302. The geo-positioning sensor of airborne vehicle 302 senses state data that includes at least geo-position P3o2(T3) of airborne vehicle 302 at time T3. Computer 310 acquires the state data, and applies a DTM of geographical region 304 to build a layer of access 320(T3), using the techniques described above with respect to Figures 1A-1 G. Layer of access 320(T3) includes the union of sub-regions 308C(T3), 308D(T3), and 308E(T3). Each of sub-regions 308C(T3), 308D(T3), and 308E(T3) includes at least one potential point within geographical region 304 that defines a potential line of sight between the transmission and airborne vehicle 302 while positioned at P3o2(T3).
With reference to Figure 3E, computer 310 creates a superimposition 326 by overlaying layer of access 320(T3) of Figure 3D, and associated with geo-position P302(T3) with overlap regions 324A, 324B, 324C, 324D, and 324E, determined above with respect to Figure 3C. Superimposition 326 includes sub- regions 308C(T3), 308D(T3), and 308EC(T3) overlaid on overlap regions 324A, 324B, 324C, 324D, and 324E. As a result of the overlaying, superimposition 326 includes overlap regions 326A, 326B, 326C, and 326D, overlaid on DTM 304, and corresponding to the intersection, i.e. those points that are common to each of sub-regions 308A(T1 ), 308B(T1 ), 308C(T1 ), and 308D(T1 ) associated with P302(T1 ), sub-regions 308C(T2), and 308D(T2) associated with P302(T2), and sub-regions 308C(T3), 308D(T3), and 308E(T3) associated with P302(T3).
With reference to Figure 3F, computer 310 extracts overlap regions 326A, 326B, 326C, and 326D from superimposition 326, and overlays these regions on DTM map 304. As a result of the techniques described above, the potential region of transmission at time T3 is considerably smaller than at time T1 .
With reference to Figure 3G, computer 310 defines a general transmission region 328, including overlap regions 326A, 326B, 326C, and 326D. Computer 310 may display any of region 328 and overlap regions 326A, 326B, 326C, and 326D on a user interface, such as Ul 266 of Figure 2. In one embodiment, computer 310 displays region 328 and overlap regions 326A, 326B, 326C, and 326D on an augmented reality display of the geographical region represented by DTM 304.
Reference is now made to Figures 4A-4B, which taken together, are a schematic illustration of a method for determining a potential transmission region, constructed and operative in accordance with an embodiment of the disclosed technique. Figure 4A illustrates a general method to determine a potential transmission region, and Figure 4B illustrates a more detailed implementation for procedure 400 of Figure 4A.
With reference to Figure 4A, in procedure 400, multiple layers of access respective of a geographical region and a transmission are built, where each layer of access of the multiple layers of access is associated with a different geo-position of at least one airborne vehicle. In some embodiments, the transmission is any of: a threat signal; and a distress signal. The at least one airborne vehicle may be any of: a fixed wing aircraft; a helicopter; a drone; and a surveillance balloon. The transmission may be any combination of: an optical signal; a radio signal; a LIDAR signal; a RADAR signal; an acoustic signal, an ultrasound signal, a thermal signal, and any of a distance, amplitude and velocity associated with a projectile. With reference to the system of Figures 1A- 1 G, computer 1 10 builds multiple layers of access respective of geographical region 104 and a transmission by transmitter 106. Each of the multiple layers of access is associated with a different geo-position of airborne vehicle 102. For example, Layer(P(T1 )) 220 of Figure 1 B is associated with geo-position P(T1 ) of airborne vehicle 102 at time T1 , as shown in Figure 1A, whereas Layer(P(T2)) 222 of Figure 2B is associated with geo-position P(T(2)) of airborne vehicle 102 at time T2, as shown in Figure 1 C.
In procedure 402, an intersection of the multiple layers of access is computed. With reference to the system of Figures 1A-1 G, and the system of Figure 2, computer 1 10 creates superimposition 224, shown in Figure 1 E, by overlaying layer of access Layer(P(T1 )) associated with geo-position P(T1 ), and shown in Figure 1 B, with layer of access Layer(P(T2)) associated with geo- position P(T2), and shown in Figure 1 D. Thus, representative sub-regions 208A(T1 ), 208B(T1 ), and 208C(T1 ) defining Layer(P(T1 )) of Figure 1 B are overlaid on representative sub-regions 208A(T2) and 208B(T2) defining Layer(P(T2)) of Figure 1 D. Computer 1 10 computes overlap regions 224A and 224B by determining those points that are common to both respective representative sub-regions 208A(T1 ), 208B(T1 ), and 208C(T1 ) and representative sub-regions 208A(T2) and 208B(T2), such as by performing a logical AND operation. Computer 1 10 computes intersection 226 of Figure 1 F by extracting overlap regions 224A and 224B from superimposition 224 of Figure 1 E, and stores superimposition 224. Computer 1 10 may display any of representative layers of access Layer(P(T1 )) and Layer(P(T2)); superimposition 224; and intersection 226 on Ul 266 of computer 210.
In procedure 404, at least one potential transmission region, respective of the transmission and each of the different geo-positions of the at least one airborne vehicle, is determined from the intersection. With reference to the system of Figures 1 G, computer 1 10 maps overlap regions 224A and 224B of Figure 2E onto geographic region 104, as corresponding regions 124A and 124B on respective peaks 108A and 108B. Computer 1 10 determines that at least one of regions 124A and 124B are a potential transmission region for transmitter 106, respective of the geo-position of airborne vehicle 102 at time T1 , indicated by an "X" 126, and the geo-position of airborne vehicle 102 at time 12, indicated by an "X" 128. Optionally, Computer 1 10 displays indications for regions 124A and 124B on Ul 366. In one embodiment, Ul 366 is an augmented reality display, and regions 124A and 124B are displayed as features overlaid on a real-time rendition of geographic region 104.
With reference to Figure 4B, a more detailed implementation for building each one of the layers of access of procedure 400 of Figure 4A is now described.
In procedure 420, state data of the at least one airborne vehicle comprising at least the geo-position associated with the one layer of access is sensed. In some embodiments, the state data further includes any combination of: a time stamp indicating the time of sensing of the indication for the direct line of sight; a heading measurement for the at least one airborne vehicle; an altitude measurement for the at least one airborne vehicle; a pose measurement for the at least one airborne vehicle; a velocity measurement for the at least one airborne vehicle; and an acceleration measurement for the at least one airborne vehicle. In some embodiments, the sensed indication is a binary indication for the existence of a direct line of sight. With reference to the system of Figures 1A-1 C and the system of Figure 2, processor 230 applies measurements acquired by any of 3D accelerometer 256, gyroscope 258, compass 260, GPS 262, and camera 252 to determine at least the geo-positions of airborne vehicle 102 at time T1 , shown in Figure 1A, and at time 12, shown in Figure 1 B. Processor 230 may additionally use any of the above measurements, and any additional measurements acquired via one or more components of transceiver 232 to determine the heading, pose, velocity, and acceleration of airborne vehicle 102.
In procedure 422, an indication, such as a binary indication for a direct line of sight between the transmission and the at least one airborne vehicle is sensed, while the at least one airborne vehicle is positioned at the geo-position associated with the one layer of access. With reference to the system of Figures 1A-1 C and the system of Figure 2, transmission sensor 242 senses an indication for direct line of sight between a transmission by transmitter 106 and airborne vehicle 102. For example, long RF range transceiver 238 may sense a radio signal transmitted by transmitter 106. Additionally or alternatively, LIDAR detector 244 may sense a LIDAR signal transmitted by transmitter 106. Additionally or alternatively, RADAR antenna 246 may sense a RADAR signal transmitted by transmitter 106. Additionally or alternatively, thermal detector 248 may sense a thermal signal transmitted by transmitter 106. Additionally or alternatively, acoustic detector 250A may detect an acoustic signal transmitted by transmitter 106. Additionally or alternatively, US detector 250B may detect a US signal transmitted by transmitter 106. Additionally or alternatively, camera 252 may detect an optical signal transmitted by transmitter 106. Processor 230 applies at least one of the above detected signals to determine binary indications for direct line of sight 1 12A(T1 ), 1 12B(T1 ), and 1 12C(T1 ) between transmitter 106 and airborne vehicle 102 while airborne vehicle is positioned at geo-position P(T1 ), as shown in Figure 1A. Similarly, processor 230 applies at least one of the above detected signals to determine binary indications for direct line of sight 1 12A(T2) and 1 12B(T2) between transmitter 106 and airborne vehicle 102 while airborne vehicle is positioned at geo-position P(T2), as shown in Figure 1 B.
In procedure 424, topographical data for the geographical region is obtained. With reference to the system of Figures 1A-1 B and Figure 2, processor 230 obtains topographical data for geographical region 104 from memory 264. In procedure 426, responsive to sensing the indication for the direct lines of sight while the airborne vehicle is positioned at the geo-position associated with the one layer of access, i) the binary indication for the direct lines of sight is combined with ii) state data including at least the associated geo-position of the at least one airborne vehicle and iii) the topographical data for the geographical region, to compute a subset of the geographical region. The subset of the geographic region includes at least one potential point of the geographical region defining a potential line of sight between the transmission and the airborne vehicle positioned at the associated geo-position. Each associated layer of access described above includes the subset of the geographical region.
With reference to the system of Figures 1A, 1 B and 2, while airborne vehicle 102 is positioned at geo-position P(T1 ), transmission sensor 242 senses respective indications for direct lines of sight 1 12A(T1 ), 1 12B(T1 ), and 1 12C(T1 ). Responsive to the sensing by transmission sensor 242, processor 230 obtains state data including at least geo-position P(T1 ) from geo-positioning unit 254. Processor 230 additionally obtains topographical data for geographical region 104 from memory 264. Processor 230 combines i) indications for direct lines of sight 1 12A(T1 ), 1 12B(T1 ), and 1 12C(T1 ), ii) the state data including at least geo-position P(T1 ), and the topographical data to compute the subset comprising the union of sub-regions 108A(T1 ), 108B(T1 ), and 108C(T1 ) of geographical region 104. The subset of geographic region 104, comprising the union of sub-regions 108A(T1 ), 108B(T1 ), and 108C(T1 ), includes at least one potential point of geographical region 104 defining a potential line of sight between the transmission by transmitter 106 and airborne vehicle 102 positioned at the associated geo-position P(T(1 )). As shown in Figure 1 B, layer of access Layer (P(T1 )) 220 includes three representative sub-regions 208A(T1 ), 208B(T1 ) and 208C(T1 ) of representation 204 of geographical region 104, and corresponding to respective sub-regions 108A(T1 ), 108B(T1 ) and 108C(T1 ) of peaks 108A, 108B, and 108C, shown in Figure 1A.
With reference to the system of Figures 1 C, 1 D and 2, while airborne vehicle 102 is positioned at geo-position P(T2), transmission sensor 242 senses respective indications for direct lines of sight 1 12A(T2) and 1 12B(T2). Responsive to the sensing by transmission sensor 242, processor 230 obtains state data including at least geo-position P(T2) from geo-positioning unit 254. Processor 230 additionally obtains topographical data for geographical region 104 from memory 264. Processor 230 combines i) indications for direct lines of sight 1 12A(T2) and 1 12B(T2), ii) the state data including at least geo-position P(T2), and the topographical data to compute a subset comprising the union of sub-regions 108A(T2) and 108B(T2) of geographical region 104. The subset of geographic region 104, comprising the union of sub-regions 108A(T2) and 108B(T2) includes at least one potential point of geographical region 104 defining a potential line of sight between the transmission by transmitter 106 and airborne vehicle 102 positioned at the associated geo-position P(T(1 )). As shown in Figure 1 D, layer of access Layer (P(T2)) includes two representative sub-regions 208A(T2) and 208B(T2) of representation 204 of geographical region 104, and corresponding to respective sub-regions 108A(T2) and 108B(T2) of peaks 108A and 108B, shown in Figure 1 B.
It will be appreciated by persons skilled in the art that the disclosed technique is not limited to what has been particularly shown and described hereinabove. Rather the scope of the disclosed technique is defined only by the claims, which follow.

Claims

1 . A method for automatically determining at least one potential transmission region, comprising:
acquiring multiple binary indications for a direct line of sight between a transmitter and at least one airborne vehicle respective of multiple geo- positions of said at least one airborne vehicle;
acquiring each of said multiple geo-positions of said at least one airborne vehicle respective of each of said multiple binary indications for said direct line of sight;
for each one of said acquired geo-positions of said at least one airborne vehicle for said binary indication for said direct line of sight,
determining a layer of access respective of topographical data for said geographical region and said one of said multiple acquired geo-positions, as a subset of said geographical region that includes at least one potential point of said geographical region defining a potential line of sight between said transmitter and said at least one airborne vehicle,
thereby determining multiple layers of access;
determining an intersection of said multiple layers of access; and determining, from said intersection, said at least one potential transmission region respective of said transmitter and each of said different geo-positions of said at least one airborne vehicle.
2. The method of claim 1 , wherein acquiring said geo-position further comprises acquiring data including a parameter selected from the group consisting of: a time stamp associated with said sensing of said binary indication for said direct line of sight; a heading measurement for said at least one airborne vehicle; an altitude measurement for said at least one airborne vehicle; a pose measurement for said at least one airborne vehicle; a velocity measurement for said at least one airborne vehicle; and an acceleration measurement for said at least one airborne vehicle.
3. The method of claim 1 , wherein said transmitter is configured to transmit a signal selected from the group consisting of: a threat signal; and a distress signal.
4. The method of claim 1 , wherein said at least one airborne vehicle is selected from the group consisting of: a fixed wing aircraft; a helicopter; a drone; and a surveillance balloon.
5. The method of claim 1 , wherein said transmitter is configured to transmit a signal selected from the group consisting of: an optical signal; a radio signal; a lidar signal; a radar signal, an acoustic signal, an ultrasound signal, a thermal signal; and any of a distance, amplitude, velocity, and acceleration associated with a projectile.
6. A system for automatically determining at least one potential transmission region, comprising:
a memory unit configured to store topographical data for a geographical region;
at least one first sensor configured to sense a signal from a transmitter at multiple geo-positions of at least one airborne vehicle;
at least one second sensor configured to sense said multiple geo- positions of said at least one airborne vehicle; and
at least one processor configured to:
acquire multiple binary indications for a direct line of sight between said transmitter and said at least one airborne vehicle respective of said sensed signal at said multiple geo-positions,
acquire said multiple geo-positions of said at least one airborne vehicle respective of said multiple binary indications for said direct line of sight,
for each one of said multiple acquired geo-positions of said at least one airborne vehicle respective of said binary indication for said direct line of sight,
determine a layer of access respective of said topographical data for said geographical region and said one of said multiple acquired geo-positions, as a subset of said geographical region that includes at least one potential point of said geographical region defining a potential line of sight between said transmission and said at least one airborne vehicle,
thereby determining multiple layers of access,
determine an intersection of said multiple layers of access, and determine, from said intersection, said at least one potential transmission region respective of said transmitter and each of said multiple geo-positions of said at least one airborne vehicle.
7. The system of claim 6, further comprising said at least one airborne vehicle, said at least one airborne vehicle provided with said at least one processor, said memory unit, said at least one first sensor, and said at least one second sensor.
8. The system of claim 7, wherein said at least one second sensor is selected from the group consisting of: a compass, a GPS unit, a 3D accelerometer, a gyroscope, and a camera.
9. The system of claim 7, wherein said at least one first sensor is selected from the group consisting of: a long range radio antenna, a lidar detector, a radar antenna, a thermal detector, an acoustic detector, an ultrasound detector, and a camera.
10. The system of claim 6, further comprising a user interface configured to display said at least one potential transmission region respective of each of said different geo-positions of at least one airborne vehicle.
PCT/IL2018/051191 2017-11-07 2018-11-07 Transmission detection using line of sight WO2019092706A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/762,436 US10989792B2 (en) 2017-11-07 2018-11-07 Transmission detection using line of sight
EP18875544.1A EP3707524B1 (en) 2017-11-07 2018-11-07 Transmission detection using line of sight

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
IL255512 2017-11-07
IL255512A IL255512B (en) 2017-11-07 2017-11-07 Transmission detection using line of sight
US201862755572P 2018-11-05 2018-11-05
US62/755,572 2018-11-05

Publications (1)

Publication Number Publication Date
WO2019092706A1 true WO2019092706A1 (en) 2019-05-16

Family

ID=66438322

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2018/051191 WO2019092706A1 (en) 2017-11-07 2018-11-07 Transmission detection using line of sight

Country Status (1)

Country Link
WO (1) WO2019092706A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6255992B1 (en) * 2000-04-13 2001-07-03 The United States Of America As Represented By The Secretary Of The Air Force Self-calibrating large baseline interferometer for very precise emitter location using time difference of arrival and time difference of arrival rate
US20120293371A1 (en) * 2011-05-19 2012-11-22 Itt Manufacturing Enterprises, Inc. System and Method for Geolocation of Multiple Unknown Radio Frequency Signal Sources
US20150241545A1 (en) * 2014-02-25 2015-08-27 Lockheed Martin Corporation Single Platform Doppler Geolocation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6255992B1 (en) * 2000-04-13 2001-07-03 The United States Of America As Represented By The Secretary Of The Air Force Self-calibrating large baseline interferometer for very precise emitter location using time difference of arrival and time difference of arrival rate
US20120293371A1 (en) * 2011-05-19 2012-11-22 Itt Manufacturing Enterprises, Inc. System and Method for Geolocation of Multiple Unknown Radio Frequency Signal Sources
US20150241545A1 (en) * 2014-02-25 2015-08-27 Lockheed Martin Corporation Single Platform Doppler Geolocation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3707524A4 *

Similar Documents

Publication Publication Date Title
CN108693543A (en) Method and system for detecting signal deception
CN103376442B (en) Information display device and method for information display
JP6380936B2 (en) Mobile body and system
US20150241560A1 (en) Apparatus and method for providing traffic control service
US7792330B1 (en) System and method for determining range in response to image data
AU1900099A (en) Radio frequency interferometer and laser rangefinder/designator base targeting system
KR101438289B1 (en) Altitude information obtention system using a complex navigation equipment
CN110392908A (en) For generating the electronic equipment and its operating method of map datum
CN101223455A (en) Method and device for determining the ground position of a mobile object, in particular an aircraft on an airport
EP3765820B1 (en) Positioning method and positioning apparatus
AU2015356865A1 (en) Electronic device for the near locating of a terrestrial object, and method of locating such an object
FR2557971B1 (en) PILOTLESS AIRCRAFT MONITORING SYSTEM FOR OBJECTIVE LOCATION
GB2553141A (en) Method and apparatus for position estimation
KR20160105628A (en) Distance and position measurement method with beacon and acceleration sensor and indoor positioning system using the same
KR100963680B1 (en) Apparatus and method for measuring remote target's axis using gps
EP2761326B1 (en) Method for improving the accuracy of a radio based navigation system
EP3707524B1 (en) Transmission detection using line of sight
EP3767235B1 (en) System for mapping building interior with pedestrian dead reckoning and ranging and related methods
US11160047B2 (en) Determining motion information associated with a mobile device
WO2020261255A1 (en) Geolocation of head-mounted image sensor using celestial navigation
WO2019092706A1 (en) Transmission detection using line of sight
US11333522B2 (en) Localization system
US6664917B2 (en) Synthetic aperture, interferometric, down-looking, imaging, radar system
KR20170123801A (en) Method and apparatus for keeping horizontal position accuracy for taking off and landing of unmanned air vehicle
Sonnessa et al. Indoor Positioning Methods–A Short Review and First Tests Using a Robotic Platform for Tunnel Monitoring

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18875544

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018875544

Country of ref document: EP

Effective date: 20200608