IL299575A - Vehicle tracking using acoustic measurements - Google Patents
Vehicle tracking using acoustic measurementsInfo
- Publication number
- IL299575A IL299575A IL299575A IL29957522A IL299575A IL 299575 A IL299575 A IL 299575A IL 299575 A IL299575 A IL 299575A IL 29957522 A IL29957522 A IL 29957522A IL 299575 A IL299575 A IL 299575A
- Authority
- IL
- Israel
- Prior art keywords
- acoustic
- vehicle
- environment
- sensors
- sensor
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/18—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Description
VEHICLE TRACKING USING ACOUSTIC MEASUREMENT TECHNOLOGICAL FIELD The present disclosure, in some embodiments, thereof, relates to vehicle tracking using acoustic measurements and, more particularly, but not exclusively, to use of acoustic tracking where access to other tracking modality/ies is limited.
BACKGROUND ART Background art, where each and every listed art is incorporated by reference into this document in its entirety, includes:R. Kapoor, A. Gardi and R. Sabatini, "Acoustic Positioning and Navigation System for GNSS Denied/Challenged Environments," 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS), 2020, pp. 1280-1285, doi: 10.1109/PLANS46316.2020.9110156.US Patent No. US8164484 discloses "A method and apparatus for identifying running vehicles in an area to be monitored using acoustic signature recognition. The apparatus includes an input sensor for capturing an acoustic waveform produced by a vehicle source, and a processing system. The waveform is digitized and divided into frames. Each frame is filtered into a plurality of gammatone filtered signals. At least one spectral feature vector is computed for each frame. The vectors are integrated across a plurality of frames to create a spectro-temporal representation of the vehicle waveform. In a training mode, values from the spectro-temporal representation are used as inputs to a Nonlinear Hebbian learning function to extract acoustic signatures and synaptic weights. In an active mode, the synaptic weights and acoustic signatures are used as patterns in a supervised associative network to identify whether a vehicle is present in the area to be monitored. In response to a vehicle being present, the class of vehicle is identified. Results may be provided to a central computer. "US Patent Application Publication No. US2020180669 discloses "A method forlocating a railway vehicle includes: measuring an acoustic signature of a railway vehicle passing over a railway switch for switching between a first railway track and a second railway track diverging from the first railway track, using a distributed acoustic sensing apparatus including an optical fiber sensor placed along the railway switch; analyzing the measured acoustic signature, using an electronic processing unit, by identifying, in the measured acoustic signature, vibration patterns representative of track geometry and/or track curvature; and determining which of the first railway track or the second railway track is occupied by the railway vehicle, using the electronic processing unit, based on the identified vibration patterns. "Additional background art includes US Patent No. US10198946, US Patent Application Publication No. US2021063526. and European Patent No. EP2793043.Acknowledgement of the above references herein is not to be inferred as meaning that these are in any way relevant to the patentability of the presently disclosed subject matter.
GENERAL DESCRIPTION Following is a non-exclusive list of some exemplary embodiments of the disclosure. The present disclosure also includes embodiments which include fewer than all the features in an example and embodiments using features from multiple examples, even if not listed below.Example 1. A method of vehicle tracking comprising:receiving a spatial relationship between a plurality of acoustic sensors dispersed within an environment;receiving information regarding said environment;receiving a plurality of acoustic measurement signals, an acoustic measurement signal from each of said plurality of acoustic sensors;identifying a vehicular acoustic signature of a vehicle within said environment, in at least one of said plurality of acoustic measurement signals;determining a location of said vehicle with respect to said plurality of acoustic sensors, using said plurality of acoustic measurement signals, said acoustic signature, said spatial relationship between said plurality of acoustic sensors, and said information regarding said environment.Example 2. The method according to Example 1, comprising:wherein said receiving information regarding said environment comprises receiving information comprising one or more of:topographical information regarding said environment; and effect on passage of sound through said environment;wherein said determining comprises determining said location of said vehicle within said environment using said information regarding said environment.Example 3. The method according to any one of Examples 1-2, wherein said identifying comprises identifying a vehicular acoustic signature of a vehicle within said environment, in at least two of said plurality of acoustic measurement signals.Example 4. The method according to any one of Examples 1-3, wherein a medium through which said acoustic sensors measure is air.Example 5. The method according to any one of Examples 1-3, wherein said determining comprises identifying one or more features of said acoustic signature and using said identified features to determine said location of said vehicle.Example 6. The method according to Example 5, wherein said determining comprises comparing identified features of said acoustic signature in at least two of plurality of said acoustic measurement signals.Example 7. The method according to any one of Examples 5-6, wherein said one or more features includes a reoccurring feature of said acoustic signal.Example 8. The method according to any one of Examples 1-7, wherein said information regarding said environment includes topology of one or more acoustically reflective landscape element.Example 9. The method according to Example 8, wherein said identifying comprises identifying one or more reflection of said acoustic signature within said plurality of sensor measurement signals;wherein said determining said location comprises using said one or more reflection of said acoustic signature.Example 10. The method according to any one of Examples 1-9, wherein said receiving comprises receiving information regarding vehicular acoustic signatures;wherein said determining comprises determining one or more of:a vehicle type;a vehicle state;a specific vehicle;using said vehicular acoustic signature and said information regarding vehicular acoustic signatures.
Example 11. The method according to Example 10, comprising generating control signals for said vehicle based on one or more of said vehicle type, said vehicle state, and said specific vehicle.Example 12. The method according to any one of Examples 1-11, comprising generating control signals for control of said vehicle, using said location.Example 13. The method according to Example 12, wherein said generating comprises using one or more of a received desired destination and a desired vehicle trajectory.Example 14. The method according to any one of Examples 1-13, comprising transmitting said location to said vehicle.Example 15. The method according to any one of Examples 1-14, comprising identifying a Doppler shift in said acoustic signature of at least one of said sensor measurement signals.Example 16. The method according to Example 15, wherein said determining comprises using said identified Doppler shift and a position of a sensor corresponding to the acoustic signature in which said Doppler shift was identified.Example 17. A tracking system comprising:a plurality of acoustic sensors dispersed within an environment of a vehicle to be tracked;a processor configured to:receive a spatial relationship between said plurality of acoustic sensors; and a location of said plurality of acoustic sensors within said environment;receive a plurality of acoustic measurement signals, an acoustic measurement signal from each of said plurality of acoustic sensors;identify a vehicular acoustic signature of a vehicle within said environment in at least two of said plurality of acoustic measurement signals;determine a location of said vehicle within said environment by determining a position of said vehicle with respect to said plurality of acoustic sensors, using said acoustic signature and said positions of said acoustic sensors within said environment.Example 18. The system according to Example 17, wherein at least two of said plurality of acoustic sensors are spaced away from each other. -5-Example 19. The system according to any one of Examples 17-18, wherein said plurality of acoustic sensors are dispersed within said environment where an acoustic sensor is positioned within each portion of a plurality of portions of the environment.Example 20. The system according any one of Examples 17-19 wherein said plurality of acoustic sensors are fixedly attached to one or more structure within said environment.Example 21. The system according to any one of Examples 17-20, comprising a transmitter configured to transmit said location to said vehicle.Example 22. The system according to any one of Examples 17-21, wherein said processor is configured to generate control signals for control of said vehicle, using said location and a received one or more of destination and trajectory.Example 23. The system according to Example 22, comprising a transmitter configured to transmit said control signals to said vehicle.Unless otherwise defined, all technical and/or scientific terms used within this document have meaning as commonly understood by one of ordinary skill in the art/s to which the present disclosure pertains. Methods and/or materials similar or equivalent to those described herein can be used in the practice and/or testing of embodiments of the present disclosure, and exemplary methods and/or materials are described below. Regarding exemplary embodiments described below, the materials, methods, and examples are illustrative and are not intended to be necessarily limiting.Some embodiments of the present disclosure are embodied as a system, method, or computer program product. For example, some embodiments of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" and/or "system."Implementation of the method and/or system of some embodiments of the present disclosure can involve performing and/or completing selected tasks manually, automatically, or a combination thereof. According to actual instrumentation and/or equipment of some embodiments of the method and/or system of the present disclosure, several selected tasks could be implemented by hardware, by software or by firmware and/or by a combination thereof, e.g., using an operating system. -6-For example, hardware for performing selected tasks according to some embodiments of the present disclosure could be implemented as a chip or a circuit. As software, selected tasks according to some embodiments of the present disclosure could be implemented as a plurality of software instructions being executed by a computational device e.g., using any suitable operating system.In some embodiments, one or more tasks according to some exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage e.g., for storing instructions and/or data. Optionally, a network connection is provided as well. User interface/s e.g., display/s and/or user input device/s are optionally provided.Some embodiments of the present disclosure may be described below with reference to flowchart illustrations and/or block diagrams. For example illustrating exemplary methods and/or apparatus (systems) and/or and computer program products according to embodiments of the present disclosure. It will be understood that each step of the flowchart illustrations and/or block of the block diagrams, and/or combinations of steps in the flowchart illustrations and/or blocks in the block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart steps and/or block diagram block or blocks.These computer program instructions may also be stored in a computer readablemedium that can direct a computer (e.g., in a memory, local and/or hosted at the cloud), other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium can be used to produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.The computer program instructions may also be run by one or more computational device to cause a series of operational steps to be performed e.g., on the computational device, other programmable apparatus and/or other devices to produce a computer implemented process such that the instructions which execute provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.Some of the methods described herein are generally designed only for use by a computer, and may not be feasible and/or practical for performing purely manually, by a human expert. A human expert who wanted to manually perform similar tasks, might be expected to use different methods, e.g., making use of expert knowledge and/or the pattern recognition capabilities of the human brain, potentially more efficient than manually going through the steps of the methods described herein.
BRIEF DESCRIPTION OF THE DRAWINGS In order to better understand the subject matter that is disclosed herein and to exemplify how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:FIG. 1 is a simplified schematic of a vehicle tracking system, according to some embodiments of the disclosure;FIG. 2 is a method of vehicle tracking, according to some embodiments of the disclosure;FIG. 3 is a simplified schematic of a vehicle tracking system, according to some embodiments of the disclosure;FIG. 4 is a method of vehicular navigation, according to some embodiments of the disclosure;FIG. 5 is a simplified schematic plot of acoustic sensors measurements data, according to some embodiments of the disclosure;FIG. 6 is a method of vehicle tracking, according to some embodiments of the disclosure;FIG. 7 is a simplified schematic plot of acoustic sensor measurement data, according to some embodiments of the disclosure;FIG. 8 is a method of vehicle tracking, according to some embodiments of the disclosure;FIG. 9 is a simplified schematic of a vehicle tracking system, according to some embodiments of the disclosure; FIG. 10 is a simplified schematic illustrating a tracking system in a landscape, according to some embodiments of the invention;FIG. 11 is a simplified schematic illustrating a tracking system in a landscape, according to some embodiments of the disclosure;FIG. 12 is a method of vehicle tracking, according to some embodiments of the disclosure;FIGs. 13A-B are simplified schematics illustrating acoustics of a landscape, according to some embodiments of the disclosure;FIGs. 14A-B are a method of vehicle tracking, according to some embodiments of the disclosure;FIGs. 15A-B are plots of amplitude with time, for acoustic measurement signals, according to some embodiments of the disclosure;FIGs. 16A-B are plots of amplitude with time, for acoustic measurement signals, according to some embodiments of the disclosure;FIGs. 17A-B are plots of amplitude with time, for acoustic measurement signals, according to some embodiments of the disclosure;FIGs. 18A-B are plots of frequency magnitude with time, for acoustic measurement signals, according to some embodiments of the disclosure;FIGs. 19A-B are plots of frequency magnitude with time, for acoustic measurement signals, according to some embodiments of the disclosure;FIGs. 20A-B are plots of frequency magnitude with time, for acoustic measurement signals, according to some embodiments of the disclosure;FIG. 22 illustrates plots of acoustic sensor signals, with time, for a repeated acoustic signal; andFIG. 21 illustrates plots of acoustic sensor signals, , according to some embodiments of the disclosure.In some embodiments, although non-limiting, in different figures, like numerals are used to refer to like elements, for example, element 104 in FIG. 1 corresponding to element 304 in FIG. 3. -9- DETAILED DESCRIPTION OF EMBODIMENTS The present disclosure, in some embodiments, thereof, relates to vehicle tracking using acoustic measurements and, more particularly, but not exclusively, to use of acoustic tracking where access to other tracking modality/ies is limited. Overview A broad aspect of some embodiments of the disclosure relates to passive acoustic tracking (e.g. of vehicles) using acoustic measurements from a plurality of acoustic sensors with known spatial relationship to each other and within an environment to be navigated. Where, in some embodiments, the acoustic sensors have known positions within a landscape to be navigated by vehicle/s being tracked and/or have known positions with respect to structure/s (e.g. buildings, fences) within the landscape.Where passive acoustic tracking is defined as being tracking of an object (e.g. vehicle) using measurements of noise generated by the object (e.g. by operation of the vehicle). For example, without a dedicated transmission in the acoustic spectrum for the purpose of tracking.A potential benefit of passive acoustic vehicle tracking being lack of a broadcasted signal from the vehicle potentially increasing confidentiality of the vehicle e.g. as opposed to active tracking method/s.Within this document, reference is generally with respect to tracking of vehicles,but it should be understood that the system/s and/or method/s disclosed here, in some embodiments, are used in tracking of other objects and/or locating other noise generating object/s and/or events. For example, location of a position from which a gunshot emanates. For example, a location of a human or animal making noise. For example, location of a speaker producing noise.In some embodiments, one or more of engine noise, noise of moving parts of the vehicle with respect to each other and/or the environment (also, e.g. for land and/or air vehicles herein termed "landscape"’) in which the vehicle is moving, and/or resonance, contribute to an individual vehicle acoustic signature, which is identified in acoustic sensor measurement/s.In some embodiments, an acoustic signature includes a pattern over time in amplitude and/or frequency e.g. amplitude for each frequency (or frequency range of a plurality of ranges). In some embodiments, the pattern includes a portion which repeats. - 10-In some embodiments, an acoustic signature includes a plurality of patterns each of which optionally repeats over time, where different patterns, in some embodiments, repeat at different frequencies.In some embodiments, a vehicular acoustic signature includes such a pattern and/or plurality of patterns for which change with speed and/or state (stopped, moving) of the vehicle.In some embodiments, the acoustic tracking is used as a supplement to another tracking modality, for example GNSS. Where, in some embodiments, acoustic tracking is used where access to the other modality is limited and/or interrupted (e.g. satellite concealment and/or blockage of GNSS signals to the vehicle).In some embodiments, acoustic tracking is used in order to avoid and/or detect spoofing (e.g. GPS spoofing). For example, where a rogue tracking (e.g. GPS) signal for the vehicle is transmitted, this signal resulting in false position data being transmitted to the vehicle. In some embodiments, the tracking (e.g. GPS) position data is verified using acoustic tracking. Exemplary actions upon failing to verify (e.g. and identifying possible spoofing) including one or more of correcting tracking data sent to user/s and/or sending an alert to user/s.In some embodiments, the acoustic tracking is used along with other sensor measurements, for example, inertial sensor measurements providing information regarding movements of the vehicle, for example, proximity sensor measurements, for example, optical sensor measurements (e.g. image/s).A potential advantage of used of acoustic sensors is low cost of acoustic sensors, potentially providing the ability to track vehicle/s over a large area e.g. at low cost. Where, in some embodiments, for tracking, the system requires an acoustic sensor for each region of space in which a vehicle is to be tracked, where, in some embodiments, a size of the region of space is associated with the vehicle type and/or sensor sensitivity. Where, for example, in some embodiments, for road vehicle tracking (e.g. car, motorcycle) tracking a system includes an acoustic sensor every 10-100m, or 10-50m, or lower, or higher, or intermediate ranges or distances. Where, in some embodiments, other vehicle types (e.g.ground vehicles with a louder acoustic signal e.g. tank and/or air vehicles, e.g. drone) are tracked using a system with an acoustic sensor every 50-200m, or 50-100m, or lower, or higher, or intermediate ranges, or distances.
An aspect of some embodiments of the disclosure relates to tracking a vehicle by identifying an acoustic signature of the vehicle in sensor signals of the plurality of sensors.In some embodiments, identification of the acoustic signature of a vehicle, in one or more sensor signal is used to identify presence of the vehicle within a geographical region of the sensors. In some embodiments, identification is of a presence of a particular vehicle type and/or of a specific vehicle.In some embodiments, identified feature/s of the identified acoustic signature, in one or more sensor signal are used in identifying presence of the vehicle, and/or tracking vehicle position and/or movement.Where, in some embodiments, feature/s include one or more of: identified time of arrival (TOA) of the acoustic signature, differential time of arrival (DTOA) of the acoustic signature, angle of arrival (AOA) of the acoustic signature, magnitude of the acoustic signature, frequency signature of the acoustic signature, pattern of the acoustic signature over time, recurring feature/s in time and/or frequency domain of the acoustic signature.In some embodiments, a vehicular acoustic signature (e.g. including sounds of a running motor) has a continuous noise signature with reoccurring changes to the signature (e.g. frequency, power). In some embodiments, one or more recurring (e.g. periodic) feature of the acoustic signature is used to track the vehicle. For example, TOA of a recurring feature at one or more acoustic sensor. Where, in some embodiments, comparison of the sensed recurring feature (e.g. TOA of the feature) at different sensors (e.g. different TOAs at different sensors) is used to track the vehicle.In some embodiments, the vehicle is a land vehicle (e.g. car, bus, tank, robotic land vehicle). In some embodiments, the vehicle is an air vehicle (e.g. drone, aircraft, helicopter). In some embodiments, the vehicle is capable of travel in more than one modality e.g. both air and land. In some embodiments, the vehicle is an aquatic vehicle (e.g. boat, submarine).Where, in some embodiments, feature/s of the identified acoustic signature in one or more of the sensor signals is used to determine location and/or movement of the vehicle.In some embodiments, one or more feature of the acoustic signature identified in a plurality of sensor measurement signals is used. -12-In some embodiments, one or more feature identified in a single sensor measurement (e.g. over a time period) is used. For example, where the vehicle is tracked using other measured features, in some embodiments, location and/or movement of the vehicle is corrected and/or compensated, based on an additional feature identified e.g. in a single sensor measurement.For example, in an exemplary embodiment, feature/s (e.g. TOA, DTOA, magnitude) of the acoustic signature, identified from measurements at plurality of sensors are used to track the vehicle. Where, in some embodiments, timing of Doppler shift as the vehicle passes an individual sensor is used to correct (e.g. accumulated errors) and/or update a position of the vehicle with respect to the individual sensor.An aspect of some embodiments of the invention relates to use of vehicle tracking data determined from acoustic sensor measurements for navigation of the vehicle being tracked. For example, where, in some embodiments, tracking information is displayed to one or more user (e.g. a user of the vehicle and/or user/s remote to the vehicle and sensor array). For example, in some embodiments, tracking information is used to generate control signals to control movements of the vehicle e.g. where position feedback (e.g. with respect to position of acoustic sensors) is used. Where, in some embodiments, control signals are generated using a desired trajectory and/or destination which is received (e.g. pre-planned) and/or generated during use of the vehicle (e.g. based on user inputs).In some embodiments, passive acoustic tracking is used alone. Alternatively, in some embodiments, passive acoustic tracking is used in combination with one or more other positioning modality (e.g. GNSS). Where a potential benefit of acoustic tracking is the ability to track a vehicle where other tracking modality/ies are limited and/or lacking (e.g. vehicle is a GNSS lacking environment).In some embodiments, passive acoustic tracking is used to estimate vehicle location at times where other modalities are limited. In some embodiments, passive acoustic tracking is used to increase accuracy of tracking of other modalities e.g. with respect to known environmental feature/s e.g. a structure to which the acoustic sensors is attached.In some embodiments, a vehicle type and/or a specific vehicle is identified from the identified vehicular acoustic signal. Where, in some embodiments, generated control signals are tailored according to the vehicle type and/or feature/s of the specific vehicle.
Where, in some embodiments, sensor data and/or tracking information and/or control signal/s are directed to the specific vehicle e.g. only.An aspect of some embodiments of the invention relates to using information regarding an environment in which the sensor array is positioned and/or that the vehicle is traversing and using the information regarding the environment, along with one or more acoustic sensor signal, to track the vehicle. In some embodiments, the information includes information regarding landscape topographical feature/s e.g. hills and/or valleys e.g. feature/s which are acoustically reflecting (e.g. walls of buildings). In some embodiments, the information includes information regarding acoustic property/ies of landscape feature/s and/or structures e.g. acoustic reflectivity. In some embodiments, using the environment information, reflections and/or other changes to the vehicular acoustic signature are identified and either removed (e.g. as noise) and/or used in determining position and/or movement of the vehicle.For example, in some embodiments, known position of obstacles and/or topological features in a landscape are used to determine an acoustic path to sensor/s, the path/s e.g. being used to determine location of the vehicle.In some embodiments, a model of reactive acoustic signature of a landscape is acquired. For example, where noise is generated at different parts of the landscape and measured using a plurality of sensors in known positions within the landscape. Where, in some embodiments, modeling is via machine learning were, in some embodiments, a machine learning model is trained using acoustic measurements, the position of the sensors which acquired the measurements, and information regarding position and/or trajectory and/or other features of the object generating the acoustic signal (e.g. type of vehicle). In some embodiments, the trained machine learning model then provides location of a source of an acoustic signature, based on the plurality of acoustic measurement signals.Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways.
FIG. 1 is a simplified schematic of a vehicle tracking system 100, according to some embodiments of the disclosure.In some embodiments, FIG. 1 illustrates a bird’s eye (also herein termed "aerial’) view of system 100.In some embodiments, system 100 includes a plurality of acoustic sensors 108, 110, 112, 114, 116, 118. In some embodiments, a plurality of acoustic sensors in a geographical location (e.g. attached to a same structure e.g. within an area of 10km2, e.g. dispersed along a border) are termed a "set" or "array" of sensors. Alternatively or additionally, in some embodiments, a plurality of acoustic sensors are defined as a set of sensors by connection of the sensors to a same processor 106. Where, in some embodiments, acoustic sensors 108,110,112,114,116,118 are dispersed geographically in different places. Having, for example, a distance d between sensors of at least Im or 1- 200m, or, 10-200m, or 10-100m, or lower, or higher, or intermediate, ranges, or distances between adjacent sensors.In some embodiments, sensors are spaced out, e.g. as illustrated in FIG. 1, in a horizontal direction. Additionally, or alternatively, in some embodiments, sensors are placed at different heights. For example, adjacent sensors being at different geographical locations (horizontally) and at different heights. For example, two or more sensors being at a generally same geographical location (horizontally e.g. to within 10m, or Im, or within 50cm, or within 10cm) but at different heights. A potential benefit of sensors in locations having a distance in a vertical direction between the locations is the ability, in some embodiments to determine azimuth and/or elevation using the sensor signals.In some embodiments, one or more of the acoustic sensors includes a plurality of sensors, each acoustically sensing in a different direction.In some embodiments, acoustic sensors 108-118 are disposed in fixed positions. In some embodiments, sensors 108-118 are positioned with a known spatial relationship to each other and/or positioned at (and/or at a known distance to) a structure 120 having a known topology.Exemplary structures 120 includes a fence and/or wall e.g. a boundary fence. In some embodiments, vehicle 104 follows a trajectory traveling along a structure e.g. along a fence e.g. a security fence. In some embodiments, the vehicle is travelling in an offroad environment and/or a semi-offroad environment.
For example, in some embodiments, sensors 108, 110, 112, 114, 116, 118 are attached to a structure 120. Where exemplary structures include existing (e.g. pre-dating a start to installation of system 100) structure/s e.g. fencing, wall/s of e.g. of existing building/s. In some embodiments, structure 120 includes a division (e.g. security division) between entities e.g. includes a border fence.In some embodiments, structure 120 includes dedicated system portion/s e.g. one or more stand for one or more sensor. For example, where, in some embodiments, structure 120 includes at least one portion formed from an existing infrastructure element and at least one portion which is a dedicated system support portion. For example, where system support/s are used to bridge region/s of space between infrastructural elements.In some embodiments, feature/s of structure 120 are known, for example, position (e.g. with respect to sensors and/or other landscape features e.g. as described regarding FIG. 9) and/or geometry of the structure.In some embodiments, feature/s of the structure are measured e.g. during calibration and/or installation of the system.In some embodiments, acoustic sensors 108, 110, 112, 114, 116, 118 sense acoustic signal/s emitted by a vehicle 104. In some embodiments, acoustic sensors 108, 110, 112, 114, 116, 118 are data connected to a processor 106, for example, passing measurement data (e.g., wirelessly and/or via wired connection).Optionally, in some embodiments, processor 106 is (e.g. wirelessly) connected to a processor 122 of vehicle 104.Optionally, in some embodiments, system 100 is a local closed system (which does not include control center 176) and/or which does not rely on communication links to remote element/s. Where, for example, processing is performed only at processor 1and/or processor 122. Where, in some embodiments, data (e.g. data regarding acoustic signature/s and/or their interpretation) is stored locally e.g. at a memory local to processor 106 (not illustrated) and/or at a vehicle memory (not illustrated).Where, for example, in some embodiments, one or more of sensor signals, determined location information, and control signals (e.g. for one or more actuators 1of vehicle 104) are communicated from processor 106 to processor 122. In some embodiments, vehicle 104 includes one or more user interfaces 126 e.g. for display of location information to a user of the vehicle and/or for receipt of navigation inputs from the user. - 16-Optionally, in some embodiments, system 100 includes an external processor 1and/or with an external user interface 172. Where, in some embodiments, processor 1is hosted by the cloud. Where, in some embodiments, system 100 includes a control center 176 which hosts processor 170 and/or user interface 172. In some embodiments, one or both of processors 106, 122 communicates with external processor 170 and/or with external user interface 172.In some embodiments, processing of acoustic sensor signals (e.g. as described in method/s of this document e.g. one or more of FIG. 2, FIG. 4, FIG. 6, FIG. 7, FIG. 12. and FIG. 12) is performed entirely at processor 106. Which processor 106, in some embodiments, is local to the sensors 108-118 and/or structure 120. Alternatively, in some embodiments, at least a portion of processing of acoustic sensor signals is performed at vehicle processor 122 and/or at another (e.g. a remote) processor 170.For example, in some embodiments, at least a portion of tracking calculations and/or generating of control signals are performed at processor 170. Where, for example, user/s input desired navigation feature/s and/or monitor vehicle progress through user interface 172. In some embodiments, information regarding vehicle 104 position and/or movement is displayed to user/s at control center 176 and/or the user/s at the control center 176 are involved with decisions regarding control of vehicle 104.In some embodiments, a central processor e.g. control center processor 120 receives tracking information regarding more than one vehicle and/or more than one set of acoustic sensors 108-118.Optionally, in some embodiments, system 100 includes one or more non-acoustic sensor 178. Where, in some embodiments, non-acoustic sensor 178 is hosted by vehicle 104. In some embodiments, sensor 178 includes an inertial sensor (e.g. accelerometer).In some embodiments, sensor 178 includes a proximity sensor e.g. to provide additional data regarding position of vehicle 104 e.g. with respect to structure 120 (and/or other structure/s e.g. landscape element 962 FIG. 9).For example, optionally, where, in some embodiments, one or more of sensors 108, 110, 112, 114, 116, 118 includes one or more additional sensor. For example, a temperature sensor e.g. where temperature measurements are used to determine and/or adjust a speed of sound used in tracking methods. For example, a proximity sensor e.g. proximity sensor providing additional data regarding distance between vehicle 104 with respect to the sensor. Where, in some embodiments, one or both of vehicle 104 and - 17-structure 120 include a proximity sensor. Optionally, in some embodiments, vehicle 1includes a transceiver 174 which, in some embodiments, sends and/or receives data for positioning and/or navigation using one or more additional modality. For example, GNSS. However, in some embodiments, vehicle 104 includes no such capabilities and/or is able to navigate solely using sensor data and/or location data and/or control instruction/s received from processor 106.
FIG. 2 is a method of vehicle tracking, according to some embodiments of the disclosure.At 200, in some embodiments, spatial relationship between sensors of a pluralityof acoustic sensors is received. Optionally, in some embodiments, a spatial relationship between one or more of the sensors and one or more other element is received. For example, with respect to structure/s (e.g. buildings e.g. fences) and/or landscape topology.At 202, in some embodiments, acoustic measurement signals including measurement of a vehicular acoustic signature are received from the sensors e.g. each of the sensors. Where, in some embodiments, acoustic measurements, from one or more of the plurality of sensors are received continuously. In some embodiments, from one or more of the plurality of sensors, acoustic measurements are received periodically.In some embodiments, one or more of the acoustic measurement signals includes wideband acoustic measurements. Where, in some embodiments, an upper bound of a range of wavelengths of the acoustic measurement signal is at least double, or at least times, or 2-5000 times a lower bound of the range of wavelengths. Where, in some embodiments, the acoustic measurement signal/s include wavelengths of 10-45kHz, or 20-45kHz, or lower or higher or intermediate ranges or wavelengths.Where, the vehicular acoustic signature includes, for example, sound generatedby movement of portion/s of the vehicle with respect to other portion/s. For example, including, for example, sound generated by the vehicle engine and/or other mechanical part/s and/or including resonance e.g. associated with the structure of the vehicle.Where, additionally, or alternatively, the vehicular acoustic signature includes, for example, sound/s made by movement of portion/s of the vehicle with respect to the environment in which the vehicle is located: For example, sound of movement of vehicle wheels and/or tracks against the ground (e.g. when the vehicle is a ground vehicle). For example, the sound of propeller movements against fluid e.g. for vehicles propelled (at least partially) by propeller/s. For example, for a water vehicle, sound of movement of the vehicle surface through the water.At 204, in some embodiments, vehicle acoustic signature/s are identified in one or more of the acoustic measurement signals.For example, according to one or more feature of US Patent No. US81644and/or US Patent Application Publication No. US20210225182, US Patent Application Publication No. US20220011786 which are each herein incorporated by reference into this document in their entirety.For example, by acquiring a plurality of acoustic measurements of a plurality of vehicles and using the data to train a machine learning model which identifies a vehicle based measurement of sound generated by the vehicle. In some embodiments, audio signals are converted into spectrograms, the spectrograms being used to train the machine learning model.In some embodiments, acoustic measurements are acquired from a plurality of positions in a landscape e.g. to train the model to provide a position of a sensed vehicle, based on predictable acoustic effect of the landscape on the acoustic signature of the vehicle. In some embodiments, a plurality of acoustic measurements e.g. from sensors sensing sound from different directions and/or positions (e.g. on a known structure) are together used to identify a position of a sound source e.g. vehicle or other sound generating object. For example, position with respect to a known landscape feature, in some embodiments, is determined from a plurality of acoustic signals. For example, if the sound source is within a tunnel, direction of sound as emanating from different ends of the tunnel is used to determine that the sound source is within the tunnel e.g. as opposed to located externally to the tunnel at on or the other end of the tunnel.In some embodiments, presence of a vehicle in a geographic region of the sensor array is determined by identifying the vehicle acoustic signature in one or more of the sensor signals.In some embodiments, one or more feature of the acoustic signature is identified in one or more of the sensor measurement signals.Exemplary acoustic signal features including one or more of:• time of arrival (TOA) of the acoustic signature• differential time of arrival (DTOA) of the acoustic signature• angle of arrival • power (e.g. time average of magnitude of a sensed acoustic signature) of one or more portion (time and/or frequency portion) of the acoustic signature• pattern of one or more portion acoustic signature over time, e.g. for one or more range of frequencies• recurring (e.g. periodic) feature/s in time e.g. for one or more range of frequencies• doppler shift of the acoustic signalIn some embodiments, identified feature/s identified in more than one sensor signal are compared and/or are used together.For example, in some embodiments, power of sensed acoustic signal from at least three sensors is used to locate the vehicle e.g. using triangulation.For example, in some embodiments, identified doppler shift in vehicular acoustic signature for one or more sensor signal is used to determine a speed of movement of the vehicle with respect to the sensor/s.In some embodiments, an acoustic signature of more than one vehicle is identified, for example, where a plurality of vehicles are tracked using a same set of sensors.Where, in some embodiments, (e.g. where acoustic data is wideband) acoustic signatures of different vehicles are identified by non-overlapping frequency portions of the acoustic signatures of the different vehicles. For example, where a first vehicle has an acoustic signature including a band of high frequencies and a second vehicle has an acoustic signature including a different band of high frequencies, the two vehicle acoustic signatures are identified despite having other wavelengths where the acoustic signatures overlap.At 206, in some embodiments, presence and/or position and/or movement a vehicle (and/or more than one vehicle) is determined using the identified acoustic signatures.In some embodiments, determining includes identifying a single sensor or a subset of sensors of the plurality of sensors to use to in tracking of the vehicle. For example, where, in some embodiments, an initial screening of all sensor signals is performed, selection of the subset of sensors, based on results of the initial screening. Where, in some embodiments, the initial screening is to provide those sensor signals which have high enough magnitude for an acoustic signal to be identified in the signal.
In some embodiments, a vehicle type and/or a specific vehicle and/or and state of a vehicle (e.g. if the engine is on, whether the vehicle is moving or not, e.g. one of a range of speeds, e.g. what gear a vehicle is being driven in) is determined from the identified acoustic signature.For example, using identified feature/s of the vehicular acoustic signature in one or more sensor measurement signal and/or using comparisons between sensor signals of one or more feature.In some embodiments, determining of the position and/or movement of the vehicle occurs, at least partially, at a processor external to the vehicle. For example, at a processor local to the acoustic sensors (e.g. processor 106, FIG. 1) and/or at a remote processor e.g. a control center processor (e.g. processor 170, FIG. 1).In some embodiments, determining of the position and/or movement of the vehicle occurs (e.g. at least partially) at the vehicle (e.g. processor 122 FIG. 1). Where, for example, a processor hosted by the vehicle receives acoustic sensor measurement and/or partially processed acoustic sensor measurements.Optionally, in some embodiments, vehicle tracking information (e.g. the determined location and/or movement of the vehicle) is displayed at one or more user interface e.g. at a vehicular user interface (e.g. user interface 126 FIG. 1, e.g. at a control center user interface (e.g. user interface 172 FIG. 1).
FIG. 3 is a simplified schematic of a vehicle tracking system 300, according to some embodiments of the disclosure.In some embodiments, system 300 includes one or more feature as illustrated in and/or described regarding system 100 FIG. 1.For example, a plurality of acoustic sensors 308, 310, 312, 314, 316, 318, 328, 330, 332. Where, in some embodiments, one or more of the plurality of sensors (e.g. all of the sensors) are attached (e.g. fixedly attached) to a structure 320. Which structure 320, in some embodiments, includes one or more feature as illustrated in and/or described regarding structure 120 FIG. 1.In some embodiments, rings 302 illustrate travel with time of an acoustic signal generated, at a particular time, from vehicle 304.
In some embodiments, the acoustic signal 302 includes noise generated by vehicle 304 and/or associated with movement of vehicle 304 e.g. including one or more feature described regarding vehicular acoustic signatures in step 202 FIG. 2.
FIG. 4 is a method of vehicular navigation, according to some embodiments of the disclosure.At 400, in some embodiments, spatial relationship between sensors of a plurality of acoustic sensors is received e.g. according to one or more feature of step 200 FIG. 2.At 402, in some embodiments, acoustic measurement signals including measurement of a vehicular acoustic signature are received from the sensors e.g. each of the sensors. For example, according to one or more feature of step 202 FIG. 2.At 404, in some embodiments, data regarding acoustic signatures of different vehicles is received. For example, different vehicle types and/or specific vehicles.At 406, in some embodiments, vehicle acoustic signature/s are identified in one or more of the acoustic measurement signals. For example, according to one or more feature of step 204 FIG. 2.At 408, in some embodiments, the vehicle type and/or specific vehicle and/or a state of a vehicle is identified from the acoustic measurement signals.Where, in some embodiments, a state of a vehicle is determined by identifying deviation and/or addition to the acoustic signature of the vehicle.For example, where an additional sound (e.g. different amplitude in a range of frequencies e.g. a signal in a frequency range not normally present in the acoustic signature), in some embodiments, is used to determine a change in state. For example, in an exemplary embodiment, a periodic additional sound to the expected acoustic signature is used to identify a foreign object in moving portion/s of the vehicle.At 410, in some embodiments, the position and/or movement of the vehicle are determined from the identified acoustic signals, e.g. according to one or more feature of Step 206 FIG. 2 and using information regarding the vehicle type and/or specific vehicle.At 412, in some embodiments, control signal/s for movement of the vehicle are determined, using the determined position and/or movement of the vehicle. Additionally and optionally, in some embodiments, the control signals are determined using data associated with the identified vehicle type and/or specific vehicle. For example, where, control signals are generated using knowledge of the vehicle’s capabilities.
In some embodiments, generation of control signal/s is based on received destination and/or route, which is used (e.g. along with the determined position and/or movement and optionally the vehicle characteristics) to navigate the vehicle to the destination and/or along the route.At 414, optionally, in some embodiments, control signals are transmitted to the vehicle. For example, where at least a portion of processing for determining the position and/or location of the vehicle and/or processing to determine control signal/s is performed externally to the vehicle. For example, by a processor local to the acoustic sensors (e.g. processor 106 FIG. 1) and/or by a processor located remotely e.g. a control center processor (e.g. processor 170 FIG. 1).
FIG. 5 is a simplified schematic plot of acoustic sensors measurement data 534, 546, according to some embodiments of the disclosure.FIG. 5 illustrates acoustic measurement signals from a first sensor 534 and a second sensor 546 respectively. Where, in some embodiments, plots are of acoustic measurement magnitude with time. In some embodiments, signal 546 is of a sensor more distant from a vehicle than signal 534. An exemplary illustration provided by referring back to FIG. 3 where sensor 316 (e.g., corresponding to signal 546) is closer to vehicle 304 than sensor 330 (e.g., corresponding to signal 534).Where FIG. 5, in some embodiments, illustrates a reoccurring feature of acoustic signals 534, 546. Where, in some embodiments, a reoccurring (e.g. periodic) feature includes a magnitude peak e.g. peaks 540, 545, 544 for signal 534 and peaks 550, 552, 554 for signal 546.In some embodiments, a reoccurring feature of a vehicular acoustic signature includes identifiable changes to magnitude and/or frequency.In some embodiments, an acoustic signature is identified in a measurement signal of a first sensor located closer to the vehicle earlier (e.g. TOA and/or DTO is earlier) than the acoustic signature is identified in a measurement signal of a second sensor located further away from the vehicle. Where TOA, in some embodiments, is determined as a change in magnitude, for example, as compared to noise levels (e.g. at 536, 546). Where, for example, FIG. 5 illustrates the change in magnitude of the acoustic measurement signal 536, 548 as happening later in time for the more distant sensor providing second signal 546.
FIG. 6 is a method of vehicle tracking, according to some embodiments of the disclosure.At 600, in some embodiments, spatial relationship between sensors of a plurality of acoustic sensors is received. Where, in some embodiments, step 600 includes one or more feature of step 200 FIG. 2.At 602, in some embodiments, acoustic measurement signals including measurement of a vehicular acoustic signature are received from the sensors e.g. each of the sensors. Where, in some embodiments, step 602 includes one or more feature of step 202 FIG. 2.At 604, in some embodiments, vehicle acoustic signature/s are identified in one or more of the acoustic measurement signals Where, in some embodiments, step 6includes one or more feature of step 204 FIG. 2.At 606, in some embodiments, one or more recurring (e.g. periodic) feature of the vehicular acoustic signature in identified a sensor signal. In some embodiments, one or more recurring feature of the vehicular acoustic signature is identified in a plurality of sensor signals. In some embodiments, a recurring feature includes a change and/or pattern in the sensor signal magnitude e.g. for one or more frequency range.At 608, in some embodiments, position and/or movement of the vehicle is determined from the identified acoustic signature in sensor signal/s. Where, in some embodiments, step 608 includes one or more feature of step 206 FIG. 2. Where, in some embodiments, identified instances of reoccurring features of the acoustic signature in one or more sensor signal are used to determine position and/or movement of the vehicle.
FIG. 7 is a simplified schematic plot of acoustic sensor measurement data 758, according to some embodiments of the disclosure.FIG. 7 illustrates, in some embodiments, frequency of a measured acoustic signal and/or of an acoustic signature of a vehicle (e.g. identified from the measured acoustic signal) with time. FIG. 7 illustrates, in some embodiments, a situation where the vehicle is moving past the sensor, where, in some embodiments, point 760 is where the signal has no Doppler shift e.g. the vehicle is next to the sensor. In some embodiments, FIG. illustrates a simplified vehicular acoustic signal.
In some embodiments, identification such a cross over between red and blue Doppler shifts is used to determine vehicle position at a particular time. For example, used to correct other methods of location determination e.g. for accumulated error/s. In some embodiments, the vehicle is tracked using cross over between Doppler shifts at multiple sensors e.g. in different locations (e.g. two or more of sensors 108, 110, 112, 114. 116, 118 FIG. 1).
FIG. 8 is a method of vehicle tracking, according to some embodiments of the disclosure.At 800, in some embodiments, spatial relationship between sensors of a plurality of acoustic sensors is received. Where, in some embodiments, step 800 includes one or more feature of step 200 FIG. 2.At 802, in some embodiments, acoustic measurement signals including measurement of a vehicular acoustic signature are received from the sensors e.g. each of the sensors. Where, in some embodiments, step 802 includes one or more feature of step 202 FIG. 2.At 804, in some embodiments, vehicle acoustic signature/s are identified in one or more of the acoustic measurement signals. Where, in some embodiments, step 8includes one or more feature of step 204 FIG. 2.At 806, in some embodiments, a Doppler shift in frequency of the vehicular acoustic signature is identified in one or more sensor signal e.g. refer to FIG. 7 and/or description regarding FIG. 7.At 808, in some embodiments, position and/or movement a vehicle (and/or more than one vehicle) is determined using the identified acoustic signatures and, optionally identified doppler shift/s in sensor signal/s. Where, in some embodiments, step 8includes one or more feature of step 206 FIG. 2.At 810, in some embodiments, determined position is corrected using, for example, time/s of identified doppler shift/s and position/s of the specific sensor/s in which the respective doppler shift/s have been identified.In some embodiments, identifying of a doppler shift is periodic, for example, at times that the vehicle moves past a sensor, whilst moving at a sufficient speed for the Doppler shift to be identified in the sensor signal. Where, periodic location information from doppler shift is used in determining (e.g. at step 808) position and/or movement of the vehicle and/or where, periodically, the position determined using identified doppler shift is used to correct the position determined.
FIG. 9 is a simplified schematic of a vehicle tracking system 700, according to some embodiments of the disclosure.In some embodiments, system 900 includes one or more feature as illustrated in and/or described regarding system 100 FIG. 1 and/or system 300 FIG. 3. For example, a plurality of sensors 108,110,112,114,116 attached to a structure 120 and a vehicle 104.In some embodiments, FIG. 9 illustrates exemplary acoustic signal paths where an acoustically reflective landscape element 962 is present in a geographical area of the sensors 108. 110, 112, 114. 116 and/or vehicle 104.Where, arrows 964, 966, 960-968 illustrate travel of sound originating at vehicle 104 to sensors 114,112. Where, for example, sensor 114 receives the sound only through a direct path 964 to sensor 114. Where sensor 112 receives sound both directly along path 966 and along a path 960-986 via a reflection at landscape element 962.In some embodiments, interference effects e.g. between reflected portions of the vehicular signature are used to track the vehicle. For example, where known landscape features generate the same interference effects for the same vehicle conditions (e.g. conditions including vehicle position and/or vehicle state).
FIG. 10 is a simplified schematic illustrating a tracking system 1000 in a landscape 1088, according to some embodiments of the invention.In some embodiments, system 1000 includes a plurality of acoustic sensors 1008, 1010, 1012,1014, distributed along a structure 1020.In some embodiments, system 1000 includes one or more acoustic sensors 1016, 1018 positioned at other point/s in the landscape. For example, one or more sensor 10on another structure 1080. For example, one or more sensor 1018 located on a ground surface 1086.In some embodiments, sensors 1008.1010,1012,1014 and/or sensor 1080 and/or sensor 1018 provide acoustic measurement signals to a processor e.g. a single processor (not illustrated). Where processing is performed by the processor e.g. according to method/s described within this document e.g. with respect to processor 122 FIG. 1. -26-In some embodiments, the landscape 1088 includes different types of surface, for example, road surface/s 1084, 1088. For example, built structures 1008, 1062. For example, ground surface/s having, in some embodiments, different acoustic features 1082,1086. For example, if ground surface 1018 is a non-built surface e.g. a grass and/or garden surface it will have different acoustic characteristics than, for example, road surfaces 1084, 1088.In some embodiments, topography of landscape 1088 e.g. including protrusion/s and/or passageways (e.g. between structures 1008, 1062) affects acoustic signals from vehicles 1004, 1005 e.g. as received by the acoustic sensors.In some embodiments, one or more acoustic sensor 1090 is located in a known, but not fixed position. For example, in some embodiments, a moving vehicle having a known position 1004 (e.g. via GNSS) hosts one or more sensor. Where in some embodiments, vehicle 1004 is a land vehicle, or an air vehicle (e.g. drone).
FIG. 11 is a simplified schematic illustrating a tracking system including aplurality of acoustic sensors 1, 2,3, 4, 5, 6, 7, 8, 9, 10, in a landscape 1188, according to some embodiments of the disclosure.Where, in some embodiments, system 1100 includes one or more features of systems as described elsewhere in this document.In some embodiments, FIG. 11 illustrates an aerial view of landscape 1188, where illustrated are positions of a plurality of acoustic sensors 1, 2, 3, 4, 5, 6, 7, 8, 9, 10. In some embodiments, FIG. 11 is an aerial photograph of landscape 1188 where positions of acoustic sensors of the system are illustrated.In some embodiments, landscape 1188 includes one or more of buildings 1162, roads 1184, and open spaces 1186 (which, in some cases, include trees 1182). Where numerals indicating feature/s of landscape 1188 should be understood to be illustrative and non-limiting.
FIG. 12 is a method of vehicle tracking, according to some embodiments of the invention.At 1200, in some embodiments, data regarding a landscape is received.Where, in some embodiments, the landscape data includes data regarding topology of the landscape and/or structures within the landscape.
In some embodiments, the landscape data includes acoustic characteristic/s of the landscape. For example, material characteristics as related to e.g. acoustic reflectivity.In some embodiments, landscape data includes modelling of effect of portion/s of the landscape on a vehicular acoustic signal.In some embodiments, landscape data includes elevation and/or structural mapping of the landscape. For example, natural feature topology (e.g. hill and/or valley topography). For example, size and/or shape of structure/s e.g. buildings, e.g. including acoustically transferring features e.g. tunnels.In some embodiments, landscape data includes material types of portion/s of the landscape e.g. building surface, road surface, natural surface e.g. with respect to their acoustic properties.In some embodiments, landscape data is used to generate a model which produces expected changes to acoustic signatures, based on the landscape data. In some embodiments, the model is specific to a particular vehicle. In some embodiment, the model is general, being able to identify changes to an acoustic signature, based on passive acoustic feature/s of the landscape (e.g. associated with topography and/or materials making up the landscape).In some embodiments, the model is built by acquiring landscape data. For example, by acoustically measuring the landscape. In some embodiments, acoustic signals are generated at different portions of the landscape, measurements of these signals at the plurality of sensors being used to generate a model which provides position information for a received acoustic signal.In some embodiments, the acoustic signals are those of a vehicle, for example a known vehicle being moved around the landscape, according to a known route is used, in some embodiments, to provide data regarding effect of the landscape on the vehicle acoustic signature.In some embodiments, these measurements being performed a plurality of times, for example, where the model provides information for different vehicle types (e.g. using different vehicles and/or different vehicle movements e.g. speed) for generation of the model.At 1201, in some embodiments, a spatial relationship between a plurality of acoustic sensors is received. For example, including one or more feature of step 200 FIG. 2. Where, in some embodiments, information regarding spatial relationship between the -28-sensor/s and one or more feature of a landscape in which said plurality of acoustic sensors are located is received. For example, position and/or topography of one or more landscape feature and/or the spatial relationship of the feature/s with respect to the acoustic sensor/s.Where landscape feature/s include, in some embodiments, building structure/s e.g.position and/or shape and/or dimensions of the building/s. Where landscape feature/s include, in some embodiments, topography of physical features in the landscape of the sensors. For example, position and/or dimension of hills and/or valleys and/or other undulation in a landscape (e.g. boulders and/or other geographical features).Where, vehicle tracking is performed within a structure landscape feature/s include, for example, walls and/or windows of the structure, and/or elements (e.g. furniture, machinery) housed by the structure.In some embodiments, information including material characteristic/s of the feature/s e.g. those material characteristics affecting audio reflection from the feature/s (e.g. acoustic reflectivity) is also received.At 1202, in some embodiments, sensor measurement signals are received fromthe plurality of sensors. For example, including one or more feature of step 202 FIG. 2.At 1204, in some embodiments, a vehicular acoustic signature is identified in the plurality of acoustic sensor measurements. For example, including one or more feature of step 204 FIG. 2.In some embodiments, sensed reflection/s of the vehicular acoustic signature atlandscape feature/s is identified. Where in some embodiments, reflections are identified as such using the identified directly incident acoustic signature signals received (e.g. a same sensor receives a second acoustic signature of a same vehicle, but at a later time) and/or by feature/s of the reflection acoustic signature (e.g. lower magnitude, differentAOA).At 1206, optionally, in some embodiments, received acoustic sensor signals are corrected, for example, using identified reflections of the vehicle acoustic signature.In some embodiments, landscape information and/or identification of reflections of the vehicle acoustic signal are used to remove noise and/or correct the signal.Where, in some embodiments, correction removes echo noise of reflection/s ofthe acoustic signature at landscape features from sensor signal/s.In some embodiments, for example, where an obstacle prevents receipt of the direct vehicular acoustic signature, one or more feature of a reflected vehicular acoustic signature is corrected e.g. to adjust the receive acoustic signature features to those expected if the obstacle was not in position. For example, where the reflection delays arrival of the acoustic signature, one or more of TOA, DOA, AOA are corrected, using known spatial features of the obstacle and/or position of the obstacle with respect to the sensor/s.At 1208, in some embodiments, a vehicle position is determined using the identified vehicular acoustic signature, its reflection, and received information regarding landscape feature/s.In some embodiments, reflection/s of the vehicular acoustic signature are removed from the acoustic measurement signal/s, for example, based on the information regarding landscaped features/s received. Where, the noise-reduced acoustic measurement signal/s are then used to determine position and/or movement of the vehicle.In some embodiments, identified vehicular acoustic signature reflection/s in one or more acoustic measurement signal, for example, along with information regarding landscape feature/s, are used in tracking of position of the vehicle.
FIGs. 13A-B are simplified schematics illustrating acoustics of a landscape 1388, according to some embodiments of the disclosure.Where, in some embodiments, FIGs. 13A-B illustrate an aerial view of landscape 1388. Where illustrated on landscape are buildings 1362.In some embodiments, FIGs. 13A-B illustrate a feature of an acoustic signal (e.g. time of arrival, intensity). Where the acoustic signal is emitted from position 1300, 13respectively. Where lighter shades illustrate earlier time of arrival and/or higher intensity than darker shades.FIGs. 13A-B illustrate that, for different positions of an acoustic signal 1300, 1302, features of the landscape (buildings 1362) affect the acoustic signal.
FIGs. 14A-B are a method of vehicle tracking, according to some embodiments of the disclosure.Referring now to FIG. 14A: At 1400, in some embodiments, an acoustic tracking system is installed. Where, in some embodiments, the system includes one or more feature as described regarding and/or illustrated for one or more of system 100 FIG. 1, system 300 FIG. 3 and system 900 FIG. 9.In some embodiments, installation includes fixedly positioning a plurality of acoustic sensors e.g. by attaching them to an existing environmental structure.In some embodiments, installation includes performing calibration step/s. Where, in some embodiments, calibration is saved to a system memory and/or is accessible by the system.For example, where position of one or more of the sensors is registered with respect to landscape data (e.g. GNSS data). For example, where spatial relationships between sensor/s (e.g. distance between sensors) is measured. For example, where spatial relationships between sensor/s and environmental structure/s and/or features are measured.In some embodiments, installation of a system includes acquiring data regarding the environment into which the system is installed. For example, where registering to other tracking and/or location modalities is for one or more structure feature and/or landscape feature. For example, additionally or alternatively, where measurements as to structural environmental feature/s and/or landscape feature/s are acquired. For example, in some embodiments, the measurements being stored accessible to the system and/or used in providing data to the system.At 1402, in some embodiments, a spatial relationship between a plurality of acoustic sensors is received. For example, according to one or more feature as described regarding step 200 FIG. 2. For example, where the spatial relationship is determined from data acquired during calibration/s e.g. as described in step 1400.At 1404, in some embodiments, data regarding infrastructure and/or topography of an environment of an acoustic tracking system is received. For example, according toAt 1406, optionally, in some embodiments, position of sensors of the acoustic tracking system with respect to the infrastructure and/or topography data is received.In some embodiments, one or more of steps 1402-1406 is together. Where, in some embodiments, position of the sensors on a map of the environment is received. For example, where data at 1404, 1406 includes GNSS data, sensor position, in some embodiments, is registered to the GNSS data.
At 1407, optionally, in some embodiments, data regarding vehicle acoustic signatures e.g. those of different vehicles is received e.g. according to one or more feature of step 402 FIG. 4.In some embodiments, the vehicle acoustic signature data is acquired. For example, by collecting acoustic measurements of one or more vehicles. Where, in some embodiments, acoustic measurements are acquired of a vehicle moving in different states (e.g. different speeds) and/or over different terrains (e.g. over surfaces with different acoustic signatures e.g. a dirt path, paved road).At 1408, in some embodiments, measurement signals from a plurality of acoustic sensors are received. For example, according to one or more feature as illustrated in and/or described regarding step 202 FIG. 2.At 1410, optionally, in some embodiments, measurement signals are received from one or more non-acoustic sensor (e.g. sensor 178 FIG. 1). For example, a temperature sensor e.g. for determining a speed of sound in an environment of the acoustic sensors. For example, inertial measurements e.g. as provided by one or more inertial sensor attached to the vehicle to be tracked. For example, one or more proximity sensor.At 1412, in some embodiments, optionally, reflection/s of the acoustic signature at environmental feature/s are identified. For example, according to one or more feature of step 1204 FIG. 12.At 1414, optionally, in some embodiments, acoustic measurement signals are corrected. For example, using received data regarding environment infrastructure and/or topography e.g. to remove echo/s. For example, using one or more feature as illustrated in and/or described regarding step 1206 FIG. 12.Referring now to FIG. 14B, which in some embodiments is a continuation of themethod of FIG. 14A.
At 1416, in some embodiments, an acoustic signature of a vehicle is identified in one or more of the measurement signals. For example, according to one or more feature as illustrated in and/or described regarding step 204 FIG. 2. Optionally, the acoustic signature is identified using data received at step! 107. In some embodiments, one or more feature is identified in the acoustic signature in one or more of the sensor signals e.g. according to one or more feature of step 606 FIG. 6.Optionally, reflection/s of the acoustic signature e.g. at environmental feature/s are identified e.g. according to one or more feature of step 1204 FIG. 12.At 1418, optionally, in some embodiments, a vehicle type and/or state and/or a specific vehicle is identified e.g. according to one or more feature of step 408 FIG. 4.At 1420, optionally, in some embodiments, a Doppler shift in one or more acoustic signature is identified. For example, according to one or more feature of step 806 FIG. 8.At 1422, in some embodiments, position and/or movement of the vehicle is determined, using the identified acoustic signatures (according to step 206 FIG. 2) and/or feature/s of identified acoustic signatures (e.g. including, optionally, comparison of features between sensor signals).Optionally, position and/or movement is also determined using one or more of:• an identified vehicle state, type, specific vehicle e.g. according to step 4FIG. 4;• identified occurrences of a recurring feature of acoustic signature in sensor signal/s e.g. according to step 604 FIG. 6;• identified Doppler shift in one or more acoustic signature:o for example, where a position is corrected using a position identified using Doppler shift at a specific sensor and a known position of the specific sensor;o e.g. according to step 808 FIG. 8;• identified sensed reflections of the acoustic signature e.g. according to step 1208 FIG. 12 At 1424, in some embodiments, the determined location and/or movement of the vehicle is transmitted and/or save. For example, to the vehicle itself e.g. to display to a user of the vehicle e.g. where the vehicle self-navigates based on location and/or measurement feedback received. For example, to a control center. Where saving is locally (e.g. at the vehicle and/or local to the sensors) and/or remotely (e.g. at the cloud and/or at a remote control center).At 1426, optionally, in some embodiments, control signal/s for control of the vehicle are generated, based on the determined position and/or movement. Where, in some embodiments, the control signal/s include control signals for one or more actuator of the vehicle e.g. to control movement and/or position of the vehicle. In some embodiments, a desired trajectory and/or destination is also used, e.g. along with the determined position and/or movement, in generating the control signal/s. For example, as described regarding FIG. 1, in some embodiments, generation occurs locally (e.g. at a local processor to the system and/or at the vehicle itself) and optionally, (additionally or alternatively) generation occurs at least partially remotely (e.g. at a control center).At 1428, optionally, in some embodiments, control signals are transmitted to the vehicle e.g. where the control signals are at least partially generated away from the vehicle.
FIGs. 15A-B are plots of amplitude with time, for acoustic measurement signals, according to some embodiments of the disclosure.FIGs. 16A-B are plots of amplitude with time, for acoustic measurement signals, according to some embodiments of the disclosure.FIGs. 17A-B are plots of amplitude with time, for acoustic measurement signals, according to some embodiments of the disclosure.In some embodiments, FIGs. 15A-B, FIGs. 16A-B, and FIGs. 17A-B each illustrate an acoustic signal produced by a sensor where a same type of acoustic signal is generated at different positions in a landscape. For example, in some embodiments, the figures illustrate experimental data from a sensor, when an acoustic signal (e.g. a gun- shot), referring to FIG. 11, is produced at each of positions 8 (FIGs. 15A-B), 6 (FIGs. 16A-B), and 7 (FIGs. 17A-B). Where A and B for each graph set illustrates a second measurement for a second acoustic signal produced at the same position (e.g. a second gun-shot).
Visible in FIGs. 15A-17B is similarity of the measurement signal for the repeated sound, at a same measurement point and/or difference in the measurement signal for the same type of acoustic signal at different positions within the landscape. Illustrating the ability to use such measurement signals to identify a position of an object making a sound (e.g. gun, vehicle) using an acoustic signal from one or more acoustic sensor in the landscape e.g. where the effect of acoustic signals by the landscape on an acoustic path from the object to the measurement sensor is known.
FIGs. 18A-B are plots of frequency magnitude with time, for acoustic measurement signals, according to some embodiments of the disclosure.FIGs. 19A-B are plots of frequency magnitude with time, for acoustic measurement signals, according to some embodiments of the disclosure.FIGs. 20A-B are plots of frequency magnitude with time, for acoustic measurement signals, according to some embodiments of the disclosure.Where, in some embodiments, light colors illustrate high intensity for the frequency of the y-axis, the darker the shade, the lower the intensity.In some embodiments, FIGs. 18A-B, FIGs. 19A-B, and FIGs. 20A-B eachillustrate an acoustic signal produced by a sensor where a same type of acoustic signal is generated at different positions in a landscape. For example, in some embodiments, the figures illustrate experimental data from a sensor, when an acoustic signal (e.g. a gun- shot), referring to FIG. 11, is produced at each of positions 8 (FIGs. 18A-B), 6 (FIGs. 19A-B), and 7 (FIGs. 20A-B). Where A and B for each graph set illustrates a second measurement for a second acoustic signal produced at the same position (e.g. a second gun-shot).Visible in FIGs. 18A-20B is similarity of the measurement signal for the repeated sound, at a same measurement point and/or difference in the measurement signal for the same type of acoustic signal at different positions within the landscape. Illustrating the ability to use such measurement signals to identify a position of an object making a sound (e.g. gun, vehicle) using an acoustic signal from one or more acoustic sensor in the landscape e.g. where the effect of acoustic signals by the landscape on an acoustic path from the object to the measurement sensor is known.
FIG. 21 illustrates plots of acoustic sensor signals, with time, for a repeated acoustic signal.Where, in some embodiments, columns 1, 2, 3, 4, each illustrate measurement of an acoustic measurement signal over time. Where individual plots illustrate a sensor signal of amplitude with time. Where a same type of acoustic signal is generated at a same position in a landscape and measured at a same position. Similarity of traces at T=0-0.2, T=0.2-0.4, T=0.4-0.6, T=0.6-0.8, illustrating a consistent effect of the landscape on the measured acoustic signal.
FIG. 22 illustrates plots of acoustic sensor signals, , according to some embodiments of the disclosure.In some embodiments, FIG. 21 illustrates measurement signals of an acoustic sensor array where each column indicates a position from which a same acoustic signal is produced. In some embodiments, referring to FIG. 11, FIG. 21 illustrates measurements of a sensor array located at position 10.Where the acoustic signal is produced repetitively (e.g. numbers at the top of the column in FIG. 21 indicating positions of the source of the acoustic signal) individual plots illustrating measurement signals for the repetitions. In some embodiments, the individual plots illustrate an angle of highest amplitude trace (e.g. where the array provides a plurality of traces of, with time. Visible in the graphs is the repetitive (e.g. identifiable patterns) nature of the same acoustic signal at different positions in a landscape with respect to the sensing position. General As used within this document, the term "about" refers to±20%The terms "comprises", "comprising", "includes", "including", "having" and their conjugates mean "including but not limited to".The term "consisting of’ means "including and limited to".As used herein, singular forms, for example, "a", "an" and "the" include plural references unless the context clearly dictates otherwise.Within this application, various quantifications and/or expressions may include use of ranges. Range format should not be construed as an inflexible limitation on the scope of the present disclosure. Accordingly, descriptions including ranges should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to should be considered to have specifically disclosed subranges such as from 1 to 3, from to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within the stated range and/or subrange, for example, 1, 2, 3, 4, 5, and 6. Whenever a numerical range is indicated within this document, it is meant to include any cited numeral (fractional or integral) within the indicated range.It is appreciated that certain features which are (e.g., for clarity) described in the context of separate embodiments, may also be provided in combination in a single embodiment. Where various features of the present disclosure, which are (e.g., for brevity) described in a context of a single embodiment, may also be provided separately or in any suitable sub-combination or may be suitable for use with any other described embodiment. Features described in the context of various embodiments are not to be considered essential features of those embodiments unless the embodiment is inoperative without those elements.Although the present disclosure has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications, and variations will be apparent to those skilled in the art. Accordingly, this application intends to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.All references (e.g., publications, patents, patent applications) mentioned in this specification are herein incorporated in their entirety by reference into the specification, e.g., as if each individual publication, patent, or patent application was individually indicated to be incorporated herein by reference. Citation or identification of any reference in this application should not be construed as an admission that such reference is available as prior art to the present disclosure. In addition, any priority document(s) and/or documents related to this application (e.g., co-filed) are hereby incorporated herein by reference in its/their entirety.Where section headings are used in this document, they should not be interpreted as necessarily limiting.
Claims (23)
1. A method of vehicle tracking comprising:receiving a spatial relationship between a plurality of acoustic sensors dispersed within an environment;receiving information regarding said environment;receiving a plurality of acoustic measurement signals, an acoustic measurement signal from each of said plurality of acoustic sensors;identifying a vehicular acoustic signature of a vehicle within said environment, in at least one of said plurality of acoustic measurement signals;determining a location of said vehicle with respect to said plurality of acoustic sensors, using said plurality of acoustic measurement signals, said acoustic signature, said spatial relationship between said plurality of acoustic sensors, and said information regarding said environment.
2. The method according to claim 1, comprising:wherein said receiving information regarding said environment comprises receiving information comprising one or more of:topographical information regarding said environment; andeffect on passage of sound through said environment;wherein said determining comprises determining said location of said vehicle within said environment using said information regarding said environment.
3. The method according to any one of claims 1-2, wherein said identifying comprises identifying a vehicular acoustic signature of a vehicle within said environment, in at least two of said plurality of acoustic measurement signals.
4. The method according to any one of claims 1-3, wherein a medium through which said acoustic sensors measure is air.
5. The method according to any one of claims 1-3, wherein said determining comprises identifying one or more features of said acoustic signature and using said identified features to determine said location of said vehicle. -38-
6. The method according to claim 5, wherein said determining comprises comparing identified features of said acoustic signature in at least two of plurality of said acoustic measurement signals.
7. The method according to any one of claims 5-6, wherein said one or more features includes a reoccurring feature of said acoustic signal.
8. The method according to any one of claims 1-7, wherein said information regarding said environment includes topology of one or more acoustically reflective landscape element.
9. The method according to claim 8, wherein said identifying comprises identifying one or more reflection of said acoustic signature within said plurality of sensor measurement signals;wherein said determining said location comprises using said one or more reflection of said acoustic signature.
10. The method according to any one of claims 1-9, wherein said receiving comprises receiving information regarding vehicular acoustic signatures;wherein said determining comprises determining one or more of:a vehicle type;a vehicle state;a specific vehicle;using said vehicular acoustic signature and said information regarding vehicular acoustic signatures.
11. The method according to claim 10, comprising generating control signals for said vehicle based on one or more of said vehicle type, said vehicle state, and said specific vehicle.
12. The method according to any one of claims 1-11, comprising generating control signals for control of said vehicle, using said location. -39-
13. The method according to claim 12, wherein said generating comprises using one or more of a received desired destination and a desired vehicle trajectory.
14. The method according to any one of claims 1-13, comprising transmitting said location to said vehicle.
15. The method according to any one of claims 1-14, comprising identifying a Doppler shift in said acoustic signature of at least one of said sensor measurement signals.
16. The method according to claim 15, wherein said determining comprises using said identified Doppler shift and a position of a sensor corresponding to the acoustic signature in which said Doppler shift was identified.
17. A tracking system comprising:a plurality of acoustic sensors dispersed within an environment of a vehicle to be tracked;a processor configured to:receive a spatial relationship between said plurality of acoustic sensors; and a location of said plurality of acoustic sensors within said environment;receive a plurality of acoustic measurement signals, an acoustic measurement signal from each of said plurality of acoustic sensors;identify a vehicular acoustic signature of a vehicle within said environment in at least two of said plurality of acoustic measurement signals;determine a location of said vehicle within said environment by determining a position of said vehicle with respect to said plurality of acoustic sensors, using said acoustic signature and said positions of said acoustic sensors within said environment.
18. The system according to claim 17, wherein at least two of said plurality of acoustic sensors are spaced away from each other. -40-
19. The system according to any one of claims 17-18, wherein said plurality of acoustic sensors are dispersed within said environment where an acoustic sensor is positioned within each portion of a plurality of portions of the environment.
20. The system according any one of claims 17-19 wherein said plurality of acoustic sensors are fixedly attached to one or more structure within said environment.
21. The system according to any one of claims 17-20, comprising a transmitterconfigured to transmit said location to said vehicle.
22. The system according to any one of claims 17-21, wherein said processor is configured to generate control signals for control of said vehicle, using said location and a received one or more of destination and trajectory.
23. The system according to claim 22, comprising a transmitter configured to transmit said control signals to said vehicle. For the Applicants,REINHOLD COHN AND PARTNERSBy:
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IL299575A IL299575A (en) | 2022-12-28 | 2022-12-28 | Vehicle tracking using acoustic measurements |
| EP23911146.1A EP4643148A1 (en) | 2022-12-28 | 2023-12-25 | Vehicle tracking using acoustic measurement |
| PCT/IL2023/051309 WO2024142048A1 (en) | 2022-12-28 | 2023-12-25 | Vehicle tracking using acoustic measurement |
| AU2023416120A AU2023416120A1 (en) | 2022-12-28 | 2023-12-25 | Vehicle tracking using acoustic measurement |
| KR1020257024932A KR20250124379A (en) | 2022-12-28 | 2023-12-25 | Vehicle tracking using acoustic measurements |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IL299575A IL299575A (en) | 2022-12-28 | 2022-12-28 | Vehicle tracking using acoustic measurements |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| IL299575A true IL299575A (en) | 2024-12-01 |
Family
ID=91716739
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| IL299575A IL299575A (en) | 2022-12-28 | 2022-12-28 | Vehicle tracking using acoustic measurements |
Country Status (5)
| Country | Link |
|---|---|
| EP (1) | EP4643148A1 (en) |
| KR (1) | KR20250124379A (en) |
| AU (1) | AU2023416120A1 (en) |
| IL (1) | IL299575A (en) |
| WO (1) | WO2024142048A1 (en) |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7872948B2 (en) * | 2008-04-14 | 2011-01-18 | The Boeing Company | Acoustic wide area air surveillance system |
| US9250315B2 (en) * | 2009-03-04 | 2016-02-02 | Toyota Motor Engineering & Manufacturing North America, Inc. | Collision avoidance system and method |
| US11408988B2 (en) * | 2018-09-24 | 2022-08-09 | Howden Alphair Ventilating Systems Inc. | System and method for acoustic vehicle location tracking |
-
2022
- 2022-12-28 IL IL299575A patent/IL299575A/en unknown
-
2023
- 2023-12-25 WO PCT/IL2023/051309 patent/WO2024142048A1/en not_active Ceased
- 2023-12-25 KR KR1020257024932A patent/KR20250124379A/en active Pending
- 2023-12-25 EP EP23911146.1A patent/EP4643148A1/en active Pending
- 2023-12-25 AU AU2023416120A patent/AU2023416120A1/en active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| KR20250124379A (en) | 2025-08-19 |
| WO2024142048A1 (en) | 2024-07-04 |
| AU2023416120A1 (en) | 2025-07-03 |
| EP4643148A1 (en) | 2025-11-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7506829B2 (en) | Speed Estimation and Object Tracking for Autonomous Vehicle Applications | |
| CN104793202B (en) | The object emerging system of more radar imagery sensors | |
| US9459238B1 (en) | Methods and apparatus for using acoustic inspection of containers to image objects | |
| EP3919238B1 (en) | Mobile robot and control method therefor | |
| KR20130056586A (en) | Method and apparatus for building map by using collective intelligent robots | |
| JP2017067756A (en) | Object detection apparatus and object detection method | |
| JP2019015598A (en) | Measurement device and method for measurement | |
| EP3761136B1 (en) | Control device, mobile body, and program | |
| CN109917788A (en) | A kind of control method and device of Robot wall walking | |
| CN113227832B (en) | Determine the orientation of an object by means of radar or by using electromagnetic interrogating radiation | |
| US7489255B2 (en) | Self-position identification apparatus and self-position identification method | |
| CN105912026A (en) | Flying robot obstacle avoiding device and flying robot obstacle avoiding method | |
| JP6920342B2 (en) | Devices and methods for determining object kinematics for movable objects | |
| RU2621463C2 (en) | Method of controlling towable linear acoustic antenna and navigation control unit | |
| AU2023416120A1 (en) | Vehicle tracking using acoustic measurement | |
| JP2024527619A (en) | System and method for sensing the environment surrounding a vehicle - Patents.com | |
| RU2019103392A (en) | OBTAINING SEISMIC DATA AT ULTRA LARGE RANGE FOR FULL-WAVE INVERSION DURING GROUND SEISMIC DATA COLLECTION | |
| JP2013185856A (en) | Methods for measuring position and wind speed utilizing doppler effect | |
| CN118038715B (en) | Aircraft monitoring method and device based on vibration sensor | |
| JP7294323B2 (en) | Moving body management device, moving body management system, moving body management method, and computer program | |
| CN119472680A (en) | Environmental perception method and system for autonomous driving vehicle | |
| CN109738899B (en) | Low-altitude aircraft detection method and system based on stochastic resonance detection array | |
| KR20180060233A (en) | Personal Radar & Personal Radar System, Animal-Driving Device and Animal-Driving System using the same | |
| Okaya et al. | A study on indoor positioning of an open-source drone using AR markers | |
| KR102068688B1 (en) | location recognition system of multi-robot |