US20220228866A1 - System and method for providing localization using inertial sensors - Google Patents
System and method for providing localization using inertial sensors Download PDFInfo
- Publication number
- US20220228866A1 US20220228866A1 US17/337,632 US202117337632A US2022228866A1 US 20220228866 A1 US20220228866 A1 US 20220228866A1 US 202117337632 A US202117337632 A US 202117337632A US 2022228866 A1 US2022228866 A1 US 2022228866A1
- Authority
- US
- United States
- Prior art keywords
- runtime
- angular velocities
- accelerations
- defined area
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/40—Correcting position, velocity or attitude
- G01S19/41—Differential correction, e.g. DGPS [differential GPS]
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/43—Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/46—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/47—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/49—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
Definitions
- the present invention relates generally to localization technology and, more specifically, to localization based on inertial sensors.
- GNSS global navigation satellite system
- GPS global positioning system
- GLONASS global positioning system
- Galileo Galileo
- Beidou Beidou
- GNSS global navigation satellite system
- autonomous vehicle companies integrate localization and mapping sensors and algorithms to achieve high-accuracy localization solutions for driver safety.
- GNSS cannot be used for indoor navigation, localization or positioning applications.
- Indoor navigation, localization or positioning applications may include, for example, navigating robots or vehicles in storage warehouses that are used to monitor and provide equipment efficiently, or navigating in an indoor parking lot.
- Today, indoor localization is typically performed by applying sensor fusion schemes, where data acquired by many types of sensors is integrated to provide an estimation of the location of the vehicle.
- GNSS may not provide adequate accuracy for some outdoor localization applications as well.
- localization systems for autonomous vehicles may require higher accuracy than is provided by GNSS.
- localization systems for autonomous vehicles may also use sensor fusion to achieve high-accuracy localization solutions.
- the sensors may include a camera, LIDAR, inertial sensors, and others. Unfortunately, these sensors may be expensive, and the quality of the data they provide may depend on various physical conditions, such as day and night, light and dark, urban canyon, and indoor environments. Hence, there is no high-accuracy localization and mapping solution for vehicles, both indoor and outdoor.
- a computer-based system and method for providing localization may include: during a training phase: obtaining a training dataset of accelerations, angular velocities, and known locations over time of vehicles moving in a defined area; and training a machine learning model to provide location estimation in the defined area based on the accelerations and angular velocities using the training dataset; during runtime phase: obtaining runtime accelerations and angular velocities over time of a vehicle moving in the defined area; and using the trained model to obtain current location of the vehicle based on the runtime acceleration and angular velocities.
- the accelerations, angular velocities of the training set and the runtime acceleration and angular velocities may be measured using at least one inertial measurement unit (IMU).
- IMU inertial measurement unit
- the IMU may include at least one three-dimensional accelerometer and at least one three-dimensional gyroscope.
- the machine learning model may be a neural network.
- Some embodiments of the invention may include, during the training phase: extracting features from the accelerations and angular velocities of the training dataset and adding the features to the training dataset; and during the runtime phase: extracting runtime features from the runtime accelerations and angular velocities; and using the trained model to obtain the current location of the vehicle based on the runtime acceleration, the runtime angular velocities and the runtime features.
- the features may be selected from velocity and horizontal slope.
- the known locations may be obtained from at least one of the list including: a global navigation satellite system (GNSS) receiver and a real-time kinematic (RTK) positioning system.
- GNSS global navigation satellite system
- RTK real-time kinematic
- the defined area may include a route.
- Some embodiments of the invention may include dividing mapping of the defined area into segments, and according to some embodiments, the location may be provided as a segment in which the vehicle is located.
- Some embodiments of the invention may include performing anomaly detection to find changes in defined area.
- Some embodiments of the invention may include obtaining readings from at least one sensor selected from, a GNSS receiver, a Lidar sensor and radio frequency (RF) sensor; and using the readings to enhance an accuracy of the current location provided by the trained ML model.
- a sensor selected from, a GNSS receiver, a Lidar sensor and radio frequency (RF) sensor
- FIG. 1 presents a system for providing localization of a vehicle, according to some embodiments of the invention.
- FIG. 2 presents the defined area or route divided into segments, according to some embodiments of the invention.
- FIG. 3 shows a flowchart of a method, according to some embodiments of the present invention.
- FIG. 4 presents an example of accelerations in the x, y and z directions and angular velocities in the x, y and z directions, according to some embodiments of the invention.
- FIG. 5 presents a deep learning neural network (DL NN) model during training phase, according to some embodiments of the invention.
- DL NN deep learning neural network
- FIG. 6 presents the trained DL NN model, according to some embodiments of the invention.
- FIG. 7 shows a high-level block diagram of an exemplary computing device according to some embodiments of the present invention.
- the terms “plurality” and “a plurality” as used herein may include, for example, “multiple” or “two or more”.
- the term set when used herein may include one or more items.
- the method embodiments described herein are not constrained to a particular order or sequence. Additionally, some of the described method embodiments or elements thereof can occur or be performed simultaneously, at the same point in time, or concurrently.
- sensors may include cameras, inertial sensors, Lidar, and RF sensors. These sensors typically suffer from considerable disadvantages and are unable to provide the high-accuracy needed for various scenarios, such as indoor localization where no GNSS reception is available and outdoor localization for autonomous vehicles where the accuracy provided by GNSS is still not high enough to allow safe-driving.
- Other applications that may require the high-accuracy navigation may include navigating around a parking lot and navigating a tractor or other agricultural vehicles in a field, where the tractor needs to cover an entire field efficiently, e.g., without leaving any part of the field uncovered and with minimal repetitions.
- An inertial measurement unit may be or may include an electronic device configured to measure the specific force, angular velocity, magnetic field and the orientation of a vehicle, typically using one or more accelerometers, e.g., three-dimensional accelerometers, gyroscopes, e.g., one three-dimensional gyroscopes, and optionally magnetometers.
- IMUs may be used in strapdown inertial navigation system (SINS), where the IMU sensor is physically attached to the body of the vehicle and measurements are integrated into motion equations. Moving along surfaces, roads, and other terrains results in a dynamic change of the IMU readings.
- SINS strapdown inertial navigation system
- the sensor readings contain intrinsic knowledge regarding the changes in location, which may be used to calculate the current location of the vehicle.
- current IMUs used in SINS typically suffer from biases and drift over time, making SINS problematic for high-accuracy localization and positioning solutions when used alone, without any accurate measurement update.
- Some embodiments of the invention aim to solve the high-accuracy localization and positioning problem in real-time for vehicles using inertial sensors by using machine learning (ML) models.
- readings of the inertial sensors may be provided as input to a deep learning (DL) neural network (NN) model.
- DL deep learning
- NN neural network
- signals from IMUs may be provided to an ML model that may provide the position or location of the vehicle.
- the signals provided by IMUs may include information indicative of accelerations, angular velocities, and time as raw data. Additionally, features may be calculated based on raw data, such as velocity and horizontal slope, etc.
- Some embodiments of the invention may include training a machine learning model to provide location estimation using a training dataset of accelerations, angular velocities, and known locations over time of vehicles moving in a defined area.
- a functional mapping may be established.
- raw data information measured by IMUs of vehicles may be provided to the trained ML model, and the trained model may provide location or position estimation of the moving vehicle in real time.
- the ML model may be or may include a NN model, and more specifically, a DL NN.
- a NN may include neurons and nodes organized into layers, with links between neurons transferring output between neurons. Aspects of a NN may be weighed, e.g., links may have weights, and training may involve adjusting weights. Aspects of a NN may include transfer functions, also referred to as nonlinear activation functions, e.g., an output of a node may be calculated using a transfer function.
- a NN may be executed and represented as formulas or relationships among nodes or neurons, such that the neurons, nodes or links are “virtual”, represented by software and formulas, where training or executing a NN is performed by for example a dedicated or conventional computer.
- a DL NN model may include many neurons and layers with non-linear activation functions such as convolutional, Softmax, rectified linear unit (ReLU), etc.
- the training dataset may be generated in a tagging procedure for a given area or environment. For example, a designer may map the entire area (e.g., a route, a parking lot, tunnels, urban canyons and other areas where GNSS reception is poor etc.) where a batch of raw data may be tagged or labeled with the correct location. This process may be improved with user-recorded raw data information and shared on a cloud.
- a designer may map the entire area (e.g., a route, a parking lot, tunnels, urban canyons and other areas where GNSS reception is poor etc.) where a batch of raw data may be tagged or labeled with the correct location. This process may be improved with user-recorded raw data information and shared on a cloud.
- some embodiments of the invention may improve the technology of positioning and localization by providing high-accuracy localization solutions based on IMU signals and DL ML schemes in real-time.
- some embodiments of the invention may provide a database of terrain information for various areas such as routes, parking lots, tunnels and urban canyons and keep updating the database. Some embodiments may find anomalies in the received signals, adjust the DL NN model online, and provide notifications regarding terrain anomalies in a vehicle's network for safe driving, where pits and other danger road modification will be shared among all users in a defined zone.
- FIG. 1 depicts a system 100 for providing localization of a vehicle, according to some embodiments of the invention.
- system 100 may include a vehicle 110 , equipped with one or more sensor unit 112 that may measure and provide data including at least one of specific force, angular velocity and/or the orientation of a vehicle, typically using at least one accelerometer, three-dimensional accelerometer, gyroscope and/or three-dimensional gyroscope.
- sensor unit 112 may be or may include an IMU or an SINS, e.g., may be physically attached to the body of vehicle 110 .
- Sensor unit 112 may further include a processor and a communication module for initial processing and transmitting of data measured by sensor unit 112 to navigation server 130 .
- vehicle 110 may be a vehicle moving along a road, way, path or route 120 .
- system 100 may include a vehicle moving in any defined area, such as a parking lot, a tunnel, a field, an urban canyon (e.g., areas is cities where reception of signals from GNSS is poor), or in another confined area, or an indoor vehicle, a robot or any other indoor vehicle moving in a confined indoor area.
- Sensor unit 112 may provide the data measured by sensor unit 112 to a navigation server 130 directly or through networks 140 .
- Networks 140 may include any type of network or combination of networks available for supporting communication between sensor unit 112 and navigation server 130 .
- Networks 340 may include for example, a wired, wireless, fiber optic, cellular or any other type of connection, a local area network (LAN), a wide area network (WAN), the Internet and intranet networks, etc.
- Each of navigation server 130 and sensor unit 112 may be or may include a computing device, such as computing device 700 depicted in FIG. 7 .
- One or more databases 150 may be or may include a storage device, such as storage device 730 .
- navigation server 130 and database 150 may be implemented in a remote location, e.g., in a ‘cloud’ computing system.
- navigation server 130 may store in database 150 data obtained from sensor unit 112 and other data such as ML model parameters, mapping of terrain and/or route 120 , computational results, and any other data required for providing localization or positioning data according to some embodiments of the invention.
- navigation server 130 may be configured to obtain, during a training phase, a training dataset of accelerations, angular velocities, and known locations over time of vehicles 110 moving in a defined area or route 120 , and to train an ML model, e.g., a DL NN model, to provide location estimation in the defined area or route 120 based on the accelerations and angular velocities using the training dataset.
- ML model e.g., a DL NN model
- navigation server 130 may be configured to obtain, during a training phase, a training dataset of accelerations, angular velocities over time as measured by sensor unit 112 .
- the data measured by sensor unit 112 may be tagged or labeled with the known locations.
- navigation server 130 may obtain the known locations from at least one of a GNSS receiver and a real-time kinematic (RTK) positioning system. Other methods may be used to obtain the location data.
- RTK real-time kinematic
- navigation server 130 may be further configured to, during a runtime phase, obtain runtime accelerations and angular velocities over time of a vehicle 110 moving in the defined area or route 120 and use the trained model to obtain current location of vehicle 110 based on the runtime acceleration and angular velocities.
- navigation server 130 may be further configured to, during the training phase, extract features from the accelerations and angular velocities of the training dataset and add the features to the training dataset.
- the features may include velocity, horizontal slope and/or other features.
- Navigation server 130 may be further configured to, during the runtime phase, extract the same type of features from the runtime accelerations and angular velocities, and use the trained model to obtain the current location of the vehicle 112 based on the runtime acceleration, the runtime angular velocities and the runtime features.
- navigation server 130 may have mapping of the defined area or route 120 .
- navigation server 130 may divide the mapping of the defined area or route 120 into segments and may provide or express the location of vehicle 110 a segment in which the vehicle 110 is located.
- a defined area or route 120 divided into segments is presented, according to some embodiments of the invention.
- the dashed squares/rectangles represent segments 200
- the location of vehicle 110 is between segments 210 and 220 .
- Other methods may be used to provide or express the location of vehicle 110 , e.g., using coordinates.
- FIG. 3 shows a flowchart of a method according to some embodiments of the present invention. The operations of FIG. 3 may be performed by the systems described in FIGS. 1 and 7 , but other systems may be used.
- a training dataset of accelerations, angular velocities, and known locations over time of vehicles moving in a defined area may be obtained.
- the accelerations and angular velocities may be measured by a sensor unit 112 including, for example, an IMU, and the known locations may be obtained from a GNSS receiver and/or a RTK positioning system. Other positioning systems may be used.
- An example of raw data measured by sensor unit 112 is presented in FIG.
- FIG. 4 which depicts an example of accelerations in the x, y and z directions (labeled a x , a y and a z , respectively) and angular velocities in the x, y and z directions (labeled w x , w y and w z , respectively), measured by a sensor unit 112 that is attached to a vehicle 110 moving along defined area or route 120 .
- Defined area or route 120 may include a uniform or non-uniform terrain, where pits, speed bumps, and more artifacts may be presented.
- sensor unit 112 may measure signals that may represent the movement of vehicle 110 , as sensor unit 112 may be physically attached or integrated with the body of vehicle 110 , and may record the raw data with respective time.
- location or position information may be obtained or generated manually or using any applicable indoor positioning methods including Wi-Fi positioning, capturing images of vehicle 110 over time and extracting the location or position of vehicle 110 from the images, etc.
- the position or location data may be provided in any applicable manner including spatial coordinates, segments, etc.
- the parking number may be used as the location indication or label.
- the defined area or route 120 may be divided into segments, as demonstrated in FIG. 2 , and location or position data may be provided as the segment in which vehicle 110 is present.
- features from the accelerations and angular velocities of the training dataset may be extracted and added to the training dataset.
- the features may include, for example, the estimated velocity and horizontal slope.
- the estimated velocity and horizontal slope may be extracted, calculated or obtained by applying classical approaches, such as integration of the integration of the accelerometer readings (e.g., the measured acceleration) and gyroscope readings (e.g., the measured orientation).
- the training dataset may include a plurality of recordings made by the same or different vehicles 110 moving again and again in the defined area or route 120 .
- an ML model e.g., a DL NN or other model may be trained to provide location estimation in the defined area based on the accelerations and angular velocities (and/or extracted features) using the training dataset.
- the training dataset may be provided to the ML model and used in the training phase to adjust model parameters (e.g., weights) of the ML model.
- the model parameters may be adjusted through a backpropagation training scheme, while the parameters of the ML model may be tuned or adjusted over and over until a loss function is minimized.
- a generalization of the solution may be achieved by using a nonlinear activation function, such as Sigmoid, ReLU, and Softmax, and a large number of neurons, convolutional, and recurrent layers.
- a trained ML model may be obtained or generated.
- runtime accelerations and angular velocities over time of vehicle 110 moving in defined area or route 120 may be obtained.
- the accelerations and angular velocities may be measured by sensor unit 112 including, for example an IMU that is physically attached to the body of vehicle 110 .
- runtime features may be extracted or calculated from the runtime accelerations and angular velocities, similarly to operation 320 .
- the trained model may be used to obtain current location of vehicle 110 based on the runtime acceleration and angular velocities and/or features. For example, the dataset of accelerations and angular velocities as measured by sensor unit 112 as well as the extracted features may be provided or feed into the trained ML model, and the trained ML model may provide an estimation of the current location of vehicle 110 in real time.
- the trained model may be used together with other sensors such as a camera, a GNSS receiver, a Lidar sensor, radio frequency (RF) sensor, etc., to enhance the accuracy of the location provided by the trained ML model using sensor fusion frameworks.
- Sensor fusion may be used in the field of accurate navigation and mapping solutions to combine or integrate data acquired by many types of sensors to provide an estimation of the location of the vehicle that is more accurate than each sensor taken alone.
- the sensor's data may be obtained in different sampling rates and may contain various types of information, such as vision, inertial information, position, etc.
- the location data provided by the trained model may be used as a sensory input alongside other data from the sensor. By that, the navigation and mapping solution accuracy may be improved.
- measurements taken from a plurality of vehicles 110 that pass in defined area or route 120 may be stored, for example, in database 150 , as indicated in operation 370 .
- anomaly detection techniques may be used to find changes in defined area or route 120 in a real-time and the map of defined area or route 120 may be updated.
- the anomaly detection techniques may include ML algorithms, including unsupervised ML classifiers that may classify new data as similar or different from the training dataset, e.g., K-means, Expectation-maximization, one class support vector machine (SVM), and others.
- a notification regarding a detected change in defined area or route 120 may be provided to drivers moving in defined area or route 120 or directly to vehicle 110 .
- This technique may allow notifying vehicles 110 of risks in defined area or route 120 , as may also be used for tuning the unsupervised ML model (e.g., the unsupervised ML model used for anomaly detection) to achieve higher accuracy by using the acquired data to train or retrain the unsupervised ML model.
- the unsupervised ML model e.g., the unsupervised ML model used for anomaly detection
- DL NN model 500 may include at least three layers of nodes, e.g., a convolutional layer 510 and fully connected layers 520 and 530 .
- convolutional layer 510 may implement sigmoid function
- fully connected layer 520 may implement ReLU function
- fully connected layer 530 may implement a SoftMax function.
- a loss function 540 may obtain the predictions 550 of DL NN model 500 , and the location or position label 560 may adjust or tune parameters of DL NN model 500 through a backpropagation training scheme.
- FIG. 6 presents the trained DL NN model 600 , including trained convolutional layer 610 and fully connected layers 620 and 630 .
- the model presented in FIG. 6 is an example only. Other models may be used.
- Trained DL NN model 600 may obtain runtime sensor readings of vehicle 110 and may provide a prediction of the current location or position of vehicle 110 .
- Computing device 700 may include a processor 705 that may be, for example, a central processing unit processor (CPU) or any other suitable multi-purpose or specific processors or controllers, a chip or any suitable computing or computational device, an operating system 715 , a memory 120 , executable code 725 , a storage system 730 , input devices 735 and output devices 740 .
- processor 705 (or one or more controllers or processors, possibly across multiple units or devices) may be configured to carry out methods described herein, and/or to execute or act as the various modules, units, etc. or example when executing code 725 .
- More than one computing device 700 may be included in, and one or more computing devices 700 may be, or act as the components of, a system according to embodiments of the invention.
- Various components, computers, and modules of FIG. 1 may be or include devices such as computing device 700 , and one or more devices such as computing device 700 may carry out functions such as those described in FIG. 3 .
- navigation server may be implemented on or executed by a computing device 700 .
- Operating system 715 may be or may include any code segment (e.g., one similar to executable code 725 ) designed and/or configured to perform tasks involving coordination, scheduling, arbitration, controlling or otherwise managing operation of computing device 700 , for example, scheduling execution of software programs or enabling software programs or other modules or units to communicate.
- code segment e.g., one similar to executable code 725
- Memory 720 may be or may include, for example, a Random Access Memory (RAM), a read only memory (ROM), a Dynamic RAM (DRAM), a Synchronous DRAM (SD-RAM), a double data rate (DDR) memory chip, a Flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long term memory unit, or other suitable memory or storage units.
- Memory 720 may be or may include a plurality of, possibly different memory units.
- Memory 720 may be a computer or processor non-transitory readable medium, or a computer non-transitory storage medium, e.g., a RAM.
- Executable code 725 may be any executable code, e.g., an application, a program, a process, task or script. Executable code 725 may be executed by processor 705 possibly under control of operating system 715 . For example, executable code 725 may configure processor 705 to perform clustering of interactions, to handle or record interactions or calls, and perform other methods as described herein. Although, for the sake of clarity, a single item of executable code 725 is shown in FIG. 7 , a system according to some embodiments of the invention may include a plurality of executable code segments similar to executable code 725 that may be loaded into memory 720 and cause processor 705 to carry out methods described herein.
- Storage system 730 may be or may include, for example, a hard disk drive, a CD-Recordable (CD-R) drive, a Blu-ray disk (BD), a universal serial bus (USB) device or other suitable removable and/or fixed storage unit.
- Data such as the training dataset of accelerations, angular velocities, and known locations over time, the extracted features, ML model parameters (e.g., weights) and equations, runtime datasets of measured accelerations, angular velocities and extracted features as well as other data required for performing embodiments of the invention, may be stored in storage system 730 and may be loaded from storage system 730 into memory 720 where it may be processed by processor 705 .
- Some of the components shown in FIG. 7 may be omitted.
- memory 720 may be a non-volatile memory having the storage capacity of storage system 730 . Accordingly, although shown as a separate component, storage system 730 may be embedded or included in memory 720 .
- Input devices 735 may be or may include a mouse, a keyboard, a microphone, a touch screen or pad or any suitable input device. Any suitable number of input devices may be operatively connected to computing device 700 as shown by block 735 .
- Output devices 740 may include one or more displays or monitors, speakers and/or any other suitable output devices. Any suitable number of output devices may be operatively connected to computing device 700 as shown by block 740 .
- Any applicable input/output (I/O) devices may be connected to computing device 700 as shown by blocks 735 and 740 .
- NIC network interface card
- USB universal serial bus
- device 700 may include or may be, for example, a personal computer, a desktop computer, a laptop computer, a workstation, a server computer, a network device, a smartphone or any other suitable computing device.
- a system as described herein may include one or more devices such as computing device 700 .
- a computer processor performing functions may mean one computer processor performing the functions or multiple computer processors or modules performing the functions; for example, a process as described herein may be performed by one or more processors, possibly in different locations.
- each of the verbs, “comprise”, “include” and “have”, and conjugates thereof are used to indicate that the object or objects of the verb are not necessarily a complete listing of components, elements or parts of the subject or subjects of the verb.
- adjectives such as “substantially” and “about” modifying a condition or relationship characteristic of a feature or features of an embodiment of the disclosure, are understood to mean that the condition or characteristic is defined to within tolerances that are acceptable for operation of an embodiment as described.
- the word “or” is considered to be the inclusive “or” rather than the exclusive or, and indicates at least one of, or any combination of items it conjoins.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
Abstract
A system and method for providing localization, including, during a training phase: obtaining a training dataset of accelerations, angular velocities, and known locations over time of vehicles moving in a defined area; and training a machine learning model to provide location estimation in the defined area based on the accelerations and angular velocities using the training dataset; and during runtime phase: obtaining runtime accelerations and angular velocities over time of a vehicle moving in the defined area; and using the trained model to obtain current location of the vehicle based on the runtime acceleration and angular velocities.
Description
- This application claims the benefit of U.S. Provisional Patent Application No. 63/138,153, filed Jan. 15, 2021 and entitled: “DEEP LEARNING-BASED REAL-TIME TERRAIN IDENTIFICATION FOR LOCALIZATION AND MAPPING USING INERTIAL SENSORS”, which is hereby incorporated by reference in its entirety.
- The present invention relates generally to localization technology and, more specifically, to localization based on inertial sensors.
- The need for high-accuracy localization, positioning and mapping solutions in real-time exists in many domains and applications. Current outdoor localization technology typically utilizes a satellite navigation device, also referred to as global navigation satellite system (GNSS) including for example, global positioning system (GPS), GLONASS, Galileo, Beidou and other satellite navigation systems. Drivers use GNSS systems routinely for localization and navigation. In addition, autonomous vehicle companies integrate localization and mapping sensors and algorithms to achieve high-accuracy localization solutions for driver safety.
- However, GNSS cannot be used for indoor navigation, localization or positioning applications. Indoor navigation, localization or positioning applications may include, for example, navigating robots or vehicles in storage warehouses that are used to monitor and provide equipment efficiently, or navigating in an indoor parking lot. Today, indoor localization is typically performed by applying sensor fusion schemes, where data acquired by many types of sensors is integrated to provide an estimation of the location of the vehicle.
- In addition, GNSS may not provide adequate accuracy for some outdoor localization applications as well. For example, localization systems for autonomous vehicles may require higher accuracy than is provided by GNSS. Thus, localization systems for autonomous vehicles may also use sensor fusion to achieve high-accuracy localization solutions. The sensors may include a camera, LIDAR, inertial sensors, and others. Unfortunately, these sensors may be expensive, and the quality of the data they provide may depend on various physical conditions, such as day and night, light and dark, urban canyon, and indoor environments. Hence, there is no high-accuracy localization and mapping solution for vehicles, both indoor and outdoor.
- A computer-based system and method for providing localization may include: during a training phase: obtaining a training dataset of accelerations, angular velocities, and known locations over time of vehicles moving in a defined area; and training a machine learning model to provide location estimation in the defined area based on the accelerations and angular velocities using the training dataset; during runtime phase: obtaining runtime accelerations and angular velocities over time of a vehicle moving in the defined area; and using the trained model to obtain current location of the vehicle based on the runtime acceleration and angular velocities.
- According to some embodiments of the invention, the accelerations, angular velocities of the training set and the runtime acceleration and angular velocities may be measured using at least one inertial measurement unit (IMU).
- According to some embodiments of the invention, the IMU may include at least one three-dimensional accelerometer and at least one three-dimensional gyroscope.
- According to some embodiments of the invention, the machine learning model may be a neural network.
- Some embodiments of the invention may include, during the training phase: extracting features from the accelerations and angular velocities of the training dataset and adding the features to the training dataset; and during the runtime phase: extracting runtime features from the runtime accelerations and angular velocities; and using the trained model to obtain the current location of the vehicle based on the runtime acceleration, the runtime angular velocities and the runtime features.
- According to some embodiments of the invention, the features may be selected from velocity and horizontal slope.
- According to some embodiments of the invention, during the training phase, the known locations may be obtained from at least one of the list including: a global navigation satellite system (GNSS) receiver and a real-time kinematic (RTK) positioning system.
- According to some embodiments of the invention, the defined area may include a route.
- Some embodiments of the invention may include dividing mapping of the defined area into segments, and according to some embodiments, the location may be provided as a segment in which the vehicle is located.
- Some embodiments of the invention may include performing anomaly detection to find changes in defined area.
- Some embodiments of the invention may include obtaining readings from at least one sensor selected from, a GNSS receiver, a Lidar sensor and radio frequency (RF) sensor; and using the readings to enhance an accuracy of the current location provided by the trained ML model.
- Non-limiting examples of embodiments of the disclosure are described below with reference to figures listed below. The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanied drawings.
-
FIG. 1 presents a system for providing localization of a vehicle, according to some embodiments of the invention. -
FIG. 2 presents the defined area or route divided into segments, according to some embodiments of the invention. -
FIG. 3 shows a flowchart of a method, according to some embodiments of the present invention. -
FIG. 4 presents an example of accelerations in the x, y and z directions and angular velocities in the x, y and z directions, according to some embodiments of the invention. -
FIG. 5 , presents a deep learning neural network (DL NN) model during training phase, according to some embodiments of the invention. -
FIG. 6 presents the trained DL NN model, according to some embodiments of the invention. -
FIG. 7 shows a high-level block diagram of an exemplary computing device according to some embodiments of the present invention. - It will be appreciated that, for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn accurately or to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity, or several physical components may be included in one functional block or element. Reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
- In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, and components, modules, units and/or circuits have not been described in detail so as not to obscure the invention. For the sake of clarity, discussion of same or similar features or elements may not be repeated.
- Although embodiments of the invention are not limited in this regard, discussions utilizing terms such as, for example, “processing,” “computing,” “calculating,” “determining,” “establishing”, “analyzing”, “checking”, or the like, may refer to operation(s) and/or process(es) of a computer, a computing platform, a computing system, or other electronic computing device, that manipulates and/or transforms data represented as physical (e.g., electronic) quantities within the computer's registers and/or memories into other data similarly represented as physical quantities within the computer's registers and/or memories or other information non-transitory storage medium that may store instructions to perform operations and/or processes. Although embodiments of the invention are not limited in this regard, the terms “plurality” and “a plurality” as used herein may include, for example, “multiple” or “two or more”. The term set when used herein may include one or more items. Unless explicitly stated, the method embodiments described herein are not constrained to a particular order or sequence. Additionally, some of the described method embodiments or elements thereof can occur or be performed simultaneously, at the same point in time, or concurrently.
- Today, vehicles, cars, robots, and other moving ground platforms, commonly referred to herein as vehicles, may use many sensors in a sensor-fusion framework to obtain a localization solution in real-time. The sensors used may include cameras, inertial sensors, Lidar, and RF sensors. These sensors typically suffer from considerable disadvantages and are unable to provide the high-accuracy needed for various scenarios, such as indoor localization where no GNSS reception is available and outdoor localization for autonomous vehicles where the accuracy provided by GNSS is still not high enough to allow safe-driving. Other applications that may require the high-accuracy navigation may include navigating around a parking lot and navigating a tractor or other agricultural vehicles in a field, where the tractor needs to cover an entire field efficiently, e.g., without leaving any part of the field uncovered and with minimal repetitions.
- An inertial measurement unit (IMU) may be or may include an electronic device configured to measure the specific force, angular velocity, magnetic field and the orientation of a vehicle, typically using one or more accelerometers, e.g., three-dimensional accelerometers, gyroscopes, e.g., one three-dimensional gyroscopes, and optionally magnetometers. In some implementations, IMUs may be used in strapdown inertial navigation system (SINS), where the IMU sensor is physically attached to the body of the vehicle and measurements are integrated into motion equations. Moving along surfaces, roads, and other terrains results in a dynamic change of the IMU readings. As such, the sensor readings contain intrinsic knowledge regarding the changes in location, which may be used to calculate the current location of the vehicle. However, current IMUs used in SINS typically suffer from biases and drift over time, making SINS problematic for high-accuracy localization and positioning solutions when used alone, without any accurate measurement update.
- Some embodiments of the invention aim to solve the high-accuracy localization and positioning problem in real-time for vehicles using inertial sensors by using machine learning (ML) models. For example, according to some embodiments of the invention, readings of the inertial sensors may be provided as input to a deep learning (DL) neural network (NN) model.
- According to some embodiments of the invention, signals from IMUs may be provided to an ML model that may provide the position or location of the vehicle. The signals provided by IMUs may include information indicative of accelerations, angular velocities, and time as raw data. Additionally, features may be calculated based on raw data, such as velocity and horizontal slope, etc.
- Some embodiments of the invention may include training a machine learning model to provide location estimation using a training dataset of accelerations, angular velocities, and known locations over time of vehicles moving in a defined area. By providing a large training dataset to the model and performing optimization techniques (e.g., training and testing using for example cross validation via k-folds), a functional mapping may be established. Once completed, raw data information measured by IMUs of vehicles, may be provided to the trained ML model, and the trained model may provide location or position estimation of the moving vehicle in real time.
- According to some embodiments of the invention, the ML model may be or may include a NN model, and more specifically, a DL NN. A NN may include neurons and nodes organized into layers, with links between neurons transferring output between neurons. Aspects of a NN may be weighed, e.g., links may have weights, and training may involve adjusting weights. Aspects of a NN may include transfer functions, also referred to as nonlinear activation functions, e.g., an output of a node may be calculated using a transfer function. A NN may be executed and represented as formulas or relationships among nodes or neurons, such that the neurons, nodes or links are “virtual”, represented by software and formulas, where training or executing a NN is performed by for example a dedicated or conventional computer. A DL NN model may include many neurons and layers with non-linear activation functions such as convolutional, Softmax, rectified linear unit (ReLU), etc.
- The training dataset may be generated in a tagging procedure for a given area or environment. For example, a designer may map the entire area (e.g., a route, a parking lot, tunnels, urban canyons and other areas where GNSS reception is poor etc.) where a batch of raw data may be tagged or labeled with the correct location. This process may be improved with user-recorded raw data information and shared on a cloud.
- Thus, some embodiments of the invention may improve the technology of positioning and localization by providing high-accuracy localization solutions based on IMU signals and DL ML schemes in real-time. In addition, some embodiments of the invention may provide a database of terrain information for various areas such as routes, parking lots, tunnels and urban canyons and keep updating the database. Some embodiments may find anomalies in the received signals, adjust the DL NN model online, and provide notifications regarding terrain anomalies in a vehicle's network for safe driving, where pits and other danger road modification will be shared among all users in a defined zone.
-
FIG. 1 depicts asystem 100 for providing localization of a vehicle, according to some embodiments of the invention. According to one embodiment of the invention,system 100 may include avehicle 110, equipped with one ormore sensor unit 112 that may measure and provide data including at least one of specific force, angular velocity and/or the orientation of a vehicle, typically using at least one accelerometer, three-dimensional accelerometer, gyroscope and/or three-dimensional gyroscope. For example,sensor unit 112 may be or may include an IMU or an SINS, e.g., may be physically attached to the body ofvehicle 110.Sensor unit 112 may further include a processor and a communication module for initial processing and transmitting of data measured bysensor unit 112 tonavigation server 130. In the example provided inFIG. 1 ,vehicle 110 may be a vehicle moving along a road, way, path orroute 120. This example is not limiting, andsystem 100 may include a vehicle moving in any defined area, such as a parking lot, a tunnel, a field, an urban canyon (e.g., areas is cities where reception of signals from GNSS is poor), or in another confined area, or an indoor vehicle, a robot or any other indoor vehicle moving in a confined indoor area.Sensor unit 112 may provide the data measured bysensor unit 112 to anavigation server 130 directly or throughnetworks 140. -
Networks 140 may include any type of network or combination of networks available for supporting communication betweensensor unit 112 andnavigation server 130.Networks 340 may include for example, a wired, wireless, fiber optic, cellular or any other type of connection, a local area network (LAN), a wide area network (WAN), the Internet and intranet networks, etc. Each ofnavigation server 130 andsensor unit 112 may be or may include a computing device, such ascomputing device 700 depicted inFIG. 7 . One ormore databases 150 may be or may include a storage device, such asstorage device 730. In some embodiments,navigation server 130 anddatabase 150 may be implemented in a remote location, e.g., in a ‘cloud’ computing system. - According to some embodiments of the invention,
navigation server 130 may store indatabase 150 data obtained fromsensor unit 112 and other data such as ML model parameters, mapping of terrain and/orroute 120, computational results, and any other data required for providing localization or positioning data according to some embodiments of the invention. According to some embodiments of the invention,navigation server 130 may be configured to obtain, during a training phase, a training dataset of accelerations, angular velocities, and known locations over time ofvehicles 110 moving in a defined area orroute 120, and to train an ML model, e.g., a DL NN model, to provide location estimation in the defined area orroute 120 based on the accelerations and angular velocities using the training dataset. For example,navigation server 130 may be configured to obtain, during a training phase, a training dataset of accelerations, angular velocities over time as measured bysensor unit 112. For generating the training data set, the data measured bysensor unit 112 may be tagged or labeled with the known locations. According to some embodiments, during the training phase,navigation server 130 may obtain the known locations from at least one of a GNSS receiver and a real-time kinematic (RTK) positioning system. Other methods may be used to obtain the location data. - According to some embodiments of the invention,
navigation server 130 may be further configured to, during a runtime phase, obtain runtime accelerations and angular velocities over time of avehicle 110 moving in the defined area orroute 120 and use the trained model to obtain current location ofvehicle 110 based on the runtime acceleration and angular velocities. - According to some embodiments of the invention,
navigation server 130 may be further configured to, during the training phase, extract features from the accelerations and angular velocities of the training dataset and add the features to the training dataset. For example, the features may include velocity, horizontal slope and/or other features.Navigation server 130 may be further configured to, during the runtime phase, extract the same type of features from the runtime accelerations and angular velocities, and use the trained model to obtain the current location of thevehicle 112 based on the runtime acceleration, the runtime angular velocities and the runtime features. - According to some embodiments of the invention,
navigation server 130 may have mapping of the defined area orroute 120. In some embodiments,navigation server 130 may divide the mapping of the defined area orroute 120 into segments and may provide or express the location of vehicle 110 a segment in which thevehicle 110 is located. Referring toFIG. 2 , a defined area orroute 120 divided into segments is presented, according to some embodiments of the invention. InFIG. 2 , the dashed squares/rectangles representsegments 200, and the location ofvehicle 110 is betweensegments vehicle 110, e.g., using coordinates. -
FIG. 3 shows a flowchart of a method according to some embodiments of the present invention. The operations ofFIG. 3 may be performed by the systems described inFIGS. 1 and 7 , but other systems may be used. - In
operation 310, a training dataset of accelerations, angular velocities, and known locations over time of vehicles moving in a defined area may be obtained. For example, the accelerations and angular velocities may be measured by asensor unit 112 including, for example, an IMU, and the known locations may be obtained from a GNSS receiver and/or a RTK positioning system. Other positioning systems may be used. An example of raw data measured bysensor unit 112 is presented inFIG. 4 , which depicts an example of accelerations in the x, y and z directions (labeled ax, ay and az, respectively) and angular velocities in the x, y and z directions (labeled wx, wy and wz, respectively), measured by asensor unit 112 that is attached to avehicle 110 moving along defined area orroute 120. - Defined area or
route 120 may include a uniform or non-uniform terrain, where pits, speed bumps, and more artifacts may be presented. During the motion ofvehicle 110,sensor unit 112 may measure signals that may represent the movement ofvehicle 110, assensor unit 112 may be physically attached or integrated with the body ofvehicle 110, and may record the raw data with respective time. Considering an indoor environment, location or position information may be obtained or generated manually or using any applicable indoor positioning methods including Wi-Fi positioning, capturing images ofvehicle 110 over time and extracting the location or position ofvehicle 110 from the images, etc. The position or location data may be provided in any applicable manner including spatial coordinates, segments, etc. For example, if the definedarea 120 is a parking lot, the parking number may be used as the location indication or label. In some embodiments, the defined area orroute 120 may be divided into segments, as demonstrated inFIG. 2 , and location or position data may be provided as the segment in whichvehicle 110 is present. - In
operation 320, features from the accelerations and angular velocities of the training dataset may be extracted and added to the training dataset. The features may include, for example, the estimated velocity and horizontal slope. The estimated velocity and horizontal slope may be extracted, calculated or obtained by applying classical approaches, such as integration of the integration of the accelerometer readings (e.g., the measured acceleration) and gyroscope readings (e.g., the measured orientation). The training dataset may include a plurality of recordings made by the same ordifferent vehicles 110 moving again and again in the defined area orroute 120. - In
operation 330, an ML model, e.g., a DL NN or other model may be trained to provide location estimation in the defined area based on the accelerations and angular velocities (and/or extracted features) using the training dataset. For example, the training dataset may be provided to the ML model and used in the training phase to adjust model parameters (e.g., weights) of the ML model. For example, the model parameters may be adjusted through a backpropagation training scheme, while the parameters of the ML model may be tuned or adjusted over and over until a loss function is minimized. A generalization of the solution may be achieved by using a nonlinear activation function, such as Sigmoid, ReLU, and Softmax, and a large number of neurons, convolutional, and recurrent layers. Eventually, during the training phase (e.g., operations 310-330), a trained ML model may be obtained or generated. - In
operation 340, runtime accelerations and angular velocities over time ofvehicle 110 moving in defined area orroute 120 may be obtained. As in the training phase, the accelerations and angular velocities may be measured bysensor unit 112 including, for example an IMU that is physically attached to the body ofvehicle 110. Inoperation 350, runtime features may be extracted or calculated from the runtime accelerations and angular velocities, similarly tooperation 320. Inoperation 360, the trained model may be used to obtain current location ofvehicle 110 based on the runtime acceleration and angular velocities and/or features. For example, the dataset of accelerations and angular velocities as measured bysensor unit 112 as well as the extracted features may be provided or feed into the trained ML model, and the trained ML model may provide an estimation of the current location ofvehicle 110 in real time. - In some embodiments, the trained model may be used together with other sensors such as a camera, a GNSS receiver, a Lidar sensor, radio frequency (RF) sensor, etc., to enhance the accuracy of the location provided by the trained ML model using sensor fusion frameworks. Sensor fusion may be used in the field of accurate navigation and mapping solutions to combine or integrate data acquired by many types of sensors to provide an estimation of the location of the vehicle that is more accurate than each sensor taken alone. In sensor fusion schemes, the sensor's data may be obtained in different sampling rates and may contain various types of information, such as vision, inertial information, position, etc. According to one embodiment, the location data provided by the trained model may be used as a sensory input alongside other data from the sensor. By that, the navigation and mapping solution accuracy may be improved.
- According to some embodiments, measurements taken from a plurality of
vehicles 110 that pass in defined area orroute 120 may be stored, for example, indatabase 150, as indicated inoperation 370. Inoperation 380, anomaly detection techniques may be used to find changes in defined area orroute 120 in a real-time and the map of defined area orroute 120 may be updated. The anomaly detection techniques may include ML algorithms, including unsupervised ML classifiers that may classify new data as similar or different from the training dataset, e.g., K-means, Expectation-maximization, one class support vector machine (SVM), and others. Inoperation 390, a notification regarding a detected change in defined area orroute 120 may be provided to drivers moving in defined area orroute 120 or directly tovehicle 110. This technique may allow notifyingvehicles 110 of risks in defined area orroute 120, as may also be used for tuning the unsupervised ML model (e.g., the unsupervised ML model used for anomaly detection) to achieve higher accuracy by using the acquired data to train or retrain the unsupervised ML model. - Reference is now made to
FIG. 5 , which presents aDL NN model 500 during training phase, according to some embodiments of the invention. The model presented inFIG. 5 is an example only. Other models may be used.DL NN model 500 may include at least three layers of nodes, e.g., aconvolutional layer 510 and fullyconnected layers convolutional layer 510 may implement sigmoid function, fully connectedlayer 520 may implement ReLU function, and fully connectedlayer 530 may implement a SoftMax function. Aloss function 540 may obtain thepredictions 550 ofDL NN model 500, and the location orposition label 560 may adjust or tune parameters ofDL NN model 500 through a backpropagation training scheme. -
FIG. 6 presents the trainedDL NN model 600, including trainedconvolutional layer 610 and fullyconnected layers FIG. 6 is an example only. Other models may be used. TrainedDL NN model 600 may obtain runtime sensor readings ofvehicle 110 and may provide a prediction of the current location or position ofvehicle 110. - Reference is made to
FIG. 7 , showing a high-level block diagram of an exemplary computing device according to some embodiments of the present invention.Computing device 700 may include aprocessor 705 that may be, for example, a central processing unit processor (CPU) or any other suitable multi-purpose or specific processors or controllers, a chip or any suitable computing or computational device, anoperating system 715, amemory 120,executable code 725, astorage system 730,input devices 735 andoutput devices 740. Processor 705 (or one or more controllers or processors, possibly across multiple units or devices) may be configured to carry out methods described herein, and/or to execute or act as the various modules, units, etc. or example when executingcode 725. More than onecomputing device 700 may be included in, and one ormore computing devices 700 may be, or act as the components of, a system according to embodiments of the invention. Various components, computers, and modules ofFIG. 1 may be or include devices such ascomputing device 700, and one or more devices such ascomputing device 700 may carry out functions such as those described inFIG. 3 . For example, navigation server may be implemented on or executed by acomputing device 700. -
Operating system 715 may be or may include any code segment (e.g., one similar to executable code 725) designed and/or configured to perform tasks involving coordination, scheduling, arbitration, controlling or otherwise managing operation ofcomputing device 700, for example, scheduling execution of software programs or enabling software programs or other modules or units to communicate. -
Memory 720 may be or may include, for example, a Random Access Memory (RAM), a read only memory (ROM), a Dynamic RAM (DRAM), a Synchronous DRAM (SD-RAM), a double data rate (DDR) memory chip, a Flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long term memory unit, or other suitable memory or storage units.Memory 720 may be or may include a plurality of, possibly different memory units.Memory 720 may be a computer or processor non-transitory readable medium, or a computer non-transitory storage medium, e.g., a RAM. -
Executable code 725 may be any executable code, e.g., an application, a program, a process, task or script.Executable code 725 may be executed byprocessor 705 possibly under control ofoperating system 715. For example,executable code 725 may configureprocessor 705 to perform clustering of interactions, to handle or record interactions or calls, and perform other methods as described herein. Although, for the sake of clarity, a single item ofexecutable code 725 is shown inFIG. 7 , a system according to some embodiments of the invention may include a plurality of executable code segments similar toexecutable code 725 that may be loaded intomemory 720 andcause processor 705 to carry out methods described herein. -
Storage system 730 may be or may include, for example, a hard disk drive, a CD-Recordable (CD-R) drive, a Blu-ray disk (BD), a universal serial bus (USB) device or other suitable removable and/or fixed storage unit. Data such as the training dataset of accelerations, angular velocities, and known locations over time, the extracted features, ML model parameters (e.g., weights) and equations, runtime datasets of measured accelerations, angular velocities and extracted features as well as other data required for performing embodiments of the invention, may be stored instorage system 730 and may be loaded fromstorage system 730 intomemory 720 where it may be processed byprocessor 705. Some of the components shown inFIG. 7 may be omitted. For example,memory 720 may be a non-volatile memory having the storage capacity ofstorage system 730. Accordingly, although shown as a separate component,storage system 730 may be embedded or included inmemory 720. -
Input devices 735 may be or may include a mouse, a keyboard, a microphone, a touch screen or pad or any suitable input device. Any suitable number of input devices may be operatively connected tocomputing device 700 as shown byblock 735.Output devices 740 may include one or more displays or monitors, speakers and/or any other suitable output devices. Any suitable number of output devices may be operatively connected tocomputing device 700 as shown byblock 740. Any applicable input/output (I/O) devices may be connected tocomputing device 700 as shown byblocks input devices 735 and/oroutput devices 740. - In some embodiments,
device 700 may include or may be, for example, a personal computer, a desktop computer, a laptop computer, a workstation, a server computer, a network device, a smartphone or any other suitable computing device. A system as described herein may include one or more devices such ascomputing device 700. - When discussed herein, “a” computer processor performing functions may mean one computer processor performing the functions or multiple computer processors or modules performing the functions; for example, a process as described herein may be performed by one or more processors, possibly in different locations.
- In the description and claims of the present application, each of the verbs, “comprise”, “include” and “have”, and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily a complete listing of components, elements or parts of the subject or subjects of the verb. Unless otherwise stated, adjectives such as “substantially” and “about” modifying a condition or relationship characteristic of a feature or features of an embodiment of the disclosure, are understood to mean that the condition or characteristic is defined to within tolerances that are acceptable for operation of an embodiment as described. In addition, the word “or” is considered to be the inclusive “or” rather than the exclusive or, and indicates at least one of, or any combination of items it conjoins.
- Descriptions of embodiments of the invention in the present application are provided by way of example and are not intended to limit the scope of the invention. The described embodiments comprise different features, not all of which are required in all embodiments. Embodiments comprising different combinations of features noted in the described embodiments, will occur to a person having ordinary skill in the art. Some elements described with respect to one embodiment may be combined with features or elements described with respect to other embodiments. The scope of the invention is limited only by the claims.
- While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents may occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.
Claims (20)
1. A method for providing localization, the method comprising, using a processor:
during a training phase:
obtaining a training dataset of accelerations, angular velocities, and known locations over time of vehicles moving in a defined area; and
training a machine learning model to provide location estimation in the defined area based on the accelerations and angular velocities using the training dataset; and
during runtime phase:
obtaining runtime accelerations and angular velocities over time of a vehicle moving in the defined area; and
using the trained model to obtain current location of the vehicle based on the runtime acceleration and angular velocities.
2. The method of claim 1 , wherein the accelerations, angular velocities of the training set and the runtime acceleration and angular velocities are measured using at least one inertial measurement unit (IMU).
3. The method of claim 2 , wherein the IMU comprises at least one three-dimensional accelerometer and at least one three-dimensional gyroscope.
4. The method of claim 1 , wherein the machine learning model is a neural network.
5. The method of claim 1 , comprising:
during the training phase:
extracting features from the accelerations and angular velocities of the training dataset and adding the features to the training dataset; and
during the runtime phase:
extracting runtime features from the runtime accelerations and angular velocities; and
using the trained model to obtain the current location of the vehicle based on the runtime acceleration, the runtime angular velocities and the runtime features.
6. The method of claim 5 , wherein the features are selected from the list consisting of velocity and horizontal slope.
7. The method of claim 1 , wherein during the training phase, the known locations are obtained from at least one of the list consisting of: a global navigation satellite system (GNSS) receiver and a real-time kinematic (RTK) positioning system.
8. The method of claim 1 , wherein the defined area comprises a route.
9. The method of claim 1 , comprising dividing mapping of the defined area into segments, wherein the location is provided as a segment in which the vehicle is located.
10. The method of claim 1 , comprising performing anomaly detection to find changes in defined area.
11. The method of claim 1 , comprising:
obtaining readings from at least one sensor selected from the list consisting of: a camera, a GNSS receiver, a Lidar sensor and radio frequency (RF) sensor; and
using the readings to enhance an accuracy of the current location provided by the trained ML model.
12. A system for providing localization, the system comprising:
a memory; and
a processor configured to:
during a training phase:
obtain a training dataset of accelerations, angular velocities, and known locations over time of vehicles moving in a defined area; and
train a machine learning model to provide location estimation in the defined area based on the accelerations and angular velocities using the training dataset:
during a runtime phase:
obtain runtime accelerations and angular velocities over time of a vehicle moving in the defined area; and
use the trained model to obtain current location of the vehicle based on the runtime acceleration and angular velocities.
13. The system of claim 12 , comprising at least one inertial measurement unit (IMU) attached to each of the vehicles and configured to measure the accelerations, angular velocities of the training set and the runtime acceleration and angular velocities.
14. The system of claim 13 , wherein the IMU comprises at least one three-dimensional accelerometer and at least one three-dimensional gyroscope.
15. The system of claim 12 , wherein the machine learning model is a neural network.
16. The system of claim 12 , wherein the processor is configured to:
during the training phase:
extract features from the accelerations and angular velocities of the training dataset and adding the features to the training dataset; and
during the runtime phase:
extract runtime features from the runtime accelerations and angular velocities; and
use the trained model to obtain the current location of the vehicle based on the runtime acceleration, the runtime angular velocities and the runtime features.
17. The system of claim 16 , wherein the features are selected from the list consisting of velocity and horizontal slope.
18. The system of claim 12 , wherein the defined area comprises a defined route.
19. The system of claim 12 , wherein the processor is configured to divide mapping of the defined area into segments, wherein the processor is configured to provide the location as a segment in which the vehicle is located.
20. The system of claim 12 , wherein the processor is configured to perform anomaly detection to find changes in defined area.
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/337,632 US20220228866A1 (en) | 2021-01-15 | 2021-06-03 | System and method for providing localization using inertial sensors |
US17/484,346 US20220228867A1 (en) | 2021-01-15 | 2021-09-24 | System and method for providing localization using inertial sensors |
PCT/IL2022/050064 WO2022153313A1 (en) | 2021-01-15 | 2022-01-16 | System and method for providing localization using inertial sensors |
US17/979,126 US11725945B2 (en) | 2021-01-15 | 2022-11-02 | System and method for providing localization using inertial sensors |
US18/334,451 US11859978B2 (en) | 2021-01-15 | 2023-06-14 | System and method for estimating a velocity of a vehicle using inertial sensors |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163138153P | 2021-01-15 | 2021-01-15 | |
US17/337,632 US20220228866A1 (en) | 2021-01-15 | 2021-06-03 | System and method for providing localization using inertial sensors |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/484,346 Continuation-In-Part US20220228867A1 (en) | 2021-01-15 | 2021-09-24 | System and method for providing localization using inertial sensors |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/484,346 Continuation US20220228867A1 (en) | 2021-01-15 | 2021-09-24 | System and method for providing localization using inertial sensors |
US17/484,346 Continuation-In-Part US20220228867A1 (en) | 2021-01-15 | 2021-09-24 | System and method for providing localization using inertial sensors |
PCT/IL2022/050064 Continuation-In-Part WO2022153313A1 (en) | 2021-01-15 | 2022-01-16 | System and method for providing localization using inertial sensors |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220228866A1 true US20220228866A1 (en) | 2022-07-21 |
Family
ID=82406207
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/337,632 Abandoned US20220228866A1 (en) | 2021-01-15 | 2021-06-03 | System and method for providing localization using inertial sensors |
US17/484,346 Abandoned US20220228867A1 (en) | 2021-01-15 | 2021-09-24 | System and method for providing localization using inertial sensors |
US17/979,126 Active US11725945B2 (en) | 2021-01-15 | 2022-11-02 | System and method for providing localization using inertial sensors |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/484,346 Abandoned US20220228867A1 (en) | 2021-01-15 | 2021-09-24 | System and method for providing localization using inertial sensors |
US17/979,126 Active US11725945B2 (en) | 2021-01-15 | 2022-11-02 | System and method for providing localization using inertial sensors |
Country Status (2)
Country | Link |
---|---|
US (3) | US20220228866A1 (en) |
WO (1) | WO2022153313A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2626733A (en) * | 2023-01-30 | 2024-08-07 | Arwain Systems Ltd | Localisation system |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115615428B (en) * | 2022-10-17 | 2024-02-02 | 中国电信股份有限公司 | Positioning method, device, equipment and readable medium of inertial measurement unit of terminal |
CN117948986B (en) * | 2024-03-27 | 2024-06-21 | 华航导控(天津)科技有限公司 | Polar region factor graph construction method and polar region integrated navigation method |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2733032C (en) * | 2011-02-28 | 2015-12-29 | Trusted Positioning Inc. | Method and apparatus for improved navigation of a moving platform |
CN107148553A (en) * | 2014-08-01 | 2017-09-08 | 迈克尔·科伦贝格 | Method and system for improving Inertial Measurement Unit sensor signal |
US20190204092A1 (en) * | 2017-12-01 | 2019-07-04 | DeepMap Inc. | High definition map based localization optimization |
CN110873888B (en) * | 2018-09-04 | 2022-05-06 | 腾讯大地通途(北京)科技有限公司 | Positioning method, positioning device, positioning apparatus, and computer storage medium |
EP3899974A1 (en) * | 2018-12-21 | 2021-10-27 | The Procter & Gamble Company | Apparatus and method for operating a personal grooming appliance or household cleaning appliance |
US10997461B2 (en) * | 2019-02-01 | 2021-05-04 | Tesla, Inc. | Generating ground truth for machine learning from time series elements |
US11205112B2 (en) * | 2019-04-01 | 2021-12-21 | Honeywell International Inc. | Deep neural network-based inertial measurement unit (IMU) sensor compensation method |
US11820397B2 (en) * | 2019-11-16 | 2023-11-21 | Uatc, Llc | Localization with diverse dataset for autonomous vehicles |
-
2021
- 2021-06-03 US US17/337,632 patent/US20220228866A1/en not_active Abandoned
- 2021-09-24 US US17/484,346 patent/US20220228867A1/en not_active Abandoned
-
2022
- 2022-01-16 WO PCT/IL2022/050064 patent/WO2022153313A1/en active Application Filing
- 2022-11-02 US US17/979,126 patent/US11725945B2/en active Active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2626733A (en) * | 2023-01-30 | 2024-08-07 | Arwain Systems Ltd | Localisation system |
WO2024161101A1 (en) * | 2023-01-30 | 2024-08-08 | Arwain Systems Ltd | Localisation system |
Also Published As
Publication number | Publication date |
---|---|
US11725945B2 (en) | 2023-08-15 |
US20230055498A1 (en) | 2023-02-23 |
US20220228867A1 (en) | 2022-07-21 |
WO2022153313A1 (en) | 2022-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11725945B2 (en) | System and method for providing localization using inertial sensors | |
US10579058B2 (en) | Apparatus and method for generating training data to train neural network determining information associated with road included in image | |
CN112639502B (en) | Robot pose estimation | |
Kim et al. | Deep Learning‐Based GNSS Network‐Based Real‐Time Kinematic Improvement for Autonomous Ground Vehicle Navigation | |
CN110686686B (en) | System and method for map matching | |
US20170103342A1 (en) | Machine learning based determination of accurate motion parameters of a vehicle | |
CN106017454A (en) | Pedestrian navigation device and method based on novel multi-sensor fusion technology | |
Gao et al. | Glow in the dark: Smartphone inertial odometry for vehicle tracking in GPS blocked environments | |
CN113406682A (en) | Positioning method, positioning device, electronic equipment and storage medium | |
US11805390B2 (en) | Method, apparatus, and computer program product for determining sensor orientation | |
Moussa et al. | Steering angle assisted vehicular navigation using portable devices in GNSS-denied environments | |
US20190249997A1 (en) | Effective search of travelling data corresponding to measured positions of a vehicle | |
CN114127810A (en) | Vehicle autonomous level function | |
Abdelrahman et al. | Crowdsensing-based personalized dynamic route planning for smart vehicles | |
Fu et al. | An improved integrated navigation method based on RINS, GNSS and kinematics for port heavy-duty AGV | |
Du | A micro-electro-mechanical-system-based inertial system with rotating accelerometers and gyroscopes for land vehicle navigation | |
US20240236934A1 (en) | Positioning precision estimation method and apparatus, electronic device, and storage medium | |
Tong et al. | Vehicle inertial tracking via mobile crowdsensing: Experience and enhancement | |
Jurišica et al. | High precision gnss guidance for field mobile robots | |
Huang et al. | Improved intelligent vehicle self-localization with integration of sparse visual map and high-speed pavement visual odometry | |
Islam et al. | Real-time vehicle trajectory estimation based on lane change detection using smartphone sensors | |
Zhao et al. | An ISVD and SFFSD-based vehicle ego-positioning method and its application on indoor parking guidance | |
MAILKA et al. | An efficient end-to-end EKF-SLAM architecture based on LiDAR, GNSS, and IMU data sensor fusion for autonomous ground vehicles | |
Liu et al. | A robust cascaded strategy of in-motion alignment for inertial navigation systems | |
Praschl et al. | Enabling outdoor MR capabilities for head mounted displays: a case study |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STCB | Information on status: application discontinuation |
Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION |