WO2018225067A1 - Fusion and calibration of sensor signals in a moving vehicle - Google Patents

Fusion and calibration of sensor signals in a moving vehicle Download PDF

Info

Publication number
WO2018225067A1
WO2018225067A1 PCT/IL2018/050615 IL2018050615W WO2018225067A1 WO 2018225067 A1 WO2018225067 A1 WO 2018225067A1 IL 2018050615 W IL2018050615 W IL 2018050615W WO 2018225067 A1 WO2018225067 A1 WO 2018225067A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
vehicle
sensors
data
signal data
Prior art date
Application number
PCT/IL2018/050615
Other languages
French (fr)
Inventor
Lev Yitzhak Lavy
Eliahu Brosh
Bruno FERNANDEZ-RUIZ
Eran Shir
Original Assignee
Nexar Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nexar Ltd. filed Critical Nexar Ltd.
Publication of WO2018225067A1 publication Critical patent/WO2018225067A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/222Secondary servers, e.g. proxy server, cable television Head-end
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages

Definitions

  • the present invention relates to using a data bridge to fuse and calibrate data from multiple sensors within a moving vehicle.
  • the Robot Operating System is a framework for standardizing communication and configuration sharing between devices, when low level access is provided.
  • ROS Robot Operating System
  • Embodiments of the present invention take advantage of the fact that a driver's smartphone typically has at least twice the computing power and sensing capability as does the vehicle that he drives. There is a sort of "Moore 's Law" at play here; namely, that a new smartphone typically has twice as much computing power as, say, a three-year old car. And, of course, it's much less expensive to upgrade a smartphone than it is to upgrade a car.
  • embodiments of the present invention leverage a smartphone as a data bridge to fuse and calibrate sensor data in a vehicle, and to provide unified accessibility, connectivity and data handing.
  • Embodiments of the present invention integrate sensor data with minimal prior knowledge about the sensor properties including inter alia sensor latency, sensor error and sensor orientation.
  • a sensor processing device located within a vehicle that is being driven, the processing device communicating with plural sensors, each sensor generating signal data, the processing device including a transceiver transmitting data derived by the processing device from the sensor signal data, to one or more remote servers, and receiving sensor-related information from the one or more remote servers, a synchronizer evaluating latencies of the sensors, an error estimator estimating accuracies of the sensor signal data, a sensor validator determining if one or more of the sensors are failed, and a calibrator transforming the sensor signal data to a common vehicle reference system.
  • a vehicle network data processor that receives times series data from transmitters in one or more vehicles that are being driven, the time series data based on plural sensors located in the one or more vehicles, the vehicle data processor deriving, from the received time series data, sensor-related information and driver-related information, the vehicle data processor including one or more cellular or Wi-Fi transceivers receiving time series data from the vehicles, a synchronizer evaluating latencies of the time series data, an error estimator for estimating accuracies of the time series data, and one or more database managers storing the sensor-related information and the driver-related information derived by the processor in one or more databases, wherein the one or more cellular or Wi-Fi transceivers transmit the sensor-related information in the databases to the vehicles.
  • a non-transitory computer readable medium storing instructions, which, when executed by a processing device located in a vehicle that is being driven, cause the processing device to process signal data from sensors in the vehicle, including causing the processing device to receive signal data from the sensors, receive sensor-related information from one or more remote servers, evaluate latencies of the sensors so as to synchronize the sensor signal data, estimate accuracies of the sensor signal data, determine if one or more of the sensors are failed, and transform the sensor signal data to a common vehicle reference system.
  • a non-transitory computer readable medium storing instructions, which, when executed by a vehicle network data processor, cause the data processor to receive time series data from transmitters in one or more vehicles that are being driven, the time series data being based on plural sensors located in the one or more vehicles, and to derive sensor-related information and driver- related information from the received time series data, comprising causing the data processor to: receive time series data from the vehicles, evaluate latencies of the sensors, estimate accuracies of the time series data, store the sensor-related information and the driver-related information derived by the processor in one or more databases, and transmit the sensor-related information in the databases to the vehicle.
  • a vehicle sensor system including a plurality of computer processing units within a vehicle that is being driven, the vehicle including plural sensors that generate signal data, the plurality of computer processing units jointly comprising circuitry for sensor calibration and fusion, the circuitry including one or more local area data connections receiving signal data from the sensors, one or more transceivers transmitting data derived from the sensor signal data to one or more remote servers, and receiving sensor-related information from the one or more remote servers, a synchronizer evaluating latencies of the sensors, an error estimator for estimating accuracies of the received sensor signal data, a sensor validator for determining if one or more of the sensors are failed, and a calibrator for transforming the received sensor signal data to a common vehicle reference system.
  • a method for a plurality of computer processing units within a vehicle that is being driven, the vehicle including plural sensors that generate signal data, to jointly perform sensor data processing including dynamically allocating among the computer processing units the following real-time tasks, the allocation being based on currently available bandwidth and computing power of each computer processing unit: evaluate latencies of the sensors so as to synchronize the sensor signal data, estimate accuracies of the sensor signal data, determine if one or more of the sensors are failed, and transform the sensor signal data to a common vehicle reference system.
  • FIG. 1 is a simplified block diagram of a device for fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention
  • FIG. 2 is a simplified block diagram of a sensor processor for fusing and calibrating data received from a moving vehicle, in accordance with an embodiment of the present invention
  • FIG. 3 is a simplified diagram of vehicle using a smartphone positioned on a cradle for configuring a common domain, using the smartphone as a data bridge, in accordance with an embodiment of the present invention
  • FIG. 4 is a simplified flowchart of a method for determining whether a sensor is operable, in accordance with an embodiment of the present invention
  • FIG. 5 is a simplified drawing of coordinate systems relative to a vehicle and relative to a smartphone, in accordance with an embodiment of the present invention
  • FIG. 6 is a simplified flowchart of a method used for estimating locations and orientations of sensors, in accordance with an embodiment of the present invention
  • FIG. 7 is a simplified flowchart of a method for sensor fusion for an inertial measurement unit sensor that is positioned in a cradle in a vehicle in an arbitrary orientation, in accordance with an embodiment of the present invention
  • FIG. 8 is a simplified flowchart of a method for determining measurement error, in accordance with an embodiment of the present invention.
  • FIG. 9 is a simplified flowchart of a method for fusing sensor data to derive linear acceleration and the gravity vector relative to a moving vehicle coordinate system, in accordance with an embodiment of the present invention.
  • FIG. 10 is a simplified flowchart of a method for online fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention
  • FIG. 11 is a simplified flowchart of a method for offline fusing and calibrating data received from moving vehicles, in accordance with an embodiment of the present invention
  • FIG. 12 is a simplified block diagram of a multi-processor system for fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention
  • FIG. 13 is a simplified flowchart of a multi-processor method for initiating distributed fusing and calibrating of sensor signal data in a moving vehicle
  • FIG. 14 is a simplified flowchart of a multi-processor method for allocating sensor data fusing and calibrating tasks among compute nodes in a moving vehicle, in accordance with an embodiment of the present invention.
  • FIG. 15 is a graph of magnetic field data over time, showing correlation between acceleration and magnetic field, illustrating a self-converging Kalman filter in accordance with an embodiment of the present invention.
  • TABLE I provides an index of elements and their numerals. Similarly numbered elements represent elements of the same type, but they need not be identical elements.
  • Embodiments of the present invention leverage computing power of smartphones to fuse and calibrate signals generated by sensors in moving vehicles.
  • FIG. 1 is a simplified block diagram of a device for fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention.
  • a vehicle 100 is moving, and a device, namely, a smartphone 120 is located inside vehicle 100.
  • Vehicle 100 may be a land vehicle such as a car or a motorcycle, a water vehicle such as a boat or a ship, or an air vehicle such as an airplane or a drone.
  • Vehicle 100 includes a plurality of sensors 110.
  • Vehicle sensors 110 may include inter alia on-board diagnostics (OBD) 111 and a camera 112.
  • Smartphone 120 includes a plurality of sensors 130.
  • Device sensors 130 may include any types of sensors within smartphones, including a global positioning system (GPS) 131, an accelerometer 132, a gyroscope 133, a magnetometer 134, a camera 135, and other sensors such as a microphone.
  • GPS global positioning system
  • Each sensor generates a signal z(t) for an actual physical quantity/vector x(t) such as a vehicle position, velocity or acceleration vector.
  • Each sensor signal is sampled as a time series ⁇ (3 ⁇ 4), z(t 2 ), ... of signal data.
  • Each sensor sample has a latency and an error; i.e.,
  • GPS 131 generates a times series of vehicle 100 longitude-latitude positions.
  • Accelerometer 132 generates a time series of vehicle 100 accelerations.
  • Gyroscope 133 generates a time series of angular velocities.
  • Magnetometer 134 generates a time series of orientations.
  • Cameras 112 and 135 generate time series' of images near the front/back/right/left of vehicle 100.
  • OBD 111 generates a time series of system measurements for vehicle 100, including inter alia fuel systems, emission systems, transmission systems, speed control systems and idle control systems. [0035] Some sensors 110 / 130 have higher sampling rates than others. Accelerometer 132 and gyroscope 133 have high sampling rates and provide high-frequency data. OBD 111 and GPS 131 have lower sampling rates and provide low frequency data.
  • Smartphone 120 includes a sensor processor 150 that fuses and calibrates signals from vehicle sensors 110 and device sensors 130 in order to accurately estimate the physical quantities x(t), as explained in detail hereinbelow.
  • Sensor processor 150 acts as a data bridge for vehicle sensors 110 and device sensors 130.
  • One of the technical challenges in fusing sensors 110 / 130 is that the various sensors 110 / 130 have different sampling rates, different latencies and different accuracies.
  • sensor processor 150 Based on its derived estimators, x(t), sensor processor 150 generates feedback to a driver of vehicle 100. Alternatively or additionally, sensor processor 150 generates a driver score based on the driver's driving skill. Alternatively or additionally, sensor processor 150 may provide positioning information for vehicle 100. Alternatively, vehicle 100 may be an autonomous driverless vehicle, and sensor processor 150 may provide driving instructions to vehicle 100.
  • sensor processor 150 is a component of smartphone 120. In other embodiments of the present invention, sensor processor 150 is external to smartphone 120.
  • Derivation of the estimators x(t) by sensor processor 150 may be performed using a Kalman filter, or such other filters that attempt to minimize a statistical error of x(t)— x(t) at each time step ti, t2, ... using a recursive process equation for x(t) for an update step, and a measurement equation for z(t) as a corrective step.
  • a Kalman filter or such other filters that attempt to minimize a statistical error of x(t)— x(t) at each time step ti, t2, ... using a recursive process equation for x(t) for an update step, and a measurement equation for z(t) as a corrective step.
  • a server 200 remote from vehicle 100, is in communication with vehicle 100.
  • Server 200 accesses two databases; namely, a sensor database 210 and a driver database 220.
  • Sensor database 210 and driver database 220 may be local to server 200 or remote from but accessible by server 200.
  • Server 200 includes a vehicle data processor 250, operation of which is explained in detail hereinbelow.
  • Vehicle data processor 250 may be a node within a vehicle-to-vehicle network. Alternatively or additionally, vehicle data processor 250 may be part of one or more advanced driver assistance systems (ADAS). Alternatively or additionally, vehicle data processor 250 may be used for autonomous driving training. Alternatively or additionally, vehicle data processor 250 may be used for map generation. Alternatively or additionally, vehicle data processor 250 may be used to provide fleet driver scores.
  • ADAS advanced driver assistance systems
  • FIG. 2 is a simplified block diagram of sensor processor 150, which fuses and calibrates data received from vehicle 100, in accordance with an embodiment of the present invention.
  • Sensor processor 150 includes four primary components; namely, a synchronizer 152 for evaluating latencies of sensors 110 / 130, an error estimator 154 for estimating accuracies of the signals generated by sensors 110 / 130, a sensor validator 156 determining if one or more of sensors 110 / 130 are inoperable, and a calibrator 158 transforming the sensor signal data to a common vehicle reference system.
  • Sensors 110 / 130 may have different operating characteristics, including inter alia different frequencies of measurement, different data qualities, different stabilities, different latencies, and different levels of detail exposed by the sensors.
  • sensor processor 150 processes sensor signals as time series data, i.e., on the basis that each sensor output is a measurement that has a timestamp. The measurement has an error, the timestamp has an error, and the timestamp has a latency, all of which need to be computed in real time, in order to fuse sensors 110 / 130 and accurately analyze their data.
  • Sensor processor 150 uses redundancies in sensor measurements to optimize its analysis, and to compensate for failure of a specific sensor 110 / 130.
  • TABLE II below provides exemplary measurement errors and latencies for different sensors 110 / 130.
  • Most sensors 110 / 130 do not provide information about their measurement errors, except for a GPS sensor, which provides a spatial error in location, but does not provide a latency.
  • Latency is a significant factor for vehicle-to- vehicle networks and autonomous control based on simultaneous location and mapping (SLAM). At speeds of 30 m/sec (108 km/sec), a GPS spatial location error may be on the order of 5 m, whereas latency may cause an error of 50 m.
  • SLAM simultaneous location and mapping
  • Server 200 assists sensor processor 150 in calibrating the sensor measurements by providing calibration data and aggregate date from other vehicles.
  • FIG. 3 is a simplified diagram of vehicle 100 using smartphone 120 positioned on a cradle for configuring a common domain using sensor processor 150 of smartphone 120 as a data bridge.
  • FIG. 3 shows OBD 111 of vehicle 100, and camera 112 external to smartphone 120.
  • Smartphone 120 is shown with two sensors; namely, GPS 131 and accelerometer 132.
  • OBD 111 and camera 112 are external to smartphone 120, their data is nevertheless processed by sensor processor 150, which acts as a data bridge for the sensors.
  • latency tracking is performed by one of more of the following algorithms.
  • the FFT algorithm is used for larger signals.
  • the algorithms may incorporate data about the sensor, vehicle motion equations, or other prior knowledge. Alternatively, the algorithms may be used without any prior knowledge.
  • error estimator 154 matches signals and estimates errors using an error model that is linear over short time ranges. For each sensor 110 / 130, a calibration and error estimation is derived using domain matching and, if necessary, back-propagating error to the original domain.
  • FIG. 4 is a simplified flowchart of a method 1000 for sensor validator 156 to determine whether a sensor is operable, in accordance with an embodiment of the present invention.
  • a specific sensor 110 / 130 in vehicle 100 is tested, to determine whether or not its signal data aligns with the motion of vehicle 100. If decision 1010 is affirmative, i.e., the signal data aligns with the motion of the vehicle, then at decision 1020, a determination is made whether or not the signal data of the specific signal being tested aligns with the signal data of others of sensors 110 / 130.
  • decision 1020 is based on the signal data of the sensor being tested aligning with the signal data of a majority of the other sensors 110 / 130, since it may be that a few of the other sensors 110 / 130 are inoperable. If either decision 1010 or decision 1020 is negative, then at operation 1030 it is concluded that the sensor 110 / 130 being tested is inoperable. Otherwise, if both decision 1010 and decision 1020 are affirmative, then at operation 1040 it is concluded that the sensor 110 / 130 being tested is operable.
  • a sensor may be inoperable because it has failed.
  • a failed sensor 110 / 130 is ignored until it is back in operation. If a reset for the failed sensor 110 / 130 is available, e.g., by stopping and restarting the sensor, then the failed sensor 110 / 130 is reset.
  • a sensor may be inoperable because of human interaction.
  • a driver moves smartphone 120 from its cradle, then the motion of smartphone 120 during this time is a combination of driving motion and human interaction.
  • Such an anomaly is detected by the method of FIG. 4 when the smartphone motion is compared with the motion of vehicle 100, with OBD 111 speed data, or with external camera 112 ego motion.
  • calibrator 158 performs low-level pre-processing prior to sensor data being provided to high-level algorithms, applications, and an operating system. Calibration may be performed using deep learning with unsupervised methods, by defining outputs such as location, and using a loss function to train the network to achieve best results by augmentation of errors on inputs.
  • mapping One of the services that server 200 provides is mapping. Mapping enables better localization on the client, whereby enhanced location services are provided in a manner such that location is treated as a sensor instead of a high-level output. Each sensor is labeled by model and type, and its data is uploaded to the server in which model-specific calibration and, as appropriate, training of the network to calibrate, are performed. As such, when new users connect to the sensor, an initial offline calibration is provided.
  • Any device that is connectable and delivers state data such as a vehicle Wi-Fi hotspot, may be treated as a signal to be calibrated.
  • Multiple sensors with the same measurement may be aggregated to output a single location for top level consumers, such as applications and operating systems.
  • the multiple GPS units may be a wearable watch with GPS, a phone GPS and a vehicle GPS.
  • Calibrator 158 not only processes individual sensors, but also generates a unified output in the vehicle context, such as the vehicle coordinate system, or the vehicle time.
  • a sensor is calibrated over time, and only the final calibration with sensor model and additional data is uploaded, instead of raw sensor data.
  • sensors 110 / 130 such as accelerometer 132
  • IMU inertial measurement unit
  • Calibrator 158 transforms local sensor coordinate systems to a common vehicle coordinate system, as described below with reference to FIG. 5.
  • Calibrator 158 also estimates locations and orientations of sensors 110, as described below with reference to FIG. 6.
  • the sensors may fetch from server 200 initial per-sensor model calibrations.
  • the server aggregates measurements to optimize per-sensor model calibration.
  • the sensors periodically send back enhanced on-the- fly calibration so that the server per-model calibration uses accurate statistics.
  • a sensor of type/model X may have a constant calibration error.
  • the new sensor uses the initial model calibration. As such, the new sensor immediately provides accurate calibrated results, avoiding the need to wait for the online calibration process to converge.
  • the initial per-sensor model calibrations may be based on local calibrations performed by the individual sensors and uploaded to the server.
  • FIG. 5 is a simplified drawing of coordinate systems relative to vehicle 100 and relative to smartphone 120, in accordance with an embodiment of the present invention.
  • FIG. 5 shows that relative to vehicle 100, an x-axis passes through the vehicle from the passenger side to the driver side, a j-axis points along the direction of motion of vehicle 100, and a z-axis points upwards to the roof of vehicle 100.
  • the j-axis is referred to as the "vehicle heading vector”.
  • the z-axis is referred to as the "vehicle roof vector ' ' .
  • Linear acceleration of vehicle 100 is measured along the vehicle heading vector
  • rotation of vehicle 100 is measured around the vehicle roof vector as axis of rotation.
  • FIG. 5 also shows that relative to smartphone 120, an x-axis passes through smartphone 120 from front to back, a j-axis points to the right of the phone display, and a z-axis points above the display.
  • Both the vehicle and the smartphone xyz coordinate systems are moving right-handed coordinate systems.
  • calibrator 158 transforms the sensors' local coordinate systems to the vehicle coordinate system. Calibrator 158 transforms each local sensor coordinate system in accordance with
  • Vcai ⁇ (p - o) « R ⁇ S ⁇ (p - o) (EQ. 1)
  • p cai and p represent respective calibrated and uncalibrated position vectors
  • T is a 3x3 sensor-to-vehicle calibration matrix including skew
  • o is an offset
  • R is a 3D device-to-vehicle rotation matrix
  • S is a 3D scale matrix.
  • Representation of the matrix T as the product of a rotation matrix, R, and a scale matrix, S, in EQ. 1 is based on the assumption that the sensor is factory-calibrated to an orthogonal local sensor coordinate system. Otherwise, the matrix T is instead factored using a singular value decomposition (SVD) of matrix T.
  • SVD singular value decomposition
  • the sensor-to- vehicle rotation matrix R is determined based on an assumption that during rotation most of the energy of vehicle 100 is relative to the roof axis (z-axis in FIG. 5) of vehicle 100, and when not rotating, most of the energy is relative to the heading vector (j-axis in FIG. 5), which is accelerating and decelerating.
  • the rotation matrix R can be validated based on an assumption that the gravity vector of vehicle 100, when the accelerometer is at rest, should be near the vehicle floor.
  • full route sections of vehicle 100 are analyzed by dead reckoning using a Kalman filter and/or using gradient descent regression.
  • FIG. 6 is a simplified flowchart of a method 1100 used by calibrator 158 for estimating locations and orientations of sensors 110 and 130, in accordance with an embodiment of the present invention.
  • sensor output signal data is collected.
  • car front wheels are aligned with a road contour, which may have road defects. Operation 1120 is optionally performed only when vehicle 100 is moving slowly, since complexities due to wheel suspension may arise at high speeds.
  • distance of vehicle rotation axis from sensor 110 / 130 is estimated using induced force and angular rate, based on a simple rigid body model. Speed bumps are particularly helpful in performing operation 1130.
  • sensor 110 / 130 includes a GPS, then at operation 1140 the height of sensor 110 / 130 is compared to a map, within accuracy of a few cm. If sensor 110 / 130 is vision-based, then at operation 1150 camera properties including inter alia tilt and height are estimated with reference to an object of known size. If sensor 110 / 130 is a camera, then at operation 1160 the side (front/left/right/back) of vehicle 100 on which sensor 110 / 130 is located is determined using ego motion and optical flows. At operation 1170, camera data is compared to registered features of other cameras and IMU sensors, to perform camera-to-camera calibration.
  • FIG. 7 is a simplified flowchart of a method 1200, performed by sensor processor 150, for sensor fusion for an IMU sensor 130 of smartphone 120, where smartphone 120 is positioned in a cradle in vehicle 100 in an arbitrary orientation, in accordance with an embodiment of the present invention.
  • sensor processor 150 calibrates a gyroscope 133 based on sensor signal data from accelerometer 132, gyroscope 133 and a magnetometer 134.
  • Sensor processor 150 initially estimates the heading vector of vehicle 100 to be the smartphone 120 z-axis of FIG. 5, since this is often the case.
  • Sensor processor 150 initially estimates the roof vector (z-axis of FIG. 5) for vehicle 100 to be the gravity vector of the smartphone 120 local coordinate system.
  • sensor processor 150 derives orientation of smartphone 120, based on the assumption that large rotations are around the vehicle roof axis when the gravity vector does not change substantially.
  • sensor processor 150 derives linear acceleration of smartphone 120, based on the assumption that acceleration is in the direction of the vehicle 100 heading vector (y- axis of FIG. 5) when vehicle 100 is not rotating. Negative acceleration is mirrored. It is noted that the vehicle 100 heading vector is derived from linear acceleration, with or without use of a GPS. The vehicle 100 roof vector (z-axis of FIG. 5) is computed using gyroscope 133, with or without use of a GPS. It is noted that GPS -based acceleration data has a significant amount of noise, and has variable latency. Angular rate of rotation of vehicle 100 is computed from the course that the vehicle is driving.
  • Sensor processor 150 matches linear acceleration and angular rate of rotation with GPS-derived values, so that they slowly converge. Earth gravity, the vehicle heading vector and the vehicle roof vectors are tracked to obtain the vehicle's 3D orientation and acceleration vis-a-vis gyroscope 113.
  • sensor processor 150 derives the device-to-vehicle rotation matrix R.
  • Calibrator 158 also calibrates camera 112 to synchronize time, to determine latency, and to transform camera data to the vehicle coordinate system. Ego motion algorithms are used to estimate latencies of other vehicle sensors, and, if not known, to estimate the location of camera 112 in vehicle 100.
  • FIG. 8 is a simplified flowchart of a method performed by sensor processor 150 for determining measurement error, in accordance with an embodiment of the present invention.
  • operation 1310 uncalibrated data from accelerometer 112, gyroscope 133 and magnetometer 134 is low-pass filtered with time synchronization.
  • operation 1320 smartphone 120 orientation and linear acceleration are derived.
  • operation 1330 a measurement error is calculated as a sum of deviation norms. The deviations are biased according to a gyroscope random walk and temperature changes.
  • the bias and bias error are updated using uncalibrated gyroscope data and the measurement error calculated at operation 1330.
  • the method avoids over-simplified low-pass filtering since it introduces large errors with long rotational motion, which is common for country-side driving.
  • FIG. 9 is a simplified flowchart of a method 1400 performed by sensor processor 150 for fusing sensor data to derive linear acceleration and the gravity vector relative to a moving vehicle coordinate system, in accordance with an embodiment of the present invention.
  • sensor processor 150 applies a low-pass filter to accelerometer 132 signal data.
  • sensor processor 150 applies rotation matrices to rotate accelerometer and gyroscope signal data to the vehicle 100 coordinate system.
  • sensor processor 150 updates the gravity vector using the gyroscope signal data, and updates the error estimate based on the gyroscope error estimate.
  • sensor processor 150 uses the current accelerometer measurement to estimate the gravity vector, and estimates the error by distance from a 1 G-force gravity vector and by deviation from the low -pass data due to jerking motion.
  • sensor processor 150 updates each component of the gravity vector in the vehicle coordinate system, and normalizes the resulting gravity vector.
  • sensor processor 150 corrects the acceleration components using speed derived from OBD 111 and GPS 131, and by the relation
  • sensor processor 150 back-propagates the correction in the linear acceleration to correct the gravity vector.
  • sensor processor 150 outputs the linear acceleration and gravity vector relative to the vehicle coordinate system, and outputs an estimate of the error.
  • the vehicle 100 motion may be tracked separately per spatial coordinate, instead of full 3D motion. It is noted that, in distinction from cars, such an assumption is not valid for drones. For small time steps, a one -dimensional Kalman filter may be used, with errors and updates computed relative to the vehicle coordinate system separately for each spatial dimension.
  • the linear acceleration of the vehicle is used to generate a driver score, to detect emergencies, to reconstruct accidents, and for dead reckoning.
  • Device-to-car rotation is used for camera calibration when the camera is on the same device as the IMU sensors.
  • FIG. 10 is a simplified flowchart of a method 1500 performed by sensor processor 150 for online fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention.
  • sensor processor 150 receives signal data from sensors 110.
  • sensor processor 150 receives sensor data from server 200.
  • synchronizer 152 evaluates latencies of the received signal data as described hereinabove.
  • synchronizer 152 synchronizes the signal data as described hereinabove.
  • error estimator 154 estimates signal data errors as described hereinabove.
  • sensor validator 156 validates the signal data as described hereinabove.
  • calibrator 158 calibrates the signal data as described hereinabove.
  • signal processor 150 provides feedback to a driver of vehicle 100. If vehicle 100 is an autonomous vehicle, then signal processor 150 provides feedback to an autonomous vehicle controller.
  • FIG. 11 is a simplified flowchart of a method 1600 performed by vehicle data processor 250 for offline fusing and calibrating data received from moving vehicles 100, in accordance with an embodiment of the present invention.
  • vehicle data processor 250 receives data from vehicles 100.
  • vehicle data processor 250 evaluates latencies of the received vehicle data.
  • vehicle data processor 250 synchronizes the vehicle data.
  • vehicle data processor 250 estimates vehicle data errors.
  • vehicle data processor 250 derives sensor-related and driver-related information, and stores the derived information in sensor database 210 and driver database 220, respectively.
  • vehicle data processor transmits the derived sensor-related information to vehicles 100.
  • rules are provided for triggering smartphone 120 to log sensor data on server 200 for analysis. Such triggering enables collision analysis and system failure analysis. Conventional systems either don't store signal data around critical events, or else store only locally. Triggering enables preservation of an abundance of critical data for careful analysis.
  • the rules for triggering may be provided by server 200, device 120 or any of sensors 110/130.
  • embodiments of the present invention distribute sensor data fusion and calibration among multiple compute nodes in vehicle 100.
  • FIG. 12 is a simplified block diagram of a multi-processor system for fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention.
  • FIG. 12 shows multiple compute nodes 160 in vehicle 100.
  • Compute nodes 160 may include, for example, smartphone 120, an infotainment system, a smart watch and other Internet of Things (IoT) and wearable devices within vehicle 100, which collaborate to fuse and calibrate signal data from sensors 110 and 130.
  • IoT Internet of Things
  • the system of FIG. 12 allocates computing tasks in an optimal way based on the available compute capabilities of the compute nodes 160.
  • One of the compute nodes is designated as the primary node.
  • the primary node is a device with high-availability and significant computing power, such as smartphone 120.
  • Data transfer between nodes is minimized by preprocessing parts of computations at the nodes.
  • Server 200 performs some of the extensive calibration and fine-tuning computations, in order to relieve the local compute nodes 160 of this burden.
  • FIG. 13 is a simplified flowchart of a multi-processor method for initiating distributed fusing and calibrating of sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention.
  • FIG. 13 shows multiple compute nodes 160 in vehicle 100. Some of the compute nodes 160, such as the smartphones, have embedded sensors. Some of the compute nodes, such as the smartphones and the car infotainment system, are coupled with sensors. Thus one or more of the smartphones are coupled with the OBD, and the infotainment system is coupled with a camera. The compute nodes 160 cooperate to fuse and calibrate the sensors in vehicle 100.
  • compute nodes 160 initiate their negotiation with one another and disclose their computing capability and Internet access. Sensing information is collected across all compute nodes 160.
  • compute nodes 160 select one of the compute nodes to be the primary node, and set Internet access roles. For the purpose of redundancy, another one of the compute nodes is selected to be the secondary node.
  • compute nodes 160 set computing and data flow roles per device in vehicle 100.
  • compute nodes 160 synchronize their internal clocks.
  • compute nodes 160 initiate fusion and calibration of the sensor data.
  • FIG. 14 is a simplified flowchart of a multi-processor method for allocating sensor data fusing and calibrating tasks among compute nodes in a moving vehicle, in accordance with an embodiment of the present invention.
  • FIG. 14 shows that calculating sensor latency is allocated to compute nodes #1 and #4, calculating sensor accuracy and calibrating sensor data is allocated to compute node #2, and validating sensors and calculating sensor accuracy is allocated to compute node #3.
  • these various tasks are all performed by components of sensor processor 150.
  • behavior of sensor is automatically classified by monitoring detailed relationships used for sensor fusion on the device side and on the server side.
  • An exemplary embodiment is a Kalman filter where the linear relationships between process and measurement variables are learned on-the-fly.
  • the Kalman filter is based on a process model of the form
  • x k Fx k ⁇ + Bu k + w k ,
  • a loss function may be used to self-converge and detect the parameters.
  • Use of a loss function is feasible for devices such as compute nodes 160, since the number of variables is on the order of 1,000 or less.
  • deep learning algorithms use as many as 1 ,000,000 variables.
  • the number of variables used in embodiments of the present invention is given by cN 2 , where N is the number of inputs and hidden variables, and c is determined from the set of levels in the Kaman filter that are trained.
  • the set of levels is 3; namely, (i) process model, (ii) noise covariance matrix, and (iii) state transitions.
  • a 4 th level, (iv) input control is used.
  • FIG. 15 is a graph of magnetic field data over time, showing correlation between acceleration and magnetic field, illustrating a self- converging Kalman filter in accordance with an embodiment of the present invention.
  • the data in FIG. 15 was generated using an iPhone in a Tesla Model S P100D electric car.
  • a conventional Kalman fails to include a weighting for the relation between magnetic field and vehicle acceleration, but instead includes a weighting between magnetic field and orientation. This is an incorrect relationship and as such, the conventional Kalman filter produces large errors in this use case.
  • the self-converging Kalman filter in accordance with the present invention slowly adjusts the weights using a loss function based on GPS short range dead reckoning. If the electric car has an OBD then the car model is registered on server 100, and the pre-designated model from the server is used for the next driver with this type of car, without having to repeat the learning phase.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

A sensor processing device located within a vehicle that is being driven, the processing device communicating with plural sensors, each sensor generating signal data, the processing device including a transceiver transmitting data derived by the processing device from the sensor signal data, to one or more remote servers, and receiving sensor-related information from the one or more remote servers, a synchronizer evaluating latencies of the sensors, an error estimator estimating accuracies of the sensor signal data, a sensor validator determining if one or more of the sensors are failed, and a calibrator transforming the sensor signal data to a common vehicle reference system.

Description

FUSION AND CALIBRATION OF SENSOR SIGNALS IN A MOVING VEHICLE
FIELD OF THE INVENTION
[0001] This application claims the benefit of priority from U.S. Provisional Application No. 62/516,459, filed on June 7, 2017. The content of the above document is incorporated by reference in its entirety as if fully set forth herein.
[0002] The present invention relates to using a data bridge to fuse and calibrate data from multiple sensors within a moving vehicle.
BACKGROUND OF THE INVENTION
[0003] Today there are many devices with built-in sensors that measure physical quantities such as motion and orientation, and environmental conditions. Some sensors provide imagery and videos. Sensors generate real-time data, and are used by artificial intelligence (AI) applications to derive useful assists. Vehicle driving is an important case where integration of sensor data from diverse sources is an essential factor. Measurement-enabled sensors are available in various platforms including mobile phones, smart watches, smart glasses, dashcams and car sensors. These sensors exhibit different accessibility, connectivity and reliability. For maximal assist, all available sensor data should be collected and fused.
[0004] However, standardization of data sources for sensors is not enforced. There is no proper unified gateway for data collection, and no system to obtain, align, synchronize and calibrate data sources, and assess data validity and analyze it for an ΑΙ-aware driving experience.
[0005] The Robot Operating System (ROS) is a framework for standardizing communication and configuration sharing between devices, when low level access is provided. However, there is currently no standard for high-level integration of multiple devices. SUMMARY
[0006] Embodiments of the present invention take advantage of the fact that a driver's smartphone typically has at least twice the computing power and sensing capability as does the vehicle that he drives. There is a sort of "Moore 's Law" at play here; namely, that a new smartphone typically has twice as much computing power as, say, a three-year old car. And, of course, it's much less expensive to upgrade a smartphone than it is to upgrade a car.
[0007] As such, embodiments of the present invention leverage a smartphone as a data bridge to fuse and calibrate sensor data in a vehicle, and to provide unified accessibility, connectivity and data handing. Embodiments of the present invention integrate sensor data with minimal prior knowledge about the sensor properties including inter alia sensor latency, sensor error and sensor orientation.
[0008] There is thus provided in accordance with an embodiment of the present invention a sensor processing device located within a vehicle that is being driven, the processing device communicating with plural sensors, each sensor generating signal data, the processing device including a transceiver transmitting data derived by the processing device from the sensor signal data, to one or more remote servers, and receiving sensor-related information from the one or more remote servers, a synchronizer evaluating latencies of the sensors, an error estimator estimating accuracies of the sensor signal data, a sensor validator determining if one or more of the sensors are failed, and a calibrator transforming the sensor signal data to a common vehicle reference system.
[0009] There is additionally provided in accordance with an embodiment of the present invention a vehicle network data processor that receives times series data from transmitters in one or more vehicles that are being driven, the time series data based on plural sensors located in the one or more vehicles, the vehicle data processor deriving, from the received time series data, sensor-related information and driver-related information, the vehicle data processor including one or more cellular or Wi-Fi transceivers receiving time series data from the vehicles, a synchronizer evaluating latencies of the time series data, an error estimator for estimating accuracies of the time series data, and one or more database managers storing the sensor-related information and the driver-related information derived by the processor in one or more databases, wherein the one or more cellular or Wi-Fi transceivers transmit the sensor-related information in the databases to the vehicles.
[0010] There is further provided in accordance with an embodiment of the present invention a non-transitory computer readable medium storing instructions, which, when executed by a processing device located in a vehicle that is being driven, cause the processing device to process signal data from sensors in the vehicle, including causing the processing device to receive signal data from the sensors, receive sensor-related information from one or more remote servers, evaluate latencies of the sensors so as to synchronize the sensor signal data, estimate accuracies of the sensor signal data, determine if one or more of the sensors are failed, and transform the sensor signal data to a common vehicle reference system.
[0011] There is yet further provided in accordance with an embodiment of the present invention a non-transitory computer readable medium storing instructions, which, when executed by a vehicle network data processor, cause the data processor to receive time series data from transmitters in one or more vehicles that are being driven, the time series data being based on plural sensors located in the one or more vehicles, and to derive sensor-related information and driver- related information from the received time series data, comprising causing the data processor to: receive time series data from the vehicles, evaluate latencies of the sensors, estimate accuracies of the time series data, store the sensor-related information and the driver-related information derived by the processor in one or more databases, and transmit the sensor-related information in the databases to the vehicle.
[0012] There is moreover provided in accordance with an embodiment of the present invention a vehicle sensor system, including a plurality of computer processing units within a vehicle that is being driven, the vehicle including plural sensors that generate signal data, the plurality of computer processing units jointly comprising circuitry for sensor calibration and fusion, the circuitry including one or more local area data connections receiving signal data from the sensors, one or more transceivers transmitting data derived from the sensor signal data to one or more remote servers, and receiving sensor-related information from the one or more remote servers, a synchronizer evaluating latencies of the sensors, an error estimator for estimating accuracies of the received sensor signal data, a sensor validator for determining if one or more of the sensors are failed, and a calibrator for transforming the received sensor signal data to a common vehicle reference system.
[0013] There is additionally provided in accordance with an embodiment of the present invention a method for a plurality of computer processing units within a vehicle that is being driven, the vehicle including plural sensors that generate signal data, to jointly perform sensor data processing, the method including dynamically allocating among the computer processing units the following real-time tasks, the allocation being based on currently available bandwidth and computing power of each computer processing unit: evaluate latencies of the sensors so as to synchronize the sensor signal data, estimate accuracies of the sensor signal data, determine if one or more of the sensors are failed, and transform the sensor signal data to a common vehicle reference system.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
[0015] FIG. 1 is a simplified block diagram of a device for fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention;
[0016] FIG. 2 is a simplified block diagram of a sensor processor for fusing and calibrating data received from a moving vehicle, in accordance with an embodiment of the present invention;
[0017] FIG. 3 is a simplified diagram of vehicle using a smartphone positioned on a cradle for configuring a common domain, using the smartphone as a data bridge, in accordance with an embodiment of the present invention;
[0018] FIG. 4 is a simplified flowchart of a method for determining whether a sensor is operable, in accordance with an embodiment of the present invention;
[0019] FIG. 5 is a simplified drawing of coordinate systems relative to a vehicle and relative to a smartphone, in accordance with an embodiment of the present invention;
[0020] FIG. 6 is a simplified flowchart of a method used for estimating locations and orientations of sensors, in accordance with an embodiment of the present invention;
[0021] FIG. 7 is a simplified flowchart of a method for sensor fusion for an inertial measurement unit sensor that is positioned in a cradle in a vehicle in an arbitrary orientation, in accordance with an embodiment of the present invention;
[0022] FIG. 8 is a simplified flowchart of a method for determining measurement error, in accordance with an embodiment of the present invention;
[0023] FIG. 9 is a simplified flowchart of a method for fusing sensor data to derive linear acceleration and the gravity vector relative to a moving vehicle coordinate system, in accordance with an embodiment of the present invention;
[0024] FIG. 10 is a simplified flowchart of a method for online fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention; [0025] FIG. 11 is a simplified flowchart of a method for offline fusing and calibrating data received from moving vehicles, in accordance with an embodiment of the present invention;
[0026] FIG. 12 is a simplified block diagram of a multi-processor system for fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention;
[0027] FIG. 13 is a simplified flowchart of a multi-processor method for initiating distributed fusing and calibrating of sensor signal data in a moving vehicle;
[0028] FIG. 14 is a simplified flowchart of a multi-processor method for allocating sensor data fusing and calibrating tasks among compute nodes in a moving vehicle, in accordance with an embodiment of the present invention; and
[0029] FIG. 15 is a graph of magnetic field data over time, showing correlation between acceleration and magnetic field, illustrating a self-converging Kalman filter in accordance with an embodiment of the present invention.
[0030] For reference to the figures, TABLE I provides an index of elements and their numerals. Similarly numbered elements represent elements of the same type, but they need not be identical elements.
Figure imgf000009_0001
[0031] Elements numbered in the 1000' s are operations of flow charts.
DETAILED DESCRIPTION
[0032] Embodiments of the present invention leverage computing power of smartphones to fuse and calibrate signals generated by sensors in moving vehicles.
[0033] Reference is made to FIG. 1, which is a simplified block diagram of a device for fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention. A vehicle 100 is moving, and a device, namely, a smartphone 120 is located inside vehicle 100. Vehicle 100 may be a land vehicle such as a car or a motorcycle, a water vehicle such as a boat or a ship, or an air vehicle such as an airplane or a drone. Vehicle 100 includes a plurality of sensors 110. Vehicle sensors 110 may include inter alia on-board diagnostics (OBD) 111 and a camera 112. Smartphone 120 includes a plurality of sensors 130. Device sensors 130 may include any types of sensors within smartphones, including a global positioning system (GPS) 131, an accelerometer 132, a gyroscope 133, a magnetometer 134, a camera 135, and other sensors such as a microphone.
[0034] Each sensor generates a signal z(t) for an actual physical quantity/vector x(t) such as a vehicle position, velocity or acceleration vector. Each sensor signal is sampled as a time series ζ(¾), z(t2), ... of signal data. Each sensor sample has a latency and an error; i.e.,
z(t) = x(t - At) + e(t)
where At is a latency, and where the error term e(t) incudes a component due to inaccuracy of the sensor, and a component due to noise. Thus GPS 131 generates a times series of vehicle 100 longitude-latitude positions. Accelerometer 132 generates a time series of vehicle 100 accelerations. Gyroscope 133 generates a time series of angular velocities. Magnetometer 134 generates a time series of orientations. Cameras 112 and 135 generate time series' of images near the front/back/right/left of vehicle 100. OBD 111 generates a time series of system measurements for vehicle 100, including inter alia fuel systems, emission systems, transmission systems, speed control systems and idle control systems. [0035] Some sensors 110 / 130 have higher sampling rates than others. Accelerometer 132 and gyroscope 133 have high sampling rates and provide high-frequency data. OBD 111 and GPS 131 have lower sampling rates and provide low frequency data.
[0036] Smartphone 120 includes a sensor processor 150 that fuses and calibrates signals from vehicle sensors 110 and device sensors 130 in order to accurately estimate the physical quantities x(t), as explained in detail hereinbelow. Sensor processor 150 acts as a data bridge for vehicle sensors 110 and device sensors 130. One of the technical challenges in fusing sensors 110 / 130 is that the various sensors 110 / 130 have different sampling rates, different latencies and different accuracies.
[0037] Based on its derived estimators, x(t), sensor processor 150 generates feedback to a driver of vehicle 100. Alternatively or additionally, sensor processor 150 generates a driver score based on the driver's driving skill. Alternatively or additionally, sensor processor 150 may provide positioning information for vehicle 100. Alternatively, vehicle 100 may be an autonomous driverless vehicle, and sensor processor 150 may provide driving instructions to vehicle 100.
[0038] In some embodiments of the present invention, sensor processor 150 is a component of smartphone 120. In other embodiments of the present invention, sensor processor 150 is external to smartphone 120.
[0039] Derivation of the estimators x(t) by sensor processor 150 may be performed using a Kalman filter, or such other filters that attempt to minimize a statistical error of x(t)— x(t) at each time step ti, t2, ... using a recursive process equation for x(t) for an update step, and a measurement equation for z(t) as a corrective step.
[0040] A server 200, remote from vehicle 100, is in communication with vehicle 100. Server 200 accesses two databases; namely, a sensor database 210 and a driver database 220. Sensor database 210 and driver database 220 may be local to server 200 or remote from but accessible by server 200. Server 200 includes a vehicle data processor 250, operation of which is explained in detail hereinbelow. [0041] Vehicle data processor 250 may be a node within a vehicle-to-vehicle network. Alternatively or additionally, vehicle data processor 250 may be part of one or more advanced driver assistance systems (ADAS). Alternatively or additionally, vehicle data processor 250 may be used for autonomous driving training. Alternatively or additionally, vehicle data processor 250 may be used for map generation. Alternatively or additionally, vehicle data processor 250 may be used to provide fleet driver scores.
[0042] Reference is made to FIG. 2, which is a simplified block diagram of sensor processor 150, which fuses and calibrates data received from vehicle 100, in accordance with an embodiment of the present invention. Sensor processor 150 includes four primary components; namely, a synchronizer 152 for evaluating latencies of sensors 110 / 130, an error estimator 154 for estimating accuracies of the signals generated by sensors 110 / 130, a sensor validator 156 determining if one or more of sensors 110 / 130 are inoperable, and a calibrator 158 transforming the sensor signal data to a common vehicle reference system.
[0043] Sensors 110 / 130 may have different operating characteristics, including inter alia different frequencies of measurement, different data qualities, different stabilities, different latencies, and different levels of detail exposed by the sensors. In order to fuse the different sensor signals, sensor processor 150 processes sensor signals as time series data, i.e., on the basis that each sensor output is a measurement that has a timestamp. The measurement has an error, the timestamp has an error, and the timestamp has a latency, all of which need to be computed in real time, in order to fuse sensors 110 / 130 and accurately analyze their data. Sensor processor 150 uses redundancies in sensor measurements to optimize its analysis, and to compensate for failure of a specific sensor 110 / 130. TABLE II below provides exemplary measurement errors and latencies for different sensors 110 / 130. Most sensors 110 / 130 do not provide information about their measurement errors, except for a GPS sensor, which provides a spatial error in location, but does not provide a latency. Latency is a significant factor for vehicle-to- vehicle networks and autonomous control based on simultaneous location and mapping (SLAM). At speeds of 30 m/sec (108 km/sec), a GPS spatial location error may be on the order of 5 m, whereas latency may cause an error of 50 m.
Figure imgf000013_0001
[0044] Server 200 assists sensor processor 150 in calibrating the sensor measurements by providing calibration data and aggregate date from other vehicles.
Synchronizer 152
[0045] To track latencies between sensors a common domain is used. Reference is made to FIG. 3, which is a simplified diagram of vehicle 100 using smartphone 120 positioned on a cradle for configuring a common domain using sensor processor 150 of smartphone 120 as a data bridge. FIG. 3 shows OBD 111 of vehicle 100, and camera 112 external to smartphone 120. Smartphone 120 is shown with two sensors; namely, GPS 131 and accelerometer 132.
[0046] It is noted that although OBD 111 and camera 112 are external to smartphone 120, their data is nevertheless processed by sensor processor 150, which acts as a data bridge for the sensors.
[0047] Sensor processor 150 independently derives the speed of vehicle 100 from each of GPS 131, accelerometer 132, OBD 111 and camera 112. Specifically, sensor processor 150 derives the speed in four different ways; namely, (i) by numerically differentiating location data z(t) obtained from GPS 131, v(t) =— , (ii) by numerically integrating acceleration data z(t) obtained from accelerometer 132, v(t) = / z(t) , (iii) from OBD 111 data z(t), which measures speed directly from vehicle sensors, and (iv) by estimating ego-motion from camera 112. Each of these derivations exhibits different properties, as summarized below in TABLE III. As such, four different signals v(t) for the same vehicle velocity are generated.
Figure imgf000014_0001
[0048] Once the signals are in the same domain, latency tracking is performed by one of more of the following algorithms.
• Fast-Fourier Transform (FFT) and phase matching;
• cross correlation; and
• Kalman filtering.
The FFT algorithm is used for larger signals. The algorithms may incorporate data about the sensor, vehicle motion equations, or other prior knowledge. Alternatively, the algorithms may be used without any prior knowledge.
Error Estimator 154
[0049] After latencies are derived by synchronizer 152, error estimator 154 matches signals and estimates errors using an error model that is linear over short time ranges. For each sensor 110 / 130, a calibration and error estimation is derived using domain matching and, if necessary, back-propagating error to the original domain.
Sensor Validator 156
[0050] Reference is made to FIG. 4, which is a simplified flowchart of a method 1000 for sensor validator 156 to determine whether a sensor is operable, in accordance with an embodiment of the present invention. At decision 1010, a specific sensor 110 / 130 in vehicle 100 is tested, to determine whether or not its signal data aligns with the motion of vehicle 100. If decision 1010 is affirmative, i.e., the signal data aligns with the motion of the vehicle, then at decision 1020, a determination is made whether or not the signal data of the specific signal being tested aligns with the signal data of others of sensors 110 / 130. In one embodiment of the present invention, decision 1020 is based on the signal data of the sensor being tested aligning with the signal data of a majority of the other sensors 110 / 130, since it may be that a few of the other sensors 110 / 130 are inoperable. If either decision 1010 or decision 1020 is negative, then at operation 1030 it is concluded that the sensor 110 / 130 being tested is inoperable. Otherwise, if both decision 1010 and decision 1020 are affirmative, then at operation 1040 it is concluded that the sensor 110 / 130 being tested is operable.
[0051] A sensor may be inoperable because it has failed. A failed sensor 110 / 130 is ignored until it is back in operation. If a reset for the failed sensor 110 / 130 is available, e.g., by stopping and restarting the sensor, then the failed sensor 110 / 130 is reset.
[0052] A sensor may be inoperable because of human interaction. E.g., if a driver moves smartphone 120 from its cradle, then the motion of smartphone 120 during this time is a combination of driving motion and human interaction. Such an anomaly is detected by the method of FIG. 4 when the smartphone motion is compared with the motion of vehicle 100, with OBD 111 speed data, or with external camera 112 ego motion.
Calibrator 158
[0053] In accordance with an embodiment of the present invention, calibrator 158 performs low-level pre-processing prior to sensor data being provided to high-level algorithms, applications, and an operating system. Calibration may be performed using deep learning with unsupervised methods, by defining outputs such as location, and using a loss function to train the network to achieve best results by augmentation of errors on inputs.
[0054] One of the services that server 200 provides is mapping. Mapping enables better localization on the client, whereby enhanced location services are provided in a manner such that location is treated as a sensor instead of a high-level output. Each sensor is labeled by model and type, and its data is uploaded to the server in which model-specific calibration and, as appropriate, training of the network to calibrate, are performed. As such, when new users connect to the sensor, an initial offline calibration is provided.
[0055] Any device that is connectable and delivers state data, such as a vehicle Wi-Fi hotspot, may be treated as a signal to be calibrated.
[0056] Multiple sensors with the same measurement, such as multiple GPS units, may be aggregated to output a single location for top level consumers, such as applications and operating systems. E.g., the multiple GPS units may be a wearable watch with GPS, a phone GPS and a vehicle GPS.
[0057] Calibrator 158 not only processes individual sensors, but also generates a unified output in the vehicle context, such as the vehicle coordinate system, or the vehicle time.
[0058] In some embodiments of the present invention, a sensor is calibrated over time, and only the final calibration with sensor model and additional data is uploaded, instead of raw sensor data.
[0059] First notice of loss (FNOL) services, which run in real-time on smartphone 120, and collision reconstruction are improved by calibration, whereby vehicle and driver damage is evaluated using a combination of all sensors. Each such case is uploaded to server 200 for further adjustment and analysis of parameters and calibration. This is significant because collisions are non-linear, and many systems are not calibrated for a non-linear working area. A high G-force, for example, may cause faulty gyro readings.
[0060] By proper calibration with multiple inputs from many sensors, distracted driving is easily recognized when inputs are well-synced. Distracted driving is another output in a list of driver score and localization.
[0061] Some of sensors 110 / 130, such as accelerometer 132, are inertial measurement unit (IMU) sensors relative to a local coordinate system. Calibrator 158 transforms local sensor coordinate systems to a common vehicle coordinate system, as described below with reference to FIG. 5. Calibrator 158 also estimates locations and orientations of sensors 110, as described below with reference to FIG. 6.
[0062] In accordance with embodiments of the present invention, the sensors may fetch from server 200 initial per-sensor model calibrations. The server aggregates measurements to optimize per-sensor model calibration. Additionally, the sensors periodically send back enhanced on-the- fly calibration so that the server per-model calibration uses accurate statistics. For example, a sensor of type/model X may have a constant calibration error. When a new sensor of type/model X is introduced into the system, the new sensor uses the initial model calibration. As such, the new sensor immediately provides accurate calibrated results, avoiding the need to wait for the online calibration process to converge.
[0063] The initial per-sensor model calibrations may be based on local calibrations performed by the individual sensors and uploaded to the server.
[0064] Reference is made to FIG. 5, which is a simplified drawing of coordinate systems relative to vehicle 100 and relative to smartphone 120, in accordance with an embodiment of the present invention. FIG. 5 shows that relative to vehicle 100, an x-axis passes through the vehicle from the passenger side to the driver side, a j-axis points along the direction of motion of vehicle 100, and a z-axis points upwards to the roof of vehicle 100. The j-axis is referred to as the "vehicle heading vector". The z-axis is referred to as the "vehicle roof vector ' '. Linear acceleration of vehicle 100 is measured along the vehicle heading vector, and rotation of vehicle 100 is measured around the vehicle roof vector as axis of rotation.
[0065] FIG. 5 also shows that relative to smartphone 120, an x-axis passes through smartphone 120 from front to back, a j-axis points to the right of the phone display, and a z-axis points above the display. Both the vehicle and the smartphone xyz coordinate systems are moving right-handed coordinate systems.
[0066] In order to fuse IMU sensor data, camera 112 data and other sensor data, calibrator 158 transforms the sensors' local coordinate systems to the vehicle coordinate system. Calibrator 158 transforms each local sensor coordinate system in accordance with
Vcai = (p - o) « R S (p - o) (EQ. 1) where pcai and p represent respective calibrated and uncalibrated position vectors, T is a 3x3 sensor-to-vehicle calibration matrix including skew, o is an offset, R is a 3D device-to-vehicle rotation matrix and S is a 3D scale matrix. Representation of the matrix T as the product of a rotation matrix, R, and a scale matrix, S, in EQ. 1 is based on the assumption that the sensor is factory-calibrated to an orthogonal local sensor coordinate system. Otherwise, the matrix T is instead factored using a singular value decomposition (SVD) of matrix T. It is noted that EQ. 1 assumes a linear relation between measured and calibrated vectors. However, in practice there may be nonlinear effects especially during large acceleration.
[0067] For IMU sensors without a GPS, such as accelerometers and gyroscopes, the sensor-to- vehicle rotation matrix R is determined based on an assumption that during rotation most of the energy of vehicle 100 is relative to the roof axis (z-axis in FIG. 5) of vehicle 100, and when not rotating, most of the energy is relative to the heading vector (j-axis in FIG. 5), which is accelerating and decelerating. The rotation matrix R can be validated based on an assumption that the gravity vector of vehicle 100, when the accelerometer is at rest, should be near the vehicle floor. [0068] For IMU sensors with a GPS, full route sections of vehicle 100 are analyzed by dead reckoning using a Kalman filter and/or using gradient descent regression.
[0069] Reference is made to FIG. 6, which is a simplified flowchart of a method 1100 used by calibrator 158 for estimating locations and orientations of sensors 110 and 130, in accordance with an embodiment of the present invention. At operation 1110 sensor output signal data is collected. At operation 1120 car front wheels are aligned with a road contour, which may have road defects. Operation 1120 is optionally performed only when vehicle 100 is moving slowly, since complexities due to wheel suspension may arise at high speeds. At operation 1130, distance of vehicle rotation axis from sensor 110 / 130 is estimated using induced force and angular rate, based on a simple rigid body model. Speed bumps are particularly helpful in performing operation 1130.
[0070] If sensor 110 / 130 includes a GPS, then at operation 1140 the height of sensor 110 / 130 is compared to a map, within accuracy of a few cm. If sensor 110 / 130 is vision-based, then at operation 1150 camera properties including inter alia tilt and height are estimated with reference to an object of known size. If sensor 110 / 130 is a camera, then at operation 1160 the side (front/left/right/back) of vehicle 100 on which sensor 110 / 130 is located is determined using ego motion and optical flows. At operation 1170, camera data is compared to registered features of other cameras and IMU sensors, to perform camera-to-camera calibration.
[0071] Reference is made to FIG. 7, which is a simplified flowchart of a method 1200, performed by sensor processor 150, for sensor fusion for an IMU sensor 130 of smartphone 120, where smartphone 120 is positioned in a cradle in vehicle 100 in an arbitrary orientation, in accordance with an embodiment of the present invention. At operation 1210 sensor processor 150 calibrates a gyroscope 133 based on sensor signal data from accelerometer 132, gyroscope 133 and a magnetometer 134. Sensor processor 150 initially estimates the heading vector of vehicle 100 to be the smartphone 120 z-axis of FIG. 5, since this is often the case. Sensor processor 150 initially estimates the roof vector (z-axis of FIG. 5) for vehicle 100 to be the gravity vector of the smartphone 120 local coordinate system. [0072] At operation 1220 sensor processor 150 derives orientation of smartphone 120, based on the assumption that large rotations are around the vehicle roof axis when the gravity vector does not change substantially.
[0073] At operation 1230 sensor processor 150 derives linear acceleration of smartphone 120, based on the assumption that acceleration is in the direction of the vehicle 100 heading vector (y- axis of FIG. 5) when vehicle 100 is not rotating. Negative acceleration is mirrored. It is noted that the vehicle 100 heading vector is derived from linear acceleration, with or without use of a GPS. The vehicle 100 roof vector (z-axis of FIG. 5) is computed using gyroscope 133, with or without use of a GPS. It is noted that GPS -based acceleration data has a significant amount of noise, and has variable latency. Angular rate of rotation of vehicle 100 is computed from the course that the vehicle is driving. Sensor processor 150 matches linear acceleration and angular rate of rotation with GPS-derived values, so that they slowly converge. Earth gravity, the vehicle heading vector and the vehicle roof vectors are tracked to obtain the vehicle's 3D orientation and acceleration vis-a-vis gyroscope 113.
[0074] At operation 1240 sensor processor 150 derives the device-to-vehicle rotation matrix R.
[0075] Calibrator 158 also calibrates camera 112 to synchronize time, to determine latency, and to transform camera data to the vehicle coordinate system. Ego motion algorithms are used to estimate latencies of other vehicle sensors, and, if not known, to estimate the location of camera 112 in vehicle 100.
[0076] Reference is made to FIG. 8, which is a simplified flowchart of a method performed by sensor processor 150 for determining measurement error, in accordance with an embodiment of the present invention. At operation 1310 uncalibrated data from accelerometer 112, gyroscope 133 and magnetometer 134 is low-pass filtered with time synchronization. At operation 1320 smartphone 120 orientation and linear acceleration are derived. At operation 1330 a measurement error is calculated as a sum of deviation norms. The deviations are biased according to a gyroscope random walk and temperature changes. At operation 1340 the bias and bias error are updated using uncalibrated gyroscope data and the measurement error calculated at operation 1330. [0077] The method of FIG. 8 is optimized to avoid using a full gyroscope model, which requires excessive computing, by using derived vectors such that offset calibration is not necessary for the magnetometer ad accelerometer. The method avoids over-simplified low-pass filtering since it introduces large errors with long rotational motion, which is common for country-side driving.
[0078] Reference is made to FIG. 9, which is a simplified flowchart of a method 1400 performed by sensor processor 150 for fusing sensor data to derive linear acceleration and the gravity vector relative to a moving vehicle coordinate system, in accordance with an embodiment of the present invention. At operation 1410 sensor processor 150 applies a low-pass filter to accelerometer 132 signal data. At operation 1420 sensor processor 150 applies rotation matrices to rotate accelerometer and gyroscope signal data to the vehicle 100 coordinate system. At operation 1430 sensor processor 150 updates the gravity vector using the gyroscope signal data, and updates the error estimate based on the gyroscope error estimate.
[0079] At operation 1440 sensor processor 150 uses the current accelerometer measurement to estimate the gravity vector, and estimates the error by distance from a 1 G-force gravity vector and by deviation from the low -pass data due to jerking motion. At operation 1450 sensor processor 150 updates each component of the gravity vector in the vehicle coordinate system, and normalizes the resulting gravity vector. At operation 1460 sensor processor 150 corrects the acceleration components using speed derived from OBD 111 and GPS 131, and by the relation
side force = (speed) * (rotation around roof axis).
[0080] At operation 1470 sensor processor 150 back-propagates the correction in the linear acceleration to correct the gravity vector.
[0081] At operation 1480 sensor processor 150 outputs the linear acceleration and gravity vector relative to the vehicle coordinate system, and outputs an estimate of the error.
[0082] In accordance with an embodiment of the present invention, to reduce computational complexity it may be assumed that the vehicle 100 motion may be tracked separately per spatial coordinate, instead of full 3D motion. It is noted that, in distinction from cars, such an assumption is not valid for drones. For small time steps, a one -dimensional Kalman filter may be used, with errors and updates computed relative to the vehicle coordinate system separately for each spatial dimension.
[0083] After computing the acceleration, if OBD 111 and GPS 131 are available, the speed and course are tracked, and latency is estimated in a 1 - 5 min. window in accordance with motion intensity. If the latency is acceptable for real-time application, then the front force is derived from the speed, the side force is derived from course change and/or (speed) * (gyroscope angular rate around the roof axis). Speed updates are propagated back to correct the gravity vector, thereby improving accuracy and reducing error estimation.
[0084] In an embodiment of the present invention, the linear acceleration of the vehicle is used to generate a driver score, to detect emergencies, to reconstruct accidents, and for dead reckoning. Device-to-car rotation is used for camera calibration when the camera is on the same device as the IMU sensors.
[0085] Reference is made to FIG. 10, which is a simplified flowchart of a method 1500 performed by sensor processor 150 for online fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention. At operation 1510 sensor processor 150 receives signal data from sensors 110. At operation 1520 sensor processor 150 receives sensor data from server 200. At operation 1530 synchronizer 152 evaluates latencies of the received signal data as described hereinabove. At operation 1540 synchronizer 152 synchronizes the signal data as described hereinabove.
[0086] At operation 1550 error estimator 154 estimates signal data errors as described hereinabove. At operation 1560 sensor validator 156 validates the signal data as described hereinabove. At operation 1570 calibrator 158 calibrates the signal data as described hereinabove. At operation 1580 signal processor 150 provides feedback to a driver of vehicle 100. If vehicle 100 is an autonomous vehicle, then signal processor 150 provides feedback to an autonomous vehicle controller. [0087] Reference is made to FIG. 11, which is a simplified flowchart of a method 1600 performed by vehicle data processor 250 for offline fusing and calibrating data received from moving vehicles 100, in accordance with an embodiment of the present invention. At operation 1610 vehicle data processor 250 receives data from vehicles 100. At operation 1620 vehicle data processor 250 evaluates latencies of the received vehicle data. At operation 1630 vehicle data processor 250 synchronizes the vehicle data.
[0088] At operation 1640 vehicle data processor 250 estimates vehicle data errors. At operation 1650 vehicle data processor 250 derives sensor-related and driver-related information, and stores the derived information in sensor database 210 and driver database 220, respectively. At operation 1660 vehicle data processor transmits the derived sensor-related information to vehicles 100.
[0089] In accordance with embodiments of the present invention, rules are provided for triggering smartphone 120 to log sensor data on server 200 for analysis. Such triggering enables collision analysis and system failure analysis. Conventional systems either don't store signal data around critical events, or else store only locally. Triggering enables preservation of an abundance of critical data for careful analysis. The rules for triggering may be provided by server 200, device 120 or any of sensors 110/130.
Distributed Computing Architecture
[0090] More generally than the embodiment shown in FIG. 1, embodiments of the present invention distribute sensor data fusion and calibration among multiple compute nodes in vehicle 100.
[0091] Reference is made to FIG. 12, which is a simplified block diagram of a multi-processor system for fusing and calibrating sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention. FIG. 12 shows multiple compute nodes 160 in vehicle 100. Compute nodes 160 may include, for example, smartphone 120, an infotainment system, a smart watch and other Internet of Things (IoT) and wearable devices within vehicle 100, which collaborate to fuse and calibrate signal data from sensors 110 and 130. Once compute nodes 160 are enabled, the system of FIG. 12 allocates computing tasks in an optimal way based on the available compute capabilities of the compute nodes 160. One of the compute nodes is designated as the primary node. Preferably the primary node is a device with high-availability and significant computing power, such as smartphone 120. Data transfer between nodes is minimized by preprocessing parts of computations at the nodes. For driving scenarios, it is important that the multiprocessor system of FIG. 12 continues to function if one or more of the sensors or computing nodes fails. To ensure such continuity, redundancies are used.
[0092] Server 200 performs some of the extensive calibration and fine-tuning computations, in order to relieve the local compute nodes 160 of this burden.
[0093] Reference is made to FIG. 13, which is a simplified flowchart of a multi-processor method for initiating distributed fusing and calibrating of sensor signal data in a moving vehicle, in accordance with an embodiment of the present invention. FIG. 13 shows multiple compute nodes 160 in vehicle 100. Some of the compute nodes 160, such as the smartphones, have embedded sensors. Some of the compute nodes, such as the smartphones and the car infotainment system, are coupled with sensors. Thus one or more of the smartphones are coupled with the OBD, and the infotainment system is coupled with a camera. The compute nodes 160 cooperate to fuse and calibrate the sensors in vehicle 100.
[0094] At operation 1710 compute nodes 160 initiate their negotiation with one another and disclose their computing capability and Internet access. Sensing information is collected across all compute nodes 160. At operation 1720 compute nodes 160 select one of the compute nodes to be the primary node, and set Internet access roles. For the purpose of redundancy, another one of the compute nodes is selected to be the secondary node. At operation 1730 compute nodes 160 set computing and data flow roles per device in vehicle 100. At operation 1740 compute nodes 160 synchronize their internal clocks. At operation 1650 compute nodes 160 initiate fusion and calibration of the sensor data.
[0095] Reference is made to FIG. 14, which is a simplified flowchart of a multi-processor method for allocating sensor data fusing and calibrating tasks among compute nodes in a moving vehicle, in accordance with an embodiment of the present invention. FIG. 14 shows that calculating sensor latency is allocated to compute nodes #1 and #4, calculating sensor accuracy and calibrating sensor data is allocated to compute node #2, and validating sensors and calculating sensor accuracy is allocated to compute node #3. In distinction, in the system shown in FIG. 1 these various tasks are all performed by components of sensor processor 150.
Machine Learning
[0096] When one or more compute nodes 160 of FIG. 12 have significant compute power, machine learning is introduced into the sensor data calibration and fusion algorithms. When complete process models for the vehicle motion are known a priori, synchronization, calibration and fusion are performed using Kalman filters, particle filters, and other such filters. Filtering may be based on maps stored on server computers for dead reckoning, and pre-calibrations from factory defaults. When complete process models are not known a priori, synchronization, calibration and fusing is more complex and less predictable.
[0097] In an embodiment of the present invention, behavior of sensor is automatically classified by monitoring detailed relationships used for sensor fusion on the device side and on the server side.
[0098] An exemplary embodiment is a Kalman filter where the linear relationships between process and measurement variables are learned on-the-fly. Specifically, the Kalman filter is based on a process model of the form
xk = Fxk→ + Buk + wk ,
and a measurement model of the form
zk = Hxk + vk ,
where F, B and H are known matrices, and w and v are white noise with known covariances. When these parameters are not known, a loss function may be used to self-converge and detect the parameters. Use of a loss function is feasible for devices such as compute nodes 160, since the number of variables is on the order of 1,000 or less. In contrast, deep learning algorithms use as many as 1 ,000,000 variables. Specifically, the number of variables used in embodiments of the present invention is given by cN2, where N is the number of inputs and hidden variables, and c is determined from the set of levels in the Kaman filter that are trained. Typically, the set of levels is 3; namely, (i) process model, (ii) noise covariance matrix, and (iii) state transitions. In some cases, a 4th level, (iv) input control, is used.
[0099] The following use case is a case where model parameters are learned on-the-fly. A driver is driving an electric car and places his smartphone in a cradle in a specific orientation. Whenever the driver presses the accelerator pedal, the magnetic field jumps, in a linear relation to the vehicle acceleration. Reference is made to FIG. 15, which is a graph of magnetic field data over time, showing correlation between acceleration and magnetic field, illustrating a self- converging Kalman filter in accordance with an embodiment of the present invention. The data in FIG. 15 was generated using an iPhone in a Tesla Model S P100D electric car.
[00100] A conventional Kalman fails to include a weighting for the relation between magnetic field and vehicle acceleration, but instead includes a weighting between magnetic field and orientation. This is an incorrect relationship and as such, the conventional Kalman filter produces large errors in this use case.
[00101] The self-converging Kalman filter in accordance with the present invention, slowly adjusts the weights using a loss function based on GPS short range dead reckoning. If the electric car has an OBD then the car model is registered on server 100, and the pre-designated model from the server is used for the next driver with this type of car, without having to repeat the learning phase.
[00102] It will thus be appreciated by those skilled in the art that the subject invention enables sensor data and calibration for sensors in a moving vehicle that have diverse behavior and characteristics, using a smartphone as a data bridge, or using a plurality of devices in the vehicle as distributed compute nodes. [00103] In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims

CLAIMS What is claimed is:
1. A sensor processing device located within a vehicle that is being driven, the processing device communicating with plural sensors, each sensor generating signal data, the processing device comprising:
a transceiver transmitting data derived by the processing device from the sensor signal data, to one or more servers, and receiving sensor-related information from the one or more servers;
a synchronizer evaluating latencies of the sensors;
an error estimator estimating accuracies of the sensor signal data;
a sensor validator determining if one or more of the sensors are failed; and a calibrator transforming the sensor signal data to a common vehicle reference system.
2. The sensor processing device of claim 1 wherein the one or more servers comprise one or more devices within the vehicle or within the processing device.
3. The sensor processing device of claim 1 wherein the sensors are members of the group consisting of an accelerometer, a barometer, a beacon, a gyroscope, a magnetometer, a camera, Lidar, radar, ultrasonic radar, a microphone, a global positioning system, and on-board diagnostic sensors.
4. The sensor processing device of claim 1 wherein the processing device derives driver-related information from the transformed signal data, the driving-related information comprising vehicle position and orientation, autonomous vehicle feedback, driver feedback, or driver scores, and exposes the driving-related information to other devices.
5. The sensor processing device of claim 1 wherein the processing device exposes the synchronized and calibrated signal data through an application programming interface (API) or through a software development kit (SDK).
6. The sensor processing device of claim 1 wherein the processing device responds to a trigger event by logging sensor data on the one or more servers for analysis, wherein rules for trigger events are provided by the processing device or by the one or more servers.
7. The sensor processing device of claim 6 wherein the one or more servers conducts collision analysis or system failure analysis based on the logged sensor data.
8. A vehicle network data processor that receives times series data from transmitters in one or more vehicles that are being driven, the time series data based on plural sensors located in the one or more vehicles, the vehicle data processor deriving, from the received time series data, sensor- related information and driver-related information, the vehicle data processor comprising:
one or more cellular or Wi-Fi transceivers receiving time series data from the vehicles; a synchronizer evaluating latencies of the time series data;
an error estimator for estimating accuracies of the time series data; and
one or more database managers storing the sensor -related information and the driver- related information derived by the processor in one or more databases,
wherein said one or more cellular or Wi-Fi transceivers transmit the sensor-related information in the databases to the vehicles.
9. The vehicle network data processor of claim 8 wherein the sensors are members of the group consisting of an accelerometer, a barometer, a gyroscope, a magnetometer, a camera, a microphone, a global positioning system, on-board diagnostic sensors, and a temperature sensor.
10. The vehicle network data processor of claim 8 wherein the driving-related information derived by said processor comprises vehicle-to-vehicle network information, advanced driver assistance system information, autonomous driving training information, map information and fleet driver scores.
11. The vehicle network data processor of claim 8 wherein the sensor-related information stored by said one or more database managers includes initial sensor calibration models obtained from at least some of the plural sensors, and wherein other sensors access the initial sensor calibration models from said one or more database managers for use as their calibration models.
12. The vehicle network data processor of claim 11 wherein the initial calibration models are obtained from local calibrations performed by the individual sensors and uploaded to said one or more database managers.
13. A non-transitory computer readable medium storing instructions, which, when executed by a processing device located in a vehicle that is being driven, cause the processing device to process signal data from sensors in the vehicle, comprising causing the processing device to:
receive signal data from the sensors;
receive sensor-related information from one or more remote servers;
evaluate latencies of the sensors so as to synchronize the sensor signal data; estimate accuracies of the sensor signal data;
determine if one or more of the sensors are failed; and
transform the sensor signal data to a common vehicle reference system.
14. The computer readable medium of claim 13 wherein the processor evaluates latencies of the sensor signal data based on domain matching whereby the same physical quantity is derived from the sensor signal data in more than one way.
15. The computer readable medium of claim 13 wherein the processing device transforms the sensor signal data to the vehicle reference system by use of a rotation matrix that transforms an orthogonal set of device axes to an orthogonal set of vehicle axes, the device axes comprising two perpendicular axes in a plane of the device and a third axis normal to the plane, and the vehicle axes comprising a roof axis, a forward axis and a side axis.
16. The computer readable medium of claim 13 wherein the sensor-related information comprises calibration data for the sensors in the vehicle.
17. A non-transitory computer readable medium storing instructions, which, when executed by a vehicle network data processor, cause the data processor to receive time series data from transmitters in one or more vehicles that are being driven, the time series data being based on plural sensors located in the one or more vehicles, and to derive sensor-related information and driver- related information from the received time series data, comprising causing the data processor to:
receive time series data from the vehicles;
evaluate latencies of the sensors;
estimate accuracies of the time series data;
store the sensor-related information and the driver-related information derived by the processor in one or more databases; and
transmit the sensor-related information in the databases to the vehicles.
18. A vehicle sensor system, comprising a plurality of computer processing units within a vehicle that is being driven, the vehicle including plural sensors that generate signal data, the plurality of computer processing units jointly comprising circuitry for sensor calibration and fusion, the circuitry comprising:
one or more local area data connections receiving signal data from the sensors; one or more transceivers transmitting data derived from the sensor signal data to one or more remote servers, and receiving sensor-related information from the one or more remote servers;
a synchronizer evaluating latencies of the sensors;
an error estimator for estimating accuracies of the received sensor signal data;
a sensor validator for determining if one or more of the sensors are failed; and a calibrator for transforming the received sensor signal data to a common vehicle reference system.
19. The vehicle sensor system of claim 18 wherein said computer processing units are members of the group consisting of smartphones, Internet of things (IoT) devices, wearable devices, and a vehicle system.
20. The vehicle sensor system of claim 18 wherein the sensors are members of the group consisting of an accelerometer, a barometer, a gyroscope, a magnetometer, a camera, a microphone, a global positioning system, on-board diagnostic sensors and temperature sensors.
21. The vehicle sensor system of claim 18 wherein the plurality of computer processing units jointly derive driving-related information, the driver-related information comprising autonomous vehicle feedback, driver feedback, one or more driver scores, or vehicle orientation and positioning information.
22. A method for a plurality of computer processing units within a vehicle that is being driven, the vehicle comprising plural sensors that generate signal data, to jointly perform sensor data processing, the method comprising dynamically allocating among the computer processing units the following real-time tasks, the allocation being based on currently available bandwidth and computing power of each computer processing unit: evaluate latencies of the sensors so as to synchronize the sensor signal data;
estimate accuracies of the sensor signal data;
determine if one or more of the sensors are failed; and
transform the sensor signal data to a common vehicle reference system.
23. The method of claim 22 wherein said dynamically allocating comprises dynamically selecting one of the computer processing units to be a master over the other computer processing units.
24. The method of claim 22 wherein system data is shared among the processing units, and when one of the processing units is removed one or more others of the processing units perform the removed processing unit's allocated tasks.
25. The method of claim 22 wherein the task to evaluate latencies is performed by domain matching whereby the same physical quantity is derived from the sensor signal data in more than one way.
26. The method of claim 22 wherein the task to transform the sensor signal data to the vehicle reference system is performed using a rotation matrix that transforms an orthogonal set of sensor axes to an orthogonal set of vehicle axes, the sensor axes comprising two perpendicular axes in a plane of the sensor and a third axis normal to the plane, and the vehicle axes comprising a roof axis, a forward axis and a side axis.
PCT/IL2018/050615 2017-06-07 2018-06-06 Fusion and calibration of sensor signals in a moving vehicle WO2018225067A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762516459P 2017-06-07 2017-06-07
US62/516,459 2017-06-07

Publications (1)

Publication Number Publication Date
WO2018225067A1 true WO2018225067A1 (en) 2018-12-13

Family

ID=64566201

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2018/050615 WO2018225067A1 (en) 2017-06-07 2018-06-06 Fusion and calibration of sensor signals in a moving vehicle

Country Status (1)

Country Link
WO (1) WO2018225067A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110782693A (en) * 2019-08-12 2020-02-11 腾讯科技(深圳)有限公司 Positioning method, device and equipment
US20210407119A1 (en) * 2020-06-30 2021-12-30 Lyft, Inc. Validating a Map Reconstruction Using Sensor Data Constraints
DE102020118620A1 (en) 2020-07-15 2022-01-20 Bayerische Motoren Werke Aktiengesellschaft positioning for a vehicle
CN114216483A (en) * 2021-12-14 2022-03-22 北京云迹科技股份有限公司 Robot detection method and device
US11609558B2 (en) 2019-10-29 2023-03-21 Allstate Insurance Company Processing system for dynamic event verification and sensor selection
US11765067B1 (en) 2019-12-28 2023-09-19 Waymo Llc Methods and apparatus for monitoring a sensor validator

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030073406A1 (en) * 2001-10-17 2003-04-17 Benjamin Mitchell A. Multi-sensor fusion
US7102496B1 (en) * 2002-07-30 2006-09-05 Yazaki North America, Inc. Multi-sensor integration for a vehicle
WO2016178613A1 (en) * 2015-05-05 2016-11-10 Scania Cv Ab Device and method for managing communication for a vehicle

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030073406A1 (en) * 2001-10-17 2003-04-17 Benjamin Mitchell A. Multi-sensor fusion
US7102496B1 (en) * 2002-07-30 2006-09-05 Yazaki North America, Inc. Multi-sensor integration for a vehicle
WO2016178613A1 (en) * 2015-05-05 2016-11-10 Scania Cv Ab Device and method for managing communication for a vehicle

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110782693A (en) * 2019-08-12 2020-02-11 腾讯科技(深圳)有限公司 Positioning method, device and equipment
US11609558B2 (en) 2019-10-29 2023-03-21 Allstate Insurance Company Processing system for dynamic event verification and sensor selection
US11765067B1 (en) 2019-12-28 2023-09-19 Waymo Llc Methods and apparatus for monitoring a sensor validator
US20210407119A1 (en) * 2020-06-30 2021-12-30 Lyft, Inc. Validating a Map Reconstruction Using Sensor Data Constraints
US11978224B2 (en) * 2020-06-30 2024-05-07 Lyft, Inc. Validating a map reconstruction using sensor data constraints
DE102020118620A1 (en) 2020-07-15 2022-01-20 Bayerische Motoren Werke Aktiengesellschaft positioning for a vehicle
CN114216483A (en) * 2021-12-14 2022-03-22 北京云迹科技股份有限公司 Robot detection method and device

Similar Documents

Publication Publication Date Title
WO2018225067A1 (en) Fusion and calibration of sensor signals in a moving vehicle
CN108227735B (en) Method, computer readable medium and system for self-stabilization based on visual flight
WO2020253260A1 (en) Time synchronization processing method, electronic apparatus, and storage medium
CN111694351A (en) Method and system for executing a composite behavior strategy for an autonomous vehicle
US11366236B2 (en) Signals of opportunity aided inertial navigation
US20200156639A1 (en) Efficient Optimal Control With Dynamic Model For Autonomous Vehicle
US11875519B2 (en) Method and system for positioning using optical sensor and motion sensors
JP2020530569A (en) Vehicle sensor calibration and positioning
CN110207714A (en) A kind of method, onboard system and the vehicle of determining vehicle pose
US11946746B2 (en) Method for satellite-based detection of a vehicle location by means of a motion and location sensor
CN117685953A (en) UWB and vision fusion positioning method and system for multi-unmanned aerial vehicle co-positioning
CN111654593A (en) Motion sickness reduction for vehicle mounted displays
JP2022537361A (en) Relative position tracking using motion sensors with drift correction
CN106886037A (en) Suitable for the POS data method for correcting error of weak GNSS signal condition
Deilamsalehy et al. Fuzzy adaptive extended Kalman filter for robot 3D pose estimation
CN113075713B (en) Vehicle relative pose measurement method, system, equipment and storage medium
CN115930959A (en) Vision initialization method and device and hovercar
CN112595330B (en) Vehicle positioning method and device, electronic equipment and computer readable medium
CN115476881A (en) Vehicle trajectory tracking control method, device, equipment and medium
EP3120164A1 (en) Producing data describing target measurements
CN115556769A (en) Obstacle state quantity determination method and device, electronic device and medium
CN104792336B (en) A kind of state of flight measurement method and device
CN105874352B (en) The method and apparatus of the dislocation between equipment and ship are determined using radius of turn
Ma et al. Development of a vision-based guidance law for tracking a moving target
CN115583243B (en) Method for determining lane line information, vehicle control method, device and equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18812665

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18812665

Country of ref document: EP

Kind code of ref document: A1