Detailed Description
SUMMARY
Automotive technology is successfully improving autonomy and safety. One way to achieve this goal is to equip the vehicle with a number of different types of sensors, including optical cameras, radar systems, and LiDAR systems. The sensors collect low level data that is processed by the sensor tracker to generate low level tracks representing different aspects of the vehicle surroundings. Most of the raw data collected, which is relevant to autonomy and security, represents different objects in the environment. These objects may be dynamic or static and may include pedestrians, animals, other vehicles, vegetation, buildings, road signs, and many other elements that may be present in the environment. Often, the environment is flooded with a plurality of objects of various different types. An example of a cluttered environment may be a busy downtown street in a large city. The environment may include a number of pedestrians, vehicles, or other objects.
One of the main challenges of automotive systems with different sensors is to correlate or disassociate the tracking gathered from the different sensor patterns. For example, the automobile system attempts to determine whether the camera-derived tracking matches the radar-derived tracking.
Currently, the available tracking correlation processes focus mainly on the status and attributes of the tracking to make the correlation decision. The historical track of the trace is typically not taken into account. Storing and processing historical information can be expensive in terms of storage and computational complexity. However, historical information may be helpful when an accurate correlation decision between two traces is to be determined from a large number of different traces originating from one or more different sensors, especially in cluttered environments.
As an example, at a particular point in time, there may be radar tracking and visual tracking that are close to each other, causing them to appear to match well. However, if the historical trajectories of the two different tracks are analyzed, it may be found that the two different tracks come from different locations and thus refer to different objects rather than a common object. In this case, matching two different traces may not be a good decision.
This document describes methods and systems for identification of history-based incompatibility tracking. The historical trajectory of the tracking facilitates accurately determining whether the tracks originating from different sensors identify the same object or different objects. However, recording historical data for multiple traces may consume a large amount of storage or other computing resources, and the associated computations may become complex. The methods and systems described herein enable a sensor fusion system for an automobile or other vehicle to take historical data into account when correlating and pairing tracks without requiring extensive storage and without tying up other computing resources.
Example Environment
FIG. 1 illustrates an example environment 100 in which a system 102 is configured to perform identification of history-based incompatibility tracking in accordance with the techniques of this disclosure in the example environment 100. In the depicted environment 100, the system 102 is an automobile. Sometimes referred to as a vehicle 102, the system 102 may represent any type of device or machine, including manned systems and unmanned systems, which may be used for various purposes. Some non-exhaustive and non-limiting examples of the vehicle 102 include a motorcycle, bus, tractor, semi-trailer, watercraft, aircraft, or other equipment or machine.
The vehicle 102 may be equipped with an object fusion system 104, the object fusion system 104 may include one or more sensor interfaces 106 and a tracking matching module 108, the sensor interfaces 106 including a camera interface 106-1, a radar interface 106-2. With object fusion system 104, transport 102 has an instrument field of view (FOV) 110 that may encompass one or more transports 112, including transport 112-1 and transport 112-2. The object fusion system 104 may capture the FOV 110 from any exterior surface of the vehicle 102. Positioning the camera and radar components interfacing with the object fusion system 104 in a particular manner may result in the object fusion system 104 having a particular FOV. For example, positioning one or more of the cameras with the radar may ensure that the FOV of the object fusion system 104 includes an area above (above), near, or on (on) a road that the vehicle 102 may be traveling on. At least a portion of the object fusion system 104 may be integrated into a side view mirror, bumper, roof, or any other portion of the vehicle 102.
Although not precisely shown in fig. 1, trace matching module 108 executes on a processor or other hardware. During execution, the tracking matching module 108 may track the object based on sensor data obtained at the camera interface 106-1 and the radar interface 106-2. Camera interface 106-1 receives camera data from one or more cameras of vehicle 102 and radar interface 106-2 receives radar data from at least one radar of vehicle 102. The tracking matching module 108 of the object fusion system 104 accesses the camera interface 106-1 and the radar interface 106-2 to acquire camera data and radar data, respectively. As will be appreciated from other portions of the specification, the object fusion system 104 may include additional sensor interfaces 106 (e.g., liDAR) than those shown in fig. 1.
The track matching module 108 configures the object fusion system 104 to associate or disassociate different types of tracks obtained from the sensor interface 106. The track matching module 108 identifies a plurality of tracks from the first sensor data (e.g., obtained from the camera interface 106-1) and determines a plurality of tracks from the second sensor data (e.g., obtained from the radar interface 106-2). Upon pairing or associating the two tracks, the track matching module evaluates the tracks from the first sensor data as compared to the tracks from the second sensor data to determine whether a pair of tracks (e.g., object pairing) represents a good candidate association. For example, if the camera interface 106-1 receives a first track related to the vehicle 112-1 and the radar interface 106-2 receives a second track related to the vehicle 112-2, the track matching module pairs the first track and the second track using the techniques described herein and other techniques that may or may not be used to determine whether the first track and the second track match to represent a common vehicle 112.
To correlate or disassociate the different types of tracking collected by the sensor interface, the tracking matching module 108 performs a tracking matching process that pairs the tracking present in the camera data with the tracking present in the radar data. When pairing large sets of multiple traces, the trace matching module 108 may generate a feasibility matrix and a incompatibility matrix. The object fusion system 104 assigns an identifier to each track with object candidates detected using the camera, and the object fusion system 104 assigns an identifier to each track (e.g., detection) obtained using radar.
One example of a feasibility matrix is an object tracking pair matrix. Each element value in the feasibility matrix indicates a probability that a different object tracking of an object tracking pair is associated with the same object. The feasibility matrix may be created by one or more of several different methods and is the primary means for tracking object fusion.
In this example, the incompatibility matrix supplements the feasibility matrix by tracking historical data. The object fusion system may adjust the probabilities indicated in the feasibility matrix based on the historical data.
The incompatibility matrix is created in two dimensions, although it is contemplated that the incompatibility matrix can comprise more than two dimensions with additional sensors. The first dimension represents the total number of columns, each column being assigned a camera identifier assigned to the camera data. The second dimension represents the total number of rows, each row assigned a radar identifier assigned to radar data, and so on. For each combination of camera and radar pairing represented by the incompatibility matrix, a parameter (e.g., a binary value) is assigned based on a difference in state between the tracking derived from the camera data and the tracking derived from the radar data, the difference is compared to a threshold parameter (e.g., a state threshold) for that state, and evaluated during a threshold number of consecutive frames. The parameter or binary value indicates whether the tracking from the camera data and the tracking from the radar data represent different objects.
Example architecture
FIG. 2 illustrates an example automobile system 200 configured to perform identification of history-based incompatibility tracking in accordance with the techniques of this disclosure. The automotive system 200 may be integrated into the vehicle 102 shown in fig. 1 and described in this context. For example, the automotive system 200 includes a controller 202 and an object fusion system 104-1, the object fusion system 104-1 being an example of the sensor fusion system 104. The object fusion system 104-1 and the controller 202 communicate over a link 204. The link 204 may be a wired link or a wireless link, and in some cases the link 204 includes a communication bus. The controller 202 performs operations based on information received over the link 204, such as an indication of compatibility output from the object fusion system 104-1 when objects in the FOV are identified by processing and merge tracking based at least on the partial incompatibility matrix.
The controller 202 includes a processor 206 and a computer-readable storage medium (CRM) 208 (e.g., memory, long term storage, short term storage), the CRM 208 storing instructions for an automotive module 210. In addition to the radar interface 106-2, the object fusion system 104-1 includes a camera interface 106-1. Any number of other sensor interfaces 106 may be used, including a LiDAR interface or other sensor interface 106-n. The object fusion system 104-1 can include processing hardware including a processor 212 (e.g., a hardware processor, a processing unit) and a computer-readable storage medium (CRM) 214, the computer-readable storage medium 214 storing instructions associated with a trace matching module 108-1. Trace match module 108-1 is an example of trace match module 108 and includes a incompatibility matrix 216 and a feasibility matrix 220 stored as a bit array 218.
Processors 206 and 212 may be two separate processing units, or a single processing unit (e.g., a microprocessor), or a pair of systems-on-chip or a single system-on-chip with a computing device, controller, or control unit. The processors 206 and 212 execute computer-executable instructions stored within the CRMs 208 and 214. As an example, the processor 206 may execute the car module 210 to perform a driving function (e.g., autonomous lane change maneuver, semi-autonomous lane keeping feature) or other operation of the car system 200. Similarly, the processor 212 may execute the track matching module 108-1 to infer objects in the FOV based on sensor data obtained from the plurality of sensor interfaces 106 of the system 102. The automobile module 210, when executed at the processor 206, may receive an indication of one or more objects detected by the track matching module 108-1 in response to the track matching module 108-1 combining and analyzing sensor data generated at each of the sensor interfaces 106.
In general, the automobile system 200 executes an automobile module 210 to perform automobile functions, which may include using outputs from the object fusion system 104-1. For example, the automobile module 210 may provide automatic cruise control and monitor the object fusion system 104-1 for an output indicating the presence of an object in or near the FOV 110, e.g., to reduce speed and prevent a collision with the rear end of the vehicle 112. In such an example, the track matching module 108-1 provides the sensor data or derivative thereof (e.g., the incompatibility matrix) as an output to the automobile module 210. The automobile module 210 may provide an alert or perform a particular maneuver when the data obtained from the track matching module 108-1 indicates that one or more objects are passing in front of the vehicle 102.
FIG. 3 illustrates an example scenario 300 in accordance with the techniques of this disclosure, where traces 302 and 304 are received by a vehicle 306, the vehicle 306 configured to perform identification of history-based incompatibility traces. In fig. 3, track 302 may be a first track identified by one sensor interface 308 (e.g., camera interface 106-1 in fig. 1) on vehicle 306, and track 304 may be a second track identified by a different sensor interface 310 (e.g., radar interface 106-2 in fig. 1) on vehicle 306. Different states of vehicle 306 may be extracted from tracking data including distance 312 (r-312), azimuth 314 (θ -314), longitudinal velocity 320 (vlong-320), and lateral velocity 324 (vlat-324) of track 302, among other conceivable states. For tracking 304, distance 316 (r-316), azimuth 318 (θ -318), longitudinal velocity 322 (vlong-322), and lateral velocity 326 (vlat-326) may be extracted. Other trace states not shown in FIG. 3 may also be used for identification of history-based incompatible traces. To identify whether traces 302 and 304 are compatible or incompatible, the difference between the corresponding states of the two different traces 302 and 304 within a common time frame is calculated. For example, the difference between r-312 and r-316 is calculated, and the difference between theta-314 and theta-318 is calculated. Likewise, the difference between vlong-320 and vlong-322 is calculated, and the difference between vlat-324 and vlat-326 is calculated. Each difference is compared to a threshold value for that particular state. If the difference is greater than the threshold for that particular state, then the tracking pair may be incompatible.
The incompatibility may be true if only one state difference is greater than the corresponding state threshold and the other differences are less than the corresponding state threshold. For example, the incompatibility of the trace pairs (302, 304) may be calculated as:
incompatibility = | r-312-r-316| > distance _ threshold, or
| θ -314- θ -318| > Azimuth _ thresh, or
| vlong-320-vlong-322| longitudinal velocity _ threshold, or
| vlat-324-vlat-326| > lateral velocity _ threshold.
If an incompatibility between tracking pairs is calculated within a frame, a frame counter is activated such that the incompatibility is calculated for a threshold number of frames before an incompatibility decision is made. Historically (e.g., per a threshold number of frames), the tracking pair may be considered to track a different object if a threshold number of frames are reached and the incompatibility of the tracking pair still exists (e.g., all analyzed state differences exceed the state threshold for the duration of the threshold number of frames). In this example, an incompatibility of a tracking pair may be assigned a binary value of 1, and a compatibility of a tracking pair may be assigned a binary value of 0. In other aspects, the assignment of binary values may be reversed, with 1 corresponding to compatible and 0 corresponding to incompatible. If multiple tracking pairs are historically tracked in this manner, the incompatible data may be arranged as an incompatible matrix.
FIG. 4 illustrates an example incompatibility matrix 400 configured to perform the identification of history-based incompatibility tracking in accordance with the techniques of this disclosure. The columns in the incompatibility matrix 400 represent traces from a first sensor interface (e.g., trace 302), and the rows in the incompatibility matrix 400 represent traces from a different sensor interface (e.g., trace 304). In this example, if trace 302 corresponds to column A in the incompatibility matrix 400, and if trace 304 corresponds to row 3 in the incompatibility matrix 400, then the trace pair (302, 304) has an incompatibility allocation of 1 (e.g., element A-3 in the element incompatibility matrix 400). If compatibility becomes possible within a threshold number of frames, the incompatible allocation will become 0. FIG. 4 further illustrates that the incompatibility matrix can be arranged and stored as a bit array 402. In this example, sixteen tracking pairs are identified and stored as a sixteen element (e.g., sixteen bit) array. In other examples, a more complex encoding scheme may be used (e.g., using less than the total number of bit pairs to represent each tracking pair in the array). The array is shown such that the first bit in the array corresponds to tracking pair A-1, the second bit corresponds to A-2, and so on. However, the array can be arranged differently, such as a first listed row (e.g., A-1, B-1, \ 8230;, C-4, D-4), a reverse move (e.g., D-4, D-3, \ 8230;, A-2, A-1), or any other arrangement of bits representing elements in the incompatibility matrix 400.
By storing the incompatibility matrix 400 as an array of bits, each bit in the array of bits representing an incompatibility of a different trace pair, a large amount of information about the historical relationships of different traces can be stored in an efficient and inexpensive manner with a small amount of computing resources.
Example scenarios
FIG. 5-1 illustrates an example implementation 500-1 of identification of history-based incompatibility tracking in accordance with the techniques of this disclosure. In implementation 500-1, a vehicle 502 is equipped to perform identification of history-based incompatibility tracking. At least one object 504-1 is in the FOV of at least two sensors mounted on the vehicle 502. One sensor interface of the vehicle 502 receives the trace 504-2 and within the same frame, a second sensor interface of the vehicle 502 receives the trace 504-3. The difference between at least one tracking state (e.g., azimuth angle) from tracking 504-2 and 504-3 is greater than the state threshold error set for that tracking state. In this case, the difference in the tracking states indicates incompatibility of the two tracks 504-2 and 504-3, even though the other tracking states indicate compatibility. If the difference persists for a consecutive number of frames set by the frame counter, the two traces 504-2 and 504-3 may be considered historically incompatible. However, if the difference in tracking status of the two tracks 504-2 and 504-3 falls within a feasibility threshold range within some frames, i.e., the status error is below the feasibility threshold set to determine the feasibility of the two tracks representing the same object at any given frame index, another frame counter begins counting the frames for which both tracks remain feasible. Once the feasibility is maintained for a threshold number of frames, the tracks become compatible with each other, and tracks 504-2 and 504-3 may be considered as likely to track the same object 504-1. The feasibility tracking pair may then be further considered along with other data (of which the object class is one) to further determine whether the tracking pair (504-2, 504-3) represents object 504-1. In implementation 500-1, exceeding the state difference for a compatible state threshold is the azimuth; however, any one or more of the trace state differences between traces 504-2 and 504-3 may be greater than the compatible state threshold for that particular state, indicating incompatibility. Once all state differences remain below the feasibility threshold (not the compatible state threshold mentioned in [0026 ]) for a threshold number of frames, then tracks 504-2 and 504-3 may represent a common object 504-1 based on the historical relationships of tracks 504-2 and 504-3.
FIG. 5-2 illustrates another example implementation 500-2 of identification of history-based incompatibility tracking in accordance with the techniques of this disclosure. In implementation 500-2, the state difference for traces 504-2 and 504-3 are both less than the relative state threshold for consistent tracking at frame T-1. However, at frame T-2, which may or may not occur continuously after frame T-1, one or more of the state differences of traces 504-2 and 504-3 are determined to be greater than the relative state threshold. In some aspects, a counter may begin determining whether the tracking pair (504-2, 504-3) remains incompatible within a threshold number of frames in this example before making a decision that tracking 504-2 may be tracking a different object than tracking 504-3. In other aspects, at frame T-2, the tracking pairs (504-2, 504-3) may be immediately determined to be incompatible and not decided to be compatible pairs until the state difference remains less than the feasibility threshold for a threshold number of frames.
Example method
FIG. 6-1 illustrates an example method of identification of history-based incompatibility tracking in accordance with the techniques of this disclosure. The method 600 is illustrated as a set of operations (or acts) that are performed in, but not necessarily limited to, the order or combination of operations shown or described. In addition, any of the operations can be repeated, combined, or recombined to provide further methods. In portions of the following discussion, reference may be made to the preceding description of the figures in describing some non-limiting examples of method 600.
At 602, a first tracking of a first object proximate a vehicle (e.g., vehicle 102) is determined by an object fusion system of the vehicle from first sensor data acquired by a first sensor. The first tracking may be associated with at least a portion of a stationary object or a portion of a moving object, such as the vehicle 112.
At 604, a second pair of tracks of a second object proximate to the vehicle is determined by the object fusion system of the vehicle from second sensor data acquired by the second sensor. In some examples, the first sensor may include one or more optical cameras or near-infrared cameras, and the second sensor may include one or more radar sensors, liDAR sensors, or ultrasonic sensors.
At 606, a feasibility matrix and a incompatibility matrix are maintained by the object fusion system. Each element value in the feasibility matrix indicates a probability that a different object track of an object track pair is associated with the same object at a given time/frame index. The incompatibility matrix indicates whether the first track and the second track are historically associated with different objects. The trace matching module 108-1 pairs the first trace with the second trace and the trace pairs are represented as elements of an incompatibility matrix. Data from the first trace is compared to data from the second trace and the value of the element is set based on the result of the comparison. Based on data derived from the first trace and the second trace, a value representing an element of the trace pair indicates whether the trace pair is historically incompatible or compatible. In some aspects, the value may be a binary value, and the value indicates that the tracking pair may or may not be incompatible. The values of the tracking pairs are updated periodically. The object fusion system can adjust the probabilities indicated in the feasibility matrix based on the element values of the corresponding tracking pairs in the incompatibility matrix.
At 608, it is determined whether the incompatibility matrix indicates that the first track and the second track are historically associated with different objects. The trace matching module 108-1 compares the first trace and the second trace within successive frames that persist for the presence of trace pairings. That is, as long as the first trace and the second trace exist, the trace pair may be updated every frame that data arrives based on a comparison between the trace data. If the first trace and the second trace are associated with the same object within a threshold number of consecutive frames, the binary value of the element representing the object pair is changed.
At 610, in response to determining that the incompatibility matrix indicates that the first track and the second track are associated with different objects for at least a threshold number of consecutive frames, then it is determined that the first track and the second track comprise a history associated with the different objects. The binary value representing the element of the trace pair may represent a historical relationship between the first trace and the second trace. For example, if the first trace and the second trace appear incompatible (e.g., the two traces differ in position by more than a state threshold of a compatible trace), the binary value of the element representing the trace pair may be set to "1". If at some point the tracking pair becomes a viable pair, the counter will count the number of frames the tracking pair remains viable. If the counter reaches a threshold number of consecutive frames, the comparison indicates a probability that the tracks are associated with the same object, and a historical relationship between the two tracks is established, the binary value of the element may change to "0" to indicate the nature of the relationship.
At 612, in response to determining that the first track and the second track include histories associated with different objects, the feasibility matrix is adjusted by reducing a probability that the first object and the second object are the same object at the current time index. Mastering the history of the relationship between the first tracking and the second tracking may help the feasibility process to accurately make decisions about the association of tracking pairs with common objects.
At 614, information identifying the first object separate from the second object to avoid a collision between the vehicle and the first object or the second object is output to an automated system of the vehicle in response to adjusting the feasibility matrix by reducing a probability that the first object and the second object are the same object. The binary values of the elements representing the object pairs may be readily transmitted, read or otherwise transmitted to the automotive system. Optionally, the information contained in the incompatibility matrix may be further used by the object fusion system, in addition to the data collected and derived by the one or more first and second sensors, to determine the association of the object pairs. The object fusion system may output this information about the object pairing to the automotive system.
Additional examples
In the following section, additional examples of history-based identification of incompatibility traces are provided.
Example 1. A method, comprising: determining, by an object fusion system of a vehicle, a first tracking of a first object proximate to the vehicle from first sensor data acquired by a first sensor; determining, by the object fusion system of the vehicle, a second object tracking of a second object proximate to the vehicle based on second sensor data acquired by the second sensor; maintaining, by the object fusion system, a feasibility matrix indicating a probability that the first object and the second object are the same object at a given time index and an incompatibility matrix indicating whether the first track and the second track are historically associated with different objects; determining whether the incompatibility matrix indicates that the first track and the second track are associated with different objects; in response to determining that the incompatibility matrix indicates that the first tracking and the second tracking are associated with different objects, determining that the first tracking and the second tracking comprise histories related to the different objects; adjusting the feasibility matrix by reducing a probability that the first object and the second object are the same object in response to determining that the first track and the second track are related to histories of different objects; responsive to adjusting the feasibility matrix by reducing a probability that the first object and the second object are the same object, outputting, to an automated system of the vehicle, information identifying the first object as being separate from the second object to avoid a collision between the vehicle and the first object or the second object.
Example 2. The method of any of the preceding examples, further comprising: in response to determining that the incompatibility matrix indicates, for at least a threshold number of consecutive frames, that the first track and the second track are associated with the same object, determining that the first track and the second track comprise a history related to the same object; in response to determining that the first object and the second object include a history related to the same object, modulating a feasibility matrix by reducing a notion that the first object and the second object are the same object; and responsive to the determination that the first object and the second object are the same object by reducing the probability that the same object is, outputting, to an automated system, information identifying the same object to avoid a collision between the vehicle and the same object.
Example 3 the method of any of the preceding examples, wherein determining whether the incompatibility matrix indicates that the first tracking and the second tracking are associated with the same object for at least a threshold number of consecutive frames further comprises: incrementing a count associated with successive frames, the count being a temporary indication of whether the tracking is of the same object; and setting the threshold number of consecutive frames to a particular value based on the measurable accuracy of the first sensor and the measurable accuracy of the second sensor
Example 4. The method of any of the preceding examples, further comprising: in response to incrementing the count associated with successive frames, the count is reset if the tracking is determined to include histories related to two different objects.
Example 5 the method of any of the preceding examples, further comprising: the threshold number of consecutive frames is set to a particular value further based on an age of one or more of the first tracking and the second tracking.
Example 6 the method of any of the preceding examples, wherein maintaining the incompatibility matrix comprises: determining a difference between a vehicle state derived from the first trace and a same vehicle state derived from the second trace; comparing the difference to a state threshold; in response to comparing the difference to the state threshold, a binary value is assigned to an element in the incompatibility matrix, the element being associated with the first tracking and the second tracking.
Example 7 the method of any of the preceding examples, further comprising: the binary values are stored as corresponding bit arrays associated with the first trace and the second trace.
Example 8 the method of any of the preceding examples, wherein the vehicle state comprises one or more of a position, a speed, an azimuth, or a distance.
Example 9 the method of any of the preceding examples, wherein the first sensor comprises a vision sensor, a radar sensor, or a light detection and ranging sensor.
Example 10 the method of any of the preceding examples, wherein the second sensor comprises a vision sensor, a radar sensor, or a light detection and ranging sensor.
An example 11. A system, comprising: one or more processors configured to perform a method as in any one of the preceding examples.
Example 12. A system comprising means for performing the method of any of the preceding examples.
Example 13 a computer-readable storage medium comprising instructions that, when executed, configure one or more processors to perform the method of any of the preceding examples.
Final phrase
While various embodiments of the present disclosure have been described in the foregoing description and illustrated in the accompanying drawings, it is to be understood that the disclosure is not limited thereto but may be practiced in various ways within the scope of the following claims. From the foregoing description, it will be apparent that various modifications may be made without departing from the scope of the disclosure as defined in the appended claims. Problems associated with incompatible tracking may occur in other systems that identify and process tracking from various sensors. Thus, although described as one way of improving vehicle-based matching techniques, the techniques of the above description may be applied to other problems to efficiently and inexpensively match objects based on multiple tracked historical relationships.
The use of "or" and grammatical related terms means a non-exclusive alternative without limitation, unless the context clearly dictates otherwise. As used herein, a phrase referring to "at least one of" a list of items refers to any combination of those items, including a single member. For example, "at least one of a, b, or c" is intended to encompass: a. b, c, a-b, a-c, b-c, and a-b-c, and any combination having a plurality of the same elements (e.g., a-a-a, a-a-b, a-a-c, a-b-b, a-c-c, b-b-b, b-b-c, c-c, and c-c-c, or any other ordering of a, b, and c).