CN115146694A - Identification of history-based incompatibility tracking - Google Patents

Identification of history-based incompatibility tracking Download PDF

Info

Publication number
CN115146694A
CN115146694A CN202210267195.0A CN202210267195A CN115146694A CN 115146694 A CN115146694 A CN 115146694A CN 202210267195 A CN202210267195 A CN 202210267195A CN 115146694 A CN115146694 A CN 115146694A
Authority
CN
China
Prior art keywords
tracking
track
matrix
vehicle
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210267195.0A
Other languages
Chinese (zh)
Inventor
S·A·伊姆兰
J·K·希夫曼
曹年霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anbofu Technology Co ltd
Original Assignee
Delphi Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Delphi Technologies Inc filed Critical Delphi Technologies Inc
Publication of CN115146694A publication Critical patent/CN115146694A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Abstract

This document describes methods and systems for identification of history-based incompatibility tracking. The historical trajectory of the tracking facilitates accurately determining whether the tracks originating from different sensors identify the same object or different objects. However, recording historical data for multiple traces may consume a large amount of storage or computing resources, and the associated computations may become complex. The methods and systems described herein enable a sensor fusion system for an automobile or other vehicle to take historical data into account when correlating and pairing tracking without requiring extensive storage and tying up additional computing resources.

Description

Identification of history-based incompatibility tracking
Background
Automobiles become more complex with the addition of sensors for tracking objects near the vehicle. These objects may include other vehicles, pedestrians, animals, and inanimate objects, such as trees and road signs. Sensors (e.g., optical cameras, radar, light detection and ranging (LiDAR)) collect low level data that is processed by a sensor tracker to generate low level tracking. There may be a large amount of tracking from all sensors, especially in cluttered environments. As the tracks are generated, steps may be taken to pair and associate tracks originating from different sensors to a common object. Since traces have many different possible associations or pairings, matching different traces correctly to a common object is a difficult task. Improving the accuracy and speed of tracking associations and pairings may improve the safety and reliability of the automobile, especially for autonomous and semi-autonomous control.
Disclosure of Invention
This document describes methods and systems related to the identification of history-based incompatibility tracking. The historical trajectory of the tracking facilitates accurately determining whether the tracks originating from different sensors identify the same object or different objects. However, keeping a record of historical data associated with multiple traces may consume a large amount of storage or other computing resources, and the associated computations may become complex. The systems and methods described herein provide a solution for considering historical data for correlating traces from different sensors in a manner that seeks to minimize storage consumption and reduce computational complexity.
In one example, a method includes determining, by an object fusion system of a vehicle, a first tracking of a first object proximate to the vehicle from first sensor data acquired by a first sensor. The method further includes determining, by the object fusion system, a second tracking of a second object proximate to the vehicle from second sensor data acquired by a second sensor. The method further includes maintaining, by the object fusion system, a feasibility matrix indicating a probability that the first object and the second object are the same object at a given time index and an incompatibility matrix indicating that the first tracking and the second tracking are historically associated with the same object. The method further includes determining whether the incompatibility matrix indicates that the first track and the second track are associated with different objects. The method further includes, in response to determining that the incompatibility matrix indicates that the first track and the second track are associated with different objects, determining that the first track and the second track include a history related to the different objects. The method further includes, in response to determining that the first and second tracks include histories related to different objects, adjusting the feasibility matrix by reducing a probability that the first object and the second object are the same object. The method further includes, in response to adjusting the feasibility matrix by reducing the probability that the first object and the second object are the same object, outputting, to an automated system of the vehicle, information identifying the first object as being separate from the second object to avoid a collision between the vehicle and the first object or the second object.
In one example, a system includes a processor configured to perform this and other methods. In another example, a system is described that includes means for performing the method and other methods. In addition to describing a system configured to perform the methods outlined above and other methods set forth herein, this document also describes a computer-readable storage medium comprising instructions that, when executed, configure a processor to perform the methods outlined above and other methods set forth herein.
This summary introduces a simplified concept of history-based identification of incompatibility tracking. This summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter. That is, one problem addressed by the described techniques is that using historical data as an effective tool in correlating traces originating from different sensors is very expensive in terms of storage consumption and computational complexity. Thus, although described primarily in the context of improving the object fusion functionality of an automobile, reducing the storage and computational cost of historical data may also be applied to other applications requiring accurate object fusion. Moreover, these concepts may be implemented in the reverse manner; that is, the techniques described herein may be applied to the identification of history-based consistent traces.
Brief description of the drawings
Details of one or more aspects of the identification of history-based incompatibility tracking are described in this document with reference to the following figures. The same numbers are generally used throughout the drawings to reference like features and components:
FIG. 1 illustrates an example environment in which a system is configured to perform identification of history-based incompatibility tracking in accordance with the techniques of this disclosure.
FIG. 2 illustrates an example automobile system configured for performing identification of history-based incompatibility tracking in accordance with the techniques of this disclosure.
FIG. 3 illustrates an example scenario in which tracking is received by a vehicle configured to perform identification of history-based incompatibility tracking, in accordance with the techniques of this disclosure.
FIG. 4 illustrates an example incompatibility matrix configured for performing the identification of history-based incompatibility traces in accordance with the techniques of this disclosure.
FIG. 5-1 illustrates an example implementation of history-based identification of incompatibility traces in accordance with the techniques of this disclosure.
5-2 illustrates another example implementation of history-based identification of incompatibility traces in accordance with the techniques of this disclosure; and
FIG. 6 illustrates an example method of history-based identification of incompatibility traces in accordance with the techniques of this disclosure.
Detailed Description
SUMMARY
Automotive technology is successfully improving autonomy and safety. One way to achieve this goal is to equip the vehicle with a number of different types of sensors, including optical cameras, radar systems, and LiDAR systems. The sensors collect low level data that is processed by the sensor tracker to generate low level tracks representing different aspects of the vehicle surroundings. Most of the raw data collected, which is relevant to autonomy and security, represents different objects in the environment. These objects may be dynamic or static and may include pedestrians, animals, other vehicles, vegetation, buildings, road signs, and many other elements that may be present in the environment. Often, the environment is flooded with a plurality of objects of various different types. An example of a cluttered environment may be a busy downtown street in a large city. The environment may include a number of pedestrians, vehicles, or other objects.
One of the main challenges of automotive systems with different sensors is to correlate or disassociate the tracking gathered from the different sensor patterns. For example, the automobile system attempts to determine whether the camera-derived tracking matches the radar-derived tracking.
Currently, the available tracking correlation processes focus mainly on the status and attributes of the tracking to make the correlation decision. The historical track of the trace is typically not taken into account. Storing and processing historical information can be expensive in terms of storage and computational complexity. However, historical information may be helpful when an accurate correlation decision between two traces is to be determined from a large number of different traces originating from one or more different sensors, especially in cluttered environments.
As an example, at a particular point in time, there may be radar tracking and visual tracking that are close to each other, causing them to appear to match well. However, if the historical trajectories of the two different tracks are analyzed, it may be found that the two different tracks come from different locations and thus refer to different objects rather than a common object. In this case, matching two different traces may not be a good decision.
This document describes methods and systems for identification of history-based incompatibility tracking. The historical trajectory of the tracking facilitates accurately determining whether the tracks originating from different sensors identify the same object or different objects. However, recording historical data for multiple traces may consume a large amount of storage or other computing resources, and the associated computations may become complex. The methods and systems described herein enable a sensor fusion system for an automobile or other vehicle to take historical data into account when correlating and pairing tracks without requiring extensive storage and without tying up other computing resources.
Example Environment
FIG. 1 illustrates an example environment 100 in which a system 102 is configured to perform identification of history-based incompatibility tracking in accordance with the techniques of this disclosure in the example environment 100. In the depicted environment 100, the system 102 is an automobile. Sometimes referred to as a vehicle 102, the system 102 may represent any type of device or machine, including manned systems and unmanned systems, which may be used for various purposes. Some non-exhaustive and non-limiting examples of the vehicle 102 include a motorcycle, bus, tractor, semi-trailer, watercraft, aircraft, or other equipment or machine.
The vehicle 102 may be equipped with an object fusion system 104, the object fusion system 104 may include one or more sensor interfaces 106 and a tracking matching module 108, the sensor interfaces 106 including a camera interface 106-1, a radar interface 106-2. With object fusion system 104, transport 102 has an instrument field of view (FOV) 110 that may encompass one or more transports 112, including transport 112-1 and transport 112-2. The object fusion system 104 may capture the FOV 110 from any exterior surface of the vehicle 102. Positioning the camera and radar components interfacing with the object fusion system 104 in a particular manner may result in the object fusion system 104 having a particular FOV. For example, positioning one or more of the cameras with the radar may ensure that the FOV of the object fusion system 104 includes an area above (above), near, or on (on) a road that the vehicle 102 may be traveling on. At least a portion of the object fusion system 104 may be integrated into a side view mirror, bumper, roof, or any other portion of the vehicle 102.
Although not precisely shown in fig. 1, trace matching module 108 executes on a processor or other hardware. During execution, the tracking matching module 108 may track the object based on sensor data obtained at the camera interface 106-1 and the radar interface 106-2. Camera interface 106-1 receives camera data from one or more cameras of vehicle 102 and radar interface 106-2 receives radar data from at least one radar of vehicle 102. The tracking matching module 108 of the object fusion system 104 accesses the camera interface 106-1 and the radar interface 106-2 to acquire camera data and radar data, respectively. As will be appreciated from other portions of the specification, the object fusion system 104 may include additional sensor interfaces 106 (e.g., liDAR) than those shown in fig. 1.
The track matching module 108 configures the object fusion system 104 to associate or disassociate different types of tracks obtained from the sensor interface 106. The track matching module 108 identifies a plurality of tracks from the first sensor data (e.g., obtained from the camera interface 106-1) and determines a plurality of tracks from the second sensor data (e.g., obtained from the radar interface 106-2). Upon pairing or associating the two tracks, the track matching module evaluates the tracks from the first sensor data as compared to the tracks from the second sensor data to determine whether a pair of tracks (e.g., object pairing) represents a good candidate association. For example, if the camera interface 106-1 receives a first track related to the vehicle 112-1 and the radar interface 106-2 receives a second track related to the vehicle 112-2, the track matching module pairs the first track and the second track using the techniques described herein and other techniques that may or may not be used to determine whether the first track and the second track match to represent a common vehicle 112.
To correlate or disassociate the different types of tracking collected by the sensor interface, the tracking matching module 108 performs a tracking matching process that pairs the tracking present in the camera data with the tracking present in the radar data. When pairing large sets of multiple traces, the trace matching module 108 may generate a feasibility matrix and a incompatibility matrix. The object fusion system 104 assigns an identifier to each track with object candidates detected using the camera, and the object fusion system 104 assigns an identifier to each track (e.g., detection) obtained using radar.
One example of a feasibility matrix is an object tracking pair matrix. Each element value in the feasibility matrix indicates a probability that a different object tracking of an object tracking pair is associated with the same object. The feasibility matrix may be created by one or more of several different methods and is the primary means for tracking object fusion.
In this example, the incompatibility matrix supplements the feasibility matrix by tracking historical data. The object fusion system may adjust the probabilities indicated in the feasibility matrix based on the historical data.
The incompatibility matrix is created in two dimensions, although it is contemplated that the incompatibility matrix can comprise more than two dimensions with additional sensors. The first dimension represents the total number of columns, each column being assigned a camera identifier assigned to the camera data. The second dimension represents the total number of rows, each row assigned a radar identifier assigned to radar data, and so on. For each combination of camera and radar pairing represented by the incompatibility matrix, a parameter (e.g., a binary value) is assigned based on a difference in state between the tracking derived from the camera data and the tracking derived from the radar data, the difference is compared to a threshold parameter (e.g., a state threshold) for that state, and evaluated during a threshold number of consecutive frames. The parameter or binary value indicates whether the tracking from the camera data and the tracking from the radar data represent different objects.
Example architecture
FIG. 2 illustrates an example automobile system 200 configured to perform identification of history-based incompatibility tracking in accordance with the techniques of this disclosure. The automotive system 200 may be integrated into the vehicle 102 shown in fig. 1 and described in this context. For example, the automotive system 200 includes a controller 202 and an object fusion system 104-1, the object fusion system 104-1 being an example of the sensor fusion system 104. The object fusion system 104-1 and the controller 202 communicate over a link 204. The link 204 may be a wired link or a wireless link, and in some cases the link 204 includes a communication bus. The controller 202 performs operations based on information received over the link 204, such as an indication of compatibility output from the object fusion system 104-1 when objects in the FOV are identified by processing and merge tracking based at least on the partial incompatibility matrix.
The controller 202 includes a processor 206 and a computer-readable storage medium (CRM) 208 (e.g., memory, long term storage, short term storage), the CRM 208 storing instructions for an automotive module 210. In addition to the radar interface 106-2, the object fusion system 104-1 includes a camera interface 106-1. Any number of other sensor interfaces 106 may be used, including a LiDAR interface or other sensor interface 106-n. The object fusion system 104-1 can include processing hardware including a processor 212 (e.g., a hardware processor, a processing unit) and a computer-readable storage medium (CRM) 214, the computer-readable storage medium 214 storing instructions associated with a trace matching module 108-1. Trace match module 108-1 is an example of trace match module 108 and includes a incompatibility matrix 216 and a feasibility matrix 220 stored as a bit array 218.
Processors 206 and 212 may be two separate processing units, or a single processing unit (e.g., a microprocessor), or a pair of systems-on-chip or a single system-on-chip with a computing device, controller, or control unit. The processors 206 and 212 execute computer-executable instructions stored within the CRMs 208 and 214. As an example, the processor 206 may execute the car module 210 to perform a driving function (e.g., autonomous lane change maneuver, semi-autonomous lane keeping feature) or other operation of the car system 200. Similarly, the processor 212 may execute the track matching module 108-1 to infer objects in the FOV based on sensor data obtained from the plurality of sensor interfaces 106 of the system 102. The automobile module 210, when executed at the processor 206, may receive an indication of one or more objects detected by the track matching module 108-1 in response to the track matching module 108-1 combining and analyzing sensor data generated at each of the sensor interfaces 106.
In general, the automobile system 200 executes an automobile module 210 to perform automobile functions, which may include using outputs from the object fusion system 104-1. For example, the automobile module 210 may provide automatic cruise control and monitor the object fusion system 104-1 for an output indicating the presence of an object in or near the FOV 110, e.g., to reduce speed and prevent a collision with the rear end of the vehicle 112. In such an example, the track matching module 108-1 provides the sensor data or derivative thereof (e.g., the incompatibility matrix) as an output to the automobile module 210. The automobile module 210 may provide an alert or perform a particular maneuver when the data obtained from the track matching module 108-1 indicates that one or more objects are passing in front of the vehicle 102.
FIG. 3 illustrates an example scenario 300 in accordance with the techniques of this disclosure, where traces 302 and 304 are received by a vehicle 306, the vehicle 306 configured to perform identification of history-based incompatibility traces. In fig. 3, track 302 may be a first track identified by one sensor interface 308 (e.g., camera interface 106-1 in fig. 1) on vehicle 306, and track 304 may be a second track identified by a different sensor interface 310 (e.g., radar interface 106-2 in fig. 1) on vehicle 306. Different states of vehicle 306 may be extracted from tracking data including distance 312 (r-312), azimuth 314 (θ -314), longitudinal velocity 320 (vlong-320), and lateral velocity 324 (vlat-324) of track 302, among other conceivable states. For tracking 304, distance 316 (r-316), azimuth 318 (θ -318), longitudinal velocity 322 (vlong-322), and lateral velocity 326 (vlat-326) may be extracted. Other trace states not shown in FIG. 3 may also be used for identification of history-based incompatible traces. To identify whether traces 302 and 304 are compatible or incompatible, the difference between the corresponding states of the two different traces 302 and 304 within a common time frame is calculated. For example, the difference between r-312 and r-316 is calculated, and the difference between theta-314 and theta-318 is calculated. Likewise, the difference between vlong-320 and vlong-322 is calculated, and the difference between vlat-324 and vlat-326 is calculated. Each difference is compared to a threshold value for that particular state. If the difference is greater than the threshold for that particular state, then the tracking pair may be incompatible.
The incompatibility may be true if only one state difference is greater than the corresponding state threshold and the other differences are less than the corresponding state threshold. For example, the incompatibility of the trace pairs (302, 304) may be calculated as:
incompatibility = | r-312-r-316| > distance _ threshold, or
| θ -314- θ -318| > Azimuth _ thresh, or
| vlong-320-vlong-322| longitudinal velocity _ threshold, or
| vlat-324-vlat-326| > lateral velocity _ threshold.
If an incompatibility between tracking pairs is calculated within a frame, a frame counter is activated such that the incompatibility is calculated for a threshold number of frames before an incompatibility decision is made. Historically (e.g., per a threshold number of frames), the tracking pair may be considered to track a different object if a threshold number of frames are reached and the incompatibility of the tracking pair still exists (e.g., all analyzed state differences exceed the state threshold for the duration of the threshold number of frames). In this example, an incompatibility of a tracking pair may be assigned a binary value of 1, and a compatibility of a tracking pair may be assigned a binary value of 0. In other aspects, the assignment of binary values may be reversed, with 1 corresponding to compatible and 0 corresponding to incompatible. If multiple tracking pairs are historically tracked in this manner, the incompatible data may be arranged as an incompatible matrix.
FIG. 4 illustrates an example incompatibility matrix 400 configured to perform the identification of history-based incompatibility tracking in accordance with the techniques of this disclosure. The columns in the incompatibility matrix 400 represent traces from a first sensor interface (e.g., trace 302), and the rows in the incompatibility matrix 400 represent traces from a different sensor interface (e.g., trace 304). In this example, if trace 302 corresponds to column A in the incompatibility matrix 400, and if trace 304 corresponds to row 3 in the incompatibility matrix 400, then the trace pair (302, 304) has an incompatibility allocation of 1 (e.g., element A-3 in the element incompatibility matrix 400). If compatibility becomes possible within a threshold number of frames, the incompatible allocation will become 0. FIG. 4 further illustrates that the incompatibility matrix can be arranged and stored as a bit array 402. In this example, sixteen tracking pairs are identified and stored as a sixteen element (e.g., sixteen bit) array. In other examples, a more complex encoding scheme may be used (e.g., using less than the total number of bit pairs to represent each tracking pair in the array). The array is shown such that the first bit in the array corresponds to tracking pair A-1, the second bit corresponds to A-2, and so on. However, the array can be arranged differently, such as a first listed row (e.g., A-1, B-1, \ 8230;, C-4, D-4), a reverse move (e.g., D-4, D-3, \ 8230;, A-2, A-1), or any other arrangement of bits representing elements in the incompatibility matrix 400.
By storing the incompatibility matrix 400 as an array of bits, each bit in the array of bits representing an incompatibility of a different trace pair, a large amount of information about the historical relationships of different traces can be stored in an efficient and inexpensive manner with a small amount of computing resources.
Example scenarios
FIG. 5-1 illustrates an example implementation 500-1 of identification of history-based incompatibility tracking in accordance with the techniques of this disclosure. In implementation 500-1, a vehicle 502 is equipped to perform identification of history-based incompatibility tracking. At least one object 504-1 is in the FOV of at least two sensors mounted on the vehicle 502. One sensor interface of the vehicle 502 receives the trace 504-2 and within the same frame, a second sensor interface of the vehicle 502 receives the trace 504-3. The difference between at least one tracking state (e.g., azimuth angle) from tracking 504-2 and 504-3 is greater than the state threshold error set for that tracking state. In this case, the difference in the tracking states indicates incompatibility of the two tracks 504-2 and 504-3, even though the other tracking states indicate compatibility. If the difference persists for a consecutive number of frames set by the frame counter, the two traces 504-2 and 504-3 may be considered historically incompatible. However, if the difference in tracking status of the two tracks 504-2 and 504-3 falls within a feasibility threshold range within some frames, i.e., the status error is below the feasibility threshold set to determine the feasibility of the two tracks representing the same object at any given frame index, another frame counter begins counting the frames for which both tracks remain feasible. Once the feasibility is maintained for a threshold number of frames, the tracks become compatible with each other, and tracks 504-2 and 504-3 may be considered as likely to track the same object 504-1. The feasibility tracking pair may then be further considered along with other data (of which the object class is one) to further determine whether the tracking pair (504-2, 504-3) represents object 504-1. In implementation 500-1, exceeding the state difference for a compatible state threshold is the azimuth; however, any one or more of the trace state differences between traces 504-2 and 504-3 may be greater than the compatible state threshold for that particular state, indicating incompatibility. Once all state differences remain below the feasibility threshold (not the compatible state threshold mentioned in [0026 ]) for a threshold number of frames, then tracks 504-2 and 504-3 may represent a common object 504-1 based on the historical relationships of tracks 504-2 and 504-3.
FIG. 5-2 illustrates another example implementation 500-2 of identification of history-based incompatibility tracking in accordance with the techniques of this disclosure. In implementation 500-2, the state difference for traces 504-2 and 504-3 are both less than the relative state threshold for consistent tracking at frame T-1. However, at frame T-2, which may or may not occur continuously after frame T-1, one or more of the state differences of traces 504-2 and 504-3 are determined to be greater than the relative state threshold. In some aspects, a counter may begin determining whether the tracking pair (504-2, 504-3) remains incompatible within a threshold number of frames in this example before making a decision that tracking 504-2 may be tracking a different object than tracking 504-3. In other aspects, at frame T-2, the tracking pairs (504-2, 504-3) may be immediately determined to be incompatible and not decided to be compatible pairs until the state difference remains less than the feasibility threshold for a threshold number of frames.
Example method
FIG. 6-1 illustrates an example method of identification of history-based incompatibility tracking in accordance with the techniques of this disclosure. The method 600 is illustrated as a set of operations (or acts) that are performed in, but not necessarily limited to, the order or combination of operations shown or described. In addition, any of the operations can be repeated, combined, or recombined to provide further methods. In portions of the following discussion, reference may be made to the preceding description of the figures in describing some non-limiting examples of method 600.
At 602, a first tracking of a first object proximate a vehicle (e.g., vehicle 102) is determined by an object fusion system of the vehicle from first sensor data acquired by a first sensor. The first tracking may be associated with at least a portion of a stationary object or a portion of a moving object, such as the vehicle 112.
At 604, a second pair of tracks of a second object proximate to the vehicle is determined by the object fusion system of the vehicle from second sensor data acquired by the second sensor. In some examples, the first sensor may include one or more optical cameras or near-infrared cameras, and the second sensor may include one or more radar sensors, liDAR sensors, or ultrasonic sensors.
At 606, a feasibility matrix and a incompatibility matrix are maintained by the object fusion system. Each element value in the feasibility matrix indicates a probability that a different object track of an object track pair is associated with the same object at a given time/frame index. The incompatibility matrix indicates whether the first track and the second track are historically associated with different objects. The trace matching module 108-1 pairs the first trace with the second trace and the trace pairs are represented as elements of an incompatibility matrix. Data from the first trace is compared to data from the second trace and the value of the element is set based on the result of the comparison. Based on data derived from the first trace and the second trace, a value representing an element of the trace pair indicates whether the trace pair is historically incompatible or compatible. In some aspects, the value may be a binary value, and the value indicates that the tracking pair may or may not be incompatible. The values of the tracking pairs are updated periodically. The object fusion system can adjust the probabilities indicated in the feasibility matrix based on the element values of the corresponding tracking pairs in the incompatibility matrix.
At 608, it is determined whether the incompatibility matrix indicates that the first track and the second track are historically associated with different objects. The trace matching module 108-1 compares the first trace and the second trace within successive frames that persist for the presence of trace pairings. That is, as long as the first trace and the second trace exist, the trace pair may be updated every frame that data arrives based on a comparison between the trace data. If the first trace and the second trace are associated with the same object within a threshold number of consecutive frames, the binary value of the element representing the object pair is changed.
At 610, in response to determining that the incompatibility matrix indicates that the first track and the second track are associated with different objects for at least a threshold number of consecutive frames, then it is determined that the first track and the second track comprise a history associated with the different objects. The binary value representing the element of the trace pair may represent a historical relationship between the first trace and the second trace. For example, if the first trace and the second trace appear incompatible (e.g., the two traces differ in position by more than a state threshold of a compatible trace), the binary value of the element representing the trace pair may be set to "1". If at some point the tracking pair becomes a viable pair, the counter will count the number of frames the tracking pair remains viable. If the counter reaches a threshold number of consecutive frames, the comparison indicates a probability that the tracks are associated with the same object, and a historical relationship between the two tracks is established, the binary value of the element may change to "0" to indicate the nature of the relationship.
At 612, in response to determining that the first track and the second track include histories associated with different objects, the feasibility matrix is adjusted by reducing a probability that the first object and the second object are the same object at the current time index. Mastering the history of the relationship between the first tracking and the second tracking may help the feasibility process to accurately make decisions about the association of tracking pairs with common objects.
At 614, information identifying the first object separate from the second object to avoid a collision between the vehicle and the first object or the second object is output to an automated system of the vehicle in response to adjusting the feasibility matrix by reducing a probability that the first object and the second object are the same object. The binary values of the elements representing the object pairs may be readily transmitted, read or otherwise transmitted to the automotive system. Optionally, the information contained in the incompatibility matrix may be further used by the object fusion system, in addition to the data collected and derived by the one or more first and second sensors, to determine the association of the object pairs. The object fusion system may output this information about the object pairing to the automotive system.
Additional examples
In the following section, additional examples of history-based identification of incompatibility traces are provided.
Example 1. A method, comprising: determining, by an object fusion system of a vehicle, a first tracking of a first object proximate to the vehicle from first sensor data acquired by a first sensor; determining, by the object fusion system of the vehicle, a second object tracking of a second object proximate to the vehicle based on second sensor data acquired by the second sensor; maintaining, by the object fusion system, a feasibility matrix indicating a probability that the first object and the second object are the same object at a given time index and an incompatibility matrix indicating whether the first track and the second track are historically associated with different objects; determining whether the incompatibility matrix indicates that the first track and the second track are associated with different objects; in response to determining that the incompatibility matrix indicates that the first tracking and the second tracking are associated with different objects, determining that the first tracking and the second tracking comprise histories related to the different objects; adjusting the feasibility matrix by reducing a probability that the first object and the second object are the same object in response to determining that the first track and the second track are related to histories of different objects; responsive to adjusting the feasibility matrix by reducing a probability that the first object and the second object are the same object, outputting, to an automated system of the vehicle, information identifying the first object as being separate from the second object to avoid a collision between the vehicle and the first object or the second object.
Example 2. The method of any of the preceding examples, further comprising: in response to determining that the incompatibility matrix indicates, for at least a threshold number of consecutive frames, that the first track and the second track are associated with the same object, determining that the first track and the second track comprise a history related to the same object; in response to determining that the first object and the second object include a history related to the same object, modulating a feasibility matrix by reducing a notion that the first object and the second object are the same object; and responsive to the determination that the first object and the second object are the same object by reducing the probability that the same object is, outputting, to an automated system, information identifying the same object to avoid a collision between the vehicle and the same object.
Example 3 the method of any of the preceding examples, wherein determining whether the incompatibility matrix indicates that the first tracking and the second tracking are associated with the same object for at least a threshold number of consecutive frames further comprises: incrementing a count associated with successive frames, the count being a temporary indication of whether the tracking is of the same object; and setting the threshold number of consecutive frames to a particular value based on the measurable accuracy of the first sensor and the measurable accuracy of the second sensor
Example 4. The method of any of the preceding examples, further comprising: in response to incrementing the count associated with successive frames, the count is reset if the tracking is determined to include histories related to two different objects.
Example 5 the method of any of the preceding examples, further comprising: the threshold number of consecutive frames is set to a particular value further based on an age of one or more of the first tracking and the second tracking.
Example 6 the method of any of the preceding examples, wherein maintaining the incompatibility matrix comprises: determining a difference between a vehicle state derived from the first trace and a same vehicle state derived from the second trace; comparing the difference to a state threshold; in response to comparing the difference to the state threshold, a binary value is assigned to an element in the incompatibility matrix, the element being associated with the first tracking and the second tracking.
Example 7 the method of any of the preceding examples, further comprising: the binary values are stored as corresponding bit arrays associated with the first trace and the second trace.
Example 8 the method of any of the preceding examples, wherein the vehicle state comprises one or more of a position, a speed, an azimuth, or a distance.
Example 9 the method of any of the preceding examples, wherein the first sensor comprises a vision sensor, a radar sensor, or a light detection and ranging sensor.
Example 10 the method of any of the preceding examples, wherein the second sensor comprises a vision sensor, a radar sensor, or a light detection and ranging sensor.
An example 11. A system, comprising: one or more processors configured to perform a method as in any one of the preceding examples.
Example 12. A system comprising means for performing the method of any of the preceding examples.
Example 13 a computer-readable storage medium comprising instructions that, when executed, configure one or more processors to perform the method of any of the preceding examples.
Final phrase
While various embodiments of the present disclosure have been described in the foregoing description and illustrated in the accompanying drawings, it is to be understood that the disclosure is not limited thereto but may be practiced in various ways within the scope of the following claims. From the foregoing description, it will be apparent that various modifications may be made without departing from the scope of the disclosure as defined in the appended claims. Problems associated with incompatible tracking may occur in other systems that identify and process tracking from various sensors. Thus, although described as one way of improving vehicle-based matching techniques, the techniques of the above description may be applied to other problems to efficiently and inexpensively match objects based on multiple tracked historical relationships.
The use of "or" and grammatical related terms means a non-exclusive alternative without limitation, unless the context clearly dictates otherwise. As used herein, a phrase referring to "at least one of" a list of items refers to any combination of those items, including a single member. For example, "at least one of a, b, or c" is intended to encompass: a. b, c, a-b, a-c, b-c, and a-b-c, and any combination having a plurality of the same elements (e.g., a-a-a, a-a-b, a-a-c, a-b-b, a-c-c, b-b-b, b-b-c, c-c, and c-c-c, or any other ordering of a, b, and c).

Claims (20)

1. A method, the method comprising:
determining, by an object fusion system of a vehicle, a first tracking of a first object proximate to the vehicle from first sensor data acquired by a first sensor;
determining, by the object fusion system, a second tracking of a second object proximate the vehicle from second sensor data acquired by a second sensor;
maintaining, by the object fusion system, a feasibility matrix and an incompatibility matrix, the feasibility matrix indicating a probability that the first object and the second object are the same object at a given time index, the incompatibility matrix indicating that the first tracking and the second tracking are historically associated with different objects;
determining whether the incompatibility matrix indicates that the first tracking and the second tracking are associated with different objects;
in response to determining that the incompatibility matrix indicates that the first track and the second track are associated with different objects, determining that the first track and the second track comprise a history related to different objects;
in response to determining that the first and second tracks comprise histories related to different objects, adjusting the feasibility matrix by reducing the probability that the first and second objects are the same object;
in response to adjusting the feasibility matrix by reducing the probability that the first object and the second object are the same object, outputting, to an automated system of the vehicle, information identifying the first object as being separate from the second object to avoid a collision between the vehicle and the first object or the second object.
2. The method of claim 1, further comprising:
in response to determining that the incompatibility matrix indicates that the first track and the second track are associated with the same object for at least a threshold number of consecutive frames, determining that the first track and the second track comprise a history related to the same object;
in response to determining that the first and second tracks include the history relating to the same object, adjusting the feasibility matrix by increasing the probability that the first and second objects are the same object; and
in response to adjusting the feasibility matrix by increasing the probability that the first object and the second object are the same object, outputting, to the automated system, information identifying the same object to avoid a collision between the vehicle and the same object.
3. The method of claim 2, wherein determining that the incompatibility matrix indicates that the first tracking and the second tracking are associated with the same object for at least a threshold number of consecutive frames further comprises:
incrementing a count associated with the successive frames, the count being a temporary indication of whether the trace is the same object; and
setting the threshold number of the consecutive frames to a particular value based on a measurement accuracy of the first sensor and a measurement accuracy of the second sensor.
4. The method of claim 3, further comprising:
in response to incrementing a count associated with the successive frames, resetting the count if the tracking is determined to include a history relating to two different objects.
5. The method of claim 3, further comprising:
setting the threshold number of the consecutive frames to a particular value further based on an age of one or more of the first track and the second track.
6. The method of claim 1, wherein maintaining the incompatibility matrix comprises:
determining a difference between a vehicle state derived from the first trace and a same vehicle state derived from the second trace;
comparing the difference to a state threshold; and
assigning a binary value to an element in the incompatibility matrix corresponding to the association of the first trace and the second trace in response to comparing the difference to the state threshold.
7. The method of claim 6, further comprising:
storing the binary value as the associated bit array corresponding to the first track and the second track.
8. The method of claim 6, wherein the vehicle state comprises one or more of a position, a speed, an azimuth, or a distance.
9. The method of claim 1, wherein the first sensor comprises:
a vision sensor;
a radar sensor; or
Light detection and ranging sensors.
10. The method of claim 9, wherein the second sensor comprises:
a vision sensor;
a radar sensor; or
Light detection and ranging sensors.
11. A system, the system comprising:
one or more processors configured to:
determining a first tracking of a first object proximate to a vehicle from first sensor data acquired by a first sensor of the vehicle;
determining a second tracking of a second object proximate the vehicle from second sensor data acquired by a second sensor of the vehicle;
maintaining a feasibility matrix and an incompatibility matrix, the feasibility matrix indicating a probability that the first object and the second object are the same object at a given time index, the incompatibility matrix indicating that the first tracking and the second tracking are historically associated with the same object;
determining whether the incompatibility matrix indicates that the first track and the second track are associated with different objects;
in response to determining that the incompatibility matrix indicates that the first track and the second track are associated with different objects, determining that the first track and the second track comprise a history related to different objects;
in response to determining that the first and second tracks comprise histories related to different objects, adjusting the feasibility matrix by reducing the probability that the first and second objects are the same object; and
in response to adjusting the feasibility matrix by reducing the probability that the first object and the second object are the same object, outputting, to an automated system of the vehicle, information identifying the first object as being separate from the second object to avoid a collision between the vehicle and the first object or the second object.
12. The system of claim 11, wherein the one or more processors are further configured for:
in response to determining that the incompatibility matrix indicates, within at least a threshold number of consecutive frames, that the first tracking and the second tracking are associated with the same object, determining that the first tracking and the second tracking comprise a history related to the same object;
in response to determining that the first and second tracks comprise the history relating to the same object, adjusting the feasibility matrix by increasing the probability that the first and second objects are the same object; and
in response to adjusting the feasibility matrix by increasing the probability that the first object and the second object are the same object, outputting, to the automated system, information identifying the same object to avoid a collision between the vehicle and the same object.
13. The system of claim 12, wherein the one or more processors are further configured to determine whether the incompatibility matrix indicates that the first track and the second track are associated with the same object during at least a threshold number of consecutive frames by:
incrementing a count associated with the successive frames, the count being a temporary indication of whether the trace is the same object; and
setting the threshold number of the consecutive frames to a particular value based on a measurement accuracy of the first sensor and a measurement accuracy of the second sensor.
14. The system of claim 12, wherein the one or more processors are further configured for:
in response to incrementing a count associated with the successive frames, resetting the count if the tracking is determined to include a history relating to two different objects.
15. The system of claim 12, wherein the one or more processors are further configured for:
setting the threshold number of consecutive frames to a particular value based on an age of one or more of the first track and the second track.
16. The system of claim 11, wherein the one or more processors are further configured to maintain the incompatibility matrix by:
determining a difference between a vehicle state derived from the first trace and a same vehicle state derived from the second trace;
comparing the difference to a state threshold; and
assigning a binary value to an element in the incompatibility matrix corresponding to the association of the first trace and the second trace in response to comparing the difference to the state threshold.
17. The system of claim 16, wherein the one or more processors are further configured for:
storing the binary value as the associated bit array corresponding to the first track and the second track.
18. The system of claim 16, wherein the vehicle state comprises one or more of a position, a velocity, an azimuth, or a distance.
19. The system of claim 11, wherein the first sensor and the second sensor comprise one or more of:
a vision sensor;
a radar sensor; or
Light detection and ranging sensors.
20. A computer-readable storage medium comprising instructions that, when executed, configure at least one processor to:
determining, by an object fusion system of a vehicle, a first tracking of a first object proximate to the vehicle from first sensor data acquired by a first sensor;
determining, by the object fusion system, a second tracking of a second object proximate the vehicle from second sensor data acquired by a second sensor;
maintaining, by the object fusion system, a feasibility matrix and an incompatibility matrix, the feasibility matrix indicating a probability that the first object and the second object are the same object at a given time index, the incompatibility matrix indicating that the first tracking and the second tracking are historically associated with the same object;
determining whether the incompatibility matrix indicates that the first track and the second track are associated with different objects;
in response to determining that the incompatibility matrix indicates that the first track and the second track are associated with different objects, determining that the first track and the second track comprise a history related to different objects;
in response to determining that the first and second tracks comprise histories related to different objects, adjusting the feasibility matrix by reducing the probability that the first and second objects are the same object; and
responsive to adjusting the feasibility matrix by reducing the probability that the first object and the second object are the same object, outputting, to an automated system of the vehicle, information identifying the first object separated from the second object to avoid a collision between the vehicle and the first object or the second object.
CN202210267195.0A 2021-03-18 2022-03-17 Identification of history-based incompatibility tracking Pending CN115146694A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163162851P 2021-03-18 2021-03-18
US63/162,851 2021-03-18
US17/308,908 US20220300743A1 (en) 2021-03-18 2021-05-05 History-Based Identification of Incompatible Tracks
US17/308,908 2021-05-05

Publications (1)

Publication Number Publication Date
CN115146694A true CN115146694A (en) 2022-10-04

Family

ID=80775371

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210267195.0A Pending CN115146694A (en) 2021-03-18 2022-03-17 Identification of history-based incompatibility tracking

Country Status (3)

Country Link
US (1) US20220300743A1 (en)
EP (1) EP4060381A1 (en)
CN (1) CN115146694A (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8705797B2 (en) * 2012-03-07 2014-04-22 GM Global Technology Operations LLC Enhanced data association of fusion using weighted Bayesian filtering
US10466361B2 (en) * 2017-03-14 2019-11-05 Toyota Research Institute, Inc. Systems and methods for multi-sensor fusion using permutation matrix track association
CN207396739U (en) * 2017-12-22 2018-05-22 西安飞芯电子科技有限公司 A kind of anti-collision early warning system based on mobile lidar
CN109987025B (en) * 2018-01-03 2023-02-21 奥迪股份公司 Vehicle driving assistance system and method for night environment
US10468062B1 (en) * 2018-04-03 2019-11-05 Zoox, Inc. Detecting errors in sensor data
JP7019503B2 (en) * 2018-04-25 2022-02-15 日立Astemo株式会社 Electronic control device, calculation method
CN208915155U (en) * 2018-06-27 2019-05-31 苏州瑞耀三维科技有限公司 A kind of vehicle anti-collision device based on laser ranging

Also Published As

Publication number Publication date
US20220300743A1 (en) 2022-09-22
EP4060381A1 (en) 2022-09-21

Similar Documents

Publication Publication Date Title
EP3104284B1 (en) Automatic labeling and learning of driver yield intention
JP7199545B2 (en) A Multi-view System and Method for Action Policy Selection by Autonomous Agents
CN113228040B (en) System and method for multi-level object travel direction estimation
US11774582B2 (en) Imaging and radar fusion for multiple-object tracking
CN113492851B (en) Vehicle control device, vehicle control method, and computer program for vehicle control
CN112154455B (en) Data processing method, equipment and movable platform
US10776642B2 (en) Sampling training data for in-cabin human detection from raw video
US11631325B2 (en) Methods and systems for traffic light state monitoring and traffic light to lane assignment
US11748593B2 (en) Sensor fusion target prediction device and method for vehicles and vehicle including the device
US11753002B2 (en) Vehicular control system
CN112009468A (en) Multi-hypothesis object tracking for autonomous driving systems
CN114518574A (en) Kurtosis-based pruning for sensor fusion systems
CN115146694A (en) Identification of history-based incompatibility tracking
CN110696828A (en) Forward target selection method and device and vehicle-mounted equipment
CN116022168A (en) Free space verification of ADS perception system perception
CN112739599B (en) Vehicle lane change behavior identification method and device
US20230147100A1 (en) Clustering Track Pairs for Multi-Sensor Track Association
US20230410490A1 (en) Deep Association for Sensor Fusion
US20240078814A1 (en) Method and apparatus for modeling object, storage medium, and vehicle control method
US11798295B2 (en) Model free lane tracking system
US20230227042A1 (en) Method for determining the reliability of objects
CN115704687A (en) Occlusion constraints for resolving tracking from multiple types of sensors
US20230373503A1 (en) Vehicle controller, method and computer program for vehicle control, priority setting device, and vehicle control system
US20230143958A1 (en) System for neural architecture search for monocular depth estimation and method of using
CN117058501A (en) Depth correlation for sensor fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20240401

Address after: Jiraum, Luxembourg, Luxembourg J. 12C Kroll Street

Applicant after: Anbofu Manufacturing Management Services Co.,Ltd.

Country or region after: Luxembourg

Address before: Babado J San Michaele

Applicant before: Delphi Technologies, Inc.

Country or region before: Barbados

TA01 Transfer of patent application right

Effective date of registration: 20240428

Address after: 2 Pestaroz Street, Schaffhausen, Switzerland

Applicant after: Anbofu Technology Co.,Ltd.

Country or region after: Switzerland

Address before: Jiraum, Luxembourg, Luxembourg J. 12C Kroll Street

Applicant before: Anbofu Manufacturing Management Services Co.,Ltd.

Country or region before: Luxembourg