CN112816975B - Track determining method and device and electronic equipment - Google Patents

Track determining method and device and electronic equipment Download PDF

Info

Publication number
CN112816975B
CN112816975B CN202011622460.XA CN202011622460A CN112816975B CN 112816975 B CN112816975 B CN 112816975B CN 202011622460 A CN202011622460 A CN 202011622460A CN 112816975 B CN112816975 B CN 112816975B
Authority
CN
China
Prior art keywords
track
radar
detected
vision
visual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011622460.XA
Other languages
Chinese (zh)
Other versions
CN112816975A (en
Inventor
徐宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecarx Hubei Tech Co Ltd
Original Assignee
Ecarx Hubei Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecarx Hubei Tech Co Ltd filed Critical Ecarx Hubei Tech Co Ltd
Priority to CN202011622460.XA priority Critical patent/CN112816975B/en
Publication of CN112816975A publication Critical patent/CN112816975A/en
Application granted granted Critical
Publication of CN112816975B publication Critical patent/CN112816975B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the invention provides a track determining method, a track determining device and electronic equipment, wherein the method comprises the following steps: the method comprises the steps of obtaining radar measurement data detected by a radar at intervals of a preset radar detection period and vision measurement data detected by a vision sensor at intervals of a preset vision detection period; determining a radar track corresponding to each first detection target; determining a visual track corresponding to each second detection target; converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; performing track point compensation on the radar track and/or the visual track so that the detection time of the track point in the radar track is the same as the detection time of the track point in the visual track; and fusing the radar track and the track points with the same detection time in the visual track to obtain a fused track. By adopting the method, the application scene of target detection is expanded, and the detection precision is improved.

Description

Track determining method and device and electronic equipment
Technical Field
The present invention relates to the field of automobile road detection technologies, and in particular, to a track determining method, a track determining device, and an electronic device.
Background
ADAS (advanced driving assistance system) is a system which senses the surrounding environment at any time in the running process of an automobile through various sensors installed on the automobile, collects data, performs identification, detection and tracking of static and dynamic targets, and combines navigator map data to perform operation and analysis. In ADAS, detection and identification of targets mainly include two ways: the recognition target is detected by a camera vision sensor and detected by a millimeter wave radar. Specifically, the position of the target can be detected through a camera vision sensor or through millimeter wave radar, the motion track of the automobile relative to the target is calculated through the detected position of the target, the relative position of the target relative to the automobile is determined, and meanwhile, the target is identified according to the image data of the detected target.
However, the manner of detecting the identified target by the camera vision sensor and the manner of detecting the identified target by the millimeter wave radar have certain limitations: the millimeter wave radar is slow to detect the transverse parameters of the target, and elements such as lane lines, traffic signs and the like cannot be distinguished; the visual sensor has the advantages of short detection distance, low ranging and speed measuring precision, and large influence of factors such as illumination, weather and the like. Therefore, the application scene of target detection and identification is limited only by the camera vision sensor or only by the millimeter wave radar, and the detection precision is low.
Disclosure of Invention
The embodiment of the invention aims to provide a track determining method, a track determining device and electronic equipment, which are used for realizing the expansion of the application scene of target detection and the improvement of the accuracy of target detection by fusing a radar track detected by a radar sensor with a visual track detected by a visual sensor.
In order to achieve the above object, an embodiment of the present invention provides a track determining method, including:
the method comprises the steps of obtaining radar measurement data detected by a radar at intervals of a preset radar detection period and vision measurement data detected by a vision sensor at intervals of a preset vision detection period; the radar measurement data comprise first position coordinates, first movement speed and first detected time of each first detected target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detected target is detected by the radar; the vision measurement data comprise second position coordinates, second movement speed and second detected time of each second detection target detected by the vision sensor in a vision sensor coordinate system, wherein the second detected time is the time when the second detection target is detected;
Based on the radar measurement data, determining a radar track corresponding to each first detection target, wherein each radar track point in the radar track comprises: a first position coordinate, a first moving speed and a first detected moment of a corresponding first detected target; and determining, based on the vision measurement data, a vision track corresponding to each second detection target, each vision track point in the vision track comprising: a second position coordinate, a second movement speed and a second detected moment of the corresponding second detection target;
converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; and performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected moment of each first detection target in the radar track, the second detected moment of each second detection target in the vision track, the first movement speed of each first detection target in the radar track and the second movement speed of each second detection target in the vision track, so that the detection moment of a track point in the radar track is the same as the detection moment of a track point in the vision track;
And fusing the track points with the same detection time in the radar track and the visual track to obtain a fused track.
Further, the determining, based on the radar measurement data, a radar track corresponding to each first detection target includes:
respectively creating a radar track list aiming at each radar track point in radar measurement data, wherein the first detected moment is closest to the starting moment of a preset fusion period, so as to obtain one or more radar track lists;
aiming at each remaining radar track point in a preset fusion period, calculating an estimated position coordinate corresponding to the radar track point according to a preset radar detection period and a first position coordinate of the radar track point, which is one preset radar detection period earlier than the radar track point, in the created radar track list for each created radar track list;
calculating a coordinate difference between a first position coordinate of the radar track point and an estimated position coordinate corresponding to the radar track point;
if the coordinate difference is smaller than a preset difference threshold, adding the radar track point into the created radar track list, and adding 1 to the updated weight of the created radar track list; if the radar track point cannot be added into any established radar track list, establishing a new radar track list for the radar track point, and subtracting 1 from the update weight of the new radar track list; wherein, each radar track list comprises: one or more radar track points;
And aiming at each radar track list, if the number of times of increasing the updating weight of the radar track list within the preset time is greater than or equal to the preset number of times, determining the radar track list as the radar track corresponding to the first detection target.
Further, the determining, based on the vision measurement data, a vision track corresponding to each second detection target includes:
respectively creating a visual track list aiming at each visual track point in the visual measurement data, wherein the second detected moment is closest to the starting moment of the preset fusion period, so as to obtain one or more visual track lists;
aiming at each visual track point remained in a preset fusion period, calculating an estimated position coordinate corresponding to the visual track point according to a preset visual detection period and a second position coordinate of the visual track point, which is earlier than the visual track point by one preset visual detection period, in the second detected moment in the created visual track list for each created visual track list;
calculating a coordinate difference between the second position coordinate of the visual track point and the estimated position coordinate corresponding to the visual track point;
if the coordinate difference is smaller than a preset difference threshold, adding the visual track point into the created visual track list, and adding 1 to the updated weight of the created visual track list; if the visual track point cannot be added into any created visual track list, creating a new visual track list for the visual track point, and subtracting 1 from the updated weight of the new visual track list; wherein, each visual track list comprises: one or more visual track points;
And aiming at each visual track list, if the number of times of increasing the updating weight of the visual track list within the preset time is greater than or equal to the preset number of times, determining the visual track list as the visual track corresponding to the second detection target.
Further, the converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the vision track to the same coordinate system includes:
and converting the second position coordinates of each second detection target in the visual track to new second position coordinates under the radar coordinate system by taking the radar coordinate system as a reference coordinate system.
Further, the performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected time of each first detected target in the radar track, the second detected time of each second detected target in the vision track, the first movement speed of each first detected target in the radar track, and the second movement speed of each second detected target in the vision track, so that the detection time of the track point in the radar track and the detection time of the track point in the vision track are the same, includes:
If the radar detection period is a common multiple of the vision detection period, determining a second detected time different from the first detected time in the vision track as a compensation time;
for each compensation moment, calculating a first position coordinate corresponding to the compensation moment based on a first position coordinate and a first movement speed of a previous radar track point adjacent to the compensation moment in the radar track and a time difference value between the compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the compensation moment into one radar track point of the radar track.
Further, the performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected time of each first detected target in the radar track, the second detected time of each second detected target in the vision track, the first movement speed of each first detected target in the radar track, and the second movement speed of each second detected target in the vision track, so that the detection time of the track point in the radar track and the detection time of the track point in the vision track are the same, includes:
If the vision detection period is a common multiple of the radar detection period, determining a first detected time different from a second detected time in the radar track as a compensation time;
for each compensation moment, calculating a second position coordinate corresponding to the compensation moment based on a second position coordinate and a second movement speed of a previous visual track point adjacent to the compensation moment in the visual track and a time difference value between the compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the compensation moment into one visual track point of the visual track.
Further, the performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected time of each first detected target in the radar track, the second detected time of each second detected target in the vision track, the first movement speed of each first detected target in the radar track, and the second movement speed of each second detected target in the vision track, so that the detection time of the track point in the radar track and the detection time of the track point in the vision track are the same, includes:
If the radar detection period is not a common multiple of the vision detection period and the vision detection period is not a common multiple of the radar detection period, determining a second detected moment different from the first detected moment in the vision track as a first compensation moment; and determining a first detected time in the radar track that is different from the second detected time as a second compensation time;
for each first compensation moment, calculating a first position coordinate corresponding to the first compensation moment based on a first position coordinate and a first movement speed of a previous radar track point adjacent to the first compensation moment in the radar track and a time difference value between the first compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the first compensation moment into one radar track point of the radar track;
and calculating a second position coordinate corresponding to each second compensation moment based on a second position coordinate and a second movement speed of a previous visual track point adjacent to the second compensation moment in the visual track and a time difference value between the second compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the second compensation moment into one visual track point of the visual track.
Further, in the step of fusing the track points with the same detection time in the radar track and the vision track, after obtaining the fused track, the method further includes:
and identifying each detection target based on the fusion track, radar measurement data and/or vision measurement data.
In order to achieve the above object, an embodiment of the present invention further provides a track determining apparatus, including:
the measuring data acquisition module is used for acquiring radar measuring data detected by the radar at intervals of a preset radar detection period and vision measuring data detected by the vision sensor at intervals of a preset vision detection period; the radar measurement data comprise first position coordinates, first movement speed and first detected time of each first detected target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detected target is detected by the radar; the vision measurement data comprise second position coordinates, second movement speed and second detected time of each second detection target detected by the vision sensor in a vision sensor coordinate system, wherein the second detected time is the time when the second detection target is detected;
The track determining module is configured to determine, based on the radar measurement data, a radar track corresponding to each first detection target, where each radar track point in the radar track includes: a first position coordinate, a first moving speed and a first detected moment of a corresponding first detected target; and determining, based on the vision measurement data, a vision track corresponding to each second detection target, each vision track point in the vision track comprising: a second position coordinate, a second movement speed and a second detected moment of the corresponding second detection target;
the data conversion module is used for converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; and performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected moment of each first detection target in the radar track, the second detected moment of each second detection target in the vision track, the first movement speed of each first detection target in the radar track and the second movement speed of each second detection target in the vision track, so that the detection moment of a track point in the radar track is the same as the detection moment of a track point in the vision track;
And the track fusion module is used for fusing the track points with the same detection time in the radar track and the visual track to obtain a fused track.
Further, the track determining module includes:
the track list creation sub-module is used for respectively creating a radar track list aiming at each radar track point with the first detected moment closest to the starting moment of the preset fusion period in the radar measurement data to obtain one or more radar track lists;
the position coordinate estimation sub-module is used for calculating the estimated position coordinate corresponding to each radar track point according to the preset radar detection period and the first position coordinate of the radar track point, which is one preset radar detection period earlier than the radar track point, in the created radar track list for each radar track point remaining in the preset fusion period and for each created radar track list;
the coordinate difference value calculation sub-module is used for calculating the coordinate difference value between the first position coordinate of the radar track point and the estimated position coordinate corresponding to the radar track point; if the coordinate difference is smaller than a preset difference threshold, adding the radar track point into the created radar track list, and adding 1 to the updated weight of the created radar track list; if the radar track point cannot be added into any established radar track list, establishing a new radar track list for the radar track point, and subtracting 1 from the update weight of the new radar track list; wherein, each radar track list comprises: one or more radar track points;
The track determining sub-module is used for determining that each radar track list is the radar track corresponding to the first detection target if the number of times of increasing the updating weight of the radar track list within the preset duration is greater than or equal to the preset number of times.
In order to achieve the above object, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus;
a memory for storing a computer program;
and the processor is used for realizing any track determining method step when executing the program stored in the memory.
The embodiment of the invention has the beneficial effects that:
by adopting the method provided by the embodiment of the invention, the radar measurement data detected by the radar at intervals of a preset radar detection period and the vision measurement data detected by the vision sensor at intervals of a preset vision detection period are obtained; determining a radar track corresponding to each first detection target based on the radar measurement data, and determining a visual track corresponding to each second detection target based on the visual measurement data; converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; and performing track point compensation on the radar track and/or the vision track based on a preset radar detection period and a preset vision detection period, a first detected time of each first detection target in the radar track, a second detected time of each second detection target in the vision track, a first movement speed of each first detection target in the radar track and a second movement speed of each second detection target in the vision track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the vision track; and fusing the track points with the same detection time in the radar track and the visual track to obtain a fused track. The fusion track is obtained by fusing the radar track and the visual track, so that the fusion of detection information of the radar and the visual sensor is realized, the obtained fusion track comprises the information detected by the radar and the information detected by the visual sensor, the application scene of target detection is expanded, and the detection precision is improved.
Of course, it is not necessary for any one product or method of practicing the invention to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a track determination method according to an embodiment of the present invention;
FIG. 2 is another flow chart of a track determination method according to an embodiment of the present invention;
FIG. 3 is a flowchart of determining a radar track corresponding to each first detection target according to an embodiment of the present invention;
FIG. 4 is a flow chart of determining a visual track corresponding to each second detected object;
FIG. 5 is a flow chart of compensating for track points of a radar track and/or a visual track;
FIG. 6 is a schematic diagram of radar track point compensation for radar tracks;
FIG. 7 is a schematic diagram of visual track point compensation for a visual track;
FIG. 8 is a schematic diagram of radar track point compensation for radar tracks and visual track point compensation for visual tracks;
FIG. 9 is a schematic structural diagram of a track determining apparatus according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, fig. 1 is a flow of a track determining method according to an embodiment of the present invention, which may include the following steps:
step 101, radar measurement data detected by a radar at intervals of a preset radar detection period and vision measurement data detected by a vision sensor at intervals of a preset vision detection period are obtained.
The radar detection period may be set to 1 second, 2 seconds, 3 seconds, or the like, and is not particularly limited; the visual detection period may be set to 1 second, 2 seconds, 3 seconds, or the like, and is not particularly limited.
In the embodiment of the invention, radar measurement data comprise first position coordinates, first movement speed and first detected time of each first detection target detected by a radar in a radar coordinate system, wherein the first detected time is the time when the first detection target is detected by the radar; the vision measurement data comprise second position coordinates, second movement speed and second detected time of each second detection target detected by the vision sensor in a vision sensor coordinate system, wherein the second detected time is the time when the second detection target is detected by the vision sensor.
The origin of coordinates of the radar coordinate system may be: and a barycentric coordinate point of the installation position of the radar in the automobile. The origin of coordinates of the visual coordinate system may be: the barycentric coordinates of the photosensitive components of the vision sensor or the position coordinates of the contact point of the left front wheel of the automobile body and the ground.
For example, if the radar detects the first detected target a, the first detected target B, and the first detected target C, the radar measurement data includes radar detected: the first detection target a is located at a first position coordinate (x A ,y A ,z A ) First movement speed (v xA ,v yA ,v zA ) First detected time t A The first detection target B is located at a first position coordinate (x B ,y B ,z B ) First movement speed (v xB ,v yB ,v zB ) First detected time t B The first detection target C is located at a first position coordinate (x C ,y C ,z C ) First movement speed (v xC ,v yC ,v zC ) First detected time t C
Wherein the first detected time t A For the moment when the first detected target a is detected by the radar, i.e. if the moment when the radar detects the first detected target a is t A Then t is A This timestamp serves as a first detected moment of the first detection target a; likewise, a first detected time t can be obtained B For the time when the first detected target B is detected by the radar, a first detected time t C Is the moment at which the first detection target C is detected by the radar.
Step 102, determining a radar track corresponding to each first detection target based on the radar measurement data, wherein each radar track point in the radar track comprises: a first position coordinate, a first moving speed and a first detected moment of a corresponding first detected target; and determining, based on the vision measurement data, a vision track corresponding to each second detection target, each vision track point in the vision track comprising: the second position coordinates, the second movement speed and the second detected moment of the corresponding second detection target.
Step 103, converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; and performing track point compensation on the radar track and/or the vision track based on a preset radar detection period and a preset vision detection period, a first detected time of each first detection target in the radar track, a second detected time of each second detection target in the vision track, a first movement speed of each first detection target in the radar track and a second movement speed of each second detection target in the vision track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the vision track.
And 104, fusing track points with the same detection time in the radar track and the visual track to obtain a fused track.
By adopting the method provided by the embodiment of the invention, the radar measurement data detected by the radar at intervals of a preset radar detection period and the vision measurement data detected by the vision sensor at intervals of a preset vision detection period are obtained; determining a radar track corresponding to each first detection target based on the radar measurement data, and determining a visual track corresponding to each second detection target based on the visual measurement data; converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; and performing track point compensation on the radar track and/or the vision track based on a preset radar detection period and a preset vision detection period, a first detected time of each first detection target in the radar track, a second detected time of each second detection target in the vision track, a first movement speed of each first detection target in the radar track and a second movement speed of each second detection target in the vision track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the vision track; and fusing the track points with the same detection time in the radar track and the visual track to obtain a fused track. The fusion track is obtained by fusing the radar track and the visual track, so that the fusion of detection information of the radar and the visual sensor is realized, the obtained fusion track comprises the information detected by the radar and the information detected by the visual sensor, the application scene of target detection is expanded, and the detection precision is improved.
The method and apparatus provided by the present invention will now be described in detail with particular embodiments thereof, with reference to the accompanying drawings.
Referring to fig. 2, fig. 2 is a flow of a track determining method according to an embodiment of the present invention, which may include the following steps:
step 201, obtaining radar measurement data detected by the radar at intervals of a preset radar detection period, and vision measurement data detected by the vision sensor at intervals of a preset vision detection period.
Step 202, determining a radar track corresponding to each first detection target based on the radar measurement data.
Wherein the radar track may include one or more radar track points, each of the radar track points including: the first position coordinates, the first movement speed and the first detected moment of the corresponding first detected target.
Specifically, referring to fig. 3, fig. 3 is a flow of determining a radar track corresponding to each first detection target according to an embodiment of the present invention, which may include the following steps:
step 301, respectively creating a radar track list for each radar track point in the radar measurement data, where the first detected time is closest to the starting time of the preset fusion period, so as to obtain one or more radar track lists.
The preset fusion period may set a start time and an end time, for example, the start time is 10 hours 10 minutes 0 seconds, the end time is 10 hours 20 minutes 0 seconds, and the preset fusion period is: a period of time determined from 10 minutes 0 seconds to 10 hours 20 minutes 0 seconds: [10:10:00, 10:20:00].
For example, if the predetermined fusion period is [10:10:00, 10:20:00]And at [10:10:00, 10:20:00]The radar measurement data obtained in the method comprises the following steps: radar track point P1, radar track point P2, radar track point P3, radar track point P4, radar track point P5, and radar track point P6. The first position coordinate of the radar track point P1 is (x P1 ,y P1 ,z P1 ) The first movement speed is (v xP1 ,v yP1 ,v zP1 ) First detected time t P1 The first position coordinate of the radar track point P2 is (x P2 ,y P2 ,z P2 ) The first movement speed is (v xP2 ,v yP2 ,v zP2 ) First detected time t P2 The first position coordinate of the radar track point P3 is (x P3 ,y P3 ,z P3 ) The first movement speed is (v xP3 ,v yP3 ,v zP3 ) First detected time t P3 The first position coordinate of the radar track point P4 is (x P4 ,y P4 ,z P4 ) The first movement speed is (v xP4 ,v yP4 ,v zP4 ) First detected time t P4 The first position coordinate of the radar track point P5 is (x P5 ,y P5 ,z P5 ) The first movement speed is (v xP5 ,v yP5 ,v zP5 ) First detected time t P5 The first position coordinate of the radar track point P6 is (x P6 ,y P6 ,z P6 ) The first movement speed is (v xP6 ,v yP6 ,v zP6 ) First detected time t P6
If t P1 Equal to t P2 And t P1 And if the starting time is closest to the starting time 10:10:00 of the preset fusion period, a radar track list can be respectively established aiming at the radar track point P1 and the radar track point P2 to obtain a radar track list L1 and a radar track list L2. The radar track list L1 and the radar track list L2 include a radar track point P1 and a radar track point P2, respectively.
Step 302, for each radar track point remaining in the preset fusion period, calculating an estimated position coordinate corresponding to the radar track point according to a preset radar detection period and a first position coordinate of the radar track point in the created radar track list, wherein the first detected time is earlier than the radar track point by a preset radar detection period.
For example, if already createdA radar track list 1 and a radar track list 2. The radar track list 1 includes a radar track point a, and the first position coordinate of the radar track point a is (x A ,y A ,z A ) The first movement speed is (v xA ,v yA ,v zA ) First detected time t A The method comprises the steps of carrying out a first treatment on the surface of the The radar track list 2 includes a radar track point B, and the first position coordinate of the radar track point B is (x B ,y B ,z B ) The first movement speed is (v xB ,v yB ,v zB ) First detected time t B . And if the remaining radar track points in the preset fusion period comprise: a radar track point C and a radar track point D, and a first position coordinate of the radar track point C is (x C ,y C ,z C ) The first movement speed is (v xC ,v yC ,v zC ) First detected time t C The first position coordinates of the radar track point D are (x D ,y D ,z D ) The first movement speed is (v xD ,v yD ,v zD ) First detected time t D
In this step, if the preset radar detection period is T1, for the radar track point C, if the first detected time in the created radar track list 1 is earlier than the first detected time by one T1 in the created radar track list 1 by one radar track point a, then the estimated position coordinates corresponding to the radar track point may be calculated for the created radar track list 1 by using the following formula:
(x' C ,y' C ,z' C )=((x A +v xA ·T1),(y A +v yA ·T1),(z A +v zA ·T1))
wherein, (x' C ,y' C ,z' C ) Estimated position coordinates for the radar track point C relative to the radar track list 1;
similarly, for the radar track point C, if the radar track point that is earlier by one T1 than the first detected time in the created radar track list 2 is the radar track point B, the estimated position coordinate corresponding to the radar track point may be calculated for the created radar track list 2 by using the following formula:
(x” C ,y” C ,z” C )=((x B +v xB ·T1),(y B +v yB ·T1),(z B +v zB ·T1))
Wherein, (x) " C ,y” C ,z” C ) The estimated position coordinates of this radar track point C with respect to the radar track list 2.
By the same method, the estimated position coordinates of the remaining radar track points D relative to the radar track list 1 and the estimated position coordinates of the remaining radar track points D relative to the radar track list 2 within the preset fusion period can be calculated.
In step 303, a coordinate difference between the first position coordinate of the radar track point and the estimated position coordinate corresponding to the radar track point is calculated.
For example, for radar track list 1, the coordinate difference between the first position coordinates of radar track point C and the estimated position coordinates of radar track point C relative to radar track list 1 is:
(Δx C ,Δy C ,Δz C )=|(x' C ,y' C ,z' C )-(x C ,y C ,z C )|;
for example, for radar track list 2, the coordinate difference between the first position coordinates of radar track point C and the estimated position coordinates of radar track point C relative to radar track list 2 is:
(Δx' C ,Δy' C ,Δz' C )=|(x” C ,y” C ,z” C )-(x C ,y C ,z C )|。
step 304, if the coordinate difference is smaller than the preset difference threshold, adding the radar track point into the created radar track list, and adding 1 to the update weight of the created radar track list; if the radar track point cannot be added to any created radar track list, a new radar track list is created for the radar track point, and the updated weight of the new radar track list is subtracted by 1.
Wherein, each radar track list comprises: one or more radar track points.
The preset difference threshold may be set according to practical application, and is not specifically limited.
For example, if the radar track list 1 is used, the coordinate difference (Δx) between the first position coordinate of the radar track point C and the estimated position coordinate of the radar track point C with respect to the radar track list 1 C ,Δy C ,Δz C ) If the radar track point C is smaller than the preset difference threshold value, adding the radar track point C into the created radar track list 1, and adding 1 to the updated weight of the created radar track list 1;
if the coordinate difference (Deltax) between the first position coordinate of the radar track point C and the estimated position coordinate of the radar track point C with respect to the radar track list 1 C ,Δy C ,Δz C ) Not less than a preset difference threshold, and a coordinate difference (Deltax 'between the first position coordinate of the radar track point C and the estimated position coordinate of the radar track point C with respect to the radar track list 2' C ,Δy' C ,Δz' C ) If the radar track point C is smaller than the preset difference threshold value, adding the radar track point C into the created radar track list 2, and adding 1 to the updated weight of the created radar track list 2;
if the radar track point C cannot be added to any created radar track list, a new radar track list 3 is created for the radar track point C, and the updated weight of the new radar track list 3 is subtracted by 1.
Step 305, for each radar track list, if the number of times of increasing the update weight of the radar track list within the preset duration is greater than or equal to the preset number of times, determining that the radar track list is the radar track corresponding to the first detection target.
Specifically, in this step, a sliding window method criterion may be used to determine a radar track corresponding to each first detected target. The preset sliding window method criterion comprises a preset time length m and a preset frequency n, wherein the preset time length and the preset frequency can be set according to practical application, and the preset frequency n is smaller than the preset time length m.
For example, for the radar track list 1, if the number of times of increasing the update weight of the radar track list 1 within the preset time period is greater than or equal to the preset number of times, determining that the radar track list 1 is the radar track corresponding to the first detection target; if the number of times of increasing the updated weight of the radar track list 1 within the preset duration is greater than 0 and less than the preset number of times, the radar track list 1 can be temporarily reserved; if the number of times of increasing the updated weight of the radar track list 1 is equal to 0 within the preset duration, deleting the radar track list 1;
likewise, for the radar track list 2, if the number of times of increasing the update weight of the radar track list 2 is greater than or equal to the preset number of times within the preset duration, determining that the radar track list 2 is the radar track corresponding to the first detection target; if the number of times of the update weight increase of the radar track list 2 is greater than 0 and less than the preset number of times within the preset duration, the radar track list 2 can be temporarily reserved; if the number of times the updated weight of the radar track list 2 increases within the preset time period is equal to 0, the radar track list 2 may be deleted.
Step 203, based on the vision measurement data, a vision track corresponding to each second detection target is determined.
Wherein the visual track may include one or more visual track points, each of the visual track points including: the second position coordinates, the second movement speed and the second detected moment of the corresponding second detection target.
Specifically, referring to fig. 4, fig. 4 is a flowchart of determining a visual track corresponding to each second detection target according to an embodiment of the present invention, which may include the following steps:
step 401, respectively creating a visual track list for each visual track point in the visual measurement data, where the second detected time is closest to the starting time of the preset fusion period, so as to obtain one or more visual track lists.
The preset fusion period may set a start time and an end time, for example, the start time is 10 hours 10 minutes 0 seconds, the end time is 10 hours 20 minutes 0 seconds, and the preset fusion period is: a period of time determined from 10 minutes 0 seconds to 10 hours 20 minutes 0 seconds: [10:10:00, 10:20:00].
For example, if the predetermined fusion period is [10:10:00, 10:20:00]And at [10:10:00, 10:20:00 ]The vision measurement data obtained in the method comprises the following steps: visual track point G1, visual track point G2, visual track point G3 and visual track point G4. The second position coordinate of the visual track point G1 is (x G1 ,y G1 ,z G1 ) The second movement speed is (v xG1 ,v yG1 ,v zG1 ) A second detected time t G1 The second position coordinate of the visual track point G2 is (x G2 ,y G2 ,z G2 ) The second movement speed is (v xG2 ,v yG2 ,v zG2 ) A second detected time t G2 The second position coordinate of the visual track point G3 is (x G3 ,y G3 ,z G3 ) The second movement speed is (v xG3 ,v yG3 ,v zG3 ) A second detected time t G3 The second position coordinate of the visual track point G4 is (x G4 ,y G4 ,z G4 ) The second movement speed is (v xG4 ,v yG4 ,v zG4 ) A second detected time t G4
If t G1 Equal to t G2 And t G1 And if the starting time is closest to the starting time of 10:10:00 of the preset fusion period, a visual track list can be respectively created for the visual track point G1 and the visual track point G2 to obtain a visual track list L3 and a visual track list L4. The visual track list L3 and the visual track list L4 include a visual track point G1 and a visual track point G2, respectively.
Step 402, for each visual track point remaining in the preset fusion period, calculating, for each created visual track list, an estimated position coordinate corresponding to the visual track point according to a preset visual detection period and a second position coordinate of a visual track point in the created visual track list, the second detected time being earlier than the visual track point by a preset visual detection period.
For example, if visual track list 3 and visual track list 4 have been created. The visual track list 3 includes a visual track point E, and the second position coordinates of the visual track point E are (x E ,y E ,z E ) The second movement speed is (v xE ,v yE ,v zE ) A second detected time t E The method comprises the steps of carrying out a first treatment on the surface of the The visual track list 4 includes a visual track point F, and the second position coordinate of the visual track point F is (x F ,y F ,z F ) The second movement speed is (v xF ,v yF ,v zF ) A second detected time t F . And if the remaining visual track points in the preset fusion period comprise: a visual track point J and a visual track point H, and the second position coordinates of the visual track point J are (x) J ,y J ,z J ) The second movement speed is (v xJ ,v yJ ,v zJ ) A second detected time t J The second position coordinates of the visual track point H are (x H ,y H ,z H ) The second movement speed is (v xH ,v yH ,v zH ) A second detected time t H
In this step, if the preset visual detection period is T2, and the second detected time is T2 earlier than the first detected time in the created visual track list 3 is the visual track point E for the visual track point J, the estimated position coordinates corresponding to the created visual track list 3 may be calculated according to the following formula:
(x' J ,y' J ,z' J )=((x E +v xE ·T2),(y E +v yE ·T2),(z E +v zE ·T2))
wherein, (x' J ,y' J ,z' J ) Estimated position coordinates for the visual track point J relative to the track list 3;
similarly, for the visual track point J, if the visual track point that is earlier than the second detected time point in the created visual track list 4 by one T2 is the visual track point F, the estimated position coordinate corresponding to the visual track point may be calculated for the created visual track list 4 by using the following formula:
(x” J ,y” J ,z” J )=((x F +v xF ·T2),(y F +v yF ·T2),(z F +v zF ·T2))
wherein, (x) " J ,y” J ,z” J )(x” C ,y” C ,z” C ) Estimated position coordinates for the visual track point J relative to the track list 4.
In the same way, the estimated position coordinates of the remaining visual track points H with respect to the track list 3 and the estimated position coordinates of the remaining visual track points H with respect to the track list 4 within the preset fusion period can be calculated.
Step 403, calculating a coordinate difference between the second position coordinate of the visual track point and the estimated position coordinate corresponding to the visual track point.
For example, for visual track list 3, the coordinate difference between the second position coordinates of visual track point J and the estimated position coordinates of visual track point J relative to visual track list 3 is:
(Δx J ,Δy J ,Δz J )=|(x' J ,y' J ,z' J )-(x J ,y J ,z J )|;
for example, for track list 2, the coordinate difference between the second position coordinates of visual track point J and the estimated position coordinates of visual track point J relative to visual track list 4 is:
(Δx' J ,Δy' J ,Δz' J )=|(x” J ,y” J ,z” J )-(x J ,y J ,z J )|。
Step 404, if the coordinate difference is smaller than the preset difference threshold, adding the visual track point to the created visual track list, and adding 1 to the update weight of the created visual track list; if the visual track point cannot be added to any created visual track list, a new visual track list is created for the visual track point, and the updated weight of the new visual track list is subtracted by 1.
Wherein, each visual track list comprises: one or more visual track points.
The preset difference threshold may be set according to practical application, and is not specifically limited.
For example, if for visual track list 3, the coordinate difference (Δx) between the second position coordinates of visual track point J and the estimated position coordinates of visual track point J with respect to visual track list 3 J ,Δy J ,Δz J ) If the visual track point J is smaller than the preset difference threshold value, adding the visual track point J into the created visual track list 3, and adding 1 to the updating weight of the created visual track list 3;
if the visual track list 3 is used, the coordinate difference (deltax) between the second position coordinates of the visual track point J and the estimated position coordinates of the visual track point J with respect to the visual track list 3 J ,Δy J ,Δz J ) Not less than a preset difference threshold, and for the track list 2, a coordinate difference (Δx ') between the second position coordinates of the visual track point J and the estimated position coordinates of the visual track point J with respect to the track list 2' J ,Δy' J ,Δz' J ) If the visual track point J is smaller than the preset difference threshold value, adding the visual track point J into the created visual track list 4, and adding 1 to the updating weight of the created visual track list 4;
if the visual track point J cannot be added to any created visual track list, a new visual track list 5 is created for the visual track point J and the updated weight of the new visual track list 5 is subtracted by 1.
Step 405, for each visual track list, if the number of times of increasing the update weight of the visual track list within the preset duration is greater than or equal to the preset number of times, determining that the visual track list is the visual track corresponding to the second detection target.
Specifically, in this step, a sliding window method criterion may be used to determine a visual track corresponding to each second detection target. The preset sliding window method criterion comprises a preset time length m and a preset frequency n, wherein the preset time length and the preset frequency can be set according to practical application, and the preset frequency n is smaller than the preset time length m.
For example, for the visual track list 3, if the number of times of increasing the update weight of the visual track list 3 within the preset time period is greater than or equal to the preset number of times, determining that the visual track list 3 is the visual track corresponding to the second detection target; if the number of times of increasing the update weight of the visual track list 3 within the preset time period is greater than 0 and less than the preset number of times, the visual track list 3 can be temporarily reserved; if the number of times of increasing the update weight of the visual track list 3 within the preset duration is equal to 0, deleting the visual track list 3;
likewise, for the visual track list 4, if the number of times of increasing the update weight of the visual track list 4 within the preset time period is greater than or equal to the preset number of times, determining that the visual track list 4 is the visual track corresponding to the second detection target; if the number of times of increasing the update weight of the visual track list 4 within the preset time period is greater than 0 and less than the preset number of times, the visual track list 4 can be temporarily reserved; if the number of times the update weight of the visual track list 4 increases within the preset time period is equal to 0, the visual track list 4 may be deleted.
The execution sequence of step 202 and step 203 is not particularly limited.
Step 204, converting the second position coordinates of each second detection target in the visual track to new second position coordinates in the radar coordinate system by taking the radar coordinate system as a reference coordinate system.
In this step, the difference coordinates of the origin of coordinates of the visual coordinate system with respect to the origin of coordinates of the radar coordinate system may be calculated: (Δx, Δy, Δz); the above-mentioned difference coordinates (Δx, Δy, Δz) may be added on the basis of the second position coordinates of each second detected object in the visual track in the visual coordinate system, to obtain new second position coordinates converted into the radar coordinate system.
In the embodiment of the invention, the difference coordinate of the coordinate origin of the radar coordinate system relative to the coordinate origin of the vision coordinate system can also be calculated: (Δx ', Δy ', Δz '); the difference coordinates (Δx ', Δy ', Δz ') can be added on the basis of the first position coordinates of each first detection target in the radar track under the radar coordinate system, so as to obtain new first position coordinates converted into the vision coordinate system, and the coordinate system is realized.
Step 205, performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected time of each first detection target in the radar track and the second detected time of each second detection target in the vision track, the first movement speed of each first detection target in the radar track and the second movement speed of each second detection target in the vision track, so that the detection time of the track point in the radar track and the detection time of the track point in the vision track are the same.
Referring to fig. 5, fig. 5 is a flow of compensating track points of a radar track and/or a visual track according to an embodiment of the present invention, including:
step 501, it is determined whether the radar detection period is a common multiple of the vision detection period, if so, step 502 is executed, and if not, step 503 is executed.
Step 502, if the radar detection period is a common multiple of the vision detection period, determining a second detected time different from the first detected time in the vision track as a compensation time; for each compensation moment, calculating a first position coordinate corresponding to the compensation moment based on a first position coordinate and a first movement speed of a previous radar track point adjacent to the compensation moment in the radar track and a time difference value between the compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the compensation moment into one radar track point of the radar track.
For example, if the visual detection period is 1 second and the radar detection period is 3 seconds, it may be determined that the radar detection period is a common multiple of the visual detection period.
Referring to fig. 6, the second detected moments corresponding to the visual track points a1 to a10 included in the visual track a in fig. 6 are respectively: 10:10:00, 10:10:01, 10:10:02, 10:10:03, 10:10:04, 10:10:05, 10:10:06, 10:10:07, 10:10:08, and 10:10:09; the first detected moments corresponding to the radar track points b1 to b4 included in the radar track b in fig. 6 are respectively: 10:10:00, 10:10:03, 10:10:06, and 10:10:09.
Determining a second detected moment different from the first detected moment in the visual track a as a compensation moment: that is, it may be determined that the second detected time is 10:10:01, 10:10:02, 10:10:04, 10:10:05, 10:10:07, and 10:10:08 are compensation times.
For the compensation time 10:10:01, the following formula can be adopted to calculate the first position coordinate corresponding to the compensation time:
(x' b ,y' b ,z' b )=((x b1 +v xb1 ×Δt),(y b1 +v yb1 ×Δt),(z b1 +v zb1 ×Δt))
wherein, (x' b ,y' b ,z' b ) For the first position coordinate corresponding to the compensation time, namely the first position coordinate of the b' point compensated for the compensation time, (x) b1 ,y b1 ,z b1 ) For the first position coordinate based on the previous radar track point b1 adjacent to the compensation time in the radar track b, Δt is the time difference of the first detected time of the previous radar track point adjacent to the compensation time; (v) xb1 ,v yb1 ,v zb1 ) Is the first speed of motion of the previous radar track point adjacent to the compensation instant.
In this step, the first position coordinate corresponding to each compensation time is compensated to be a radar track point of the radar track, see fig. 6, and a radar track b' after the radar track point is compensated is obtained.
Step 503, determining whether the visual detection period is a common multiple of the radar detection period, if so, executing step 504, otherwise, executing step 505.
Step 504, determining a first detected moment different from a second detected moment in the radar track as a compensation moment; for each compensation moment, calculating a second position coordinate corresponding to the compensation moment based on a second position coordinate and a second movement speed of a previous visual track point adjacent to the compensation moment in the visual track and a time difference value between the compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the compensation moment into one visual track point of the visual track.
For example, if the radar detection period is 1 second and the vision detection period is 2 seconds, it may be determined that the vision detection period is a common multiple of the radar detection period.
Referring to fig. 7, the first detected moments corresponding to the radar track points E1 to E5 included in the radar track E in fig. 7 are respectively: 10:10:00, 10:10:01, 10:10:02, 10:10:03, and 10:10:04; the first detected moments corresponding to the visual track points F1 to F3 included in the visual track F in fig. 7 are respectively: 10:10:00, 10:10:02, and 10:10:04.
Determining a first detected moment different from a second detected moment in the radar track E as a compensation moment: i.e. it can be determined that the first detected moment is 10:10:01 and 10:10:03 is the compensation moment.
For the compensation time 10:10:01, the following formula can be adopted to calculate the first position coordinate corresponding to the compensation time:
(x' f ,y' f ,z' f )=((x f1 +v xf1 ×Δt),(y f1 +v yf1 ×Δt),(z f1 +v zf1 ×Δt))
wherein, (x' f ,y' f ,z' f ) For the second position coordinate corresponding to the compensation time, namely the second position coordinate of the f' point compensated for the compensation time, (x) f1 ,y f1 ,z f1 ) For the second position coordinates based on the previous visual track point F1 adjacent to the compensation time in the visual track F, Δt is the time difference of the second detected time of the previous visual track point adjacent to the compensation time; (v) xf1 ,v yf1 ,v zf1 ) A second movement speed for a previous visual track point adjacent to the compensation instant.
In this step, the second position coordinate corresponding to each compensation time is compensated to be a visual track point of the visual track, see fig. 7, and a visual track F' after the visual track point is compensated is obtained.
Step 505, determining a second detected moment different from the first detected moment in the visual track as a first compensation moment; for each first compensation moment, calculating a first position coordinate corresponding to the first compensation moment based on a first position coordinate and a first movement speed of a previous radar track point adjacent to the first compensation moment in the radar track and a time difference value between the first compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the first compensation moment into one radar track point of the radar track.
Step 506, determining a first detected time different from a second detected time in the radar track as a second compensation time; and calculating a second position coordinate corresponding to each second compensation moment based on a second position coordinate and a second movement speed of a previous visual track point adjacent to the second compensation moment in the visual track and a time difference value between the second compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the second compensation moment into one visual track point of the visual track.
For example, if the visual detection period is 2 seconds and the radar detection period is 3 seconds, it may be determined that the radar detection period is not a common multiple of the visual detection period, and the visual detection period is not a common multiple of the radar detection period.
Referring to fig. 8, the second detected moments corresponding to the visual track points P1 to P6 included in the visual track P in fig. 8 are respectively: 10:10:00, 10:10:02, 10:10:04, 10:10:06, 10:10:08, and 10:10:10; the first detected timings corresponding to the radar track points Q1 to Q4 included in the radar track Q in fig. 8 are respectively: 10:10:00, 10:10:03, 10:10:06, and 10:10:09.
Determining a second detected moment different from the first detected moment in the visual track P as a first compensation moment: i.e. it can be determined that the second detected instants are 10:10:02, 10:10:04, 10:10:08 and 10:10:10 are the first compensating instants.
For each first compensation time, the first position coordinate corresponding to the compensation time can be calculated by adopting the following formula:
(x' q ,y' q ,z' q )=((x q +v xq ×Δt),(y q +v yq ×Δt),(z q +v zq ×Δt))
wherein, (x' q ,y' q ,z' q ) The first position coordinates corresponding to each first compensation moment, namely the first position coordinates of the q' point compensated for each compensation moment; for each first compensation instant, (x q ,y q ,z q ) A first position coordinate based on a previous radar track point adjacent to the first compensation time in the radar track Q; for each first compensation instant Δt is the time difference of the first detected instant of the previous radar track point adjacent to the first compensation instant; (v) xq ,v yq ,v zq ) Is the first speed of motion of the previous radar track point adjacent to the first compensation instant. The method comprises the steps of carrying out a first treatment on the surface of the
In this step, the first position coordinate corresponding to each first compensation time is compensated to be a radar track point of the radar track, see fig. 8, so as to obtain a radar track Q' after the radar track point is compensated.
Referring to fig. 8, a first detected timing different from a second detected timing in the radar track Q is determined as a second compensation timing: i.e. it can be determined that the first detected instants are 10:10:03 and 10:10:09 as the second compensation instants.
For each second compensation time, the first position coordinate corresponding to the second compensation time may be calculated using the following formula:
(x' p ,y' p ,z' p )=((x p +v xp ×Δt),(y p +v yp ×Δt),(z p +v zp ×Δt))
wherein, (x' p ,y' p ,z' p ) For each second compensation instant corresponding second position coordinates, i.e. for each second compensationA second position coordinate of the p' point compensated at the moment; for each second compensation instant, (x p ,y p ,z p ) A second position coordinate based on a previous visual track point P adjacent to the second compensation time in the visual track P; for each second compensation instant Δt is the time difference of the second detected instant of the previous visual track point adjacent to the second compensation instant; for each second compensation instant, (v) xp ,v yp ,v zp ) A second movement speed for a previous visual track point adjacent to the second compensation instant.
In this step, the second position coordinates corresponding to each second compensation time are compensated to be a visual track point of the visual track, see fig. 8, so as to obtain a visual track P' after the visual track point is compensated.
And 206, fusing track points with the same detection time in the radar track and the visual track to obtain a fused track.
In this step, a difference between a first position coordinate of the radar track point and a second position coordinate of the vision track point may be calculated for the radar track point and the vision track point having the same first detected time and the second detected time, and if the difference is smaller than a preset difference threshold, the radar track point or the vision track point may be determined to be a fusion track point. And taking the track formed by the determined fusion track points as the fusion track if the ratio of the number of the obtained fusion track points to the number of the radar track points in the radar track reaches a preset ratio threshold and the ratio of the number of the obtained fusion track points to the number of the visual track points in the visual track reaches a preset ratio threshold.
In step 207, each detected target is identified based on the fused track, radar measurement data, and/or vision measurement data.
In an embodiment of the present invention, the radar measurement data may further include: image information of a first detection target; the vision metric data may further include: image information of the second detection target.
In this step, the detection target may be identified based on the determined fusion track, the image information of the first detection target, and/or the image information of the second detection target.
By adopting the method provided by the embodiment of the invention, the fusion track is obtained by fusing the radar track and the vision track, so that the fusion track obtained comprises the information detected by the radar and the information detected by the vision sensor, the application scene of target detection is expanded, and the detection precision is improved.
Referring to fig. 9, based on the same inventive concept, according to the track determining method provided in the above embodiment of the present invention, correspondingly, another embodiment of the present invention further provides a track determining apparatus, which specifically includes:
the measurement data obtaining module 901 is configured to obtain radar measurement data detected by the radar at intervals of a preset radar detection period, and vision measurement data detected by the vision sensor at intervals of a preset vision detection period; the radar measurement data comprise first position coordinates, first movement speed and first detected time of each first detected target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detected target is detected by the radar; the vision measurement data comprise second position coordinates, second movement speed and second detected time of each second detection target detected by the vision sensor in a vision sensor coordinate system, wherein the second detected time is the time when the second detection target is detected;
A track determining module 902, configured to determine, based on radar measurement data, a radar track corresponding to each first detected target, where each radar track point in the radar track includes: a first position coordinate, a first moving speed and a first detected moment of a corresponding first detected target; and determining, based on the vision measurement data, a vision track corresponding to each second detection target, each vision track point in the vision track comprising: a second position coordinate, a second movement speed and a second detected moment of the corresponding second detection target;
the data conversion module 903 is configured to convert the first position coordinate of each first detected target in the radar track and the second position coordinate of each second detected target in the visual track to the same coordinate system; and performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected moment of each first detection target in the radar track, the second detected moment of each second detection target in the vision track, the first movement speed of each first detection target in the radar track and the second movement speed of each second detection target in the vision track, so that the detection moment of a track point in the radar track is the same as the detection moment of a track point in the vision track;
And the track fusion module 904 is configured to fuse track points with the same detection time in the radar track and the visual track to obtain a fused track.
Therefore, by adopting the device provided by the embodiment of the invention, the radar measurement data detected by the radar at intervals of the preset radar detection period and the vision measurement data detected by the vision sensor at intervals of the preset vision detection period are obtained; determining a radar track corresponding to each first detection target based on the radar measurement data, and determining a visual track corresponding to each second detection target based on the visual measurement data; converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; and performing track point compensation on the radar track and/or the vision track based on a preset radar detection period and a preset vision detection period, a first detected time of each first detection target in the radar track, a second detected time of each second detection target in the vision track, a first movement speed of each first detection target in the radar track and a second movement speed of each second detection target in the vision track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the vision track; and fusing the track points with the same detection time in the radar track and the visual track to obtain a fused track. The fusion track is obtained by fusing the radar track and the visual track, so that the fusion of detection information of the radar and the visual sensor is realized, the obtained fusion track comprises the information detected by the radar and the information detected by the visual sensor, the application scene of target detection is expanded, and the detection precision is improved.
Further, the track determining module 902 includes:
the track list creation sub-module is used for respectively creating a radar track list aiming at each radar track point with the first detected moment closest to the starting moment of the preset fusion period in the radar measurement data to obtain one or more radar track lists;
the position coordinate estimation sub-module is used for calculating the estimated position coordinate corresponding to each radar track point according to the preset radar detection period and the first position coordinate of the radar track point, which is one preset radar detection period earlier than the radar track point, in the created radar track list for each radar track point remaining in the preset fusion period and for each created radar track list;
the coordinate difference value calculation sub-module is used for calculating the coordinate difference value between the first position coordinate of the radar track point and the estimated position coordinate corresponding to the radar track point; if the coordinate difference is smaller than a preset difference threshold, adding the radar track point into the created radar track list, and adding 1 to the updated weight of the created radar track list; if the radar track point cannot be added into any established radar track list, establishing a new radar track list for the radar track point, and subtracting 1 from the update weight of the new radar track list; wherein, each radar track list comprises: one or more radar track points;
The track determining sub-module is used for determining that each radar track list is the radar track corresponding to the first detection target if the number of times of increasing the updating weight of the radar track list within the preset duration is greater than or equal to the preset number of times.
By adopting the device provided by the embodiment of the invention, the fusion track is obtained by fusing the radar track and the vision track, so that the fusion track obtained comprises the information detected by the radar and the information detected by the vision sensor, the application scene of target detection is expanded, and the detection precision is improved.
The embodiment of the invention also provides an electronic device, as shown in fig. 10, which comprises a processor 1001, a communication interface 1002, a memory 1003 and a communication bus 1004, wherein the processor 1001, the communication interface 1002 and the memory 1003 complete communication with each other through the communication bus 1004,
a memory 1003 for storing a computer program;
the processor 1001 is configured to execute a program stored in the memory 1003, and implement the following steps:
the method comprises the steps of obtaining radar measurement data detected by a radar at intervals of a preset radar detection period and vision measurement data detected by a vision sensor at intervals of a preset vision detection period; the radar measurement data comprise first position coordinates, first movement speed and first detected time of each first detected target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detected target is detected by the radar; the vision measurement data comprise second position coordinates, second movement speed and second detected time of each second detection target detected by the vision sensor in a vision sensor coordinate system, wherein the second detected time is the time when the second detection target is detected;
Based on the radar measurement data, determining a radar track corresponding to each first detection target, wherein each radar track point in the radar track comprises: a first position coordinate, a first moving speed and a first detected moment of a corresponding first detected target; and determining, based on the vision measurement data, a vision track corresponding to each second detection target, each vision track point in the vision track comprising: a second position coordinate, a second movement speed and a second detected moment of the corresponding second detection target;
converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; and performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected moment of each first detection target in the radar track, the second detected moment of each second detection target in the vision track, the first movement speed of each first detection target in the radar track and the second movement speed of each second detection target in the vision track, so that the detection moment of a track point in the radar track is the same as the detection moment of a track point in the vision track;
And fusing the track points with the same detection time in the radar track and the visual track to obtain a fused track.
The communication bus mentioned above for the electronic devices may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processing, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present invention, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for the apparatus, electronic device and storage medium embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and references to the parts of the description of the method embodiments are only needed. The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention are included in the protection scope of the present invention.

Claims (10)

1. A track determination method, comprising:
the method comprises the steps of obtaining radar measurement data detected by a radar at intervals of a preset radar detection period and vision measurement data detected by a vision sensor at intervals of a preset vision detection period; the radar measurement data comprise first position coordinates, first movement speed and first detected time of each first detected target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detected target is detected by the radar; the vision measurement data comprise second position coordinates, second movement speed and second detected time of each second detection target detected by the vision sensor in a vision sensor coordinate system, wherein the second detected time is the time when the second detection target is detected;
based on the radar measurement data, determining a radar track corresponding to each first detection target, wherein each radar track point in the radar track comprises: a first position coordinate, a first moving speed and a first detected moment of a corresponding first detected target; and determining, based on the vision measurement data, a vision track corresponding to each second detection target, each vision track point in the vision track comprising: a second position coordinate, a second movement speed and a second detected moment of the corresponding second detection target;
Converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; and performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected moment of each first detection target in the radar track, the second detected moment of each second detection target in the vision track, the first movement speed of each first detection target in the radar track and the second movement speed of each second detection target in the vision track, so that the detection moment of a track point in the radar track is the same as the detection moment of a track point in the vision track;
and fusing the track points with the same detection time in the radar track and the visual track to obtain a fused track.
2. The method of claim 1, wherein determining a radar track for each first detected target based on radar measurement data comprises:
respectively creating a radar track list aiming at each radar track point in radar measurement data, wherein the first detected moment is closest to the starting moment of a preset fusion period, so as to obtain one or more radar track lists;
Aiming at each remaining radar track point in a preset fusion period, calculating an estimated position coordinate corresponding to the radar track point according to a preset radar detection period and a first position coordinate of the radar track point, which is one preset radar detection period earlier than the radar track point, in the created radar track list for each created radar track list;
calculating a coordinate difference between a first position coordinate of the radar track point and an estimated position coordinate corresponding to the radar track point;
if the coordinate difference is smaller than a preset difference threshold, adding the radar track point into the created radar track list, and adding 1 to the updated weight of the created radar track list; if the radar track point cannot be added into any established radar track list, establishing a new radar track list for the radar track point, and subtracting 1 from the update weight of the new radar track list; wherein, each radar track list comprises: one or more radar track points;
and aiming at each radar track list, if the number of times of increasing the updating weight of the radar track list within the preset time is greater than or equal to the preset number of times, determining the radar track list as the radar track corresponding to the first detection target.
3. The method of claim 1, wherein determining a visual track corresponding to each second probe object based on the vision metric data comprises:
respectively creating a visual track list aiming at each visual track point in the visual measurement data, wherein the second detected moment is closest to the starting moment of the preset fusion period, so as to obtain one or more visual track lists;
aiming at each visual track point remained in a preset fusion period, calculating an estimated position coordinate corresponding to the visual track point according to a preset visual detection period and a second position coordinate of the visual track point, which is earlier than the visual track point by one preset visual detection period, in the second detected moment in the created visual track list for each created visual track list;
calculating a coordinate difference between the second position coordinate of the visual track point and the estimated position coordinate corresponding to the visual track point;
if the coordinate difference is smaller than a preset difference threshold, adding the visual track point into the created visual track list, and adding 1 to the updated weight of the created visual track list; if the visual track point cannot be added into any created visual track list, creating a new visual track list for the visual track point, and subtracting 1 from the updated weight of the new visual track list; wherein, each visual track list comprises: one or more visual track points;
And aiming at each visual track list, if the number of times of increasing the updating weight of the visual track list within the preset time is greater than or equal to the preset number of times, determining the visual track list as the visual track corresponding to the second detection target.
4. The method of claim 1, wherein converting the first position coordinates of each first detected object in the radar track and the second position coordinates of each second detected object in the vision track to the same coordinate system comprises:
and converting the second position coordinates of each second detection target in the visual track to new second position coordinates under the radar coordinate system by taking the radar coordinate system as a reference coordinate system.
5. The method according to claim 1, wherein performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected time of each first detected target in the radar track and the second detected time of each second detected target in the vision track, the first movement speed of each first detected target in the radar track and the second movement speed of each second detected target in the vision track so that the detection time of the track point in the radar track and the detection time of the track point in the vision track are the same, comprises:
If the radar detection period is a common multiple of the vision detection period, determining a second detected time different from the first detected time in the vision track as a compensation time;
for each compensation moment, calculating a first position coordinate corresponding to the compensation moment based on a first position coordinate and a first movement speed of a previous radar track point adjacent to the compensation moment in the radar track and a time difference value between the compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the compensation moment into one radar track point of the radar track.
6. The method according to claim 1, wherein performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected time of each first detected target in the radar track and the second detected time of each second detected target in the vision track, the first movement speed of each first detected target in the radar track and the second movement speed of each second detected target in the vision track so that the detection time of the track point in the radar track and the detection time of the track point in the vision track are the same, comprises:
If the vision detection period is a common multiple of the radar detection period, determining a first detected time different from a second detected time in the radar track as a compensation time;
for each compensation moment, calculating a second position coordinate corresponding to the compensation moment based on a second position coordinate and a second movement speed of a previous visual track point adjacent to the compensation moment in the visual track and a time difference value between the compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the compensation moment into one visual track point of the visual track.
7. The method according to claim 1, wherein performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected time of each first detected target in the radar track and the second detected time of each second detected target in the vision track, the first movement speed of each first detected target in the radar track and the second movement speed of each second detected target in the vision track so that the detection time of the track point in the radar track and the detection time of the track point in the vision track are the same, comprises:
If the radar detection period is not a common multiple of the vision detection period and the vision detection period is not a common multiple of the radar detection period, determining a second detected moment different from the first detected moment in the vision track as a first compensation moment; and determining a first detected time in the radar track that is different from the second detected time as a second compensation time;
for each first compensation moment, calculating a first position coordinate corresponding to the first compensation moment based on a first position coordinate and a first movement speed of a previous radar track point adjacent to the first compensation moment in the radar track and a time difference value between the first compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the first compensation moment into one radar track point of the radar track;
and calculating a second position coordinate corresponding to each second compensation moment based on a second position coordinate and a second movement speed of a previous visual track point adjacent to the second compensation moment in the visual track and a time difference value between the second compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the second compensation moment into one visual track point of the visual track.
8. The method according to claim 1, wherein after the fusing the track points with the same detection time in the radar track and the visual track to obtain a fused track, further comprising:
and identifying each detection target based on the fusion track, radar measurement data and/or vision measurement data.
9. A track determining device, comprising:
the measuring data acquisition module is used for acquiring radar measuring data detected by the radar at intervals of a preset radar detection period and vision measuring data detected by the vision sensor at intervals of a preset vision detection period; the radar measurement data comprise first position coordinates, first movement speed and first detected time of each first detected target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detected target is detected by the radar; the vision measurement data comprise second position coordinates, second movement speed and second detected time of each second detection target detected by the vision sensor in a vision sensor coordinate system, wherein the second detected time is the time when the second detection target is detected;
The track determining module is configured to determine, based on the radar measurement data, a radar track corresponding to each first detection target, where each radar track point in the radar track includes: a first position coordinate, a first moving speed and a first detected moment of a corresponding first detected target; and determining, based on the vision measurement data, a vision track corresponding to each second detection target, each vision track point in the vision track comprising: a second position coordinate, a second movement speed and a second detected moment of the corresponding second detection target;
the data conversion module is used for converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; and performing track point compensation on the radar track and/or the vision track based on the preset radar detection period and the preset vision detection period, the first detected moment of each first detection target in the radar track, the second detected moment of each second detection target in the vision track, the first movement speed of each first detection target in the radar track and the second movement speed of each second detection target in the vision track, so that the detection moment of a track point in the radar track is the same as the detection moment of a track point in the vision track;
And the track fusion module is used for fusing the track points with the same detection time in the radar track and the visual track to obtain a fused track.
10. The electronic equipment is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
a memory for storing a computer program;
a processor for carrying out the method steps of any one of claims 1-8 when executing a program stored on a memory.
CN202011622460.XA 2020-12-31 2020-12-31 Track determining method and device and electronic equipment Active CN112816975B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011622460.XA CN112816975B (en) 2020-12-31 2020-12-31 Track determining method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011622460.XA CN112816975B (en) 2020-12-31 2020-12-31 Track determining method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN112816975A CN112816975A (en) 2021-05-18
CN112816975B true CN112816975B (en) 2024-03-15

Family

ID=75854923

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011622460.XA Active CN112816975B (en) 2020-12-31 2020-12-31 Track determining method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112816975B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342057B (en) * 2021-08-05 2021-12-03 特金智能科技(上海)有限公司 Track fusion method and device, unmanned aerial vehicle detection system, equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109116349A (en) * 2018-07-26 2019-01-01 西南电子技术研究所(中国电子科技集团公司第十研究所) Multi-sensor cooperation tracks combined optimization decision-making technique
CN110929810A (en) * 2019-12-27 2020-03-27 航天南湖电子信息技术股份有限公司 Multi-source data fusion method for low-speed small-target detection system
CN111967498A (en) * 2020-07-20 2020-11-20 重庆大学 Night target detection and tracking method based on millimeter wave radar and vision fusion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109116349A (en) * 2018-07-26 2019-01-01 西南电子技术研究所(中国电子科技集团公司第十研究所) Multi-sensor cooperation tracks combined optimization decision-making technique
CN110929810A (en) * 2019-12-27 2020-03-27 航天南湖电子信息技术股份有限公司 Multi-source data fusion method for low-speed small-target detection system
CN111967498A (en) * 2020-07-20 2020-11-20 重庆大学 Night target detection and tracking method based on millimeter wave radar and vision fusion

Also Published As

Publication number Publication date
CN112816975A (en) 2021-05-18

Similar Documents

Publication Publication Date Title
CN109212521B (en) Target tracking method based on fusion of forward-looking camera and millimeter wave radar
US9127956B2 (en) Technique for lane assignment in a vehicle
JP6714513B2 (en) An in-vehicle device that informs the navigation module of the vehicle of the presence of an object
US9002631B2 (en) Vicinity environment estimation device with blind region prediction, road detection and intervehicle communication
WO2020232648A1 (en) Lane line detection method, electronic device and storage medium
US11935250B2 (en) Method, device and computer-readable storage medium with instructions for processing sensor data
JP7056842B2 (en) State estimator and program
CN112083441B (en) Obstacle detection method and system for depth fusion of laser radar and millimeter wave radar
CN111220998B (en) Multi-target cooperative tracking method based on vehicle-to-vehicle communication
US11748593B2 (en) Sensor fusion target prediction device and method for vehicles and vehicle including the device
CN112084810A (en) Obstacle detection method and device, electronic equipment and storage medium
CN112816975B (en) Track determining method and device and electronic equipment
JP6951728B2 (en) Object tracking device
Diehl et al. Radar-based dynamic occupancy grid mapping and object detection
CN112835030A (en) Data fusion method and device for obstacle target and intelligent automobile
US20200398797A1 (en) Vehicle sensor enhancements
CN113269811A (en) Data fusion method and device and electronic equipment
JP2018055451A (en) Other lane monitoring apparatus
CN110986966B (en) Automatic driving positioning method and system for long-distance tunnel
CN115856872A (en) Vehicle motion track continuous tracking method
Cao et al. A geometry-driven car-following distance estimation algorithm robust to road slopes
Yu et al. Camera-radar data fusion for target detection via Kalman filter and Bayesian estimation
CN112598715A (en) Multi-sensor-based multi-target tracking method, system and computer readable medium
CN111612818A (en) Novel binocular vision multi-target tracking method and system
CN115588184A (en) Method and device for detecting target running device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220322

Address after: 430051 No. b1336, chuanggu startup area, taizihu cultural Digital Creative Industry Park, No. 18, Shenlong Avenue, Wuhan Economic and Technological Development Zone, Hubei Province

Applicant after: Yikatong (Hubei) Technology Co.,Ltd.

Address before: No.c101, chuanggu start up zone, taizihu cultural Digital Industrial Park, No.18 Shenlong Avenue, Wuhan Economic and Technological Development Zone, Hubei Province

Applicant before: HUBEI ECARX TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant