CN112816975A - Flight path determining method and device and electronic equipment - Google Patents

Flight path determining method and device and electronic equipment Download PDF

Info

Publication number
CN112816975A
CN112816975A CN202011622460.XA CN202011622460A CN112816975A CN 112816975 A CN112816975 A CN 112816975A CN 202011622460 A CN202011622460 A CN 202011622460A CN 112816975 A CN112816975 A CN 112816975A
Authority
CN
China
Prior art keywords
track
radar
visual
detected
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011622460.XA
Other languages
Chinese (zh)
Other versions
CN112816975B (en
Inventor
徐宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecarx Hubei Tech Co Ltd
Original Assignee
Hubei Ecarx Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Ecarx Technology Co Ltd filed Critical Hubei Ecarx Technology Co Ltd
Priority to CN202011622460.XA priority Critical patent/CN112816975B/en
Publication of CN112816975A publication Critical patent/CN112816975A/en
Application granted granted Critical
Publication of CN112816975B publication Critical patent/CN112816975B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The embodiment of the invention provides a flight path determining method, a flight path determining device and electronic equipment, wherein the method comprises the following steps: acquiring radar measurement data detected by a radar every other preset radar detection period and vision measurement data detected by a vision sensor every other preset vision detection period; determining a radar track corresponding to each first detection target; determining a visual track corresponding to each second detection target; converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; track point compensation is carried out on the radar track and/or the visual track, so that the detection time of the track point in the radar track is the same as the detection time of the track point in the visual track; and fusing the radar track and track points with the same detection time in the visual track to obtain a fused track. By adopting the method, the application scene of target detection is expanded, and the detection precision is improved.

Description

Flight path determining method and device and electronic equipment
Technical Field
The invention relates to the technical field of automobile road detection, in particular to a track determining method, a track determining device and electronic equipment.
Background
The ADAS (advanced driver assistance system) is a system that senses the surrounding environment at any time during the driving of an automobile by various sensors mounted on the automobile, collects data, identifies, detects and tracks static and dynamic targets, and performs calculation and analysis by combining with navigator map data. In ADAS, detection and identification of targets mainly include two ways: the recognition target is detected by a camera vision sensor and detected by a millimeter wave radar. Specifically, the position of the target can be detected through a camera vision sensor or through a millimeter wave radar, the motion track of the automobile relative to the target is calculated through the detected position of the target, the relative position of the target relative to the automobile is determined, and meanwhile the target is identified according to the image data of the detected target.
However, the ways of detecting the recognition target through the camera vision sensor and detecting the recognition target through the millimeter wave radar have certain limitations: the millimeter wave radar is slow in detecting the transverse parameters of the target and cannot distinguish elements such as lane lines and traffic signs; the visual sensor has short detection distance, low distance measurement and speed measurement precision and is greatly influenced by factors such as illumination, weather and the like. Therefore, the application scenes of target detection and identification only through the camera vision sensor or only through the millimeter wave radar are limited, and the detection precision is low.
Disclosure of Invention
The embodiment of the invention aims to provide a track determining method, a track determining device and electronic equipment, which are used for realizing the purposes of expanding an application scene of target detection and improving the precision of the target detection by fusing a radar track detected by a radar sensor and a visual track detected by a visual sensor.
In order to achieve the above object, an embodiment of the present invention provides a flight path determining method, including:
acquiring radar measurement data detected by a radar every other preset radar detection period and vision measurement data detected by a vision sensor every other preset vision detection period; the radar measurement data comprise a first position coordinate, a first movement speed and a first detected time of each first detection target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detection target is detected by the radar; the vision measurement data comprises a second position coordinate, a second movement speed and a second detected moment of each second detected target in a vision sensor coordinate system, wherein the second position coordinate, the second movement speed and the second detected moment are detected by the vision sensor, and the second detected moment is the moment when the second detected target is reached;
based on radar measurement data, determining a radar track corresponding to each first detection target, wherein each radar track point in the radar track comprises: the corresponding first position coordinate, the first movement speed and the first detected moment of the first detection target; and determining a visual track corresponding to each second detection target based on the visual measurement data, wherein each visual track point in the visual track comprises: the corresponding second position coordinate, the second movement speed and the second detected moment of the second detection target;
converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; performing track point compensation on the radar track and/or the visual track based on the preset radar detection period and the preset visual detection period, the first detected time of each first detection target in the radar track and the second detected time of each second detection target in the visual track, the first motion speed of each first detection target in the radar track and the second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the visual track;
and fusing track points with the same detection time in the radar track and the visual track to obtain a fused track.
Further, the determining the radar track corresponding to each first detection target based on the radar measurement data includes:
respectively creating a radar track list aiming at each radar track point with the first detected moment in radar measurement data closest to the starting moment of a preset fusion time period to obtain one or more radar track lists;
aiming at each remaining radar track point in a preset fusion time period, aiming at each established radar track list, calculating an estimated position coordinate corresponding to the radar track point according to a preset radar detection period and a first position coordinate of the radar track point, of which the first detected time is one preset radar detection period earlier than the radar track point, in the established radar track list;
calculating a coordinate difference value between the first position coordinate of the radar track point and the estimated position coordinate corresponding to the radar track point;
if the coordinate difference value is smaller than a preset difference value threshold value, adding the radar track point into the established radar track list, and adding 1 to the updating weight of the established radar track list; if the radar track point cannot be added into any established radar track list, establishing a new radar track list for the radar track point, and subtracting 1 from the update weight of the new radar track list; wherein, each radar track list comprises: one or more radar track points;
and for each radar track list, if the times of increasing the updating weight of the radar track list in the preset time length are more than or equal to the preset times, determining that the radar track list is the radar track corresponding to the first detection target.
Further, the determining a visual track corresponding to each second detection target based on the visual measurement data includes:
respectively creating a visual track list aiming at each visual track point with the second detected moment closest to the starting moment of the preset fusion time period in the visual measurement data to obtain one or more visual track lists;
aiming at each residual visual track point in a preset fusion time period, aiming at each created visual track list, and according to a preset visual detection period and a second position coordinate of the visual track point, which is earlier than the visual track point by one preset visual detection period at a second detected moment, in the created visual track list, calculating an estimated position coordinate corresponding to the visual track point;
calculating a coordinate difference value between the second position coordinate of the visual track point and the estimated position coordinate corresponding to the visual track point;
if the coordinate difference value is smaller than a preset difference value threshold value, adding the visual track point into the created visual track list, and adding 1 to the updating weight of the created visual track list; if the visual track point can not be added into any established visual track list, establishing a new visual track list for the visual track point, and subtracting 1 from the update weight of the new visual track list; wherein, each visual track list comprises: one or more visual track points;
and for each visual track list, if the times of increasing the updating weight of the visual track list in the preset time length are more than or equal to the preset times, determining that the visual track list is the visual track corresponding to the second detection target.
Further, the converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system includes:
and converting the second position coordinate of each second detection target in the visual track to a new second position coordinate under the radar coordinate system by using the radar coordinate system as a reference coordinate system.
Further, based on the preset radar detection period and the preset visual detection period, the first detected time of each first detected target in the radar track and the second detected time of each second detected target in the visual track, the first movement speed of each first detected target in the radar track and the second movement speed of each second detected target in the visual track, track point compensation is performed on the radar track and/or the visual track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the visual track, including:
if the radar detection period is a common multiple of the visual detection period, determining a second detected time different from the first detected time in the visual track as a compensation time;
and for each compensation moment, calculating a first position coordinate corresponding to the compensation moment based on a first position coordinate and a first movement speed of a previous radar track point adjacent to the compensation moment in the radar track and a time difference value between the compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the compensation moment into one radar track point of the radar track.
Further, based on the preset radar detection period and the preset visual detection period, the first detected time of each first detected target in the radar track and the second detected time of each second detected target in the visual track, the first movement speed of each first detected target in the radar track and the second movement speed of each second detected target in the visual track, track point compensation is performed on the radar track and/or the visual track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the visual track, including:
if the visual detection period is a common multiple of the radar detection period, determining a first detected time different from a second detected time in the radar track as a compensation time;
and for each compensation moment, calculating a second position coordinate corresponding to the compensation moment based on a second position coordinate and a second motion speed of a previous visual track point adjacent to the compensation moment in the visual track and a time difference value between the compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the compensation moment into one visual track point of the visual track.
Further, based on the preset radar detection period and the preset visual detection period, the first detected time of each first detected target in the radar track and the second detected time of each second detected target in the visual track, the first movement speed of each first detected target in the radar track and the second movement speed of each second detected target in the visual track, track point compensation is performed on the radar track and/or the visual track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the visual track, including:
if the radar detection period is not a common multiple of the visual detection period and the visual detection period is not a common multiple of the radar detection period, determining a second detected time different from the first detected time in the visual track as a first compensation time; determining a first detected time different from a second detected time in the radar track as a second compensation time;
for each first compensation moment, calculating a first position coordinate corresponding to the first compensation moment based on a first position coordinate and a first motion speed of a previous radar track point adjacent to the first compensation moment in the radar track and a time difference value between the first compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the first compensation moment into one radar track point of the radar track;
and for each second compensation moment, calculating a second position coordinate corresponding to the second compensation moment based on a second position coordinate and a second motion speed of a previous visual track point adjacent to the second compensation moment in the visual track and a time difference value between the second compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the second compensation moment into one visual track point of the visual track.
Further, in the radar track and the visual track, track points with the same detection time are fused to obtain a fused track, the method further includes:
and identifying each detection target based on the fusion flight path, the radar measurement data and/or the vision measurement data.
In order to achieve the above object, an embodiment of the present invention further provides a track determining apparatus, including:
the measurement data acquisition module is used for acquiring radar measurement data detected by the radar every other preset radar detection period and vision measurement data detected by the vision sensor every other preset vision detection period; the radar measurement data comprise a first position coordinate, a first movement speed and a first detected time of each first detection target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detection target is detected by the radar; the vision measurement data comprises a second position coordinate, a second movement speed and a second detected moment of each second detected target in a vision sensor coordinate system, wherein the second position coordinate, the second movement speed and the second detected moment are detected by the vision sensor, and the second detected moment is the moment when the second detected target is reached;
the track determining module is used for determining a radar track corresponding to each first detection target based on radar measurement data, and each radar track point in the radar track comprises: the corresponding first position coordinate, the first movement speed and the first detected moment of the first detection target; and determining a visual track corresponding to each second detection target based on the visual measurement data, wherein each visual track point in the visual track comprises: the corresponding second position coordinate, the second movement speed and the second detected moment of the second detection target;
the data conversion module is used for converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; performing track point compensation on the radar track and/or the visual track based on the preset radar detection period and the preset visual detection period, the first detected time of each first detection target in the radar track and the second detected time of each second detection target in the visual track, the first motion speed of each first detection target in the radar track and the second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the visual track;
and the track fusion module is used for fusing track points with the same detection time in the radar track and the visual track to obtain a fusion track.
Further, the track determination module includes:
the track list creating submodule is used for respectively creating a radar track list aiming at each radar track point with the first detected moment in the radar measurement data being closest to the starting moment of the preset fusion time interval so as to obtain one or more radar track lists;
the position coordinate estimation submodule is used for calculating an estimated position coordinate corresponding to each radar track point according to a preset radar detection period and a first position coordinate of the radar track point, of which the first detected time is one preset radar detection period earlier than the radar track point, in the established radar track list aiming at each radar track point remaining in a preset fusion time period;
the coordinate difference value calculating submodule is used for calculating a coordinate difference value between the first position coordinate of the radar track point and the estimated position coordinate corresponding to the radar track point; if the coordinate difference value is smaller than a preset difference value threshold value, adding the radar track point into the established radar track list, and adding 1 to the updating weight of the established radar track list; if the radar track point cannot be added into any established radar track list, establishing a new radar track list for the radar track point, and subtracting 1 from the update weight of the new radar track list; wherein, each radar track list comprises: one or more radar track points;
and the track determining submodule is used for determining that each radar track list is a radar track corresponding to the first detection target if the times of increasing the updating weight of the radar track list in the preset time length are more than or equal to the preset times.
In order to achieve the above object, an embodiment of the present invention provides an electronic device, which includes a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface are configured to complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
and the processor is used for realizing any one of the steps of the flight path determining method when executing the program stored in the memory.
The embodiment of the invention has the following beneficial effects:
by adopting the method provided by the embodiment of the invention, the radar measurement data detected by the radar every other preset radar detection period and the vision measurement data detected by the vision sensor every other preset vision detection period are obtained; determining a radar track corresponding to each first detection target based on the radar measurement data, and determining a visual track corresponding to each second detection target based on the visual measurement data; converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; performing track point compensation on the radar track and/or the visual track based on a preset radar detection period and a preset visual detection period, a first detected time of each first detection target in the radar track and a second detected time of each second detection target in the visual track, a first motion speed of each first detection target in the radar track and a second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of the track point in the visual track; and (4) merging the track points with the same detection time in the radar track and the visual track to obtain a merged track. The radar track and the visual track are fused to obtain the fusion track, so that the fusion of detection information of the radar and the visual sensor is realized, the obtained fusion track comprises information detected by the radar and information detected by the visual sensor, the application scene of target detection is expanded, and the detection precision is improved.
Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of a method for determining a flight path according to an embodiment of the present invention;
FIG. 2 is another flow chart of a method for determining a flight path according to an embodiment of the present invention;
fig. 3 is a flowchart of determining a radar track corresponding to each first detection target according to an embodiment of the present invention;
FIG. 4 is a flowchart of determining a visual trajectory for each second detected object;
FIG. 5 is a flow chart of track point compensation for radar tracks and/or visual tracks;
FIG. 6 is a schematic illustration of radar track point compensation for a radar track;
FIG. 7 is a schematic view of a visual track point compensation for a visual track;
FIG. 8 is a schematic diagram of radar track point compensation for a radar track and visual track point compensation for a visual track;
FIG. 9 is a schematic structural diagram of a track determination device according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flow chart of a flight path determining method provided by an embodiment of the present invention, and the flow chart may include the following steps:
step 101, obtaining radar measurement data detected by a radar every other preset radar detection period and vision measurement data detected by a vision sensor every other preset vision detection period.
The radar detection period may be set to 1 second, 2 seconds, 3 seconds, or the like, and is not particularly limited; the visual detection period may also be set to 1 second, 2 seconds, 3 seconds, or the like, and is not particularly limited.
In the embodiment of the present invention, the radar measurement data includes a first position coordinate, a first movement speed, and a first detected time of each first detected target detected by the radar in a radar coordinate system, where the first detected time is a time when the first detected target is detected by the radar; the vision measurement data includes a second position coordinate, a second movement speed and a second detected time of each second detected object detected by the vision sensor in the vision sensor coordinate system, wherein the second detected time is a time when the second detected object is detected by the vision sensor.
Wherein, the origin of coordinates of the radar coordinate system can be: and (3) a gravity center coordinate point of the installation position of the radar in the automobile. The origin of coordinates of the visual coordinate system may be: the center of gravity coordinates of a photosensitive component of the vision sensor, or the position coordinates of a contact point of a left front wheel of an automobile body and the ground.
For example, if the radar detects the first detection target a, the first detection target B, and the first detection target C, the radar measurement data includes the data detected by the radar: the first detection target A has a first position coordinate (x) in the radar coordinate systemA,yA,zA) First movement velocity (v)xA,vyA,vzA) And a first detected time tAThe first detection target B is at the first position coordinate (x) of the radar coordinate systemB,yB,zB) First movement velocity (v)xB,vyB,vzB) And a first detected time tBThe first detection target C has a first position coordinate (x) in the radar coordinate systemC,yC,zC) First movement velocity (v)xC,vyC,vzC) And a first detected time tC
Wherein the first detected time tAIs the time when the first detection object A is detected by the radar, i.e. the time when the radar detects the first detection object A is tAThen t will beAThis time stamp is taken as a first detection targetA first detected time of a; likewise, the first detected instant t can be obtainedBA first detected time t is a time at which the first detection target B is detected by the radarCIs the time at which the first detection target C is detected by the radar.
Step 102, determining a radar track corresponding to each first detection target based on radar measurement data, wherein each radar track point in the radar track comprises: the corresponding first position coordinate, the first movement speed and the first detected moment of the first detection target; and determining a visual track corresponding to each second detection target based on the visual measurement data, wherein each visual track point in the visual track comprises: and the corresponding second position coordinate, the corresponding second movement speed and the corresponding second detected time of the second detection target.
103, converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; and performing track point compensation on the radar track and/or the visual track based on a preset radar detection period and a preset visual detection period, the first detected time of each first detection target in the radar track and the second detected time of each second detection target in the visual track, the first motion speed of each first detection target in the radar track and the second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of the track point in the visual track.
And step 104, merging track points with the same detection time in the radar track and the visual track to obtain a merged track.
By adopting the method provided by the embodiment of the invention, the radar measurement data detected by the radar every other preset radar detection period and the vision measurement data detected by the vision sensor every other preset vision detection period are obtained; determining a radar track corresponding to each first detection target based on the radar measurement data, and determining a visual track corresponding to each second detection target based on the visual measurement data; converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; performing track point compensation on the radar track and/or the visual track based on a preset radar detection period and a preset visual detection period, a first detected time of each first detection target in the radar track and a second detected time of each second detection target in the visual track, a first motion speed of each first detection target in the radar track and a second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of the track point in the visual track; and (4) merging the track points with the same detection time in the radar track and the visual track to obtain a merged track. The radar track and the visual track are fused to obtain the fusion track, so that the fusion of detection information of the radar and the visual sensor is realized, the obtained fusion track comprises information detected by the radar and information detected by the visual sensor, the application scene of target detection is expanded, and the detection precision is improved.
The method and apparatus of the present invention will be described in detail with reference to the accompanying drawings using specific embodiments.
Referring to fig. 2, fig. 2 is a flow chart of a flight path determining method provided by the embodiment of the present invention, and the flow chart may include the following steps:
step 201, obtaining radar measurement data detected by a radar every other preset radar detection period and vision measurement data detected by a vision sensor every other preset vision detection period.
Step 202, determining a radar track corresponding to each first detection target based on the radar measurement data.
Wherein the radar track can include one or more radar track points, each radar track point in the radar track including: the corresponding first position coordinate, the first movement speed and the first detected time of the first detection target.
Specifically, referring to fig. 3, fig. 3 is a flowchart for determining a radar track corresponding to each first detection target according to an embodiment of the present invention, which may include the following steps:
step 301, respectively creating a radar track list for each radar track point in the radar measurement data, where the first detected time is closest to the starting time of the preset fusion time period, to obtain one or more radar track lists.
The preset fusion time period may set a start time and a stop time, for example, the start time is 10 hours, 10 minutes and 0 seconds, and the stop time is 10 hours, 20 minutes and 0 seconds, so that the preset fusion time period is as follows: time period determined from 10 hours 10 minutes 0 seconds to 10 hours 20 minutes 0 seconds: [10:10:00, 10:20:00].
For example, if the predetermined fusion time period is [10:10:00, 10:20:00]]And is in the range of [10:10:00, 10:20:00]Internally acquired radar metrology data includes: the radar track point P1, the radar track point P2, the radar track point P3, the radar track point P4, the radar track point P5 and the radar track point P6. And, the first position coordinate of the radar track point P1 is (x)P1,yP1,zP1) The first movement speed is (v)xP1,vyP1,vzP1) And a first detected time tP1The first position coordinate of the radar track point P2 is (x)P2,yP2,zP2) The first movement speed is (v)xP2,vyP2,vzP2) And a first detected time tP2The first position coordinate of the radar track point P3 is (x)P3,yP3,zP3) The first movement speed is (v)xP3,vyP3,vzP3) And a first detected time tP3The first position coordinate of the radar track point P4 is (x)P4,yP4,zP4) The first movement speed is (v)xP4,vyP4,vzP4) And a first detected time tP4The first position coordinate of the radar track point P5 is (x)P5,yP5,zP5) The first movement speed is (v)xP5,vyP5,vzP5) And a first detected time tP5The first position coordinate of the radar track point P6 is (x)P6,yP6,zP6) The first movement speed is (v)xP6,vyP6,vzP6) And the first detectedMeasuring time tP6
If tP1Is equal to tP2And t isP1And when the initial time is closest to the 10:10:00 starting time of the preset fusion time period, respectively creating a radar track list aiming at the radar track point P1 and the radar track point P2 to obtain a radar track list L1 and a radar track list L2. The radar track list L1 and the radar track list L2 respectively comprise radar track points P1 and radar track points P2.
Step 302, aiming at each remaining radar track point in the preset fusion time period, aiming at each established radar track list, calculating an estimated position coordinate corresponding to the radar track point according to a preset radar detection period and a first position coordinate of the radar track point, of which the first detected time is one preset radar detection period earlier than the radar track point, in the established radar track list.
For example, if the radar track list 1 and the radar track list 2 are created. The radar track list 1 comprises a radar track point A, and the first position coordinate of the radar track point A is (x)A,yA,zA) The first movement speed is (v)xA,vyA,vzA) And a first detected time tA(ii) a The radar track list 2 comprises a radar track point B, and the first position coordinate of the radar track point B is (x)B,yB,zB) The first movement speed is (v)xB,vyB,vzB) And a first detected time tB. And if the remaining radar track points in the preset fusion time period comprise: the first position coordinate of the radar track point C is (x)C,yC,zC) The first movement speed is (v)xC,vyC,vzC) And a first detected time tCThe first position coordinate of the radar track point D is (x)D,yD,zD) The first movement speed is (v)xD,vyD,vzD) And a first detected time tD
In this step, if the preset radar detection period is T1, for a radar track point C, if a radar track point that is earlier than the radar track point C by one time T1 in the first detected time in the created radar track list 1 is a radar track point a, the estimated position coordinates corresponding to the radar track point can be calculated by using the following formula for the created radar track list 1:
(x'C,y'C,z'C)=((xA+vxA·T1),(yA+vyA·T1),(zA+vzA·T1))
wherein, (x'C,y'C,z'C) Estimating position coordinates of the radar track point C relative to a radar track list 1;
similarly, for the radar track point C, if the radar track point whose first detected time is one T1 earlier than the radar track point C in the created radar track list 2 is the radar track point B, the estimated position coordinate corresponding to the radar track point can be calculated by using the following formula for the created radar track list 2:
(x”C,y”C,z”C)=((xB+vxB·T1),(yB+vyB·T1),(zB+vzB·T1))
wherein (x) "C,y”C,z”C) Is the estimated position coordinates of the radar track point C relative to the radar track list 2.
In the same way, the estimated position coordinates of the residual radar track points D relative to the radar track list 1 and the estimated position coordinates of the residual radar track points D relative to the radar track list 2 in the preset fusion time period can be calculated.
Step 303, calculating a coordinate difference between the first position coordinate of the radar track point and the estimated position coordinate corresponding to the radar track point.
For example, for the radar track list 1, the coordinate difference between the first position coordinate of the radar track point C and the estimated position coordinate of the radar track point C relative to the radar track list 1 is:
(ΔxC,ΔyC,ΔzC)=|(x'C,y'C,z'C)-(xC,yC,zC)|;
for example, for the radar track list 2, the coordinate difference between the first position coordinate of the radar track point C and the estimated position coordinate of the radar track point C relative to the radar track list 2 is:
(Δx'C,Δy'C,Δz'C)=|(x”C,y”C,z”C)-(xC,yC,zC)|。
step 304, if the coordinate difference value is smaller than a preset difference value threshold value, adding the radar track point into the established radar track list, and adding 1 to the updating weight of the established radar track list; and if the radar track point cannot be added into any established radar track list, establishing a new radar track list for the radar track point, and subtracting 1 from the updated weight of the new radar track list.
Wherein, each radar track list comprises: one or more radar track points.
The preset difference threshold may be set according to practical application, and is not specifically limited.
For example, if the radar track list 1 is used, the coordinate difference (Δ x) between the first position coordinates of the radar track point C and the estimated position coordinates of the radar track point C relative to the radar track list 1 is obtainedC,ΔyC,ΔzC) If the difference value is smaller than the preset difference value threshold value, adding the radar track point C into the established radar track list 1, and adding 1 to the updating weight of the established radar track list 1;
if the coordinate difference (Deltax) between the first position coordinate of the radar track point C and the estimated position coordinate of the radar track point C relative to the radar track list 1 is presentC,ΔyC,ΔzC) Is not less than a preset difference value threshold value, and is the coordinate difference value (delta x ') between the first position coordinate of the radar track point C and the estimated position coordinate of the radar track point C relative to the radar track list 2'C,Δy'C,Δz'C) If the difference value is less than the preset difference value threshold value, the radar track point is determinedC, adding the created radar track list 2, and adding 1 to the updating weight of the created radar track list 2;
and if the radar track point C cannot be added into any established radar track list, establishing a new radar track list 3 for the radar track point C, and subtracting 1 from the updated weight of the new radar track list 3.
Step 305, for each radar track list, if the number of times of increasing the update weight of the radar track list within a preset time period is greater than or equal to a preset number of times, determining that the radar track list is a radar track corresponding to the first detection target.
Specifically, in this step, a sliding window method criterion may be used to determine the radar track corresponding to each first detection target. The preset sliding window method criterion comprises a preset time length m and a preset number of times n, both the preset time length and the preset number of times can be set according to practical application, and the preset number of times n is smaller than the preset time length m.
For example, for a radar track list 1, if the number of times of increasing the update weight of the radar track list 1 within a preset time is greater than or equal to a preset number of times, determining that the radar track list 1 is a radar track corresponding to a first detection target; if the number of times of increasing the updating weight of the radar track list 1 in the preset time is more than 0 and less than the preset number of times, temporarily retaining the radar track list 1; if the number of times of increasing the updating weight of the radar track list 1 within the preset time length is equal to 0, deleting the radar track list 1;
similarly, for the radar track list 2, if the number of times of increasing the update weight of the radar track list 2 within the preset time is greater than or equal to the preset number of times, determining that the radar track list 2 is the radar track corresponding to the first detection target; if the number of times of increasing the updating weight of the radar track list 2 in the preset time is more than 0 and less than the preset number of times, temporarily retaining the radar track list 2; if the number of times of increasing the update weight of the radar track list 2 within the preset time length is equal to 0, the radar track list 2 can be deleted.
Step 203, determining a visual track corresponding to each second detection target based on the visual measurement data.
Wherein the visual tracks can include one or more visual track points, each visual track point in each of the visual tracks including: and the corresponding second position coordinate, the corresponding second movement speed and the corresponding second detected time of the second detection target.
Specifically, referring to fig. 4, fig. 4 is a flowchart for determining a visual track corresponding to each second detection target according to an embodiment of the present invention, which may include the following steps:
step 401, respectively creating a visual track list for each visual track point in the visual measurement data, where the second detected time is closest to the starting time of the preset fusion time period, to obtain one or more visual track lists.
The preset fusion time period may set a start time and a stop time, for example, the start time is 10 hours, 10 minutes and 0 seconds, and the stop time is 10 hours, 20 minutes and 0 seconds, so that the preset fusion time period is as follows: time period determined from 10 hours 10 minutes 0 seconds to 10 hours 20 minutes 0 seconds: [10:10:00, 10:20:00].
For example, if the predetermined fusion time period is [10:10:00, 10:20:00]]And is in the range of [10:10:00, 10:20:00]The internally acquired vision metrology data includes: visual track point G1, visual track point G2, visual track point G3, and visual track point G4. And, the second position coordinate of the visual track point G1 is (x)G1,yG1,zG1) The second movement speed is (v)xG1,vyG1,vzG1) And a second detected time tG1The second position coordinate of the visual track point G2 is (x)G2,yG2,zG2) The second movement speed is (v)xG2,vyG2,vzG2) And a second detected time tG2The second position coordinate of the visual track point G3 is (x)G3,yG3,zG3) The second movement speed is (v)xG3,vyG3,vzG3) And a second detected time tG3The second position coordinate of the visual track point G4 is (x)G4,yG4,zG4) The second movement speed is (v)xG4,vyG4,vzG4) And a second detected time tG4
If tG1Is equal to tG2And t isG1And when the initial time is closest to the initial time 10:10:00 of the preset fusion time period, respectively creating a visual track list aiming at the visual track point G1 and the visual track point G2 to obtain a visual track list L3 and a visual track list L4. The visual track list L3 and the visual track list L4 respectively include visual track points G1 and visual track points G2.
Step 402, aiming at each remaining visual track point in the preset fusion time period, aiming at each created visual track list, and according to a preset visual detection period and a second position coordinate of the visual track point in the created visual track list, wherein the second detected time is earlier than the visual track point by one preset visual detection period, calculating an estimated position coordinate corresponding to the visual track point.
For example, if the visual track list 3 and the visual track list 4 are created. The visual track list 3 includes a visual track point E, and the second position coordinate of the visual track point E is (x)E,yE,zE) The second movement speed is (v)xE,vyE,vzE) And a second detected time tE(ii) a The visual track list 4 includes a visual track point F, and the second position coordinate of the visual track point F is (x)F,yF,zF) The second movement speed is (v)xF,vyF,vzF) And a second detected time tF. And if the residual visual track points in the preset fusion time period comprise: the visual track point J and the visual track point H, and the second position coordinate of the visual track point J is (x)J,yJ,zJ) The second movement speed is (v)xJ,vyJ,vzJ) And a second detected time tJThe second position coordinate of the visual track point H is (x)H,yH,zH) The second movement speed is (v)xH,vyH,vzH) And a second detected time tH
In this step, if the preset visual detection period is T2, for a visual track point J, if a visual track point T2 earlier than the visual track point J at the second detected time in the created visual track list 3 is a visual track point E, the estimated position coordinates corresponding to the visual track point can be calculated by using the following formula for the created visual track list 3:
(x'J,y'J,z'J)=((xE+vxE·T2),(yE+vyE·T2),(zE+vzE·T2))
wherein, (x'J,y'J,z'J) The estimated position coordinates of the visual track point J relative to the track list 3;
similarly, for the visual track point J, if the visual track point whose second detected time is one T2 earlier than the visual track point J in the created visual track list 4 is the visual track point F, the estimated position coordinate corresponding to the visual track point can be calculated by using the following formula for the created visual track list 4:
(x”J,y”J,z”J)=((xF+vxF·T2),(yF+vyF·T2),(zF+vzF·T2))
wherein (x) "J,y”J,z”J)(x”C,y”C,z”C) Is the estimated position coordinates of the visual track point J relative to the track list 4.
In the same way, the estimated position coordinates of the remaining visual track points H relative to the track list 3 and the estimated position coordinates of the remaining visual track points H relative to the track list 4 in the preset fusion time period can be calculated.
And 403, calculating a coordinate difference value between the second position coordinate of the visual track point and the estimated position coordinate corresponding to the visual track point.
For example, for the visual track list 3, the coordinate difference between the second position coordinate of the visual track point J and the estimated position coordinate of the visual track point J relative to the visual track list 3 is:
(ΔxJ,ΔyJ,ΔzJ)=|(x'J,y'J,z'J)-(xJ,yJ,zJ)|;
for example, for the track list 2, the coordinate difference between the second position coordinate of the visual track point J and the estimated position coordinate of the visual track point J relative to the visual track list 4 is:
(Δx'J,Δy'J,Δz'J)=|(x”J,y”J,z”J)-(xJ,yJ,zJ)|。
step 404, if the coordinate difference is smaller than a preset difference threshold, adding the visual track point into the created visual track list, and adding 1 to the update weight of the created visual track list; and if the visual track point cannot be added into any established visual track list, establishing a new visual track list for the visual track point, and subtracting 1 from the update weight of the new visual track list.
Wherein, each visual track list comprises: one or more visual track points.
The preset difference threshold may be set according to practical application, and is not specifically limited.
For example, if the visual track list 3 is used, the coordinate difference (Δ x) between the second position coordinates of the visual track point J and the estimated position coordinates of the visual track point J relative to the visual track list 3J,ΔyJ,ΔzJ) If the difference value is less than the preset difference value threshold value, adding the visual track point J into the created visual track list 3, and adding 1 to the updating weight of the created visual track list 3;
if aiming at the visual track list 3, the coordinate difference (deltax) between the second position coordinate of the visual track point J and the estimated position coordinate of the visual track point J relative to the visual track list 3J,ΔyJ,ΔzJ) Not less than the preset difference threshold, and aiming at the track list 2, the position coordinates of the second position of the visual track point J and the estimated position coordinates of the visual track point J relative to the track list 2Coordinate difference value of (Δ x'J,Δy'J,Δz'J) If the difference value is less than the preset difference value threshold value, adding the visual track point J into the created visual track list 4, and adding 1 to the updating weight of the created visual track list 4;
if the visual track point J cannot be added into any created visual track list, a new visual track list 5 is created for the visual track point J, and the updating weight value of the new visual track list 5 is reduced by 1.
Step 405, for each visual track list, if the number of times of increasing the update weight of the visual track list within a preset time period is greater than or equal to a preset number of times, determining that the visual track list is a visual track corresponding to the second detection target.
Specifically, in this step, a sliding window method criterion may be adopted to determine the visual track corresponding to each second detection target. The preset sliding window method criterion comprises a preset time length m and a preset number of times n, both the preset time length and the preset number of times can be set according to practical application, and the preset number of times n is smaller than the preset time length m.
For example, for the visual track list 3, if the number of times of increasing the update weight of the visual track list 3 within a preset time period is greater than or equal to a preset number of times, determining that the visual track list 3 is a visual track corresponding to the second detection target; if the number of times of increasing the updating weight of the visual track list 3 in the preset time is more than 0 and less than the preset number of times, the visual track list 3 can be temporarily reserved; if the number of times of increasing the updating weight of the visual track list 3 in the preset time length is equal to 0, deleting the visual track list 3;
similarly, for the visual track list 4, if the number of times of increasing the update weight of the visual track list 4 within the preset time is greater than or equal to the preset number of times, determining that the visual track list 4 is the visual track corresponding to the second detection target; if the number of times of increasing the update weight of the visual track list 4 within the preset time is more than 0 and less than the preset number of times, the visual track list 4 can be temporarily reserved; if the number of times of increasing the update weight of the visual track list 4 within the preset time period is equal to 0, the visual track list 4 may be deleted.
The execution sequence of step 202 and step 203 is not particularly limited.
And step 204, converting the second position coordinate of each second detection target in the visual track to a new second position coordinate in the radar coordinate system by using the radar coordinate system as a reference coordinate system.
In this step, a difference coordinate of the origin of coordinates of the visual coordinate system with respect to the origin of coordinates of the radar coordinate system may be calculated: (Δ x, Δ y, Δ z); the difference coordinates (Δ x, Δ y, Δ z) may be added to the second position coordinates of each second detected object in the visual track in the visual coordinate system, so as to obtain new second position coordinates converted into the radar coordinate system.
In the embodiment of the present invention, a difference coordinate of the coordinate origin of the radar coordinate system with respect to the coordinate origin of the visual coordinate system may also be calculated: (Δ x ', Δ y ', Δ z '); the difference coordinates (Δ x ', Δ y ', Δ z ') may be added to the first position coordinates of each first detected target in the radar track in the radar coordinate system to obtain new first position coordinates converted to the visual coordinate system, thereby implementing the coordinate system unification.
And step 205, performing track point compensation on the radar track and/or the visual track based on a preset radar detection period and a preset visual detection period, a first detected time of each first detection target in the radar track and a second detected time of each second detection target in the visual track, a first motion speed of each first detection target in the radar track and a second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the visual track.
Referring to fig. 5, fig. 5 is a flow chart of performing track point compensation on a radar track and/or a visual track according to an embodiment of the present invention, where the flow chart includes:
step 501, judging whether the radar detection period is a common multiple of the visual detection period, if so, executing step 502, and if not, executing step 503.
Step 502, if the radar detection period is a common multiple of the visual detection period, determining a second detected time different from the first detected time in the visual track as a compensation time; and for each compensation moment, calculating a first position coordinate corresponding to the compensation moment based on a first position coordinate and a first movement speed of a previous radar track point adjacent to the compensation moment in the radar track and a time difference value between the compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the compensation moment into one radar track point of the radar track.
For example, if the visual detection period is 1 second and the radar detection period is 3 seconds, it can be determined that the radar detection period is a common multiple of the visual detection period.
Referring to fig. 6, the second detected times corresponding to the visual track points a1-a10 included in the visual track a in fig. 6 are: 10:10:00, 10:10:01, 10:10:02, 10:10:03, 10:10:04, 10:10:05, 10:10:06, 10:10:07, 10:10:08, and 10:10: 09; the first detected moments corresponding to the radar track points b1-b4 included in the radar track b in fig. 6 are respectively: 10:10:00, 10:10:03, 10:10:06, and 10:10: 09.
Determining a second detected time different from the first detected time in the visual track a as a compensation time: that is, it can be determined that the second detected time is the compensation time at 10:10:01, 10:10:02, 10:10:04, 10:10:05, 10:10:07, and 10:10: 08.
For the compensation time 10:10:01, the first position coordinate corresponding to the compensation time can be calculated by adopting the following formula:
(x'b,y'b,z'b)=((xb1+vxb1×Δt),(yb1+vyb1×Δt),(zb1+vzb1×Δt))
wherein, (x'b,y'b,z'b) A first position coordinate corresponding to the compensation time, i.e. a first position coordinate of the point b' compensated for the compensation time, (x)b1,yb1,zb1) Is a first radar track point b1 based on the previous radar track point b adjacent to the compensation time in the radar track bPosition coordinates, wherein delta t is the time difference value of the compensation moment and the first detected moment of the previous radar track point adjacent to the compensation moment; (v)xb1,vyb1,vzb1) The first motion velocity of the previous radar track point adjacent to the compensation moment.
In this step, the first position coordinate corresponding to each compensation time is compensated to be one radar track point of the radar track, see fig. 6, and a radar track b' after the radar track point is compensated is obtained.
Step 503, determining whether the visual detection period is a common multiple of the radar detection period, if so, executing step 504, and if not, executing step 505.
Step 504, determining a first detected time different from a second detected time in the radar track as a compensation time; and for each compensation moment, calculating a second position coordinate corresponding to the compensation moment based on a second position coordinate and a second motion speed of a previous visual track point adjacent to the compensation moment in the visual track and a time difference value between the compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the compensation moment into one visual track point of the visual track.
For example, if the radar detection period is 1 second and the visual detection period is 2 seconds, it can be determined that the visual detection period is a common multiple of the radar detection period.
Referring to fig. 7, the first detected times corresponding to the radar track points E1-E5 included in the radar track E in fig. 7 are: 10:10:00, 10:10:01, 10:10:02, 10:10:03, and 10:10: 04; the first detected times corresponding to the visual track points F1-F3 included in the visual track F in fig. 7 are: 10:10:00, 10:10:02 and 10:10: 04.
Determining a first detected time different from a second detected time in the radar track E as a compensation time: that is, it can be determined that the first detected time is 10:10:01 and 10:10:03 as the compensation time.
For the compensation time 10:10:01, the first position coordinate corresponding to the compensation time can be calculated by adopting the following formula:
(x'f,y'f,z'f)=((xf1+vxf1×Δt),(yf1+vyf1×Δt),(zf1+vzf1×Δt))
wherein, (x'f,y'f,z'f) A second position coordinate corresponding to the compensation time, i.e. a second position coordinate of the f' point compensated for the compensation time, (x)f1,yf1,zf1) Based on the second position coordinate of the previous visual track point F1 adjacent to the compensation moment in the visual track F, Δ t is the time difference between the compensation moment and the second detected moment of the previous visual track point adjacent to the compensation moment; (v)xf1,vyf1,vzf1) The second motion velocity of the previous visual track point adjacent to the compensation moment.
In this step, the second position coordinate corresponding to each compensation time is compensated to be one visual track point of the visual track, see fig. 7, and the visual track F' after the visual track point is compensated is obtained.
Step 505, determining a second detected time different from the first detected time in the visual track as a first compensation time; and for each first compensation moment, calculating a first position coordinate corresponding to the first compensation moment based on a first position coordinate and a first movement speed of a previous radar track point adjacent to the first compensation moment in the radar track and a time difference value between the first compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the first compensation moment into one radar track point of the radar track.
Step 506, determining a first detected time different from a second detected time in the radar track as a second compensation time; and for each second compensation moment, calculating a second position coordinate corresponding to the second compensation moment based on a second position coordinate and a second motion speed of a previous visual track point adjacent to the second compensation moment in the visual track and a time difference value between the second compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the second compensation moment into one visual track point of the visual track.
For example, if the vision detection period is 2 seconds and the radar detection period is 3 seconds, it can be determined that the radar detection period is not a common multiple of the vision detection period, and meanwhile, the vision detection period is not a common multiple of the radar detection period.
Referring to fig. 8, the second detected times corresponding to the visual track points P1-P6 included in the visual track P in fig. 8 are: 10:10:00, 10:10:02, 10:10:04, 10:10:06, 10:10:08, and 10:10: 10; the first detected times corresponding to the radar track points Q1-Q4 included in the radar track Q in fig. 8 are: 10:10:00, 10:10:03, 10:10:06, and 10:10: 09.
Determining a second detected time different from the first detected time in the visual track P as a first compensation time: it can be determined that the second detected time instants are 10:10:02, 10:10:04, 10:10:08 and 10:10:10 as the first compensation time instant.
For each first compensation time, the first position coordinate corresponding to the compensation time can be calculated by adopting the following formula:
(x'q,y'q,z'q)=((xq+vxq×Δt),(yq+vyq×Δt),(zq+vzq×Δt))
wherein, (x'q,y'q,z'q) A first position coordinate corresponding to each first compensation moment, namely the first position coordinate of the q' point compensated for each compensation moment; for each first compensation instant, (x)q,yq,zq) The first position coordinate of the previous radar track point adjacent to the first compensation moment in the radar track Q is used as the first position coordinate; for each first compensation moment, delta t is the time difference value of the first detected moment of the previous radar track point adjacent to the first compensation moment and the first compensation moment; (v)xq,vyq,vzq) Is the first motion velocity of the previous radar track point adjacent to the first compensation moment. (ii) a
In this step, the first position coordinate corresponding to each first compensation time is compensated to be one radar track point of the radar track, see fig. 8, and a radar track Q' after the radar track point is compensated is obtained.
Referring to fig. 8, a first detected time different from a second detected time in the radar track Q is determined as a second compensation time: that is, the first detected time is 10:10:03 and 10:10:09 can be determined as the second compensation time.
For each second compensation time, the following formula can be adopted to calculate the first position coordinate corresponding to the second compensation time:
(x'p,y'p,z'p)=((xp+vxp×Δt),(yp+vyp×Δt),(zp+vzp×Δt))
wherein, (x'p,y'p,z'p) A second position coordinate corresponding to each second compensation moment, namely a second position coordinate of the p' point compensated for each second compensation moment; for each second compensation instant, (x)p,yp,zp) The second position coordinate of the previous visual track point P adjacent to the second compensation moment in the visual track P is used as the basis; for each second compensation moment, delta t is the time difference of the second detected moment of the previous visual track point adjacent to the second compensation moment and the second compensation moment; for each second compensation instant, (v)xp,vyp,vzp) The second motion velocity of the previous visual track point adjacent to the second compensation moment.
In this step, the second position coordinate corresponding to each second compensation time is compensated to be a visual track point of the visual track, see fig. 8, and the visual track P' after the visual track point is compensated is obtained.
And step 206, merging the track points with the same detection time in the radar track and the visual track to obtain a merged track.
In this step, a difference between a first position coordinate of the radar track point and a second position coordinate of the visual track point can be calculated for a radar track point and a visual track point in the radar track and the visual track, where the first detected time is the same as the second detected time, and the radar track point or the visual track point can be determined as a fusion track point if the difference is smaller than a preset difference threshold. And if the ratio of the obtained number of the fused track points to the number of the radar track points in the radar track reaches a preset ratio threshold value, and meanwhile, if the ratio of the obtained number of the fused track points to the number of the visual track points in the visual track reaches a preset ratio threshold value, taking the track formed by the determination of the fused track points as the fused track.
And step 207, identifying each detection target based on the fusion flight path, the radar measurement data and/or the vision measurement data.
In the embodiment of the present invention, the radar measurement data may further include: image information of a first detection target; the vision metrology data may also include: image information of the second detection target.
In this step, the detection target may be identified based on the determined fusion track, the image information of the first detection target, and/or the image information of the second detection target.
By adopting the method provided by the embodiment of the invention, the fusion track is obtained by fusing the radar track and the visual track, so that the fusion of the detection information of the radar and the visual sensor is realized, the obtained fusion track comprises the information detected by the radar and the information detected by the visual sensor, the application scene of target detection is expanded, and the detection precision is improved.
Referring to fig. 9, based on the same inventive concept, according to the flight path determining method provided in the above embodiment of the present invention, correspondingly, another embodiment of the present invention further provides a flight path determining apparatus, which specifically includes:
a measured data obtaining module 901, configured to obtain radar measured data detected by a radar every other preset radar detection period and visual measured data detected by a visual sensor every other preset visual detection period; the radar measurement data comprise a first position coordinate, a first movement speed and a first detected time of each first detection target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detection target is detected by the radar; the vision measurement data comprises a second position coordinate, a second movement speed and a second detected moment of each second detected target in a vision sensor coordinate system, wherein the second position coordinate, the second movement speed and the second detected moment are detected by the vision sensor, and the second detected moment is the moment when the second detected target is reached;
a track determining module 902, configured to determine, based on the radar measurement data, a radar track corresponding to each first detection target, where each radar track point in the radar track includes: the corresponding first position coordinate, the first movement speed and the first detected moment of the first detection target; and determining a visual track corresponding to each second detection target based on the visual measurement data, wherein each visual track point in the visual track comprises: the corresponding second position coordinate, the second movement speed and the second detected moment of the second detection target;
the data conversion module 903 is configured to convert the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track to the same coordinate system; performing track point compensation on the radar track and/or the visual track based on the preset radar detection period and the preset visual detection period, the first detected time of each first detection target in the radar track and the second detected time of each second detection target in the visual track, the first motion speed of each first detection target in the radar track and the second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the visual track;
and the track fusion module 904 is used for fusing track points with the same detection time in the radar track and the visual track to obtain a fused track.
Therefore, by adopting the device provided by the embodiment of the invention, the radar measurement data detected by the radar every other preset radar detection period and the vision measurement data detected by the vision sensor every other preset vision detection period are obtained; determining a radar track corresponding to each first detection target based on the radar measurement data, and determining a visual track corresponding to each second detection target based on the visual measurement data; converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; performing track point compensation on the radar track and/or the visual track based on a preset radar detection period and a preset visual detection period, a first detected time of each first detection target in the radar track and a second detected time of each second detection target in the visual track, a first motion speed of each first detection target in the radar track and a second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of the track point in the visual track; and (4) merging the track points with the same detection time in the radar track and the visual track to obtain a merged track. The radar track and the visual track are fused to obtain the fusion track, so that the fusion of detection information of the radar and the visual sensor is realized, the obtained fusion track comprises information detected by the radar and information detected by the visual sensor, the application scene of target detection is expanded, and the detection precision is improved.
Further, the track determining module 902 includes:
the track list creating submodule is used for respectively creating a radar track list aiming at each radar track point with the first detected moment in the radar measurement data being closest to the starting moment of the preset fusion time interval so as to obtain one or more radar track lists;
the position coordinate estimation submodule is used for calculating an estimated position coordinate corresponding to each radar track point according to a preset radar detection period and a first position coordinate of the radar track point, of which the first detected time is one preset radar detection period earlier than the radar track point, in the established radar track list aiming at each radar track point remaining in a preset fusion time period;
the coordinate difference value calculating submodule is used for calculating a coordinate difference value between the first position coordinate of the radar track point and the estimated position coordinate corresponding to the radar track point; if the coordinate difference value is smaller than a preset difference value threshold value, adding the radar track point into the established radar track list, and adding 1 to the updating weight of the established radar track list; if the radar track point cannot be added into any established radar track list, establishing a new radar track list for the radar track point, and subtracting 1 from the update weight of the new radar track list; wherein, each radar track list comprises: one or more radar track points;
and the track determining submodule is used for determining that each radar track list is a radar track corresponding to the first detection target if the times of increasing the updating weight of the radar track list in the preset time length are more than or equal to the preset times.
By adopting the device provided by the embodiment of the invention, the fusion track is obtained by fusing the radar track and the visual track, so that the fusion of the detection information of the radar and the visual sensor is realized, the obtained fusion track comprises the information detected by the radar and the information detected by the visual sensor, the application scene of target detection is expanded, and the detection precision is improved.
The embodiment of the present invention further provides an electronic device, as shown in fig. 10, which includes a processor 1001, a communication interface 1002, a memory 1003 and a communication bus 1004, wherein the processor 1001, the communication interface 1002 and the memory 1003 complete mutual communication through the communication bus 1004,
a memory 1003 for storing a computer program;
the processor 1001 is configured to implement the following steps when executing the program stored in the memory 1003:
acquiring radar measurement data detected by a radar every other preset radar detection period and vision measurement data detected by a vision sensor every other preset vision detection period; the radar measurement data comprise a first position coordinate, a first movement speed and a first detected time of each first detection target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detection target is detected by the radar; the vision measurement data comprises a second position coordinate, a second movement speed and a second detected moment of each second detected target in a vision sensor coordinate system, wherein the second position coordinate, the second movement speed and the second detected moment are detected by the vision sensor, and the second detected moment is the moment when the second detected target is reached;
based on radar measurement data, determining a radar track corresponding to each first detection target, wherein each radar track point in the radar track comprises: the corresponding first position coordinate, the first movement speed and the first detected moment of the first detection target; and determining a visual track corresponding to each second detection target based on the visual measurement data, wherein each visual track point in the visual track comprises: the corresponding second position coordinate, the second movement speed and the second detected moment of the second detection target;
converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; performing track point compensation on the radar track and/or the visual track based on the preset radar detection period and the preset visual detection period, the first detected time of each first detection target in the radar track and the second detected time of each second detection target in the visual track, the first motion speed of each first detection target in the radar track and the second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the visual track;
and fusing track points with the same detection time in the radar track and the visual track to obtain a fused track.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the device, the electronic apparatus and the storage medium embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and the relevant points can be referred to the partial description of the method embodiments. The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (10)

1. A method for determining a flight path, comprising:
acquiring radar measurement data detected by a radar every other preset radar detection period and vision measurement data detected by a vision sensor every other preset vision detection period; the radar measurement data comprise a first position coordinate, a first movement speed and a first detected time of each first detection target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detection target is detected by the radar; the vision measurement data comprises a second position coordinate, a second movement speed and a second detected moment of each second detected target in a vision sensor coordinate system, wherein the second position coordinate, the second movement speed and the second detected moment are detected by the vision sensor, and the second detected moment is the moment when the second detected target is reached;
based on radar measurement data, determining a radar track corresponding to each first detection target, wherein each radar track point in the radar track comprises: the corresponding first position coordinate, the first movement speed and the first detected moment of the first detection target; and determining a visual track corresponding to each second detection target based on the visual measurement data, wherein each visual track point in the visual track comprises: the corresponding second position coordinate, the second movement speed and the second detected moment of the second detection target;
converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; performing track point compensation on the radar track and/or the visual track based on the preset radar detection period and the preset visual detection period, the first detected time of each first detection target in the radar track and the second detected time of each second detection target in the visual track, the first motion speed of each first detection target in the radar track and the second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the visual track;
and fusing track points with the same detection time in the radar track and the visual track to obtain a fused track.
2. The method of claim 1, wherein determining the radar track corresponding to each first probe target based on the radar measurement data comprises:
respectively creating a radar track list aiming at each radar track point with the first detected moment in radar measurement data closest to the starting moment of a preset fusion time period to obtain one or more radar track lists;
aiming at each remaining radar track point in a preset fusion time period, aiming at each established radar track list, calculating an estimated position coordinate corresponding to the radar track point according to a preset radar detection period and a first position coordinate of the radar track point, of which the first detected time is one preset radar detection period earlier than the radar track point, in the established radar track list;
calculating a coordinate difference value between the first position coordinate of the radar track point and the estimated position coordinate corresponding to the radar track point;
if the coordinate difference value is smaller than a preset difference value threshold value, adding the radar track point into the established radar track list, and adding 1 to the updating weight of the established radar track list; if the radar track point cannot be added into any established radar track list, establishing a new radar track list for the radar track point, and subtracting 1 from the update weight of the new radar track list; wherein, each radar track list comprises: one or more radar track points;
and for each radar track list, if the times of increasing the updating weight of the radar track list in the preset time length are more than or equal to the preset times, determining that the radar track list is the radar track corresponding to the first detection target.
3. The method of claim 1, wherein determining the visual trajectory for each second probe target based on the visual metrology data comprises:
respectively creating a visual track list aiming at each visual track point with the second detected moment closest to the starting moment of the preset fusion time period in the visual measurement data to obtain one or more visual track lists;
aiming at each residual visual track point in a preset fusion time period, aiming at each created visual track list, and according to a preset visual detection period and a second position coordinate of the visual track point, which is earlier than the visual track point by one preset visual detection period at a second detected moment, in the created visual track list, calculating an estimated position coordinate corresponding to the visual track point;
calculating a coordinate difference value between the second position coordinate of the visual track point and the estimated position coordinate corresponding to the visual track point;
if the coordinate difference value is smaller than a preset difference value threshold value, adding the visual track point into the created visual track list, and adding 1 to the updating weight of the created visual track list; if the visual track point can not be added into any established visual track list, establishing a new visual track list for the visual track point, and subtracting 1 from the update weight of the new visual track list; wherein, each visual track list comprises: one or more visual track points;
and for each visual track list, if the times of increasing the updating weight of the visual track list in the preset time length are more than or equal to the preset times, determining that the visual track list is the visual track corresponding to the second detection target.
4. The method of claim 1, wherein converting the first position coordinates of each first detected object in the radar track and the second position coordinates of each second detected object in the vision track to the same coordinate system comprises:
and converting the second position coordinate of each second detection target in the visual track to a new second position coordinate under the radar coordinate system by using the radar coordinate system as a reference coordinate system.
5. The method of claim 1, wherein performing track point compensation on the radar track and/or the visual track based on the preset radar detection period and the preset visual detection period, the first detected time of each first detected target in the radar track, and the second detected time of each second detected target in the visual track, the first motion velocity of each first detected target in the radar track, and the second motion velocity of each second detected target in the visual track, such that the detection time of the track point in the radar track and the detection time of the track point in the visual track are the same comprises:
if the radar detection period is a common multiple of the visual detection period, determining a second detected time different from the first detected time in the visual track as a compensation time;
and for each compensation moment, calculating a first position coordinate corresponding to the compensation moment based on a first position coordinate and a first movement speed of a previous radar track point adjacent to the compensation moment in the radar track and a time difference value between the compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the compensation moment into one radar track point of the radar track.
6. The method of claim 1, wherein performing track point compensation on the radar track and/or the visual track based on the preset radar detection period and the preset visual detection period, the first detected time of each first detected target in the radar track, and the second detected time of each second detected target in the visual track, the first motion velocity of each first detected target in the radar track, and the second motion velocity of each second detected target in the visual track, such that the detection time of the track point in the radar track and the detection time of the track point in the visual track are the same comprises:
if the visual detection period is a common multiple of the radar detection period, determining a first detected time different from a second detected time in the radar track as a compensation time;
and for each compensation moment, calculating a second position coordinate corresponding to the compensation moment based on a second position coordinate and a second motion speed of a previous visual track point adjacent to the compensation moment in the visual track and a time difference value between the compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the compensation moment into one visual track point of the visual track.
7. The method of claim 1, wherein performing track point compensation on the radar track and/or the visual track based on the preset radar detection period and the preset visual detection period, the first detected time of each first detected target in the radar track, and the second detected time of each second detected target in the visual track, the first motion velocity of each first detected target in the radar track, and the second motion velocity of each second detected target in the visual track, such that the detection time of the track point in the radar track and the detection time of the track point in the visual track are the same comprises:
if the radar detection period is not a common multiple of the visual detection period and the visual detection period is not a common multiple of the radar detection period, determining a second detected time different from the first detected time in the visual track as a first compensation time; determining a first detected time different from a second detected time in the radar track as a second compensation time;
for each first compensation moment, calculating a first position coordinate corresponding to the first compensation moment based on a first position coordinate and a first motion speed of a previous radar track point adjacent to the first compensation moment in the radar track and a time difference value between the first compensation moment and a first detected moment of the radar track point, and compensating the first position coordinate corresponding to the first compensation moment into one radar track point of the radar track;
and for each second compensation moment, calculating a second position coordinate corresponding to the second compensation moment based on a second position coordinate and a second motion speed of a previous visual track point adjacent to the second compensation moment in the visual track and a time difference value between the second compensation moment and a second detected moment of the visual track point, and compensating the second position coordinate corresponding to the second compensation moment into one visual track point of the visual track.
8. The method according to claim 1, wherein after the merging the track points with the same detection time in the radar track and the visual track to obtain a merged track, the method further comprises:
and identifying each detection target based on the fusion flight path, the radar measurement data and/or the vision measurement data.
9. A track determination device, comprising:
the measurement data acquisition module is used for acquiring radar measurement data detected by the radar every other preset radar detection period and vision measurement data detected by the vision sensor every other preset vision detection period; the radar measurement data comprise a first position coordinate, a first movement speed and a first detected time of each first detection target detected by the radar in a radar coordinate system, wherein the first detected time is the time when the first detection target is detected by the radar; the vision measurement data comprises a second position coordinate, a second movement speed and a second detected moment of each second detected target in a vision sensor coordinate system, wherein the second position coordinate, the second movement speed and the second detected moment are detected by the vision sensor, and the second detected moment is the moment when the second detected target is reached;
the track determining module is used for determining a radar track corresponding to each first detection target based on radar measurement data, and each radar track point in the radar track comprises: the corresponding first position coordinate, the first movement speed and the first detected moment of the first detection target; and determining a visual track corresponding to each second detection target based on the visual measurement data, wherein each visual track point in the visual track comprises: the corresponding second position coordinate, the second movement speed and the second detected moment of the second detection target;
the data conversion module is used for converting the first position coordinate of each first detection target in the radar track and the second position coordinate of each second detection target in the visual track into the same coordinate system; performing track point compensation on the radar track and/or the visual track based on the preset radar detection period and the preset visual detection period, the first detected time of each first detection target in the radar track and the second detected time of each second detection target in the visual track, the first motion speed of each first detection target in the radar track and the second motion speed of each second detection target in the visual track, so that the detection time of a track point in the radar track is the same as the detection time of a track point in the visual track;
and the track fusion module is used for fusing track points with the same detection time in the radar track and the visual track to obtain a fusion track.
10. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1 to 8 when executing a program stored in the memory.
CN202011622460.XA 2020-12-31 2020-12-31 Track determining method and device and electronic equipment Active CN112816975B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011622460.XA CN112816975B (en) 2020-12-31 2020-12-31 Track determining method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011622460.XA CN112816975B (en) 2020-12-31 2020-12-31 Track determining method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN112816975A true CN112816975A (en) 2021-05-18
CN112816975B CN112816975B (en) 2024-03-15

Family

ID=75854923

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011622460.XA Active CN112816975B (en) 2020-12-31 2020-12-31 Track determining method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112816975B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342057A (en) * 2021-08-05 2021-09-03 上海特金信息科技有限公司 Track fusion method and device, unmanned aerial vehicle detection system, equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109116349A (en) * 2018-07-26 2019-01-01 西南电子技术研究所(中国电子科技集团公司第十研究所) Multi-sensor cooperation tracks combined optimization decision-making technique
CN110929810A (en) * 2019-12-27 2020-03-27 航天南湖电子信息技术股份有限公司 Multi-source data fusion method for low-speed small-target detection system
CN111967498A (en) * 2020-07-20 2020-11-20 重庆大学 Night target detection and tracking method based on millimeter wave radar and vision fusion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109116349A (en) * 2018-07-26 2019-01-01 西南电子技术研究所(中国电子科技集团公司第十研究所) Multi-sensor cooperation tracks combined optimization decision-making technique
CN110929810A (en) * 2019-12-27 2020-03-27 航天南湖电子信息技术股份有限公司 Multi-source data fusion method for low-speed small-target detection system
CN111967498A (en) * 2020-07-20 2020-11-20 重庆大学 Night target detection and tracking method based on millimeter wave radar and vision fusion

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342057A (en) * 2021-08-05 2021-09-03 上海特金信息科技有限公司 Track fusion method and device, unmanned aerial vehicle detection system, equipment and medium

Also Published As

Publication number Publication date
CN112816975B (en) 2024-03-15

Similar Documents

Publication Publication Date Title
CN109212521B (en) Target tracking method based on fusion of forward-looking camera and millimeter wave radar
US9127956B2 (en) Technique for lane assignment in a vehicle
WO2020232648A1 (en) Lane line detection method, electronic device and storage medium
US10552982B2 (en) Method for automatically establishing extrinsic parameters of a camera of a vehicle
JP2017521745A (en) In-vehicle device that informs vehicle navigation module of presence of object
CN105128836A (en) Autonomous emergency braking system and method for recognizing pedestrian therein
US11935250B2 (en) Method, device and computer-readable storage medium with instructions for processing sensor data
Goyat et al. Vehicle trajectories evaluation by static video sensors
JP7056842B2 (en) State estimator and program
CN111220998B (en) Multi-target cooperative tracking method based on vehicle-to-vehicle communication
US11748593B2 (en) Sensor fusion target prediction device and method for vehicles and vehicle including the device
Diehl et al. Radar-based dynamic occupancy grid mapping and object detection
GB2576206A (en) Sensor degradation
CN110986966B (en) Automatic driving positioning method and system for long-distance tunnel
CN112835030A (en) Data fusion method and device for obstacle target and intelligent automobile
CN112816975B (en) Track determining method and device and electronic equipment
Cao et al. A geometry-driven car-following distance estimation algorithm robust to road slopes
JP6815963B2 (en) External recognition device for vehicles
CN113269811A (en) Data fusion method and device and electronic equipment
CN115792894A (en) Multi-sensor fusion vehicle target tracking method, system and storage medium
Yu et al. Camera-radar data fusion for target detection via Kalman filter and Bayesian estimation
Romero-Cano et al. Stereo-based motion detection and tracking from a moving platform
CN111539305B (en) Map construction method and system, vehicle and storage medium
CN111612818A (en) Novel binocular vision multi-target tracking method and system
Amditis et al. Fusion of infrared vision and radar for estimating the lateral dynamics of obstacles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220322

Address after: 430051 No. b1336, chuanggu startup area, taizihu cultural Digital Creative Industry Park, No. 18, Shenlong Avenue, Wuhan Economic and Technological Development Zone, Hubei Province

Applicant after: Yikatong (Hubei) Technology Co.,Ltd.

Address before: No.c101, chuanggu start up zone, taizihu cultural Digital Industrial Park, No.18 Shenlong Avenue, Wuhan Economic and Technological Development Zone, Hubei Province

Applicant before: HUBEI ECARX TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant