CN112488150A - Target correction method, device and system based on vehicle-road cooperation and storage medium - Google Patents

Target correction method, device and system based on vehicle-road cooperation and storage medium Download PDF

Info

Publication number
CN112488150A
CN112488150A CN202011216785.8A CN202011216785A CN112488150A CN 112488150 A CN112488150 A CN 112488150A CN 202011216785 A CN202011216785 A CN 202011216785A CN 112488150 A CN112488150 A CN 112488150A
Authority
CN
China
Prior art keywords
dimensional
data
target
lane line
corrected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011216785.8A
Other languages
Chinese (zh)
Other versions
CN112488150B (en
Inventor
孙皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Geely Holding Group Co Ltd
Geely Automobile Research Institute Ningbo Co Ltd
Original Assignee
Zhejiang Geely Holding Group Co Ltd
Geely Automobile Research Institute Ningbo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Geely Holding Group Co Ltd, Geely Automobile Research Institute Ningbo Co Ltd filed Critical Zhejiang Geely Holding Group Co Ltd
Priority to CN202011216785.8A priority Critical patent/CN112488150B/en
Publication of CN112488150A publication Critical patent/CN112488150A/en
Application granted granted Critical
Publication of CN112488150B publication Critical patent/CN112488150B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The application provides a target correction method, a device, a system and a storage medium based on vehicle-road cooperation, wherein the method comprises the following steps: acquiring two-dimensional target data and two-dimensional lane line data through a camera configured in a target correction system; correcting the two-dimensional target data and the two-dimensional lane line data to obtain two-dimensional corrected target data and two-dimensional corrected lane line data; determining first target lane data, and adding the first target lane data into the two-dimensional corrected target data to obtain second target data; converting the coordinate system of the second target data and the two-dimensional corrected lane line data to obtain three-dimensional target data and three-dimensional lane line data; correcting the three-dimensional target data and the three-dimensional lane line data to obtain three-dimensional corrected target data and three-dimensional corrected lane line data; and determining second target lane data, and adding the second target lane data into the three-dimensional correction target data to obtain third target data.

Description

Target correction method, device and system based on vehicle-road cooperation and storage medium
Technical Field
The present disclosure relates to the field of automotive technologies, and in particular, to a method, an apparatus, a system, and a storage medium for target correction based on vehicle-road coordination.
Background
The vehicle-road cooperation means that advanced wireless communication, new generation internet and other technologies are adopted, dynamic real-time information interaction of vehicles and vehicles is carried out in all directions, active safety control of vehicles and road cooperative management are carried out on the basis of full-time dynamic traffic information acquisition and fusion, effective cooperation of human and vehicles is fully realized, traffic safety is guaranteed, traffic efficiency is improved, and therefore a safe, efficient and environment-friendly road traffic technology is formed. With the development of vehicle driving automation, accurate positioning of a vehicle is slowly becoming an important technology for implementing driving assistance and automatic driving.
In the prior art, a camera, a millimeter wave radar, a laser radar and other sensors arranged at a road end are adopted for a vehicle to obtain target information, the sensors are processed through an algorithm to obtain the target information, meanwhile, multi-sensor fusion is carried out in a post-fusion mode, the obtained fused target information fuses the characteristics of the sensors, and the positioning result of a target is more accurate. Although the position of the target can be corrected by the post-fusion strategy, there is a sudden jump of target information during the fusion process, and therefore, an error of the positioning result still exists, and the error increases as the detected distance increases.
Disclosure of Invention
The embodiment of the application provides a method, a device and a system for correcting a target based on vehicle-road cooperation and a storage medium, wherein lane information of the target is added into target information for positioning the target, and the detected target information is corrected twice, so that target jump formed by fusion of multiple sensors is prevented, and the accuracy of target positioning can be realized.
In one aspect, an embodiment of the present application provides a target correction method based on vehicle-road cooperation, including:
acquiring two-dimensional target data and two-dimensional lane line data through a camera configured in a target correction system;
correcting the two-dimensional target data and the two-dimensional lane line data based on the two-dimensional lane line data and a preset two-dimensional coordinate true value of a lane line to obtain two-dimensional corrected target data and two-dimensional corrected lane line data;
determining first target lane data based on the two-dimensional corrected target data and the lane line two-dimensional coordinate true value, and adding the first target lane data into the two-dimensional corrected target data to obtain second target data;
converting a coordinate system of the second target data and the two-dimensional corrected lane line data to obtain three-dimensional target data and three-dimensional lane line data;
correcting the three-dimensional target data and the three-dimensional lane line data based on the three-dimensional lane line data and a preset three-dimensional coordinate true value of the lane line to obtain three-dimensional corrected target data and three-dimensional corrected lane line data;
and determining second target lane data based on the three-dimensional corrected target data and the lane line three-dimensional coordinate true value, and adding the second target lane data into the three-dimensional corrected target data to obtain third target data.
Optionally, the target correction method in the embodiment of the present application further includes:
acquiring fourth target data through a millimeter wave radar configured in the target correction system, and acquiring fifth target data through a laser radar configured in the target correction system;
and fusing the third target data, the fourth target data and the fifth target data to obtain sixth target data.
Optionally, the two-dimensional target data comprises two-dimensional coordinates of a target, and the two-dimensional lane line data comprises two-dimensional coordinates of a lane line;
the three-dimensional object data includes three-dimensional coordinates of an object, and the three-dimensional lane line data includes three-dimensional coordinates of a lane line.
Optionally, the correcting the two-dimensional target data and the two-dimensional lane line data based on the two-dimensional lane line data and a preset two-dimensional coordinate true value of the lane line to obtain two-dimensional corrected target data and two-dimensional corrected lane line data includes:
calculating an error between the two-dimensional coordinate of the lane line and a truth value of the two-dimensional coordinate of the lane line based on a preset equation to obtain an error curve;
and correcting the two-dimensional coordinates of the target and the two-dimensional coordinates of the lane line according to the error curve to obtain the two-dimensional coordinates of the corrected target and the two-dimensional coordinates of the corrected lane line.
Optionally, the determining of the first target lane data based on the two-dimensional corrected target data and the truth value of the two-dimensional coordinate of the lane line, and adding the first target lane data to the two-dimensional corrected target data to obtain second target data includes;
judging the position relation between the target and the lane line according to the corrected two-dimensional coordinate of the target and the truth value of the two-dimensional coordinate of the lane line, and determining the data of the lane where the target is located;
and adding the data of the lane where the target is located into the two-dimensional correction target data to obtain second target data.
Optionally, the modifying the three-dimensional target data and the three-dimensional lane line data based on the three-dimensional lane line data and a preset three-dimensional coordinate true value of the lane line to obtain three-dimensional modified target data and three-dimensional modified lane line data includes:
calculating an error between the three-dimensional coordinate of the lane line and a true value of the three-dimensional coordinate of the lane line based on a preset equation to obtain an error curve;
and correcting the three-dimensional coordinates of the target and the three-dimensional coordinates of the lane line according to the error curve to obtain the three-dimensional coordinates of the corrected target and the three-dimensional coordinates of the corrected lane line.
Optionally, the determining of second target lane data based on the three-dimensional corrected target data and the true value of the three-dimensional coordinate of the lane line, and adding the second target lane data to the three-dimensional corrected target data to obtain third target data includes;
judging the position relation between the target and the lane line according to the corrected three-dimensional coordinate of the target and the three-dimensional coordinate truth value of the lane line, and determining the data of the lane where the target is located;
and adding the data of the lane where the target is located into the three-dimensional correction target data to obtain third target data.
In another aspect, an embodiment of the present application provides an object correction apparatus, including:
the first acquisition module is used for acquiring two-dimensional target data and two-dimensional lane line data through a camera configured in the target correction system;
the first correction module is used for correcting the two-dimensional target data and the two-dimensional lane line data based on the two-dimensional lane line data and a preset two-dimensional coordinate true value of the lane line to obtain two-dimensional corrected target data and two-dimensional corrected lane line data;
the first determining module is used for determining first target lane data based on the two-dimensional corrected target data and the truth value of the two-dimensional coordinate of the lane line, and adding the first target lane data into the two-dimensional corrected target data to obtain second target data;
the coordinate conversion module is used for carrying out coordinate system conversion on the second target data and the two-dimensional corrected lane line data to obtain three-dimensional target data and three-dimensional lane line data;
the second correction module is used for correcting the three-dimensional target data and the three-dimensional lane line data based on the three-dimensional lane line data and a preset three-dimensional coordinate true value of the lane line to obtain three-dimensional corrected target data and three-dimensional corrected lane line data;
and the second determining module is used for determining second target lane data based on the three-dimensional corrected target data and the true value of the three-dimensional coordinate of the lane line, and adding the second target lane data into the three-dimensional corrected target data to obtain third target data.
On the other hand, the embodiment of the application provides a target correction system, which comprises a road end sensing device and a storage device;
the road end sensing device comprises a camera, a millimeter wave radar and a laser radar, wherein the camera is used for acquiring two-dimensional target data, the millimeter wave radar is used for acquiring fourth target data, and the laser radar is used for acquiring fifth target data;
the storage device is used for storing a preset two-dimensional coordinate true value and a preset three-dimensional coordinate true value of the lane line.
In another aspect, an embodiment of the present application provides a computer storage medium, where at least one instruction or at least one program is stored in the storage medium, and the at least one instruction or the at least one program is loaded and executed by a processor to implement any one of the above-mentioned target correction methods.
The method, the device, the system and the storage medium for target correction based on vehicle-road cooperation have the following beneficial effects:
the detected target data and lane line data are corrected for the first time in an image data layer through a preset lane line two-dimensional coordinate true value, and lane data where a target is located are added into the target data, so that the information of the target is perfected, and the accurate positioning of the target is facilitated; and performing second correction on the target data and the lane line data corresponding to the real world after coordinate conversion through a preset lane line three-dimensional coordinate true value, and adding the corrected lane data again, so that twice correction is realized, and the accuracy of a target positioning result is further improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a target correction method provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of an object correction apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application, and the application scenario includes a road-end sensing device 101 and a vehicle 102, where the road-end sensing device includes a camera, a millimeter wave radar, and a laser radar, and is configured to obtain vehicle information and lane line information, perform two corrections by using the target correction method according to the embodiment of the present application based on the obtained vehicle information and lane line information, and apply the obtained corrected target information to tracking and positioning of the vehicle, so as to improve accuracy of vehicle positioning.
While specific embodiments of a target modification method of the present application are described below, fig. 2 is a flow chart of a target modification method provided by embodiments of the present application, and the present specification provides the method operation steps as in the embodiments or the flow chart, but may include more or less operation steps based on conventional or non-inventive labor. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. In practice, the system or server product may be implemented in a sequential or parallel manner (e.g., parallel processor or multi-threaded environment) according to the embodiments or methods shown in the figures. Specifically, as shown in fig. 2, the method may include:
s201: acquiring two-dimensional target data and two-dimensional lane line data through a camera configured in a target correction system;
in specific implementation, the two-dimensional target data includes two-dimensional coordinates of the target vehicle, the two-dimensional lane line data includes two-dimensional coordinates of the lane line, and the two-dimensional coordinates of the target vehicle and the two-dimensional coordinates of the lane line are acquired through image data detected by the camera.
As an optional implementation manner, after step S201, detecting whether the formats of the acquired dimensional target data and the two-dimensional lane line data are correct, including determining whether the acquired data is empty, determining whether the image frame number is correct, determining whether the two-dimensional coordinates in the data are not within the image range, and the like. If the obtained judgment result is that the data is available, the process proceeds to step S202, and if the obtained judgment result is that the data is unavailable, the process repeats step S201.
S202: correcting the two-dimensional target data and the two-dimensional lane line data based on the two-dimensional lane line data and a preset true value of a two-dimensional coordinate of the lane line to obtain two-dimensional corrected target data and two-dimensional corrected lane line data;
in a specific implementation, since the position of the road-end sensing device is not changed and the position of the lane line in the real environment is not changed, a true value of the lane line in the image coordinate system, that is, a true value of the two-dimensional coordinate of the lane line, may be set in advance, and the true value of the two-dimensional coordinate of the lane line may be obtained through calibration of the detected image data.
The step S202 may specifically include: calculating the error between the two-dimensional coordinate of the lane line and the truth value of the two-dimensional coordinate of the lane line based on a preset equation to obtain an error curve;
and correcting the two-dimensional coordinates of the target and the two-dimensional coordinates of the lane line according to the error curve to obtain the two-dimensional coordinates of the corrected target and the two-dimensional coordinates of the corrected lane line.
S203: determining first target lane data based on the two-dimensional corrected target data and a lane line two-dimensional coordinate true value, and adding the first target lane data into the two-dimensional corrected target data to obtain second target data;
the step S203 may specifically include: judging the position relation of the target and the lane line according to the corrected two-dimensional coordinate of the target and the truth value of the two-dimensional coordinate of the lane line, and determining the data of the lane where the target is located;
and adding the data of the lane where the target is located into the two-dimensional correction target data to obtain second target data.
In a specific implementation, if the two-dimensional coordinate of the corrected target is obtained, the two-dimensional coordinate of the corrected target and the truth value of the two-dimensional coordinate of the lane line can be compared, so that the relative position of the target and the lane line is determined; for example, assuming that the left side of the detected lane line is a first lane and the right side is a second lane, if the target vehicle is confirmed to be on the left side of the lane line by comparing the corrected two-dimensional coordinates of the target with the true two-dimensional coordinates of the lane line, it may be determined that the target vehicle is on the first lane, and at this time, the serial number of the lane where the target vehicle is located may be added to the target information to increase the accuracy of positioning and avoid the target from jumping.
S204: converting the coordinate system of the second target data and the two-dimensional corrected lane line data to obtain three-dimensional target data and three-dimensional lane line data;
in a particular implementation, the three-dimensional object data includes three-dimensional coordinates of the object and the three-dimensional lane line data includes three-dimensional coordinates of the lane line.
S205: correcting the three-dimensional target data and the three-dimensional lane line data based on the three-dimensional lane line data and a preset lane line three-dimensional coordinate true value to obtain three-dimensional corrected target data and three-dimensional corrected lane line data;
in a specific implementation, since the position of the road-end sensing device is not changed and the position of the lane line in the real environment is not changed, a true value of the lane line in the three-dimensional coordinate system, that is, the true value of the lane line three-dimensional coordinate, may be set in advance, and the true value of the lane line three-dimensional coordinate may be obtained by point-sampling calibration in the real environment.
The step S205 may specifically include: calculating the error between the three-dimensional coordinate of the lane line and the true value of the three-dimensional coordinate of the lane line based on a preset equation to obtain an error curve;
and correcting the three-dimensional coordinates of the target and the three-dimensional coordinates of the lane line according to the error curve to obtain the corrected three-dimensional coordinates of the target and the corrected three-dimensional coordinates of the lane line.
S206: and determining second target lane data based on the three-dimensional corrected target data and the true value of the three-dimensional coordinate of the lane line, and adding the second target lane data into the three-dimensional corrected target data to obtain third target data.
The step S206 may specifically include: judging the position relation of the target and the lane line according to the corrected three-dimensional coordinate of the target and the three-dimensional coordinate truth value of the lane line, and determining the data of the lane where the target is located;
and adding the data of the lane where the target is located into the three-dimensional correction target data to obtain third target data.
In a specific implementation, if the three-dimensional coordinates of the corrected target are obtained, the three-dimensional coordinates of the corrected target and the three-dimensional coordinate truth value of the lane line can be compared, so that the relative position of the target and the lane line is determined; for example, if the left side of the detected lane line is a first lane and the right side is a second lane, if the target vehicle is confirmed to be on the left side of the lane line by comparing the corrected three-dimensional coordinate of the target with the true value of the three-dimensional coordinate of the lane line, the target vehicle can be determined to be on the first lane, and at this time, the serial number of the lane where the target vehicle is located can be added to the target information to obtain the three-dimensional target information with the lane data where the target is located.
As an optional implementation manner, the target modification method in the embodiment of the present application further includes:
s207: acquiring fourth target data through a millimeter wave radar configured in the target correction system, and acquiring fifth target data through a laser radar configured in the target correction system;
s208: and fusing the third target data, the fourth target data and the fifth target data to obtain sixth target data.
And the weight of each sensor in the multi-sensor fusion can be adjusted in a self-adaptive manner, and the target jump is reduced.
An embodiment of the present application further provides a target correction device, and fig. 3 is a schematic structural diagram of the target correction device provided in the embodiment of the present application, and as shown in fig. 3, the device includes:
a first obtaining module 301, configured to obtain two-dimensional target data and two-dimensional lane line data through a camera configured in a target modification system;
the first correction module 302 is configured to correct the two-dimensional target data and the two-dimensional lane line data based on the two-dimensional lane line data and a preset lane line two-dimensional coordinate true value, so as to obtain two-dimensional corrected target data and two-dimensional corrected lane line data;
a first determining module 303, configured to determine first target lane data based on the two-dimensional corrected target data and a true value of the two-dimensional coordinate of the lane line, and add the first target lane data to the two-dimensional corrected target data to obtain second target data;
the coordinate conversion module 304 is configured to perform coordinate system conversion on the second target data and the two-dimensional corrected lane line data to obtain three-dimensional target data and three-dimensional lane line data;
a second correction module 305, configured to correct the three-dimensional target data and the three-dimensional lane line data based on the three-dimensional lane line data and a preset lane line three-dimensional coordinate true value, so as to obtain three-dimensional corrected target data and three-dimensional corrected lane line data;
the second determining module 306 is configured to determine second target lane data based on the three-dimensional corrected target data and the true value of the three-dimensional coordinate of the lane line, and add the second target lane data to the three-dimensional corrected target data to obtain third target data.
The embodiment of the application also provides a target correction system, which comprises a road end sensing device and a storage device;
the road end sensing device comprises a camera, a millimeter wave radar and a laser radar, wherein the camera is used for acquiring two-dimensional target data, the millimeter wave radar is used for acquiring fourth target data, and the laser radar is used for acquiring fifth target data;
the storage device is used for storing a preset two-dimensional coordinate true value and a preset three-dimensional coordinate true value of the lane line.
The embodiment of the present application provides a computer storage medium, in which at least one instruction or at least one program is stored, and the at least one instruction or the at least one program is loaded and executed by a processor to implement the above target correction method.
Alternatively, in this embodiment, the storage medium may be located in at least one network server of a plurality of network servers of a computer network. Optionally, in this embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
As can be seen from the embodiments of the target correction method, device, system, or storage medium provided by the present application, in the present application, the detected target data and lane line data are corrected for the first time on the image data layer by the preset true value of the two-dimensional coordinate of the lane line, and the lane data where the target is located is added to the target data, so that the information of the target is improved, and the accurate positioning of the target is facilitated; and performing second correction on the target data and the lane line data corresponding to the real world after coordinate conversion through a preset lane line three-dimensional coordinate true value, and adding the corrected lane data again, so that twice correction is realized, and the accuracy of a target positioning result is further improved.
It should be noted that: the sequence of the embodiments of the present application is only for description, and does not represent the advantages and disadvantages of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A target correction method based on vehicle-road cooperation is characterized by comprising the following steps:
acquiring two-dimensional target data and two-dimensional lane line data through a camera configured in a target correction system;
correcting the two-dimensional target data and the two-dimensional lane line data based on the two-dimensional lane line data and a preset two-dimensional coordinate true value of a lane line to obtain two-dimensional corrected target data and two-dimensional corrected lane line data;
determining first target lane data based on the two-dimensional corrected target data and the lane line two-dimensional coordinate true value, and adding the first target lane data into the two-dimensional corrected target data to obtain second target data;
converting a coordinate system of the second target data and the two-dimensional corrected lane line data to obtain three-dimensional target data and three-dimensional lane line data;
correcting the three-dimensional target data and the three-dimensional lane line data based on the three-dimensional lane line data and a preset three-dimensional coordinate true value of the lane line to obtain three-dimensional corrected target data and three-dimensional corrected lane line data;
and determining second target lane data based on the three-dimensional corrected target data and the lane line three-dimensional coordinate true value, and adding the second target lane data into the three-dimensional corrected target data to obtain third target data.
2. The method of claim 1, further comprising:
acquiring fourth target data through a millimeter wave radar configured in the target correction system, and acquiring fifth target data through a laser radar configured in the target correction system;
and fusing the third target data, the fourth target data and the fifth target data to obtain sixth target data.
3. The method of claim 1, wherein the two-dimensional target data comprises two-dimensional coordinates of a target, and the two-dimensional lane line data comprises two-dimensional coordinates of a lane line;
the three-dimensional object data includes three-dimensional coordinates of an object, and the three-dimensional lane line data includes three-dimensional coordinates of a lane line.
4. The method of claim 3, wherein the correcting the two-dimensional target data and the two-dimensional lane line data based on the two-dimensional lane line data and a preset two-dimensional coordinate true value of a lane line to obtain two-dimensional corrected target data and two-dimensional corrected lane line data comprises:
calculating an error between the two-dimensional coordinate of the lane line and a truth value of the two-dimensional coordinate of the lane line based on a preset equation to obtain an error curve;
and correcting the two-dimensional coordinates of the target and the two-dimensional coordinates of the lane line according to the error curve to obtain the two-dimensional coordinates of the corrected target and the two-dimensional coordinates of the corrected lane line.
5. The method of claim 4, wherein said determining first target lane data based on said two-dimensional corrected target data and said lane line two-dimensional coordinate true value, adding said first target lane data to said two-dimensional corrected target data to obtain second target data comprises;
judging the position relation between the target and the lane line according to the corrected two-dimensional coordinate of the target and the truth value of the two-dimensional coordinate of the lane line, and determining the data of the lane where the target is located;
and adding the data of the lane where the target is located into the two-dimensional correction target data to obtain second target data.
6. The method of claim 3, wherein the correcting the three-dimensional target data and the three-dimensional lane line data based on the three-dimensional lane line data and a preset three-dimensional coordinate true value of a lane line to obtain three-dimensional corrected target data and three-dimensional corrected lane line data comprises:
calculating an error between the three-dimensional coordinate of the lane line and a true value of the three-dimensional coordinate of the lane line based on a preset equation to obtain an error curve;
and correcting the three-dimensional coordinates of the target and the three-dimensional coordinates of the lane line according to the error curve to obtain the three-dimensional coordinates of the corrected target and the three-dimensional coordinates of the corrected lane line.
7. The method of claim 6, wherein said determining second target lane data based on said three-dimensional corrected target data and said lane line three-dimensional coordinate truth value, adding said second target lane data to said three-dimensional corrected target data to obtain third target data comprises;
judging the position relation between the target and the lane line according to the corrected three-dimensional coordinate of the target and the three-dimensional coordinate truth value of the lane line, and determining the data of the lane where the target is located;
and adding the data of the lane where the target is located into the three-dimensional correction target data to obtain third target data.
8. An object correction device, characterized by comprising:
the first acquisition module is used for acquiring two-dimensional target data and two-dimensional lane line data through a camera configured in the target correction system;
the first correction module is used for correcting the two-dimensional target data and the two-dimensional lane line data based on the two-dimensional lane line data and a preset two-dimensional coordinate true value of the lane line to obtain two-dimensional corrected target data and two-dimensional corrected lane line data;
the first determining module is used for determining first target lane data based on the two-dimensional corrected target data and the truth value of the two-dimensional coordinate of the lane line, and adding the first target lane data into the two-dimensional corrected target data to obtain second target data;
the coordinate conversion module is used for carrying out coordinate system conversion on the second target data and the two-dimensional corrected lane line data to obtain three-dimensional target data and three-dimensional lane line data;
the second correction module is used for correcting the three-dimensional target data and the three-dimensional lane line data based on the three-dimensional lane line data and a preset three-dimensional coordinate true value of the lane line to obtain three-dimensional corrected target data and three-dimensional corrected lane line data;
and the second determining module is used for determining second target lane data based on the three-dimensional corrected target data and the true value of the three-dimensional coordinate of the lane line, and adding the second target lane data into the three-dimensional corrected target data to obtain third target data.
9. A target correction system is characterized by comprising a road end sensing device and a storage device;
the road end sensing device comprises a camera, a millimeter wave radar and a laser radar, wherein the camera is used for acquiring two-dimensional target data, the millimeter wave radar is used for acquiring fourth target data, and the laser radar is used for acquiring fifth target data;
the storage device is used for storing a preset two-dimensional coordinate true value and a preset three-dimensional coordinate true value of the lane line.
10. A computer storage medium, wherein at least one instruction or at least one program is stored, which is loaded and executed by a processor to implement the object correction method according to any one of claims 1 to 7.
CN202011216785.8A 2020-11-04 2020-11-04 Target correction method, device and system based on vehicle-road cooperation and storage medium Active CN112488150B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011216785.8A CN112488150B (en) 2020-11-04 2020-11-04 Target correction method, device and system based on vehicle-road cooperation and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011216785.8A CN112488150B (en) 2020-11-04 2020-11-04 Target correction method, device and system based on vehicle-road cooperation and storage medium

Publications (2)

Publication Number Publication Date
CN112488150A true CN112488150A (en) 2021-03-12
CN112488150B CN112488150B (en) 2023-04-07

Family

ID=74928329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011216785.8A Active CN112488150B (en) 2020-11-04 2020-11-04 Target correction method, device and system based on vehicle-road cooperation and storage medium

Country Status (1)

Country Link
CN (1) CN112488150B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114863026A (en) * 2022-05-18 2022-08-05 禾多科技(北京)有限公司 Three-dimensional lane line information generation method, device, equipment and computer readable medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130116909A1 (en) * 2010-07-29 2013-05-09 Toyota Jidosha Kabushiki Kaisha Vehicle control system
JP2015161968A (en) * 2014-02-26 2015-09-07 日産自動車株式会社 Traffic lane identification apparatus, traffic lane change support apparatus, traffic lane identification method
CN109085570A (en) * 2018-06-10 2018-12-25 南京理工大学 Automobile detecting following algorithm based on data fusion
CN109946730A (en) * 2019-03-06 2019-06-28 东南大学 Ultra-wideband-based high-reliability fusion positioning method for vehicles under cooperation of vehicle and road
CN111522338A (en) * 2020-04-13 2020-08-11 吉利汽车研究院(宁波)有限公司 Vehicle-road cooperative control method, vehicle-road cooperative system and automatic driving device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130116909A1 (en) * 2010-07-29 2013-05-09 Toyota Jidosha Kabushiki Kaisha Vehicle control system
JP2015161968A (en) * 2014-02-26 2015-09-07 日産自動車株式会社 Traffic lane identification apparatus, traffic lane change support apparatus, traffic lane identification method
CN109085570A (en) * 2018-06-10 2018-12-25 南京理工大学 Automobile detecting following algorithm based on data fusion
CN109946730A (en) * 2019-03-06 2019-06-28 东南大学 Ultra-wideband-based high-reliability fusion positioning method for vehicles under cooperation of vehicle and road
CN111522338A (en) * 2020-04-13 2020-08-11 吉利汽车研究院(宁波)有限公司 Vehicle-road cooperative control method, vehicle-road cooperative system and automatic driving device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YIRAN DING 等: "Vehicle Aided Positioning Method Based on Intelligent Identification", 《2019 34RD YOUTH ACADEMIC ANNUAL CONFERENCE OF CHINESE ASSOCIATION OF AUTOMATION (YAC)》 *
王江锋 等: "车路协同环境下基于误差加权的车辆无线定位融合算法", 《第十一届中国智能交通年会大会论文集》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114863026A (en) * 2022-05-18 2022-08-05 禾多科技(北京)有限公司 Three-dimensional lane line information generation method, device, equipment and computer readable medium

Also Published As

Publication number Publication date
CN112488150B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
JP6507201B2 (en) Method, apparatus, and apparatus for generating target detection information
CN111376271B (en) Method and device for controlling welding robot, welding robot and storage medium
JP2008250906A (en) Mobile robot, and self-location correction method and program
CN112166458B (en) Target detection and tracking method, system, equipment and storage medium
CN112731334B (en) Method and device for positioning vehicle by laser
CN112488150B (en) Target correction method, device and system based on vehicle-road cooperation and storage medium
CN112466147B (en) Multi-sensor-based library position detection method and related device
JP2008250905A (en) Mobile robot, self-location correction method and self-location correction program
CN115326051A (en) Positioning method and device based on dynamic scene, robot and medium
CN110865360B (en) Data fusion method and device
CN116304992A (en) Sensor time difference determining method, device, computer equipment and storage medium
CN117213510A (en) Automatic driving tracking travel information processing method, device, equipment and storage medium
CN113325415B (en) Fusion method and system of vehicle radar data and camera data
CN114371475A (en) Method, system, equipment and computer storage medium for optimizing calibration parameters
CN116148820A (en) Laser radar calibration method, computer equipment, readable storage medium and motor vehicle
CN115115704A (en) Method and device for determining automobile pose information
CN113643359A (en) Target object positioning method, device, equipment and storage medium
CN112068547A (en) Robot positioning method and device based on AMCL and robot
CN111982115A (en) Feature point map construction method, device and medium based on inertial navigation system
CN112946612B (en) External parameter calibration method and device, electronic equipment and storage medium
CN116228820B (en) Obstacle detection method and device, electronic equipment and storage medium
CN112633214B (en) Vehicle identification method and device
JP6991489B2 (en) Map evaluation device, map evaluation method and map evaluation program
CN117665796A (en) Target association method, target identification method, corresponding medium and system
CN115511938A (en) Height determining method and device based on monocular camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant