CN111079079A - Data correction method and device, electronic equipment and computer readable storage medium - Google Patents

Data correction method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN111079079A
CN111079079A CN201911200789.4A CN201911200789A CN111079079A CN 111079079 A CN111079079 A CN 111079079A CN 201911200789 A CN201911200789 A CN 201911200789A CN 111079079 A CN111079079 A CN 111079079A
Authority
CN
China
Prior art keywords
data
unmanned vehicle
roadside
sample
transformation matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911200789.4A
Other languages
Chinese (zh)
Other versions
CN111079079B (en
Inventor
曹获
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201911200789.4A priority Critical patent/CN111079079B/en
Publication of CN111079079A publication Critical patent/CN111079079A/en
Application granted granted Critical
Publication of CN111079079B publication Critical patent/CN111079079B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle

Abstract

The application discloses a data correction method, a data correction device, electronic equipment and a computer readable storage medium, and relates to the technical field of automatic driving. The specific implementation scheme is as follows: acquiring roadside sensing data of the target unmanned vehicle; processing the roadside perception data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle; the preset transformation matrix is obtained by calculating road side sensing data and truth value data of a plurality of sample unmanned vehicles. According to the embodiment of the application, the accuracy of the roadside sensing data of the unmanned vehicle can be improved.

Description

Data correction method and device, electronic equipment and computer readable storage medium
Technical Field
The application relates to the technical field of computers, in particular to the technical field of automatic driving.
Background
With the development of the automatic driving technology and the intelligent vehicle networking technology, the detection of obstacles around the unmanned vehicle and road information, namely, the real-time and accurate sensing of the position, speed, lane information and the like of the obstacles on the road side becomes the premise of safe driving. However, because the roadside sensing system is generally calibrated uniformly, ground equations at different roadsides are not consistent, and other factors, the accuracy of roadside sensing data of the existing unmanned vehicle is low, and the safety and reliability of automatic driving cannot be ensured.
Disclosure of Invention
The embodiment of the application provides a data correction method, a data correction device, electronic equipment and a computer readable storage medium, and aims to solve the problem that the accuracy of roadside sensing data of an existing unmanned vehicle is low.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a data correction method, including:
acquiring roadside sensing data of the target unmanned vehicle;
processing the roadside perception data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle;
the preset transformation matrix is obtained by calculating road side perception data and truth value data of a plurality of sample unmanned vehicles.
Therefore, roadside sensing data of the target unmanned vehicle can be corrected, the technical problem that the roadside sensing data of the existing unmanned vehicle are low in precision is solved, and the precision of the roadside sensing data is improved.
Optionally, before the obtaining of roadside sensing data of the target unmanned vehicle, the method further includes:
obtaining roadside perception data of the plurality of sample unmanned vehicles;
obtaining truth value data of the unmanned vehicles;
and calculating a transformation matrix between roadside perception data and truth value data of the plurality of sample unmanned vehicles based on an iterative closest point ICP algorithm to obtain the preset transformation matrix.
Therefore, the ICP algorithm can be used for matching the road side perception data corresponding points and the truth value data corresponding points of the sample unmanned vehicle, so that the road side perception data is infinitely close to the truth value data, and a transformation matrix between the road side perception data corresponding points and the truth value data is obtained.
Optionally, the obtaining of the truth data of the plurality of sample unmanned vehicles includes:
and respectively receiving truth value data sent by each sample unmanned vehicle, wherein the truth value data are obtained by the sample unmanned vehicles through positioning operation.
Thus, since general accuracy of position information and the like obtained by vehicle positioning is high, the accuracy of the preset transformation matrix can be ensured by using data obtained by positioning operation as true value data.
Optionally, the roadside sensing data of the sample unmanned vehicle includes: obtaining a position coordinate set of a central point and a corner point of the sample unmanned vehicle through roadside detection;
the sample unmanned vehicle truth data includes: and obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle through positioning operation.
Therefore, the unmanned vehicle can be accurately calibrated by means of the position coordinate sets of the central point and the corner points, and the accuracy of the preset transformation matrix is improved.
In a second aspect, an embodiment of the present application provides a data correction apparatus, including:
the first acquisition module is used for acquiring roadside perception data of the target unmanned vehicle;
the processing module is used for processing the roadside sensing data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle;
the preset transformation matrix is obtained by calculating road side perception data and truth value data of a plurality of sample unmanned vehicles.
Optionally, the apparatus further comprises:
the second acquisition module is used for acquiring roadside perception data of the plurality of sample unmanned vehicles;
the third acquisition module is used for acquiring truth value data of the unmanned vehicles;
and the calculation module is used for calculating a transformation matrix between the roadside perception data and the truth value data of the multiple sample unmanned vehicles based on an ICP (inductively coupled plasma) algorithm to obtain the preset transformation matrix.
Optionally, the third obtaining module is specifically configured to:
and respectively receiving truth value data sent by each sample unmanned vehicle, wherein the truth value data are obtained by the sample unmanned vehicles through positioning operation.
Optionally, the roadside sensing data of the sample unmanned vehicle includes: obtaining a position coordinate set of a central point and a corner point of the sample unmanned vehicle through roadside detection;
the sample unmanned vehicle truth data includes: and obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle through positioning operation.
In a third aspect, an embodiment of the present application further provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a data correction method as described above.
In a fourth aspect, the present application further provides a non-transitory computer-readable storage medium storing computer instructions, where the computer instructions are configured to cause the computer to execute the data modification method described above.
One embodiment in the above application has the following advantages or benefits: the accuracy of the roadside sensing data of the unmanned vehicle can be improved. Because the technical means of processing the roadside sensing data of the target unmanned vehicle according to the preset transformation matrix to obtain the target correction data is adopted, the technical problem that the roadside sensing data of the existing unmanned vehicle is low in precision is solved, and the technical effect of improving the precision of the roadside sensing data can be achieved.
Other effects of the above-described alternative will be described below with reference to specific embodiments.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a flow chart of a data correction method of an embodiment of the present application;
FIG. 2 is a schematic diagram of an application scenario in an embodiment of the present application;
FIG. 3 is a second exemplary illustration of an application scenario in an embodiment of the present application;
FIG. 4 is a block diagram of a data correction device for implementing a data correction method according to an embodiment of the present application;
fig. 5 is a block diagram of an electronic device for implementing the data correction method according to the embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Referring to fig. 1, fig. 1 is a flowchart of a data correction method according to an embodiment of the present application, and as shown in fig. 1, the method includes the following steps:
step 101: and acquiring roadside sensing data of the target unmanned vehicle.
In this embodiment, the target unmanned vehicle may be a vehicle traveling at an intersection, a road, or the like. The target unmanned vehicle may be one vehicle or a plurality of vehicles. Roadside sensing data of the target unmanned vehicle can be obtained through roadside detection by a roadside sensing system.
Optionally, the main body of the data correction method of this embodiment may be a road side computing device (or called road side computing unit). The roadside computing device may be a component of the roadside sensing system, or may be a computing device independent of the roadside sensing system, which is not limited in this embodiment.
Step 102: and processing the roadside perception data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle.
In this embodiment, the preset transformation matrix may be obtained by calculating road side sensing data and truth value data of a plurality of sample unmanned vehicles, and is used for subsequently correcting (or called as correcting and precision compensating) the road side sensing data of the target unmanned vehicle, so as to improve the road side sensing precision. For example, the number of the sample unmanned vehicles may be 3, 4, or the like.
In one embodiment, the roadside sensing data of the sample unmanned vehicle can be obtained by roadside detection by a roadside sensing system. And the truth value data of the sample unmanned vehicle can be obtained by simulating the positioning operation by the sample unmanned vehicle. It can be understood that the position information obtained by vehicle positioning and the like are generally high in precision and can be used as a true value reference.
In the application scenario of this embodiment, the sample unmanned vehicle and the target unmanned vehicle may be driven successively through the same intersection. That is, after the change matrix is calculated by using the roadside sensing data and the true value data of the (sample) unmanned vehicle which is driven to pass through the intersection first, according to the change matrix, the roadside sensing data of the (target) unmanned vehicle which is driven to pass through the intersection later is processed, namely precision compensation is performed, so that the precision of the roadside sensing data of the unmanned vehicle which passes through the intersection later is improved.
According to the data correction method, the roadside sensing data of the target unmanned vehicle are processed according to the preset transformation matrix to obtain the target correction data, and the roadside sensing data of the target unmanned vehicle can be corrected, so that the technical problem that the precision of the roadside sensing data of the existing unmanned vehicle is low is solved, and the precision of the roadside sensing data is improved.
In this application, since the position of the unmanned vehicle can be usually calibrated by the central point and the corner points, the roadside sensing data of the sample unmanned vehicle may include: obtaining a position coordinate set of a central point and a corner point of the sample unmanned vehicle through roadside detection; the truth data of the sample unmanned vehicle comprises: and obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle through positioning operation. Therefore, the unmanned vehicle can be accurately calibrated, and the accuracy of the preset transformation matrix is ensured.
Optionally, before step 101, the method may further include:
obtaining roadside sensing data of a plurality of sample unmanned vehicles;
obtaining truth value data of a plurality of sample unmanned vehicles;
and calculating a transformation matrix between the roadside perception data and the true value data of the plurality of sample unmanned vehicles based on an Iterative Closest Point (ICP) algorithm to obtain the preset transformation matrix.
The ICP algorithm is an iterative calculation method, is mainly used for accurately splicing depth images in computer vision, and can realize accurate splicing by continuously iterating and minimizing corresponding points of source data and target data. Based on this, when the preset transformation matrix is calculated, the ICP algorithm can be used to match the roadside sensing data corresponding points and the true value data corresponding points of the sample unmanned vehicle, so that the roadside sensing data infinitely approaches the true value data, and the transformation matrix between the two is obtained.
In this embodiment, since the sensing module in the roadside sensing system, such as a camera, has a 3D error due to internal and external parameters, a ground equation, and the like, which conforms to the euclidean transformation, there may be a model formula that transforms and projects roadside sensing data of the sample unmanned vehicle into true value data:
P2i=RP1i+t+ei(1)
wherein, P1iA set of position coordinates representing the center point and the corner point of the sample unmanned vehicle i obtained by simulation of the positioning operation, P2iPosition coordinate set representing center point and edge point of sample unmanned vehicle i obtained by roadside detection, R and t represent Euclidean transformation, eiIndicating an artificial labeling error that may fit into a gaussian distribution.
In one embodiment, after obtaining roadside sensing data (a position coordinate set of a center point and a corner point obtained through roadside detection) and truth data (a position coordinate set of a center point and a corner point obtained through positioning operation simulation) of a plurality of sample unmanned vehicles, solving can be performed in an ICP (Singular value decomposition, SVD) mode for formula (1), and accurate matching is realized by continuously iterating corresponding points of the roadside sensing data and the truth data to obtain R and t, so that a transformation matrix between the roadside sensing data and the truth data of the sample unmanned vehicles is obtained.
It should be noted that, in this embodiment, the method for calculating the transformation matrix between the roadside sensing data and the truth data of the sample unmanned vehicle is not limited, and the ICP solution may be performed by using a conventional method, for example, the SVD method.
Optionally, the process of obtaining the truth data of the unmanned vehicles may include:
and respectively receiving truth value data sent by each sample unmanned vehicle, wherein the truth value data are obtained by the sample unmanned vehicles through positioning operation.
Thus, since general accuracy of position information and the like obtained by vehicle positioning is high, the accuracy of the preset transformation matrix can be ensured by using data obtained by positioning operation as true value data.
It should be noted that the truth data may be actively sent to the roadside computing device by the sample unmanned vehicle when the sample unmanned vehicle passes through the specific location, or sent to the roadside computing device after receiving the relevant information. The related information may be request information sent by the roadside computing device for requesting truth data, or may be roadside sensing information broadcast by a roadside sensing system, where the roadside sensing information includes all obstacle (vehicle) information sensed by the roadside sensing system, so that after the sample unmanned vehicle receives the obstacle information, the sample unmanned vehicle finds information belonging to the sample unmanned vehicle based on the host vehicle discovery model, and returns truth information obtained through positioning operation and matched obstacle information to the roadside computing device.
The following describes the embodiments of the present application in detail with reference to fig. 2 and 3.
In the embodiment of the present application, as shown in fig. 2, the roadside sensing system 1 may include a sensing module 11 and a first V2X (vehicle-to-all) transmission module 12. The sensing module 11 is configured to detect roadside sensing information (including at least position coordinate data of a center point and a corner point of an obstacle vehicle) of corresponding positions, such as all obstacle vehicles at a crossing. The first V2X sending module 12 is configured to broadcast the roadside awareness information detected by the awareness module 11 for being received by each unmanned vehicle 2 in the intersection.
As shown in fig. 2, unmanned vehicle 2 (which may also be referred to as: primary vehicle 2) may include a second V2X receiving module 21, a primary vehicle discovery module 22, and a positioning module 23. The second V2X receiving module 21 is used to receive the roadside awareness information broadcast by the first V2X transmitting module 12. The host vehicle discovery module 22 is configured to discover information pertaining to itself from the received obstacle vehicle information. The positioning module 23 is configured to obtain positioning information (distinguishing from perception information obtained by roadside detection, which is information simulated by the positioning operation, and is high in precision, and may be used as a true value reference and referred to as true value information, and at least includes position coordinate data of a center point and a corner point of the information). Understandably, for single-target tracking, because the precision of the roadside sensing information is limited, if the unmanned vehicle 2 adopts the roadside sensing information for discovery, the error is abnormally large; since the accuracy of the information simulated by the positioning operation is high, the error is small if the unmanned vehicle 2 uses the simulated information for discovery.
Further, as shown in fig. 3, unmanned vehicle 2 (such as host a, host B, and host C shown in fig. 3, which correspond to a sample unmanned vehicle) may also include a second V2X sending module 24. The second V2X sending module 24 may be configured to transmit the positioning information obtained by the positioning module 23, the self size, and the roadside awareness information matched by the primary vehicle discovery module 22 back to the roadside computing device 3 through V2I (vehicle-to-vehicle infrastructure), so that the roadside computing device 3 computes a transformation matrix for compensating for, i.e., correcting, the accuracy of the roadside awareness information.
In one embodiment, the information (or referred to as the primary obstacle information) sent by the second V2X sending module 24 may include the following necessary fields:
1) id: an identification of the host vehicle;
2) position: the location of the host vehicle in a world coordinate system (such as the UTM coordinate system);
3) theta: a course angle of the host vehicle in a world coordinate system;
4) velocity: the velocity of the host vehicle, such as Vx, Vy, and Vz in the world coordinate system;
5) length: the length of the main frame;
6) and (2) width: the width of the main vehicle;
7) height: the height of the main vehicle;
8) polygon _ point: corner points of the contour of the host vehicle, such as the coordinates of the points in the world coordinate system: x, Y and Z;
9) type: types of hosts, including: unknown, unknown motion, unknown stationary, non-motorized, pedestrian, and the like.
As shown in fig. 3, the roadside computing device 3 may include a first V2X receiving module 31 and a precision compensation module 32. The first V2X receiving module 31 is used to receive the information sent by the second V2X sending module 24. The precision compensation module 32 is configured to match roadside sensing information (e.g., position coordinate sets of a center point and a corner point obtained through roadside sensing) and positioning information (e.g., position coordinate sets of a center point and a corner point obtained through positioning operation) of a plurality of sample unmanned vehicles, such as the master vehicle a, the master vehicle B, and the master vehicle C, to obtain a transformation matrix, and correct the roadside sensing information of the obstacle vehicle detected by the roadside sensing system 1 subsequently by using the transformation matrix, so that the corrected data approximates to a true value, and return the corrected data to the roadside sensing system 1. Optionally, the roadside computing device 3 may be a component of the roadside sensing system 1, or may exist independently of the roadside sensing system.
Referring to fig. 4, fig. 4 is a block diagram of a data correction apparatus for implementing the data correction method according to the embodiment of the present application, and as shown in fig. 4, the data correction apparatus 40 includes:
the first obtaining module 41 is configured to obtain roadside sensing data of the target unmanned vehicle;
the processing module 42 is configured to process the roadside sensing data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle;
the preset transformation matrix is obtained by calculating road side perception data and truth value data of a plurality of sample unmanned vehicles.
Optionally, the apparatus further comprises:
the second acquisition module is used for acquiring roadside perception data of the plurality of sample unmanned vehicles;
the third acquisition module is used for acquiring truth value data of the unmanned vehicles;
and the calculation module is used for calculating a transformation matrix between the roadside perception data and the truth value data of the multiple sample unmanned vehicles based on an ICP (inductively coupled plasma) algorithm to obtain the preset transformation matrix.
Optionally, the third obtaining module is specifically configured to:
and respectively receiving truth value data sent by each sample unmanned vehicle, wherein the truth value data are obtained by the sample unmanned vehicles through positioning operation.
Optionally, the roadside sensing data of the sample unmanned vehicle includes: obtaining a position coordinate set of a central point and a corner point of the sample unmanned vehicle through roadside detection;
the sample unmanned vehicle truth data includes: and obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle through positioning operation.
It can be understood that the data correction apparatus 40 according to the embodiment of the present application can implement each process implemented in the method embodiment shown in fig. 1 and achieve the same beneficial effects, and for avoiding repetition, the details are not repeated here.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
Fig. 5 is a block diagram of an electronic device for implementing the data correction method according to the embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 5, the electronic apparatus includes: one or more processors 501, memory 502, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 5, one processor 501 is taken as an example.
Memory 502 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by at least one processor to cause the at least one processor to perform the data correction method provided by the present application. The non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to execute the data correction method provided by the present application.
The memory 502, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the first obtaining module 41 and the processing module 42 shown in fig. 4) corresponding to the data modification method in the embodiment of the present application. The processor 501 executes various functional applications of the server and data processing by running non-transitory software programs, instructions, and modules stored in the memory 502, that is, implements the data correction method in the above method embodiment.
The memory 502 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by use of the electronic device, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 502 optionally includes memory located remotely from processor 501, which may be connected to an electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the data correction method may further include: an input device 503 and an output device 504. The processor 501, the memory 502, the input device 503 and the output device 504 may be connected by a bus or other means, and fig. 5 illustrates the connection by a bus as an example.
The input device 503 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus of the data correction method, such as an input device of a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, or the like. The output devices 504 may include a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, the roadside sensing data of the target unmanned vehicle can be corrected, so that the technical problem that the roadside sensing data of the existing unmanned vehicle is low in precision is solved, and the precision of the roadside sensing data is improved.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A method of data correction, comprising:
acquiring roadside sensing data of the target unmanned vehicle;
processing the roadside perception data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle;
the preset transformation matrix is obtained by calculating road side perception data and truth value data of a plurality of sample unmanned vehicles.
2. The method of claim 1, wherein prior to obtaining roadside awareness data of the target unmanned vehicle, the method further comprises:
obtaining roadside perception data of the plurality of sample unmanned vehicles;
obtaining truth value data of the unmanned vehicles;
and calculating a transformation matrix between roadside perception data and truth value data of the plurality of sample unmanned vehicles based on an iterative closest point ICP algorithm to obtain the preset transformation matrix.
3. The method of claim 2, wherein the obtaining truth data for the plurality of sample unmanned vehicles comprises:
and respectively receiving truth value data sent by each sample unmanned vehicle, wherein the truth value data are obtained by the sample unmanned vehicles through positioning operation.
4. The method according to any one of claims 1 to 3,
the roadside perception data of the sample unmanned vehicle comprises: obtaining a position coordinate set of a central point and a corner point of the sample unmanned vehicle through roadside detection;
the sample unmanned vehicle truth data includes: and obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle through positioning operation.
5. A data correction apparatus, comprising:
the first acquisition module is used for acquiring roadside perception data of the target unmanned vehicle;
the processing module is used for processing the roadside sensing data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle;
the preset transformation matrix is obtained by calculating road side perception data and truth value data of a plurality of sample unmanned vehicles.
6. The apparatus of claim 5, further comprising:
the second acquisition module is used for acquiring roadside perception data of the plurality of sample unmanned vehicles;
the third acquisition module is used for acquiring truth value data of the unmanned vehicles;
and the calculation module is used for calculating a transformation matrix between the roadside perception data and the truth value data of the multiple sample unmanned vehicles based on an ICP (inductively coupled plasma) algorithm to obtain the preset transformation matrix.
7. The apparatus of claim 6,
the third obtaining module is specifically configured to:
and respectively receiving truth value data sent by each sample unmanned vehicle, wherein the truth value data are obtained by the sample unmanned vehicles through positioning operation.
8. The apparatus according to any one of claims 5 to 7,
the roadside perception data of the sample unmanned vehicle comprises: obtaining a position coordinate set of a central point and a corner point of the sample unmanned vehicle through roadside detection;
the sample unmanned vehicle truth data includes: and obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle through positioning operation.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-4.
10. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-4.
CN201911200789.4A 2019-11-29 2019-11-29 Data correction method, device, electronic equipment and computer readable storage medium Active CN111079079B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911200789.4A CN111079079B (en) 2019-11-29 2019-11-29 Data correction method, device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911200789.4A CN111079079B (en) 2019-11-29 2019-11-29 Data correction method, device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111079079A true CN111079079A (en) 2020-04-28
CN111079079B CN111079079B (en) 2023-12-26

Family

ID=70312120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911200789.4A Active CN111079079B (en) 2019-11-29 2019-11-29 Data correction method, device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111079079B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111932883A (en) * 2020-08-13 2020-11-13 上海电科市政工程有限公司 Method for guiding unmanned driving by utilizing broadcast communication of road side equipment
CN112816954A (en) * 2021-02-09 2021-05-18 中国信息通信研究院 Road side perception system evaluation method and system based on truth value
CN114581615A (en) * 2022-05-07 2022-06-03 江苏三棱智慧物联发展股份有限公司 Data processing method, device, equipment and storage medium
CN115265630A (en) * 2022-07-25 2022-11-01 科大国创合肥智能汽车科技有限公司 Method for screening static object identification information of sensor based on FDR
CN115825901A (en) * 2023-02-21 2023-03-21 南京楚航科技有限公司 Vehicle-mounted sensor perception performance evaluation truth value system

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040247030A1 (en) * 2003-06-09 2004-12-09 Andre Wiethoff Method for transcoding an MPEG-2 video stream to a new bitrate
CN105741546A (en) * 2016-03-18 2016-07-06 重庆邮电大学 Intelligent vehicle target tracking system through integration of road side equipment and vehicle sensor and method thereof
CN108415057A (en) * 2018-01-25 2018-08-17 南京理工大学 A kind of relative positioning method that unmanned fleet cooperates with roadside unit
CN207852108U (en) * 2018-03-12 2018-09-11 北京图森未来科技有限公司 A kind of bus or train route cooperative system and its bus or train route cooperate with trackside awareness apparatus
CN108762245A (en) * 2018-03-20 2018-11-06 华为技术有限公司 Data fusion method and relevant device
CN109901586A (en) * 2019-03-27 2019-06-18 厦门金龙旅行车有限公司 A kind of unmanned vehicle tracking control method, device, equipment and storage medium
CN110070139A (en) * 2019-04-28 2019-07-30 吉林大学 Small sample towards automatic Pilot environment sensing is in ring learning system and method
CN110103953A (en) * 2019-04-30 2019-08-09 北京百度网讯科技有限公司 For assisting method, equipment, medium and the system of the Driving control of vehicle
CN110231601A (en) * 2019-07-01 2019-09-13 百度在线网络技术(北京)有限公司 Sensor error compensation method, device, equipment and storage medium
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN110455554A (en) * 2019-09-03 2019-11-15 酷黑科技(北京)有限公司 A kind of unmanned vehicle test macro and method
CN110501013A (en) * 2019-08-07 2019-11-26 腾讯科技(深圳)有限公司 Position compensation method, apparatus and electronic equipment

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040247030A1 (en) * 2003-06-09 2004-12-09 Andre Wiethoff Method for transcoding an MPEG-2 video stream to a new bitrate
CN105741546A (en) * 2016-03-18 2016-07-06 重庆邮电大学 Intelligent vehicle target tracking system through integration of road side equipment and vehicle sensor and method thereof
CN108415057A (en) * 2018-01-25 2018-08-17 南京理工大学 A kind of relative positioning method that unmanned fleet cooperates with roadside unit
CN207852108U (en) * 2018-03-12 2018-09-11 北京图森未来科技有限公司 A kind of bus or train route cooperative system and its bus or train route cooperate with trackside awareness apparatus
CN108762245A (en) * 2018-03-20 2018-11-06 华为技术有限公司 Data fusion method and relevant device
CN109901586A (en) * 2019-03-27 2019-06-18 厦门金龙旅行车有限公司 A kind of unmanned vehicle tracking control method, device, equipment and storage medium
CN110070139A (en) * 2019-04-28 2019-07-30 吉林大学 Small sample towards automatic Pilot environment sensing is in ring learning system and method
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN110103953A (en) * 2019-04-30 2019-08-09 北京百度网讯科技有限公司 For assisting method, equipment, medium and the system of the Driving control of vehicle
CN110231601A (en) * 2019-07-01 2019-09-13 百度在线网络技术(北京)有限公司 Sensor error compensation method, device, equipment and storage medium
CN110501013A (en) * 2019-08-07 2019-11-26 腾讯科技(深圳)有限公司 Position compensation method, apparatus and electronic equipment
CN110455554A (en) * 2019-09-03 2019-11-15 酷黑科技(北京)有限公司 A kind of unmanned vehicle test macro and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘希宾;杨卫;陈晓乐;王琳玮;刘珊;: "声光阵列协作的运动目标轨迹预推方法", 电子器件, no. 02 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111932883A (en) * 2020-08-13 2020-11-13 上海电科市政工程有限公司 Method for guiding unmanned driving by utilizing broadcast communication of road side equipment
CN112816954A (en) * 2021-02-09 2021-05-18 中国信息通信研究院 Road side perception system evaluation method and system based on truth value
CN112816954B (en) * 2021-02-09 2024-03-26 中国信息通信研究院 Road side perception system evaluation method and system based on true value
CN114581615A (en) * 2022-05-07 2022-06-03 江苏三棱智慧物联发展股份有限公司 Data processing method, device, equipment and storage medium
CN115265630A (en) * 2022-07-25 2022-11-01 科大国创合肥智能汽车科技有限公司 Method for screening static object identification information of sensor based on FDR
CN115825901A (en) * 2023-02-21 2023-03-21 南京楚航科技有限公司 Vehicle-mounted sensor perception performance evaluation truth value system

Also Published As

Publication number Publication date
CN111079079B (en) 2023-12-26

Similar Documents

Publication Publication Date Title
CN111079079B (en) Data correction method, device, electronic equipment and computer readable storage medium
CN110738183B (en) Road side camera obstacle detection method and device
EP3862723A2 (en) Method and apparatus for detecting map quality
CN112415552A (en) Vehicle position determining method and device and electronic equipment
CN112132829A (en) Vehicle information detection method and device, electronic equipment and storage medium
CN111462029B (en) Visual point cloud and high-precision map fusion method and device and electronic equipment
CN112270669B (en) Human body 3D key point detection method, model training method and related devices
CN111310840B (en) Data fusion processing method, device, equipment and storage medium
CN111753765A (en) Detection method, device and equipment of sensing equipment and storage medium
CN111324945B (en) Sensor scheme determining method, device, equipment and storage medium
CN111523471B (en) Method, device, equipment and storage medium for determining lane where vehicle is located
CN111324115A (en) Obstacle position detection fusion method and device, electronic equipment and storage medium
CN111081033B (en) Method and device for determining orientation angle of vehicle
CN111368760A (en) Obstacle detection method and device, electronic equipment and storage medium
CN112184914A (en) Method and device for determining three-dimensional position of target object and road side equipment
CN111767360A (en) Method and device for marking virtual lane at intersection
CN114572240A (en) Vehicle travel control method, device, vehicle, electronic device, and storage medium
CN111693059B (en) Navigation method, device and equipment for roundabout and storage medium
CN112344855A (en) Obstacle detection method and device, storage medium and drive test equipment
CN112131335A (en) Lane-level map data processing method and device, electronic equipment and storage medium
CN111597287A (en) Map generation method, device and equipment
CN110843771B (en) Obstacle recognition method, obstacle recognition device, electronic device and storage medium
CN111462072A (en) Dot cloud picture quality detection method and device and electronic equipment
CN111275827A (en) Edge-based augmented reality three-dimensional tracking registration method and device and electronic equipment
CN110728721B (en) Method, device and equipment for acquiring external parameters

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant