CN111079079B - Data correction method, device, electronic equipment and computer readable storage medium - Google Patents
Data correction method, device, electronic equipment and computer readable storage medium Download PDFInfo
- Publication number
- CN111079079B CN111079079B CN201911200789.4A CN201911200789A CN111079079B CN 111079079 B CN111079079 B CN 111079079B CN 201911200789 A CN201911200789 A CN 201911200789A CN 111079079 B CN111079079 B CN 111079079B
- Authority
- CN
- China
- Prior art keywords
- data
- road side
- sample
- unmanned vehicle
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 238000012937 correction Methods 0.000 title claims abstract description 37
- 230000008447 perception Effects 0.000 claims abstract description 65
- 239000011159 matrix material Substances 0.000 claims abstract description 42
- 230000009466 transformation Effects 0.000 claims abstract description 42
- 238000012545 processing Methods 0.000 claims abstract description 13
- 230000015654 memory Effects 0.000 claims description 19
- 238000001514 detection method Methods 0.000 claims description 12
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 238000010586 diagram Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000000354 decomposition reaction Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000002715 modification method Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013501 data transformation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096791—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Mathematical Optimization (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Pure & Applied Mathematics (AREA)
- Computational Mathematics (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Traffic Control Systems (AREA)
Abstract
The application discloses a data correction method, a data correction device, electronic equipment and a computer readable storage medium, and relates to the technical field of automatic driving. The specific implementation scheme is as follows: obtaining road side perception data of a target unmanned vehicle; processing the road side perception data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle; the preset transformation matrix is calculated by using road side perception data and truth value data of a plurality of sample unmanned vehicles. According to the method and the device, accuracy of road side perception data of the unmanned vehicle can be improved.
Description
Technical Field
The application relates to the field of computer technology, in particular to the technical field of automatic driving.
Background
With the development of automatic driving technology and intelligent internet of vehicles technology, detection of obstacles around unmanned vehicles and road information, namely, real-time accurate perception of the position, speed, lane information and the like of the obstacles at the road side becomes a premise of safe driving. However, because of factors such as unified calibration of the road side perception system, inconsistent ground equations of different road sides and the like, the accuracy of road side perception data of the existing unmanned vehicle is smaller, and the safety and reliability of automatic driving cannot be ensured.
Disclosure of Invention
The embodiment of the application provides a data correction method, a data correction device, electronic equipment and a computer readable storage medium, so as to solve the problem that the accuracy of road side perceived data of an existing unmanned vehicle is low.
In order to solve the technical problems, the application is realized as follows:
in a first aspect, an embodiment of the present application provides a data correction method, including:
obtaining road side perception data of a target unmanned vehicle;
processing the road side perception data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle;
the preset transformation matrix is calculated by using road side perception data and truth value data of a plurality of sample unmanned vehicles.
Therefore, the road side perception data of the target unmanned vehicle can be corrected, the technical problem that the accuracy of the road side perception data of the existing unmanned vehicle is low is solved, and the accuracy of the road side perception data is improved.
Optionally, before the obtaining the road side perception data of the target unmanned vehicle, the method further includes:
acquiring road side perception data of the plurality of sample unmanned vehicles;
acquiring true value data of the plurality of sample unmanned vehicles;
and calculating a transformation matrix between the road side perception data and the truth value data of the plurality of sample unmanned vehicles based on an iterative closest point ICP algorithm to obtain the preset transformation matrix.
Therefore, the ICP algorithm can be used for matching the corresponding points of the road side perception data and the corresponding points of the true value data of the sample unmanned vehicle, so that the road side perception data is infinitely close to the true value data, and a transformation matrix between the road side perception data and the true value data is obtained.
Optionally, the acquiring the truth value data of the plurality of sample unmanned vehicles includes:
and respectively receiving true value data sent by each sample unmanned aerial vehicle, wherein the true value data is obtained by the sample unmanned aerial vehicle through positioning operation.
In this way, since the general accuracy of the positional information and the like obtained by the vehicle positioning is high, the accuracy of the preset transformation matrix can be ensured by taking the data obtained by the positioning operation as the true value data.
Optionally, the road side perception data of the sample unmanned vehicle includes: the position coordinate sets of the center point and the corner point of the sample unmanned vehicle are obtained through road side detection;
the truth data of the sample unmanned vehicle comprises: and through positioning operation, obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle.
Therefore, by means of the position coordinate set of the center point and the side corner point, accurate calibration of the unmanned vehicle can be achieved, and therefore accuracy of a preset transformation matrix is improved.
In a second aspect, an embodiment of the present application provides a data correction device, including:
the first acquisition module is used for acquiring road side perception data of the target unmanned vehicle;
the processing module is used for processing the road side perception data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle;
the preset transformation matrix is calculated by using road side perception data and truth value data of a plurality of sample unmanned vehicles.
Optionally, the apparatus further includes:
the second acquisition module is used for acquiring road side perception data of the plurality of sample unmanned vehicles;
the third acquisition module is used for acquiring true value data of the plurality of sample unmanned vehicles;
the calculating module is used for calculating a transformation matrix between the road side perception data and the truth value data of the plurality of sample unmanned vehicles based on an ICP algorithm to obtain the preset transformation matrix.
Optionally, the third obtaining module is specifically configured to:
and respectively receiving true value data sent by each sample unmanned aerial vehicle, wherein the true value data is obtained by the sample unmanned aerial vehicle through positioning operation.
Optionally, the road side perception data of the sample unmanned vehicle includes: the position coordinate sets of the center point and the corner point of the sample unmanned vehicle are obtained through road side detection;
the truth data of the sample unmanned vehicle comprises: and through positioning operation, obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle.
In a third aspect, an embodiment of the present application further provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the data modification method as described above.
In a fourth aspect, embodiments of the present application also provide a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the data correction method as described above.
One embodiment of the above application has the following advantages or benefits: the accuracy of road side perception data of the unmanned vehicle can be improved. Because the technical means of processing the road side sensing data of the target unmanned vehicle according to the preset transformation matrix to obtain the target correction data is adopted, the technical problem that the accuracy of the road side sensing data of the existing unmanned vehicle is low is solved, and the technical effect of improving the accuracy of the road side sensing data can be achieved.
Other effects of the above alternative will be described below in connection with specific embodiments.
Drawings
The drawings are for better understanding of the present solution and do not constitute a limitation of the present application. Wherein:
FIG. 1 is a flow chart of a data correction method of an embodiment of the present application;
FIG. 2 is a schematic diagram of an application scenario in an embodiment of the present application;
FIG. 3 is a second schematic view of an application scenario in an embodiment of the present application;
FIG. 4 is a block diagram of a data correction device for implementing the data correction method of the embodiments of the present application;
fig. 5 is a block diagram of an electronic device for implementing a data correction method of an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present application to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Referring to fig. 1, fig. 1 is a flowchart of a data correction method according to an embodiment of the present application, as shown in fig. 1, the method includes the following steps:
step 101: and obtaining road side perception data of the target unmanned vehicle.
In this embodiment, the target unmanned vehicle may be a vehicle that travels on an intersection, road, or the like. The target drone may be one vehicle or multiple vehicles. The road side perception data of the target unmanned vehicle can be obtained by a road side perception system through road side detection.
Alternatively, the execution subject of the data correction method of the present embodiment may be a roadside computing device (or referred to as a roadside computing unit). The roadside computing device may be a component of the roadside sensing system, or may be a computing device independent of the roadside sensing system, which is not limited in this embodiment.
Step 102: and processing the road side perception data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle.
In this embodiment, the preset transformation matrix may be calculated by using the road side sensing data and the true value data of the multiple sample unmanned vehicles, and used for subsequently correcting (or called correcting and precision compensating) the road side sensing data of the target unmanned vehicles, so as to improve the road side sensing precision. For example, the plurality of sample drones may be selected to be 3, 4, etc.
In one embodiment, the road side sensing data of the sample unmanned vehicle can be detected by a road side sensing system through the road side. The truth value data of the sample unmanned vehicle can be obtained by self simulation through positioning operation. It can be appreciated that the positional information obtained by vehicle positioning, etc. is generally highly accurate and thus can be used as a true value reference.
In the application scenario of this embodiment, the sample unmanned vehicle and the target unmanned vehicle may be driven sequentially through the same intersection. That is, after the change matrix is calculated by using the road side sensing data and the truth value data of the (sample) unmanned vehicle which is driven through the intersection, the road side sensing data of the (target) unmanned vehicle which is driven through the intersection is processed, namely, the accuracy compensation is performed according to the change matrix, so that the accuracy of the road side sensing data of the subsequent unmanned vehicle is improved.
According to the data correction method, the road side perceived data of the target unmanned aerial vehicle is processed according to the preset transformation matrix to obtain the target correction data, and the road side perceived data of the target unmanned aerial vehicle can be corrected, so that the technical problem that the accuracy of the road side perceived data of the existing unmanned aerial vehicle is low is solved, and the accuracy of the road side perceived data is improved.
In this embodiment of the present application, since the position of the unmanned vehicle may be calibrated by means of the center point and the corner point, the road side sensing data of the sample unmanned vehicle may include: the position coordinate sets of the center point and the corner point of the sample unmanned vehicle are obtained through road side detection; the truth data of the sample unmanned vehicle comprises: and through positioning operation, obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle. Therefore, the unmanned vehicle can be accurately calibrated, and the accuracy of a preset transformation matrix is ensured.
Optionally, before step 101, the method may further include:
obtaining road side perception data of a plurality of sample unmanned vehicles;
acquiring truth value data of a plurality of sample unmanned vehicles;
and calculating a transformation matrix between the road side perception data and the true value data of the plurality of sample unmanned vehicles based on an iterative closest point (Iterative Closest Point, ICP) algorithm to obtain the preset transformation matrix.
The ICP algorithm is an iterative calculation method, is mainly used for accurately splicing depth images in computer vision, and can be realized by continuously iterating and minimizing corresponding points of source data and target data. Based on the above, when the preset transformation matrix is calculated, the ICP algorithm can be used for matching the corresponding points of the road side perception data and the corresponding points of the true value data of the sample unmanned vehicle, so that the road side perception data is infinitely close to the true value data, and the transformation matrix between the road side perception data and the true value data is obtained.
In this embodiment, since the 3D error of the sensing module in the road side sensing system, such as the camera, caused by the internal and external parameters and the ground equation, accords with the european transformation, the following model formula may exist to enable the road side sensing data transformation of the sample unmanned vehicle to project to the true value data:
P2 i =RP1 i +t+e i (1)
wherein P1 i Representing a position coordinate set of a center point and a corner point of the sample unmanned vehicle i obtained through positioning operation simulation, P2 i Position coordinate set representing center point and corner point of sample unmanned vehicle i obtained by road side detection, R and t represent European transformation, and e i Representing a human labeling error, which may conform to a gaussian distribution.
In one embodiment, after obtaining the road side sensing data (the position coordinate set of the center point and the edge corner point obtained by road side detection) and the truth value data (the position coordinate set of the center point and the edge corner point obtained by positioning operation simulation) of the plurality of sample unmanned vehicles, ICP solving can be performed by using a singular value decomposition (Singular Value Decomposition, SVD) mode according to the above formula (1), and accurate splicing is achieved by continuously iterating the corresponding points of the road side sensing data and the truth value data so as to obtain R and t, thereby obtaining a transformation matrix between the road side sensing data and the truth value data of the sample unmanned vehicles.
Note that, in this embodiment, the method of calculating the transformation matrix between the road side sensing data and the truth value data of the sample unmanned vehicle is not limited, and an existing method, for example, the above-mentioned ICP solving method using the SVD method may be adopted.
Optionally, the process of acquiring the truth data of the plurality of sample unmanned vehicles may include:
and respectively receiving true value data sent by each sample unmanned aerial vehicle, wherein the true value data is obtained by the sample unmanned aerial vehicle through positioning operation.
In this way, since the general accuracy of the positional information and the like obtained by the vehicle positioning is high, the accuracy of the preset transformation matrix can be ensured by taking the data obtained by the positioning operation as the true value data.
It should be noted that, the truth data may be actively sent to the roadside computing device by the sample unmanned vehicle when the sample unmanned vehicle passes through a specific location, or may be sent to the roadside computing device by the sample unmanned vehicle after receiving the related information. The related information may be request information sent by the road side computing device and used for requesting true value data, or a road side sensing message broadcast by the road side sensing system, where the road side sensing message includes all obstacle (vehicle) information sensed by the road side sensing system, so that after the sample unmanned aerial vehicle receives the obstacle information, the sample unmanned aerial vehicle discovers information belonging to the sample unmanned aerial vehicle based on the main vehicle discovery model, and returns the true value information obtained through positioning operation and matched obstacle information to the road side computing device.
Embodiments of the present application are described in detail below in conjunction with fig. 2 and 3.
In the embodiment of the present application, as shown in fig. 2, the road side sensing system 1 may include a sensing module 11 and a first V2X (vehicle to everything) transmitting module 12. The sensing module 11 is configured to detect road side sensing information (at least including position coordinate data of a center point and a corner point of the obstacle vehicle) of all obstacle vehicles at corresponding positions, such as an intersection. The first V2X transmitting module 12 is configured to broadcast the road side sensing information detected by the sensing module 11, so as to be received by each unmanned vehicle 2 in the intersection.
As shown in fig. 2, the drone vehicle 2 (which may also be referred to as a host vehicle 2) may include a second V2X receiving module 21, a host vehicle discovery module 22, and a positioning module 23. The second V2X receiving module 21 is configured to receive the road side sensing information broadcast by the first V2X transmitting module 12. The host vehicle discovery module 22 is configured to discover information pertaining to itself from the received obstacle-vehicle information. The positioning module 23 is configured to obtain positioning information (sensing information detected by a road side is distinguished from the sensing information, which is simulated by the positioning operation) of the unmanned vehicle 2 through a positioning operation, where the accuracy of the information is high, and the information can be used as a true value reference and called true value information, and at least includes position coordinate data of a center point and a corner point thereof. It can be understood that, for single-target tracking, because the accuracy of the road side perception information is limited, if the unmanned vehicle 2 adopts the road side perception information to discover, the error is abnormally large; and because the accuracy of the information simulated by the positioning operation is high, the error is small if the unmanned vehicle 2 adopts the simulated information for discovery.
Further, as shown in fig. 3, the drone 2 (such as the host vehicle a, host vehicle B, and host vehicle C shown in fig. 3, which correspond to the sample drone) may also include a second V2X sending module 24. The second V2X sending module 24 may be configured to send back, to the road side computing device 3, the positioning information obtained by the positioning module 23, the self-size, and the road side perceived information matched by the host vehicle discovery module 22 through V2I (vehicle to infrastructure), so that the road side computing device 3 calculates a transformation matrix for compensating, i.e. correcting, the accuracy of the road side perceived information.
In one embodiment, the information (or referred to as host vehicle obstacle information) sent by the second V2X sending module 24 may include the following necessary fields:
1) id: identification of the host vehicle;
2) position: the location of the host vehicle in a world coordinate system (e.g., UTM coordinate system);
3) theta: heading angle of the main vehicle under a world coordinate system;
4) The relationship: velocity of the host vehicle, such as Vx, vy, and Vz in world coordinates;
5) length: the length of the main vehicle;
6) width: the width of the main vehicle;
7) height: the height of the main vehicle;
8) polygon_point: corner points of the main vehicle contour, such as point coordinates in the world coordinate system: x, Y and Z;
9) type: types of host vehicles, including: unknown, unknown motion, unknown stationary, non-motor vehicle, pedestrian, etc.
As shown in fig. 3, the roadside computing device 3 may include a first V2X receiving module 31 and an accuracy compensation module 32. The first V2X receiving module 31 is configured to receive the information sent by the second V2X sending module 24. The accuracy compensation module 32 is configured to match road side sensing information (such as a set of position coordinates of a center point and a corner point obtained by road side sensing) and positioning information (such as a set of position coordinates of a center point and a corner point obtained by positioning operation) of a plurality of sample unmanned vehicles such as a host vehicle a, a host vehicle B, and a host vehicle C to obtain a transformation matrix, and correct road side sensing information of an obstacle vehicle detected by the road side sensing system 1 by using the transformation matrix, so that corrected data approximates to a true value, and return the corrected data to the road side sensing system 1. Alternatively, the roadside computing device 3 may be an integral part of the roadside perception system 1, or may exist independently of the roadside perception system.
Referring to fig. 4, fig. 4 is a block diagram of a data correction device for implementing the data correction method according to the embodiment of the present application, and as shown in fig. 4, the data correction device 40 includes:
a first obtaining module 41, configured to obtain road side perception data of a target unmanned vehicle;
the processing module 42 is configured to process the road side sensing data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle;
the preset transformation matrix is calculated by using road side perception data and truth value data of a plurality of sample unmanned vehicles.
Optionally, the apparatus further includes:
the second acquisition module is used for acquiring road side perception data of the plurality of sample unmanned vehicles;
the third acquisition module is used for acquiring true value data of the plurality of sample unmanned vehicles;
the calculating module is used for calculating a transformation matrix between the road side perception data and the truth value data of the plurality of sample unmanned vehicles based on an ICP algorithm to obtain the preset transformation matrix.
Optionally, the third obtaining module is specifically configured to:
and respectively receiving true value data sent by each sample unmanned aerial vehicle, wherein the true value data is obtained by the sample unmanned aerial vehicle through positioning operation.
Optionally, the road side perception data of the sample unmanned vehicle includes: the position coordinate sets of the center point and the corner point of the sample unmanned vehicle are obtained through road side detection;
the truth data of the sample unmanned vehicle comprises: and through positioning operation, obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle.
It can be appreciated that the data correction device 40 in this embodiment of the present application may implement each process implemented in the method embodiment shown in fig. 1 and achieve the same beneficial effects, and in order to avoid repetition, a detailed description is omitted here.
According to embodiments of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 5, a block diagram of an electronic device for implementing the data correction method according to the embodiment of the present application is shown. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 5, the electronic device includes: one or more processors 501, memory 502, and interfaces for connecting components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple electronic devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 501 is illustrated in fig. 5.
Memory 502 is a non-transitory computer readable storage medium provided herein. Wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the data correction methods provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the data correction method provided by the present application.
The memory 502 is used as a non-transitory computer readable storage medium for storing non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the first acquisition module 41 and the processing module 42 shown in fig. 4) corresponding to the data correction method in the embodiment of the present application. The processor 501 executes various functional applications of the server and data processing by running non-transitory software programs, instructions, and modules stored in the memory 502, that is, implements the data correction method in the above-described method embodiments.
Memory 502 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created by use of the electronic device, and the like. In addition, memory 502 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some embodiments, memory 502 may optionally include memory located remotely from processor 501, which may be connected to the electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the data correction method may further include: an input device 503 and an output device 504. The processor 501, memory 502, input devices 503 and output devices 504 may be connected by a bus or otherwise, for example in fig. 5.
The input device 503 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device of the data modification method, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointer stick, one or more mouse buttons, a track ball, a joystick, etc. The output devices 504 may include a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibration motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASIC (application specific integrated circuit), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also referred to as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, the road side perception data of the target unmanned vehicle can be corrected, so that the technical problem that the accuracy of the road side perception data of the existing unmanned vehicle is low is solved, and the accuracy of the road side perception data is improved.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions disclosed in the present application can be achieved, and are not limited herein.
The above embodiments do not limit the scope of the application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application are intended to be included within the scope of the present application.
Claims (6)
1. A data correction method, comprising:
obtaining road side perception data of a target unmanned vehicle;
processing the road side perception data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle;
the preset transformation matrix is obtained by calculating road side perception data and truth value data of a plurality of sample unmanned vehicles; the road side perception data of the target unmanned vehicle are obtained by a road side perception system through road side detection;
before the obtaining of the road side perception data of the target unmanned vehicle, the method further comprises:
acquiring road side perception data of the plurality of sample unmanned vehicles;
acquiring true value data of the plurality of sample unmanned vehicles;
based on an iterative closest point ICP algorithm, calculating a transformation matrix between road side perception data and truth value data of the plurality of sample unmanned vehicles to obtain the preset transformation matrix;
the road side perception data of the sample unmanned vehicle comprises: the position coordinate sets of the center point and the corner point of the sample unmanned vehicle are obtained through road side detection;
the truth data of the sample unmanned vehicle comprises: and through positioning operation, obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle.
2. The method of claim 1, wherein the obtaining the truth data for the plurality of sample drones comprises:
and respectively receiving true value data sent by each sample unmanned aerial vehicle, wherein the true value data is obtained by the sample unmanned aerial vehicle through positioning operation.
3. A data correction device, comprising:
the first acquisition module is used for acquiring road side perception data of the target unmanned vehicle;
the processing module is used for processing the road side perception data according to a preset transformation matrix to obtain target correction data of the target unmanned vehicle;
the preset transformation matrix is obtained by calculating road side perception data and truth value data of a plurality of sample unmanned vehicles; the road side perception data of the target unmanned vehicle are obtained by a road side perception system through road side detection;
the apparatus further comprises:
the second acquisition module is used for acquiring road side perception data of the plurality of sample unmanned vehicles;
the third acquisition module is used for acquiring true value data of the plurality of sample unmanned vehicles;
the computing module is used for computing a transformation matrix between the road side perception data and the truth value data of the plurality of sample unmanned vehicles based on an ICP algorithm to obtain the preset transformation matrix;
the road side perception data of the sample unmanned vehicle comprises: the position coordinate sets of the center point and the corner point of the sample unmanned vehicle are obtained through road side detection;
the truth data of the sample unmanned vehicle comprises: and through positioning operation, obtaining a position coordinate set of the center point and the corner point of the sample unmanned vehicle.
4. The apparatus of claim 3, wherein the device comprises a plurality of sensors,
the third obtaining module is specifically configured to:
and respectively receiving true value data sent by each sample unmanned aerial vehicle, wherein the true value data is obtained by the sample unmanned aerial vehicle through positioning operation.
5. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-2.
6. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-2.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911200789.4A CN111079079B (en) | 2019-11-29 | 2019-11-29 | Data correction method, device, electronic equipment and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911200789.4A CN111079079B (en) | 2019-11-29 | 2019-11-29 | Data correction method, device, electronic equipment and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111079079A CN111079079A (en) | 2020-04-28 |
CN111079079B true CN111079079B (en) | 2023-12-26 |
Family
ID=70312120
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911200789.4A Active CN111079079B (en) | 2019-11-29 | 2019-11-29 | Data correction method, device, electronic equipment and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111079079B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111932883B (en) * | 2020-08-13 | 2022-09-27 | 上海电科市政工程有限公司 | Method for guiding unmanned driving by utilizing broadcast communication of road side equipment |
CN112816954B (en) * | 2021-02-09 | 2024-03-26 | 中国信息通信研究院 | Road side perception system evaluation method and system based on true value |
CN114581615B (en) * | 2022-05-07 | 2022-08-26 | 江苏三棱智慧物联发展股份有限公司 | Data processing method, device, equipment and storage medium |
CN115265630B (en) * | 2022-07-25 | 2023-04-07 | 科大国创合肥智能汽车科技有限公司 | Method for screening static object identification information of sensor based on FDR |
CN115825901B (en) * | 2023-02-21 | 2023-04-28 | 南京楚航科技有限公司 | Vehicle-mounted sensor perception performance evaluation truth value system |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105741546A (en) * | 2016-03-18 | 2016-07-06 | 重庆邮电大学 | Intelligent vehicle target tracking system through integration of road side equipment and vehicle sensor and method thereof |
CN108415057A (en) * | 2018-01-25 | 2018-08-17 | 南京理工大学 | A kind of relative positioning method that unmanned fleet cooperates with roadside unit |
CN207852108U (en) * | 2018-03-12 | 2018-09-11 | 北京图森未来科技有限公司 | A kind of bus or train route cooperative system and its bus or train route cooperate with trackside awareness apparatus |
CN108762245A (en) * | 2018-03-20 | 2018-11-06 | 华为技术有限公司 | Data fusion method and relevant device |
CN109901586A (en) * | 2019-03-27 | 2019-06-18 | 厦门金龙旅行车有限公司 | A kind of unmanned vehicle tracking control method, device, equipment and storage medium |
CN110070139A (en) * | 2019-04-28 | 2019-07-30 | 吉林大学 | Small sample towards automatic Pilot environment sensing is in ring learning system and method |
CN110103953A (en) * | 2019-04-30 | 2019-08-09 | 北京百度网讯科技有限公司 | For assisting method, equipment, medium and the system of the Driving control of vehicle |
CN110231601A (en) * | 2019-07-01 | 2019-09-13 | 百度在线网络技术(北京)有限公司 | Sensor error compensation method, device, equipment and storage medium |
CN110243358A (en) * | 2019-04-29 | 2019-09-17 | 武汉理工大学 | The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion |
CN110455554A (en) * | 2019-09-03 | 2019-11-15 | 酷黑科技(北京)有限公司 | A kind of unmanned vehicle test macro and method |
CN110501013A (en) * | 2019-08-07 | 2019-11-26 | 腾讯科技(深圳)有限公司 | Position compensation method, apparatus and electronic equipment |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040247030A1 (en) * | 2003-06-09 | 2004-12-09 | Andre Wiethoff | Method for transcoding an MPEG-2 video stream to a new bitrate |
-
2019
- 2019-11-29 CN CN201911200789.4A patent/CN111079079B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105741546A (en) * | 2016-03-18 | 2016-07-06 | 重庆邮电大学 | Intelligent vehicle target tracking system through integration of road side equipment and vehicle sensor and method thereof |
CN108415057A (en) * | 2018-01-25 | 2018-08-17 | 南京理工大学 | A kind of relative positioning method that unmanned fleet cooperates with roadside unit |
CN207852108U (en) * | 2018-03-12 | 2018-09-11 | 北京图森未来科技有限公司 | A kind of bus or train route cooperative system and its bus or train route cooperate with trackside awareness apparatus |
CN108762245A (en) * | 2018-03-20 | 2018-11-06 | 华为技术有限公司 | Data fusion method and relevant device |
CN109901586A (en) * | 2019-03-27 | 2019-06-18 | 厦门金龙旅行车有限公司 | A kind of unmanned vehicle tracking control method, device, equipment and storage medium |
CN110070139A (en) * | 2019-04-28 | 2019-07-30 | 吉林大学 | Small sample towards automatic Pilot environment sensing is in ring learning system and method |
CN110243358A (en) * | 2019-04-29 | 2019-09-17 | 武汉理工大学 | The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion |
CN110103953A (en) * | 2019-04-30 | 2019-08-09 | 北京百度网讯科技有限公司 | For assisting method, equipment, medium and the system of the Driving control of vehicle |
CN110231601A (en) * | 2019-07-01 | 2019-09-13 | 百度在线网络技术(北京)有限公司 | Sensor error compensation method, device, equipment and storage medium |
CN110501013A (en) * | 2019-08-07 | 2019-11-26 | 腾讯科技(深圳)有限公司 | Position compensation method, apparatus and electronic equipment |
CN110455554A (en) * | 2019-09-03 | 2019-11-15 | 酷黑科技(北京)有限公司 | A kind of unmanned vehicle test macro and method |
Non-Patent Citations (2)
Title |
---|
刘希宾 ; 杨卫 ; 陈晓乐 ; 王琳玮 ; 刘珊 ; .声光阵列协作的运动目标轨迹预推方法.电子器件.2019,(第02期),全文. * |
声光阵列协作的运动目标轨迹预推方法;刘希宾;杨卫;陈晓乐;王琳玮;刘珊;;电子器件(第02期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN111079079A (en) | 2020-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111079079B (en) | Data correction method, device, electronic equipment and computer readable storage medium | |
US11615605B2 (en) | Vehicle information detection method, electronic device and storage medium | |
CN112415552B (en) | Vehicle position determining method and device and electronic equipment | |
CN110738183B (en) | Road side camera obstacle detection method and device | |
CN110793544B (en) | Method, device and equipment for calibrating parameters of roadside sensing sensor and storage medium | |
CN110806215B (en) | Vehicle positioning method, device, equipment and storage medium | |
CN111784836B (en) | High-precision map generation method, device, equipment and readable storage medium | |
EP3968266B1 (en) | Obstacle three-dimensional position acquisition method and apparatus for roadside computing device | |
CN111324945B (en) | Sensor scheme determining method, device, equipment and storage medium | |
JP2021101365A (en) | Positioning method, positioning device, and electronic device | |
CN112101209B (en) | Method and apparatus for determining world coordinate point cloud for roadside computing device | |
CN111767853B (en) | Lane line detection method and device | |
CN112270669A (en) | Human body 3D key point detection method, model training method and related device | |
CN112184914B (en) | Method and device for determining three-dimensional position of target object and road side equipment | |
CN111310840B (en) | Data fusion processing method, device, equipment and storage medium | |
CN111523471B (en) | Method, device, equipment and storage medium for determining lane where vehicle is located | |
CN111578839B (en) | Obstacle coordinate processing method and device, electronic equipment and readable storage medium | |
CN113844463B (en) | Vehicle control method and device based on automatic driving system and vehicle | |
KR102694715B1 (en) | Method for detecting obstacle, electronic device, roadside device and cloud control platform | |
JP7194217B2 (en) | Obstacle speed determination method, device, electronic device, storage medium and computer program | |
CN111949816B (en) | Positioning processing method, device, electronic equipment and storage medium | |
CN111462072B (en) | Point cloud picture quality detection method and device and electronic equipment | |
CN112344855A (en) | Obstacle detection method and device, storage medium and drive test equipment | |
CN112184828A (en) | External parameter calibration method and device for laser radar and camera and automatic driving vehicle | |
CN112102417A (en) | Method and device for determining world coordinates and external reference calibration method for vehicle-road cooperative roadside camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |