CN116148820A - Laser radar calibration method, computer equipment, readable storage medium and motor vehicle - Google Patents

Laser radar calibration method, computer equipment, readable storage medium and motor vehicle Download PDF

Info

Publication number
CN116148820A
CN116148820A CN202211715511.2A CN202211715511A CN116148820A CN 116148820 A CN116148820 A CN 116148820A CN 202211715511 A CN202211715511 A CN 202211715511A CN 116148820 A CN116148820 A CN 116148820A
Authority
CN
China
Prior art keywords
detection frame
radar
laser radar
visual
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211715511.2A
Other languages
Chinese (zh)
Inventor
盛军辉
章霖超
谢钱昆
王煜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Zero Run Technology Co Ltd
Original Assignee
Zhejiang Zero Run Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Zero Run Technology Co Ltd filed Critical Zhejiang Zero Run Technology Co Ltd
Priority to CN202211715511.2A priority Critical patent/CN116148820A/en
Publication of CN116148820A publication Critical patent/CN116148820A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor

Abstract

The invention discloses a laser radar calibration method, computer equipment, a readable storage medium and a motor vehicle, and relates to the technical field of automatic driving, comprising the following steps: the laser radar obtains true value data of a measured target; the vision sensor acquires a looking-around image; projecting true value data of a detected target on a corresponding looking-around image to form a 3D detection frame; forming a corresponding radar 2D detection frame based on the 3D detection frame; performing target detection on the looking-around image to form a corresponding visual 2D detection frame; calculating the intersection ratio of the radar 2D detection frame and the visual 2D detection frame; and matching the radar 2D detection frame with the visual 2D detection frame based on cross-correlation so as to calibrate the laser radar. The laser radar calibration method provided by the invention can correct the conditions of false detection, omission and the like in the truth value data obtained by a laser radar algorithm, and improves the accuracy of the truth value data.

Description

Laser radar calibration method, computer equipment, readable storage medium and motor vehicle
Technical Field
The invention relates to the technical field of automatic driving, in particular to a laser radar calibration method, computer equipment, a readable storage medium and a motor vehicle.
Background
With the continuous progress of vehicle technology, automatic driving technology gradually develops into hot spots, and is widely focused on various large vehicle enterprises. The environmental perception is used as a primary link in an automatic driving system and is an important basis for a vehicle to conduct behavior decision and path planning. The algorithms for environmental perception include vision-based and radar-based perception algorithms. The perception algorithm developed based on the laser radar is generally used for 3D target detection, and can obtain the position size, the gesture and the category information of targets such as vehicles, pedestrians and the like around in a driving environment. Compared with a pure vision perception algorithm, the perception algorithm based on the laser radar can generally control errors on the position size and the gesture precision of a detected target to be in the centimeter level, and the errors are far higher than those of the pure vision algorithm. But on the other hand, the pure visual perception algorithm has rich semantic information, and can acquire more accurate object category information. At present, the development of the automatic driving technology is generally configured with a data acquisition team and a manual labeling team, but the process of manually acquiring and then manually labeling takes a lot of time to accumulate true value data for model training by an algorithm team, besides manual labeling, a supplier needs to be purchased or a true value data acquisition vehicle needs to be built by itself, and the required target information is acquired by matching with a true value data algorithm, so that a lot of manpower, material resources and financial resources are consumed. In addition, the true value data of the target object obtained based on the laser radar sensing algorithm inevitably has the conditions of target false detection and missed detection, and the false data introduced by the algorithm can cause serious errors of other algorithms or functions developed based on the false data.
Disclosure of Invention
In order to solve the problems, the invention provides a laser radar calibration method, which corrects the conditions of false detection, omission and the like in true value data obtained by a laser radar algorithm and improves the accuracy of the true value data.
In order to achieve the above purpose, the invention adopts the following technical scheme:
a laser radar calibration method comprising the steps of:
the laser radar obtains true value data of a detected target around the vehicle in a world coordinate system; the vision sensor acquires a surrounding image of the periphery of the vehicle;
projecting true value data of a detected target around the vehicle on a corresponding looking-around image to form a 3D detection frame;
forming a corresponding radar 2D detection frame based on the 3D detection frame;
performing target detection on the looking-around image to form a corresponding visual 2D detection frame;
calculating the intersection ratio of the radar 2D detection frame and the visual 2D detection frame;
and matching the radar 2D detection frame with the visual 2D detection frame based on cross-correlation so as to calibrate the laser radar.
Optionally, forming a corresponding radar 2D detection frame based on the 3D detection frame includes the steps of:
and acquiring pixel coordinates of 8 angular points projected on the corresponding looking-around image by the 3D detection frame, and acquiring maximum values and minimum values of U-axis and V-axis of the 8 angular points in the image coordinate system to generate two new angular points, wherein the two newly generated angular points generate the corresponding radar 2D detection frame.
Optionally, matching the radar 2D detection frame and the visual 2D detection frame based on cross-correlation includes the following steps:
calculating the cross ratio between all radar 2D detection frames and all vision 2D detection frames;
constructing a cost matrix based on the calculated cross-over ratio;
in the cost matrix, the sum of the overall cost values is the optimal matching under the maximum condition.
Optionally, the matching result between the radar 2D detection frame and the visual 2D detection frame includes:
if the cross ratio is greater than 0 and the categories are matched with each other, the true value data of the corresponding measured target is reserved;
if the radar 2D detection frame is not matched with the vision 2D detection frame, eliminating true value data of the corresponding detected target;
if the visual 2D detection frame is not matched with the radar 2D detection frame, judging whether the corresponding detected target exists, if so, supplementing true value data of the corresponding detected target, and if not, eliminating the visual 2D detection frame.
Optionally, the step of judging whether the corresponding measured target exists comprises the following steps:
projecting point cloud data around the vehicle on a corresponding looking-around image, recording all the point clouds falling in the visual 2D detection frame, then clustering the point clouds falling in the visual 2D detection frame in a point cloud space, and if the clustering is successful and the clustering result accords with the height condition of a preset category, considering that the detected target exists; if the clustering is unsuccessful or the clustered 3D detection frame height does not accord with the preset category, the detected target is considered to be absent.
According to the technical scheme provided by the invention, the vision-based target detection algorithm is introduced to process the image acquired by the vehicle-mounted look-around camera synchronous with the time space of the laser radar so as to acquire the target object information around the vehicle, so that the application range of the laser radar calibration method is enlarged, meanwhile, the true value data acquired by the laser radar is checked, the false detection target in the true value data is removed, and the missed detection target is supplemented, thereby completing the correction of the true value data, being equivalent to introducing the camera on the basis of hardware of the laser radar, increasing the dependence source of the true value data, improving the data correction efficiency and saving manpower, material resources and financial resources.
Meanwhile, the invention also provides computer equipment, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the laser radar calibration method according to any one of the above when executing the computer program.
The present invention also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the lidar calibration method of any of the above.
In addition, the invention also provides a motor vehicle which has an automatic driving function and is provided with a laser radar, a vision sensor and an MCU, when the motor vehicle runs the automatic driving function, the periphery of the motor vehicle running the automatic driving is identified through the laser radar, and when the automatic driving function is calibrated, the laser radar is calibrated through the MCU and the vision sensor by the laser radar calibration method according to any one of the above.
Or the motor vehicle has the aforementioned computer device;
or the motor vehicle has the aforementioned computer-readable storage medium, which when executed by a processor implements the lidar calibration method of any of the preceding claims.
These features and advantages of the present invention will be disclosed in more detail in the following detailed description and the accompanying drawings. The best mode or means of the present invention will be described in detail with reference to the accompanying drawings, but is not limited to the technical scheme of the present invention. In addition, these features, elements, and components are shown in plural in each of the following and drawings, and are labeled with different symbols or numerals for convenience of description, but each denote a component of the same or similar construction or function.
Drawings
The invention is further described below with reference to the accompanying drawings:
FIG. 1 is a flow chart of an embodiment of the present invention;
FIG. 2 is a schematic diagram of an intersection ratio between a radar 2D detection frame and a visual 2D detection frame in an embodiment of the present invention;
fig. 3 is a schematic diagram of matching a radar 2D detection frame and a visual 2D detection frame in an embodiment of the present invention.
Detailed Description
The technical solutions of the embodiments of the present invention will be explained and illustrated below with reference to the drawings of the embodiments of the present invention, but the following embodiments are only preferred embodiments of the present invention, and not all embodiments. Based on the examples in the implementation manner, other examples obtained by a person skilled in the art without making creative efforts fall within the protection scope of the present invention.
Reference in the specification to "one embodiment" or "an example" means that a particular feature, structure, or characteristic described in connection with the embodiment itself can be included in at least one embodiment of the present patent disclosure. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
Examples:
when the true value data contains false detection or missing detection targets, although the detection effect can be optimized by improving the laser radar sensing algorithm, the defect of the laser radar sensor, namely the semantic information missing, cannot be avoided. In addition, the point cloud distribution has the characteristics of close-in density and far-out sparseness, which also makes the performance of the laser radar perception algorithm limited by hardware. Thus, as shown in fig. 1, the present embodiment provides a laser radar calibration method, which includes the following steps:
the laser radar obtains true value data of a detected target around the vehicle in a world coordinate system; the vision sensor acquires a looking-around image from the surroundings of the vehicle. Specifically, in this embodiment, the looking-around image acquired by the vision sensor is a 6-way looking-around image commonly used in the art, that is, a front left image, a left side image, a rear left image, a front right image, a right side image, and a rear right image.
Projecting true value data of a detected target around the vehicle on a corresponding looking-around image to form a 3D detection frame;
and forming a corresponding radar 2D detection frame based on the 3D detection frame. This step comprises the following sub-steps:
since the 3D detection frame is in a cube shape and has 8 corner points, the pixel coordinates of the 8 corner points projected on the corresponding looking-around image of the 3D detection frame, namely [ U ] are obtained 0 ,V 0 ],[U 1 ,V 1 ]...[U 7 ,V 7 ]Taking maximum and minimum values of U axis and V axis of 8 corner points in an image coordinate system, namely [ U ] min ,V min ],[U max ,V max ]Two new corner points are generated, and corresponding radar 2D detection frames are generated by the two newly generated corner points.
And performing target detection on the looking-around image to form a corresponding visual 2D detection frame. In this step, the target detection algorithm based on visual perception may be a target detection algorithm which is already mature in the prior art and has high accuracy, and is not limited herein.
Computing thunderReaching the cross-over ratio of the 2D detection frame and the visual 2D detection frame, as shown in figure 2
Figure BDA0004022595430000061
Figure BDA0004022595430000062
And matching the radar 2D detection frame with the visual 2D detection frame based on cross-correlation so as to calibrate the laser radar. Specifically, matching is performed based on cross-correlation of the radar 2D detection frame and the visual 2D detection frame by using Hungary matching, and if no amplification path is found from a certain point, no matter how many amplification paths are found from other points to change the current matching, no amplification path is always found from the point. So initially the maximum match is empty and then the amplification path is continually sought and expanded. This process is repeated until no amplification paths are found. In this embodiment, the matching includes the following steps, as shown in fig. 3:
calculating the cross ratio between all radar 2D detection frames and all vision 2D detection frames;
constructing a cost matrix based on the calculated cross-over ratio;
in the cost matrix, the sum of the overall cost values is the optimal matching under the maximum condition. For example, assuming that the number of targets detected by the target detection algorithm is 3, respectively designated as V1, V2, and V3, the number of targets detected by the digital radar algorithm is 4, respectively designated as L1, L2, L3, and L4, all the IOU values under the matching of the two are calculated, and a cost matrix is constructed, and the visual detection target or the laser radar detection target may be used as a row, which is not limited herein. The optimal matching can find the minimum sum or the maximum sum of the whole cost values as a matching standard, and the matching standard is shown in the following table:
Figure BDA0004022595430000071
the matching result is:
Figure BDA0004022595430000072
in this embodiment, the matching result is: v1 matches L2, V2 matches L3, and V3 matches L4.
The matching result includes:
if the cross ratio is greater than 0 and the cross ratios are matched with each other, namely the preset categories are consistent, the true value data of the corresponding measured targets are reserved;
if the radar 2D detection frame is not matched with the vision 2D detection frame, the detection is regarded as false detection, and true value data of a corresponding detected target are removed;
if the visual 2D detection box does not match the radar 2D detection box, there are two possibilities: one possibility is false detection of the object being a 2D vision algorithm, another possibility is missed detection of the object being a lidar algorithm. Therefore, whether the corresponding detected target exists or not needs to be judged according to the following steps;
projecting point cloud data around the vehicle on a corresponding looking-around image, recording all the point clouds falling in the visual 2D detection frame, clustering the point clouds falling in the visual 2D detection frame in a point cloud space, wherein the clustering result is represented by a radar 3D detection frame and has a center point position, length, width and height, if the clustering is successful and the clustering result accords with the height condition of a preset category, the detected target is considered to exist, and the true value data of the corresponding detected target is supplemented; if the clustering is unsuccessful or the height of the clustered 3D detection frame is not consistent with the preset category, the detected target is considered to be absent, and the visual 2D detection frame is removed.
According to the technical scheme provided by the embodiment, the vision-based target detection algorithm is introduced to process the image acquired by the vehicle-mounted look-around camera synchronous with the time space of the laser radar so as to acquire the target object information around the vehicle, so that the application range of the laser radar calibration method is expanded, meanwhile, the true value data acquired by the laser radar is checked, the false detection target in the true value data is removed, and the missed detection target is supplemented, so that the correction of the true value data is completed, the camera is introduced on the basis of hardware of the laser radar, the dependence source of the true value data is increased, the data correction efficiency is improved, and manpower, material resources and financial resources are saved.
Meanwhile, the embodiment also provides a computer device, which comprises a memory and a processor, wherein the memory stores a computer program, and the computer program, when executed by the processor, causes the processor to execute the steps of the point cloud semantic segmentation method.
Those skilled in the art will appreciate that implementing all or part of the processes in the methods of the embodiments described above may be accomplished by computer programs to instruct related hardware. Accordingly, the computer program may be stored in a non-volatile computer readable storage medium, which when executed, performs the method of any of the above embodiments. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
In addition, the embodiment also provides a motor vehicle which has an automatic driving function and is provided with a laser radar, a vision sensor and an MCU, when the motor vehicle runs the automatic driving function, the periphery of the motor vehicle running the automatic driving is identified through the laser radar, and when the automatic driving function is calibrated, the MCU and the vision sensor are utilized to calibrate the laser radar through a previous laser radar calibration method;
or a motor vehicle having the aforementioned computer device;
or a motor vehicle having a computer readable storage medium as described above, which when executed by a processor implements a lidar calibration method as described in any of the preceding claims.
The above is only a specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and it should be understood by those skilled in the art that the present invention includes but is not limited to the accompanying drawings and the description of the above specific embodiment. Any modifications which do not depart from the functional and structural principles of the present invention are intended to be included within the scope of the appended claims.

Claims (8)

1. A laser radar calibration method, comprising the steps of:
the laser radar obtains true value data of a detected target around the vehicle in a world coordinate system; the vision sensor acquires a surrounding image of the periphery of the vehicle;
projecting true value data of a detected target around the vehicle on a corresponding looking-around image to form a 3D detection frame;
forming a corresponding radar 2D detection frame based on the 3D detection frame;
performing target detection on the looking-around image to form a corresponding visual 2D detection frame;
calculating the intersection ratio of the radar 2D detection frame and the visual 2D detection frame;
and matching the radar 2D detection frame with the visual 2D detection frame based on cross-correlation so as to calibrate the laser radar.
2. The lidar calibration method of claim 1, wherein forming a corresponding radar 2D detection frame based on the 3D detection frame comprises the steps of:
and acquiring pixel coordinates of 8 angular points projected on the corresponding looking-around image by the 3D detection frame, and acquiring maximum values and minimum values of U-axis and V-axis of the 8 angular points in the image coordinate system to generate two new angular points, wherein the two newly generated angular points generate the corresponding radar 2D detection frame.
3. The lidar calibration method of claim 1, wherein matching the radar 2D detection frame and the visual 2D detection frame based on the cross-correlation comprises the steps of:
calculating the cross ratio between all radar 2D detection frames and all vision 2D detection frames;
constructing a cost matrix based on the calculated cross-over ratio;
in the cost matrix, the sum of the overall cost values is the optimal matching under the maximum condition.
4. The lidar calibration method of claim 3, wherein the matching of the radar 2D detection frame and the visual 2D detection frame comprises:
if the cross ratio is greater than 0 and the categories are matched with each other, the true value data of the corresponding measured target is reserved;
if the radar 2D detection frame is not matched with the vision 2D detection frame, eliminating true value data of the corresponding detected target;
if the visual 2D detection frame is not matched with the radar 2D detection frame, judging whether the corresponding detected target exists, if so, supplementing true value data of the corresponding detected target, and if not, eliminating the visual 2D detection frame.
5. The lidar calibration method of claim 1, wherein determining whether the corresponding measured target is present comprises the steps of:
projecting point cloud data around the vehicle on a corresponding looking-around image, recording all the point clouds falling in the visual 2D detection frame, then clustering the point clouds falling in the visual 2D detection frame in a point cloud space, and if the clustering is successful and the clustering result accords with the height condition of a preset category, considering that the detected target exists; if the clustering is unsuccessful or the clustered 3D detection frame height does not accord with the preset category, the detected target is considered to be absent.
6. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the lidar calibration method of any of claims 1 to 5 when the computer program is executed.
7. A computer readable storage medium having stored thereon a computer program, characterized in that the computer program, when executed by a processor, implements the lidar calibration method of any of claims 1 to 5.
8. A motor vehicle having an autopilot function and equipped with a laser radar, a vision sensor and an MCU, wherein when the motor vehicle operates the autopilot function, the periphery of the motor vehicle operating the autopilot is recognized by the laser radar, and when the autopilot function is calibrated, the laser radar is calibrated by the laser radar calibration method according to any one of claims 1 to 5 by using the MCU and the vision sensor.
Or the motor vehicle has the computer device of claim 6;
or the motor vehicle has a computer-readable storage medium as claimed in claim 7, which computer program, when executed by a processor, implements the lidar calibration method of any of claims 1 to 5.
CN202211715511.2A 2022-12-28 2022-12-28 Laser radar calibration method, computer equipment, readable storage medium and motor vehicle Pending CN116148820A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211715511.2A CN116148820A (en) 2022-12-28 2022-12-28 Laser radar calibration method, computer equipment, readable storage medium and motor vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211715511.2A CN116148820A (en) 2022-12-28 2022-12-28 Laser radar calibration method, computer equipment, readable storage medium and motor vehicle

Publications (1)

Publication Number Publication Date
CN116148820A true CN116148820A (en) 2023-05-23

Family

ID=86351904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211715511.2A Pending CN116148820A (en) 2022-12-28 2022-12-28 Laser radar calibration method, computer equipment, readable storage medium and motor vehicle

Country Status (1)

Country Link
CN (1) CN116148820A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102643538B1 (en) * 2023-08-25 2024-03-06 주식회사 라이드플럭스 Method, computing device and computer program for performing auto calibration between different sensor data in autonomous driving environment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102643538B1 (en) * 2023-08-25 2024-03-06 주식회사 라이드플럭스 Method, computing device and computer program for performing auto calibration between different sensor data in autonomous driving environment

Similar Documents

Publication Publication Date Title
CN109920011B (en) External parameter calibration method, device and equipment for laser radar and binocular camera
CN110458112B (en) Vehicle detection method and device, computer equipment and readable storage medium
CN113936198B (en) Low-beam laser radar and camera fusion method, storage medium and device
CN110827202A (en) Target detection method, target detection device, computer equipment and storage medium
CN112927309B (en) Vehicle-mounted camera calibration method and device, vehicle-mounted camera and storage medium
CN112466147B (en) Multi-sensor-based library position detection method and related device
CN113792707A (en) Terrain environment detection method and system based on binocular stereo camera and intelligent terminal
CN116148820A (en) Laser radar calibration method, computer equipment, readable storage medium and motor vehicle
CN115797454A (en) Multi-camera fusion sensing method and device under bird's-eye view angle
US20230206500A1 (en) Method and apparatus for calibrating extrinsic parameter of a camera
CN114998856B (en) 3D target detection method, device, equipment and medium for multi-camera image
CN115410167A (en) Target detection and semantic segmentation method, device, equipment and storage medium
CN110962858B (en) Target identification method and device
CN112733672A (en) Monocular camera-based three-dimensional target detection method and device and computer equipment
CN114120149A (en) Oblique photogrammetry building feature point extraction method and device, electronic equipment and medium
CN115376090A (en) High-precision map construction method and device, electronic equipment and storage medium
US20080036855A1 (en) Sensing apparatus and method for vehicles
CN113140002B (en) Road condition detection method and system based on binocular stereo camera and intelligent terminal
CN112529011A (en) Target detection method and related device
CN111753901A (en) Data fusion method, device and system and computer equipment
CN115908551A (en) Vehicle distance measuring method and device, electronic equipment and storage medium
CN113033578B (en) Image calibration method, system, terminal and medium based on multi-scale feature matching
CN115050007A (en) Method and device for identifying tractor and trailer, electronic equipment and storage medium
US20210019902A1 (en) Image processing apparatus
CN115205809B (en) Method and system for detecting roughness of road surface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination