EP3620823B1 - Procédé et dispositif permettant de détecter la précision d'un paramètre interne d'un radar laser - Google Patents

Procédé et dispositif permettant de détecter la précision d'un paramètre interne d'un radar laser Download PDF

Info

Publication number
EP3620823B1
EP3620823B1 EP19195035.1A EP19195035A EP3620823B1 EP 3620823 B1 EP3620823 B1 EP 3620823B1 EP 19195035 A EP19195035 A EP 19195035A EP 3620823 B1 EP3620823 B1 EP 3620823B1
Authority
EP
European Patent Office
Prior art keywords
distance
point cloud
road
cloud data
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP19195035.1A
Other languages
German (de)
English (en)
Other versions
EP3620823A1 (fr
Inventor
Xun Zhou
Yuanfan XIE
Shirui LI
Liang Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Intelligent Driving Technology Beijing Co Ltd
Original Assignee
Apollo Intelligent Driving Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Intelligent Driving Technology Beijing Co Ltd filed Critical Apollo Intelligent Driving Technology Beijing Co Ltd
Publication of EP3620823A1 publication Critical patent/EP3620823A1/fr
Application granted granted Critical
Publication of EP3620823B1 publication Critical patent/EP3620823B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Definitions

  • Embodiments of the present disclosure relate to a field of sensor technology, and more particular to a method and a device for detecting a precision of an internal parameter of a laser radar, a related apparatus and medium.
  • laser radar may be gradually applied to transportation, map drawing, unmanned vehicles, military and other fields, due to high resolution and good concealment.
  • the laser radar may use a large number of light rays to fully describe an environment, which is prominently important for high-precision map drawing, unmanned vehicle, building survey, and home service robot.
  • internal parameters of the laser radar such as a pose and emission angle of a laser transceiver
  • Errors caused by the mass production process may result in that a precision of an internal parameter may not satisfy requirements on an actual application.
  • the imprecise internal parameter may cause a serious effect on the high-precision map drawing and the automatic driving, causing a poor user experience. It is therefore necessary to provide a method for automatically detecting an internal parameter that does not satisfy a requirement on accuracy.
  • US2015066412A1 relates to a method for calibrating a surroundings sensor.
  • the surroundings sensors detect the surroundings of the vehicle and generate sensor data.
  • the sensor data is sent to an external server.
  • the server generates calibration data by comparing the sensor data and reference data and sends the calibration data to the surrounding sensors.
  • a control signal is generated to control the vehicle.
  • CN105627938A relates to detecting thickness of asphalt pavement based on point clouds.
  • a set of road point cloud data is obtained before paving asphalt and a set of road point cloud data is obtained after paving asphalt respectively. These two sets of road point cloud data are registered in a same coordinate system.
  • the original point cloud data are divided along the road to obtain multiple sets of road point cloud data.
  • These road point cloud data are registered, for plane fitting According to the distance between the plane structures, the thickness of the asphalt road is detected.
  • Embodiment of the present disclosure provide a method for detecting a precision of an internal parameter of a laser radar according to independent claim 1, a device for detecting a precision of an internal parameter of a laser radar according to independent claim 4, and a medium according to independent claim 7. Further aspects of the present application are defined by the dependent claims.
  • the point cloud data is obtained by the laser radar during a traveling process of the autonomous mobile carrier on the flat road.
  • the three-dimensional scene reconstruction is performed according to the point cloud data collected to obtain the point cloud model of the three-dimensional scene.
  • the point cloud model of the three-dimensional scene is divided and point cloud data of the scene other than the point cloud data of the road is filtered to obtain the road.
  • the thickness of the road is determined based on the point cloud data of the road. It is determined whether the internal parameter is precise based on the thickness of the road.
  • FIG. 1 is a flowchart illustrating a method for detecting a precision of an internal parameter of a laser radar according to embodiments of the present disclosure. Embodiments of the present disclosure may be applicable to a case of determining whether the internal parameter of the laser radar is precise.
  • the method may be implemented by a device for detecting a precision of an internal parameter of a laser radar according to embodiments of the present disclosure.
  • the device may be implemented by software and/or hardware.
  • the device may be integrated into an autonomous mobile carrier such as an autonomous vehicle.
  • the method may include the following.
  • point cloud data collected by a laser radar provided on an autonomous mobile carrier traveling on a flat road is obtained.
  • the autonomous mobile carrier may be an unmanned mobile device, such as an autonomous vehicle or the like.
  • the laser radar may be a multi-beam laser radar. Multiple laser emitters are distributed vertically such that multiple scanning lines may be formed by rotation of a motor, in a single scan.
  • the multi-beam laser radar may be usually a three-dimensional (3D) laser radar such that the data obtained may be three-dimensional.
  • the multi-beam laser radar may be installed at a roof or a windshield of the autonomous vehicle.
  • the point cloud data may refer to a set of points in three-dimensional coordinates to characterize an outer surface shape of an object. Geometric position information of a three-dimensional space may be represented by (x, y, z) for a point.
  • the point cloud data may also represent RGB (Red-Green-Blue) color, gray value, depth, and division result of a point.
  • the point cloud data collected by the laser radar may be the point cloud data of a scene of a flat road, including the point cloud data of objects such as trees, lights, and vehicles on the road, and point cloud data of the road.
  • the flat road may be detected to have a bowl-shaped curved surface, such that the road formed by the point cloud data detected may have the bowl-shaped curved surface. Therefore, the above characteristic may be used to allow the autonomous mobile carrier travelling on the flat road to detect the precision of the internal parameter of the laser radar.
  • the laser radar may be mounted on the autonomous mobile carrier.
  • the scene of the flat road may be scanned by the laser radar provided on the autonomous mobile carrier to obtain the point cloud data of the scene of the road.
  • a three-dimensional scene reconstruction is performed based on the point cloud data collected to obtain a point cloud model of a three-dimensional scene.
  • the 3D scene reconstruction may refer to fusing the point cloud data collected to reproduce the scene of the flat road.
  • the point cloud data collected may be fused using an ICP (iterative closest point) algorithm.
  • the point cloud data collected may be fused based on real-time positioning data, such as GPS (global positioning system) data, obtained by the GPS or a BeiDou navigation satellite system (BDS).
  • GPS global positioning system
  • BDS BeiDou navigation satellite system
  • the point cloud model of the 3D scene may refer to a 3D point cloud image obtained by fusing the point cloud collected.
  • the ICP algorithm may be used to fuse the point cloud data collected to obtain the point cloud model of the 3D scene.
  • the point cloud model of the 3D scene on the road is shown.
  • the actual information of the road may include objects, such as trees, lights, vehicles on the road, and may also include locations of the objects.
  • the point cloud model of the 3D scene is divided to obtain the road.
  • the road may be formed by the point cloud data of the road, which may also be called as road point cloud in a 3D scene.
  • a point cloud division threshold may be determined according to the point cloud model of the 3D scene and the point cloud data collected.
  • the point cloud model of the 3D scene may be divided based on the point cloud division threshold to remove the point cloud data of objects, such as trees, lights and the vehicles in the scene of the road other than the road to obtain the point cloud data of the road.
  • the road obtained by dividing the point cloud model of the 3D scene illustrated in FIG 2 may be indicated by the numeral reference 10.
  • the point cloud model of the 3D scene may also be divided to obtain the road by modeling.
  • a Gaussian mixture background model may be used to extract a background (such as the road) directly from the point cloud model of the 3D scene.
  • the point cloud model of the 3D scene may be divided to obtain the road using other algorithms, such as a random sample consensus (RANSAC) algorithm.
  • RANSAC random sample consensus
  • a thickness of the road is determined based on the point cloud data of the road, and it is determined whether the internal parameter of the laser radar is precise based on the thickness of the road.
  • the thickness of the road refers to a difference between an upper boundary and a lower boundary of the road in the point cloud model of the 3D scene constructed by the point cloud data collected.
  • a road surface such as a ground plane
  • a road surface may be obtained by fitting, based on a point cloud data line of the road, the point cloud data collected.
  • a distance from each point of point cloud data of the road to the ground plane may be calculated.
  • the thickness of the road may be determined based on the distance.
  • the point cloud data may be divided into two sets of point cloud data according to a distribution characteristics of the point cloud model of the 3D scene.
  • Two planes may be obtained by fitting the two sets of point cloud data respectively. For each set of point cloud data, the thickness may be determined by a distance from each point of the point cloud data to the plane determined.
  • the thickness of the road may be determined based on the determined thickness of the two planes.
  • the point cloud data of the upper boundary and the point cloud data of the lower boundary of the road may be obtained respectively in the point cloud model of the 3D scene constructed by the point cloud data collected.
  • An upper plane may be determined according to the point cloud data of the upper boundary, and a lower plane may be determined according to the point cloud data of the lower boundary.
  • the thickness of the road may be determined by calculating a distance between the two planes.
  • the thickness of the road may be compared with a preset thickness threshold to determine whether the internal parameter of the laser radar is precise. For example, in a case where the thickness of the road is greater than the preset thickness threshold, it may be determined that the internal parameter of the laser radar is imprecise.
  • the preset thickness threshold may refer to a preset value, which may be corrected based on actual road conditions. The smaller the thickness threshold, the higher the accuracy of determining whether the internal parameter of the lidar is precise. For example, the preset thickness threshold may be 20 cm.
  • the thickness of the road determined based on the point cloud data of the road is greater than the preset thickness threshold of 20 cm, it may indicate that the point cloud data collected by the laser radar is imprecise, and thus the internal parameter of the laser radar is imprecise.
  • the thickness of the road surface is less than or equal to the preset thickness threshold of 20 cm, it may indicate that the point cloud data collected by the laser radar is precise, and thus the internal parameter of the laser radar is precise.
  • the point cloud data may be collected by the laser radar during a traveling process of the autonomous mobile carrier on the flat road.
  • the 3D scene reconstruction may be performed based on the point cloud data collected to obtain the point cloud model of the 3D scene.
  • the point cloud model of the 3D scene may be divided and the point cloud data of an object in the scene of the road other than the road may be filtered out, to obtain the road.
  • the thickness of the road may be determined based on the point cloud data constituting the road. It may be determined whether the internal parameter of the laser radar is precise based on the thickness of the road.
  • FIG. 3 is a flowchart illustrating a method for detecting a precision of an internal parameter of a laser radar according to embodiments of the present disclosure.
  • the embodiment illustrated in FIG. 3 may be based on the above embodiment illustrated in FIG. 1 , providing a method for determining the thickness of the road based on point cloud data of the road and determining whether the internal parameter of the laser radar is precise based on the thickness of the road.
  • the method in embodiments may include the following.
  • point cloud data collected by a laser radar provided on an autonomous mobile carrier travelling on a flat road is obtained.
  • a 3D scene reconstruction is performed based on the point cloud data collected to obtain the point cloud model of the 3D scene.
  • the point cloud model of the 3D scene is divided to obtain the road.
  • the point cloud data of the road is divided into near-distance point cloud data and far-distance point cloud data.
  • a density of the point cloud data near the laser radar is greater than the density of the point cloud data far from the laser radar.
  • the internal parameter is precise, as the distance from the laser radar increases, the road gradually rises or bends.
  • the thickness of the road far from the laser radar is thicker than the thickness of the road near the laser radar, by fusing multiple frames of point cloud data.
  • the point cloud data of the road may be divided into the near-distance point cloud data and the far-distance point cloud data based on the division threshold.
  • the close-distance point cloud data may refer to the point cloud within a distance threshold from the laser radar.
  • the long-distance point cloud data may refer to the point cloud data outside the distance threshold from the laser radar.
  • the division threshold may be a preset value based on a measuring range of the laser radar and may be proportional to the measuring range of the laser radar, which may be corrected according to an actual situation. For example, the division threshold may be 30% to 40% of the measuring range of the laser radar.
  • the road 10 obtained by removing the point cloud data of an object included in the scene of the road other than the road in the point cloud model of the 3D scene illustrated in FIG. 2 may be taken as an example for description.
  • a center indicated by the numeral reference 8 on the road may be taken as a center of a circle and the division threshold may be taken as a radius of the circle, for division.
  • the point cloud data within the circle may be determined as the close-distance cloud data, while the point cloud data outside the circle may be determined as the long-distance point cloud data.
  • a near-distance plane is obtained by fitting the near-distance point cloud data, to determine a thickness of the near-distance plane.
  • the near-distance plane may be obtained by performing a plane fitting using an existing method of plane fitting.
  • the RANSAC algorithm may be employed to fit the close-distance point cloud data to a preset plane to obtain the near-distance plane.
  • the thickness corresponding the near-distance plane may be determined based on a distance from each point of the near-distance point cloud data to the near-distance plane.
  • fitting the near-distance point cloud data to obtain the near-distance plane and determining the thickness of the near-distance plane may include the following.
  • the near-distance point cloud data are fitted to obtain the near-distance plane.
  • the RANSAC algorithm may be taken as an example to describe the fitting of the near-distance plane according to the near-distance point cloud data.
  • Parameters a, b, c and d in the preset plane function may be calculated from an original data set (e.g., the near-distance point cloud data), to obtain an initial near-distance plane function.
  • a statistic is performed to obtain the number of points in the original data set having a less distance to a plane determined by the initial near-distance plane function than a preset distance threshold.
  • the original data set may be updated using the points satisfying the above condition, i.e., the points satisfying the above condition may be determined to replace the original data set.
  • the method returns back to blocks of calculating the parameters of the preset plane function and performing the statistic to obtain the number of points satisfying the above condition, until the number of points satisfying the above condition is greater than or equal to the preset point value, the method ends (the iteration is stopped).
  • a plane determined by the initial near-distance plane function having the number of points satisfying the above condition greater than or equal to the preset point value may be determined as the near-distance plane.
  • the preset distance threshold may refer to a predetermined value. The smaller the predetermined value is, the higher the precision is.
  • the preset distance threshold may be corrected according to an actual situation.
  • the preset point value may refer to a predetermined value and may be corrected according to an actual situation. For example, the preset point value may be 300.
  • the thickness of the near-distance plane is determined according to the distance from each point of the near-distance point cloud data to the near-distance plane.
  • the distance from each point of the near-distance point cloud data to the near-distance plane is calculated.
  • the thickness of the near-distance plane may be determined based on at least one of a distance mean, a variance or a mean function.
  • the distance mean may be determined as the thickness of the near-distance plane. It is also possible to determine a maximum distance among each distance as the thickness of the near-distance plane.
  • a far-distance plane is obtained by fitting the far-distance point cloud data, to determine the thickness of the far-distance plane.
  • the method of obtaining the far-distance plane by fitting the far-distance point cloud data to determine the thickness of the far-distance plane is similar to that of obtaining the near-distance plane by fitting the near-distance point cloud data and determining the thickness of the near-distance plane, except that the near-distance plane is determined based on the near-distance point cloud data, while the far-distance plane is determined based on the far-distance point cloud data.
  • the far-distance plane may be obtained by performing the plane fitting with an existing plane fitting method to obtain the far-distance plane.
  • a RANSAC algorithm may be employed to obtain the far-distance plane by fitting the long-distance point cloud data to a preset plane.
  • the thickness of the far-distance plane may be determined based on a distance from each point of the far-distance cloud point data to the far-distance plane.
  • obtaining the far-distance plane by fitting the far-distance point cloud data and determining the thickness of the far-distance plane may include: obtaining the far-distance plane by fitting the far-distant point cloud data and determining the far-distance plane based on the distance from each point of the far-distant point cloud data to the far-distance plane.
  • a difference between the thickness of the near-distance plane and the thickness of the far-distance plane may be obtained.
  • an absolute value of the difference is smaller than a threshold, it may indicate that the near-distance plane and the far-distance plane are almost horizontal to each other. That is, it may be determined that the road is flat and it may be determined that the internal parameter of the laser radar is precise.
  • the absolute value of the difference is greater than or equal to the threshold, it may indicate that the far-distance plane is upturned or bent. That is, the road has a bowl-shaped surface and it may be determined that the internal parameter of the laser radar is imprecise.
  • the point cloud data during the traveling process of the autonomous mobile carrier on the flat road may be collected by the laser radar.
  • the 3D scene reconstruction may be performed based on the point cloud data collected to obtain the point cloud model of the 3D scene.
  • the point cloud model of the 3D scene may be divided and the point cloud data of an object in the scene of the road other than the road may be filtered out, to obtain the road.
  • the point cloud data may be divided into near-distance point cloud data and the far-distance point cloud data based on the characteristics of the point cloud data collected by the laser radar.
  • the thickness of the near-distance plane and the thickness of the far-distance plane may be determined according to the near-distance plane obtained by fitting the near-distance point cloud data and the far-distance plane obtained by fitting the far-distance point cloud data. It may be determined whether the internal parameter is precise based on the thickness of the far-distance plane and the thickness of the near-distance plane. Anew way for automatically detecting the precision of the internal parameter of the laser radar is provided.
  • FIG. 4 is a flowchart illustrating a method for detecting a precision of an internal parameter of a laser radar according to embodiments of the present disclosure.
  • the embodiments illustrated in FIG. 4 is based on the embodiment illustrated in FIG. 3 .
  • a method for determining a thickness of the road based on the point cloud data of the road and determining whether the internal parameter of the laser radar is precise based on the thickness of the road is provided.
  • the method in embodiments may include the following.
  • point cloud data collected by a laser radar provided on an autonomous mobile carrier travelling on a flat road is obtained.
  • a 3D scene reconstruction is performed according to the point cloud data collected to obtain a point cloud model of a 3D scene.
  • the point cloud model of the 3D scene is divided to obtain a road.
  • the point cloud data of the road is fitted to a road plane.
  • the road plane may be a plane of the ground.
  • an existing plane fitting method may be employed to perform plane fitting to obtain the road plane.
  • a RANSAC algorithm may be employed.
  • the manner of determining the road plane by employing the RANSAC algorithm may be the same as the method for determining the near-distance plane in the example illustrated in FIG. 3 , except that the near-distance plane is determined based on the near-distance point cloud data, which the road plane is determined based on the point cloud data of the road including the near-distance point cloud data and the far-distance point cloud data.
  • a distance from each point of the point cloud data to the road plane is determined.
  • the distance from each point of the point cloud data to the road plane is calculated, i.e., the distance between a point and a plane.
  • a ratio of the number of points at a first distance to the total number of points and a ratio of a number of point cloud data at a second distance to the total number of point cloud data are preset ratio thresholds, it is determined that the point cloud data at the first distance from the road plane defines a first boundary of the road and the point cloud data at the second distance from the road plane and a second boundary of the road.
  • the total number of point cloud data may refer to the total number of road point cloud data.
  • the ratio threshold may refer to a preset proportion, which may be corrected according to an actual situation. For example, the ratio threshold may be 1%.
  • a distance from a first preset number of point cloud data to the load plane may be determined as the first distance
  • a distance from a second preset number of point cloud data to the load plane may be determined as the second distance.
  • a ratio of the first preset number to the total number of point cloud data may be a first preset ratio threshold
  • a ratio of the second preset number of point cloud data to the total number of point cloud data may be a second preset ratio threshold.
  • a boundary formed by the point cloud data at the first distance from the road plane may be defined as the first boundary of the road
  • a boundary formed by the point cloud data at the second distance from the road plane may be defined as the second boundary of the road.
  • the first distance and the second distance may be determined by obtaining a distance histogram through fitting the point cloud data to the road plane, taking the distance as an abscissa and taking the number of point cloud data as an ordinate. Two intersection points between a line parallel to the abscissa and the distance histogram are obtained. Abscissa coordinates a and b of the two intersection points may be determined by projecting the two intersection points onto the abscissa.
  • the distance corresponding to the point a may be determined as the first distance
  • the distance corresponding to the point b may be determined as the second distance. It may be determined that the first boundary of the road is formed by the point cloud data corresponding to first distance from the road plane, and the second boundary of the road is formed by the point cloud data corresponding to the second distance from the road plane.
  • a distance between the first boundary and the second boundary is determined as the thickness of the road.
  • the internal parameter of the laser radar is precise. In a case where the thickness of the road is greater than or equal to the thickness threshold, it may be determined that the internal parameter of the laser radar is imprecise.
  • the point cloud data during the traveling process of the autonomous mobile carrier on the flat road may be collected by the laser radar.
  • the 3D scene reconstruction may be performed based on the point cloud data collected to obtain the point cloud model of the 3D scene.
  • the point cloud model of the 3D scene may be divided and the point cloud data of an object in the scene of the road other than the road may be filtered out to obtain the road.
  • the road plane may be obtained by fitting the point cloud data of the road.
  • a distance from each point of the point cloud data to the road plane may be determined.
  • the first boundary and the second boundary of the road may be determined based on the distance. It may be determined whether the internal parameter of the laser radar is precise according to the two boundaries. A new way for automatically detecting whether the internal parameter of the laser radar is precise is provided.
  • FIG. 5 is a block diagram illustrating a device for detecting a precision of an internal parameter of a laser radar according to embodiments of the present disclosure.
  • the device may be configured to execute the method for detecting a precision of an internal parameter of a laser radar according to any one of embodiments of the present disclosure, include corresponding functional modules for executing the method and have beneficial effects.
  • the device may include a point cloud data obtaining module 410, a three-dimensional scene building module 420, a road dividing module 430, a road thickness determining module 440 and an internal parameter precision detecting module 450.
  • the point cloud data obtaining module 410 may be configured to obtain point cloud data collected by the laser radar provided on an autonomous mobile carrier travelling on a flat road.
  • the three-dimensional scene building module 420 may be configured to perform a three-dimensional scene reconstruction based on the point cloud data collected, to obtain a point cloud model of a three-dimensional scene.
  • the road dividing module 430 may be configured to divide the point cloud model of the three-dimensional scene to obtain the road.
  • the road thickness determining module 440 may be configured to determine a thickness of the road based on the point cloud data of the road.
  • the internal parameter precision detecting module 450 may be configured to determine whether the internal parameter of the laser radar is precise based on the thickness of the road.
  • the point cloud data during the traveling process of the autonomous mobile carrier on the flat road may be collected by the laser radar.
  • the 3D scene reconstruction may be performed based on the point cloud data collected to obtain the point cloud model of the 3D scene.
  • the point cloud model of the 3D scene may be divided and the point cloud data of an object in the scene of the road other than the road may be filtered out, to obtain the road.
  • the point cloud data may be divided into near-distance point cloud data and the far-distance point cloud data based on the characteristics of the point cloud data collected by the laser radar.
  • the thickness of the near-distance plane and the thickness of the far-distance plane may be determined according to the near-distance plane obtained by fitting the near-distance point cloud data and the far-distance plane obtained by fitting the far-distance point cloud data. It may be determined whether the internal parameter is precise based on the thickness of the far-distance plane and the thickness of the near-distance plane. A new way for automatically detecting the precision of the internal parameter of the laser radar is provided.
  • road thickness determining module 440 may be further configured to divide the point cloud data of the road into a near-distance point cloud data and far-distance point cloud data; obtain a near-distance plane by fitting the near-distance point cloud data to obtain a thickness of the near-distance plane; and obtain a far-distance plane by fitting the far-distance point cloud data to obtain a thickness of the far-distance plane.
  • the internal parameter precision detecting module 450 may be further configured to determine whether the internal parameter of the laser radar is precise based on the thickness of the near-distance plane and the thickness of the far-distance plane.
  • the road thickness determining module 440 may be further configured to obtain the near-distance plane by fitting the near-distance point cloud data; and determine the thickness of the near-distance plane based on a distance from each point of the near-distance point cloud data to the near-distance plane.
  • the road thickness detecting module 440 may be further configured to obtain a road plane by fitting the point cloud data of the road; determine a distance from each point of the point cloud data to the road plane; in response to determining that a ratio of the number of point cloud data at a first distance to the total number of point cloud data and a ratio of the number of point cloud data at a second distance to the total number of point cloud data are preset ratio thresholds, determine that the point cloud data at the first distance from the road plane defines a first boundary of the road and the point cloud data at the second distance from the road plane defines a second boundary of the road; and determine a distance between the first boundary and the second boundary as the thickness of the road.
  • the internal parameter precision detecting module 450 may be further configured to in response to determining that the thickness of the road is greater than a preset thickness threshold, determine that the internal parameter of the laser radar is imprecise.
  • FIG. 6 is a schematic diagram illustrating an apparatus according to embodiments of the present disclosure.
  • a block diagram applicable for implementing an exemplary apparatus 6 according to embodiments of the present disclosure is illustrated in 6.
  • the apparatus 12 illustrated in 6 may be merely an example, and should not be construed to limit functions and usage scope of embodiments of the present disclosure.
  • the apparatus 12 may be in the form of a general-purpose computing apparatus.
  • the apparatus 12 may include, but not limited to, one or more processors or processing units 16, a system memory 28, and a bus 18 for connecting different system components (including the system memory 28 and the processing unit 16).
  • the bus 18 may represent one or more of several types of bus architectures, including a memory bus or a memory control bus, a peripheral bus, a graphic acceleration port bus, a processor bus, or a local bus using any bus architecture in a variety of bus architectures.
  • these architectures include, but are not limited to, an industry standard architecture (ISA) bus, a micro-channel architecture (MAC) bus, an enhanced ISA bus, a video electronic standard association (VESA) local bus, and a peripheral component interconnect (PCI) bus.
  • ISA industry standard architecture
  • MAC micro-channel architecture
  • VESA video electronic standard association
  • PCI peripheral component interconnect
  • the apparatus 12 may include multiple kinds of computer-readable media. These media may be any storage media accessible by the apparatus 12, including transitory or non-transitory storage medium and movable or unmovable storage medium.
  • the memory 28 may include a computer-readable medium in a form of volatile memory, such as a random-access memory (RAM) 30 and/or a high-speed cache memory 32.
  • the apparatus 12 may further include other transitory/non-transitory and movable/unmovable computer system storage media.
  • the storage system 34 may be used to read and write from and to non-removable and non-volatile magnetic media (not illustrated in the figure, commonly referred to as "hard disk drives").
  • a disk driver for reading and writing to and from movable and non-volatile magnetic disks (e.g.
  • floppy disks may be provided, as well as an optical driver for reading and writing to and from movable and non-volatile optical disks (e.g. a compact disc read only memory (CD-ROM), a digital video disc read only Memory (DVD-ROM), or other optical media) may be provided.
  • each driver may be connected to the bus 18 via one or more data interfaces.
  • the system memory 28 may include at least one program product.
  • the program product may have a set of (for example at least one) program modules.
  • the program modules may be configured to perform functions of embodiments of the present disclosure.
  • a program/application 40 having a set of (at least one) program modules 42 may be stored in system memory 28.
  • the program modules 42 may include, but not limit to, an operating system, one or more application programs, other program modules and program data. Any one or a combination of above examples may include an implementation in a network environment.
  • the program modules 42 may be generally configured to implement functions and/or methods described in embodiments of the present disclosure.
  • the apparatus 12 may also communicate with one or more external devices 14 (e.g., a keyboard, a pointing device, a display 24, and etc.) and may also communicate with one or more devices that enables a user to interact with the apparatus 12, and/or any device (e.g., a network card, a modem, etc.) that enables the apparatus 12 to communicate with one or more other computing devices.
  • the above communications can be achieved by the input/output (I/O) interface 22.
  • the display 24 may be not a separate physical entity, but may be embedded into a mirror. When nothing is displayed on a display surface of the display 24, the display surface of the display 24 may be visually same to the mirror.
  • the apparatus 120 may be connected to and communicate with one or more networks such as a local area network (LAN), a wide area network (WAN) and/or a public network such as the Internet through a network adapter 20. As illustrated in 6, the network adapter 20 may communicate with other modules of the device 12 over the bus 18. It should be understood that although not illustrated in the figures, other hardware and/or software modules may be used in combination with the apparatus 12, including, but not limited to, microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, as well as data backup storage systems and the like.
  • the processing unit 16 can perform various functional applications and data processing by running programs stored in the system memory 28, for example, to perform a method for detecting a precision of an internal parameter of a laser radar according to embodiments of the present disclosure.
  • Embodiments of the present disclosure provide a computer readable storage medium, having computer programs stored thereon that when executed by a processor cause the processor to perform the method for detecting a precision of an internal parameter of a laser radar according to embodiments of the present disclosure.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • the computer readable storage medium may be, but not limited to, for example, electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, component or any combination thereof.
  • a specific example of the computer readable storage medium includes (a non-exhaustive list): an electrical connection having one or more wires, a portable computer disk, a hard disk, a random access memory (RAM), a read only memory (ROM), an erasable programmable read only memory (EPROM or a flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical memory component, a magnetic memory component, or any suitable combination thereof.
  • the computer readable storage medium may be any tangible medium including or storing programs. The programs may be used by an instruction executed system, apparatus, device, or a connection thereof.
  • the computer readable signal medium may include a data signal propagated in baseband or as a part of carrier and carries computer readable program codes. Such propagated data signal may be in many forms, including but not limited to an electromagnetic signal, an optical signal, or any suitable combination thereof.
  • the computer readable signal medium may also be any computer readable medium other than the computer readable storage medium.
  • the computer readable medium may send, propagate, or transport programs used by an instruction executed system, apparatus, device, or a connection thereof.
  • the program codes stored on the computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, or any suitable combination thereof.
  • the computer program codes for carrying out operations of embodiments of the present disclosure may be written in one or more programming languages.
  • the programming language includes an object oriented programming language, such as Java, Smalltalk, C++, as well as conventional procedural programming language, such as "C" language or similar programming language.
  • the program codes may be executed entirely on a user's computer, partly on the user's computer, as a separate software package, partly on the user's computer, partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer or an external computer (such as using an Internet service provider to connect over the Internet) through any kind of network, including a local area network (hereafter referred as to LAN) or a wide area network (hereafter referred as to WAN).
  • LAN local area network
  • WAN wide area network
  • embodiments of the present disclosure further provide a vehicle.
  • the vehicle includes a vehicle body, the device according to any one of embodiments of the present disclosure arranged on the vehicle body and a laser radar and an imager connected in communication with the device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Claims (7)

  1. Procédé mis en œuvre par un ordinateur de détection d'une précision d'un paramètre interne d'un radar laser, comprenant :
    l'obtention (110) de données de nuage de points collectées par le radar laser placé sur un support mobile autonome qui se déplace sur une route plane ;
    l'exécution (120) d'une reconstruction de scène en trois dimensions sur la base des données de nuage de points afin d'obtenir un modèle de nuage de points d'une scène en trois dimensions ;
    la division (130) du modèle de nuage de points de la scène en trois dimensions sur la base d'un seuil de division de nuage de points prédéfini afin de supprimer les données de nuage de points d'un objet autre que la route de façon à obtenir un nuage de points de route ;
    caractérisé en ce que le procédé comprend en outre :
    la détermination (140) d'une épaisseur de la route en adaptant le nuage de points de route à un plan de la route et en déterminant si le paramètre interne du radar laser est précis en comparant l'épaisseur de la route avec un seuil d'épaisseur prédéfini, comprenant :
    la division (240) du nuage de points de route en des données de nuage de points de courte distance et en des données de nuage de points de distance longue sur la base d'une distance de division prédéfinie, dans lequel les données de nuage de points de courte distance se trouvent à la distance de division prédéfinie du radar laser, et les données de nuage de points de distance longue sont à l'extérieur de la distance de division prédéfinie par rapport au radar laser, la distance de division prédéfini étant proportionnelle à une plage de mesure du radar laser ;
    l'obtention (250) d'un plan de courte distance en adaptant les données de nuage de points de courte distance, le calcul d'une distance de chaque point de courte distance des données de nuage de points de courte distance par rapport au plan de courte distance, et l'obtention d'une moyenne de distance, d'une variance ou d'une fonction moyenne de la distance de chaque point de courte distance afin d'obtenir une épaisseur du plan de courte distance ;
    l'obtention (260) d'un plan de longue distance en adaptant les données de nuage de points de longue distance, le calcul d'une distance de chaque point de longue distance des données de nuage de points de longue distance par rapport au plan de longue distance, l'obtention d'une moyenne de distance, d'une variance ou d'une fonction moyenne de la distance de chaque point de longue distance afin d'obtenir une épaisseur du plan de longue distance ;
    la détermination d'une différence entre l'épaisseur du plan de courte distance et l'épaisseur du plan de longue distance ; et
    la détermination (270) du fait que le paramètre interne du radar laser soit précis en comparant la différence avec un seuil prédéfini.
  2. Procédé selon la revendication 1, dans lequel la détermination (140) de l'épaisseur de la route en adaptant le nuage de points de route au plan de route comprend :
    la détermination (350) d'une distance entre chaque point du nuage de points de route et le plan de route ;
    en réponse à la détermination du fait qu'un rapport entre le nombre de points à une première distance et le nombre total de points et un rapport entre un nombre de données de nuage de points à une seconde distance et le nombre total de données de nuage de points soient des seuils de rapport prédéfinis, la détermination (360) du fait que les données de nuage de points à la première distance du plan de route définissent une première limite de la route et que les données de nuage de points à la seconde distance du plan de route définissent une seconde limite ; et
    la détermination (370) d'une distance entre la première limite et la seconde limite en guise d'épaisseur de la route.
  3. Procédé selon la revendication 1 ou 2, dans lequel la détermination (140 ; 380) du fait que le paramètre interne du radar laser soit précis en comparant l'épaisseur de la route avec le seuil d'épaisseur prédéfini comprend :
    en réponse à la détermination du fait que l'épaisseur de la route soit supérieure au seuil d'épaisseur prédéfini, la détermination du fait que le paramètre interne du radar laser soit imprécis.
  4. Dispositif de détection d'une précision d'un paramètre interne d'un radar laser, comprenant :
    un module d'obtention de données de nuage de points (410), configuré pour obtenir les données de nuage de points collectées par le radar laser prévu sur un support mobile autonome qui se déplace sur une route plane ;
    un module de reconstitution de scène en trois dimensions (420), configuré pour réaliser une reconstruction de scène en trois dimensions sur la base des données de nuage de points, afin d'obtenir un modèle de nuage de points d'une scène en trois dimensions ;
    un module de division de route (430), configuré pour diviser le modèle de nuage de points de la scène en trois dimensions sur la base d'un seuil de division de nuage de points prédéfini afin de supprimer les données de nuage de points d'un objet autre que la route de façon à obtenir un nuage de points de route ;
    caractérisé en ce que le dispositif comprend en outre :
    un module de détermination d'épaisseur de route (440), configuré pour déterminer une épaisseur de la route en adaptant le nuage de points de route à un plan de route ; et
    un module de détection de précision de paramètre interne (450), configuré pour déterminer si le paramètre interne du radar laser est précis en comparant l'épaisseur de la route avec un seuil d'épaisseur prédéfini,
    dans lequel le module de détermination d'épaisseur de route (440) est en outre configuré pour diviser les données de nuage de points de la route en données de nuage de points de courte distance et en données de nuage de points de longue distance sur la base d'une distance de division prédéfinie, dans lequel les données de nuage de points de courte distance se trouvent à la distance de division prédéfinie du radar laser, et les données de nuage de points de distance longue sont à l'extérieur de la distance de division prédéfinie par rapport au radar laser, la distance de division prédéfini étant proportionnelle à une plage de mesure du radar laser ;
    l'obtention d'un plan de courte distance en adaptant les données de nuage de points de courte distance, le calcul d'une distance de chaque point de courte distance des données de nuage de points de courte distance par rapport au plan de courte distance, et l'obtention d'une moyenne de distance, d'une variance ou d'une fonction moyenne de la distance de chaque point de courte distance afin d'obtenir une épaisseur du plan de courte distance ; et
    l'obtention d'un plan de longue distance en adaptant les données de nuage de points de longue distance, le calcul d'une distance de chaque point de longue distance des données de nuage de points de longue distance par rapport au plan de longue distance, l'obtention d'une moyenne de distance, d'une variance ou d'une fonction moyenne de la distance de chaque point de longue distance afin d'obtenir une épaisseur du plan de longue distance ; et
    le module de détection de précision de paramètre interne (450) est en outre configuré pour déterminer une différence entre l'épaisseur du plan de courte distance et l'épaisseur du plan de longue distance et pour déterminer si le paramètre interne du radar laser est précis en comparant la différence avec un seuil prédéfini.
  5. Dispositif selon la revendication 4, dans lequel le module de détection d'épaisseur de route (440) est en outre configuré pour :
    déterminer une distance entre chaque point du nuage de points de route et le plan de route ;
    en réponse à la détermination du fait qu'un rapport entre le nombre de points à une première distance et le nombre total de points et un rapport entre un nombre de données de nuage de points à une seconde distance et le nombre total de données de nuage de points soient des seuils de rapport prédéfinis, déterminer que les données de nuage de points à la première distance du plan de route définissent une première limite de la route et que les données de nuage de points à la seconde distance du plan de route définissent une seconde limite ; et
    déterminer une distance entre la première limite et la seconde limite en guise d'épaisseur de la route.
  6. Dispositif selon la revendication 4 ou 5, dans lequel le module de détection de précision de paramètre interne (450) est en outre configuré pour :
    en réponse à la détermination du fait que l'épaisseur de la route soit supérieure au seuil d'épaisseur prédéfini, déterminer que le paramètre interne du radar laser est imprécis.
  7. Support de stockage non transitoire lisible par un ordinateur, ayant un programme mis en oeuvre par un ordinateur stocké dessus, dans lequel, lorsque le programme est exécuté par un processeur, compris dans un dispositif de détection d'une précision d'un paramètre interne d'un radar laser selon la revendication 4, un procédé de détection d'une précision d'un paramètre interne d'un radar laser selon l'une quelconque des revendications 1 à 3 est exécuté par le processeur.
EP19195035.1A 2018-09-06 2019-09-03 Procédé et dispositif permettant de détecter la précision d'un paramètre interne d'un radar laser Active EP3620823B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811039514.2A CN109143207B (zh) 2018-09-06 2018-09-06 激光雷达内参精度验证方法、装置、设备及介质

Publications (2)

Publication Number Publication Date
EP3620823A1 EP3620823A1 (fr) 2020-03-11
EP3620823B1 true EP3620823B1 (fr) 2022-08-24

Family

ID=64827460

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19195035.1A Active EP3620823B1 (fr) 2018-09-06 2019-09-03 Procédé et dispositif permettant de détecter la précision d'un paramètre interne d'un radar laser

Country Status (4)

Country Link
US (1) US11506769B2 (fr)
EP (1) EP3620823B1 (fr)
JP (1) JP7112993B2 (fr)
CN (1) CN109143207B (fr)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110223297A (zh) * 2019-04-16 2019-09-10 广东康云科技有限公司 基于扫描点云数据的分割与识别方法、系统及存储介质
CN110333503B (zh) * 2019-05-29 2023-06-09 菜鸟智能物流控股有限公司 激光雷达的标定方法、装置及电子设备
US11460581B2 (en) * 2019-06-10 2022-10-04 Toyota Research Institute, Inc. Systems and methods for reducing LiDAR points
CN110516564A (zh) * 2019-08-06 2019-11-29 深兰科技(上海)有限公司 路面检测方法和装置
CN112630749B (zh) * 2019-09-24 2023-06-09 北京百度网讯科技有限公司 用于输出提示信息的方法和装置
CN110687549B (zh) * 2019-10-25 2022-02-25 阿波罗智能技术(北京)有限公司 障碍物检测方法和装置
CN110988848B (zh) * 2019-12-23 2022-04-26 潍柴动力股份有限公司 车载激光雷达相对位姿监测方法及设备
DE102020007645A1 (de) * 2020-04-03 2021-10-07 Daimler Ag Verfahren zur Kalibrierung eines Lidarsensors
CN111695489B (zh) * 2020-06-09 2023-08-11 阿波罗智能技术(北京)有限公司 建模路线的验证方法、装置、无人车及存储介质
CN111999720B (zh) * 2020-07-08 2022-08-16 深圳市速腾聚创科技有限公司 激光雷达参数调整方法、激光雷达系统和计算机存储介质
CN112147635B (zh) * 2020-09-25 2024-05-31 北京亮道智能汽车技术有限公司 一种检测系统、方法及装置
CN113176547B (zh) * 2020-10-20 2022-03-22 苏州思卡信息系统有限公司 基于贝塞尔建模的路侧雷达的实时滤除背景的方法
CN112270769B (zh) * 2020-11-11 2023-11-10 北京百度网讯科技有限公司 一种导游方法、装置、电子设备及存储介质
CN112634260A (zh) * 2020-12-31 2021-04-09 上海商汤智能科技有限公司 一种地图评估的方法、装置、电子设备及存储介质
CN113051304B (zh) * 2021-04-02 2022-06-24 中国有色金属长沙勘察设计研究院有限公司 一种雷达监测数据与三维点云融合的计算方法
CN114529727A (zh) * 2022-04-25 2022-05-24 武汉图科智能科技有限公司 一种基于LiDAR和图像融合的街道场景语义分割方法
CN114966793B (zh) * 2022-05-25 2024-01-26 苏州天硕导航科技有限责任公司 三维测量系统、方法及gnss系统

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005055311A (ja) * 2003-08-05 2005-03-03 Yaskawa Electric Corp スキャナ装置のキャリブレーション方法
JP4890894B2 (ja) * 2006-03-13 2012-03-07 オムロンオートモーティブエレクトロニクス株式会社 車載用レーダ装置
JP5088401B2 (ja) * 2010-06-23 2012-12-05 日本電気株式会社 道路構造測定方法および道路面測定装置
DE102011084264A1 (de) * 2011-10-11 2013-04-11 Robert Bosch Gmbh Verfahren und Vorrichtung zum Kalibrieren eines Umfeldsensors
CA2819956C (fr) * 2013-07-02 2022-07-12 Guy Martin Methode de modelisation et d'etalonnage de camera haute precision
JP6357050B2 (ja) * 2014-08-19 2018-07-11 日野自動車株式会社 運転支援システム
CN105404844B (zh) * 2014-09-12 2019-05-31 广州汽车集团股份有限公司 一种基于多线激光雷达的道路边界检测方法
KR101630248B1 (ko) * 2014-11-13 2016-06-14 한국건설기술연구원 레이저 스캐닝과 영상을 이용한 도로의 기준 도로폭 측정방법
JP2016120892A (ja) * 2014-12-25 2016-07-07 富士通株式会社 立体物検出装置、立体物検出方法および立体物検出プログラム
CA3067177A1 (fr) * 2015-02-10 2016-08-18 Mobileye Vision Technologies Ltd. Carte eparse pour la navigation d'un vehicule autonome
CN104820217B (zh) * 2015-04-14 2016-08-03 同济大学 一种多法向平面的多元线阵探测成像激光雷达的检校方法
CN104931943A (zh) * 2015-05-26 2015-09-23 中公高科养护科技股份有限公司 一种路面测厚雷达的计量方法和计量装置
CN105627938A (zh) * 2016-01-07 2016-06-01 厦门大学 一种基于车载激光扫描点云的路面沥青厚度检测方法
CN107945198B (zh) * 2016-10-13 2021-02-23 北京百度网讯科技有限公司 用于标注点云数据的方法和装置
CN107167788B (zh) * 2017-03-21 2020-01-21 深圳市速腾聚创科技有限公司 获取激光雷达校准参数、激光雷达校准的方法及系统
CN107272019B (zh) * 2017-05-09 2020-06-05 深圳市速腾聚创科技有限公司 基于激光雷达扫描的路沿检测方法
CN107179534B (zh) * 2017-06-29 2020-05-01 北京北科天绘科技有限公司 一种激光雷达参数自动标定的方法、装置及激光雷达
CN108254758A (zh) * 2017-12-25 2018-07-06 清华大学苏州汽车研究院(吴江) 基于多线激光雷达和gps的三维道路构建方法

Also Published As

Publication number Publication date
JP2020042024A (ja) 2020-03-19
CN109143207B (zh) 2020-11-10
US20200081105A1 (en) 2020-03-12
US11506769B2 (en) 2022-11-22
JP7112993B2 (ja) 2022-08-04
EP3620823A1 (fr) 2020-03-11
CN109143207A (zh) 2019-01-04

Similar Documents

Publication Publication Date Title
EP3620823B1 (fr) Procédé et dispositif permettant de détecter la précision d'un paramètre interne d'un radar laser
CN109270545B (zh) 一种定位真值校验方法、装置、设备及存储介质
CN110687549B (zh) 障碍物检测方法和装置
CN109405836B (zh) 用于确定无人驾驶汽车的可驾驶导航路径的方法和系统
CN110163930B (zh) 车道线生成方法、装置、设备、系统及可读存储介质
CN109461211B (zh) 基于视觉点云的语义矢量地图构建方法、装置和电子设备
CN109459734B (zh) 一种激光雷达定位效果评估方法、装置、设备及存储介质
CN109410735B (zh) 反射值地图构建方法和装置
CN113657224B (zh) 车路协同中用于确定对象状态的方法、装置、设备
CN109435955B (zh) 一种自动驾驶系统性能评估方法、装置、设备及存储介质
CN109407073B (zh) 反射值地图构建方法和装置
KR20210111180A (ko) 위치 추적 방법, 장치, 컴퓨팅 기기 및 컴퓨터 판독 가능한 저장 매체
EP3624055B1 (fr) Procédé de détection de terre, appareil, dispositif électronique, véhicule et support d'enregistrement
CN108734780B (zh) 用于生成地图的方法、装置和设备
CN111563450B (zh) 数据处理方法、装置、设备及存储介质
CN112258519B (zh) 一种高精度地图制作中道路的让行线自动化提取方法及装置
CN112154303B (zh) 高精度地图定位方法、系统、平台及计算机可读存储介质
EP4198901A1 (fr) Procédé et appareil d'étalonnage de paramètres extrinsèques d'une caméra
WO2024012211A1 (fr) Procédé de perception environnementale à conduite autonome, support et véhicule
CN114820749A (zh) 无人车井下定位方法、系统、设备及介质
CN113706704A (zh) 基于高精地图规划路线的方法、设备以及自动驾驶车辆
CN111833443A (zh) 自主机器应用中的地标位置重建
CN114092660A (zh) 高精地图生成方法、装置及用于生成地图的车辆
CN117671013A (zh) 点云定位方法、智能设备及计算机可读存储介质
CN114631124A (zh) 三维点云分割方法和装置、可移动平台

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200904

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210319

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: APOLLO INTELLIGENT DRIVING TECHNOLOGY (BEIJING) CO., LTD.

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602019018579

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G01S0017930000

Ipc: G01S0007497000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/80 20170101ALI20220325BHEP

Ipc: G01S 17/931 20200101ALI20220325BHEP

Ipc: G01S 17/89 20200101ALI20220325BHEP

Ipc: G01S 17/42 20060101ALI20220325BHEP

Ipc: G01S 7/481 20060101ALI20220325BHEP

Ipc: G01S 7/497 20060101AFI20220325BHEP

INTG Intention to grant announced

Effective date: 20220419

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1514024

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220915

Ref country code: DE

Ref legal event code: R096

Ref document number: 602019018579

Country of ref document: DE

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20220824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221226

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221124

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1514024

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221224

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221125

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602019018579

Country of ref document: DE

Ref country code: BE

Ref legal event code: MM

Effective date: 20220930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220903

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230530

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220930

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220903

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220930

26N No opposition filed

Effective date: 20230525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220930

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230808

Year of fee payment: 5

Ref country code: DE

Payment date: 20230808

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20190903

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20230903

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220824