WO2019037484A1 - 激光扫描设备标定的方法、装置、设备及存储介质 - Google Patents
激光扫描设备标定的方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2019037484A1 WO2019037484A1 PCT/CN2018/087251 CN2018087251W WO2019037484A1 WO 2019037484 A1 WO2019037484 A1 WO 2019037484A1 CN 2018087251 W CN2018087251 W CN 2018087251W WO 2019037484 A1 WO2019037484 A1 WO 2019037484A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- coordinate
- cloud data
- point cloud
- frame
- coordinate system
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
- G01C15/002—Active optical surveying means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
- G01S7/4972—Alignment of sensor
Definitions
- the present application relates to the field of driverless technology, and in particular, to a method, device, device and storage medium for calibration of a laser scanning device.
- a navigation system in an unmanned vehicle can provide a navigation path for the driverless vehicle to follow the navigation path.
- the unmanned vehicle can also scan the surrounding environment in real time through the laser scanning device to obtain a three-dimensional image of the surrounding environment, so that the unmanned vehicle can travel in conjunction with the surrounding environment and the navigation path to avoid obstacles in the surrounding environment, further Ensure the safety of driving.
- the laser scanning device needs to be calibrated before using the laser scanning device.
- the laser scanning device is calibrated by generally setting up a marker in the calibration field, and setting a plurality of calibration points with obvious positions in the marker to establish a calibration field including a plurality of calibration points.
- the vehicle coordinate system with the unmanned vehicle as the coordinate origin is established in the calibration field, and the coordinates of each calibration point in the vehicle coordinate system are manually measured by the traditional surveying and mapping method.
- a laser coordinate system with the laser scanning device as the origin is established, and the calibration field is scanned by the laser scanning device to obtain a frame of point cloud data, the frame point cloud data includes a set of surface points of the marker in the calibration field, and a surface point The coordinates of each point in the set in the laser coordinate system.
- a plurality of punctuation points in the set of surface points are manually selected, and coordinates of each of the calibration points in the laser coordinate system are obtained.
- the SVD (Singular Value Decomposition) algorithm is used to calculate the deviation of the laser coordinate system from the vehicle coordinate system.
- the shifting posture includes a numerical value of an offset position of the laser coordinate system with respect to the vehicle coordinate system and a numerical value of the yaw angle, and the offset posture is directly used as a value of a laser external parameter of the laser scanning device.
- the yaw angle is the angle between the x-axis of the laser coordinate system (directly in front of the laser scanning device) and the x-axis of the vehicle coordinate system (directly in front of the driverless vehicle).
- the laser scanning device is calibrated by the value of the external laser parameter.
- the above method requires manual establishment of a calibration field, and subsequent methods of manual measurement or identification are required to determine the coordinates of each calibration point in the vehicle coordinate system and the coordinates in the laser coordinate system, thereby causing the above-mentioned laser scanning device calibration.
- the method is inefficient.
- a method, apparatus, apparatus, and storage medium for laser scanning device calibration are provided.
- a method of calibrating a laser scanning device comprising:
- a device for calibrating a laser scanning device comprising:
- An acquiring module configured to acquire, according to at least two frame point cloud data obtained by scanning a target area by the laser scanning device, a first coordinate of the feature element in the point cloud data of each frame, where the first coordinate is the feature element The coordinates in the laser coordinate system;
- a first determining module configured to determine, according to the map data of the target area, a second coordinate of the feature element in the frame coordinate system in the vehicle coordinate system;
- a second determining module configured to determine an offset pose of the point cloud data of each frame according to the first coordinate and the second coordinate of the feature element for the frame cloud data of each frame;
- a calculating module configured to calculate a value of the laser external parameter of the laser scanning device according to the offset pose of the at least two frames of point cloud data to calibrate the laser scanning device.
- a computer apparatus comprising a memory and a processor, the memory storing computer readable instructions, the computer readable instructions being executed by the processor such that the processor performs the following steps:
- a non-transitory computer readable storage medium storing computer readable instructions, when executed by one or more processors, causes the one or more processors to perform the following steps:
- FIG. 1 is a schematic diagram of a driving system provided by an embodiment of the present application.
- FIG. 2 is a flow chart of a method for calibration of a laser scanning device according to an embodiment of the present application
- FIG. 3 is a schematic diagram of a preset scanning route provided by an embodiment of the present application.
- FIG. 4 is a schematic diagram of a first distance provided by an embodiment of the present application.
- FIG. 5 is a schematic diagram of a second distance provided by an embodiment of the present application.
- FIG. 6 is a schematic structural diagram of an apparatus for calibrating a laser scanning device according to an embodiment of the present application.
- FIG. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application.
- the embodiment of the present application discloses a method of calibrating a laser scanning device.
- the laser scanning device can be a laser scanning device installed in any driver that needs to navigate.
- the laser scanning device may be installed in a driver such as an unmanned vehicle, a drone, or a robot that requires navigation, which is not specifically limited in the embodiment of the present application.
- the embodiment of the present application is described by taking only a laser scanning device installed in a vehicle as an example.
- the driving system includes: a laser scanning device 101 and a navigation system 102.
- Map data is pre-stored in the navigation system 102, and the map data includes at least position coordinates of each feature element in the target area in the map coordinate system.
- the navigation system 102 includes a GPS (Global Positioning System) and an IMU (Inertial Measurement Unit).
- the navigation system 102 can receive satellite signals via GPS and locate the current position coordinates of the vehicle in the map coordinate system in real time.
- the navigation system 102 can determine a navigation path of the vehicle in the map data according to the current position coordinate of the vehicle and the destination position coordinate of the vehicle, and map the corresponding path coordinate in the map coordinate system to the geocentric coordinate system and the station.
- the cardiac coordinate system is converted into a vehicle coordinate system to cause the vehicle to travel in accordance with a navigation path in the vehicle coordinate system.
- the accelerometer and the gyroscope are integrated in the IMU.
- the navigation system 102 can also acquire the heading angle and the traveling speed of the vehicle in the vehicle coordinate system in real time through the IMU, thereby monitoring the running state of the vehicle in real time.
- the driving system further includes a laser scanning device 101.
- the vehicle can also scan the surrounding environment in real time through the laser scanning device 101 to obtain multi-frame point cloud data of the surrounding environment, and each frame point cloud data includes each of the surrounding environments.
- the position coordinates of each obstacle in the laser coordinate system are converted into the vehicle coordinate system; the vehicle can travel in conjunction with the navigation path in the vehicle coordinate system and each obstacle in the surrounding environment, thereby further ensuring the safety of the vehicle running.
- the map data may be map data of a to-be-traveled area that is set and stored in advance according to user needs. Further, the map data may be high-precision map data.
- the high-precision map data is a next-generation navigation map with centimeter-level positioning accuracy and including road auxiliary facilities information (such as traffic lights, electronic eyes, traffic signs, etc.) and dynamic traffic information, and the navigation data can be more accurately navigated through the high-precision map data. .
- the vehicle may be an unmanned vehicle that acquires a navigation path through the navigation system 102 and acquires multi-frame point cloud data of the surrounding environment by the laser scanning device 101, so that the unmanned vehicle can combine the vehicle coordinates
- the navigation path in the system and each obstacle in the surrounding environment further ensure that the unmanned vehicle can travel safely.
- the map coordinate system is generally WGS84 (World Geodetic System for 1984, 1984 World Geodetic Coordinate System), and the position coordinates of each feature element is the latitude and longitude coordinates and elevation coordinates of the feature element in the WGS84 coordinate system.
- the vehicle coordinate system uses the vehicle as the coordinate origin, with the front of the vehicle running in the positive direction of the x-axis, the direction perpendicular to the left and the direction perpendicular to the x-axis being the positive direction of the y-axis, and the direction of the vertical upward being the positive z-axis. direction.
- the laser coordinate system uses the laser scanning device as the coordinate origin, with the positive direction of the laser scanning device being the positive x-axis direction, the direction horizontally to the left and perpendicular to the x-axis being directly in front of the y-axis, and the direction in the vertical direction is The coordinate system of the positive direction of the z-axis.
- the geocentric coordinate system uses the Earth's centroid as the coordinate origin o, the direction of the intersection of the first meridional plane and the equatorial plane to the east is the positive x-axis direction, and the direction of the north of the earth's rotation axis is the positive direction of the z-axis.
- the xoz plane is vertical and the direction determined by the right-hand rule is the positive direction of the y-axis, and the spatial rectangular coordinate system is established.
- the center coordinate system of the station is the origin of the coordinate system, the direction of the long semi-axial direction of the earth ellipsoid (east direction) is the positive direction of the x-axis, and the direction of the semi-axis of the earth ellipsoid is north (north direction).
- the space is a rectangular coordinate system established by the positive direction of the y-axis and the normal direction of the ellipsoid of the earth (the sky direction) is the positive direction of the z-axis.
- the laser external parameters of the laser scanning device are the offset position and the yaw angle between the laser coordinate system and the vehicle coordinate system.
- the offset position is an offset distance of the laser coordinate system relative to the vehicle coordinate system in the x-axis and the y-axis direction
- the yaw angle is an angle between the x-axis of the laser coordinate system and the x-axis of the vehicle coordinate system. That is, the angle between the front of the laser scanning device and the front of the vehicle.
- the present application also relates to the heading angle of the vehicle.
- the heading angle refers to an angle between the front side and the true north direction of the vehicle.
- the execution body of the method is a terminal, and the terminal may be an in-vehicle terminal or any terminal having a data processing function.
- the method includes:
- the terminal scans the target area based on the preset scan route by using the laser scanning device to obtain at least two frames of point cloud data, where the target area is any area including the feature element.
- the laser scanning device is installed in the vehicle and can be disposed on the front side or the side of the vehicle for scanning the environment around the vehicle.
- the preset scan route may be a travel route designed to scan the target area.
- the step may be: the terminal acquires a preset scan route, and uses the preset scan route as a travel route of the vehicle to control the vehicle to travel along the preset scan route.
- the terminal controls the laser scanning device to scan the target area once every preset time period to obtain a frame of point cloud data of the target area.
- the terminal controls the laser scanning device to perform at least two scans to obtain point cloud data of at least two frames of the target area.
- the per-frame point cloud data includes, but is not limited to, a set of surface points of each obstacle in the target area, and position coordinates of each surface point in the laser coordinate system.
- the preset duration may be set and changed according to the needs of the user, which is not specifically limited in this embodiment of the present application. For example, the preset duration may be 100 milliseconds, 5 seconds, or the like.
- the feature of the feature includes, but is not limited to, a fixed road tooth, a road guardrail, a rod-shaped feature or a traffic sign in the target area. Since the feature element is a fixed-position object in the target area, the ground element in the target area is used as a basic element of the punctuation, and the laser scanning can be finally performed by determining different coordinates of the feature element in each coordinate system. The device is calibrated.
- the target area may be any area including a feature element.
- the terminal may select an open area with fewer pedestrians as the target area.
- unnecessary noise data of other vehicles and the like are less, thereby reducing interference of environmental noise, and improving subsequent extraction of feature elements based on point cloud data.
- the accuracy of the first coordinate is the accuracy of the first coordinate.
- the preset scan route may be a scan route determined based on the target area.
- the determined preset scan route is a circular route around the target area.
- the traveling direction of the vehicle may be any of the directions of east, south, west, north, etc. during the running of the vehicle. Therefore, the terminal can control the vehicle to travel along the loop road, so that point cloud data of the target area in each traveling direction can be obtained. Moreover, since the vehicle travels on the road side while observing the traffic rules, the point cloud data collected by the terminal is the point cloud data of the left side or the right side.
- the terminal can control the vehicle to travel back and forth along the ring road, that is, control the vehicle to travel clockwise along the ring road, and then drive counterclockwise along the loop road, so that the vehicle is traveling on the left side of the road and biased. Scanning can be performed while driving on the right side, which improves the accuracy of determining the value of the external laser parameter according to the offset posture of the cloud data of each frame.
- the target area is an A area
- the preset scanning route may be a circular route around the A area, that is, the terminal controls the vehicle to travel clockwise along the ring road from the starting point B, and returns to the starting point B. Then, take a circle from the starting point B counterclockwise along the circular road.
- the terminal For each frame point cloud data, the terminal extracts a first coordinate of the feature element in the laser coordinate system.
- each frame point cloud data includes a set of surface points of each obstacle in the target area and position coordinates of each surface point in the laser coordinate system
- the terminal needs to extract from each frame point cloud data.
- the first coordinate of the feature element the first coordinate being the coordinate of the feature element in the laser coordinate system.
- the terminal For each frame of point cloud data, the terminal extracts a point set corresponding to the feature of the feature from the point cloud data by using a preset extraction algorithm. For each feature element, a set of position coordinates of the point set corresponding to the feature element in the laser coordinate system is used as the first coordinate of the feature element; and then the first feature of the feature element included in each point cloud data is obtained. coordinate.
- the preset extraction algorithm may be set and changed according to the user's needs, which is not specifically limited in this embodiment of the present application.
- the preset extraction algorithm may be: a segmentation based extraction algorithm or a detection based extraction algorithm.
- the foregoing steps 201-202 are actually at least two frame point cloud data obtained by the terminal scanning the target area based on the laser scanning device, and obtaining the first coordinate of the feature element in the point cloud data of each frame. the way.
- the foregoing specific implementation manner may be replaced by other implementation manners.
- the foregoing specific implementation manner actually obtains point cloud data through real-time scanning, and in an actual scenario, may also obtain the historical data obtained by pre-scanning.
- At least two frames of point cloud data of the target area are implemented, which is not specifically limited in this embodiment of the present application.
- the terminal acquires map data of the target area from the navigation system, where the map data includes latitude and longitude coordinates and elevation coordinates of the feature element in a map coordinate system.
- the navigation data of the vehicle stores the map data of the target area
- the terminal may acquire the map data of the target area from the navigation system according to the area information of the target area.
- the navigation system may further store map data of an arbitrary area other than the target area, where the map data is actually high-precision map data of the target area, and therefore, the map data of the target area includes at least the feature elements in the target area.
- the area information may be an area identifier or a latitude and longitude range of the target area.
- the zone ID can be the name of the zone.
- the terminal needs to obtain the difference between the vehicle coordinate system and the laser coordinate system of the target area. Therefore, after the terminal acquires the first coordinate of the feature element in the target area, the terminal needs to acquire the feature of the feature. The position coordinates in the map coordinate system, so that the terminal subsequently determines the second coordinate of the feature element in the vehicle coordinates.
- the terminal can locate the current position coordinates of the vehicle in the map coordinate system through the navigation system, for each frame point cloud data, when the terminal acquires the point cloud data of each frame, the terminal needs to obtain the frame point through the map data in the navigation system.
- the position coordinates of the feature element included in the cloud coordinate system in the map coordinate system, and the position coordinate is converted into the second coordinate in the vehicle coordinate system.
- the area information may be an area identifier
- the terminal may store the correspondence between the area identifier and the map data.
- the step of the terminal acquiring the map data of the target area from the navigation system may be: the terminal acquires the target.
- the area identifier of the area is obtained according to the area identifier of the target area
- the map data corresponding to the target area is obtained from the correspondence between the area identifier and the map data.
- the area information may be a latitude and longitude range
- the terminal stores the correspondence between the latitude and longitude range and the map data.
- the step of the terminal acquiring the map data of the target area from the navigation system may be: the terminal acquires the target area.
- the latitude and longitude range, according to the latitude and longitude range of the target area obtains map data corresponding to the target area from the correspondence between the latitude and longitude range and the map data.
- the terminal determines, according to the map data of the target area, the second coordinate of the feature element in the vehicle coordinate system.
- the vehicle acquires point cloud data for each frame during the running of the vehicle, the vehicle coordinate system with the vehicle as the origin moves accordingly, in order to determine the second coordinate of the corresponding feature element in the vehicle coordinate system of each frame point cloud data.
- the terminal acquires the frame point cloud data, and obtains the latitude and longitude coordinates of the feature element in the map coordinate system from the map data according to the feature elements included in the frame point cloud data. And elevation coordinates.
- the terminal determines the second coordinate of the feature element in the vehicle coordinate system according to the latitude and longitude coordinates and the elevation coordinate of the feature element in the map coordinate system.
- the process of determining, by the terminal, the second coordinate of the feature element in the vehicle coordinate according to the latitude and longitude coordinates and the elevation coordinate of the feature element in the map coordinate system may be: the terminal first maps the feature element in map coordinates The latitude and longitude coordinates and elevation coordinates in the system are converted into position coordinates in the geocentric coordinate system with the origin of the Earth's centroid as the origin, and then the position coordinates of the feature element in the geocentric coordinate system are converted into positions in the center coordinate system. coordinate.
- the terminal acquires the heading angle of the vehicle through the IMU in the navigation system, and the terminal converts the position coordinates of the feature element in the station center coordinate system into the second coordinate in the vehicle coordinate system according to the heading angle.
- the coordinates of the center coordinate system and the vehicle coordinate system are the same, except that the positive directions of the x and y axes are different, and the positive direction of the x-axis of the vehicle coordinate system and the positive direction of the y-axis of the central coordinate system are
- the angle is the heading angle of the vehicle. Therefore, the terminal may first convert the feature element in the map coordinate system to the center coordinate system via the geocentric coordinate system, and finally obtain the second coordinate of the feature element according to the heading angle of the vehicle.
- the system deviation is the position coordinate of the feature element in the map coordinate system in the map data, and the actual feature element is in the map coordinate system.
- the displacement deviation between the position coordinates in Therefore, in order to improve the accuracy of determining the second coordinate, the terminal also needs to consider the influence of the system deviation on the second coordinate.
- the process of converting, by the terminal, the position coordinate of the feature element in the center coordinate system into the second coordinate in the vehicle coordinate system according to the heading angle may be: the initial system deviation of the terminal acquiring the map data, according to the initial System deviation, adjust the position coordinates in the center coordinate system.
- the terminal converts the adjusted position coordinates into the second coordinates in the vehicle coordinate system according to the heading angle.
- the process of adjusting the position coordinates may be expressed as follows: the initial system deviation may be represented by (x′ 0 , y′ 0 ), that is, the position coordinates of the terminal element in the center coordinate system. , offset by x' 0 unit distance in the positive direction of the x-axis, and offset by y' 0 unit distance in the positive direction of the y-axis.
- the terminal actually determines a specific implementation manner of the second coordinate of the feature element in the vehicle coordinate system in the point cloud data of each frame based on the map data of the target area.
- the specific implementation manner is actually obtaining the second coordinate by acquiring the map data of the target area from the navigation system, and the terminal may also pre-follow the navigation during the actual operation.
- the map data of the target area is obtained in the system, and the map data of the target area is stored in the terminal, and the second coordinate is determined based on the map data of the target area stored in the terminal, which is not specifically limited in this embodiment of the present application.
- the offset posture of the point cloud data of each frame is an offset posture between the laser coordinate system and the vehicle coordinate system when the terminal acquires the point cloud data of each frame, because the vehicle moves with the vehicle.
- the laser coordinate system with the laser scanner as the coordinate origin and the vehicle coordinate system with the vehicle as the coordinate origin also move, resulting in the offset posture of the point cloud data of each frame may be the same or different. Therefore, the terminal also needs to determine the offset pose of each frame point cloud data by the following steps 205-207.
- the terminal acquires an initial offset pose between the vehicle coordinate system and the laser coordinate system.
- the offset pose includes a value of an offset position between the vehicle coordinate system and the laser coordinate system and a value of a yaw angle.
- the offset position between the vehicle coordinate system and the laser coordinate system may be represented by a position coordinate of a coordinate origin of the laser coordinate system in the vehicle coordinate system, and the yaw angle may be the x-axis of the laser coordinate system and the vehicle coordinate system. The angle between the x-axes is indicated.
- the initial offset pose of each frame point cloud data is determined by step 205, and then the offset pose of each frame point cloud data is determined through steps 206-207.
- the initial offset pose includes a value of an initial offset position and a value of an initial yaw angle.
- the terminal may pre-acquire and store the initial offset pose between the vehicle coordinate system and the laser coordinate system by using the measurement, and use the initial offset pose as the initial offset of the point cloud data of each frame. posture.
- the terminal can measure the coordinates of the laser scanning device in the vehicle coordinate system and the angle between the x-axis of the laser coordinate system and the x-axis of the vehicle coordinate system by using a measuring tool such as a tape measure, and take the measured coordinates as The value of the initial offset position, using the measured angle as the value of the initial yaw angle.
- the terminal determines, according to the initial offset pose and the second coordinate of the feature element, a third coordinate of the feature element, where the third coordinate is the feature element in the laser The coordinates in the coordinate system.
- the step may be: for each frame cloud data, the terminal offsets the second coordinate of the feature element according to the value of the initial offset position in the initial offset posture of the frame point cloud data, and the terminal
- the second coordinate after the offset position is angularly offset according to the value of the initial yaw angle in the initial offset pose of the frame point cloud data.
- the terminal uses the position coordinate after the offset position and the angle offset as the third coordinate of the feature element.
- the value of the initial offset position can be represented by (dx", dy")
- the initial yaw angle can be represented by dyaw" that is, the terminal offsets the second coordinate of the feature element along the positive direction of the x-axis. Shift dx" unit distances, offset dy" unit distances in the positive direction of the y-axis, and rotate the offset second coordinates counterclockwise by dyaw" unit angles.
- the terminal determines, according to the first coordinate and the third coordinate of the feature element, an offset pose of the point cloud data of each frame.
- the terminal may first determine an offset pose corresponding to each frame point cloud data by using step 207, so that the subsequent frame cloud data may be subsequently Corresponding offset pose determines an offset pose that reflects the general law.
- This step can be implemented by the following steps 2071-2072.
- the terminal calculates, according to the first coordinate and the third coordinate of the feature element, a first distance between each first point element and an adjacent second point element, and each of the first point elements and The second distance between adjacent linear features.
- each feature element in each frame point cloud data, is composed of a point element and a line element, wherein the first point element is a point in the feature element corresponding to the first coordinate.
- the second element is a point element among the feature elements corresponding to the third coordinate
- the line element is a linear element among the feature elements corresponding to the third coordinate.
- the distance between the first point element and the adjacent element can be calculated by any of the following methods.
- the first method by calculating a first point of the feature element in the point cloud data of each frame, and a first distance between the second point element, as a subsequent matching of the first coordinate and the third coordinate Reference distance.
- the terminal calculates the position according to the position coordinates of each first point element in the laser coordinate system and the position coordinates of the second point element adjacent to the first point element in the laser coordinate system. The first distance between the point feature and the second point feature.
- the second point element adjacent to the first point element is a second point shape centering on the first point element and closest to the first point element among the plurality of second point elements Elements.
- the point C is the first point element
- the point D is the second point element adjacent to the point C
- the terminal can calculate the first distance between the point C and the point D.
- the second method is to calculate a second distance between the first point element in the feature element of each frame point cloud data and the line element as a reference distance for matching the first coordinate and the third coordinate .
- the second distance between the first point element and the adjacent line element is a normal distance of the first point element to the line element. Therefore, in this step, the terminal calculates the position according to the position coordinates of each first point element in the laser coordinate system and the position coordinates of the line element adjacent to the first point element in the laser coordinate system. The normal distance between the point element and the line element, and the normal distance is taken as the second distance.
- the linear element adjacent to the first point element is a line element having the first point element as a center and the closest to the first point element among the plurality of line elements.
- the point C is the first point element
- the line L is a line element adjacent to the point C
- the terminal can calculate the normal distance between the point C and the line L, thereby obtaining the second distance.
- the terminal determines a plurality of first distances by using position coordinates of the plurality of first point elements and position coordinates of the plurality of second point elements, and the position coordinates of the plurality of first point elements and the plurality of The position coordinates of the linear features determine a plurality of second distances.
- the terminal determines, according to the first distance and the second distance, an offset pose of the point cloud data of each frame.
- the terminal may determine the offset pose of the point cloud data of each frame by iteratively matching the first coordinate and the third coordinate of the feature element.
- the process includes the following steps a-g:
- Step a For each frame of point cloud data, the terminal selects, according to the first distance and the second distance, a first point element having a first distance smaller than a first preset threshold and a second point element corresponding to the first point element And an element, and selecting a first point element having a second distance smaller than the first predetermined threshold and a line element corresponding to the first point element.
- the second point element corresponding to the first point element is a second point element adjacent to the first point element when the terminal calculates the first distance.
- the linear element corresponding to the first point element is a linear element adjacent to the first point element when the terminal calculates the second distance.
- Step b the terminal determines, according to the selected first point element and the second point element, and the first point element and the line element, based on a mean square error expression between the first coordinate and the third coordinate
- the offset matrix with the smallest value of the mean square error will make the offset matrix with the smallest value of the mean square error as the intermediate offset matrix between the first coordinate and the third coordinate.
- Step c The terminal updates an initial offset matrix of the frame point cloud data according to the intermediate offset matrix between the first coordinate and the third coordinate, and multiplies the updated initial offset matrix by the second coordinate to obtain a first Four coordinates, thus completing the first iteration match.
- the step of updating the initial offset matrix of the frame point cloud data according to the intermediate offset matrix between the first coordinate and the third coordinate may be: the terminal intermediate the first coordinate and the third coordinate
- the offset matrix is multiplied by the initial offset matrix of the frame point cloud data to obtain an updated initial offset matrix
- step c is actually a process of converting the second coordinate in the vehicle coordinate system into the laser coordinate system again, and the implementation manner is the same as that in step 206, and details are not described herein again.
- Step d the terminal calculates a third distance between each first point element and the adjacent second point element according to the first coordinate and the fourth coordinate of the feature element, and each first point element And the fourth distance between adjacent linear features.
- the step d is actually a process of recalculating the first distance and the second distance according to the first coordinate and the fourth coordinate converted into the laser coordinate system again, and the implementation manner is consistent with the step 2071, and the method is no longer one by one. Narration.
- Step e Determine the initial offset matrix after the update again by the implementation in steps a-c, thereby completing the second iteration matching.
- Step f Perform multiple iterations matching by the implementation in the above steps a-e.
- the minimum mean square error corresponding to the intermediate offset matrix is smaller than the second preset threshold
- the initial offset matrix updated according to the intermediate offset matrix is obtained, and the obtained initial offset matrix is taken as The offset matrix of the point cloud data.
- the updated initial offset matrix in the last iterative matching process is obtained, and the obtained initial offset matrix is used as the offset matrix of the frame point cloud data.
- Step g The terminal determines an offset pose of the frame point cloud data according to the offset matrix of the frame point cloud data.
- the step b may be specifically: the terminal according to the selected first point element and the second point element corresponding to the first point element, and the first point element and the line corresponding to the first point element
- the element by the following formula 1, the mean square error expression, makes the offset matrix with the smallest value of the mean square error as the intermediate offset matrix between the first coordinate and the third coordinate:
- X is the first coordinate of the feature element
- Y is the third coordinate of the feature element
- E(X, Y) is the mean square error between the first coordinate and the third coordinate of the feature element
- x i is The first distance or the second distance is not greater than the preset threshold
- y i is the second point element corresponding to the i-th first point element or
- m is the number of first point features whose first distance or second distance is not greater than a preset threshold
- M is an intermediate offset matrix between the first coordinate and the third coordinate.
- the intermediate offset matrix between the first coordinate and the third coordinate may be represented by M.
- the intermediate offset matrix includes a numerical value (dx', dy') of the offset position between the first coordinate and the third coordinate and a numerical value dyaw' of the yaw angle.
- the first preset threshold, the second preset threshold, and the third preset threshold may be set and changed according to the user's needs, which is not specifically limited in this embodiment of the present application.
- the first preset threshold may be 1 meter, 0.5 meter, or the like.
- the second preset threshold may be 0.1, 0.3, or the like.
- the third preset threshold may be 20, 100, or the like.
- the terminal specifically determines, according to the first coordinate and the second coordinate of the feature element, the offset posture of the point cloud data of each frame.
- the specific implementation manner is actually by converting the second coordinate in the vehicle coordinate system into the laser coordinate system, according to the first coordinate and the converted third coordinate. Determine the offset pose of each frame point cloud data.
- the terminal can also convert the first coordinate in the laser coordinate system into the vehicle coordinate system to obtain the converted fourth coordinate, according to the second coordinate and after the conversion.
- the fourth coordinate of the method determines the offset pose of the point cloud data of each frame, which is not specifically limited in this embodiment of the present application.
- the terminal establishes an observation equation between the offset pose of the at least two frames of point cloud data and the offset position, the yaw angle, and the system deviation. For each frame cloud data, the terminal acquires the point cloud of each frame. The heading angle of the vehicle corresponding to the data.
- the laser external parameter of the laser scanning device includes an offset position and a yaw angle between the vehicle coordinate system and the laser coordinate system.
- steps 203-204 due to system deviation in the map data, The second coordinate of the feature element is deviated from the actual coordinate of the feature element in the vehicle coordinate system, and when the offset pose of the point cloud data is determined for each frame, the influence of the system deviation on the second coordinate is considered. Therefore, in this step, when the terminal establishes the observation equation, it is also necessary to consider the influence of the system deviation.
- the terminal establishes an observation equation according to the offset pose of the at least two frames of point cloud data, the offset position, the yaw angle, and the system deviation:
- the system deviation is (x 0 , y 0 ), the offset position is (dx, dy), the yaw angle is dyaw, and (dx′ i , dy′ i ) is the ith frame of the at least two frames of point cloud data.
- the value of the offset position of the point cloud data, dyaw' i is the value of the yaw angle of the i-th point point cloud data in the at least two frames of point cloud data, and yaw i is the i-th frame point in the at least two frames of point cloud data
- the system deviation can be converted to the projection in the x-axis direction and the projection in the y-axis direction. Since the system deviation is an error in the map data, the actual operation is performed.
- the coordinate system is converted into the vehicle coordinate system.
- the center coordinate system and the vehicle coordinate system all use the vehicle as the coordinate origin, and the difference lies in the positive direction of the x-axis and the y-axis, the positive direction of the y-axis of the center-center coordinate system and the vehicle coordinate system.
- the angle between the positive directions of the x-axis is equal to the heading angle of the vehicle.
- the terminal needs to obtain the heading angle of the vehicle corresponding to the point cloud data of the frame.
- the process may be: when the terminal acquires the point cloud data of each frame, the terminal obtains the IMU in the navigation system. The heading angle of the vehicle corresponding to the frame point cloud data.
- the terminal calculates a value of the offset position and a value of the yaw angle in the observation equation according to the heading angle and an offset posture of the point cloud data of each frame.
- the terminal may substitute the offset pose of the at least two frames of point cloud data into the observation equation, thereby calculating the offset in the observation equation according to the offset pose of the at least two frames of point cloud data.
- the value of the offset position in the observation equation, the value of the yaw angle, and the value of the system deviation can be determined.
- the value of the relatively robust external laser parameter is obtained.
- the terminal may acquire the offset pose of the n-frame point cloud data (n is a positive integer greater than 2), and the The heading angle of the vehicle corresponding to each point cloud data in the offset pose of the n-frame point cloud data, respectively, the offset pose of each frame of cloud data, and the corresponding heading angle are substituted into the observation equation, using least squares method Calculating the value of the offset position in the observation equation, the value of the yaw angle, and the value of the system deviation. Due to the offset pose of the n-frame point cloud data, the possible existence of the point cloud data per frame is reduced. The interference of random noise reduces the error, which in turn makes the value of the determined external laser parameters more accurate.
- the terminal actually calculates the value of the laser external parameter of the laser scanning device according to the offset posture of the at least two frames of point cloud data, so as to calibrate the specific implementation manner of the laser scanning device.
- the above specific implementation manner may also be replaced by other implementation manners.
- the specific implementation manner described above actually determines the value of the external laser parameter by establishing an observation equation between the offset pose and the offset position, the yaw angle, and the system deviation.
- the terminal may also pre-establish and store the observation equation, or pre-write and store the same program instruction as the observation equation, and the terminal directly obtains the observation equation to determine the value of the external laser parameter; or directly The program instruction is obtained, and the program instruction is executed to determine the value of the laser external parameter.
- the laser scanning device in the vehicle is calibrated by the value of the external laser parameter, and the navigation system is calibrated by the determined value of the system deviation, so that the vehicle is combined and calibrated.
- the latter laser scanning device provides point cloud data and map data provided by the calibrated navigation system to improve driving safety.
- the terminal may acquire, according to at least two frame point cloud data obtained by scanning the target area by the laser scanning device, the first coordinate of the feature element in the point cloud data of each frame, where the first coordinate is the feature The coordinates of the element in the laser coordinate system; and the terminal directly determines the second coordinate of the feature element in the vehicle coordinate system in each point cloud data based on the map data of the target area of the vehicle;
- the first coordinate and the second coordinate perform subsequent processes, omitting the process of manually establishing the calibration field and the manual measurement, improving the efficiency of determining the first coordinate and the second coordinate, thereby improving the efficiency of calibration of the laser scanning device.
- the terminal determines an offset pose of the point cloud data of each frame according to the first coordinate and the second coordinate of the feature element; and subsequently continues to be based on the offset of the at least two frames of point cloud data.
- the value of the laser external parameter of the laser scanning device is calculated to calibrate the laser scanning device in the vehicle. Since the terminal calculates the value of the laser external parameter of the laser scanning device according to the multi-frame point cloud data, the interference of the random noise in the cloud data of each frame is reduced, thereby reducing the error, thereby improving the accuracy of determining the external parameters of the laser. .
- FIG. 6 is a schematic structural diagram of an apparatus for calibrating a laser scanning device according to an embodiment of the present application.
- the apparatus includes: an obtaining module 601, a first determining module 602, a second determining module 603, and a calculating module 604.
- the acquiring module 601 is configured to acquire, according to at least two frame point cloud data obtained by scanning the target area by the laser scanning device, a first coordinate of the feature element in the point cloud data of each frame, where the first coordinate is the feature element The coordinates in the laser coordinate system;
- the first determining module 602 is configured to determine, according to map data of the target area of the vehicle, a second coordinate of the feature element in the point coordinate data of the frame in the vehicle coordinate system;
- the second determining module 603 is configured to determine, according to the first coordinate and the second coordinate of the feature element, the offset pose of each point cloud data for the frame cloud data of each frame;
- the calculating module 604 is configured to calculate a value of the laser external parameter of the laser scanning device according to the offset pose of the at least two frames of point cloud data to calibrate the laser scanning device.
- the obtaining module 601 includes:
- a scanning unit configured to scan the target area based on a preset scan route by using a laser scanning device, to obtain the at least two frames of point cloud data, where the target area is any area including the feature element;
- an extracting unit configured to extract, for each frame of point cloud data, a first coordinate of the feature element in the laser coordinate system.
- the first determining module 602 includes:
- a first acquiring unit configured to acquire map data of the target area from a navigation system of the vehicle, where the map data includes latitude and longitude coordinates and elevation coordinates of the feature element in a map coordinate system;
- the first determining unit is configured to determine, according to the map data of the target area, the second coordinate of the feature element in the vehicle coordinate system for the frame cloud data of each frame.
- the second determining module 603 includes:
- a second acquiring unit configured to acquire an initial offset posture between the vehicle coordinate system and the laser coordinate system
- a second determining unit configured to determine, according to the initial offset pose and the second coordinate of the feature element, the third coordinate of the feature element, where the third coordinate is the feature The coordinates of the feature in the laser coordinate system;
- a third determining unit configured to determine an offset pose of the point cloud data of each frame according to the first coordinate and the third coordinate of the feature element.
- the third determining unit includes:
- a calculating subunit configured to calculate a first distance between each first point element and an adjacent second point element according to the first coordinate and the third coordinate of the feature element, and each first point a second distance between the feature element and the adjacent linear element, wherein the first point element is a point element among the feature elements corresponding to the first coordinate, and the second point element is corresponding to the third coordinate a point element in the feature element, wherein the line element is a line element among the feature elements corresponding to the third coordinate;
- Determining a subunit configured to determine an offset pose of the point cloud data per frame according to the first distance and the second distance.
- the laser external parameter of the laser scanning device includes an offset position and a yaw angle between the vehicle coordinate system and the laser coordinate system
- the calculating module 604 includes:
- Establishing a unit configured to establish an observation equation between the offset pose of the at least two frames of point cloud data and the offset position, the yaw angle, and the system deviation, where the system deviation is a navigation system of the vehicle and the map data Deviation between
- a third acquiring unit configured to acquire a heading angle of the vehicle corresponding to the point cloud data of each frame for the point cloud data of each frame;
- a calculating unit configured to calculate a value of the offset position and a value of the yaw angle in the observation equation according to the heading angle and an offset posture of the point cloud data of each frame.
- the terminal may acquire, according to at least two frame point cloud data obtained by scanning the target area by the laser scanning device, the first coordinate of the feature element in the point cloud data of each frame, where the first coordinate is the feature The coordinates of the element in the laser coordinate system; and the terminal directly determines the second coordinate of the feature element in the vehicle coordinate system in each point cloud data based on the map data of the target area of the vehicle;
- the first coordinate and the second coordinate perform subsequent processes, omitting the process of manually establishing the calibration field and the manual measurement, improving the efficiency of determining the first coordinate and the second coordinate, thereby improving the efficiency of calibration of the laser scanning device.
- the terminal determines an offset pose of the point cloud data of each frame according to the first coordinate and the second coordinate of the feature element; and subsequently continues to be based on the offset of the at least two frames of point cloud data.
- the value of the laser external parameter of the laser scanning device is calculated to calibrate the laser scanning device in the vehicle. Since the terminal calculates the value of the laser external parameter of the laser scanning device according to the multi-frame point cloud data, the interference of the random noise in the cloud data of each frame is reduced, thereby reducing the error, thereby improving the accuracy of determining the external parameters of the laser. .
- the device for calibrating the laser scanning device provided by the above embodiment is exemplified by the division of each functional module in the laser scanning device calibration. In practical applications, the functions may be assigned differently according to requirements.
- the function module is completed, that is, the internal structure of the terminal is divided into different functional modules to complete all or part of the functions described above.
- the device for calibrating the laser scanning device provided by the above embodiment is the same as the method for calibrating the laser scanning device. For the specific implementation process, refer to the method embodiment, and details are not described herein again.
- FIG. 7 is a schematic structural diagram of a computer device 700 according to an embodiment of the present application.
- the computer device 700 includes a processor and a memory, and may further include a communication interface and a communication bus, and may further include an input and output interface and a display device, wherein the processor, the memory, the input/output interface, the display device, and the communication interface pass The communication bus completes communication with each other.
- the memory comprises a non-volatile storage medium and an internal memory.
- the non-volatile storage medium of the computer device stores an operating system and can also store computer readable instructions that, when executed by the processor, cause the processor to implement a method of laser scanning device calibration.
- the internal memory can also store computer readable instructions that, when executed by the processor, cause the processor to perform a method of laser scanning device calibration.
- a communication bus is a circuit that connects the elements described and implements transmission between these elements.
- the processor receives commands from other elements over the communication bus, decrypts the received commands, and performs calculations or data processing in accordance with the decrypted commands.
- the memory may include program modules such as a kernel, middleware, an application programming interface (API), and an application.
- the program module can be composed of software, firmware or hardware, or at least two of them.
- the input and output interface forward commands or data entered by the user through input and output devices (eg, sensors, keyboards, touch screens).
- the display device displays various information to the user.
- the communication interface connects the computer device 700 with other network devices, user devices, networks.
- the communication interface can be connected to the network by wired or wireless to connect to other external network devices or user devices.
- the wireless communication may include at least one of the following: Wireless Fidelity (WiFi), Bluetooth (BT), Near Field Communication (NFC), and Global Positioning System (GPS).
- the wired communication may include at least one of the following: a Universal Serial Bus (USB), a High Definition Multimedia Interface (HDMI), and an asynchronous standard interface (Recommended Standard 232, RS-232). , and ordinary old-fashioned telephone business (Plain Old T Elephone Service, POTS).
- the network can be a telecommunication network and a communication network.
- the communication network can be a computer network, the Internet, an Internet of Things, a telephone network.
- the computer device 700 can be connected to the network through a communication interface, and the computer device 700 communicates with other network devices.
- the protocol can be supported by at least one of an application, an Application Programming Interface (API), a middleware, a kernel, and a communication interface.
- API Application Programming Interface
- a computer readable storage medium storing a computer program, such as a memory storing computer readable instructions that, when executed by a processor, implement the laser of the above embodiments Scan the method of calibration of the device.
- the computer readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), or a Compact Disc Read-Only Memory (CD-ROM). , tapes, floppy disks, and optical data storage devices.
- a person skilled in the art may understand that all or part of the steps of implementing the above embodiments may be completed by hardware, or may be instructed by a program to execute related hardware, and the program may be stored in a computer readable storage medium.
- the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Manufacturing & Machinery (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Navigation (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims (20)
- 一种激光扫描设备标定的方法,应用于计算机设备,所述方法包括:基于激光扫描设备对目标区域进行扫描所得到的至少两帧点云数据,获取每帧点云数据中地物要素的第一坐标,所述第一坐标为所述地物要素在激光坐标系中的坐标;基于所述目标区域的地图数据,确定所述每帧点云数据中地物要素在车辆坐标系中的第二坐标;对于所述每帧点云数据,根据所述地物要素的第一坐标和第二坐标,确定所述每帧点云数据的偏移位姿;及根据所述至少两帧点云数据的偏移位姿,计算所述激光扫描设备的激光外参数的数值,以标定所述激光扫描设备。
- 根据权利要求1所述的方法,其特征在于,所述基于激光扫描设备对目标区域进行扫描所得到的至少两帧点云数据,获取每帧点云数据中地物要素的第一坐标包括:通过所述激光扫描设备,基于预设扫描路线扫描所述目标区域,得到所述至少两帧点云数据,所述目标区域为包括所述地物要素的任一区域;及对于所述每帧点云数据,提取所述地物要素在所述激光坐标系中的第一坐标。
- 根据权利要求1所述的方法,其特征在于,所述基于所述目标区域的地图数据,确定所述每帧点云数据中地物要素在车辆坐标系中的第二坐标包括:从导航系统中获取所述目标区域的地图数据,所述地图数据包括所述地物要素在地图坐标系中的经纬度坐标和高程坐标;及对于所述每帧点云数据,根据所述目标区域的地图数据,确定所述地物要素在所述车辆坐标系中的第二坐标。
- 根据权利要求3所述的方法,其特征在于,所述对于所述每帧点云数据,根据所述目标区域的地图数据,确定所述地物要素在所述车辆坐标系中的第二坐标包括:将所述地物要素在地图坐标系中的经纬度坐标和高程坐标,转换为地心坐标系中的位置坐标;将所述地物要素在地心坐标系中的位置坐标,转换为站心坐标系中的位置坐标;及根据获取的车辆的航向角将所述地物要素在站心坐标系中的位置坐标转换为车辆坐标系中的第二坐标。
- 根据权利要求1所述的方法,其特征在于,所述对于所述每帧点云数据,根据所述地物要素的第一坐标和第二坐标,确定所述每帧点云数据的偏移位姿包括:获取所述车辆坐标系和所述激光坐标系之间的初始偏移位姿;对于所述每帧点云数据,根据所述初始偏移位姿和所述地物要素的第二坐标,确定所述地物要素的第三坐标,所述第三坐标为所述地物要素在所述激光坐标系中的坐标;及根据所述地物要素的第一坐标和第三坐标,确定所述每帧点云数据的偏移位姿。
- 根据权利要求5所述的方法,其特征在于,所述对于所述每帧点云数据,根据所述初始偏移位姿和所述地物要素的第二坐标,确定所述地物要素的第三坐标包括:对于所述每帧点云数据,根据所述初始偏移位资中的初始偏移位置的数值,将所述地物要素的第二坐标进行偏移位置,根据所述初始偏移位资中的初始偏航角的数值,将偏移位置后的第二坐标进行角度偏移;及将偏移位置和角度偏移后的位置坐标作为所述地物要素的第三坐标。
- 根据权利要求5所述的方法,其特征在于,所述根据所述地物要素的第一坐标和第三坐标,确定所述每帧点云数据的偏移位姿包括:根据所述地物要素的第一坐标和第三坐标,计算每个第一点状要素和相邻第二点状要素之间的第一距离,以及所述每个第一点状要素和相邻线状要素之间的第二距离,所述第一点状要素为所述第一坐标对应的地物要素中的点状要素,所述第二点状要素为所述第三坐标对应的地物要素中的点状要素,所述线状要素为所述第三坐标对应的地物要素中的线状要素;及根据所述第一距离和所述第二距离,确定所述每帧点云数据的偏移位姿。
- 根据权利要求1所述的方法,其特征在于,所述激光扫描设备的激光外参数包括所述车辆坐标系和所述激光坐标系之间的偏移位置和偏航角,所 述根据所述至少两帧点云数据的偏移位姿,计算所述激光扫描设备的激光外参数的数值包括:建立所述至少两帧点云数据的偏移位姿与所述偏移位置、所述偏航角和系统偏差之间的观测方程,所述系统偏差为所述地图数据中的系统误差;对于所述每帧点云数据,获取所述每帧点云数据对应的所述车辆的航向角;及根据所述航向角和所述每帧点云数据的偏移位姿,计算所述观测方程中所述偏移位置的数值和所述偏航角的数值。
- 一种计算机设备,包括存储器和处理器,所述存储器中储存有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行以下步骤:基于激光扫描设备对目标区域进行扫描所得到的至少两帧点云数据,获取每帧点云数据中地物要素的第一坐标,所述第一坐标为所述地物要素在激光坐标系中的坐标;基于所述目标区域的地图数据,确定所述每帧点云数据中地物要素在车辆坐标系中的第二坐标;对于所述每帧点云数据,根据所述地物要素的第一坐标和第二坐标,确定所述每帧点云数据的偏移位姿;及根据所述至少两帧点云数据的偏移位姿,计算所述激光扫描设备的激光外参数的数值,以标定所述激光扫描设备。
- 根据权利要求9所述的计算机设备,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述处理器在执行所述基于激光扫描设备对目标区域进行扫描所得到的至少两帧点云数据,获取每帧点云数据中地物要素的第一坐标的步骤时,执行以下步骤:通过所述激光扫描设备,基于预设扫描路线扫描所述目标区域,得到所述至少两帧点云数据,所述目标区域为包括所述地物要素的任一区域;及对于所述每帧点云数据,提取所述地物要素在所述激光坐标系中的第一坐标。
- 根据权利要求9所述的计算机设备,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述处理器在执行所述基于所述目标区域的 地图数据,确定所述每帧点云数据中地物要素在车辆坐标系中的第二坐标的步骤时,执行以下步骤:从导航系统中获取所述目标区域的地图数据,所述地图数据包括所述地物要素在地图坐标系中的经纬度坐标和高程坐标;及对于所述每帧点云数据,根据所述目标区域的地图数据,确定所述地物要素在所述车辆坐标系中的第二坐标。
- 根据权利要求9所述的计算机设备,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述处理器在执行所述对于所述每帧点云数据,根据所述地物要素的第一坐标和第二坐标,确定所述每帧点云数据的偏移位姿的步骤时,执行以下步骤:获取所述车辆坐标系和所述激光坐标系之间的初始偏移位姿;对于所述每帧点云数据,根据所述初始偏移位姿和所述地物要素的第二坐标,确定所述地物要素的第三坐标,所述第三坐标为所述地物要素在所述激光坐标系中的坐标;及根据所述地物要素的第一坐标和第三坐标,确定所述每帧点云数据的偏移位姿。
- 根据权利要求12所述的计算机设备,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述处理器在执行所述根据所述地物要素的第一坐标和第三坐标,确定所述每帧点云数据的偏移位姿的步骤时,执行以下步骤:根据所述地物要素的第一坐标和第三坐标,计算每个第一点状要素和相邻第二点状要素之间的第一距离,以及所述每个第一点状要素和相邻线状要素之间的第二距离,所述第一点状要素为所述第一坐标对应的地物要素中的点状要素,所述第二点状要素为所述第三坐标对应的地物要素中的点状要素,所述线状要素为所述第三坐标对应的地物要素中的线状要素;及根据所述第一距离和所述第二距离,确定所述每帧点云数据的偏移位姿。
- 根据权利要求11所述的计算机设备,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述处理器在执行所述激光扫描设备的激光外参数包括所述车辆坐标系和所述激光坐标系之间的偏移位置和偏航角,所述根据所述至少两帧点云数据的偏移位姿,计算所述激光扫描设备的激光外 参数的数值的步骤时,执行以下步骤:建立所述至少两帧点云数据的偏移位姿与所述偏移位置、所述偏航角和系统偏差之间的观测方程,所述系统偏差为所述地图数据中的系统误差;对于所述每帧点云数据,获取所述每帧点云数据对应的所述车辆的航向角;及根据所述航向角和所述每帧点云数据的偏移位姿,计算所述观测方程中所述偏移位置的数值和所述偏航角的数值。
- 一种非易失性的计算机可读存储介质,存储有计算机可读指令,所述计算机可读指令被一个或多个处理器执行时,使得所述一个或多个处理器执行以下步骤:基于激光扫描设备对目标区域进行扫描所得到的至少两帧点云数据,获取每帧点云数据中地物要素的第一坐标,所述第一坐标为所述地物要素在激光坐标系中的坐标;基于所述目标区域的地图数据,确定所述每帧点云数据中地物要素在车辆坐标系中的第二坐标;对于所述每帧点云数据,根据所述地物要素的第一坐标和第二坐标,确定所述每帧点云数据的偏移位姿;及根据所述至少两帧点云数据的偏移位姿,计算所述激光扫描设备的激光外参数的数值,以标定所述激光扫描设备。
- 根据权利要求15所述的计算机可读存储介质,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述处理器在执行所述基于激光扫描设备对目标区域进行扫描所得到的至少两帧点云数据,获取每帧点云数据中地物要素的第一坐标的步骤时,执行以下步骤:通过所述激光扫描设备,基于预设扫描路线扫描所述目标区域,得到所述至少两帧点云数据,所述目标区域为包括所述地物要素的任一区域;及对于所述每帧点云数据,提取所述地物要素在所述激光坐标系中的第一坐标。
- 根据权利要求15所述的计算机可读存储介质,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述处理器在执行所述基于所述目标区域的地图数据,确定所述每帧点云数据中地物要素在车辆坐标系中的第 二坐标的步骤时,执行以下步骤:从导航系统中获取所述目标区域的地图数据,所述地图数据包括所述地物要素在地图坐标系中的经纬度坐标和高程坐标;及对于所述每帧点云数据,根据所述目标区域的地图数据,确定所述地物要素在所述车辆坐标系中的第二坐标。
- 根据权利要求15所述的计算机可读存储介质,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述处理器在执行所述对于所述每帧点云数据,根据所述地物要素的第一坐标和第二坐标,确定所述每帧点云数据的偏移位姿的步骤时,执行以下步骤:获取所述车辆坐标系和所述激光坐标系之间的初始偏移位姿;对于所述每帧点云数据,根据所述初始偏移位姿和所述地物要素的第二坐标,确定所述地物要素的第三坐标,所述第三坐标为所述地物要素在所述激光坐标系中的坐标;及根据所述地物要素的第一坐标和第三坐标,确定所述每帧点云数据的偏移位姿。
- 根据权利要求18所述的计算机可读存储介质,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述处理器在执行所述根据所述地物要素的第一坐标和第三坐标,确定所述每帧点云数据的偏移位姿的步骤时,执行以下步骤:根据所述地物要素的第一坐标和第三坐标,计算每个第一点状要素和相邻第二点状要素之间的第一距离,以及所述每个第一点状要素和相邻线状要素之间的第二距离,所述第一点状要素为所述第一坐标对应的地物要素中的点状要素,所述第二点状要素为所述第三坐标对应的地物要素中的点状要素,所述线状要素为所述第三坐标对应的地物要素中的线状要素;及根据所述第一距离和所述第二距离,确定所述每帧点云数据的偏移位姿。
- 根据权利要求15所述的计算机可读存储介质,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述处理器在执行所述激光扫描设备的激光外参数包括所述车辆坐标系和所述激光坐标系之间的偏移位置和偏航角,所述根据所述至少两帧点云数据的偏移位姿,计算所述激光扫描设备的激光外参数的数值的步骤时,执行以下步骤:建立所述至少两帧点云数据的偏移位姿与所述偏移位置、所述偏航角和系统偏差之间的观测方程,所述系统偏差为所述地图数据中的系统误差;对于所述每帧点云数据,获取所述每帧点云数据对应的所述车辆的航向角;及根据所述航向角和所述每帧点云数据的偏移位姿,计算所述观测方程中所述偏移位置的数值和所述偏航角的数值。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020197030956A KR102296723B1 (ko) | 2017-08-23 | 2018-05-17 | 레이저 스캐닝 디바이스 교정 방법, 장치, 디바이스 및 저장 매체 |
EP18847637.8A EP3686557A4 (en) | 2017-08-23 | 2018-05-17 | LASER SCAN DEVICE CALIBRATION METHOD, APPARATUS AND DEVICE, AND STORAGE MEDIA |
JP2020511185A JP6906691B2 (ja) | 2017-08-23 | 2018-05-17 | レーザー走査デバイスの標定方法、装置、デバイス及び記憶媒体 |
US16/383,358 US20190235062A1 (en) | 2017-08-23 | 2019-04-12 | Method, device, and storage medium for laser scanning device calibration |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710731253.X | 2017-08-23 | ||
CN201710731253.XA CN109425365B (zh) | 2017-08-23 | 2017-08-23 | 激光扫描设备标定的方法、装置、设备及存储介质 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/383,358 Continuation US20190235062A1 (en) | 2017-08-23 | 2019-04-12 | Method, device, and storage medium for laser scanning device calibration |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019037484A1 true WO2019037484A1 (zh) | 2019-02-28 |
Family
ID=65439766
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/087251 WO2019037484A1 (zh) | 2017-08-23 | 2018-05-17 | 激光扫描设备标定的方法、装置、设备及存储介质 |
Country Status (7)
Country | Link |
---|---|
US (1) | US20190235062A1 (zh) |
EP (1) | EP3686557A4 (zh) |
JP (1) | JP6906691B2 (zh) |
KR (1) | KR102296723B1 (zh) |
CN (1) | CN109425365B (zh) |
MA (1) | MA50182A (zh) |
WO (1) | WO2019037484A1 (zh) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111402328A (zh) * | 2020-03-17 | 2020-07-10 | 北京图森智途科技有限公司 | 一种基于激光里程计的位姿计算方法及装置 |
CN111784836A (zh) * | 2020-06-28 | 2020-10-16 | 北京百度网讯科技有限公司 | 高精地图生成方法、装置、设备及可读存储介质 |
CN111986472A (zh) * | 2019-05-22 | 2020-11-24 | 阿里巴巴集团控股有限公司 | 车辆速度确定方法及车辆 |
CN112100900A (zh) * | 2020-06-30 | 2020-12-18 | 北京控制工程研究所 | 一种空间非合作目标点云初始姿态测量方法 |
CN112164138A (zh) * | 2020-10-30 | 2021-01-01 | 上海商汤临港智能科技有限公司 | 一种点云数据筛选方法及装置 |
CN112596063A (zh) * | 2020-11-27 | 2021-04-02 | 北京迈格威科技有限公司 | 点云描述子构建方法及装置,闭环检测方法及装置 |
CN112639882A (zh) * | 2019-09-12 | 2021-04-09 | 华为技术有限公司 | 定位方法、装置及系统 |
CN112684432A (zh) * | 2019-10-18 | 2021-04-20 | 北京万集科技股份有限公司 | 激光雷达标定方法、装置、设备及存储介质 |
CN113034685A (zh) * | 2021-03-18 | 2021-06-25 | 北京百度网讯科技有限公司 | 激光点云与高精地图的叠加方法、装置及电子设备 |
CN113238202A (zh) * | 2021-06-08 | 2021-08-10 | 上海海洋大学 | 光子激光三维成像系统的坐标系点云计算方法及其应用 |
CN113721227A (zh) * | 2021-08-06 | 2021-11-30 | 上海有个机器人有限公司 | 一种激光器的偏移角度计算方法 |
CN113739774A (zh) * | 2021-09-14 | 2021-12-03 | 煤炭科学研究总院 | 基于移动激光与标靶协作的掘进机位姿纠偏方法 |
CN113984072A (zh) * | 2021-10-28 | 2022-01-28 | 阿波罗智能技术(北京)有限公司 | 车辆定位方法、装置、设备、存储介质及自动驾驶车辆 |
CN114353807A (zh) * | 2022-03-21 | 2022-04-15 | 沈阳吕尚科技有限公司 | 一种机器人的定位方法及定位装置 |
CN114581379A (zh) * | 2022-02-14 | 2022-06-03 | 浙江华睿科技股份有限公司 | 一种密封胶的检测方法及装置 |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109946732B (zh) * | 2019-03-18 | 2020-12-01 | 李子月 | 一种基于多传感器数据融合的无人车定位方法 |
CN110298103A (zh) * | 2019-06-25 | 2019-10-01 | 中国电建集团成都勘测设计研究院有限公司 | 基于无人机机载三维激光扫描仪的高陡危岩体调查方法 |
CN112212871B (zh) * | 2019-07-10 | 2024-07-19 | 浙江未来精灵人工智能科技有限公司 | 一种数据处理方法、装置及机器人 |
CN112241016B (zh) * | 2019-07-19 | 2024-07-19 | 北京初速度科技有限公司 | 一种泊车地图地理坐标的确定方法和装置 |
CN110780325B (zh) * | 2019-08-23 | 2022-07-19 | 腾讯科技(深圳)有限公司 | 运动对象的定位方法及装置、电子设备 |
CN110736456B (zh) * | 2019-08-26 | 2023-05-05 | 广东亿嘉和科技有限公司 | 稀疏环境下基于特征提取的二维激光实时定位方法 |
CN112630751B (zh) * | 2019-10-09 | 2024-06-18 | 中车时代电动汽车股份有限公司 | 一种激光雷达的标定方法 |
CN110794392B (zh) * | 2019-10-15 | 2024-03-19 | 上海创昂智能技术有限公司 | 车辆定位方法、装置、车辆及存储介质 |
CN110837080B (zh) * | 2019-10-28 | 2023-09-05 | 武汉海云空间信息技术有限公司 | 激光雷达移动测量系统的快速标定方法 |
CN110888120B (zh) * | 2019-12-03 | 2023-04-07 | 华南农业大学 | 一种基于组合导航系统矫正激光雷达点云数据运动畸变的方法 |
CN111207762B (zh) * | 2019-12-31 | 2021-12-07 | 深圳一清创新科技有限公司 | 地图生成方法、装置、计算机设备和存储介质 |
CN111508021B (zh) * | 2020-03-24 | 2023-08-18 | 广州视源电子科技股份有限公司 | 一种位姿确定方法、装置、存储介质及电子设备 |
CN116930933A (zh) * | 2020-03-27 | 2023-10-24 | 深圳市速腾聚创科技有限公司 | 激光雷达的姿态校正方法和装置 |
CN111949816B (zh) * | 2020-06-22 | 2023-09-26 | 北京百度网讯科技有限公司 | 定位处理方法、装置、电子设备和存储介质 |
CN113866779A (zh) * | 2020-06-30 | 2021-12-31 | 上海商汤智能科技有限公司 | 点云数据的融合方法、装置、电子设备及存储介质 |
CN112068108A (zh) * | 2020-08-11 | 2020-12-11 | 南京航空航天大学 | 一种基于全站仪的激光雷达外部参数标定方法 |
CN112595325B (zh) * | 2020-12-21 | 2024-08-13 | 武汉汉宁轨道交通技术有限公司 | 初始位置确定方法、装置、电子设备和存储介质 |
CN112578356B (zh) * | 2020-12-25 | 2024-05-17 | 上海商汤临港智能科技有限公司 | 一种外参标定方法、装置、计算机设备及存储介质 |
CN112904317B (zh) * | 2021-01-21 | 2023-08-22 | 湖南阿波罗智行科技有限公司 | 一种多激光雷达与gnss_ins系统标定方法 |
CN112509053B (zh) * | 2021-02-07 | 2021-06-04 | 深圳市智绘科技有限公司 | 机器人位姿的获取方法、装置及电子设备 |
JP2022152629A (ja) * | 2021-03-29 | 2022-10-12 | 株式会社トプコン | 測量システム及び点群データ取得方法及び点群データ取得プログラム |
CN113124777B (zh) * | 2021-04-20 | 2023-02-24 | 辽宁因泰立电子信息有限公司 | 车辆尺寸确定方法、装置、系统及存储介质 |
CN113247769B (zh) * | 2021-04-28 | 2023-06-06 | 三一海洋重工有限公司 | 一种集卡定位方法及其定位系统、岸桥 |
CN113237896B (zh) * | 2021-06-08 | 2024-02-20 | 诚丰家具有限公司 | 一种基于光源扫描的家具板材动态监测系统及方法 |
CN113671527B (zh) * | 2021-07-23 | 2024-08-06 | 国电南瑞科技股份有限公司 | 一种提高配网带电作业机器人的精准作业方法及装置 |
CN113362328B (zh) * | 2021-08-10 | 2021-11-09 | 深圳市信润富联数字科技有限公司 | 点云图生成方法、装置、电子设备和存储介质 |
CN113721255B (zh) * | 2021-08-17 | 2023-09-26 | 北京航空航天大学 | 基于激光雷达与视觉融合的列车站台停车点精准检测方法 |
CN113743483B (zh) * | 2021-08-20 | 2022-10-21 | 浙江省测绘科学技术研究院 | 一种基于空间平面偏移分析模型的道路点云误差场景分析方法 |
CN113884278B (zh) * | 2021-09-16 | 2023-10-27 | 杭州海康机器人股份有限公司 | 一种线激光设备的系统标定方法和装置 |
CN113959397B (zh) * | 2021-10-19 | 2023-10-03 | 广东电网有限责任公司 | 一种电力杆塔姿态监测方法、设备及介质 |
CN114018228B (zh) * | 2021-11-04 | 2024-01-23 | 武汉天测测绘科技有限公司 | 一种移动式轨道交通三维数据获取方法及系统 |
CN114399550B (zh) * | 2022-01-18 | 2024-06-07 | 中冶赛迪信息技术(重庆)有限公司 | 一种基于三维激光扫描的汽车鞍座提取方法及系统 |
CN116246020B (zh) * | 2023-03-07 | 2023-09-08 | 武汉理工大学 | 一种多激光点云技术三维重建系统及方法 |
CN117269939B (zh) * | 2023-10-25 | 2024-03-26 | 北京路凯智行科技有限公司 | 用于传感器的参数标定系统、方法及存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104019829A (zh) * | 2014-06-09 | 2014-09-03 | 武汉克利福昇科技有限责任公司 | 一种基于pos系统的车载全景相机和线阵激光扫描仪的外参标定方法 |
CN104833372A (zh) * | 2015-04-13 | 2015-08-12 | 武汉海达数云技术有限公司 | 一种车载移动测量系统高清全景相机外参数标定方法 |
CN105180811A (zh) * | 2015-09-21 | 2015-12-23 | 武汉海达数云技术有限公司 | 基于同名特征地物的移动测量系统激光扫描仪标定方法 |
CN105203023A (zh) * | 2015-07-10 | 2015-12-30 | 中国人民解放军信息工程大学 | 一种车载三维激光扫描系统安置参数的一站式标定方法 |
US20160070981A1 (en) * | 2014-09-08 | 2016-03-10 | Kabushiki Kaisha Topcon | Operating device, operating system, operating method, and program therefor |
CN106996795A (zh) * | 2016-01-22 | 2017-08-01 | 腾讯科技(深圳)有限公司 | 一种车载激光外参标定方法和装置 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3395393B2 (ja) * | 1994-08-05 | 2003-04-14 | 日産自動車株式会社 | 車両周囲表示装置 |
JP5069439B2 (ja) * | 2006-09-21 | 2012-11-07 | パナソニック株式会社 | 自己位置認識システム |
JP2011191239A (ja) * | 2010-03-16 | 2011-09-29 | Mazda Motor Corp | 移動体位置検出装置 |
EP2523017A1 (de) * | 2011-05-13 | 2012-11-14 | Hexagon Technology Center GmbH | Kalibrierverfahren für ein Gerät mit Scanfunktionalität |
GB201116961D0 (en) * | 2011-09-30 | 2011-11-16 | Bae Systems Plc | Fast calibration for lidars |
US9043069B1 (en) * | 2012-11-07 | 2015-05-26 | Google Inc. | Methods and systems for scan matching approaches for vehicle heading estimation |
CN105164549B (zh) * | 2013-03-15 | 2019-07-02 | 优步技术公司 | 用于机器人的多传感立体视觉的方法、系统和设备 |
KR102003339B1 (ko) * | 2013-12-06 | 2019-07-25 | 한국전자통신연구원 | 정밀 위치 설정 장치 및 방법 |
EP3129807B1 (de) * | 2014-04-09 | 2018-06-13 | Continental Teves AG & Co. oHG | Positionskorrektur eines fahrzeugs durch referenzierung zu objekten im umfeld |
DE102014211176A1 (de) * | 2014-06-11 | 2015-12-17 | Continental Teves Ag & Co. Ohg | Verfahren und System zur Korrektur von Messdaten und/oder Navigationsdaten eines Sensorbasissystems |
US20150362587A1 (en) * | 2014-06-17 | 2015-12-17 | Microsoft Corporation | Lidar sensor calibration using surface pattern detection |
US9823059B2 (en) * | 2014-08-06 | 2017-11-21 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
JP6442193B2 (ja) * | 2014-08-26 | 2018-12-19 | 株式会社トプコン | 点群位置データ処理装置、点群位置データ処理システム、点群位置データ処理方法およびプログラム |
CN104180793A (zh) * | 2014-08-27 | 2014-12-03 | 北京建筑大学 | 一种用于数字城市建设的移动空间信息获取装置和方法 |
CN104657464B (zh) * | 2015-02-10 | 2018-07-03 | 腾讯科技(深圳)有限公司 | 一种数据处理方法及装置 |
CN106546260B (zh) * | 2015-09-22 | 2019-08-13 | 腾讯科技(深圳)有限公司 | 一种移动测量数据的纠正方法及系统 |
US9916703B2 (en) * | 2015-11-04 | 2018-03-13 | Zoox, Inc. | Calibration for autonomous vehicle operation |
EP3182065A1 (de) * | 2015-12-14 | 2017-06-21 | Leica Geosystems AG | Handhaltbares entfernungsmessgerät und verfahren zum erfassen relativer positionen |
KR102373926B1 (ko) * | 2016-02-05 | 2022-03-14 | 삼성전자주식회사 | 이동체 및 이동체의 위치 인식 방법 |
US10837773B2 (en) * | 2016-12-30 | 2020-11-17 | DeepMap Inc. | Detection of vertical structures based on LiDAR scanner data for high-definition maps for autonomous vehicles |
CN108732582B (zh) * | 2017-04-20 | 2020-07-10 | 百度在线网络技术(北京)有限公司 | 车辆定位方法和装置 |
-
2017
- 2017-08-23 CN CN201710731253.XA patent/CN109425365B/zh active Active
-
2018
- 2018-05-17 MA MA050182A patent/MA50182A/fr unknown
- 2018-05-17 KR KR1020197030956A patent/KR102296723B1/ko active IP Right Grant
- 2018-05-17 JP JP2020511185A patent/JP6906691B2/ja active Active
- 2018-05-17 EP EP18847637.8A patent/EP3686557A4/en active Pending
- 2018-05-17 WO PCT/CN2018/087251 patent/WO2019037484A1/zh unknown
-
2019
- 2019-04-12 US US16/383,358 patent/US20190235062A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104019829A (zh) * | 2014-06-09 | 2014-09-03 | 武汉克利福昇科技有限责任公司 | 一种基于pos系统的车载全景相机和线阵激光扫描仪的外参标定方法 |
US20160070981A1 (en) * | 2014-09-08 | 2016-03-10 | Kabushiki Kaisha Topcon | Operating device, operating system, operating method, and program therefor |
CN104833372A (zh) * | 2015-04-13 | 2015-08-12 | 武汉海达数云技术有限公司 | 一种车载移动测量系统高清全景相机外参数标定方法 |
CN105203023A (zh) * | 2015-07-10 | 2015-12-30 | 中国人民解放军信息工程大学 | 一种车载三维激光扫描系统安置参数的一站式标定方法 |
CN105180811A (zh) * | 2015-09-21 | 2015-12-23 | 武汉海达数云技术有限公司 | 基于同名特征地物的移动测量系统激光扫描仪标定方法 |
CN106996795A (zh) * | 2016-01-22 | 2017-08-01 | 腾讯科技(深圳)有限公司 | 一种车载激光外参标定方法和装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3686557A4 |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111986472A (zh) * | 2019-05-22 | 2020-11-24 | 阿里巴巴集团控股有限公司 | 车辆速度确定方法及车辆 |
CN111986472B (zh) * | 2019-05-22 | 2023-04-28 | 阿里巴巴集团控股有限公司 | 车辆速度确定方法及车辆 |
CN112639882A (zh) * | 2019-09-12 | 2021-04-09 | 华为技术有限公司 | 定位方法、装置及系统 |
CN112639882B (zh) * | 2019-09-12 | 2021-12-14 | 华为技术有限公司 | 定位方法、装置及系统 |
CN112684432A (zh) * | 2019-10-18 | 2021-04-20 | 北京万集科技股份有限公司 | 激光雷达标定方法、装置、设备及存储介质 |
CN112684432B (zh) * | 2019-10-18 | 2024-04-16 | 武汉万集光电技术有限公司 | 激光雷达标定方法、装置、设备及存储介质 |
CN111402328A (zh) * | 2020-03-17 | 2020-07-10 | 北京图森智途科技有限公司 | 一种基于激光里程计的位姿计算方法及装置 |
CN111402328B (zh) * | 2020-03-17 | 2023-11-10 | 北京图森智途科技有限公司 | 一种基于激光里程计的位姿计算方法及装置 |
CN111784836B (zh) * | 2020-06-28 | 2024-06-04 | 北京百度网讯科技有限公司 | 高精地图生成方法、装置、设备及可读存储介质 |
CN111784836A (zh) * | 2020-06-28 | 2020-10-16 | 北京百度网讯科技有限公司 | 高精地图生成方法、装置、设备及可读存储介质 |
CN112100900B (zh) * | 2020-06-30 | 2024-03-26 | 北京控制工程研究所 | 一种空间非合作目标点云初始姿态测量方法 |
CN112100900A (zh) * | 2020-06-30 | 2020-12-18 | 北京控制工程研究所 | 一种空间非合作目标点云初始姿态测量方法 |
CN112164138A (zh) * | 2020-10-30 | 2021-01-01 | 上海商汤临港智能科技有限公司 | 一种点云数据筛选方法及装置 |
CN112596063A (zh) * | 2020-11-27 | 2021-04-02 | 北京迈格威科技有限公司 | 点云描述子构建方法及装置,闭环检测方法及装置 |
CN112596063B (zh) * | 2020-11-27 | 2024-04-02 | 北京迈格威科技有限公司 | 点云描述子构建方法及装置,闭环检测方法及装置 |
CN113034685A (zh) * | 2021-03-18 | 2021-06-25 | 北京百度网讯科技有限公司 | 激光点云与高精地图的叠加方法、装置及电子设备 |
CN113034685B (zh) * | 2021-03-18 | 2022-12-06 | 北京百度网讯科技有限公司 | 激光点云与高精地图的叠加方法、装置及电子设备 |
CN113238202A (zh) * | 2021-06-08 | 2021-08-10 | 上海海洋大学 | 光子激光三维成像系统的坐标系点云计算方法及其应用 |
CN113238202B (zh) * | 2021-06-08 | 2023-08-15 | 上海海洋大学 | 光子激光三维成像系统的坐标系点云计算方法及其应用 |
CN113721227A (zh) * | 2021-08-06 | 2021-11-30 | 上海有个机器人有限公司 | 一种激光器的偏移角度计算方法 |
CN113739774A (zh) * | 2021-09-14 | 2021-12-03 | 煤炭科学研究总院 | 基于移动激光与标靶协作的掘进机位姿纠偏方法 |
CN113984072A (zh) * | 2021-10-28 | 2022-01-28 | 阿波罗智能技术(北京)有限公司 | 车辆定位方法、装置、设备、存储介质及自动驾驶车辆 |
CN113984072B (zh) * | 2021-10-28 | 2024-05-17 | 阿波罗智能技术(北京)有限公司 | 车辆定位方法、装置、设备、存储介质及自动驾驶车辆 |
CN114581379B (zh) * | 2022-02-14 | 2024-03-22 | 浙江华睿科技股份有限公司 | 一种密封胶的检测方法及装置 |
CN114581379A (zh) * | 2022-02-14 | 2022-06-03 | 浙江华睿科技股份有限公司 | 一种密封胶的检测方法及装置 |
CN114353807B (zh) * | 2022-03-21 | 2022-08-12 | 沈阳吕尚科技有限公司 | 一种机器人的定位方法及定位装置 |
CN114353807A (zh) * | 2022-03-21 | 2022-04-15 | 沈阳吕尚科技有限公司 | 一种机器人的定位方法及定位装置 |
Also Published As
Publication number | Publication date |
---|---|
CN109425365B (zh) | 2022-03-11 |
CN109425365A (zh) | 2019-03-05 |
KR20190129978A (ko) | 2019-11-20 |
JP2020531831A (ja) | 2020-11-05 |
US20190235062A1 (en) | 2019-08-01 |
JP6906691B2 (ja) | 2021-07-21 |
EP3686557A1 (en) | 2020-07-29 |
EP3686557A4 (en) | 2021-08-04 |
MA50182A (fr) | 2020-07-29 |
KR102296723B1 (ko) | 2021-08-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019037484A1 (zh) | 激光扫描设备标定的方法、装置、设备及存储介质 | |
US11802769B2 (en) | Lane line positioning method and apparatus, and storage medium thereof | |
WO2021139590A1 (zh) | 基于蓝牙与slam的室内定位导航装置及其方法 | |
US9378558B2 (en) | Self-position and self-orientation based on externally received position information, sensor data, and markers | |
KR101444685B1 (ko) | 영상기반 멀티센서 데이터를 이용한 차량의 위치자세 결정 방법 및 장치 | |
WO2020146102A1 (en) | Robust lane association by projecting 2-d image into 3-d world using map information | |
CN110160545B (zh) | 一种激光雷达与gps的增强定位系统及方法 | |
KR20200064542A (ko) | 드론을 이용한 지상기준점 측량장치 및 그 방법 | |
CN113295174B (zh) | 一种车道级定位的方法、相关装置、设备以及存储介质 | |
KR102239562B1 (ko) | 항공 관측 데이터와 지상 관측 데이터 간의 융합 시스템 | |
US20160379365A1 (en) | Camera calibration device, camera calibration method, and camera calibration program | |
CN110031880B (zh) | 基于地理位置定位的高精度增强现实方法及设备 | |
US20210208608A1 (en) | Control method, control apparatus, control terminal for unmanned aerial vehicle | |
CN109541570B (zh) | 毫米波扫描设备标定的方法及设备 | |
JP2022130588A (ja) | 自動運転車両の位置合わせ方法、装置、電子機器及び車両 | |
CN114296097A (zh) | 基于GNSS和LiDAR的SLAM导航方法及系统 | |
CN113900517A (zh) | 线路导航方法和装置、电子设备、计算机可读介质 | |
JP2021143861A (ja) | 情報処理装置、情報処理方法及び情報処理システム | |
KR102136924B1 (ko) | 피테스트 카메라 모듈의 상대 위치 산출 정확도를 평가하기 위한 평가 방법 및 시스템 | |
TW201812338A (zh) | 旋翼飛行器的定位方法 | |
US20230266483A1 (en) | Information processing device, information processing method, and program | |
CN113566847B (zh) | 导航校准方法和装置、电子设备、计算机可读介质 | |
JP7125927B2 (ja) | 情報端末装置、方法及びプログラム | |
CN117953007B (zh) | 一种基于图像匹配的线性运动补偿控制方法 | |
KR101282917B1 (ko) | 모바일 프로젝터를 이용한 길안내 방법 및 길안내 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18847637 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20197030956 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020511185 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018847637 Country of ref document: EP Effective date: 20200323 |