WO2022110653A1 - 一种位姿确定方法及装置、电子设备和计算机可读存储介质 - Google Patents
一种位姿确定方法及装置、电子设备和计算机可读存储介质 Download PDFInfo
- Publication number
- WO2022110653A1 WO2022110653A1 PCT/CN2021/092487 CN2021092487W WO2022110653A1 WO 2022110653 A1 WO2022110653 A1 WO 2022110653A1 CN 2021092487 W CN2021092487 W CN 2021092487W WO 2022110653 A1 WO2022110653 A1 WO 2022110653A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- line segment
- pose
- radar
- traveling device
- map
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 91
- 238000012545 processing Methods 0.000 claims description 31
- 239000011159 matrix material Substances 0.000 claims description 27
- 238000004590 computer program Methods 0.000 claims description 22
- 230000009466 transformation Effects 0.000 claims description 21
- 239000004576 sand Substances 0.000 claims description 17
- 238000013461 design Methods 0.000 claims description 9
- 230000008569 process Effects 0.000 description 23
- 238000010586 diagram Methods 0.000 description 20
- 238000004891 communication Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000004927 fusion Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 238000013519 translation Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 244000035744 Hura crepitans Species 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000011478 gradient descent method Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/881—Radar or analogous systems specially adapted for specific applications for robotics
Definitions
- the present disclosure relates to the field of computer technology, and in particular, to a method and apparatus for determining a pose, an electronic device, and a computer-readable storage medium.
- radar-based pose estimation technology has been widely used, such as in the automatic driving or assisted driving of smart cars.
- the position and orientation of smart cars can be estimated by radar, which can help smart cars rationally plan the next driving route. .
- the embodiment of the present disclosure proposes a technical solution for pose determination.
- a pose determination method including:
- the estimated pose of the traveling device in the line segment map determine the estimated pose of the radar of the traveling device in the line segment map, and the things in the line segment map are represented by line segments;
- the target pose of the traveling device in the line segment map is determined.
- determining the estimated pose of the radar of the traveling device in the line segment map according to the estimated pose of the traveling device in the line segment map including:
- the estimated poses of the traveling device at a second moment are obtained through interpolation processing, and the second moment includes the point cloud collected by the radar. time;
- the estimated pose of the traveling device at the second moment is obtained.
- the method before the matching of the point cloud collected by the radar with the line segments in the line segment map, the method further includes:
- Matching the point cloud collected by the radar with the line segments in the line segment map including:
- the point cloud collected by the radar is matched with the target line segment.
- the first preset condition includes:
- the absolute value of the included angle between the line segment and the facing direction is less than the first threshold
- the distance between the line segment and the first line segment is less than a distance threshold, and the first line segment is a line segment whose absolute value of the included angle with the orientation is less than a second threshold value.
- the point cloud collected by the radar is matched with the line segment in the line segment map, and it is determined according to the matching result that the radar is in the location where the matching is performed.
- the poses in the line segment map including:
- the estimated pose of the radar determine the coordinates of the point cloud scanned by the radar in the line segment map
- a target transformation matrix is determined, wherein the point cloud at the coordinates is translated and rotated by the target transformation matrix and the line segment map The line segments in match;
- the estimated pose of the radar is translated and rotated to obtain the pose of the radar in the line segment map.
- the determining the target pose of the traveling device in the line segment map according to the matched pose of the radar includes:
- the matched pose of the radar determine the matched pose of the traveling device
- the matched pose of the traveling device and the estimated pose of the traveling device are fused to obtain the target pose of the traveling device in the line segment map.
- the method for determining the pose in response to the pose when the method for determining the pose in response to the pose is periodically executed, before determining the estimated pose of the radar of the traveling device in the line segment map, further include:
- the pose of the traveling device in the line segment map is estimated to obtain the current cycle of the traveling device.
- the estimated pose of the device, and the sensor data is collected based on the motion information of the traveling device.
- the line segment map is obtained by performing straight line fitting on an initial map, where the initial map includes at least one of an occupation grid map and a site design drawing.
- the traveling device includes a traveling device based on an embedded platform, the traveling device runs in a sand table, and the line segment map is a map of the sand table.
- an apparatus for determining a pose including:
- the estimated pose determination part is configured to determine the estimated pose of the radar of the traveling device in the line segment map according to the estimated pose of the traveling device in the line segment map, and the things in the line segment map pass through line segment representation;
- the radar pose determination part is configured to match the point cloud collected by the radar with the line segments in the line segment map according to the estimated pose of the radar, and determine the location of the radar after matching according to the matching result.
- the target pose determination part is configured to determine the target pose of the traveling device in the line segment map according to the matched pose of the radar.
- the estimated pose determination part is further configured to use the estimated poses of the traveling device at a plurality of first moments to obtain the traveling device at the second time through interpolation processing.
- the estimated pose at the time, the second time includes the time when the radar collects the point cloud; according to the estimated pose of the driving device at the second time, the radar is obtained at the second time The estimated pose at the moment.
- the apparatus further includes:
- a target line segment determination part configured to determine a target line segment in the line segment map that satisfies a first preset condition according to the orientation in the estimated pose of the traveling device
- the radar pose determination part is configured to match the point cloud collected by the radar with the target line segment.
- the first preset condition includes:
- the absolute value of the included angle between the line segment and the facing direction is less than the first threshold
- the distance between the line segment and the first line segment is less than a distance threshold, and the first line segment is a line segment whose absolute value of the included angle with the orientation is less than a second threshold value.
- the radar pose determination part is further configured to determine the coordinates of the point cloud scanned by the radar in the line segment map according to the estimated pose of the radar;
- the coordinates of the point cloud in the line segment map and the line segments in the line segment map determine a target conversion matrix, wherein the point cloud at the coordinates is translated and rotated by the target conversion matrix and is different from the line segment in the line segment map.
- Line segment matching; through the target transformation matrix, the estimated pose of the radar is translated and rotated to obtain the pose of the radar in the line segment map.
- the target pose determination part is further configured to determine the pose of the traveling device after matching according to the pose of the radar after matching; The pose is fused with the estimated pose of the traveling device to obtain the target pose of the traveling device in the line segment map.
- the apparatus further includes:
- the current estimated pose determination part is configured to, according to the target pose of the traveling device determined in the previous cycle and the sensing data collected by the traveling device in the current cycle, determine the position of the traveling device in the line segment map.
- the estimated pose of the traveling device in the current cycle is obtained, and the sensing data is collected based on the motion information of the traveling device.
- the line segment map is obtained by performing straight line fitting on an initial map, where the initial map includes at least one of an occupation grid map and a site design drawing.
- the traveling device includes a traveling device based on an embedded platform, the traveling device runs in a sand table, and the line segment map is a map of the sand table.
- an electronic device comprising: a processor; a memory configured to store instructions executable by the processor; wherein the processor is configured to invoke the instructions stored in the memory, to perform the above method.
- a computer-readable storage medium having computer program instructions stored thereon, the computer program instructions implementing the above method when executed by a processor.
- a computer program including computer-readable codes, and when the computer-readable codes are executed in an electronic device, the processor in the electronic device implements the above when executed. method.
- the estimated pose of the radar of the traveling device in the line segment map can be determined according to the estimated pose of the traveling device in the line segment map, and the radar is collected according to the pre-estimated estimated pose.
- Matching the point cloud in the line segment map with the line segment in the line segment map reduces the range of the matching area in the line segment map and reduces the calculation amount of the pose determination process.
- the point cloud is used to match the line segments in the line segment map, which reduces the calculation amount of the pose determination process and improves the efficiency of the pose determination of the driving equipment.
- FIG. 1 shows a schematic diagram of the architecture of a pose determination system according to an embodiment of the present disclosure
- FIG. 2 shows a flowchart of a pose determination method according to an embodiment of the present disclosure
- FIG. 3A shows a schematic diagram of a target line segment in a line segment map according to an embodiment of the present disclosure
- 3B shows a schematic diagram of a target line segment in another line segment map according to an embodiment of the present disclosure
- FIG. 4 shows a block diagram of an apparatus for determining a pose according to an embodiment of the present disclosure
- FIG. 5 shows a block diagram of an electronic device according to an embodiment of the present disclosure
- FIG. 6 shows a block diagram of an electronic device according to an embodiment of the present disclosure.
- the pose determination method provided by the embodiments of the present disclosure may be executed by a traveling device, or may be executed by the traveling device sending the collected data to other devices, for example, an electronic device such as a terminal device or a server.
- the terminal device can be a user equipment (User Equipment, UE), a mobile device, a user terminal, a terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle-mounted device, a wearable devices, etc.
- UE User Equipment
- PDA Personal Digital Assistant
- FIG. 1 is a schematic structural diagram of an exemplary pose determination system 10 provided by an embodiment of the present disclosure; as shown in FIG. 1 , the pose determination system 10 includes a terminal device/server 100 and a driving device 200 (Fig. A traveling device 200-1) is shown by way of example in 1).
- the traveling device 200-1 can send the collected motion information or sensor data, as well as the collected point cloud and other data to the terminal device/server 100 during the driving process;
- the relative relationship between the position and attitude of the device and the radar, as well as data such as the estimated position and attitude of the driving device in the line segment map, the terminal device/server 100 determines The estimated pose of the radar of the driving device 200-1 in the line segment map, according to the estimated pose of the radar of the traveling device 200-1, the point cloud collected by the radar is matched with the line segment in the line segment map, and according to the matching result Determine the pose of the radar in the line segment map after matching, and finally determine the target pose of the traveling device 200-1 in the line segment map according to the pose of the matched radar, and use the traveling device 200-1 in the line segment map to determine the pose of the target.
- the target pose is sent to the traveling device 200-1.
- the traveling device by sending the collected data to other devices such as a terminal device or a server by the traveling device, and determining the target pose of the traveling device 200-1 by other devices, the calculation amount of the traveling device can be reduced.
- FIG. 2 shows a flowchart of a pose determination method according to an embodiment of the present disclosure. As shown in FIG. 2 , the pose determination method includes:
- step S11 the estimated pose of the radar of the traveling device in the line segment map is determined according to the estimated pose of the traveling device in the line segment map.
- connection segments for example, roads and buildings are represented by line segments, etc.
- the line segment map can be implemented in many ways, and the line segment map will be described in detail later in conjunction with the possible implementation methods of the present disclosure.
- the traveling device here may be a device whose geographic location can be changed, and the change of the geographic location may be an autonomous change of the traveling device, or a change caused by an external force.
- the traveling device may be a traveling device in the field of unmanned driving or artificial intelligence education.
- the traveling device may be a vehicle (for example, an embedded self-driving car), a ship, an aircraft and other devices, or it may also be a robot, such as Sweeping robots, handling robots and educational robots, etc.
- the pose can include a position and an attitude
- the position can be the coordinates of the device on the map
- the pose can be the direction of the device on the map.
- the estimated pose of the traveling device may be a pre-estimated pose, which may be estimated according to the pose determined at the target time of the traveling device.
- the estimated pose of the traveling device will be estimated later in combination with the possible implementations of the embodiments of the present disclosure. posture is described in detail.
- the estimated pose of the radar of the traveling device in the line segment map may be determined by the estimated pose of the traveling device according to the relative relationship between the pose of the traveling device and the radar.
- the relative relationship In the case where the radar base is fixed on the traveling equipment, the relative relationship is often fixed. Therefore, the relative relationship can be manually determined in advance, that is, the relative relationship is a preset value.
- step S12 according to the estimated pose of the radar, the point cloud collected by the radar is matched with the line segment in the line segment map, and it is determined according to the matching result that the radar is in the line segment map after matching. 's pose.
- the point cloud collected by the radar is located in the polar coordinate system of the radar, such as the polar coordinate system with the radar as the center point, the point cloud obtained by the radar scanning the surrounding objects is located around the center point in the polar coordinate system .
- the estimated position and attitude of the radar in the line segment map After the estimated position and attitude of the radar in the line segment map is obtained, the estimated position and attitude of the radar in the world coordinate system are determined. Therefore, the polar coordinates can be calculated according to the estimated position and attitude of the radar in the world coordinate system.
- the point cloud in the system is mapped to the world coordinate system, and the position and attitude of the point cloud scanned by the radar in the world coordinate system are obtained. That is, according to the estimated pose of the radar in the line segment map and the pose of the radar scanned point cloud in the radar polar coordinate system, the pose of the radar scanned point cloud in the line segment map is obtained.
- the pose is a pre-estimated pose, that is, the pre-estimated general position and direction of the radar-scanned point cloud in the line-segment map, it can be determined according to The general position and direction of the point cloud scanned by the radar in the line segment map.
- the "matching” mentioned here can be understood as finding the line segment corresponding to the thing represented by the point cloud in the line segment map.
- the point cloud can be translated and rotated to find the line segment that matches the point cloud as much as possible Coinciding line segments, once a line segment with the expected degree of coincidence with the point cloud is found, the matching is considered complete.
- the point cloud In the process of matching the point cloud with the line segments in the line segment map, the point cloud is moved and rotated, and a line segment that overlaps the point cloud as much as possible is found in the line segment map. Since the point cloud is the relative position information of the points scanned by the radar on the surrounding things, and the line segment in the line segment map also represents the location information of the thing, therefore, when the point cloud and the line segment are matched, you can The position and posture of the radar is reversed, and the reversed posture can be considered as the radar's posture when scanning the point cloud.
- step S13 the target pose of the traveling device in the line segment map is determined according to the matched pose of the radar.
- the pose of the traveling device in the line segment map can be determined according to the relative relationship between the pose of the traveling device and the radar.
- the pose is referred to as the target pose here.
- the target pose may also be determined according to other factors, which will be described in detail later in conjunction with possible implementations of the embodiments of the present disclosure.
- the estimated pose of the radar of the traveling device in the line segment map can be determined according to the estimated pose of the traveling device in the line segment map, and the radar is collected according to the pre-estimated estimated pose.
- Matching the point cloud in the line segment map with the line segment in the line segment map reduces the range of the matching area in the line segment map and reduces the calculation amount of the pose determination process.
- the point cloud is used to match the line segments in the line segment map, which reduces the calculation amount of the pose determination process and improves the efficiency of the pose determination of the driving equipment.
- the pose determination method provided by the embodiment of the present disclosure may also be implemented in a variety of ways.
- the pose of the traveling device is determined by performing point-line matching on a line segment map.
- the line segment map is obtained by performing straight line fitting on an initial map, and the initial map includes at least one of an occupancy grid map and a site design map.
- a real-time positioning and mapping algorithm can be used to map the driving site of the driving equipment to obtain the occupancy grid map of the driving site.
- the line segment map can be obtained by directly fitting a straight line to the initial image, which improves the efficiency of drawing the line segment map.
- the line segment map is determined directly based on the site design drawing. Since the site design drawing is often a relatively simple line drawing, it is more suitable for the straight line fitting algorithm, and the obtained line segment map is more accurate, and there is no need to pass other algorithms.
- the initial map is constructed, which is more convenient.
- the obtained line segment map is a very lightweight map compared to other maps, which can improve the speed of pose estimation.
- the estimated pose of the traveling device may be estimated based on the pose determined at the target time of the traveling device.
- the pose determination method in the embodiment of the present disclosure is periodically executed.
- the pose at the target moment may be the target pose determined in the previous cycle, and in addition, the pose at the target moment may also be the pose determined by other means, such as the pose determined by a position sensor such as a global positioning system .
- the method further includes: according to The target pose of the traveling device determined in the previous cycle and the sensing data collected by the traveling device in the current cycle are used to estimate the pose of the traveling device in the line segment map to obtain the traveling device in the current cycle.
- the estimated pose, the sensing data is collected based on the motion information of the traveling equipment.
- the motion information of the traveling equipment may include at least one of attitude (yaw) angle, angular velocity, acceleration, wheel speed, etc., and the sensor data may be collected by sensors such as an inertial measurement unit (IMU) or a wheel speedometer. data.
- attitude yaw
- angular velocity angular velocity
- acceleration acceleration
- wheel speed etc.
- sensor data may be collected by sensors such as an inertial measurement unit (IMU) or a wheel speedometer. data.
- IMU inertial measurement unit
- the data collected by the front-end IMU in the current cycle can be integrated and fused, and the wheel speedometer can be collected. Integrate and fuse the wheel speed of the current cycle to obtain the estimated pose of the driving equipment in the current cycle; or, based on the target pose of the driving device determined in the previous cycle, Kalman filter fusion can be performed on the sensor data collected by the sensor to obtain the current cycle.
- the estimated pose of the driving device based on the target pose of the driving equipment determined in the previous cycle.
- the angular velocity collected by the IMU will be integrated in time, so that the angle change of the driving equipment can be obtained; By integrating the speed, the change in displacement of the traveling equipment can be obtained. Then, based on the target pose of the traveling device determined in the previous cycle, plus the obtained change in angle and displacement, the estimated pose of the traveling device in the current cycle can be obtained.
- the Kalman prediction equation will be used, and based on the target pose of the traveling equipment determined in the previous cycle, the angles and displacements obtained from different sensing data will be calculated according to The ratio is fused to obtain the estimated pose of the current cycle driving equipment.
- the fusion ratio is determined by Kalman Gain (Kg), and Kg is updated by the Kalman update equation according to the prediction result of the Kalman prediction equation, so that the update The prediction results of the latter Kalman prediction equation are more accurate.
- the target pose used in the process of determining the estimated pose for the initial period may be manually set or calculated according to other parameters manually set.
- the estimated pose can be determined at a preset frequency in the current cycle, and the estimated poses at multiple first moments in the current cycle can be obtained. For example, 100 can be output in 1 second. frequency to output the estimated pose.
- the pose of the traveling device in the line segment map is estimated by using the target pose determined in the previous cycle and the sensor data collected by the traveling device in the current cycle, so that the position and orientation of the traveling device can be accurately estimated. pose, which facilitates subsequent point-line matching and improves the accuracy of the determined pose.
- determining the estimated pose of the radar of the traveling device in the line segment map according to the estimated pose of the traveling device in the line segment map includes: using the traveling device For the estimated poses at a plurality of first moments, the estimated poses of the traveling equipment at a second moment are obtained through interpolation processing, and the second moment includes the moment when the radar collects the point cloud; according to the The estimated pose of the traveling device at the second moment is obtained, and the estimated pose of the radar at the second moment is obtained.
- the radar scans the surrounding objects at a certain frequency to obtain the point cloud.
- the moment when the radar collects the point cloud is called the second moment, and there can be multiple second moments. Since the first moment at which the front end determines the estimated pose of the traveling device may be different from the second moment, the estimated pose of the traveling device at multiple first moments can be used to obtain the estimated pose of the traveling device at the second moment through interpolation processing. Estimating pose.
- the estimated poses within the data range at the first moment will be interpolated by means of interpolation to obtain the estimated poses of the driving equipment at the second moment;
- interpolation processing is performed on a plurality of estimated poses outside the data range at the first moment to obtain the estimated pose of the traveling device at the second moment.
- the estimated pose of the second moment obtained by interpolation is the estimated pose of the driving equipment. Therefore, according to the relative relationship between the radar and the pose of the driving equipment, the estimated pose of the second moment obtained by interpolation can be obtained. The estimated pose at the second moment.
- the estimated pose at the moment when the radar collects the point cloud is accurately obtained, which facilitates subsequent point-line matching and improves the determined pose accuracy.
- the method before the matching of the point cloud collected by the radar with the line segments in the line segment map, the method further includes: determining, according to the orientation in the estimated pose of the traveling device, determining The target line segment in the line segment map that satisfies the first preset condition; matching the point cloud collected by the radar with the line segment in the line segment map includes: matching the point cloud collected by the radar with the target line segment. match.
- the traveling device In the process of traveling on the track, the traveling device often travels along the track, so the orientation of the traveling device and the two sides of the track where the traveling device is located are often parallel or nearly parallel.
- the things in the line segment map are also represented by line segments. Therefore, the matching line segments can be filtered according to the orientation of the driving device, and the line segments can be filtered by setting conditions, and the line segment that matches the orientation of the driving device can be selected.
- the first preset condition includes:
- the absolute value of the included angle between the line segment and the facing direction is less than the first threshold
- the distance between the line segment and the first line segment is smaller than a distance threshold, and the first line segment is a line segment whose absolute value of the included angle with the orientation is smaller than a second threshold value.
- the first threshold may be, for example, 30°, so that a line segment within ⁇ 30° of the traveling device orientation can be selected, as shown in FIG. 3A , for the line segment map of the circular track, the traveling device orientation is shown in FIG.
- the set of line segments a within which the absolute value of the angle is less than ⁇ 30° is shown in FIG. 3A .
- the second threshold is smaller than the first threshold, and may be, for example, 3° or 5°, so that the first line segment is a line segment that is parallel or approximately parallel to the orientation of the traveling device. After the first line segment is determined, a line segment whose distance from the first line segment is smaller than the distance threshold is selected as a target line segment for subsequent matching.
- the distance threshold may be an empirical value set according to experience, and the value is not limited in this embodiment of the present disclosure.
- FIG. 3B for the line segment map of the circular track, the orientation of the traveling equipment is as shown in FIG. 3B , and FIG. 3B shows the first line segment that is nearly parallel to the orientation of the traveling equipment, and the distance to the first line segment is less than Set b of line segments with distance thresholds.
- the distance of the line segment may be the distance between the midpoints of the two line segments, or may be the distance between all points on the two line segments. The smallest value in the distance set.
- the manner of determining the distance between the line segments is not limited in this embodiment of the present disclosure.
- the point cloud collected by the radar is matched with the line segment in the line segment map, and it is determined according to the matching result that the radar is located in the location after the matching.
- the pose in the line segment map includes: determining the coordinates of the point cloud scanned by the radar in the line segment map according to the estimated pose of the radar; according to the coordinates of the point cloud in the line segment map and the line segment in the line segment map, determine a target conversion matrix, wherein, the point cloud at the coordinate is matched with the line segment in the line segment map after translation and rotation by the target conversion matrix; through the target conversion matrix, to The estimated pose of the radar is translated and rotated to obtain the pose of the radar in the line segment map.
- the line segment map can be searched for a line segment that can overlap or completely overlap with the point cloud as much as possible. After that, the matching of point cloud and line segment is realized.
- the radar-scanned point cloud in the line-segment map can be obtained according to the estimated pose of the radar in the line-segment map (world coordinate system) and the pose of the radar-scanned point cloud in the radar polar coordinate system. coordinate of.
- the point cloud in the line segment map, can be translated and rotated through a transformation matrix, and a target transformation matrix that makes the point cloud match the line segments in the line segment map during the translation and rotation process is determined.
- a target transformation matrix that makes the point cloud match the line segments in the line segment map during the translation and rotation process is determined.
- the distance between the point in the point cloud and the nearest line segment can be calculated. When the sum of the distance between the point in the point cloud and the nearest line segment is the smallest, it is considered that the point cloud matches the line segment in the map. .
- the point cloud scanned by the radar represents the relative positional relationship of the things around the radar, after the point cloud is matched with the line segment in the map, it indicates the actual pose when the radar scans the point cloud, and the estimated pose is rotated by the target rotation matrix. back pose. Therefore, the more accurate pose of the radar should be the pose after the rotation and translation of the estimated pose through the target transformation matrix. Then, the predicted pose of the radar can be translated and rotated through the target transformation matrix to obtain the radar. Pose in the segment map.
- the process of matching the point cloud and the line segment in the embodiment of the present disclosure may be implemented by an algorithm, for example, may be implemented based on the Gauss-Newton method, the Levenberg-Marquardt method, or the trust region dogleg method.
- the point cloud collected by the radar is matched with the line segments in the line segment map according to the pre-estimated estimated pose.
- the cloud is matched with the line segment in the line segment map, which reduces the calculation amount of the pose determination process and improves the efficiency of the pose determination of the driving equipment.
- determining the target pose of the traveling device in the line segment map according to the matched pose of the radar includes: determining the matched pose of the radar according to the matched pose.
- the pose of the traveling device; the matching pose of the traveling device is fused with the estimated pose of the traveling device to obtain the target pose of the traveling device in the line segment map.
- the traveling device can determine the pose of the matched traveling device according to the relative relationship between the radar and the pose of the traveling device. Then, the matched pose of the traveling device is fused with the estimated pose of the traveling device, and the fused pose is used as the target pose of the traveling device in the line segment map. For example, the pose of the matched traveling device and the estimated pose of the traveling device can be weighted and averaged to obtain the fused target pose; or the Kalman filter algorithm can be used to fuse the two to obtain the fused target. Pose, the Kalman filter algorithm will fuse the matching pose of the traveling device with the estimated pose of the traveling device in proportion through the Kalman prediction equation to obtain the fused target pose, and the proportion of fusion is obtained by Kalman gain. To determine, the Kalman gain is updated by the Kalman update equation according to the prediction result of the Kalman prediction equation, so that the prediction result of the updated Kalman prediction equation is more accurate.
- the estimated pose of the traveling device after interpolation to the second moment can be used to fuse with the matched pose of the traveling device, or the estimated pose and matching of the traveling device at the first moment can be used directly.
- the poses of the rear driving equipment are fused.
- the shortcomings of different pose determination methods can be compensated for each other, and errors caused by various uncertain factors can be reduced. Improve the accuracy of the obtained target pose.
- the traveling device includes a traveling device based on an embedded platform, for example, a car running in a sandbox and configured for artificial intelligence teaching, an indoor sweeping robot, and the like.
- the computing performance of a driving device based on an embedded platform is usually low, and the driving device has a high requirement for real-time determination of the pose. Increase the speed of pose determination.
- the traveling device operates in a sand table
- the line segment map is a map of the sand table. Due to the relatively monotonous environment of the sand table, in many road sections, the two frames before and after the radar scan are not sufficiently different. Matching between the point clouds of adjacent frames, or matching between adjacent frames, cannot be accurately matched. The pose of the traveling device. However, in the embodiment of the present disclosure, by matching the point cloud of the radar with the line segment of the map, the positioning accuracy can be significantly improved.
- the car taught by artificial intelligence drives automatically in the sand table, and the line segment map of the sand table can be established in advance through the site design drawing of the sand table. Roads, buildings, obstacles, etc. in the sand table are represented by line segments in the line segment map.
- the first estimated pose S 1 in the line segment map of the car can be set manually, and according to the sensor data collected at each first moment of the initial cycle, combined with the estimated pose S 1 , determine the The estimated poses S 2 , S 3 ??S n of the car at each subsequent first moment (n is the number at the first moment), by interpolating S 1 ?? S n , and according to the relative relationship between the radar and the car The pose is transformed to obtain the estimated pose P 1 ?? P n of the radar at each second moment.
- the target line segments that meet the first preset conditions are screened from the line segment map, and the radar scanning points are determined according to the estimated poses P 1 ?? P n of the radar
- the coordinates of the cloud in the line segment map translate and rotate the coordinates of the point cloud based on the transformation matrix, until a line segment that basically coincides with the point cloud is found in the target line segment, determine the transformation matrix at this time as the target transformation matrix, and use the target
- the transformation matrix translates and rotates the estimated pose P 1 ?? P n of the radar to obtain the matched radar pose P' 1 ......
- the above method may be performed by a pose determination module on the traveling device, and may also be implemented by a processor invoking computer-readable instructions stored in a memory.
- the present disclosure also provides a pose determination device, an electronic device, a computer-readable storage medium, and a program, all of which can be used to implement any one of the pose determination methods provided by the present disclosure. Record accordingly.
- FIG. 4 shows a block diagram of a position determination apparatus according to an embodiment of the present disclosure.
- the apparatus 40 includes:
- the estimated pose determination part 401 is configured to determine the estimated pose of the radar of the traveling device in the line segment map according to the estimated pose of the traveling device in the line segment map, the things in the line segment map represented by line segments;
- the radar pose determination part 402 is configured to match the point cloud collected by the radar with the line segments in the line segment map according to the estimated pose of the radar, and determine that the radar is in the line segment after matching according to the matching result. the pose in the line segment map;
- the target pose determination part 403 is configured to determine the target pose of the traveling device in the line segment map according to the matched pose of the radar.
- the estimated pose determination part 401 is further configured to use the estimated poses of the traveling device at a plurality of first moments to obtain the traveling device at the first time through interpolation processing.
- the estimated pose at the second time, the second time includes the time when the radar collects the point cloud; according to the estimated pose of the driving device at the second time, the radar is obtained at the second time.
- the estimated pose at the second moment is further configured to use the estimated poses of the traveling device at a plurality of first moments to obtain the traveling device at the first time through interpolation processing.
- the estimated pose at the second time, the second time includes the time when the radar collects the point cloud; according to the estimated pose of the driving device at the second time, the radar is obtained at the second time.
- the estimated pose at the second moment is further configured to use the estimated poses of the traveling device at a plurality of first moments to obtain the traveling device at the first time through interpolation processing.
- the apparatus further includes:
- a target line segment determination part configured to determine a target line segment in the line segment map that satisfies a first preset condition according to the orientation in the estimated pose of the traveling device
- the radar pose determination part 402 is further configured to match the point cloud collected by the radar with the target line segment.
- the first preset condition includes:
- the absolute value of the included angle between the line segment and the facing direction is less than the first threshold
- the distance between the line segment and the first line segment is less than a distance threshold, and the first line segment is a line segment whose absolute value of the included angle with the orientation is less than a second threshold value.
- the radar pose determination part 402 is further configured to determine the coordinates of the point cloud scanned by the radar in the line segment map according to the estimated pose of the radar;
- the coordinates of the point cloud in the line segment map and the line segments in the line segment map determine a target transformation matrix, so that the point cloud at the coordinates is translated and rotated by the target transformation matrix and is the same as the line segment in the line segment map.
- Line segment matching; through the target transformation matrix, the estimated pose of the radar is translated and rotated to obtain the pose of the radar in the line segment map.
- the target pose determination part 403 is further configured to determine the pose of the traveling device after matching according to the pose of the radar after matching; The pose of the vehicle is fused with the estimated pose of the traveling device to obtain the target pose of the traveling device in the line segment map.
- the apparatus further includes:
- the current estimated pose determination part is configured to, according to the target pose of the traveling device determined in the previous cycle and the sensing data collected by the traveling device in the current cycle, determine the position of the traveling device in the line segment map.
- the estimated pose of the traveling device in the current cycle is obtained, and the sensing data is collected based on the motion information of the traveling device.
- the line segment map is obtained by performing straight line fitting on an initial map, where the initial map includes at least one of an occupation grid map and a site design drawing.
- the traveling device includes a traveling device based on an embedded platform, the traveling device runs in a sand table, and the line segment map is a map of the sand table.
- the functions or modules included in the apparatus provided in the embodiments of the present disclosure may be configured to execute the methods described in the above method embodiments, and for implementation, reference may be made to the above method embodiments.
- a "part" may be a part of a circuit, a part of a processor, a part of a program or software, etc., of course, a unit, a module or a non-modularity.
- Embodiments of the present disclosure further provide a computer-readable storage medium, on which computer program instructions are stored, and when the computer program instructions are executed by a processor, the foregoing method is implemented.
- Computer-readable storage media can be volatile or non-volatile computer-readable storage media.
- An embodiment of the present disclosure further provides an electronic device, comprising: a processor and a memory configured to store instructions executable by the processor; wherein the processor is configured to invoke the instructions stored in the memory to execute the above method.
- Embodiments of the present disclosure also provide a computer program product, including computer-readable codes.
- a processor in the device executes the pose for implementing the pose provided by any of the above embodiments.
- a directive to determine the method is also provided.
- Embodiments of the present disclosure further provide another computer program product configured to store computer-readable instructions, which, when executed, cause the computer to perform the operations of the pose determination method provided by any one of the foregoing embodiments.
- An embodiment of the present disclosure further provides a computer program, which, when executed by a processor, implements the pose determination method provided by the embodiment of the present disclosure.
- the electronic device may be provided as a terminal, server or other form of device.
- FIG. 5 shows a block diagram of an electronic device 800 according to an embodiment of the present disclosure.
- electronic device 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, fitness device, personal digital assistant, etc. terminal.
- electronic device 800 may include one or more of the following components: processing component 802, memory 804, power supply component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814 , and the communication component 816 .
- the processing component 802 generally controls the overall operation of the electronic device 800, such as operations associated with display, phone calls, data communications, camera operations, and recording operations.
- the processing component 802 can include one or more processors 820 to execute instructions to perform all or some of the steps of the methods described above.
- processing component 802 may include one or more modules that facilitate interaction between processing component 802 and other components.
- processing component 802 may include a multimedia module to facilitate interaction between multimedia component 808 and processing component 802.
- Memory 804 is configured to store various types of data to support operation at electronic device 800 . Examples of such data include instructions for any application or method operating on electronic device 800, contact data, phonebook data, messages, pictures, videos, and the like. Memory 804 may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), Magnetic Memory, Flash Memory, Magnetic or Optical Disk.
- SRAM static random access memory
- EEPROM electrically erasable programmable read only memory
- EPROM erasable Programmable Read Only Memory
- PROM Programmable Read Only Memory
- ROM Read Only Memory
- Magnetic Memory Flash Memory
- Magnetic or Optical Disk Magnetic Disk
- Power supply assembly 806 provides power to various components of electronic device 800 .
- Power supply components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power to electronic device 800 .
- Multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and the user.
- the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user.
- the touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may not only sense the boundaries of a touch or swipe action, but also detect the duration and pressure associated with the touch or swipe action.
- the multimedia component 808 includes a front-facing camera and/or a rear-facing camera. When the electronic device 800 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each of the front and rear cameras can be a fixed optical lens system or have focal length and optical zoom capability.
- Audio component 810 is configured to output and/or input audio signals.
- audio component 810 includes a microphone (MIC) that is configured to receive external audio signals when electronic device 800 is in operating modes, such as calling mode, recording mode, and voice recognition mode.
- the received audio signal may be further stored in memory 804 or transmitted via communication component 816 .
- audio component 810 also includes a speaker for outputting audio signals.
- the I/O interface 812 provides an interface between the processing component 802 and a peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to: home button, volume buttons, start button, and lock button.
- Sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of electronic device 800 .
- the sensor assembly 814 can detect the on/off state of the electronic device 800, the relative positioning of the components, such as the display and the keypad of the electronic device 800, the sensor assembly 814 can also detect the electronic device 800 or one of the electronic device 800 Changes in the position of components, presence or absence of user contact with the electronic device 800 , orientation or acceleration/deceleration of the electronic device 800 and changes in the temperature of the electronic device 800 .
- Sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact.
- Sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
- the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
- Communication component 816 is configured to facilitate wired or wireless communication between electronic device 800 and other devices.
- Electronic device 800 may access wireless networks based on communication standards, such as WiFi, 2G or 3G, or a combination thereof.
- the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel.
- the communication component 816 also includes a near field communication (NFC) module to facilitate short-range communication.
- NFC near field communication
- the NFC module may be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology and other technologies.
- RFID radio frequency identification
- IrDA infrared data association
- UWB ultra-wideband
- Bluetooth Bluetooth
- electronic device 800 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A programmed gate array (FPGA), controller, microcontroller, microprocessor or other electronic component implementation is used to perform the above method.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGA field programmable A programmed gate array
- controller microcontroller, microprocessor or other electronic component implementation is used to perform the above method.
- a non-volatile computer-readable storage medium such as a memory 804 comprising computer program instructions executable by the processor 820 of the electronic device 800 to perform the above method is also provided.
- FIG. 6 shows a block diagram of an electronic device 1900 according to an embodiment of the present disclosure.
- the electronic device 1900 may be provided as a server.
- electronic device 1900 includes processing component 1922, which further includes one or more processors, and a memory resource represented by memory 1932 configured to store instructions executable by processing component 1922, such as an application program.
- An application program stored in memory 1932 may include one or more modules, each corresponding to a set of instructions.
- the processing component 1922 is configured to execute instructions to perform the above-described methods.
- the electronic device 1900 may also include a power supply assembly 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input output (I/O) interface 1958 .
- Electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server TM , Mac OS X TM , Unix TM , Linux TM , FreeBSD TM or the like.
- a non-volatile computer-readable storage medium such as memory 1932 comprising computer program instructions executable by processing component 1922 of electronic device 1900 to perform the above-described method.
- the present disclosure may be a system, method and/or computer program product.
- the computer program product may include a computer-readable storage medium having computer-readable program instructions (computer program) loaded thereon for causing a processor to implement various aspects of the embodiments of the present disclosure.
- a computer-readable storage medium may be a tangible device that can hold and store instructions for use by the instruction execution device, and may be a volatile storage medium or a non-volatile storage medium.
- the computer-readable storage medium may be, for example, but is not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
- Non-exhaustive list of computer readable storage media include: portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) or flash memory), static random access memory (SRAM), portable compact disk read only memory (CD-ROM), digital versatile disk (DVD), memory sticks, floppy disks, mechanically coded devices, such as printers with instructions stored thereon Hole cards or raised structures in grooves, and any suitable combination of the above.
- RAM random access memory
- ROM read only memory
- EPROM erasable programmable read only memory
- flash memory static random access memory
- SRAM static random access memory
- CD-ROM compact disk read only memory
- DVD digital versatile disk
- memory sticks floppy disks
- mechanically coded devices such as printers with instructions stored thereon Hole cards or raised structures in grooves, and any suitable combination of the above.
- Computer-readable storage media are not to be construed as transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (eg, light pulses through fiber optic cables), or through electrical wires transmitted electrical signals.
- the computer readable program instructions described herein may be downloaded to various computing/processing devices from a computer readable storage medium, or to an external computer or external storage device over a network such as the Internet, a local area network, a wide area network, and/or a wireless network.
- the network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
- a network adapter card or network interface in each computing/processing device receives computer-readable program instructions from a network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .
- Computer program instructions for carrying out operations of the present disclosure may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state setting data, or instructions in one or more programming languages.
- Source or object code written in any combination, including object-oriented programming languages, such as Smalltalk, C++, etc., and conventional procedural programming languages, such as the "C" language or similar programming languages.
- the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server implement.
- the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through the Internet connect).
- LAN local area network
- WAN wide area network
- custom electronic circuits such as programmable logic circuits, field programmable gate arrays (FPGAs), or programmable logic arrays (PLAs) can be personalized by utilizing state information of computer readable program instructions.
- Computer readable program instructions are executed to implement various aspects of the present disclosure.
- These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer or other programmable data processing apparatus to produce a machine that causes the instructions when executed by the processor of the computer or other programmable data processing apparatus , resulting in means for implementing the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
- These computer readable program instructions can also be stored in a computer readable storage medium, these instructions cause a computer, programmable data processing apparatus and/or other equipment to operate in a specific manner, so that the computer readable medium on which the instructions are stored includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
- Computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other equipment to cause a series of operational steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process , thereby causing instructions executing on a computer, other programmable data processing apparatus, or other device to implement the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more functions for implementing the specified logical function(s) executable instructions.
- the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or actions , or can be implemented in a combination of dedicated hardware and computer instructions.
- the computer program product can be specifically implemented by hardware, software or a combination thereof.
- the computer program product is embodied as a computer storage medium, and in another optional embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), etc. Wait.
- a software development kit Software Development Kit, SDK
- Embodiments of the present disclosure relate to a method and device for determining a pose, an electronic device, and a computer-readable storage medium.
- the method includes: determining, according to an estimated pose of the traveling device in a line segment map, where the radar of the traveling device is located.
- the estimated pose in the line segment map, the things in the line segment map are represented by line segments; according to the estimated pose of the radar, the point cloud collected by the radar is matched with the line segments in the line segment map, and according to the estimated pose of the radar
- the matching result determines the pose of the radar in the line segment map after matching; and determines the target pose of the traveling device in the line segment map according to the pose of the radar after the matching.
- the embodiments of the present disclosure can reduce the calculation amount of the pose determination process, and improve the efficiency of the pose determination of the traveling equipment.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Robotics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (21)
- 一种位姿确定方法,包括:根据行驶设备在线段地图中的预估位姿,确定所述行驶设备的雷达在所述线段地图中的预估位姿,所述线段地图中的事物通过线段表示;根据所述雷达的预估位姿,对所述雷达采集的点云与所述线段地图中的线段进行匹配,并根据匹配结果确定匹配后所述雷达在所述线段地图中的位姿;根据匹配后所述雷达的位姿,确定所述行驶设备在所述线段地图中的目标位姿。
- 根据权利要求1所述方法,其中,所述根据行驶设备在线段地图中的预估位姿,确定所述行驶设备的雷达在所述线段地图中的预估位姿,包括:利用所述行驶设备在多个第一时刻的预估位姿,通过插值处理得到所述行驶设备在第二时刻的预估位姿,所述第二时刻包括所述雷达采集所述点云的时刻;根据所述行驶设备在所述第二时刻的预估位姿,得到所述雷达在所述第二时刻的预估位姿。
- 根据权利要求1或2所述方法,其中,在所述对所述雷达采集的点云与所述线段地图中的线段进行匹配前,还包括:根据所述行驶设备的预估位姿中的朝向,确定所述线段地图中满足第一预设条件的目标线段;所述对所述雷达采集的点云与所述线段地图中的线段进行匹配,包括:对所述雷达采集的点云与所述目标线段进行匹配。
- 根据权利要求3所述方法,其中,所述第一预设条件,包括:线段与所述朝向的夹角的绝对值小于第一阈值;线段与第一线段的距离小于距离阈值,所述第一线段为与所述朝向的夹角的绝对值小于第二阈值的线段。
- 根据权利要求1-4任一所述方法,其中,所述根据所述雷达的预估位姿,对所述雷达采集的点云与所述线段地图中的线段进行匹配,并根据匹配结果确定匹配后所述雷达在所述线段地图中的位姿,包括:根据所述雷达的预估位姿,确定所述雷达扫描的点云在所述线段地图中的坐标;根据所述点云在所述线段地图中的坐标和所述线段地图中的线段,确定目标转换矩阵,其中,所述坐标处的点云经所述目标转换矩阵平移旋转后与所述线段地图中的线段匹配;通过所述目标转换矩阵,对所述雷达的预估位姿进行平移旋转,得到所述雷达在所述线段地图中的位姿。
- 根据权利要求1-5任一所述方法,其中,所述根据匹配后所述雷达的 位姿,确定所述行驶设备在所述线段地图中的目标位姿,包括:根据匹配后所述雷达的位姿,确定匹配后所述行驶设备的位姿;将匹配后所述行驶设备的位姿与所述行驶设备的预估位姿进行融合,得到所述行驶设备在所述线段地图中的目标位姿。
- 根据权利要求1-6任一所述方法,其中,在所述位姿确定方法周期性执行的情况下,在所述确定所述行驶设备的雷达在所述线段地图中的预估位姿前,还包括:根据上一周期确定的所述行驶设备的目标位姿,以及当前周期所述行驶设备采集的传感数据,对所述行驶设备在线段地图中的位姿进行预估,得到当前周期所述行驶设备的预估位姿,所述传感数据为基于所述行驶设备的运动信息采集的。
- 根据权利要求1-7任一所述方法,其中,所述线段地图是通过对初始地图进行直线拟合得到的,所述初始地图包括:占据栅格地图和场地设计图中的至少一个。
- 根据权利要求1-8任一所述方法,其中,所述行驶设备包括基于嵌入式平台的行驶设备,所述行驶设备运行于沙盘中,所述线段地图为所述沙盘的地图。
- 一种位姿确定装置,包括:预估位姿确定部分,被配置为根据行驶设备在线段地图中的预估位姿,确定所述行驶设备的雷达在所述线段地图中的预估位姿,所述线段地图中的事物通过线段表示;雷达位姿确定部分,被配置为根据所述雷达的预估位姿,对所述雷达采集的点云与所述线段地图中的线段进行匹配,并根据匹配结果确定匹配后所述雷达在所述线段地图中的位姿;目标位姿确定部分,被配置为根据匹配后所述雷达的位姿,确定所述行驶设备在所述线段地图中的目标位姿。
- 根据权利要求10所述装置,其中,所述预估位姿确定部分,还被配置为利用所述行驶设备在多个第一时刻的预估位姿,通过插值处理得到所述行驶设备在第二时刻的预估位姿,所述第二时刻包括所述雷达采集所述点云的时刻;根据所述行驶设备在所述第二时刻的预估位姿,得到所述雷达在所述第二时刻的预估位姿。
- 根据权利要求10或11所述装置,其中,所述装置还包括:目标线段确定部分,被配置为根据所述行驶设备的预估位姿中的朝向,确定所述线段地图中满足第一预设条件的目标线段;所述雷达位姿确定部分,被配置为对所述雷达采集的点云与所述目标线段进行匹配。
- 根据权利要求12所述装置,其中,所述第一预设条件,包括:线段与所述朝向的夹角的绝对值小于第一阈值;线段与第一线段的距离小于距离阈值,所述第一线段为与所述朝向的夹 角的绝对值小于第二阈值的线段。
- 根据权利要求10-13任一项所述装置,其中,所述雷达位姿确定部分,还被配置为根据所述雷达的预估位姿,确定所述雷达扫描的点云在所述线段地图中的坐标;根据所述点云在所述线段地图中的坐标和所述线段地图中的线段,确定目标转换矩阵,其中,所述坐标处的点云经所述目标转换矩阵平移旋转后与所述线段地图中的线段匹配;通过所述目标转换矩阵,对所述雷达的预估位姿进行平移旋转,得到所述雷达在所述线段地图中的位姿。
- 根据权利要求10-14任一项所述装置,其中,所述目标位姿确定部分,还被配置为根据匹配后所述雷达的位姿,确定匹配后所述行驶设备的位姿;将匹配后所述行驶设备的位姿与所述行驶设备的预估位姿进行融合,得到所述行驶设备在所述线段地图中的目标位姿。
- 根据权利要求10-15任一项所述装置,其中,所述装置还包括:当前预估位姿确定部分,被配置为根据上一周期确定的所述行驶设备的目标位姿,以及当前周期所述行驶设备采集的传感数据,对所述行驶设备在线段地图中的位姿进行预估,得到当前周期所述行驶设备的预估位姿,所述传感数据为基于所述行驶设备的运动信息采集的。
- 根据权利要求10-16任一项所述装置,其中,所述线段地图是通过对初始地图进行直线拟合得到的,所述初始地图包括:占据栅格地图和场地设计图中的至少一个。
- 根据权利要求10-17任一项所述装置,其中,所述行驶设备包括基于嵌入式平台的行驶设备,所述行驶设备运行于沙盘中,所述线段地图为所述沙盘的地图。
- 一种电子设备,包括:处理器;被配置为存储处理器可执行指令的存储器;其中,所述处理器被配置为调用所述存储器存储的指令,执行权利要求1至9中任意一项所述的方法。
- 一种计算机可读存储介质,其上存储有计算机程序指令,所述计算机程序指令被处理器执行时实现权利要求1至9中任意一项所述的方法。
- 一种计算机程序,包括计算机可读代码,在所述计算机可读代码在电子设备中运行的情况下,所述电子设备中的处理器执行时实现权利要求1至9中任意一项所述的方法。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011362399.XA CN112433211B (zh) | 2020-11-27 | 2020-11-27 | 一种位姿确定方法及装置、电子设备和存储介质 |
CN202011362399.X | 2020-11-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022110653A1 true WO2022110653A1 (zh) | 2022-06-02 |
Family
ID=74698665
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/092487 WO2022110653A1 (zh) | 2020-11-27 | 2021-05-08 | 一种位姿确定方法及装置、电子设备和计算机可读存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112433211B (zh) |
WO (1) | WO2022110653A1 (zh) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112433211B (zh) * | 2020-11-27 | 2022-11-29 | 浙江商汤科技开发有限公司 | 一种位姿确定方法及装置、电子设备和存储介质 |
CN113406659A (zh) * | 2021-05-28 | 2021-09-17 | 浙江大学 | 一种基于激光雷达信息的移动机器人位置重识别方法 |
CN113776533A (zh) * | 2021-07-29 | 2021-12-10 | 北京旷视科技有限公司 | 可移动设备的重定位方法及装置 |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101000507A (zh) * | 2006-09-29 | 2007-07-18 | 浙江大学 | 移动机器人在未知环境中同时定位与地图构建的方法 |
US8369606B2 (en) * | 2010-07-21 | 2013-02-05 | Palo Alto Research Center Incorporated | System and method for aligning maps using polyline matching |
CN103941264A (zh) * | 2014-03-26 | 2014-07-23 | 南京航空航天大学 | 一种室内未知环境下使用激光雷达定位方法 |
US20170075355A1 (en) * | 2015-09-16 | 2017-03-16 | Ford Global Technologies, Llc | Vehicle radar perception and localization |
CN109341688A (zh) * | 2018-09-05 | 2019-02-15 | 南京理工大学 | 一种基于构造顺序的地图调用定位算法 |
CN109655805A (zh) * | 2019-01-25 | 2019-04-19 | 南京理工大学 | 一种基于扫描线段重合长度估计的激光雷达定位方法 |
CN110631554A (zh) * | 2018-06-22 | 2019-12-31 | 北京京东尚科信息技术有限公司 | 机器人位姿的确定方法、装置、机器人和可读存储介质 |
CN111325796A (zh) * | 2020-02-28 | 2020-06-23 | 北京百度网讯科技有限公司 | 用于确定视觉设备的位姿的方法和装置 |
CN111522022A (zh) * | 2020-04-20 | 2020-08-11 | 西安电子科技大学 | 基于激光雷达的机器人进行动态目标检测方法 |
CN112433211A (zh) * | 2020-11-27 | 2021-03-02 | 浙江商汤科技开发有限公司 | 一种位姿确定方法及装置、电子设备和存储介质 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2249232C2 (ru) * | 2003-04-15 | 2005-03-27 | Открытое акционерное общество "Корпорация "Фазотрон-Научно-исследовательский институт радиостроения" (ОАО "Корпорация "Фазотрон-НИИР") | Способ сопровождения радиоконтрастного объекта по направлению и устройство сопровождения радиоконтрастного объекта по направлению |
WO2013051047A1 (ja) * | 2011-10-03 | 2013-04-11 | 古野電気株式会社 | 表示装置、表示プログラム、及び表示方法 |
CN104575075B (zh) * | 2015-01-14 | 2016-09-28 | 合肥革绿信息科技有限公司 | 一种基于北斗的城市路网车辆坐标校正方法及装置 |
US9679216B2 (en) * | 2015-05-07 | 2017-06-13 | The United States Of America As Represented By The Secretary Of The Air Force | Morphological automatic triangle orientation detection |
CN106123890A (zh) * | 2016-06-14 | 2016-11-16 | 中国科学院合肥物质科学研究院 | 一种多传感器数据融合的机器人定位方法 |
US10192098B2 (en) * | 2016-09-09 | 2019-01-29 | MorphoTrak, LLC | Palm print image matching techniques |
CN107704821B (zh) * | 2017-09-29 | 2020-06-09 | 河北工业大学 | 一种弯道的车辆位姿计算方法 |
JP6880080B2 (ja) * | 2018-07-02 | 2021-06-02 | ベイジン ディディ インフィニティ テクノロジー アンド ディベロップメント カンパニー リミティッド | ポイントクラウドに基づく姿勢推定を用いた車両ナビゲーションシステム |
CN111316284A (zh) * | 2019-02-13 | 2020-06-19 | 深圳市大疆创新科技有限公司 | 车道线检测方法、装置、系统与车辆、存储介质 |
CN109839112B (zh) * | 2019-03-11 | 2023-04-07 | 中南大学 | 地下作业设备定位方法、装置、系统及存储介质 |
CN110926485B (zh) * | 2019-11-11 | 2021-10-08 | 华中科技大学 | 一种基于直线特征的移动机器人定位方法及系统 |
CN111223145A (zh) * | 2020-01-03 | 2020-06-02 | 上海有个机器人有限公司 | 数据处理方法、系统、服务装置及其存储介质 |
CN111590595B (zh) * | 2020-06-30 | 2021-09-28 | 深圳市银星智能科技股份有限公司 | 一种定位方法、装置、移动机器人及存储介质 |
CN111929699B (zh) * | 2020-07-21 | 2023-05-09 | 北京建筑大学 | 一种顾及动态障碍物的激光雷达惯导里程计与建图方法及系统 |
CN111983635B (zh) * | 2020-08-17 | 2022-03-29 | 浙江商汤科技开发有限公司 | 位姿确定方法及装置、电子设备和存储介质 |
-
2020
- 2020-11-27 CN CN202011362399.XA patent/CN112433211B/zh active Active
-
2021
- 2021-05-08 WO PCT/CN2021/092487 patent/WO2022110653A1/zh active Application Filing
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101000507A (zh) * | 2006-09-29 | 2007-07-18 | 浙江大学 | 移动机器人在未知环境中同时定位与地图构建的方法 |
US8369606B2 (en) * | 2010-07-21 | 2013-02-05 | Palo Alto Research Center Incorporated | System and method for aligning maps using polyline matching |
CN103941264A (zh) * | 2014-03-26 | 2014-07-23 | 南京航空航天大学 | 一种室内未知环境下使用激光雷达定位方法 |
US20170075355A1 (en) * | 2015-09-16 | 2017-03-16 | Ford Global Technologies, Llc | Vehicle radar perception and localization |
CN110631554A (zh) * | 2018-06-22 | 2019-12-31 | 北京京东尚科信息技术有限公司 | 机器人位姿的确定方法、装置、机器人和可读存储介质 |
CN109341688A (zh) * | 2018-09-05 | 2019-02-15 | 南京理工大学 | 一种基于构造顺序的地图调用定位算法 |
CN109655805A (zh) * | 2019-01-25 | 2019-04-19 | 南京理工大学 | 一种基于扫描线段重合长度估计的激光雷达定位方法 |
CN111325796A (zh) * | 2020-02-28 | 2020-06-23 | 北京百度网讯科技有限公司 | 用于确定视觉设备的位姿的方法和装置 |
CN111522022A (zh) * | 2020-04-20 | 2020-08-11 | 西安电子科技大学 | 基于激光雷达的机器人进行动态目标检测方法 |
CN112433211A (zh) * | 2020-11-27 | 2021-03-02 | 浙江商汤科技开发有限公司 | 一种位姿确定方法及装置、电子设备和存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN112433211A (zh) | 2021-03-02 |
CN112433211B (zh) | 2022-11-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022110653A1 (zh) | 一种位姿确定方法及装置、电子设备和计算机可读存储介质 | |
CN108596116B (zh) | 测距方法、智能控制方法及装置、电子设备和存储介质 | |
CN109870157B (zh) | 确定车体位姿的方法及装置、制图方法 | |
JP7236565B2 (ja) | 位置姿勢決定方法、装置、電子機器、記憶媒体及びコンピュータプログラム | |
WO2021212964A1 (zh) | 定位方法及装置、电子设备和存储介质 | |
US9641814B2 (en) | Crowd sourced vision and sensor-surveyed mapping | |
US20180213358A1 (en) | Scene sharing-based navigation assistance method and terminal | |
US20210158560A1 (en) | Method and device for obtaining localization information and storage medium | |
EP3579215A1 (en) | Electronic device for generating map data and operating method therefor | |
TWI767217B (zh) | 坐標系對齊的方法及裝置、電子設備和計算機可讀存儲介質 | |
WO2019047725A1 (zh) | 无人飞行器的航区规划方法、装置和遥控器 | |
KR102277503B1 (ko) | 객체 인식 기반 실내 측위를 위한 단말장치, 서비스 서버 및 그 방법 | |
CN113205549A (zh) | 深度估计方法及装置、电子设备和存储介质 | |
WO2022110776A1 (zh) | 定位方法及装置、电子设备、存储介质、计算机程序产品、计算机程序 | |
CN110865405A (zh) | 融合定位方法及装置、移动设备控制方法及电子设备 | |
CN113094966A (zh) | 用于使用粒子滤波器进行定位的基于射频的虚拟运动模型 | |
WO2020168744A1 (zh) | 车辆标定系统及方法 | |
WO2022110785A1 (zh) | 定位方法及装置、电子设备、存储介质、计算机程序产品、计算机程序 | |
US20160343156A1 (en) | Information display device and information display program | |
CN109086843B (zh) | 一种基于二维码的移动机器人导航方法 | |
WO2022237071A1 (zh) | 定位方法及装置、电子设备、存储介质和计算机程序 | |
WO2022110777A1 (zh) | 定位方法及装置、电子设备、存储介质、计算机程序产品、计算机程序 | |
WO2022110801A1 (zh) | 数据处理方法及装置、电子设备和存储介质 | |
US20240053746A1 (en) | Display system, communications system, display control method, and program | |
CN112162292A (zh) | 点云数据处理方法及其装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21896168 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21896168 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21896168 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27/11/2024) |