CN112034470A - Cable identification and positioning method based on solid-state area array laser radar - Google Patents
Cable identification and positioning method based on solid-state area array laser radar Download PDFInfo
- Publication number
- CN112034470A CN112034470A CN202010910294.7A CN202010910294A CN112034470A CN 112034470 A CN112034470 A CN 112034470A CN 202010910294 A CN202010910294 A CN 202010910294A CN 112034470 A CN112034470 A CN 112034470A
- Authority
- CN
- China
- Prior art keywords
- point
- points
- line
- robot
- laser radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/022—Optical sensing devices using lasers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02G—INSTALLATION OF ELECTRIC CABLES OR LINES, OR OF COMBINED OPTICAL AND ELECTRIC CABLES OR LINES
- H02G1/00—Methods or apparatus specially adapted for installing, maintaining, repairing or dismantling electric cables or lines
- H02G1/02—Methods or apparatus specially adapted for installing, maintaining, repairing or dismantling electric cables or lines for overhead lines or cables
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Optics & Photonics (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a cable identification and positioning method based on a solid-state area array laser radar, which comprises the following steps: (1) a hemispherical laser radar is slidably installed in front of the robot, and a global cable low-precision position is obtained through modeling of the hemispherical laser radar; a solid-state area array laser radar is obliquely and upwards installed on the operation arm of the robot; (2) identifying a branch line; (3) identifying branch position forms and tail end lengths; (4) a dominant line location modality is identified. The invention realizes the effective recognition of the robot to the black cable, and can ensure that the full-automatic flow operation realizes a series of high-precision actions such as wire grabbing, wire stripping, wire hanging and the like.
Description
Technical Field
The invention relates to the field of laser radar detection, in particular to a cable identification and positioning method based on a solid-state area array laser radar.
Background
An aerial live working robot is a robot capable of carrying out live working on a high-altitude distribution network line, and the aerial live working robot replaces manual work to complete the live working through a remote control strategy. Compare with traditional artifical live working mode, stopped personal safety risk, operating efficiency promotes doubly, and the overall process realizes that people and electric physics are isolated moreover, has effectively promoted live working quality and efficiency.
In the process of aerial operation of the aerial live working robot, the actions of wire grabbing, wire stripping and wire hanging can be finished only by accurately identifying the cable. In the prior art, the cable identification is generally realized by global laser modeling positioning or visual positioning, but the global laser modeling positioning is used for positioning the cable, the accuracy of a sensor per se is more than 3 cm-5 cm, the accuracy is difficult to reach 1cm in consideration of system errors, and the accuracy of wire grabbing, wire stripping and wire hanging operations is difficult to meet; and the cable is positioned by using the visual scheme, so that the anti-interference capability is difficult to achieve under the conditions of strong light, night and the like in the outdoor environment.
Disclosure of Invention
The purpose of the invention is as follows: aiming at the defects, the invention provides a cable identification and positioning method based on a solid-state area array laser radar, which realizes high-precision identification and extraction of close-range cables in an outdoor environment and enables a robot to automatically complete actions of wire grabbing, wire stripping and wire hanging.
The technical scheme is as follows:
a cable identification and positioning method based on a solid-state area array laser radar comprises the following steps:
(1) a hemispherical laser radar is slidably installed in front of the robot, and a global cable low-precision position is obtained through modeling of the hemispherical laser radar; a solid-state area array laser radar is obliquely and upwards installed on the operation arm of the robot;
(2) identifying branch lines and line grabbing points thereof:
(21) planning the movement of a single-side operation arm of the robot to enable the position of a branch line in the low-precision position of the global cable obtained in the step (1) to enter the visual field range of the solid-state area array laser radar;
(22) carrying out point cloud range filtering, outlier filtering and voxel filtering on point cloud data acquired and provided by the solid-state area array laser radar, and then automatically clustering point clouds;
(23) carrying out automatic seed growth and segmentation cutting on the clustered point cloud, extracting the mass center of each segment of point cloud, converting the obtained series of segmentation point coordinates from the coordinate system of the solid-state area array laser radar to the coordinate system of the robot, identifying to obtain branch lines, and determining branch line grabbing points for line grabbing according to the highest point of the branch line bending;
(3) identifying branch position morphology and end length:
(31) a first operation arm of the planning robot grabs the branch line grabbing points determined in the step (2), after grabbing, the branch line is moved along the branch line to the tail end strip line, the branch line is moved to the position below the main line and parallel to the main line, and a second operation arm is moved to the position below the tail ends of the branch lines determined in the step (2), so that the tail ends of the branch lines are in the middle of the visual field of the solid area array laser radar;
(32) carrying out range filtering, outlier filtering and voxel filtering on point cloud data acquired and provided by the solid-state area array laser radar, and carrying out automatic clustering and segmentation cutting on the point cloud;
(33) performing linear iterative fitting on each segment of point cloud segmented and cut in the step (32) until the number of points participating in calculation of each segment of point cloud line is not changed, and finally extracting the central point of each segment of line and two end points of the whole line segment to form a series of key points of the branch line to represent the position form of the branch line;
(34) converting a series of key point coordinates from a coordinate system of the solid-state area array laser radar to a coordinate system of the robot, and identifying to obtain a branch position form for threading;
(35) calculating to obtain the tail end length of the branch line grabbing point according to the line grabbing point of the first operating arm grabbing branch line and the branch line position form obtained by identification, and if the tail end length of the branch line grabbing point is not in the threshold range, adjusting the position of the branch line grabbing point until the tail end length of the branch line grabbing point is in the threshold range;
(4) recognizing the position form of the main line:
(41) planning the movement of a single-side operation arm of the robot, so that the main line in the low-precision position of the global cable obtained in the step (1) enters the visual field range of the solid-state area array laser radar;
(42) performing range filtering, outlier filtering and voxel filtering on the point cloud according to laser point cloud data acquired by the solid-state area array laser radar, then performing automatic clustering and linear iterative fitting on the point cloud until the number of points participating in linear calculation is not changed, finally extracting a main line and two end points thereof in a certain range on two sides in the middle of a visual field as the position form of the main line, and converting the main line and the two end points into a coordinate system of a robot for wire stripping and wire hanging.
In the steps (2), (3) and (4), the distance from the solid-state area array laser radar to the target cable is 30-70 cm.
In the step (2), the step of automatically clustering the point cloud comprises the following steps:
1) finding a certain point p10 in the space, finding n points nearest to the certain point by kdTere, judging the distance from the n points to p10, and putting the points p12, p13 and p14 … with the distance less than a threshold r into a class Q;
2) finding a point in the class Q in the step 1), and repeating the step 1) to obtain an updated class Q; repeating the step, and when class Q can not be added with new points any more;
3) judging whether the number of the point clouds in the class Q obtained in the step 2) is between the set maximum and minimum numbers, if not, returning to the step 1); if so, the search is completed.
In the step (2), the automatic growing and the segmentation cutting for obtaining the centroid comprises the following steps:
1) generating seeds for the clustered point cloud from one side of the x extreme value, and searching n points nearby by the seeds to serve as a section;
2) removing the points searched in the step 1) in the clustered point cloud;
3) repeating the steps 1) and 2) on the rest point clouds until all the point clouds are cut;
4) and carrying out centroid calculation on each segmented point cloud, obtaining a point for each segment, and representing the whole line segment by a series of centroid broken lines.
In the step (2), the coordinates of the wire identification points are converted as follows:
the coordinate relation of the solid-state area array laser radar relative to the single-side operation arm of the robot is RTCamera_FlangeThe identified cable point is PointsCameraThe coordinate relation from the one-side operation arm of the robot to the coordinate origin of the robot is RTFlange_RobotThen the position Points of the cable target point under the robot coordinate system can be calculatedRobot=RTFlange_Robot*RTCamera_Flangee*PointsCamera。
In the step (3), before the step (31), the point cloud data is translated for 10cm along the direction of the branch line vector on the basis of the line-grabbing point of the first operating arm, and then is cut off.
The iterative fitting of the straight line of step (33) comprises the following steps:
1) performing least square fitting on each segment of point cloud to obtain a linear equation;
2) calculating the distance from each point in the point cloud cluster to the fitted straight line, and eliminating points exceeding a threshold value;
3) using the point cloud obtained after the points exceeding the threshold are removed in the step 2), and repeating the steps 1) and 2) until no point needs to be removed again;
4) and finally, the linear equation obtained by fitting is the final result.
In the step (3), calculating to obtain the tail end length of the branch line grabbing point according to the line grabbing point of the first operating arm grabbing the branch line and the position form of the branch line obtained by the step (33); if the tail end length of the branch line grabbing point is not within the threshold range, adjusting the line grabbing point position of the first operating arm, and re-identifying the branch line position form and the tail end length until the tail end length of the branch line grabbing point is within the threshold range; the threshold range from the branch line grabbing point to the tail end of the branch line is 20-70 cm.
And in the step (42), cables in the range of +/-12 cm on two sides in the middle of the visual field are extracted as main lines.
And (3) comparing the difference between the mainline direction vector calculated in the step (4) and the mainline direction vector obtained at the low-precision position of the global cable in the step (1), and if the difference is more than 10 degrees, determining that the fault occurs.
Has the advantages that: according to the invention, global laser modeling is carried out through the hemispherical laser radar to obtain the low-precision position of the global cable, and then the corresponding high-precision cable identification is carried out through the drawing-in distance of the solid area array laser radar arranged on the robot operation arm, so that the effective identification of the robot to the black cable is realized, the precision can reach +/-1 cm, and the full-automatic flow operation is ensured to realize a series of high-precision actions such as line grabbing, line stripping, line hanging and the like.
Drawings
Fig. 1 is a schematic diagram of the installation of the solid-state area array lidar of the present invention.
Detailed Description
The invention is further elucidated with reference to the drawings and the embodiments.
Fig. 1 is a schematic diagram of the installation of the solid-state area array lidar of the present invention. As shown in fig. 1, in the invention, firstly, a hemispherical laser radar is slidably mounted in front of a robot, and a global cable low-precision position is obtained through global laser modeled by the hemispherical laser radar; the solid-state area array laser radar is arranged on the robot operation arm, and is arranged on an insulating rod at the front end of the robot operation arm by adopting an inclined upward 30 degrees in order to prevent laser from irradiating the robot body; the coarse positioning precision is +/-5 cm; the two arms of the robot are matched to place a target cable in the visual field range of the solid-state area array laser radar, the distance from the solid-state area array laser radar to the cable is 30-70 cm (the operation range angle is 85 degrees by 48 degrees), and different algorithms are adopted for recognizing branch line grabbing points, branch line position forms, tail end lengths and main line position forms, so that the accurate position of the target cable is obtained.
Three action coordinate transformations are performed by using a tf tree provided by ros to transform and extract coordinates from an original point of the solid-state area array laser radar to the position (size angle shown in figure 1) of a flange of a robot operating arm.
Wherein:
1. identification branch line and line grabbing point thereof
Planning the movement of one-side operation arm of the robot, enabling branch positions obtained according to the rough positioning of global laser to enter the visual field range of the solid-state area array laser radar, and acquiring and providing point cloud data p (x, y, z) (p belongs to CameraPoints) according to the solid-state area array laser radarrobot) According to an interface of a point cloud base (PCL base), performing range filtering, outlier filtering and voxel filtering on the point cloud, automatically clustering the point cloud, then performing automatic seed growth and segmentation cutting on the clustered point cloud, extracting the mass center of each segment of the point cloud, finally performing coordinate conversion through a tf tree provided by ros, converting the obtained series of segmented point coordinates from a laser coordinate system to a robot coordinate system, identifying to obtain a branch line, and determining a branch line grabbing point for line grabbing according to the highest point of the branch line bending.
Wherein, the range-filtered point cloud simultaneously satisfies p.x E (x)min,xmax),p.y∈(ymin,ymax),p.z∈(zmin,zmax) Wherein p.x, p.y and p.z respectively represent the three-axis coordinate values of the point cloud p (x, y and z) in the laser coordinate system, (x)min,xmax)、(ymin,ymax)、(zmin,zmax) Respectively representing the value ranges of three-axis coordinate values of the point cloud p (x, y, z) in a laser coordinate system; the outlier filtering is the number of specified neighbor points, and each point can be retained in the point cloud only if the specified neighbor points are required to be arranged in a specified radius; voxel filtering creates a three-dimensional voxel grid through the input point cloud, then the barycenter of all the points in each voxel approximately displays other points in the voxel, so that all the points in the voxel are represented by one barycenter point, and the purpose of downsampling dense point cloud is achieved.
The automatic clustering step of the point cloud is as follows:
1) finding a certain point p10 in the space, finding n points nearest to the certain point by kdTere, judging the distance from the n points to p10, and putting the points p12, p13 and p14 … with the distance less than a threshold r into a class Q; wherein, the threshold value r represents the maximum distance from the point p10 in the space and is set according to the clustering range;
2) finding a point in the class Q in the step 1), and repeating the step 1) to obtain an updated class Q; repeating the step, and when class Q can not be added with new points any more;
3) judging whether the number of the point clouds in the class Q obtained in the step 2) is between the set maximum and minimum numbers, if not, returning to the step 1); if so, the search is completed.
The automatic growing and segmentation cutting centroid finding method comprises the following steps:
1) and (4) performing growth on the clustering point cloud from one side of the X extreme value (the electric wire in the field of view of the solid-state area array laser radar is distributed along the X axis in an extending way, namely, the growth is started from one side of the electric wire. ) Starting to generate seeds, and searching n points nearby the seeds as a section;
2) removing the points searched in the step 1) in the clustered point cloud;
3) repeating the steps 1) and 2) on the rest point clouds until all the point clouds are cut;
4) and carrying out centroid calculation on each segmented point cloud, obtaining a point for each segment, and representing the whole line segment by a series of centroid broken lines.
And (3) converting coordinates of the wire identification points:
the coordinate relation of the solid-state area array laser radar relative to the single-side operation arm of the robot is RTCamera_FlangeThe identified cable point is PointsCameraThe coordinate relation from the one-side operation arm of the robot to the coordinate origin of the robot is RTFlange_RobotThen the position Points of the cable target point under the robot coordinate system can be calculatedRobot=RTFlange_Robot*RTCamera_Flangee*PointsCamera。
2. Identifying branch location morphology and end length
A first operating arm of the planning robot grabs the branch line according to the branch line grabbing point determined in the step (2), after grabbing, the branch line is stripped towards the tail end along the branch line, the branch line is moved to the position below the main line and parallel to the main line, a second operating arm is moved to the position below the tail end of the branch line, the tail end of the branch line is ensured to be in the middle of the visual field of the solid-state area array laser radar, the distance of the solid-state area array laser radar reaching a cable is 30-70 cm (the operating range angle is 85 degrees × 48 degrees), and the first operating arm provides the line grabbing point and the branch line vector of the current action; during scanning, the operating arm of the grabbing arm can also enter the visual field to generate point cloud to become interference, in order to prevent the point cloud of the operating arm of the grabbing arm from participating in calculation, the point cloud needs to be cut off, therefore, the point cloud data is removed after being translated for a certain distance (10cm) along the direction of the branch line vector on the basis of the line-grabbing point of the first operating arm, and then range filtering, outlier filtering and voxel filtering are carried out, carrying out automatic clustering and segmentation cutting on the point cloud, then carrying out linear iterative fitting on each segment of point cloud until the number of points participating in linear calculation of each segment of point cloud is not changed, finally extracting the central point of each segment of linear and two end points of the whole line segment to form a series of key points of a branch line to represent the position form of the branch line, and finally converting a series of key point coordinates from a laser coordinate system to a robot coordinate system through tf conversion to identify and obtain the position form of the branch line for threading;
calculating the length of the branch line grabbing point from the tail end according to the line grabbing point of the first operating arm grabbing branch line of the robot and the branch line position form obtained by recognition, if the reserved length of the tail end of the branch line grabbing point is not within the threshold range, adjusting the position of the branch line grabbing point, and further extending or shortening the length of the tail end of the branch line grabbing point until the length of the tail end of the branch line grabbing point is within the threshold range; the threshold range from the branch line grabbing point to the tail end of the branch line is 20-70 cm in length, and in the embodiment, 35cm is taken.
The linear iterative fitting comprises the following steps:
1) performing least square fitting on each segment of point cloud to obtain a linear equation;
2) calculating the distance from each point in the point cloud cluster to the fitted straight line, and eliminating points exceeding a threshold value;
3) using the point cloud obtained after the points exceeding the threshold are removed in the step 2), and repeating the steps 1) and 2) until no point needs to be removed again;
4) and finally, the linear equation obtained by fitting is the final result.
And calculating the length of the branch line according to the line grabbing point of the first operating arm and the calculated end point. If the spare size of the branch line is not within the spare length range of the tail end, the line grabbing arm is adjusted, the length of the tail end is lengthened or shortened, and the process of looking at the tail end of the branch line is entered again until the accurately calculated length is within the effective range.
3. Identifying dominant line location modalities
Planning the movement of a single-side operation arm of the robot, enabling a hemispherical laser radar to enter the visual field of the solid-state area array laser radar through a main line and a main line prediction stripping point acquired by global laser, performing range filtering, outlier filtering and voxel filtering on the point cloud according to laser point cloud data acquired by the solid-state area array laser radar, then performing automatic clustering and linear iterative fitting on the point cloud until the number of points participating in linear calculation is not changed, finally extracting the main line and two end points of the main line within +/-12 cm on two sides in the middle of the visual field as the position form of the main line, and converting the coordinate system of the solid-state area array laser radar into the coordinate system of the robot through tf conversion to obtain the position form of the main line for stripping and hanging. And comparing the difference between the main line direction vector which is accurately calculated and the main line direction vector obtained by global modeling, if the difference is more than 10 degrees, determining that the difference is failed, and needing to perform accurate calculation again.
Although the preferred embodiments of the present invention have been described in detail, the present invention is not limited to the details of the foregoing embodiments, and various equivalent changes (such as number, shape, position, etc.) may be made to the technical solution of the present invention within the technical spirit of the present invention, and these equivalent changes are all within the protection scope of the present invention.
Claims (10)
1. A cable identification and positioning method based on a solid-state area array laser radar is characterized by comprising the following steps: the method comprises the following steps:
(1) a hemispherical laser radar is slidably installed in front of the robot, and a global cable low-precision position is obtained through modeling of the hemispherical laser radar; a solid-state area array laser radar is obliquely and upwards installed on the operation arm of the robot;
(2) identifying branch lines and line grabbing points thereof:
(21) planning the movement of a single-side operation arm of the robot to enable the position of a branch line in the low-precision position of the global cable obtained in the step (1) to enter the visual field range of the solid-state area array laser radar;
(22) carrying out point cloud range filtering, outlier filtering and voxel filtering on point cloud data acquired and provided by the solid-state area array laser radar, and then automatically clustering point clouds;
(23) carrying out automatic seed growth and segmentation cutting on the clustered point cloud, extracting the mass center of each segment of point cloud, converting the obtained series of segmentation point coordinates from the coordinate system of the solid-state area array laser radar to the coordinate system of the aerial robot, identifying to obtain branch lines, and determining branch line grabbing points for line grabbing according to the highest point of the branch line bending;
(3) identifying branch position morphology and end length:
(31) a first operation arm of the planning robot grabs the branch line grabbing points determined in the step (2), after grabbing, the branch line is moved along the branch line to the tail end strip line, the branch line is moved to the position below the main line and parallel to the main line, and a second operation arm is moved to the position below the tail ends of the branch lines determined in the step (2), so that the tail ends of the branch lines are in the middle of the visual field of the solid area array laser radar;
(32) carrying out range filtering, outlier filtering and voxel filtering on point cloud data acquired and provided by the solid-state area array laser radar, and carrying out automatic clustering and segmentation cutting on the point cloud;
(33) performing linear iterative fitting on each segment of point cloud segmented and cut in the step (32) until the number of points participating in calculation of each segment of point cloud line is not changed, and finally extracting the central point of each segment of line and two end points of the whole line segment to form a series of key points of the branch line to represent the position form of the branch line;
(34) converting a series of key point coordinates from a coordinate system of the solid-state area array laser radar to a coordinate system of the robot, and identifying to obtain a branch position form for threading;
(4) recognizing the position form of the main line:
(41) planning the movement of a single-side operation arm of the robot, so that the main line in the low-precision position of the global cable obtained in the step (1) enters the visual field range of the solid-state area array laser radar;
(42) performing range filtering, outlier filtering and voxel filtering on the point cloud according to laser point cloud data acquired by the solid-state area array laser radar, then performing automatic clustering and linear iterative fitting on the point cloud until the number of points participating in linear calculation is not changed, finally extracting a main line and two end points thereof in a certain range on two sides in the middle of a visual field as the position form of the main line, and converting the main line and the two end points into a coordinate system of a robot for wire stripping and wire hanging.
2. The cable identification and location method according to claim 1, wherein: in the steps (2), (3) and (4), the distance from the solid-state area array laser radar to the target cable is 30-70 cm.
3. The cable identification and location method according to claim 1, wherein: in the step (2), the step of automatically clustering the point cloud comprises the following steps:
1) finding a certain point p in the space, finding n points nearest to the certain point p by using the kdTere, judging the distance between the n points and the point p, and placing the points with the distance less than a threshold value r in a class Q;
2) finding a point in the class Q in the step 1), and repeating the step 1) to obtain an updated class Q; repeating the step, and when class Q can not be added with new points any more;
3) judging whether the number of the point clouds in the class Q obtained in the step 2) is between the set maximum and minimum numbers, if not, returning to the step 1); if so, the search is completed.
4. The cable identification and location method according to claim 1, wherein: in the step (2), the automatic growing and the segmentation cutting for obtaining the centroid comprises the following steps:
1) generating seeds for the clustered point cloud from one side of the x extreme value, and searching n points nearby by the seeds to serve as a section;
2) removing the points searched in the step 1) in the clustered point cloud;
3) repeating the steps 1) and 2) on the rest point clouds until all the point clouds are cut;
4) and carrying out centroid calculation on each segmented point cloud, obtaining a point for each segment, and representing the whole line segment by a series of centroid broken lines.
5. The cable identification and location method according to claim 1, wherein: in the step (2), the coordinates of the wire identification points are converted as follows:
the coordinate relation of the solid-state area array laser radar relative to the single-side operation arm of the robot is RTCamera_FlangeThe identified cable point is PointsCameraThe coordinate relation from the one-side operation arm of the robot to the coordinate origin of the robot is RTFlange_RobotThen the position Points of the cable target point under the robot coordinate system can be calculatedRobot=RTFlange_Robot*RTCamera_Flangee*PointsCamera。
6. The cable identification and location method according to claim 1, wherein: in the step (3), before the step (31), the point cloud data is translated for 10cm along the direction of the branch line vector on the basis of the line-grabbing point of the first operating arm, and then is cut off.
7. The cable identification and location method according to claim 1, wherein: the iterative fitting of the straight line of step (33) comprises the following steps:
1) performing least square fitting on each segment of point cloud to obtain a linear equation;
2) calculating the distance from each point in the point cloud cluster to the fitted straight line, and eliminating points exceeding a threshold value;
3) using the point cloud obtained after the points exceeding the threshold are removed in the step 2), and repeating the steps 1) and 2) until no point needs to be removed again;
4) and finally, the linear equation obtained by fitting is the final result.
8. The cable identification and location method according to claim 1, wherein: in the step (3), calculating to obtain the tail end length of the branch line grabbing point according to the line grabbing point of the first operating arm grabbing the branch line and the position form of the branch line obtained by the step (33); if the tail end length of the branch line grabbing point is not within the threshold range, adjusting the line grabbing point position of the first operating arm, and re-identifying the branch line position form and the tail end length until the tail end length of the branch line grabbing point is within the threshold range; the threshold range from the branch line grabbing point to the tail end of the branch line is 20-70 cm.
9. The cable identification and location method according to claim 1, wherein: and in the step (42), cables in the range of +/-12 cm on two sides in the middle of the visual field are extracted as main lines.
10. The cable identification and location method according to claim 1, wherein: and (3) comparing the difference between the mainline direction vector calculated in the step (4) and the mainline direction vector obtained at the low-precision position of the global cable in the step (1), and if the difference is more than 10 degrees, determining that the fault occurs.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010910294.7A CN112034470B (en) | 2020-09-02 | 2020-09-02 | Cable identification and positioning method based on solid-state area array laser radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010910294.7A CN112034470B (en) | 2020-09-02 | 2020-09-02 | Cable identification and positioning method based on solid-state area array laser radar |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112034470A true CN112034470A (en) | 2020-12-04 |
CN112034470B CN112034470B (en) | 2022-10-18 |
Family
ID=73591215
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010910294.7A Active CN112034470B (en) | 2020-09-02 | 2020-09-02 | Cable identification and positioning method based on solid-state area array laser radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112034470B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114509044A (en) * | 2022-01-26 | 2022-05-17 | 成都唐源电气股份有限公司 | System and method for continuously measuring geometrical parameters of contact net |
CN114782529A (en) * | 2022-03-25 | 2022-07-22 | 国网湖北省电力有限公司电力科学研究院 | High-precision positioning method and system for line grabbing point of live working robot and storage medium |
CN116787466A (en) * | 2023-08-21 | 2023-09-22 | 福建大观电子科技有限公司 | Insulation sleeve-based drainage wire identification method, storage medium and robot |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103473734A (en) * | 2013-09-16 | 2013-12-25 | 南京大学 | Power line extracting and fitting method based on in-vehicle LiDAR data |
CN106529416A (en) * | 2016-10-18 | 2017-03-22 | 国网山东省电力公司电力科学研究院 | Electric-power line detection method and system based on millimeter wave radar decision tree classification |
CN108985143A (en) * | 2018-05-10 | 2018-12-11 | 国电南瑞科技股份有限公司 | A method of based on unmanned plane image recognition iron towers of overhead power transmission lines structure |
CN109446640A (en) * | 2018-10-25 | 2019-03-08 | 国网河南省电力公司濮阳供电公司 | A kind of transmission line of electricity power line modeling extracting method based on laser point cloud |
CN111104861A (en) * | 2019-11-20 | 2020-05-05 | 广州极飞科技有限公司 | Method and apparatus for determining position of electric wire and storage medium |
CN111508020A (en) * | 2020-03-23 | 2020-08-07 | 北京国电富通科技发展有限责任公司 | Cable three-dimensional position calculation method and device fusing image and laser radar |
CN111542828A (en) * | 2018-11-21 | 2020-08-14 | 深圳市大疆创新科技有限公司 | Line recognition method, line recognition device, line recognition system, and computer storage medium |
-
2020
- 2020-09-02 CN CN202010910294.7A patent/CN112034470B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103473734A (en) * | 2013-09-16 | 2013-12-25 | 南京大学 | Power line extracting and fitting method based on in-vehicle LiDAR data |
CN106529416A (en) * | 2016-10-18 | 2017-03-22 | 国网山东省电力公司电力科学研究院 | Electric-power line detection method and system based on millimeter wave radar decision tree classification |
CN108985143A (en) * | 2018-05-10 | 2018-12-11 | 国电南瑞科技股份有限公司 | A method of based on unmanned plane image recognition iron towers of overhead power transmission lines structure |
CN109446640A (en) * | 2018-10-25 | 2019-03-08 | 国网河南省电力公司濮阳供电公司 | A kind of transmission line of electricity power line modeling extracting method based on laser point cloud |
CN111542828A (en) * | 2018-11-21 | 2020-08-14 | 深圳市大疆创新科技有限公司 | Line recognition method, line recognition device, line recognition system, and computer storage medium |
CN111104861A (en) * | 2019-11-20 | 2020-05-05 | 广州极飞科技有限公司 | Method and apparatus for determining position of electric wire and storage medium |
CN111508020A (en) * | 2020-03-23 | 2020-08-07 | 北京国电富通科技发展有限责任公司 | Cable three-dimensional position calculation method and device fusing image and laser radar |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114509044A (en) * | 2022-01-26 | 2022-05-17 | 成都唐源电气股份有限公司 | System and method for continuously measuring geometrical parameters of contact net |
CN114782529A (en) * | 2022-03-25 | 2022-07-22 | 国网湖北省电力有限公司电力科学研究院 | High-precision positioning method and system for line grabbing point of live working robot and storage medium |
CN116787466A (en) * | 2023-08-21 | 2023-09-22 | 福建大观电子科技有限公司 | Insulation sleeve-based drainage wire identification method, storage medium and robot |
CN116787466B (en) * | 2023-08-21 | 2023-11-07 | 福建大观电子科技有限公司 | Insulation sleeve-based drainage wire identification method, storage medium and robot |
Also Published As
Publication number | Publication date |
---|---|
CN112034470B (en) | 2022-10-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112034470B (en) | Cable identification and positioning method based on solid-state area array laser radar | |
CN106970375B (en) | Method for automatically extracting building information from airborne laser radar point cloud | |
CN112033414A (en) | Unmanned aerial vehicle inspection route generation method, device, equipment and medium | |
AU2020104291A4 (en) | Singletree segmentation method based on chord angle discriminant clustering for layered LIDAR point cloud | |
CN110340891A (en) | Mechanical arm positioning grasping system and method based on cloud template matching technique | |
CN101672915B (en) | High spatial resolution remote sensing image crown outline delineation system and method | |
CN112034481B (en) | Automatic cable identification method based on reflective sticker and laser radar | |
CN112883878A (en) | Automatic point cloud classification method under transformer substation scene based on three-dimensional grid | |
CN107679458B (en) | Method for extracting road marking lines in road color laser point cloud based on K-Means | |
CN110060256B (en) | Pole and tower extraction method based on airborne LiDAR point cloud | |
CN112414403B (en) | Robot positioning and attitude determining method, equipment and storage medium | |
CN113358129B (en) | Obstacle avoidance shortest path planning method based on Voronoi diagram | |
CN115562348A (en) | Unmanned aerial vehicle image technology method based on transformer substation | |
CN112578405A (en) | Method and system for removing ground based on laser radar point cloud data | |
CN113838059B (en) | Element level-based digital orthographic image generation method | |
CN115272815A (en) | Cable tunnel environment abnormity identification method based on image | |
CN117422826A (en) | Method and system for constructing digital twin body of power grid equipment | |
CN113689504A (en) | Point cloud accurate positioning method and device based on describable shape and storage medium | |
CN112558091A (en) | Real-time detection method and device for spatial distance of power transmission line to tree and terminal equipment | |
KR20050078670A (en) | Method for auto-detecting edges of building by using lidar data | |
CN116071530B (en) | Building roof voxelized segmentation method based on airborne laser point cloud | |
CN112008730B (en) | Sun position identification and avoidance method based on solid-state area array laser radar | |
Mayura et al. | Building detection from LIDAR point cloud data | |
CN113433532B (en) | Laser radar attitude calibration method and device based on particle swarm algorithm | |
Jian et al. | Position guidance method of live-working robot based on lidar point-cloud |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |