CN112548654A - Cylindrical material butt joint method and device based on 3D camera - Google Patents
Cylindrical material butt joint method and device based on 3D camera Download PDFInfo
- Publication number
- CN112548654A CN112548654A CN202110217670.9A CN202110217670A CN112548654A CN 112548654 A CN112548654 A CN 112548654A CN 202110217670 A CN202110217670 A CN 202110217670A CN 112548654 A CN112548654 A CN 112548654A
- Authority
- CN
- China
- Prior art keywords
- camera
- point cloud
- cylindrical material
- cylinder
- mobile robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q7/00—Arrangements for handling work specially combined with or arranged in, or specially adapted for use in connection with, machine tools, e.g. for conveying, loading, positioning, discharging, sorting
- B23Q7/04—Arrangements for handling work specially combined with or arranged in, or specially adapted for use in connection with, machine tools, e.g. for conveying, loading, positioning, discharging, sorting by means of grippers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1669—Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The application discloses a cylinder material butt joint method and device based on a 3D camera, and the method comprises the following steps: collecting three-dimensional point cloud data of a section of a cylindrical material by two 3D cameras, wherein the two 3D cameras are respectively arranged on two sides of the cylindrical material, and the visual fields of the two 3D cameras cover the whole section of the cylindrical material; converting the three-dimensional point cloud data into the coordinate system of the autonomous mobile robot according to the conversion relation between the coordinate system of the 3D camera and the coordinate system of the autonomous mobile robot; preprocessing the three-dimensional point cloud data after conversion; dividing the preprocessed three-dimensional point cloud data to obtain a point cloud of a central area of the cylindrical material, and fitting the point cloud to obtain a geometric cylinder; obtaining the circle center position and the axis position of the geometric cylinder according to the geometric cylinder; and sending the circle center position and the axis position to the autonomous mobile robot, and realizing the butt joint process through the motion control of the autonomous mobile robot.
Description
Technical Field
The application relates to the technical field of automation and intelligence, in particular to a cylinder material butt joint method and device based on a 3D camera.
Background
With the gradual disappearance of the population dividends in China, the labor cost gradually becomes a great burden for production and manufacturing enterprises. In the production lines of many current manufacturing industries, the degree of automation and intellectualization is low, and the operations of carrying, loading and unloading, aligning and the like of some heavy materials are still completed by manpower, so that the labor cost is also increased year by year. In addition, the manual operation also has the defects of fatigue, easy error, low efficiency and the like. Therefore, in the field of manufacturing industry, the need for automatic and intelligent robot replacement is more and more urgent.
In some production and manufacturing fields, such as printing, papermaking, new energy, automobile manufacturing and other industries, reel raw materials exist, the volume and the weight are large, the weight of the reel raw materials is up to one ton or even more, and the reel raw materials are basically cylindrical or cylinder-like. In the feeding process, the materials need to be lifted manually or by a forklift, and are accurately prevented from being clamped on a specified supporting table by a mechanical clamp. Due to the requirements of the execution precision and the fixity of the machine, the manual work often needs to take a large amount of effort and a long time to complete the butt joint operation. At present, the scheme of automatically aligning the cylindrical materials is not adopted in the industry, so that the conditions of low efficiency, low speed, easy danger and the like are caused. Under these circumstances, demands for automation and intellectualization are higher.
Disclosure of Invention
The embodiment of the application aims to provide a cylinder material butt joint method and device based on a 3D camera, and aims to solve the problems of low efficiency, low speed and high danger possibility existing in the related technology.
According to a first aspect of the embodiments of the present application, there is provided a 3D camera-based cylinder material docking method, where the cylinder material is carried on an autonomous mobile robot, the method including:
the method comprises the steps of collecting three-dimensional point cloud data of a section of a cylindrical material through two 3D cameras, wherein the two 3D cameras are respectively arranged on two sides of the cylindrical material, the fields of vision of the two 3D cameras cover the whole section of the cylindrical material, and the section comprises the material and a circular hole part;
a coordinate conversion step, which is used for converting the three-dimensional point cloud data into the coordinate system of the autonomous mobile robot according to the conversion relation between the coordinate system of the 3D camera and the coordinate system of the autonomous mobile robot;
a preprocessing step, which is used for preprocessing the three-dimensional point cloud data after conversion;
a segmentation fitting step, namely segmenting the preprocessed three-dimensional point cloud data to obtain a point cloud of a central area of the cylindrical material, and fitting the point cloud to obtain a geometric cylinder;
a calculating step, which is used for obtaining the circle center position and the axis position of the geometric cylinder according to the geometric cylinder;
and a control step, namely sending the circle center position and the axis position to the autonomous mobile robot, and controlling the docking process through the motion of the autonomous mobile robot.
Further, the two 3D cameras are respectively arranged at both sides of the cylindrical material, and the fields of view of the two 3D cameras cover the entire section of the cylindrical material, including:
the 3D camera is installed below the outside of the two sides of the cylindrical material, the lens of the 3D camera faces the outer edge of the bottom of the cylindrical material respectively, and the 3D camera can acquire the center and bottom cambered surface data of the cylindrical material.
Further, the 3D camera refers to a sensor that can acquire scene depth information in addition to a two-dimensional image.
Further, when the 3D camera shoots the cylindrical material, the 3D camera can simultaneously acquire the information of the section of the cylindrical material and the bottom information, wherein the bottom information is used for assisting positioning and roughly determining the upper position and the lower position of the center of the cylindrical material; the section information is used for carrying out accurate positioning, and accurate 3D position information of the center of the cylindrical material is obtained.
Further, the transformation relation between the 3D camera coordinate system and the autonomous mobile robot coordinate system is obtained by calibrating the 3D camera and the autonomous mobile robot.
Further, the three-dimensional point cloud data is preprocessed, and the preprocessing comprises the following steps:
denoising the three-dimensional point cloud data through a filtering algorithm;
after denoising, down-sampling the three-dimensional point cloud data, removing partial noise, and keeping the point cloud uniformly distributed;
and after downsampling, useful point cloud data at the bottom of the cylinder are reserved, and interference points are removed again by utilizing normal vector information.
Further, the filtering algorithm includes bilateral filtering and median filtering for filtering noise.
Further, according to the geometric cylinder, obtaining the circle center position and the axis position of the geometric cylinder, the method includes:
obtaining the circle center positions of the edges at the two ends of the cylindrical material according to the geometric cylinder;
calculating the axis through the left point cloud of the central area of the cylindrical materialCalculating to obtain an axis through the right point cloud of the central area of the cylindrical materialObtaining the shaft by connecting the circle center positions of the two endsShaft taking outShaft, shaftAnd shaftThe average value of the two shafts with small angle deviation in the middle pair is used as the circle center position and the axis position.
Further, still include:
an iteration control step, which is used for executing the acquisition step to the control step according to the current result of the motion control of the autonomous mobile robot, and if the position difference between the circle center position and the axis position is smaller than a set threshold value, the butt joint is successful, and the motion control is stopped; otherwise, repeating the acquisition step to the control step until the docking is successful.
According to a second aspect of the embodiments of the present invention, there is provided a 3D camera-based cylinder material docking apparatus, the cylinder material being carried on an autonomous mobile robot, the apparatus including:
the acquisition module is used for acquiring three-dimensional point cloud data of a section of a cylindrical material through two 3D cameras, wherein the two 3D cameras are respectively arranged on two sides of the cylindrical material, the visual fields of the two 3D cameras cover the whole section of the cylindrical material, and the section comprises the material and a circular hole part;
the coordinate conversion module is used for converting the three-dimensional point cloud data into the coordinate system of the autonomous mobile robot according to the conversion relation between the coordinate system of the 3D camera and the coordinate system of the autonomous mobile robot;
the preprocessing module is used for preprocessing the three-dimensional point cloud data after conversion;
the segmentation fitting module is used for segmenting the preprocessed three-dimensional point cloud data to obtain a point cloud of a central area of the cylindrical material, and fitting the point cloud to obtain a geometric cylinder;
the calculation module is used for obtaining the circle center position and the axis position of the geometric cylinder according to the geometric cylinder;
and the control module is used for sending the circle center position and the axis position to the autonomous mobile robot, and realizing the butt joint process through the motion control of the autonomous mobile robot.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
according to the embodiment, the 3D cameras arranged on the two sides of the autonomous mobile robot are used for automatically identifying the positions of the materials and the butt joint shaft, and the autonomous mobile robot is guided to complete motion control, so that the automation of the whole process of material feeding and alignment is achieved, the operation efficiency is improved, and the manpower participation is reduced.
In order to improve the production efficiency, reduce the labor participation and improve the industrial automation degree, the method can quickly and efficiently extract the accurate positions of the center and the edge of the cylinder. Meanwhile, through the configuration of the double 3D cameras, the accurate position and posture of the whole cylindrical material can be obtained, and accurate information is provided for automatic alignment.
According to the method, the 3D camera is used as a sensor and fixed on two sides of an Autonomous Mobile Robot (AMR), and materials are lifted at the top of the AMR, so that the 3D camera can observe a circular hole area in the material roll. Two 3D cameras (the cameras generally adopt TOF or structured light principle) are placed below the two ends of the material, and the two ends of the cylinder are shot upwards at the same time to obtain 3D point cloud data.
Due to the existence of noise factors such as light materials and the like, the method firstly obtains clean cylindrical point cloud in a denoising mode, and then processes the point cloud. The process from coarse positioning to fine positioning of three-dimensional geometry is carried out through an algorithm, the position and the posture of the whole cylinder are obtained, and the highest positioning precision is within 0.5 mm. Then, the invention sends the relevant position and posture information to an Autonomous Mobile Robot (AMR) to guide the robot to finish position adjustment and finish the process of aligning and feeding.
The method can effectively replace manual heavy and long butt joint work. And 3D sensor data and an intelligent algorithm are combined, so that the efficiency is higher and the safety is higher.
The method of the invention combines the autonomous mobile robot, realizes the functions of lifting and accurate butt joint of the robot, and ensures that the robot can not only move autonomously, but also realize the function of 'hands' and finish relatively fine work.
The method of the invention can be compatible with materials with different sizes and positions to a certain extent, really realizes the automation and intellectualization of the operation on the production line and completes the process of robot changing.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a flowchart illustrating a 3D camera-based cylinder material docking method according to an exemplary embodiment.
Fig. 2 is a schematic diagram illustrating an effect of a self-moving robot carrying a cylindrical material and a 3D camera installation according to an exemplary embodiment.
FIG. 3 is a diagram illustrating a material location where a 3D camera acquires three-dimensional point cloud data and fits thereto, according to an example embodiment.
Fig. 4 is a block diagram illustrating a 3D camera-based cylinder material docking apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
Fig. 1 is a flowchart illustrating a 3D camera-based cylinder material docking method according to an exemplary embodiment. Referring to fig. 1, an embodiment of the present invention provides a 3D camera-based method for docking a cylindrical material carried on an Autonomous Mobile Robot (AMR), including:
an acquisition step S101, which is used for acquiring three-dimensional point cloud data of a section of a cylindrical material through two 3D cameras, wherein the two 3D cameras are respectively arranged on two sides of the cylindrical material, the fields of vision of the two 3D cameras cover the whole section of the cylindrical material, and the section comprises the material and a circular hole part;
a coordinate conversion step S102, which is used for converting the three-dimensional point cloud data into the coordinate system of the autonomous mobile robot according to the conversion relation between the coordinate system of the 3D camera and the coordinate system of the autonomous mobile robot;
a preprocessing step S103, for preprocessing the three-dimensional point cloud data after conversion;
a segmentation fitting step S104, which is used for performing segmentation processing on the preprocessed three-dimensional point cloud data to obtain a point cloud of a central area of the cylindrical material, and fitting the point cloud to obtain a geometric cylinder;
a calculating step S105, configured to obtain a circle center position and an axis position of the geometric cylinder according to the geometric cylinder;
and a control step S106, configured to send the circle center position and the axis position to the autonomous mobile robot, and implement a docking process through motion control of the autonomous mobile robot.
Through the steps, the Autonomous Mobile Robot (AMR) forms a control system through the position deviation calculated by the 3D camera, guides the autonomous mobile robot to move, completes the process of replacing manual automatic butt joint and feeding, and realizes higher butt joint precision. The 3D camera serves as the 'eyes' of the robot and can quickly feed back position data needing to be adjusted at the current position; the Autonomous Mobile Robot (AMR) is used as an arm to lift materials to complete butt joint operation. The whole process is similar to manual butt joint, but the butt joint precision and efficiency greatly exceed those of manual work, and the automatic and intelligent process is practiced.
In the above collecting step S101, the two 3D cameras are respectively disposed at two sides of the cylindrical material, and the fields of view of the two 3D cameras cover the entire cross section of the cylindrical material, including:
as shown in fig. 2, a first 3D camera 2 and a second 3D camera 3 are installed below the outer portions of two sides of a cylindrical material 1, the cylindrical material 1 is borne on a base 4 of an Autonomous Mobile Robot (AMR), and lenses of the first 3D camera 2 and the second 3D camera 3 respectively face the outer edge of the bottom of the cylindrical material 1, so that the first 3D camera 2 and the second 3D camera 3 can acquire data of the center and the bottom arc surface of the cylindrical material. The camera needs to be installed stably in the process, the situations of shaking and the like are reduced in the process of AMR movement, and the key effect is achieved on the positioning of materials.
In this example, the 3D camera is a sensor that can acquire scene depth information in addition to a two-dimensional image. When the 3D camera shoots a cylindrical material, the 3D camera can simultaneously acquire information of the section of the cylindrical material and bottom information, wherein the bottom information is used for assisting positioning and roughly determining the upper position and the lower position of the center of the cylindrical material; the section information is used for carrying out accurate positioning, and accurate 3D position information of the center of the cylindrical material is obtained. It should be noted that the camera needs to be kept as horizontal as possible and stable during the movement of the robot.
In the coordinate conversion step S102, the conversion relationship between the 3D camera coordinate system and the autonomous mobile robot coordinate system may be obtained by calibrating the 3D camera and the autonomous mobile robot. The calibration is used for obtaining the relative deviation between the camera and the AMR through a calibration process. The result of the calibration is typically a transfer matrix comprising translational and rotational components between the camera and the AMR. The process converts the position information of the material from a camera coordinate system to a robot coordinate system. The calibration process may refer to an external camera reference calibration method in OpenCV, which is not described herein again. The cameras on both sides need to be respectively calibrated to obtain transfer matrixes H1 and H2, which respectively represent the position and the posture of the 3D camera relative to the center of the robot. And finally, data of the detection result can be uniformly converted into a robot coordinate system, and the process of butt joint of the materials is completed by controlling the robot to move.
In the preprocessing step S103, preprocessing the three-dimensional point cloud data includes:
(1) denoising the three-dimensional point cloud data through a filtering algorithm; the filtering algorithm comprises bilateral filtering and median filtering and is used for filtering noise points;
(2) after denoising, down-sampling the three-dimensional point cloud data, removing partial noise, and keeping the point cloud uniformly distributed;
(3) and after downsampling, useful point cloud data at the bottom of the cylinder are reserved, and interference points are removed again by utilizing normal vector information.
The three-dimensional point cloud data is preprocessed, and the three-dimensional data denoising process is completed. In the step, through reducing noise, more accurate 3D data of the material is obtained, and data input is provided for the next accurate processing.
In the step S104, the step of segmenting and fitting is configured to segment the preprocessed three-dimensional point cloud data to obtain a point cloud of a central area of the cylindrical material, and fit the point cloud to obtain a geometric cylinder, and includes:
first, point cloud data is divided. Since the radius of the material is known in advance, a preliminary filtering is performed on the segmentation results. After filtering, one or several point clouds, which may be material free center areas, may be acquired. If no result is obtained in the step, the detection is failed to be output, and the positioning result is abnormal.
Secondly, according to geometric features such as size, normal vector and the like, a cylindrical region is found and optimized fitting is carried out. And fitting a geometric model of the cylinder in a least square mode due to the fact that the main axis direction and the approximate position of the cylinder are consistent. The calibration modes of the model of the cylinder can be represented by a straight line and a radius, wherein the straight line represents the axis of the cylinder; the radius is the radius of the cylinder. The radius of the cylinder to be fitted is compared with the radius of the cylinder known in advance, and if the difference is large, the fitting is wrong. As all normal vectors on the surface of the cylinder point to the center of the cross section, the position of the center of the circle can be obtained only by selecting the central area of the bottom of the cylinder, namely the more reliable point cloud of the 3D camera, and calculating along the normal vectors.
Finally, the centers of the two ends of the cylinder are obtained in the same way. The accurate information at two ends is obtained in a 3D processing mode, so that the method is quick and accurate, and is not easily interfered by external information such as light.
Compared with a physical positioning mode, the scheme adopting the 3D vision has higher efficiency, accuracy and success rate. The process obtains the center position of material fitting. The depth data is acquired by a multi-line scanning 3D camera, so that the single-position precision of the material is less than 0.5mm, as shown in figure 3.
In the calculating step S105, obtaining the circle center position and the axis position of the geometric cylinder according to the geometric cylinder includes:
(1) obtaining the circle center positions of the edges at the two ends of the cylindrical material according to the geometric cylinder; since there is an error in the calculation, there may be a case where the directions of the axes obtained as a result of both ends are not coincident.
(2) Calculating the axis through the left point cloud of the central area of the cylindrical materialCalculating to obtain an axis through the right point cloud of the central area of the cylindrical materialObtaining the shaft by connecting the circle center positions of the two endsShaft taking outShaft, shaftAnd shaftThe average of the two shafts with small angle deviation in the middle pair is used as the circle center position and the axis position. The angular deviation is obtained by means of the unit vector dot product of the two axes.
The embodiment of the invention provides a cylindrical material butt joint method based on a 3D camera, which further comprises the following steps:
an iteration control step S107, which is used for executing the acquisition step to the control step according to the current result of the motion control of the autonomous mobile robot, and if the position difference between the circle center position and the axis position is smaller than a set threshold value, the butt joint is successful, and the motion control is stopped; otherwise, repeating the acquisition step to the control step until the docking is successful.
Because the coordinate system is converted into a central coordinate system of the robot (AMR), only the calculated angle deviation needs to be input into a robot control system to guide the robot to move. And (4) iterating the processes of S101 to S106 for several times, stopping the butt joint process and finishing the butt joint process when the deviations on the two sides are less than a certain threshold value d.
At the moment, the positions of the central points on the two sides of the cylinder are obtained, and the posture of the cylinder is obtained in a connection optimization mode. And calculating the deviation of the current cylinder from the target position through a transfer matrix H of a camera coordinate system and a robot coordinate system which are calibrated in advance. The robot can be guided to drive the cylinder to reach the alignment target point by adjusting the position of the robot driving wheel.
Compared with manual butt joint, the butt joint method based on the 3D camera and the 3D vision is full-automatic in whole process, manual intervention is not needed, the whole process is completed, the efficiency is greatly improved, the labor cost is reduced, and butt joint feeding automation is realized.
Corresponding to the embodiment of the cylinder material docking method based on the 3D camera, the application also provides an embodiment of a cylinder material docking device based on the 3D camera.
Fig. 4 is a block diagram illustrating a 3D camera-based cylinder material docking apparatus according to an exemplary embodiment. Referring to fig. 4, the apparatus may include:
the acquisition module 21 is used for acquiring three-dimensional point cloud data of a section of a cylindrical material through two 3D cameras, wherein the two 3D cameras are respectively arranged on two sides of the cylindrical material, the fields of vision of the two 3D cameras cover the whole section of the cylindrical material, and the section comprises the material and a circular hole part;
the coordinate conversion module 22 is used for converting the three-dimensional point cloud data into the coordinate system of the autonomous mobile robot according to the conversion relation between the coordinate system of the 3D camera and the coordinate system of the autonomous mobile robot;
the preprocessing module 23 is configured to preprocess the three-dimensional point cloud data after conversion;
the segmentation fitting module 24 is configured to perform segmentation processing on the preprocessed three-dimensional point cloud data to obtain a point cloud of a central area of the cylindrical material, and fit the point cloud to obtain a geometric cylinder;
the calculation module 25 is used for obtaining the circle center position and the axis position of the geometric cylinder according to the geometric cylinder;
and the control module 26 is used for sending the circle center position and the axis position to the autonomous mobile robot, and controlling the motion of the autonomous mobile robot to realize the docking process.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the present application also provides an electronic device, comprising: one or more processors; a memory for storing one or more programs; when executed by the one or more processors, cause the one or more processors to implement a 3D camera-based cylinder material docking method as described above.
Accordingly, the present application also provides a computer readable storage medium, on which computer instructions are stored, wherein the instructions, when executed by a processor, implement a 3D camera-based cylinder material docking method as described above.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.
Claims (10)
1. A3D camera-based cylinder material docking method, wherein the cylinder material is borne on an autonomous mobile robot, the method comprises the following steps:
the method comprises the steps of collecting three-dimensional point cloud data of a section of a cylindrical material through two 3D cameras, wherein the two 3D cameras are respectively arranged on two sides of the cylindrical material, the fields of vision of the two 3D cameras cover the whole section of the cylindrical material, and the section comprises the material and a circular hole part;
a coordinate conversion step, which is used for converting the three-dimensional point cloud data into the coordinate system of the autonomous mobile robot according to the conversion relation between the coordinate system of the 3D camera and the coordinate system of the autonomous mobile robot;
a preprocessing step, which is used for preprocessing the three-dimensional point cloud data after conversion;
a segmentation fitting step, namely segmenting the preprocessed three-dimensional point cloud data to obtain a point cloud of a central area of the cylindrical material, and fitting the point cloud to obtain a geometric cylinder;
a calculating step, which is used for obtaining the circle center position and the axis position of the geometric cylinder according to the geometric cylinder;
and a control step, namely sending the circle center position and the axis position to the autonomous mobile robot, and controlling the docking process through the motion of the autonomous mobile robot.
2. The 3D camera-based cylindrical material docking method according to claim 1, wherein the two 3D cameras are respectively arranged on two sides of a cylindrical material, and the fields of view of the two 3D cameras cover the whole cross section of the cylindrical material, and the method comprises the following steps:
the 3D camera is installed below the outside of the two sides of the cylindrical material, the lens of the 3D camera faces the outer edge of the bottom of the cylindrical material respectively, and the 3D camera can acquire the center and bottom cambered surface data of the cylindrical material.
3. The cylinder material docking method based on the 3D camera as claimed in claim 1, wherein the 3D camera is a sensor capable of acquiring depth information of a scene besides two-dimensional images.
4. The cylindrical material docking method based on the 3D camera according to claim 1, wherein when the 3D camera shoots the cylindrical material, the 3D camera simultaneously obtains information of the section of the cylindrical material and bottom information, wherein the bottom information is used for assisting positioning and approximately determining the upper and lower positions of the center of the cylindrical material; the section information is used for carrying out accurate positioning, and accurate 3D position information of the center of the cylindrical material is obtained.
5. The cylindrical material docking method based on the 3D camera as claimed in claim 1, wherein the transformation relationship between the 3D camera coordinate system and the autonomous mobile robot coordinate system is obtained by calibrating the 3D camera and the autonomous mobile robot.
6. The 3D camera-based cylinder material docking method according to claim 1, wherein the preprocessing of the three-dimensional point cloud data comprises:
denoising the three-dimensional point cloud data through a filtering algorithm;
after denoising, down-sampling the three-dimensional point cloud data, removing partial noise, and keeping the point cloud uniformly distributed;
and after downsampling, useful point cloud data at the bottom of the cylinder are reserved, and interference points are removed again by utilizing normal vector information.
7. The cylinder material docking method based on the 3D camera according to claim 6, wherein the filtering algorithm comprises bilateral filtering and median filtering for filtering noise.
8. The 3D camera-based cylinder material docking method according to claim 1, wherein obtaining a circle center position and an axis position of a geometric cylinder according to the geometric cylinder comprises:
obtaining the circle center positions of the edges at the two ends of the cylindrical material according to the geometric cylinder;
calculating the axis through the left point cloud of the central area of the cylindrical materialCalculating to obtain an axis through the right point cloud of the central area of the cylindrical materialObtaining the shaft by connecting the circle center positions of the two endsShaft taking outShaft, shaftAnd shaftTwo in middleThe average value of the two shafts with smaller angular deviation is used as the circle center position and the axis position.
9. The 3D camera-based cylinder material docking method according to claim 1, further comprising:
an iteration control step, which is used for executing the acquisition step to the control step according to the current result of the motion control of the autonomous mobile robot, and if the position difference between the circle center position and the axis position is smaller than a set threshold value, the butt joint is successful, and the motion control is stopped; otherwise, repeating the acquisition step to the control step until the docking is successful.
10. A cylinder material interfacing apparatus based on 3D camera, the cylinder material bears on autonomic mobile robot, its characterized in that, the device includes:
the acquisition module is used for acquiring three-dimensional point cloud data of a section of a cylindrical material through two 3D cameras, wherein the two 3D cameras are respectively arranged on two sides of the cylindrical material, the visual fields of the two 3D cameras cover the whole section of the cylindrical material, and the section comprises the material and a circular hole part;
the coordinate conversion module is used for converting the three-dimensional point cloud data into the coordinate system of the autonomous mobile robot according to the conversion relation between the coordinate system of the 3D camera and the coordinate system of the autonomous mobile robot;
the preprocessing module is used for preprocessing the three-dimensional point cloud data after conversion;
the segmentation fitting module is used for segmenting the preprocessed three-dimensional point cloud data to obtain a point cloud of a central area of the cylindrical material, and fitting the point cloud to obtain a geometric cylinder;
the calculation module is used for obtaining the circle center position and the axis position of the geometric cylinder according to the geometric cylinder;
and the control module is used for sending the circle center position and the axis position to the autonomous mobile robot, and realizing the butt joint process through the motion control of the autonomous mobile robot.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110217670.9A CN112548654A (en) | 2021-02-26 | 2021-02-26 | Cylindrical material butt joint method and device based on 3D camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110217670.9A CN112548654A (en) | 2021-02-26 | 2021-02-26 | Cylindrical material butt joint method and device based on 3D camera |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112548654A true CN112548654A (en) | 2021-03-26 |
Family
ID=75036038
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110217670.9A Pending CN112548654A (en) | 2021-02-26 | 2021-02-26 | Cylindrical material butt joint method and device based on 3D camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112548654A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116853515A (en) * | 2023-06-12 | 2023-10-10 | 成都飞机工业(集团)有限责任公司 | Autonomous butt joint method of numerical control locator ball socket and ball head based on 3D camera |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB638102A (en) * | 1947-10-03 | 1950-05-31 | Stuart Williamson | Optical projection apparatus |
CN103616016A (en) * | 2013-11-29 | 2014-03-05 | 大连理工大学 | Visual position-pose measurement method based on point-line combination characteristics |
CN105627917A (en) * | 2014-11-05 | 2016-06-01 | 北京航天计量测试技术研究所 | Large-scale structural component assembly joining measurement method based on visual principle |
CN106624709A (en) * | 2016-12-29 | 2017-05-10 | 南京天祥智能设备科技有限公司 | Assembly system and method based on binocular vision |
CN107170045A (en) * | 2017-03-21 | 2017-09-15 | 国网湖北省电力公司检修公司 | The method being modeled based on cloud data to substation transformer |
CN108694713A (en) * | 2018-04-19 | 2018-10-23 | 北京控制工程研究所 | A kind of the ring segment identification of satellite-rocket docking ring part and measurement method based on stereoscopic vision |
CN111689432A (en) * | 2020-06-08 | 2020-09-22 | 杭州蓝芯科技有限公司 | Automatic feeding system and method for soft package printing equipment |
CN111709131A (en) * | 2020-06-05 | 2020-09-25 | 中国铁道科学研究院集团有限公司基础设施检测研究所 | Tunnel axis determining method and device |
CN112318107A (en) * | 2020-10-23 | 2021-02-05 | 西北工业大学 | Large-scale part hole shaft automatic assembly centering measurement method based on depth camera |
CN112356019A (en) * | 2020-08-06 | 2021-02-12 | 武汉科技大学 | Method and device for analyzing body of target object grabbed by dexterous hand |
-
2021
- 2021-02-26 CN CN202110217670.9A patent/CN112548654A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB638102A (en) * | 1947-10-03 | 1950-05-31 | Stuart Williamson | Optical projection apparatus |
CN103616016A (en) * | 2013-11-29 | 2014-03-05 | 大连理工大学 | Visual position-pose measurement method based on point-line combination characteristics |
CN105627917A (en) * | 2014-11-05 | 2016-06-01 | 北京航天计量测试技术研究所 | Large-scale structural component assembly joining measurement method based on visual principle |
CN106624709A (en) * | 2016-12-29 | 2017-05-10 | 南京天祥智能设备科技有限公司 | Assembly system and method based on binocular vision |
CN107170045A (en) * | 2017-03-21 | 2017-09-15 | 国网湖北省电力公司检修公司 | The method being modeled based on cloud data to substation transformer |
CN108694713A (en) * | 2018-04-19 | 2018-10-23 | 北京控制工程研究所 | A kind of the ring segment identification of satellite-rocket docking ring part and measurement method based on stereoscopic vision |
CN111709131A (en) * | 2020-06-05 | 2020-09-25 | 中国铁道科学研究院集团有限公司基础设施检测研究所 | Tunnel axis determining method and device |
CN111689432A (en) * | 2020-06-08 | 2020-09-22 | 杭州蓝芯科技有限公司 | Automatic feeding system and method for soft package printing equipment |
CN112356019A (en) * | 2020-08-06 | 2021-02-12 | 武汉科技大学 | Method and device for analyzing body of target object grabbed by dexterous hand |
CN112318107A (en) * | 2020-10-23 | 2021-02-05 | 西北工业大学 | Large-scale part hole shaft automatic assembly centering measurement method based on depth camera |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116853515A (en) * | 2023-06-12 | 2023-10-10 | 成都飞机工业(集团)有限责任公司 | Autonomous butt joint method of numerical control locator ball socket and ball head based on 3D camera |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110539109B (en) | Robot automatic welding system and method based on single-binocular vision | |
US6379215B1 (en) | Eyeglass lens processing system | |
CN110293559B (en) | Installation method for automatically identifying, positioning and aligning | |
US20150197009A1 (en) | Method for picking up an article using a robot arm and associated system | |
CN110919134A (en) | Tube plate positioning welding method | |
WO2021050646A1 (en) | Robot-mounted moving device, system, and machine tool | |
JP2021035708A (en) | Production system | |
CN111906788B (en) | Bathroom intelligent polishing system based on machine vision and polishing method thereof | |
CN112548654A (en) | Cylindrical material butt joint method and device based on 3D camera | |
CN107804708A (en) | A kind of pivot localization method of placement equipment feeding rotary shaft | |
CN112365502B (en) | Calibration method based on visual image defect detection | |
CN112381827A (en) | Rapid high-precision defect detection method based on visual image | |
CN113500593B (en) | Method for grabbing designated part of shaft workpiece for feeding | |
Ge et al. | Robot welding seam online grinding system based on laser vision guidance | |
CN112873164A (en) | Automatic material handling robot | |
CN115384052A (en) | Intelligent laminating machine automatic control system | |
CN115164752A (en) | Self-adaptive measurement equipment and method for gap and step difference of butt joint of large part | |
CN111127406B (en) | Back plate machining position adjusting method, terminal, system and storage medium | |
CN106826399B (en) | Intelligent deburring method for hub | |
JP7057841B2 (en) | Robot control system and robot control method | |
CN111179255B (en) | Feature recognition method in automatic preparation process of membrane water-cooled wall | |
CN113715935A (en) | Automatic assembling system and automatic assembling method for automobile windshield | |
CN111598945A (en) | Three-dimensional positioning method for automobile engine crankshaft bush cover | |
CN114782533B (en) | Cable drum shaft pose determining method based on monocular vision | |
CN114131660A (en) | Visual guidance imaging device based on teleoperation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210326 |
|
RJ01 | Rejection of invention patent application after publication |