CN115284297B - Workpiece positioning method, robot, and robot working method - Google Patents

Workpiece positioning method, robot, and robot working method Download PDF

Info

Publication number
CN115284297B
CN115284297B CN202211063956.7A CN202211063956A CN115284297B CN 115284297 B CN115284297 B CN 115284297B CN 202211063956 A CN202211063956 A CN 202211063956A CN 115284297 B CN115284297 B CN 115284297B
Authority
CN
China
Prior art keywords
workpiece
robot
camera
virtual
camera device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211063956.7A
Other languages
Chinese (zh)
Other versions
CN115284297A (en
Inventor
植美浃
韦卓光
廖伟东
翟军
李俊渊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cimc Container Group Co ltd
Shenzhen Qianhai Ruiji Technology Co ltd
China International Marine Containers Group Co Ltd
Original Assignee
Cimc Container Group Co ltd
Shenzhen Qianhai Ruiji Technology Co ltd
China International Marine Containers Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cimc Container Group Co ltd, Shenzhen Qianhai Ruiji Technology Co ltd, China International Marine Containers Group Co Ltd filed Critical Cimc Container Group Co ltd
Priority to CN202211063956.7A priority Critical patent/CN115284297B/en
Publication of CN115284297A publication Critical patent/CN115284297A/en
Application granted granted Critical
Publication of CN115284297B publication Critical patent/CN115284297B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means

Abstract

The application discloses a workpiece positioning method, computer equipment, a computer readable storage medium, a robot and a robot operation method.

Description

Workpiece positioning method, robot, and robot working method
Technical Field
The present application relates to the field of robots, and in particular, to a workpiece positioning method, a computer device, a computer readable storage medium, a robot, and a robot operation method.
Background
In the field of robot operation, the workpiece position is obtained by a line laser scanning mode, the workpiece position is obtained by a manual teaching programming mode, and the like, so that the problems of long time consumption, low efficiency and the like exist. Meanwhile, as more noise points exist in the workpiece point cloud data acquired by line laser scanning, the accuracy is low, and the operation quality and efficiency of the robot are low; manual teaching programming introduces artificial uncontrollable factors, which often cause accidents.
Along with the rapid development of robot technology and three-dimensional vision technology and the upgrade of industrial intelligent manufacturing, robot vision is increasingly applied to scenes such as industrial production, service industry and the like, and robot operation is guided through a vision system, so that the robot intelligent operation method is an important means for realizing the robot operation. Based on the operation of three-dimensional vision guide robot, efficiency is higher, also more accurate to the location of work piece.
However, in the prior art, the workpiece point cloud data acquired by the camera device is directly matched with the virtual point cloud data corresponding to the workpiece model, the workpiece model comprises thickness data, the camera device can only acquire the workpiece point cloud data in the sight range, the workpiece point cloud data is directly matched with the virtual point cloud data in the workpiece model, the virtual point cloud data matched with the workpiece point cloud data cannot be ensured to be corresponding, and the matching error is large. As shown in fig. 1, a mark a represents a workpiece point cloud contour, a mark b represents a virtual point cloud contour obtained directly from a workpiece model, and the mark a and the mark b are not located on the same plane but have a certain distance; that is, the virtual point cloud data does not exactly correspond to the workpiece point cloud data, which would result in inaccurate matching.
Content of the application
In order to solve the problem of inaccurate workpiece positioning, the application provides a workpiece positioning method, computer equipment, a computer readable storage medium, a robot and a robot operation method.
According to one aspect of the embodiment of the application, a workpiece positioning method is disclosed, and is used for a scene of a robot working on a workpiece, wherein a camera device is arranged on the robot. The workpiece positioning method comprises the following steps:
acquiring an actual point cloud of the workpiece under a base coordinate system of the robot based on the workpiece image acquired by the camera device;
acquiring a corresponding workpiece model of the workpiece under the base coordinate system;
constructing a virtual camera and a virtual scene, placing the workpiece model in the virtual scene, and enabling the workpiece model to be positioned in the view angle of the virtual camera;
shooting the workpiece model by adopting the virtual camera, and obtaining a virtual point cloud of the workpiece model under the base coordinate system;
and matching the actual point cloud with the virtual point cloud to obtain the position of the workpiece.
In an exemplary embodiment, the camera device is disposed at an end of the robot; the acquiring the actual point cloud of the workpiece under the base coordinate system of the robot based on the workpiece image acquired by the camera device comprises the following steps:
acquiring a workpiece image acquired by the camera device and a pose matrix of the robot when the camera device acquires the workpiece image;
according to the pose matrix and the hand-eye matrix of the robot, converting coordinates of a point cloud in the workpiece image under a camera coordinate system into the base coordinate system, and obtaining the actual point cloud; the hand-eye matrix represents a conversion relation of a camera coordinate system relative to a tool coordinate system of the robot.
In an exemplary embodiment, the constructing a virtual camera includes:
determining the position of the virtual camera in the virtual scene according to the pose matrix and the hand-eye matrix;
configuring parameters of the virtual camera according to the parameters of the camera device to construct the virtual camera; the parameters include field angle and pixels.
In an exemplary embodiment, the configuring the parameters of the virtual camera according to the parameters of the camera device includes:
configuring the angle of view of the virtual camera to be consistent with the angle of view of the camera device;
the pixels configuring the virtual camera are coincident with the pixels of the camera device.
In an exemplary embodiment, the determining the position of the virtual camera in the virtual scene according to the pose matrix and the hand-eye matrix includes:
determining a position of the camera device relative to the robot according to the pose matrix and the hand-eye matrix;
and setting the position of the virtual camera in the virtual scene according to the position of the camera device relative to the robot so that the position of the virtual camera corresponds to the position of the camera device.
In an exemplary embodiment, the acquiring a workpiece model of the workpiece in the base coordinate system includes:
acquiring image data of a plurality of angles of the workpiece;
integrating the image data of the angles into a three-dimensional model by utilizing three-dimensional modeling software;
and D, the three-dimensional digital-to-analog conversion is carried out under the base coordinate system, and the workpiece model is obtained.
In an exemplary embodiment, said matching said actual point cloud and said virtual point cloud to obtain a position of said workpiece comprises:
and registering the actual point cloud and the virtual point cloud by adopting an iterative nearest neighbor algorithm to obtain the position of the workpiece.
According to an aspect of an embodiment of the present application, there is disclosed a robot including:
a robot body having a plurality of axes of motion;
a camera device disposed at the robot body; a kind of electronic device with high-pressure air-conditioning system
And a controller connected to the robot body and the camera device for controlling the robot body and the camera device and performing the steps of the workpiece positioning method as described above.
According to an aspect of an embodiment of the present application, there is disclosed a robot working method provided with an end tool and a camera device, the working method including:
controlling the camera device to acquire a workpiece image;
carrying out workpiece positioning by adopting the workpiece positioning method to obtain the position of the workpiece;
and controlling the end tool to move to the position of the workpiece for operation.
In one exemplary embodiment, the end tool is a welding gun and the camera device is a three-dimensional camera.
According to an aspect of an embodiment of the present application, a computer apparatus is disclosed for a scenario in which a robot is working on a workpiece, the robot being provided with a camera device. The computer device includes:
the first acquisition module is used for acquiring an actual point cloud of the workpiece under a base coordinate system of the robot based on the workpiece image acquired by the camera device;
the first construction module is used for acquiring a corresponding workpiece model of the workpiece under the base coordinate system;
the second construction module is used for constructing a virtual camera and a virtual scene, placing the workpiece model in the virtual scene and enabling the workpiece model to be located in the view angle of the virtual camera;
the second acquisition module is used for shooting the workpiece model by adopting the virtual camera and acquiring a virtual point cloud of the workpiece model under the base coordinate system;
and the point cloud matching module is used for matching the actual point cloud with the virtual point cloud to obtain the position of the workpiece.
According to an aspect of an embodiment of the present application, there is disclosed a computer apparatus including:
one or more processors;
and a memory for storing one or more programs that, when executed by the one or more processors, cause the computer device to implement the aforementioned workpiece positioning method.
According to an aspect of an embodiment of the present application, a computer-readable storage medium storing computer-readable instructions that, when executed by a processor of a computer, cause the computer to perform the aforementioned workpiece positioning method is disclosed.
The technical scheme provided by the embodiment of the application at least comprises the following beneficial effects:
according to the technical scheme provided by the application, the virtual camera and the workpiece model are constructed, the workpiece model in the virtual scene is shot by the virtual camera, the virtual point cloud of the workpiece model under the base coordinate system is obtained, the virtual point cloud is matched with the actual point cloud obtained by the workpiece image acquired by the camera device, the position of the workpiece is determined based on the matching result, the inaccuracy in matching caused by the way of directly matching the actual point cloud with the workpiece model is avoided, the accuracy of real-time positioning of the workpiece is improved, and the quality and efficiency of robot operation are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is an interface diagram of a prior art matching of a workpiece point cloud to a workpiece model.
Fig. 2 is a diagram of a robot structure shown in an exemplary embodiment.
FIG. 3 is a flowchart illustrating a method of workpiece positioning, according to an example embodiment.
Fig. 4 is a detailed flowchart of step S101 in the corresponding embodiment of fig. 3.
Fig. 5 is a detailed flowchart of step S102 in the corresponding embodiment of fig. 3.
Fig. 6 is a partial detailed flowchart of step S103 in the corresponding embodiment of fig. 3.
Fig. 7 is a detailed flowchart of step S1031 in the corresponding embodiment of fig. 6.
FIG. 8 is a schematic diagram of a virtual camera and workpiece model according to an example embodiment.
FIG. 9 is an interface diagram of matching an actual point cloud with a virtual point cloud in accordance with an example embodiment.
Fig. 10 is a flowchart illustrating a robotic work method according to an exemplary embodiment.
FIG. 11 is a block diagram illustrating a computer device for implementing an embodiment of the present application, according to an exemplary embodiment.
FIG. 12 is a block diagram illustrating the architecture of a computer system for implementing an embodiment of the present application, according to an exemplary embodiment.
The reference numerals are explained as follows:
100. a robot; 101. a robot body; 102. a camera device; 103. an end tool; 200. a computer device; 201. a first acquisition module; 202. a first building block; 203. a second building block; 204. a second acquisition module; 205. a point cloud matching module; 300. a computer system; 301. a CPU; 302. a ROM; 303. a storage section; 304. a RAM; 305. a bus; 306. an I/O interface; 307. an input section; 308. an output section; 309. a communication section; 310. a driver; 311. a removable medium; 401. a virtual camera; 402. and (5) a workpiece model.
Detailed Description
While this application is susceptible of embodiment in different forms, there is shown in the drawings and will herein be described in detail, specific embodiments thereof with the understanding that the present disclosure is to be considered as an exemplification of the principles of the application and is not intended to limit the application to that as illustrated.
Furthermore, references to the terms "comprising," "including," "having," and any variations thereof in the description of the present application are intended to cover a non-exclusive inclusion. Such as a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to the list of steps or modules but may, alternatively, include other steps or modules not listed or inherent to such process, method, article, or apparatus.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more features.
In the description of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "illustratively" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of the word "exemplary" or "for example" or "such as" is intended to present the relevant concepts in a concrete manner.
Exemplary embodiments will be described in detail below. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the application.
Fig. 2 shows a robot structural diagram of an exemplary embodiment. As shown in fig. 2, the robot 100 includes a robot body 101, a camera device 102, and an end tool 103, the camera device 102 being provided on the robot body 101, the end tool 103 being provided at an end of the robot body 101; the camera device 102 and the end tool 103 are driven to move through the robot body 101, so that the camera device 102 is utilized to shoot the image of the workpiece in a short distance, the position of the workpiece is obtained by analyzing the image of the workpiece, and the end tool 103 is further driven to move to the position of the workpiece for operation.
It can be appreciated that the robot 100 further includes a controller, where the controller is connected to the robot body 101, the camera device 102 and the end tool 103 to control the movement of the robot body 101, and control the camera device 102 to capture images of the workpiece, and the controller further analyzes the images of the workpiece to obtain the position of the workpiece, so as to control the movement of the robot body 101 to drive the end tool 103 to move to the position of the workpiece for performing the operation. It should be understood that the controller may be built into the robot body 101 or may be provided outside the robot body 101.
The end tool 103 may be a welding gun, the workpiece is an object to be welded, and the robot 100 is a welding robot. The end tool 103 may be a glue gun, the workpiece is an object to be glued, and the robot 100 is a glue robot. The end tool 103 may also be a knife, the workpiece is an object to be cut, and the robot 100 is a cutting robot. Of course, the end tool 103 may be another tool that may be provided at the end of the robot 100 and driven by the robot 100 to perform work, and is not limited to the welding gun, the glue gun, the cutter, and the like.
The embodiment of the application provides a workpiece positioning method, computer equipment, a computer readable storage medium, a robot and a robot operation method, which can accurately position a workpiece in real time, thereby improving the quality and efficiency of robot operation.
The workpiece positioning method, the computer device, the computer readable storage medium, the robot and the robot operation method provided by the embodiment of the application are specifically described by the following embodiments, and the workpiece positioning method in the embodiment of the application is described first.
The embodiment of the application firstly provides a workpiece positioning method which is used for a scene that a robot performs three-dimensional positioning on a workpiece to work the workpiece, wherein a camera device is arranged on the robot. The workpiece positioning method comprises the following steps:
acquiring an actual point cloud of a workpiece under a base coordinate system of a robot based on a workpiece image acquired by a camera device;
obtaining a corresponding workpiece model of a workpiece under a base coordinate system;
constructing a virtual camera and a virtual scene, placing a workpiece model in the virtual scene, and enabling the workpiece model to be positioned in the view angle of the virtual camera;
shooting a workpiece model by adopting a virtual camera, and obtaining a virtual point cloud of the workpiece model under a base coordinate system;
and matching the actual point cloud with the virtual point cloud to obtain the position of the workpiece.
According to the technical scheme provided by the application, the virtual camera and the workpiece model are constructed, the workpiece model in the virtual scene is shot by the virtual camera, the virtual point cloud of the workpiece model under the base coordinate system is obtained, the virtual point cloud is matched with the actual point cloud obtained by the workpiece image acquired by the camera device, the position of the workpiece is determined based on the matching result, the inaccuracy in matching caused by the direct matching mode of the actual point cloud and the workpiece model is avoided, the noise interference resistance of the actual point cloud of the workpiece is high, the real-time positioning accuracy of the workpiece is improved, and therefore the quality and efficiency of robot operation are improved. Meanwhile, the robot is prevented from working in a robot teaching mode, the workload of workers is reduced, and the working quality and efficiency of the robot are improved.
Embodiments of the present application are further elaborated below in conjunction with the drawings in the examples of the specification.
Referring to fig. 3, the workpiece positioning method according to an exemplary embodiment of the present application includes the following steps S101 to S105.
S101, acquiring an actual point cloud of a workpiece under a base coordinate system of a robot based on a workpiece image acquired by a camera device.
In one exemplary embodiment, the camera device is mounted at the end of the robot, as shown in fig. 2. In this one exemplary embodiment, as shown in fig. 4, step S101 includes the following steps S1011 to S1012.
S1011, acquiring a workpiece image acquired by the camera device and a pose matrix of the robot when the camera device acquires the workpiece image.
S1012, converting coordinates of the point cloud in the workpiece image under a camera coordinate system into a base coordinate system according to the pose matrix and the hand-eye matrix of the robot, and obtaining an actual point cloud.
The hand-eye matrix represents the conversion relation of a camera coordinate system relative to a tool coordinate system of the robot; specifically including translation of the robot tip to the tip mounted camera device plus rotation of the robot tip to the tip mounted camera device. As to how to obtain the hand-eye matrix is the prior art, the description is not repeated here.
Wherein the pose matrix represents the conversion relation of the tool coordinate system of the robot relative to the base coordinate system of the robot. The tool coordinate system is used to define the center position of the end tool and the pose of the end tool.
In detail, in step S1012, coordinates of a point cloud in the workpiece image in the camera coordinate system are converted into the base coordinate system based on the mapping relationship pcd1=transform (pcd 0, tools×handleeye), and an actual point cloud is obtained.
Wherein pcd1 represents the actual point cloud under the robot base coordinate system after conversion, pcd0 represents the actual point cloud under the camera coordinate system before conversion, transform represents the conversion function, tool pos represents the pose matrix of the current robot, the pose matrix is a 4*4 matrix, and hand eye represents the positional relationship matrix from the terminal TCP of the robot to the origin of the camera coordinate system, namely the hand eye matrix.
S102, acquiring a corresponding workpiece model of the workpiece under the base coordinate system.
In one exemplary embodiment, as shown in fig. 5, step S102 includes the following steps S1021 to S1023.
S1021, image data of a plurality of angles of the workpiece is acquired.
S1022, integrating the image data of the multiple angles into a three-dimensional model by utilizing three-dimensional modeling software.
It will be appreciated that three-dimensional digital-to-analog is a model of a product that is created using three-dimensional modeling software, such as UG, CATIA, etc. Three-dimensional models are polygonal representations of objects, typically displayed with a computer or other video device.
S1023, converting the three-dimensional digital model from the world coordinate system to the base coordinate system to obtain the workpiece model.
The workpiece model may be, for example, an STL profile or the like.
S103, constructing a virtual camera and a virtual scene, placing the workpiece model in the virtual scene, and enabling the workpiece model to be located in the view angle of the virtual camera.
In detail, the virtual camera may be constructed using software such as Blender.
In one exemplary embodiment, as shown in fig. 6, in step S103, a virtual camera is constructed, including the following steps S1031 to S1032.
S1031, determining the position of the virtual camera in the virtual scene according to the pose matrix and the hand-eye matrix of the robot.
In one exemplary embodiment, as shown in fig. 7, step S1031 includes the following steps S10311 to S10312.
S10311, determining the position of the camera device relative to the robot according to the pose matrix and the hand-eye matrix.
In detail, in step S10311, the position of the camera device with respect to the robot is determined based on the mapping relationship campos=tools.
Wherein, camPos represents the position of camera device relative to the robot, tool pos represents the gesture matrix of robot when camera device gathered work piece image, and handleeye represents the hand eye matrix.
S10312, setting a position of the virtual camera in the virtual scene according to a position of the camera device with respect to the robot, so that the position of the virtual camera corresponds to the position of the camera device.
S1032, configuring parameters of the virtual camera according to the parameters of the camera device, and constructing the virtual camera.
In detail, the aforementioned parameters include a field angle and a pixel. In step S1032, the field angle of the virtual camera is arranged to coincide with the field angle of the camera device, and the pixels of the virtual camera are arranged to coincide with the pixels of the camera device.
By setting the position of the virtual camera in the virtual scene to correspond to the position of the camera device in the robot and setting the parameters of the virtual camera to be consistent with the parameters of the camera device, the virtual point cloud collected by the virtual camera can be ensured to correspond to the position of the actual point cloud collected by the camera device to the greatest extent, and the accuracy of workpiece positioning is improved.
In one exemplary embodiment, the field angle camView of the camera device is 0.87, the resolution size of the camera device is x=1280, y=1024; accordingly, the field angle camView of the virtual camera is 0.87, and the resolution size of the virtual camera is x=1280, y=1024.
An exemplary embodiment virtual camera and workpiece model is shown in fig. 8, where reference numeral 401 represents a virtual camera and reference numeral 402 represents a workpiece model.
It will be appreciated that the foregoing parameters may also include other parameters of the camera, such as the pitch angle, azimuth angle, roll angle, etc. of the camera.
S104, shooting the workpiece model by using a virtual camera, and obtaining a virtual point cloud of the workpiece model under a base coordinate system.
In the embodiment in which the position of the virtual camera corresponds to the position of the camera device relative to the robot and the parameters such as the field angle are consistent with the camera device, the workpiece model surface point cloud obtained in step S104 is the same view angle as the actual point cloud.
S105, matching the actual point cloud with the virtual point cloud to obtain the position of the workpiece.
In an exemplary embodiment, in step S105, the iterative nearest neighbor algorithm is used to register the actual point cloud and the virtual point cloud to obtain the position of the workpiece.
It will be appreciated that the use of an iterative nearest neighbor algorithm, also known as ICP registration (Point Cloud Registration), refers to the input of two point clouds (source data and required registration data), and the output of a transformation (compensation matrix) such that the source data and the required registration data overlap as high as possible. The transformation may or may not be rigid, including rotation and translation. As to how to match the virtual point cloud to the position of the actual point cloud by using the iterative nearest neighbor algorithm is the prior art, and will not be described in detail herein.
FIG. 9 is an interface diagram of matching an actual point cloud with a virtual point cloud in accordance with an example embodiment. As shown in fig. 9, the virtual point cloud and the actual point cloud are attached to the same surface, and the virtual point cloud and the actual point cloud can be completely matched together.
Referring to fig. 2 again, in order to implement the workpiece positioning method provided by the embodiment of the present application, a robot 100 is provided, and the robot 100 includes a robot body 101, a camera device 102, an end tool 103 and a controller (not shown). The robot body 101 has a plurality of moving axes, and the camera device 102 and the end tool 103 are disposed at the end of the robot body 101. The controller is connected to the robot body 101, the camera device 102 and the end tool 103 for controlling the robot body 101, the camera device 102 and the end tool 103 so that the robot 100 can perform all or part of the steps of the workpiece positioning method shown in any one of fig. 3 to 7.
For example, the robot body 101 has six axes of motion, i.e., the robot 100 is a six-axis robot.
For example, the end tool 103 is a welding gun.
Fig. 10 shows a flow chart of a robotic work method of an exemplary embodiment. As shown in fig. 10, the robot working method includes the following steps S201 to S203.
S201, controlling a camera device to collect the workpiece image.
S202, performing workpiece positioning by adopting the workpiece positioning method to obtain the position of the workpiece.
S203, controlling the end tool to move to the position of the workpiece to perform work.
In one exemplary embodiment, as shown in fig. 2, the end tool 103 is a welding gun, the workpiece is an object to be welded, the position of the weld in the workpiece is obtained in step S202, the end tool is controlled to move to the position of the weld in the workpiece in step S203, and the weld of the workpiece is welded using the end tool 103.
Referring next to fig. 11, fig. 11 is a block diagram illustrating a computer apparatus 200, which may be used in a robot to perform all or part of the steps of the workpiece positioning method shown in any of fig. 2-7, according to an exemplary embodiment. As shown in fig. 11, the computer device 200 includes, but is not limited to: the device comprises a first acquisition module 201, a first construction module 202, a second construction module 203, a second acquisition module 204 and a point cloud matching module 205.
The first obtaining module 201 is configured to obtain an actual point cloud of the workpiece under a base coordinate system of the robot based on the image of the workpiece acquired by the camera device.
The first building module 202 is configured to obtain a workpiece model corresponding to the workpiece in the base coordinate system.
The second construction module 203 is configured to construct a virtual camera and a virtual scene, place the workpiece model in the virtual scene, and position the workpiece model within a field angle of the virtual camera.
The second obtaining module 204 is configured to capture a workpiece model with a virtual camera, and obtain a virtual point cloud of the workpiece model in a base coordinate system.
The point cloud matching module 205 is configured to match the actual point cloud and the virtual point cloud, and obtain a position of the workpiece.
The implementation process of the functions and roles of each module in the above-mentioned computer device 200 is specifically described in the implementation process of the corresponding steps in the above-mentioned workpiece positioning method, and will not be described herein again.
The computer device 200 may be any terminal having an information processing function, such as a desktop computer, a notebook computer, or the like.
FIG. 12 schematically shows a block diagram of a computer system of a computer device for implementing a workpiece positioning method according to an embodiment of the application.
It should be noted that, the computer system 300 of the computer device shown in fig. 12 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 12, the computer system 300 includes a central processing unit 301 (Central Processing Unit, CPU) which can execute various appropriate actions and processes according to a program stored in a Read-Only Memory 302 (ROM) or a program loaded from a storage section 303 into a random access Memory 304 (Random Access Memory, RAM). In the random access memory 304, various programs and data required for the system operation are also stored. The central processing unit 301, the read only memory 302, and the random access memory 304 are connected to each other via a bus 305. An Input/Output interface 306 (i.e., an I/O interface) is also connected to bus 305.
The following components are connected to the input/output interface 306: an input portion 307 including a keyboard, a mouse, and the like; an output section 308 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and the like, and a speaker, and the like; a storage section 303 including a hard disk or the like; and a communication section 309 including a network interface card such as a local area network card, a modem, or the like. The communication section 309 performs communication processing via a network such as the internet. The driver 310 is also connected to the input/output interface 306 as needed. A removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed on the drive 310 as needed, so that a computer program read out therefrom is installed into the storage section 303 as needed.
In particular, the processes described in the various method flowcharts may be implemented as computer software programs according to embodiments of the application. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 309, and/or installed from the removable medium 311. The computer program, when executed by the central processor 301, performs the various functions defined in the system of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable storage medium.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, e.g., divided modularly, divided into only one type of logic functions, and other manners of division are possible in actual practice. For example, multiple units or components may be combined or may be integrated into another device, or some features may be omitted, or not performed.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (5)

1. A workpiece positioning method for a scene of a robot working on a workpiece, the robot being provided with a camera device, the camera device being provided at a tip of the robot, the method comprising:
based on the workpiece image acquired by the camera device, acquiring an actual point cloud of the workpiece under a base coordinate system of the robot, wherein the method comprises the following steps: acquiring a workpiece image acquired by the camera device and a pose matrix of the robot when the camera device acquires the workpiece image; according to the pose matrix and the hand-eye matrix of the robot, converting coordinates of a point cloud in the workpiece image under a camera coordinate system into the base coordinate system, and obtaining the actual point cloud; the hand-eye matrix represents the conversion relation of a camera coordinate system relative to a tool coordinate system of the robot;
obtaining a corresponding workpiece model of the workpiece under the base coordinate system, wherein the method comprises the following steps: acquiring image data of a plurality of angles of the workpiece; integrating the image data of the angles into a three-dimensional model by utilizing three-dimensional modeling software; the three-dimensional digital-to-analog conversion is carried out under the base coordinate system, and the workpiece model is obtained;
constructing a virtual camera and a virtual scene, placing the workpiece model in the virtual scene, and enabling the workpiece model to be positioned in the view angle of the virtual camera; the constructing a virtual camera includes: determining the position of the virtual camera in the virtual scene according to the pose matrix and the hand-eye matrix; configuring parameters of the virtual camera according to the parameters of the camera device to construct the virtual camera; the parameters include a field angle and pixels, wherein the configuring parameters of the virtual camera according to parameters of the camera device includes: configuring the angle of view of the virtual camera to be consistent with the angle of view of the camera device; configuring pixels of the virtual camera to coincide with pixels of the camera device; the determining the position of the virtual camera in the virtual scene according to the pose matrix and the hand-eye matrix comprises the following steps: determining a position of the camera device relative to the robot according to the pose matrix and the hand-eye matrix; setting a position of the virtual camera in the virtual scene according to a position of the camera device relative to the robot, so that the position of the virtual camera corresponds to the position of the camera device;
shooting the workpiece model by adopting the virtual camera, and obtaining a virtual point cloud of the workpiece model under the base coordinate system;
and matching the actual point cloud with the virtual point cloud to obtain the position of the workpiece.
2. The workpiece positioning method according to claim 1, wherein the matching the actual point cloud and the virtual point cloud to obtain the position of the workpiece includes:
and registering the actual point cloud and the virtual point cloud by adopting an iterative nearest neighbor algorithm to obtain the position of the workpiece.
3. A robot, comprising:
a robot body having a plurality of axes of motion;
a camera device disposed at the robot body; a kind of electronic device with high-pressure air-conditioning system
A controller connected to the robot body and the camera device for controlling the robot body and the camera device and performing the steps of the workpiece positioning method according to any one of claims 1 to 2.
4. A robot working method, the robot being provided with an end tool and a camera device, the working method comprising:
controlling the camera device to acquire a workpiece image;
positioning a workpiece by the workpiece positioning method according to any one of claims 1 to 2 to obtain the position of the workpiece;
and controlling the end tool to move to the position of the workpiece for operation.
5. The robotic work method of claim 4, wherein the end tool is a welding gun and the camera device is a three-dimensional camera.
CN202211063956.7A 2022-08-31 2022-08-31 Workpiece positioning method, robot, and robot working method Active CN115284297B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211063956.7A CN115284297B (en) 2022-08-31 2022-08-31 Workpiece positioning method, robot, and robot working method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211063956.7A CN115284297B (en) 2022-08-31 2022-08-31 Workpiece positioning method, robot, and robot working method

Publications (2)

Publication Number Publication Date
CN115284297A CN115284297A (en) 2022-11-04
CN115284297B true CN115284297B (en) 2023-12-12

Family

ID=83832337

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211063956.7A Active CN115284297B (en) 2022-08-31 2022-08-31 Workpiece positioning method, robot, and robot working method

Country Status (1)

Country Link
CN (1) CN115284297B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108724190A (en) * 2018-06-27 2018-11-02 西安交通大学 A kind of industrial robot number twinned system emulation mode and device
CN109986255A (en) * 2017-12-29 2019-07-09 深圳中集智能科技有限公司 Mix visual servo parallel robot and operational method
CN110634161A (en) * 2019-08-30 2019-12-31 哈尔滨工业大学(深圳) Method and device for quickly and accurately estimating pose of workpiece based on point cloud data
CN110842918A (en) * 2019-10-24 2020-02-28 华中科技大学 Robot mobile processing autonomous locating method based on point cloud servo
CN113222940A (en) * 2021-05-17 2021-08-06 哈尔滨工业大学 Method for automatically grabbing workpiece by robot based on RGB-D image and CAD model
CN114202566A (en) * 2022-02-17 2022-03-18 常州铭赛机器人科技股份有限公司 Glue path guiding and positioning method based on shape coarse registration and ICP point cloud fine registration
WO2022061673A1 (en) * 2020-09-24 2022-03-31 西门子(中国)有限公司 Calibration method and device for robot

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4238256B2 (en) * 2006-06-06 2009-03-18 ファナック株式会社 Robot simulation device
US10659768B2 (en) * 2017-02-28 2020-05-19 Mitsubishi Electric Research Laboratories, Inc. System and method for virtually-augmented visual simultaneous localization and mapping

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109986255A (en) * 2017-12-29 2019-07-09 深圳中集智能科技有限公司 Mix visual servo parallel robot and operational method
CN108724190A (en) * 2018-06-27 2018-11-02 西安交通大学 A kind of industrial robot number twinned system emulation mode and device
CN110634161A (en) * 2019-08-30 2019-12-31 哈尔滨工业大学(深圳) Method and device for quickly and accurately estimating pose of workpiece based on point cloud data
CN110842918A (en) * 2019-10-24 2020-02-28 华中科技大学 Robot mobile processing autonomous locating method based on point cloud servo
WO2022061673A1 (en) * 2020-09-24 2022-03-31 西门子(中国)有限公司 Calibration method and device for robot
CN113222940A (en) * 2021-05-17 2021-08-06 哈尔滨工业大学 Method for automatically grabbing workpiece by robot based on RGB-D image and CAD model
CN114202566A (en) * 2022-02-17 2022-03-18 常州铭赛机器人科技股份有限公司 Glue path guiding and positioning method based on shape coarse registration and ICP point cloud fine registration

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
随机工件的点云场景CAD模型的快速识别与定位;赵刚;郭晓康;刘德政;王中任;;激光与红外(第12期);全文 *

Also Published As

Publication number Publication date
CN115284297A (en) 2022-11-04

Similar Documents

Publication Publication Date Title
CN112122840B (en) Visual positioning welding system and welding method based on robot welding
CN110176078B (en) Method and device for labeling training set data
JP5113666B2 (en) Robot teaching system and display method of robot operation simulation result
CN104457566A (en) Spatial positioning method not needing teaching robot system
CN113284191A (en) Dispensing method, system, equipment and storage medium based on visual guidance
CN110171000B (en) Groove cutting method, device and control equipment
CN110142770B (en) Robot teaching system and method based on head-mounted display device
Geng et al. A novel welding path planning method based on point cloud for robotic welding of impeller blades
CN115351482A (en) Welding robot control method, welding robot control device, welding robot, and storage medium
CN115284292A (en) Mechanical arm hand-eye calibration method and device based on laser camera
CN116068959A (en) Processing method based on tool path compensation, electronic equipment and storage medium
CN115351389A (en) Automatic welding method and device, electronic device and storage medium
CN113319859B (en) Robot teaching method, system and device and electronic equipment
Yu et al. Collaborative SLAM and AR-guided navigation for floor layout inspection
CN115284297B (en) Workpiece positioning method, robot, and robot working method
CN112506378A (en) Bending track control method and device and computer readable storage medium
CN109685851B (en) Hand-eye calibration method, system, equipment and storage medium of walking robot
CN115741666A (en) Robot hand-eye calibration method, robot and robot operation method
CN115272410A (en) Dynamic target tracking method, device, equipment and medium without calibration vision
CN114859327A (en) Calibration method, device and equipment
CN112454363A (en) Control method of AR auxiliary robot for welding operation
CN110060330B (en) Three-dimensional modeling method and device based on point cloud image and robot
CN114329675A (en) Model generation method, model generation device, electronic device, and readable storage medium
CN116136388A (en) Calibration method, device, equipment and storage medium for robot tool coordinate system
CN115284296A (en) Hand-eye calibration method, robot and robot operation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Room 201, Building A, No. 1, Qianwan Road, Qianhai-Shenzhen-Hong Kong Cooperation Zone, Shenzhen, Guangdong Province, 518000

Applicant after: SHENZHEN QIANHAI RUIJI TECHNOLOGY CO.,LTD.

Applicant after: CIMC Container (Group) Co.,Ltd.

Applicant after: CHINA INTERNATIONAL MARINE CONTAINERS (GROUP) Ltd.

Address before: Room 201, Building A, No. 1, Qianwan Road, Qianhai-Shenzhen-Hong Kong Cooperation Zone, Shenzhen, Guangdong Province, 518000

Applicant before: SHENZHEN QIANHAI RUIJI TECHNOLOGY CO.,LTD.

Applicant before: CIMC CONTAINERS HOLDING Co.,Ltd.

Applicant before: CHINA INTERNATIONAL MARINE CONTAINERS (GROUP) Ltd.

GR01 Patent grant
GR01 Patent grant