CN117549315A - Path generation method and system of packaging robot - Google Patents

Path generation method and system of packaging robot Download PDF

Info

Publication number
CN117549315A
CN117549315A CN202410032129.4A CN202410032129A CN117549315A CN 117549315 A CN117549315 A CN 117549315A CN 202410032129 A CN202410032129 A CN 202410032129A CN 117549315 A CN117549315 A CN 117549315A
Authority
CN
China
Prior art keywords
clamping
path
clamped
clampable
end effector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410032129.4A
Other languages
Chinese (zh)
Other versions
CN117549315B (en
Inventor
欧浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pizhou Huanhang Packaging Materials Co ltd
Original Assignee
Pizhou Huanhang Packaging Materials Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pizhou Huanhang Packaging Materials Co ltd filed Critical Pizhou Huanhang Packaging Materials Co ltd
Priority to CN202410032129.4A priority Critical patent/CN117549315B/en
Publication of CN117549315A publication Critical patent/CN117549315A/en
Application granted granted Critical
Publication of CN117549315B publication Critical patent/CN117549315B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a path generation method and a path generation system of a packaging robot, which are applied to the technical field of the control field of non-electric variables, wherein the method comprises the following steps: by determining the gripping configuration of the end effector, a plurality of gripper jaw features are obtained. Based on a visual recognition module of the packaging robot, an image dataset of the article to be clamped is obtained. And carrying out clamping characteristic identification according to the image data set, and carrying out region analysis according to the obtained clamping characteristic to obtain a clampable region. And carrying out clamping position change identification in the clampable area according to the characteristics of the clamping claws to obtain the position change fitness. And inputting the position change fitness into a path planning module of the end effector, regulating and controlling error constraint conditions of the path planning module, and outputting a first planned path. The intelligent robot solves the technical problems of low intelligent degree and low universality for various industrial products in the action of robots in the prior art.

Description

Path generation method and system of packaging robot
Technical Field
The invention relates to the field of non-electric variable control, in particular to a path generation method and system of a packaging robot.
Background
Industrial robots are widely applied to the industrial field, are applied to various industrial production lines, and realize the functions of product transportation and various processing and manufacturing. However, in the prior art, the actions of the robot are realized through pre-programming, so that the intelligent degree is low, and the universality of the robot for various industrial products is low.
Therefore, the action of the robot in the prior art has the technical problems of low intelligent degree and low universality for various industrial products.
Disclosure of Invention
The path generation method and system of the packaging robot solve the technical problems that in the prior art, the intelligent degree of the robot action is low, and universality of various industrial products is low.
The application provides a path generation method of a packaging robot, which comprises the following steps: acquiring an end effector of a packaging robot, and determining a clamping structure of the end effector; acquiring characteristics of a plurality of clamping claws according to a clamping structure of the end effector; acquiring an image dataset of an article to be clamped based on a visual identification module of the packaging robot; carrying out clamping characteristic identification according to the image dataset, and carrying out region analysis according to the obtained clamping characteristic to obtain a clampable region; carrying out clamping position change identification in the clampable area according to the characteristics of the clamping claws to obtain position change fitness, wherein the position change fitness is the fitness of changing clamping positions between the clamping claws and the object to be clamped; and inputting the position change fitness into a path planning module of the end effector, regulating and controlling error constraint conditions of the path planning module, and outputting a first planned path.
The application also provides a path generation system of the packaging robot, the system comprising: the clamping structure acquisition module is used for acquiring an end effector of the packaging robot and determining the clamping structure of the end effector; a gripper characteristic acquisition module for acquiring characteristics of a plurality of grippers according to a gripping structure of the end effector; the image data acquisition module is used for acquiring an image data set of the object to be clamped based on the visual identification module of the packaging robot; the clamping area acquisition module is used for carrying out clamping characteristic identification according to the image dataset and carrying out area analysis according to the obtained clamping characteristics to obtain a clampable area; the adaptation degree acquisition module is used for carrying out clamping position change identification in the clampable area according to the characteristics of the clamping claws to obtain position change adaptation degree, wherein the position change adaptation degree is the adaptation degree of changing clamping positions between the clamping claws and the article to be clamped; and the planned path acquisition module is used for inputting the position change fitness into a path planning module of the end effector, regulating and controlling error constraint conditions of the path planning module and outputting a first planned path.
The application also provides an electronic device, comprising:
a memory for storing executable instructions;
and the processor is used for realizing the path generation method of the packaging robot when executing the executable instructions stored in the memory.
The present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements a path generation method of a packaging robot provided by the present application.
According to the path generation method and system of the packaging robot, the characteristics of the clamping claws are obtained by determining the clamping structure of the end effector. Based on a visual recognition module of the packaging robot, an image dataset of the article to be clamped is obtained. And carrying out clamping characteristic identification according to the image data set, and carrying out region analysis according to the obtained clamping characteristic to obtain a clampable region. And carrying out clamping position change identification in the clampable area according to the characteristics of the clamping claws to obtain the position change fitness. And inputting the position change fitness into a path planning module of the end effector, regulating and controlling error constraint conditions of the path planning module, and outputting a first planned path. The automatic planning grabbing and path planning of different types of packaged products are realized, the intelligent degree of the robot work is improved, and the universality of the robot on various industrial products is improved. The intelligent robot solves the technical problems of low intelligent degree and low universality for various industrial products in the action of robots in the prior art.
The foregoing description is only an overview of the technical solutions of the present application, and may be implemented according to the content of the specification in order to make the technical means of the present application more clearly understood, and in order to make the above-mentioned and other objects, features and advantages of the present application more clearly understood, the following detailed description of the present application will be given.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings of the embodiments of the present disclosure will be briefly described below. It is apparent that the figures in the following description relate only to some embodiments of the present disclosure and are not limiting of the present disclosure.
Fig. 1 is a flow chart of a path generating method of a packaging robot according to an embodiment of the present application.
Fig. 2 is a schematic flow chart of a path generating method of a packaging robot for obtaining characteristics of a gripper according to an embodiment of the present application.
Fig. 3 is a schematic flow chart of a path generating method of a packaging robot for obtaining a clampable area according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a system of a path generating method of a packaging robot according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a system electronic device of a path generating method of a packaging robot according to an embodiment of the present invention.
Reference numerals illustrate: the device comprises a clamping structure acquisition module 11, a clamping claw feature acquisition module 12, an image data acquisition module 13, a clamping area acquisition module 14, an adaptation degree acquisition module 15, a planned path acquisition module 16, a processor 31, a memory 32, an input device 33 and an output device 34.
Detailed Description
Example 1
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the present application will be described in further detail with reference to the accompanying drawings, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
In the following description, the terms "first", "second", "third" and the like are merely used to distinguish similar objects and do not represent a particular ordering of the objects, it being understood that the "first", "second", "third" may be interchanged with a particular order or sequence, as permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only.
While the present application makes various references to certain modules in a system according to embodiments of the present application, any number of different modules may be used and run on a user terminal and/or server, the modules are merely illustrative, and different aspects of the system and method may use different modules.
A flowchart is used in this application to describe the operations performed by a system according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in order precisely. Rather, the various steps may be processed in reverse order or simultaneously, as desired. Also, other operations may be added to or removed from these processes.
As shown in fig. 1, an embodiment of the present application provides a path generating method of a packaging robot, including:
acquiring an end effector of a packaging robot, and determining a clamping structure of the end effector;
acquiring characteristics of a plurality of clamping claws according to a clamping structure of the end effector;
acquiring an image dataset of an article to be clamped based on a visual identification module of the packaging robot;
an end effector of a packaging robot is acquired and a gripping structure of the end effector is determined. According to the clamping structure of the end effector, characteristics of a plurality of clamping claws are obtained, wherein the characteristics of the clamping claws comprise the structure, the appearance size characteristics and the like of the clamping mechanism. Based on the visual recognition module of the packaging robot, an image data set of the object to be clamped is obtained through the visual recognition module.
As shown in fig. 2, the method provided in the embodiment of the present application further includes:
according to the clamping structure of the end effector, clamping angle intervals of the plurality of clamping claws are obtained, wherein the clamping angle intervals comprise clamping expanding angles and clamping shrinking angles;
obtaining the geometric dimensions of the plurality of clamping claws and the clamping contact points of the plurality of clamping claws;
outputting the clamping angle intervals, the geometric dimensions and the clamping contact points of the clamping claws as the characteristics of the clamping claws.
According to the clamping structure of the end effector, a clamping angle interval of the plurality of clamping claws is obtained, the clamping angle interval comprises a clamping expanding angle and a clamping shrinking angle, wherein the clamping expanding angle is the maximum expanding angle of the clamping claws, and the clamping shrinking angle is the minimum shrinking angle of the clamping claws. The geometric dimensions of the plurality of clamping jaws, and the clamping contact points of the plurality of clamping jaws, are obtained. Outputting the clamping angle intervals, the geometric dimensions and the clamping contact points of the clamping claws as the characteristics of the clamping claws.
Carrying out clamping characteristic identification according to the image dataset, and carrying out region analysis according to the obtained clamping characteristic to obtain a clampable region;
carrying out clamping position change identification in the clampable area according to the characteristics of the clamping claws to obtain position change fitness, wherein the position change fitness is the fitness of changing clamping positions between the clamping claws and the object to be clamped;
and inputting the position change fitness into a path planning module of the end effector, regulating and controlling error constraint conditions of the path planning module, and outputting a first planned path.
And carrying out clamping characteristic identification according to the image dataset, and carrying out region analysis according to the obtained clamping characteristic to obtain a clampable region, wherein the clampable region is a region which corresponds to a clamped object and is used for clamping. Further, according to the characteristics of the clamping claws, clamping position change identification is carried out in the clamping area, and position change fitness is obtained, wherein the position change fitness is the fitness of changing clamping positions between the clamping claws and the article to be clamped. And finally, inputting the position change fitness into a path planning module of the end effector, and regulating and controlling error constraint conditions of the path planning module, wherein different clamping article sizes are preset to correspond to different error constraint conditions, the smaller the article is, the smaller the error in the error constraint conditions is, the larger the article is, the larger the error in the error constraint conditions is, and the first planned path is output. Therefore, automatic planning grabbing and path planning of different types of packaged products are realized, the intelligent degree of robot work is improved, and the universality of the robot on various industrial products is improved.
The method provided by the embodiment of the application further comprises the following steps:
carrying out clamping characteristic identification on the image dataset to obtain geometric data of the article to be clamped and surface data of the article to be clamped;
and carrying out clamping stability analysis by taking the geometric data and the surface data of the object to be clamped as clamping characteristics to obtain a clampable area, wherein the clampable area is an area with the clamping stability larger than preset stability on the surface of the object to be clamped.
Acquiring the grippable region includes: and carrying out clamping feature recognition on the image dataset to obtain geometric data of the object to be clamped and surface data of the object to be clamped, wherein the geometric data of the object to be clamped comprise specific geometric dimensions and geometric shapes. And carrying out clamping characteristic recognition on the image dataset to obtain the geometric data of the object to be clamped and the surface data of the object to be clamped, and acquiring by adopting an image recognition method commonly used in the prior art. And then, carrying out clamping stability analysis by taking the geometric data and the surface data of the object to be clamped as clamping characteristics to obtain a clampable area, wherein the clampable area is an area with the clamping stability larger than a preset stability on the surface of the object to be clamped.
As shown in fig. 3, the method provided in the embodiment of the present application further includes:
acquiring quality sensing data of the article to be clamped based on a quality sensing module of the packaging robot, wherein the quality sensing module comprises a quality sensor which is arranged on a conveyor mechanism of the article to be clamped;
acquiring a full-connection simulation system, calling a clamping stability analysis sample by using the full-connection simulation system, and training a clamping stability analysis module by using the clamping stability analysis sample;
and inputting the geometric data and the surface data of the object to be clamped as input variables and the quality sensing data as influence variables into the clamping stability analysis module to carry out clamping stability analysis, so as to obtain a clampable area.
When the clamping stability analysis is carried out, based on the quality sensing module of the packaging robot, the quality sensing data of the article to be clamped is obtained, the quality sensing module comprises a quality sensor, the quality sensor is arranged on the conveyor mechanism of the article to be clamped, and the quality data of the article to be clamped is obtained through the quality sensor. And acquiring a full-connection simulation system, calling a clamping stability analysis sample by using the full-connection simulation system, wherein the clamping stability analysis sample comprises geometric data and surface data of various clamping objects and specific quality data, and comprises clamping areas corresponding to the clamping objects. And then training the clamping stability analysis module by using the clamping stability analysis sample, wherein the clamping stability analysis module is constructed based on the neural network model, the clamping stability analysis sample is used as training data to supervise and train the neural network model, and training of the model is completed until a clampable area output by the model meets the preset accuracy, so that the clamping stability analysis module is obtained. And finally, inputting the geometric data and the surface data of the object to be clamped as input variables and the quality sensing data as influence variables into the clamping stability analysis module to carry out clamping stability analysis, so as to obtain a clampable area.
The method provided by the embodiment of the application further comprises the following steps:
matching the clampable areas according to the characteristics of the clamping claws, and screening the clampable areas for one time to obtain screened clampable areas;
identifying the clamping positions of the clamping claws according to the screened clampable areas, and obtaining the coordinate number of the variable clamping positions;
and obtaining the position change fitness by using the ratio of the number of coordinates of the variable clamping positions to the number of preset coordinates.
And carrying out clamping position change identification in the clampable area according to the characteristics of the clamping claws to obtain position change fitness, wherein the method further comprises the following steps: and matching the clampable areas according to the characteristics of the clamping claws, namely screening the clampable areas once according to the clamping angle intervals, the geometric dimensions and the clamping contact points of the clamping claws, acquiring clamping width ranges corresponding to the clamping claws through the clamping angle intervals, the clamping contact points and the geometric dimensions when screening the clampable areas, and then acquiring clamping ranges with widths falling into the clamping width ranges in the clampable areas to obtain the screened clampable areas. Further, the clamping positions of the clamping claws are identified according to the screened clampable areas, and the coordinate number of the variable clamping positions is obtained. When the coordinate number of the variable clamping positions is obtained, mapping to a three-dimensional coordinate system based on the screened clampable regions, and obtaining the coordinate positions of all clamping contact points of the clamping claws of the screened clampable regions simultaneously, so that the coordinate number of the variable clamping positions is obtained. And finally, obtaining the position change fitness by utilizing the ratio of the number of coordinates of the variable clamping positions to the number of preset coordinates, wherein the number of preset coordinates is the preset minimum number of variable clamping positions.
The method provided by the embodiment of the application further comprises the following steps:
acquiring a plurality of degrees of freedom of the packaging robot;
decomposing the plurality of degrees of freedom, and outputting a joint degree of freedom node and a tail end degree of freedom node;
the error control output based on the joint degree of freedom node is a joint error constraint condition, and the error control output based on the terminal degree of freedom node is a terminal error constraint condition;
and inputting the position change fitness into a path planning module of the end effector, and respectively carrying out error control on a first planning path of the path planning module according to the joint error constraint condition and the end error constraint condition.
And acquiring a plurality of degrees of freedom of the packaging robot, decomposing the plurality of degrees of freedom, and outputting a joint degree of freedom node and a tail end degree of freedom node, wherein the joint degree of freedom node is a joint node of the robot, and the tail end degree of freedom node is a tail end execution component node of the robot. The degree of freedom of the packaging robot is composed of different execution components, and error constraint is carried out on each component, so that the overall error constraint on the robot action path is realized. Further, the error control output based on the joint degree of freedom node is a joint error constraint condition, the error control output based on the end degree of freedom node is an end error constraint condition, wherein the joint error constraint condition and the end error constraint are both percentage constraints, the smaller the article is, the smaller the error in the error constraint condition is, the larger the article is, and the larger the error in the error constraint condition is. And finally, inputting the position change fitness into a path planning module of the end effector, and respectively carrying out error control on a first planning path of the path planning module according to the joint error constraint condition and the end error constraint condition.
The method provided by the embodiment of the application further comprises the following steps:
determining the stacking position of the article to be clamped according to the packaging robot;
taking the stacking position of the article to be clamped as an end point and the real-time position of the article to be clamped as a starting point to determine a first planning path;
dividing the execution object of the first planning path into a node execution path and an end execution path;
performing error control on the node execution path according to the joint error constraint condition;
and performing error control on the end execution path according to the end error constraint condition.
And when error control is performed, determining the stacking position of the object to be clamped, namely the final placing position of the object to be clamped, according to the packaging robot. And determining a first planning path by taking the stacking position of the article to be clamped as an end point and taking the real-time position of the article to be clamped as a starting point. Based on the execution object of the first planning path, the first planning path is divided into a node execution path and an end execution path, namely, the first planning path is converted into the execution parameters of the execution object and error control is carried out. And performing error control on the node execution path according to the joint error constraint condition. And performing error control on the end execution path according to the end error constraint condition.
According to the technical scheme provided by the embodiment of the invention, the clamping structure of the end effector is determined by acquiring the end effector of the packaging robot. A plurality of gripper jaw features are obtained based on a gripping configuration of the end effector. And acquiring an image dataset of the object to be clamped based on the visual identification module of the packaging robot. And carrying out clamping characteristic identification according to the image dataset, and carrying out region analysis according to the obtained clamping characteristic to obtain a clampable region. And carrying out clamping position change identification in the clampable area according to the characteristics of the clamping claws to obtain position change fitness, wherein the position change fitness is the fitness of changing clamping positions between the clamping claws and the article to be clamped. And inputting the position change fitness into a path planning module of the end effector, regulating and controlling error constraint conditions of the path planning module, and outputting a first planned path. The automatic planning grabbing and path planning of different types of packaged products are realized, the intelligent degree of the robot work is improved, and the universality of the robot on various industrial products is improved. The intelligent robot solves the technical problems of low intelligent degree and low universality for various industrial products in the action of robots in the prior art.
Example two
Based on the same inventive concept as the path generating method of the packaging robot in the foregoing embodiment, the present invention also provides a system of the path generating method of the packaging robot, which may be implemented by hardware and/or software, and may be generally integrated in an electronic device, for executing the method provided by any embodiment of the present invention. As shown in fig. 4, the system includes:
a clamping structure acquisition module 11 for acquiring an end effector of a packaging robot and determining a clamping structure of the end effector;
a gripper signature acquisition module 12 for acquiring signatures of a plurality of grippers according to a gripping structure of the end effector;
an image data acquisition module 13, configured to acquire an image dataset of an article to be clamped based on a visual identification module of the packaging robot;
a clamping area acquisition module 14, configured to perform clamping feature identification according to the image dataset, and perform area analysis according to the obtained clamping feature, so as to obtain a clampable area;
the adaptation degree obtaining module 15 is configured to identify a clamping position change in the clampable area according to the characteristics of the plurality of clamping claws, so as to obtain a position change adaptation degree, where the position change adaptation degree is an adaptation degree of changing clamping positions between the plurality of clamping claws and the object to be clamped;
and the planned path obtaining module 16 is configured to input the position change fitness into a path planning module of the end effector, regulate and control an error constraint condition of the path planning module, and output a first planned path.
Further, the gripper characteristics acquisition module 12 is further configured to:
according to the clamping structure of the end effector, clamping angle intervals of the plurality of clamping claws are obtained, wherein the clamping angle intervals comprise clamping expanding angles and clamping shrinking angles;
obtaining the geometric dimensions of the plurality of clamping claws and the clamping contact points of the plurality of clamping claws;
outputting the clamping angle intervals, the geometric dimensions and the clamping contact points of the clamping claws as the characteristics of the clamping claws.
Further, the clamping area acquisition module 14 is further configured to:
carrying out clamping characteristic identification on the image dataset to obtain geometric data of the article to be clamped and surface data of the article to be clamped;
and carrying out clamping stability analysis by taking the geometric data and the surface data of the object to be clamped as clamping characteristics to obtain a clampable area, wherein the clampable area is an area with the clamping stability larger than preset stability on the surface of the object to be clamped.
Further, the clamping area acquisition module 14 is further configured to:
acquiring quality sensing data of the article to be clamped based on a quality sensing module of the packaging robot, wherein the quality sensing module comprises a quality sensor which is arranged on a conveyor mechanism of the article to be clamped;
acquiring a full-connection simulation system, calling a clamping stability analysis sample by using the full-connection simulation system, and training a clamping stability analysis module by using the clamping stability analysis sample;
and inputting the geometric data and the surface data of the object to be clamped as input variables and the quality sensing data as influence variables into the clamping stability analysis module to carry out clamping stability analysis, so as to obtain a clampable area.
Further, the adaptation degree obtaining module 15 is further configured to:
matching the clampable areas according to the characteristics of the clamping claws, and screening the clampable areas for one time to obtain screened clampable areas;
identifying the clamping positions of the clamping claws according to the screened clampable areas, and obtaining the coordinate number of the variable clamping positions;
and obtaining the position change fitness by using the ratio of the number of coordinates of the variable clamping positions to the number of preset coordinates.
Further, the planned path acquisition module 16 is further configured to:
acquiring a plurality of degrees of freedom of the packaging robot;
decomposing the plurality of degrees of freedom, and outputting a joint degree of freedom node and a tail end degree of freedom node;
the error control output based on the joint degree of freedom node is a joint error constraint condition, and the error control output based on the terminal degree of freedom node is a terminal error constraint condition;
and inputting the position change fitness into a path planning module of the end effector, and respectively carrying out error control on a first planning path of the path planning module according to the joint error constraint condition and the end error constraint condition.
Further, the planned path acquisition module 16 is further configured to:
determining the stacking position of the article to be clamped according to the packaging robot;
taking the stacking position of the article to be clamped as an end point and the real-time position of the article to be clamped as a starting point to determine a first planning path;
dividing the execution object of the first planning path into a node execution path and an end execution path;
performing error control on the node execution path according to the joint error constraint condition;
and performing error control on the end execution path according to the end error constraint condition.
The included units and modules are only divided according to the functional logic, but are not limited to the above-mentioned division, so long as the corresponding functions can be realized; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present invention.
Example III
Fig. 5 is a schematic structural diagram of an electronic device provided in a third embodiment of the present invention, and shows a block diagram of an exemplary electronic device suitable for implementing an embodiment of the present invention. The electronic device shown in fig. 5 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the present invention. As shown in fig. 5, the electronic device includes a processor 31, a memory 32, an input device 33, and an output device 34; the number of processors 31 in the electronic device may be one or more, in fig. 5, one processor 31 is taken as an example, and the processors 31, the memory 32, the input device 33 and the output device 34 in the electronic device may be connected by a bus or other means, in fig. 5, by bus connection is taken as an example.
The memory 32 is a computer readable storage medium, and may be used to store a software program, a computer executable program, and modules, such as program instructions/modules corresponding to a path generating method of a packaging robot in an embodiment of the present invention. The processor 31 executes various functional applications of the computer device and data processing by running software programs, instructions and modules stored in the memory 32, i.e., implements the path generating method of the packaging robot described above.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (10)

1. A method of path generation for a packaging robot, the method comprising:
acquiring an end effector of a packaging robot, and determining a clamping structure of the end effector;
acquiring characteristics of a plurality of clamping claws according to a clamping structure of the end effector;
acquiring an image dataset of an article to be clamped based on a visual identification module of the packaging robot;
carrying out clamping characteristic identification according to the image dataset, and carrying out region analysis according to the obtained clamping characteristic to obtain a clampable region;
carrying out clamping position change identification in the clampable area according to the characteristics of the clamping claws to obtain position change fitness, wherein the position change fitness is the fitness of changing clamping positions between the clamping claws and the object to be clamped;
and inputting the position change fitness into a path planning module of the end effector, regulating and controlling error constraint conditions of the path planning module, and outputting a first planned path.
2. The method of claim 1, wherein the plurality of gripper jaw features are obtained from a gripping structure of the end effector, the method comprising:
according to the clamping structure of the end effector, clamping angle intervals of the plurality of clamping claws are obtained, wherein the clamping angle intervals comprise clamping expanding angles and clamping shrinking angles;
obtaining the geometric dimensions of the plurality of clamping claws and the clamping contact points of the plurality of clamping claws;
outputting the clamping angle intervals, the geometric dimensions and the clamping contact points of the clamping claws as the characteristics of the clamping claws.
3. The method of claim 1, wherein the image dataset is characterized by identifying clamping features and performing region analysis based on the clamping features obtained to obtain clampable regions, the method comprising:
carrying out clamping characteristic identification on the image dataset to obtain geometric data of the article to be clamped and surface data of the article to be clamped;
and carrying out clamping stability analysis by taking the geometric data and the surface data of the object to be clamped as clamping characteristics to obtain a clampable area, wherein the clampable area is an area with the clamping stability larger than preset stability on the surface of the object to be clamped.
4. A method according to claim 3, wherein the geometrical data and the surface data of the article to be clamped are used as clamping characteristics for a clamping stability analysis, the method further comprising;
acquiring quality sensing data of the article to be clamped based on a quality sensing module of the packaging robot, wherein the quality sensing module comprises a quality sensor which is arranged on a conveyor mechanism of the article to be clamped;
acquiring a full-connection simulation system, calling a clamping stability analysis sample by using the full-connection simulation system, and training a clamping stability analysis module by using the clamping stability analysis sample;
and inputting the geometric data and the surface data of the object to be clamped as input variables and the quality sensing data as influence variables into the clamping stability analysis module to carry out clamping stability analysis, so as to obtain a clampable area.
5. The method of claim 1, wherein the identifying of the change in gripping position in the grippable region based on the characteristics of the plurality of gripping jaws results in a degree of fitness in the change in position, the method further comprising:
matching the clampable areas according to the characteristics of the clamping claws, and screening the clampable areas for one time to obtain screened clampable areas;
identifying the clamping positions of the clamping claws according to the screened clampable areas, and obtaining the coordinate number of the variable clamping positions;
and obtaining the position change fitness by using the ratio of the number of coordinates of the variable clamping positions to the number of preset coordinates.
6. The method of claim 1, wherein the method further comprises:
acquiring a plurality of degrees of freedom of the packaging robot;
decomposing the plurality of degrees of freedom, and outputting a joint degree of freedom node and a tail end degree of freedom node;
the error control output based on the joint degree of freedom node is a joint error constraint condition, and the error control output based on the terminal degree of freedom node is a terminal error constraint condition;
and inputting the position change fitness into a path planning module of the end effector, and respectively carrying out error control on a first planning path of the path planning module according to the joint error constraint condition and the end error constraint condition.
7. The method of claim 6, wherein the error control is performed on the first planned path of the path planning module with the error constraint, the method further comprising:
determining the stacking position of the article to be clamped according to the packaging robot;
taking the stacking position of the article to be clamped as an end point and the real-time position of the article to be clamped as a starting point to determine a first planning path;
dividing the execution object of the first planning path into a node execution path and an end execution path;
performing error control on the node execution path according to the joint error constraint condition;
and performing error control on the end execution path according to the end error constraint condition.
8. A path generation system of a packaging robot, the system comprising:
the clamping structure acquisition module is used for acquiring an end effector of the packaging robot and determining the clamping structure of the end effector;
a gripper characteristic acquisition module for acquiring characteristics of a plurality of grippers according to a gripping structure of the end effector;
the image data acquisition module is used for acquiring an image data set of the object to be clamped based on the visual identification module of the packaging robot;
the clamping area acquisition module is used for carrying out clamping characteristic identification according to the image dataset and carrying out area analysis according to the obtained clamping characteristics to obtain a clampable area;
the adaptation degree acquisition module is used for carrying out clamping position change identification in the clampable area according to the characteristics of the clamping claws to obtain position change adaptation degree, wherein the position change adaptation degree is the adaptation degree of changing clamping positions between the clamping claws and the article to be clamped;
and the planned path acquisition module is used for inputting the position change fitness into a path planning module of the end effector, regulating and controlling error constraint conditions of the path planning module and outputting a first planned path.
9. An electronic device, the electronic device comprising:
a memory for storing executable instructions;
a processor for implementing a path generation method of a packaging robot according to any one of claims 1 to 7 when executing executable instructions stored in said memory.
10. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements a path generating method of a packaging robot according to any one of claims 1-7.
CN202410032129.4A 2024-01-10 2024-01-10 Path generation method and system of packaging robot Active CN117549315B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410032129.4A CN117549315B (en) 2024-01-10 2024-01-10 Path generation method and system of packaging robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410032129.4A CN117549315B (en) 2024-01-10 2024-01-10 Path generation method and system of packaging robot

Publications (2)

Publication Number Publication Date
CN117549315A true CN117549315A (en) 2024-02-13
CN117549315B CN117549315B (en) 2024-03-26

Family

ID=89818911

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410032129.4A Active CN117549315B (en) 2024-01-10 2024-01-10 Path generation method and system of packaging robot

Country Status (1)

Country Link
CN (1) CN117549315B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110202581A (en) * 2019-06-28 2019-09-06 南京博蓝奇智能科技有限公司 Compensation method, device and the electronic equipment of end effector of robot operating error
CN111275063A (en) * 2018-12-04 2020-06-12 广州中国科学院先进技术研究所 Robot intelligent grabbing control method and system based on 3D vision
US20210053230A1 (en) * 2019-08-21 2021-02-25 Mujin, Inc. Robotic multi-gripper assemblies and methods for gripping and holding objects
CN113967911A (en) * 2019-12-31 2022-01-25 浙江大学 Follow control method and system of humanoid mechanical arm based on tail end working space
US20220193894A1 (en) * 2020-12-21 2022-06-23 Boston Dynamics, Inc. Supervised Autonomous Grasping
CN116061173A (en) * 2022-11-04 2023-05-05 云南电网有限责任公司保山供电局 Six-degree-of-freedom redundant task track planning method for mechanical arm for live working
US20230339118A1 (en) * 2022-04-20 2023-10-26 eBots Inc. Reliable robotic manipulation in a cluttered environment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111275063A (en) * 2018-12-04 2020-06-12 广州中国科学院先进技术研究所 Robot intelligent grabbing control method and system based on 3D vision
CN110202581A (en) * 2019-06-28 2019-09-06 南京博蓝奇智能科技有限公司 Compensation method, device and the electronic equipment of end effector of robot operating error
US20210053230A1 (en) * 2019-08-21 2021-02-25 Mujin, Inc. Robotic multi-gripper assemblies and methods for gripping and holding objects
CN113967911A (en) * 2019-12-31 2022-01-25 浙江大学 Follow control method and system of humanoid mechanical arm based on tail end working space
US20220193894A1 (en) * 2020-12-21 2022-06-23 Boston Dynamics, Inc. Supervised Autonomous Grasping
US20230339118A1 (en) * 2022-04-20 2023-10-26 eBots Inc. Reliable robotic manipulation in a cluttered environment
CN116061173A (en) * 2022-11-04 2023-05-05 云南电网有限责任公司保山供电局 Six-degree-of-freedom redundant task track planning method for mechanical arm for live working

Also Published As

Publication number Publication date
CN117549315B (en) 2024-03-26

Similar Documents

Publication Publication Date Title
US4920572A (en) Object recognition system for a robot
US20190152054A1 (en) Gripping system with machine learning
Chang et al. Collision avoidance of two general robot manipulators by minimum delay time
CN110599544B (en) Workpiece positioning method and device based on machine vision
CN109176521A (en) A kind of mechanical arm and its crawl control method and system
Morales et al. Vision-based computation of three-finger grasps on unknown planar objects
CN112847375B (en) Workpiece grabbing method and device, computer equipment and storage medium
CN113858188A (en) Industrial robot gripping method and apparatus, computer storage medium, and industrial robot
Nguyen et al. A novel vision-based method for 3D profile extraction of wire harness in robotized assembly process
CN113269723A (en) Unordered grasping system for three-dimensional visual positioning and mechanical arm cooperative work parts
CN114025928A (en) End effector control system and end effector control method
CN117549315B (en) Path generation method and system of packaging robot
Somani et al. Object detection using boundary representations of primitive shapes
Hefner et al. Vision-based adjusting of a digital model to real-world conditions for wire insertion tasks
Martinez et al. Automated 3D vision guided bin picking process for randomly located industrial parts
CN114310892B (en) Object grabbing method, device and equipment based on point cloud data collision detection
CN108284075B (en) Method and device for sorting articles by robot and robot
Su et al. Pose-estimation and reorientation of pistons for robotic bin-picking
CN106200541B (en) Method for converting function block diagram into AOV structure
CN114800533B (en) Sorting control method and system for industrial robot
CN108145712B (en) Method and device for sorting articles by robot and robot
US10395360B2 (en) Inspection system, controller, inspection method, and inspection program
Weng et al. The task-level evaluation model for a flexible assembly task with an industrial dual-arm robot
DETEŞAN The path planning of rttrr small-sized industrial robot in a process of microprocessor packing
JPH05127723A (en) High-speed picking device for stacked component

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant