CN117301077B - Mechanical arm track generation method and device, electronic equipment and readable storage medium - Google Patents

Mechanical arm track generation method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN117301077B
CN117301077B CN202311574113.8A CN202311574113A CN117301077B CN 117301077 B CN117301077 B CN 117301077B CN 202311574113 A CN202311574113 A CN 202311574113A CN 117301077 B CN117301077 B CN 117301077B
Authority
CN
China
Prior art keywords
model
target
mechanical arm
grabbing
production line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311574113.8A
Other languages
Chinese (zh)
Other versions
CN117301077A (en
Inventor
许震洲
熊海飞
李飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xinrun Fulian Digital Technology Co Ltd
Original Assignee
Shenzhen Xinrun Fulian Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xinrun Fulian Digital Technology Co Ltd filed Critical Shenzhen Xinrun Fulian Digital Technology Co Ltd
Priority to CN202311574113.8A priority Critical patent/CN117301077B/en
Publication of CN117301077A publication Critical patent/CN117301077A/en
Application granted granted Critical
Publication of CN117301077B publication Critical patent/CN117301077B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application relates to a method and a device for generating a mechanical arm track, electronic equipment and a readable storage medium, wherein the method comprises the following steps: acquiring a target production line model, and importing a target workpiece model into the target production line model; controlling the mechanical arm model to grasp the target workpiece model; performing motion control on the mechanical arm model to adjust the posture of the target workpiece model; and determining a target track of the target mechanical arm according to the action control performed on the mechanical arm model. Through simulating the line environment in virtual environment to carry out gesture adjustment based on the target workpiece model of real work piece with the arm model that real line corresponds in the simulated environment, avoided the occupation to real line, avoided influencing the normal production of line, simultaneously, compare in real arm, in virtual environment, control to the arm model is comparatively simple, accurate, thereby can improve the generation efficiency and the accuracy of arm orbit.

Description

Mechanical arm track generation method and device, electronic equipment and readable storage medium
Technical Field
The present disclosure relates to the field of product testing, and in particular, to a method and apparatus for generating a mechanical arm track, an electronic device, and a readable storage medium.
Background
The knuckle is an important component of the automobile, and the quality of the knuckle casting process directly influences the safety of the automobile; during the casting process, various flaws and defects are generated under the influence of the production process, so that quality detection is required during the production process. At present, automatic detection based on computer vision mainly adopts a scheme that the surface of a steering knuckle is photographed by a 2d industrial camera and then defects are identified by deep learning; in the shooting process, the steering knuckle is required to be shot at different angles to obtain a complete image of the steering knuckle, generally, the position of an industrial camera is fixed, and the attitude of the steering knuckle is adjusted through a mechanical arm to realize angle change; however, the knuckle has a plurality of models, and thus, each model requires a custom-made track template for the manipulator-operated knuckle. The existing track template is completed by a manual operation mechanical arm, the production time of a production line is occupied by the mode, normal production is affected, time is consumed, and the accuracy is low.
Disclosure of Invention
The application provides a method and a device for generating a track of a mechanical arm, electronic equipment and a readable storage medium, and aims to solve the technical problems that in the prior art, the track template manufacture of the mechanical arm affects normal production, is time-consuming and is not accurate.
To solve the above technical problems or at least partially solve the above technical problems, the present application provides a method for generating a trajectory of a mechanical arm, the method including the steps of:
acquiring a target production line model, and importing a target workpiece model into the target production line model, wherein the target production line model comprises a mechanical arm model and a camera model, the mechanical arm model corresponds to a target mechanical arm in a real production line, and the camera model corresponds to a target camera in the real production line;
controlling the mechanical arm model to grasp the target workpiece model;
performing motion control on the mechanical arm model to adjust the posture of the target workpiece model so that the target camera can acquire complete appearance data of the target workpiece model under the corresponding posture of the camera model;
and determining a target track of the target mechanical arm according to the action control performed on the mechanical arm model.
Optionally, the step of obtaining the target production line model includes:
creating a virtual three-dimensional scene and acquiring a production line device model, wherein the production line device model comprises the mechanical arm model and the camera model;
importing the line device model into the virtual three-dimensional scene;
acquiring setting parameters corresponding to the device models of the production lines, wherein the setting parameters are obtained based on the corresponding devices in the actual production lines;
and setting each production line device model according to the setting parameters to obtain the target production line model.
Optionally, the step of setting each production line device model according to the setting parameters includes:
acquiring kinematic parameters corresponding to the target mechanical arm in the setting parameters;
and setting a kinematic model corresponding to the mechanical arm model through the kinematic parameters.
Optionally, the step of controlling the robotic arm model to grasp the target workpiece model includes:
determining a grabbing position in the target workpiece model, and determining the grabbing gesture of the mechanical arm model according to the grabbing position;
and controlling the mechanical arm model according to the grabbing gesture so as to grab the target workpiece model.
Optionally, the step of determining a grabbing position in the target workpiece model and determining the grabbing gesture of the mechanical arm model according to the grabbing position includes:
determining plane positioning points in the target workpiece model, wherein the number of the plane positioning points is 3;
determining a grabbing plane according to the plane locating points, and determining the midpoints between the plane locating points on the grabbing plane to obtain grabbing positions;
and determining a vertical vector of the grabbing position, and determining the grabbing gesture of the mechanical arm model according to the vertical vector and the grabbing position.
Optionally, the step of determining the target track of the target mechanical arm according to the motion control performed on the mechanical arm model includes:
if a movement instruction is received, controlling the mechanical arm model to act according to the movement instruction, wherein the mechanical arm model acts based on a kinematic model of the target mechanical arm;
if a point position instruction is received, taking the current gesture of the mechanical arm model as a track sub gesture according to the point position instruction;
and if a completion instruction is received, determining the target track through the track sub-gesture according to the completion instruction.
Optionally, the step of determining the target track through the track sub-gesture according to the completion instruction includes:
acquiring a first grabbing coordinate of a grabbing point of the target workpiece model;
converting the first grabbing coordinate into a workpiece coordinate system corresponding to the target workpiece model to obtain a second grabbing coordinate;
taking the associated coordinate of the second grabbing coordinate in a mechanical arm coordinate system as a starting point posture of the target track;
and taking the track sub-gesture as a process gesture of the target track to obtain the target track.
In order to achieve the above object, the present invention further provides a robot trajectory generation device, including:
the first acquisition module is used for acquiring a target production line model and importing a target workpiece model into the target production line model, wherein the target production line model comprises a mechanical arm model and a camera model, the mechanical arm model corresponds to a target mechanical arm in a real production line, and the camera model corresponds to a target camera in the real production line;
the first control module is used for controlling the mechanical arm model to grasp the target workpiece model;
the first adjusting module is used for performing action control on the mechanical arm model so as to adjust the gesture of the target workpiece model, so that the target camera can acquire complete appearance data of the target workpiece model under the corresponding gesture of the camera model;
And the first determining module is used for determining the target track of the target mechanical arm according to the action control performed on the mechanical arm model.
To achieve the above object, the present invention also provides an electronic device including a memory, a processor, and a computer program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the robot trajectory generation method as described above.
To achieve the above object, the present invention also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the robot arm trajectory generation method as described above.
The invention provides a method, a device, electronic equipment and a readable storage medium for generating a mechanical arm track, which are used for acquiring a target production line model, and importing a target workpiece model into the target production line model, wherein the target production line model comprises a mechanical arm model and a camera model, the mechanical arm model corresponds to a target mechanical arm in a real production line, and the camera model corresponds to a target camera in the real production line; controlling the mechanical arm model to grasp the target workpiece model; performing motion control on the mechanical arm model to adjust the posture of the target workpiece model so that the target camera can acquire complete appearance data of the target workpiece model under the corresponding posture of the camera model; and determining a target track of the target mechanical arm according to the action control performed on the mechanical arm model. Through simulating the line environment in virtual environment to carry out gesture adjustment based on the target workpiece model of real work piece with the arm model that real line corresponds in the simulated environment, avoided the occupation to real line, avoided influencing the normal production of line, simultaneously, compare in real arm, in virtual environment, control to the arm model is comparatively simple, accurate, thereby can improve the generation efficiency and the accuracy of arm orbit.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, and it will be obvious to a person skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a flowchart of a first embodiment of a method for generating a trajectory of a robot arm according to the present invention;
FIG. 2 is a schematic diagram of a target workpiece model according to an embodiment of the method for generating a trajectory of a robot arm of the present invention
Fig. 3 is a schematic block diagram of an electronic device according to the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
The invention provides a method for generating a track of a mechanical arm, referring to fig. 1, fig. 1 is a schematic flow chart of a first embodiment of the method for generating a track of a mechanical arm, and the method comprises the following steps:
step S10, a target production line model is obtained, a target workpiece model is imported into the target production line model, the target production line model comprises a mechanical arm model and a camera model, the mechanical arm model corresponds to a target mechanical arm in a real production line, and the camera model corresponds to a target camera in the real production line;
the target production line model is a model created based on real production line equal proportion; namely, the relative position, the relative size, the physical property, the motion property and the like of the object in the target production line model are consistent with the real production line.
It will be appreciated that the specific devices included in the target line model may be configured based on the actual line, and may include, for example, conveyor belts, work machines, inspection machines, personnel stations, etc.; the embodiment is applied to the generation of the mechanical arm track, so that the target production line model at least comprises a mechanical arm model and a camera model; the mechanical arm model corresponds to a target mechanical arm in reality, and the camera model corresponds to a target camera in reality. In practical application, the actual setting of the actual production line is various and the complexity is different, so that when the target production line model is constructed, devices having influence on production, test and mechanical arm track can be considered, and the target production line model can be created according to the devices.
The target workpiece model is a model corresponding to a target workpiece needing surface shooting; it will be appreciated that after the target workpiece model is imported into the target line model, the relative sizes of the target workpiece model and the devices in the target line model are the same as the relative sizes of the workpieces and the devices in the real line.
Step S20, controlling the mechanical arm model to grasp the target workpiece model;
it can be appreciated that the grasping of the manipulator model is based on a kinematic model of the target manipulator; the grabbing logic of the mechanical arm model on the target workpiece model is consistent with the grabbing logic of the target mechanical arm on the target workpiece.
Step S30, performing motion control on the mechanical arm model to adjust the gesture of the target workpiece model, so that the target camera can acquire complete appearance data of the target workpiece model under the corresponding gesture of the camera model;
it can be understood that when the posture of the mechanical arm model is adjusted under the condition that the mechanical arm model grabs the target workpiece model, the posture of the target workpiece model changes along with the posture change of the mechanical arm model; therefore, the gesture of the mechanical arm model is adjusted, so that the gesture of the target workpiece model can be adjusted, the surface of the target workpiece model, which is exposed in the photographable view of the camera model, is adjusted, and the surface of the target workpiece model can be completely acquired by the camera model based on multiple adjustments; in contrast, the target robot is controlled based on motion control of the robot model, so that the target workpiece can be photographed by the target camera for a complete surface. The motion control may be controlled by a user or based on a preset program.
The complete appearance data of the target workpiece model is data that includes the complete surface of the target workpiece model.
The target camera can acquire the complete appearance data of the target workpiece model under the corresponding posture of the camera model, and the target production line model corresponds to the real production line, so that the target camera can acquire the complete appearance data of the target workpiece in the real production line.
And S40, determining a target track of the target mechanical arm according to action control performed on the mechanical arm model.
The motion control of the mechanical arm model can enable the target camera to acquire the complete appearance data of the target workpiece model under the gesture of the camera model, so that the controlled target mechanical arm can drive the target workpiece to act based on the target track obtained by the motion control, and the target camera can acquire the complete appearance data of the target workpiece to act.
According to the embodiment, the production line environment is simulated in the virtual environment, the mechanical arm model corresponding to the real production line in the simulation environment is based on the target workpiece model of the real workpiece, the occupation of the real production line is avoided, the influence on the normal production of the production line is avoided, and meanwhile, compared with the real mechanical arm, the control of the mechanical arm model in the virtual environment is simpler and more accurate, so that the generation efficiency and the accuracy of the mechanical arm track can be improved.
Further, in the second embodiment of the method for generating a trajectory of a mechanical arm according to the present invention set forth in the first embodiment of the present invention, the step S10 includes the steps of:
step S11, creating a virtual three-dimensional scene and acquiring a production line device model, wherein the production line device model comprises the mechanical arm model and the camera model;
step S12, importing the production line device model into the virtual three-dimensional scene;
step S13, obtaining setting parameters corresponding to the device models of the production lines, wherein the setting parameters are obtained based on the corresponding devices in the actual production lines;
and step S14, setting each production line device model according to the setting parameters to obtain the target production line model.
The creation mode and the tool of the specific virtual three-dimensional scene can be set based on the actual application scene; such as creating a virtual three-dimensional scene through threjs.
The production line device model is a model corresponding to devices contained in a real production line. It is to be appreciated that the line device model is based on actual line settings, as the line device model may include, but is not limited to, conveyor belt models, work machine models, inspection machine models, personnel station models, robotic arm models, and the camera models. It can be understood that, besides the mechanical arm model and the camera model, the mechanical arm model and the camera model are used for specifically participating in the determination of the target track, the other models are used for simulating the working environment of the real production line; it can be understood that when the target mechanical arm is controlled to act, the gesture of the target mechanical arm changes, the position in space changes, and the target mechanical arm possibly collides with other devices in the production line; in order to avoid the problem, in this embodiment, other devices for simulating the working environment of the production line are provided, and at the same time, when the motion of the mechanical arm model is controlled, the mechanical arm model is prevented from being collided with other production line device models, so that the obtained target track cannot be collided with other production line device models, and further, when the motion of the target mechanical arm is controlled based on the target track, the target mechanical arm cannot be collided with other devices of the real production line.
Setting parameters to indicate the characteristics of devices corresponding to the production line device model; the setting parameters can be obtained based on specific setting of devices in a real production line; setting parameters including, but not limited to, setting the size, physical parameters, appearance, position coordinates, and attitude of the device; physical parameters include, but are not limited to, friction parameters, collision parameters; appearance includes, but is not limited to, color, texture.
And setting the production line device model through setting parameters to obtain a target production line model corresponding to the actual production line.
Further, the step S14 includes the steps of:
step S141, acquiring kinematic parameters corresponding to the target mechanical arm in the set parameters;
and step S142, setting a kinematic model corresponding to the mechanical arm model through the kinematic parameters.
The kinematic parameters are used to indicate the characteristics of the movement rules of the target mechanical arm. It can be understood that, based on the specific structural arrangement of the target mechanical arm, the target mechanical arm has a determined movement rule, such as driving the connecting rod to change positions through joint rotation, so as to realize the movement of the clamping jaw of the target mechanical arm; the kinematic parameters corresponding to the specific target mechanical arm may be determined by the actual setting of the target mechanical arm, which is not limited herein.
The kinematic model is a model reflecting the motion rule of the target mechanical arm; after the kinematic model is set through the kinematic parameters of the target mechanical arm, the kinematic model corresponds to the motion rule of the target mechanical arm, and the mechanical arm model can be controlled through the kinematic model so that the mechanical arm model acts based on the motion rule of the target mechanical arm.
The embodiment can accurately construct the target production line model.
Further, in a third embodiment of the method for generating a trajectory of a mechanical arm according to the present invention set forth in the first embodiment of the present invention, the step S20 includes the steps of:
s21, determining a grabbing position in the target workpiece model, and determining the grabbing posture of the mechanical arm model according to the grabbing position;
and S22, controlling the mechanical arm model according to the grabbing gesture so as to grab the target workpiece model.
The grabbing position is a position where the mechanical arm model can grab the target workpiece model; it can be understood that the specific grabbing positions are different based on different structures of the target workpiece model and the mechanical arm model, so that the grabbing positions can be determined based on the target workpiece model and the mechanical arm model which are actually applied; referring to fig. 2, fig. 2 is a schematic structural diagram of a target workpiece model in an embodiment of a method for generating a trajectory of a robot arm according to the present invention; in fig. 2, the target workpiece model is a knuckle model of an automobile, and the mechanical arm model grabs the target workpiece model through a shaft hole of the knuckle model.
The grabbing gesture is a gesture of the mechanical arm model when grabbing the target workpiece model; namely, when the mechanical arm model is positioned in the grabbing gesture, the mechanical arm model can grab the target workpiece model. It is understood that the gripping gesture of the mechanical arm model for gripping the target workpiece model is not necessarily unique, for example, the gripping jaw of the mechanical arm model may grip the target workpiece model through different angles.
And controlling the mechanical arm model through the grabbing gesture so as to enable the gesture of the mechanical arm model to be the grabbing gesture, thereby grabbing the target workpiece model.
Further, the step S21 includes the steps of:
step S211, determining plane positioning points in the target workpiece model, wherein the number of the plane positioning points is 3;
step S212, a grabbing plane is determined according to the plane locating points, and the middle points among the plane locating points are determined on the grabbing plane to obtain grabbing positions;
step S213, determining a vertical vector of the grabbing position, and determining the grabbing gesture of the mechanical arm model according to the vertical vector and the grabbing position.
The plane locating point is a point on the grabbing plane, and it is understood that three points determine one plane, so that the grabbing plane can be determined through three plane locating points located on the grabbing plane. The plane positioning point can be determined through user selection, for example, the plane positioning point is selected on the grabbing plane through a ray projection collision function, a plane mark can be set on the grabbing surface of the target workpiece model in advance, and then the plane positioning point is determined based on the plane mark, wherein the grabbing surface is the surface of the target workpiece model on the grabbing plane.
The grabbing plane is a contact surface of the clamping jaw of the mechanical arm model and the target workpiece model when the mechanical arm model grabs the target workpiece model. For the knuckle model in fig. 2, the grabbing plane is the plane where the shaft hole is located, and when the mechanical arm model grabs the knuckle model, the mechanical arm model clamps the knuckle model by penetrating the shaft hole through the clamping jaw, and the clamping jaw contacts with the plane where the shaft hole is located, namely the grabbing plane.
Further, in order to facilitate the determination of the grabbing positions, the plane locating points a are disposed at the edges of the grabbing surfaces, as shown in fig. 2, the grabbing surfaces of the steering knuckle are regular circles, so when the plane locating points a are disposed at the edges of the grabbing surfaces, the circle centers corresponding to the circumferences of the plane locating points a are taken as the middle points C between the plane locating points a, and the middle points C of the plane locating points a are located at the center of the shaft hole, so that the grabbing positions are located at the center of the shaft hole.
The grabbing position is the projection position of the clamping jaw of the mechanical arm model on the grabbing plane; the vertical vector is a vector which is perpendicular to the grabbing plane and passes through the grabbing position, and the vector direction points to the shaft hole; the vertical vector indicates a specific grabbing position, so that after the vertical vector is determined, the grabbing gesture of the mechanical arm model can be obtained;
the embodiment can accurately obtain the grabbing gesture.
Further, in a fourth embodiment of the method for generating a trajectory of a mechanical arm according to the present invention set forth in the first embodiment of the present invention, the step S40 includes the steps of:
step S41, if a movement instruction is received, controlling the mechanical arm model to act according to the movement instruction, wherein the mechanical arm model acts based on a kinematic model of the target mechanical arm;
step S42, if a point position instruction is received, taking the current gesture of the mechanical arm model as a track sub gesture according to the point position instruction;
step S43, if a completion instruction is received, determining the target track through the track sub-gesture according to the completion instruction.
It can be appreciated that when the motion control is performed on the mechanical arm model, the actual motion track of the mechanical arm model may be disordered, so that in order to make the control efficiency of the target mechanical arm higher and more accurate, a more accurate target track needs to be constructed.
The movement instruction is used for indicating movement of the mechanical arm model, the movement instruction can indicate a movement direction, namely the mechanical arm model continuously moves based on the movement direction corresponding to the movement instruction, and the movement instruction can also indicate coordinates, namely the mechanical arm model is controlled to move to the coordinates corresponding to the movement instruction.
The point position instruction is used for indicating the determination of the track sub-gesture; the track sub-gesture is a characteristic gesture that constitutes the target track.
The target track is composed of a plurality of track sub-postures, a sequence relation exists among the track sub-postures, the track sub-postures indicate the posture of the determined mechanical arm model, and the mechanical arm model can be transformed between two continuous track sub-postures based on the self-kinematics model, so that continuous posture change is realized, and movement based on the target track is realized.
Specifically, a user sends a movement instruction, the mechanical arm model adjusts the gesture based on the movement instruction, when the user sends a point position instruction, the current gesture of the mechanical arm model is used as a track sub-gesture, and meanwhile, a corresponding sequence identifier is related in sequence according to the acquisition of the track sub-gesture; and by analogy, the track sub-gestures are continuously determined based on the action of the point position instruction of the moving instruction, and when the completion instruction is received, all the track sub-gestures are combined based on the sequence identification to obtain the target track.
Further, the step S43 includes the steps of:
step S431, obtaining a first grabbing coordinate of a grabbing point of the target workpiece model;
step S432, converting the first grabbing coordinate into a workpiece coordinate system corresponding to the target workpiece model to obtain a second grabbing coordinate;
Step S433, taking the associated coordinate of the second grabbing coordinate in a mechanical arm coordinate system as the starting point posture of the target track;
step S434, the track sub-gesture is used as the process gesture of the target track to obtain the target track.
It can be understood that the target track is aimed at a mechanical arm model for grabbing the target workpiece model, and therefore, the starting point gesture of the target track should be the grabbing gesture of the target workpiece model; in practical application, the gesture and the position of the target workpiece are not fixed, and when the target mechanical arm grabs the target workpiece, the relative gesture between the target mechanical arm and the target workpiece is determined, so that in order to ensure that the target mechanical arm can accurately grab the target workpiece, the first grabbing coordinate is converted into the workpiece coordinate system to obtain the second grabbing coordinate; therefore, the target mechanical arm can always grasp the corresponding grasping position of the target workpiece through the second grasping coordinates.
It can be understood that the first grabbing coordinate is a currently adopted target workpiece model determined according to the gesture in the mechanical arm coordinate system, and the second grabbing coordinate is a target workpiece corresponding to the workpiece coordinate system in the workpiece coordinate system; under the mechanical arm coordinate system, if the gesture of the target workpiece model changes, the first grabbing coordinate correspondingly changes, and under the workpiece coordinate system, due to the association relation between the target workpiece and the second grabbing coordinate, if the gesture of the target workpiece changes, the second grabbing coordinate is always kept unchanged based on the workpiece coordinate system, namely, the conversion of the first grabbing coordinate and the second grabbing coordinate is not only the conversion between different coordinate systems, but also the conversion of the first grabbing coordinate which is uncertain under the mechanical arm coordinate system into the second grabbing coordinate which is determined under the workpiece coordinate system, so that the subsequent starting point gesture can adapt to different gestures of the target workpiece, and the grabbing of the target workpiece by the target mechanical arm is ensured.
The associated coordinates do not indicate specific coordinates in the mechanical arm coordinate system, but indicate association with the second grabbing coordinates, namely based on the fact that the relative positions between the mechanical arm coordinate system and the workpiece coordinate system are different, the conversion relationship between the associated coordinates and the second grabbing coordinates is also different;
when the target mechanical arm acts according to the target track, firstly, determining an associated coordinate corresponding to the second grabbing coordinate based on a conversion relation between an actual mechanical arm coordinate system and a workpiece coordinate system, and further determining a starting point posture according to the associated coordinate; after the starting point gesture grabs the target workpiece, shooting the target workpiece through the target camera, after shooting is completed, the target mechanical arm sequentially adjusts the gesture into subsequent track sub-gestures, and shooting the target workpiece through the target camera at each track sub-gesture until shooting of the target workpiece is completed after all track sub-gestures are traversed.
The embodiment can adapt to target workpieces with different postures.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required in the present application.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method described in the embodiments of the present application.
The application also provides a mechanical arm track generating device for implementing the mechanical arm track generating method, wherein the mechanical arm track generating device comprises:
the first acquisition module is used for acquiring a target production line model and importing a target workpiece model into the target production line model, wherein the target production line model comprises a mechanical arm model and a camera model, the mechanical arm model corresponds to a target mechanical arm in a real production line, and the camera model corresponds to a target camera in the real production line;
The first control module is used for controlling the mechanical arm model to grasp the target workpiece model;
the first adjusting module is used for performing action control on the mechanical arm model so as to adjust the gesture of the target workpiece model, so that the target camera can acquire complete appearance data of the target workpiece model under the corresponding gesture of the camera model;
and the first determining module is used for determining the target track of the target mechanical arm according to the action control performed on the mechanical arm model.
This arm track generation device is through the simulation line environment of producing in virtual environment to carry out gesture adjustment based on the arm model that corresponds with the reality line in the simulation environment based on the target work piece model of reality work piece, avoided the occupation to the reality line of producing, avoided influencing the normal production of line, simultaneously, compare in real arm, in virtual environment, control to the arm model is comparatively simple, accurate, thereby can improve the generation efficiency and the accuracy of arm track.
It should be noted that, the first obtaining module in this embodiment may be used to perform step S10 in the embodiment of the present application, the first control module in this embodiment may be used to perform step S20 in the embodiment of the present application, the first adjusting module in this embodiment may be used to perform step S30 in the embodiment of the present application, and the first determining module in this embodiment may be used to perform step S30 in the embodiment of the present application.
Further, the first acquisition module includes:
the first creating unit is used for creating a virtual three-dimensional scene and acquiring a production line device model, wherein the production line device model comprises the mechanical arm model and the camera model;
a first importing unit, configured to import the line device model into the virtual three-dimensional scene;
the first acquisition unit is used for acquiring setting parameters corresponding to the device models of the production line, wherein the setting parameters are obtained based on the corresponding devices in the actual production line;
and the first setting unit is used for setting the production line device models according to the setting parameters to obtain the target production line model.
Further, the first setting unit includes:
the first acquisition subunit is used for acquiring kinematic parameters corresponding to the target mechanical arm in the setting parameters;
the first setting subunit is used for setting the kinematic model corresponding to the mechanical arm model through the kinematic parameters.
Further, the first control module includes:
the first determining unit is used for determining a grabbing position in the target workpiece model and determining grabbing postures of the mechanical arm model according to the grabbing position;
And the first control unit is used for controlling the mechanical arm model according to the grabbing gesture so as to grab the target workpiece model.
Further, the first determination unit includes:
a first determining subunit, configured to determine the number of plane positioning points in the target workpiece model, where the number of plane positioning points is 3;
the second determining subunit is used for determining a grabbing plane according to the plane locating points, determining the middle points between the plane locating points on the grabbing plane and obtaining grabbing positions;
and the third determination subunit is used for determining the vertical vector of the grabbing position and determining the grabbing gesture of the mechanical arm model according to the vertical vector and the grabbing position.
Further, the first determining module includes:
the second control unit is used for controlling the mechanical arm model to act according to the moving instruction if the moving instruction is received, wherein the mechanical arm model acts based on the kinematic model of the target mechanical arm;
the first execution unit is used for taking the current gesture of the mechanical arm model as a track sub gesture according to the point position instruction if the point position instruction is received;
And the second determining unit is used for determining the target track through the track sub-gesture according to the completion instruction if the completion instruction is received.
Further, the second determining unit includes:
the second acquisition subunit is used for acquiring first grabbing coordinates of grabbing points of the target workpiece model;
the first execution subunit is used for converting the first grabbing coordinate into a workpiece coordinate system corresponding to the target workpiece model to obtain a second grabbing coordinate;
the second execution subunit is used for taking the associated coordinate of the second grabbing coordinate in the mechanical arm coordinate system as the starting point posture of the target track;
and the third execution subunit is used for taking the track sub-gesture as the process gesture of the target track to obtain the target track.
It should be noted that the above modules are the same as examples and application scenarios implemented by the corresponding steps, but are not limited to what is disclosed in the above embodiments. It should be noted that, the above modules may be implemented in software as a part of the apparatus, or may be implemented in hardware, where the hardware environment includes a network environment.
Referring to fig. 3, the electronic device may include components such as a communication module 10, a memory 20, and a processor 30 in a hardware configuration. In the electronic device, the processor 30 is connected to the memory 20 and the communication module 10, and the memory 20 stores a computer program, and the computer program is executed by the processor 30 at the same time, where the computer program implements the steps of the method embodiments described above when executed.
The communication module 10 is connectable to an external communication device via a network. The communication module 10 may receive a request sent by an external communication device, and may also send a request, an instruction, and information to the external communication device, where the external communication device may be other electronic devices, a server, or an internet of things device, such as a television, and so on.
The memory 20 is used for storing software programs and various data. The memory 20 may mainly include a memory program area and a memory data area, wherein the memory program area may store an operating system, an application program required for at least one function (such as performing motion control on the robot arm model), and the like; the storage data area may include a database, may store data or information created according to the use of the system, and the like. In addition, the memory 20 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 30, which is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 20, and calling data stored in the memory 20, thereby performing overall monitoring of the electronic device. Processor 30 may include one or more processing units; alternatively, the processor 30 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 30.
Although not shown in fig. 3, the electronic device may further include a circuit control module, where the circuit control module is used to connect to a power source to ensure normal operation of other components. Those skilled in the art will appreciate that the electronic device structure shown in fig. 3 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or may be arranged in different components.
The present invention also proposes a computer-readable storage medium on which a computer program is stored. The computer readable storage medium may be the Memory 20 in the electronic device of fig. 3, or may be at least one of ROM (Read-Only Memory)/RAM (Random Access Memory ), magnetic disk, or optical disk, and the computer readable storage medium includes several instructions for causing a terminal device (which may be a television, an automobile, a mobile phone, a computer, a server, a terminal, or a network device) having a processor to perform the method according to the embodiments of the present invention.
In the present invention, the terms "first", "second", "third", "fourth", "fifth" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance, and the specific meaning of the above terms in the present invention will be understood by those of ordinary skill in the art depending on the specific circumstances.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, the scope of the present invention is not limited thereto, and it should be understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications and substitutions of the above embodiments may be made by those skilled in the art within the scope of the present invention, and are intended to be included in the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (10)

1. The mechanical arm track generation method is characterized by comprising the following steps of:
acquiring a target production line model, and importing a target workpiece model into the target production line model, wherein the target production line model comprises a mechanical arm model and a camera model, the mechanical arm model corresponds to a target mechanical arm in a real production line, and the camera model corresponds to a target camera in the real production line;
controlling the mechanical arm model to grasp the target workpiece model;
performing motion control on the mechanical arm model to adjust the posture of the target workpiece model so that the target camera can acquire complete appearance data of the target workpiece model under the corresponding posture of the camera model;
determining a target track of the target mechanical arm according to action control performed on the mechanical arm model; the target track is used for enabling the controlled target mechanical arm to drive the target workpiece to act, so that the target camera can acquire complete appearance data of the target workpiece to act.
2. The method of claim 1, wherein the step of obtaining the target line model comprises:
Creating a virtual three-dimensional scene and acquiring a production line device model, wherein the production line device model comprises the mechanical arm model and the camera model;
importing the line device model into the virtual three-dimensional scene;
acquiring setting parameters corresponding to the device models of the production lines, wherein the setting parameters are obtained based on the corresponding devices in the actual production lines;
and setting each production line device model according to the setting parameters to obtain the target production line model.
3. The method of claim 2, wherein the step of setting each of the production line device models according to the setting parameters includes:
acquiring kinematic parameters corresponding to the target mechanical arm in the setting parameters;
and setting a kinematic model corresponding to the mechanical arm model through the kinematic parameters.
4. The robot trajectory generation method of claim 1, wherein the step of controlling the robot model to grasp the target workpiece model includes:
determining a grabbing position in the target workpiece model, and determining the grabbing gesture of the mechanical arm model according to the grabbing position;
And controlling the mechanical arm model according to the grabbing gesture so as to grab the target workpiece model.
5. The method of claim 4, wherein the step of determining a gripping position in the target workpiece model and determining a gripping posture of the robot model from the gripping position comprises:
determining plane positioning points in the target workpiece model, wherein the number of the plane positioning points is 3;
determining a grabbing plane according to the plane locating points, and determining the midpoints between the plane locating points on the grabbing plane to obtain grabbing positions;
and determining a vertical vector of the grabbing position, and determining the grabbing gesture of the mechanical arm model according to the vertical vector and the grabbing position.
6. The robot arm trajectory generation method according to claim 1, wherein the step of determining the target trajectory of the target robot arm based on motion control performed on the robot arm model includes:
if a movement instruction is received, controlling the mechanical arm model to act according to the movement instruction, wherein the mechanical arm model acts based on a kinematic model of the target mechanical arm;
If a point position instruction is received, taking the current gesture of the mechanical arm model as a track sub gesture according to the point position instruction;
and if a completion instruction is received, determining the target track through the track sub-gesture according to the completion instruction.
7. The robot arm trajectory generation method of claim 6, wherein the step of determining the target trajectory from the trajectory sub-pose according to the completion instruction includes:
acquiring a first grabbing coordinate of a grabbing point of the target workpiece model;
converting the first grabbing coordinate into a workpiece coordinate system corresponding to the target workpiece model to obtain a second grabbing coordinate;
taking the associated coordinate of the second grabbing coordinate in a mechanical arm coordinate system as a starting point posture of the target track;
and taking the track sub-gesture as a process gesture of the target track to obtain the target track.
8. The mechanical arm track generating device is characterized by comprising:
the first acquisition module is used for acquiring a target production line model and importing a target workpiece model into the target production line model, wherein the target production line model comprises a mechanical arm model and a camera model, the mechanical arm model corresponds to a target mechanical arm in a real production line, and the camera model corresponds to a target camera in the real production line;
The first control module is used for controlling the mechanical arm model to grasp the target workpiece model;
the first adjusting module is used for performing action control on the mechanical arm model so as to adjust the gesture of the target workpiece model, so that the target camera can acquire complete appearance data of the target workpiece model under the corresponding gesture of the camera model;
the first determining module is used for determining a target track of the target mechanical arm according to action control performed on the mechanical arm model; the target track is used for enabling the controlled target mechanical arm to drive the target workpiece to act, so that the target camera can acquire complete appearance data of the target workpiece to act.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the robot trajectory generation method of any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the robot trajectory generation method according to any one of claims 1 to 7.
CN202311574113.8A 2023-11-23 2023-11-23 Mechanical arm track generation method and device, electronic equipment and readable storage medium Active CN117301077B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311574113.8A CN117301077B (en) 2023-11-23 2023-11-23 Mechanical arm track generation method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311574113.8A CN117301077B (en) 2023-11-23 2023-11-23 Mechanical arm track generation method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN117301077A CN117301077A (en) 2023-12-29
CN117301077B true CN117301077B (en) 2024-03-26

Family

ID=89281348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311574113.8A Active CN117301077B (en) 2023-11-23 2023-11-23 Mechanical arm track generation method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN117301077B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2500148A1 (en) * 2011-03-17 2012-09-19 KUKA Roboter GmbH Method and apparatus for controlling a robot using a virtual model of the robot
CN109719730A (en) * 2019-01-25 2019-05-07 温州大学 A kind of twin robot of number of breaker flexibility assembling process
CN112476434A (en) * 2020-11-24 2021-03-12 新拓三维技术(深圳)有限公司 Visual 3D pick-and-place method and system based on cooperative robot
CN113874844A (en) * 2019-07-29 2021-12-31 西门子股份公司 Simulation method, device and system of context awareness device
CN114888801A (en) * 2022-05-16 2022-08-12 南京邮电大学 Mechanical arm control method and system based on offline strategy reinforcement learning
CN115903541A (en) * 2022-11-16 2023-04-04 北京航空航天大学 Visual algorithm simulation data set generation and verification method based on twin scene

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2500148A1 (en) * 2011-03-17 2012-09-19 KUKA Roboter GmbH Method and apparatus for controlling a robot using a virtual model of the robot
CN109719730A (en) * 2019-01-25 2019-05-07 温州大学 A kind of twin robot of number of breaker flexibility assembling process
CN113874844A (en) * 2019-07-29 2021-12-31 西门子股份公司 Simulation method, device and system of context awareness device
CN112476434A (en) * 2020-11-24 2021-03-12 新拓三维技术(深圳)有限公司 Visual 3D pick-and-place method and system based on cooperative robot
CN114888801A (en) * 2022-05-16 2022-08-12 南京邮电大学 Mechanical arm control method and system based on offline strategy reinforcement learning
CN115903541A (en) * 2022-11-16 2023-04-04 北京航空航天大学 Visual algorithm simulation data set generation and verification method based on twin scene

Also Published As

Publication number Publication date
CN117301077A (en) 2023-12-29

Similar Documents

Publication Publication Date Title
US20140288710A1 (en) Robot system and calibration method
CN114273726B (en) 3D vision guiding groove cutting method, device, equipment, system and storage medium
CN113664835B (en) Automatic hand-eye calibration method and system for robot
TW201927497A (en) Robot arm automatic processing system, method, and non-transitory computer-readable recording medium
CN109814434B (en) Calibration method and device of control program
CN113379849A (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
US20090228144A1 (en) Method For Calculating Rotation Center Point And Axis Of Rotation, Method For Generating Program, Method For Moving Manipulator And Positioning Device, And Robotic System
CN116147527A (en) Three-dimensional scanning system and scanning path planning method thereof
CN114139857A (en) Workpiece finishing process correcting method, system, storage medium and device
CN113211447A (en) Mechanical arm real-time perception planning method and system based on bidirectional RRT algorithm
CN115383256A (en) Automatic welding method, device and system
CN117301077B (en) Mechanical arm track generation method and device, electronic equipment and readable storage medium
CN114670189A (en) Storage medium, and method and system for generating control program of robot
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
TWI807990B (en) Robot teaching system
CN114833832B (en) Robot hand-eye calibration method, device, equipment and readable storage medium
US20230278196A1 (en) Robot system
CN109664273B (en) Industrial robot cursor dragging teaching method and system
CN112184819A (en) Robot guiding method and device, computer equipment and storage medium
JPH09323280A (en) Control method and system of manupulator
CN110315542A (en) A kind of Multi-axis motion control method of industrial robot
CN115319323B (en) Tube plate welding method, system, welding robot and storage medium
CN215701709U (en) Configurable hand-eye calibration device
CN112706161B (en) Gluing control system with intelligent sensing capability
CN116563491B (en) Digital twin scene modeling and calibration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant