CN115016293A - Pig carcass segmentation robot path autonomous correction method based on force feedback - Google Patents
Pig carcass segmentation robot path autonomous correction method based on force feedback Download PDFInfo
- Publication number
- CN115016293A CN115016293A CN202210872407.8A CN202210872407A CN115016293A CN 115016293 A CN115016293 A CN 115016293A CN 202210872407 A CN202210872407 A CN 202210872407A CN 115016293 A CN115016293 A CN 115016293A
- Authority
- CN
- China
- Prior art keywords
- robot
- force
- path
- segmentation
- vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B13/00—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
- G05B13/02—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
- G05B13/04—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric involving the use of models or simulators
- G05B13/042—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric involving the use of models or simulators in which a parameter or coefficient is automatically adjusted to optimise the performance
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
The invention provides a pig carcass segmentation robot path autonomous correction method based on force feedback, which comprises the following steps: the visual identification system identifies the pig carcass dividing line to obtain a preset dividing track; obtaining a motion position vector of the segmentation robot according to a three-dimensional coordinate point of a preset segmentation track, substituting the motion position vector into a kinematics equation to calculate matrix transformation quantity of each joint motor, and controlling each joint motor; the visual recognition system detects the cutting path of the cutter in real time and judges whether the cutting path has deviation from a preset segmentation path or not; and establishing an impedance model based on a force sense admittance control strategy to calculate the correction quantity of the cutting path, and continuously adjusting and correcting the cutting path. The invention can replace the manual operation of pork segmentation on the pork product processing line, and effectively reduce the secondary microbial pollution to the meat in the manual segmentation process; the accurate boning cutting and flexible boning cutting of the pig carcass are realized, the cutting error is reduced, the generation of leftover materials is reduced, and a large amount of labor cost is saved.
Description
Technical Field
The invention relates to the technical field of intelligent control, in particular to a pig carcass segmentation robot path autonomous correction method based on force feedback.
Background
Pork and meat products thereof are indispensable in daily life of people, are important components of human daily diet and account for 65 percent of the total meat production amount in China. In recent years, with the development of social economy and the improvement of living standard of people, the market demand of nutritional and high-quality pork products is increased year by year, and the demand of pork quality is also improved.
The pork carcass division is an important process for pig slaughtering, is the basis for processing the pork carcass and is an important step for producing high-quality pork products. Most of the existing pork product processing lines rely on manual production operation, workers divide the carcass into pork, pig trotters, pork chops and other different parts according to production requirements, and the direct contact between people and raw meat easily influences the health and safety of food. The robot is adopted to cut the pig carcass, so that the production efficiency of carcass cutting can be obviously improved, and the sanitation and safety in the production process can be guaranteed; meanwhile, the uncontrollable cutting speed and direction can easily cause a large amount of bones to be touched in the cutting process, thereby damaging the service life of the end cutter. Therefore, in order to solve the above problems, it is necessary to find a robot path planning method capable of performing fine planning and autonomous correction on a cutting path.
The invention patent with application number CN201910691522.3 discloses a method for automatically correcting the path of an industrial robot, which can obtain the speed feedback and position feedback parameters of each joint through a servo motor, a sensor and a brake, and transmit the parameters to a servo control system, and compare the given value with the feedback value through a position error amplifier and a speed error amplifier to obtain a corresponding control signal, and complete the joint motion control of the industrial robot and the position and speed adjustment of the joint through a driver, thereby completing the correction of the actual path of the industrial robot. However, the application field of the method is industrial production and manufacturing, the method is mostly aimed at correcting the processing path of parts, the pork carcass segmentation in the food processing field is different from the processing of the parts, meanwhile, the pork carcass and mechanical parts have great difference in hardness, processing requirements and the like, the path correction method is applied to the carcass segmentation, the effectiveness of the path correction cannot be guaranteed, a large amount of bone residues and crushed meat can be generated in the segmentation process, waste is caused, the final quality of the processed meat is even influenced, and irreparable economic loss is caused to enterprises.
Disclosure of Invention
Aiming at the technical problem that the existing robot path correction method cannot guarantee the effectiveness of path correction in the pig carcass processing process, the invention provides a force feedback-based pig carcass segmentation robot path automatic correction method.
In order to achieve the purpose, the technical scheme of the invention is realized as follows: a pig carcass splitting robot path autonomous correction method based on force feedback comprises the following steps:
the method comprises the following steps: the visual identification system identifies the pig carcass dividing line to obtain a preset dividing track;
step two: and (3) planning and executing kinematics of a preset segmentation track: obtaining a motion position vector of the segmentation robot according to a three-dimensional coordinate point of a preset segmentation track, substituting the motion position vector into a kinematic equation of the segmentation robot to calculate a motor matrix transformation quantity corresponding to each joint, and controlling motors of each joint position of the segmentation robot;
step three: the vision recognition system detects the cutting path of the cutter of the segmentation robot in real time, judges whether the cutting path has deviation from the preset segmentation path, if so, enters the step four, otherwise, returns to the step two;
step four: the force admittance control system establishes an impedance model based on a force admittance control strategy to calculate the correction quantity of the cutting path, obtains the motor matrix transformation quantity corresponding to each joint through the inverse kinematics solution of the robot, and continuously adjusts and corrects the cutting path until the actual cutting path has no deviation from the preset cutting path.
Preferably, the preset segmentation track is implemented by: performing key feature extraction on a segmented part image of a pig carcass acquired by an industrial camera based on a yolov4 target detection method to obtain a large amount of image data of a key part of the pig carcass, and training the image data of the key part of the pig carcass to obtain a pig carcass segmentation line; a series of three-dimensional coordinate points are obtained through three-dimensional coordinate transformation of a depth camera of the visual recognition system, and the three-dimensional coordinate points jointly form a preset segmentation track.
Preferably, the industrial camera of the vision recognition system is fixed above the segmentation robot and is kept unchanged, and a calibration method with eyes outside hands is adopted: the method comprises the following steps of (1) a nine-point calibration method based on camera calibration, namely, a segmentation robot respectively walks 9 points along a grid on a plane parallel to a working plane, then returns to the center of the grid, respectively rotates for a certain number of degrees left and right, records the position of the robot moving each time, simultaneously takes a picture by an industrial camera, records the same mark point in a visual field, can obtain the mapping relation between the position of the robot and the point in an image coordinate system, and further establishes a coordinate transformation relation between the camera and the robot; and (3) calculating camera calibration parameters by using matlab software to obtain a transformation relation among a robot basic coordinate system, a manipulator coordinate system, a camera coordinate system and a workpiece coordinate system.
Preferably, the kinematic planning and executing method in the second step is as follows: carrying out vector synthesis on a three-dimensional coordinate point (x y z) of a preset segmentation track to obtain a motion position vector x of the robot, and deriving the motion position vector x to obtain a motion speed vector of the robotByReversely solving to obtain a velocity vectorFor velocity vectorFurther integration can be carried out to obtain an angle vector q, and the angle vector q is substituted into a joint space kinetic equation:obtaining robot joint motor driving torque tau r Control input force F of a split robot r =J(q) -T τ r Obtaining a motor driving force matrix tau corresponding to each joint motor r The difference value of the motor driving force matrix is the motor driving force matrix transformation amount, and each joint motor is controlled through the motor driving force matrix transformation amount; wherein J (q) represents Jacobian matrix, τ r For joint driving torque, tau e For external driving torque, M r (q) is a robot inertia matrix under a joint space coordinate system,is a Coriolis force matrix in a joint space coordinate system, G r And (q) is a gravity matrix under a joint space coordinate system.
Preferably, the depth camera of the vision recognition system recognizes the cutting path of the pig carcass in real time and compares the cutting path with a preset cutting track, and if deviation occurs, the six-dimensional force sensor of the vision recognition system acquires the contact force value at the two ends of the cutter, namely the force feedback value F of the external environment to the cutter e When the force feedback value F e When the cutting path is larger than the set threshold value of 0.8N, judging that the cutting path of the cutter deviates from the preset segmentation track; when force sense feedback value F e And when the cutting path is less than 0.8N, judging that the cutting path of the cutter has no deviation from the preset dividing path.
Preferably, the method of the cutting path correction amount is as follows: the force sense admittance control system adopts an inner ring based on position control and an outer ring strategy based on force control, and the contact force between a six-dimensional force sense sensor of the vision recognition system and the outside generates a pose control quantity x through a second-order admittance model u Using the pose control quantity x u Correcting presetsAnd (5) determining a position track, and finally sending the position track into a position control inner ring to finish final position control.
wherein M is an inertia coefficient, K is a rigidity coefficient, and B is a damping coefficient; f e The force feedback value of the outside and the cutter is obtained; x is the number of e =x-x d A robot terminal motion position vector x and a planning displacement x under a base coordinate system d The difference between the two;the first and second derivatives of the difference xe, respectively.
Preferably, the pose control amount x u The calculation method comprises the following steps: detecting the current feedback angle vector q by using the angle sensor on each joint motor, and calculating the motion velocity vector of the robotFor motion velocity vectorIntegrating the motion position vector x of the segmentation robot, and calculating the displacement deviationFor displacement deviationDeriving the deviation of the displacement velocityForce feedback value F collected by force sensor e The expected acceleration can be obtained by an impedance modelTo accelerationIntegrating to correct the pose deviationWill correct the pose deviationSuperimposed on the desired planning displacement x d To obtain the final pose control quantityWherein the displacement x is planned d And J (q) represents a Jacobian matrix for a robot motion position vector of a preset segmentation track.
Preferably, the impedance model is
Wherein, F e The force sense feedback value under the base coordinate system is acquired by the force sense sensor; k is a stiffness coefficient; b is a damping coefficient; m is an inertia coefficient;
and further integrating to obtain a corrected pose deviation:
the force feedback-based pig carcass splitting robot path autonomous correction method of claim 9, characterized in that motion velocity vector is correctedAnd (3) reversely solving to obtain an angle vector q, substituting into a joint space kinetic equation: obtaining the driving torque tau of each joint motor of the robot r Controlling the motion of each joint of the robot to complete the self-correction of the cutting path to obtain a finally established correction track; wherein J (q) represents a Jacobian matrix, q is an angle vector in a space coordinate system,in the form of a velocity vector, the velocity vector,is an acceleration vector, τ r For joint driving torque, tau e For external driving torque, M r (q) is a robot inertia matrix under a joint space coordinate system,is a Coriolis force matrix in a joint space coordinate system, G r And (q) is a gravity matrix under a joint space coordinate system.
Compared with the prior art, the invention has the beneficial effects that: in the self-correction process of the pig carcass cutting path, a proper impedance model is established based on a force sense admittance control strategy, a contact force value of a force sense sensor and the outside and a robot related parameter value are substituted into a second-order admittance model to generate a trajectory correction quantity, the trajectory correction quantity and a preset trajectory are superposed to obtain a correction trajectory, and a control system performs kinematic planning and execution on the correction trajectory. In the whole process, the small six-dimensional force sensors on two sides of the cutter can sense the force feedback value of the external environment to the cutter in real time, the force admittance control system judges and compares the real-time force feedback value with a set threshold value, and continuously outputs track correction quantity to ensure that the force feedback value is within the set threshold value, and meanwhile, the kinematics planning and execution of track correction are carried out until the tail end of the track is reached, so that the autonomous correction of the pig carcass segmentation path is completed. The invention can replace the manual operation of pork segmentation on the pork product processing line, thereby effectively reducing the secondary microbial pollution to the pork in the manual segmentation process; according to the invention, the cutting path is autonomously corrected by adopting a force feedback-based method, so that accurate boning cutting and flexible boning cutting of the pig carcass are realized, a fine cutting product which accords with consumption habits and cooking modes of China can be provided, the cutting error is reduced, the generation of leftover materials is reduced, and a large amount of labor cost is saved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of the present invention.
FIG. 2 is a diagram illustrating path correction according to the present invention.
FIG. 3 is a schematic block diagram of a force admittance control system of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive effort based on the embodiments of the present invention, are within the scope of the present invention.
As shown in fig. 1, a pig carcass segmentation robot path autonomous correction method based on force feedback includes the following steps:
the method comprises the following steps: the method comprises the following steps that a visual recognition system recognizes and extracts a pig carcass dividing line to obtain a preset dividing track; and a cutter of the splitting robot cuts the carcass to be split along a preset track.
The visual recognition system comprises a depth camera, an industrial camera and a multi-dimensional perception system formed by a six-dimensional force sensor, wherein the depth camera can extract depth information of a key part of the pig carcass, the industrial camera is used for recognizing a pig carcass dividing line, and the six-dimensional force sensor is used for acquiring contact force values of two ends of a cutter. The method comprises the steps of extracting key features of a segmented part based on a volov4 target detection method, obtaining a large amount of image data of the pig carcass key part, training the image data of the pig carcass key part to obtain a pig carcass segmentation line, and finally obtaining a series of three-dimensional coordinate points through three-dimensional coordinate transformation of a depth camera, wherein the three-dimensional coordinate points jointly form a preset segmentation track A-B-C as shown in figure 2.
Step two: and (3) planning and executing kinematics of a preset segmentation track: obtaining a robot motion position vector x according to a three-dimensional coordinate point of a preset segmentation track d Substituting the motor matrix transformation quantity into a kinematic equation of the robot to gradually calculate the motor matrix transformation quantity corresponding to each joint, and controlling motors of each joint position of the robot.
In order to calculate the angle vector q corresponding to each joint, a dynamic equation of the robot in a cartesian coordinate system needs to be established as follows:
wherein M is r Is a robot inertia matrix, C r Is a Coriolis force matrix, G r Which is a gravity matrix, x is a motion position vector of the end tool of the robot actuator,is a vector of the motion speed of the robot,is the robot motion acceleration vector. F f Is the friction force, F r Is the control input force of the robot system, F e Is the interaction force between the external environment and the robot.
The control of the terminal position of the robot is completed by the cooperative motion of all joints, so a dynamic coupling relation between the robot and a joint space coordinate system under a Cartesian coordinate system needs to be established, and a joint space dynamic equation is as follows:
wherein q is an angle vector of the robot joint coordinates,in the form of a velocity vector, the velocity vector,is an acceleration vector, τ r For joint driving torque, tau e Is the external driving torque. The parameter conversion corresponding relation under two coordinate systems is as follows:
M r (x)=J -T M r (q)J -1
C r (x)=J -T (C r (q)-M r (q)J -1 )J -1
G r (x)=J -T C r (q)
F r =J -T τ r
F e =J-Tτ e
wherein J is a velocity Jacobian matrix.
In order to enable the motion of a tool at the tail end of the robot to be more accurate, the tail end position of the robot, an industrial camera and the accurate position relation among all joints of the robot need to be obtained, and the three are located in different coordinate systems, namely a robot basic coordinate system, a manipulator coordinate system, a camera coordinate system and a workpiece coordinate system; in order to obtain the conversion relation between the coordinate systems, the robot eye calibration is required to be carried out: the industrial camera is fixed above the robot and is kept unchanged, and a calibration method that eyes are outside hands is adopted; the nine-point calibration method based on camera calibration is characterized in that a segmentation robot respectively walks 9 points along a grid on a plane parallel to a working plane, then returns to the center of the grid, respectively rotates for a certain number of degrees left and right, records the position of the robot moving each time, simultaneously takes a picture by an industrial camera, records the same mark point in a visual field, can obtain the mapping relation between the position of the robot and the point in an image coordinate system, and further establishes the coordinate transformation relation between the camera and the robot. And finally, calculating camera calibration parameters by using matlab software, and obtaining a transformation relation among a robot basic coordinate system, a manipulator coordinate system, a camera coordinate system and a workpiece coordinate system.
Planning and executing the movement: the three-dimensional coordinate points (x y z) of the preset segmentation track are subjected to vector synthesis to obtain a robot motion position vector x, and the motion velocity vector of the robot is obtained by derivation of xBy the formulaSolving to obtain a velocity vectorFor velocity vectorFurther integration can obtain an angle vector q, J (q) represents a Jacobian matrix, and the angle vector q is substituted into a joint space kinetic equation:obtaining robot joint motor driving torque tau r Gradually substituting the difference value of the motion position vector x before and after the robot planning into the formula to obtain the motor driving force matrix tau corresponding to each joint motor r The difference value is the motor driving force matrix transformation quantity, and each joint motor is specifically controlled through a program code, so that the robot kinematics planning and execution are completed.
Step three: and (4) detecting the motion track of the cutter of the segmentation robot in real time by the vision recognition system, judging whether the actual cutting path has deviation from the preset segmentation path, if so, entering the step four, and if not, returning to the step two.
A multi-dimensional sensing system consisting of a depth camera, an industrial camera and a six-dimensional force sensor senses the motion track and the external contact state of a cutter in real time, the depth camera recognizes the pig carcass segmentation path in real time and compares the pig carcass segmentation path with a preset segmentation track A-B-C, and if the pig carcass segmentation path deviates, the segmentation path is changed from A-B-C to A-B' -C; after the six-dimensional force sensor obtains the contact force values of the two ends of the cutter, when the contact force values of the two ends of the cutter are larger than a set threshold value of 0.8N, the robot control system judges that the running track of the cutter deviates from the preset segmentation track, and the system enters the self-correction process of the cutting path.
Judging the deviation of the cutting path of the robot: two sides of the cutter are provided with small six-dimensional force sensors for continuously sensing the force feedback value F of the external environment to the cutter e When the force sense feedback value F e Judging that the cutting path has deviation from a preset dividing path outside the range of 0.8N, namely the path has deviation from A-B-C to A-B' -C; when force sense feedback value F e And when the number of the cutting paths is less than 0.8N, judging that the cutting paths have no deviation from the preset dividing paths.
Step four: the force admittance control system establishes an impedance model based on a force admittance control strategy to calculate the correction quantity of the cutting path, obtains the motor matrix transformation quantity corresponding to each joint through the inverse kinematics solution of the robot, and continuously adjusts and corrects the cutting path until the actual cutting path has no deviation from the preset cutting path.
When force sense feedback value F e When the force feedback value is larger than 0.8N, the force sense admittance control system starts to calculate the path correction quantity, continuously outputs the cutting path correction quantity, obtains the motor matrix transformation quantity corresponding to each joint through the inverse kinematics solution of the robot, and then starts to continuously adjust and correct the cutting path until the real-time force feedback value F e And when the number of the sensors is less than 0.8N, the force feedback and the visual feedback are continuously crossed and fused to form a multidimensional sensing network for the robot to sense the change of the external environment.
In the self-correction process of the robot cutting path, a proper impedance model is established based on a force sense admittance control strategy, a contact force value of a force sense sensor and the outside and a robot related parameter value are substituted into a second-order admittance model to generate a cutting path correction quantity, the cutting path correction quantity and a preset segmentation track are superposed to obtain a correction track, and a robot control system performs kinematic planning and execution in the step two on the correction track. In the whole process, the small six-dimensional force sensors on two sides of the cutter can sense the force feedback value of the external environment to the cutter in real time, the force admittance control system judges and compares the real-time force feedback value with a set threshold value, track correction is continuously output, the force feedback value is ensured to be within the set threshold value, and meanwhile, kinematics planning and execution of track correction are carried out until the tail end of the track is reached, and the autonomous correction of the pig carcass cutting path is completed.
The calculation method of the cutting path correction quantity comprises the following steps:
the force admittance control system employs an inner loop based on position control and an outer loop strategy of force control, as shown in fig. 3. The contact force between the six-dimensional force sensor and the outside is measured through a second-order admittance model:generating an additional position x u The additional position x u And then correcting the preset position track, and finally sending the position track into a position control inner ring to finish the final position control.
Setting initial parameter values, namely setting the rigidity coefficient, the damping coefficient and the inertia coefficient to be 1, 5 and 45 respectively.
Establishing an impedance model:
wherein, F e A force sense feedback value under a base coordinate system acquired by a force sense sensor; k is a stiffness coefficient; b is a damping coefficient; m is an inertia coefficient; x is the number of e =x-x d Is the actual displacement x and the planned displacement x under the base coordinate system d The difference between the two;are respectively the difference x e The first and second derivatives of (a).
Further integration can give:
detecting the current feedback angle vector q by using the angle sensor on each joint motor, and calculating the motion speed vector of the robotFor motion velocity vectorIntegrating to obtain the motion position vector x of the end of the robot actuator, and calculating the displacement deviationInputting a planned displacement x d The motion position vector of the tail end of the robot with the track preset in the step two is obtainedDeriving the deviation of the displacement velocityForce sense feedback value F collected by force sense sensor e Desired acceleration can be obtained To accelerationIntegrating to correct the pose deviationWill correct the pose deviationSuperimposed on the desired input planning displacement x d To obtain the final pose control quantity
To pose control quantity x u Derivation and substitution into formulaThe angle vector q can be obtained by solving, and is substituted into a joint space dynamics equation:obtaining the driving torque tau of each joint motor of the robot r . The difference value of the motion position vector x before and after the robot planning is gradually substituted into the formula, and the motor driving force matrix tau corresponding to each joint motor can be obtained r The difference value of the cutting path is the motor driving force matrix transformation amount, the motor of each joint is controlled through the matrix transformation amount, the motion of each joint of the robot is further controlled, the self-correction of the cutting path is completed, and the corrected track A-C' -C is finally established. In fig. 2, the trajectory a-B ' -C is a preset trajectory deviation path, and compared with the preset trajectory deviation path a-B ' -C, the corrected trajectory a-C ' -C can effectively avoid some hard bones due to the force sense and visual feedback, so that the tool loss and the generation of leftover materials such as bone dregs, broken meat and the like caused by the deviation path are reduced, and the precise segmentation of the pig carcass is ensured.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (10)
1. A pig carcass splitting robot path autonomous correction method based on force feedback is characterized by comprising the following steps:
the method comprises the following steps: the visual identification system identifies the pig carcass dividing line to obtain a preset dividing track;
step two: and (3) planning and executing kinematics of a preset segmentation track: obtaining a motion position vector of the segmentation robot according to a three-dimensional coordinate point of a preset segmentation track, substituting the motion position vector into a kinematic equation of the segmentation robot to calculate a motor matrix transformation quantity corresponding to each joint, and controlling motors of each joint position of the segmentation robot;
step three: the vision recognition system detects the cutting path of the cutter of the segmentation robot in real time, judges whether the cutting path has deviation from the preset segmentation path, if so, enters the step four, otherwise, returns to the step two;
step four: the force admittance control system establishes an impedance model based on a force admittance control strategy to calculate the correction quantity of the cutting path, obtains the motor matrix transformation quantity corresponding to each joint through the inverse kinematics solution of the robot, and continuously adjusts and corrects the cutting path until the actual cutting path has no deviation from the preset cutting path.
2. The force feedback-based pig carcass segmentation robot path autonomous correction method according to claim 1, characterized in that the preset segmentation trajectory is realized by: performing key feature extraction on a segmented part image of a pig carcass acquired by an industrial camera based on a yolov4 target detection method to obtain a large amount of image data of a key part of the pig carcass, and training the image data of the key part of the pig carcass to obtain a pig carcass segmentation line; a series of three-dimensional coordinate points are obtained through three-dimensional coordinate transformation of a depth camera of the visual recognition system, and the three-dimensional coordinate points jointly form a preset segmentation track.
3. The pig carcass splitting robot path automatic correction method based on force feedback as claimed in claim 2, characterized in that the industrial camera of the vision recognition system is fixed above the splitting robot and kept unchanged, adopting a calibration method with eyes outside the hand: the method comprises the following steps of (1) a nine-point calibration method based on camera calibration, namely, dividing a robot to walk 9 points along a field grid on a plane parallel to a working plane respectively, then returning to the center of the field grid, rotating for a certain number of degrees left and right respectively, recording the position of the robot moved each time, simultaneously taking a picture by an industrial camera, recording the same mark point in a visual field, obtaining the mapping relation between the position of the robot and the point in an image coordinate system, and further establishing a coordinate transformation relation between the camera and the robot; and (3) calculating camera calibration parameters by using matlab software to obtain a transformation relation among a robot basic coordinate system, a manipulator coordinate system, a camera coordinate system and a workpiece coordinate system.
4. The pig carcass splitting robot path autonomous correction method based on force feedback according to claim 1 or 3, characterized in that the kinematic planning and execution method in the second step is: carrying out vector synthesis on a three-dimensional coordinate point (x y z) of a preset segmentation track to obtain a motion position vector x of the robot, and deriving the motion position vector x to obtain a motion speed vector of the robotByReversely solving to obtain a velocity vectorFor velocity vectorAnd further integrating to obtain an angle vector q, substituting the angle vector q into a joint space dynamics equation:finding robotDriving torque tau of joint motor r Control input force F of a split robot r =J(q) -T τ r Obtaining a motor driving force matrix tau corresponding to each joint motor r The difference value of the motor driving force matrix is the motor driving force matrix transformation amount, and each joint motor is controlled through the motor driving force matrix transformation amount; wherein J (q) represents Jacobian matrix, τ r For joint driving torque, tau e For external driving torque, M r (q) is a robot inertia matrix under a joint space coordinate system,is a Coriolis force matrix in a joint space coordinate system, G r And (q) is a gravity matrix under a joint space coordinate system.
5. The method for autonomously correcting the path of a pig carcass splitting robot based on force feedback as claimed in claim 4, wherein the depth camera of the vision recognition system recognizes the split path of the pig carcass in real time and compares the split path with a preset split track, and if there is a deviation, the six-dimensional force sensor of the vision recognition system obtains the contact force value at the two ends of the cutter, i.e. the force feedback value F of the external environment to the cutter e When the force feedback value F e When the cutting path is larger than the set threshold value of 0.8N, judging that the cutting path of the cutter deviates from the preset segmentation track; when force sense feedback value F e And when the cutting path is less than 0.8N, judging that the cutting path of the cutter has no deviation from the preset dividing path.
6. The pig carcass splitting robot path autonomous correction method based on force feedback according to claim 1 or 5, characterized in that the method of cutting path correction amount is: the force sense admittance control system adopts an inner ring based on position control and an outer ring strategy based on force control, and the contact force between a six-dimensional force sense sensor of the vision recognition system and the outside generates a pose control quantity x through a second-order admittance model u Using the pose control quantity x u And correcting the preset position track, and finally sending the position track into a position control inner ring to finish the final position control.
7. The force feedback-based pig carcass splitting robot path autonomous correction method of claim 6, characterized in that the second order admittance model is:
wherein M is an inertia coefficient, K is a rigidity coefficient, and B is a damping coefficient; f e The force feedback value of the outside and the cutter is obtained; x is the number of e =x-x d A robot terminal motion position vector x and a planning displacement x under a base coordinate system d The difference between the two;are respectively the difference x e First and second derivatives of (a).
8. The pig carcass splitting robot path autonomous correction method based on force feedback according to claim 7, characterized in that the pose control quantity x u The calculation method comprises the following steps: detecting the current feedback angle vector q by using the angle sensor on each joint motor, and calculating the motion velocity vector of the robotFor motion velocity vectorIntegrating the motion position vector x of the segmentation robot, and calculating the displacement deviationFor displacement deviationDeriving the deviation of the displacement velocityForce feedback value F collected by force sensor e The expected acceleration can be obtained by an impedance modelTo accelerationIntegrating to correct the pose deviationWill correct the pose deviationSuperimposed on the desired planning displacement x d To obtain the final pose control quantityWherein the displacement x is planned d And J (q) represents a Jacobian matrix for a robot motion position vector of a preset segmentation track.
9. The force feedback-based pig carcass splitting robot path autonomous correction method of claim 8, characterized in that the impedance model is
Wherein, F e The force sense feedback value under the base coordinate system is acquired by the force sense sensor; k is a stiffness coefficient; b is a damping coefficient; m is an inertia coefficient;
and further integrating to obtain a corrected pose deviation:
10. the force feedback-based hog carcass splitting robot path autonomous correction method according to claim 9, wherein the motion velocity vector is correctedAnd (3) reversely solving to obtain an angle vector q, substituting into a joint space kinetic equation:obtaining the driving torque tau of each joint motor of the robot r Controlling the motion of each joint of the robot to complete the self-correction of the cutting path to obtain a finally established correction track; wherein J (q) represents a Jacobian matrix, q is an angle vector in a space coordinate system,in the form of a velocity vector, the velocity vector,is an acceleration vector, τ r For joint driving torque, τ e For external driving torque, M r (q) is a robot inertia matrix under a joint space coordinate system,is a Coriolis force matrix in a joint space coordinate system, G r And (q) is a gravity matrix under a joint space coordinate system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210872407.8A CN115016293A (en) | 2022-07-20 | 2022-07-20 | Pig carcass segmentation robot path autonomous correction method based on force feedback |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210872407.8A CN115016293A (en) | 2022-07-20 | 2022-07-20 | Pig carcass segmentation robot path autonomous correction method based on force feedback |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115016293A true CN115016293A (en) | 2022-09-06 |
Family
ID=83082211
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210872407.8A Pending CN115016293A (en) | 2022-07-20 | 2022-07-20 | Pig carcass segmentation robot path autonomous correction method based on force feedback |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115016293A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115946129A (en) * | 2023-03-10 | 2023-04-11 | 珞石(北京)科技有限公司 | Robot variable admittance control method for operating large-inertia object |
CN117253198A (en) * | 2023-11-20 | 2023-12-19 | 山东大学 | Intelligent manufacturing dynamic management method and system |
CN117562098A (en) * | 2024-01-15 | 2024-02-20 | 河南科技学院 | Pig small-lining separation method |
-
2022
- 2022-07-20 CN CN202210872407.8A patent/CN115016293A/en active Pending
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115946129A (en) * | 2023-03-10 | 2023-04-11 | 珞石(北京)科技有限公司 | Robot variable admittance control method for operating large-inertia object |
CN115946129B (en) * | 2023-03-10 | 2023-05-09 | 珞石(北京)科技有限公司 | Robot admittance-changing control method for operating large-inertia object |
CN117253198A (en) * | 2023-11-20 | 2023-12-19 | 山东大学 | Intelligent manufacturing dynamic management method and system |
CN117253198B (en) * | 2023-11-20 | 2024-01-26 | 山东大学 | Intelligent manufacturing dynamic management method and system |
CN117562098A (en) * | 2024-01-15 | 2024-02-20 | 河南科技学院 | Pig small-lining separation method |
CN117562098B (en) * | 2024-01-15 | 2024-05-03 | 河南科技学院 | Pig small-lining separation method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115016293A (en) | Pig carcass segmentation robot path autonomous correction method based on force feedback | |
CN113814986B (en) | Method and system for controlling SCARA robot based on machine vision | |
JP2022542241A (en) | Systems and methods for augmenting visual output from robotic devices | |
CN104325268A (en) | Industrial robot three-dimensional space independent assembly method based on intelligent learning | |
CN111267083A (en) | Mechanical arm autonomous carrying system based on combination of monocular and binocular cameras | |
CN113601512A (en) | Universal avoidance method and system for singular points of mechanical arm | |
CN106527239A (en) | Method and system of multi-robot cooperative operation mode | |
CN207724306U (en) | A kind of robot polishing Force control system | |
CN113829343A (en) | Real-time multi-task multi-person man-machine interaction system based on environment perception | |
CN116276328A (en) | Robot polishing track optimization method based on digital twin and visual transmission technology | |
CN112650217B (en) | Robot trajectory tracking strategy dynamic optimization method based on evaluation function | |
CN113001069A (en) | Welding seam tracking method of six-joint robot | |
Long et al. | Robotic cutting of soft materials using force control & image moments | |
CN117021066A (en) | Robot vision servo motion control method based on deep reinforcement learning | |
Yang et al. | Human-robot shared control system based on 3d point cloud and teleoperation | |
Lei et al. | Unknown object grasping using force balance exploration on a partial point cloud | |
CN114536351B (en) | Redundant double-arm robot teaching method and device, electronic equipment and system | |
Luo et al. | Robotic conveyor tracking with dynamic object fetching for industrial automation | |
CN112757274B (en) | Human-computer cooperative operation oriented dynamic fusion behavior safety algorithm and system | |
Sebastián et al. | A new method for the estimation of the image Jacobian for the control of an uncalibrated joint system | |
Zhou et al. | Visual servo control system of 2-DOF parallel robot | |
CN112487960A (en) | Machine vision-based toilet bowl embryo in-vitro flexible bonding method and system | |
Jayasurya et al. | Gesture controlled AI-robot using Kinect | |
Guire et al. | Robotic cell with redundant architecture and force control: application to cutting and boning | |
Hanh et al. | Autonomous gluing based on image-based visual servoing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |