CN117562098A - Pig small-lining separation method - Google Patents
Pig small-lining separation method Download PDFInfo
- Publication number
- CN117562098A CN117562098A CN202410052278.7A CN202410052278A CN117562098A CN 117562098 A CN117562098 A CN 117562098A CN 202410052278 A CN202410052278 A CN 202410052278A CN 117562098 A CN117562098 A CN 117562098A
- Authority
- CN
- China
- Prior art keywords
- cutting
- cutter
- connecting rod
- force
- sequence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000926 separation method Methods 0.000 title description 14
- 238000005520 cutting process Methods 0.000 claims abstract description 108
- 238000000034 method Methods 0.000 claims abstract description 77
- 230000009471 action Effects 0.000 claims abstract description 47
- 238000012549 training Methods 0.000 claims abstract description 15
- 230000000007 visual effect Effects 0.000 claims abstract description 4
- 235000015277 pork Nutrition 0.000 claims description 25
- 230000008569 process Effects 0.000 claims description 24
- 235000013372 meat Nutrition 0.000 claims description 15
- 230000006870 function Effects 0.000 claims description 10
- 238000006073 displacement reaction Methods 0.000 claims description 9
- 230000003044 adaptive effect Effects 0.000 claims description 6
- 230000001186 cumulative effect Effects 0.000 claims description 6
- 238000005315 distribution function Methods 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 6
- 230000002787 reinforcement Effects 0.000 claims description 6
- 238000000605 extraction Methods 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 3
- 230000001131 transforming effect Effects 0.000 claims description 3
- 241000282887 Suidae Species 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000003307 slaughter Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000011478 gradient descent method Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 235000013622 meat product Nutrition 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000001087 myotubule Anatomy 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/15—Correlation function computation including computation of convolution operations
-
- A—HUMAN NECESSITIES
- A22—BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
- A22C—PROCESSING MEAT, POULTRY, OR FISH
- A22C17/00—Other devices for processing meat or bones
- A22C17/0073—Other devices for processing meat or bones using visual recognition, X-rays, ultrasounds, or other contactless means to determine quality or size of portioned meat
- A22C17/0086—Calculating cutting patterns based on visual recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Mathematical Analysis (AREA)
- Computational Mathematics (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Algebra (AREA)
- Life Sciences & Earth Sciences (AREA)
- Wood Science & Technology (AREA)
- Zoology (AREA)
- Food Science & Technology (AREA)
- Image Analysis (AREA)
- Meat, Egg Or Seafood Products (AREA)
Abstract
The invention provides a method for separating a small pig ridge, which comprises the following steps: generating a cutting line through the characteristics of the area to be cut; obtaining a motion sequence of a cutting tool according to the movement speed information of the tool in the cutting line; training and updating the action sequence through a meta learning network, and obtaining an optimal action sequence; optimizing the cutting line through the optimal action sequence, and obtaining an optimal cutting line; when the cutter cuts along the optimal cutting line, the cutting force of the cutter is obtained through the six-dimensional force sensor; simultaneously, acquiring a real-time cutting path through a visual camera; according to the cutting force and the real-time cutting path, controlling the cutter to regulate and control the cutting force along with the real-time cutting path, and further completing the cutting of the area to be cut; and introducing element learning into a feedback adjustment model of the six-dimensional force sensor, and returning data through the six-dimensional force sensor carried by the cutter in the cutting process to correct a cutting path and accurately cut a region to be cut.
Description
Technical Field
The invention relates to a pig small-ridge separation technology, in particular to a pig small-ridge separation method.
Background
In the slaughtering process of raw pigs, the separation of the pig's back is a critical process. The traditional separation method is manual separation, the method is time-consuming and labor-consuming, the separation effect is influenced by experience and skill of workers, the quality of the separated pig small ridge is unstable, and in addition, the manual separation is easy to pollute the small ridge, so that the subsequent processing and the product quality are influenced. Therefore, there is a need to improve the conventional separation method to improve the separation efficiency and separation quality. At present, although some automatic separation equipment appears, the weight and the size of live pigs are different, the traditional machine separation is difficult to adapt to live pigs with different meat quality, the adaptability is poor, and the requirements of modern live pig slaughtering enterprises are difficult to meet.
Disclosure of Invention
Aiming at the problems in the prior art, the method for separating the small inner spines of the pigs is provided, and aims to adapt to cutting of different meat quality.
A method for separating a small pig's back, comprising the steps of:
step 1: generating a cutting line through the characteristics of the area to be cut;
step 2: obtaining a motion sequence of a cutting tool according to the movement speed information of the tool in the cutting line;
step 3: training and updating the action sequence through a meta learning network, and obtaining an optimal action sequence;
step 4: optimizing the cutting line through the optimal action sequence, and obtaining an optimal cutting line;
step 5: when the cutter cuts along the optimal cutting line, the cutting force of the cutter is obtained through the six-dimensional force sensor; simultaneously, acquiring a real-time cutting path through a visual camera; and controlling the cutter to regulate and control the cutting force along with the real-time cutting path according to the cutting force and the real-time cutting path, so as to complete the cutting of the area to be cut.
The method further comprises the following steps: the step 1 comprises the following steps:
step 1.1: collecting pork images containing small ridges through a depth camera;
step 1.2: noise of pork images is suppressed using gaussian filtering:
coordinates of any point in pork; />Is the coordinates of the center point in pork, +.>Is the standard deviation;
step 1.3: making probability histogram of probability gray scale of original image, then setting input pixel gray scale value asThe cumulative distribution function is:
wherein the method comprises the steps ofFor gray values of +.>Pixel frequency of>Is the total number of image pixels; let the output image gray value be +.>The pixel range is +.>It is desirable to output a uniformly distributed histogram, namely:
transforming with cumulative distribution function to obtain new gray level after transformation, namely orderObtaining:
the new gray is used to replace the old gray, and the new gray is obtained;
Step 1.4: image gradient of small ridge was calculated by difference:
in the method, in the process of the invention,at the point +.>A horizontal gradient value at the position; />Is a vertical gradient value; />The proportion of the salient edges to the image pixel points; />The number of pixels of the pork picture is; />Is the best adaptive threshold; />Is the maximum gradient value; />Is the minimum gradient value; obtain the best adaptive threshold +.>And then, removing the rest pixel points in the pork image, only reserving the pixel points matched with the optimal self-adaptive threshold value, and enabling the pixel points to form a cutting line.
The method further comprises the following steps: the cutter movement speed informationThe method comprises the following steps:
wherein,indicative of the fracture toughness of meat->For the friction between the knife and pork +.>,/>For the thickness of the cutter->For friction coefficient>Is the elastic modulus.
The method further comprises the following steps: the effort required for cutting meat isAnd force->The available resultant force is expressed asThe above formula is combined to obtain:
in the method, in the process of the invention,representing the force on the x-axis; />Representing the force in the z-axis.
In the process of controlling the cutter to cut by the mechanical arm, the six-dimensional force sensor obtains acting force when cutting meatSimultaneously, a fixed cutting angle is maintained, and cutting force is adjusted or a cutting path is corrected according to feedback of cutting force; the relation between the cutter displacement and the cutting force is as follows:
in the method, in the process of the invention,representing the force on the x-axis; />Representing the force in the z-axis; />Represents the fracture toughness of meat;and->Representing displacement in the x-axis and z-axis; />Expressed as the actual direction of displacement of the tool; />Represented as the width of the tool.
The method further comprises the following steps: the step 3 comprises the following steps:
step 3.1: information of the movement speed of the cutterMarked as->Wherein->Is the number of the features;
step 3.2: in the case of performing the meat-based cut,as an environment->Setting the rewarding value of the execution action +.>At the beginning of the cutting, the system randomly selects an action command +.>,/>A vector representing the direction and speed of the tool at that moment>The input/output layer obtains a reward +.>Rewarding->Representing the degree of fitting to the sample;
the action sequence corresponding to each task is executedInputting a meta-learning network, and obtaining model parameters by meta-learning>Update and update the adjusting action sequence>The loss function is:
in the method, in the process of the invention,for this sequence of actions, +.>Is->Information on the movement speed of the tool at the moment +.>For the reward value of the sequence, for->And->Updating to obtain:
in the method, in the process of the invention,for action sequence, < > is->For model parameters, an optimal action sequence is obtained by training>And parameters->。
The method further comprises the following steps: the step 5 comprises the following steps:
step 5.1: will optimize the action sequenceAs input, the sequence +.>Next round of action +.>Is the optimal value of (a)The method comprises the steps of carrying out a first treatment on the surface of the Is provided with->For discount coefficient, select action->Maximize->Can be expressed as:
using loss functions during trainingEvery iteration->The following updates were made:
first, theThe feature output function of the secondary training is:
in the method, in the process of the invention,representing new features of reinforcement learning extraction on the recognition target; />Is the sequence->And action sequence->Probability density distribution of (2); />Is expected;
step 5.2: setting network punishment coefficients for the cutting force and the real-time cutting path, thereby realizing updating of the sequence parameters of the action network and obtaining optimal execution control of the taskOptimally execute control->The method comprises the following steps: />。
The method further comprises the following steps: the six-dimensional force sensor is fixedly connected between the cutter and the mechanical arm, and the depth camera is fixedly connected on the mechanical arm and synchronously moves along with the cutter.
The method further comprises the following steps: the telescopic assembly comprises a vertical telescopic rod, a cylinder body of the telescopic rod is fixedly connected to the six-dimensional force sensor, a telescopic end of the telescopic rod is fixedly connected with the cutter after passing through a first connecting rod and a second connecting rod in sequence, one end of the first connecting rod is hinged with the telescopic rod, the other end of the first connecting rod is fixedly connected with one end of the second connecting rod, the other end of the first connecting rod and the one end of the second connecting rod form a V-shaped structure with an upward opening, the other end of the second connecting rod is fixedly connected with the top of the cutter, and the lower part of the cutter is a cutting edge end; a transverse sliding rod is fixedly connected to the cylinder body, a sliding seat is arranged on the sliding rod in a sliding manner, a node between the first connecting rod and the second connecting rod corresponds to the sliding seat up and down, a vertical third connecting rod is arranged between the node and the sliding seat, the lower end of the third connecting rod is hinged to the node, the upper end of the third connecting rod is fixedly connected to the sliding seat, and the first connecting rod drives the third connecting rod to slide on the sliding seat when swinging between the telescopic rod and the third connecting rod; a clamping jaw is fixedly arranged at the lower end of the telescopic rod, and when the telescopic rod is contracted upwards, the second connecting rod drives the cutter to descend downwards, and the cutter is lower than the clamping jaw and is used for cutting pork; when the telescopic rod extends downwards, the second connecting rod drives the cutter to lift upwards, the cutter is higher than the clamping jaw, and the clamping jaw is used for grabbing the cut small back ridge.
The invention has the beneficial effects that: and introducing element learning into a feedback adjustment model of the six-dimensional force sensor, and returning data through the six-dimensional force sensor carried by the cutter in the cutting process to correct a cutting path and accurately cut a region to be cut.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a schematic diagram of a connection structure of a cutter, a six-dimensional force sensor and a mechanical arm in the invention;
fig. 3 is a schematic diagram of a specific connection structure between a cutter and a six-dimensional force sensor in the present invention.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings. Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the invention. The terms left, middle, right, upper, lower, etc. in the embodiments of the present invention are merely relative concepts or references to the normal use state of the product, and should not be construed as limiting.
A method for separating a small pig's back, comprising the steps of:
step 1: the cutting line is generated through the characteristics of the area to be cut, and specifically comprises the following steps:
step 1.1: collecting pork images containing small ridges through a depth camera;
step 1.2: noise of pork images is suppressed using gaussian filtering:
coordinates of any point in pork; />The coordinates of the center point in pork are considered as integers in image processing; />Is the standard deviation;
step 1.3: making probability histogram of probability gray scale of original image, then setting input pixel gray scale value asThe cumulative distribution function is:
wherein the method comprises the steps ofFor gray values of +.>Pixel frequency of>Is the total number of image pixels; let the output image gray value be +.>The pixel range is +.>It is desirable to output a uniformly distributed histogram, namely:
transforming with cumulative distribution function to obtain new gray level after transformation, namely orderObtaining:
the new gray is used to replace the old gray, and the new gray is obtained;
Step 1.4: the image gradient of the small ridge is calculated through difference, the optimal gray threshold value of the small ridge region is obtained, the outlines of the rest pork parts are removed, and a fine dividing line is generated:
in the method, in the process of the invention,at the point +.>A horizontal gradient value at the position; />Is a vertical gradient value; />The proportion of the salient edges to the image pixel points; />The number of pixels of the pork picture is; />Is the best adaptive threshold;is the maximum gradient value; />Is the minimum gradient value; obtain the best adaptive threshold +.>Removing other pixel points in the pork image, only reserving the pixel point matched with the optimal self-adaptive threshold value, and enabling the pixel point to form a cutting line;
step 2: obtaining a motion sequence of a cutting tool according to the movement speed information of the tool in the cutting line;
step 3: the action sequence is updated through meta learning network training, and an optimal action sequence is obtained, specifically:
step 3.1: information of the movement speed of the cutterMarked as->Wherein->Is the number of the features;
step 3.2: in the case of performing the meat-based cut,as an environment->Setting the rewarding value of the execution action +.>At the beginning of the cutting, the system randomly selects an action command +.>,/>A vector representing the direction and speed of the tool at that moment>The input/output layer obtains a reward +.>Rewarding->Representing the degree of fitting to the sample;
the action sequence corresponding to each task is executedInputting a meta-learning network, and obtaining model parameters by meta-learning>Update and update the adjusting action sequence>The loss function is:
in the method, in the process of the invention,for this sequence of actions, +.>Is->Information on the movement speed of the tool at the moment +.>For the reward value of the sequence, for->And->Updating to obtain:
in the method, in the process of the invention,for action sequence, < > is->For model parameters, an optimal action sequence is obtained by training>And parameters->;
Step 4: optimizing the cutting line through the optimal action sequence, and obtaining an optimal cutting line;
step 5: the cutting force required by the robot when cutting meat is different due to the influence of uneven distribution of muscle fibers of the meat, and the cutting quality of the cutting is influenced by the fact that the given cutting force is too large or too small; aiming at the problems, when the cutter cuts along the optimal cutting line, the cutting force of the cutter is obtained through a six-dimensional force sensor; simultaneously, acquiring a real-time cutting path through a visual camera; according to the cutting force and the real-time cutting path, controlling the cutter to regulate and control the cutting force along with the real-time cutting path, and further completing the cutting of the area to be cut; the method comprises the following steps:
step 5.1: will optimize the action sequenceAs input, the sequence +.>Next round of action +.>Is the optimal value of (a)The method comprises the steps of carrying out a first treatment on the surface of the Is provided with->For discount coefficient, select action->Maximize->Can be expressed as:
using loss functions during trainingEvery iteration->The following updates were made:
first, theThe feature output function of the secondary training is:
in the method, in the process of the invention,representing new features of reinforcement learning extraction on the recognition target; />Is the sequence->And action sequence->Probability density distribution of (2); />Is expected;
step 5.2: setting network punishment coefficients for the cutting force and the real-time cutting path, thereby realizing updating of the sequence parameters of the action network and obtaining optimal execution control of the taskOptimally execute control->The method comprises the following steps: />。
Wherein the cutter movement speed informationThe method comprises the following steps:
wherein,indicative of the fracture toughness of meat->For the friction between the knife and pork +.>,/>For the thickness of the cutter->For friction coefficient>Is the elastic modulus;
the effort required for cutting meat isAnd force->The available resultant force is denoted +.>The above formula is combined to obtain:
in the method, in the process of the invention,representing the force on the x-axis; />Representing the force in the z-axis;
in the process of controlling the cutter to cut by the mechanical arm, the six-dimensional force sensor obtains acting force when cutting meatWhile remaining fixedAccording to the cutting force feedback, adjusting the cutting force or correcting the cutting path; the relation between the cutter displacement and the cutting force is as follows:
in the method, in the process of the invention,representing the force on the x-axis; />Representing the force in the z-axis; />Represents the fracture toughness of meat;and->Representing displacement in the x-axis and z-axis; />Expressed as the actual direction of displacement of the tool; />Represented as the width of the tool.
As shown in fig. 2, the six-dimensional force sensor 200 is fixedly connected between the tool 300 and the mechanical arm 100, and the depth camera 4 is fixedly connected to the mechanical arm 100 and moves synchronously with the tool 300; referring to fig. 3, after the cutter 300 passes through the telescopic assembly and is connected with the six-dimensional force sensor 200, the telescopic assembly comprises a vertical telescopic rod 1, a cylinder body of the telescopic rod 1 is fixedly connected to the six-dimensional force sensor 200, a telescopic end of the telescopic rod 1 is fixedly connected with the cutter 300 after passing through a first connecting rod 31 and a second connecting rod 32 in sequence, one end of the first connecting rod 31 is hinged with the telescopic rod 1, the other end of the first connecting rod 31 is fixedly connected with one end of the second connecting rod 32, the other end of the second connecting rod 32 is fixedly connected with the top of the cutter 300, and the lower part of the cutter 300 is a cutting edge end; a transverse slide bar 21 is fixedly connected to the cylinder body, a slide seat 2 is arranged on the slide bar 21 in a sliding manner, a node between a first connecting rod 31 and a second connecting rod 32 corresponds to the slide seat 2 up and down, a vertical third connecting rod 33 is arranged between the node and the slide seat 2, the lower end of the third connecting rod 33 is hinged to the node, the upper end of the third connecting rod 33 is fixedly connected to the slide seat 2, and the first connecting rod 31 drives the third connecting rod 33 to slide on the slide seat 2 when swinging between a telescopic rod and the third connecting rod 33; a clamping jaw 5 is fixedly arranged at the lower end of the telescopic rod 1, when the telescopic rod 1 is contracted upwards, the second connecting rod 32 drives the cutter 300 to descend downwards, and the cutter 300 is lower than the clamping jaw 5 and is used for cutting pork; when the telescopic rod 1 extends downwards, the second connecting rod 32 drives the cutter 300 to lift upwards, and the cutter 300 is higher than the clamping jaw 5, so that the cutter 300 can grasp the cut small ridge.
The method is characterized in that a small ridge is automatically and accurately separated based on fusion of image processing, real-time force feedback and meta reinforcement learning (meta-RL), in known meat products, an image is collected by a depth camera, preprocessing is carried out on the image, then an image gradient is calculated by difference, an optimal image threshold value of an area where the small ridge of a pig is located is determined, the rest background is removed, a Canny edge detection algorithm is utilized to obtain an edge image of a segmentation path, a fine segmentation line is generated, a cutter lifting point, a cutter collecting point and a cutting angle are determined, operation path planning is carried out, and a mechanical arm executing device is controlled to start cutting; setting up a cutting model, introducing element learning into a six-dimensional force feedback adjustment model, returning data through a force sensor carried by a cutter in the cutting process, combining a force bit hybrid control principle, correcting a cutting path, and controlling the cutting depth; in the generation process of the parting line, gaussian filtering processing is carried out on the image acquired by the depth camera, the Gaussian filtering belongs to low-pass filtering, the edge information of the image can be effectively reserved, the high-frequency component in the image is weakened, and the image noise and abrupt change are furthest suppressed; carrying out histogram equalization on the image, expanding the pixel value range, and enabling the image to be clearer; calculating a gradient image by utilizing the differential pair image, adjusting an optimal threshold according to the maximum gradient and the minimum gradient value, and eliminating measurement information exceeding the threshold; generating a fine dividing line by using a Canny algorithm; constructing a random characteristic matrix of yield stress, meat density and meat deformation, and analyzing the relation between the running speed of the cutter and the resistance born by the cutter; performing reinforcement training on the segmentation task sequence to obtain an optimal action sequence; training the change condition of the reinforcement learning action sequence parameters by adopting a random gradient descent method, improving the generalization capability of an algorithm, and adjusting the cutting position in real time; and correcting the cutting track of the cutter according to the force feedback information.
The foregoing has shown and described the basic principles, principal features and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and that the above embodiments and descriptions are merely illustrative of the principles of the present invention, and various changes and modifications may be made without departing from the spirit and scope of the invention, which is defined in the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (8)
1. A method for separating a small pig's back is characterized by comprising the following steps: the method comprises the following steps:
step 1: generating a cutting line through the characteristics of the area to be cut;
step 2: obtaining a motion sequence of a cutting tool according to the movement speed information of the tool in the cutting line;
step 3: training and updating the action sequence through a meta learning network, and obtaining an optimal action sequence;
step 4: optimizing the cutting line through the optimal action sequence, and obtaining an optimal cutting line;
step 5: when the cutter cuts along the optimal cutting line, the cutting force of the cutter is obtained through the six-dimensional force sensor; simultaneously, acquiring a real-time cutting path through a visual camera; and controlling the cutter to regulate and control the cutting force along with the real-time cutting path according to the cutting force and the real-time cutting path, so as to complete the cutting of the area to be cut.
2. The method for separating the small pig ridges according to claim 1, wherein: the step 1 comprises the following steps:
step 1.1: collecting pork images containing small ridges through a depth camera;
step 1.2: noise of pork images is suppressed using gaussian filtering:
;
coordinates of any point in pork; />Is the coordinates of the center point in pork, +.>Is the standard deviation;
step 1.3: making probability histogram of probability gray scale of original image, then setting input pixel gray scale value asThe cumulative distribution function is:
;
wherein the method comprises the steps ofFor gray values of +.>Pixel frequency of>Is the total number of image pixels; let the output image gray value be +.>The pixel range is +.>It is desirable to output a uniformly distributed histogram, namely:
;
transforming with cumulative distribution function to obtain new gray level after transformation, namely orderObtaining:
;
the new gray is used to replace the old gray, and the new gray is obtained;
Step 1.4: image gradient of small ridge was calculated by difference:
;
in the method, in the process of the invention,at the point +.>A horizontal gradient value at the position; />Is a vertical gradient value; />The proportion of the salient edges to the image pixel points; />The number of pixels of the pork picture is; />Is the best adaptive threshold; />Is the maximum gradient value; />Is the minimum gradient value; obtain the best adaptive threshold +.>And then, removing the rest pixel points in the pork image, only reserving the pixel points matched with the optimal self-adaptive threshold value, and enabling the pixel points to form a cutting line.
3. The method for separating the small pig ridges according to claim 1, wherein: the cutter movement speed informationThe method comprises the following steps:
;
wherein,indicative of the fracture toughness of meat->For the friction between the knife and pork +.>,/>For the thickness of the cutter->For friction coefficient>Is the elastic modulus.
4. A method of separating porcine parcels according to claim 3, wherein: the effort required for cutting meat isAnd force->The available resultant force is denoted +.>The above formula is combined to obtain:
;
in the method, in the process of the invention,representing the force on the x-axis; />Representing the force in the z-axis;
in the process of controlling the cutter to cut by the mechanical arm, the six-dimensional force sensor obtains acting force when cutting meatSimultaneously, a fixed cutting angle is maintained, and cutting force is adjusted or a cutting path is corrected according to feedback of cutting force; the relation between the cutter displacement and the cutting force is as follows:
;
in the method, in the process of the invention,representing the force on the x-axis; />Representing the force in the z-axis; />Represents the fracture toughness of meat; />And->Representing displacement in the x-axis and z-axis; />Expressed as the actual direction of displacement of the tool; />Represented as the width of the tool.
5. The method for separating the small pig ridges according to claim 4, wherein: the step 3 comprises the following steps:
step 3.1: information of the movement speed of the cutterMarked as->Wherein->Is the number of the features;
step 3.2: in the case of performing the meat-based cut,as an environment->Setting the rewarding value of the execution action +.>At the beginning of the cutting, the system randomly selects an action command +.>,/>A vector representing the direction and speed of the tool at that moment>The input/output layer obtains a reward +.>Rewarding->Representing the degree of fitting to the sample;
the action sequence corresponding to each task is executedInputting a meta-learning network, and obtaining model parameters by meta-learning>Update and update the adjusting action sequence>The loss function is:
;
in the method, in the process of the invention,for this sequence of actions, +.>Is->Information on the movement speed of the tool at the moment +.>For the reward value of the sequence, for->Andupdating to obtain:
;
in the method, in the process of the invention,for action sequence, < > is->For model parameters, an optimal action sequence is obtained by training>And parameters->。
6. The method for separating the small pig ridges according to claim 5, wherein: the step 5 comprises the following steps:
step 5.1: will optimize the action sequenceAs input, the sequence +.>Next round of action +.>Is +.>The method comprises the steps of carrying out a first treatment on the surface of the Is provided with->For discount coefficient, select action->Maximize->Can be expressed as:
;
using loss functions during trainingEvery iteration->The following updates were made:
;
first, theTraining for a second timeThe characteristic output function is:
;
in the method, in the process of the invention,representing new features of reinforcement learning extraction on the recognition target; />Is the sequence->And action sequence->Probability density distribution of (2); />Is expected;
step 5.2: setting network punishment coefficients for the cutting force and the real-time cutting path, thereby realizing updating of the sequence parameters of the action network and obtaining optimal execution control of the taskOptimally execute control->The method comprises the following steps: />。
7. The method for separating the small pig ridges according to claim 1, wherein: the six-dimensional force sensor is fixedly connected between the cutter and the mechanical arm, and the depth camera is fixedly connected on the mechanical arm and synchronously moves along with the cutter.
8. The method for separating the small pig ridges according to claim 7, wherein: the telescopic assembly comprises a vertical telescopic rod, a cylinder body of the telescopic rod is fixedly connected to the six-dimensional force sensor, a telescopic end of the telescopic rod is fixedly connected with the cutter after passing through a first connecting rod and a second connecting rod in sequence, one end of the first connecting rod is hinged with the telescopic rod, the other end of the first connecting rod is fixedly connected with one end of the second connecting rod, the other end of the first connecting rod and the one end of the second connecting rod form a V-shaped structure with an upward opening, the other end of the second connecting rod is fixedly connected with the top of the cutter, and the lower part of the cutter is a cutting edge end; a transverse sliding rod is fixedly connected to the cylinder body, a sliding seat is arranged on the sliding rod in a sliding manner, a node between the first connecting rod and the second connecting rod corresponds to the sliding seat up and down, a vertical third connecting rod is arranged between the node and the sliding seat, the lower end of the third connecting rod is hinged to the node, the upper end of the third connecting rod is fixedly connected to the sliding seat, and the first connecting rod drives the third connecting rod to slide on the sliding seat when swinging between the telescopic rod and the third connecting rod; a clamping jaw is fixedly arranged at the lower end of the telescopic rod, and when the telescopic rod is contracted upwards, the second connecting rod drives the cutter to descend downwards, and the cutter is lower than the clamping jaw and is used for cutting pork; when the telescopic rod extends downwards, the second connecting rod drives the cutter to lift upwards, the cutter is higher than the clamping jaw, and the clamping jaw is used for grabbing the cut small back ridge.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410052278.7A CN117562098B (en) | 2024-01-15 | 2024-01-15 | Pig small-lining separation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410052278.7A CN117562098B (en) | 2024-01-15 | 2024-01-15 | Pig small-lining separation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117562098A true CN117562098A (en) | 2024-02-20 |
CN117562098B CN117562098B (en) | 2024-05-03 |
Family
ID=89884778
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410052278.7A Active CN117562098B (en) | 2024-01-15 | 2024-01-15 | Pig small-lining separation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117562098B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101556693A (en) * | 2009-03-30 | 2009-10-14 | 西安电子科技大学 | Division method for extracted watershed SAR image with threshold method and marking |
CN101826204A (en) * | 2009-03-04 | 2010-09-08 | 中国人民解放军63976部队 | Quick particle image segmentation method based on improved waterline algorithm |
CN108205667A (en) * | 2018-03-14 | 2018-06-26 | 海信集团有限公司 | Method for detecting lane lines and device, lane detection terminal, storage medium |
US20200126210A1 (en) * | 2018-10-19 | 2020-04-23 | Genentech, Inc. | Defect Detection in Lyophilized Drug Products with Convolutional Neural Networks |
TWI711485B (en) * | 2019-11-28 | 2020-12-01 | 國立臺東大學 | Method for cutting fascia in meat block and its device |
CN114387515A (en) * | 2021-12-31 | 2022-04-22 | 潮州三环(集团)股份有限公司 | Cutting path planning method and device based on machine vision |
CN114714419A (en) * | 2022-04-18 | 2022-07-08 | 青岛锐智智能装备科技有限公司 | Chicken middle wing cutting device and cutting method thereof |
CN115016293A (en) * | 2022-07-20 | 2022-09-06 | 河南科技学院 | Pig carcass segmentation robot path autonomous correction method based on force feedback |
CN115063438A (en) * | 2022-07-27 | 2022-09-16 | 河南科技学院 | Autonomous adjusting method applied to pig carcass splitting robot |
CN116362147A (en) * | 2023-02-24 | 2023-06-30 | 西北工业大学 | Aviation hydraulic pipeline joint sealing performance prediction method considering yield hardening effect |
-
2024
- 2024-01-15 CN CN202410052278.7A patent/CN117562098B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101826204A (en) * | 2009-03-04 | 2010-09-08 | 中国人民解放军63976部队 | Quick particle image segmentation method based on improved waterline algorithm |
CN101556693A (en) * | 2009-03-30 | 2009-10-14 | 西安电子科技大学 | Division method for extracted watershed SAR image with threshold method and marking |
CN108205667A (en) * | 2018-03-14 | 2018-06-26 | 海信集团有限公司 | Method for detecting lane lines and device, lane detection terminal, storage medium |
US20200126210A1 (en) * | 2018-10-19 | 2020-04-23 | Genentech, Inc. | Defect Detection in Lyophilized Drug Products with Convolutional Neural Networks |
TWI711485B (en) * | 2019-11-28 | 2020-12-01 | 國立臺東大學 | Method for cutting fascia in meat block and its device |
CN114387515A (en) * | 2021-12-31 | 2022-04-22 | 潮州三环(集团)股份有限公司 | Cutting path planning method and device based on machine vision |
CN114714419A (en) * | 2022-04-18 | 2022-07-08 | 青岛锐智智能装备科技有限公司 | Chicken middle wing cutting device and cutting method thereof |
CN115016293A (en) * | 2022-07-20 | 2022-09-06 | 河南科技学院 | Pig carcass segmentation robot path autonomous correction method based on force feedback |
CN115063438A (en) * | 2022-07-27 | 2022-09-16 | 河南科技学院 | Autonomous adjusting method applied to pig carcass splitting robot |
CN116362147A (en) * | 2023-02-24 | 2023-06-30 | 西北工业大学 | Aviation hydraulic pipeline joint sealing performance prediction method considering yield hardening effect |
Also Published As
Publication number | Publication date |
---|---|
CN117562098B (en) | 2024-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111325764B (en) | Fruit image contour recognition method | |
US20230281265A1 (en) | Method for estimating body size and weight of pig based on deep learning | |
CN110415230B (en) | CT slice image semantic segmentation system and method based on deep learning | |
CN112750106B (en) | Nuclear staining cell counting method based on incomplete marker deep learning, computer equipment and storage medium | |
CN112990103B (en) | String mining secondary positioning method based on machine vision | |
CN106919902B (en) | Vehicle identification and track tracking method based on CNN | |
CN108890692A (en) | A kind of material color identification method for industrial robot vision's sorting | |
CN109919036B (en) | Worker operation posture classification method based on time domain analysis deep network | |
CN109781737B (en) | Detection method and detection system for surface defects of hose | |
CN108491807B (en) | Real-time monitoring method and system for oestrus of dairy cows | |
CN111931654A (en) | Intelligent monitoring method, system and device for personnel tracking | |
CN108537751A (en) | A kind of Thyroid ultrasound image automatic segmentation method based on radial base neural net | |
CN104298993B (en) | A kind of bar number positioning and recognition methods suitable under complex scene along track | |
CN107153067A (en) | A kind of surface defects of parts detection method based on MATLAB | |
CN106295639A (en) | A kind of virtual reality terminal and the extracting method of target image and device | |
CN116935327A (en) | Aquaculture monitoring method, device, equipment and storage medium based on AI vision | |
CN114170292A (en) | Pig weight estimation method and device based on depth image | |
CN117562098B (en) | Pig small-lining separation method | |
CN116206194A (en) | Method, device, system and storage medium for shoal feeding | |
CN113793385A (en) | Method and device for positioning fish head and fish tail | |
CN114494295A (en) | Robot intelligent slaughter and segmentation method and device and storage medium | |
CN113989322A (en) | Guide wire tip tracking method and system | |
CN113763432B (en) | Target detection tracking method based on image definition and tracking stability conditions | |
Klaoudatos et al. | Development of an Experimental Strawberry Harvesting Robotic System. | |
CN114463831A (en) | Training method and recognition method of iris recognition model for eyelashes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |