CN113643401B - Live pig carcass segmentation method and system based on machine learning - Google Patents

Live pig carcass segmentation method and system based on machine learning Download PDF

Info

Publication number
CN113643401B
CN113643401B CN202110994442.2A CN202110994442A CN113643401B CN 113643401 B CN113643401 B CN 113643401B CN 202110994442 A CN202110994442 A CN 202110994442A CN 113643401 B CN113643401 B CN 113643401B
Authority
CN
China
Prior art keywords
live pig
segmentation
machine learning
bones
fat
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110994442.2A
Other languages
Chinese (zh)
Other versions
CN113643401A (en
Inventor
江一宇
杨耀国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Fortec Automation Engineering Co ltd
Original Assignee
Wuxi Fortec Automation Engineering Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Fortec Automation Engineering Co ltd filed Critical Wuxi Fortec Automation Engineering Co ltd
Priority to CN202110994442.2A priority Critical patent/CN113643401B/en
Publication of CN113643401A publication Critical patent/CN113643401A/en
Application granted granted Critical
Publication of CN113643401B publication Critical patent/CN113643401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a live pig carcass segmentation method based on machine learning, which relates to the field of live pig segmentation, wherein a segmentation path is manually designed after relevant data of bones, muscles and fat are obtained after live pig carcasses are scanned by using a tomography technology, and a model capable of automatically generating a mechanical arm segmentation path according to characteristic parameters of the bones, the muscles and the fat is obtained after machine learning by a neural network algorithm; the trained model is utilized to a live pig carcass segmentation process, and a mechanical arm segmentation path is generated according to different live pig carcass parameters so as to segment the live pig carcass by the mechanical arm, thereby realizing automation of live pig carcass segmentation, improving segmentation efficiency and reducing segmentation cost.

Description

Live pig carcass segmentation method and system based on machine learning
Technical Field
The invention relates to a live pig carcass segmentation method, in particular to a live pig carcass segmentation method and system based on machine learning.
Background
After the pig is slaughtered, the pig carcass needs to be segmented, and the prior art generally adopts a mode of combining automatic segmentation and manual segmentation. At present, the highest degree of automation in the segmentation of live pig carcasses can reach 30%. The live pig carcasses are manually segmented, the segmentation process is time-consuming and labor-consuming, standardized management cannot be achieved, and the efficiency is low; and the manual division of live pig carcasses is required to be carried out in a low-temperature environment, so that the number of workers willing to do the work is gradually reduced, and the manual wages of the work are gradually increased.
Disclosure of Invention
Aiming at the technical problems, the invention provides a live pig carcass segmentation method and a live pig carcass segmentation system based on machine learning, which can realize automatic segmentation of live pig carcasses, reduce human participation, improve production efficiency and reduce production cost.
In order to achieve the above purpose, the technical scheme provided by the invention is as follows:
the invention provides a live pig carcass segmentation method based on machine learning, which comprises the following steps:
1) Performing tomographic scanning on live pig carcasses;
2) Carrying out digital processing and three-dimensional modeling on the pictures of the tomography, and distinguishing areas such as bones, fat, muscles and the like in the three-dimensional modeling process;
3) According to the digitally processed picture, obtaining characteristic parameters of bones, muscles and fat, and storing the characteristic parameters into a database;
4) According to the bone, fat and muscle areas distinguished by the three-dimensional model, designing a mechanical arm segmentation path, and storing parameters of the mechanical arm segmentation path into a database;
5) Repeating steps 1) -5) and filling the database;
6) Using the data stored in the database, taking the characteristic parameters of bones, muscles and fat as input values, taking a mechanical arm segmentation path as an output value, and performing machine learning through a neural network algorithm;
7) After machine learning, a model capable of automatically generating mechanical arm segmentation path parameters according to the characteristic parameters of bones, muscles and fat is obtained.
The invention provides a live pig carcass segmentation method based on machine learning, preferably, the step 2) specifically comprises the following steps: and performing tomographic scanning on the split live pig carcass by using an X-ray tomographic scanning technology.
The invention provides a live pig carcass segmentation method based on machine learning, preferably, the step 3) specifically comprises the following steps: performing density coloring on different tissues in the tomographic image according to different densities; and (3) carrying out digital processing and three-dimensional modeling on the tomographic image, and distinguishing areas such as bones, fat, muscles and the like according to different density coloring in the modeling process.
The invention provides a live pig carcass segmentation method based on machine learning, preferably, the step 4) specifically comprises the following steps: according to the digitized picture, the position coordinates and the inverse calculation proportion of bones, respectively calculating the characteristic parameters of bones in a plurality of areas by using a reference intensity method; calculating characteristic parameters of muscle and fat by using the same method; the characteristic parameters of bones, muscles and fat are stored in a database.
The invention provides a live pig carcass segmentation method based on machine learning, preferably, the step of performing tomography on live pig carcasses comprises the following steps: tomographic scanning is carried out on live pig carcasses.
The invention provides a live pig carcass segmentation system based on machine learning, which comprises:
the method for dividing the live pig carcasses based on the machine learning is utilized to respectively train and learn the live pig carcasses of different types, so that the mechanical arm dividing paths of the live pigs of different types under different body types are obtained;
the method comprises the steps of collecting the mechanical arm segmentation paths of different types of live pigs under different body types into a database;
the type and the body shape of the live pig are used as input values, and the mechanical arm dividing path can be obtained.
The live pig carcass segmentation system based on machine learning provided by the invention preferably comprises an input device; the input device is used for obtaining the body type of the live pig type and the live pig carcass.
The technical scheme has the following advantages or beneficial effects:
the invention obtains the related data of bones, muscles and fat by scanning live pig carcasses by using a tomography technology, then manually designs a segmentation path, and obtains a model capable of automatically generating a mechanical arm segmentation path according to the characteristic parameters of bones, muscles and fat by performing machine learning through a neural network algorithm; the trained model is utilized to a live pig carcass segmentation process, and a mechanical arm segmentation path is generated according to different live pig carcass parameters so as to segment the live pig carcass by the mechanical arm, thereby realizing automation of live pig carcass segmentation, improving segmentation efficiency and reducing segmentation cost.
Drawings
The invention and its features, aspects and advantages will become more apparent from the detailed description of non-limiting embodiments with reference to the following drawings. Like numbers refer to like parts throughout. The drawings are not intended to be drawn to scale, emphasis instead being placed upon illustrating the principles of the invention.
Fig. 1 is a schematic flow chart of a live pig carcass segmentation method based on machine learning.
Detailed Description
The following description of the technical solutions according to the embodiments of the present invention refers to the accompanying drawings, which are included to illustrate only some embodiments of the invention, and not all embodiments. Accordingly, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to fall within the scope of the present invention.
Example 1:
as shown in fig. 1, the method for dividing live pig carcasses based on machine learning provided by the invention comprises the following steps:
1) Splitting the live pig carcass;
2) Performing tomographic scanning on the split live pig carcass by using an X-ray tomographic scanning technology;
3) Performing density coloring on different tissues in the tomographic image according to different densities; carrying out digital processing and three-dimensional modeling on the pictures of the tomography, and distinguishing areas such as bones, fat, muscles and the like according to different density coloring in the modeling process;
4) According to the digitized picture, the position coordinates and the inverse calculation proportion of bones, respectively calculating the characteristic parameters of bones in a plurality of areas by using a reference intensity method; calculating characteristic parameters of muscle and fat by using the same method; storing the characteristic parameters of bones, muscles and fat into a database;
5) According to the bone, fat and muscle areas distinguished by the three-dimensional model, designing a mechanical arm segmentation path, and storing parameters of the mechanical arm segmentation path into a database;
6) Repeating steps 1) -5) and filling the database;
7) Using the data stored in the database, taking the characteristic parameters of bones, muscles and fat as input values, taking a mechanical arm segmentation path as an output value, and performing machine learning through a neural network algorithm;
8) After machine learning, a model capable of automatically generating mechanical arm segmentation path parameters according to the characteristic parameters of bones, muscles and fat is obtained.
The invention obtains the related data of bones, muscles and fat by scanning live pig carcasses by using a tomography technology, then manually designs a segmentation path, and obtains a model capable of automatically generating a mechanical arm segmentation path according to the characteristic parameters of bones, muscles and fat by performing machine learning through a neural network algorithm; the trained model is utilized to a live pig carcass segmentation process, and a mechanical arm segmentation path is generated according to different live pig carcass parameters so as to segment the live pig carcass by the mechanical arm, thereby realizing automation of live pig carcass segmentation, improving segmentation efficiency and reducing segmentation cost.
For the same kind of live pigs, the body types of the live pig carcasses are almost identical in the same body type, and the dividing paths thereof are also identical, so that there is provided in the present embodiment a live pig carcass dividing system based on machine learning, comprising: the method for dividing the live pig carcasses based on machine learning in the embodiment is used for respectively training and learning the live pig carcasses of different types to obtain the mechanical arm dividing paths of the live pigs of different types under different body types;
the method comprises the steps of collecting the mechanical arm segmentation paths of different types of live pigs under different body types into a database;
the type and the body shape of the live pig are used as input values, and the mechanical arm dividing path can be obtained.
Meanwhile, the live pig carcass segmentation system in the embodiment further comprises an input device; the input device is used for obtaining the body type of the live pig type and the live pig carcass. The input device in this embodiment includes a keyboard input device, but the keyboard input device is not limited to the keyboard input device, and may also be a device that can obtain a live pig variety, a live pig carcass, or the like, such as a laser ranging instrument connected to the system.
When the user uses the live pig carcass splitting system provided by the embodiment, the corresponding splitting path can be obtained only by inputting live pig types and live pig body types through the input device without installing the scanning device on the site where the splitting operation is performed, and the mechanical arm splits the live pig carcass according to the obtained splitting path, so that the problems that the scanning device is installed on the splitting operation site (such as complex environment of the operation site, easy damage of the scanning device, radiation of the scanning device, adverse health of workers performing the operation and the like) are avoided.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the invention, and all equivalent structural changes made by the present invention and the accompanying drawings, or direct or indirect application in other related technical fields, are included in the scope of the present invention.

Claims (6)

1. The live pig carcass segmentation method based on machine learning is characterized by comprising the following steps of:
1) Performing tomographic scanning on live pig carcasses;
2) Carrying out digital processing and three-dimensional modeling on the pictures of the tomography, and distinguishing areas such as bones, fat, muscles and the like in the three-dimensional modeling process;
3) According to the digitally processed picture, obtaining characteristic parameters of bones, muscles and fat, and storing the characteristic parameters into a database;
4) According to the bone, fat and muscle areas distinguished by the three-dimensional model, designing a mechanical arm segmentation path, and storing parameters of the mechanical arm segmentation path into a database;
5) Repeating steps 1) -5) and filling the database;
6) Using the data stored in the database, taking the characteristic parameters of bones, muscles and fat as input values, taking a mechanical arm segmentation path as an output value, and performing machine learning through a neural network algorithm;
7) After machine learning, a model capable of automatically generating mechanical arm segmentation path parameters according to the characteristic parameters of bones, muscles and fat is obtained.
2. The method for splitting live pig carcass based on machine learning according to claim 1, wherein the step 2) specifically comprises: and performing tomographic scanning on the split live pig carcass by using an X-ray tomographic scanning technology.
3. The method for splitting live pig carcass based on machine learning according to claim 2, wherein the step 3) specifically comprises: performing density coloring on different tissues in the tomographic image according to different densities; and (3) carrying out digital processing and three-dimensional modeling on the tomographic image, and distinguishing areas such as bones, fat, muscles and the like according to different density coloring in the modeling process.
4. The method for splitting live pig carcass based on machine learning according to claim 1, wherein the step 4) specifically comprises: according to the digitized picture, the position coordinates and the inverse calculation proportion of bones, respectively calculating the characteristic parameters of bones in a plurality of areas by using a reference intensity method; calculating characteristic parameters of muscle and fat by using the same method; the characteristic parameters of bones, muscles and fat are stored in a database.
5. A live pig carcass segmentation system based on machine learning, comprising:
training and learning the different types of live pig carcasses by using the machine learning-based live pig carcass segmentation method according to any one of claims 1-4 to obtain mechanical arm segmentation paths of the different types of live pigs under different body types;
the method comprises the steps of collecting the mechanical arm segmentation paths of different types of live pigs under different body types into a database;
the type and the body shape of the live pig are used as input values, and the mechanical arm dividing path can be obtained.
6. The machine learning based live pig carcass segmentation system of claim 5, comprising an input device; the input device is used for obtaining the body type of the live pig type and the live pig carcass.
CN202110994442.2A 2021-08-27 2021-08-27 Live pig carcass segmentation method and system based on machine learning Active CN113643401B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110994442.2A CN113643401B (en) 2021-08-27 2021-08-27 Live pig carcass segmentation method and system based on machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110994442.2A CN113643401B (en) 2021-08-27 2021-08-27 Live pig carcass segmentation method and system based on machine learning

Publications (2)

Publication Number Publication Date
CN113643401A CN113643401A (en) 2021-11-12
CN113643401B true CN113643401B (en) 2023-07-14

Family

ID=78424245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110994442.2A Active CN113643401B (en) 2021-08-27 2021-08-27 Live pig carcass segmentation method and system based on machine learning

Country Status (1)

Country Link
CN (1) CN113643401B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116502468B (en) * 2023-06-21 2023-11-28 查维斯机械制造(北京)有限公司 Method and system for controlling cattle carcass segmentation robot based on machine vision

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2361205C1 (en) * 2008-01-17 2009-07-10 Государственное образовательное учреждение высшего профессионального образования "Российский университет дружбы народов" (РУДН) Method of evaluation of morphologic composition of carcass of sheep
CN112164073A (en) * 2020-09-22 2021-01-01 江南大学 Image three-dimensional tissue segmentation and determination method based on deep neural network
CN113261582A (en) * 2021-05-17 2021-08-17 福建傲农生物科技集团股份有限公司 Online traceability system is cut apart to subsection position based on pig carcass segmentation technique

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2361205C1 (en) * 2008-01-17 2009-07-10 Государственное образовательное учреждение высшего профессионального образования "Российский университет дружбы народов" (РУДН) Method of evaluation of morphologic composition of carcass of sheep
CN112164073A (en) * 2020-09-22 2021-01-01 江南大学 Image three-dimensional tissue segmentation and determination method based on deep neural network
CN113261582A (en) * 2021-05-17 2021-08-17 福建傲农生物科技集团股份有限公司 Online traceability system is cut apart to subsection position based on pig carcass segmentation technique

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Carcass image segmentation using CNN-based methods;Diogo Nunes Goncalves;《INFORMATION PROCESSING IN AGRICULTURE》;560-572 *
基于R2U-Net 和空洞卷积的羊后腿分割目标肌肉区识别;刘楷东;《农业机械学报》;507-514 *

Also Published As

Publication number Publication date
CN113643401A (en) 2021-11-12

Similar Documents

Publication Publication Date Title
CN108921851B (en) Medical CT image segmentation method based on 3D countermeasure network
CN102422307B (en) For method, system, device and computer program that interactive liver vessel and biliary system are assessed
CN109949215B (en) Low-dose CT image simulation method
Tian et al. DCPR-GAN: dental crown prosthesis restoration using two-stage generative adversarial networks
CN110097557B (en) Medical image automatic segmentation method and system based on 3D-UNet
CN112037200A (en) Method for automatically identifying anatomical features and reconstructing model in medical image
Zachow et al. 3D reconstruction of individual anatomy from medical image data: Segmentation and geometry processing
EP3859605A3 (en) Image recognition method, apparatus, device, and computer storage medium
CN113643401B (en) Live pig carcass segmentation method and system based on machine learning
CN106462974B (en) Parameter optimization for segmenting images
WO2021027152A1 (en) Image synthesis method based on conditional generative adversarial network, and related device
CN111724389B (en) Method, device, storage medium and computer equipment for segmenting CT image of hip joint
CN108597017A (en) A kind of textured bone template construction method based on measurement parameter
CN112820399A (en) Method and device for automatically diagnosing benign and malignant thyroid nodules
CN110060315A (en) A kind of image motion artifact eliminating method and system based on artificial intelligence
Mu et al. Robotic 3D vision-guided system for half-sheep cutting robot
CN109544530B (en) Method and system for automatically positioning structural feature points of X-ray head radiography measurement image
CN114342986B (en) Intelligent splitting method for half-carcasses of pigs
CN115530762A (en) CBCT temporomandibular joint automatic positioning method and system
CN115937153A (en) Model training method and device, electronic equipment and computer storage medium
CN115311188B (en) Image recognition method and device, electronic equipment and storage medium
CN109509189B (en) Abdominal muscle labeling method and labeling device based on multiple sub-region templates
CN115439650A (en) Kidney ultrasonic image segmentation method based on CT image cross-mode transfer learning
CN111127636A (en) Intelligent desktop-level three-dimensional diagnosis system for complex intra-articular fracture
CN112102284A (en) Marking method, training method and device of training sample of image segmentation model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant