CN117053875B - Intelligent poultry phenotype measuring device and method - Google Patents

Intelligent poultry phenotype measuring device and method Download PDF

Info

Publication number
CN117053875B
CN117053875B CN202311304678.4A CN202311304678A CN117053875B CN 117053875 B CN117053875 B CN 117053875B CN 202311304678 A CN202311304678 A CN 202311304678A CN 117053875 B CN117053875 B CN 117053875B
Authority
CN
China
Prior art keywords
poultry
point
feature
layer
phenotype
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311304678.4A
Other languages
Chinese (zh)
Other versions
CN117053875A (en
Inventor
肖德琴
闫志广
刘又夫
廖言易
刘克坚
康俊琪
王佳涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Agricultural University
Original Assignee
South China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Agricultural University filed Critical South China Agricultural University
Priority to CN202311304678.4A priority Critical patent/CN117053875B/en
Publication of CN117053875A publication Critical patent/CN117053875A/en
Application granted granted Critical
Publication of CN117053875B publication Critical patent/CN117053875B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K45/00Other aviculture appliances, e.g. devices for determining whether a bird is about to lay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10009Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves
    • G06K7/10297Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves arrangements for handling protocols designed for non-contact record carriers such as RFIDs NFCs, e.g. ISO/IEC 14443 and 18092
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Toxicology (AREA)
  • Electromagnetism (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Animal Husbandry (AREA)
  • Birds (AREA)
  • Image Processing (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses an intelligent measurement device and method for a poultry phenotype, and relates to the field of intelligent measurement for the poultry phenotype, wherein the measurement device comprises a chassis (1), a weighing scale (2), a first shooting mechanism, a second shooting mechanism, a fence, a nodding mechanism, a rotary opening and closing door (7), a wing mark detector (8), an electric control box (9), an RFID reader (10), a wiring box (11) and a power line (12), the chassis (1) comprises wheels (1.1), a first rotating shaft (1.2) and a second rotating shaft (1.3), the fence comprises a fence (5.1) and a fixed bracket (5.2), and the rotary opening and closing door (7) comprises a fixed rotating shaft (7.1) and a door fence (7.2). The invention solves the problems of single object, single measurement data and low measurement precision of the existing poultry phenotype measurement.

Description

Intelligent poultry phenotype measuring device and method
Technical Field
The invention relates to the field of intelligent determination of poultry phenotypes, in particular to an intelligent determination device and method for poultry phenotypes.
Background
The weight and phenotypic signs of poultry, such as body size and body temperature, are important indicators of interest in the farming of enterprises, reflecting the health of the poultry and the benefits obtained by the enterprises. Currently, there are two main measures of weight and phenotypic characteristics of poultry in farms. Firstly, the artificial weighing is that a breeder grabs the poultry and only weighs the poultry on a weight scale, and the human eyes or a tape measure the data such as the body length, and the artificial measuring method is easy to cause stress reaction of the poultry, affects the growth and development, and has the advantages of high working intensity of the breeder, difficult data recording, difficult fixation of the live poultry and certain error of the obtained data. Secondly, the device equipment measures, through installing camera, ground platform scale then fine promotion efficiency, but this kind of sensor installation is complicated, maintains the difficulty, and the camera is mostly installed in roof high position, is the angle of nodding, and the definition of shooing is not high, and the direction is also few, and the error of body chi calculation is great, can not accurately calculate finer body chi equally, like neck length, leg length, body height. And the number of scales on the poultry is uncertain, and the obtained weight data is not representative.
At present, the poultry phenotype is single in measurement object, one set of device only aims at one variety, and most of the device only can measure one type of data, and most of measurement body scales or other data are nodding beats, so that the obtained effective information is limited, and the measurement accuracy is low.
Disclosure of Invention
Aiming at the defects in the prior art, the intelligent poultry phenotype measuring device and method provided by the invention solve the problems of single measuring object, single measuring data and low measuring precision of the existing poultry phenotype.
In order to achieve the aim of the invention, the invention adopts the following technical scheme: the utility model provides a poultry phenotype intelligent measurement device, includes chassis, weight scale, first mechanism, second mechanism, rail, dive and clap mechanism, rotatory door, wing mark detector, automatically controlled box, RFID reader, walks line box and power cord that opens and shuts, the chassis includes wheel, first pivot and second pivot, the rail includes net fence and fixed bolster, rotatory door that opens and shuts includes fixed pivot and door fence, 4 the wheel is arranged in the chassis bottom, and the both sides on chassis upper portion are arranged in respectively to first pivot and second pivot, the weight scale is arranged in the middle of the chassis in, first mechanism and first pivot swing joint, second mechanism and second pivot swing joint, 4 the fixed bolster is vertical to be fixed in four apex angles departments on chassis, 4 the bottom of fixed bolster and upper portion all are provided with the recess, and 4 the upper portion of fixed bolster all is provided with the pipe diameter, two the net fence passes through the recess setting in the both sides on chassis upper portion, dive mechanism passes through pipe diameter and fixed bolster connection, rotatory door that opens and shuts installs in the both sides on chassis upper portion, the fan is arranged in at the fixed bolster through fixed bolster, the automatically controlled box is arranged in to the side of electric control box, the electric control box is arranged in to the side of the device, the electric control box is arranged in to the side of the electric control box, the electric control device is read.
The beneficial effect of above-mentioned scheme is: the invention provides a poultry phenotype measuring device, in the device, more comprehensive data is acquired by arranging a nodding mechanism and a left side shooting mechanism and a right side shooting mechanism, and meanwhile, a plurality of grooves are arranged to fix a net fence, and the net fence is clamped in different grooves to change the width, so that the device is suitable for poultry of different sizes, and the problems of single measuring object and low measuring precision of the existing poultry phenotype are solved.
Further, the inlet and outlet of the chassis are in a slope shape, and a rechargeable battery is arranged in the chassis.
The beneficial effects of the above-mentioned further scheme are: the inlet and outlet of the chassis are in a slope shape, so that poultry can conveniently enter and exit, and a rechargeable battery is arranged in the chassis to provide power for the measuring device.
Further, the first shooting mechanism comprises a first rocker, a first support and a first camera, one end of the first rocker is movably connected with the first rotating shaft, the other end of the first rocker is fixedly connected with one end of the first support, and the other end of the first support is movably connected with the first camera;
the second shooting mechanism comprises a second rocker, a second support and a second camera, one end of the second rocker is movably connected with the second rotating shaft, the other end of the second rocker is fixedly connected with one end of the second support, and the other end of the second support is movably connected with the second camera.
The beneficial effects of the above-mentioned further scheme are: the rocker in the side shooting mechanism can rotate along the rotating shaft, the support post is connected with the camera to rotate, the shooting angle of the camera can be conveniently adjusted, the height is adjusted up and down, the distance is adjusted front and back, when the side shooting mechanism works, the rocker rotates outside the car body, and the side shooting mechanism can be rotated back to the car body after the work is finished. The device is provided with cameras on both sides for shooting, so that comprehensive information data can be captured.
Further, the nodding mechanism comprises an upper fixing frame, a fixing plate, an upper camera, a connecting sleeve and a foot rest, wherein the fixing plate is connected with the upper fixing frame, the upper camera is arranged below the fixing plate, the foot rest is connected with the upper fixing frame through the connecting sleeve, and the foot rest is movably connected with the pipe diameter on the fixing support.
The beneficial effects of the above-mentioned further scheme are: go up the camera and fix through last mount and foot rest for obtain the top view of poultry, whole nodding mechanism is detachable, has promoted the flexibility of device.
Further, the measuring device further comprises a swing gate motor, the swing gate motor is arranged on the fixed support and connected with a control PC arranged in the electric control box, the swing gate motor comprises a motor main body, a speed reducer, a transmission device and a limit switch, and the motor main body is respectively connected with the speed reducer, the transmission device and the limit switch.
The beneficial effects of the above-mentioned further scheme are: the swing gate motor is used for controlling the movement of the rotary opening and closing door, and the limit switch is used for controlling the movement range of the rotary opening and closing door, so that the rotary opening and closing door can only move within a specified range.
In addition, the invention adopts the following technical scheme: an intelligent determination method for poultry phenotype, comprising the following steps:
s1: the method comprises the steps of wearing light electronic wing marks on poultry to be detected, arranging an RFID tag on each individual poultry, detecting poultry wing mark information by using a wing mark detector, transmitting the poultry wing mark information to a control PC (personal computer) arranged in an electric control box, identifying the number of the poultry by using an RFID reader and transmitting the poultry number to the control PC, and controlling a green light of the wing mark detector to be turned on and a fence entering door in a rotary opening door to be opened outwards by using the control PC;
s2: when the poultry enters the passageway, the control PC is used for controlling the entrance gate to be closed and the wing mark detector to turn on the yellow lamp, and when the poultry reaches the weighing scale, the time and the weight of the poultry on the weighing scale are obtained by the weighing scale and are transmitted into the control PC;
s3: after the weight scale data are stable, acquiring multi-angle image data of the poultry by utilizing the first shooting mechanism, the second shooting mechanism and the pitching mechanism, and uploading the multi-angle image data to a control PC;
s4: based on the multi-angle image data, calculating phenotype data of the poultry through a control PC, and uploading the phenotype data to a database for storage;
s5: after the phenotype data is stored, a control PC is used for controlling the opening of the gate outlet in the rotary opening and closing gate, and after the poultry leaves the measuring device, the gate outlet is controlled to be closed and the wing mark detector is controlled to be lighted up in red, so that the next round of detection is carried out, and the intelligent measurement of the phenotype of the poultry is completed.
The beneficial effect of above-mentioned scheme is: the RFID reader is used for reading the number of the RFID tag on the poultry body, the wing mark detector is used for detecting wing mark information of the poultry, the weighing scale is used for recording the time and weight of the poultry passing through the channel, the nodding mechanism and the side shooting mechanism are used for measuring the body length, the body width, the neck length, the leg length, the body characteristics, whether injury, hair falling, lameness and other phenotype characteristics of the poultry, and meanwhile the acquired phenotype data are recorded and stored in a grading manner, so that the health state of the poultry can be more accurately and effectively analyzed and monitored.
Further, in S2, a time-weight graph is drawn by controlling a PC, two extreme values of the weight are removed, and the average value is calculated in the middle section of time, so that the weight is calculatedThe calculation formula is as follows:
wherein,for the sum of body weight>For maximum weight>Is the minimum weight value->In order to be able to count the time,is time.
The beneficial effects of the above-mentioned further scheme are: the weight value of the poultry is calculated through the formula, extreme values at two ends are removed, and the average value is calculated in the middle section of time, so that the accuracy of the calculation result is improved.
Further, the step S4 includes the following steps:
s4-1: dividing an image background by using a RANSAC algorithm, wherein the iteration frequency formula in the RANSAC algorithm is as follows:
wherein,minimum number of iterations required for optimal model, +.>For confidence level->For which the data points in the image dataset are outliersProbability of->The minimum data required to construct the dataset model;
s4-2: denoising and smoothing the image after the image background segmentation by adopting an octree filtering algorithm;
s4-3: performing point cloud registration on the denoised and smoothed image, performing point cloud coarse registration by adopting a 4PCS algorithm, performing point cloud fine registration on the coarse registered point cloud image by adopting an ICP algorithm, and completing preprocessing of the image, wherein the evaluation function of the ICP algorithm is as followsThe method comprises the following steps:
wherein,for rotating the transformation matrix +.>Is a three-dimensional translation vector>For the corresponding set of points found in the source point cloud +.>For a sample point set selected in the target point cloud, +.>The number of data points in the point set;
s4-4: marking key feature areas and key feature points of the poultry body area, and establishing a standard three-dimensional template;
s4-5: registering the standard three-dimensional template with the preprocessed actual acquisition point cloud data to obtain global point cloud image data of the poultry;
s4-6: inputting global point cloud image data of the poultry into a bird-PointNet++ neural network to carry out local feature fine segmentation to obtain a poultry segmentation result;
s4-7: positioning measuring points of the poultry based on the poultry segmentation result to obtain phenotype data of the poultry.
The beneficial effects of the above-mentioned further scheme are: in order to avoid the shot image comprising the point cloud data of the ground points and the fences, background removal is carried out on the image, denoising smoothing is carried out in order to reduce noise existing in the point cloud data, a plurality of point clouds are correctly registered under the same coordinate system by utilizing a point cloud matching algorithm to form more complete point clouds, preprocessing of the image is completed, and the obtained global point cloud image is input into a bird-PointNet++ neural network, so that poultry phenotype data are obtained.
Further, in S4-6, partial fine segmentation is performed by using a Birds-PointNet++ neural network, and the method comprises the following steps of:
s4-6-1: the method comprises the steps of carrying out global feature extraction on input global point cloud image data by adopting an MLP layer, wherein the global point cloud image data comprisesDots, each dot having +>The dimension of the spatial feature, the feature is obtained>
S4-6-2: features to be characterizedInputting to SA layer, sampling and combining to obtain characteristic ∈>Feature extraction is performed on each cluster, and downsampling is performed by using a PointNet layer to obtain output as a feature +.>
S4-6-3: features to be characterizedAs input to the next SA layer, the features are obtained by sampling and combiningDownsampling with PointNet layer to obtain feature +.>
S4-6-4: features to be characterizedUp-sampling is carried out, the up-sampled point characteristics are input into an FP module, and the up-sampled point characteristics and the point characteristics before down-sampling are spliced to obtain characteristics +.>Using MLP layer pair characteristicsPerforming feature transformation and reducing dimensionality to obtain features->
S4-6-5: features to be characterizedAs input of the next FP module, the characteristics are obtained by interpolationThe feature is transferred to the original data point set through the MLP layer to obtain the feature output of the neural network>
S4-6-6: outputting the characteristicsAs the input of the next softmax layer network, calculating the probability of each point and classifying to finally obtain a segmentation result;
wherein,for the feature dimension obtained through the MLP layer, < + >>For the characteristic dimension obtained by the first downsampling of SA layer,/a>For the characteristic dimension obtained by the second downsampling of SA layer, +.>For the characteristic dimension obtained by performing MLP conversion after interpolation in FP layer for the first time, the ++>For the number of clusters after the first sampling combination, < >>For the number of clusters after the second sampling combination, < > is>Contain the number of points for each cluster, +.>And performing interpolation in the FP layer for the second time and then performing MLP conversion to obtain characteristic dimensions.
The beneficial effects of the above-mentioned further scheme are: the above-mentioned Birds-PointNet++ neural network includes a downsampling process called encoder and an upsampling process called decoder, and the local feature fine segmentation is performed on the point cloud data image through the above-mentioned neural network.
Further, in S4-6-4, the up-sampled point features and the point features before down-sampling are spliced by interpolation, and the calculation formula of the interpolation features is as follows:
to interpolate featuresAnd->By feature fusion->
Wherein,for new features obtained by difference, +.>Is the distance between the adjacent point and the reference point, +.>For reference point feature->For the feature when the point originally corresponds to the SA layer downsampling, +.>For the distance of the first neighboring point from the reference point, -/-, is>For the distance of the second neighboring point from the reference point, -/-, is>For the distance of the third neighboring point from the reference point, +.>Is the number of the adjacent points.
The beneficial effects of the above-mentioned further scheme are: through the formula, the characteristics of each point before downsampling of the layer are obtained by adopting weighted neighbor interpolation, so that the point characteristics after upsampling and the point characteristics before downsampling are spliced conveniently.
Drawings
Fig. 1 is a poultry phenotype intelligent assay device.
Wherein: 1. a chassis; 1.1, wheels; 1.2, a first rotating shaft; 1.3, a second rotating shaft; 2. weighing scale; 3.1, a first rocker; 3.2, a first pillar; 3.3, a first camera; 4.1, a second rocker; 4.2, a second pillar; 4.3, a second camera; 5.1, a net fence; 5.2, fixing the bracket; 5.3, grooves; 6.1, an upper fixing frame; 6.2, fixing plate; 6.3, mounting a camera; 6.4, connecting sleeve; 6.5, foot rest; 7. rotating the opening and closing door; 7.1, fixing a rotating shaft; 7.2, a door rail; 8. a fin target detector; 9. an electric control box; 10. an RFID reader; 11. a wiring box; 12. and a power supply line.
FIG. 2 is a flow chart of a method for intelligently determining the phenotype of poultry.
Fig. 3 is a schematic diagram of a breeding goose point cloud template fabrication.
FIG. 4 is a schematic diagram of a Birds-PointNet++ neural network model according to the present invention.
Detailed Description
The invention will be further described with reference to the drawings and specific examples.
Embodiment 1, as shown in fig. 1, a poultry phenotype intelligent measurement device comprises a chassis 1, a weight scale 2, a first beat mechanism, a second beat mechanism, a fence, a nodding beat mechanism, a rotary opening and closing door 7, a wing mark detector 8, an electric control box 9, an RFID reader 10, a wiring box 11 and a power line 12, wherein the chassis 1 comprises wheels 1.1, a first rotating shaft 1.2 and a second rotating shaft 1.3, the fence comprises a net fence 5.1 and a fixed support 5.2, the rotary opening and closing door 7 comprises a fixed rotating shaft 7.1 and a door fence 7.2,4, the wheels 1.1 are arranged at the bottom of the chassis 1, the first rotating shaft 1.2 and the second rotating shaft 1.3 are respectively arranged at two sides of the upper part of the chassis 1, the weight scale 2 is arranged in the middle of the chassis 1, the first beat mechanism is movably connected with the first rotating shaft 1.2, the second beat mechanism is movably connected with the second rotating shaft 1.3, the four apex angles department of chassis 1 is vertically fixed to 4 fixed bolster 5.2, 4 fixed bolster 5.2's bottom and upper portion all are provided with recess 5.3, and 4 fixed bolster 5.2's upper portion all is provided with the pipe diameter, two net fence 5.1 passes through recess 5.3 and sets up the both sides on chassis 1 upper portion, nodding mechanism passes through the pipe diameter and is connected with fixed bolster 5.2, rotatory opening and shutting door 7 passes through fixed rotating shaft 7.1 and installs on fixed bolster 5.2, wing mark detector 8 is arranged in on fixed bolster 5.2, net fence 5.1 is arranged in to automatically controlled box 9, automatically controlled box 9 embeds control PC for the intelligent determination of control survey device, RFID reader 10 is arranged in fixed bolster 5.2 side top, walk line box 11 and set up along fixed bolster 5.2, power cord 12 draws forth from chassis 1 inside for being connected with external power source.
In the embodiment, the design of the fixing bracket has two important functions, namely, a plurality of grooves 5.3 are arranged to fix the net fence 5.1, and the width of the net fence 5.1 can be changed by being clamped in different grooves so as to correspond to poultry of different sizes; and secondly, the pipe diameter is arranged at the upper part and is used for being connected with an upper fixing frame 6.1.
The inlet and outlet of the chassis 1 are in a slope shape, and a rechargeable battery is arranged in the chassis 1.
As shown in figure 1, the chassis of the measuring device is provided with 4 wheels 1.1, so that the device can move freely, and the device can be used for carrying out work in different poultry houses, has multiple purposes and reduces the cost. The inlet and outlet of the chassis 1 is changed into a slope shape, which is convenient for meat and poultry to go in and out. A rechargeable battery is arranged in the chassis 1 to provide power for each module of the equipment. The device is led out through a power line, and is connected with an external power supply, so that an internal battery can be charged, and the whole device can be powered. The device provides a wiring box 11 for each sensor, and finally is assembled to an electronic control box 9 (a built-in control PC).
The first shooting mechanism comprises a first rocker 3.1, a first support column 3.2 and a first camera 3.3, one end of the first rocker 3.1 is movably connected with the first rotating shaft 1.2, the other end of the first rocker is fixedly connected with one end of the first support column 3.2, and the other end of the first support column 3.2 is movably connected with the first camera 3.3;
the second shooting mechanism comprises a second rocker 4.1, a second support column 4.2 and a second camera 4.3, one end of the second rocker 4.1 is movably connected with the second rotating shaft 1.3, the other end of the second rocker is fixedly connected with one end of the second support column 4.2, and the other end of the second support column 4.2 is movably connected with the second camera 4.3.
In the first shooting mechanism and the second shooting mechanism, the rocker can rotate along the rotating shaft, the support is connected with the camera, and the rocker can also rotate to adjust the shooting angle of the camera, and the height and the front-back adjustment distance are adjusted up and down. When the device works, the rocker rotates outside the vehicle body, the rocker can be turned back to the vehicle body after the work is finished, and cameras are arranged on two sides of the rocker for shooting, so that comprehensive information data can be captured.
The nodding mechanism comprises an upper fixing frame 6.1, a fixing plate 6.2, an upper camera 6.3, a connecting sleeve 6.4 and a foot rest 6.5, wherein the fixing plate 6.2 is connected with the upper fixing frame 6.1, the upper camera 6.3 is arranged below the fixing plate 6.2, the foot rest 6.5 is connected with the upper fixing frame 6.1 through the connecting sleeve 6.4, and the foot rest 6.5 is movably connected with the pipe diameter on the fixing support 5.2. The whole nodding mechanism is detachable.
The measuring device further comprises a swing brake motor, the swing brake motor is arranged on the fixed support 5.2 and connected with a control PC arranged in the electric control box 9, the swing brake motor comprises a motor main body, a speed reducer, a transmission device and a limit switch, and the motor main body is respectively connected with the speed reducer, the transmission device and the limit switch.
In this embodiment, the fin target detector 8 detects fin target information of the poultry in real time and transmits the information to the control PC. The control PC judges whether the access authority exists according to the poultry information and sends an instruction to the swing gate motor. After receiving the instruction of the control PC, the swing gate motor starts the motor main body to rotate. The high-speed rotation of the motor is converted into the low-speed rotation of the rotating shaft by the speed reducer and the transmission. Meanwhile, the limit switch can control the movement range of the door rail, so that the door rail can only move in a specified range. When the swing gate moves to an on or off position, the limit switch detects the state of the swing gate and transmits the state information to the control PC. When the user passes through the swing gate, the swing gate motor is automatically turned off to wait for the next operation.
Example 2 as shown in fig. 2, a method for intelligently determining the phenotype of poultry comprises the following steps:
s1: the method comprises the steps of wearing light electronic wing marks on poultry to be detected, arranging an RFID tag on each individual poultry, detecting poultry wing mark information by using a wing mark detector, transmitting the poultry wing mark information to a control PC (personal computer) arranged in an electric control box, identifying the number of the poultry by using an RFID reader and transmitting the poultry number to the control PC, and controlling a green light of the wing mark detector to be turned on and a fence entering door in a rotary opening door to be opened outwards by using the control PC;
s2: when the poultry enters the passageway, the control PC is used for controlling the entrance gate to be closed and the wing mark detector to turn on the yellow lamp, and when the poultry reaches the weighing scale, the time and the weight of the poultry on the weighing scale are obtained by the weighing scale and are transmitted into the control PC;
s3: after the weight scale data are stable, acquiring multi-angle image data of the poultry by utilizing the first shooting mechanism, the second shooting mechanism and the pitching mechanism, and uploading the multi-angle image data to a control PC;
s4: based on the multi-angle image data, calculating phenotype data of the poultry through a control PC, and uploading the phenotype data to a database for storage;
s5: after the phenotype data is stored, a control PC is used for controlling the opening of the gate outlet in the rotary opening and closing gate, and after the poultry leaves the measuring device, the gate outlet is controlled to be closed and the wing mark detector is controlled to be lighted up in red, so that the next round of detection is carried out, and the intelligent measurement of the phenotype of the poultry is completed.
S2, drawing a time-weight graph by controlling a PC, removing extreme values at two ends of the weight, taking the middle section of time, and calculating the average weightThe calculation formula is as follows:
wherein,for the sum of body weight>For maximum weight>Is the minimum weight value->In order to be able to count the time,is time.
S4, the following steps are included:
s4-1: dividing an image background by using a RANSAC algorithm, wherein the iteration frequency formula in the RANSAC algorithm is as follows:
wherein,minimum number of iterations required for optimal model, +.>For confidence level->For the probability that a data point in the image dataset is an outlier, < +.>The minimum data required to construct the dataset model;
in one embodiment of the invention, the method comprises the steps of:
s4-1-1: from image dataset(poultry Point cloud data set acquired by depth camera) randomly selecting +.>Data of which +.>The data need to satisfy the constituent image dataset model +.>(poultry appearance) number, calculating parameters of the image dataset model;
s4-1-2: using image dataset modelsDistance calculation is performed for all matching points of the image dataset, if matching points are +.>When the distance between the two is smaller than or equal to the threshold value, the matching point is set as an inner point; if the distance is greater than the threshold value, the matching point is an outlier;
s4-1-3: performing loop iteration on the steps S4-1-1 and S4-1-2 until the iteration times meet the expected set iteration value, and marking the model with the largest inner point number as the optimal modelAnd saves the parameters of the model.
S4-2: denoising and smoothing the image after the image background segmentation by adopting an octree filtering algorithm;
in one embodiment of the invention, the method comprises the steps of:
s4-2-1: setting the maximum recursion depth, finding out the maximum size of a scene, and building a first cube according to the maximum size, wherein each node of the octree represents a volume element of a cube, each node has eight child nodes, and the volume elements represented by the eight child nodes are added together to be equal to the volume of a parent node;
s4-2-2: distributing the poultry point cloud into the first built cube according to the octree model, starting subdivision, numbering each node, and stopping subdivision if the number of unit elements distributed to the subcubes is found to be different from zero and the number of the unit elements distributed to the subcubes is the same as that of the parent cube;
s4-2-3: finding out all the minimum nodes according to the node numbers, regarding the node where the point with the minimum height is located as a ground point, performing plane fitting with the adjacent points in a certain threshold according to a least square method to obtain a synthetic bottom surface, regarding all the contained points as ground points, and removing the ground points;
s4-2-4: calculating the number of original points in the volume element, setting a quantity threshold, counting the number of data points in the volume element, if the number of the data points in the volume element is larger than the quantity threshold, retaining the data points contained in the node, and if the number of the data points in the volume element is smaller than or equal to the quantity threshold, deleting all the data points in the node;
s4-2-5: calculating the length of the data point in the volume element from the center point of the volume element, calculating the average value of the distances, deleting the points larger than the average value to smooth the surface of the poultry point cloud, and repeating the steps until all the point cloud data are processed.
S4-3: performing point cloud registration on the denoised and smoothed image, performing point cloud coarse registration by adopting a 4PCS algorithm, performing point cloud fine registration on the coarse registered point cloud image by adopting an ICP algorithm, and completing preprocessing of the image, wherein the evaluation function of the ICP algorithm is as followsThe method comprises the following steps:
wherein,for rotating the transformation matrix +.>Is a three-dimensional translation vector>For the corresponding set of points found in the source point cloud +.>For a sample point set selected in the target point cloud, +.>The number of data points in the point set;
in one embodiment of the invention, the coarse registration of the point cloud comprises the steps of:
s4-3-1: selecting a group of non-completely collinear coplanar four points in a target point cloudThe point set is taken as a group of basis、/>、/>And->) And calculates the distance between the two points +.>And->
S4-3-2: on line segmentAnd->Intersect at->Under the condition of points, respectively calculating independent ratios of two line segments, and finding out the energy and the +% in the source point cloud according to the characteristic that the line segment proportion in rigid transformation has unchanged, wherein the independent ratios of the line segments also remain unchanged>Four-point set with equal ratio->
S4-3-3: and calculating all four point sets corresponding to the coplanar point base in the source point cloud, obtaining a transformation matrix of the four point sets, and selecting the transformation matrix with the highest registration precision according to the principle of maximizing the common point set to perform global transformation.
After coarse registration by a 4PCS algorithm, two pieces of point clouds are approximately overlapped, in order to further improve registration accuracy, an ICP algorithm is used for fine registration, the basic principle of ICP is a least square method, the algorithm continuously selects sample points and corresponding points and calculates a transformation matrix between the sample points and the corresponding points until the sample points and the corresponding points meet accuracy requirements, and the fine registration of the point clouds comprises the following steps:
s4-3-4: selecting in a target point cloudAnd searching for the corresponding point set +.>Make->And->Is the smallest;
s4-3-5: will beThe point cloud and the rotation transformation matrix in (2)>And three-dimensional translation vector->Recombination, obtaining a new point cloud set +.>
S4-3-6: calculate allCorresponding dot set->Average distance between>The formula is:
s4-3-7: if calculatedIf the number of the transformation matrix is smaller than a preset threshold value or larger than a preset maximum iteration number, stopping iteration, wherein the obtained transformation matrix meets the requirements; otherwise, returning to the step S4-3-4, and repeating the operation until the constraint condition is met.
S4-4: marking key feature areas and key feature points of the poultry body area, and establishing a standard three-dimensional template;
in one embodiment of the present invention, as shown in fig. 3, a geese point cloud template is established by taking a geese as an example, and the method includes the following steps:
s4-4-1: manufacturing a standard physical model of the breeding geese;
s4-4-2: a depth camera is adopted to acquire depth data information of the standard physical model from three directions of left view, right view and overlook respectively;
s4-4-3: preprocessing three point cloud images, registering point clouds of left and right views, and registering the obtained result with top view point cloud data;
s4-4-4: denoising and smoothing the registered images through an octree filtering algorithm to obtain a final point cloud standard three-dimensional template.
S4-5: registering the standard three-dimensional template with the preprocessed actual acquisition point cloud data to obtain global point cloud image data of the poultry;
s4-6: inputting global point cloud image data of the poultry into a bird-PointNet++ neural network to carry out local feature fine segmentation to obtain a poultry segmentation result;
s4-7: positioning measuring points of the poultry based on the poultry segmentation result to obtain phenotype data of the poultry.
As shown in fig. 4, the bird-pointnet++ neural network model extracts point cloud features layer by layer through a point set extraction module (SA), and the module is mainly divided into a sampling layer, a grouping layer and a feature extraction layer. The sampling layer is completed by a furthest point sampling algorithm (farthest point sampling, FPS) which can more uniformly cover the whole space with sampling points; the grouping layer takes the sampling point obtained by calculation of the sampling layer as a center, adopts Ball query, namely spherical retrieval, selects points in a spherical neighborhood with a given radius, and divides the point cloud data into a plurality of local point sets; the feature extraction layer adopts a multi-layer perceptron (multi layer perceptron, MLP) to extract the features of each local point set obtained by calculating the layers. By stacking multiple SA modules, the Birds-PointNet++ implements a structure similar to a Convolutional Neural Network (CNN) on a point cloud, and deep global semantic features are obtained from shallow local features.
When the part segmentation task is performed, the down-sampled features are up-sampled to obtain the corresponding features of each point in the original point cloud. The Birds-pointnet++ implements feature upsampling by the feature delivery module (feature propagation, FP). The modules are mainly divided into interpolation, splicing and linear transformation. The interpolation is to obtain the characteristics of each point before the downsampling of the layer by adopting weighted neighbor interpolation; splicing is to splice the up-sampled point features with the point features before down-sampling; the feature transformation is to perform feature transformation on the spliced features through a multi-layer perceptron and reduce the dimension. And stacking FP modules with the same number as the SA modules, and outputting the point clouds with the same number as the original point clouds by the final model. And then reducing the dimension of the output characteristics to the category number through a multi-layer perceptron, and obtaining the category probability of each point in the point cloud through softmax.
In S4-6, partial fine segmentation is carried out by utilizing a Birds-PointNet++ neural network, and the method comprises the following steps of:
s4-6-1: the method comprises the steps of carrying out global feature extraction on input global point cloud image data by adopting an MLP layer, wherein the global point cloud image data comprisesDots, each dot having +>The dimension of the spatial feature, the feature is obtained>
S4-6-2: features to be characterizedInputting to SA layer, sampling and combining to obtain characteristic ∈>Feature extraction is performed on each cluster, and downsampling is performed by using a PointNet layer to obtain output as a feature +.>
S4-6-3: features to be characterizedAs input to the next SA layer, the features are obtained by sampling and combiningDownsampling with PointNet layer to obtain feature +.>
S4-6-4: features to be characterizedUp-sampling is carried out, the up-sampled point characteristics are input into an FP module, and the up-sampled point characteristics and the point characteristics before down-sampling are spliced to obtain characteristics +.>Using MLP layer pair characteristicsPerforming feature transformation and reducing dimensionality to obtain features->
S4-6-5: features to be characterizedAs input of the next FP module, the characteristics are obtained by interpolationThe feature is transferred to the original data point set through the MLP layer to obtain the feature output of the neural network>
S4-6-6: outputting the characteristicsAs the input of the next softmax layer network, calculating the probability of each point and classifying to finally obtain a segmentation result;
wherein,for the feature dimension obtained through the MLP layer, < + >>For the characteristic dimension obtained by the first downsampling of SA layer,/a>For the characteristic dimension obtained by the second downsampling of SA layer, +.>For the characteristic dimension obtained by performing MLP conversion after interpolation in FP layer for the first time, the ++>For the number of clusters after the first sampling combination, < >>For the number of clusters after the second sampling combination, < > is>Contain the number of points for each cluster, +.>And performing interpolation in the FP layer for the second time and then performing MLP conversion to obtain characteristic dimensions.
In S4-6-4, the up-sampled point features and the point features before down-sampling are spliced through interpolation, wherein the calculation formula of the interpolation features is as follows:
to interpolate featuresAnd->By feature fusion->
Wherein,for new features obtained by difference, +.>Is the distance between the adjacent point and the reference point, +.>For reference point feature->For the feature when the point originally corresponds to the SA layer downsampling, +.>For the distance of the first neighboring point from the reference point, -/-, is>For the distance of the second neighboring point from the reference point, -/-, is>For the distance of the third neighboring point from the reference point, +.>Is the number of the adjacent points.
In one embodiment of the invention, in early work, the Birds-pointnet++ relied on a large-scale point cloud dataset. In the early deployment of the device, more than 5000 poultry pictures can be collected by the camera first, and a data set of the training network is constructed. After manual labeling, the poultry phenotype features were divided into head, neck, torso and lower limbs. And finally, carrying out back propagation on errors calculated by a loss function of the Birds-PointNet++ model, and updating network parameters. The data set is divided into a training set, a testing set and a verification set in a 3:1:1 mode.
And classifying and grading the poultry by using the multidimensional information afferent neural network, and providing rating evaluation. The poultry breeding performance system comprises poultry information management, poultry phenotype record management and poultry body condition record management, wherein the poultry information management provides functions of adding, deleting, modifying and inquiring poultry information, the poultry phenotype record management provides body rule records of each part of each poultry, the body rule weight changes with time, and the poultry body condition record management provides body condition records of only the poultry.
The measuring device provided by the invention solves the problems of time and labor consumption of manual weighing, low accuracy of body ruler measurement and automatic recording of analysis data; the device has high cost performance and can freely move, and one set of equipment can be used for working in a plurality of poultry houses; the width of the fence can be adjusted, so that the fence can effectively cope with poultry of different varieties and body types. The detection tracking of the corresponding meat poultry can be realized based on the detection method of the wing marks, and false detection caused by missing detection is prevented; each poultry is provided with an RFID number, so that data matching can be accurately performed, and growth health monitoring can be performed; the meat poultry data with multiple dimensions are collected, so that meat poultry can be classified more accurately, and the health state can be effectively analyzed and monitored; the body size calculation algorithm based on the depth image has great robustness, can quickly obtain accurate calculation results, is non-contact, and reduces the stress response of meat poultry.
Those of ordinary skill in the art will recognize that the embodiments described herein are for the purpose of aiding the reader in understanding the principles of the present invention and should be understood that the scope of the invention is not limited to such specific statements and embodiments. Those of ordinary skill in the art can make various other specific modifications and combinations from the teachings of the present disclosure without departing from the spirit of the invention, and such modifications and combinations are still within the scope of the invention.

Claims (4)

1. The utility model provides a poultry phenotype intelligent determination method, based on poultry phenotype intelligent determination device realization, the device includes chassis (1), weight scale (2), first rapping mechanism, second rapping mechanism, rail, nodding mechanism, rotatory door (7) that opens and shuts, wing standard detector (8), automatically controlled box (9), RFID reader (10), walk line box (11) and power cord (12), chassis (1) include wheel (1.1), first pivot (1.2) and second pivot (1.3), the rail includes net fence (5.1) and fixed bolster (5.2), rotatory door (7) that opens and shuts include fixed pivot (7.1) and gate (7.2), 4 wheel (1.1) are arranged in chassis (1) bottom, and both sides on chassis (1) upper portion are arranged in respectively to first pivot (1.2) and second pivot (1.3), weight scale (2) are arranged in the middle of chassis (1), first mechanism and first pivot (1.2) swing joint and first pivot (5.2) and fixed bolster (5.2), second pivot (4.3) are connected with fixed bolster (4.5.3 vertical top angle (5.5) are all provided with fixed in the fixed bolster (4.5.5 top angle parts, the two net fences (5.1) are arranged on two sides of the upper part of the chassis (1) through grooves (5.3), the nodding mechanism is connected with a fixed support (5.2) through pipe diameters, the rotary opening and closing door (7) is arranged on the fixed support (5.2) through a fixed rotating shaft (7.1), the wing mark detector (8) is arranged on the fixed support (5.2), the electric control box (9) is arranged on the net fences (5.1), a control PC is arranged in the electric control box (9) and used for controlling intelligent measurement of a measuring device, the RFID reader (10) is arranged above the side of the fixed support (5.2), the wiring box (11) is arranged along the fixed support (5.2), and the power wire (12) is led out from the inside of the chassis (1) and is used for being connected with an external power supply;
the inlet and outlet of the chassis (1) are in a slope shape, and a rechargeable battery is arranged in the chassis (1);
the first shooting mechanism comprises a first rocker (3.1), a first support (3.2) and a first camera (3.3), one end of the first rocker (3.1) is movably connected with the first rotating shaft (1.2), the other end of the first rocker is fixedly connected with one end of the first support (3.2), and the other end of the first support (3.2) is movably connected with the first camera (3.3);
the second shooting mechanism comprises a second rocker (4.1), a second support (4.2) and a second camera (4.3), one end of the second rocker (4.1) is movably connected with the second rotating shaft (1.3), the other end of the second rocker is fixedly connected with one end of the second support (4.2), and the other end of the second support (4.2) is movably connected with the second camera (4.3);
the nodding mechanism comprises an upper fixing frame (6.1), a fixing plate (6.2), an upper camera (6.3), a connecting sleeve (6.4) and a foot rest (6.5), wherein the fixing plate (6.2) is connected with the upper fixing frame (6.1), the upper camera (6.3) is arranged below the fixing plate (6.2), the foot rest (6.5) is connected with the upper fixing frame (6.1) through the connecting sleeve (6.4), and the foot rest (6.5) is movably connected with the pipe diameter on the fixing support (5.2);
the measuring device further comprises a swing brake motor, the swing brake motor is arranged on the fixed support (5.2) and connected with a control PC (personal computer) arranged in the electric control box (9), the swing brake motor comprises a motor main body, a speed reducer, a transmission device and a limit switch, and the motor main body is respectively connected with the speed reducer, the transmission device and the limit switch;
the intelligent determination method for the phenotype of the poultry is characterized by comprising the following steps of:
s1: the method comprises the steps of wearing light electronic wing marks on poultry to be detected, arranging an RFID tag on each individual poultry, detecting poultry wing mark information by using a wing mark detector, transmitting the poultry wing mark information to a control PC (personal computer) arranged in an electric control box, identifying the number of the poultry by using an RFID reader and transmitting the poultry number to the control PC, and controlling a green light of the wing mark detector to be turned on and a fence entering door in a rotary opening door to be opened outwards by using the control PC;
s2: when the poultry enters the passageway, the control PC is used for controlling the entrance gate to be closed and the wing mark detector to turn on the yellow lamp, and when the poultry reaches the weighing scale, the time and the weight of the poultry on the weighing scale are obtained by the weighing scale and are transmitted into the control PC;
s3: after the weight scale data are stable, acquiring multi-angle image data of the poultry by utilizing the first shooting mechanism, the second shooting mechanism and the pitching mechanism, and uploading the multi-angle image data to a control PC;
s4: based on the multi-angle image data, calculating phenotype data of the poultry through a control PC, and uploading the phenotype data to a database for storage;
s5: after the phenotype data is stored, a control PC is used for controlling the opening of a gate outlet in the rotary opening and closing gate, and after the poultry leaves the measuring device, the gate outlet is controlled to be closed and the wing mark detector is controlled to be lighted up in red, so that the next round of detection is carried out, and the intelligent measurement of the phenotype of the poultry is completed;
the step S4 comprises the following steps:
s4-1: dividing an image background by using a RANSAC algorithm, wherein the iteration frequency formula in the RANSAC algorithm is as follows:
wherein,minimum number of iterations required for optimal model, +.>For confidence level->For the probability that a data point in the image dataset is an outlier, < +.>The minimum data required to construct the dataset model;
s4-2: denoising and smoothing the image after the image background segmentation by adopting an octree filtering algorithm;
s4-3: performing point cloud registration on the denoised and smoothed image, performing point cloud coarse registration by adopting a 4PCS algorithm, performing point cloud fine registration on the coarse registered point cloud image by adopting an ICP algorithm, and completing preprocessing of the image, wherein the evaluation function of the ICP algorithm is as followsThe method comprises the following steps:
wherein,for rotating the transformation matrix +.>Is a three-dimensional translation vector>For a corresponding set of points found in the source point cloud,for a sample point set selected in the target point cloud, +.>The number of data points in the point set;
s4-4: marking key feature areas and key feature points of the poultry body area, and establishing a standard three-dimensional template;
s4-5: registering the standard three-dimensional template with the preprocessed actual acquisition point cloud data to obtain global point cloud image data of the poultry;
s4-6: inputting global point cloud image data of the poultry into a bird-PointNet++ neural network to carry out local feature fine segmentation to obtain a poultry segmentation result;
s4-7: positioning measuring points of the poultry based on the poultry segmentation result to obtain phenotype data of the poultry.
2. The intelligent determination method according to claim 1, wherein in S2, a time-weight graph is drawn by a control PC, and two extreme values of the weight are removed, and the average is calculated by taking the middle time segment, and the weight is calculatedThe calculation formula is as follows:
wherein,for the sum of body weight>For maximum weight>Is minimum in weightValue of->For the number of times of time>Is time.
3. The intelligent determination method of poultry phenotype according to claim 1, wherein the local fine segmentation in S4-6 using a Birds-pointet++ neural network comprises the steps of:
s4-6-1: the method comprises the steps of carrying out global feature extraction on input global point cloud image data by adopting an MLP layer, wherein the global point cloud image data comprisesDots, each dot having +>The dimension of the spatial feature, the feature is obtained>
S4-6-2: features to be characterizedInputting to SA layer, sampling and combining to obtain characteristic ∈>Feature extraction is performed on each cluster, and downsampling is performed by using a PointNet layer to obtain output as a feature +.>
S4-6-3: features to be characterizedAs input to the next SA layer, the features are obtained by sampling and combiningDownsampling with PointNet layer to obtain feature +.>
S4-6-4: features to be characterizedUp-sampling is carried out, the up-sampled point characteristics are input into an FP module, and the up-sampled point characteristics and the point characteristics before down-sampling are spliced to obtain characteristics +.>Features with MLP layer pair>Performing feature transformation and reducing dimensionality to obtain features->
S4-6-5: features to be characterizedAs input of the next FP module, the characteristics are obtained by interpolationThe feature is transferred to the original data point set through the MLP layer to obtain the feature output of the neural network>
S4-6-6: outputting the characteristicsAs the input of the next softmax layer network, calculating the probability of each point and classifying to finally obtain a segmentation result;
wherein,for the feature dimension obtained through the MLP layer, < + >>For the feature dimension obtained by the first downsampling of the SA layer,for the characteristic dimension obtained by the second downsampling of SA layer, +.>For the characteristic dimension obtained by performing MLP conversion after interpolation in FP layer for the first time, the ++>For the number of clusters after the first sampling combination, < >>For the number of clusters after the second sampling combination, < > is>Contain the number of points for each cluster, +.>And performing interpolation in the FP layer for the second time and then performing MLP conversion to obtain characteristic dimensions.
4. The intelligent determination method for poultry phenotype according to claim 3, wherein the up-sampled point features and the point features before down-sampling are spliced by interpolation in the step S4-6-4, and the calculation formula of the interpolation features is as follows:
to interpolate featuresAnd->By feature fusion->
Wherein,for new features obtained by difference, +.>Is the distance between the adjacent point and the reference point, +.>As a feature of the reference point,for the feature when the point originally corresponds to the SA layer downsampling, +.>For the distance of the first neighboring point from the reference point, -/-, is>For the distance of the second neighboring point from the reference point, -/-, is>For the distance of the third neighboring point from the reference point, +.>Is the number of the adjacent points.
CN202311304678.4A 2023-10-10 2023-10-10 Intelligent poultry phenotype measuring device and method Active CN117053875B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311304678.4A CN117053875B (en) 2023-10-10 2023-10-10 Intelligent poultry phenotype measuring device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311304678.4A CN117053875B (en) 2023-10-10 2023-10-10 Intelligent poultry phenotype measuring device and method

Publications (2)

Publication Number Publication Date
CN117053875A CN117053875A (en) 2023-11-14
CN117053875B true CN117053875B (en) 2023-12-19

Family

ID=88653844

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311304678.4A Active CN117053875B (en) 2023-10-10 2023-10-10 Intelligent poultry phenotype measuring device and method

Country Status (1)

Country Link
CN (1) CN117053875B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106918381A (en) * 2017-04-27 2017-07-04 宁夏壹加壹农牧股份有限公司 Weighing system and weighing method
CN110432906A (en) * 2018-05-05 2019-11-12 周庆民 A kind of simple livestock Physical examination car
CN110741964A (en) * 2019-11-19 2020-02-04 金昌牧旺养殖技术有限责任公司 Automatic sheep identification and sheep body determination device under computer assistance
CN110866969A (en) * 2019-10-18 2020-03-06 西北工业大学 Engine blade reconstruction method based on neural network and point cloud registration
CN112150523A (en) * 2020-09-24 2020-12-29 中北大学 Three-dimensional point cloud registration method with low overlapping rate
CN213282917U (en) * 2020-05-20 2021-05-28 清远市智慧农业研究院 Non-contact type pig body size parameter measuring system
CN113678751A (en) * 2021-08-19 2021-11-23 安徽大学 Intelligent passageway device for beef cattle body shape parameter acquisition and exercise health recognition
WO2022025354A1 (en) * 2020-07-30 2022-02-03 주식회사 에스티엔 Poultry weight measurement and weight estimation system
CN114187310A (en) * 2021-11-22 2022-03-15 华南农业大学 Large-scale point cloud segmentation method based on octree and PointNet ++ network
CN114758222A (en) * 2022-03-09 2022-07-15 哈尔滨工业大学水资源国家工程研究中心有限公司 Concrete pipeline damage identification and volume quantification method based on PointNet ++ neural network
CN218736577U (en) * 2022-10-27 2023-03-28 江苏省农业科学院 Pig body chi weight integration survey equipment
CN115886792A (en) * 2022-11-23 2023-04-04 内蒙古好快科技有限公司 Livestock intelligent body size measuring method, system and device
CN115918571A (en) * 2023-01-03 2023-04-07 合肥夔牛电子科技有限公司 Fence passageway type cattle body health data extraction device and intelligent extraction method thereof

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106918381A (en) * 2017-04-27 2017-07-04 宁夏壹加壹农牧股份有限公司 Weighing system and weighing method
CN110432906A (en) * 2018-05-05 2019-11-12 周庆民 A kind of simple livestock Physical examination car
CN110866969A (en) * 2019-10-18 2020-03-06 西北工业大学 Engine blade reconstruction method based on neural network and point cloud registration
CN110741964A (en) * 2019-11-19 2020-02-04 金昌牧旺养殖技术有限责任公司 Automatic sheep identification and sheep body determination device under computer assistance
CN213282917U (en) * 2020-05-20 2021-05-28 清远市智慧农业研究院 Non-contact type pig body size parameter measuring system
WO2022025354A1 (en) * 2020-07-30 2022-02-03 주식회사 에스티엔 Poultry weight measurement and weight estimation system
CN112150523A (en) * 2020-09-24 2020-12-29 中北大学 Three-dimensional point cloud registration method with low overlapping rate
CN113678751A (en) * 2021-08-19 2021-11-23 安徽大学 Intelligent passageway device for beef cattle body shape parameter acquisition and exercise health recognition
CN114187310A (en) * 2021-11-22 2022-03-15 华南农业大学 Large-scale point cloud segmentation method based on octree and PointNet ++ network
CN114758222A (en) * 2022-03-09 2022-07-15 哈尔滨工业大学水资源国家工程研究中心有限公司 Concrete pipeline damage identification and volume quantification method based on PointNet ++ neural network
CN218736577U (en) * 2022-10-27 2023-03-28 江苏省农业科学院 Pig body chi weight integration survey equipment
CN115886792A (en) * 2022-11-23 2023-04-04 内蒙古好快科技有限公司 Livestock intelligent body size measuring method, system and device
CN115918571A (en) * 2023-01-03 2023-04-07 合肥夔牛电子科技有限公司 Fence passageway type cattle body health data extraction device and intelligent extraction method thereof

Also Published As

Publication number Publication date
CN117053875A (en) 2023-11-14

Similar Documents

Publication Publication Date Title
CN105930819B (en) Real-time city traffic lamp identifying system based on monocular vision and GPS integrated navigation system
CN109141248B (en) Pig weight measuring and calculating method and system based on image
US20190339209A1 (en) A system for detecting crack growth of asphalt pavement based on binocular image analysis
CN104268138B (en) Merge the human body motion capture method of depth map and threedimensional model
CN106384079B (en) A kind of real-time pedestrian tracting method based on RGB-D information
CN105787933B (en) Water front three-dimensional reconstruction apparatus and method based on multi-angle of view point cloud registering
CN110533722A (en) A kind of the robot fast relocation method and system of view-based access control model dictionary
CN109887020B (en) Plant organ separation method and system
CN107397658B (en) Multi-scale full-convolution network and visual blind guiding method and device
CN108209926A (en) Human Height measuring system based on depth image
CN108961330A (en) The long measuring method of pig body and system based on image
Xu et al. A new clustering-based framework to the stem estimation and growth fitting of street trees from mobile laser scanning data
CN110837839B (en) High-precision unmanned aerial vehicle orthographic image manufacturing and data acquisition method
CN115187803B (en) Positioning method for picking process of famous tea tender shoots
CN206741554U (en) The house type 3D modeling system of indoor 3D scanning devices based on depth camera
CN110084198A (en) The airport CNN indoor scene recognition methods based on Fisher signature analysis
CN107169961A (en) A kind of cigarette sorting detecting system and method based on CIS IMAQs
CN113743358A (en) Landscape visual feature recognition method based on all-dimensional acquisition and intelligent calculation
Bobrowski et al. Best practices to use the iPad Pro LiDAR for some procedures of data acquisition in the urban forest
CN115854895A (en) Non-contact stumpage breast diameter measurement method based on target stumpage form
CN116682106A (en) Deep learning-based intelligent detection method and device for diaphorina citri
CN114898405A (en) Portable broiler chicken abnormity monitoring system based on edge calculation
CN113469097B (en) Multi-camera real-time detection method for water surface floaters based on SSD network
CN117053875B (en) Intelligent poultry phenotype measuring device and method
CN104809688B (en) Sheep body body measurement method and system based on affine transformation registration Algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant