CN113507490A - Multi-source information fusion live pig full-chain intelligent monitoring method and device - Google Patents

Multi-source information fusion live pig full-chain intelligent monitoring method and device Download PDF

Info

Publication number
CN113507490A
CN113507490A CN202110486279.9A CN202110486279A CN113507490A CN 113507490 A CN113507490 A CN 113507490A CN 202110486279 A CN202110486279 A CN 202110486279A CN 113507490 A CN113507490 A CN 113507490A
Authority
CN
China
Prior art keywords
live pig
node
sub
live
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110486279.9A
Other languages
Chinese (zh)
Inventor
黄汉英
赵思明
熊善柏
李鹏飞
牛猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong Agricultural University
Original Assignee
Huazhong Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong Agricultural University filed Critical Huazhong Agricultural University
Priority to CN202110486279.9A priority Critical patent/CN113507490A/en
Publication of CN113507490A publication Critical patent/CN113507490A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification techniques
    • G10L17/26Recognition of special voice characteristics, e.g. for use in lie detectors; Recognition of animal voices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0876Network architectures or network communication protocols for network security for authentication of entities based on the identity of the terminal or configuration, e.g. MAC address, hardware or software configuration or device fingerprint

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Software Systems (AREA)
  • Evolutionary Biology (AREA)
  • Biomedical Technology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Emergency Management (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Power Engineering (AREA)
  • Medical Informatics (AREA)
  • Business, Economics & Management (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a multi-source information fusion live pig full-chain intelligent monitoring system method and a multi-source information fusion live pig full-chain intelligent monitoring system device, which comprise the following steps: identifying the ear tag of the live pig and determining the identity information of the live pig; monitoring corresponding live pig appearance images, live pig cough sounds and live pig group videos in real time according to the identity information of the live pigs; determining a corresponding body type characteristic value according to the exterior image of the live pig; converting the cough sound of the live pig into a signal oscillogram, and extracting a sound characteristic value of the signal oscillogram; determining a corresponding activity characteristic value according to the live pig group video; inputting the body type characteristics, the sound characteristic values and the activity characteristics into a neural network model which is trained completely, and outputting corresponding health grades; generating early warning information according to the health grade, and associating the early warning information with the identity information; and carrying out visual processing on the associated identity information and the early warning information so as to remind related workers to carry out processing. The invention collects information in multiple aspects, ensures the safety of live pig production, and can early warn the health hidden trouble of the live pig with low cost.

Description

Multi-source information fusion live pig full-chain intelligent monitoring method and device
Technical Field
The invention relates to the technical field of agricultural information, in particular to a multi-source information fusion live pig full-chain intelligent monitoring method and device.
Background
Pork is a main food for people in most of China, and the yield and quality of the produced pork are vital to the nation. The existing pork production chain includes major nodes for cultivation, slaughter, processing, storage and circulation, wherein each node affects the yield and quality of the produced pork. In the prior art, a certain node or a certain production factor of the produced pork is often monitored respectively, the monitoring form and the monitoring content are single, and the whole industrial chain cannot be completely monitored and judged. In addition, in the prior art, the monitoring information is processed by adopting a traditional data processing mode, and the data processing process lacks high efficiency and rapidity. Therefore, how to efficiently and comprehensively monitor the health condition of live pig production is an urgent problem to be solved.
Disclosure of Invention
In view of this, there is a need to provide a method and a device for intelligent monitoring of whole chain of live pigs with multi-source information fusion, so as to solve the problem of how to efficiently and comprehensively monitor the health condition of live pig production.
The invention provides a multisource information fusion live pig full-chain intelligent monitoring method, which comprises the following steps:
identifying the ear tag of the live pig and determining the identity information of the live pig;
monitoring corresponding live pig appearance images, live pig cough sounds and live pig group videos in real time according to the identity information of the live pigs;
determining a corresponding body type characteristic value according to the live pig appearance image;
converting the cough sound of the live pig into a signal oscillogram, and extracting a sound characteristic value of the signal oscillogram;
determining a corresponding activity characteristic value according to the live pig group video;
inputting the body type characteristics, the sound characteristic values and the activity characteristics into a well-trained neural network model, and outputting corresponding health grades;
generating early warning information according to the health grade, and associating the early warning information with the identity information;
and carrying out visual processing on the associated identity information and the early warning information so as to remind related workers to process the live pigs with the hidden health risks.
Further, the determining the corresponding body type characteristic value according to the live pig appearance image comprises:
performing edge recognition according to the live pig appearance image, determining a corresponding edge contour, and determining the posture characteristics of the live pig according to the edge contour, wherein the posture characteristics comprise the body length and the body weight of the live pig;
carrying out target identification according to the live pig exterior image, identifying an abnormal part on the surface of the live pig, and determining the skin surface characteristics of the live pig according to the abnormal part, wherein the skin surface characteristics comprise the area of the abnormal part and the severity of the abnormal part;
and determining the body type characteristic value according to the body state characteristic and the skin surface characteristic.
Further, the converting the cough sound of the live pig into a signal waveform diagram and extracting the sound characteristic value of the signal waveform diagram include:
converting the live pig cough sounds into a signal waveform diagram;
extracting sound characteristic values of the signal oscillogram;
and comparing the sound characteristic value with a preset standard sound characteristic value.
Further, the determining, according to the live pig population video, a corresponding activity feature value includes:
counting the activity duration of each live pig according to the identity information and the live pig group video;
comparing the activity duration with a preset activity time standard value, and determining a corresponding activity numerical range;
and determining the activity characteristic value corresponding to the live pig according to the activity numerical range.
Further, still include: and performing video framing processing according to the live pig group videos, judging whether a fighting behavior exists between live pigs according to the image difference of each frame of video, and if so, generating fighting early warning information to remind related workers to stop.
Further, still include:
acquiring the weight of feed in a trough corresponding to the live pig;
and associating the identity information with the feed weight, comparing the feed weight with a preset standard feed amount, determining a corresponding feed difference value, and if the feed difference value exceeds a first preset value, generating alarm information to transmit relevant workers so as to check corresponding live pigs in time.
Further, still include:
acquiring environmental factors, processing factors, circulation factors and operation images formed by each node in the production process of the live pigs;
comparing the environmental factors, the processing factors and the circulation factors with a prestored parameter database, and generating first early warning information according to a comparison result and sending the first early warning information to a corresponding node so as to remind related workers to regulate and control the production process;
and matching and comparing the operation image with a prestored operation standard image library, and generating second early warning information according to a comparison result and sending the second early warning information to the corresponding node so as to remind related workers to adjust the operation standard.
Further, each node comprises a breeding node, a slaughtering node, a processing node, a storage node and a circulation node, wherein the breeding node sequentially comprises a breeding sub-node, a feeding sub-node, an environment sub-node, a disease sub-node and a finished pig slaughtering sub-node, the slaughtering node sequentially comprises a live pig acceptance sub-node, a hot water flushing sub-node, a slitting sub-node, a precooling sub-node and a pork packaging sub-node, the processing node sequentially comprises a raw material receiving sub-node, a pickling sub-node, a chopping sub-node, a cooking sterilization sub-node and a finished product packaging sub-node, the storage node comprises a storage sub-node, a storage sub-node and a delivery sub-node, and the circulation node comprises a delivery sub-node, a transportation sub-node and a delivery sub-node.
Further, still include:
acquiring the culture node parameter, the slaughter node parameter, the processing node parameter, the storage node parameter and the circulation node parameter;
performing data filtering on the breeding node parameter, the slaughter node parameter, the processing node parameter, the storage node parameter and the circulation node parameter, and determining a filtered parameter to be uploaded;
sequencing the processing time delay of each parameter to be uploaded in an ascending order to form a first sequence, and placing each newly added parameter to be uploaded at the edge node at the tail end of the first sequence;
adjusting the first sequence according to the transmission delay of each edge node, and determining the uploading sequence of the parameters to be uploaded according to the adjusted first sequence;
and sequentially carrying out data processing on the uploaded parameters to be uploaded, and visualizing the data processing result.
The invention also provides an intelligent live pig production device which comprises a processor and a memory, wherein the memory is stored with a computer program, and when the computer program is executed by the processor, the intelligent live pig full-chain monitoring method with multi-source information fusion is realized.
Compared with the prior art, the invention has the beneficial effects that: firstly, determining the corresponding identity information of a live pig through a live pig ear tag, and identifying different live pigs; then, for different live pigs, real-time monitoring live pig exterior images corresponding to the live pigs to identify the exterior health degree of the live pigs, collecting cough sounds of the live pigs, extracting the characteristics of the sounds, and effectively acquiring live pig crowd video so as to judge the activity state of the live pigs; further, according to the exterior image of the live pig, determining a corresponding body type characteristic value and feeding back the body surface health degree of the live pig; meanwhile, the coughing sounds of the live pigs are converted into corresponding signal oscillograms, and sound extraction is carried out, so that the sound health degree of the live pigs is fed back; meanwhile, according to the live pig group video, the activity of the live pigs is effectively judged; and finally, judging the health degree of the live pig from multiple angles according to the body characteristic value, the sound characteristic value and the activity characteristic value, so as to achieve the efficient and accurate judgment effect. In conclusion, the system collects the live pig information in many aspects, guarantees the safety of live pig production, cost-effectively eliminates the hidden health hazards of the live pigs, avoids the complexity of artificial diagnosis, realizes the effect of preliminary timely diagnosis, further realizes the high efficiency and accuracy of monitoring the live pig production process, is beneficial to timely feedback and early warning, and improves the safety of live pig production.
Drawings
FIG. 1 is a schematic flow chart of a multi-source information fusion live pig whole chain intelligent monitoring method provided by the invention;
fig. 2 is a schematic structural diagram of a multi-source information-fused live pig full-chain intelligent monitoring system provided by the invention.
Detailed Description
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate preferred embodiments of the invention and together with the description, serve to explain the principles of the invention and not to limit the scope of the invention.
Example 1
The embodiment of the invention provides a multisource information fused live pig full-chain intelligent monitoring method, and by combining with fig. 1, fig. 1 is a schematic flow diagram of the multisource information fused live pig full-chain intelligent monitoring method provided by the invention, the multisource information fused live pig full-chain intelligent monitoring method comprises steps S1 to S8, wherein:
in step S1, identifying the ear tag of the live pig, and determining the identity information of the live pig;
in step S2, monitoring the corresponding live pig appearance image, the live pig cough sound, and the live pig group video in real time according to the identity information of the live pig;
in step S3, determining a corresponding body type feature value according to the exterior image of the live pig;
in step S4, converting the cough sound of the live pig into a signal waveform diagram, and extracting sound features from the signal waveform diagram;
in step S5, determining a corresponding activity feature value according to the live pig group video;
in step S6, inputting the body type features, voice features, and activity features into a neural network model that is trained completely, and outputting corresponding health levels;
in step S7, generating early warning information according to the health level, and associating the early warning information with the identity information;
in step S8, the correlated identity information and the early warning information are visualized to remind relevant staff to treat live pigs with health risks.
In the embodiment of the invention, firstly, the corresponding identity information of the live pig is determined through the ear tag of the live pig, and different live pigs are identified; then, for different live pigs, real-time monitoring live pig exterior images corresponding to the live pigs to identify the exterior health degree of the live pigs, collecting cough sounds of the live pigs, extracting the characteristics of the sounds, and effectively acquiring live pig crowd video so as to judge the activity state of the live pigs; further, according to the exterior image of the live pig, determining a corresponding body type characteristic value and feeding back the body surface health degree of the live pig; meanwhile, the coughing sounds of the live pigs are converted into corresponding signal oscillograms, and sound extraction is carried out, so that the sound health degree of the live pigs is fed back; meanwhile, according to the live pig group video, the activity of the live pigs is effectively judged; and finally, judging the health degree of the live pig from multiple angles according to the body characteristic value, the sound characteristic value and the activity characteristic value, so as to achieve the efficient and accurate judgment effect.
Preferably, step S3 specifically includes: performing edge recognition according to the live pig appearance image, determining a corresponding edge contour, and determining the posture characteristics of the live pig according to the edge contour, wherein the posture characteristics comprise the body length and the body weight of the live pig; carrying out target identification according to the live pig exterior image, identifying abnormal parts on the live pig surface, and determining the skin surface characteristics of the live pig according to the abnormal parts, wherein the skin surface characteristics comprise the area of the abnormal parts and the severity of the abnormal parts; and determining a body type characteristic value according to the body state characteristic and the skin surface characteristic. As a specific embodiment, the embodiment of the invention carries out edge recognition on the live pig exterior image so as to monitor the body type characteristic value of the live pig and judge the health degree of the live pig exterior.
Preferably, step S4 specifically includes: converting the coughing sounds of the live pigs into signal oscillograms; extracting sound characteristics of the signal oscillogram; and comparing the sound characteristic with a preset standard sound characteristic quantity. As a specific embodiment, the embodiment of the invention preliminarily judges the cough sound of the live pig, and extracts the corresponding sound characteristic by monitoring the sound of the live pig.
Preferably, step S5 specifically includes: counting the activity duration of each live pig according to the identity information and the live pig group video; comparing the activity duration with a preset activity time standard value, and determining a corresponding activity numerical range; and determining the activity characteristic value corresponding to the live pig according to the activity numerical range. As a specific example, the embodiment of the invention judges the health condition of the live pig by detecting the activity frequency of the live pig.
In one embodiment of the invention, the pig emergency response action monitoring comprises the steps of collecting video image data of pig activities by using a camera, collecting sound signals of pigs by using a portable microphone, transmitting the video image data and the sound signal data to an edge server, processing and analyzing the video image data and the sound signal data by the edge server to obtain pig health assessment, monitoring the pig emergency response action in real time to ensure the health condition of the pigs and the quality of pork, and generating an alarm to inform a manager to check whether diseases or injuries occur in time if the health degree of the pigs is monitored to be in a lower grade, so as to prevent the loss caused by the damage of the pigs.
It should be noted that there are many ways for the pig to respond to emergency, which are briefly described as follows: when eating, the user can slowly move, strive for the food, and get rid of the food, and generally the user can sleep after eating the food. The external sound stimuli or the presence of a person run, stand still and then move freely. When a foreign pig breaks in, the pig will crowd and attack the bite. Some sick pigs are attacked by others. The normal pig excretion can go to a fixed place, and sick pigs can relieve the bowels anywhere.
According to the emergency response behavior of the live pigs, the health state of the live pigs can be monitored by adopting video image recognition and voice recognition, and health grade assessment and early warning are realized. The specific method comprises the following steps:
firstly, a live pig health grade identification method through video images;
1) the method comprises the steps of collecting video image data of live pig activities by using a camera, and transmitting the video image data to an edge server.
2) Preprocessing acquired video image data
Segmenting the video image according to the color characteristics, and separating the live pig from the background to obtain a primary segmentation image; converting the color image from RGB to HSV to perform identification and segmentation of color characteristics; the conversion formula is:
R′=R255
G′=G/255
B′=B/255
Cmax=max(R,G,B)
Cmin=min(R,G,B
ΔCmax-Cmin
Figure BDA0003050440920000081
V=Cmax
removing bubbles and noise points in the image by using methods such as expansion, corrosion, maximum connected domain segmentation and the like;
the formula of expansion is
Figure BDA0003050440920000082
The formula shows that B is used for carrying out expansion processing on an image A, wherein B is a convolution template or a convolution kernel, the shape of B can be square or circular, each pixel point in the image is scanned through convolution calculation of the template B and the image A, and operation is carried out by using template elements and binary image elements, if the template elements and the binary image elements are both 0, a target pixel point is 0, and otherwise, the target pixel point is 1. Therefore, the maximum value of the pixel point of the B coverage area is calculated, and the pixel value of the reference point is replaced by the maximum value to realize expansion.
The formula for corrosion is:
Figure BDA0003050440920000083
the formula shows that the image A is corroded by a convolution template B, the template B and the image A are subjected to convolution calculation to obtain the minimum value of pixel points in a coverage area B, and the minimum value is used for replacing the pixel value of a reference point;
acquiring the mass center of the live pig image, and extracting the moving track of the live pig to obtain the moving speed and the moving acceleration of the live pig;
the calculation formula of the moving speed of the live pig is as follows:
Figure BDA0003050440920000091
Figure BDA0003050440920000092
wherein x and y represent the coordinates of the center of mass of the live pig, and the moving speed of the live pig is represented by the distance of the coordinates of the center of mass between two adjacent frames;
the acceleration of live pig movement is calculated as follows:
Figure BDA0003050440920000093
in the formula, 2n pieces of velocity data are differentiated by a difference method at an interval of n to calculate the acceleration.
4) And classifying the data samples by adopting a clustering hierarchical clustering algorithm, and calculating the parameter range of each live pig health assessment according to the clustering center value of each class.
The specific process comprises the following steps:
and extracting the moving speed and acceleration data of 300 samples, and performing cluster analysis by using a hierarchical clustering method respectively, wherein algorithm selection, cluster calculation data selection and live pig health assessment are divided as follows.
The most effective algorithm is selected by comparing the distances between the elements with the correlation coefficients of the actual distances calculated by using different algorithms, including five methods, i.e., the shortest distance method, the longest distance method, the unweighted average distance method, the centroid distance method, and the inner square distance method, as shown in table 1, where R1 is the correlation coefficient for calculating the clustering information using the velocity data, and R2 is the correlation coefficient for calculating the clustering information using the acceleration.
TABLE 1
Correlation coefficient Method of shortest distance Maximum distanceMethod of Unweighted average distance method Centroid distance method Inner square distance method
R1 0.9556 0.9341 0.9068 0.9068 0.9057
R2 0.9643 0.9545 0.9567 0.9567 0.8745
As can be seen from table 1, in the results of using the velocity calculation and using the acceleration calculation, the correlation coefficient of the shortest distance method is the largest, and therefore the systematic clustering tree calculated by the shortest distance method is used as the basis for the division. The data are properly divided into three types according to the system clustering tree and the scatter diagram, so that the health evaluation of the live pigs is divided into three grades of diseases, sub-health and health, and the central values of the grades are 7.7346, 31.6225 and 50.4169 respectively.
The moving speed range corresponding to the disease level is [0, 19.6786 ], the moving speed range corresponding to the sub-health level is [19.6786, 41.0197], and the moving speed corresponding to the health level is more than 41.0197. The live pig health assessment score is shown in table 2.
TABLE 2
Live pig health assessment Speed (pixels/s)
Disease and disorder 0~19.6786
Sub-health 19.6786~41.0197
Health care >41.0197
Calculating the identification accuracy rate of the pig emergency response action, wherein the formula is as follows:
Figure BDA0003050440920000101
Figure BDA0003050440920000102
wherein, PkIdentification accuracy of class k samples, AkFor the number of correctly identified class k samples, NkIs the total number of class k samples, P0The identification accuracy was averaged over all samples.
Thirdly, the method for recognizing the health grade of the live pig by voice is as follows:
1) collecting sound signal data of the live pigs by using a portable sound collector, and transmitting the sound signal data to an edge server;
2) preprocessing the sound signal, including denoising, pre-emphasis, windowing and framing, end point detection and the like;
1. spectral subtraction denoising
The actually collected signal is the superposition of the original signal and the noise signal, so the estimated noise average energy value is subtracted from the energy value of the input signal, the stable noise part in the signal is removed, and finally pure voice is obtained. The average energy value D (k) of the noise section is:
Figure BDA0003050440920000111
wherein L is the frame length of the ith frame signal, LisFor corresponding frame number, xi(k) Is the value of the kth sampled data point of a frame of data.
2. Pre-emphasis
To enhance the high frequency components of the sound signal, the audio signal is pre-emphasized. The pre-emphasis is to pass the sound signal through a digital filter, and the high-frequency characteristic of the pig sound signal can be compensated through the pre-emphasis.
The filter order is 1 and the transfer function is
H(z)=1-αz-1
Where H (z) is the transfer function in the z domain and α is the pre-emphasis factor, typically 0.9 < α < 1.0
And if the sampling value of the pig sound signal at the moment n is x (n), the signal after pre-emphasis processing has the following expression:
Y(n)=x(n)-αx(n-1)
in the formula, y (n) is a signal sequence after pre-emphasis.
Usually, the original sound signal s (n) is added with window function w (n) to realize the framing of the sound signal, and the signal s after window additionw(n) is represented by:
sw(n)=s(n)·w(n)
a Hamming window is adopted as a window function, the weighting coefficient of the Hamming window can enable the side lobe to be smaller, and the attenuation speed of the side lobe is low. The effect is better compared with the rectangular window and the Hainin window. The formula is as follows:
Figure BDA0003050440920000121
where l is the window length and represents the number of points of the signal in the window.
The start and stop points of the speech signal are detected using a double threshold detection method. The influence of environmental noise on endpoint detection can be reduced, and voice segment signals can be intercepted more accurately;
the average energy calculation formula is:
Figure BDA0003050440920000122
the average zero crossing rate is calculated according to the formula
Figure BDA0003050440920000123
Wherein L is the frame length, xi(n) is the value of the nth sample point of the ith frame data, sgn is a sign function expressed as:
Figure BDA0003050440920000124
using a fast fourier transform on the acquired sound signal, the formula is as follows:
Figure BDA0003050440920000125
wherein X (i, k) represents the frequency spectrum of the signal, Xi(m) is the original sound signal and i is the framing index. The short-time amplitude spectrum of the signal can be expressed as | X (i, k) |, with the energy density function P (i, k) being:
P(i,k)=|X(i,k)|2
and classifying the spectral characteristics of the extracted sound signals by using artificial intelligence, identifying the cough sound and the normal sound of the live pigs, and marking the corresponding live pigs.
And combining the grade of the live pig health evaluation identified by the image and the result of the live pig voice identification to construct the early warning grade of the live pig health.
TABLE 3
Health grade Result of voice recognition Early warning level Emergency activity situation
Disease and disorder Cough sound of live pig Three-stage The live pig does not eat food and has no response to external sound
Sub-health Normal sound of live pig Second stage Less feeding activity of live pigs
Health care Normal sound of live pig First stage Normal feeding and activity of live pig
The live pig health early warning grades are divided into 3 grades, wherein the third grade is the lowest grade, and the first grade is the highest grade.
When the live pig health early warning grade is three grades, it indicates that the live pig is possibly sick, an alarm needs to be generated in time, and a veterinarian is informed to detect the live pig diseases. When the live pig health early warning level is the second level, the situation that the live pig lacks movement is indicated, a prompt is generated, and a manager is informed to check specific reasons. When the health early warning grade of the live pigs is first grade, the situation is good when the live pigs normally eat and move.
Preferably, the method further comprises: and performing video framing processing according to the live pig group videos, judging whether a shelving behavior exists between the live pigs according to the image difference of each frame of video, and if so, generating shelving early warning information to remind related workers to stop. As a specific embodiment, the embodiment of the invention effectively warns the fighting behavior among live pigs and intervenes in time.
Preferably, the method further comprises: acquiring the weight of feed in a trough corresponding to the live pig; the identity information and the feed weight are correlated, the feed weight is compared with a preset standard feed amount, a corresponding feed difference value is determined, and if the feed difference value exceeds a first preset value, alarm information is generated to transmit relevant workers so as to check corresponding live pigs in time. As a specific example, the embodiment of the present invention further determines the health status of the live pig in a whole-scale manner by monitoring the feeding condition of the live pig.
Preferably, the method further comprises: acquiring environmental factors, processing factors, circulation factors and operation images formed by each node in the production process of the live pigs; comparing the environmental factors, the processing factors and the circulation factors with a prestored parameter database, and generating first early warning information according to a comparison result and sending the first early warning information to a corresponding node so as to remind related workers to regulate and control the production process; and matching and comparing the operation image with a prestored operation standard image library, and generating second early warning information according to a comparison result and sending the second early warning information to the corresponding node so as to remind related workers to adjust the operation standard. As a specific embodiment, the embodiment of the invention monitors the parameters of a plurality of nodes for pig production, and gives early warning in time for regulation and control.
Preferably, each node comprises a breeding node, a slaughtering node, a processing node, a storage node and a circulation node, wherein the breeding node sequentially comprises a breeding sub-node, a feeding sub-node, an environment sub-node, a disease sub-node and a finished pig slaughtering sub-node, the slaughtering node sequentially comprises a live pig acceptance sub-node, a hot water showering sub-node, a slitting sub-node, a precooling sub-node and a pork packaging sub-node, the processing node sequentially comprises a raw material receiving sub-node, a pickling sub-node, a chopping sub-node, a cooking sterilization sub-node and a finished product packaging sub-node, the storage node comprises a warehousing sub-node, a storage sub-node and a ex-warehouse sub-node, and the circulation node comprises a goods feeding handover sub-node, a transportation sub-node and a goods outgoing handover sub-node. As a specific embodiment, in the embodiment of the present invention, each node is provided with a corresponding child node, so as to achieve comprehensive monitoring of the entire production chain.
Preferably, the method further comprises: acquiring a breeding node parameter, a slaughter node parameter, a processing node parameter, a storage node parameter and a circulation node parameter; carrying out data filtering on the breeding node parameters, the slaughter node parameters, the processing node parameters, the storage node parameters and the circulation node parameters, and determining filtered parameters to be uploaded; the processing time delay of each parameter to be uploaded is sorted in an ascending order to form a first sequence, and each newly added parameter to be uploaded of each edge node is placed at the tail end of the first sequence; adjusting the first sequence according to the transmission delay of each edge node, and determining the uploading sequence of a plurality of parameters to be uploaded according to the adjusted first sequence; and sequentially carrying out data processing on the uploaded parameters to be uploaded, and visualizing the data processing result. As a specific embodiment, the embodiment of the present invention implements real-time and efficient data uploading by performing big data processing on each node, ensures reasonable allocation of network resources, achieves the purpose of effectively monitoring each production link, and visualizes the data processing result, thereby facilitating information management and operation.
In a specific embodiment of the invention, the sensors of the internet of things transmit in parallel, and the bandwidth is adjusted by a target planning method to transmit the tasks to the edge server. When a task is transmitted to a cloud server from an edge server, the tasks on the edge server are sequenced, the tasks are arranged in an ascending order according to the processing delay, a new task is added and placed at the end of the sequence, the transmission delay of each task is calculated, the queuing delay of each task is calculated, the two tasks are added, the task with the largest sum of the transmission delay and the queuing delay is preferentially transmitted according to the descending order of the sum of the two tasks, when the new task is added into the queue, the transmission delay and the queuing delay of each current task are recalculated, the tasks are arranged in a descending order, the task with the largest sum of the transmission delay and the queuing delay is preferentially transmitted, each task with the largest sum of the transmission delay and the queuing delay can be preferentially transmitted, and the queuing delay is reduced.
The specific task transmission method comprises the following steps:
method for transmitting task to edge server by sensor of Internet of things
(1) Calculating transmission time delay
According to Shannon's theorem
C=Blog2(1+S/N)
Wherein, C is the maximum speed supported by the channel or the capacity of the channel, B is the bandwidth of the channel, S is the average signal power, and N is the average noise power; S/N is the signal to noise ratio.
The transmission delay can be expressed as
Figure BDA0003050440920000151
Wherein D isiIs the amount of data for task i, BiAnd S is the average signal power, namely the product of the transmission power provided when the mobile terminal where the ith task is located sends the task i to the edge server and the channel gain of the channel used for transmission, and N is the average noise power in the channel.
(2) Using a target plan, the model of which is
The target is as follows:
Figure BDA0003050440920000161
s.t.C1:∑Bi≤B
the constraint conditions are as follows: c2: b isop
Wherein, BopIs the optimal transmission signalAnd B is the total bandwidth of the wireless communication link for transmitting data.
In this way, the task is transmitted to the edge server.
Method for transmitting task from edge server to cloud server
(1) Calculating the processing delay d of each taskproc
Figure BDA0003050440920000162
In the formula, the data amount D of each taskiComputing power f of the upper edge serveri eProcessing delay d for each taskproc
(2) Sequencing each task according to the sequence of the processing time delay from small to large to form an ascending queue q
q=(D1,D2,...,Di,...,Dn)
Wherein, the first task of the queue is uploaded preferentially, and the new task is added to the tail end of the queue.
Defining a set before as a set of all tasks arranged in front of the task i, and M is the number of the tasks in the set.
After the task processing is finished, the queue is not needed to be queued, and the transmission is directly carried out according to the ascending queue of the processing time delay, namely, under an ideal condition, the transmission of the previous task is finished, and the processing of the next task is just finished. When a task is queued, it is transmitted as follows.
(3) Calculating the transmission time delay d of each tasktrans
Figure BDA0003050440920000171
In the formula, the data amount D of each taskiThe transmission delay d of each task being greater than the transmission rate Ctrans
(4) Calculating queuing delay d of each taskq
Figure BDA0003050440920000172
In the formula, j belongs to a set before, the set is a set of all tasks arranged in front of the task i, and the sum of the transmission delays of all tasks arranged in front of the task i is the queuing delay.
(6) And adding the transmission delay and the queuing delay of each task, performing descending arrangement, and uploading the task with the maximum sum of the transmission delay and the queuing delay.
(7) And when a new task is added into the queue, recalculating the queuing delay of each task, performing descending arrangement according to the sum of the new transmission delay and the queuing delay, and preferentially uploading the task with the maximum sum of the transmission delay and the queuing delay.
And the sum of the transmission delay and the queuing delay of the task is calculated, and the priority transmission with the maximum sum of the transmission delay and the queuing delay is carried out to reduce the queuing delay of the task and enable the task to be transmitted quickly.
The task is transmitted to the cloud server through the method.
Preferably, the live pig whole chain quality information intelligent detection method based on the internet of things further includes: and comparing the plurality of breeding node parameters, the plurality of collecting and storing node parameters, the plurality of processing node parameters and the plurality of circulating node parameters with the corresponding pre-stored parameter index libraries respectively, and early warning the corresponding nodes according to parameter comparison results. Therefore, early warning is carried out through effective data comparison.
Preferably, the live pig whole chain quality information intelligent detection method based on the internet of things further includes: and converting the production information of the produced pigs into corresponding RFID tags. Therefore, direct information tracing of the consumer is facilitated through the arrangement of the RFID tag.
In a specific embodiment of the invention, in the process of monitoring the health of live pigs, 24 paths of cameras are respectively set up, wherein the bandwidth and storage space required by the 24 paths of cameras for carrying out full video monitoring in the live pig breeding process include an uplink bandwidth for transmitting video data to an edge node by the cameras and a downlink bandwidth for receiving the video data by the edge node, the storage space for locally storing the video data for one month by the cameras, and the storage space for clearing the video data once a month by the cameras and the storage space for the edge node.
The method for uploading the video images of the 24-channel camera by using the edge calculation comprises the following steps:
the first step is as follows: detecting a moving target object in the operation video stream, identifying the video data of the operation video stream, judging whether a moving target exists or not, extracting an extracted video segment with the moving target, and then carrying out next processing on the extracted video data;
the method comprises the following steps of extracting a video clip with a moving target by adopting a three-frame difference method, wherein a specific formula is described as follows:
Figure BDA0003050440920000181
wherein G (x, y) is G1(x, y) and g2(x, y) carrying out logical AND, judging that the continuous three frames of images change, indicating that a moving object exists, and extracting the video clip with the moving object. Therefore, before the identification of the operation behavior specification is carried out, whether a moving target exists needs to be identified, namely an operator has the significance of further carrying out the identification of the operation behavior specification, a video segment with the moving target is extracted through a frame difference method, redundant video segments are removed, and the video segment with the moving target is processed in the next step.
The second step is that: preprocessing by using a frame filtering model, wherein the formula of the frame filtering model is as follows:
Figure BDA0003050440920000182
wherein, Oi,sFor the identification number of the object of the i-th frame in the video stream S, DupUploading video frame data volume for ECN, DmaxFor the maximum amount of data allowed to be transmitted per unit time of the network, tdWhen the task is completedM, tdMaximum processing time allowed for completion of the task, TeIs the total time delay, T, of S sent to the cloud computing centercThe time delay of the direct transmission cloud platform is adopted. Therefore, after the transmission capacity and the total time delay of the current network both meet the conditions, the ECN Controller allocates an uploading channel and starts task scheduling. When a plurality of cameras are used for collecting data, if all video data are processed, the processing time is longer, resources are wasted, and for repeated video content, namely the same scene content collected by different cameras, the video with the most identified target objects is selected through frame filtering, and is identified and analyzed, so that the data processing time is reduced, and the problem of video data redundancy is solved;
the third step: extracting the key frame, wherein the specific flow is as follows:
the joint histogram represents two images I of the same sizecAnd IjWith the frequency of occurrence of the gray combinations of the pixel pairs at their corresponding locations. For image I of same MxNi(x,y)、、Ii(x, y), the joint probability of the corresponding pixel value pair (p, q) is expressed as:
Figure BDA0003050440920000191
wherein the content of the first and second substances,
Figure BDA0003050440920000192
from the above equation, it can be seen that the image I can be obtained by finding the F (p, q) value for all possible pixel value pairs (p, q)i(x, y) and Ij(x, y) the joint histogram symmetry is defined as
Figure BDA0003050440920000201
Where α is the weight on the diagonal of the joint histogram, here the smaller than normal amount, and β ═ p-qnRepresenting the weight of elements far away from the diagonal line, wherein n is an integer in the formula, δ more intuitively represents the similarity between two frames, as δ approaches to 1, the more symmetrical the joint histogram is, that is, the greater the similarity between two images is, when the video content of the target rapidly appears, the brightness and the like is obviously changed, the similarity between frames is correspondingly changed, and generally the similarity δ between adjacent frames belongs to [0, 1]To avoid missing key frames, the threshold T' is set to 0.9.
According to the continuous characteristic of the monitoring video, in a continuously changing video sequence, the characteristic value of the continuous front and back video frames is gradually changed, namely the change of the image information value of the adjacent frames is not large. In order to reduce the redundancy of data, a frame with the largest information entropy value of an image is selected from a video sequence with closer intervals as a key frame, and the calculation formula of entropy of the image information is as follows:
Figure BDA0003050440920000202
in the formula: n denotes the number of gray levels of an image, xiIndicates the gray value of the pixel (x, y), p (x)i) Is the probability of each gray level occurring. In order to prevent redundancy of key frames caused by illumination change and the like in the extracted key frames, one frame with the largest information entropy is selected from adjacent and nearer candidate frames as the key frame. Non-adjacent inter-frame entropy differences of 20 frames apart can be clearly distinguished. Therefore, the key frames are extracted based on the combined histogram, and when the interval of the key candidate frame sequences is less than 20, one frame with the largest information entropy is selected as the key frame.
The fourth step: and (3) scheduling the tasks, wherein the specific flow is as follows:
after the ECN Controller allocates the uploading channel, the ECN feeds back the queue information of the uploading video frame. Dynamic adjustment of ECN cluster and the like under different network environmentsThe transmitted video stream parameters are different, and in order to facilitate the unified scheduling management of the ECN Controller, the data volume of the video frame group is represented as De,i. The ECN average identification target number may be expressed as:
Figure BDA0003050440920000211
wherein the content of the first and second substances,
Figure BDA0003050440920000212
for ECN average identification of target number, De,iAmount of data for a group of video frames, BcFor link capacity, OiIs the number of identification targets of the ith frame. Thus, the ECN Controller assigns a uniform metric to the scheduling of each upload task
Figure BDA0003050440920000213
At time T, N ECN tasks in the cluster wait to be scheduled, and the completion time of each task after r rounds of scheduling is recorded as TiThe validity of the ECN uploading task can be reached only by scheduling and completing before the deadline, and the completion time meets ti<τiIn which τ isiIs the latest completion time of the ith ECN upload task. The task scheduling needs to consider the limited available resources of the system, can not exceed a threshold value, have
Figure BDA0003050440920000214
Wherein r is the number of scheduling rounds,
Figure BDA0003050440920000215
for the ith frame data amount, MtScheduling an upper threshold of resources for the task;
under the condition that the allocation condition of system resources and the ECN uploading meeting the deadline are considered, the task scheduling time model is as follows:
Figure BDA0003050440920000216
Figure BDA0003050440920000221
wherein, tiFor scheduled completion time, r is the number of scheduled rounds, DiFor the ith frame data amount, MtScheduling an upper threshold, τ, of resources for a taskiThe latest completion time of the ith ECN uploading task is the latest completion time of the ith ECN uploading task, so that the video stream data collected by the multiple cameras cannot be uploaded together, and the uploading resources and sequence need to be allocated, so that efficient transmission is realized.
The fifth step: identifying operation behaviors, wherein the specific flow is as follows:
firstly, extracting the position coordinates of each frame of human body joint points in a video: carrying out posture estimation on each frame of human body in the video by utilizing an Open-position method to obtain position coordinates of 15 joint points of the neck, the chest, the head, the right shoulder, the left shoulder, the right hip, the left hip, the right elbow, the left elbow, the right knee, the left knee, the right wrist, the left wrist, the right ankle and the left ankle of the human body, wherein the coordinate of the kth joint point is expressed as Lk=(xk,yk) K is from 1 to 15;
then, the position coordinates of each joint point are normalized, a coordinate matrix P is formed by the position coordinates of the 15 joint points after normalization,
Figure BDA0003050440920000222
wherein (x)k,yk) Representing the coordinates after the k-th joint point normalization;
and then, calculating a distance variation matrix of the human body joint points of two adjacent frames: according to the coordinate matrix P of two adjacent framesnAnd Pn-1Calculating a joint point position coordinate variation matrix of two adjacent frames, and calculating a joint point distance variation matrix D according to the joint point position coordinate variation matrix;
further, generating video characteristics, averagely dividing the video into 4 sections according to the time length of the video, adding distance variation matrixes D generated by two adjacent frames in each section of the video to obtain accumulated distance variation matrixes Di, wherein i ranges from 1 to 4, carrying out L2 normalization on Di to obtain normalized Di ', and connecting the accumulated distance variation matrixes Di' in series to serve as the characteristics of the whole video: f ═ D1', D2', D3', D4';
then, the videos are classified using neural networks: dividing video data into a training set and a testing set, inputting the characteristics of a training set video into a neural network for training to obtain a trained neural network classification model, and inputting the characteristics of a testing set video into the trained neural network classification model to obtain a classification result.
Preferably, the moving target is detected, when the video data of the moving object is acquired by the camera, whether the moving target exists is judged according to algorithm detection and analysis, the video data with the moving target is uploaded, all the video data are not required to be uploaded, only the video data with the moving target object are uploaded, and therefore the data transmission amount and the data transmission time are reduced, and the bandwidth is saved. In the target detection stage, the inter-frame difference method is used for detecting the moving target, so that whether the moving target exists or not is judged through the inter-frame image difference, the moving target is detected and extracted to obtain the video clip with the moving target, and useless video clips are not processed.
Example 2
The embodiment of the invention provides a multisource information fusion live pig full-chain intelligent monitoring system, and by combining with fig. 2, fig. 2 is a schematic flow diagram of the multisource information fusion live pig full-chain intelligent monitoring system provided by the invention, the multisource information fusion live pig full-chain intelligent monitoring system comprises a plurality of monitoring devices and information tracing devices 6, the monitoring devices comprise breeding node monitoring devices 1, slaughter node monitoring devices 2, processing node monitoring devices 3, storage node monitoring devices 4 and circulation node monitoring devices 5 and are used for monitoring different node parameters, the node parameters comprise breeding node parameters, slaughter node parameters, processing node parameters, storage node parameters and circulation node parameters, and the internet-of-things-based live pig production full-chain information intelligent detection system specifically comprises:
the breeding node monitoring equipment 1 is used for monitoring a plurality of breeding node parameters under the breeding nodes and transmitting the parameters to the information tracing equipment so as to feed back the environment quality of the breeding place of the live pigs;
the slaughter node monitoring equipment 2 is used for monitoring a plurality of slaughter node parameters under the slaughter nodes and transmitting the parameters to the information tracing equipment so as to feed back whether the pork slaughtering process is standard or not;
the processing node monitoring equipment 3 is used for monitoring a plurality of processing node parameters under the processing nodes and transmitting the processing node parameters to the information tracing equipment so as to feed back whether the pork processing process is standard or not;
the storage node monitoring equipment 4 is used for monitoring a plurality of storage node parameters under the storage nodes and transmitting the storage node parameters to the information tracing equipment so as to feed back whether the pork storage process is standard or not;
the circulation node monitoring equipment 5 is used for monitoring a plurality of circulation node parameters under the circulation nodes and transmitting the circulation node parameters to the information tracing equipment so as to feed back whether the pork circulation process is standard or not;
the information tracing device 6 comprises a food safety big data platform 601 and an electronic tag device 602, wherein the food safety big data platform is used for carrying out big data processing on a cultivation node parameter, a slaughter node parameter, a processing node parameter, a storage node parameter and a circulation node parameter to form tracing information and visualizing a big data processing result; the electronic label equipment is used for converting the traceability information into a corresponding RFID label;
the food safety big data platform comprises a cloud platform, an edge node manager and a plurality of edge nodes corresponding to a plurality of monitoring devices respectively, wherein: the edge node is used for receiving the corresponding node parameters, filtering the data of the node parameters and determining the filtered parameters to be uploaded; the edge node manager is used for sequencing the processing time delay of each parameter to be uploaded in an ascending order to form a first sequence, and each edge node newly added parameter to be uploaded is placed at the tail end of the first sequence; the first sequence is adjusted according to the transmission delay of each edge node, and the uploading sequence of a plurality of parameters to be uploaded is determined according to the adjusted first sequence; and the cloud platform is used for sequentially carrying out data processing on the uploaded parameters to be uploaded and visualizing the data processing result.
In the embodiment of the invention, the states of all nodes on the live pig production chain are comprehensively monitored, the large data platform is utilized to realize the rapid processing of various monitoring data, the processing result is visually operated and displayed to related personnel, and the quality control and management of the live pig production are facilitated; in addition, the production information of the live pigs is burnt to the corresponding RFID labels through the electronic label equipment in the information tracing equipment, so that a consumer can quickly master the production information (batch number, manufacturer, production place and the like) of the live pigs through a way of scanning the RFID labels, the public opening degree and transparency of the production information of the live pigs are comprehensively ensured, the selection and supervision of the consumer are facilitated, and the safety is further enhanced; in addition, a plurality of node parameters are obtained, data redundancy is effectively avoided by using data filtering operation, meanwhile, ascending sequencing is carried out according to processing time delay, effective virtual machine resource allocation is carried out, a first sequence is reasonably planned, finally, the first sequence is adjusted according to transmission time delay, and finally the uploading sequence of each parameter to be uploaded is determined, so that efficient and rapid data processing and data uploading are guaranteed, and the uploading sequence of each node parameter is reasonably allocated.
Preferably, the breeding node parameters sequentially include breeding parameters, feeding parameters, environmental parameters, disease parameters and slaughter parameters of the adult pigs, and the breeding node monitoring equipment comprises: the breeding monitoring equipment is used for monitoring breeding parameters to feed back the breeding condition and the ear tag marking condition in the piglet breeding process, wherein the breeding parameters comprise at least one of piglet varieties, piglet body conditions and live pig ear tag marking operation images; the feeding monitoring equipment is used for monitoring feeding parameters to feed back the feeding condition and the environmental condition in the process of feeding the live pigs, wherein the feeding parameters comprise at least one of the quality of live pig feed, the quality of live pig drinking water and the feed intake of the live pigs; the environment monitoring equipment is used for monitoring environmental parameters to feed back the condition of the growing environment of the live pigs, wherein the environmental parameters comprise at least one of the temperature and the humidity of the pigsty, the air quality of the pigsty, the residual condition of solid waste of the pigsty, the residual condition of sewage of the pigsty, the residual condition of excrement of the pigsty for resisting biology and the images of the live pig groups; the disease monitoring device is used for monitoring disease parameters to feed back whether the live pigs are healthy, wherein the disease parameters comprise at least one of live pig disease conditions, live pig body temperature, live pig body state, live pig cough sound and feeding operation images; the adult pig slaughtering monitoring equipment is used for monitoring adult pig slaughtering parameters so as to feed back the quality of live pig slaughtering, wherein the adult pig slaughtering parameters comprise at least one of weight of slaughtered live pigs, backfat thickness of slaughtered live pigs, appearance of slaughtered live pigs, activity of slaughtered live pigs, slaughter live pig quarantine information and slaughter operation images. As a specific embodiment, the multiple sub-nodes are reasonably arranged on the breeding nodes, and omnibearing monitoring is achieved.
Preferably, slaughter node parameter includes live pig acceptance parameter, hot water shower parameter, cuts parameter, precooling parameter and pork packing parameter, and slaughter node monitoring facilities includes: the live pig acceptance monitoring equipment is used for monitoring the live pig acceptance parameters to feed back the quality of the accepted live pig, wherein the live pig acceptance parameters comprise at least one of the weight of the accepted live pig, the backfat thickness of the accepted live pig, the appearance of the accepted live pig, the activity of the accepted live pig and an acceptance operation image; the hot water showering monitoring equipment is used for monitoring hot water showering parameters to feed back an operation specification when hot water is showered, wherein the hot water showering parameters comprise at least one of hot water temperature and a showering operation image; the cutting monitoring equipment is used for monitoring cutting parameters to feed back the operation specification of the cut pork, wherein the cutting parameters comprise at least one of temperature and humidity of a cutting workshop and cutting operation images; the precooling monitoring equipment is used for monitoring precooling parameters to feed back the operation specifications of the precooled pork, wherein the precooling parameters comprise at least one of pork nutrient content, residual pork pollutants, precooling workshop environment quality and precooling operation images; pork packing monitoring facilities for monitor pork packing parameter to the operating specification of feedback packing pork, wherein, pork packing parameter includes at least one of pork metal residue, pork foreign matter residue, packing operation image. As a specific embodiment, the slaughtering node is reasonably provided with a plurality of sub-nodes, and omnibearing monitoring is achieved.
Preferably, the processing node parameters include a raw material receiving parameter, a pickling parameter, a chopping parameter, a cooking sterilization parameter and a finished product packaging parameter in sequence, and the processing node monitoring device includes: raw material receiving and monitoring equipment for monitoring raw material receiving parameters so as to feed back the quality of raw materials received in the processing process and operation specifications during receiving, wherein the raw material receiving parameters comprise at least one of the quality of received pork and receiving operation images; the pickling monitoring equipment is used for monitoring pickling parameters to feed back the operation specification of the pickled pork, wherein the pickling parameters comprise at least one of temperature and humidity of a pickling workshop and a pickling operation image; the chopping monitoring equipment is used for monitoring chopping parameters to feed back the operation specification of the chopped pork, wherein the chopping parameters comprise at least one of temperature and humidity of a chopping workshop and a chopping operation image; and the cooking sterilization monitoring equipment is used for monitoring cooking sterilization parameters to feed back the operation specification when the pork is cooked and sterilized, wherein the cooking sterilization parameters comprise at least one of cooking temperature and a cooking operation image: and the finished product packaging monitoring equipment is used for monitoring finished product packaging parameters so as to feed back the packaging standard of the finished product pork, wherein the finished product packaging parameters comprise at least one of the nutrient content of the finished product pork, the pollutant residue of the finished product pork, the environmental quality of a packaging workshop and a finished product packaging operation image. As a specific embodiment, the embodiment of the invention reasonably arranges a plurality of sub-nodes on the processing node to realize omnibearing monitoring.
Preferably, the storage node parameters include warehousing parameters, storage parameters, and ex-warehouse parameters, and the storage node monitoring device includes: the warehousing monitoring equipment is used for monitoring warehousing parameters to feed back warehousing operation specifications, wherein the warehousing parameters comprise at least one of the quality of stored pork, the quality of a storage environment and warehousing operation images; the storage monitoring equipment is used for monitoring storage parameters to feed back the quality of the storage environment, wherein the storage parameters comprise at least one of warehouse temperature and humidity and warehouse air quality; the ex-warehouse monitoring equipment is used for monitoring ex-warehouse parameters to feed back ex-warehouse operation specifications, wherein the ex-warehouse parameters comprise at least one of ex-warehouse pork quality and ex-warehouse operation images; as a specific embodiment, the embodiment of the invention reasonably arranges a plurality of sub-nodes on the storage node to realize omnibearing monitoring.
Preferably, the circulation node parameters include an incoming delivery parameter, a transportation parameter, and an outgoing delivery parameter, and the circulation node monitoring apparatus includes: the stock delivery monitoring equipment is used for monitoring stock delivery handover parameters to feed back the stock delivery handover operation specification, wherein the stock delivery handover parameters comprise at least one of handover pork quality and handover operation images; the transportation process monitoring equipment is used for monitoring transportation parameters to feed back transportation operation specifications, wherein the transportation parameters comprise at least one of temperature and humidity, transportation time, transportation tracks, transportation personnel information and transportation images of a transportation vehicle storage space; shipment handing-over monitoring facilities for monitor shipment handing-over parameter to the operating specification of feedback shipment handing-over, wherein, shipment handing-over parameter includes at least one of shipment pork quality, shipment operation image. As a specific embodiment, the embodiment of the invention reasonably arranges a plurality of sub-nodes at the circulation node to realize omnibearing monitoring.
Example 3
The embodiment of the invention provides a multisource information fusion live pig full-chain intelligent monitoring device which comprises a processor and a memory, wherein a computer program is stored in the memory, and when the computer program is executed by the processor, the multisource information fusion live pig full-chain intelligent monitoring method is realized.
The invention discloses a multisource information fusion live pig full-chain intelligent monitoring method and a multisource information fusion live pig full-chain intelligent monitoring device, wherein firstly, identity information corresponding to a live pig is determined through a live pig ear tag, and different live pigs are identified; then, for different live pigs, real-time monitoring live pig exterior images corresponding to the live pigs to identify the exterior health degree of the live pigs, collecting cough sounds of the live pigs, extracting the characteristics of the sounds, and effectively acquiring live pig crowd video so as to judge the activity state of the live pigs; further, according to the exterior image of the live pig, determining a corresponding body type characteristic value and feeding back the body surface health degree of the live pig; meanwhile, the coughing sounds of the live pigs are converted into corresponding signal oscillograms, and sound extraction is carried out, so that the sound health degree of the live pigs is fed back; meanwhile, according to the live pig group video, the activity of the live pigs is effectively judged; and finally, judging the health degree of the live pig from multiple angles according to the body characteristic value, the sound characteristic value and the activity characteristic value, so as to achieve the efficient and accurate judgment effect.
According to the technical scheme, live pig information is collected in multiple aspects, the safety of live pig production is guaranteed, the health hidden danger of the live pigs is eliminated in a cost-saving manner, the complexity of artificial diagnosis is avoided, the effect of preliminary timely diagnosis is realized, the high efficiency and the accuracy of monitoring the live pig production process are further realized, timely feedback and early warning are facilitated, and the safety of live pig production is improved.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are also included in the scope of the present invention.

Claims (10)

1. A multi-source information fusion live pig full-chain intelligent monitoring method is characterized by comprising the following steps:
identifying the ear tag of the live pig and determining the identity information of the live pig;
monitoring corresponding live pig appearance images, live pig cough sounds and live pig group videos in real time according to the identity information of the live pigs;
determining a corresponding body type characteristic value according to the live pig appearance image;
converting the cough sound of the live pig into a signal oscillogram, and extracting a sound characteristic value of the signal oscillogram;
determining a corresponding activity characteristic value according to the live pig group video;
determining a corresponding health grade according to the body type characteristic value, the sound characteristic value and the activity characteristic;
generating early warning information according to the health grade, and associating the early warning information with the identity information;
and carrying out visual processing on the associated identity information and the early warning information so as to remind related workers to process the live pigs with the hidden health risks.
2. The multi-source information fused live pig full-chain intelligent monitoring method according to claim 1, wherein the determining the corresponding body type characteristic value according to the live pig appearance image comprises:
performing edge recognition according to the live pig appearance image, determining a corresponding edge contour, and determining the posture characteristics of the live pig according to the edge contour, wherein the posture characteristics comprise the body length and the body weight of the live pig;
carrying out target identification according to the live pig exterior image, identifying an abnormal part on the surface of the live pig, and determining the skin surface characteristics of the live pig according to the abnormal part, wherein the skin surface characteristics comprise the area of the abnormal part and the severity of the abnormal part;
and determining the body type characteristic value according to the body state characteristic and the skin surface characteristic.
3. The method for intelligently monitoring the whole chain of the live pig based on the multi-source information fusion of claim 1, wherein the converting the coughing sounds of the live pig into the signal oscillogram and extracting the sound characteristic value of the signal oscillogram comprises:
converting the live pig cough sounds into a signal waveform diagram;
extracting sound characteristic values of the signal oscillogram;
and comparing the sound characteristic value with a preset standard sound characteristic value.
4. The multi-source information-fused live pig full-chain intelligent monitoring method according to claim 1, wherein the determining the corresponding activity characteristic value according to the live pig group video comprises:
extracting the moving track of the live pig according to the identity information and the live pig group video to obtain the moving speed and the moving acceleration of the live pig;
and determining the activity characteristic value corresponding to the live pig according to the speed and the acceleration.
5. The intelligent multi-source information fused live pig full-chain monitoring method according to claim 4, further comprising: and performing video framing processing according to the live pig group videos, judging whether a fighting behavior exists between live pigs according to the image difference of each frame of video, and if so, generating fighting early warning information to remind related workers to stop.
6. The intelligent multi-source information fused live pig chain monitoring method according to claim 1, further comprising:
acquiring the weight of feed in a trough corresponding to the live pig;
and associating the identity information with the feed weight, comparing the feed weight with a preset standard feed amount, determining a corresponding feed difference value, and if the feed difference value exceeds a first preset value, generating alarm information to transmit relevant workers so as to check corresponding live pigs in time.
7. The intelligent multi-source information fused live pig chain monitoring method according to claim 1, further comprising:
acquiring environmental factors, processing factors, circulation factors and operation images formed by each node in the production process of the live pigs;
comparing the environmental factors, the processing factors and the circulation factors with a prestored parameter database, and generating first early warning information according to a comparison result and sending the first early warning information to a corresponding node so as to remind related workers to regulate and control the production process;
and matching and comparing the operation image with a prestored operation standard image library, and generating second early warning information according to a comparison result and sending the second early warning information to the corresponding node so as to remind related workers to adjust the operation standard.
8. The multi-source information fused live pig full-chain intelligent monitoring method according to claim 7, it is characterized in that each node comprises a breeding node, a slaughtering node, a processing node, a storage node and a circulation node, wherein the breeding nodes sequentially comprise a seed selection sub-node, a feeding sub-node, an environment sub-node, a disease sub-node and a finished pig slaughtering sub-node, the slaughtering nodes sequentially comprise a live pig acceptance sub-node, a hot water showering sub-node, a splitting sub-node, a pre-cooling sub-node and a pork packaging sub-node, the processing node sequentially comprises a raw material receiving sub-node, a pickling sub-node, a chopping sub-node, a cooking and sterilizing sub-node and a finished product packaging sub-node, the storage nodes comprise warehousing sub-nodes, storage sub-nodes and delivery sub-nodes, and the circulation nodes comprise delivery sub-nodes, transportation sub-nodes and delivery sub-nodes.
9. The intelligent multi-source information fused live pig chain monitoring method according to claim 8, further comprising:
acquiring the culture node parameter, the slaughter node parameter, the processing node parameter, the storage node parameter and the circulation node parameter;
performing data filtering on the breeding node parameter, the slaughter node parameter, the processing node parameter, the storage node parameter and the circulation node parameter, and determining a filtered parameter to be uploaded;
sequencing the processing time delay of each parameter to be uploaded in an ascending order to form a first sequence, and placing each newly added parameter to be uploaded at the edge node at the tail end of the first sequence;
adjusting the first sequence according to the transmission delay of each edge node, and determining the uploading sequence of the parameters to be uploaded according to the adjusted first sequence;
and sequentially carrying out data processing on the uploaded parameters to be uploaded, and visualizing the data processing result.
10. An intelligent monitoring device for whole chain of live pig based on multi-source information fusion, which is characterized by comprising a processor and a memory, wherein the memory is stored with a computer program, and when the computer program is executed by the processor, the intelligent monitoring method for whole chain of live pig based on multi-source information fusion according to any one of claims 1 to 9 is realized.
CN202110486279.9A 2021-04-30 2021-04-30 Multi-source information fusion live pig full-chain intelligent monitoring method and device Pending CN113507490A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110486279.9A CN113507490A (en) 2021-04-30 2021-04-30 Multi-source information fusion live pig full-chain intelligent monitoring method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110486279.9A CN113507490A (en) 2021-04-30 2021-04-30 Multi-source information fusion live pig full-chain intelligent monitoring method and device

Publications (1)

Publication Number Publication Date
CN113507490A true CN113507490A (en) 2021-10-15

Family

ID=78008373

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110486279.9A Pending CN113507490A (en) 2021-04-30 2021-04-30 Multi-source information fusion live pig full-chain intelligent monitoring method and device

Country Status (1)

Country Link
CN (1) CN113507490A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117936087A (en) * 2023-12-25 2024-04-26 甘肃省畜牧兽医研究所 Intelligent monitoring method and system for bovine nodular skin disease
CN117936087B (en) * 2023-12-25 2024-07-12 甘肃省畜牧兽医研究所 Intelligent monitoring method and system for bovine nodular skin disease

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150282457A1 (en) * 2014-04-08 2015-10-08 Medisim, Ltd. Cattle monitoring for illness
CN108990831A (en) * 2018-06-22 2018-12-14 成都睿畜电子科技有限公司 A kind of animal health monitoring method and system
CN109602421A (en) * 2019-01-04 2019-04-12 平安科技(深圳)有限公司 Health monitor method, device and computer readable storage medium
CN110264073A (en) * 2019-06-19 2019-09-20 贵州省东骏有机农业科技有限公司 Intelligent livestock and poultry cultivation production management traceability system and platform
CN110495405A (en) * 2019-07-12 2019-11-26 中国农业大学 A kind of pig-breeding monitoring node, system and method
CN110934088A (en) * 2019-12-10 2020-03-31 河南科技学院 Healthy breeding robot for live pigs
CN111034643A (en) * 2019-12-11 2020-04-21 华北水利水电大学 Physiological characteristic detection system and method for livestock breeding
CN111294565A (en) * 2020-03-10 2020-06-16 福建农业职业技术学院 Intelligent pig raising monitoring method and management terminal
CN112164408A (en) * 2020-10-26 2021-01-01 南京农业大学 Pig coughing sound monitoring and early warning system based on deep learning

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150282457A1 (en) * 2014-04-08 2015-10-08 Medisim, Ltd. Cattle monitoring for illness
CN108990831A (en) * 2018-06-22 2018-12-14 成都睿畜电子科技有限公司 A kind of animal health monitoring method and system
CN109602421A (en) * 2019-01-04 2019-04-12 平安科技(深圳)有限公司 Health monitor method, device and computer readable storage medium
CN110264073A (en) * 2019-06-19 2019-09-20 贵州省东骏有机农业科技有限公司 Intelligent livestock and poultry cultivation production management traceability system and platform
CN110495405A (en) * 2019-07-12 2019-11-26 中国农业大学 A kind of pig-breeding monitoring node, system and method
CN110934088A (en) * 2019-12-10 2020-03-31 河南科技学院 Healthy breeding robot for live pigs
CN111034643A (en) * 2019-12-11 2020-04-21 华北水利水电大学 Physiological characteristic detection system and method for livestock breeding
CN111294565A (en) * 2020-03-10 2020-06-16 福建农业职业技术学院 Intelligent pig raising monitoring method and management terminal
CN112164408A (en) * 2020-10-26 2021-01-01 南京农业大学 Pig coughing sound monitoring and early warning system based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李奇峰等: "信息技术在畜禽养殖中的应用进展", 《中国农业信息》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117936087A (en) * 2023-12-25 2024-04-26 甘肃省畜牧兽医研究所 Intelligent monitoring method and system for bovine nodular skin disease
CN117936087B (en) * 2023-12-25 2024-07-12 甘肃省畜牧兽医研究所 Intelligent monitoring method and system for bovine nodular skin disease

Similar Documents

Publication Publication Date Title
Li et al. Automated techniques for monitoring the behaviour and welfare of broilers and laying hens: towards the goal of precision livestock farming
Liu et al. A computer vision-based method for spatial-temporal action recognition of tail-biting behaviour in group-housed pigs
Chen et al. Detection of aggressive behaviours in pigs using a RealSence depth sensor
Oczak et al. Classification of aggressive behaviour in pigs by activity index and multilayer feed forward neural network
Chen et al. Image motion feature extraction for recognition of aggressive behaviors among group-housed pigs
CN113938503A (en) Early warning system for diseases through live pig behavior sign monitoring and construction method
CN109543679A (en) A kind of dead fish recognition methods and early warning system based on depth convolutional neural networks
CN106778784B (en) Pig individual identification and drinking behavior analysis method based on machine vision
Tscharke et al. A brief review of the application of machine vision in livestock behaviour analysis
Garcia et al. Lameness detection challenges in automated milking systems addressed with partial least squares discriminant analysis
Subedi et al. Tracking floor eggs with machine vision in cage-free hen houses
Chen et al. A kinetic energy model based on machine vision for recognition of aggressive behaviours among group-housed pigs
CN108460370B (en) Fixed poultry life information alarm device
Suwannakhun et al. Estimating pig weight with digital image processing using deep learning
CN113298537A (en) Rice full-chain quality information intelligent detection system and method based on Internet of things
CN115861721A (en) Livestock and poultry breeding spraying equipment state identification method based on image data
Van der Voort et al. Invited review: Toward a common language in data-driven mastitis detection research
CN108805736A (en) Cultivating system and state monitoring method under two dot patterns of one kind
CN114898405A (en) Portable broiler chicken abnormity monitoring system based on edge calculation
Xu et al. Research on the lying pattern of grouped pigs using unsupervised clustering and deep learning
CN113507490A (en) Multi-source information fusion live pig full-chain intelligent monitoring method and device
CN113408334B (en) Crayfish full-chain data acquisition and intelligent detection method and device
CN113989538A (en) Depth image-based chicken flock uniformity estimation method, device, system and medium
CN111652084A (en) Abnormal laying hen identification method and device
CN113507491B (en) Method and system for uploading full-chain information of clean egg production in real time

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20211015

RJ01 Rejection of invention patent application after publication