CN111209844A - Method and device for monitoring breeding place, electronic equipment and storage medium - Google Patents

Method and device for monitoring breeding place, electronic equipment and storage medium Download PDF

Info

Publication number
CN111209844A
CN111209844A CN202010004759.2A CN202010004759A CN111209844A CN 111209844 A CN111209844 A CN 111209844A CN 202010004759 A CN202010004759 A CN 202010004759A CN 111209844 A CN111209844 A CN 111209844A
Authority
CN
China
Prior art keywords
monitoring
image
target
colony house
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010004759.2A
Other languages
Chinese (zh)
Inventor
沈翀
陶兴源
刘永霞
李芳媛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Miaozhen Information Technology Co Ltd
Original Assignee
Miaozhen Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Miaozhen Information Technology Co Ltd filed Critical Miaozhen Information Technology Co Ltd
Priority to CN202010004759.2A priority Critical patent/CN111209844A/en
Publication of CN111209844A publication Critical patent/CN111209844A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Agronomy & Crop Science (AREA)
  • Mining & Mineral Resources (AREA)
  • Human Computer Interaction (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Psychiatry (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Social Psychology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Signal Processing (AREA)
  • Catching Or Destruction (AREA)

Abstract

The embodiment of the application provides a monitoring method and device for a breeding place, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a monitoring image in a monitoring video of a target colony house; performing semantic segmentation on the monitoring image to obtain an image recognition result, wherein the image recognition result comprises data of target organisms and data of harmful organisms, and the target organisms are breeding objects of the target colony house; and determining a control scheme of the target colony house according to the data of the pests. Therefore, the problem that in the prior art, a user cannot find the potential safety hazard of the breeding objects in the breeding place in time easily can be solved.

Description

Method and device for monitoring breeding place, electronic equipment and storage medium
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a monitoring method and device for a breeding place, electronic equipment and a storage medium.
Background
Along with the development of society, the living standard of people is obviously improved, and the demand of people on meat food is not only met quantitatively in the past, but also puts forward certain requirements on the product quality.
For breeding objects such as pigs, chickens, ducks, cattle, rabbits and the like, in the automatic and large-scale breeding process, the potential safety hazard faced by the breeding objects in the breeding places is difficult to find in time by users due to the complexity of the breeding environment.
Disclosure of Invention
An object of the embodiments of the present application is to provide a monitoring method and apparatus for a farm, an electronic device, and a storage medium, so as to solve the problem that it is difficult for a user to find potential safety hazards faced by a farm object in the farm in time in the prior art.
In a first aspect, an embodiment of the present application provides a monitoring method for a farm, where the method includes:
acquiring a monitoring image in a monitoring video of a target colony house;
performing semantic segmentation on the monitoring image to obtain an image recognition result, wherein the image recognition result comprises data of target organisms and data of harmful organisms, and the target organisms are breeding objects of the target colony house;
and determining a control scheme of the target colony house according to the data of the pests.
According to the method, the semantic segmentation is performed on the monitoring image in the acquired monitoring video of the target colony house, so that the control scheme of the target colony house is determined according to the pests in the image recognition result, the real-time monitoring on the breeding environment of the target organisms can be realized, the pests in the monitoring image can be rapidly recognized from a pixel level, and the precision is higher. Compared with the mode of manual regular cleaning and regular medication, the control scheme determined according to the pixel-level image recognition result has stronger pertinence to the pests, and is favorable for carrying out subsequent observation and rapid treatment on some pests with more concealed positions. Compared with an undifferentiated blind cleaning mode, the method can reduce the labor cost and the cleaning equipment cost, and compared with a blind medicine application mode, the method can reduce the influence of the medicine on the target organism.
In an optional embodiment, after performing semantic segmentation on the monitoring image to obtain an image recognition result, the method further includes:
displaying the outline of the harmful organism and the species attribute of the harmful organism in the monitoring image according to the data of the harmful organism.
Through the implementation mode, the contour of the harmful organism and the species attribute of the harmful organism are displayed in the monitoring image, so that a user can know which harmful organisms exist in the corresponding target colony house through the species attribute and the contour and know the long living position of the harmful organism, the subsequent treatment of the harmful organisms is facilitated, the control effect on the target colony house can be enhanced, and the control efficiency is improved.
In an optional embodiment, after performing semantic segmentation on the monitoring image to obtain an image recognition result, the method further includes:
and displaying the outline of the target organism in the monitoring image according to the data of the target organism.
Through the implementation mode, different objects can be distinguished based on the outline displayed in the image, the target organisms are marked, various pests which may appear or targets (possibly people) which do not belong to the target colony house are detected, and when the pests are detected, a corresponding control scheme is determined for a user to control the target colony house.
In an optional embodiment, the performing semantic segmentation on the monitoring image to obtain an image recognition result includes:
performing semantic segmentation on multiple frames of monitoring images in the monitoring video respectively to obtain an identification result of each image in the multiple frames of monitoring images;
and acquiring the image identification result according to the identification result of each image in the multiple frames of monitoring images.
By means of the implementation mode, partial harmful organisms can be prevented from being shielded due to the action of the target organisms, so that partial harmful organisms are omitted, and the harmful organisms in the same monitoring area can be detected more comprehensively. Compared with a mode that the number of times of the control scheme is determined once for each frame of image, the implementation mode can reduce verification workload of a user and unnecessary control times.
In an optional embodiment, the acquiring a monitoring image in a monitoring video of a target colony house includes:
when a detection request is received, extracting continuous multi-frame images from the monitoring video of the target colony house as the multi-frame monitoring images;
or when a detection request is received, extracting a plurality of frames of images from the monitoring video of the target colony house as the plurality of frames of monitoring images according to a preset number of interval frames.
Through the implementation mode, the target colony house can be detected in a targeted mode when detection needs exist (for example, in the active growth period of some harmful organisms).
In an alternative embodiment, the data of the pests includes a name of the pest, the method further comprising:
outputting prompt information according to the name of the pest and the control scheme;
and responding to the confirmation operation of the prompt message, and generating a control treatment work order of the target colony house.
Through the implementation mode, the situation that an improper prevention and control scheme is given due to machine identification errors or the given prevention and control scheme is not suitable for the current target colony house can be avoided as far as possible, and the reliability of the prevention and control process can be improved.
In alternative embodiments, the target organism is a pig, chicken, duck, cow, or rabbit.
Through the implementation mode, a monitoring scheme suitable for various breeding objects can be provided, biological control on various target organisms needing to be captive can be facilitated, and the universality is high.
In a second aspect, an embodiment of the present application provides a monitoring device for a farm, the device including:
the acquisition module is used for acquiring a monitoring image in a monitoring video of the target colony house;
the image processing module is used for performing semantic segmentation on the monitoring image to obtain an image recognition result, wherein the image recognition result comprises data of target organisms and data of harmful organisms, and the target organisms are breeding objects of the target colony house;
and the determining module is used for determining a control scheme of the target colony house according to the data of the pests.
The method provided by the first aspect can be realized through the device, the breeding environment of the target organism can be effectively monitored based on a machine vision technology, and the harmful organisms in the monitored image can be quickly identified from a pixel level. Compared with a manual observation mode, the precision is higher, and the control scheme determined according to the pixel-level image recognition result is stronger in pertinence to the pests. Therefore, the harmful organisms in certain positions can be observed and treated quickly, the labor cost and the equipment cleaning cost can be reduced, and the influence of the drugs on the target organisms can be reduced.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory;
a processor;
the memory stores a computer program executable by the processor, the computer program, when executed by the processor, performing the method of the first aspect as set forth above.
In a fourth aspect, an embodiment provides a storage medium, on which a computer program is stored, which, when executed by a processor, performs the method of the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a flowchart of a monitoring method for a farm according to an embodiment of the present disclosure.
Fig. 2 is a flowchart of another monitoring method for a farm according to an embodiment of the present disclosure.
Fig. 3 is a functional block diagram of a monitoring device for a farm according to an embodiment of the present disclosure.
Fig. 4 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Reference numerals: 300-monitoring devices of the breeding place; 301-an obtaining module; 302-an image processing module; 303-a determination module; 400-an electronic device; 401-a memory; 402-a processor; 403-display unit.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
In the existing large-scale breeding process, due to the complex colony house environment of the breeding objects, the treatment of the colony house is carried out by means of measures such as artificial feeding observation, regular cleaning, traditional medicament prevention and the like at present, but the regular cleaning is difficult to find potential safety hazards faced by the breeding objects in time, for some pests with hidden positions, if the cleaning is not in place, the pests cannot be treated in time, and a large amount of labor cost and equipment cost can be consumed when large-area cleaning action is too frequent. The medicament treatment mode can make the cultured objects face medicament contact hidden dangers, and the cultured objects are easily influenced by the medicament, so that the normal culture of the cultured objects is influenced.
In view of the above, the embodiments of the present application propose the following embodiments to improve the above-mentioned drawbacks.
Referring to fig. 1, fig. 1 is a flowchart of a monitoring method for a farm according to an embodiment of the present disclosure. As shown in FIG. 1, the method includes steps S11-S13.
S11: and acquiring a monitoring image in the monitoring video of the target colony house.
The target housing may be a housing for housing a breeding target such as a pig, a cow, a chicken, a duck, or a rabbit. One or more cameras can be installed at the designated position of the target colony house to monitor the colony house, so that the monitoring video of the target colony house can be obtained, and the designated position can be above the colony house or can be the side wall of the colony house. The technical personnel in the field can select the quantity of cameras and the installation positions of the cameras according to actual needs, as long as the colony house can carry out all-round monitoring, and the installation mode of the cameras should not be understood as the limitation to the application.
As an implementation manner, each target colony house may have a colony house identifier, and each colony house may be distinguished by the colony house identifier, and a position of the target colony house is determined. When the colony house identification of the target colony house is selected, the monitoring video of the target colony house can be obtained, and then the monitoring image in the monitoring video is obtained.
As another implementation manner, the monitoring video of the target colony house monitored by the camera can be obtained by selecting the device identifier of the camera, so that the monitoring image in the monitoring video is obtained, and the position of the target colony house can be determined according to the device identifier of the camera.
S12: and performing semantic segmentation on the monitored image to obtain an image recognition result, wherein the image recognition result comprises data of target organisms and data of harmful organisms, and the target organisms are breeding objects of the target colony house.
The target organism may be a breeding object such as a pig, a chicken, a duck, a cow or a rabbit which is bred in a captive manner. Therefore, the monitoring scheme suitable for various breeding objects can be provided, biological control on various target organisms needing to be captive is facilitated, and the universality is good.
The semantic segmentation is an image processing technology in the field of computer vision, and refers to pixel-level recognition of an image to distinguish the object class of each pixel in the image. Semantic segmentation focuses on the class of each pixel in the image, but does not focus on individual discrimination within the same class.
Through the above S12, since semantic segmentation is based on object classification at the pixel level, a target organism and a pest in the same monitored image can be distinguished, and a more accurate and intuitive image recognition result can be obtained compared with the conventional image classification (what exists in an output image) and target detection (labeling each object in the image with a rectangular frame). However, in the method, individual discrimination of the same kind of target organisms is not needed, and the problem that identification interference is caused by excessive discrimination marks to a subsequent user confirmation process so as to influence the processing efficiency can be avoided.
S13: and determining a control scheme of the target colony house according to the data of the pests.
In practical applications, the control scheme is related to the actual pest type and the target colony house, and those skilled in the art can implement setting of the control scheme, for example, the control scheme can be set for multiple pests according to the size, position, internal breeding object and other attributes of the target colony house. When harmful organisms are identified by S12, one or more schemes may be determined as the control scheme of the target colony house from among the preset control schemes.
In order to better identify the harmful organisms in the monitored image, an image identification model can be trained in advance, and the trained image identification model is used for identifying and detecting the monitored image in the monitored video, so that the target organisms, the harmful organisms and other objects in the monitored image can be distinguished, wherein the other objects refer to objects which are not the target organisms but are harmless to the target organisms.
In one example, a monitoring image in a monitoring video about a pigsty is acquired, the acquired pigsty monitoring image is subjected to semantic segmentation through a pre-trained image recognition model, and an obtained image recognition result comprises live pig data, feeder data, ant colony data and pigsty background data. In the image recognition result, a live pig is used as a target organism, an ant colony is used as a harmful organism, a breeder is used as one of other objects, and a pigsty background is used as another of the other objects. When the ant colony data in the image recognition result is obtained, the control scheme of the pigsty is determined by combining the display proportion of the scale size of the ant colony in the whole pigsty.
In other examples, the identified pests may also be rats, cockroach groups, centipedes, and the like that are not readily observable by the user.
In the method, the semantic segmentation is carried out on the obtained monitoring images in the monitoring video of the target colony house, so that the control scheme of the target colony house is determined according to the pests in the image recognition result. The system not only can realize real-time monitoring on the breeding environment of the target organisms, but also can quickly identify the harmful organisms in the monitored images from the pixel level, and has higher precision compared with an artificial observation mode. Compared with the mode of manual regular cleaning and regular medication, the control scheme determined according to the pixel-level image recognition result has stronger pertinence to the pests, is beneficial to carrying out subsequent observation and rapid treatment on some pests with hidden positions, can reduce the labor cost and the cost of cleaning equipment compared with the undifferentiated blind cleaning mode, and can reduce the influence of the drugs on target organisms compared with the blind medication mode.
Optionally, after the step S12, the method may further include a step S121.
S121: according to the data of the harmful organisms, the outline of the harmful organisms and the species attributes of the harmful organisms are displayed in the monitoring image.
Wherein, a species attribute can correspond to a living being, and the species attribute of the living being is displayed in the monitoring image, so that the user can know how many kinds of living things exist in the monitoring area.
As one implementation, the profile of one or more pests may be displayed in the same monitored image. For example, the outlines of two ant colonies may be displayed in one monitor image, and the outlines of one ant colony and two mice may be displayed in one monitor image.
For different pests in the same image, the pest may be marked in different contour marking modes, for example, for rats, one color may be used for contour marking, and for ant colonies, another color may be used for contour marking.
Through the implementation mode, the outline of the pests and the species attributes of the pests are displayed in the monitoring image, so that a user can know which pests exist in the corresponding target colony house and know the long living positions of the pests, the subsequent treatment of the pests is facilitated for the user, the control effect on the target colony house can be enhanced, and the control efficiency is improved.
Optionally, after the step S12, the method may further include a step S122.
S122: and displaying the outline of the target living beings in the monitoring image according to the data of the target living beings.
For example, in the case of a pig farm, the outline of a target organism such as a live pig can be displayed in the monitoring image. For the breeding place of the chicken coop, the outline of the target organism of the chicken can be displayed in the monitoring image.
Alternatively, in addition to displaying the outline of the target living being in the monitored image, the species attribute of the target living being may be displayed in the monitored image according to the data of the target living being in the image recognition result, and some necessary items (e.g., a food tray, a fence, etc.) in the monitored image may be marked.
Alternatively, the above embodiments of S121 and S122 may be used in combination, for example, the outline of the target living being, the outline of the harmful organism, and the species attribute of the harmful organism may be displayed simultaneously in the same monitored image.
The way of outline marking includes but is not limited to: marking methods such as marking different objects with outlines of different colors, marking edges of outlines of different objects with lines of different thicknesses, and marking different labels within outlines of different objects.
Through the implementation mode, different objects can be distinguished based on the outline displayed in the image, the target organisms and related necessary articles in the colony house are marked, various possible harmful organisms or targets (possibly people) which do not belong to the target colony house are detected, and when the harmful organisms are detected, a corresponding control scheme is determined for the user to control the target colony house.
As an implementation manner, if the monitoring image in S11 is a multi-frame monitoring image including a current frame of monitoring image, S12 may include: performing semantic segmentation on multiple frames of monitoring images in the monitoring video respectively to obtain an identification result of each image in the multiple frames of monitoring images; and acquiring an image identification result according to the identification result of each image in the multi-frame monitoring image.
The identification result of each image in the multi-frame monitoring images can be subjected to combined processing such as duplication removal and splicing, so that the image identification result within a period of time can be obtained.
In one example, one ant colony located at the lower left of the multi-frame monitoring image is identified in two images, another ant colony located at the middle of the image is identified in the other two images of the multi-frame monitoring image, a merged image can be obtained by merging the four images, and the ant colony is displayed at each of the lower left and the middle of the merged image, so that the pest of the ant colony in the primary image identification result of the multi-frame monitoring image is determined.
Through the implementation mode, the situation that partial harmful organisms are blocked due to the activity behavior of the target organisms and are omitted is avoided, through the identification processing of the multi-frame monitoring images, not only can the harmful organisms in the same monitoring area be detected more comprehensively, but also the verification workload of a user can be reduced and the unnecessary prevention times can be reduced compared with the times of determining the prevention scheme once according to the prevention scheme determined by the multi-frame monitoring images.
As an implementation manner of the S11, the S11 may include: when a detection request is received, extracting continuous multi-frame images from a monitoring video of a target colony house as multi-frame monitoring images; or when receiving the detection request, extracting the multi-frame image as the multi-frame monitoring image from the monitoring video of the target colony according to the preset number of interval frames.
The detection request may be initiated by a user, or may be triggered by an electronic device executing the method according to a preset detection mechanism.
The number of extracted images can be arbitrarily set by those skilled in the art according to actual requirements, for example, 5, 10, or 20 images may be continuously extracted as a multi-frame monitoring image for semantic segmentation, or 5, 10, or 20 images may be extracted as a plurality of monitoring images for semantic segmentation by extracting images according to the number of frame intervals of 2, 3, or 5.
Through the implementation mode, the multi-frame monitoring image of the monitoring video can be obtained when the detection request is received, so that the target colony house can be detected in a targeted manner when the detection requirement exists (for example, in the growth active period of some harmful organisms).
Alternatively, in consideration that the control scheme determined through the above S13 is identified and determined by a machine, in order to avoid giving an inappropriate control scheme due to a machine identification error or giving a control scheme inappropriate for a current target house, a confirmation link may be further included in the above method. The data of the harmful organisms may include names of the harmful organisms, and as shown in fig. 2, the method may further include steps S14 to S15.
S14: and outputting prompt information according to the name of the pest and the control scheme.
In one example, the danger levels (low, medium, and high levels) may be given based on the data of the harmful organisms, and the danger levels, the names of the harmful organisms, and the control plan determined at S13 may be returned and displayed on a display unit in a json data type, and a prompt message may be displayed on the display unit.
S15: and responding to the confirmation operation of the prompt information, and generating a control treatment work order of the target colony house.
After the user manually verifies the corresponding monitoring image and manually rechecks and confirms the control scheme determined in the step S13, confirmation operation can be performed on the prompt information, and after the confirmation operation on the prompt information is received, a control processing work order about the target colony house is generated according to the control scheme recommended in the step S13, so that field workers can perform field processing according to the control processing work order.
The above method will be described in detail below with reference to a pigpen for raising live pigs as an example.
Firstly, carry out the label for every pigsty to set up near the pigsty and set up a plurality of cameras, then carry out the network deployment to each camera, make the live pig in every pigsty all move about under the control of camera. The method comprises the steps of setting cameras for different pigsty houses, obtaining live pig monitoring video clips of the different pigsty houses, obtaining multi-frame monitoring images about each pigsty, carrying out semantic segmentation on the monitoring images of each pigsty, and displaying live pig outlines and species attributes, pest outlines and species attributes in the monitoring images.
When a detection request for a pigsty is received, obtaining a plurality of frames of monitoring images of a monitoring video and performing semantic segmentation, so that live pigs and harmful organisms are distinguished.
After the category of the harmful organisms is determined through the image recognition result, a prevention and treatment scheme is recommended according to the hidden danger level and the type of the harmful organisms, and then prompt information is output to perform early warning.
After the user manually rechecks, the prompt information is confirmed, so that the electronic equipment executing the method generates a corresponding control processing work order according to the confirmation operation, and field biosafety control workers can process the pigsty according to the control processing work order.
By the method, the harmful organisms can be timely and accurately detected by a machine vision technology, different breeding places can be rapidly detected, the type of the harmful organisms can be judged by pixel-level semantic segmentation, a prevention scheme is recommended, errors caused by complete dependence on artificial vision are reduced, labor cost and workload are reduced, the intelligence and accuracy of a prevention process are improved, a user can conveniently and timely investigate biological potential safety hazards, and potential risks caused by repeated manual entry and exit of the harmful organisms into and out of a target colony house are reduced.
Based on the same inventive concept, please refer to fig. 3, an embodiment of the present application further provides a monitoring apparatus 300 for a farm, which is used for executing the aforementioned monitoring method for a farm.
As shown in fig. 3, the apparatus includes: an acquisition module 301, an image processing module 302, and a determination module 303.
The acquiring module 301 is configured to acquire a monitoring image in a monitoring video of a target house.
The target housing may be a housing for housing a breeding target such as a pig, a cow, a chicken, a duck, or a rabbit.
The image processing module 302 is configured to perform semantic segmentation on the monitored image to obtain an image recognition result, where the image recognition result includes data of a target organism and data of a pest, and the target organism is a breeding object of the target colony house.
The semantic segmentation is an image processing technology in the field of computer vision, and refers to pixel-level recognition of an image to distinguish the object class of each pixel in the image. Semantic segmentation focuses on the class of each pixel in the image, but does not focus on individual discrimination within the same class. Because semantic segmentation is based on object distinguishing at the pixel level, target organisms and harmful organisms in the same monitored image can be distinguished, and compared with traditional image classification (which objects exist in an output image) and target detection (each object in the image is marked by a rectangular frame), a more accurate and visual image recognition result can be obtained. However, individual discrimination of the same kind of target organisms is not required, and recognition interference caused by excessive discrimination marks on a subsequent user confirmation process can be avoided, so that the processing efficiency is influenced.
And the determining module 303 is used for determining a control scheme of the target colony house according to the data of the pests.
In practical applications, the control scheme is related to the actual pest type and the target colony house, and the control scheme can be set by those skilled in the art.
The monitoring method of the breeding place can be realized through the device, the breeding environment of the target organisms can be monitored in real time, the harmful organisms in the monitored images can be rapidly identified from a pixel level, and compared with an artificial observation mode, the precision is higher. The control scheme determined according to the pixel-level image recognition result has stronger pertinence to pests, is favorable for carrying out subsequent observation and rapid treatment on some pests with hidden positions, can reduce the labor cost and the cost of cleaning equipment, and can also reduce the influence of drugs on target organisms.
Alternatively, in order to better identify the harmful organisms in the monitored image, an image identification model may be trained in advance, and the trained image identification model is used to perform identification detection on the monitored image in the monitored video, so as to distinguish the target organisms, the harmful organisms and other objects in the monitored image. Other objects refer to objects that are not target organisms but are not harmful to the target organism.
Optionally, the image processing module 302 may also be used to display the outline of the pest and the species attribute of the pest in the monitoring image according to the data of the pest.
Wherein, a species attribute may correspond to a living being, and the species attribute of the living being is displayed in the monitoring image, so that the user can know how many kinds of living things exist in the monitoring area.
As an embodiment, the contour of one or more pests may be displayed in the same monitored image. Different pests in the same image can be marked in different outline marking modes.
By displaying the outline of the pests and the species attributes of the pests in the monitoring image, the user can know which pests exist in the corresponding target colony house and know the long living positions of the pests, so that the user can conveniently perform subsequent treatment on the pests, the control effect on the target colony house can be enhanced, and the control efficiency is improved.
Optionally, the image processing module 302 may be further configured to display the outline of the target living being in the monitored image according to the data of the target living being. The image processing module 302 may also be used to monitor the species attributes of the target organism displayed in the image.
Among them, the ways of outline marking include but are not limited to: marking methods such as marking different objects with outlines of different colors, marking edges of outlines of different objects with lines of different thicknesses, and marking different labels within outlines of different objects.
Optionally, the image processing module 302 may be further configured to, when the monitored image is a multi-frame monitored image including a current frame of monitored image, perform semantic segmentation on the multi-frame monitored image in the monitored video, respectively, to obtain an identification result of each image in the multi-frame monitored image; and acquiring an image identification result according to the identification result of each image in the multi-frame monitoring image.
Optionally, the image processing module 302 may be further configured to perform de-duplication and stitching on the recognition result of each image in the multiple frames of monitoring images to obtain an image recognition result within a period of time.
Therefore, the situation that partial harmful organisms are missed due to shielding of the moving behavior of the target organisms can be avoided, and the target colony house can be detected more comprehensively.
Optionally, the obtaining module 301 may be further configured to extract continuous multiple frames of images from the monitoring video of the target colony house as multiple frames of monitoring images when receiving the detection request; or when receiving the detection request, extracting the multi-frame image as the multi-frame monitoring image from the monitoring video of the target colony according to the preset number of interval frames.
Optionally, the determination module 303 may also be configured to determine a risk level based on the data of the pest.
Optionally, the determining module 303 may be further configured to output a prompt message according to the name of the pest and the control scheme; and responding to the confirmation operation of the prompt information, and generating a control treatment work order of the target colony house.
For further details of the monitoring device 300 for a farm, please refer to the above description related to the monitoring method for a farm, which is not repeated herein.
Based on the same inventive concept, as shown in fig. 4, an embodiment of the present application further provides an electronic device 400, where the electronic device 400 may be used to perform the foregoing monitoring method for a farm. The electronic device 400 may be a computer, a server, or the like having an arithmetic processing capability.
As shown in fig. 4, the electronic device 400 includes: memory 401, processor 402, display unit 403. The memory 401, the processor 402 and the display unit 403 are directly or indirectly connected through a communication bus to realize data interaction.
The Memory 401 is a storage medium, and may be, but is not limited to, a medium capable of storing a computer program, such as a Random Access Memory (RAM), a Read Only Memory (ROM), an electrically erasable Programmable Read-Only Memory (EEPROM), and the like. The memory 401 has stored therein a computer program executable by the processor 402, which computer program, when executed by the processor 402, performs the method described above.
The processor 402 has an arithmetic processing capability, and may be, but is not limited to, a Central Processing Unit (CPU), a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, and a discrete component, and the processor 402 may execute a computer program stored in the memory 401, so as to execute the aforementioned monitoring method for the cultivation site.
The display unit 403 is used to display text and images for reference of a user and provide an interactive interface for the user, and the display unit 403 may be a display component such as a liquid crystal display or a touch display. The display content can be monitoring video, monitoring image, prompt information and the like.
It will be appreciated that the configuration of fig. 4 is merely illustrative and that in practice, the electronic device 400 may have further components, for example, a speaker, for providing a voice prompt when a pest is detected.
In addition to the above embodiments, the present application further provides a storage medium, on which a computer program is stored, and the computer program is executed by the processor 402 to perform the foregoing method. Storage media include, but are not limited to: a U disk, a removable hard disk, a memory 401, a magnetic disk, and various other media that can store program codes.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of a unit is only one logical division, and there may be other divisions in actual implementation, and in other aspects, the connections discussed may be indirect coupling or communication connection of apparatuses or units through some communication interfaces, and may be electrical, mechanical or other forms.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above embodiments are merely examples of the present application and are not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A method of monitoring a farm, the method comprising:
acquiring a monitoring image in a monitoring video of a target colony house;
performing semantic segmentation on the monitoring image to obtain an image recognition result, wherein the image recognition result comprises data of target organisms and data of harmful organisms, and the target organisms are breeding objects of the target colony house;
and determining a control scheme of the target colony house according to the data of the pests.
2. The method of claim 1, wherein after the semantically segmenting the monitoring image to obtain the image recognition result, the method further comprises:
displaying the outline of the harmful organism and the species attribute of the harmful organism in the monitoring image according to the data of the harmful organism.
3. The method of claim 1, wherein after the semantically segmenting the monitoring image to obtain the image recognition result, the method further comprises:
and displaying the outline of the target organism in the monitoring image according to the data of the target organism.
4. The method according to claim 1, wherein the monitoring image is a multi-frame monitoring image including a current frame of monitoring image, and the semantic segmentation is performed on the monitoring image to obtain an image recognition result, including:
performing semantic segmentation on multiple frames of monitoring images in the monitoring video respectively to obtain an identification result of each image in the multiple frames of monitoring images;
and acquiring the image identification result according to the identification result of each image in the multiple frames of monitoring images.
5. The method of claim 4, wherein the obtaining of the monitoring image in the monitoring video of the target colony house comprises:
when a detection request is received, extracting continuous multi-frame images from the monitoring video of the target colony house as the multi-frame monitoring images;
alternatively, the first and second electrodes may be,
and when a detection request is received, extracting a plurality of frames of images from the monitoring video of the target colony house as the plurality of frames of monitoring images according to a preset number of interval frames.
6. The method of claim 1, wherein the data of the pests includes a name of the pest, the method further comprising:
outputting prompt information according to the name of the pest and the control scheme;
and responding to the confirmation operation of the prompt message, and generating a control treatment work order of the target colony house.
7. The method of any one of claims 1 to 6, wherein the target organism is a pig, chicken, duck, cow or rabbit.
8. A monitoring device for a farm, the device comprising:
the acquisition module is used for acquiring a monitoring image in a monitoring video of the target colony house;
the image processing module is used for performing semantic segmentation on the monitoring image to obtain an image recognition result, wherein the image recognition result comprises data of target organisms and data of harmful organisms, and the target organisms are breeding objects of the target colony house;
and the determining module is used for determining a control scheme of the target colony house according to the data of the pests.
9. An electronic device, comprising:
a memory;
a processor;
the memory stores a computer program executable by the processor, the computer program, when executed by the processor, performing the method of any of claims 1-7.
10. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, performs the method of any one of claims 1-7.
CN202010004759.2A 2020-01-02 2020-01-02 Method and device for monitoring breeding place, electronic equipment and storage medium Pending CN111209844A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010004759.2A CN111209844A (en) 2020-01-02 2020-01-02 Method and device for monitoring breeding place, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010004759.2A CN111209844A (en) 2020-01-02 2020-01-02 Method and device for monitoring breeding place, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111209844A true CN111209844A (en) 2020-05-29

Family

ID=70789540

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010004759.2A Pending CN111209844A (en) 2020-01-02 2020-01-02 Method and device for monitoring breeding place, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111209844A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115937784A (en) * 2022-12-27 2023-04-07 正大农业科学研究有限公司 Farm monitoring method and device, electronic equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012135281A (en) * 2010-12-27 2012-07-19 Takenaka Komuten Co Ltd Apparatus and program for preventing wildlife damage
CN103461315A (en) * 2013-09-26 2013-12-25 四川宾吾谷科技有限公司 Mouse killing device
CN108509976A (en) * 2018-02-12 2018-09-07 北京佳格天地科技有限公司 The identification device and method of animal
JP2018143215A (en) * 2017-03-09 2018-09-20 株式会社エヌ・ティ・ティ・データ Harmful animal countermeasure support system and harmful animal countermeasure support method
US20180293444A1 (en) * 2017-04-05 2018-10-11 International Business Machines Corporation Automatic pest monitoring by cognitive image recognition with two cameras on autonomous vehicles
CN108664844A (en) * 2017-03-28 2018-10-16 爱唯秀股份有限公司 The image object semantics of convolution deep neural network identify and tracking
US20190141982A1 (en) * 2017-11-16 2019-05-16 Brian Wayne Carnell Methods and systems for directing animals away from an area
CN109934045A (en) * 2017-12-15 2019-06-25 北京京东尚科信息技术有限公司 Pedestrian detection method and device
CN110235890A (en) * 2019-05-14 2019-09-17 熵康(深圳)科技有限公司 A kind of detection of harmful organism and drive method, apparatus, equipment and system
US20190380325A1 (en) * 2018-06-18 2019-12-19 International Business Machines Corporation Manage and control pests infestation using machine learning in conjunction with automated devices

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012135281A (en) * 2010-12-27 2012-07-19 Takenaka Komuten Co Ltd Apparatus and program for preventing wildlife damage
CN103461315A (en) * 2013-09-26 2013-12-25 四川宾吾谷科技有限公司 Mouse killing device
JP2018143215A (en) * 2017-03-09 2018-09-20 株式会社エヌ・ティ・ティ・データ Harmful animal countermeasure support system and harmful animal countermeasure support method
CN108664844A (en) * 2017-03-28 2018-10-16 爱唯秀股份有限公司 The image object semantics of convolution deep neural network identify and tracking
US20180293444A1 (en) * 2017-04-05 2018-10-11 International Business Machines Corporation Automatic pest monitoring by cognitive image recognition with two cameras on autonomous vehicles
US20190141982A1 (en) * 2017-11-16 2019-05-16 Brian Wayne Carnell Methods and systems for directing animals away from an area
CN109934045A (en) * 2017-12-15 2019-06-25 北京京东尚科信息技术有限公司 Pedestrian detection method and device
CN108509976A (en) * 2018-02-12 2018-09-07 北京佳格天地科技有限公司 The identification device and method of animal
US20190380325A1 (en) * 2018-06-18 2019-12-19 International Business Machines Corporation Manage and control pests infestation using machine learning in conjunction with automated devices
CN110235890A (en) * 2019-05-14 2019-09-17 熵康(深圳)科技有限公司 A kind of detection of harmful organism and drive method, apparatus, equipment and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115937784A (en) * 2022-12-27 2023-04-07 正大农业科学研究有限公司 Farm monitoring method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
Riekert et al. Automatically detecting pig position and posture by 2D camera imaging and deep learning
Matthews et al. Automated tracking to measure behavioural changes in pigs for health and welfare monitoring
Fuentes et al. Deep learning-based hierarchical cattle behavior recognition with spatio-temporal information
O’Connor et al. Camera trap arrays improve detection probability of wildlife: Investigating study design considerations using an empirical dataset
Dawkins et al. Optical flow, flock behaviour and chicken welfare
ZA202300610B (en) System and method for crop monitoring
de Chaumont et al. Live Mouse Tracker: real-time behavioral analysis of groups of mice
CN109101547B (en) Management method and device for wild animals
Kollis et al. Weight estimation using image analysis and statistical modelling: A preliminary study
CN109086696B (en) Abnormal behavior detection method and device, electronic equipment and storage medium
CN108874910B (en) Vision-based small target recognition system
CN111209844A (en) Method and device for monitoring breeding place, electronic equipment and storage medium
EP3769036B1 (en) Method and system for extraction of statistical sample of moving fish
CN113068657B (en) Intelligent efficient pig raising method and system
McKenna et al. Automated classification for visual-only postmortem inspection of porcine pathology
CN117709971A (en) Cultivation information visual traceability system and method based on Internet of things
CN112150498A (en) Method and device for determining posture information, storage medium and electronic device
CN116343018A (en) Intelligent fishery fishing identification method, system and medium based on image processing
CN115226650B (en) Sow oestrus state automatic detection system based on interaction characteristics
Ojukwu et al. Development of a computer vision system to detect inactivity in group-housed pigs
Küster et al. Automatic behavior and posture detection of sows in loose farrowing pens based on 2D-video images
Britt et al. Linking live animals and products: traceability
CN114187584A (en) Live pig weight estimation system, method and storage medium
CN108664912B (en) Information processing method and device, computer storage medium and terminal
CN113033733A (en) Aquatic product traceability management method, device and system and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200529