KR101694946B1 - Sensing estrus of poultry method and device for thereof - Google Patents

Sensing estrus of poultry method and device for thereof Download PDF

Info

Publication number
KR101694946B1
KR101694946B1 KR1020150101651A KR20150101651A KR101694946B1 KR 101694946 B1 KR101694946 B1 KR 101694946B1 KR 1020150101651 A KR1020150101651 A KR 1020150101651A KR 20150101651 A KR20150101651 A KR 20150101651A KR 101694946 B1 KR101694946 B1 KR 101694946B1
Authority
KR
South Korea
Prior art keywords
animal
information
configurations
posture
behavior
Prior art date
Application number
KR1020150101651A
Other languages
Korean (ko)
Inventor
이중호
이철재
Original Assignee
주식회사 씨피에스글로벌
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 씨피에스글로벌 filed Critical 주식회사 씨피에스글로벌
Priority to KR1020150101651A priority Critical patent/KR101694946B1/en
Application granted granted Critical
Publication of KR101694946B1 publication Critical patent/KR101694946B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61DVETERINARY INSTRUMENTS, IMPLEMENTS, TOOLS, OR METHODS
    • A61D17/00Devices for indicating trouble during labour of animals ; Methods or instruments for detecting pregnancy-related states of animals
    • A61D17/002Devices for indicating trouble during labour of animals ; Methods or instruments for detecting pregnancy-related states of animals for detecting period of heat of animals, i.e. for detecting oestrus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61DVETERINARY INSTRUMENTS, IMPLEMENTS, TOOLS, OR METHODS
    • A61D17/00Devices for indicating trouble during labour of animals ; Methods or instruments for detecting pregnancy-related states of animals
    • A61D17/004Devices for indicating trouble during labour of animals ; Methods or instruments for detecting pregnancy-related states of animals for detecting mating action

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pregnancy & Childbirth (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Husbandry (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Image Analysis (AREA)

Abstract

Disclosed are a method for detecting estrus of animals, and a device for conducting the same. More specifically, the method for detecting estrus of animals comprises the following steps: a step for checking, on the device for detecting estrus of animals, a moving body from a photographed image of a breeding region; a step for determining whether the moving body is a target animal to check the mounting motion on the device for detecting estrus of animals; and a step for creating, on the device for detecting estrus of animals, information on body posture patterns based on a correlation between a plurality of body components constituting the animal body determined from the photographed image so as to determine the detected movement of the animal body as the mounting activity by making a comparison with the stored information on mounting posture pattern.

Description

TECHNICAL FIELD [0001] The present invention relates to a method of sensing an animal's estrus,

The present invention relates to an animal estrus detection method and an apparatus for carrying out the method of detecting an animal having an estrogen arrival time by detecting an animal's ascending behavior.

In general, the animals of the estrus show different vocal sounds or behave differently from their normal states.

For this reason, zookeepers will see if there are any specific behaviors such as changes in animal crying, changes in movement, movements in ascending order, etc., in order to know if the animal is in an estrus state. Particularly, the behavior of ascending is a typical behavior of animals in estrus period, and it is an important criterion for judging estrus or not.

However, it is somewhat unreasonable for the zookeepers to observe the zooplankton because it appears over a short period of time during the estrus period, and the duration per second is as short as a few seconds.

In particular, it is almost impossible for sick animals to detect the ridge in this environment because most of them have been raising animals using large-scale grazing or large-scale housing.

In addition, since the starting point of the ascending action is a temporal basis for the post-processing such as artificial correction, if the starting point of the ascending action is not confirmed, post-processing becomes difficult.

In addition, if an animal's estrus is missed, it will take a certain period of time for the animal to reach its estrus again, so the person raising the animal will lose time and money during that period.

Korean Patent No. 10-1329022 (Nov.

An embodiment of the present invention relates to an animal estrus detecting apparatus and method for recognizing an animal object to detect an estrus in an image taken by a photographing apparatus for photographing a breeding area where an animal is being raised, .

The method for detecting an animal estrus according to an embodiment of the present invention includes the steps of: identifying an object moving in an image captured by an animal estrous sensing apparatus; Determining whether the moving object is an animal to be ascertained in the animal estrogen sensing apparatus; And the animal estrous detection apparatus generates object posture pattern information according to a connection relation of a plurality of object configurations constituting the identified animal object in the captured image and compares the object posture pattern information with pre- And judging whether or not the object has been actuated.

Wherein the step of determining whether the animal is to be ascertained is a step of extracting object configurations of the moving object in the animal estrous detection apparatus; And comparing the extracted object configurations with the previously stored configurations of the animal object to be sensed in the animal estrogen sensing device to determine whether two or more object configurations are the same or similar.

Wherein the step of determining whether the animal is to be ascertained is a step of extracting object configurations of the moving object in the animal estrous detection apparatus; Generating object type information from the extracted object configurations in the animal estrous detection apparatus; And comparing the generated object type information with object shape information of a previously stored animal object in the animal estrogen sensing apparatus to determine the moving object as an animal to be sensed.

The object posture pattern information is generated by extracting at least two object configurations among the head, torso, hip, leg, tail, and skeleton constituting the recognized animal object in the captured image and connecting the extracted object configurations .

The animal estrous detection apparatus may store a previous image and a subsequent image for a preset time period based on a time point when the step S is determined to be a step-up action in the captured image.

Further comprising the step of transmitting at least one of a step-up action confirmation signal and the stored image to the user terminal in the animal estro- tion sensing apparatus after extracting and storing the image, And / or object information of the recognized animal object.

The method may further include confirming whether the awake posture of the recognized animal object is maintained for a predetermined time in the animal estrous sensing apparatus when it is determined that the step of performing the up / down behavior is determined.

Confirming the unique identification information of the recognized animal object in the animal estrous sensing apparatus when it is determined that the behavior is a step-up behavior; And updating at least one of identification information of the animal object to be ascertained and the date and time of the activity of the animal to the identification information of the recognized animal object, in the animal detection apparatus.

Estimating an estrous cycle using the cumulative wake date and time of the animal object in the animal estrous sensing apparatus after the step of updating the step S, And transmitting the estrous cycle of the predicted animal object to the user terminal in the animal estrogen sensing apparatus.

Further comprising the step of, when the animal estrous detection device determines that the animal is in the step-up behavior in the step of determining whether the animal is in the step-up behavior, adding the connection relationship between the object structures determined to be the step-up behavior, .

Meanwhile, the animal estrogen sensing apparatus according to an embodiment of the present invention includes an object recognizing unit that identifies a moving object in a photographed image of a breeding area and determines whether the moving object is an animal to be ascertained as to whether or not the moving object exists; Generating an object posture pattern information according to a connection relationship of a plurality of object structures constituting the identified animal object, comparing the generated object posture pattern information with pre-stored value pattern information, And an ascending behavior determining unit for determining the ascending behavior determining unit.

The object recognition unit extracts object configurations of the moving object, compares the extracted object configurations with the configurations of previously stored animal objects, confirms whether two or more object configurations are the same or similar, can do.

The object recognition unit extracts object configurations of the moving object, generates object type information from the extracted object configurations, compares the generated object type information with object shape information of a previously stored animal object to be sensed, The object can be judged as an object animal to be detected.

The object posture pattern information may be generated by extracting at least two object configurations among the head, torso, hip, leg, tail, and skeleton constituting the recognized animal object in the captured image and connecting the extracted object configurations have.

The ascending behavior determining unit may store a previous image and a subsequent image for a predetermined time period based on a viewpoint of the image determined as a step-up action in the captured image.

The animal estrogen sensing apparatus may further include a communication unit for transmitting a step-up behavior confirmation signal and the extracted and stored image to the user terminal, wherein the step-up behavior confirmation signal includes a step- Or the like.

The ascending behavior determining unit may check whether the ascending posture of the recognized animal object is maintained for a preset time.

The animal estrogen sensing apparatus may further include an identification information verifying unit for verifying unique identification information of the recognized animal object and the other animal object, And an object management unit for updating at least one of identification information of an animal object to be subjected to the augmentation and an operation to be performed to the unique identification information confirmed by the identification information confirmation unit when the pattern information is determined to be similar.

The object management unit may further include a communication unit for predicting an estrous cycle using the accumulated date and time of the animal object, and the animal estrous detection may include a communication unit for transmitting the estrous cycle of the animal object predicted to the user terminal have.

The ascending behavior determining unit may add the generated posture pattern information to the posture attitude pattern information and store the generated posture pattern information when it is determined that the generated object posture pattern information is similar to the previously stored upkeep posture pattern information.

According to an embodiment of the present invention, an animal object to detect an estrus is recognized in an image captured by a photographing device for photographing a breeding area where an animal is raised, The user who is raising the animal can easily confirm whether or not the animal has been raised.

In addition, according to the embodiment of the present invention, information such as the authenticity of the animal's estrous behavior through the ascending behavior, the date of the estrous behavior, and the like are discriminated, so that it is possible to accurately determine which animal object has emerged from the animal.

According to the embodiment of the present invention, it is possible to greatly reduce the burden on the maintenance by discriminating whether or not the animal is in the ascending state and the animal of the estrus using only the image of the camera.

1 is a block diagram of an animal estrous detection system according to an embodiment of the present invention;
2 is a block diagram showing a detailed configuration of an animal estrous detection apparatus according to an embodiment of the present invention;
FIG. 3 is a flowchart showing an animal estrous detection method using an animal estrous detection apparatus according to an embodiment of the present invention.
4 is a view showing an image taken by a photographing apparatus according to an embodiment of the present invention
5 illustrates a computing environment including an exemplary computing device suitable for use in the exemplary embodiments

Hereinafter, specific embodiments of the present invention will be described with reference to FIGS. 1 to 5. FIG. However, this is an exemplary embodiment only and the present invention is not limited thereto.

In the following description, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear. The following terms are defined in consideration of the functions of the present invention, and may be changed according to the intention or custom of the user, the operator, and the like. Therefore, the definition should be based on the contents throughout this specification.

The technical idea of the present invention is determined by the claims, and the following embodiments are merely a means for effectively explaining the technical idea of the present invention to a person having ordinary skill in the art to which the present invention belongs.

In the following description, terms such as " transmission ", "transmission "," transmission ", "reception ", and the like, of a signal or information refer not only to the direct transmission of signals or information from one component to another But also through other components. In particular, "transmitting" or "transmitting" a signal or information to an element is indicative of the final destination of the signal or information and not a direct destination. This is the same for "reception" of a signal or information. Also, in this specification, the fact that two or more pieces of data or information are "related" means that when one piece of data (or information) is acquired, at least a part of the other data (or information) can be acquired based thereon.

FIG. 1 is a network configuration diagram illustrating an animal estrogen sensing system 10 and its peripheral configuration according to an embodiment of the present invention, and FIG. 2 is a block diagram of an animal estrogen sensing apparatus according to an embodiment of the present invention.

Referring to FIGS. 1 and 2, an animal estrogen sensing system 10 according to an embodiment may include a photographing apparatus 100, an animal estrogen sensing apparatus 200, and a user terminal 300.

The photographing apparatus 100 and the user terminal 300 may be connected to the animal estrogen sensing apparatus 200 through a network such as a local area network (LAN), a wide area network (WAN), a cellular network, So that they can communicate with each other. Examples of the photographing apparatus 100 may be a camera, a cctv, or the like, but are not limited thereto. Examples of the user terminal 300 may be a smart phone, a mobile phone, a PDA (Personal Digital Assistants), MP3, a tablet PC, a portable multimedia player (PMP), a laptop computer, a personal computer, And various types of wireless communication devices and wired communication devices that can be connected to the animal estrogen sensing device 200.

The photographing apparatus 100 photographs a breeding area 50 where an animal is bred. Here, the breeding area 50 may be an area for breeding animals such as an outdoor area or a barn area zoned by a fence. The photographing apparatus 100 can transmit a photographing image of the breeding region 50 where the animal is being raised to the animal estrogen sensing apparatus 200. One or more photographing apparatuses 100 may be provided to photograph the breeding region 50. Here, the photographing apparatus 100 may be, for example, a visible light camera, a thermal camera, an infrared camera, a stereo camera, and the like, but is not limited thereto. It is needless to say that the present invention is not limited thereto.

The animal estrous detection apparatus 200 recognizes an animal object that senses an estrus in at least one of the photographed image and the pre-stored photographed image received from the photographing apparatus 100, and confirms whether or not the recognized animal object performs the upward movement . Hereinafter, the configuration and operation of the animal estrous sensing apparatus 200 will be described in detail with reference to FIG.

2, the animal estrous detection apparatus 200 includes an object identification information recognizing unit 202, an object recognizing unit 204, an ascending behavior determining unit 206, an object managing unit 208, and an information storing unit 210, . ≪ / RTI >

The object identification information recognizing unit 202 can identify the unique identification information of a predetermined object in an image captured by the breeding area 50 where the animal is being raised. Hereinafter, the photographed image may be an image received in real time from the photographing apparatus 100, or may be an image received from the photographing apparatus 100 and stored. For example, in the breeding area 50, the unique identification information for identifying the animal in each specific area may be displayed as a QR code or a barcode. Or unique identification information may be displayed on an attachment (e.g., a necklace, ankle, earring, etc.) mounted on a particular area of the animal. The object identification information recognizing unit 202 identifies the unique identification information on the photographed image and identifies the corresponding object. That is, the object identification information recognizing unit 202 can identify the unique identification information of a specific object (animal) (for example, an object that is performing a winning operation) in the breeding area 50.

The object recognizing unit 204 can recognize an animal object to detect an estrus in the photographed image. Specifically, the object recognition unit 204 can identify a moving object from any one of the captured image and the stored image. The object recognition unit 204 may compare the previous image and the current image to determine that the object having a difference is a moving object. Next, the object recognition unit 204 extracts only the object having motion from the photographed image and can make the object of analysis. This reduces the amount of data processing associated with recognizing animal objects and increases data processing speed.

The object recognition unit 204 may extract a plurality of object structures (e.g., a head, a torso, a hip, a leg, a tail, a skeleton, etc.) constituting the object from a moving object. In addition, the object recognition unit 204 can generate the object type information of the object through the extracted object configurations. That is, the object type information of the object can be generated through the distance between the extracted object configurations, the shape and size of each object configuration, and the connection relationship between the object configurations. In addition, the object recognition unit 204 can recognize object recognition assistant information such as color, shade, color, contrast, and depth of the hair of the moving object in the photographed image.

The object recognizing unit 204 recognizes an animal object (hereinafter, may be referred to as a detection target animal object) that senses an estrus by using at least one of the plurality of object configurations, object type information, and object recognition information, Or not. Specifically, the object recognition unit 204 can recognize whether the moving object is an object animal to be detected by comparing a plurality of object configurations extracted from the captured image and object configurations of previously stored animal objects. For example, if at least two of the plurality of object configurations extracted from the photographed image are the same or similar to the object configuration of the pre-stored detected animal object, the object recognition unit 204 may determine that the moving object is an object to be detected .

The object recognition unit 204 compares the object type information generated from the object configurations extracted from the captured image with the object type information of the previously stored animal object to recognize whether the moving object is an object animal to be detected . At this time, the object recognition unit 204 may determine that the moving object is an object animal according to the degree of similarity between the generated object-type information and object-type information of the object animal.

The object recognition unit 204 may additionally use the object recognition assistance information in addition to the plurality of object configurations or the generated object type information when determining whether the moving object is an animal to be detected. For example, information on the skeleton of the moving object can be supplemented through shading among the object recognition assistant information. Also, information on the distance between the moving objects can be confirmed through the brightness of the object recognition assistant information.

The ascending behavior judging unit 206 can judge whether or not the animal object has been moved up. Specifically, if the animal object is an animal object for which the user wishes to confirm the behavior of the animal, the behavior determining unit 206 may track the object in the captured image. Also, the ascending behavior determining unit 206 can generate the object posture pattern information of the recognized animal object using the plurality of configurations extracted by the object recognizing unit 204. [ Here, the object posture pattern information may refer to a connection pattern of how the object structures (e.g., head, torso, hip, leg, tail, skeleton, etc.) are connected at each angle. In an exemplary embodiment, the object attitude pattern information may be a connection pattern connecting the center points of two or more object configurations to one another. The ascending behavior determining unit 206 may compare the generated object posture pattern information with previously stored upright posture pattern information to determine whether the corresponding object is in the upright posture. The step-up attitude pattern information includes information indicating a connection relationship between an object configuration (e.g., a head, a torso, a hip, a leg, a tail, a skeleton, etc.) in a posture . The upright posture pattern information may include connection relation information between object configurations when viewing the upright posture from various angles. For example, in the case of the cow, when the coworker occurs, the coworker takes the action of getting on the coworker. In this case, the ascending object on the ascending allowable object has a certain slope, which is different from the general posture, between the head-torso-hips. In this case, when the ascending behavior determining unit 206 compares the object posture pattern information of the object with the pre-stored posture posture pattern information, it can be confirmed that the object is in the upright posture.

When the object attitude pattern information acquired from the photographed image is the same as or similar to the previously stored up-and-coming attitude pattern information, the winning action determining unit 206 adds the acquired object attitude pattern information to the winning attitude pattern information, And can be used as the up-attitude pattern information. Here, a similar case may be a case where the degree of similarity between the obtained object posture pattern information and pre-stored posture attitude pattern information is equal to or greater than a predetermined threshold similarity degree.

Also, when it is determined that the object is in the awake attitude, the awake behavior determiner 206 can check whether the object maintains the awake attitude over the predetermined time. For example, a cow is placed on another cow for more than a certain period of time to maintain a posture and perform a cow behavior. Therefore, the ascending behavior determining unit 206 may determine whether the object attitude pattern information and the previously stored ascending posture pattern information are the same or similar, thereby ascertaining whether or not the object is in the ascending state.

Also, the ascending behavior judging unit 206 judges whether or not there is a step permitting object that permits the step-up action to the step-up object (i.e., the animal object judged to be in the step-up posture) It is possible. For example, in the case of a cow, generally, when a winning action occurs, a winning object is placed on the winning allowable object. In this case, the head of the winning object and the winning allowable object mainly look in the same direction The head of the object and the permissible object is level with the ground), and the head of the object is positioned between the head and the hip of the object (ie, the body). That is, the ascending behavior determining unit 206 can determine whether or not the ascending or descending operation is performed by checking whether the ascending allowable object exists within the predetermined range from the ascending object.

When it is determined that the object is in the step-up behavior, the winning action determination unit 206 may store the forward and backward images of the corresponding scene in the information storage unit 210. [ Specifically, the ascending behavior judging unit 206 judges whether or not a previous image and a subsequent image (for example, from 15 seconds before the moving-in behavior to 15 seconds after the moving-in behavior) And store the extracted information in the information storage unit 210.

The winning behavior determining unit 206 may generate a winning behavior confirming signal for informing the user terminal 300 of the changing behavior using a communication unit (not shown). Here, the ascertained behavior confirmation signal may include at least one of the ascertaining action and the object information of the animal object as a signal indicating the ascending action of the animal object. In addition, the ascending behavior determining unit 206 may transmit the front and rear images (for example, the previous 15 seconds and the subsequent 15 seconds) of the portion in which the ascending action is confirmed in the captured image together with the ascending action confirming signal to the user terminal 300 . Here, the object information may be received from the object management unit 208 to be described later, and may generate a rise behavior confirmation signal. The object information is described in detail by the object information management unit 208 described later.

The object management unit 208 may manage object information for each animal object. Specifically, the object management unit 208 can manage object information such as unique identification information of each animal object, information related to the rise of the animal object (identification information of the object of activation, date and time, etc.). The object management unit 208 receives the unique identification information of the object from the object identification information recognition unit 202 and determines whether the animal object corresponding to the received unique identification information And can update the information related to the elevation. In addition, the object management unit 208 predicts the estrous cycle of each animal object using the information related to the ascending order of the ascending behavior for each animal object, and informs the user terminal 300 of the estimated estrous cycle for each animal object.

The information storage unit 210 may store the photographed image received from the photographing apparatus 100. The information storage unit 210 may store the object configurations of the sensing animal object for comparing the object configurations of the moving object extracted from the photographed image by the object recognition unit 206. [ Also, the information storage unit 210 may store the object type information of the animal object to be detected. In addition, the information storage unit 210 may store the posture attitude pattern information for comparing with the object posture pattern information generated by the object recognition unit 204 by the ascending behavior determining unit 206. [ In addition, the information storage unit 210 may store object information for matching with the identification information confirmed from the photographed image. In addition, the information storage unit 210 may store the connection relationship between the object configurations determined to be the ascending behavior in the photographed image as the upright posture pattern information. Here, it is described that the information storage unit 210 stores the photographed image, information on the animal object configurations, the posture attitude pattern information, and the object information for determining whether or not to perform the upward movement. However, the present invention is not limited to this, 210 may further store other data required by the animal estrogen sensing device 200. [ The information storage unit 210 is provided outside the animal estrous sensing apparatus 200 and the information storage unit 210 may be provided outside the animal estrous sensing apparatus 200. [ It is needless to say that the sensing device 200 can store necessary data.

The animal estrous detection apparatus 200 may further include a communication unit (not shown) and a display unit (not shown).

The communication unit (not shown) can communicate with the image capturing apparatus 100 and the user terminal 300. Specifically, the communication unit can receive the photographed image from the photographing apparatus 100. [ Also, the communication unit may transmit the ascending behavior confirmation signal to the user terminal 300. [

The display unit (not shown) may display a status in which the ascending behavior is confirmed in at least one of the captured image and the previously stored captured image from the image capturing apparatus 100. Specifically, the display unit may indicate that the object recognition unit 204 identifies the captured image in order to extract the object configurations of the animal object, and tracks the animal object in the received captured image. Also, the display unit may indicate that the ascending behavior determining unit 206 confirms the ascending behavior. The display unit may display the received photographed image and an image confirming the ascending behavior together.

The user terminal 300 may be a terminal that receives an animal's ascertaining behavior confirmation signal from the animal estrogen sensing apparatus 200. The user terminal 300 may also be connected to the animal estrous detection apparatus 200 to check the photographed image of the photographing apparatus 100 in real time. In addition, the user terminal 300 can access the animal estrogen sensing apparatus 200 to confirm the object information of the animal object, and can receive the estrous timing for each animal object predicted by the animal estrogen sensing apparatus 200.

3 is a flowchart of an animal estrus detection method according to an embodiment of the present invention. In the drawings, the method is described by dividing into a plurality of steps. However, at least some of the steps may be performed in sequence, combined with other steps, performed together, omitted, divided into detailed steps, Step may be added and performed. Also, one or more steps not shown in the method may be performed in conjunction with the method, depending on the embodiment.

Referring to FIG. 3, the photographing apparatus 100 transmits a photographed image of a breeding area 50 where an animal is kept to an animal estrous detection apparatus 200 (S310).

Next, the animal estrous detection apparatus 200 determines whether there is an object moving in the photographed image (S320). Specifically, the animal estrous detection apparatus 200 may compare the previous image and the current image to determine that the object having the difference is a moving object.

Next, the animal estrous detection apparatus 200 determines whether the moving object is an animal to which the moving behavior is to be confirmed (S330). Specifically, the animal estrogen sensing apparatus 200 can extract the object configurations of the moving object identified in the photographed image. The animal estrous detection apparatus 200 compares two or more object configurations (e.g., a head, a trunk, a hip, a leg, a tail, a skeleton, etc.) with the extracted object configurations and object configurations of pre- It can be determined whether the moving object is the same or similar whether the moving object is an animal to which the moving behavior is to be confirmed. In addition, the animal estrous detection apparatus 200 generates object type information using the extracted object configurations, and compares the object type information with the object type information of the stored sensing target animal object to recognize whether the moving object is the sensing target animal object It is possible. In addition, when determining whether the moving object is an object to be detected, the animal estrous detection apparatus 200 may additionally use the object recognition assistance information in addition to the plurality of object configurations or the generated object type information.

Next, the animal estrous detection apparatus 200 determines whether the posture of the animal to be determined as the upright behavior is similar to the posture attitude posture (S340). Specifically, the animal estrous detection apparatus 200 generates object posture pattern information indicating a connection between the extracted object configurations, compares the generated object posture pattern information with previously stored upright posture pattern information, ) Is taking a posture for the behavior of a change in price.

Next, the animal estrous detection apparatus 200 stores the photographed image including the ascending action (S350). Specifically, the animal estrous detection apparatus 200 stores a fore-and-aft image for a predetermined time (for example, from 15 seconds before the moving-in behavior to 15 seconds after the moving-in behavior) on the basis of a scene determined as a moving- .

Next, the animal estrous detection apparatus 200 transmits a step-up behavior confirmation signal to the user terminal 300 (S360). Here, the ascertained behavior confirmation signal may include at least one of the ascertaining action and the object information of the object, which is a signal indicating the ascending action of the animal. In addition, the animal estrous detection apparatus 200 may transmit a forward and backward image (for example, 15 seconds after the previous 15 seconds) to the user terminal 300 in addition to the ascertained behavior confirmation signal. If the animal estrous detection apparatus 200 transmits only the captured image to the user terminal 300, the user determines whether or not the actual upward movement has been performed through the received image to the user terminal 300 . Therefore, the animal estrous detection apparatus 200 stores the image received from the photographing apparatus 100, and then displays the portion of the stored image that the user should check (for example, from 15 seconds before the moving-in action to 15 seconds after the moving-in action And transmits the extracted information to the user terminal 300.

4 is a view showing an image taken by a photographing apparatus according to an embodiment of the present invention.

4 is an image taken by the animal estrous detection apparatus 200 from the photographing apparatus 100 and photographing the breeding region 50 where the photographing apparatus 100 raises an animal. Here, the photographed image received from the photographing apparatus 100 will be described, but it is needless to say that the pre-stored image may also be displayed as shown in FIG.

Referring to FIG. 4A, the animal estrogen sensing apparatus 200 confirms the motion in the housing from the received image and displays the object configurations of the object (for example, the head, the body , Buttocks, legs, tails, skeletons, etc.). The animal estrous detection apparatus 200 can confirm whether or not the object is in the ascending order by comparing the object posture pattern information generated through the extracted object structures and the pre-stored posture attitude pattern information. In the case of FIG. 4 (a), it is assumed that a cow captures an image in which a cow is performed, and when a cow moves on a cow, the cow (i.e., the cow object) Able to know. Also, the cows on other cattle go up above a certain height and can be seen tilting the body over a certain slope. That is, the animal estrous detection apparatus 200 generates the object attitude pattern information of the ascending action object through the height and slope, compares the generated object attitude pattern information and the ascending pattern information to determine the ascending behavior, It is possible to determine whether or not the animal is ascertained by checking the time at which the animal's behavior attitude determined to be maintained is maintained (i.e., the duration of the animal's behavior).

Fig. 4 (b) shows the ascending power of the cow as in Fig. 4 (a). 4 (b) is a photograph of the image of the photographing apparatus 100 photographing the breeding region 50 where the animal is being raised at night. That is, the animal estrous detection device 200 can receive the photographed image of the breeding area photographed by the photographing apparatus 100 for 24 hours and determine whether the animal is in the breeding area.

FIG. 5 illustrates a computing environment including an exemplary computing device suitable for use in the exemplary embodiments.

The exemplary computing environment 400 shown in FIG. 5 includes a computing device 410. Typically, each configuration may have different functions and capabilities, and may additionally include components that are appropriate for the configuration, even if not described below. Computing device 410 may be a device for sensing the animal's estrus (e.g., animal estrogen sensing device 200) or a device for verifying animal estrus (e.g., user terminal 300).

The computing device 410 includes at least one processor 412, a computer readable storage medium 414, and a bus 460. The processor 412 is coupled to a bus 460 and the bus 460 includes a computer readable storage medium 414 to couple various other components of the computing device 410 to the processor 412.

The processor 412 may cause the computing device 410 to operate in accordance with the exemplary embodiment discussed above. For example, the processor 412 may execute computer-executable instructions stored in the computer-readable storage medium 414, and computer-executable instructions stored in the computer-readable storage medium 414 may be executed by the processor 412 The computing device 410 may be configured to perform operations in accordance with certain exemplary embodiments.

Computer readable storage medium 414 may store computer-executable instructions or program code (e.g., instructions contained in application 430), program data (e.g., data used by application 430), and / As shown in FIG. The application 430 stored in the computer-readable storage medium 414 includes a predetermined set of instructions executable by the processor 412.

Memory 416 and storage device 418 shown in FIG. 5 are examples of computer readable storage medium 414. The memory 416 may be loaded with computer executable instructions that may be executed by the processor 412. [ Also, the program data may be stored in the memory 416. [ For example, such memory 416 may be volatile memory, such as random access memory, non-volatile memory, or any suitable combination thereof. As another example, the storage device 418 may include one or more removable or non-removable components for storage of information. For example, the storage device 418 may be a hard disk, flash memory, magnetic disk, optical disk, other type of storage medium that can be accessed by the computing device 410 and store the desired information, or any suitable combination thereof.

The computing device 410 may also include one or more input / output interfaces 420 that provide an interface for one or more input / output devices 470. The input / output interface 420 is connected to the bus 460. The input / output device 470 may be connected to (other components of) the computing device 410 via the input / output interface 420. The input / output device 470 includes an input device such as a pointing device, a keyboard, a touch input device, a voice input device, a sensor device and / or a photographing device and / or an output device such as a display device, printer, speaker and / can do.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, . Therefore, the scope of the present invention should not be limited to the above-described embodiments, but should be determined by equivalents to the appended claims, as well as the appended claims.

10: Animal estrogen detection system
100:
200: Animal estrogen sensing device
202: object identification information verifying unit
204:
206:
208:
210: Information storage unit
300: User terminal
400: Computing environment
410: computing device
412: Processor
414: Computer readable storage medium
416: Memory
418: Storage device
420: input / output interface
430: Application
460: Bus
470: input / output device

Claims (21)

Identifying an object moving in an image captured by an animal estrous detection device;
Determining whether the moving object is an animal to be ascertained in the animal estrogen sensing apparatus; And
In the animal estrogen sensing apparatus, object posture pattern information according to a connection relation of a plurality of object structures constituting the identified animal object is generated from the captured image, and compared with pre-stored posture attitude pattern information, Wherein the step of determining whether the animal is in the step of performing the step
The method according to claim 1,
The step of determining whether the animal is to be ascertained as to whether or not the animal is to be ascended,
Extracting object configurations of the moving object in the animal estrogen sensing apparatus; And
Comparing the extracted object configurations with the previously stored configurations of the animal object to be sensed, and determining whether two or more object configurations are the same or similar.
The method according to claim 1,
The step of determining whether the animal is to be ascertained as to whether or not the animal is to be ascended,
Extracting object configurations of the moving object in the animal estrogen sensing apparatus;
Generating object type information from the extracted object configurations in the animal estrous detection apparatus; And
The method of claim 1, further comprising the step of comparing the generated object type information with object shape information of a previously stored animal object to determine the moving object as an animal to be detected.
The method according to claim 1,
The object posture information may include:
And extracting at least two object configurations from among the head, torso, buttocks, legs, tails, and skeletons constituting the identified animal object, and connecting the extracted object configurations to each other.
The method according to claim 1,
After the step of judging whether or not the step-
Wherein the animal estrous detection apparatus stores a previous image and a subsequent image for a preset time period based on a time point when a step-up behavior is determined in the captured image.
The method of claim 5,
After extracting and storing the image,
Further comprising the step of transmitting at least one of the ascending behavior confirmation signal and the stored image to the user terminal in the animal estrogen sensing apparatus,
The step-up behavior confirmation signal includes:
A step of determining whether or not the animal has been moved, and an object information of the identified animal object.
The method according to claim 1,
If it is determined in the step of judging whether or not the step-
Further comprising the step of: in said animal estrogen sensing device, confirming that the ascending posture of said identified animal object is maintained for a predetermined period of time.
The method according to claim 1,
If it is determined in the step of judging whether or not the step-
Identifying, in the animal estrogen sensing apparatus, unique identification information of the identified animal object; And
Further comprising the step of updating at least one of the date and time of the taking action and the identification information of the animal object to be augmented in the identified unique identification information of the identified animal object.
The method of claim 8,
After updating the power-up information,
Estimating an estrous cycle using the accumulated date and time of the animal object in the animal estrus sensing apparatus; And
Wherein the animal estrous detection apparatus further comprises transmitting the estrous cycle of the predicted animal object to a user terminal.
The method according to claim 1,
If it is determined in the step of judging whether or not the step-
Further comprising the step of storing, in the animal estrogen sensing apparatus, a connection relationship between object structures judged to be an ascending behavior, to the posture pattern information.
An object recognizing unit for recognizing a moving object in a photographed image of a breeding area and determining whether the moving object is an animal to be ascertained; And
Generating an object posture pattern information according to a connection relationship between a plurality of object structures constituting the identified animal object, comparing the generated object posture pattern information with pre-stored value pattern information, And an ascending behavior judging section for judging an ascending behavior judging section.
The method of claim 11,
Wherein the object recognizing unit comprises:
The method includes extracting object configurations of the moving object, comparing the extracted object configurations with the configurations of previously stored animal objects to be sensed, recognizing the object by confirming whether two or more object configurations are the same or similar, Device.
The method of claim 11,
Wherein the object recognizing unit comprises:
Extracting object configurations of the moving object, generating object type information from the extracted object configurations, comparing the generated object type information with object shape information of a previously stored animal object to be detected, An animal horn sensing device as determined by an object.
The method of claim 11,
The object posture information may include:
Wherein at least two object configurations among the head, torso, hips, legs, and skeleton constituting the identified animal object are extracted from the captured image and the extracted object configurations are concatenated.
The method of claim 11,
Wherein the step-
And stores a previous image and a subsequent image for a preset time period on the basis of a time point of the image determined as a step-up behavior in the captured image.
16. The method of claim 15,
The animal estrous detection apparatus comprises:
Further comprising: a communication unit for transmitting a step-up behavior confirmation signal and the stored image to the user terminal,
The step-up behavior confirmation signal includes:
A step of ascertaining whether or not the animal has been actuated, and object information of the identified animal object.
The method of claim 11,
Wherein the step-
And confirms whether the ascending posture of the identified animal object is maintained for a preset time.
The method of claim 11,
The animal estrous detection apparatus comprises:
Further comprising an identification information confirmation unit for confirming unique identification information of the identified animal object and other animal objects,
Wherein when the generated behavior object determining unit determines that the generated object posture pattern information is similar to the previously stored upkeep posture pattern information, the unique identification information verified by the identification information verifying unit includes identification information of an animal object And an object management unit for updating at least one of the animal and the animal.
19. The method of claim 18,
The object management unit,
Estimating an estrous cycle using an accumulated date and time of the animal object,
The animal estrogen detection can be performed by,
Further comprising a communication unit transmitting the estrous cycle of the predicted animal object to a user terminal.
The method of claim 11,
Wherein the ascending behavior determining unit adds the generated posture pattern information to the upwind posture pattern information when the generated object posture pattern information is determined to be similar to the previously stored upward posture pattern information, .
One or more processors;
Memory; And
A device comprising one or more programs,
Wherein the one or more programs are stored in the memory and are configured to be executed by the one or more processors,
The program causes the apparatus to:
An operation for confirming a moving object in a captured image of a breeding area in an animal estrous detection device;
Determining whether the moving object is an animal to be ascertained in the animal estrogen sensing apparatus; And
In the animal estrogen sensing apparatus, object posture pattern information according to a connection relation of a plurality of object structures constituting the identified animal object is generated from the captured image, and compared with pre-stored posture attitude pattern information, Is set to perform an operation of determining whether or not the operation of the first device is performed.
KR1020150101651A 2015-07-17 2015-07-17 Sensing estrus of poultry method and device for thereof KR101694946B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150101651A KR101694946B1 (en) 2015-07-17 2015-07-17 Sensing estrus of poultry method and device for thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150101651A KR101694946B1 (en) 2015-07-17 2015-07-17 Sensing estrus of poultry method and device for thereof

Publications (1)

Publication Number Publication Date
KR101694946B1 true KR101694946B1 (en) 2017-01-10

Family

ID=57811770

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150101651A KR101694946B1 (en) 2015-07-17 2015-07-17 Sensing estrus of poultry method and device for thereof

Country Status (1)

Country Link
KR (1) KR101694946B1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018043765A1 (en) * 2016-08-29 2018-03-08 주식회사 씨피에스글로벌 Method for detecting animal estrus and device for performing same
KR20200030765A (en) 2018-09-13 2020-03-23 박상열 Apparatus and method for managing livestock using machine learning
KR20220012629A (en) * 2020-07-23 2022-02-04 강원대학교산학협력단 method for detecting estrus of cattle based on object detection algorithm
WO2022167328A1 (en) * 2021-02-02 2022-08-11 Signify Holding B.V. System and method for analyzing mating behavior of an animal species
WO2022190923A1 (en) * 2021-03-11 2022-09-15 日本ハム株式会社 Swine rearing assistance apparatus, swine rearing assistance method, and swine rearing assistance program
KR102527058B1 (en) * 2022-12-01 2023-05-02 한국아이오티 주식회사 Apparatus for detecting mounting behavior of cattle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090127762A (en) * 2008-06-09 2009-12-14 재단법인 포항산업과학연구원 Monitoring method of cattle's heat in network, system therof, and web-server used therin
KR20110007922A (en) * 2009-07-17 2011-01-25 경북대학교 산학협력단 An automatic alarm system for in-and-in breeding of cattle
JP2012190280A (en) * 2011-03-10 2012-10-04 Hiroshima Univ Action recognition device, action recognition method, and program
KR20130030615A (en) * 2011-09-19 2013-03-27 (주)터보소프트 System for providing domestic animal mounting information

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090127762A (en) * 2008-06-09 2009-12-14 재단법인 포항산업과학연구원 Monitoring method of cattle's heat in network, system therof, and web-server used therin
KR20110007922A (en) * 2009-07-17 2011-01-25 경북대학교 산학협력단 An automatic alarm system for in-and-in breeding of cattle
JP2012190280A (en) * 2011-03-10 2012-10-04 Hiroshima Univ Action recognition device, action recognition method, and program
KR20130030615A (en) * 2011-09-19 2013-03-27 (주)터보소프트 System for providing domestic animal mounting information
KR101329022B1 (en) 2011-09-19 2013-12-20 (주)터보소프트 System for providing domestic animal mounting information

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018043765A1 (en) * 2016-08-29 2018-03-08 주식회사 씨피에스글로벌 Method for detecting animal estrus and device for performing same
KR20200030765A (en) 2018-09-13 2020-03-23 박상열 Apparatus and method for managing livestock using machine learning
KR20220012629A (en) * 2020-07-23 2022-02-04 강원대학교산학협력단 method for detecting estrus of cattle based on object detection algorithm
KR102424901B1 (en) * 2020-07-23 2022-07-22 강원대학교산학협력단 method for detecting estrus of cattle based on object detection algorithm
WO2022167328A1 (en) * 2021-02-02 2022-08-11 Signify Holding B.V. System and method for analyzing mating behavior of an animal species
WO2022190923A1 (en) * 2021-03-11 2022-09-15 日本ハム株式会社 Swine rearing assistance apparatus, swine rearing assistance method, and swine rearing assistance program
KR102527058B1 (en) * 2022-12-01 2023-05-02 한국아이오티 주식회사 Apparatus for detecting mounting behavior of cattle
WO2024117665A1 (en) * 2022-12-01 2024-06-06 한국아이오티 주식회사 Device for detecting mounting behavior of cattle

Similar Documents

Publication Publication Date Title
KR101694946B1 (en) Sensing estrus of poultry method and device for thereof
US11080434B2 (en) Protecting content on a display device from a field-of-view of a person or device
CN111542222B (en) System and method for estimating the weight of livestock
US10058076B2 (en) Method of monitoring infectious disease, system using the same, and recording medium for performing the same
CN106104569B (en) For establishing the method and apparatus of connection between electronic device
JP5301973B2 (en) Crime prevention device and program
CN111107740A (en) Livestock information management system, livestock house, livestock information management program, and livestock information management method
JP2017112857A (en) Animal husbandry management system
JP6535196B2 (en) Image processing apparatus, image processing method and image processing system
WO2017158698A1 (en) Monitoring device, monitoring method, and monitoring program
JP5001808B2 (en) Crime prevention device and crime prevention program
KR101104656B1 (en) Pet image detection system and method for controlling operation thereof
CN110267010B (en) Image processing method, image processing apparatus, server, and storage medium
Joshi et al. A fall detection and alert system for an elderly using computer vision and Internet of Things
US20210216758A1 (en) Animal information management system and animal information management method
KR102527322B1 (en) Method and device for managing livestock through object recognition based on deep learning
US20230320328A1 (en) Pet status assessment system, pet camera, server, pet status assessment method, and program
KR102304109B1 (en) Animal tracking monitoring server and operation method thereof
EP4402657A1 (en) Systems and methods for the automated monitoring of animal physiological conditions and for the prediction of animal phenotypes and health outcomes
JP2017091552A (en) Behavior detection device, behavior detection method and monitored person monitoring device
CN115471916A (en) Smoking detection method, device, equipment and storage medium
KR102277853B1 (en) Control method for system of preventing pet loss
KR101718452B1 (en) Sensing estrus of poultry system and device for thereof
CN110036406B (en) Collection system, recording medium storing terminal program, and collection method
JP6893812B2 (en) Object detector

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant