CN111652911A - Target monitoring method, device and equipment - Google Patents

Target monitoring method, device and equipment Download PDF

Info

Publication number
CN111652911A
CN111652911A CN202010521316.0A CN202010521316A CN111652911A CN 111652911 A CN111652911 A CN 111652911A CN 202010521316 A CN202010521316 A CN 202010521316A CN 111652911 A CN111652911 A CN 111652911A
Authority
CN
China
Prior art keywords
information
tracking target
preset
motion state
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010521316.0A
Other languages
Chinese (zh)
Other versions
CN111652911B (en
Inventor
张发恩
林国森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ainnovation Nanjing Technology Co ltd
Original Assignee
Ainnovation Nanjing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ainnovation Nanjing Technology Co ltd filed Critical Ainnovation Nanjing Technology Co ltd
Priority to CN202010521316.0A priority Critical patent/CN111652911B/en
Publication of CN111652911A publication Critical patent/CN111652911A/en
Application granted granted Critical
Publication of CN111652911B publication Critical patent/CN111652911B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The application provides a target monitoring method, a device and equipment, wherein the method comprises the following steps: acquiring position information of a tracking target and recording time information; generating motion state information of the tracking target according to the position information and the time information; and analyzing the health index of the tracking target according to the motion state information, and sending prompt information. According to the method and the device, the position information of the tracking target is obtained in real time, the corresponding time information is recorded, the motion state information of the tracking target is generated according to the position information and the time information, the health index of the tracking target is analyzed according to the motion state information, the corresponding prompt information is sent, and the health condition of the tracking target is monitored in time.

Description

Target monitoring method, device and equipment
Technical Field
The present application relates to the field of information processing technologies, and in particular, to a target monitoring method, apparatus, and device.
Background
With the further improvement of living conditions of people, the requirements of people on poultry breeding are more and more increased, and large-scale farms are rapidly developed. Meanwhile, once disease infection among the poultry occurs, serious loss is caused to farmers, so that the health conditions of the poultry are greatly concerned by the farmers.
In the process of large-scale breeding, in order to monitor the health condition of poultry individuals, various instruments and systems are generally adopted. Such as mounting a small device for each bird to monitor its heart rate, record steps and trajectories.
However, the monitoring scheme requires installation of monitoring equipment for each individual, and is high in cost, large in workload of installation and disassembly, and difficult to clean and maintain; and the abnormal target cannot be timely and accurately positioned.
Disclosure of Invention
An object of the embodiments of the present application is to provide a target monitoring method, device and apparatus, so as to analyze and obtain a health index of a tracked target according to motion state information of the tracked target, and send out corresponding prompt information.
A first aspect of an embodiment of the present application provides a target monitoring method, including: acquiring position information of a tracking target and recording time information; generating motion state information of the tracking target according to the position information and the time information; and analyzing the health index of the tracking target according to the motion state information, and sending prompt information.
In an embodiment, the acquiring the position information of the tracking target and recording the time information includes: acquiring image information of the tracking target in a preset area; and performing image processing on the image information to obtain the position information and the time information of the tracking target in the preset area, and associating and marking the position information and the identity label of the tracking target.
In an embodiment, the generating the motion state information of the tracking target according to the position information and the time information includes: generating motion trail information of the tracking target according to the position information and the time information; and generating the motion state information of the tracking target in a preset time period according to the motion track information.
In an embodiment, the analyzing the health index of the tracking target according to the motion state information and sending a prompt message includes: analyzing the motion state information to obtain the motion distance of the tracking target in a first preset time period; judging whether the movement distance is within a preset distance threshold value; and if the movement distance is not within the preset distance threshold, sending out alarm information.
In an embodiment, the analyzing the health index of the tracking target according to the motion state information and sending a prompt message includes: analyzing the motion state information to obtain the motion intensity of the tracking target in a second preset time period; judging whether the exercise intensity is within a preset intensity threshold value; and if the movement distance is not within the preset intensity threshold value, sending out alarm information.
A second aspect of the embodiments of the present application provides a target monitoring apparatus, including: the acquisition module is used for acquiring the position information of the tracking target and recording time information; the generating module is used for generating the motion state information of the tracking target according to the position information and the time information; and the analysis module is used for analyzing the health index of the tracking target according to the motion state information and sending prompt information.
In one embodiment, the obtaining module is configured to: acquiring image information of the tracking target in a preset area; and performing image processing on the image information to obtain the position information and the time information of the tracking target in the preset area, and associating and marking the position information and the identity label of the tracking target.
In one embodiment, the generating module is configured to: generating motion trail information of the tracking target according to the position information and the time information; and generating the motion state information of the tracking target in a preset time period according to the motion track information.
In one embodiment, the parsing module is configured to: analyzing the motion state information to obtain the motion distance of the tracking target in a first preset time period; judging whether the movement distance is within a preset distance threshold value; and if the movement distance is not within the preset distance threshold, sending out alarm information.
In one embodiment, the parsing module is configured to: analyzing the motion state information to obtain the motion intensity of the tracking target in a second preset time period; judging whether the exercise intensity is within a preset intensity threshold value; and if the movement distance is not within the preset intensity threshold value, sending out alarm information.
A third aspect of embodiments of the present application provides an electronic device, including: a memory to store a computer program; a processor configured to perform the method of the first aspect and any embodiment thereof of the embodiments of the present application to monitor the tracking target.
According to the target monitoring method, the target monitoring device and the target monitoring equipment, the position information of the tracking target is obtained in real time, the corresponding time information is recorded, the motion state information of the tracking target is generated according to the position information and the time information, the health index of the tracking target is analyzed according to the motion state information, the corresponding prompt information is sent, and the health condition of the tracking target is monitored in time.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 2 is a schematic flow chart illustrating a target monitoring method according to an embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating a target monitoring method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a target monitoring device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the present application, the terms "first," "second," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
As shown in fig. 1, the present embodiment provides an electronic apparatus 1 including: at least one processor 11 and a memory 12, one processor being exemplified in fig. 1. The processor 11 and the memory 12 are connected by the bus 10, and the memory 12 stores instructions executable by the processor 11, and the instructions are executed by the processor 11, so that the electronic device 1 can execute all or part of the flow of the method in the embodiments described below to monitor the tracking target.
In an embodiment, the electronic device 1 may be a mobile phone, a notebook computer, or the like.
In one embodiment, the tracking target can be living bodies such as poultry, and in a large-scale poultry breeding scene, the poultry is tracked and monitored, so that prevention and control of poultry diseases are facilitated.
Please refer to fig. 2, which is a target monitoring method according to an embodiment of the present application, and the method may be executed by the electronic device 1 shown in fig. 1, and may be applied to a poultry disease prevention and control scenario in a farm, so as to analyze a health index of a tracked target according to motion state information of the tracked target, and send out corresponding prompt information. The method comprises the following steps:
step 201: and acquiring the position information of the tracking target and recording the time information.
In this step, taking a poultry breeding scene as an example, the tracked target may be a live poultry in a farm, there may be a plurality of tracked targets, a unique identity tag may be set for each tracked target, and the position information and the corresponding time information of each tracked target are obtained in real time.
Step 202: and generating the motion state information of the tracking target according to the position information and the time information.
In this step, the position information corresponds to the time information one to one, that is, the position information of the tracking target changing with time is acquired in step 201 and is associated with the tracking target. According to the position information and the time information, the motion state information of each tracking target can be counted.
Step 203: and analyzing the health index of the tracking target according to the motion state information, and sending prompt information.
In this step, the motion state information of the tracked target can represent the health condition of the tracked target, so that the health index of the tracked target can be analyzed according to the motion state information, and corresponding prompt information can be sent out in time, so that the sudden disease condition can be responded in time.
According to the target monitoring method, the position information of the marked poultry is obtained in real time, the corresponding time information is recorded, the motion state information of the poultry is generated according to the position information and the time information, the health index of the poultry is analyzed according to the motion state information, the corresponding prompt information is sent out, and the health condition of the poultry is monitored in time.
Please refer to fig. 3, which is a target monitoring method according to an embodiment of the present application, and the method may be executed by the electronic device 1 shown in fig. 1, and may be applied to a poultry disease prevention and control scenario in a farm, so as to analyze a health index of a tracked target according to motion state information of the tracked target, and send out corresponding prompt information. The method comprises the following steps:
step 301: and acquiring the image information of the tracking target in a preset area.
In this step, taking a poultry farming scene as an example, the preset area may be a range of a moving area of poultry in the farming farm. The image information in the farm can be collected by adopting equipment such as a monitoring camera. For example, can install multichannel camera respectively in different positions, different angles of plant, the image information of plant is gathered respectively to each way camera.
Step 302: and carrying out image processing on the image information to obtain position information and time information of the tracking target in a preset area, and associating and marking the position information and the identity tag of the tracking target.
In this step, each path of image information is subjected to image processing, so as to obtain position information and local identity information of a tracking target in a preset area, and shooting time can be recorded in the image information, so that corresponding time information can be obtained. A unique identity label can be preset for each tracking target, such as a farmThe birds in the house are respectively provided with identity tags. The location information and time information for each tracked object is then associated with the corresponding identity tag. For example, videos shot by each camera are processed by using mutually independent models, the models have the same structure, and Real-Time target tracking (Real-Time MDNet) is adopted. Each model outputs the identity information and the position information of a tracking target under the current monitoring video, wherein the position information is obtained by the coordinates (x) of the upper left corner point of a positioning rectangular frame in the picture of the tracking targetmin,ymin) And coordinates of lower right corner point (x)min,ymin) And (4) showing.
In one embodiment, after the image processing of the multiple paths of image information, the processing results are integrated to form a global target tracking result. In an actual scene, certain superposition exists in the visual fields among different cameras. Through calibration during camera installation, target tracking results of the overlapping area can be unified. After all the images are fused, the result of target tracking is mapped into global unique identity information and global position information by the result of each path of image information after processing, and time information at the moment is recorded. The mapping mode of the global position information is as follows: and performing corresponding translation on the position information after processing of each path of image information by combining the position of the camera view in the global picture, so that the position information of the tracking target in each path of image information is mapped into global position information. The global position information may be global coordinates (X) of the upper left corner point of the positioning rectangular box in the pictureMIN,YMIN) And coordinates of lower right corner point (X)MAX,YMAX) And (4) showing.
Step 303: and generating the motion trail information of the tracking target according to the position information and the time information.
In this step, motion trajectory information may be generated for all tracked targets respectively according to the globally detected position information and time information, and a corresponding trajectory database may be created. The track database records the global position information of each tracking target at each moment in detail.
Step 304: and generating motion state information of the tracking target in a preset time period according to the motion track information.
In this step, the motion trajectory information is generally a trace left by the motion state of the tracking target, so the motion state of each tracking target can be analyzed based on the trajectory database of all tracking targets obtained in the previous step, and a time period can be set to analyze the motion state information of the tracking target in this time period.
Step 305: and analyzing the motion state information to obtain the motion distance of the tracking target in a first preset time period.
In this step, the motion state information may represent the health condition of the tracked target, for example, healthy birds, and the motion state is generally lively and naturally leaves a relatively active trace, so that the motion state information may be analyzed from the motion state information. The moving distance of each tracked object within a first preset time period (e.g., 30 minutes) may be calculated.
In one embodiment, the moving distance may be obtained by accumulating distances in the position information updated every second by the tracking target. Let the coordinates of the position information where the t-th second tracking target 1 is located be:
Figure BDA0002532170810000071
the coordinates of the position information of the previous-second tracking target 1 are
Figure BDA0002532170810000072
The moving distance between the tth second and the t-1 th second can be calculated by the following formula:
Figure BDA0002532170810000073
wherein L is the moving distance between the tth second and the t-1 second.
Step 306: and judging whether the movement distance is within a preset distance threshold value.
In this step, the movement distance within a period of time may represent the health state of the birds, healthy birds may have a healthy movement distance, the preset distance threshold may be obtained based on empirical data statistics, or positive and negative sample data of healthy birds and unhealthy birds may be accumulated, and a suitable preset distance threshold may be further mined by training a sample data mining model, such as xgboost (extreme gradient boosting), a machine learning algorithm) model. By determining whether the distance traveled by the bird is within a predetermined distance threshold, if so, the bird is healthy, no alarm is issued, otherwise, step 307 is entered.
Step 307: and sending out alarm information.
In this step, if the moving distance is not within the preset distance threshold, which indicates that the birds may be sick, an alarm message is issued for the birds.
In one embodiment, for example, by performing summary statistics on the movement distance of each tracked object in a first preset time period in the past (e.g., 8 am to 8 am and a half of a day), if the movement distance is not within a preset distance threshold, it indicates that the movement distance is too short or too long, which may reflect that there is a health risk for the tracked object, and an alarm is required.
In one embodiment, when a certain tracked target keeps relatively fixed in position for a certain period of time (e.g., 15 minutes), it indicates that the tracked target may have a health abnormality, and an alarm needs to be processed.
In an embodiment, after step 304, the method further includes: and analyzing the motion state information to obtain the motion intensity of the tracking target in a second preset time period. And judging whether the exercise intensity is within a preset intensity threshold value. And if the movement distance is not within the preset intensity threshold value, sending out alarm information.
In the above steps, the exercise intensity may also represent the health status of the birds, and similarly, the empirical data may be counted to obtain a suitable preset intensity threshold value that may represent the health status of the birds, and then it is determined whether the exercise intensity of each bird is within the preset intensity threshold value, if so, the birds are healthy, otherwise, the birds are given an alarm message to facilitate timely handling by the staff. For example, for poultry with low continuous multi-day movement, alarm processing is necessary to carry out isolated feeding to avoid diseases. The alarm information can be displayed in the form of message pop-up window and voice broadcast, and the position and abnormal reasons of the suspicious poultry are displayed in real time in a video picture, so that the staff is reminded to perform corresponding treatment, and the disease spreading is avoided.
According to the target monitoring method, each poultry individual does not need to wear additional equipment, the workload required by wearing and dismantling the equipment is greatly reduced, and heavy work such as cleaning and maintenance of the equipment is correspondingly avoided. Meanwhile, abnormal targets can be positioned at any time, and workers can be guided to process suspicious targets as soon as possible, so that the method is more targeted, effective and timely.
Please refer to fig. 4, which is a target monitoring apparatus 400 according to an embodiment of the present application, and the apparatus may be applied to the electronic device 1 shown in fig. 1, and may be applied to a poultry disease prevention and control scenario in a farm, so as to analyze the health index of the tracked target according to the motion state information of the tracked target, and send out corresponding prompt information. The device includes: the system comprises an acquisition module 401, a generation module 402 and an analysis module 403, wherein the principle relationship of each module is as follows:
the obtaining module 401 is configured to obtain position information of a tracking target and record time information. See the description of step 201 in the above embodiments for details.
A generating module 402, configured to generate motion state information of the tracking target according to the position information and the time information. See the description of step 202 in the above embodiments for details.
And the analyzing module 403 is configured to analyze the health index of the tracking target according to the motion state information, and send a prompt message. See the description of step 203 in the above embodiments for details.
In one embodiment, the obtaining module 401 is configured to: and acquiring the image information of the tracking target in a preset area. And carrying out image processing on the image information to obtain position information and time information of the tracking target in a preset area, and associating and marking the position information and the identity tag of the tracking target. See the description of step 301 to step 302 in the above embodiments in detail.
In one embodiment, the generating module 402 is configured to: and generating the motion trail information of the tracking target according to the position information and the time information. And generating motion state information of the tracking target in a preset time period according to the motion track information. See the description of steps 303 to 304 in the above embodiments in detail.
In one embodiment, the parsing module 403 is configured to: and analyzing the motion state information to obtain the motion distance of the tracking target in a first preset time period. And judging whether the movement distance is within a preset distance threshold value. And if the movement distance is not within the preset distance threshold, sending out alarm information. Refer to the description of step 305 to step 307 in the above embodiments in detail.
In one embodiment, the parsing module 403 is configured to: and analyzing the motion state information to obtain the motion intensity of the tracking target in a second preset time period. And judging whether the exercise intensity is within a preset intensity threshold value. And if the movement distance is not within the preset intensity threshold value, sending out alarm information. Reference is made in detail to the description of the relevant method steps in the above examples.
For a detailed description of the target monitoring apparatus 400, please refer to the description of the related method steps in the above embodiments.
An embodiment of the present invention further provides a non-transitory electronic device readable storage medium, including: a program that, when run on an electronic device, causes the electronic device to perform all or part of the procedures of the methods in the above-described embodiments. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid State Drive (SSD), or the like. The storage medium may also comprise a combination of memories of the kind described above.
Although the embodiments of the present invention have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope defined by the appended claims.

Claims (10)

1. A method of target monitoring, comprising:
acquiring position information of a tracking target and recording time information;
generating motion state information of the tracking target according to the position information and the time information;
and analyzing the health index of the tracking target according to the motion state information, and sending prompt information.
2. The method of claim 1, wherein the obtaining position information of the tracking target and recording time information comprises:
acquiring image information of the tracking target in a preset area;
and performing image processing on the image information to obtain the position information and the time information of the tracking target in the preset area, and associating and marking the position information and the identity label of the tracking target.
3. The method according to claim 1, wherein the generating motion state information of the tracking target according to the position information and the time information comprises:
generating motion trail information of the tracking target according to the position information and the time information;
and generating the motion state information of the tracking target in a preset time period according to the motion track information.
4. The method according to claim 1, wherein the analyzing the health index of the tracking target according to the motion state information and issuing a prompt message comprises:
analyzing the motion state information to obtain the motion distance of the tracking target in a first preset time period;
judging whether the movement distance is within a preset distance threshold value;
and if the movement distance is not within the preset distance threshold, sending out alarm information.
5. The method according to claim 1, wherein the analyzing the health index of the tracking target according to the motion state information and issuing a prompt message comprises:
analyzing the motion state information to obtain the motion intensity of the tracking target in a second preset time period;
judging whether the exercise intensity is within a preset intensity threshold value;
and if the movement distance is not within the preset intensity threshold value, sending out alarm information.
6. An object monitoring device, comprising:
the acquisition module is used for acquiring the position information of the tracking target and recording time information;
the generating module is used for generating the motion state information of the tracking target according to the position information and the time information;
and the analysis module is used for analyzing the health index of the tracking target according to the motion state information and sending prompt information.
7. The apparatus of claim 6, wherein the obtaining module is configured to:
acquiring image information of the tracking target in a preset area;
and performing image processing on the image information to obtain the position information and the time information of the tracking target in the preset area, and associating and marking the position information and the identity label of the tracking target.
8. The apparatus of claim 6, wherein the generating module is configured to:
generating motion trail information of the tracking target according to the position information and the time information;
and generating the motion state information of the tracking target in a preset time period according to the motion track information.
9. The apparatus of claim 6, wherein the parsing module is configured to:
analyzing the motion state information to obtain the motion distance of the tracking target in a first preset time period;
judging whether the movement distance is within a preset distance threshold value;
if the movement distance is not within the preset distance threshold, sending out alarm information; and/or
The parsing module is configured to:
analyzing the motion state information to obtain the motion intensity of the tracking target in a second preset time period;
judging whether the exercise intensity is within a preset intensity threshold value;
and if the movement distance is not within the preset intensity threshold value, sending out alarm information.
10. An electronic device, comprising:
a memory to store a computer program;
a processor configured to perform the method of any one of claims 1 to 5 for monitoring a tracked target.
CN202010521316.0A 2020-06-10 2020-06-10 Target monitoring method, device and equipment Active CN111652911B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010521316.0A CN111652911B (en) 2020-06-10 2020-06-10 Target monitoring method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010521316.0A CN111652911B (en) 2020-06-10 2020-06-10 Target monitoring method, device and equipment

Publications (2)

Publication Number Publication Date
CN111652911A true CN111652911A (en) 2020-09-11
CN111652911B CN111652911B (en) 2023-07-28

Family

ID=72349107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010521316.0A Active CN111652911B (en) 2020-06-10 2020-06-10 Target monitoring method, device and equipment

Country Status (1)

Country Link
CN (1) CN111652911B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106682572A (en) * 2016-10-12 2017-05-17 纳恩博(北京)科技有限公司 Target tracking method, target tracking system and first electronic device
CN107135983A (en) * 2017-05-05 2017-09-08 北京农业信息技术研究中心 Poultry health monitoring method, motion pin ring, server and system
CN108157222A (en) * 2018-02-13 2018-06-15 福建思特电子有限公司 A kind of artificial intelligence domestic animals supervisory terminal
CN109388669A (en) * 2018-08-31 2019-02-26 上海奥孛睿斯科技有限公司 The acquisition of internet-of-things terminal motion state and analysis system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106682572A (en) * 2016-10-12 2017-05-17 纳恩博(北京)科技有限公司 Target tracking method, target tracking system and first electronic device
CN107135983A (en) * 2017-05-05 2017-09-08 北京农业信息技术研究中心 Poultry health monitoring method, motion pin ring, server and system
CN108157222A (en) * 2018-02-13 2018-06-15 福建思特电子有限公司 A kind of artificial intelligence domestic animals supervisory terminal
CN109388669A (en) * 2018-08-31 2019-02-26 上海奥孛睿斯科技有限公司 The acquisition of internet-of-things terminal motion state and analysis system and method

Also Published As

Publication number Publication date
CN111652911B (en) 2023-07-28

Similar Documents

Publication Publication Date Title
Fuentes et al. Deep learning-based hierarchical cattle behavior recognition with spatio-temporal information
Ratnayake et al. Tracking individual honeybees among wildflower clusters with computer vision-facilitated pollinator monitoring
De Chaumont et al. Computerized video analysis of social interactions in mice
WO2020003310A1 (en) Monitoring livestock in an agricultural pen
KR102506029B1 (en) Apparatus and method for monitoring growing progress of livestock individual based on image
US20230260327A1 (en) Autonomous livestock monitoring
Ratnayake et al. Towards computer vision and deep learning facilitated pollination monitoring for agriculture
KR102584357B1 (en) Apparatus for identifying a livestock using a pattern, and system for classifying livestock behavior pattern based on images using the apparatus and method thereof
CA3230401A1 (en) Systems and methods for the automated monitoring of animal physiological conditions and for the prediction of animal phenotypes and health outcomes
Sun et al. A visual tracking system for honey bee (hymenoptera: Apidae) 3D flight trajectory reconstruction and analysis
CN113724250A (en) Animal target counting method based on double-optical camera
Guo et al. Video-based Detection and Tracking with Improved Re-Identification Association for Pigs and Laying Hens in Farms.
CN116824626A (en) Artificial intelligent identification method for abnormal state of animal
CN111652911A (en) Target monitoring method, device and equipment
CN113516139A (en) Data processing method, device, equipment and storage medium
Xu et al. Automatic quantification and assessment of grouped pig movement using the XGBoost and YOLOv5s models
CN111523472A (en) Active target counting method and device based on machine vision
Eagan et al. Behaviour Real-Time spatial tracking identification (BeRSTID) used for cat behaviour monitoring in an animal shelter
TWI789598B (en) Livestock abnormality monitoring system, method, computer program product, and computer readable recording medium
Alon et al. Machine vision-based automatic lamb identification and drinking activity in a commercial farm
JP7368915B1 (en) Bee trajectory tracking and analysis device and method using learning
Nasiri et al. An automated video action recognition-based system for drinking time estimation of individual broilers
Borah et al. Animal Motion Tracking in Forest: Using Machine Vision Technology
SAITOH et al. A Static Video Summarization Approach for the Analysis of Cattle's Movement
WO2023037397A1 (en) Dead fowl detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder

Address after: 19 / F, building B, Xingzhi science and Technology Park, 6 Xingzhi Road, Nanjing Economic and Technological Development Zone, Jiangsu Province, 210000

Patentee after: AINNOVATION (NANJING) TECHNOLOGY Co.,Ltd.

Address before: Floor 19, building B, Xingzhi science and Technology Park, 6 Xingzhi Road, Jiangning Economic and Technological Development Zone, Nanjing, Jiangsu Province

Patentee before: AINNOVATION (NANJING) TECHNOLOGY Co.,Ltd.

CP02 Change in the address of a patent holder