CN115941895A - Wearable grazing system with unmanned aerial vehicle is supplementary - Google Patents

Wearable grazing system with unmanned aerial vehicle is supplementary Download PDF

Info

Publication number
CN115941895A
CN115941895A CN202211397920.2A CN202211397920A CN115941895A CN 115941895 A CN115941895 A CN 115941895A CN 202211397920 A CN202211397920 A CN 202211397920A CN 115941895 A CN115941895 A CN 115941895A
Authority
CN
China
Prior art keywords
livestock
module
unmanned aerial
aerial vehicle
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211397920.2A
Other languages
Chinese (zh)
Other versions
CN115941895B (en
Inventor
张喜海
王浩
陈泽瑞
郭锐超
王杨
张宇
宋伟先
李鸿博
孟繁锋
龚鑫晶
张茹雯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeast Agricultural University
Original Assignee
Northeast Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeast Agricultural University filed Critical Northeast Agricultural University
Priority to CN202211397920.2A priority Critical patent/CN115941895B/en
Publication of CN115941895A publication Critical patent/CN115941895A/en
Application granted granted Critical
Publication of CN115941895B publication Critical patent/CN115941895B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/70Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in livestock or poultry

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The invention discloses a wearable grazing system with unmanned aerial vehicle assistance, and relates to the technical field of livestock monitoring. The technical points of the invention comprise: the livestock wears a small Internet of things device which comprises a Beidou module, a six-axis attitude sensor, a voice playing module, an NB-IoT and the like; in addition, the unmanned aerial vehicle auxiliary equipment with image transmission and GPS is also provided; the Internet of things equipment is responsible for monitoring the postures and the geographic positions of the livestock; when the animal leaves the designated area, the worn device will simulate the sound of a dog barking to drive the animal, and when the animal has not returned to the electronic fence for a long time or a danger is detected, no one will automatically go to the alarm area based on the geographical location of the animal. The invention combines a plurality of sensors to realize the functions of an electronic fence and an electronic shepherd dog, the unmanned aerial vehicle is used as auxiliary equipment, feedback control is carried out according to the condition of livestock, and the unmanned aerial vehicle can fly to an alarm area only when needed, so that the problems of insufficient battery power and interference on the livestock caused by long flight time of the unmanned aerial vehicle are solved.

Description

Wearable grazing system with unmanned aerial vehicle is supplementary
Technical Field
The invention relates to the technical field of livestock monitoring, in particular to a wearable grazing system with unmanned aerial vehicle assistance.
Background
In recent years, animal husbandry has entered a new development stage as a branch of agriculture, moving from traditional animal husbandry to modern animal husbandry. Grazing monitoring is of increasing interest, mainly due to the increasing population and the increasing demand for food and animal products; both of these factors have led to a rapid increase in the size of livestock enterprises worldwide. However, this growth also increases the difficulty of herdsmen in managing livestock. Therefore, it is important to determine how to implement intelligent monitoring.
The internet of things is about connecting people and things and is a large network formed by combining a sensor technology, a communication technology and an automation technology. The unmanned aerial vehicle technology has been widely applied to agricultural monitoring and other monitoring applications due to the advantages of simplicity and convenience in operation, high efficiency, safety and the like. Therefore, applying these information technologies to intelligent grazing is an urgent problem to be solved. These technologies will become the key to implementing monitoring and alarm functions.
Currently, there are many researches on applying a new technology to animal husbandry and realizing intelligent management. King et al [1] A Wireless Sensor Network (WSN) system was developed to monitor the ingestion and drinking behavior of animals and an energy efficient mesh routing strategy was proposed to aggregate the monitoring data. The authors demonstrate the novelty and feasibility of such a wireless monitoring system. Walker et al [2] A low cost telemetry system based on common Radio Frequency Identification (RFID) technology is presented for continuous monitoring of physiological signals in small animals. Tsuichihara et al [3] Farm management systems based on drones and Global Positioning Systems (GPS) are studied. This system uses unmanned aerial vehicle to shoot the grassland photo and passes through GPS location livestock. In addition, barbedo et al [4,5] An unmanned aerial vehicle is used for capturing an animal image, and then a deep learning method is used for extracting relevant information. The wearable device is used for collecting sensing data, which is an effective mode for monitoring livestock behaviors, but the current research function is relatively simple and needs manual supervision; unmanned aerial vehicles are also good monitoring tools, but long-term flight can lead to endurance problems. Most of the prior art only use thing networking or unmanned aerial vehicle technique to monitor the animal, lack the research that combines together thing networking and unmanned aerial vehicle.
Drones are increasingly seen as valuable tools to aid in farm management, but research on the grazing of drones is still in its infancy. Two main obstacles to the unmanned grazing system are: (1) Nobody who lacks practicalityMachine-animal interaction and suitable drone-grazing platform [6] (ii) a And (2) power consumption problem caused by long-time flight.
Disclosure of Invention
To this end, the present invention proposes a wearable grazing system with drone assistance in an attempt to solve or at least alleviate at least one of the problems presented above.
A wearable grazing system with unmanned aerial vehicle assistance comprises a data acquisition and processing end, a cloud end, a user end and an unmanned aerial vehicle end; wherein the content of the first and second substances,
the data acquisition processing end comprises a core controller, an attitude sensor, a Beidou positioning module, a voice playing module and a first data transmission module; the system comprises a posture sensor, a Beidou positioning module, a core controller, a voice playing module, a first data transmission module, a Beidou positioning module, a core controller and a voice broadcasting module, wherein the posture sensor is used for acquiring livestock posture data, the Beidou positioning module is used for acquiring livestock position data, the core controller is used for judging whether livestock is in an electronic fence according to the livestock position data, judging whether the livestock is in a feeding, running or falling state according to the livestock posture data, sending an alarm instruction to the voice playing module when the livestock is not in the electronic fence, and sending the livestock position data and the livestock posture data to the cloud end through the first data transmission module; wherein the animal pose data carries animal status information including feeding, running or falling status;
the cloud end is used for judging whether the livestock is in a normal feeding state according to preset periodic feeding times after receiving livestock attitude data carrying livestock state information, judging whether the livestock is in a normal running state according to a preset time threshold value, and sending a control instruction and corresponding livestock position data to the unmanned aerial vehicle end if judging that the livestock is in an abnormal running state or a falling state; sending warning information and corresponding livestock position data to the user side; wherein the warning information comprises abnormal running warning, abnormal eating warning and falling warning;
the unmanned aerial vehicle end comprises an unmanned aerial vehicle, a main controller, a GPS (global positioning system), an image acquisition module, an image transmission module and a wireless communication module, and the wireless communication module is used for receiving a control instruction and livestock position data sent by the cloud end; the main controller is used for sending the livestock position data to the GPS, enabling the unmanned aerial vehicle to fly to the position of the livestock, and sending an image acquisition instruction to the image acquisition module after the unmanned aerial vehicle reaches the position of the livestock; the image acquisition module is used for starting a camera to acquire a video containing the livestock after receiving an image acquisition instruction and sending the video to the user side in real time through the image transmission module.
Furthermore, the voice playing module sends out a simulated dog call after receiving the alarm instruction.
Further, the livestock attitude data comprises a yaw angle, a roll angle, a pitch angle, a yaw angle speed, a roll angle speed and a pitch angle speed; the process that the core controller judges whether the livestock eats, runs or falls according to the livestock posture data comprises the following steps:
1) Food intake judgment
The one-time feeding condition is that a pitch angle is smaller than a first preset threshold value and is automatically recovered to an initial state; b. the pitch angle speed changes from a positive maximum value to a negative maximum value;
2) Running state determination
The condition is that the pitch angle speed is more than a second preset threshold value for multiple times;
3) Judgment of falling state
The condition is that c is simultaneously satisfied, and the roll angle is smaller than a third preset threshold value; d. the roll angular velocity is less than a fourth preset threshold.
Furthermore, the user side is used for displaying videos transmitted back by the unmanned aerial vehicle side, warning information and livestock position data sent by the cloud side and livestock posture data.
Furthermore, the user side is further configured to input a first preset threshold, a second preset threshold, a third preset threshold, a fourth preset threshold, a time threshold, and a number of times of periodic eating.
Further, the first data transmission module is an NB-IoT module, and a communication protocol between the NB-IoT module and the cloud terminal is a Modbus protocol.
Further, the attitude sensor is an MPU6050 module, and the MPU6050 module has a three-axis accelerometer and a three-axis gyroscope.
Further, the first preset threshold is 45 °, the second preset threshold is 40 °/s, the third preset threshold is 50 °, the fourth preset threshold is 5 °/s, and the time threshold is 60s.
The beneficial technical effects of the invention are as follows:
the invention provides a monitoring system, and livestock only need to wear one small Internet of things device, and the monitoring system is composed of a Beidou module, a six-axis attitude sensor, a voice playing module, an NB-IoT and the like. In addition, the unmanned aerial vehicle auxiliary equipment with image transmission and GPS is provided. The Internet of things equipment is responsible for monitoring the postures and the geographic positions of the livestock. When the livestock leaves the designated area, the worn device can simulate the sound of barking of a dog to drive the livestock, thereby realizing the functions of the electronic fence and the electronic shepherd dog. When the livestock does not return to the electronic fence for a long time or danger is detected, no one can automatically go to an alarm area according to the geographic position of the livestock. The invention combines a plurality of sensors to realize the functions of the electronic fence and the electronic shepherd dog and testify the practical feasibility; secondly, unmanned aerial vehicle is as auxiliary assembly, will carry out feedback control according to the livestock condition, and unmanned aerial vehicle only just can go the warning region when needs, has avoided that unmanned aerial vehicle flight time is long and the battery power that causes is low and to the interference problem of livestock.
Drawings
The present invention may be better understood by reference to the following description taken in conjunction with the accompanying drawings, which are incorporated in and form a part of this specification, and which are used to further illustrate preferred embodiments of the present invention and to explain the principles and advantages of the present invention.
Fig. 1 is a schematic diagram of the general layout of a wearable grazing system with drone assistance according to an embodiment of the present invention;
FIG. 2 is a hardware diagram of an embedded system in an embodiment of the invention;
FIG. 3 is a flow chart of the software design for the electric fence and the electric shepherd according to the embodiment of the present invention;
FIG. 4 is a flow chart of livestock attitude monitoring in an embodiment of the present invention;
FIG. 5 is an exemplary diagram of a data display interface in an embodiment of the invention; wherein, (a) corresponds to the PC terminal; (b) corresponding to a mobile phone end;
FIG. 6 is a schematic structural diagram of an auxiliary layer device in an embodiment of the present invention;
FIG. 7 is a schematic three-dimensional coordinate system of an apparatus in an embodiment of the invention;
FIG. 8 is an exemplary diagram of a feeding monitoring result according to an embodiment of the present invention, in which the vertical axes on both sides correspond to different curves, respectively, and the horizontal line is a reference line of 0 value of the right vertical axis;
FIG. 9 is an exemplary diagram of a running monitoring result according to an embodiment of the present invention, wherein the longitudinal axes on two sides respectively correspond to different curves;
fig. 10 is an exemplary diagram of fall monitoring results in an embodiment of the invention, in which the longitudinal axes on both sides correspond to different curves respectively;
FIG. 11 is an exemplary diagram of a stability test result of a Beidou system positioning module according to the embodiment of the present invention;
fig. 12 is a diagram illustrating an example of a stability test result of autonomous flight of the unmanned aerial vehicle in the embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the disclosure, exemplary embodiments or examples of the disclosure are described below with reference to the accompanying drawings. It is to be understood that the disclosed embodiments or examples are only some, but not all embodiments or examples of the invention. All other embodiments or examples obtained by a person of ordinary skill in the art based on the embodiments or examples of the present invention without any creative effort shall fall within the protection scope of the present invention.
The embodiment of the invention provides a wearable grazing system with unmanned aerial vehicle assistance, which comprises a data acquisition and processing end, a cloud end, a user end and an unmanned aerial vehicle end, wherein the data acquisition and processing end is connected with the cloud end; wherein, the first and the second end of the pipe are connected with each other,
the data acquisition processing end comprises a core controller, an attitude sensor, a Beidou positioning module, a voice playing module and a first data transmission module; the data acquisition processing end can be mounted on the livestock, such as the neck of the livestock; the system comprises a posture sensor, a Beidou positioning module, a core controller, a voice playing module, a first data transmission module, a Beidou positioning module, a core controller and a voice playing module, wherein the posture sensor is used for acquiring livestock posture data, the Beidou positioning module is used for acquiring livestock position data, the core controller is used for judging whether livestock is in an electronic fence or not according to the livestock position data, judging whether the livestock is in a feeding, running or falling state or not according to the livestock posture data, sending an alarm instruction to the voice playing module when the livestock is not in the electronic fence, and sending the livestock position data and the livestock posture data to the cloud end through the first data transmission module; the livestock posture data carries livestock state information, and the livestock state information comprises eating, running or falling states;
the cloud end is used for judging whether the livestock is in a normal feeding state according to preset periodic feeding times after receiving the livestock attitude data carrying the livestock state information, judging whether the livestock is in a normal running state according to a preset time threshold value, and sending a control instruction and corresponding livestock position data to the unmanned aerial vehicle end if judging that the livestock is in an abnormal running state or a falling state; sending warning information and corresponding livestock position data to a user side; wherein the warning information comprises abnormal running warning, abnormal eating warning and falling warning;
the unmanned aerial vehicle end comprises an unmanned aerial vehicle, a main controller, a GPS (global positioning system), an image acquisition module, an image transmission module and a wireless communication module, and the wireless communication module is used for receiving a control instruction and livestock position data sent by a cloud end; the main controller is used for sending the livestock position data to the GPS, enabling the unmanned aerial vehicle to fly to the position of the livestock, and sending an image acquisition instruction to the image acquisition module after the unmanned aerial vehicle reaches the position of the livestock; the image acquisition module is used for starting the camera to acquire the video containing the livestock after receiving the image acquisition instruction and sending the video to the user side in real time through the image transmission module.
In this embodiment, preferably, the voice playing module sends out the analog dog call after receiving the alarm instruction.
In this embodiment, preferably, the livestock attitude data includes a yaw angle, a roll angle, a pitch angle, a yaw rate, a roll rate, and a pitch rate; the process that the core controller judges whether the livestock is eating, running or falling according to the livestock posture data comprises the following steps:
1) Food intake judgment
The one-time feeding condition is that a pitch angle is smaller than a first preset threshold value and the initial state is automatically recovered; b. the pitch angle speed changes from a positive maximum value to a negative maximum value;
2) Running state determination
The condition is that the pitch angle speed is larger than a second preset threshold value for multiple times;
3) Judgment of falling state
The condition is that c is simultaneously satisfied, and the roll angle is smaller than a third preset threshold value; d. the roll angular velocity is less than a fourth preset threshold.
In this embodiment, preferably, the user side is used for displaying the video transmitted back from the unmanned aerial vehicle side, displaying the warning information and the livestock position data sent by the cloud, and displaying the livestock posture data.
In this embodiment, preferably, the user side is further configured to input a first preset threshold, a second preset threshold, a third preset threshold, a fourth preset threshold, a time threshold, and a number of times of periodic eating.
In this embodiment, preferably, the first data transmission module is an NB-IoT module, and a communication protocol between the NB-IoT module and the cloud is a Modbus protocol.
In this embodiment, preferably, the attitude sensor is an MPU6050 module, and the MPU6050 module has a three-axis accelerometer and a three-axis gyroscope.
In this embodiment, preferably, the first preset threshold is 45 °, the second preset threshold is 40 °/s, the third preset threshold is 50 °, the fourth preset threshold is 5 °/s, and the time threshold is 60s.
In this embodiment, the number of periodic feedings is preferably related to the weight of the livestock, which can be modified according to the actual situation, for example 500 times/day.
In order to solve the problem of the unmanned aerial vehicle in the grazing system, another embodiment of the invention provides a wearable grazing system with unmanned aerial vehicle assistance, namely an unmanned aerial vehicle-assisted grazing monitoring system. The system consists of wearable embedded equipment and an unmanned aerial vehicle. Firstly, the system applies NB-IoT technology to realize data transmission between the embedded device and the remote server. The technology has the characteristics of low power consumption and wide coverage range. And secondly, acquiring the geographical position of the livestock by using a Beidou satellite navigation system (BDS). The Beidou satellite system adopts a three-frequency positioning theory, and has high positioning precision, reliability and anti-interference capability. Furthermore, the system uses six-axis attitude sensors to monitor the behaviour of the animal. The sensor integrates a three-axis accelerometer and a three-axis gyroscope, acceleration can be monitored, and angles and angular velocities can be acquired, so that the accuracy of behavior monitoring is greatly improved. Finally, the drone acts as an auxiliary device. When the alarm is triggered, the herdsman can choose to let the unmanned aerial vehicle fly to a designated area, and then analyze the reason of the alarm by checking the real-time image returned by the alarm area.
1. General structure of monitoring system
The system employs a modular approach. The whole structure is divided into four layers, namely an embedded system layer, a data transmission layer, a data display layer and an auxiliary layer. The general layout of the system is shown in figure 1.
The main function of the embedded system layer is to collect information of livestock, including attitude, longitude and latitude. It also simulates the sound of a shepherd dog to repel livestock and transmits data to a remote server. The hardware annotation for an embedded system is shown in fig. 2. The main function of the data display layer is to display the data transmitted from the embedded system layer on a mobile phone or a webpage for the herdsman to view. The main function of the data transmission layer is to transmit the data of the embedded device to the data display layer. The main function of the auxiliary layer is that when the herdsman receives a Short Message Service (SMS) or a WeChat early warning, the quad-rotor unmanned aerial vehicle equipped with the image transmission device will go to a designated area according to longitude and latitude. At this moment, the herdsman can look over the image in real time or control unmanned aerial vehicle and drive the livestock.
The modular structure makes the grazing monitoring system easier to manage, and each layer cooperates with other layers to complete the functions required by the whole system.
2. Design of embedded system layer
1. Hardware model selection
1) Core controller
In the selection of the core processor, the overall power consumption and the functional requirements of the system are considered, and finally, STM32F103ZET6 (ARMCortex-M3 kernel) is selected as a main control chip. The chip has low power consumption, and the working frequency can reach 72MHz. Its advantage is that it has rich I/O interface, timer and serial port, can connect a plurality of external devices to go on simultaneously. Which makes it satisfactory for the system requirements.
2) Attitude sensor
The MPU6050 module is selected as a sensor of the animal attitude detection section of the entire system. The module has a three-axis accelerometer and a three-axis gyroscope so that behavior can be monitored more accurately. In addition, the device also comprises a Digital Motion Processor (DMP), which can directly output the Euler angle without complex filtering and data fusion processing, thereby reducing the workload of a main control chip and the difficulty of system development. But in order to make the attitude monitoring more accurate, a simple Kalman filtering algorithm is provided. After the DMP completes the pose calculation, the system will trigger an external interrupt. At this time, the data will be represented by I 2 And C, communication output. The maximum transmission frequency is 400KHz, and the real-time performance of the system is ensured.
3) Beidou positioning module
To realize the function of electronic fence that allows livestock to move within a designated area, the geographic location of the livestock needs to be acquired. Currently, common positioning methods include base station positioning and satellite positioning. The former method is generally used for indoor positioning. The method determines the distance by testing the signal strength. The positioning speed is high, but the signal intensity is easily interfered by external signals, so the positioning precision is low. The latter method can link multiple satellites simultaneously to provide accurate position information of the receiver. The method has high positioning precision and wide positioning range, and meets the requirement of design precision. Therefore, the Beidou module is used for realizing livestock positioning. The specific model is ATGM336H-5N-21.
The Beidou system has the advantages of high sensitivity, low power consumption, low cost and the like, and is widely applied to a plurality of fields. The data transmission format of ATGM336H is similar to that of GPS, and adopts NMAEA-0183 international standard. This format uses ASCII code transmissions and the information is transmitted through the UART as the primary output channel. The default baud rate is 9600bps.
Continuous positioning consumes a lot of energy, so the beidou is configured in a sleep mode and awakened when needed. This is a good way to reduce power consumption. There are two ways to wake it up. The first is to wake up once every other period of time by the RTC clock. And after the geographical position of the livestock is obtained, judging whether the livestock is in the designated area. If so, it will re-enter sleep mode. If not, it will continue to acquire the position of the animal. Because unmanned aerial vehicle need go to the alarm area according to the geographical position of livestock, so when the livestock gesture is unusual, the module also can be awaken up.
4) Voice playing module
The embodiment of the invention aims to simulate the sound of a shepherd dog to drive livestock, so that the ISD1820 voice playing module is selected to realize the function. The ISD1820 voice chip is designed and developed by ISD corporation of America in 2001, and can realize single-segment recording and playback of 8-20 seconds. Playing by adopting an edge triggering or level triggering mode, wherein the working voltage is 3-5V, the automatic power-saving function is realized, and the current is kept at 0.2 muA. The module is implemented by CMOS technology and comprises an oscillator, an automatic gain control, an anti-aliasing filter, a loudspeaker driver and a flash memory array. Compared with other similar recording and broadcasting systems, the recording and broadcasting system has the characteristics of simple structure and high running speed.
5) Data transmission module
The communication of the system is mainly embodied in data transmission between the embedded device and the cloud. The embedded system is responsible for uploading the acquired data to the remote server and then realizing specific functions. The cloud is used for storing data, generating historical records, reporting equipment problems, regulating and controlling threshold values and sending messages to management personnel. Therefore, in order to realize network communication of the equipment, an NB-IoT module with the model number of WH-NB73 is selected.
Firstly, compared with short-distance wireless technologies such as Bluetooth and ZigBee, the module increases the access times by 50-100 times and embodies the strong link characteristics. Secondly, the module coverage is wide. Compared with LTE, the gain is improved by 20dB, namely the area coverage is increased by 100 times, and the high coverage characteristic is embodied. Finally, the whole module has low power consumption, and the standby time is as long as 10 years. The above features make it possible to realize a function of one person managing a multi-region object, thereby reducing the consumption of human resources.
The firmware type in the system is transparent, and the supported frequency bands are B5 and B8. The module adopts 4G thing networking special card, supports many operators. In addition, the working voltage is 3.1V-4.2V, the data interface is UART, and the communication protocol is Modbus RTU.
2. Software design
In the windows operating system, the software design of the host controller is written in C language using KeiluVision 5. After the compiling is successful, the program is recorded into the single chip microcomputer through ST-LINK/V2. The main purpose of the compiler is to control multiple sensors through the master control chip and collect the data that needs to be sent into the cloud. In the program, some software design schemes such as electronic fence monitoring, livestock abnormal posture detection, electronic shepherd dog and the like are also written according to functional requirements.
1) Software design scheme of electronic fence and electronic shepherd dog
The software design of the electronic fence and the electronic shepherd dog comprises main program initialization, a UART communication mode and timer service. First, a main program initializes a main control chip, a Beidou positioning module, an NB-IoT module and the like. When the RTC alarm clock reaches the set time, the Beidou wakes up from the low power consumption mode. Then the system circularly judges whether the Beidou positioning module searches for a signal or not; this is a critical step. If the module has no signal, the system waits for the signal and lights the LED. And if the signal is received, judging whether the longitude and latitude are in the specified area. If yes, resetting the wake-up time of the alarm clock, and enabling the Beidou to enter the sleep state again. If not, the embedded device sends data to the server through the NB-IoT module and outputs high level to the voice playing module. And a UART communication mode is adopted between the main control chip and the NB-IoT module. And finally, the server sends alarm information to the herdsman. The flow of the design is shown in FIG. 3.
2) Software design scheme for monitoring livestock posture
The design scheme of the livestock posture monitoring is similar to that of an electronic fence and an electronic shepherd dog. Similarly, when the monitored data exceeds a specified threshold, the device sends the data to the server through the NB-IoT module, and then the server sends alarm information to the herdsman. The difference lies in that I is adopted between the microcontroller and the sensor 2 C serial communication mode. Data collected by the attitude sensor is firstly filtered, and then quaternions are converted into Euler angles. When the livestock runs abnormally and falls down, the system judges that the livestock is in danger. This avoids false reports. At this time, the Beidou is awakened from the low power consumption mode. The process of monitoring the posture of the livestock is shown in figure 4.
3. Design of data display layer
The data display layer consists of a mobile phone end and a web end. The design of the mobile phone end requires the use of an android version 4.0 and above in a software environment. For a hardware environment, a mobile phone terminal device with a dual core CPU of 2GHz or more is required. The programming language used in the design is Java. The design of the Web interface comprises the steps of establishing a server environment, establishing a MySQL database and establishing a Web terminal. The webpage design uses languages such as HTML and JS. First, the server environment is built on an Alice cloud server. Secondly, because the hardware terminal of the system needs to collect livestock information in real time and upload data in real time, the MySQL database is installed on the Ali cloud server to store the hardware terminal data, so that the use and acquisition of the Web terminal are facilitated, and effective data support is provided for the system. Finally, the construction on the network side is based on Tomcat. Tomcat is a Servlet container developed by Apache and plays the role of a Web server. The design of the data display layer allows managers to check the longitude and latitude, the posture, the feeding state and other information of livestock. The data display interface is shown in fig. 5.
4. Design of auxiliary layer
The auxiliary layer of the grazing monitoring system consists of a quad-rotor unmanned aerial vehicle and video transmission equipment. The unmanned aerial vehicle is provided with a GPS and a data transmission device. And have the ultrasonic wave and keep away the barrier function, can avoid the influence of low height above sea level trees or power line. Some PID control algorithms are written for flight control. After the airplane takes off, the airplane hovers and records the longitude and latitude of a flying starting point, and then the manager waits for transmitting the livestock longitude and latitude returned by the embedded system to the airplane through the data transmission equipment. After receiving the instruction, the unmanned aerial vehicle can automatically go to the designated area through the GPS. The drone may then transmit back an image of the pasturing area for the herdsman to check the condition of the livestock. And transmitting the image, wherein the transmission distance can reach 10 kilometers by adopting drawing transmission equipment produced by DJ-Innovations company and applying OcuSync drawing transmission technology. Compared with the Lightbridge image transmission technology and the Wi-Fi image transmission technology, the technology realizes stable transmission, clear transmission image, interference resistance and low delay. The herdsman can look over the image that unmanned aerial vehicle shot in real time through cell-phone APP, perhaps control unmanned aerial vehicle and drive the livestock. After the unmanned aerial vehicle finishes the task, the administrator can start a one-key return function, and the unmanned aerial vehicle can automatically fly back to the flying point. Unmanned aerial vehicle auxiliary assembly is convenient operation not only, has saved the consumption of manpower resources moreover. The auxiliary layer device is shown in fig. 6. Limited range has been a problem for drone applications. Because the graph transmission needs high power, in order to prolong the flight time and solve the problem of limited battery power, a switch circuit is designed. The power consumption of image transmission is related to the transmission distance. The further the distance, the greater the power consumption. In the autonomous flight process, the image transmission equipment can be closed, and the image transmission equipment can be automatically opened after the destination is reached, so that the electric quantity of a battery can be saved. The used battery capacity is 4920mAh, the longest endurance time of the unmanned aerial vehicle is 50 minutes, and the requirements of grazing tasks can be met.
The technical effect of the invention is further verified through experiments.
Three sets of equipment, namely a data acquisition and processing end, are hung on the neck of a three-head adult bull for parameter monitoring in the experiment. The system monitors the falling, running and eating conditions of the livestock by utilizing the Euler angles and the angular velocities acquired by the MPU6050 module. The three-dimensional coordinate system of the device is shown in fig. 7. The orientation of the coordinate axes is determined by the placement angle of the sensors. Thus, the angle of inclination when the animals are upright is not 0 ° but 90 °. In addition, the inclination angles are all positive values, and gradually decrease from 90 °. However, angular velocity has positive and negative values. According to the set monitoring threshold and the judgment rule, the three devices alarm for 143 times; 15 of the alarms were caused by human intervention, with an accuracy of 97.2%. The specific test results and analysis are as follows.
First, the feeding condition of the livestock is monitored and a line graph is generated as shown in fig. 8. The selected area of the frame in the figure is a posture change curve when the livestock eats. The positive half axis represents the process of lowering the head of the livestock for feeding, and the negative half axis represents the process of raising the head of the livestock after feeding. The pitch angle will decrease to about 41 deg. and then revert to about 90 deg.. Furthermore, it can be seen from the figure that the angle of inclination is substantially unchanged when the animals are not feeding. The monitoring of the feeding times of the livestock is judged by two conditions. The first condition is that the angular velocity of the animals is changed from a maximum value for the positive half-axis to a maximum value for the negative half-axis. The second condition is that the pitch angle is less than 45 °. When both conditions are met, the system determines that the animal is eating once.
Then, as the animal runs, a plurality of sets of pose data are collected, as shown in fig. 9. As can be seen from the figure, the angular velocity fluctuates up and down. In addition, the minimum pitch angle is not less than 45 °. It does not affect the number of meals consumed. The angular velocity threshold is set to 40/s. When the pitch angle rate is larger than the threshold value a number of times, the system will determine that the animal is running.
Finally, the attitude data when the animal fell was monitored, as shown in fig. 10. As can be seen from the figure, when the animal falls, the roll angle decreases, but does not drop to 0. When the roll angle is less than 50 deg. and the roll angle speed is less than 5 deg./s, the system will determine that the animal has fallen.
And further, carrying out a stability test of the Beidou positioning system.
To verify the positioning accuracy and ability to acquire signals of this design, three locations were chosen for testing, which most herdsmen often graze. These places are all grassland, wasteland and mountain forest. In each site, 50 experimental trials were performed, and the results are shown in fig. 11. The average value of 50 groups of data is calculated according to actual conditions, and the recording result is shown in a table I. And after each signal test is finished, the positioning precision is tested. Since the accuracy test is only meaningful if the device has a signal.
Watch 1
Figure BDA0003934419540000101
The average error for these three positions was 1.841m, 2.551m, and 6.573m, respectively. The average positioning time of these three positions is 12s, 13s, 33s, respectively. The test results of the grassland and the wasteland are better, but in mountainous regions and forests, due to signal interference, the positioning module does not find signals in two tests, and the positioning precision deviation is larger. Although it has a large error, it is still in the field of view and still can play a monitoring role. Therefore, under normal conditions, the positioning module has good stability, can adapt to the current grazing environment, and meets the grazing requirement of the herdsman.
Further, feasibility tests of electronic fences and electronic shepherd dogs were performed.
These functions were tested for fifteen days and the animals were trained. Over time, the response of the animals to the sounds was recorded as shown in table two. The animal-worn device is driven to a designated area for testing. When they walk out of the area, the devices emit a sound to stimulate them. The stimulus to this sound has no effect on expelling the animals around the first four days. At this point, they need to be manually driven away so that they can generate memory. At about day 9, the experimental animals began to respond slightly and had memory of the voice. In particular, on days 6 and 10, the animals did not respond to this stimulus. On the thirteen days, the animals had a clear response to the sound stimuli and when they exceeded the designated area, they returned themselves. Also, on day 14, the animals responded to this sound, but did not return to the designated area. As can be seen from the table entries, they gradually adapted to this stimulus over long periods of training, demonstrating the feasibility of electronic fences and electronic shepherds.
Watch 2
Figure BDA0003934419540000102
And further, carrying out a stability test of autonomous flight of the unmanned aerial vehicle.
In order to verify the flight stability of the unmanned aerial vehicle, a northeast agriculture university test field is selected to perform data transmission and image transmission tests, and the total number of experiments is 20. The test pattern is shown in fig. 12. Commands were sent from the ground station to the aircraft via the data transmission device and all 20 experiments were successful. However, in the image transmission test, the farthest flight distance was about 8 km, and the flight height was about 200 m. At this time, an image may be received, but the remote control signal is weak. To avoid losses, the drone is forced to return automatically. However, due to insufficient battery power, the drone does not fly back to the point of departure, but is forced to land. Through testing, it is found that the image transmission is stable when the flight reaches 6 kilometers and the battery power meets the requirement of going home. Furthermore, in three flights, image jammers or delays occur due to signal interference and antenna problems of the receiving device. After the antenna is adjusted and the flying height is reduced to about 150 meters, the picture is gradually restored to be stable.
In conclusion, the wearable grazing equipment and the unmanned aerial vehicle are combined based on the single function of the existing Internet of things equipment and the problem of cruising caused by long-time flight of the unmanned aerial vehicle. The wearable equipment realizes the functions of monitoring abnormal postures of livestock, electronic fences, electronic shepherd dogs and the like; the unmanned plane can automatically go to the early warning area according to information fed back by the equipment worn by the livestock and automatically return after the task is completed. System tests and experimental analysis prove that the monitoring and early warning functions are good in performance, and data transmission and auxiliary equipment are stable. Unmanned aerial vehicle realizes coordinated control with wearable equipment. Thus, the design of the monitoring system is feasible. The realization of this system has filled the blank that current research only relies on wearable equipment or unmanned aerial vehicle to monitor to combine both innovatively. Unmanned aerial vehicle combines together with internet of things, can help herdsman high efficiency to manage the livestock to reduce manpower resources and consume, realize intelligent, the scientization of herding.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. The present invention has been disclosed with respect to the scope of the invention, which is to be considered as illustrative and not restrictive, and the scope of the invention is defined by the appended claims.
The documents cited in the present invention are as follows:
[1]H.Wang,A.O.Fapojuwo,and R.J.Davies,"A wireless sensor network for feedlot animal health monitoring,"IEEE Sensors Journal,vol.16,no.16,pp.6433-6446,2016.
[2]T.Volk et al.,"RFID technology for continuous monitoring of physiological signals in small animals,"IEEE Transactions on Biomedical Engineering,vol.62,no.2,pp.618-626,2015.
[3]S.Tsuichihara et al.,"Drone and GPS sensors-based grassland management using deep-learning image segmentation,"in 2019 Third IEEE International Conference onRobotic Computing(IRC),2019,pp.608-611.
[4]J.G.A.Barbedo,L.V.Koenigkan,T.T.Santos,and P.M.Santos,"A Study on the Detection of Cattle in UAV Images Using Deep Learning,"Sensors,vol.19,no.24,2019.
[5]M.L.Zhou etal.,"Improving Animal Monitoring Using Small Unmanned Aircraft Systems(sUAS)and Deep Learning Networks,"Sensors,vol.21,no.17,2021.
[6]X.H.Li,H.L.Huang,A.V.Savkin,and J.Zhang,"Robotic Herding of Farm Animals Using a Network of Barking Aerial Drones,"Drones,vol.6,no.2,2022。

Claims (8)

1. a wearable grazing system with unmanned aerial vehicle assistance is characterized by comprising a data acquisition and processing end, a cloud end, a user end and an unmanned aerial vehicle end; wherein the content of the first and second substances,
the data acquisition processing end comprises a core controller, an attitude sensor, a Beidou positioning module, a voice playing module and a first data transmission module; the system comprises a posture sensor, a Beidou positioning module, a core controller, a voice playing module, a first data transmission module, a Beidou positioning module, a core controller and a voice broadcasting module, wherein the posture sensor is used for acquiring livestock posture data, the Beidou positioning module is used for acquiring livestock position data, the core controller is used for judging whether livestock is in an electronic fence according to the livestock position data, judging whether the livestock is in a feeding, running or falling state according to the livestock posture data, sending an alarm instruction to the voice playing module when the livestock is not in the electronic fence, and sending the livestock position data and the livestock posture data to the cloud end through the first data transmission module; wherein the animal pose data carries animal status information, including eating, running or falling status;
the cloud end is used for judging whether the livestock is in a normal feeding state according to preset periodic feeding times after receiving livestock attitude data carrying livestock state information, judging whether the livestock is in a normal running state according to a preset time threshold value, and sending a control instruction and corresponding livestock position data to the unmanned aerial vehicle end if judging that the livestock is in an abnormal running state or a falling state; sending warning information and corresponding livestock position data to the user side; wherein the warning information comprises abnormal running warning, abnormal eating warning and falling warning;
the unmanned aerial vehicle end comprises an unmanned aerial vehicle, a main controller, a GPS (global positioning system), an image acquisition module, an image transmission module and a wireless communication module, and the wireless communication module is used for receiving a control instruction and livestock position data sent by the cloud end; the main controller is used for sending the livestock position data to the GPS, enabling the unmanned aerial vehicle to fly to the position of the livestock, and sending an image acquisition instruction to the image acquisition module after the unmanned aerial vehicle reaches the position of the livestock; the image acquisition module is used for starting a camera to acquire a video containing the livestock after receiving an image acquisition instruction and sending the video to the user side in real time through the image transmission module.
2. The wearable grazing system with unmanned aerial vehicle assistance of claim 1, wherein the voice playing module sounds a simulated dog call upon receiving an alarm instruction.
3. The wearable grazing system with drone assistance of claim 1, wherein the animal pose data includes yaw angle, roll angle, pitch angle, yaw angular velocity, roll angular velocity, pitch angular velocity; the process that the core controller judges whether the livestock eats, runs or falls according to the livestock posture data comprises the following steps:
1) Food intake judgment
The one-time feeding condition is that a pitch angle is smaller than a first preset threshold value and the initial state is automatically recovered; b. the pitch angle speed changes from a positive maximum value to a negative maximum value;
2) Running state determination
The condition is that the pitch angle speed is more than a second preset threshold value for multiple times;
3) Judgment of falling state
C, simultaneously satisfying that the roll angle is smaller than a third preset threshold; d. the roll angular velocity is less than a fourth preset threshold.
4. The wearable grazing system with unmanned aerial vehicle assistance of claim 3, wherein the user side is configured to display video transmitted back by the unmanned aerial vehicle side, display warning information and livestock position data transmitted by the cloud side, and display livestock attitude data.
5. The wearable grazing system with drone assistance of claim 4, wherein the user end is further configured to input a first preset threshold, a second preset threshold, a third preset threshold, a fourth preset threshold, a time threshold, and a number of periodic feedings.
6. The wearable grazing system with unmanned aerial vehicle assistance of claim 1, wherein the first data transmission module is an NB-IoT module, and a communication protocol between the NB-IoT module and the cloud is a Modbus protocol.
7. The wearable grazing system with drone assistance of claim 1, wherein the attitude sensor is a MPU6050 module, the MPU6050 module having a three-axis accelerometer and a three-axis gyroscope.
8. A wearable grazing system with drone assistance according to claim 3, characterized in that said first preset threshold is 45 °, said second preset threshold is 40 °/s, said third preset threshold is 50 °, said fourth preset threshold is 5 °/s, said time threshold is 60s.
CN202211397920.2A 2022-11-09 2022-11-09 Wearable grazing system with unmanned aerial vehicle is supplementary Active CN115941895B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211397920.2A CN115941895B (en) 2022-11-09 2022-11-09 Wearable grazing system with unmanned aerial vehicle is supplementary

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211397920.2A CN115941895B (en) 2022-11-09 2022-11-09 Wearable grazing system with unmanned aerial vehicle is supplementary

Publications (2)

Publication Number Publication Date
CN115941895A true CN115941895A (en) 2023-04-07
CN115941895B CN115941895B (en) 2023-09-26

Family

ID=86653083

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211397920.2A Active CN115941895B (en) 2022-11-09 2022-11-09 Wearable grazing system with unmanned aerial vehicle is supplementary

Country Status (1)

Country Link
CN (1) CN115941895B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120122570A (en) * 2011-04-29 2012-11-07 (주)티엘씨테크놀로지 System and method for grazing based on Ubiquitous network
CN205546945U (en) * 2016-03-02 2016-09-07 中国农业科学院农业信息研究所 Domestic animal controlling and monitoring system
CA2996770A1 (en) * 2015-09-24 2017-03-30 Digi-Star, Llc Agricultural drone for use in livestock monitoring
WO2019039118A1 (en) * 2017-08-22 2019-02-28 ソニー株式会社 Livestock sensor device, livestock astasia inference method, livestock astasia inference program, and livestock management system
KR20190048161A (en) * 2017-10-30 2019-05-09 건국대학교 글로컬산학협력단 Method of grazing livestock using virtual fence and apparatuses performing the same
CN110197500A (en) * 2019-05-29 2019-09-03 南京信息工程大学 Herd unmanned plane and herds tracking
KR20200049642A (en) * 2018-10-30 2020-05-08 건국대학교 글로컬산학협력단 Method and system for livestock behavior analysis and grazing
CN111918209A (en) * 2020-08-14 2020-11-10 深圳华强技术有限公司 NB-IoT-based livestock breeding positioning management system
CN113784290A (en) * 2021-09-10 2021-12-10 北京市计量检测科学研究院 Early warning method and system for closed area based on electronic fence equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120122570A (en) * 2011-04-29 2012-11-07 (주)티엘씨테크놀로지 System and method for grazing based on Ubiquitous network
CA2996770A1 (en) * 2015-09-24 2017-03-30 Digi-Star, Llc Agricultural drone for use in livestock monitoring
CN205546945U (en) * 2016-03-02 2016-09-07 中国农业科学院农业信息研究所 Domestic animal controlling and monitoring system
WO2019039118A1 (en) * 2017-08-22 2019-02-28 ソニー株式会社 Livestock sensor device, livestock astasia inference method, livestock astasia inference program, and livestock management system
KR20190048161A (en) * 2017-10-30 2019-05-09 건국대학교 글로컬산학협력단 Method of grazing livestock using virtual fence and apparatuses performing the same
KR20200049642A (en) * 2018-10-30 2020-05-08 건국대학교 글로컬산학협력단 Method and system for livestock behavior analysis and grazing
CN110197500A (en) * 2019-05-29 2019-09-03 南京信息工程大学 Herd unmanned plane and herds tracking
CN111918209A (en) * 2020-08-14 2020-11-10 深圳华强技术有限公司 NB-IoT-based livestock breeding positioning management system
CN113784290A (en) * 2021-09-10 2021-12-10 北京市计量检测科学研究院 Early warning method and system for closed area based on electronic fence equipment

Also Published As

Publication number Publication date
CN115941895B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
US11589559B2 (en) Adaptive sensor performance based on risk assessment
Jukan et al. Smart computing and sensing technologies for animal welfare: A systematic review
CN104049625B (en) Internet of Things irrigating facility regulation platform and method based on unmanned vehicle
US11778987B2 (en) Unmanned aerial vehicle (UAV)-based system for collecting and distributing animal data for monitoring
US20180146645A1 (en) System and method for monitoring livestock
Fahmy WSNs applications
CN106719051A (en) Grazing management system is raised scattered based on unmanned vehicle
EP3127253B1 (en) Position tracking method and apparatus
WO2018103716A1 (en) Composite flight control method and system, aircraft
CN105737897A (en) Distributed large-field meteorological remote data monitoring system
Petkovic et al. IoT devices VS. drones for data collection in agriculture
TWI662290B (en) Wearable system for aviation internet of things and captive animals
CN115941895B (en) Wearable grazing system with unmanned aerial vehicle is supplementary
Durán López et al. A Low-power, Reachable, Wearable and Intelligent IoT Device for Animal Activity Monitoring
CN213639266U (en) Wearable herbivorous domestic animal individual feature recognition device
Li et al. Smart Grazing in Tibetan Plateau: Development of a Ground‐Air‐Space Integrated Low‐Cost Internet of Things System for Yak Monitoring
Wang et al. Electronic Sheepdog: A Novel Method in With UAV-Assisted Wearable Grazing Monitoring
CN110296733A (en) Detect the unmanned plane apparatus and system of water quality
Shao et al. Situation Awareness Method and Simulation Test of UAVs and Observer Wards Air-Ground Cooperation
Nadzri Design issues and considerations for hardware implementation of wildlife surveillance system: a review
Deshpande et al. A survey on the role of IoT in agriculture for smart farming
Rodriguez III The Design and Implementation of Soil Monitoring Systems Using UAVs
CN116321011B (en) Wisdom pasture management system
Zhang et al. UAV Grazing Research [J]
CN115035629B (en) Unmanned aerial vehicle integrated control management system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant