CN115797867A - Intelligent track inspection method applied to farm - Google Patents

Intelligent track inspection method applied to farm Download PDF

Info

Publication number
CN115797867A
CN115797867A CN202211563333.6A CN202211563333A CN115797867A CN 115797867 A CN115797867 A CN 115797867A CN 202211563333 A CN202211563333 A CN 202211563333A CN 115797867 A CN115797867 A CN 115797867A
Authority
CN
China
Prior art keywords
farm
area
robot
trough
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211563333.6A
Other languages
Chinese (zh)
Inventor
桂志明
汪强飞
郑伟
邹军
蔡翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Jiangxinduzhi Intelligent Technology Co ltd
Original Assignee
Hefei Jiangxinduzhi Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Jiangxinduzhi Intelligent Technology Co ltd filed Critical Hefei Jiangxinduzhi Intelligent Technology Co ltd
Priority to CN202211563333.6A priority Critical patent/CN115797867A/en
Publication of CN115797867A publication Critical patent/CN115797867A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a track intelligent inspection method applied to a farm, belonging to the technical field of intelligent cultivation, and the specific method comprises the following steps: the method comprises the following steps: acquiring farm data, and paving a patrol robot and constructing a scene according to the acquired farm data; step two: controlling the robot to patrol according to a preset farm patrol scheme; step three: acquiring an acquired image in the column, wherein the acquired image comprises a trough acquired image and a culture image; step four: analyzing the trough image to obtain a surplus material grade; step four: analyzing the cultured object image and calculating a fat condition value; step five: counting the calculated residue grade and the fat condition value into a corresponding column statistical table; by adopting a dual-system design, the system can deal with a complex field environment and can automatically repair the problems of abnormal program operation, system breakdown and the like; the spot light is automatically controlled, time periods are not depended, and the spot can be patrolled at any time.

Description

Intelligent track inspection method applied to farm
Technical Field
The invention belongs to the technical field of intelligent breeding, and particularly relates to an intelligent track inspection method applied to a farm.
Background
In order to monitor animal growth in real time, farms are required to be concerned with abnormal behaviour and diet of the animals. Modern scale cultivation, in order to improve efficiency, the site condition should be controlled in time and adjusted and treated at any time. At the moment, the time and energy of people are limited, and a machine is urgently needed to replace manual patrol so as to collect the wanted field data. Therefore, the invention provides the intelligent track inspection method applied to the farm.
Disclosure of Invention
In order to solve the problems existing in the scheme, the invention provides an intelligent track inspection method applied to a farm.
The purpose of the invention can be realized by the following technical scheme:
a track intelligent inspection method applied to a farm comprises the following specific steps:
the method comprises the following steps: acquiring farm data, and laying a patrol robot and setting up a scene according to the acquired farm data;
step two: controlling the robot to patrol according to a preset farm patrol scheme;
step three: acquiring an acquired image in the column, wherein the acquired image comprises a trough acquired image and a culture image;
step four: analyzing the trough image to obtain a surplus material grade;
step five: analyzing the images of the cultured objects, and calculating a fat condition value;
step six: and counting the calculated residue grade and the fat condition value into the corresponding column statistical table.
Further, the method for paving the inspection robots according to the obtained farm data comprises the following steps:
establishing a farm model according to the obtained farm data, analyzing the farm model to obtain a robot track position, and establishing a corresponding track model and a robot model in the farm model according to the obtained robot track position; the robot track and the robot are installed according to the current farm model, corresponding RFID tags are arranged on the robot track according to the distribution of columns in the farm, and corresponding column information is stored in the RFID tags.
Further, the method for constructing the scene according to the obtained farm data comprises the following steps:
the method comprises the steps of obtaining a farm model, marking columns in the farm model as unit areas, combining the unit areas to obtain illumination areas, setting corresponding illumination devices in the illumination areas, and completing scene construction.
Further, the method for merging the unit areas includes:
identifying a single radiation area in a farm model, marking a unit area in the single radiation area as an analysis area, sequencing priorities of the analysis areas to obtain a first sequence, merging the analysis areas according to the first sequence, checking corresponding merging limiting conditions when merging one analysis area, and merging again according to the first sequence when the merging limiting conditions are met; and when the merging limiting condition is not met, the analysis area merging is cancelled, and the current merging area is marked as an illumination area.
Further, the method for setting the corresponding lighting device in the lighting area comprises the following steps:
and comparing the obtained illumination area with the maximum illumination range corresponding to the illumination device, determining the installation range of the illumination device, marking the obtained installation range in a farm model, and installing the illumination device by a corresponding installer in the marked area according to the farm environment.
Further, the robot adopts a dual system.
Further, the method for analyzing the trough image comprises the following steps:
establishing a corresponding trough analysis model based on a CNN network, analyzing trough images through the trough analysis model to obtain the area, the trough position and the trough area of corresponding feed, marking the area of the obtained feed as SA, marking the area of the obtained trough as LS, matching a corresponding adjustment coefficient according to the obtained trough position, marking the obtained adjustment coefficient as alpha, and calculating the corresponding excess material grade according to an excess material grading function YL = alpha multiplied by SA/LS.
Further, the method for calculating the condition value comprises the following steps:
and acquiring the overlooking area and the height of the cultured object, respectively marking as FM and H, and calculating a corresponding fat condition value according to a fat condition calculation function Bf = FM multiplied by H multiplied by beta, wherein beta is a conversion coefficient.
Compared with the prior art, the invention has the beneficial effects that:
by adopting a dual-system design, the system can deal with a complex field environment and can automatically repair the problems of abnormal program operation, system breakdown and the like; the spot light is automatically controlled, time periods are not depended, and the spot can be patrolled at any time; a special application mode is designed according to the trough and the fat condition, and a target result can be quickly analyzed; the system is independent of a field network, can be stably operated in an off-line mode, and data can be exported through a local mobile phone.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flow chart of the method of the present invention;
fig. 2 is an exemplary diagram of a field of the present invention.
Detailed Description
The technical solutions of the present invention will be described below clearly and completely in conjunction with the embodiments, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1 to 2, a track intelligent inspection method applied to a farm includes:
the method comprises the following steps: acquiring farm data, and laying a patrol robot and setting up a scene according to the acquired farm data;
the farm data comprises drawing data, environment data and the like, and the environment data comprises historical illumination data, temperature and humidity data and the like.
The method for paving the inspection robot according to the obtained farm data comprises the following steps:
establishing a farm model, namely a three-dimensional data model, according to the obtained farm data; analyzing the farm model to obtain the position of the robot track, and establishing a corresponding track model and a robot model in the farm model according to the obtained position of the robot track; the field installation work is convenient to carry out, and the operation is more intuitive;
the robot track and the robot are installed according to the current farm model, corresponding RFID tags are arranged on the robot track according to the distribution of columns in the farm, corresponding column information is stored in the RFID tags, the column information comprises column numbers, normal data ranges, farm animal information and the like, and the robot track can be specifically adjusted according to actual conditions.
The method comprises the steps of analyzing a farm model to obtain the position of a robot track, namely confirming the position according to the erection requirement of the robot track, setting the position according to the existing method, and enabling corresponding workers to be capable of determining the position of the track and the corresponding installation mode without or with few needs to go to the farm site to observe.
The robot is used for carrying out the patrol of the farm, and in one embodiment, the robot can use the existing function provided by the invention. In another embodiment, the robot comprises a robot body control board, an RFID card reader, a motor, a steering engine, a sensor, an AI edge calculation board, WIFI & BT, a visible light camera, a thermal infrared camera, a 3D camera, a radar, a wireless receiving module, a battery BMS and the like. The motor is used for robot walking power output, the steering engine is used for camera and radar direction adjustment, and the sensor comprises temperature, humidity, illumination, ammonia gas, carbon dioxide detection and the like; the charging chamber mainly comprises a wireless charging transmitting module for providing charging.
The robot adopts double systems, namely one system is a backup system, when a first system breaks down, the backup system is used for working, the first system is accessed in default operation, after the first system is accessed, whether the application is normal or not is checked, if no problem is caused, the normal operation is carried out, and meanwhile, the operation parameter mark is updated to the root environment variable (information such as starting time, application starting process and the like). The fault judgment of the corresponding system can be carried out through the existing modes such as abnormal shutdown and the like.
The method for building the scene according to the obtained farm data comprises the following steps:
acquiring a farm model which is a breeding model comprising a robot; marking the columns in the farm model as unit areas, merging the unit areas to obtain illumination areas, and setting corresponding illumination devices in the illumination areas to complete scene construction. The arranged lighting device is specially matched with the robot for use, is used for saving energy, and only needs to light an area needing to be inspected, because the robot does not perform inspection continuously; for manual inspection, the lighting system in the farm can be used, for example, a light source is arranged in each column.
The method for merging the unit areas comprises the following steps:
identifying a single radiation area in the farm model, marking a unit area in the single radiation area as an analysis area, performing priority ordering of the analysis areas to obtain a first sequence, merging the analysis areas according to the first sequence, performing corresponding merging restriction condition check on each merged analysis area, and merging again according to the first sequence when the merging restriction conditions are met; and when the merging limiting condition is not met, the analysis area merging is cancelled, and the current merging area is marked as an illumination area.
A single radiation area is an area where the same illumination device can be used, as shown in example fig. 2. The field 1 and the field 7 cannot be located in the same lighting area due to the field, and have a field ceiling shade, if there is no ceiling shade, the field 1 and the field 7 may be located in the same lighting area, for fig. 2, the fields 1, 2, 3, 4, 5, 6, 9, and 10 may be located in the same lighting area, and because there is no shade, when a lighting device is disposed in the aisle, the lighting of the above-mentioned fields may be realized.
The priority ranking of the analysis areas is performed on the basis of a non-omission principle, for example, starting from the boundary, the situation that the analysis areas of a plurality of boundaries are left at last after starting from the middle is avoided, so that an additional lighting device needs to be arranged, a corresponding priority analysis model is specifically established on the basis of a neural network, a corresponding training set is set in a manual mode for training, and the priority of each analysis area is obtained by analyzing the successfully trained priority analysis model.
The method for checking the corresponding merging limitation conditions comprises the following steps: acquiring an illumination range of a corresponding illumination device, setting according to specific illumination device types, such as fixed illumination equipment and turning illumination equipment, and performing directional illumination on turning illumination according to robot positioning; that is, the merging area cannot exceed the maximum illumination range of the corresponding illumination device, and the illumination range can be set manually and compared with the illumination range for checking.
The method for setting the corresponding lighting device in the lighting area comprises the following steps:
and comparing the obtained illumination area with the maximum illumination range corresponding to the illumination device, determining the installation range of the illumination device, marking the obtained installation range in a farm model, and installing the illumination device by a corresponding installer in the marked area according to the farm environment.
Determining the installation range of the lighting device, performing corresponding analysis by using the existing analysis method and mathematical algorithm, specifically, establishing a corresponding range analysis model based on a neural network, establishing a corresponding training set in a manual mode for training, and analyzing by using the range analysis model after the training is successful; the neural network can be an error back propagation neural network, an RBF neural network and a deep convolution neural network, and the specific establishment and training process is common knowledge in the art and therefore will not be described in detail.
Step two: controlling the robot to patrol according to a preset farm patrol scheme;
the method comprises the following steps that a farm inspection scheme is compiled into the prior art, a corresponding inspection scheme is compiled according to the actual inspection needs of the farm, and in the inspection process, an illuminating device is adjusted according to the acquisition environment of a robot, namely when the field is judged not to need light supplement, the illuminating device is not started, is judged according to the corresponding acquired image, and is judged through the existing image analysis technology and the manually set image conditions; the column location can be performed according to the corresponding RFID tag and the pulse; pulse positioning: and calculating the number of the motor pulses, pre-storing the number of pulses at corresponding positions in a planning task, and executing a current position task after the motor reaches a specified pulse. In order to prevent the abnormal phenomena of inaccurate pulse number, step loss and the like, a Hall device is arranged in the machine, the Hall device can be calibrated when the Hall device runs to a hardware barrier strip above a track, and the barrier strips are pasted on the guide rail at certain intervals.
Step three: acquiring an acquired image in the column, wherein the acquired image comprises a trough acquired image and a culture image;
step four: analyzing the trough image to obtain the grade of excess material;
the method for analyzing the trough image comprises the following steps:
establishing a corresponding trough analysis model based on the CNN network, and establishing a corresponding training set for training in a manual mode, wherein the training set comprises a trough image, the area of corresponding feed, the position of the feed and the area of the trough; the specific establishment and training process is common knowledge in the art and therefore will not be described in detail; analyzing the trough image through a trough analysis model to obtain the area, the trough position and the trough area of the corresponding feed, marking the area of the obtained feed with SA, marking the area of the obtained trough with LS, matching the corresponding adjustment coefficient according to the obtained trough position, namely setting the corresponding adjustment coefficient in a manner discussed by an expert group according to the depth of the trough and the position possibly possessed by the feed, establishing a corresponding adjustment coefficient matching table, and matching to obtain the corresponding adjustment coefficient; the obtained adjustment coefficient is marked as α, and the corresponding residue grade is calculated according to the residue grading function YL = α × SA/LS.
Step five: analyzing the cultured object image and calculating a fat condition value;
and analyzing the culture image, namely acquiring the overlooking area and the culture height of the culture by the conventional image processing method.
The method for calculating the fat condition value comprises the following steps:
the overlooking area and the height of the cultured object are obtained and respectively marked as FM and H, and the corresponding fat condition value is calculated according to a fat condition calculation function Bf = FM multiplied by H multiplied by beta, wherein beta is a conversion coefficient and is discussed and set by an expert group.
Step six: and counting the calculated residue grade and the fat condition value into the corresponding column statistical table.
The method comprises the following steps of off-line operation of a machine:
a platform or an APP issues a plan task, and equipment stores a plan to a local storage;
after the equipment reaches the time, the stored plan is executed, and the equipment executes all the measurement data and the abstract data to the memory (the memory is opened up for circular queue storage to ensure first-in first-out) in real time;
and traversing the summary information by the equipment to search the data needing to be stored, uploading the data when the network connection is normal or the APP connection is performed, deleting the current summary information after the uploading is successful, and continuously searching the next summary information until all the data are uploaded successfully.
The above formulas are all calculated by removing dimensions and taking numerical values thereof, the formula is a formula which is obtained by acquiring a large amount of data and performing software simulation to obtain the most approximate real condition, and the preset parameters and the preset threshold values in the formula are set by the technical personnel in the field according to the actual condition or obtained by simulating a large amount of data.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the spirit and scope of the present invention.

Claims (8)

1. A track intelligent inspection method applied to a farm is characterized by comprising the following specific steps:
the method comprises the following steps: acquiring farm data, and laying a patrol robot and setting up a scene according to the acquired farm data;
step two: controlling the robot to patrol according to a preset farm patrol scheme;
step three: acquiring an acquired image in the column, wherein the acquired image comprises a trough acquired image and a culture image;
step four: analyzing the trough image to obtain the grade of excess material;
step five: analyzing the images of the cultured objects, and calculating a fat condition value;
step six: and counting the calculated remainder grade and the fat condition value to a corresponding column statistical table.
2. The intelligent track inspection method applied to the farm according to claim 1, wherein the method for performing inspection robot laying according to the obtained farm data comprises:
establishing a farm model according to the obtained farm data, analyzing the farm model to obtain a robot track position, and establishing a corresponding track model and a robot model in the farm model according to the obtained robot track position; the robot track and the robot are installed according to the current farm model, corresponding RFID tags are arranged on the robot track according to the distribution of columns in the farm, and corresponding column information is stored in the RFID tags.
3. The intelligent track inspection method applied to farms according to claim 2, wherein the method for constructing the scene according to the obtained farm data comprises:
the method comprises the steps of obtaining a farm model, marking columns in the farm model as unit areas, combining the unit areas to obtain illumination areas, setting corresponding illumination devices in the illumination areas, and completing scene construction.
4. The intelligent track inspection method applied to the farm according to claim 3, wherein the method for merging the unit areas comprises the following steps:
identifying a single radiation area in a farm model, marking a unit area in the single radiation area as an analysis area, sequencing priorities of the analysis areas to obtain a first sequence, merging the analysis areas according to the first sequence, checking corresponding merging limiting conditions when merging one analysis area, and merging again according to the first sequence when the merging limiting conditions are met; when the merging limitation condition is not met, the analysis area merging is cancelled, and the current merging area is marked as an illumination area.
5. The intelligent track inspection method applied to the farm according to claim 4, wherein the method for arranging the corresponding lighting devices in the lighting area comprises the following steps:
and comparing the obtained illumination area with the maximum illumination range corresponding to the illumination device, determining the installation range of the illumination device, marking the obtained installation range in a farm model, and installing the illumination device by a corresponding installer in the marked area according to the farm environment.
6. The intelligent track inspection method applied to farms according to claim 1, wherein the robot adopts a dual system.
7. The intelligent track inspection method applied to the farm according to claim 1, wherein the method for analyzing the trough image comprises the following steps:
establishing a corresponding trough analysis model based on a CNN network, analyzing trough images through the trough analysis model to obtain the area, the trough position and the trough area of corresponding feed, marking the area of the obtained feed as SA, marking the area of the obtained trough as LS, matching a corresponding adjustment coefficient according to the obtained trough position, marking the obtained adjustment coefficient as alpha, and calculating the corresponding excess material grade according to an excess material grading function YL = alpha multiplied by SA/LS.
8. The method for intelligently routing inspection of an orbit applied to a farm according to claim 1, wherein the method for calculating the condition value comprises the following steps:
the overlooking area and the height of the cultured object are obtained and respectively marked as FM and H, and the corresponding fat condition value is calculated according to a fat condition calculation function Bf = FM multiplied by H multiplied by beta, wherein beta is a conversion coefficient.
CN202211563333.6A 2022-12-07 2022-12-07 Intelligent track inspection method applied to farm Pending CN115797867A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211563333.6A CN115797867A (en) 2022-12-07 2022-12-07 Intelligent track inspection method applied to farm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211563333.6A CN115797867A (en) 2022-12-07 2022-12-07 Intelligent track inspection method applied to farm

Publications (1)

Publication Number Publication Date
CN115797867A true CN115797867A (en) 2023-03-14

Family

ID=85417616

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211563333.6A Pending CN115797867A (en) 2022-12-07 2022-12-07 Intelligent track inspection method applied to farm

Country Status (1)

Country Link
CN (1) CN115797867A (en)

Similar Documents

Publication Publication Date Title
US10382975B2 (en) Subterranean 3D modeling at cell sites
US10397802B2 (en) Detecting changes at cell sites and surrounding areas using unmanned aerial vehicles
CN106774322B (en) Photovoltaic power station inspection system and operation method thereof
US10368249B2 (en) Modeling fiber cabling associated with cell sites
CN110134147A (en) A kind of autonomous paths planning method and device of plant protection drone
CN110262443A (en) A kind of greenhouse environmental monitoring system based on mobile Internet
CN111626985A (en) Poultry body temperature detection method based on image fusion and poultry house inspection system
CN111129995B (en) Transformer substation cooperative intelligent inspection system and application method thereof
CN109656252A (en) A kind of middle control degree system and positioning navigation method based on AGV
CN110738195A (en) poultry farm cultivation quantity recognition equipment based on image recognition
US8738316B2 (en) Luminance sensing system and method and computer program product thereof
KR20210001342A (en) A system and appratus for managing a solar panel using an unmaned aerial vehicle
CN114510077A (en) Route planning method and device for unmanned aerial vehicle pole routing inspection and computer storage medium
CN115797867A (en) Intelligent track inspection method applied to farm
CN117111660A (en) Unattended intelligent granary system and method
KR102546183B1 (en) Method and system for managing smart farm
CN112272356A (en) Automatic transformer substation inspection system and inspection method
US11777445B2 (en) Methods of and apparatus for locating energy harvesting devices in an environment
CN112862272A (en) Accurate agricultural management and analysis system, positioning method, terminal and storage medium
CN115082396A (en) Intelligent surveying method, system and medium for photovoltaic power station infrastructure progress
CN112766121A (en) A robot and system of patrolling and examining of plant for plant patrols and examines
CN109685818A (en) A kind of binocular vision robot
CN113671954B (en) Inspection method of intelligent robot of transformer substation
CN117749087B (en) Autonomous inspection cleaning operation and maintenance system for live-action three-dimensional fishing light complementary photovoltaic power station
CN114019918B (en) Vegetable three-dimensional cultivation intelligent logistics working system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination