CN114119736A - Control method and control device for agricultural machine, agricultural machine and processor - Google Patents

Control method and control device for agricultural machine, agricultural machine and processor Download PDF

Info

Publication number
CN114119736A
CN114119736A CN202010884997.7A CN202010884997A CN114119736A CN 114119736 A CN114119736 A CN 114119736A CN 202010884997 A CN202010884997 A CN 202010884997A CN 114119736 A CN114119736 A CN 114119736A
Authority
CN
China
Prior art keywords
area
time interval
image
target object
actual position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010884997.7A
Other languages
Chinese (zh)
Inventor
杨强荣
方小永
何振军
高一平
贡军
方增强
刘辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhonglian Agricultural Machinery Co ltd
Zoomlion Heavy Industry Science and Technology Co Ltd
Original Assignee
Zhonglian Agricultural Machinery Co ltd
Zoomlion Heavy Industry Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhonglian Agricultural Machinery Co ltd, Zoomlion Heavy Industry Science and Technology Co Ltd filed Critical Zhonglian Agricultural Machinery Co ltd
Priority to CN202010884997.7A priority Critical patent/CN114119736A/en
Publication of CN114119736A publication Critical patent/CN114119736A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M21/00Apparatus for the destruction of unwanted vegetation, e.g. weeds
    • A01M21/04Apparatus for destruction by steam, chemicals, burning, or electricity
    • A01M21/043Apparatus for destruction by steam, chemicals, burning, or electricity by chemicals
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0025Mechanical sprayers
    • A01M7/0032Pressure sprayers
    • A01M7/0042Field sprayers, e.g. self-propelled, drawn or tractor-mounted
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0089Regulating or controlling systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Zoology (AREA)
  • Insects & Arthropods (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Environmental Sciences (AREA)
  • Wood Science & Technology (AREA)
  • Pest Control & Pesticides (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Geometry (AREA)
  • Computational Linguistics (AREA)
  • Mechanical Engineering (AREA)
  • Toxicology (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a control method and device for agricultural machinery, the agricultural machinery, a storage medium and a processor. The method comprises the following steps: acquiring a first actual position of a first target object in a current region of a farmland in the operation process of the agricultural machine; determining a first working time interval of the spray head according to the first actual position; acquiring a second actual position of a second target object in a next region of the farmland; determining a second working time interval of the spray head according to the second actual position; and determining the actual working time interval of the sprayer according to the first working time interval and the second working time interval, so that the actual position of the target object in the farmland can be quickly and accurately identified, and the actual working time interval of the sprayer can be determined.

Description

Control method and control device for agricultural machine, agricultural machine and processor
Technical Field
The invention relates to the technical field of agriculture, in particular to a control method and a control device for agricultural machinery, the agricultural machinery, a storage medium and a processor.
Background
The research and statistics of the department of planting in rural agricultural departments show that the yield of vegetables and food crops in China is steadily increased, but some problems also exist. Among them, weeds and pests are a large factor threatening crop yield and food safety.
The current main method for controlling agricultural weeds and pests on a large scale is to use pesticides, which not only destroys the environment, but also reduces the edible safety of agricultural products and increases the production cost. For drug delivery, the commonly adopted methods mainly comprise manual judgment methods and treatment methods. The manual judgment method is mainly used for judging the species and the position of the field weed pests through the expert experience of an operator and determining a proper time to open the spray head for pesticide application. The method greatly improves the labor intensity of operators, needs to recognize weeds and pests while the operators drive agricultural machinery, and depends on personal experience of the operators for the operations of driving and recognizing simultaneously, so that the recognition levels of the weeds and the pests are uneven, and the phenomena of missed spraying or over spraying can occur. The prescription rule is that a field operation prescription chart is generated by combining field images shot by an unmanned aerial vehicle or a satellite, and the agricultural machine automatically controls the spraying dosage according to the current position and the prescription chart during operation. Although the method realizes full-autonomous fertilization and pesticide application, a high-precision prescription chart is generated through an unmanned aerial vehicle or a satellite image and a high-precision positioning system of agricultural machinery is required to be built, so that the economic cost is high, and the method is difficult to popularize in a large range.
Disclosure of Invention
An object of an embodiment of the present invention is to provide a control method, a control device, an agricultural machine, a storage medium, and a processor for an agricultural machine, which can quickly and accurately identify an actual position of a target object in a farm field and determine an actual working time interval of a spray head.
In order to achieve the above object, a first aspect of the present invention provides a control method for an agricultural machine including a head, the control method including:
acquiring a first actual position of a first target object in a current region of a farmland in the operation process of agricultural machinery;
determining a first working time interval of the spray head according to the first actual position;
acquiring a second actual position of a second target object in a next region of the farmland;
determining a second working time interval of the spray head according to the second actual position; and determining the actual working time interval of the spray head according to the first working time interval and the second working time interval.
In the embodiment of the present invention, determining the actual operating time interval of the nozzle according to the first operating time interval and the second operating time interval includes: determining the actual working time interval as the starting time of the first working time interval to the ending time of the second working time interval under the condition that the first working time interval and the second working time interval meet any one of the following conditions:
the first working time interval and the second working time interval are continuous;
the first working time interval is overlapped with the second working time interval;
the time interval between the first working time interval and the second working time interval is smaller than a preset threshold value.
In an embodiment of the invention, the first working time interval is determined according to a first target object in the current region and a target object in the previous region.
In the embodiment of the invention, the number of the spray heads is multiple; determining a first operating time interval of the spray head according to the first actual position comprises:
determining a corresponding target spray head in the plurality of spray heads according to the first actual position;
determining the actual distance between the target object and the target spray head according to the first actual position;
acquiring the moving speed of the agricultural machine and the control time delay of the agricultural machine;
and determining a first working time interval of the target spray head according to the actual distance, the moving speed and the control time delay.
In an embodiment of the present invention, determining a corresponding target showerhead of the plurality of showerheads based on the first actual position includes:
dividing the current area into a plurality of sub-areas corresponding to the plurality of spray heads;
and determining the spray head corresponding to the sub-area where the first actual position is located as the target spray head.
In the embodiment of the present invention, determining the nozzle corresponding to the sub-area where the first actual position is located as the target nozzle includes:
determining a plurality of sub-areas where the first actual positions are located;
determining a target sub-region according to the ratio of the area of the first actual position in each sub-region to the area of the first actual position;
and determining the spray head corresponding to the target subregion as a target spray head.
In the embodiment of the present invention, determining the target sub-region according to the ratio of the area occupied by the first actual position in each sub-region to the area of the first actual position includes:
acquiring a first ratio and a second ratio of the area of the first actual position in the adjacent first sub-area and second sub-area to the area of the first actual position respectively;
in case the first ratio is higher than the first threshold and the second ratio is lower than the second threshold, the first sub-area is determined as the target sub-area.
In the embodiment of the invention, the spraying range of the spray head comprises a first spraying range and a second spraying range, and the second spraying range is larger than the first spraying range; the method further comprises the following steps:
and switching the spraying range of the target spray head corresponding to the target subregion from the first spraying range to the second spraying range.
In this embodiment of the present invention, determining the target sub-region according to the ratio of the area occupied by the first actual position in each sub-region to the area of the first actual position further includes:
and under the condition that the ratio of the first ratio to the second ratio is within a preset ratio interval, determining the first sub-area and the second sub-area as target sub-areas.
In an embodiment of the present invention, acquiring the first actual position of the first target object in the current area of the agricultural field during the operation of the agricultural machine comprises: acquiring a region image corresponding to the current region of the farmland in the agricultural machinery operation process; inputting the region image to a perceptual machine model to determine an image position of a target object in the region image; acquiring camera calibration information of image acquisition equipment for shooting the area image; and determining a first actual position of the first target object according to the image position and the camera calibration information.
In an embodiment of the present invention, acquiring camera calibration information of an image capturing device for capturing the area image includes: calibrating a camera of the image acquisition equipment to obtain a plurality of coordinate data; and calibrating the camera of the image acquisition equipment through a preset algorithm and the plurality of coordinate data so as to determine the camera calibration information of the image acquisition equipment.
In the embodiment of the present invention, calibrating the camera of the image capturing device to obtain a plurality of coordinate data includes:
placing a calibration plate at a first angle of the image acquisition equipment, and recording a first camera coordinate and a corresponding first world coordinate of each feature point of the calibration plate in the image acquisition equipment;
placing the calibration plate at a second angle of the image acquisition equipment, and recording a second camera coordinate and a corresponding second world coordinate of each feature point of the calibration plate in the image acquisition equipment;
and placing the calibration plate at a third angle of the image acquisition equipment, and recording a third camera coordinate and a corresponding third world coordinate of each feature point of the calibration plate at the image acquisition equipment.
In an embodiment of the present invention, the method further includes one or both of: under the condition that a second target object is not detected in the next region of the farmland and the end time corresponding to the first working time interval is reached, closing the spray head;
and taking the earlier of the starting time of the first working time interval and the starting time of the second working time interval as the opening time of the spray head.
A second aspect of the invention provides a processor configured to perform a control method for an agricultural machine as described above.
A third aspect of the present invention provides a control device for an agricultural machine, the agricultural machine including a head, the control device including: an image acquisition device configured to acquire an area image of a farmland; and a processor according to the above.
A fourth aspect of the present invention provides an agricultural machine, comprising: a spray head; and a control device for an agricultural machine as described above.
A fifth aspect of the invention provides a machine-readable storage medium having stored thereon instructions for causing a machine to execute the control method for an agricultural machine described above.
According to the control method for the agricultural machine, in the operation of the agricultural machine, the first target object in the current area and the second target object in the next area, as well as the first actual position of the first target object and the second actual position of the second target object are continuously determined, and the actual working time interval of the sprayer can be determined according to the first working time interval and the second working time interval, so that the sprayer can be opened and closed according to the actual working time interval, and the problem that the sprayer is continuously opened and closed due to the fact that the target objects are continuously detected is effectively avoided.
Additional features and advantages of embodiments of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the embodiments of the invention without limiting the embodiments of the invention. In the drawings:
FIG. 1 schematically illustrates an application environment diagram of a method for determining a target object location according to an embodiment of the present invention;
FIG. 2 schematically illustrates a flow diagram of a method for determining a position of a target object in accordance with an embodiment of the present invention;
FIG. 3 schematically shows a flow diagram of a method of training a perceptual machine model according to an embodiment of the invention;
FIG. 4 schematically illustrates a flow diagram of a control method for an agricultural machine, according to an embodiment of the present disclosure;
FIG. 5 schematically illustrates a flow diagram of a control method for an agricultural machine, according to another embodiment of the present disclosure;
FIG. 6A schematically illustrates a region partitioning diagram according to an embodiment of the present invention;
FIG. 6B schematically shows an area distribution diagram of a first actual location according to an embodiment of the invention;
FIG. 6C is a schematic view of a plurality of spray ranges of a spray head according to an embodiment of the present invention;
FIG. 7 is a schematic flow chart diagram illustrating the steps for determining a corresponding target showerhead of the plurality of showerheads based on the first actual position in accordance with an embodiment of the present invention;
FIG. 8 schematically illustrates a flow chart of a method of controlling a spray head of an agricultural machine according to an embodiment of the present disclosure;
FIG. 9 is a block diagram schematically illustrating an apparatus for determining a position of a target object according to an embodiment of the present invention;
fig. 10 is a block diagram schematically showing the construction of a control apparatus for an agricultural machine according to an embodiment of the present invention;
fig. 11 is a block diagram schematically showing the construction of a control apparatus for an agricultural machine according to another embodiment of the present invention;
FIG. 12 is a block diagram schematically illustrating the construction of an agricultural machine according to an embodiment of the present invention;
fig. 13 schematically shows an internal configuration diagram of a computer apparatus according to an embodiment of the present invention.
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating embodiments of the invention, are given by way of illustration and explanation only, not limitation.
The method for determining the position of the target object provided by the application can be applied to the application environment as shown in fig. 1. In the application environment shown in fig. 1, the agricultural machine 100 may include an image capturing device 101 and a perception machine model 102 (e.g., a perception machine model included or operated by a processor), where the image capturing device 101 is configured to capture an area image of an agricultural field and send the captured area image to the perception machine model 102 through network communication, and the perception machine model 102 is configured to determine an image position of a target object in the input area image in the area image, so that the processor of the agricultural machine 100 may determine an actual position of the target object according to the image position and camera calibration information of the image capturing device 101. Examples of the agricultural machine 100 may include, but are not limited to, various human-powered (manual) agricultural machines, animal-powered agricultural machines, small-power agricultural machines, tractor-mounted agricultural machines, self-propelled agricultural machines, aviation agricultural machines, and the like. Examples of the image capturing apparatus 101 may include, but are not limited to, a video camera, a still camera, a video camera, various electronic apparatuses having a photographing function, and the like.
Fig. 2 schematically shows a flow diagram of a method for determining a position of a target object according to an embodiment of the invention. As shown in fig. 2, in an embodiment of the present invention, a method for determining a target object position is provided, which includes the following steps:
step 201, obtaining an area image of a farmland.
Step 202, inputting the area image into a sensing machine model to determine the image position of the target object in the area image, wherein the sensing machine model is trained through an image training sample generated by a generative confrontation network.
Step 203, camera calibration information of the image acquisition device for shooting the area image is acquired.
And step 204, determining the actual position of the target object according to the image position and the camera calibration information.
The regional image of the farmland may refer to an image obtained by image-capturing the farmland by an image-capturing device on the agricultural machine. Because the image collected by the image collecting device has a certain area range limitation, the farmland image collected by the image collecting device can be called as an area image. The target object may refer to an object of interest that needs to be identified. For example, in a farmland, there are various plants including rice, corn, wheat, weeds and the like, and when farmland operation is required on rice, such as fertilizer or pesticide spraying, a target object in the scene can be rice; when the corn needs to be subjected to farmland operation, such as chemical fertilizer or pesticide spraying, a target object in the scene can be the corn; when weeds in a farm field need to be removed and pesticides need to be sprayed on the weeds, the target object may be the weeds.
Inputting the collected region image into a sensing machine model, and determining the corresponding image position of a target object in the region image through the sensing machine model, wherein the image position refers to the corresponding position of the target object in the region image when the region image contains the target object. After the image position of the target object contained in the regional image is determined, camera calibration information of the image acquisition equipment can be acquired, and the actual position of the target object in the farmland is determined according to the image position of the target object and the camera calibration information. The camera calibration information refers to camera parameters obtained after calibrating a camera of the image acquisition device, namely, the optimal calibration matrixes R and T obtained by calculation after camera calibration.
For example, the target object is a weed, the position of the weed in the area image can be determined to be (X1, Y1), the position of the weed relative to the image acquisition device is determined to be (X1, Y1, Z1), the camera calibration information is camera calibration matrixes R and T, and the position of the weed relative to the image acquisition device, namely the actual position of the weed in the field can be calculated by the following formula:
Figure BDA0002655277670000061
in the embodiment of the invention, the perception machine model can be obtained by training the image training sample generated by the generative confrontation network. The Generative Adaptive Networks (GAN) is a deep learning model, namely a GAN network for short, and a plurality of image training samples can be randomly generated through the Generative adaptive Networks, so that the data of the training samples are more, the related scenes are more, the complexity is higher, and the recognition accuracy is higher when the input regional images are recognized through the trained perception machine model.
The method for determining the position of the target object trains the perception machine model through the image training sample generated by the generative confrontation network, after the regional image of the farmland is acquired, the regional image is input into a trained perception machine model to determine the image position of the target object in the regional image, and finally determining the actual position of the target object according to the camera calibration information of the image acquisition equipment, the perception machine model is trained through the image training sample generated by the generative confrontation network, so that the recognition efficiency and the accuracy of the perception machine model to the target object are effectively improved, therefore, when the acquired region image is input into the sensing machine model, the target object contained in the region image and the actual position of the target object in the farmland can be rapidly and accurately determined.
In one embodiment, the perceptual machine model may be trained. Specifically, the perceptual machine model is trained by: acquiring a plurality of regional image samples; performing data enhancement on the area image samples through a generative antagonizing network to obtain a plurality of image training samples; inputting the image training sample into the perception machine model to train the perception machine model.
Before the sensing machine model is actually applied, the sensing machine model can be trained, and the recognition accuracy of the trained sensing machine model on the target object is higher. The method comprises the steps of obtaining a plurality of regional image samples, performing data enhancement on the regional image samples through the generative antagonizing network, and generating a plurality of random image training samples according to the input regional image samples through the generative antagonizing network so as to improve the number of training samples of the perception machine model and the complexity of the training samples.
In one embodiment, generating a competing network includes generating a network and discriminating a network. Performing data enhancement on the area image sample by the generative antagonizing network to obtain a plurality of image training samples comprises: inputting the area image samples into a generation network to generate a plurality of random area images; inputting the random area image into a discrimination network to generate a judgment result for whether the target object is contained in the random area image; and the generation network adjusts the generation strategy of the random area image according to the judgment result so as to output a plurality of random image training samples according to the input area image samples.
The generative countermeasure network includes two networks, namely a generative network and a discriminative network. When the target object is weeds in the farmland, the generation network and the discrimination network in the GAN network can be trained simultaneously according to the characteristics of large target color difference and shape difference in the crop weed data set. Specifically, when inputting the area image sample into the GAN network, it actually means inputting the area image sample into the generation network, and generating a plurality of random area images from the area image sample by the generation network. And taking the random area image output by the generated network as the input of a judgment network, and judging whether the random area image contains the target object by the judgment network. The generation network adjusts the generation strategy of the random area images according to the judgment result, generates a plurality of random area images at random, inputs the generated random area images to the discrimination network for recognition, continuously games between the generation network and the discrimination network, improves the recognition capability of the discrimination network and the image generation capability of the generation network, and in this way, the generation type anti-network can enhance the data of the area image samples, and generates more image training samples with more complex scenes.
In one embodiment, inputting the image training sample to the perceptual machine model to determine the image position of the target object in the image training sample comprises: inputting the region image into a sensing machine model to extract a plurality of region image characteristics in the region image; determining whether the region image contains the target object according to the characteristics of the plurality of region images; determining a feature position of a region image feature corresponding to the target object when the target object is included in the determined region image; and determining the image position of the target object in the area image according to the characteristic position.
After the shot area image is input into the perception machine model, the feature extraction can be carried out on the area image through the perception machine model so as to obtain a plurality of area image features, and the perception machine model can determine whether the area image contains the target object or not according to the area image features. When the perception machine model determines that a certain regional image contains a target object, the feature position of the regional image feature belonging to the target object can be obtained, so that the image position of the target object in the regional image can be determined according to the feature position.
In one embodiment, determining whether the region image contains the target object according to the plurality of region image features comprises: acquiring a marking frame of the area image and corresponding marking frame parameters; judging whether the marking frame contains a target object or not; adjusting the parameters of the marking frame according to the judgment result to determine a detection frame aiming at the target object; determining the probability that the regional image features belong to the features of the target object according to the detection frame; and determining whether the region image contains the target object according to the probability.
The perception machine model can perform feature extraction on the regional image to obtain a plurality of regional image features. The perception machine model can also acquire the labeling frames of the region image and the labeling frame parameters corresponding to each labeling frame. The labeling frame may be labeled in advance according to whether each region image includes the target object, and when the target object exists, the position of the target object in the region image. The labeling box may be manually labeled in advance for each image training sample, and the labeling box marks one or more target objects included in each image training sample and positions of the target objects in the image training samples. The parameters of the labeling frame refer to the size, the aspect ratio and other parameters of the labeling frame. After the perception machine model acquires the labeling frame of the region image, whether the labeling frame contains the target object can be judged. Specifically, the perception machine model may determine whether the labeling frame includes a feature belonging to the target object, and adjust a parameter of the labeling frame according to a determination result to determine the detection frame for the target object. The determined detection frame is used for determining the probability that the characteristics of the image samples of the various regions belong to the characteristics of the target object by the perception machine model, so that whether the image training samples contain the target object or not can be determined according to the probability.
In one embodiment, determining the probability that the region image sample feature belongs to the feature of the target object according to the detection frame comprises: performing border regression on the detection frame indicating that the feature belonging to the target object is detected, so as to correct the position of the detection frame determining that the feature belonging to the target object exists; screening the corrected detection frame according to a preset condition; and determining the probability that the characteristics of the area image sample belong to the characteristics of the target object according to the screened detection frame.
After determining the detection frames for the target object, the perceptual machine model may obtain a detection result of each detection frame, and perform frame regression on the detection frames indicating that the characteristics belonging to the target object are detected, where the frame regression is intended to correct the positions of the detection frames. Furthermore, the corrected detection frame can be screened according to preset conditions, and the probability that the characteristics of the area image sample belong to the characteristics of the target object can be determined according to the screened detection frame.
As shown in fig. 3, in an embodiment of the present invention, a method for training a perceptual machine model is provided, which includes the following steps:
in step 301, a plurality of area image samples are obtained.
Step 302, inputting the area image sample into a generation network in the generation countermeasure network to generate a plurality of random area images.
Step 303, inputting the random area image into a discrimination network in the generative countermeasure network to generate a determination result of whether the target object is included in the random area image.
And step 304, the generation network adjusts the generation strategy of the random area image according to the judgment result so as to output a plurality of random image training samples according to the input area image samples.
When agricultural machinery carries out the agricultural and sprays the operation, need in time discern the target object in the farmland and the concrete position of target object in the farmland to realize accurate operation of spraying. When the target object is weeds, the crop weed data has the characteristics of concentrated target color difference and large shape difference, and in order to improve the accuracy and efficiency of the perception machine model for identifying the weeds, the acquired area image samples can be input into the generative confrontation network, so that a plurality of image training samples can be randomly generated through the generative confrontation network. The generative confrontation network generates more randomized, multi-scenario and complicated images when generating the image training sample. For example, a daytime farmland area image is input into the generative confrontation network, and the generative confrontation network can generate a rainy farmland area image, a cloudy farmland area image, a rainstorm farmland area image, a night farmland area image and the like.
The generative countermeasure network includes two networks, namely, a generative network and a discriminative network. Specifically, when a plurality of random area images are randomly generated by the generative countermeasure network, the area image sample is input to the generation network in the generative countermeasure network, the generation network can generate a plurality of random area images according to the input area image sample, the random area images are input to the discrimination network, the discrimination network can judge whether the random area images contain the target object, and the generation network adjusts the generation strategy of the random area images according to the judgment result of the discrimination network. That is to say, the goal of generating the network is to generate a real picture as much as possible to "cheat" the discrimination network, and the goal of the discrimination network is to separate the generated picture and the real picture as much as possible, further, the discrimination network needs to identify a target object contained in the picture generated by the generation network, that is, the generation network and the discrimination network form a dynamic "game process", in this way, the generation network and the discrimination network are both in a continuous optimization process, so that the generation network can continuously adjust the generation strategy according to the discrimination structure of the discrimination network, and finally, a plurality of random image training samples can be output according to the input area image samples.
In the embodiment of the present invention, after a plurality of area image samples are obtained, the area image samples may be further preprocessed, for example, the area image samples are subjected to processing operations such as scaling and normalization, and then the processed area image samples are input into the generative countermeasure network.
Step 305, inputting the image training sample into the perception machine model to extract a plurality of regional image sample features in the image training sample.
Step 306, obtaining the labeling frame of the image training sample and corresponding labeling frame parameters to judge whether the labeling frame contains the target object.
And 307, adjusting the parameters of the labeling frame according to the judgment result to determine a detection frame for the target object.
After a plurality of random image training samples are output according to the input area image samples through the generative confrontation network, the perception machine model can be trained through the random image training samples. Specifically, after the randomly generated image training samples are input to the perception machine model, the perception machine may extract features of the image training samples to obtain a plurality of corresponding regional image sample features. In the training process, in order to improve the prediction accuracy of the perception machine model, the image training samples may be manually labeled in advance, and the target object contained in each image training sample and the position of each target object in the image are marked by a labeling frame. The perception machine model can acquire the labeling frame of each image training sample and the corresponding labeling frame parameters, and judge whether each labeling frame comprises a target object. The process of feature extraction and the process of obtaining the labeling frame of the perception machine model can be synchronously carried out, or feature extraction can be carried out firstly and then the labeling frame can be obtained, and the sequence of the two steps can be flexibly changed.
After the perception machine judges whether the target object is contained in the marking frame, parameters of the marking frame can be adjusted according to the judgment result so as to determine a detection frame aiming at the target object. The detection frame is a detection frame adopted by the perception machine model when the target object is identified, and if the detection frame is too large or too small, the detection result of the target object is possibly not accurate enough. Therefore, the perception machine model can determine the size of the detection frame recognized by the perception machine model according to the result that the target object is contained in the pre-labeled labeling frame.
Step 308, performing border regression on the detection frame indicating that the feature belonging to the target object is detected, so as to correct the position of the detection frame.
And 309, screening the corrected detection frame according to a preset condition.
And step 310, determining the probability that the characteristics of the area image sample belong to the characteristics of the target object according to the screened detection frame.
After determining the size of the detection frame, the perceptual machine model may determine whether there is a feature belonging to the target object through the detection frame. Therefore, the detection frames indicating that the characteristics belonging to the target object are detected can be obtained, and frame regression can be performed on the detection frames so as to correct the positions of the detection frames, so that the detection frames are positioned more accurately. Further, after the position of the detection frame is corrected, the corrected detection frame can be screened according to preset conditions. When determining whether the characteristics belong to the target object through the detection frames, the perception machine model can also obtain the probability that each detection frame judges that the characteristics belong to the target object are detected, and after the detection frames are screened, the probability that the characteristics of the regional image sample belong to the characteristics of the target object can be determined according to the screened detection frames and the probability that each detection frame considers that the characteristics belong to the target object are detected.
In one embodiment, the screening the modified detection frames according to the preset condition includes: and reserving the corrected detection frame which meets a preset condition, wherein the preset condition comprises any one of the following conditions: the probability value of the corrected detection box is greater than a preset probability threshold value; the size of the corrected detection frame is larger than the preset size; and the area overlapping proportion of any two detection frames in the corrected detection frames is lower than a preset overlapping threshold value.
When the corrected detection frame is selected, the selection can be performed in the following manner. In an example screening manner, a probability value of each detection box for judging that the feature belonging to the target object is detected can be obtained, a preset probability threshold value is set, the detection boxes smaller than or equal to the preset probability threshold value are removed, and the detection boxes with the probability value larger than the preset probability threshold value are reserved. That is, each detection frame can check whether the feature belonging to the target object is included or not when detecting whether the feature belonging to the target object is included or not, and can determine the probability that each detection frame judges that the feature belonging to the target object is included when the detection frame judges that the feature belonging to the target object is included. For example, the preset probability threshold is set to 55%, there are 100 detection boxes, of which 90 detection boxes consider that the probability value of the feature belonging to the target object is higher than 55%, and the probability values of the remaining 10 detection boxes consider that the probability value of the feature belonging to the target object is less than or equal to 55%. Then, when screening the detection boxes, the 10 detection boxes with probability values less than or equal to 55% may be removed, and only the 90 detection boxes with probability values higher than 55% are retained.
The second exemplary screening method may be performed according to the size of the detection frame. Specifically, the size of each detection frame may be obtained and compared with a preset size set in advance. Since the detection frame is too small, the relevant features may not be detected, and therefore, the detection frame with the size less than or equal to the preset size may be removed, and the detection frame with the size greater than the preset size may be retained.
In a third exemplary screening manner, two detection frames can be arbitrarily selected to determine whether the area overlapping proportion is lower than a preset overlapping threshold. When the overlapping ratio of the areas of the two detection frames is too high, one detection frame can be removed, and only one detection frame is reserved. Specifically, when one of the two is selected, one of the two having a higher probability value may be selected.
And step 311, determining whether the image training sample contains the target object according to the probability.
In step 312, when it is determined that the image training sample includes the target object, the feature position of the region image sample feature corresponding to the target object is determined.
And step 313, determining the image position of the target object in the image training sample according to the characteristic position.
After the probability that the characteristics of the regional image sample belong to the characteristics of the target object is determined according to the screened detection frame, whether the image training sample contains the target object can be determined according to the probability obtained by final calculation. Specifically, a threshold may be set, and when the probability obtained by the final calculation is greater than the threshold, the perceptual machine model may determine that the image training sample includes the target object, otherwise, determine that the image training sample does not include the target object. For example, the threshold is set to 65%, and if the probability obtained by final calculation is greater than 65%, the perception machine model may output a prediction result 1, which indicates that it is determined that the input image training sample includes the target object; if the probability obtained by final calculation is less than or equal to 65%, the perception machine model can output a prediction result of 0, which indicates that the input image training sample does not contain the target object. That is, the probability value of the detection box is an intermediate detection process of the perceptual machine model, the final output of the perceptual machine model is 0 or 1, that is, yes or no, and whether the input image training sample contains the target object can be immediately known according to the output structure of the perceptual machine model.
After determining which image training samples contain the target object and which do not contain the target object through the perception machine model, the perception machine model can further acquire the regional image sample features of the target object detected in the image training samples and the feature positions of the regional image sample features in the image training samples, so that the image positions of the target object in the image training samples can be determined according to the feature positions. That is, the perceptual machine model may output not only 0 or 1, but also the image position of the target object in the image training sample when the output is 1, that is, when it is detected that the target object is included in the input image training sample. Therefore, inputting the image training sample into the sensing machine model can know whether the image training sample contains the target object and the image position of the contained target object in the image training sample. In the process of actually using the perception machine model, after the acquired region image of the farmland is input into the perception machine model, whether the region image contains the target object or not can be confirmed through the perception machine model, and the image position of the target object contained in the region image can be determined.
Fig. 4 schematically shows a flow chart of a control method for an agricultural machine according to an embodiment of the present invention. As shown in fig. 4, in an embodiment of the present invention, there is provided a control method for an agricultural machine, including the steps of:
step 401, acquiring a first actual position of a first target object in a current region of a farmland in the operation process of the agricultural machine.
Step 402, determining a first working time interval of the spray head according to the first actual position.
In step 403, a second actual position of a second target object in a next region of the field is obtained.
And step 404, determining a second working time interval of the spray head according to the second actual position.
And step 405, determining an actual working time interval of the spray head according to the first working time interval and the second working time interval.
In the operation process of agricultural machinery, a target object needs to be identified in real time, and the actual position of the target object in a farmland is determined, so that accurate operation is convenient to carry out. In this embodiment, the first target object and the second target object may be represented as the same target object, and for convenience of description, the target object included in the current region may be referred to as the first target object, and the target object included in the next region may be referred to as the second target object. Further, if there are two or more target objects, such as the target objects may be weeds or pests, the first target object and the second target object may be simultaneously expressed as weeds or pests. Correspondingly, the actual position of the first target object in the farmland is referred to as the first actual position, and the actual position of the second target object in the farmland is referred to as the second actual position.
Wherein, the division of the current region and the next region of the farmland is determined by the image acquisition region of the image acquisition equipment contained in the agricultural machine. That is to say, the current area refers to an area range that can be currently acquired by the image acquisition device when the agricultural machine is in operation, and is the current area. And after the agricultural machine continues to move, the area range acquired by the image acquisition equipment when the image acquisition equipment acquires the image again is the next area.
Specifically, the image acquisition equipment in the agricultural machine is used for continuously acquiring images of crops in the farmland, and identifying and positioning the images so as to determine whether the acquired farmland regional images contain target objects. And if the target object is determined to be contained, determining the actual position of the contained target object in the farmland. The target object contained in the area image acquired at the current position of the agricultural machine is called a first target object, so that a first actual position of the first target object in the current area can be determined, and a first working time interval of the sprayer is determined according to the first actual position. That is to say, after the actual position of the first target object is determined, the distance between the first target object and the agricultural machine can be calculated, and according to the distance and the traveling speed of the agricultural machine, the position of the first target object can be calculated, and after the agricultural machine travels to the position of the first target object for a long time, the spraying operation is started at what time, and the spraying operation is ended at what time, so that the area where the first target object is located can be accurately sprayed. Then, a first operating time interval of the spray head, i.e. the start time and the end time of the spray head, may be determined based on the first actual position of the first target object.
In the embodiment of the present invention, the first target object actually occupies a certain area. Determining the actual position of the first target object may actually determine the actual position of the first target object occupying an area. In one example, the distances between the near point and the far point of the first target object relative to the agricultural machine (e.g., the sprayer) and the agricultural machine (e.g., the sprayer) in the traveling direction of the agricultural machine may be calculated respectively, and the near point and the far point of the first target object to which the agricultural machine will travel after a certain period of time may be calculated respectively according to the two distances and the traveling speed of the agricultural machine, so as to determine the opening time and the ending time of the sprayer respectively.
In an embodiment of the invention, for ease of handling, especially for the case where the shape of the first target object in the field is an irregular shape, a regular shape may be used to fit, approximate or replace the shape of the first target object. Examples of regular shapes may include, but are not limited to, rectangular, circular, oval. In this embodiment, the distances between the near point and the far point of the regular shape relative to the agricultural machine (e.g., the sprinkler) and the agricultural machine (e.g., the sprinkler) in the traveling direction of the agricultural machine may be calculated, and the near point and the far point of the regular shape to which the agricultural machine may travel after a certain period of time may be calculated according to the two distances and the traveling speed of the agricultural machine, so as to determine the opening time and the ending time of the sprinkler, respectively.
In the continuous operation process of the agricultural machine, the agricultural machine continuously advances, the image acquisition equipment also continuously performs image acquisition, if the regional image acquired by the image acquisition equipment contains the target object, the next region can be considered to contain the second target object, the second actual position of the second target object in the farmland can be determined according to the method which is the same as the method for the first target object, and meanwhile, the second working time interval of the spraying operation of the spraying head for the second target object is determined, namely the opening time and the ending time of the spraying head when the spraying head sprays the second target object. In particular, the above-described method for determining the position of a target object may be employed when identifying and locating the target object by means of an image acquired by an image acquisition device, for determining a first target object comprised in a current region and a second target object comprised in a next region, and for determining a first actual position of the first target object and a second actual position of the second target object.
After the first target object and the second target object contained in the two areas and the actual positions of the first target object and the second target object in the farmland are determined, the actual working time interval of the spray head can be determined according to the positions of the target objects. In the actual process, if the target object is continuously detected in the area corresponding to the same spray head, the spray head can continuously receive the opening control command, and the spray head can be in a state of being repeatedly opened and closed, so that a large amount of pesticides are wasted, and the service life of the spray head is greatly reduced. Therefore, aiming at the problem of repeated opening and closing of the spray head, the control commands of the same spray head can be fused. That is to say, for same shower nozzle, if the current region that its spraying range corresponds has all detected the target object with next region, then can fuse the first operating time interval of first target object and the second operating time interval of second target object, determine the actual operating time interval of this shower nozzle for the shower nozzle opens and closes according to actual operating time interval.
In the embodiment of the present invention, the actual operating time interval may be determined as the start time of the first operating time interval to the end time of the second operating time interval in the case where any one of the following conditions is satisfied.
In one condition, the first operating time interval is continuous with the second operating time interval. For example, assume that the first operating time interval is [ t ]1,t2]Wherein t is1At the moment of opening the nozzle, t2The second working time interval is [ t ] at the closing moment of the spray head3,t4]Wherein t is3At the moment of opening the nozzle, t4Is the moment when the spray head is closed. If t is2Is equal to t3(t2=t3) In this case, the first operating time interval and the second operating time interval may be merged to determine that the actual operating time interval is [ t [ t ] ]1,t4]. That is, the original at t is canceled2Sending a closing command to the spray head at any moment (of course, the spray head is kept in an open state, so that the original condition at t is cancelled3Constantly sending an opening instruction to the spray head), and keeping the opening state of the spray head to t4The time of day.
In another condition, the first operating time interval overlaps the second operating time interval. For example, assume that the first operating time interval is [ t ]1,t2]Wherein t is1At the moment of opening the nozzle, t2The second working time interval is [ t ] at the closing moment of the spray head3,t4]Wherein t is3At the moment of opening the nozzle, t4Is the moment when the spray head is closed. If t is2Greater than t3(t2>t3) In the case that the first operating time interval and the second operating time interval overlap, the first operating time interval and the second operating time interval may be merged to determine that the actual operating time interval is [ t [ t ] ]1,t4]. That is, the original at t is canceled2Sending a closing command to the spray head at any moment (of course, the spray head is kept in an open state, so that the original condition at t is cancelled3Constantly sending an opening instruction to the spray head), and keeping the opening state of the spray head to t4The time of day.
In a third condition, the time interval between the first working time interval and the second working time interval is less than a preset threshold value. For example, assume that the first operating time interval is [ t ]1,t2]Wherein t is1At the moment of opening the nozzle, t2The second working time interval is [ t ] at the closing moment of the spray head3,t4]Wherein t is3At the moment of opening the nozzle, t4Is the moment when the spray head is closed. If t is2Less than t3(t2<t3) The first working time interval and the second working time interval have a time interval, i.e. t2And t3Time interval between (i.e. t)3–t2). In this case, generally at t2A closing command is sent to the spray head at the moment to close the spray head, and at t3And sending an opening instruction to the spray head at any time to open the spray head. However, if the time interval is relatively small, for example, less than a preset threshold (for example, the preset threshold may be 1s, 0.8s, 0.6s, 0.5s, etc.), the spraying operation may not be greatly affected even if the sprayer is kept in the open state during the time interval (for example, spraying of non-target objects is wasted), and the first operating time interval and the second operating time interval may be merged.
According to the control method for the agricultural machine, in the operation of the agricultural machine, the first target object in the current area and the second target object in the next area are continuously determined, the first actual position of the first target object and the second actual position of the second target object are continuously determined, the actual working time interval of the spray head can be determined according to the first working time interval and the second working time interval, the spray head can be opened and closed according to the actual working time interval, and the problems that the spray head is continuously opened and closed due to the fact that the target object is continuously detected, a large amount of pesticides are wasted, and the service life of the spray head is greatly reduced are effectively solved.
In one embodiment, acquiring a first actual position of a first target object in a current area of an agricultural field during operation of an agricultural machine comprises: acquiring a region image corresponding to the current region of the farmland in the operation process of the agricultural machinery; inputting the region image into a sensing machine model to determine the image position of the target object in the region image; acquiring camera calibration information of image acquisition equipment for shooting regional images; and determining a first actual position of the first target object according to the image position and the camera calibration information.
That is, in the present embodiment, the above-described method for determining the position of the target object may be employed to determine the first actual position of the first target object. The detailed process is not described herein.
In one embodiment, acquiring camera calibration information of an image capturing apparatus for capturing an image of an area includes: calibrating a camera of the image acquisition equipment to obtain a plurality of coordinate data; and calibrating the camera of the image acquisition equipment through a preset algorithm and a plurality of coordinate data to determine the camera calibration information of the image acquisition equipment.
Before image shooting is carried out by adopting the image acquisition equipment, a camera of the image acquisition equipment can be calibrated firstly, so that the conditions that the shot image generates barrel or pincushion distortion and the like can be avoided, and the calibrated camera parameters can be called as camera calibration information. Specifically, the camera of the image capturing device may be calibrated first, for example, the camera may be calibrated by using a checkerboard calibration board.
In one embodiment, calibrating a camera of an image capture device to obtain a plurality of coordinate data includes: placing the calibration plate at a first angle of the image acquisition equipment, and recording a first camera coordinate and a corresponding first world coordinate of each characteristic point of the calibration plate in the image acquisition equipment; placing the calibration plate at a second angle of the image acquisition equipment, and recording a second camera coordinate and a corresponding second world coordinate of each feature point of the calibration plate in the image acquisition equipment; and placing the calibration plate at a third angle of the image acquisition equipment, and recording a third camera coordinate and a corresponding third world coordinate of each characteristic point of the calibration plate at the image acquisition equipment.
When the camera is calibrated, a plurality of groups of data can be recorded by adjusting the angle between the calibration plate and the camera of the image acquisition equipment, so that the camera of the image acquisition equipment is determined to be calibrated. Specifically, the calibration plate can be placed at a first angle of the image acquisition device, and a first camera coordinate and a corresponding first world coordinate of each feature point of the calibration plate in the image acquisition device are recorded; placing the calibration plate at a second angle of the image acquisition equipment, and recording a second camera coordinate and a corresponding second world coordinate of each feature point of the calibration plate in the image acquisition equipment; and placing the calibration plate at a third angle of the image acquisition equipment, and recording a third camera coordinate and a corresponding third world coordinate of each characteristic point of the calibration plate at the image acquisition equipment. The first angle may refer to an angle right in front of the image capturing apparatus, i.e., located at the same vertical horizontal line with the lens shooting direction of the image capturing apparatus, and accordingly coordinates (X1, Y1) of each feature point of the template at the camera image coordinates, i.e., first camera coordinates, and world coordinates (X1, Y1, Z1) of the relative camera, i.e., first world coordinates, are defined at the angle. The second angle may be 45 degrees to the front left of the image capture device and records the coordinates of each feature point of the calibration plate at this angle in the camera image coordinates, i.e. the second camera coordinates, and the world coordinates relative to the camera, i.e. the second world coordinates. The third angle may be 45 degrees to the front right of the image capture device and records the coordinates of each feature point of the plate at this angle in the camera image coordinates, i.e. the third camera coordinates, and the world coordinates relative to the camera, i.e. the third world coordinates. Further, more angles can be set for calibration and recording.
After the camera is calibrated at a plurality of angles to obtain a plurality of coordinate data, the coordinate data can be calculated through a preset algorithm to obtain an optimal camera calibration matrix, namely, the camera calibration information of the image acquisition equipment is determined. Specifically, the preset algorithm may be a least squares method.
In one embodiment, the first operating time interval is determined based on a first target object in the current region and a target object in a previous region.
The agricultural machine is constantly moving forward during the operation, so that the first working time interval of the spray head determined according to the first target object contained in the current area is actually obtained by combining the working time intervals of the current area and the target object in the previous area. That is, the first operating time interval of the spray heads spraying on the first target object in the current area is actually determined according to the actual position of the target object included in the previous area and the actual position of the target object included in the current area.
In one embodiment, the number of the spray heads may be plural; determining a first operating time interval of the spray head according to the first actual position comprises: determining a corresponding target spray head in the plurality of spray heads according to the first actual position; determining the actual distance between the first target object and the target spray head according to the first actual position; acquiring the moving speed of the agricultural machine and the control time delay of the agricultural machine; and determining a first working time interval of the target spray head according to the actual distance, the moving speed and the control time delay.
The number of the spray heads contained in the agricultural machine can be multiple, and different spray heads can spray according to different spraying areas. When the first working time interval of the spray head is determined according to the first actual position, the corresponding target spray head can be determined from the multiple spray heads according to the first actual position, the actual distance between the first target object and the target spray head is further determined, meanwhile, the moving speed of the agricultural machine and the control time delay of the agricultural machine can also be obtained, and therefore the first working time interval of the target spray head can be determined according to the actual distance, the moving speed and the control time delay. Since the head performs the opening or closing operation of the spraying according to the received command, the control delay needs to be taken into consideration when considering the actual opening or closing time of the head. If the control delay is not considered, it may result in the sprinkler actually turning on the spraying operation at the 12 th time when the 10 th time sends the turning-on command to the sprinkler. As such, a portion of the target object that needs to be sprayed may be caused to be missed. Therefore, the control delay may be taken into account when determining the first operating time interval of the target sprinkler.
In one embodiment, determining a corresponding target sprinkler of the plurality of sprinklers based on the first actual location comprises: dividing the current area into a plurality of sub-areas corresponding to the plurality of spray heads; and determining the spray head corresponding to the sub-area where the first actual position is located as the target spray head.
In agricultural machinery, a plurality of image acquisition devices can be included, and each image acquisition device can correspond to a plurality of spray heads. For each image acquisition device, the area corresponding to the image currently captured by the agricultural machine during the operation process can be called as the current area. The current area can be divided into a plurality of sub-areas according to the spraying range of each sprayer, wherein each sub-area corresponds to one sprayer. According to actual needs, each subarea can also be set to correspond to a plurality of spray heads. Then, according to the first actual position of the first target object contained in the current area, the sub-area where the first target object is located is determined, so that the corresponding spray head can be determined, namely the target spray head.
In one embodiment, the control method further includes: and under the condition that the second target object is not detected in the next region of the farmland and the end time corresponding to the first working time interval is reached, closing the spray head.
If the second target object is not detected in the area image corresponding to the next area of the farmland, whether the spray head reaches the end time corresponding to the first working time interval or not can be detected, if so, the processor of the agricultural machine can send a closing instruction to the spray head so as to close the spray head and end the spraying operation.
In one embodiment, the control method further includes: and taking the earlier of the starting time of the first working time interval and the starting time of the second working time interval as the opening time of the spray head.
After the first working time interval and the second working time interval are fused to determine the final actual working time interval, the opening time of the sprayer can be further determined. Specifically, the earlier of the start time of the first operating time interval and the start time of the second operating time interval may be used as the start time of the nozzle, for example, the first target object detected in the current region is located farther from the image capturing device, and the second target object detected in the region image of the next region is located closer to the image capturing device, which may be caused by omission or a dead zone of the image capturing device. Then, when the target object is missed to be detected in the current area, the second target object detected in the next area can be timely sprayed according to the earlier starting time of the spray head, so that the condition of missed spraying is avoided, and the spraying accuracy is effectively improved.
As shown in fig. 5, in an embodiment of the present invention, there is provided a control method for an agricultural machine, including the steps of:
step 501, acquiring a first actual position of a first target object in a current region of a farmland in the operation process of the agricultural machine.
Step 502, divide the current region into a plurality of sub-regions corresponding to a plurality of nozzles.
Step 503, determining the spray head corresponding to the sub-area where the first actual position is located as the target spray head.
And step 504, determining the actual distance between the first target object and the target spray head according to the first actual position.
And 505, acquiring the moving speed of the agricultural machine and the control time delay of the agricultural machine.
Step 506, determining a first working time interval of the target nozzle according to the actual distance, the moving speed and the control time delay.
Step 507, obtaining a second actual position of a second target object in a next region of the farmland.
And step 508, determining a second working time interval of the spray head according to the second actual position.
In step 509, the actual operating time interval of the nozzle is determined according to the first operating time interval and the second operating time interval.
In the operation process of the agricultural machine, the image acquisition equipment can also continuously acquire images of the farmland along with the continuous advance of the agricultural machine. For the current time, the image acquired by the image acquisition device is called as the image corresponding to the current area, and the image corresponding to the next area can be acquired in the next area at the current time. Specifically, after acquiring a region image corresponding to a current region of the farmland, the image acquisition device may input the region image into the perception machine model, and determine a first target object included in the region image of the current region and a first actual position of the first target object in the farmland. The region image corresponding to the next region may also be input into the sensing machine model, and the second target object included in the region image of the next region and the second actual position of the second target object in the farmland may be determined.
For a certain image acquisition device, after determining a first actual position of a first target object included in an area image shot by the image acquisition device, the shooting area of the image acquisition device may be divided, that is, the current area is divided into a plurality of sub-areas. As shown in fig. 6A, the current region is divided into 4 sub-regions, a1, a2, A3 and a4, and the heads corresponding thereto are N1, N2, N3 and N4, respectively. Wherein, the first target object is detected in the area A3, the spray head N3 corresponding to the area A3 can be determined as the target spray headAnd (4) a head. Further, the actual distance d of the first target object from the target spray head N3 may be determined based on the actual position of the first target object. The actual distance d may refer to the longitudinal distance of the first target object from target spray head N3,
Figure BDA0002655277670000201
meanwhile, the moving speed V and the control time delay T of the agricultural machine can be acquireddelayThereby calculating the opening time of the target nozzle N3
Figure BDA0002655277670000202
Since the nozzle has a fixed opening/closing time, for example, when the opening/closing time of the nozzle is 100ms, the closing time t of the target nozzle N3 isclose=topen+0.1, the first operating time interval for target showerhead N3 is
Figure BDA0002655277670000203
The agricultural machine can perform image acquisition and identification on the next area of the current area as the operation process continues to advance, and can determine the second actual position of the second target object by adopting the method for determining the actual position of the first target object under the condition that the next area also contains the second target object, and can also determine the second working time interval of the target nozzle according to the second actual position by adopting the method for determining the first working time interval.
Assuming that the first operating time interval is [ t1, t2] and the second operating time interval is [ t3, t4], the actual operating time interval of the showerhead may be determined according to the first operating time interval and the second operating time interval. Unlike the conventional control algorithm that directly sends the head-on command, the control of the heads in this embodiment will be different. In the present embodiment, the first working time interval and the second working time interval may be merged, that is, the finally determined actual working time interval is T ═ T1, T2 ═ gou [ T3, T4 ]. Assuming that [ t1, t2] is [1s,3s ], [ t3, t4] is [2s,4s ], the two time intervals are fused to obtain the final actual working time interval of [1s,4s ], that is, after the target nozzle executes the first working time interval, the target nozzle is not closed, but directly executes the spraying work on the second target object, the nozzle is not closed until the 4 th s, and the spraying operation is finished.
If the target object is detected in the area image corresponding to the next area acquired by the image acquisition equipment when the second target object is sprayed, the actual position of the target object is continuously determined, the working time interval corresponding to the detected target object is further fused with the last determined working time interval [1s,4s ], and the final actual working time interval of the target nozzle is re-determined. That is, the first working time interval of the target nozzle is actually determined according to the current area and the actual position of the target object in the previous area, and when the target nozzle performs the spraying operation on the first target object, the determined actual working time interval is the first working time interval combining the working time interval of the target nozzle in the previous area and the current area, and the second working time interval in the next area. In other words, assuming that the current area is the fourth area of the agricultural machine operation, when the fourth area operates, the fourth area can be regarded as the current area, and the first operating time interval of the target nozzle corresponding to the current area is actually determined by fusing the spraying control commands of the target nozzle by combining the target object identified and detected by the first 3 areas and the fourth area and the actual position of the target object. That is, in the present embodiment, the heads are closed according to the end time in the actual operation time interval, and if the end time in the actual operation time interval is not reached, the closing command to the target head is not triggered, that is, the problem of the heads being repeatedly opened and closed does not occur.
It may happen that the determined first target object is located in a plurality of sub-areas. In this case, in an embodiment of the present invention, determining the head corresponding to the sub-area where the first actual position is located as the target head includes: determining a plurality of sub-areas where the first actual positions are located; determining a target sub-region according to the ratio of the area of the first actual position in each sub-region to the area of the first actual position; and determining the spray head corresponding to the target subregion as a target spray head.
For a first actual position where a first target object is located, a sub-region where the first actual position is located may be determined. When the sub-area where the first actual position is located is multiple, that is, the area where the actual area of the first actual position is located is multiple, the target sub-area may be determined according to the ratio of the area occupied by the first actual position in each sub-area to the area of the first actual position, and the corresponding nozzle in the target sub-area is determined as the target nozzle.
In one embodiment, determining the target sub-region according to the ratio of the area occupied by the first actual position in each sub-region to the area of the first actual position comprises: acquiring a first ratio and a second ratio of the area of the first actual position in the adjacent first sub-area and second sub-area to the area of the first actual position respectively; in case the first ratio is higher than the first threshold and the second ratio is lower than the second threshold, the first sub-area is determined as the target sub-area. As shown in FIG. 6B, the first actual position is a shaded area, and the actual areas of the first actual position are distributed in the sub-areas corresponding to nozzle N1 and nozzle N2. Assume that the range area of the first actual position is S, which is 1 square meter. The area occupied by the first actual position in the first sub-area is S1, S1 is 0.8 square meter, the area occupied by the first actual position in the second sub-area is S2, and S2 is 0.2 square meter, and the first ratio is the ratio of the area occupied by the first actual position in the first sub-area to the area of the first actual position, that is, the first ratio is S1/S is 0.8. The second ratio is a ratio of an area occupied by the first actual position in the second sub-area to an area of the first actual position, that is, the second ratio is S2/S is 0.2. Wherein the first threshold is set at 75% and the second threshold is set at 25%. Therefore, the first ratio 0.8 is higher than the first threshold value of 75%, and the second ratio 0.2 is lower than the second threshold value of 25%, in which case the first sub-area corresponding to showerhead N1 may be determined as the target sub-area.
In one embodiment, determining the target sub-region according to the ratio of the area occupied by the first actual position in each sub-region to the area of the first actual position further comprises: and under the condition that the ratio of the first ratio to the second ratio is within a preset ratio interval, determining the first sub-area and the second sub-area as target sub-areas.
It is assumed that there are a plurality of regions where the actual area of the first actual position is located, for example, there are two regions where the actual area of the first actual position is located, which are a first sub-region and a second sub-region, respectively, and the first sub-region is adjacent to the second sub-region. The processor of the agricultural machine may acquire a first ratio and a second ratio of the area of the first actual position in the first subregion and the area of the second actual position, and when the first ratio is higher than a first threshold and the second ratio is lower than a second threshold, the first subregion may be determined as the target subregion. And when the ratio of the first ratio to the second ratio is within a preset ratio range, determining the first sub-area and the second sub-area as target sub-areas. The first threshold and the second threshold are both set by self, and in general, the first threshold + the second threshold may be set as 1, or may be set by itself according to actual needs, and at this time, the first threshold + the second threshold is not required to be set as 1.
In one embodiment, the spray range of the spray head includes a first spray range and a second spray range, the second spray range being greater than the first spray range. The method further comprises the following steps: and switching the spraying range of the target spray head corresponding to the target subregion from the first spraying range to the second spraying range.
Each sprayer can also be provided with a plurality of spraying gears, and each spraying gear corresponds to a different spraying range. The shower nozzle includes two at least and sprays the gear, and the correspondence is first spraying scope and second spraying scope respectively, and the second sprays the scope and is greater than first spraying scope. Fig. 6C shows a schematic diagram of the spraying range of the spray head, and specifically, as shown, the spray head N1 includes two spraying ranges, which correspond to the first spraying range and the second spraying range respectively. In the figure, the spraying range corresponding to the gray dotted line is a first spraying range, and the spraying range corresponding to the black dotted line is a second spraying range. When the first ratio is higher than the first threshold and the second ratio is lower than the second threshold, the first sub-area can be determined as the target sub-area, in this case, only the spray head in the first sub-area is opened, but not the spray head in the second sub-area, so that the spraying range of the spray head in the first sub-area can be switched from the first spraying range to the second spraying range, the spraying range of the spray head in the first sub-area is expanded, the related spraying agents such as pesticides or fertilizers can be effectively saved, meanwhile, the spray head can be flexibly controlled according to the actual area of the first target object, and the spray head can be more accurately controlled.
In one embodiment, as shown in fig. 7, the step of determining a corresponding target showerhead of the plurality of showerheads based on the first actual position includes:
step 701, dividing the current area into a plurality of sub-areas corresponding to a plurality of spray heads.
Step 702, determining a plurality of sub-areas where the first actual position is located.
Step 703, acquiring a first ratio and a second ratio of the area of the first actual position in the adjacent first sub-area and second sub-area, respectively, to the area of the first actual position.
Step 704, determining whether the first ratio is higher than the first threshold and the second ratio is lower than the second threshold, if yes, performing step 705; if not, go to step 708.
Step 705, the first sub-region is determined as the target sub-region.
Step 706, determining the spray head corresponding to the first sub-area as a target spray head.
And 707, switching the spraying range of the target spray head corresponding to the target sub-area from the first spraying range to the second spraying range.
Step 708, determining whether the ratio of the first ratio to the second ratio is within a preset ratio range, if yes, executing step 709; if not, go to step 711.
Step 709, determine both the first sub-region and the second sub-region as target sub-regions.
And step 710, determining the sprayers corresponding to the first subregion and the second subregion as target sprayers.
Step 711, determine the second sub-region as the target sub-region.
And 712, determining the spray head corresponding to the second subregion as a target spray head.
After the first actual position of the first target object is determined, a control area corresponding to the image acquisition device, that is, a current area, may be divided into sub-areas corresponding to the plurality of nozzles, and the sub-area where the first actual position is located is determined. When there are a plurality of sub-regions where the first actual position is located, the first sub-region and the second sub-region where the first actual position is specifically located may be obtained. Assume that the first actual position has an area of range S. The area occupied by the first actual position in the first sub-area is S1, the area occupied by the first actual position in the second sub-area is S2, and the first ratio is the ratio of the area occupied by the first actual position in the first sub-area to the area of the first actual position, that is, the first ratio is S1/S. The second ratio is a ratio of an area occupied by the first actual position in the second sub-area to an area of the first actual position, i.e., the second ratio is S2/S, and it is understood that the first ratio + the second ratio is 1.
Further, the first ratio and the second ratio may be compared, when the first ratio is higher than the first threshold and the second ratio is lower than the second threshold, it may be considered that the area occupied by the first actual position in the first sub-area is larger, the first sub-area may be considered as the sub-area corresponding to the first actual position, and the second sub-area with the smaller occupied area may be ignored. Therefore, the first subregion can be determined as the target subregion, and the spray head corresponding to the first subregion can be determined as the target spray head. For example, the first ratio is 95%, the second ratio is 5%, the first threshold is set to 94%, the second threshold is set to 6%, and at this time, the first ratio 95% is higher than the first threshold 94% and the second ratio 5% is lower than the second threshold 6%, so that the first sub-area may be determined as the target sub-area, and the nozzle corresponding to the first sub-area may be determined as the target nozzle.
The spraying range of the target spray head corresponding to the target subregion can be switched from the first spraying range to the second spraying range. That is, at this time, the area range corresponding to the first actual position actually occupies two sub-regions, but finally, according to the area occupation ratio, the first sub-region is regarded as the sub-region where the first actual position is located, and the second sub-region is ignored. In this case, the processor will initiate a nozzle opening command to the target nozzle corresponding to the first sub-area, and the nozzle corresponding to the second sub-area will not receive the opening control command, that is, the nozzle corresponding to the second sub-area will not open the spraying operation. In order to enable the target object actually located in the area where the second sub-area is located to be sprayed and covered, the spraying range of the target sprayer of the first sub-area can be enlarged, the first spraying range is switched to the second spraying range, and therefore when the second sub-area also contains the range occupied by the first actual position, all the target sprayers in the first sub-area only need to be opened, and all the target objects contained in all the first actual positions can be sprayed.
If the first ratio and the second ratio do not satisfy the above condition, that is, the first ratio and the second ratio cannot satisfy the conditions that the first ratio is higher than the first threshold and the second ratio is lower than the second threshold at the same time, the ratio between the first ratio and the second ratio may be determined. If the ratio of the first ratio to the second ratio is within the preset ratio range, it can be considered that the areas occupied by the first actual position in the first sub-area and the second actual position in the second sub-area are substantially the same, in this case, both the first sub-area and the second sub-area can be determined as target sub-areas, and the sprayers corresponding to the first sub-area and the second sub-area can be determined as target sprayers. If the first ratio and the second ratio cannot simultaneously satisfy the conditions that the first ratio is higher than the first threshold and the second ratio is lower than the second threshold, and the ratio of the first ratio to the second ratio is not within the preset ratio range, determining the second subregion as a target subregion, and determining the spray head corresponding to the second subregion as a target spray head.
As shown in the flowchart of fig. 8, first, an image captured by an image capturing device may be input, the image is input into a sensing machine model, a target object included in the image is determined by a sensing algorithm, and an image position of the target object in the image is determined. And then determining the actual position of the target object in the farmland according to the image position and the camera calibration information so as to determine a corresponding target sub-region, and determining a corresponding target spray head according to the target sub-region. Meanwhile, the working time interval of the target spray head, namely the calculated spraying area information in the graph, can be calculated according to the actual distance between the target object and the target spray head, the moving speed of the agricultural machine and the control time delay of the agricultural machine. The processor can obtain the number of the target sprayer and send an opening instruction to the target sprayer according to the starting time corresponding to the working time interval, so that the target sprayer can open the spraying operation according to the opening instruction. Meanwhile, the target object of the next area is continuously identified and positioned, and a plurality of control instructions for the same sprayer are fused to determine the actual working time interval of the sprayer. The vehicle control unit is a processor of the agricultural machine and can correspondingly control the sprayer according to the current state of the sprayer.
In one embodiment, as shown in fig. 9, there is provided an apparatus for determining a position of a target object, comprising an image acquisition module, an image position confirmation module, and an actual position determination module, wherein:
an image acquisition module 901 configured to acquire an area image of the farmland.
An image location confirmation module 902 configured to input the region image to a perceptual machine model to determine an image location of the target object in the region image, the perceptual machine model being trained by an image training sample generated by the generative confrontation network.
An actual position determination module 903 configured to acquire camera calibration information of an image capture device for capturing an area image; and determining the actual position of the target object according to the image position and the camera calibration information.
In one embodiment, as shown in fig. 9, the apparatus for determining the position of the target object further includes a model training module 904 configured to: acquiring a plurality of regional image samples; performing data enhancement on the area image samples through a generative antagonizing network to obtain a plurality of image training samples; inputting the image training sample into the perception machine model to train the perception machine model.
In one embodiment, the generative confrontation network includes a generative network and a discriminative network; the model training module 904 is further configured to: inputting the area image samples into a generation network to generate a plurality of random area images; inputting the random area image into a discrimination network to generate a judgment result for whether the target object is contained in the random area image; and the generation network adjusts the generation strategy of the random area image according to the judgment result so as to output a plurality of random image training samples according to the input area image samples.
In one embodiment, model training module 904 is further configured to: inputting the image training sample into a perception machine model to extract a plurality of regional image sample characteristics in the image training sample; determining whether the image training sample contains a target object according to the characteristics of the plurality of regional image samples; under the condition that the image training sample is determined to contain the target object, determining the characteristic position of the regional image sample characteristic corresponding to the target object; and determining the image position of the target object in the image training sample according to the characteristic position.
In one embodiment, model training module 904 is further configured to: acquiring a marking frame of an image training sample and corresponding marking frame parameters; judging whether the marking frame contains a target object or not; adjusting the parameters of the marking frame according to the judgment result to determine a detection frame aiming at the target object; determining the probability that the characteristics of the area image sample belong to the characteristics of the target object according to the detection frame; and determining whether the image training sample contains the target object according to the probability.
In one embodiment, model training module 904 is further configured to: performing frame regression on a detection frame indicating that the feature belonging to the target object is detected so as to correct the position of the detection frame; screening the corrected detection frame according to a preset condition; and determining the probability that the characteristics of the area image sample belong to the characteristics of the target object according to the screened detection frame.
In one embodiment, model training module 904 is further configured to: and reserving the corrected detection frame which meets a preset condition, wherein the preset condition comprises any one of the following conditions: the probability value of the corrected detection box is greater than a preset probability threshold value; the size of the corrected detection frame is larger than the preset size; and the area overlapping proportion of any two detection frames in the corrected detection frames is lower than a preset overlapping threshold value.
In one embodiment, the actual position determining module 903 is further configured to calibrate a camera of the image capturing device, resulting in a plurality of coordinate data; and calibrating the camera of the image acquisition equipment through a preset algorithm and a plurality of coordinate data to determine the camera calibration information of the image acquisition equipment.
In one embodiment, the actual position determining module 903 is further configured to place the calibration plate at a first angle of the image capturing device, record a first camera coordinate of each feature point of the calibration plate at the image capturing device and a corresponding first world coordinate; placing the calibration plate at a second angle of the image acquisition equipment, and recording a second camera coordinate and a corresponding second world coordinate of each feature point of the calibration plate in the image acquisition equipment; and placing the calibration plate at a third angle of the image acquisition equipment, and recording a third camera coordinate and a corresponding third world coordinate of each characteristic point of the calibration plate at the image acquisition equipment.
The device for determining the position of the target object comprises a processor and a memory, wherein the image acquisition module, the image position confirmation module, the actual position determination module and the like are stored in the memory as program units, and the processor executes the program modules stored in the memory to realize corresponding functions.
In one embodiment, as shown in fig. 10, there is provided a control device for an agricultural machine, including:
a target object determination module 1001 configured to acquire a first actual position of a first target object in a current area of an agricultural field during a work of the agricultural machine, wherein the target object determination module 1001 comprises an image acquisition module 901, an image position confirmation module 902, and an actual position determination module 903 as shown in fig. 9.
An operating time determination module 1002 configured to determine a first operating time interval for the spray head based on the first actual position.
The target object determination module 1001 is further configured to obtain a second actual position of a second target object in a next area of the agricultural field.
The on-time determination module 1002 is further configured to determine a second on-time interval for the spray head based on the second actual position; and determining the actual working time interval of the spray head according to the first working time interval and the second working time interval.
In one embodiment, the operating time determination module 1002 is further configured to: determining the actual working time interval as the starting time of the first working time interval to the ending time of the second working time interval under the condition that the first working time interval and the second working time interval meet any one of the following conditions: the first working time interval and the second working time interval are continuous; the first working time interval is overlapped with the second working time interval; the time interval between the first working time interval and the second working time interval is smaller than a preset threshold value.
In one embodiment, the first operating time interval is determined based on a first target object in the current region and a target object in a previous region.
In one embodiment, the control apparatus further includes a nozzle identification module 1003 configured to identify a corresponding target nozzle of the plurality of nozzles according to the first actual position; wherein the number of the spray heads is multiple. The on-time determination module 1002 is further configured to determine an actual distance of the first target object from the target sprinkler based on the first actual position; acquiring the moving speed of the agricultural machine and the control time delay of the agricultural machine; and determining a first working time interval of the target spray head according to the actual distance, the moving speed and the control time delay.
In one embodiment, the sprayer validation module 1003 is further configured to divide the current zone into a plurality of sub-zones corresponding to a plurality of sprayers; and determining the spray head corresponding to the sub-area where the first actual position is located as the target spray head.
In one embodiment, the spray head validation module 1003 is further configured to determine a plurality of sub-areas in which the first actual location is located; determining a target sub-region according to the ratio of the area of the first actual position in each sub-region to the area of the first actual position; and determining the spray head corresponding to the target subregion as a target spray head.
In one embodiment, the nozzle tip identification module 1003 is further configured to obtain a first ratio and a second ratio of the area occupied by the first actual position in the adjacent first sub-area and second sub-area, respectively, to the area of the first actual position; in case the first ratio is higher than the first threshold and the second ratio is lower than the second threshold, the first sub-area is determined as the target sub-area.
In one embodiment, the nozzle tip identification module 1003 is further configured to determine both the first sub-zone and the second sub-zone as the target sub-zone if the ratio of the first ratio to the second ratio is within a preset ratio interval.
In one embodiment, the control apparatus further includes a nozzle control module 1004 configured to control the nozzle to be turned on and off according to the determined actual operating time interval after the actual operating time interval of the nozzle is determined by the operating time determining module 1002.
In one embodiment, the spray range of the spray head comprises a first spray range and a second spray range, the second spray range being greater than the first spray range; the spray head control module 1004 is further configured to switch the spray range of a target spray head corresponding to a target sub-region from a first spray range to a second spray range.
In one embodiment, the target object determination module 1001 is further configured to acquire an area image of a current area of the agricultural field; the first actual position is determined using the method for determining the position of the target object described above. In one embodiment, the spray control module 1004 is further configured to turn off the spray in the event that a second target object is not detected in a next area of the agricultural field and an end time corresponding to the first operating time interval has been reached.
In one embodiment, the operating time determination module 1002 is further configured to take the earlier of the start time of the first operating time interval and the start time of the second operating time interval as the start time of the showerhead.
The control device for the agricultural machine comprises a processor and a memory, wherein the target object determining module, the working time determining module and the like are stored in the memory as program units, and the corresponding functions are realized in the program modules stored in the memory by the processor.
In one embodiment, the processor includes a kernel that retrieves the corresponding program unit from memory. The kernel can be set to one or more, and the kernel parameter is adjusted to determine the position of the target object.
In one embodiment, a processor comprises:
an image acquisition module 901 configured to acquire an area image of the farmland.
An image location confirmation module 902 configured to input the region image to a perceptual machine model to determine an image location of the target object in the region image, the perceptual machine model being trained by an image training sample generated by the generative confrontation network.
An actual position determination module 903 configured to acquire camera calibration information of an image capture device for capturing an area image; and determining the actual position of the target object according to the image position and the camera calibration information.
In one embodiment, the processor further comprises a model training module 904 configured to: acquiring a plurality of regional image samples; performing data enhancement on the area image samples through a generative antagonizing network to obtain a plurality of image training samples; inputting the image training sample into the perception machine model to train the perception machine model.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present invention provides a storage medium having stored thereon a program that, when executed by a processor, implements the above-described method for determining a position of a target object.
The embodiment of the invention provides a processor, which is used for running a program, wherein the program executes the method for determining the position of the target object when running.
In one embodiment, the processor includes a kernel that retrieves the corresponding program unit from memory. The kernel can be set to one or more, and the kernel parameter is adjusted to determine the position of the target object.
In one embodiment, a processor comprises:
a target object determination module 1001 configured to obtain a first actual position of a first target object in a current area of an agricultural field during operation of the agricultural machine.
An operating time determination module 1002 configured to determine a first operating time interval for the spray head based on the first actual position.
The target object determination module 1001 is further configured to obtain a second actual position of a second target object in a next area of the agricultural field.
The on-time determination module 1002 is further configured to determine a second on-time interval for the spray head based on the second actual position; and determining the actual working time interval of the spray head according to the first working time interval and the second working time interval.
A spray head confirmation module 1003 configured to determine a corresponding target spray head of the plurality of spray heads according to the first actual position; wherein the number of the spray heads is multiple.
The nozzle control module 1004 is configured to control the nozzle to be turned on and off according to the determined actual working time interval after the actual working time interval of the nozzle is determined by the working time determination module 1002.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present invention provides a storage medium having a program stored thereon, the program implementing the control method for an agricultural machine described above when executed by a processor.
The embodiment of the invention provides a processor, wherein the processor is used for running a program, and the program executes the control method for the agricultural machine during running.
In one embodiment, as shown in fig. 11, there is provided a control device 1100 for an agricultural machine, the agricultural machine including a spray head, the control device 1100 including: an image capturing device 1101 configured to capture an area image of the agricultural field; and a processor 1102 configured to execute the control method for the agricultural machine described above.
In one embodiment, as shown in fig. 12, there is provided an agricultural machine 1200 comprising: a spray head 1201; and a control device 1100 for an agricultural machine as described above.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 13. The computer device includes a processor a01, a network interface a02, a memory (not shown), and a database (not shown) connected by a system bus. Wherein processor a01 of the computer device is used to provide computing and control capabilities. The memory of the computer device comprises an internal memory a03 and a non-volatile storage medium a 04. The non-volatile storage medium a04 stores an operating system B01, a computer program B02, and a database (not shown in the figure). The internal memory a03 provides an environment for the operation of the operating system B01 and the computer program B02 in the nonvolatile storage medium a 04. The database of the computer device is used to store data relevant for performing the method for determining the position of the target object and for performing the control method for the agricultural machine, etc. The network interface a02 of the computer device is used for communication with an external terminal through a network connection. The computer program B02, when executed by the processor a01, implements a method for determining a position of a target object.
Those skilled in the art will appreciate that the architecture shown in fig. 13 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
Embodiments of the present invention provide an apparatus comprising a processor, a memory and a program stored on the memory and executable on the processor, the processor implementing the steps of the method for determining a position of a target object as described above when executing the program.
The present application further provides a computer program product adapted to perform the steps of the method for determining a position of a target object as described above when executed on a data processing device.
Embodiments of the present invention provide an apparatus comprising a processor, a memory, and a program stored on the memory and executable on the processor, wherein the processor executes the program to implement the steps of the control method for an agricultural machine as described above.
The present application also provides a computer program product adapted to perform the steps of the control method for an agricultural machine as described above, when executed on a data processing device.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (17)

1. A control method for an agricultural machine including a spray head, characterized in that the control method comprises:
acquiring a first actual position of a first target object in a current region of a farmland in the operation process of the agricultural machine;
determining a first working time interval of the spray head according to the first actual position;
acquiring a second actual position of a second target object in a next region of the farmland;
determining a second working time interval of the spray head according to the second actual position;
and determining the actual working time interval of the spray head according to the first working time interval and the second working time interval.
2. The control method according to claim 1, wherein the determining an actual operating time interval of the spray head according to the first operating time interval and the second operating time interval comprises: determining the actual working time interval as the starting time of the first working time interval to the ending time of the second working time interval when the first working time interval and the second working time interval meet any one of the following conditions:
the first working time interval is continuous with the second working time interval;
the first operating time interval overlaps with the second operating time interval;
and the time interval between the first working time interval and the second working time interval is less than a preset threshold value.
3. The control method according to claim 1, wherein the first operating time interval is determined based on a first target object in the current zone and a target object in a previous zone.
4. The control method according to claim 1, wherein the number of the spray heads is plural; the determining a first operating time interval of the spray head according to the first actual position comprises:
determining a corresponding target spray head in the plurality of spray heads according to the first actual position;
determining the actual distance between the first target object and the target spray head according to the first actual position;
acquiring the moving speed of the agricultural machine and the control time delay of the agricultural machine;
and determining a first working time interval of the target spray head according to the actual distance, the moving speed and the control time delay.
5. The control method of claim 4, wherein said determining a corresponding target sprinkler of a plurality of sprinklers based on the first actual position comprises:
dividing the current region into a plurality of sub-regions corresponding to the plurality of spray heads;
and determining the spray head corresponding to the sub-area where the first actual position is located as the target spray head.
6. The control method according to claim 5, wherein the determining, as the target shower head, a shower head corresponding to the sub-area in which the first actual position is located includes:
determining a plurality of sub-areas where the first actual positions are located;
determining a target sub-region according to the ratio of the area of the first actual position in each sub-region to the area of the first actual position;
and determining the spray head corresponding to the target subregion as the target spray head.
7. The control method according to claim 6, wherein the determining the target sub-area according to the ratio of the area occupied by the first actual position in each sub-area to the area of the first actual position comprises:
acquiring a first ratio and a second ratio of the area of the first actual position in the adjacent first sub-area and second sub-area to the area of the first actual position respectively;
and determining the first subregion as a target subregion when the first ratio is higher than a first threshold value and the second ratio is lower than a second threshold value.
8. The control method of claim 7, wherein the spray range of the spray head includes a first spray range and a second spray range, the second spray range being greater than the first spray range; the method further comprises the following steps:
and switching the spraying range of the target spray head corresponding to the target subregion from the first spraying range to the second spraying range.
9. The control method according to claim 7, wherein the determining the target sub-area according to the ratio of the area occupied by the first actual position in each sub-area to the area of the first actual position further comprises:
and under the condition that the ratio of the first ratio to the second ratio is within a preset proportion interval, determining the first sub-area and the second sub-area as target sub-areas.
10. The control method of claim 1, wherein said obtaining a first actual position of a first target object in a current area of an agricultural field during operation of the agricultural machine comprises:
acquiring a region image corresponding to the current region of the farmland in the agricultural machinery operation process;
inputting the region image to a perceptual machine model to determine an image position of a target object in the region image;
acquiring camera calibration information of image acquisition equipment for shooting the area image;
and determining a first actual position of the first target object according to the image position and the camera calibration information.
11. The control method according to claim 10, wherein the acquiring camera calibration information of an image capturing apparatus for capturing the area image includes:
calibrating a camera of the image acquisition equipment to obtain a plurality of coordinate data;
and calibrating the camera of the image acquisition equipment through a preset algorithm and the plurality of coordinate data so as to determine the camera calibration information of the image acquisition equipment.
12. The control method according to claim 11, wherein the calibrating the camera of the image capturing device to obtain a plurality of coordinate data comprises:
placing a calibration plate at a first angle of the image acquisition equipment, and recording a first camera coordinate and a corresponding first world coordinate of each feature point of the calibration plate in the image acquisition equipment;
placing the calibration plate at a second angle of the image acquisition equipment, and recording a second camera coordinate and a corresponding second world coordinate of each feature point of the calibration plate in the image acquisition equipment;
and placing the calibration plate at a third angle of the image acquisition equipment, and recording a third camera coordinate and a corresponding third world coordinate of each feature point of the calibration plate at the image acquisition equipment.
13. The control method of claim 1, further comprising one or both of:
closing the spray head under the condition that the second target object is not detected in the next region of the farmland and the end time corresponding to the first working time interval is reached;
and taking the earlier of the starting time of the first working time interval and the starting time of the second working time interval as the opening time of the spray head.
14. A processor, characterized by being configured to execute the control method for an agricultural machine according to any one of claims 1 to 13.
15. A control device for an agricultural machine, the agricultural machine including a spray head, the control device comprising:
an image acquisition device configured to acquire an area image of a farmland; and
the processor of claim 14.
16. An agricultural machine, comprising:
a spray head; and
a control device for an agricultural machine according to claim 15.
17. A machine-readable storage medium having instructions stored thereon, which when executed by a processor causes the processor to be configured to perform a control method for an agricultural machine according to any one of claims 1 to 13.
CN202010884997.7A 2020-08-28 2020-08-28 Control method and control device for agricultural machine, agricultural machine and processor Pending CN114119736A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010884997.7A CN114119736A (en) 2020-08-28 2020-08-28 Control method and control device for agricultural machine, agricultural machine and processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010884997.7A CN114119736A (en) 2020-08-28 2020-08-28 Control method and control device for agricultural machine, agricultural machine and processor

Publications (1)

Publication Number Publication Date
CN114119736A true CN114119736A (en) 2022-03-01

Family

ID=80374996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010884997.7A Pending CN114119736A (en) 2020-08-28 2020-08-28 Control method and control device for agricultural machine, agricultural machine and processor

Country Status (1)

Country Link
CN (1) CN114119736A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115399139A (en) * 2022-08-12 2022-11-29 中联农业机械股份有限公司 Method, apparatus, storage medium, and processor for determining crop yield

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115399139A (en) * 2022-08-12 2022-11-29 中联农业机械股份有限公司 Method, apparatus, storage medium, and processor for determining crop yield
CN115399139B (en) * 2022-08-12 2024-04-26 中联农业机械股份有限公司 Method, apparatus, storage medium and processor for determining crop yield

Similar Documents

Publication Publication Date Title
RU2767347C2 (en) Method of applying a spraying agent on a field
CN112839511B (en) Method for applying a spray to a field
US11468670B2 (en) Detection and management of target vegetation using machine vision
US11109585B2 (en) Agricultural spraying control system
US20210360850A1 (en) Automatic driving system for grain processing, automatic driving method, and path planning method
US12075769B2 (en) Agricultural sprayer with real-time, on-machine target sensor
US10405535B2 (en) Methods, systems and devices relating to real-time object identification
US20240324579A1 (en) Agricultural sprayer with real-time on-machine target sensor and confidence level generator
AU2019419580B2 (en) Grain processing self-driving system, self-driving method, and automatic recognition method
US11832609B2 (en) Agricultural sprayer with real-time, on-machine target sensor
US11712032B2 (en) Device to detect and exercise control over weeds applied on agricultural machinery
CN115443845B (en) Tea garden tea tree lesion and growth condition monitoring method based on unmanned aerial vehicle
CN114119736A (en) Control method and control device for agricultural machine, agricultural machine and processor
US20230028506A1 (en) Method for Processing Plants in a Field
DE102019218192A1 (en) Method of working crops in a field
Braun et al. Improving pesticide spray application in vineyards by automated analysis of the foliage distribution pattern in the leaf wall
Tangwongkit et al. Development of a real-time, variable rate herbicide applicator using machine vision for between-row weeding of sugarcane fields
RU2774651C1 (en) Automatic driving system for grain processing, automatic driving method and trajectory planning method
US20220406039A1 (en) Method for Treating Plants in a Field
CN118235586A (en) Machine vision-based pesticide spraying and fertilizer applying method and device for vine fruit trees
CN112766178A (en) Method, device, equipment and medium for positioning pests based on intelligent pest control system
Braun et al. Visual analysis of vineyard foliage distribution.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination