WO2020157878A1 - Système informatique, procédé d'aide à la croissance de cultures et programme - Google Patents

Système informatique, procédé d'aide à la croissance de cultures et programme Download PDF

Info

Publication number
WO2020157878A1
WO2020157878A1 PCT/JP2019/003256 JP2019003256W WO2020157878A1 WO 2020157878 A1 WO2020157878 A1 WO 2020157878A1 JP 2019003256 W JP2019003256 W JP 2019003256W WO 2020157878 A1 WO2020157878 A1 WO 2020157878A1
Authority
WO
WIPO (PCT)
Prior art keywords
position information
point
plant
herbicide
image
Prior art date
Application number
PCT/JP2019/003256
Other languages
English (en)
Japanese (ja)
Inventor
俊二 菅谷
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Priority to JP2020569247A priority Critical patent/JP7068747B2/ja
Priority to PCT/JP2019/003256 priority patent/WO2020157878A1/fr
Publication of WO2020157878A1 publication Critical patent/WO2020157878A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G7/00Botany in general
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass

Definitions

  • the present invention relates to a computer system, a herbicide spraying support method and a program for supporting spraying of a herbicide.
  • weeds that have grown in the field have been removed by drones.
  • a weeding method there is a configuration in which the position of a weed that has grown in a field is specified, and a drone is moved to this position to perform weeding.
  • a drone is run along a ridge provided in the field, and an image of the ridge is taken by this drone. It is difficult to determine whether or not anything other than crops (for example, weeds) has been photographed by image analysis of this photographed image.If something other than crops is photographed, remove the weeding nail provided on the drone. There is disclosed a configuration in which this is used and weeding is performed (see Patent Document 1).
  • Patent Document 1 is not suitable for the weeding of plants other than the ridges because the target of weeding is only the plants growing on the ridges.
  • An object of the present invention is to provide a computer system, a herbicide spraying support method, and a program, in which a herbicide is sprayed on an arbitrary point and weeds can be easily weeded.
  • the present invention provides the following solutions.
  • the present invention is a computer system that supports the application of herbicides, A first acquisition means for acquiring position information of the seeding point of the crop, Storage means for storing the position information of the seeding point, Second acquisition means for acquiring a photographed image of the field and position information of the photographing point, An image analysis of the photographed image, a detection means for detecting a plant, Positioning information of the plant, specifying means for specifying based on the position information of the shooting point, Based on the position information of the stored sowing point, and the position information of the identified plant, a spraying means for spraying a herbicide to plants other than the seeding point, There is provided a computer system comprising:
  • the computer system that supports the spraying of the herbicide obtains the position information of the seeding point of the crop, stores the position information of the seeding point, and takes a photographed image of the field and the position information of the photographing point. Obtaining, image analysis of the photographed image, the plant is detected, the positional information of the plant is specified based on the positional information of the photographing point, the positional information of the stored seeding point, and the specified plant The herbicide is sprayed on plants other than the seeding point based on the position information of.
  • the present invention is a system category, but also in other categories such as methods and programs, the same action/effect according to the category is exhibited.
  • the present invention is a computer system that supports the application of herbicides,
  • a first acquisition means for acquiring the first captured image of the field and the positional information of the capturing location; Image analysis of the first captured image, first detection means for detecting the shoots of the crop, Position information of the sprout point where the sprout is present, first specifying means for specifying based on the position information of the shooting point, Storage means for storing the position information of the identified sprout point,
  • a second acquisition means for acquiring position information of the second captured image and the capturing location of the field at a time different from the first captured image;
  • An image analysis of the second captured image a second detection means for detecting plants, Position information of the plant, a second specifying means for specifying based on the position information of the shooting point, Based on the stored position information of the sprout point and the position information of the identified plant, spraying means for spraying a herbicide on plants other than the sprout point,
  • a computer system comprising:
  • the computer system that supports the spraying of the herbicide acquires the first photographed image of the field and the positional information of the photographing point, analyzes the image of the first photographed image, and detects the shoots of the crop. Then, the position information of the sprout point where the sprout is present is specified based on the position information of the shooting point, the specified position information of the sprout point is stored, and a field at a time different from the first shooting image is shot.
  • the second photographed image and the position information of the photographing point are acquired, the second photographed image is image-analyzed, the plant is detected, the position information of the plant is specified based on the position information of the photographing point, and stored.
  • the herbicide is sprayed on plants other than the sprout point based on the position information of the sprout point and the position information of the identified plant.
  • the present invention is a system category, but also in other categories such as methods and programs, the same action/effect according to the category is exhibited.
  • the present invention it is possible to provide a computer system, a herbicide spraying support method, and a program in which a herbicide is sprayed on an arbitrary point and weeds are easily weeded.
  • FIG. 1 is a diagram showing an outline of a herbicide spraying support system 1.
  • FIG. 2 is an overall configuration diagram of the herbicide spraying support system 1.
  • FIG. 3 is a diagram showing a flowchart of the seeding point storage processing executed by the computer 10.
  • FIG. 4 is a diagram showing a flowchart of the first herbicide spraying support process executed by the computer 10.
  • FIG. 5 is a diagram showing a flowchart of the first learning process executed by the computer 10.
  • FIG. 6 is a diagram showing a flowchart of a sprout spot storage process executed by the computer 10.
  • FIG. 7 is a diagram showing a flowchart of the second herbicide application support process executed by the computer 10.
  • FIG. 8 is a diagram showing a flowchart of the second learning process executed by the computer 10.
  • FIG. 1 is a diagram for explaining the outline of a herbicide spraying support system 1 which is a preferred embodiment of the present invention.
  • the herbicide spraying support system 1 is a computer system including a computer 10 and supporting spraying of a herbicide.
  • the herbicide spraying support system 1 is a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal owned by a worker who cultivates a crop (for example, a smartphone, a tablet terminal, a personal computer), and other computers such as other computers. Terminals and devices may be included. Further, the herbicide spraying support system 1 may be realized by a single computer such as the computer 10, or may be realized by a plurality of computers such as a cloud computer.
  • the computer 10 is connected to a drone, an agricultural machine tool, a high-performance agricultural machine, a worker terminal, other computers, etc. so as to be able to perform data communication via a public line network, etc., and executes necessary data transmission/reception.
  • the computer 10 acquires the position information on the seeding point of the crop when sowing the crop.
  • the computer 10 acquires the position information of the seeding point from, for example, an agricultural machine tool that has performed seeding, a high-performance agricultural machine, or a worker terminal that is possessed by an operator who performed seeding.
  • the computer 10 stores the position information of this seeding point.
  • the computer 10 acquires a photographed image of the field and position information of the photographing point.
  • the computer 10 acquires, for example, a picked-up image obtained by picking up each point in the field with a drone.
  • the computer 10 acquires from the drone, for example, the position information of the shooting location where the drone shot the shot image.
  • the computer 10 analyzes the photographed image and detects the plants shown in this photographed image.
  • the computer 10 extracts, for example, a feature point (for example, shape, contour, hue) and a feature amount (for example, an average of pixel values, a variance, a statistical value such as a histogram) of the captured image as the image analysis.
  • the computer 10 detects the plant shown in the captured image based on the extracted characteristic points and characteristic amounts.
  • the plants include seeded crops and weeds that can be weeded.
  • the computer 10 identifies the detected positional information of the plant based on the acquired positional information of the photographing location.
  • the computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
  • the computer 10 sprays the herbicide on the plants existing at positions other than this seeding point based on the stored position information of the seeding point and the stored position information of the plant.
  • the computer 10 determines that this plant is a crop and does not spray the herbicide.
  • the computer 10 determines that this plant is a weed, and sprays a herbicide.
  • the computer 10 moves the drone to the position of this weed based on the position information of the plant judged to be a weed, and sends a command to the drone to spray the herbicide by the herbicide spraying device of this drone.
  • the drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point.
  • the computer 10 sprays the herbicide on the plants other than the seeding point.
  • the computer 10 can also learn the detected plant image.
  • the computer 10 detects plants by taking this learning result into consideration when performing image analysis on the acquired captured image.
  • the computer 10 acquires the first photographed image of the field and the position information of the photographing point.
  • the computer 10 acquires, for example, a first photographed image obtained by photographing each point in the field with a drone.
  • the computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the first shot image.
  • the computer 10 analyzes the first photographed image and detects the sprout of the crop shown in this first photographed image.
  • the computer 10 extracts, for example, a feature point or a feature amount of the first captured image as image analysis.
  • the computer 10 detects the sprout of the crop shown in the first photographed image based on the extracted characteristic points and characteristic amounts.
  • the computer 10 identifies the detected position information of the sprout of the crop based on the acquired position information of the shooting point.
  • the computer 10 identifies the position of the sprout of the crop as the position information of the sprout point, assuming that the detected position information of the sprout of the crop matches the position information of the photographing point. As a result, the computer 10 identifies the position of the detected sprout of the crop in the field as the sprout point.
  • the computer 10 stores the position information of this sprout point.
  • the computer 10 acquires the second photographed image of the field photographed at a different time from the first photographed image and the position information of the photographing point.
  • the computer 10 acquires, for example, a second photographed image obtained by photographing each point in the field with a drone.
  • the computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the second shot image.
  • the computer 10 analyzes the image of the second captured image, and detects the plants in the second captured image.
  • the computer 10 extracts the feature points and the feature amount of the second captured image as image analysis, for example.
  • the computer 10 detects the plant in the second captured image based on the extracted feature points and feature amounts.
  • the plants in this case include crops grown from new shoots and weeds that can be weeded.
  • the computer 10 identifies the detected positional information of the plant based on the acquired positional information of the photographing location.
  • the computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
  • the computer 10 sprays the herbicide on the plants existing at the positions other than the sprout point based on the stored position information of the sprout point and the stored position information of the plant.
  • the computer 10 determines that this plant is a crop and does not spray the herbicide.
  • the computer 10 determines that this plant is a weed, and sprays a herbicide.
  • the computer 10 moves the drone to the position of this weed based on the position information of the plant judged to be a weed, and sends a command to the drone to spray the herbicide by the herbicide spraying device of this drone.
  • the drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point.
  • the computer 10 sprays the herbicide on the plants other than the seeding point.
  • the computer 10 can also learn the detected plant image.
  • the computer 10 detects plants by taking this learning result into consideration when performing image analysis on the acquired second captured image.
  • the computer 10 acquires the position information of the seeding point of the crop when sowing the crop (step S01).
  • the computer 10 acquires the position information of the seeding point from the agricultural machine tool that has sowed the seed, the high-performance agricultural machine, or the worker terminal possessed by the worker who has seeded the seed.
  • the computer 10 stores the position information of this seeding point (step S02).
  • the computer 10 acquires a photographed image of the field and position information of the photographing point as photographing data (step S03).
  • the computer 10 acquires, for example, a picked-up image obtained by picking up each point in the field with a drone.
  • the computer 10 acquires from the drone, for example, the position information of the shooting location where the drone shot the shot image.
  • the computer 10 acquires such photographed images and positional information of photographing points as photographing data.
  • the computer 10 analyzes the photographed image to detect the plants in the photographed image (step S04).
  • the computer 10 extracts, for example, a feature point or a feature amount of a captured image as image analysis.
  • the computer 10 detects the plant shown in the captured image based on the extracted characteristic points and characteristic amounts.
  • the plants include seeded crops and weeds that can be weeded.
  • the computer 10 specifies the detected position information of the plant based on the acquired position information of the photographing location (step S05).
  • the computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
  • the computer 10 sprays the herbicide on the plants existing at positions other than the seeding point based on the stored position information of the seeding point and the stored position information of the plant (step S06).
  • the computer 10 determines that this plant is a crop and does not spray the herbicide.
  • the computer 10 determines that this plant is a weed, and sprays a herbicide.
  • the computer 10 moves the drone to the position of this weed based on the position information of the plant determined to be a weed, and transmits a command to the drone to spray the herbicide by the herbicide spraying device of this drone.
  • the drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point.
  • the computer 10 sprays the herbicide on the plants other than the seeding point.
  • the computer 10 acquires the first photographed image of the field and the position information of the photographing point as the first photographing data (step S10).
  • the computer 10 acquires, for example, a first photographed image obtained by photographing each point in the field with a drone.
  • the computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the first shot image.
  • the computer 10 analyzes the image of the first captured image and detects the sprout of the crop shown in the first captured image (step S11).
  • the computer 10 extracts, for example, a feature point or a feature amount of the first captured image as image analysis.
  • the computer 10 detects the sprout of the crop shown in the first photographed image based on the extracted characteristic points and characteristic amounts.
  • the computer 10 identifies the position information of the detected sprout of the crop based on the acquired position information of the shooting point (step S12).
  • the computer 10 identifies the position of the sprout of the crop as the position information of the sprout point, assuming that the detected position information of the sprout of the crop matches the position information of the photographing point. As a result, the computer 10 identifies the position of the detected sprout of the crop in the field as the sprout point.
  • the computer 10 stores the position information of this sprout point (step S13).
  • the computer 10 acquires the second captured image obtained by capturing the field at a time different from the first captured image and the position information of the capturing location as the second captured data (step S14).
  • the computer 10 acquires, for example, a second photographed image obtained by photographing each point in the field at a time different from the first photographed image using a drone.
  • the computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the second shot image.
  • the computer 10 analyzes the image of the second captured image, and detects the plants in the second captured image (step S15).
  • the computer 10 extracts the feature points and the feature amount of the second captured image as image analysis, for example.
  • the computer 10 detects the plant in the second captured image based on the extracted feature points and feature amounts.
  • the plants in this case include crops grown from new shoots and weeds that can be weeded.
  • the computer 10 identifies the detected positional information of the plant based on the acquired positional information of the photographing location (step S16).
  • the computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
  • the computer 10 sprays the herbicide on the plants existing at the positions other than the sprout point based on the stored position information of the sprout point and the specified position information of the plant (step S17).
  • the computer 10 determines that this plant is a crop and does not spray the herbicide.
  • the computer 10 determines that this plant is a weed, and sprays a herbicide.
  • the computer 10 moves the drone to the position of this weed based on the position information of the plant determined to be a weed, and transmits a command to the drone to spray the herbicide by the herbicide spraying device of this drone.
  • the drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point.
  • the computer 10 sprays the herbicide on the plants other than the seeding point.
  • FIG. 2 is a diagram showing a system configuration of a herbicide spraying support system 1 which is a preferred embodiment of the present invention.
  • the herbicide spraying support system 1 is a computer system which is composed of a computer 10 and supports spraying of the herbicide.
  • the computer 10 is connected to a drone, an agricultural machine tool, a high-performance agricultural machine, a worker terminal, and other computers so as to be able to perform data communication via a public line network, etc., and executes necessary data transmission/reception.
  • the herbicide spraying support system 1 may include a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal, other computers, and other terminals and devices, which are not shown. Further, the herbicide spraying support system 1 may be realized by a single computer such as the computer 10, or may be realized by a plurality of computers such as a cloud computer.
  • the computer 10 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, and a device for enabling communication with other terminals or devices, such as the IEEE 802, as a communication unit. .. 11 compliant Wi-Fi (Wireless-Fidelity) compatible devices and the like.
  • the computer 10 also includes, as a storage unit, a data storage unit such as a hard disk, a semiconductor memory, a recording medium, or a memory card. Further, the computer 10 includes various devices that execute various processes as a processing unit.
  • control unit reads a predetermined program so that the seeding position acquisition module 20, the imaging data acquisition module 21, and the command transmission module 22 are realized in cooperation with the communication unit. Further, in the computer 10, the control unit reads a predetermined program to cooperate with the storage unit to realize the storage module 30. Further, in the computer 10, the control unit loads a predetermined program to realize the image analysis module 40, the position identification module 41, the crop identification module 42, the command creation module 43, and the learning module 44 in cooperation with the processing unit. To do.
  • FIG. 3 is a diagram showing a flowchart of the seeding point storage processing executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the seeding position acquisition module 20 acquires the position information of the seeding point of the crop (step S20).
  • the seeding position acquisition module 20 acquires the position information of the seeding point from the seeding equipment or device or the worker terminal possessed by the worker who seeded.
  • the worker terminal acquires its own position information from the GPS (Global Positioning System) or the like at the seeded position.
  • the worker terminal transmits the acquired position information of itself to the computer 10 as the position information of the seeding point.
  • the seeding position acquisition module 20 acquires the position information of the seeding point of this crop by receiving the position information of the seeding point transmitted by the worker terminal.
  • the storage module 30 stores the position information of the seeding point of this crop (step S21).
  • the storage module 30 may store only the position information of the seeding point of this crop, or may store it in association with the operator, the user, the identifier of the field, and the like.
  • FIG. 4 is a diagram showing a flowchart of the first herbicide spraying support process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the photographing data acquisition module 21 acquires the photographed image of the field and the position information of the photographing point as photographing data (step S30).
  • the imaging data acquisition module 21 acquires, as imaging data, for example, imaging images taken by the drone at a plurality of preset imaging points in the field and position information of the imaging points at which the drone images the imaging images.
  • the drone photographs the area directly below itself as a photographed image. That is, the drone captures the captured image from a position vertical to the field.
  • the drone captures a captured image at the capturing location and acquires its own position information from the GPS or the like.
  • the drone handles the acquired position information of itself as the position information of the shooting point.
  • the drone transmits the photographed image and the position information of the photographing point to the computer 10 as photographing data.
  • the imaging data acquisition module 21 acquires imaging data by receiving the imaging data transmitted by the drone.
  • the computer 10 acquires the photographed image of the field and the position information of the photographing point.
  • the image analysis module 40 analyzes the photographed image based on the photographed data (step S31). In step S31, the image analysis module 40 extracts feature points and feature amounts in this captured image. The image analysis module 40 extracts, for example, the shape, hue, etc. of an object existing in a captured image as image analysis.
  • the image analysis module 40 determines whether or not a plant can be detected based on the result of image analysis (step S32).
  • step S32 the image analysis module 40 compares the extracted feature points and feature amounts with the plant database in which the feature points and feature amounts of plants and the identifiers of plants are registered in advance to obtain a captured image. Determine whether plants are present.
  • the image analysis module 40 determines that a plant has been detected when the feature point or feature amount extracted this time and the feature point or feature amount registered in the plant database match, and if they do not match, the plant cannot be detected. Will be judged.
  • the image analysis module 40 also determines the identifier of the plant based on the feature point and the feature amount of the plant and the plant database.
  • step S32 when the image analysis module 40 determines that the plant could not be detected as a result of the image analysis (step S32 NO), the computer 10 ends this processing.
  • the computer 10 may be configured to execute the above-described processing of step S30 and acquire the shooting data different from the shooting data subjected to the image analysis this time.
  • step S32 when the image analysis module 40 determines that the plant can be detected as a result of the image analysis (step S32 YES), the position specifying module 41 determines the detected plant based on the position information in the photographing data.
  • the position information of is identified (step S33).
  • step S33 the position specifying module 41 specifies the position information of the shooting point corresponding to the shot image subjected to the image analysis this time based on the shooting data.
  • the position specifying module 41 specifies the position information of the shooting point as the position information of this plant.
  • the position specifying module 41 specifies the detailed position of this plant based on the position information of this plant and the coordinates in the captured image.
  • the position specifying module 41 sets an orthogonal coordinate system for the picked-up image and specifies the position in the picked-up image in which this plant is detected as the X coordinate and the Y coordinate in this picked-up image.
  • the position specifying module 41 specifies the position information of the plant in the actual field on the basis of the position information at the shooting point and the X and Y coordinates.
  • the position specifying module 41 specifies the center position of the captured image as position information at the shooting point, and specifies the X coordinate and the Y coordinate as positions with respect to the center position.
  • the position specifying module 41 will specify the position of this plant in the field.
  • the crop specifying module 42 determines whether or not this plant is a seed sown crop based on the position information of the seeding point stored by the process of step S21 described above and the position information of the specified plant (step). S34).
  • step S34 the crop identification module 42 compares the position information of this seeding point with the position information of the specified plant, and makes this determination based on whether or not they match. If the crop identification module 42 determines that they match, it determines that this plant is a crop (the sown crop has grown) (step S34 YES), and executes the process described below without executing the process.
  • the computer 10 may be configured to execute the process of step S30 described above and acquire the image data different from the image data subjected to the image analysis this time, similarly to the process of step S32 described above.
  • the computer 10 does not create or transmit the command necessary for spraying the herbicide, and thus does not spray the crop with the herbicide.
  • step S34 if the crop identification module 42 determines that they do not match, it determines that this plant is not a crop (the sown crop is not a grown one but a weed) (step S34 NO),
  • the command creation module 43 creates a command for spraying a drone with a herbicide for weeding this plant (step S35).
  • step S35 the command creation module 43 creates a flight command to the position of the field corresponding to the position information of this plant and a drive command to drive the herbicide spraying device of the drone.
  • the commands created by the command creation module 43 will be described.
  • the command creation module 43 creates, as flight commands, commands necessary to fly to the position information of this plant. Further, the command creation module 43 refers to the herbicide database in which the identifier of the plant and the identifier of the effective herbicide are registered in advance and identifies the effective herbicide for the plant detected this time. Furthermore, the command creation module 43 determines the required amount of herbicide sprayed based on the extracted feature points and feature amounts. For example, the command creation module 43 determines the required spraying amount of the herbicide according to the size and shape of the extracted plant. The command creation module 43 creates, as a drive command, a command necessary to apply the determined herbicide identifier and application amount to the herbicide application device.
  • the command creation module 43 determines only the spraying amount of this herbicide, and determines this spraying amount as the drive command. It is only necessary to create the command necessary for spraying to the herbicide spraying device.
  • the computer 10 executes the above-described processing on a preset area of the field or the entire field. For example, the computer 10 acquires the photographing data for the entire field, judges the presence/absence of a plant at each photographing point, judges the crop of the plant, and executes the process of creating or not creating a command. The computer 10 will execute the process described later when the process is completed for all the photographing data.
  • the computer 10 may be configured to execute the processing described below when it determines that the plant is not a crop for each piece of captured data, not when the processing is completed for all captured data. .. That is, the computer 10 may be configured to execute the processing described below on the individual photographing data when the plant is a crop.
  • the command transmission module 22 transmits the created command to the drone, and sprays the herbicide on the drone (step S36).
  • step S36 the command transmission module 22 transmits the flight command and the drive command described above.
  • the drone receives this flight command and drive command. Based on this flight command, the drone flies to the position of the target plant, and based on the drive command, sprays the herbicide according to the type of herbicide to be sprayed and the spraying amount of this herbicide.
  • the computer 10 sprays the herbicide on the plants other than the seeding point.
  • the target of the command created and transmitted by the computer 10 is not limited to the drone, and may be other agricultural machinery, high-performance agricultural machinery, or the like.
  • the computer 10 instead of the flight command, the computer 10 may be changed to one suitable for an instrument or machine such as a travel command.
  • FIG. 5 is a diagram showing a flowchart of the first learning process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the learning module 44 learns the photographed image of the plant to which the herbicide is sprayed (step S40). In step S40, the learning module 44 learns the captured image of the plant determined by the crop identifying module 42 not to be the planted seed by the processing of step S34 described above. The crop identification module 42 learns the feature points and feature amounts of this plant and that this plant is not a sown crop.
  • the storage module 30 stores the learning result (step S41).
  • the computer 10 executes the processes of steps S32 and S34 described above in consideration of the stored learning result.
  • the image analysis module 40 determines whether or not a plant could be detected by adding the learning result to the image analysis result. For example, the image analysis module 40 determines whether or not the extracted feature point or feature amount is based on the feature point or feature amount extracted as a result of the image analysis and the feature point or feature amount in the learning result. It is also possible to judge.
  • the crop identifying module 42 determines whether or not the plant is a crop based on the feature points and the feature amount in the learning result, in addition to the comparison of the position information. For example, the crop identification module 42 determines that the plant is not a crop if the image of the plant matches or resembles a plant that is not a crop in the learning result, even if the results of the comparison of the position information match. Will also be possible.
  • the above is the first learning process.
  • FIG. 6 is a diagram showing a flowchart of a sprout spot storage process executed by the computer 10. The processing executed by each module described above will be described together with this processing. The detailed description of the same processes as those described above will be omitted. In addition, this treatment is performed at the time when the sown crop grows and a new shoot emerges.
  • the photographing data acquisition module 21 acquires the photographed image of the field and the position information of the photographing point as the first photographing data (step S50).
  • the process of step S50 is similar to the process of step S30 described above.
  • the image analysis module 40 performs image analysis on the captured image based on the first captured data (step S51).
  • the process of step S51 is similar to the process of step S31 described above.
  • the image analysis module 40 determines whether or not the sprout of the crop can be detected based on the result of the image analysis (step S52).
  • step S52 the image analysis module 40 compares the extracted feature points and feature amounts with the sprout database in which the feature points and feature amounts of the sprouts of the crop and the crop identifier are registered in advance to capture an image. Determine if there are crop shoots in the image.
  • the image analysis module 40 judges that the sprout of the crop has been detected when the feature point or the feature amount extracted this time and the feature point or the feature amount registered in the sprout database match, and when they do not match, the sprout of the crop is determined. Will be determined not to be detected.
  • the image analysis module 40 also determines the sprout identifier of the crop based on the sprout characteristic point and the feature amount of the crop and the sprout database.
  • step S52 when the image analysis module 40 determines that the new shoots of the crop could not be detected as a result of the image analysis (step S52: NO), the computer 10 ends this processing.
  • the computer 10 may be configured to execute the process of step S50 described above and acquire the first captured image data different from the first captured image data subjected to the image analysis this time.
  • step S52 when the image analysis module 40 determines that the sprout of the crop can be detected as a result of the image analysis (step S52 YES), the position specifying module 41 determines the position information in the first shooting data based on the position information.
  • the position information of the detected sprout of the crop is specified (step S53).
  • step S53 the position specifying module 41 specifies the position information of the shooting point corresponding to the shot image subjected to the image analysis this time based on the first shooting data.
  • the position specifying module 41 specifies the position information of the shooting point as the position information of the new shoots of this crop.
  • the position specifying module 41 specifies the detailed position of the new shoots of this crop based on the position information of the new shoots of this crop and the coordinates in the captured image. For example, the position specifying module 41 sets an orthogonal coordinate system for the picked-up image, and specifies the position in the picked-up image in which the shoots of this crop are detected as the X coordinate and the Y coordinate in this picked-up image. The position specifying module 41 specifies the position information of the sprout of the crop in the actual field, based on the position information at the shooting point and the X and Y coordinates.
  • the position specifying module 41 specifies the center position of the captured image as position information at the shooting point, and specifies the X coordinate and the Y coordinate as positions with respect to the center position. As a result, the position specifying module 41 specifies the position of the sprout of this crop in the field.
  • the storage module 30 stores the position information of the sprout point where the sprout of this crop exists (step S54).
  • the storage module 30 may store only the position information of this sprout point, or may store it in association with the identifiers of the worker, the user, and the field.
  • FIG. 7 is a diagram showing a flowchart of the second herbicide application support process executed by the computer 10. The processing executed by each module described above will be described together with this processing. The detailed description of the same processes as those described above will be omitted.
  • the photographing data acquisition module 21 acquires, as the second photographing data, the photographed image of the field at a time different from the first photographing data and the position information of the photographing point (step S60).
  • the process of step S60 is similar to the process of step S30 described above.
  • the image analysis module 40 performs image analysis on the captured image based on the second captured data (step S61).
  • the process of step S61 is similar to the process of step S31 described above.
  • the image analysis module 40 determines whether or not the plant can be detected based on the result of the image analysis (step S62).
  • the process of step S62 is similar to the process of step S32 described above.
  • step S62 when the image analysis module 40 determines that the plant cannot be detected as a result of the image analysis (step S62 NO), the computer 10 ends this processing.
  • the computer 10 may be configured to execute the process of step S60 described above and acquire the second shooting data different from the second shooting data subjected to the image analysis this time.
  • step S62 when the image analysis module 40 determines that the plant can be detected as a result of the image analysis (step S62 YES), the position specifying module 41 determines the detected plant based on the position information in the photographing data. The position information of is identified (step S63).
  • the process of step S63 is similar to the process of step S33 described above.
  • the crop specifying module 42 determines whether or not this plant is a crop based on the position information of the sprout point stored by the process of step S51 described above and the position information of the specified plant (step S64).
  • the crop identification module 42 compares the position information of this sprout point with the position information of the specified plant, and makes this determination based on whether or not they match.
  • the plant identifying module 42 determines that the plants match, the plant identifying module 42 determines that the plant is a crop (i.e., a sprout of the crop has grown) (YES in step S64), and the process described below is not performed.
  • the computer 10 is configured to execute the process of step S60 described above and acquire the second image data different from the second image data subjected to the image analysis this time, similarly to the process of step S62 described above. Good.
  • the computer 10 does not create or transmit the command necessary for spraying the herbicide, and thus does not spray the crop with the herbicide.
  • step S64 determines that this plant is not a crop (a new shoot of the crop is not a grown one, but a weed) (step S64 NO), and the command
  • the creation module 43 creates a command for spraying a drone with a herbicide for weeding this plant (step S65).
  • the process of step S65 is similar to the process of step S35 described above.
  • the command transmission module 22 transmits the created command to the drone, and sprays the herbicide on the drone (step S66).
  • the process of step S66 is similar to the process of step S36 described above.
  • the computer 10 will spray the herbicide on plants other than the sprout site.
  • the target of the command created and transmitted by the computer 10 is not limited to the drone, and may be other agricultural machinery, high-performance agricultural machinery, or the like.
  • the computer 10 instead of the flight command, the computer 10 may be changed to one suitable for an instrument or machine such as a travel command.
  • FIG. 8 is a diagram showing a flowchart of the first learning process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the learning module 44 learns the photographed image of the plant to which the herbicide is sprayed (step S70). In step S70, the learning module 44 learns the captured image of the plant that the crop identifying module 42 has determined that the detected plant is not a crop by the process of step S64 described above. The crop identification module 42 learns the feature points and feature quantities of this plant and that this plant is not a crop.
  • the storage module 30 stores the learning result (step S71).
  • the computer 10 executes the processes of steps S62 and S64 described above, taking into consideration the stored learning result.
  • the image analysis module 40 determines whether or not a plant could be detected by adding the learning result to the image analysis result. For example, the image analysis module 40 determines whether or not the extracted feature point or feature amount is based on the feature point or feature amount extracted as a result of the image analysis and the feature point or feature amount in the learning result. It is also possible to judge.
  • the crop identifying module 42 determines whether or not the plant is a crop based on the feature points and the feature amount in the learning result, in addition to the comparison of the position information. For example, the crop identification module 42 determines that the plant is not a crop if the image of the plant matches or resembles a plant that is not a crop in the learning result, even if the results of the comparison of the position information match. Will also be possible.
  • the above means and functions are realized by a computer (including a CPU, an information processing device, various terminals) reading and executing a predetermined program.
  • the program is provided in the form of being provided from a computer via a network (SaaS: software as a service), for example.
  • the program is provided in a form recorded in a computer-readable recording medium such as a flexible disk, a CD (CD-ROM, etc.), a DVD (DVD-ROM, DVD-RAM, etc.).
  • the computer reads the program from the recording medium, transfers the program to the internal recording device or the external recording device, records the program, and executes the program.
  • the program may be recorded in advance in a recording device (recording medium) such as a magnetic disk, an optical disk, a magneto-optical disk, and provided from the recording device to a computer via a communication line.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Ecology (AREA)
  • Forests & Forestry (AREA)
  • Botany (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Insects & Arthropods (AREA)
  • Pest Control & Pesticides (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Catching Or Destruction (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

L'invention a pour but de fournir un système informatique, un procédé d'aide à la pulvérisation d'herbicide et un programme, un herbicide pouvant être pulvérisé et des mauvaises herbes retirées facilement en tout lieu. La solution de l'invention porte sur un système informatique prenant en charge la pulvérisation d'un herbicide qui : obtient des informations de position pour un emplacement de semis pour une culture ; stocke les informations de position pour l'emplacement de semis ; obtient une image capturée d'un champ cultivé et obtient des informations de position pour l'emplacement d'imagerie ; analyse l'image capturée ; détecte une plante ; spécifie des informations de position pour la plante sur la base des informations de position pour l'emplacement d'imagerie ; et provoque la pulvérisation d'un herbicide sur des plantes qui ne sont pas au niveau de l'emplacement de semis, sur la base des informations de position stockées pour l'emplacement de semis et des informations de position spécifiées pour la plante.
PCT/JP2019/003256 2019-01-30 2019-01-30 Système informatique, procédé d'aide à la croissance de cultures et programme WO2020157878A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020569247A JP7068747B2 (ja) 2019-01-30 2019-01-30 コンピュータシステム、作物生育支援方法及びプログラム
PCT/JP2019/003256 WO2020157878A1 (fr) 2019-01-30 2019-01-30 Système informatique, procédé d'aide à la croissance de cultures et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/003256 WO2020157878A1 (fr) 2019-01-30 2019-01-30 Système informatique, procédé d'aide à la croissance de cultures et programme

Publications (1)

Publication Number Publication Date
WO2020157878A1 true WO2020157878A1 (fr) 2020-08-06

Family

ID=71841293

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/003256 WO2020157878A1 (fr) 2019-01-30 2019-01-30 Système informatique, procédé d'aide à la croissance de cultures et programme

Country Status (2)

Country Link
JP (1) JP7068747B2 (fr)
WO (1) WO2020157878A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7000619B1 (ja) 2021-05-21 2022-02-14 株式会社オプティム プログラム、方法、射出装置、及びシステム
JP2022065581A (ja) * 2020-10-15 2022-04-27 西武建設株式会社 無人飛行体を用いた除草装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001292684A (ja) * 2000-04-11 2001-10-23 Hokkaido National Agricultural Experiment Station 農薬・肥料散布制御システム
JP2013059296A (ja) * 2011-09-14 2013-04-04 Chugoku Electric Power Co Inc:The 農地管理方法、及び農地管理システム
WO2017106903A1 (fr) * 2015-12-23 2017-06-29 Aerobugs Pty Ltd Ensemble de dispersion de particules et procédé d'utilisation
JP2018525976A (ja) * 2015-07-02 2018-09-13 エコロボティクス・ソシエテ・アノニム 植物有機体を自動処理するためのロボット車両及びロボットを使用する方法
WO2018173577A1 (fr) * 2017-03-23 2018-09-27 日本電気株式会社 Dispositif de calcul d'indice de végétation, procédé de calcul d'indice de végétation et support d'enregistrement lisible par ordinateur

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001292684A (ja) * 2000-04-11 2001-10-23 Hokkaido National Agricultural Experiment Station 農薬・肥料散布制御システム
JP2013059296A (ja) * 2011-09-14 2013-04-04 Chugoku Electric Power Co Inc:The 農地管理方法、及び農地管理システム
JP2018525976A (ja) * 2015-07-02 2018-09-13 エコロボティクス・ソシエテ・アノニム 植物有機体を自動処理するためのロボット車両及びロボットを使用する方法
WO2017106903A1 (fr) * 2015-12-23 2017-06-29 Aerobugs Pty Ltd Ensemble de dispersion de particules et procédé d'utilisation
WO2018173577A1 (fr) * 2017-03-23 2018-09-27 日本電気株式会社 Dispositif de calcul d'indice de végétation, procédé de calcul d'indice de végétation et support d'enregistrement lisible par ordinateur

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022065581A (ja) * 2020-10-15 2022-04-27 西武建設株式会社 無人飛行体を用いた除草装置
JP7000619B1 (ja) 2021-05-21 2022-02-14 株式会社オプティム プログラム、方法、射出装置、及びシステム
JP2022178975A (ja) * 2021-05-21 2022-12-02 株式会社オプティム プログラム、方法、射出装置、及びシステム

Also Published As

Publication number Publication date
JP7068747B2 (ja) 2022-05-17
JPWO2020157878A1 (ja) 2021-06-03

Similar Documents

Publication Publication Date Title
EP3741214A1 (fr) Procédé de traitement de plantations basé sur la reconnaissance d'images
JP7086203B2 (ja) 植物体耕作データ測定方法、作業経路計画方法及び装置、システム
CN109197278B (zh) 作业策略的确定方法及装置、药物喷洒策略的确定方法
JP2019520631A (ja) 自然環境における雑草の認識
US11632907B2 (en) Agricultural work apparatus, agricultural work management system, and program
WO2020157878A1 (fr) Système informatique, procédé d'aide à la croissance de cultures et programme
JP2022542764A (ja) 農業機器を用いて農場を処置するための適用マップを生成する方法
JP7075171B2 (ja) コンピュータシステム、病害虫検出方法及びプログラム
Kirkpatrick Technologizing agriculture
CN111479459A (zh) 生长状况或病虫害发生状况的预测系统、方法以及程序
KR20200077808A (ko) 영상기반 감시 장치 및 그 동작방법
CN106645147A (zh) 一种病虫害监测方法
JP7066258B2 (ja) コンピュータシステム、作物生育支援方法及びプログラム
Liu et al. Development of a proximal machine vision system for off-season weed mapping in broadacre no-tillage fallows
EP4187344A1 (fr) Prédiction de distance d'engin de chantier et commande d'action
WO2019244156A1 (fr) Système d'imagerie in situ de tissus végétaux
US20220392214A1 (en) Scouting functionality emergence
JP2022064532A (ja) 除草装置、自動除草方法及び自動除草プログラム
KR102371433B1 (ko) 인공지능 기반 농업 로봇용 농경지 경작지도 생성 시스템 및 방법
TW202219879A (zh) 植物生長辨識系統
WO2021149355A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP5935654B2 (ja) 作物推定方法,作物推定プログラムおよび作物推定装置
US20220406039A1 (en) Method for Treating Plants in a Field
WO2023105112A1 (fr) Robot de désherbage
van der Meer et al. Autonomous volunteer potato control: Farm of the future-voucher report: results activities 2021-2022

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19913309

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020569247

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19913309

Country of ref document: EP

Kind code of ref document: A1