WO2021255940A1 - Drone for diagnosing crop growth, and camera system for same - Google Patents

Drone for diagnosing crop growth, and camera system for same Download PDF

Info

Publication number
WO2021255940A1
WO2021255940A1 PCT/JP2020/024246 JP2020024246W WO2021255940A1 WO 2021255940 A1 WO2021255940 A1 WO 2021255940A1 JP 2020024246 W JP2020024246 W JP 2020024246W WO 2021255940 A1 WO2021255940 A1 WO 2021255940A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
image
flight
information
camera
Prior art date
Application number
PCT/JP2020/024246
Other languages
French (fr)
Japanese (ja)
Inventor
宮崎忠喜
西片丈晴
和氣千大
加藤宏記
Original Assignee
株式会社ナイルワークス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ナイルワークス filed Critical 株式会社ナイルワークス
Priority to PCT/JP2020/024246 priority Critical patent/WO2021255940A1/en
Priority to JP2022531240A priority patent/JPWO2021255940A1/ja
Publication of WO2021255940A1 publication Critical patent/WO2021255940A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/19Propulsion using electrically powered motors
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G7/00Botany in general
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/20Rotors; Rotor supports
    • B64U30/29Constructional aspects of rotors or rotor supports; Arrangements thereof
    • B64U30/299Rotor guards

Definitions

  • the present invention relates to a drone for diagnosing the growth of crops and a camera system thereof.
  • Patent Document 1 As a background technology in this technical field, there is International Publication No. 2017/221641 (Patent Document 1).
  • a plant growth index measuring device for generating a growth index indicating the degree of growth in a plant a plant growth index measuring device for generating an image of an image of a field to be measured from above, a method thereof, and a predetermined program thereof.
  • An image is generated at time intervals, and it is determined whether or not to store the generated image in the image storage unit so that only a significant image is stored in the image storage unit in order to obtain the growth index, and the result is ,
  • the generated image is stored in the image storage unit only when it is determined to be stored in the image storage unit (see summary).
  • the patent document 1 describes a drone that images a field.
  • the captured image is stored when the flight conditions such as the position, altitude, speed, and attitude of the aircraft (including the drone) satisfy preset conditions, but the captured image is stored only under the above conditions. There was a risk that it would not be possible to fully judge the pros and cons of.
  • an image suitable for the growth diagnosis of crops in the field was selected and stored, there was a possibility that sufficient measures were not taken except for excluding it from the preservation target. Further, when an invalid image is extracted, there is a possibility that the means for retrieving the image is not sufficiently provided.
  • the present invention provides a drone and a camera system thereof that appropriately manages the pre-processing of the captured image or the transmission of the processed data to the outside according to the flight state of the drone. Further, when an invalid image is extracted, a drone having an appropriate means for retrieving the image and a camera system thereof are provided.
  • the present application includes a plurality of means for solving the above problems, and to give an example thereof, a main body, a plurality of rotary blades, a camera, and data generated from an image acquired from the camera are transmitted to the outside.
  • a drone including a transmission unit and a storage device, in which the image is flagged as an invalid image when the flight state of the drone satisfies a predetermined condition, and the storage of the image in the storage device is performed. It is characterized in that at least one of stopping, stopping the preprocessing for the image, stopping the transmission of the data to the outside, and deleting the image from the storage device is performed.
  • This is an example of the invalid image judgment processing flow of the drone This is an example in which an invalid image was extracted during the flight of a drone that diagnoses the growth of crops in the field.
  • Drones are an example of agricultural machinery.
  • the drone is regardless of the power means (electric power, prime mover, etc.) and the maneuvering method (wireless or wired, autonomous flight type, manual maneuvering type, etc.). It refers to all aircraft with multiple rotor blades.
  • FIG. 1 is an example of a plan view of a drone.
  • FIG. 2 is an example of a front view of the drone.
  • FIG. 3 is an example of a right side view of the drone.
  • FIG. 4 is an example of a rear view of the drone.
  • FIG. 5 is an example of a perspective view of the drone.
  • Rotors 101-1a, 101-1b, 101-2a, 101-2b, 101-3a, 101-3b, 101-4a, 101-4b are means for flying the drone 100. Eight aircraft (four sets of two-stage rotor blades) are provided in consideration of the balance of flight stability, aircraft size, and power consumption.
  • Each rotary blade 101 is arranged on four sides of the main body 110 by an arm protruding from the main body 110 of the drone 100. That is, the rotors 101-1a and 101-1b are left rearward in the traveling direction, the rotary blades 101-2a and 101-2b are forward left, the rotary blades 101-3a and 101-3b are rearward right, and the rotary blades 101- are forward right.
  • Rod-shaped legs 107-1, 107-2, 107-3, and 107-4 extend downward from the rotation axis of the rotary blade 101, respectively.
  • the motors 102-1a, 102-1b, 102-2a, 102-2b, 102-3a, 102-3b, 102-4a, 102-4b are the rotary blades 101-1a, 101-1b, 101-2a, 101-. It is a means for rotating 2b, 101-3a, 101-3b, 101-4a, 101-4b (typically an electric motor, but may be a motor or the like), and is 1 for one rotary blade.
  • the machine is provided.
  • the motor 102 is an example of a propulsion device.
  • the upper and lower rotor blades (for example, 101-1a and 101-1b) in one set and the corresponding motors (for example, 102-1a and 102-1b) have axes for the stability of drone flight and the like. They are on the same straight line and rotate in opposite directions.
  • the radial member for supporting the propeller guard provided so that the rotor does not interfere with foreign matter has a wobble-like structure rather than a horizontal structure. This is to encourage the member to buckle to the outside of the rotor blade in the event of a collision and prevent it from interfering with the rotor.
  • the drug nozzles 103-1, 103-2, 103-3 are means for spraying the drug downward, and are provided with four machines.
  • a drug is a liquid, powder or fine particles sprayed in a field such as a pesticide, a herbicide, a liquid fertilizer, an insecticide, a seed, and water.
  • the medicine tank 104 is a tank for storing the medicine to be sprayed, and is provided at a position close to the center of gravity of the drone 100 and at a position lower than the center of gravity from the viewpoint of weight balance.
  • the medicine hoses 105-1, 105-2, 105-3 connect the medicine tank 104 and the medicine nozzles 103-1, 103-2, 103-3.
  • the drug hose is made of a hard material and may also serve to support the drug nozzle.
  • the pump 106 is a means for discharging the drug from the nozzle.
  • FIG. 6 is an example of a block diagram showing the control function of the drone.
  • the flight controller 501 is a component that controls the entire drone, and may be an embedded computer including a CPU, a memory, related software, and the like.
  • the flight controller 501 is based on the input information received from the mobile terminal 701 and the input information obtained from various sensors described later, via a control means such as an ESC (Electronic Speed Control), and the motors 102-1a and 102-1b. , 102-2a, 102-2b, 102-3a, 102-3b, 104-a, 104-b to control the flight of the drone 100.
  • ESC Electronic Speed Control
  • the actual rotation speeds of the motors 102-1a, 102-1b, 102-2a, 102-2b, 102-3a, 102-3b, 104-a, 104-b are fed back to the flight controller 501, and normal rotation is performed. It is configured so that it can be monitored.
  • the rotary blade 101 may be provided with an optical sensor or the like so that the rotation of the rotary blade 101 is fed back to the flight controller 501.
  • the software used by the flight controller 501 can be rewritten through a storage medium or the like for function expansion / change, problem correction, etc., or through communication means such as Wi-Fi communication or USB. In this case, protection is performed by encryption, checksum, digital signature, virus check software, etc. so that rewriting by unauthorized software is not performed. Further, a part of the calculation process used by the flight controller 501 for control may be executed by another computer located on the mobile terminal 701, the management server 702, or somewhere else. Due to the high importance of the flight controller 501, some or all of its components may be duplicated.
  • the flight controller 501 communicates with the mobile terminal 701 via the Wi-Fi slave unit function 503 and further via the base station 710, receives necessary commands from the mobile terminal 701, and receives necessary information from the mobile terminal. Can be sent to 701. In this case, the communication may be encrypted to prevent fraudulent acts such as interception, spoofing, and device hijacking.
  • the base station 710 also has a function of an RTK-GPS base station in addition to a communication function by Wi-Fi. By combining the signal from the RTK base station and the signal from the GPS positioning satellite, the flight controller 501 can measure the absolute position of the drone 100 with an accuracy of about several centimeters.
  • flight controller 501 Since the flight controller 501 is so important, it may be duplicated / multiplexed, and each redundant flight controller 501 should use a different satellite in order to cope with the failure of a specific GPS satellite. It may be controlled. Communication between the flight controller 501, the base station 710, and the mobile terminal 701 may use a mobile network such as LTE instead of Wi-Fi.
  • the 6-axis gyro sensor 505 measures the acceleration of the drone aircraft in three directions orthogonal to each other. Furthermore, the velocity is calculated by integrating the acceleration.
  • the 6-axis gyro sensor 505 measures the change in the attitude angle of the drone aircraft in the above-mentioned three directions, that is, the angular velocity.
  • the geomagnetic sensor 506 measures the direction of the drone body by measuring the geomagnetism.
  • the barometric pressure sensor 507 can also measure barometric pressure and indirectly measure the altitude of the drone.
  • the laser sensor 508 measures the distance between the drone body and the ground surface by utilizing the reflection of the laser light, and may be an IR (infrared) laser.
  • Sonar 509 measures the distance between the drone aircraft and the ground surface using the reflection of sound waves such as ultrasonic waves. These sensors may be selected according to the cost target and performance requirements of the drone. Further, a gyro sensor (angular velocity sensor) for measuring the inclination of the airframe, a wind power sensor for measuring wind power, and the like may be added. Further, these sensors may be duplicated or multiplexed. If there are multiple sensors for the same purpose, the flight controller 501 may use only one of them, and if it fails, it may switch to an alternative sensor for use. Alternatively, a plurality of sensors may be used at the same time, and if the measurement results do not match, it may be considered that a failure has occurred.
  • the flow rate sensor 510 measures the flow rate of the drug, and is provided at a plurality of locations on the route from the drug tank 104 to the drug nozzle 103.
  • the liquid drain sensor 511 is a sensor that detects that the amount of the drug is equal to or less than a predetermined amount.
  • the multispectral camera 512 is a means of photographing the field 720 and acquiring data for image analysis.
  • the obstacle detection camera 513 is a camera for detecting an obstacle, and is a device different from the multispectral camera 512 because the image characteristics and the lens orientation are different from those of the multispectral camera 512.
  • the Switch 514 is a means for the user of the drone 100 to make various settings.
  • the obstacle contact sensor 515 is a sensor for detecting that the drone 100, particularly its rotor or propeller guard portion, has come into contact with an intruder such as an electric wire, a building, a human body, a standing tree, a bird, or another drone. ..
  • the obstacle contact sensor 515 may be replaced with a 6-axis gyro sensor 505.
  • the cover sensor 516 is a sensor that detects that the operation panel of the drone 100 and the cover for internal maintenance are in an open state.
  • the drug injection port sensor 517 is a sensor that detects that the injection port of the drug tank 104 is in an open state. These sensors may be selected according to the cost target and performance requirements of the drone, or may be duplicated / multiplexed. Further, a sensor may be provided at a base station 710, a mobile terminal 701, or another place outside the drone 100, and the read information may be transmitted to the drone 100. For example, a wind power sensor may be provided in the base station 710 to transmit information on the wind power and the wind direction to the drone 100 via Wi-Fi communication.
  • the flight controller 501 transmits a control signal to the pump 106 to adjust the drug discharge amount and stop the drug discharge.
  • the current state of the pump 106 (for example, the number of revolutions) is fed back to the flight controller 501.
  • the LED 107 is a display means for notifying the drone operator of the state of the drone.
  • Display means such as a liquid crystal display may be used in place of or in addition to the LED.
  • the buzzer 518 is an output means for notifying the state of the drone (particularly the error state) by an audio signal.
  • the Wi-Fi slave unit function 519 is an optional component for communicating with an external computer or the like for transferring software, for example, in addition to the mobile terminal 701. Instead of or in addition to the Wi-Fi handset function, other wireless communication means such as infrared communication, Bluetooth®, ZigBee®, NFC, or wired communication means such as USB connection. You may use it. Further, communication between each device of the flight controller 501, the mobile terminal 701, and the base station 710 can be communicated with each other by a mobile communication system such as 3G, 4G, and LTE instead of the Wi-Fi slave unit function. May be good.
  • the speaker 520 is an output means for notifying the state of the drone (particularly the error state) by means of recorded human voice, synthetic voice, or the like. Depending on the weather conditions, it may be difficult to see the visual display of the drone 100 in flight. In such a case, it is effective to convey the situation by voice.
  • the warning light 521 is a display means such as a strobe light for notifying the state of the drone (particularly the error state).
  • FIG. 7 is an example of a connection configuration diagram of the entire drone management system 700.
  • the drone management system 700 includes a drone 100, a mobile terminal 701, a management terminal 703, and a base station 710, each of which is connected to the management server 702 via a network.
  • the network may be wired or wireless, and each terminal can send and receive information via the network.
  • the drone 100 and the mobile terminal 701 can communicate with each other in the field 720 via the base station 710, and the drone 100 performs a drug spraying flight.
  • the network may be a network that communicates according to one communication standard, or may be a network that is a combination of a plurality of communication standard networks.
  • the drone 100 and the mobile terminal 701 may be network-connected by Wi-Fi provided by the base station 710, respectively, or the drone 100 and the mobile terminal 701 may be network-connected by a mobile communication network such as LTE, respectively.
  • the drone 100 may be connected by Wi-Fi provided by the base station 710, and the base station 710 and the mobile terminal 701 may be connected by a mobile communication network.
  • the mobile terminal 701 sends a command to the drone 100 by the operation of the user, and also displays information received from the drone 100 (for example, position, drug amount, remaining battery level, camera image, etc.). For example, it is realized by a mobile information device such as a tablet terminal or a smartphone.
  • the drone 100 performs autonomous flight according to instructions from the management server 702, but the mobile terminal 701 can perform manual operations during basic operations such as takeoff and return, and in an emergency.
  • the mobile terminal 701 is connected to the base station 710, and can communicate with the management terminal 703 via the base station 710 or directly.
  • the management server 702 is, for example, a server arranged on the cloud, calculates the spray flight route of the drone 100 based on the field management information 1300, and controls the independent flight of the drone 100. In addition, it is possible to collect information acquired from a camera mounted on the drone 100 and various sensors, and perform various analyzes such as the state of fields and crops.
  • the management terminal 703 is a terminal that operates the management server 702, and makes various settings for the management server 702. It is also possible to control the drone 100 and the mobile terminal 701.
  • the base station 710 is a device installed in the field 720 that provides a master unit function for Wi-Fi communication, and also functions as an RTK-GPS base station, so that the accurate position of the drone 100 can be provided.
  • the base unit function of Wi-Fi communication and the RTK-GPS base station may be independent devices).
  • the base station 710 can communicate with the management server 702 using a mobile communication network such as 3G, 4G, and LTE.
  • Each terminal and management server 702 of the drone management system 700 may be, for example, a mobile terminal (mobile terminal) such as a smartphone, a tablet, a mobile phone, or a personal digital assistant (PDA), or may be a glasses type, a wristwatch type, a clothing type, or the like. It may be a wearable terminal of. It may also be a stationary or portable computer, or a server located in the cloud or on a network. Further, the function may be a VR (Virtual Reality) terminal, an AR (Augmented Reality) terminal, or an MR (Mixed Reality) terminal. Alternatively, it may be a combination of these plurality of terminals. For example, a combination of one smartphone and one wearable terminal can logically function as one terminal. Further, it may be an information processing terminal other than these.
  • a mobile terminal such as a smartphone, a tablet, a mobile phone, or a personal digital assistant (PDA)
  • PDA personal digital assistant
  • the function may be a VR (Virt
  • Each terminal and management server 702 of the drone management system 700 has a processor (control unit) that executes an operating system, an application, a program, etc., a main storage device such as a RAM (RandomAccessMemory), and an IC card or a hard disk drive. , SSD (Solid State Drive), auxiliary storage device such as flash memory, communication control unit such as network card, wireless communication module, mobile communication module, touch panel, keyboard, mouse, voice input, motion detection by imaging of camera unit It is equipped with an input device such as an input device and an output device such as a monitor and a display.
  • the output device may be a device or a terminal for transmitting information for output to an external monitor, display, printer, device, or the like.
  • each module is stored in the main memory, and each functional element of the entire system is realized by the processor executing these programs and applications.
  • each of these modules may be implemented by hardware by integrating them.
  • each module may be an independent program or application, but may be implemented in the form of a part of a subprogram or a function in one integrated program or application.
  • each module is described as a subject (subject) that performs processing, but in reality, a processor that processes various programs, applications, etc. (module) executes processing.
  • DBs databases
  • a "database” is a functional element (storage unit) that stores a data set so that it can handle arbitrary data operations (for example, extraction, addition, deletion, overwriting, etc.) from a processor or an external computer.
  • the method of implementing the database is not limited, and may be, for example, a database management system, spreadsheet software, or a text file such as XML or JSON.
  • the mobile terminal 701 may be referred to as an information processing device, and the management server 702 may be referred to as an information processing device.
  • FIG. 8 is an example of the field information display screen 800 displayed on the mobile terminal 701.
  • the screen display module 1011 of the mobile terminal 701 acquires the map information 1200 and the field management information 1300 stored in the mobile terminal 701, generates the field information display screen 800, and outputs the field information display screen 800 to the output device 1005 such as a screen.
  • the screen display module 1011 may be configured to acquire the map information 1200 or 1200 stored in the management server 702 and the field management information 1300 via the network to generate the field information display screen 800.
  • a map 801 is displayed on the back surface of the field information display screen 800, indicating that the information is registered in the fields 802, 803, and 804 in which the field information is stored in the field management information 1300.
  • Anchor 805 is displayed.
  • the field is a rice field, a field, or the like that is the target of chemical spraying by the drone 100. In reality, the terrain of the field is complicated, and the topographic map may not be available in advance, or the topographic map and the situation at the site may be inconsistent. Fields are usually adjacent to houses, hospitals, schools, other crop fields, roads, railroads, etc. In addition, there may be intruders such as buildings and electric wires in the field.
  • the field is an example of a target area for chemical spraying.
  • the screen display module 1011 When the screen display module 1011 receives the selection of the field 802 from the user via the input device 1004 by tapping the screen or the like, the screen display module 1011 acquires the information corresponding to the field 802 from the field management information 1300 and displays it in the field information display area 810. do. Further, the screen display module 1011 displays a highlight indicating that the field 802 is selected, such as changing the periphery of the selected field 802 to a thick line of a bright color.
  • the field information display area 810 information acquired from the field management information 1300, such as the field name 811, the address 812, the area 813, and the planted crop name 814, is displayed.
  • Information related to the spraying of the drug is displayed in the spraying information display area 820.
  • the drug to be sprayed changes depending on the crop name 814 and the spraying time, and the drug information to be sprayed in the near future is acquired from the drug management information 1600 and displayed.
  • the spraying information display area 820 information related to the spraying of the drug acquired or calculated by the spraying-related information management module 1114 of the management server 702, for example, the drug name, spraying amount, dilution amount, and energy amount required for the spraying flight in the field. Etc. are displayed.
  • the flight status display field 850 displays the current status of the drone's spray flight.
  • Compass 861 indicates the orientation displayed by map 801.
  • the screen display module 1011 changes the scale of the display so that the selected field fills the screen.
  • the screen display module 1011 changes the display so that the current location acquired by the GPS of the mobile terminal 701 becomes the center of the screen.
  • the schedule display button 870 the screen display module 1011 displays the drug spraying schedule for the day.
  • FIG. 9 is an example of the drone operation screen 900 displayed on the mobile terminal 701.
  • the drone battery display 901 displays the current remaining battery level of the drone.
  • the current position information of the drone 100 is displayed.
  • the spray flight progress information 912 displays the progress information of the current spray flight. For example, the progress of the flight route of the spray flight, the remaining amount of the sprayed drug, the remaining battery level, etc. are displayed.
  • the flight status display field 921 the current status of the spray flight of the drone 100 is displayed.
  • the message display field 922 a message indicating the communication content with the drone 100, the flight status, and the like is displayed.
  • the altitude change buttons 923 and 924 are buttons for changing the flight altitude of the drone 100. Press the minus to lower the altitude, and press the plus to raise the altitude.
  • the emergency stop button 925 is a button for urgently stopping the flying drone 100, and in addition to a temporary stop for hovering on the spot, an option for returning to the flight start point, an option for urgently stopping the motor on the spot, etc. Can also be displayed.
  • the field 930 to be sprayed with the chemical is displayed on the map, and the flight route 931 of the spraying flight on the field 930 is displayed.
  • the drone 100 sequentially flies at the designated flight coordinates according to the flight route management information 1800 stored in the mobile terminal 701 or the management server 702.
  • the drone operation module 1012 Upon receiving operations that require operations on the drone 100, such as the altitude change buttons 923 and 924 and the emergency stop button 925, the drone operation module 1012 sends information such as commands corresponding to these operations to the drone 100.
  • the drone 100 can be operated.
  • the next spraying schedule display button 940 is a button for displaying the schedule of the next spraying flight of the currently executed spraying flight. When this button is pressed, information about the next spray flight obtained from Schedule Management Information 1900 is displayed.
  • FIG. 10 is an example of the hardware configuration of the mobile terminal 701.
  • the mobile terminal 701 is, for example, a terminal such as a tablet, a smartphone, or a head-mounted display.
  • Programs and applications such as a screen display module 1011 and a drone operation module 1012 and a schedule management module 1013 are stored in the main storage device 1001, and each of the mobile terminals 701 is executed by the processor 1003 by executing these programs and applications. Functional elements are realized.
  • the screen display module 1011 displays the field information display screen 800 and the drone operation screen 900 on an output device 1005 such as a display panel.
  • the drone operation module 1012 When the drone operation module 1012 receives operations such as the altitude change buttons 923 and 924 and the emergency stop button 925 by the user, the drone operation module 1012 transmits information such as commands corresponding to these operations to the drone 100, and makes a drone flight. Manipulate.
  • the schedule management module 1013 manages the schedule of each spray flight when the spray flights are continuously performed in a plurality of fields.
  • the auxiliary storage device 1002 stores various information such as map information 1200, field management information 1300, device management information 1400, user management information 1500, drug management information 1600, energy management information 1700, flight route management information 1800, and schedule management information 1900.
  • FIG. 11 is an example of the hardware configuration of the management server 702.
  • the management server 702 is composed of, for example, a server arranged on the cloud.
  • the main storage device 1101 stores a screen output module 1111, a flight management module 1112, a user / equipment management module 1113, a spray-related information management module 1114, a flight route management module 1115, and a schedule management module 1116.
  • Each functional element of the management server 702 is realized by executing the application or the application by the processor 1103.
  • the screen output module 1111 extracts and generates information for displaying the field information display screen 800 and the drone operation screen 900, and transmits the information to the mobile terminal 701.
  • the screen information itself may be generated and displayed on the mobile terminal 701 or the like.
  • the flight management module 1112 manages the spray flight of the drone 100 based on the information such as the field management information 1300 and the flight route management information 1800.
  • the user / device management module 1113 registers and manages information about a user who uses the drone 100 in the user management information 1500.
  • the spraying-related information management module 1114 manages the amount of chemicals required for the spraying flight, the amount of chemicals, the amount of dilution, the amount of water required for dilution, the amount of energy such as the number of batteries, and the like.
  • the flight path management module 1115 calculates the flight path of the spray flight of the drone 100 based on the field management information 1300.
  • the schedule management module 1116 generates and manages a schedule of spray flights over a plurality of fields and a plurality of days. The generated drug spraying schedule is stored in the schedule management information 1900.
  • the auxiliary storage device 1102 stores various information such as map information 1200, field management information 1300, device management information 1400, user management information 1500, drug management information 1600, energy management information 1700, flight route management information 1800, and schedule management information 1900.
  • map information 1200 maps information 1200 to the mobile terminal 701 and the management server 702
  • user management information 1500 maps information to the mobile terminal 701 and the management server 702
  • drug management information 1600 delivers energy management information 1700
  • energy management information 1700 flight route management information 1800
  • schedule management information 1900 a management information that uses schedule management information to the mobile terminal 701 and the management server 702 to store information.
  • the management server 702 may be synchronized with each other, or either information may be simply copied.
  • some or all of the information may be stored in the management server 702, and the information may be downloaded from the management server 702 from the mobile terminal 701 as needed.
  • FIG. 12 is an example of the hardware configuration of the management terminal 703.
  • the management terminal 703 is, for example, a terminal such as a desktop PC, a notebook PC, or a tablet.
  • Programs and applications such as the drone setting module 1211 and the management server setting module 1212 are stored in the main storage device 1201, and each functional element of the management terminal 703 is realized by executing these programs and applications by the processor 1203. Will be done.
  • the drone setting module 1211 performs various operations and settings such as spray flight setting and initial setting of the drone 100.
  • the management server setting module 1212 makes various settings such as initial settings of the management server 702.
  • the auxiliary storage device 1202 stores various information such as drone setting information 1221 and management server setting information 1222.
  • FIG. 13 is an example of field management information 1300.
  • the field management information 1300 stores various information about the field to which the chemicals are sprayed, and stores information such as the field ID, the field name, the field position, the field peripheral coordinates, the field area, and the crops cultivated.
  • the field management information 1300 may be simply referred to as field information.
  • the field ID is identification information that uniquely identifies the field.
  • the field position 1311 indicates the position coordinates of the field, and has, for example, information on the latitude and longitude of the center of the field.
  • the field peripheral coordinates 1312 indicate the coordinates around the field, and are, for example, the position coordinates of the four corners in the case of a quadrangular field.
  • the sample value GC007 indicates a link to information in which the position coordinates are continuously stored separated by commas or the like.
  • the field area 1313 is the total area of the field corresponding to the field ID.
  • the cultivated crop 1314 stores information for identifying the crop or the like cultivated in the field.
  • FIG. 14 is an example of the device management information 1400.
  • the device management information 1400 stores information for managing the drone 100, and stores information such as a device ID, a device name, a model number, specifications, a user, energy, and flight time.
  • the device ID is identification information that uniquely identifies the drone 100.
  • the user is information on the user who is currently using the drone 100, and stores the user ID of the user management information 1500.
  • the energy 1411 is information about energy that can be mounted on the drone 100, and stores the energy ID of the energy management information 1700.
  • the flightable time 1412 indicates the flightable time due to the energy that can be mounted on the drone 100. For example, information such as being able to fly for 15 minutes with a set of two batteries is stored.
  • FIG. 15 is an example of user management information 1500.
  • the user management information 1500 stores information on the user who operates the drone 100, and stores information such as a user ID, a user display ID, a name, an e-mail address, a date of birth, and a gender.
  • the user ID is identification information that uniquely identifies the user.
  • the user display ID is user information displayed on the mobile terminal 701 or the like, and is, for example, a nickname registered by the user.
  • FIG. 16 is an example of drug management information 1600.
  • the drug management information 1600 stores information on the drug to be sprayed, and stores the drug ID, drug name, product number, specifications, dilution rate, spray amount, and the like.
  • the drug ID is identification information that uniquely identifies the drug.
  • the drug name 1602 indicates the name of a product such as a liquid, powder or fine particle to be sprayed in a field such as a pesticide, a herbicide, a liquid fertilizer, an insecticide, or a seed.
  • the specification 1603 stores information such as a method of using the drug, a method of diluting the drug, a target crop, and a method of spraying, and the drug is diluted or sprayed according to the contents described in the specification 1603.
  • the dilution ratio 1604 stores the ratio of diluting the drug, for example, the ratio of the drug to water, the amount of the drug and water used for dilution, and the like.
  • the spraying amount 1605 stores the sprayed amount of the diluted post-diluted drug (spraying drug). For example, it has been shown to spray 10 L of spraying agent per ha.
  • FIG. 17 is an example of energy management information 1700.
  • the energy management information 1700 stores information on energy such as a battery required for the flight of the drone 100, and stores information such as an energy ID, an energy name, a model number, a type, and specifications.
  • the energy ID is identification information that uniquely identifies the energy.
  • the type indicates the type of energy, and for example, a battery, gasoline, jet fuel, or the like is stored.
  • FIG. 18 is an example of flight path management information 1800.
  • the flight route management information 1800 stores information indicating the flight route of the drone 100, and stores the route ID, the target ID, the route coordinates, the total route distance, and the like.
  • the route ID is identification information that uniquely identifies the flight route.
  • the target ID is information for specifying the field for which the flight route is calculated, the movement route between the fields, and the like. For example, farm003 indicates that the subject is in the field, and route002 indicates that the subject is a movement route outside the field.
  • the route coordinate 1811 is a link to information indicating the route coordinate of the flight, and the route coordinate of the flight is represented by, for example, a combination of a plurality of continuous position coordinates.
  • the total route distance 1812 indicates the total distance of the route when the entire flight route from the start of the flight to the schedule is flown.
  • FIG. 19 is an example of schedule management information 1900.
  • the schedule management information 1900 is information that defines a schedule when a plurality of fields are sprayed, and stores information such as a schedule ID, a schedule name, a date and time, a start place, and a schedule.
  • the schedule 1901 stores information for specifying the field where the spray flight is performed, the movement route between the fields, and the like. For example, in the example of the sample value, after flying two fields specified by farm006 and farm005, after flying the movement route indicated by route001, flying the field specified by farm003, and others specified by other001. It is a schedule to fly the field specified by farm002 after the event (for example, lunch time) has passed.
  • the spraying-related information 1902 stores the total drug spraying amount, dilution amount, energy amount, etc. of the entire schedule.
  • the amount of chemicals sprayed, the amount of dilution, the amount of energy, etc. for each field may be stored.
  • the method for defining the schedule is an example, and other schedule management methods may be used.
  • a drone 100 equipped with a camera is used to diagnose the growth of crops in the field.
  • a camera photographing device
  • the wavelength range in which chlorophyll can be used for photosynthesis is approximately 400 to 700 nm.
  • the contribution rate to photosynthesis differs depending on the wavelength range, and the wavelength absorption spectrum (distribution) of chlorophyll has relatively high short wavelength range (blue wavelength range) and long wavelength range (red wavelength range), and medium wavelength range (distribution).
  • the green wavelength range tends to be relatively low.
  • the reflectance in the long wavelength region is low because the crops in which photosynthesis is actively performed have a high absorption rate of chlorophyll.
  • crops with inactive photosynthesis have low chlorophyll absorption and therefore high reflectance in the long wavelength range. Therefore, the growth of the crop can be diagnosed by detecting the amount of reflected light from the crop for each wavelength and the ratio thereof based on the image of the crop in the field.
  • FIG. 20A is an example of a conceptual diagram in which a crop growth diagnosis is performed based on the amount of reflected light (green, red, etc.) for each wavelength from the crop by the camera 512 of the flying drone 100.
  • the drone 100 photographs crops on the ground from a position at a predetermined altitude with a camera 512 oriented at a predetermined angle (not limited to directly below). Since the shooting range of the camera that shoots at one time is limited, it is configured to shoot all the crops in the field by continuously shooting the fields of view B1 and B2 in a predetermined range.
  • region A1 images with relatively high reflectance in a relatively long wavelength region of two types of crops 11 and 12 can be taken, while in adjacent region A2, comparisons of two different types of crops 13 and 14 are made. It is possible to take an image having a low reflectance in the long wavelength region.
  • the captured images of the regions A1 and A2 may be combined in the post-treatment, enabling the growth diagnosis of the crop in the entire field.
  • the crop growth diagnosis according to the embodiment of the present invention is not limited to the above contents, and the number of cameras 512 used is not limited to one.
  • the number of cameras 512 used is not limited to one.
  • the number of cameras 512 used is not limited to one.
  • the number of cameras 512 used is not limited to one.
  • the stem and leaves of the crop. Fruit, size and / or number of paddy, etc.
  • Fruit, size and / or number of paddy, etc. may be targeted.
  • two-dimensional or three-dimensional information is acquired based on the images of the crops in the field, and an appropriate growth diagnosis is performed on the growth height, growth range, etc. of the crops. May be good.
  • growth diagnosis may be performed depending on whether the crop is sick or not.
  • the pretreatment carried out with the drone 100 in the growth diagnosis of the crop is -Processing to label the captured information on the captured image-Process to detect the intensity of the light reflected from the crop for each wavelength based on the captured image-Determine the degree of growth of the crop based on the captured image Processing-Based on the captured image, processing such as associating the acquired position information with the growth degree of the crop can be considered.
  • FIG. 20B shows a crop growth diagnosis using the camera 512 of the flying drone 100 based on the crop growth height (Z-axis direction) and / or the crop growth range (X-axis, Y-axis direction).
  • the drone 100 photographs crops on the ground from a position at a predetermined altitude with a camera 512 at a predetermined angle (not limited to directly below). Since the shooting range of the camera that shoots at one time is limited, it is configured to shoot all the crops in the field by continuously shooting the fields of view B3 and B4 in a predetermined range.
  • crops 15 and 16 having a relatively low height Z1 but a relatively wide spread L1 can be photographed, while in the adjacent region A4, a height Z2 is relatively high but spread. It is possible to photograph a crop 17 having a relatively narrow L2.
  • the captured images of the regions A3 and A4 may be combined in the post-treatment, enabling the growth diagnosis of the crop in the entire field.
  • the crop growth diagnosis is not limited to the contents exemplified in FIGS. 20A and 20B.
  • any kind of camera suitable for the growth diagnosis of various crops can be used.
  • the camera 512 not only a digital camera for still image shooting but also a video camera for moving image shooting or a camera for both moving image and still image shooting can be used.
  • a panoramic camera for panoramic photography that captures an extremely wide width (angle of view) in one frame.
  • a stereo camera that can record information in the depth direction by simultaneously photographing an object from a plurality of different directions.
  • an infrared camera, an ultraviolet camera, an X-ray camera, or the like can be used alone or in combination.
  • a crop in a field is photographed using a digital camera for taking one or more images.
  • the following operations may be included. -Camera photography of field crops, -Flagging invalid or valid images on captured images, -Memory (save) of captured images, -Preprocessing for captured images, -Flagging invalid or valid images on preprocessed data, -Memory of data after preprocessing, -Transmission of preprocessed data to the outside, etc.
  • the flight condition of the drone 100 becomes a problem in particular. For example, if the shooting angle and shooting distance of the drone 100 during flight are poor, or if the drone 100 flies in a time zone or weather conditions that are not suitable for growth diagnosis, the image quality of the shot image deteriorates. In that case, it may be difficult to perform a growth diagnosis with the required accuracy. Therefore, when the growth diagnosis of crops in the field is performed using the drone 100, the flight state of the drone 100 at the time of imaging of each image is distinguished according to a predetermined classification, and the appropriate image is taken when the flight condition is good. It is preferable to extract and use only the image.
  • the camera mounted on the drone 100 has a problem that the data capacity at the time of shooting is relatively large. Therefore, the capacity of the storage medium for storing the captured image, the processing speed of the control unit for preprocessing the captured image, and the speed of the transmission unit for transmitting the preprocessed data to the outside are relatively burdensome. Is high. When an image unsuitable for growth diagnosis is stored, preprocessed or transmitted, an unnecessary burden is applied to a storage medium or the like, resulting in waste. Therefore, when the growth diagnosis of the crop in the field is performed using the drone 100, it is preferable to automatically exclude the images inappropriate for the growth diagnosis from the storage target and the like.
  • Pretreatment refers to various image processing related to crop growth diagnosis. For example, it may include detection of the proportion of light such as green or red reflected by a plant, detection of the amount of light, and the like. Alternatively, it may include analysis of plant distribution in two or three dimensions. Preferably, in the captured image, a predetermined image analysis is performed for each pixel (pixel) or for each group of pixels.
  • the preprocessing may include a process of labeling the image with shooting information, that is, a process of associating the shooting information. This shooting information may include, for example, the situation of the field at the time of shooting, the time, the temperature, the humidity, the weather, the situation of the camera at the time of shooting, the flight situation of the drone, the situation of the shot image, and the like.
  • image storage control it is conceivable to store captured images when the flight conditions such as the position, altitude, speed, and attitude of the aircraft (including the drone) satisfy preset conditions.
  • the flight conditions such as the position, altitude, speed, and attitude of the aircraft (including the drone) satisfy preset conditions.
  • the propriety of storing the captured image cannot be sufficiently determined only by the above conditions.
  • sufficient measures have not been taken except for excluding it from the storage target.
  • the drone 100 can include a multispectral camera 512.
  • the multispectral camera 512 includes a lens and an image pickup unit, and can photograph crops in a field from a predetermined height.
  • the drone 100 includes a flight controller 501 that controls the entire drone, and the output from the multispectral camera 512 (FIG. 6) can be transmitted to the flight controller 501.
  • the flight controller 501 functions as a processor (processing device) that controls the flight of the drone, and can also preprocess the image acquired by the multispectral camera 512.
  • the preprocessed data and the like can be stored on the drone 100 and transmitted to an external management server 702, a mobile terminal 701, and the like.
  • FIG. 21 shows an example in which a dedicated camera system 20 is externally equipped for the drone 100.
  • the camera system 20 has a camera 21 with a lens and an image pickup unit.
  • the camera system 20 has a camera controller (camera control unit) 22 that controls the overall control of the camera 21, and can include a camera capture 23 that gives an instruction for shooting to the camera.
  • the camera system 20 has an image processing unit (image processor) 24 that preprocesses the image taken by the camera.
  • the camera system 20 has a storage device (a non-volatile memory card such as an SD card, an SSD (Solid State Drive), a storage device such as a hard disk) 25, and captures captured images or data of images after preprocessing. Can be remembered.
  • a storage device a non-volatile memory card such as an SD card, an SSD (Solid State Drive), a storage device such as a hard disk
  • the camera system 20 illustrated in FIG. 21 further has a transmission device 26, and can transmit raw data of captured images, data of preprocessed images, and the like to an external management server or the like.
  • a cloud device 28 such as a management server 702 by using application software 27 or the like that synchronizes files and directories between remote locations.
  • the camera system 20 includes a connection portion for connecting to the drone 100 including the main body and a plurality of rotary wings. The camera system 20 may be connected at the same time as the drone 100 is manufactured, or may be retrofitted to the drone 100.
  • the camera system 20 illustrated in FIG. 21 can be further connected to a flight controller 501 (see FIG. 6). Therefore, the camera system 20 can transmit data such as captured images to the flight controller 501. Further, the camera system 20 can receive information such as the position, speed, attitude, time, flight status (mission status), and altitude above ground level of the drone 100 from the flight controller 501.
  • the camera of the drone 100 used for the growth diagnosis of the crop in the field both the multispectral camera 512 exemplified in FIG. 6 and the camera system 20 exemplified in FIG. 21 may be included.
  • the latter includes (1) a camera, (2) a control unit that preprocesses images acquired from the camera, (3) a transmission unit that transmits preprocessed data to the outside, and (4) a main body and a plurality of units. It is housed in one housing and assembled with respect to the drone 100, including a connection portion connecting to the drone with a rotor.
  • the flight state of the drone 100 may be transmitted from the flight controller 501 of the drone to the camera control unit 22 via the connection unit, or the camera system 20 itself or an external module different from the flight controller-501 may be used. It may have a plurality of sensors and may be configured to determine the flight state of the drone 100. Further, the flight state information may be received from the management server 702 or the mobile terminal 701 via the transmission device 26 included in the camera system 20.
  • various data regarding the drone 100 are sent to the flight controller 501.
  • the flight controller 501 is fed back with the actual rotation speeds of the motors 102-1a, 102-1b, 102-2a, 102-2b, 102-3a, 102-3b, 104-a, 104-b. (See FIG. 6).
  • the flight controller 501 can measure the absolute position of the drone 100 with an accuracy of about several centimeters by combining the signal of the RTK base station and the signal from the GPS positioning satellite (504-1 in FIG. 6). See 504-2 and 504-3).
  • the flight controller 501 can calculate the acceleration and velocity of the drone aircraft in three directions orthogonal to each other by the 6-axis gyro sensor 505, and can measure the change in the attitude angle of the drone aircraft in the three directions, that is, the angular velocity (FIG. 6). reference). Further, the flight controller 501 can measure the direction of the drone aircraft by measuring the geomagnetism by the geomagnetic sensor 506 (see FIG. 6). In addition, the barometric pressure sensor 507 can measure the barometric pressure and indirectly measure the altitude of the drone (see FIG. 6). Further, the flight controller 501 can measure the distance between the drone aircraft and the ground surface by using the reflection of sound waves such as ultrasonic waves by the laser sensor 508 and sonar 509 (see FIG. 6).
  • the control unit that is, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 can determine whether or not the drone 100 is in a predetermined flight state. ..
  • the drone 100 is wirelessly connected to an external mobile terminal 701 or a management server 702 (see FIG. 7), and the flight route management information 1800 (see FIG. 7) stored in the mobile terminal 701 or the management server 702 (see FIG. 7). 10. Refer to FIGS. 11 and 18), the flight coordinates are specified in order. At this time, the flight management module 1112 of the management server 702 manages the sprayed flight of the drone 100 based on the information such as the field management information 1300 and the flight route management information 1800 (see FIG. 11).
  • the flight path management module 1115 of the management server 702 calculates the flight path of the spray flight of the drone 100 based on the field management information 1300 (see FIG. 11).
  • the field management information 1300 stores various information about the field to which the chemicals are sprayed, and stores information such as the field ID, the field name, the field position, the coordinates around the field, the field area, and the crops planted.
  • the flight route management information 1800 stores information indicating the flight route of the drone 100, and stores the route ID, the target ID, the route coordinates, the total route distance, and the like (see FIG. 18). Therefore, according to the instruction from the management server 702, the drone 100 can perform autonomous flight, and the flight controller 501 can determine the flight state based on the signal.
  • the mobile terminal 701 apart from the instruction from the management server 702, the mobile terminal 701 (see FIG. 7) allows the drone 100 to perform basic operations such as takeoff and temporary return. Further, in an emergency, the mobile terminal 701 can manually operate the drone 100. For example, when the drone 100 performs a hovering pause, it can be obtained from the emergency stop button 925 (FIG. 9) of the drone operation screen 900 displayed on the mobile terminal 701. The same applies to the option for the drone 100 to return to the flight start point (temporary return, etc.) and the option to stop the motor on the spot (landing, etc.).
  • the drone operation module 1012 transfers information such as commands corresponding to these operations to the drone 100. Send and operate the drone 100. Therefore, according to the instruction from the mobile terminal 701, the drone 100 can be separated from the autonomous flight, and the flight controller 501 can determine the state based on the signal.
  • FIG. 22 is a conceptual diagram of the flight path of the drone 100 for diagnosing the growth of crops in the field.
  • the fields are shown schematically as squares of equal length on all four sides, with the horizontal axis equally divided from X0 to X10 and the vertical axis equally divided from Y0 to Y10. .. It is assumed that an image is taken for each square defined by the vertical axis and the horizontal axis. However, at the time of actual shooting, adjacent shot images may be partially overlapped. For example, starting from the starting point (S) in the lower left region (X0, X1, Y0, Y1), the drone 100 first flies outward along the four sides of the field, as shown by the outer arrow.
  • the drone 100 After making almost one round along the edge of the field and reaching the area (X0, X1, Y1, Y2) in front of the starting point (S), the drone 100 excluding the outer area that has already flown. It flies in a zigzag manner in the area inside it, so as to capture all the crops in the field.
  • the drone 100 completes the shooting work and leaves the field.
  • the shape of the field in which the drone 100 actually flies is not limited to that illustrated in FIG. 22. Further, the photographed image (square) is not limited to a square.
  • the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 can determine the flight state of the drone 100 at the time of shooting (it can be determined based on the position, altitude, speed, attitude, autonomous flight, etc.). In particular, can be subdivided into the following AF.
  • the flight controller 501 or the camera controller 22 of the camera system 20 can be classified into the following two types regarding the flight state of AF. That is, in the case of A, the flight controller 501 determines that the captured image is a valid image. On the other hand, in the case of BF, since it is difficult to properly photograph the crops in the field, the flight controller 501 or the camera controller 22 of the camera system 20 determines that the captured image is an invalid image. If valid and invalid images are extracted at this stage, in each case, the flight controller 501 or the camera controller 22 of the camera system 20 will use each image together with position (latitude, longitude, altitude, etc.) and time information. Is flagged (disabled / enabled).
  • the flight controller 501 or the camera controller 22 of the camera system 20 determines that the image taken in the following cases is an invalid image.
  • B When the drone 100 is flying in and out of the field, that is, when the drone 100 has not reached the field or has left the field, the crops in the field cannot be photographed properly. Therefore, it is determined that the flight state of the drone 100 does not satisfy the predetermined condition, that is, the captured image is an invalid image.
  • the entry / exit flight to and from the field is an entry flight from the takeoff point of the drone to the shooting target area, or an exit flight from the shooting target area to the landing point.
  • the drone 100 flies in a zigzag or meandering manner. Moves to the next area in order (see FIG. 22).
  • the drone 100 can fly diagonally to the field without being restricted by such movement. Therefore, if the flight state is tracked together with the position information of the drone 100 and a zigzag flight or the like is detected, it may be determined as a valid image. On the other hand, if an oblique flight that deviates from a predetermined route flight (zigza flight) is detected, it may be determined as an invalid image.
  • the flight state is tracked together with the position information of the drone 100, and if a zigzag flight or the like is detected, it is judged as a valid image, and if an oblique flight or the like is detected, it is determined. It may be determined as an invalid image.
  • the valid image and the invalid image may be distinguished by tracking the instruction from the mobile terminal 701, the speed and altitude of the drone 100, and the like together with the position information of the drone 100.
  • the drone 100 When the drone 100 is in an emergency stop (hovering), that is, when the camera does not move the crops in the field from a predetermined position and altitude, it is not possible to properly photograph all the crops in the field. Therefore, it is determined that the flight state of the drone 100 does not satisfy the predetermined condition, that is, the captured image is an invalid image.
  • the valid image and the invalid image may be distinguished by tracking the instruction from the mobile terminal 701, the speed and altitude of the drone 100, and the like together with the position information of the drone 100.
  • the position signal or speed measurement cannot be measured well due to the disturbance of the input signal, etc., and the flight condition of the drone 100 cannot be properly determined, it is possible to properly photograph the crops in the field. Can not. Therefore, it is determined that the flight state of the drone 100 does not satisfy the predetermined condition, that is, the captured image is an invalid image.
  • the input signal include various sensors of the drone 100, signals from the mobile terminal 701 and the management server 702, satellite signals, and the like.
  • the effective signal and the invalid signal may be distinguished by tracking the values of these signals and comparing and determining them with a predetermined threshold value or the like.
  • the flight controller 501 or the camera controller 22 of the camera system 20 may determine that the image is invalid when the position coordinates are outside the target field based on the position information of the drone 100. Further, in addition to the above BF, it may be determined as an invalid image based on the following conditions. For example, G. If the altitude above ground level of the drone 100 is out of the predetermined range (too high or too low), it will be difficult to take an appropriate picture, so it may be determined as an invalid image. Furthermore, H. If the posture angle of the drone 100 is out of the predetermined range, it is difficult to take an appropriate image, so it may be determined that the image is invalid.
  • I When the flight controller 501 is abnormal (temporary abnormality, etc.) or malfunctions (non-temporary permanent equipment failure, etc.), it may be determined as an invalid image.
  • J When the camera system 20 is abnormal (for example, a temporary abnormality of the camera controller 22) or fails (for example, the lens of the camera 21 is dirty), it may be determined as an invalid image.
  • K If the drone 100 flies overlapping with the position where the shooting flight has already been taken, it may be determined as an invalid image.
  • L When the flight controller 501 detects an emergency intervention command, an abnormal state detection, or the like, it may be determined as an invalid image.
  • M For the purpose of diagnosing the growth of crops in the field, if the growth diagnosis (for example, judgment of the color of the crop such as green or red) cannot be performed well in the image of the crop in the field, the image is invalidated. You may judge that. In this case, in particular, whether or not an appropriate amount of light is insufficient in the captured image becomes a problem. For example, an image obtained during the day can make a good image judgment, but an image obtained at night when the surrounding environment becomes dark and good color identification becomes difficult cannot make an appropriate image judgment. Therefore, if the shooting time is out of the predetermined range (other than daytime), it may be determined as an invalid image.
  • the predetermined range other than daytime
  • N For example, if the drone 100 is operated in the evening and the surrounding environment gradually darkens with sunset, the degree of illumination of the surroundings (light intensity, etc.) obtained from the sensor (light sensor, etc.) or the image of the camera. If is out of the predetermined range, it can be determined that the image is invalid. That is, instead of judging by the binary value of day or night, the change in the degree of light irradiation from day to night is assumed in advance in a multi-step manner, and when it deviates from the predetermined range, it is regarded as an invalid image. You may judge.
  • the captured image quality may deteriorate.
  • the image obtained at that time may be determined as an invalid image based on the weather information, the information obtained from the sensor, and the like. That is, even in the daytime, if the captured image is stained or disturbed by wind, insects, or the like and appropriate image quality cannot be obtained, it may be determined as an invalid image.
  • the captured image quality may deteriorate.
  • the image obtained at that time may be determined as an invalid image based on the weather information, the information obtained from the sensor, and the like. That is, even during the daytime, if a sufficient amount of light is insufficient due to weather conditions (rain, snow, etc.) and appropriate image quality cannot be obtained, it may be determined as an invalid image.
  • an invalid image is extracted by satisfying any of the above BP conditions, particularly when an invalid image is extracted by satisfying any of the above BF conditions, sound or light is extracted from the drone 100.
  • You may output a warning with, or you may output a warning to a mobile terminal 701 or the like.
  • an optical signal is given to the drone operator by lighting such as LED 107 and warning light 521 (see FIG. 6) provided in the drone 100. You may be notified by lighting, blinking, etc.).
  • the display may be performed on the display unit provided in the drone.
  • the buzzer 518 and the speaker 520 are used to notify the drone operator of the state (particularly the error state) by an audio signal. May be good.
  • a warning may be issued on the mobile terminal 701 (see FIG. 7) used by the drone operator, and information for notifying the state (particularly an error state) may be output. ..
  • the user can immediately identify whether the drone 100 is acquiring the valid image or the invalid image by the signal emitted from the drone 100 main body or the mobile terminal 701. Therefore, the user can quickly understand whether or not the invalid image is mixed in the valid image during the operation.
  • the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20
  • the acquired flight information is a predetermined condition (the above BF, etc.).
  • the storage of the image is stopped, the preprocessing for the image is stopped, or the image or after the preprocessing. It is preferable to stop the storage of the data of the above, stop the transmission of the image or the preprocessed data to the outside, and delete the image or the preprocessed data from the storage device.
  • any one of the following may be performed, or an appropriate combination may be adopted from the following plurality of combinations.
  • -Flag on / off invalid -Record / not record the original image (raw data) -Pre-process the image / not -Record / do not record preprocessed data -Transmit / not transmit the original image or preprocessed data, etc.
  • these combinations include the case where the original image is not recorded in the first place, the case where the original image is recorded as it is but the invalid flag is added, and the case where the preprocessing is performed but the invalid flag is added.
  • the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 not only specifies whether or not to store the captured image, but also preprocesses or does not preprocess the captured image.
  • the data after processing is stored / not stored, and the data after preprocessing is sent / not sent to the management server.
  • the camera controller stores the captured raw data in the storage unit 25 of the camera system 20 in the drone 100, while prohibiting preprocessing by the image processor 24, or after preprocessing by the transmission unit 26. Prohibit the transmission of the data to the outside. As a result, it is possible to avoid performing preprocessing based on poor quality captured images and reduce the burden of unnecessary data transmission. By finely dividing the drone as described above, it is possible to control the drone with the optimum contents according to various embodiments.
  • FIG. 23 is an example of an invalid image determination processing flow of the drone 100.
  • the flight controller 501 receives the flight state of the drone 100 from various sensors (FIG. 6) or an external server (FIG. 7) in step S02. To get.
  • the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) determines whether or not the flight state acquired in step S02 satisfies a predetermined invalid image condition. For example, when any of the conditions such as BF is satisfied, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) flags the captured image as an invalid image in step S04.
  • step S05 the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) performs preprocessing of the invalid image, storage of the preprocessed data, transmission of the preprocessed data to the outside, and the like. Stop.
  • step S03 for example, when the condition of A is satisfied, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) flags the captured image as an effective image in step S06.
  • step S07 preprocessing of the effective image, storage of the data after the preprocessing, transmission of the data after the preprocessing to the outside, and the like are executed.
  • the above processing may be performed by the flight controller 501 (FIG. 6) in cooperation with the camera controller 22 (FIG. 21), or the camera controller 22 (FIG. 21) instead of the flight controller 501 (FIG. 6). May be carried out.
  • the management server 702 can recalculate the flight path flying in the area where the invalid image is acquired.
  • the control unit that is, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 stores the position information of each invalid image and transmits the position information to the external management server 702. do.
  • the management server 702 may recalculate the flight path connecting each invalid image based on the received data and send an instruction to the flight controller 501 to regain the area of each invalid image.
  • the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 flags the area corresponding to the invalid image and obtains the position (latitude and longitude, etc.). Then, the management server 702 calculates the flight path of the economical drone 100 based on the position information and the like for each flagged image, or selects an appropriate flight path from the predetermined flight paths. Then, the drone 100 is re-flyed according to the route.
  • FIG. 24 is a conceptual diagram showing a case where an invalid image is extracted during the flight of the drone 100 for diagnosing the growth of crops in the field illustrated in FIG. 22.
  • the invalid image It is flagged.
  • the management server 702 can take, for example, the following means when calculating a new flight path for re-shooting the areas 31, 32, 33 from which these invalid images have been extracted.
  • FIG. 25 shows that the same flight path is adopted again in the flight path including the areas 31, 32, and 33 of the three invalid images exemplified in FIG. 24, and the flight is continued until all the invalid images are retaken.
  • the management server 702 can also calculate that, in the newly obtained flight path, the image is taken only at the point corresponding to the invalid image according to the flight path already adopted, in whole or in part. be. In this case, the calculation of the flight path can be simplified. However, if the number of invalid images that need to be retaken is small, the flight path tends to be wasted.
  • FIG. 26 shows flight so as to connect the invalid images at the shortest distance in the flight path including the regions 31, 32, 33 of the three invalid images exemplified in FIG. 24 without adopting the same flight path again.
  • the flight path is calculated by connecting a plurality of positions corresponding to a plurality of invalid images with a single stroke and minimizing the length of the entire flight path.
  • the management server 702 connects the areas 31, 32, and 33 of each flagged image with a single stroke in the newly obtained flight path, and calculates to minimize the overall flight distance. May be good.
  • the time required to fly the drone 100 can be shortened to avoid waste in the flight path, but a peculiar problem may occur due to the departure from the zigzag flight path (FIG. 22).
  • the drone 100 in the zigzag flight path, the drone 100 flies almost parallel to the vertical axis and the horizontal axis, whereas as illustrated in FIG. 26, the invalid image is the shortest.
  • the drone 100 tends to fly at an angle with respect to the vertical axis and the horizontal axis.
  • the orientation of the invalid image 31 taken in the first flight path (zigzag flight path) and the second flight path (angled flight path) are taken.
  • a deviation (angle ⁇ ) is likely to occur between the direction and the direction of the image 31'. Therefore, when the first invalid image 31 is replaced with the subsequent valid image 31', the orientations of the corresponding images are different. Therefore, when merging (combining) a plurality of images into one, the position of the image shifts at that location. (Especially when the captured image 31 is not square).
  • the direction at the time of the first shooting and the direction at the time of the second shooting are taken. It is preferable to store the image in correspondence with the direction and the like. Later, when the first effective image and the second effective image correspond to each other, the flight controller 501 or the camera controller 22 corrects the deviation (angle ⁇ ) of the image in which the retake occurs (photographed image). (Rotation, etc.) is preferable. This method is effective when the number of invalid images that need to be retaken is small.
  • FIG. 27 shows the case of the first flight path while connecting the three invalid images at the shortest possible distance without adopting the same flight path again in the flight path including the three invalid images exemplified in FIG. 24. It shows an example of flying to shoot in the same camera orientation as. That is, the length of the entire flight path is minimized while matching the direction in which the invalid image is retaken with the case of the first flight path. In this means, during the second flight, the section that flies toward each invalid image at the shortest possible distance (the section that flies diagonally with respect to the first flight path) 41 and the first for each invalid image.
  • It includes a section (a section that flies in the same manner as the first flight path) 42 that corrects the direction of the drone 100 so that the image is taken with the same camera orientation as during flight.
  • the ratio of the lengths of the sections 41 and 42 can be changed according to the embodiment, but the length of the sections 42 may be as short as possible.
  • the orientation of the image taken in the first flight path (zigzag flight path) and the second flight path (angled flight path) retaken are used.
  • multiple times three times or
  • the flight path calculates a flight path that is partially or wholly the same as the first flight path so as to include a plurality of positions corresponding to the plurality of invalid images. Further, when reacquiring an image at a plurality of positions corresponding to a plurality of invalid images, the direction of the drone is made the same as the direction in which the invalid image was acquired, and then the image is acquired again. As another method, when re-acquiring an image at a plurality of positions corresponding to a plurality of invalid images, the re-acquired image is changed to the same composition as the corresponding invalid image (image processing). You may take the method of reacquiring the image of the same composition as the first flight.
  • FIG. 28 shows that in the flight path including the regions 31, 32, 33 of the three invalid images exemplified in FIG. 24, each invalid image is connected at the shortest possible distance without adopting the same flight path again.
  • An example of flying in the same direction as in the case of the first flight path or in a direction of 90 degrees or 180 degrees is shown. Comparing the example of FIG. 28 with the example of FIG. 27, in the former, especially when flying from the area 32 of the second invalid image to the area 33 of the third invalid image, the image taken in the first flight path The shooting direction is changed only once in order to correspond to the direction (reference numeral 43). Therefore, the example of FIG. 28 is more economical because the flight path is shortened as compared with the example of FIG. 27. However, the image after shooting is symmetrical vertically or horizontally.
  • the captured image is rectangular, the vertical and horizontal lengths are the same, so by changing the orientation (rotation) by 180 degrees, the retaken valid image can be replaced with an invalid image. can. Further, when the captured image is a square, the length of each side is the same, so that the retaken valid image can be replaced with the invalid image by changing the orientation (rotation) by 90 degrees.
  • the shot image is rotated in consideration of the shooting angle of the camera at the time of shooting.
  • the management server 702 can adopt an appropriate means among the following means when newly obtaining a flight path.
  • -In the case illustrated in FIG. 25 when calculating a new flight path so that the first flight path and a part or all of the same flight path are used so as to include an invalid image.
  • -In the case illustrated in Fig. 26 when connecting invalid images with a single stroke and calculating a new flight path so as to minimize the length of the entire flight path / giving priority to connecting invalid images at the shortest distance If the shooting direction of the invalid image does not match the case of the first flight path)
  • the management server 702 may prepare a plurality of flight paths in the database in advance, select an appropriate flight path including the position of the invalid image from the database, and obtain a new flight path.
  • FIG. 29 is an example of the invalid image determination process and the second flight route calculation process flow of the drone 100.
  • the flight controller 501 (FIG. 6) is transmitted from various sensors (FIG. 6) or an external server (FIG. 7) in step S12. Acquire the flight status of the drone 100.
  • step S13 it is determined whether or not the flight state acquired by the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) satisfies the predetermined invalid image condition in step S12. For example, when any of the above conditions such as BF is satisfied, the captured image is flagged as an invalid image in step S14.
  • step S15 the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) processes invalid images (image pre-processing, pre-processing data storage, pre-processing data transmission, etc.) in step S15. ) Is stopped.
  • step 3 for example, when the condition of A is satisfied, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) flags the captured image as an effective image in step S16.
  • step S17 processing of the effective image (pre-processing of the image, storage of data after pre-processing, transmission of data after processing, etc.) is executed.
  • the invalid image information (FIG. 24, invalid image flag, position information, etc.) is transmitted to the external management server 702 in step S18 (FIG. 7).
  • step S19 the external management server 702 side recalculates the new flight path connecting the invalid images (FIGS. 25-28).
  • step S20 the recalculated flight path is transmitted from the external management server 702 to the drone 100 (FIG. 7). After that, the drone 100 is re-flyed to retake the invalid image.
  • the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 will flag each retaken image as newly valid or invalid on the second flight when the invalid image is taken again. conduct. As a result, there is a one-to-one correspondence between the image determined to be an invalid image during the first flight and the image retaken as a valid image during the second flight. At that time or thereafter, each invalid image during the first flight may be replaced with each corresponding valid image. If the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 causes an invalid image again during the second flight, the flight controller 501 (FIG. 6) takes the same measures as during the first flight. If necessary, the photograph may be taken again during the third flight.
  • the present invention provides a drone 100 equipped with a camera 512 or a camera system 20 attached to the drone 100 for diagnosing the growth of crops in a field.
  • a drone 100 equipped with a camera 512 or a camera system 20 attached to the drone 100 for diagnosing the growth of crops in a field.
  • the transmission of the preprocessed or preprocessed data to the external server is stopped. By doing so, waste of image processing is eliminated and work efficiency is optimized.
  • an external server is made to calculate a new flight route so as to newly capture the invalid image. This makes the optimized flight path available promptly for retrieving invalid images.
  • the present invention is not limited to the above-described embodiment, but includes various modifications.
  • the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations.
  • it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.
  • each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be stored in a memory, a storage device such as a hard disk or SSD (Solid State Drive), or a storage medium such as an IC card, SD card, or DVD.
  • a storage device such as a hard disk or SSD (Solid State Drive)
  • a storage medium such as an IC card, SD card, or DVD.
  • control lines and information lines indicate those that are considered necessary for explanation, and do not necessarily indicate all the control lines and information lines in the product. In practice, it can be considered that almost all configurations are interconnected. It should be noted that the above-mentioned embodiment discloses at least the configuration described in the claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Botany (AREA)
  • Ecology (AREA)
  • Forests & Forestry (AREA)
  • Environmental Sciences (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Catching Or Destruction (AREA)

Abstract

Provided is a drone which, in accordance with the flying state of the drone, performs appropriate management relating to pre-processing of captured images or transmission of pre-processed data to the outside, and a drone provided with an appropriate means for retaking an image if an invalid image is extracted. This drone is provided with a main body, a plurality of rotating blades, a camera, a transmitting unit for transmitting data generated from an image acquired from the camera to the outside, and a storage device, wherein, if the flying state of the drone satisfies a predetermined condition, the drone performs at least one of: flagging the image as an invalid image; stopping storage of the image to the storage device, stopping pre-processing of the image, or stopping transfer of the data to the outside; and deleting the image from the storage device.

Description

作物の生育診断を行うドローンと、そのカメラシステムA drone that diagnoses the growth of crops and its camera system
 本発明は、作物の生育診断を行うドローンと、そのカメラシステムに関する。 The present invention relates to a drone for diagnosing the growth of crops and a camera system thereof.
 本技術分野の背景技術として、国際公開第2017/221641号(特許文献1)がある。この公報には、「植物における生育の度合いを表す生育指標を求めるための、測定対象の圃場を上方から撮像した画像を生成する植物生育指標測定装置、該方法および該プログラムであって、所定の時間間隔で画像を生成し、前記生育指標を求めるために有意な画像のみを画像記憶部に記憶するように、前記生成された画像を画像記憶部に記憶するか否かを判定し、その結果、前記画像記憶部に記憶すると判定された場合のみ、前記生成された画像を画像記憶部に記憶する。」と記載されている(要約参照)。 As a background technology in this technical field, there is International Publication No. 2017/221641 (Patent Document 1). In this publication, "a plant growth index measuring device for generating a growth index indicating the degree of growth in a plant, a plant growth index measuring device for generating an image of an image of a field to be measured from above, a method thereof, and a predetermined program thereof. An image is generated at time intervals, and it is determined whether or not to store the generated image in the image storage unit so that only a significant image is stored in the image storage unit in order to obtain the growth index, and the result is , The generated image is stored in the image storage unit only when it is determined to be stored in the image storage unit (see summary).
国際公開第2017/221641号International Publication No. 2017/221641
 前記特許文献1には、圃場を撮像するドローンが記載されている。しかしながら、従来技術では、航空機(ドローンを含む)の位置、高度、速度、姿勢などの飛行状態が予め設定された条件を満たす場合に撮影画像を記憶するものの、上記条件だけでは、撮影画像の記憶の是非を十分に判断できない虞があった。また、圃場の作物の生育診断に適当な画像を選択して、記憶するものの、保存対象から除外すること以外については、十分な対応がなされていない虞があった。さらに、無効画像が抽出された場合に、画像を取り直す手段が十分に備えられていない虞があった。
 そこで、本発明は、ドローンの飛行状態に対応して、撮影画像の前処理または処理後のデータの外部への伝送について、適当な管理を行うドローンと、そのカメラシステムを提供する。さらに、無効画像が抽出された場合、画像を取り直すための適当な手段を備えたドローンと、そのカメラシステムを提供する。
The patent document 1 describes a drone that images a field. However, in the prior art, the captured image is stored when the flight conditions such as the position, altitude, speed, and attitude of the aircraft (including the drone) satisfy preset conditions, but the captured image is stored only under the above conditions. There was a risk that it would not be possible to fully judge the pros and cons of. In addition, although an image suitable for the growth diagnosis of crops in the field was selected and stored, there was a possibility that sufficient measures were not taken except for excluding it from the preservation target. Further, when an invalid image is extracted, there is a possibility that the means for retrieving the image is not sufficiently provided.
Therefore, the present invention provides a drone and a camera system thereof that appropriately manages the pre-processing of the captured image or the transmission of the processed data to the outside according to the flight state of the drone. Further, when an invalid image is extracted, a drone having an appropriate means for retrieving the image and a camera system thereof are provided.
 上記課題を解決するために、例えば特許請求の範囲に記載の構成を採用する。
 本願は上記課題を解決する手段を複数含んでいるが、その一例を挙げるならば、本体と、複数の回転翼と、カメラと、前記カメラから取得した画像から生成されるデータを外部に伝送する伝送部と、記憶装置と、を備えるドローンであって、前記ドローンの飛行状態が所定の条件を満たす場合に、前記画像を無効画像としてフラグ付けをする、前記画像の前記記憶装置 への記憶を停止する、前記画像に対する前処理を停止する、または前記データの外部への伝送を停止する、前記画像を前記記憶装置から削除する、ことのうち少なくとも1つを行うことを特徴とする。
In order to solve the above problems, for example, the configuration described in the claims is adopted.
The present application includes a plurality of means for solving the above problems, and to give an example thereof, a main body, a plurality of rotary blades, a camera, and data generated from an image acquired from the camera are transmitted to the outside. A drone including a transmission unit and a storage device, in which the image is flagged as an invalid image when the flight state of the drone satisfies a predetermined condition, and the storage of the image in the storage device is performed. It is characterized in that at least one of stopping, stopping the preprocessing for the image, stopping the transmission of the data to the outside, and deleting the image from the storage device is performed.
 本発明によれば、ドローンの飛行状態に対応して、撮影画像の前処理または前処理後のデータの外部への伝送について、好適な管理を行うドローンと、そのカメラシステムを提供することができる。さらに、無効画像が抽出された場合に、画像を取り直すための適当な手段を備えたドローンと、そのカメラシステムを提供することができる。
 上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。
INDUSTRIAL APPLICABILITY According to the present invention, it is possible to provide a drone and a camera system thereof that appropriately manages the preprocessing of the captured image or the transmission of the data after the preprocessing to the outside according to the flight state of the drone. .. Further, if an invalid image is extracted, a drone having an appropriate means for retrieving the image and a camera system thereof can be provided.
Issues, configurations and effects other than those described above will be clarified by the following description of the embodiments.
ドローンの平面図の例である。This is an example of a plan view of a drone. ドローンの正面図の例である。This is an example of a front view of a drone. ドローンの右側面図の例である。This is an example of the right side view of the drone. ドローンの背面図の例である。This is an example of the rear view of the drone. ドローンの斜視図の例である。This is an example of a perspective view of a drone. ドローンの制御機能を表したブロック図の例である。This is an example of a block diagram showing the control function of the drone. ドローン管理システム700全体の接続構成図の例である。This is an example of a connection configuration diagram of the entire drone management system 700. モバイル端末701に表示される圃場情報表示画面800の例である。This is an example of the field information display screen 800 displayed on the mobile terminal 701. モバイル端末701に表示されるドローン操作画面900の例である。This is an example of the drone operation screen 900 displayed on the mobile terminal 701. モバイル端末701のハードウェア構成の例である。This is an example of the hardware configuration of the mobile terminal 701. 管理サーバ702のハードウェア構成の例である。This is an example of the hardware configuration of the management server 702. 管理端末703のハードウェア構成の例である。This is an example of the hardware configuration of the management terminal 703. 圃場管理情報1300の例である。This is an example of field management information 1300. 機器管理情報1400の例である。This is an example of device management information 1400. ユーザ管理情報1500の例である。This is an example of user management information 1500. 薬剤管理情報1600の例である。This is an example of drug management information 1600. エネルギー管理情報1700の例である。This is an example of energy management information 1700. 飛行経路管理情報1800の例である。This is an example of flight path management information 1800. スケジュール管理情報1900の例である。This is an example of schedule management information 1900. ドローンを用いて行う圃場の作物の生育診断をAとBで分けて示した例である。This is an example showing the growth diagnosis of crops in a field using a drone separately for A and B. ドローンに対して装備される専用のカメラシステムの例である。This is an example of a dedicated camera system equipped for drones. 圃場の作物の生育診断を行うドローンの飛行経路の例である。This is an example of a drone's flight path for diagnosing the growth of crops in the field. ドローンの無効画像判定処理フローの例である。This is an example of the invalid image judgment processing flow of the drone. 圃場の作物の生育診断を行うドローンの飛行中、無効画像が抽出された例である。This is an example in which an invalid image was extracted during the flight of a drone that diagnoses the growth of crops in the field. 二度目のドローンの飛行中、圃場で無効画像を再度取り直す例である。This is an example of retrieving an invalid image in the field during the second drone flight. 二度目のドローンの飛行中、圃場で無効画像を再度取り直す例である。This is an example of retrieving an invalid image in the field during the second drone flight. 二度目のドローンの飛行中、圃場で無効画像を再度取り直す例である。This is an example of retrieving an invalid image in the field during the second drone flight. 二度目のドローンの飛行中、圃場で無効画像を再度取り直す例である。This is an example of retrieving an invalid image in the field during the second drone flight. ドローンの無効画像判定処理及び二度目の飛行経路算定処理のフローの例である。This is an example of the flow of the drone invalid image determination process and the second flight route calculation process.
 以下、実施例を図面を用いて説明する。
 ドローンは、農機の例である。本明細書において、ドローンとは、動力手段(電力、原動機等)、操縦方式(無線であるか有線であるか、および、自律飛行型であるか手動操縦型であるか等)を問わず、複数の回転翼を有する飛行体全般を指すこととする。
Hereinafter, examples will be described with reference to the drawings.
Drones are an example of agricultural machinery. In the present specification, the drone is regardless of the power means (electric power, prime mover, etc.) and the maneuvering method (wireless or wired, autonomous flight type, manual maneuvering type, etc.). It refers to all aircraft with multiple rotor blades.
 図1は、ドローンの平面図の例である。
 図2は、ドローンの正面図の例である。
 図3は、ドローンの右側面図の例である。
 図4は、ドローンの背面図の例である。
 図5は、ドローンの斜視図の例である。
FIG. 1 is an example of a plan view of a drone.
FIG. 2 is an example of a front view of the drone.
FIG. 3 is an example of a right side view of the drone.
FIG. 4 is an example of a rear view of the drone.
FIG. 5 is an example of a perspective view of the drone.
 回転翼101-1a、101-1b、101-2a、101-2b、101-3a、101-3b、101-4a、101-4b(ローターとも呼ばれる)は、ドローン100を飛行させるための手段であり、飛行の安定性、機体サイズ、および、電力消費量のバランスを考慮し、8機(2段構成の回転翼が4セット)備えられている。各回転翼101は、ドローン100の本体110からのび出たアームにより本体110の四方に配置されている。すなわち、進行方向左後方に回転翼101-1a、101-1b、左前方に回転翼101-2a、101-2b、右後方に回転翼101-3a、101-3b、右前方に回転翼101-4a、101-4bがそれぞれ配置されている。なお、ドローン100は図1における紙面下向きを進行方向とする。回転翼101の回転軸から下方には、それぞれ棒状の足107-1,107-2,107-3,107-4が伸び出ている。 Rotors 101-1a, 101-1b, 101-2a, 101-2b, 101-3a, 101-3b, 101-4a, 101-4b (also called rotors) are means for flying the drone 100. Eight aircraft (four sets of two-stage rotor blades) are provided in consideration of the balance of flight stability, aircraft size, and power consumption. Each rotary blade 101 is arranged on four sides of the main body 110 by an arm protruding from the main body 110 of the drone 100. That is, the rotors 101-1a and 101-1b are left rearward in the traveling direction, the rotary blades 101-2a and 101-2b are forward left, the rotary blades 101-3a and 101-3b are rearward right, and the rotary blades 101- are forward right. 4a and 101-4b are arranged respectively. In addition, the drone 100 has the traveling direction facing downward on the paper in FIG. Rod-shaped legs 107-1, 107-2, 107-3, and 107-4 extend downward from the rotation axis of the rotary blade 101, respectively.
 モーター102-1a、102-1b、102-2a、102-2b、102-3a、102-3b、102-4a、102-4bは、回転翼101-1a、101-1b、101-2a、101-2b、101-3a、101-3b、101-4a、101-4bを回転させる手段(典型的には電動機であるが発動機等であってもよい)であり、1つの回転翼に対して1機設けられている。モーター102は、推進器の例である。1セット内の上下の回転翼(例えば101-1aと101-1b)、および、それらに対応するモーター(例えば102-1aと102-1b)は、ドローンの飛行の安定性等のために軸が同一直線上にあり、かつ、互いに反対方向に回転する。 The motors 102-1a, 102-1b, 102-2a, 102-2b, 102-3a, 102-3b, 102-4a, 102-4b are the rotary blades 101-1a, 101-1b, 101-2a, 101-. It is a means for rotating 2b, 101-3a, 101-3b, 101-4a, 101-4b (typically an electric motor, but may be a motor or the like), and is 1 for one rotary blade. The machine is provided. The motor 102 is an example of a propulsion device. The upper and lower rotor blades (for example, 101-1a and 101-1b) in one set and the corresponding motors (for example, 102-1a and 102-1b) have axes for the stability of drone flight and the like. They are on the same straight line and rotate in opposite directions.
 図2、および、図3に示されるように、ローターが異物と干渉しないよう設けられたプロペラガードを支えるための放射状の部材は水平ではなくやぐら状の構造である。衝突時に当該部材が回転翼の外側に座屈することを促し、ローターと干渉することを防ぐためである。 As shown in FIGS. 2 and 3, the radial member for supporting the propeller guard provided so that the rotor does not interfere with foreign matter has a wobble-like structure rather than a horizontal structure. This is to encourage the member to buckle to the outside of the rotor blade in the event of a collision and prevent it from interfering with the rotor.
 薬剤ノズル103-1、103-2、103-3は、薬剤を下方に向けて散布するための手段であり4機備えられている。なお、本明細書において、薬剤とは、農薬、除草剤、液肥、殺虫剤、種、水などの圃場に散布される液体、粉体又は微粒子である。 The drug nozzles 103-1, 103-2, 103-3 are means for spraying the drug downward, and are provided with four machines. In addition, in this specification, a drug is a liquid, powder or fine particles sprayed in a field such as a pesticide, a herbicide, a liquid fertilizer, an insecticide, a seed, and water.
 薬剤タンク104は散布される薬剤を保管するためのタンクであり、重量バランスの観点からドローン100の重心に近い位置でかつ重心より低い位置に設けられている。薬剤ホース105-1、105-2、105-3は、薬剤タンク104と各薬剤ノズル103-1、103-2、103-3とを接続する。薬剤ホースは硬質の素材から成り、当該薬剤ノズルを支持する役割を兼ねていてもよい。ポンプ106は、薬剤をノズルから吐出するための手段である。 The medicine tank 104 is a tank for storing the medicine to be sprayed, and is provided at a position close to the center of gravity of the drone 100 and at a position lower than the center of gravity from the viewpoint of weight balance. The medicine hoses 105-1, 105-2, 105-3 connect the medicine tank 104 and the medicine nozzles 103-1, 103-2, 103-3. The drug hose is made of a hard material and may also serve to support the drug nozzle. The pump 106 is a means for discharging the drug from the nozzle.
 図6は、ドローンの制御機能を表したブロック図の例である。
 フライトコントローラー501は、ドローン全体の制御を司る構成要素であり、具体的にはCPU、メモリ、関連ソフトウェア等を含む組み込み型コンピュータであってよい。フライトコントローラー501は、モバイル端末701から受信した入力情報、および、後述の各種センサから得た入力情報に基づき、ESC(Electronic Speed Control)等の制御手段を介して、モーター102-1a、102-1b、102-2a、102-2b、102-3a、102-3b、104-a、104-bの回転数を制御することで、ドローン100の飛行を制御する。モーター102-1a、102-1b、102-2a、102-2b、102-3a、102-3b、104-a、104-bの実際の回転数はフライトコントローラー501にフィードバックされ、正常な回転が行なわれているかを監視できる構成になっている。あるいは、回転翼101に光学センサ等を設けて回転翼101の回転がフライトコントローラー501にフィードバックされる構成でもよい。
FIG. 6 is an example of a block diagram showing the control function of the drone.
The flight controller 501 is a component that controls the entire drone, and may be an embedded computer including a CPU, a memory, related software, and the like. The flight controller 501 is based on the input information received from the mobile terminal 701 and the input information obtained from various sensors described later, via a control means such as an ESC (Electronic Speed Control), and the motors 102-1a and 102-1b. , 102-2a, 102-2b, 102-3a, 102-3b, 104-a, 104-b to control the flight of the drone 100. The actual rotation speeds of the motors 102-1a, 102-1b, 102-2a, 102-2b, 102-3a, 102-3b, 104-a, 104-b are fed back to the flight controller 501, and normal rotation is performed. It is configured so that it can be monitored. Alternatively, the rotary blade 101 may be provided with an optical sensor or the like so that the rotation of the rotary blade 101 is fed back to the flight controller 501.
 フライトコントローラー501が使用するソフトウェアは、機能拡張・変更、問題修正等のために記憶媒体等を通じて、または、Wi-Fi通信やUSB等の通信手段を通じて書き換え可能になっている。この場合において、不正なソフトウェアによる書き換えが行なわれないように、暗号化、チェックサム、電子署名、ウィルスチェックソフト等による保護が行われている。また、フライトコントローラー501が制御に使用する計算処理の一部が、モバイル端末701上、または、管理サーバ702上や他の場所に存在する別のコンピュータによって実行されてもよい。フライトコントローラー501は重要性が高いため、その構成要素の一部または全部が二重化されていてもよい。 The software used by the flight controller 501 can be rewritten through a storage medium or the like for function expansion / change, problem correction, etc., or through communication means such as Wi-Fi communication or USB. In this case, protection is performed by encryption, checksum, digital signature, virus check software, etc. so that rewriting by unauthorized software is not performed. Further, a part of the calculation process used by the flight controller 501 for control may be executed by another computer located on the mobile terminal 701, the management server 702, or somewhere else. Due to the high importance of the flight controller 501, some or all of its components may be duplicated.
 フライトコントローラー501は、Wi-Fi子機機能503を介して、さらに、基地局710を介してモバイル端末701とやり取りを行ない、必要な指令をモバイル端末701から受信すると共に、必要な情報をモバイル端末701に送信できる。この場合に、通信には暗号化を施し、傍受、成り済まし、機器の乗っ取り等の不正行為を防止するようにしてもよい。基地局710は、Wi-Fiによる通信機能に加えて、RTK-GPS基地局の機能も備えている。RTK基地局の信号とGPS測位衛星からの信号を組み合わせることで、フライトコントローラー501により、ドローン100の絶対位置を数センチメートル程度の精度で測定可能となる。フライトコントローラー501は重要性が高いため、二重化・多重化されていてもよく、また、特定のGPS衛星の障害に対応するため、冗長化されたそれぞれのフライトコントローラー501は別の衛星を使用するよう制御されていてもよい。なお、フライトコントローラー501、基地局710、モバイル端末701間の通信はWi-Fiではなく、LTE等のモバイルネットワークを用いる場合もある。 The flight controller 501 communicates with the mobile terminal 701 via the Wi-Fi slave unit function 503 and further via the base station 710, receives necessary commands from the mobile terminal 701, and receives necessary information from the mobile terminal. Can be sent to 701. In this case, the communication may be encrypted to prevent fraudulent acts such as interception, spoofing, and device hijacking. The base station 710 also has a function of an RTK-GPS base station in addition to a communication function by Wi-Fi. By combining the signal from the RTK base station and the signal from the GPS positioning satellite, the flight controller 501 can measure the absolute position of the drone 100 with an accuracy of about several centimeters. Since the flight controller 501 is so important, it may be duplicated / multiplexed, and each redundant flight controller 501 should use a different satellite in order to cope with the failure of a specific GPS satellite. It may be controlled. Communication between the flight controller 501, the base station 710, and the mobile terminal 701 may use a mobile network such as LTE instead of Wi-Fi.
 6軸ジャイロセンサ505はドローン機体の互いに直交する3方向の加速度を測定する。さらに、加速度の積分により速度を計算する。6軸ジャイロセンサ505は、上述の3方向におけるドローン機体の姿勢角の変化、すなわち角速度を測定する。地磁気センサ506は、地磁気の測定によりドローン機体の方向を測定する。気圧センサ507は、気圧を測定し、間接的にドローンの高度を測定することもできる。レーザーセンサ508は、レーザー光の反射を利用してドローン機体と地表との距離を測定するものであり、IR(赤外線)レーザーであってもよい。 The 6-axis gyro sensor 505 measures the acceleration of the drone aircraft in three directions orthogonal to each other. Furthermore, the velocity is calculated by integrating the acceleration. The 6-axis gyro sensor 505 measures the change in the attitude angle of the drone aircraft in the above-mentioned three directions, that is, the angular velocity. The geomagnetic sensor 506 measures the direction of the drone body by measuring the geomagnetism. The barometric pressure sensor 507 can also measure barometric pressure and indirectly measure the altitude of the drone. The laser sensor 508 measures the distance between the drone body and the ground surface by utilizing the reflection of the laser light, and may be an IR (infrared) laser.
 ソナー509は、超音波等の音波の反射を利用してドローン機体と地表との距離を測定する。これらのセンサ類は、ドローンのコスト目標や性能要件に応じて取捨選択してよい。また、機体の傾きを測定するためのジャイロセンサ(角速度センサ)、風力を測定するための風力センサなどが追加されていてもよい。また、これらのセンサ類は、二重化または多重化されていてもよい。同一目的複数のセンサが存在する場合には、フライトコントローラー501はそのうちの一つのみを使用し、それが障害を起こした際には、代替のセンサに切り替えて使用するようにしてもよい。あるいは、複数のセンサを同時に使用し、それぞれの測定結果が一致しない場合には障害が発生したと見なすようにしてもよい。 Sonar 509 measures the distance between the drone aircraft and the ground surface using the reflection of sound waves such as ultrasonic waves. These sensors may be selected according to the cost target and performance requirements of the drone. Further, a gyro sensor (angular velocity sensor) for measuring the inclination of the airframe, a wind power sensor for measuring wind power, and the like may be added. Further, these sensors may be duplicated or multiplexed. If there are multiple sensors for the same purpose, the flight controller 501 may use only one of them, and if it fails, it may switch to an alternative sensor for use. Alternatively, a plurality of sensors may be used at the same time, and if the measurement results do not match, it may be considered that a failure has occurred.
 流量センサ510は薬剤の流量を測定するものであり、薬剤タンク104から薬剤ノズル103に至る経路の複数の場所に設けられている。液切れセンサ511は薬剤の量が所定の量以下になったことを検知するセンサである。マルチスペクトルカメラ512は圃場720を撮影し、画像分析のためのデータを取得する手段である。障害物検知カメラ513は障害物を検知するためのカメラであり、画像特性とレンズの向きがマルチスペクトルカメラ512とは異なるため、マルチスペクトルカメラ512とは別の機器である。 The flow rate sensor 510 measures the flow rate of the drug, and is provided at a plurality of locations on the route from the drug tank 104 to the drug nozzle 103. The liquid drain sensor 511 is a sensor that detects that the amount of the drug is equal to or less than a predetermined amount. The multispectral camera 512 is a means of photographing the field 720 and acquiring data for image analysis. The obstacle detection camera 513 is a camera for detecting an obstacle, and is a device different from the multispectral camera 512 because the image characteristics and the lens orientation are different from those of the multispectral camera 512.
 スイッチ514はドローン100の使用者が様々な設定を行なうための手段である。障害物接触センサ515はドローン100、特に、そのローターやプロペラガード部分が電線、建築物、人体、立木、鳥、または、他のドローン等の侵入者に接触したことを検知するためのセンサである。なお、障害物接触センサ515は、6軸ジャイロセンサ505で代用してもよい。カバーセンサ516は、ドローン100の操作パネルや内部保守用のカバーが開放状態であることを検知するセンサである。 Switch 514 is a means for the user of the drone 100 to make various settings. The obstacle contact sensor 515 is a sensor for detecting that the drone 100, particularly its rotor or propeller guard portion, has come into contact with an intruder such as an electric wire, a building, a human body, a standing tree, a bird, or another drone. .. The obstacle contact sensor 515 may be replaced with a 6-axis gyro sensor 505. The cover sensor 516 is a sensor that detects that the operation panel of the drone 100 and the cover for internal maintenance are in an open state.
薬剤注入口センサ517は薬剤タンク104の注入口が開放状態であることを検知するセンサである。これらのセンサ類はドローンのコスト目標や性能要件に応じて取捨選択されてもよく、二重化・多重化してもよい。また、ドローン100外部の基地局710、モバイル端末701、または、その他の場所にセンサを設けて、読み取った情報をドローン100に送信してもよい。たとえば、基地局710に風力センサを設け、風力・風向に関する情報をWi-Fi通信経由でドローン100に送信するようにしてもよい。 The drug injection port sensor 517 is a sensor that detects that the injection port of the drug tank 104 is in an open state. These sensors may be selected according to the cost target and performance requirements of the drone, or may be duplicated / multiplexed. Further, a sensor may be provided at a base station 710, a mobile terminal 701, or another place outside the drone 100, and the read information may be transmitted to the drone 100. For example, a wind power sensor may be provided in the base station 710 to transmit information on the wind power and the wind direction to the drone 100 via Wi-Fi communication.
 フライトコントローラー501はポンプ106に対して制御信号を送信し、薬剤吐出量の調整や薬剤吐出の停止を行なう。ポンプ106の現時点の状況(たとえば、回転数等)は、フライトコントローラー501にフィードバックされる構成となっている。 The flight controller 501 transmits a control signal to the pump 106 to adjust the drug discharge amount and stop the drug discharge. The current state of the pump 106 (for example, the number of revolutions) is fed back to the flight controller 501.
 LED107は、ドローンの操作者に対して、ドローンの状態を知らせるための表示手段である。LEDに替えて、または、それに加えて液晶ディスプレイ等の表示手段を使用してもよい。ブザー518は、音声信号によりドローンの状態(特にエラー状態)を知らせるための出力手段である。Wi-Fi子機機能519はモバイル端末701とは別に、たとえば、ソフトウェアの転送などのために外部のコンピュータ等と通信するためのオプショナルな構成要素である。Wi-Fi子機機能に替えて、または、それに加えて、赤外線通信、Bluetooth(登録商標)、ZigBee(登録商標)、NFC等の他の無線通信手段、または、USB接続などの有線通信手段を使用してもよい。また、フライトコントローラー501、モバイル端末701、基地局710の各機器間の通信は、Wi-Fi子機機能に替えて、3G、4G、およびLTE等の移動通信システムにより相互に通信可能であってもよい。 The LED 107 is a display means for notifying the drone operator of the state of the drone. Display means such as a liquid crystal display may be used in place of or in addition to the LED. The buzzer 518 is an output means for notifying the state of the drone (particularly the error state) by an audio signal. The Wi-Fi slave unit function 519 is an optional component for communicating with an external computer or the like for transferring software, for example, in addition to the mobile terminal 701. Instead of or in addition to the Wi-Fi handset function, other wireless communication means such as infrared communication, Bluetooth®, ZigBee®, NFC, or wired communication means such as USB connection. You may use it. Further, communication between each device of the flight controller 501, the mobile terminal 701, and the base station 710 can be communicated with each other by a mobile communication system such as 3G, 4G, and LTE instead of the Wi-Fi slave unit function. May be good.
 スピーカー520は、録音した人声や合成音声等により、ドローンの状態(特にエラー状態)を知らせる出力手段である。天候状態によっては飛行中のドローン100の視覚的表示が見にくいことがあるため、そのような場合には音声による状況伝達が有効である。警告灯521はドローンの状態(特にエラー状態)を知らせるストロボライト等の表示手段である。これらの入出力手段は、ドローンのコスト目標や性能要件に応じて取捨選択してよく、二重化・多重化してもよい。 The speaker 520 is an output means for notifying the state of the drone (particularly the error state) by means of recorded human voice, synthetic voice, or the like. Depending on the weather conditions, it may be difficult to see the visual display of the drone 100 in flight. In such a case, it is effective to convey the situation by voice. The warning light 521 is a display means such as a strobe light for notifying the state of the drone (particularly the error state). These input / output means may be selected according to the cost target and performance requirements of the drone, and may be duplicated / multiplexed.
 図7は、ドローン管理システム700全体の接続構成図の例である。
 ドローン管理システム700は、ドローン100、モバイル端末701、管理端末703及び基地局710を備え、それぞれがネットワークを介して管理サーバ702に接続されている。なお、ネットワークは有線、無線を問わず、それぞれの端末はネットワークを介して情報を送受信することができる。
 ドローン100及びモバイル端末701は圃場720において基地局710を介して通信を行うことが可能であり、ドローン100が薬剤の散布フライトを行う。
FIG. 7 is an example of a connection configuration diagram of the entire drone management system 700.
The drone management system 700 includes a drone 100, a mobile terminal 701, a management terminal 703, and a base station 710, each of which is connected to the management server 702 via a network. The network may be wired or wireless, and each terminal can send and receive information via the network.
The drone 100 and the mobile terminal 701 can communicate with each other in the field 720 via the base station 710, and the drone 100 performs a drug spraying flight.
 ネットワークは1つの通信規格により通信するネットワークでもよいし、複数の通信規格網が組み合わされたネットワークであってもよい。例えば、ドローン100とモバイル端末701はそれぞれ基地局710が提供するWi-Fiによりネットワーク接続されてもよいし、ドローン100とモバイル端末701はそれぞれLTE等の携帯通信網によりネットワーク接続されてもよい。また、ドローン100が基地局710により提供されるWi-Fiにより接続され、基地局710とモバイル端末701は携帯通信網により接続される構成としてもよい。 The network may be a network that communicates according to one communication standard, or may be a network that is a combination of a plurality of communication standard networks. For example, the drone 100 and the mobile terminal 701 may be network-connected by Wi-Fi provided by the base station 710, respectively, or the drone 100 and the mobile terminal 701 may be network-connected by a mobile communication network such as LTE, respectively. Further, the drone 100 may be connected by Wi-Fi provided by the base station 710, and the base station 710 and the mobile terminal 701 may be connected by a mobile communication network.
 モバイル端末701は使用者の操作によりドローン100に指令を送信し、また、ドローン100から受信した情報(例えば、位置、薬剤量、電池残量、カメラ映像等)を表示する。例えばタブレット端末やスマートフォン等の携帯情報機器によって実現される。ドローン100は管理サーバ702からの指示により自律飛行を行なうが、モバイル端末701により、離陸や帰還などの基本操作時、および、緊急時にはマニュアル操作を行うことができる。モバイル端末701は、基地局710と接続されており、基地局710を介して、若しくは直接管理端末703と通信を行うことができる。 The mobile terminal 701 sends a command to the drone 100 by the operation of the user, and also displays information received from the drone 100 (for example, position, drug amount, remaining battery level, camera image, etc.). For example, it is realized by a mobile information device such as a tablet terminal or a smartphone. The drone 100 performs autonomous flight according to instructions from the management server 702, but the mobile terminal 701 can perform manual operations during basic operations such as takeoff and return, and in an emergency. The mobile terminal 701 is connected to the base station 710, and can communicate with the management terminal 703 via the base station 710 or directly.
 管理サーバ702は、例えばクラウド上に配置されたサーバであり、圃場管理情報1300に基づいてドローン100の散布飛行ルートを算出し、ドローン100の自立飛行を制御する。また、ドローン100に搭載されたカメラや各種センサから取得された情報を収集し、圃場や作物の状態等、様々な分析を行うことができる。
 管理端末703は、管理サーバ702を操作する端末であり、管理サーバ702の各種設定を行う。また、ドローン100やモバイル端末701を制御することも可能である。
The management server 702 is, for example, a server arranged on the cloud, calculates the spray flight route of the drone 100 based on the field management information 1300, and controls the independent flight of the drone 100. In addition, it is possible to collect information acquired from a camera mounted on the drone 100 and various sensors, and perform various analyzes such as the state of fields and crops.
The management terminal 703 is a terminal that operates the management server 702, and makes various settings for the management server 702. It is also possible to control the drone 100 and the mobile terminal 701.
 基地局710は、圃場720に設置され、Wi-Fi通信の親機機能等を提供する装置であり、RTK-GPS基地局としても機能し、ドローン100の正確な位置を提供できるようになっている(Wi-Fi通信の親機機能とRTK-GPS基地局が独立した装置であってもよい)。また、基地局710は、3G、4G、およびLTE等の携帯通信網を用いて、管理サーバ702と通信可能である。 The base station 710 is a device installed in the field 720 that provides a master unit function for Wi-Fi communication, and also functions as an RTK-GPS base station, so that the accurate position of the drone 100 can be provided. (The base unit function of Wi-Fi communication and the RTK-GPS base station may be independent devices). Further, the base station 710 can communicate with the management server 702 using a mobile communication network such as 3G, 4G, and LTE.
 ドローン管理システム700のそれぞれの端末や管理サーバ702は、例えば、スマートフォン、タブレット、携帯電話機、携帯情報端末(PDA)などの携帯端末(モバイル端末)でもよいし、メガネ型や腕時計型、着衣型などのウェアラブル端末でもよい。また、据置型または携帯型のコンピュータや、クラウドやネットワーク上に配置されるサーバでもよい。また、機能としてはVR(仮想現実:Virtual Reality)端末、AR(拡張現実:Augmented Reality)端末、MR(複合現実:Mixed Reality)端末でもよい。あるいは、これらの複数の端末の組合せであってもよい。例えば、1台のスマートフォンと1台のウェアラブル端末との組合せが論理的に一つの端末として機能し得る。またこれら以外の情報処理端末であってもよい。 Each terminal and management server 702 of the drone management system 700 may be, for example, a mobile terminal (mobile terminal) such as a smartphone, a tablet, a mobile phone, or a personal digital assistant (PDA), or may be a glasses type, a wristwatch type, a clothing type, or the like. It may be a wearable terminal of. It may also be a stationary or portable computer, or a server located in the cloud or on a network. Further, the function may be a VR (Virtual Reality) terminal, an AR (Augmented Reality) terminal, or an MR (Mixed Reality) terminal. Alternatively, it may be a combination of these plurality of terminals. For example, a combination of one smartphone and one wearable terminal can logically function as one terminal. Further, it may be an information processing terminal other than these.
 ドローン管理システム700のそれぞれの端末や管理サーバ702は、それぞれオペレーティングシステムやアプリケーション、プログラムなどを実行するプロセッサ(制御部)と、RAM(Random Access Memory)等の主記憶装置と、ICカードやハードディスクドライブ、SSD(Solid State Drive)、フラッシュメモリ等の補助記憶装置と、ネットワークカードや無線通信モジュール、モバイル通信モジュール等の通信制御部と、タッチパネルやキーボード、マウス、音声入力、カメラ部の撮像による動き検知による入力などの入力装置と、モニタやディスプレイ等の出力装置とを備える。なお、出力装置は、外部のモニタやディスプレイ、プリンタ、機器などに、出力するための情報を送信する装置や端子であってもよい。 Each terminal and management server 702 of the drone management system 700 has a processor (control unit) that executes an operating system, an application, a program, etc., a main storage device such as a RAM (RandomAccessMemory), and an IC card or a hard disk drive. , SSD (Solid State Drive), auxiliary storage device such as flash memory, communication control unit such as network card, wireless communication module, mobile communication module, touch panel, keyboard, mouse, voice input, motion detection by imaging of camera unit It is equipped with an input device such as an input device and an output device such as a monitor and a display. The output device may be a device or a terminal for transmitting information for output to an external monitor, display, printer, device, or the like.
 主記憶装置には、各種プログラムやアプリケーションなど(モジュール)が記憶されており、これらのプログラムやアプリケーションをプロセッサが実行することで全体システムの各機能要素が実現される。なお、これらの各モジュールは集積化する等によりハードウェアで実装してもよい。また、各モジュールはそれぞれ独立したプログラムやアプリケーションでもよいが、1つの統合プログラムやアプリケーションの中の一部のサブプログラムや関数などの形で実装されていてもよい。 Various programs and applications (modules) are stored in the main memory, and each functional element of the entire system is realized by the processor executing these programs and applications. In addition, each of these modules may be implemented by hardware by integrating them. Further, each module may be an independent program or application, but may be implemented in the form of a part of a subprogram or a function in one integrated program or application.
 本明細書では、各モジュールが、処理を行う主体(主語)として記載をしているが、実際には各種プログラムやアプリケーションなど(モジュール)を処理するプロセッサが処理を実行する。
 補助記憶装置には、各種データベース(DB)が記憶されている。「データベース」とは、プロセッサまたは外部のコンピュータからの任意のデータ操作(例えば、抽出、追加、削除、上書きなど)に対応できるようにデータ集合を記憶する機能要素(記憶部)である。データベースの実装方法は限定されず、例えばデータベース管理システムでもよいし、表計算ソフトウェアでもよいし、XML、JSONなどのテキストファイルでもよい。
 モバイル端末701を情報処理装置と呼ぶこともあるし、管理サーバ702を情報処理装置と呼ぶこともある。
In this specification, each module is described as a subject (subject) that performs processing, but in reality, a processor that processes various programs, applications, etc. (module) executes processing.
Various databases (DBs) are stored in the auxiliary storage device. A "database" is a functional element (storage unit) that stores a data set so that it can handle arbitrary data operations (for example, extraction, addition, deletion, overwriting, etc.) from a processor or an external computer. The method of implementing the database is not limited, and may be, for example, a database management system, spreadsheet software, or a text file such as XML or JSON.
The mobile terminal 701 may be referred to as an information processing device, and the management server 702 may be referred to as an information processing device.
 図8は、モバイル端末701に表示される圃場情報表示画面800の例である。
 モバイル端末701の画面表示モジュール1011は、モバイル端末701に記憶された地図情報1200及び圃場管理情報1300を取得し、圃場情報表示画面800を生成して、画面等の出力装置1005に出力する。
 なお、画面表示モジュール1011は、管理サーバ702に記憶された地図情報1200や1200及び圃場管理情報1300をネットワーク経由で取得して、圃場情報表示画面800を生成する構成であってもよい。
FIG. 8 is an example of the field information display screen 800 displayed on the mobile terminal 701.
The screen display module 1011 of the mobile terminal 701 acquires the map information 1200 and the field management information 1300 stored in the mobile terminal 701, generates the field information display screen 800, and outputs the field information display screen 800 to the output device 1005 such as a screen.
The screen display module 1011 may be configured to acquire the map information 1200 or 1200 stored in the management server 702 and the field management information 1300 via the network to generate the field information display screen 800.
 圃場情報表示画面800の背面には地図801が表示されており、その中で圃場の情報が圃場管理情報1300に記憶されている圃場802、803、804に、情報が登録されていることを示すアンカー805が表示されている。
 圃場とは、ドローン100による薬剤散布の対象となる田圃や畑等である。実際には、圃場の地形は複雑であり、事前に地形図が入手できない場合、あるいは、地形図と現場の状況が食い違っている場合がある。通常、圃場は家屋、病院、学校、他作物圃場、道路、鉄道等と隣接している。また、圃場内に、建築物や電線等の侵入者が存在する場合もある。圃場は、薬剤散布の対象エリアの1つの例である。
A map 801 is displayed on the back surface of the field information display screen 800, indicating that the information is registered in the fields 802, 803, and 804 in which the field information is stored in the field management information 1300. Anchor 805 is displayed.
The field is a rice field, a field, or the like that is the target of chemical spraying by the drone 100. In reality, the terrain of the field is complicated, and the topographic map may not be available in advance, or the topographic map and the situation at the site may be inconsistent. Fields are usually adjacent to houses, hospitals, schools, other crop fields, roads, railroads, etc. In addition, there may be intruders such as buildings and electric wires in the field. The field is an example of a target area for chemical spraying.
 画面表示モジュール1011は、画面のタップなどにより入力装置1004を介してユーザから圃場802の選択を受け付けると、圃場802に対応する情報を、圃場管理情報1300から取得し、圃場情報表示領域810に表示する。また画面表示モジュール1011は、選択された圃場802の周囲を明るい色の太線に変更するなど、圃場802が選択されていることを示すハイライト表示を行う。 When the screen display module 1011 receives the selection of the field 802 from the user via the input device 1004 by tapping the screen or the like, the screen display module 1011 acquires the information corresponding to the field 802 from the field management information 1300 and displays it in the field information display area 810. do. Further, the screen display module 1011 displays a highlight indicating that the field 802 is selected, such as changing the periphery of the selected field 802 to a thick line of a bright color.
 圃場情報表示領域810には、圃場名811、住所812、面積813、作付作物名814等、圃場管理情報1300から取得される情報が表示される。
 散布情報表示領域820には、薬剤の散布に関連する情報が表示される。作付作物名814や散布時期などによって散布される薬剤は変わり、近い時期に散布すべき薬剤情報を薬剤管理情報1600から取得して表示する。
 散布情報表示領域820には、管理サーバ702の散布関連情報管理モジュール1114が取得または算出した薬剤の散布に関連する情報、例えば圃場の散布フライトに必要な薬剤名、散布量、希釈量、エネルギー量などを表示する。
 状態830には、選択された圃場802に対する現在の状態として、例えば、「測量済」、「飛行経路あり」などの情報が表示される。
 最新飛行日時840には、最新の散布フライト日時の情報が表示される。
 飛行ステータス表示欄850には、ドローンの散布飛行の現在のステータスが表示される。
In the field information display area 810, information acquired from the field management information 1300, such as the field name 811, the address 812, the area 813, and the planted crop name 814, is displayed.
Information related to the spraying of the drug is displayed in the spraying information display area 820. The drug to be sprayed changes depending on the crop name 814 and the spraying time, and the drug information to be sprayed in the near future is acquired from the drug management information 1600 and displayed.
In the spraying information display area 820, information related to the spraying of the drug acquired or calculated by the spraying-related information management module 1114 of the management server 702, for example, the drug name, spraying amount, dilution amount, and energy amount required for the spraying flight in the field. Etc. are displayed.
In the state 830, information such as "surveyed" or "with flight path" is displayed as the current state for the selected field 802.
Information on the latest spray flight date and time is displayed in the latest flight date and time 840.
The flight status display field 850 displays the current status of the drone's spray flight.
 コンパス861は、地図801が表示している方位を示す。
 圃場全体表示ボタン862が選択されると、画面表示モジュール1011は、選択された圃場が画面いっぱいになるように表示の縮尺を変更する。
 現在地移動ボタン863が選択されると、画面表示モジュール1011は、モバイル端末701のGPSにより取得された現在地が画面の中心になるように表示を変更する。
 スケジュール表示ボタン870が選択されると、画面表示モジュール1011は、当日の薬剤散布スケジュールを表示する。
Compass 861 indicates the orientation displayed by map 801.
When the field-wide display button 862 is selected, the screen display module 1011 changes the scale of the display so that the selected field fills the screen.
When the current location movement button 863 is selected, the screen display module 1011 changes the display so that the current location acquired by the GPS of the mobile terminal 701 becomes the center of the screen.
When the schedule display button 870 is selected, the screen display module 1011 displays the drug spraying schedule for the day.
 図9は、モバイル端末701に表示されるドローン操作画面900の例である。
 ドローンバッテリー表示901にはドローンの現在のバッテリー残量が表示される。
 ドローン位置902には、ドローン100の現在の位置情報が表示される。
 散布フライト進捗情報912には、現在の散布フライトの進捗情報が表示される。例えば散布フライトの飛行ルートの進捗状況や、散布薬剤の残量、バッテリー残量等が表示される。
 飛行ステータス表示欄921には、ドローン100の散布飛行の現在のステータスが表示される。
 メッセージ表示欄922には、ドローン100との通信内容や飛行状態等を示すメッセージが表示される。
FIG. 9 is an example of the drone operation screen 900 displayed on the mobile terminal 701.
The drone battery display 901 displays the current remaining battery level of the drone.
At the drone position 902, the current position information of the drone 100 is displayed.
The spray flight progress information 912 displays the progress information of the current spray flight. For example, the progress of the flight route of the spray flight, the remaining amount of the sprayed drug, the remaining battery level, etc. are displayed.
In the flight status display field 921, the current status of the spray flight of the drone 100 is displayed.
In the message display field 922, a message indicating the communication content with the drone 100, the flight status, and the like is displayed.
 高度変更ボタン923、924は、ドローン100の飛行高度を変更するためのボタンである。マイナスを押すと高度が下がり、プラスを押すと高度が上がる。
 緊急停止ボタン925は、飛行しているドローン100を緊急停止等するボタンであり、その場でホバリングを行う一時停止の他、飛行開始地点に戻るオプションや、その場でモーターを緊急停止するオプション等も表示可能である。
 ドローン操作画面900の例では、薬剤散布の対象となる圃場930が地図上に表示されており、圃場930上の散布フライトの飛行経路931が表示されている。ドローン100は、モバイル端末701または管理サーバ702に記憶された飛行経路管理情報1800に従い、指定された飛行座標を順に飛行する。
The altitude change buttons 923 and 924 are buttons for changing the flight altitude of the drone 100. Press the minus to lower the altitude, and press the plus to raise the altitude.
The emergency stop button 925 is a button for urgently stopping the flying drone 100, and in addition to a temporary stop for hovering on the spot, an option for returning to the flight start point, an option for urgently stopping the motor on the spot, etc. Can also be displayed.
In the example of the drone operation screen 900, the field 930 to be sprayed with the chemical is displayed on the map, and the flight route 931 of the spraying flight on the field 930 is displayed. The drone 100 sequentially flies at the designated flight coordinates according to the flight route management information 1800 stored in the mobile terminal 701 or the management server 702.
 高度変更ボタン923、924や緊急停止ボタン925等、ドローン100への操作を必要とする操作を受け付けると、ドローン操作モジュール1012が、これらの操作に対応するコマンド等の情報をドローン100に送信し、ドローン100を操作することができる。
 次の散布スケジュール表示ボタン940は、現在実行されている散布フライトの次の散布フライトのスケジュールを表示するためのボタンである。このボタンが押されると、スケジュール管理情報1900から取得された次の散布フライトに関する情報が表示される。
Upon receiving operations that require operations on the drone 100, such as the altitude change buttons 923 and 924 and the emergency stop button 925, the drone operation module 1012 sends information such as commands corresponding to these operations to the drone 100. The drone 100 can be operated.
The next spraying schedule display button 940 is a button for displaying the schedule of the next spraying flight of the currently executed spraying flight. When this button is pressed, information about the next spray flight obtained from Schedule Management Information 1900 is displayed.
 図10は、モバイル端末701のハードウェア構成の例である。
 モバイル端末701は、例えばタブレットやスマートフォン、ヘッドマウントディスプレイ等の端末である。
 主記憶装置1001には、画面表示モジュール1011、ドローン操作モジュール1012、スケジュール管理モジュール1013等のプログラムやアプリケーションが記憶されており、これらのプログラムやアプリケーションをプロセッサ1003が実行することでモバイル端末701の各機能要素が実現される。
 画面表示モジュール1011は、圃場情報表示画面800や、ドローン操作画面900を表示パネルなどの出力装置1005に表示する。
FIG. 10 is an example of the hardware configuration of the mobile terminal 701.
The mobile terminal 701 is, for example, a terminal such as a tablet, a smartphone, or a head-mounted display.
Programs and applications such as a screen display module 1011 and a drone operation module 1012 and a schedule management module 1013 are stored in the main storage device 1001, and each of the mobile terminals 701 is executed by the processor 1003 by executing these programs and applications. Functional elements are realized.
The screen display module 1011 displays the field information display screen 800 and the drone operation screen 900 on an output device 1005 such as a display panel.
 ドローン操作モジュール1012は、ユーザによる高度変更ボタン923、924や、緊急停止ボタン925等の操作を受け付けた場合に、これらの操作に対応するコマンド等の情報をドローン100に送信し、ドローンのフライトを操作する。
 スケジュール管理モジュール1013は、複数の圃場に連続して散布フライトを行う場合に、それぞれの散布フライトのスケジュールを管理する。
 補助記憶装置1002は、地図情報1200、圃場管理情報1300、機器管理情報1400、ユーザ管理情報1500、薬剤管理情報1600、エネルギー管理情報1700、飛行経路管理情報1800、スケジュール管理情報1900等の各種情報を記憶する。
When the drone operation module 1012 receives operations such as the altitude change buttons 923 and 924 and the emergency stop button 925 by the user, the drone operation module 1012 transmits information such as commands corresponding to these operations to the drone 100, and makes a drone flight. Manipulate.
The schedule management module 1013 manages the schedule of each spray flight when the spray flights are continuously performed in a plurality of fields.
The auxiliary storage device 1002 stores various information such as map information 1200, field management information 1300, device management information 1400, user management information 1500, drug management information 1600, energy management information 1700, flight route management information 1800, and schedule management information 1900. Remember.
 図11は、管理サーバ702のハードウェア構成の例である。
 管理サーバ702は、例えばクラウド上に配置されたサーバで構成される。
 主記憶装置1101には、画面出力モジュール1111、飛行管理モジュール1112、ユーザ・機器管理モジュール1113、散布関連情報管理モジュール1114、飛行経路管理モジュール1115、スケジュール管理モジュール1116が記憶されており、これらのプログラムやアプリケーションをプロセッサ1103が実行することで管理サーバ702の各機能要素が実現される。
 画面出力モジュール1111は、圃場情報表示画面800や、ドローン操作画面900を表示するための情報を抽出・生成し、モバイル端末701に送信する。画面情報そのものを生成し、モバイル端末701等で表示することとしてもよい。
 飛行管理モジュール1112は、圃場管理情報1300や飛行経路管理情報1800等の情報に基づいて、ドローン100の散布フライトを管理する。
FIG. 11 is an example of the hardware configuration of the management server 702.
The management server 702 is composed of, for example, a server arranged on the cloud.
The main storage device 1101 stores a screen output module 1111, a flight management module 1112, a user / equipment management module 1113, a spray-related information management module 1114, a flight route management module 1115, and a schedule management module 1116. Each functional element of the management server 702 is realized by executing the application or the application by the processor 1103.
The screen output module 1111 extracts and generates information for displaying the field information display screen 800 and the drone operation screen 900, and transmits the information to the mobile terminal 701. The screen information itself may be generated and displayed on the mobile terminal 701 or the like.
The flight management module 1112 manages the spray flight of the drone 100 based on the information such as the field management information 1300 and the flight route management information 1800.
 ユーザ・機器管理モジュール1113は、ドローン100を使用するユーザに関する情報をユーザ管理情報1500に登録し、管理する。
 散布関連情報管理モジュール1114は、散布フライトに必要な薬剤散布量や薬剤量、希釈量、希釈に要する水の量、バッテリー数などのエネルギー量を管理する。
 飛行経路管理モジュール1115は、圃場管理情報1300に基づいて、ドローン100の散布フライトの飛行経路を算出する。
 スケジュール管理モジュール1116は、複数の圃場や、複数日にまたがる散布フライトのスケジュールを生成し、管理する。生成された薬剤散布スケジュールは、スケジュール管理情報1900に記憶される。
The user / device management module 1113 registers and manages information about a user who uses the drone 100 in the user management information 1500.
The spraying-related information management module 1114 manages the amount of chemicals required for the spraying flight, the amount of chemicals, the amount of dilution, the amount of water required for dilution, the amount of energy such as the number of batteries, and the like.
The flight path management module 1115 calculates the flight path of the spray flight of the drone 100 based on the field management information 1300.
The schedule management module 1116 generates and manages a schedule of spray flights over a plurality of fields and a plurality of days. The generated drug spraying schedule is stored in the schedule management information 1900.
 補助記憶装置1102は、地図情報1200、圃場管理情報1300、機器管理情報1400、ユーザ管理情報1500、薬剤管理情報1600、エネルギー管理情報1700、飛行経路管理情報1800、スケジュール管理情報1900等の各種情報を記憶する。
 なお、モバイル端末701と管理サーバ702で同じ情報が記憶されているが、これはそれぞれの情報が同期されてもよいし、単にどちらかの情報をコピーしても構わない。また一部または全ての情報を管理サーバ702上に記憶しておき、モバイル端末701からは必要に応じて管理サーバ702から情報をダウンロードする構成であっても構わない。
The auxiliary storage device 1102 stores various information such as map information 1200, field management information 1300, device management information 1400, user management information 1500, drug management information 1600, energy management information 1700, flight route management information 1800, and schedule management information 1900. Remember.
The same information is stored in the mobile terminal 701 and the management server 702, but the respective information may be synchronized with each other, or either information may be simply copied. Further, some or all of the information may be stored in the management server 702, and the information may be downloaded from the management server 702 from the mobile terminal 701 as needed.
 図12は、管理端末703のハードウェア構成の例である。
 管理端末703は、例えばデスクトップPC、ノートPCやタブレット等の端末である。
 主記憶装置1201には、ドローン設定モジュール1211や管理サーバ設定モジュール1212等のプログラムやアプリケーションが記憶されており、これらのプログラムやアプリケーションをプロセッサ1203が実行することで管理端末703の各機能要素が実現される。
 ドローン設定モジュール1211は、ドローン100の散布フライト設定や初期設定などの各種操作や設定を行う。
 管理サーバ設定モジュール1212は、管理サーバ702の初期設定などの各種設定を行う。
 補助記憶装置1202は、ドローン設定情報1221や管理サーバ設定情報1222等の各種情報を記憶する。
FIG. 12 is an example of the hardware configuration of the management terminal 703.
The management terminal 703 is, for example, a terminal such as a desktop PC, a notebook PC, or a tablet.
Programs and applications such as the drone setting module 1211 and the management server setting module 1212 are stored in the main storage device 1201, and each functional element of the management terminal 703 is realized by executing these programs and applications by the processor 1203. Will be done.
The drone setting module 1211 performs various operations and settings such as spray flight setting and initial setting of the drone 100.
The management server setting module 1212 makes various settings such as initial settings of the management server 702.
The auxiliary storage device 1202 stores various information such as drone setting information 1221 and management server setting information 1222.
 図13は、圃場管理情報1300の例である。
 圃場管理情報1300は、薬剤散布を行う対象である圃場に関する各種情報を記憶しており、圃場ID、圃場名、圃場位置、圃場周囲座標、圃場面積、作付作物等の情報を記憶する。圃場管理情報1300を単に圃場情報と呼ぶこともある。
 圃場IDは、圃場を一意に特定する識別情報である。
 圃場位置1311は、圃場の位置座標を示し、例えば圃場の中心の緯度・経度の情報を有する。
 圃場周囲座標1312は、圃場の周囲の座標を示し、例えば4角形の圃場であれば角の4点の位置座標である。サンプル値のGC007は、位置座標が連続してカンマ区切りなどで記憶された情報へのリンクを示す。
 圃場面積1313は、圃場IDに対応する圃場の総面積である。
 作付作物1314は、圃場に作付けされている作物等を特定する情報を記憶する。
FIG. 13 is an example of field management information 1300.
The field management information 1300 stores various information about the field to which the chemicals are sprayed, and stores information such as the field ID, the field name, the field position, the field peripheral coordinates, the field area, and the crops cultivated. The field management information 1300 may be simply referred to as field information.
The field ID is identification information that uniquely identifies the field.
The field position 1311 indicates the position coordinates of the field, and has, for example, information on the latitude and longitude of the center of the field.
The field peripheral coordinates 1312 indicate the coordinates around the field, and are, for example, the position coordinates of the four corners in the case of a quadrangular field. The sample value GC007 indicates a link to information in which the position coordinates are continuously stored separated by commas or the like.
The field area 1313 is the total area of the field corresponding to the field ID.
The cultivated crop 1314 stores information for identifying the crop or the like cultivated in the field.
 図14は、機器管理情報1400の例である。
 機器管理情報1400は、ドローン100を管理するための情報を記憶しており、機器ID、機器名、型番、仕様、ユーザ、エネルギー、飛行可能時間などの情報を記憶する。
 機器IDは、ドローン100を一意に特定する識別情報である。
 ユーザは、現在そのドローン100を使用しているユーザの情報であり、ユーザ管理情報1500のユーザIDを記憶する。
 エネルギー1411は、ドローン100に搭載可能なエネルギーに関する情報であり、エネルギー管理情報1700のエネルギーIDを記憶する。
 飛行可能時間1412は、ドローン100に搭載できるエネルギーによる飛行可能時間を示す。例えばバッテリー2個1セットで15分飛行可能であること等の情報が記憶されている。
FIG. 14 is an example of the device management information 1400.
The device management information 1400 stores information for managing the drone 100, and stores information such as a device ID, a device name, a model number, specifications, a user, energy, and flight time.
The device ID is identification information that uniquely identifies the drone 100.
The user is information on the user who is currently using the drone 100, and stores the user ID of the user management information 1500.
The energy 1411 is information about energy that can be mounted on the drone 100, and stores the energy ID of the energy management information 1700.
The flightable time 1412 indicates the flightable time due to the energy that can be mounted on the drone 100. For example, information such as being able to fly for 15 minutes with a set of two batteries is stored.
 図15は、ユーザ管理情報1500の例である。
 ユーザ管理情報1500は、ドローン100を操作するユーザの情報を記憶しており、ユーザID、ユーザ表示ID、名前、メールアドレス、生年月日、性別等の情報を記憶する。
 ユーザIDは、ユーザを一意に特定する識別情報である。
 ユーザ表示IDは、モバイル端末701等に表示されるユーザの情報であり、例えば、ユーザが登録したニックネーム等である。
FIG. 15 is an example of user management information 1500.
The user management information 1500 stores information on the user who operates the drone 100, and stores information such as a user ID, a user display ID, a name, an e-mail address, a date of birth, and a gender.
The user ID is identification information that uniquely identifies the user.
The user display ID is user information displayed on the mobile terminal 701 or the like, and is, for example, a nickname registered by the user.
 図16は、薬剤管理情報1600の例である。
 薬剤管理情報1600は、散布する薬剤の情報を記憶しており、薬剤ID、薬剤名、品番、仕様、希釈率、散布量等を記憶する。
 薬剤IDは、薬剤を一意に特定する識別情報である。
 薬剤名1602は、例えば農薬、除草剤、液肥、殺虫剤、種などの圃場に散布される液体、粉体又は微粒子の商品等の名前を示す。
FIG. 16 is an example of drug management information 1600.
The drug management information 1600 stores information on the drug to be sprayed, and stores the drug ID, drug name, product number, specifications, dilution rate, spray amount, and the like.
The drug ID is identification information that uniquely identifies the drug.
The drug name 1602 indicates the name of a product such as a liquid, powder or fine particle to be sprayed in a field such as a pesticide, a herbicide, a liquid fertilizer, an insecticide, or a seed.
 仕様1603は、薬剤の使用方法や希釈方法、対象作物、散布方法などの情報が記憶されており、仕様1603に記載された内容に従って、薬剤の希釈や散布処理を実行する。
 希釈率1604は、薬剤を希釈する割合が記憶されており、例えば薬剤対水の割合や、希釈に用いる薬剤と水の量等が記憶される。
 散布量1605は、希釈された希釈後薬剤(散布薬剤)の散布量を記憶する。例えば1haあたり10Lの散布薬剤を散布することが示されている。
The specification 1603 stores information such as a method of using the drug, a method of diluting the drug, a target crop, and a method of spraying, and the drug is diluted or sprayed according to the contents described in the specification 1603.
The dilution ratio 1604 stores the ratio of diluting the drug, for example, the ratio of the drug to water, the amount of the drug and water used for dilution, and the like.
The spraying amount 1605 stores the sprayed amount of the diluted post-diluted drug (spraying drug). For example, it has been shown to spray 10 L of spraying agent per ha.
 図17は、エネルギー管理情報1700の例である。
 エネルギー管理情報1700は、ドローン100のフライトに必要な例えばバッテリーなどのエネルギーに関する情報を記憶しており、エネルギーID、エネルギー名、型番、種類、仕様等の情報を記憶する。
 エネルギーIDは、エネルギーを一意に特定する識別情報である。
 種類は、エネルギーの種類を示し、例えば電池(バッテリー)やガソリン、ジェット燃料等が記憶される。
FIG. 17 is an example of energy management information 1700.
The energy management information 1700 stores information on energy such as a battery required for the flight of the drone 100, and stores information such as an energy ID, an energy name, a model number, a type, and specifications.
The energy ID is identification information that uniquely identifies the energy.
The type indicates the type of energy, and for example, a battery, gasoline, jet fuel, or the like is stored.
 図18は、飛行経路管理情報1800の例である。
 飛行経路管理情報1800は、ドローン100のフライトの経路を示す情報を記憶しており、経路ID、対象ID、経路座標、経路合計距離などを記憶する。
 経路IDは、飛行経路を一意に特定する識別情報である。
 対象IDは、飛行経路を算出した対象である圃場や、圃場と圃場の間の移動経路等を特定する情報である。例えばfarm003は対象が圃場でることを示し、route002は対象が圃場外の移動経路であることを示す。
 経路座標1811は、フライトの経路座標を示す情報へのリンクであり、フライトの経路座標は、例えば連続する複数の位置座標の組み合わせで表現される。位置座標としては、緯度と経度の組み合わせや、緯度と経度と高度の組み合わせ等が考えられる。
 経路合計距離1812は、フライトの開始からスケジュールまでの飛行経路全体を飛んだ場合の経路の合計距離を示す。
FIG. 18 is an example of flight path management information 1800.
The flight route management information 1800 stores information indicating the flight route of the drone 100, and stores the route ID, the target ID, the route coordinates, the total route distance, and the like.
The route ID is identification information that uniquely identifies the flight route.
The target ID is information for specifying the field for which the flight route is calculated, the movement route between the fields, and the like. For example, farm003 indicates that the subject is in the field, and route002 indicates that the subject is a movement route outside the field.
The route coordinate 1811 is a link to information indicating the route coordinate of the flight, and the route coordinate of the flight is represented by, for example, a combination of a plurality of continuous position coordinates. As the position coordinates, a combination of latitude and longitude, a combination of latitude, longitude and altitude, etc. can be considered.
The total route distance 1812 indicates the total distance of the route when the entire flight route from the start of the flight to the schedule is flown.
 図19は、スケジュール管理情報1900の例である。
 スケジュール管理情報1900は、複数の圃場を散布フライトする場合のスケジュールを規定する情報であり、スケジュールID、スケジュール名、日時、開始場所、スケジュール等の情報を記憶する。
 スケジュール1901は、散布フライトを行う圃場や、圃場間の移動経路などを特定する情報を記憶する。例えばサンプル値の例だと、farm006、farm005で特定される圃場2つを飛行した後に、route001で示される移動経路を飛行した後、farm003で特定される圃場を飛行し、other001で指定されるその他のイベント(例えば昼食時間など)を経過した後、farm002で特定される圃場を飛行するスケジュールである。
FIG. 19 is an example of schedule management information 1900.
The schedule management information 1900 is information that defines a schedule when a plurality of fields are sprayed, and stores information such as a schedule ID, a schedule name, a date and time, a start place, and a schedule.
The schedule 1901 stores information for specifying the field where the spray flight is performed, the movement route between the fields, and the like. For example, in the example of the sample value, after flying two fields specified by farm006 and farm005, after flying the movement route indicated by route001, flying the field specified by farm003, and others specified by other001. It is a schedule to fly the field specified by farm002 after the event (for example, lunch time) has passed.
 散布関連情報1902は、全スケジュール総合の薬剤散布量、希釈量、エネルギー量等を記憶する。なお、各圃場毎の薬剤散布量、希釈量、エネルギー量等を記憶してもよい。
 スケジュールの規定方法は一例であって、その他のスケジュール管理方法であっても構わない。
The spraying-related information 1902 stores the total drug spraying amount, dilution amount, energy amount, etc. of the entire schedule. The amount of chemicals sprayed, the amount of dilution, the amount of energy, etc. for each field may be stored.
The method for defining the schedule is an example, and other schedule management methods may be used.
 本発明に係る実施形態では、カメラ(撮影装置)を備えたドローン100を用いて、圃場の作物の生育診断を行う。
 作物の生育診断の具体例を挙げると、作物中に含まれるクロロフィル(葉緑素)による光合成の明反応に着目することがある。クロロフィルが光合成に利用できる波長域はおよそ400~700nmである。その中でも波長域によって光合成への寄与率が異なっており、クロロフィルの波長吸収スペクトル(分布)は短波長域(青色波長域)と長波長域(赤色波長域)が比較的高く、中波長域(緑色波長域)は比較的低い傾向にある。例えば、長波長域に着目した場合、光合成が活発に行われている作物はクロロフィルの吸収率が高いため、長波長域の反射率は低くなる。一方で、光合成が活発ではない作物はクロロフィルの吸収率が低いため、長波長域の反射率は高くなる。従って、圃場の作物を撮影した画像に基づいて、作物からの波長毎の反射光量やその割合を検出することで、作物の生育診断を行うことができる。
In the embodiment of the present invention, a drone 100 equipped with a camera (photographing device) is used to diagnose the growth of crops in the field.
To give a specific example of crop growth diagnosis, we may focus on the bright reaction of photosynthesis by chlorophyll (chlorophyll) contained in crops. The wavelength range in which chlorophyll can be used for photosynthesis is approximately 400 to 700 nm. Among them, the contribution rate to photosynthesis differs depending on the wavelength range, and the wavelength absorption spectrum (distribution) of chlorophyll has relatively high short wavelength range (blue wavelength range) and long wavelength range (red wavelength range), and medium wavelength range (distribution). The green wavelength range) tends to be relatively low. For example, when focusing on the long wavelength region, the reflectance in the long wavelength region is low because the crops in which photosynthesis is actively performed have a high absorption rate of chlorophyll. On the other hand, crops with inactive photosynthesis have low chlorophyll absorption and therefore high reflectance in the long wavelength range. Therefore, the growth of the crop can be diagnosed by detecting the amount of reflected light from the crop for each wavelength and the ratio thereof based on the image of the crop in the field.
 図20(A)は、飛行するドローン100のカメラ512によって、作物からの波長毎の反射光量(緑色、赤色など)を基準として、作物の生育診断を行う概念図の例である。
 ドローン100は、所定高度の位置から、所定角度(真下に限定されない)に配向されたカメラ512によって、地上の作物を撮影する。一度に撮影するカメラの撮影範囲には限りがあるため、所定範囲の視野B1、B2を連続して撮影することで、圃場の作物をくまなく撮影するように構成している。
 例えば、領域A1では、2種類の作物11、12の比較的長波長領域の反射率が高い画像を撮影することができ、一方、隣の領域A2では、異なる2種類の作物13、14の比較的長波長領域の反射率が低い画像を撮影することができる。各領域A1、A2の撮影画像は、後処理で組み合わされてもよく、圃場全体で、作物の生育診断を可能にする。
FIG. 20A is an example of a conceptual diagram in which a crop growth diagnosis is performed based on the amount of reflected light (green, red, etc.) for each wavelength from the crop by the camera 512 of the flying drone 100.
The drone 100 photographs crops on the ground from a position at a predetermined altitude with a camera 512 oriented at a predetermined angle (not limited to directly below). Since the shooting range of the camera that shoots at one time is limited, it is configured to shoot all the crops in the field by continuously shooting the fields of view B1 and B2 in a predetermined range.
For example, in region A1, images with relatively high reflectance in a relatively long wavelength region of two types of crops 11 and 12 can be taken, while in adjacent region A2, comparisons of two different types of crops 13 and 14 are made. It is possible to take an image having a low reflectance in the long wavelength region. The captured images of the regions A1 and A2 may be combined in the post-treatment, enabling the growth diagnosis of the crop in the entire field.
 本発明の実施形態に係る作物の生育診断は、上記内容に限定されず、また用いられるカメラ512の個数も1つに限定されない。
 例えば、作物の生育診断では、作物からの波長毎の反射光量の他、作物の生育高さ(Z軸方向)及び/又は作物の生育範囲(X軸、Y軸方向)、作物の茎や葉、実、籾の大きさ及び/又は数などを対象としてもよい。この場合、複数のカメラ512を用いて、圃場の作物を撮影した画像に基づいて2次元または3次元の情報を取得して、作物の生育高さ、生育範囲などについて適当な生育診断を行ってもよい。さらに、作物が病気にかかっているか否かによって生育診断を行ってもよい。
The crop growth diagnosis according to the embodiment of the present invention is not limited to the above contents, and the number of cameras 512 used is not limited to one.
For example, in crop growth diagnosis, in addition to the amount of reflected light for each wavelength from the crop, the growth height of the crop (Z-axis direction) and / or the growth range of the crop (X-axis, Y-axis direction), the stem and leaves of the crop. , Fruit, size and / or number of paddy, etc. may be targeted. In this case, using a plurality of cameras 512, two-dimensional or three-dimensional information is acquired based on the images of the crops in the field, and an appropriate growth diagnosis is performed on the growth height, growth range, etc. of the crops. May be good. In addition, growth diagnosis may be performed depending on whether the crop is sick or not.
 作物の生育診断においてドローン100で実施される前処理は、
 -撮影した画像に対して撮影情報をラベル付けする処理
 -撮影した画像に基づいて作物から反射される光の波長毎の強度を検出する処理
 -撮影した画像に基づいて作物の生育度合いを判定する処理
 -撮影した画像に基づいて、この画像を取得した位置情報と作物の生育度合いとを対応付ける処理
等が考えられる。
The pretreatment carried out with the drone 100 in the growth diagnosis of the crop is
-Processing to label the captured information on the captured image-Process to detect the intensity of the light reflected from the crop for each wavelength based on the captured image-Determine the degree of growth of the crop based on the captured image Processing-Based on the captured image, processing such as associating the acquired position information with the growth degree of the crop can be considered.
 図20(B)は、飛行するドローン100のカメラ512によって、作物の生育高さ(Z軸方向)及び/又は作物の生育範囲(X軸、Y軸方向)を基準として、作物の生育診断を行う概念図の例である。
 ドローン100は、所定高度の位置から、所定角度(真下に限定されない)のカメラ512によって、地上の作物を撮影する。一度に撮影するカメラの撮影範囲には限りがあるため、所定範囲の視野B3、B4を連続して撮影することで、圃場の作物をくまなく撮影するように構成している。
 例えば、領域A3では、高さZ1が比較的低いが、広がりL1が比較的広い作物15、16を撮影することができ、一方、隣の領域A4では、高さZ2が比較的高いが、広がりL2が比較的狭い作物17を撮影することができる。各領域A3、A4の撮影画像は、後処理で組み合わされてもよく、圃場全体で、作物の生育診断を可能にする。
FIG. 20B shows a crop growth diagnosis using the camera 512 of the flying drone 100 based on the crop growth height (Z-axis direction) and / or the crop growth range (X-axis, Y-axis direction). This is an example of a conceptual diagram to be performed.
The drone 100 photographs crops on the ground from a position at a predetermined altitude with a camera 512 at a predetermined angle (not limited to directly below). Since the shooting range of the camera that shoots at one time is limited, it is configured to shoot all the crops in the field by continuously shooting the fields of view B3 and B4 in a predetermined range.
For example, in region A3, crops 15 and 16 having a relatively low height Z1 but a relatively wide spread L1 can be photographed, while in the adjacent region A4, a height Z2 is relatively high but spread. It is possible to photograph a crop 17 having a relatively narrow L2. The captured images of the regions A3 and A4 may be combined in the post-treatment, enabling the growth diagnosis of the crop in the entire field.
 作物の生育診断は、図20の(A)及び(B)に例示した内容に限定されない。さらに、各種の作物の生育診断に適当な、任意の種類のカメラを用いることができる。
 例えば、カメラ512として、静止画撮影用のデジタルカメラに限らず、動画撮影用ビデオカメラまたは動画・静止画撮影兼用カメラを用いることは可能である。
 また、極端に広い横幅(画角)をひとコマで撮影するパノラマ写真撮影用のパノラマカメラを用いることは可能である。
 また、対象物を複数の異なる方向から同時に撮影することにより、その奥行き方向の情報も記録できるステレオカメラを用いることは可能である。
 また、必要に応じて、赤外線カメラ、紫外線カメラ、X線カメラなどを単独または併用して用いることは可能である。
The crop growth diagnosis is not limited to the contents exemplified in FIGS. 20A and 20B. Furthermore, any kind of camera suitable for the growth diagnosis of various crops can be used.
For example, as the camera 512, not only a digital camera for still image shooting but also a video camera for moving image shooting or a camera for both moving image and still image shooting can be used.
It is also possible to use a panoramic camera for panoramic photography that captures an extremely wide width (angle of view) in one frame.
It is also possible to use a stereo camera that can record information in the depth direction by simultaneously photographing an object from a plurality of different directions.
Further, if necessary, an infrared camera, an ultraviolet camera, an X-ray camera, or the like can be used alone or in combination.
 以下、具体例として、単一又は複数のデジタルカメラを用いて、圃場の作物の静止画を撮影して、作物の色に基づいて生育診断を行う場合について説明するが、本発明の範囲はこの内容に限定されないことを理解されたい。 Hereinafter, as a specific example, a case where a still image of a crop in a field is taken using a single or a plurality of digital cameras and a growth diagnosis is performed based on the color of the crop will be described, but the scope of the present invention is this. Please understand that it is not limited to the content.
 通常、作物の生育診断では、単一又は複数の画像撮影用のデジタルカメラを用いて、圃場の作物を撮影する。この際、以下の各作業を含み得る。
 -カメラによる圃場の作物の撮影、
 -撮影した画像への無効画像または有効画像のフラグ付け、
 -撮影した画像の記憶(保存)、
 -撮影した画像に対する前処理、
 -前処理後のデータへの無効画像または有効画像のフラグ付け、
 -前処理後のデータの記憶、
 -前処理後のデータの外部への伝送、など。
Usually, in crop growth diagnosis, a crop in a field is photographed using a digital camera for taking one or more images. At this time, the following operations may be included.
-Camera photography of field crops,
-Flagging invalid or valid images on captured images,
-Memory (save) of captured images,
-Preprocessing for captured images,
-Flagging invalid or valid images on preprocessed data,
-Memory of data after preprocessing,
-Transmission of preprocessed data to the outside, etc.
 生育診断を適切に行うためには、画質の高い画像を取得することが求められる。このため、特にドローン100の飛行状態が問題となる。例えば、ドローン100の飛行時の撮影角度や撮影距離などが悪い場合、または生育診断に適さない時間帯や気象条件下で飛行する場合、撮影される画像の画質が低下する。その場合、求められる精度で生育診断を行うことが困難になる可能性がある。従って、ドローン100を用いて、圃場の作物の生育診断を行う場合、各画像の撮像時のドローン100の飛行状態を予め定められた分類に従って区別して、良好な飛行状態の時に撮像された適当な画像のみを抽出して利用することが好ましい。 In order to properly perform growth diagnosis, it is required to acquire high-quality images. Therefore, the flight condition of the drone 100 becomes a problem in particular. For example, if the shooting angle and shooting distance of the drone 100 during flight are poor, or if the drone 100 flies in a time zone or weather conditions that are not suitable for growth diagnosis, the image quality of the shot image deteriorates. In that case, it may be difficult to perform a growth diagnosis with the required accuracy. Therefore, when the growth diagnosis of crops in the field is performed using the drone 100, the flight state of the drone 100 at the time of imaging of each image is distinguished according to a predetermined classification, and the appropriate image is taken when the flight condition is good. It is preferable to extract and use only the image.
 また、ドローン100に搭載されたカメラでは、撮影時のデータ容量が比較的に大きいという問題がある。このため、撮影画像を保存する記憶媒体の容量、撮影画像を前処理する制御部の処理速度、および前処理後のデータを外部に伝送する伝送部の速度などの各場合で、比較的に負担が高い。生育診断に不適当な画像が保存、前処理または伝送されると、不必要な負担が記憶媒体などにかかり、無駄が生じる。従って、ドローン100を用いて、圃場の作物の生育診断を行う場合、生育診断に不適当な画像は、自動的に保存対象などから除外することが好ましい。 In addition, the camera mounted on the drone 100 has a problem that the data capacity at the time of shooting is relatively large. Therefore, the capacity of the storage medium for storing the captured image, the processing speed of the control unit for preprocessing the captured image, and the speed of the transmission unit for transmitting the preprocessed data to the outside are relatively burdensome. Is high. When an image unsuitable for growth diagnosis is stored, preprocessed or transmitted, an unnecessary burden is applied to a storage medium or the like, resulting in waste. Therefore, when the growth diagnosis of the crop in the field is performed using the drone 100, it is preferable to automatically exclude the images inappropriate for the growth diagnosis from the storage target and the like.
 撮影画像を前処理して、生データではなく、前処理後のデータを記憶または伝送することにより、データ容量を減らすのが好ましい。前処理は、作物の生育診断に係る各種の画像処理を指す。例えば、植物が反射する緑色または赤色などの光の割合の検出や光量の検出などを含み得る。または、2次元または3次元での植物の分布の分析などを含み得る。好ましくは、撮影画像において、ピクセル(画素)毎または1群のピクセル毎に、所定の画像分析を行う。
 他、前処理には、画像に対して撮影情報をラベル付けする、すなわち撮影情報の対応付けを行う処理なども含み得る。この撮影情報には、例えば、撮影時の圃場の状況、時刻、気温、湿度、天候、撮影時のカメラの状況、ドローンの飛行状況、撮影画像の状況などを含み得る。
It is preferable to reduce the data capacity by preprocessing the captured image and storing or transmitting the preprocessed data instead of the raw data. Pretreatment refers to various image processing related to crop growth diagnosis. For example, it may include detection of the proportion of light such as green or red reflected by a plant, detection of the amount of light, and the like. Alternatively, it may include analysis of plant distribution in two or three dimensions. Preferably, in the captured image, a predetermined image analysis is performed for each pixel (pixel) or for each group of pixels.
In addition, the preprocessing may include a process of labeling the image with shooting information, that is, a process of associating the shooting information. This shooting information may include, for example, the situation of the field at the time of shooting, the time, the temperature, the humidity, the weather, the situation of the camera at the time of shooting, the flight situation of the drone, the situation of the shot image, and the like.
 画像の記憶制御として、航空機(ドローンを含む)の位置、高度、速度、姿勢などの飛行状態が予め設定された条件を満たす場合に撮影画像を記憶することが考えられる。しかし、予め設定された飛行経路上を飛行する自律飛行を行うドローンの場合、上記条件だけでは、撮影画像の記憶の是非を十分に判断できない虞がある。また、そのほかの画像の記憶制御として、圃場の作物の生育診断に適当な画像を選択して、記憶することが考えられる。しかし、保存対象から除外すること以外については、十分な対応がなされていない虞がある。 As image storage control, it is conceivable to store captured images when the flight conditions such as the position, altitude, speed, and attitude of the aircraft (including the drone) satisfy preset conditions. However, in the case of a drone that performs autonomous flight flying on a preset flight path, there is a possibility that the propriety of storing the captured image cannot be sufficiently determined only by the above conditions. Further, as a memory control of other images, it is conceivable to select and store an image suitable for the growth diagnosis of crops in the field. However, there is a possibility that sufficient measures have not been taken except for excluding it from the storage target.
 以下、圃場の作物の生育診断に用いられる、ドローン100のカメラシステムについて詳述する。
 例えば、図6に例示したように、ドローン100は、マルチスペクトルカメラ512を備えることができる。マルチスペクトルカメラ512は、レンズと撮像部を備え、圃場の作物を所定の高さから撮影できる。
 また、ドローン100は、ドローン全体の制御を司るフライトコントローラー501を備えていて、マルチスペクトルカメラ512(図6)からの出力がフライトコントローラー501に伝送可能となっている。
 フライトコントローラー501は、ドローンの飛行を制御するプロセッサ(処理装置)の機能を行うとともに、マルチスペクトルカメラ512で取得した画像の前処理も可能である。
 図7に例示したように、前処理後のデータなどは、ドローン100上に記憶できるとともに、外部の管理サーバ702やモバイル端末701などに伝送できる。
Hereinafter, the camera system of the drone 100 used for the growth diagnosis of crops in the field will be described in detail.
For example, as illustrated in FIG. 6, the drone 100 can include a multispectral camera 512. The multispectral camera 512 includes a lens and an image pickup unit, and can photograph crops in a field from a predetermined height.
Further, the drone 100 includes a flight controller 501 that controls the entire drone, and the output from the multispectral camera 512 (FIG. 6) can be transmitted to the flight controller 501.
The flight controller 501 functions as a processor (processing device) that controls the flight of the drone, and can also preprocess the image acquired by the multispectral camera 512.
As illustrated in FIG. 7, the preprocessed data and the like can be stored on the drone 100 and transmitted to an external management server 702, a mobile terminal 701, and the like.
 図21は、ドローン100に対して専用のカメラシステム20を外付けで装備する例を示している。この場合も、カメラシステム20は、レンズと撮像部を備えるカメラ21を有する。
 また、カメラシステム20は、カメラ21の全体的な制御を司るカメラコントローラー(カメラ制御部)22を有し、カメラに対して撮影の指示などを行うカメラキャプチャ23などを含むことができる。
 また、カメラシステム20は、カメラの撮影した画像を前処理する画像処理部(イメージ・プロセッサ)24を有する。
 また、カメラシステム20は、記憶装置(SDカード等の不揮発性メモリカード、SSD(Solid State Drive)、ハードディスク等の記憶装置)25を有し、撮影した画像または前処理後の画像のデータなどを記憶することができる。
FIG. 21 shows an example in which a dedicated camera system 20 is externally equipped for the drone 100. Again, the camera system 20 has a camera 21 with a lens and an image pickup unit.
Further, the camera system 20 has a camera controller (camera control unit) 22 that controls the overall control of the camera 21, and can include a camera capture 23 that gives an instruction for shooting to the camera.
Further, the camera system 20 has an image processing unit (image processor) 24 that preprocesses the image taken by the camera.
Further, the camera system 20 has a storage device (a non-volatile memory card such as an SD card, an SSD (Solid State Drive), a storage device such as a hard disk) 25, and captures captured images or data of images after preprocessing. Can be remembered.
 図21に例示したカメラシステム20は、さらに、伝送装置26を有し、撮影した画像の生データまたは前処理後の画像のデータなどを外部の管理サーバなどに伝送することができる。
 例えば、遠隔地間のファイルやディレクトリの同期を行うアプリケーションソフトウェア27などを利用して、管理サーバ702等のクラウド装置28などに接続することができる。
 さらに、カメラシステム20は、本体と複数の回転翼とを備えるドローン100と接続する接続部と、を備える。カメラシステム20は、ドローン100の製造時に同時に接続されてもよく、ドローン100に対して後付けで接続されてもよい。
The camera system 20 illustrated in FIG. 21 further has a transmission device 26, and can transmit raw data of captured images, data of preprocessed images, and the like to an external management server or the like.
For example, it is possible to connect to a cloud device 28 such as a management server 702 by using application software 27 or the like that synchronizes files and directories between remote locations.
Further, the camera system 20 includes a connection portion for connecting to the drone 100 including the main body and a plurality of rotary wings. The camera system 20 may be connected at the same time as the drone 100 is manufactured, or may be retrofitted to the drone 100.
 図21に例示したカメラシステム20は、さらにフライトコントローラー501(図6参照)と接続することができる。このため、カメラシステム20は、撮影した画像などのデータをフライトコントローラー501に伝送することができる。
 また、カメラシステム20は、ドローン100の位置、速度、姿勢、時間、飛行状況(ミッション状況)、対地高度などの情報をフライトコントローラー501から受信することができる。
 以下、圃場の作物の生育診断に用いられるドローン100のカメラとして、図6に例示したマルチスペクトルカメラ512と、図21に例示したカメラシステム20の双方を含み得るものとする。後者は、(1)カメラと、(2)カメラから取得した画像の前処理を行う制御部と、(3)前処理後のデータを外部に伝送する伝送部と、(4)本体と複数の回転翼とを備えるドローンと接続する接続部と、を備えて、1つのハウジング内に収容されて、ドローン100に対して組付けられる。
The camera system 20 illustrated in FIG. 21 can be further connected to a flight controller 501 (see FIG. 6). Therefore, the camera system 20 can transmit data such as captured images to the flight controller 501.
Further, the camera system 20 can receive information such as the position, speed, attitude, time, flight status (mission status), and altitude above ground level of the drone 100 from the flight controller 501.
Hereinafter, as the camera of the drone 100 used for the growth diagnosis of the crop in the field, both the multispectral camera 512 exemplified in FIG. 6 and the camera system 20 exemplified in FIG. 21 may be included. The latter includes (1) a camera, (2) a control unit that preprocesses images acquired from the camera, (3) a transmission unit that transmits preprocessed data to the outside, and (4) a main body and a plurality of units. It is housed in one housing and assembled with respect to the drone 100, including a connection portion connecting to the drone with a rotor.
 なお、ドローン100の飛行状態は、接続部を介してドローンのフライトコントローラー501からカメラ制御部22に伝えられる構成でもよいし、カメラシステム20自体が、もしくは、フライトコントローラ―501とは異なる外部モジュールが複数のセンサを有しており、ドローン100の飛行状態を判定する構成でもよい。また、カメラシステム20が持つ伝送装置26を介して、管理サーバ702やモバイル端末701から飛行状態に関する情報を受信する構成でもよい。 The flight state of the drone 100 may be transmitted from the flight controller 501 of the drone to the camera control unit 22 via the connection unit, or the camera system 20 itself or an external module different from the flight controller-501 may be used. It may have a plurality of sensors and may be configured to determine the flight state of the drone 100. Further, the flight state information may be received from the management server 702 or the mobile terminal 701 via the transmission device 26 included in the camera system 20.
 上述したように、フライトコントローラー501には、ドローン100に関する様々なデータが送られている。
 例えば、フライトコントローラー501には、モーター102-1a、102-1b、102-2a、102-2b、102-3a、102-3b、104-a、104-bの実際の回転数がフィードバックされている(図6参照)。
 また、フライトコントローラー501は、RTK基地局の信号とGPS測位衛星からの信号を組み合わせることで、ドローン100の絶対位置を数センチメートル程度の精度で測定することができる(図6の504-1、504-2、504-3参照)。
As described above, various data regarding the drone 100 are sent to the flight controller 501.
For example, the flight controller 501 is fed back with the actual rotation speeds of the motors 102-1a, 102-1b, 102-2a, 102-2b, 102-3a, 102-3b, 104-a, 104-b. (See FIG. 6).
Further, the flight controller 501 can measure the absolute position of the drone 100 with an accuracy of about several centimeters by combining the signal of the RTK base station and the signal from the GPS positioning satellite (504-1 in FIG. 6). See 504-2 and 504-3).
 また、フライトコントローラー501は、6軸ジャイロセンサ505により、ドローン機体の互いに直交する3方向の加速度と、速度を計算し、3方向におけるドローン機体の姿勢角の変化、すなわち角速度を測定できる(図6参照)。
 また、フライトコントローラー501は、地磁気センサ506により、地磁気の測定によりドローン機体の方向を測定することができる(図6参照)。
 また、気圧センサ507により、気圧を測定し、間接的にドローンの高度を測定することができる(図6参照)。
 また、フライトコントローラー501は、レーザーセンサ508やソナー509により、超音波等の音波の反射を利用してドローン機体と地表との距離を測定することができる(図6参照)。
Further, the flight controller 501 can calculate the acceleration and velocity of the drone aircraft in three directions orthogonal to each other by the 6-axis gyro sensor 505, and can measure the change in the attitude angle of the drone aircraft in the three directions, that is, the angular velocity (FIG. 6). reference).
Further, the flight controller 501 can measure the direction of the drone aircraft by measuring the geomagnetism by the geomagnetic sensor 506 (see FIG. 6).
In addition, the barometric pressure sensor 507 can measure the barometric pressure and indirectly measure the altitude of the drone (see FIG. 6).
Further, the flight controller 501 can measure the distance between the drone aircraft and the ground surface by using the reflection of sound waves such as ultrasonic waves by the laser sensor 508 and sonar 509 (see FIG. 6).
 従って、ドローン100の位置、高度、速度、姿勢などの情報は、ドローンのセンサからの信号に基づいて、フライトコントローラー501に与えられている。これら情報に基づいて、制御部、即ち、フライトコントローラー501(図6)またはカメラシステム20のカメラコントローラー22(図21)は、ドローン100が所定の飛行状態にあるか否かについて判断することができる。 Therefore, information such as the position, altitude, speed, and attitude of the drone 100 is given to the flight controller 501 based on the signal from the drone sensor. Based on this information, the control unit, that is, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 can determine whether or not the drone 100 is in a predetermined flight state. ..
 さらに、ドローン100は、外部のモバイル端末701または管理サーバ702(図7参照)と無線接続されており、モバイル端末701または管理サーバ702(図7参照)に記憶された飛行経路管理情報1800(図10、図11、図18参照)に従って、指定された飛行座標を順に飛行するようになっている。
 この際、管理サーバ702の飛行管理モジュール1112は、圃場管理情報1300や飛行経路管理情報1800等の情報に基づいて、ドローン100の散布フライトを管理する(図11参照)。
Further, the drone 100 is wirelessly connected to an external mobile terminal 701 or a management server 702 (see FIG. 7), and the flight route management information 1800 (see FIG. 7) stored in the mobile terminal 701 or the management server 702 (see FIG. 7). 10. Refer to FIGS. 11 and 18), the flight coordinates are specified in order.
At this time, the flight management module 1112 of the management server 702 manages the sprayed flight of the drone 100 based on the information such as the field management information 1300 and the flight route management information 1800 (see FIG. 11).
 また、管理サーバ702の飛行経路管理モジュール1115は、圃場管理情報1300に基づいて、ドローン100の散布フライトの飛行経路を算出する(図11参照)。
 上述のように、圃場管理情報1300は、薬剤散布を行う対象である圃場に関する各種情報を記憶しており、圃場ID、圃場名、圃場位置、圃場周囲座標、圃場面積、作付作物等の情報を記憶する(図13参照)。
 また、飛行経路管理情報1800は、ドローン100のフライトの経路を示す情報を記憶しており、経路ID、対象ID、経路座標、経路合計距離などを記憶する(図18参照)。
 従って、管理サーバ702からの指示により、ドローン100は自律飛行を行なうことができ、フライトコントローラー501は、その信号に基づいて、その飛行状態を判断することができる。
Further, the flight path management module 1115 of the management server 702 calculates the flight path of the spray flight of the drone 100 based on the field management information 1300 (see FIG. 11).
As described above, the field management information 1300 stores various information about the field to which the chemicals are sprayed, and stores information such as the field ID, the field name, the field position, the coordinates around the field, the field area, and the crops planted. Remember (see FIG. 13).
Further, the flight route management information 1800 stores information indicating the flight route of the drone 100, and stores the route ID, the target ID, the route coordinates, the total route distance, and the like (see FIG. 18).
Therefore, according to the instruction from the management server 702, the drone 100 can perform autonomous flight, and the flight controller 501 can determine the flight state based on the signal.
 さらに、管理サーバ702からの指示とは別に、モバイル端末701により(図7参照)、ドローン100は離陸や一時帰還などの基本操作を行なうことができる。さらに、緊急時には、モバイル端末701により、ドローン100のマニュアル操作を行うことができる。
 例えば、ドローン100がホバリングを行う一時停止を行う場合、モバイル端末701に表示されるドローン操作画面900の緊急停止ボタン925(図9)から求めることができる。他、ドローン100が飛行開始地点に戻るオプション(一時帰還など)や、その場でモーターを緊急停止するオプション等(着陸など)も同様である。
Further, apart from the instruction from the management server 702, the mobile terminal 701 (see FIG. 7) allows the drone 100 to perform basic operations such as takeoff and temporary return. Further, in an emergency, the mobile terminal 701 can manually operate the drone 100.
For example, when the drone 100 performs a hovering pause, it can be obtained from the emergency stop button 925 (FIG. 9) of the drone operation screen 900 displayed on the mobile terminal 701. The same applies to the option for the drone 100 to return to the flight start point (temporary return, etc.) and the option to stop the motor on the spot (landing, etc.).
 ユーザによる高度変更ボタン923、924や、緊急停止ボタン925等、ドローン100への操作に応じて、ドローン操作モジュール1012(図10参照)が、これらの操作に対応するコマンド等の情報をドローン100に送信し、ドローン100を操作する。
 従って、モバイル端末701からの指示により、ドローン100は自律飛行から離れることができ、フライトコントローラー501は、その信号に基づいて、その状態を判断することができる。
Depending on the operation of the drone 100 such as the altitude change buttons 923 and 924 by the user and the emergency stop button 925, the drone operation module 1012 (see FIG. 10) transfers information such as commands corresponding to these operations to the drone 100. Send and operate the drone 100.
Therefore, according to the instruction from the mobile terminal 701, the drone 100 can be separated from the autonomous flight, and the flight controller 501 can determine the state based on the signal.
 図22は、圃場の作物の生育診断を行うドローン100の飛行経路の概念図である。
 理解を容易にするため、圃場は、4辺の長さが等しい正方形として概略的に示されており、横軸はX0からX10まで等分され、縦軸はY0からY10まで等分されている。縦軸と横軸によって画成されたマス目ごとに画像が撮影されるものとする。ただし、実際の撮影時には、隣り合う撮影画像は一部重複してもよい。
 例えば、左下の領域(X0、X1、Y0、Y1)のスタート地点(S)から開始すると、ドローン100は、外側の矢印に示すように、まず圃場の4辺に沿って外側を飛行する。
 次に圃場の縁に沿ってほぼ1周して、スタート地点(S)の手前の領域(X0、X1、Y1、Y2)に到達すると、ドローン100は、既に飛行した外側の領域を除いて、その内側の領域をジグザグ(蛇行)飛行して、圃場の作物を漏れなく撮影するように飛行する。
 右下の最後の領域(X8、X9、Y1、Y2)のゴール地点(G)に到達すると、ドローン100は撮影の作業を完了して、圃場から離脱する。
 なお、実際にドローン100が飛行する圃場の形は、図22に例示したものに限定されないことに留意されたい。また、撮影画像(マス目)は正方形に限定されない。
FIG. 22 is a conceptual diagram of the flight path of the drone 100 for diagnosing the growth of crops in the field.
For ease of understanding, the fields are shown schematically as squares of equal length on all four sides, with the horizontal axis equally divided from X0 to X10 and the vertical axis equally divided from Y0 to Y10. .. It is assumed that an image is taken for each square defined by the vertical axis and the horizontal axis. However, at the time of actual shooting, adjacent shot images may be partially overlapped.
For example, starting from the starting point (S) in the lower left region (X0, X1, Y0, Y1), the drone 100 first flies outward along the four sides of the field, as shown by the outer arrow.
Next, after making almost one round along the edge of the field and reaching the area (X0, X1, Y1, Y2) in front of the starting point (S), the drone 100 excluding the outer area that has already flown. It flies in a zigzag manner in the area inside it, so as to capture all the crops in the field.
When the goal point (G) in the last area (X8, X9, Y1, Y2) in the lower right is reached, the drone 100 completes the shooting work and leaves the field.
It should be noted that the shape of the field in which the drone 100 actually flies is not limited to that illustrated in FIG. 22. Further, the photographed image (square) is not limited to a square.
 ここで、フライトコントローラー501(図6)またはカメラシステム20のカメラコントローラー22(図21)は、撮影時のドローン100の飛行状態(位置、高度、速度、姿勢、自律飛行などに基づいて判定できる)について、特に、以下のA-Fに細分することができる。
 A.ドローン100が圃場を飛行中の場合(図22参照)
 B.ドローン100が圃場への入退場の経路飛行を行っている場合
 C.ドローン100が一時帰還中の場合
 D.ドローン100が着陸又は離陸している場合
 E.ドローン100が非常停止(ホバリング)中の場合
 F.入力信号の乱れ等により、特にドローン100の位置信号または速度計測が良好に計測することができず、ドローン100の飛行状態を適当に判断することができない場合
Here, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 can determine the flight state of the drone 100 at the time of shooting (it can be determined based on the position, altitude, speed, attitude, autonomous flight, etc.). In particular, can be subdivided into the following AF.
A. When the drone 100 is flying in the field (see FIG. 22)
B. When the drone 100 is flying in and out of the field C. When the drone 100 is temporarily returning D. When the drone 100 is landing or taking off E. When the drone 100 is in an emergency stop (hovering) F. When the position signal or speed measurement of the drone 100 cannot be measured well due to the disturbance of the input signal, etc., and the flight state of the drone 100 cannot be properly determined.
 さらに、フライトコントローラー501又はカメラシステム20のカメラコントローラー22は、上記A-Fの飛行状態について、次の2つに分類できる。即ち、Aの場合には、フライトコントローラー501は、撮影された画像が有効画像であると判断する。一方、B-Fの場合には、圃場の作物を適当に撮影することが困難なため、フライトコントローラー501又はカメラシステム20のカメラコントローラー22は、撮影された画像が無効画像であると判断する。この段階で、有効画像と無効画像が抽出された場合、各場合について、フライトコントローラー501又はカメラシステム20のカメラコントローラー22は、位置(緯度、経度、高度など)および時刻の情報などとともに、各画像にフラグ付け(無効/有効)をする。 Further, the flight controller 501 or the camera controller 22 of the camera system 20 can be classified into the following two types regarding the flight state of AF. That is, in the case of A, the flight controller 501 determines that the captured image is a valid image. On the other hand, in the case of BF, since it is difficult to properly photograph the crops in the field, the flight controller 501 or the camera controller 22 of the camera system 20 determines that the captured image is an invalid image. If valid and invalid images are extracted at this stage, in each case, the flight controller 501 or the camera controller 22 of the camera system 20 will use each image together with position (latitude, longitude, altitude, etc.) and time information. Is flagged (disabled / enabled).
 特に、フライトコントローラー501又はカメラシステム20のカメラコントローラー22は、次の場合に撮影された画像が無効画像であると判断することが好ましい。
 B.ドローン100が圃場への入退場の経路飛行をしている時、即ち、ドローン100が圃場に到達していない時または圃場から離脱する場合、圃場の作物を適当に撮影することができない。このため、ドローン100の飛行状態が所定の条件を満たさない、つまり撮影された画像が無効画像と判断する。圃場への入退場の飛行とは、ドローンの離陸地点から撮影対象エリアへの入場飛行、または撮影対象エリアから着陸地点への退場飛行である。
In particular, it is preferable that the flight controller 501 or the camera controller 22 of the camera system 20 determines that the image taken in the following cases is an invalid image.
B. When the drone 100 is flying in and out of the field, that is, when the drone 100 has not reached the field or has left the field, the crops in the field cannot be photographed properly. Therefore, it is determined that the flight state of the drone 100 does not satisfy the predetermined condition, that is, the captured image is an invalid image. The entry / exit flight to and from the field is an entry flight from the takeoff point of the drone to the shooting target area, or an exit flight from the shooting target area to the landing point.
 なお、通常の圃場の撮影時には、ドローン100はジグザグまたは蛇行して飛行する、例えば、圃場の作物を漏れなく撮影するため、圃場の縦軸方向を平行に略往復移動しながら、横軸方向に1つ隣の領域に順に移動する(図22参照)。一方、圃場への入退出の時には、ドローン100は、そのような動きに拘束されることなく、圃場に対して斜めに飛行できる。このため、ドローン100の位置情報と合わせて、その飛行状態を追跡して、ジグザグ飛行などが検出された場合には、有効画像と判断してもよい。一方、所定の経路飛行(ジグザグ飛行)から外れる、斜めの飛行などが検出された場合には、無効画像と判断してもよい。 At the time of photographing a normal field, the drone 100 flies in a zigzag or meandering manner. Moves to the next area in order (see FIG. 22). On the other hand, when entering and exiting the field, the drone 100 can fly diagonally to the field without being restricted by such movement. Therefore, if the flight state is tracked together with the position information of the drone 100 and a zigzag flight or the like is detected, it may be determined as a valid image. On the other hand, if an oblique flight that deviates from a predetermined route flight (zigza flight) is detected, it may be determined as an invalid image.
 C.ドローン100が一時帰還している時、即ち、ドローン100が何らかの不都合等のため圃場から離脱しフライト開始地点又はモバイル端末701の場所へ戻る場合、圃場の作物を適当に撮影することができない。このため、ドローン100の飛行状態が所定の条件を満たさない、つまり撮影された画像が無効画像と判断する。一時帰還の場合には、ドローンが作業中断地点から一時的に帰還する一時帰還飛行、または帰還地点から作業中断地点までの作業再開飛行での撮影を無効画像と判断する。
 この場合も、ドローン100の位置情報と合わせて、その飛行状態を追跡して、ジグザグ飛行などが検出された場合には、有効画像と判断し、斜めの飛行などが検出された場合には、無効画像と判断してもよい。
C. When the drone 100 is temporarily returning, that is, when the drone 100 leaves the field due to some inconvenience or the like and returns to the flight start point or the location of the mobile terminal 701, the crops in the field cannot be photographed properly. Therefore, it is determined that the flight state of the drone 100 does not satisfy the predetermined condition, that is, the captured image is an invalid image. In the case of temporary return, it is judged that the image taken in the temporary return flight in which the drone temporarily returns from the work interruption point or the work restart flight from the return point to the work interruption point is an invalid image.
Also in this case, the flight state is tracked together with the position information of the drone 100, and if a zigzag flight or the like is detected, it is judged as a valid image, and if an oblique flight or the like is detected, it is determined. It may be determined as an invalid image.
 D.ドローン100が着陸又は離陸している時、即ち、カメラが圃場の作物を所定高度から撮影できない場合、圃場の作物を適当に撮影することができない。このため、ドローン100の飛行状態が所定の条件を満たさない、つまり撮影された画像が無効画像と判断する。
 この場合は、ドローン100の位置情報と合わせて、モバイル端末701からの指示や、ドローン100の速度や高度などを追跡して、有効画像と無効画像の区別を行ってもよい。
D. When the drone 100 is landing or taking off, that is, when the camera cannot photograph the crop in the field from a predetermined altitude, the crop in the field cannot be photographed properly. Therefore, it is determined that the flight state of the drone 100 does not satisfy the predetermined condition, that is, the captured image is an invalid image.
In this case, the valid image and the invalid image may be distinguished by tracking the instruction from the mobile terminal 701, the speed and altitude of the drone 100, and the like together with the position information of the drone 100.
 E.ドローン100が非常停止(ホバリング)中である時、即ち、カメラが圃場の作物を所定位置および高度から動かない場合、圃場の作物をくまなく適当に撮影することができない。このため、ドローン100の飛行状態が所定の条件を満たさない、つまり撮影された画像が無効画像と判断する。
 この場合は、ドローン100の位置情報と合わせて、モバイル端末701からの指示や、ドローン100の速度や高度などを追跡して、有効画像と無効画像の区別を行ってもよい。
E. When the drone 100 is in an emergency stop (hovering), that is, when the camera does not move the crops in the field from a predetermined position and altitude, it is not possible to properly photograph all the crops in the field. Therefore, it is determined that the flight state of the drone 100 does not satisfy the predetermined condition, that is, the captured image is an invalid image.
In this case, the valid image and the invalid image may be distinguished by tracking the instruction from the mobile terminal 701, the speed and altitude of the drone 100, and the like together with the position information of the drone 100.
 F.入力信号の乱れ等の理由により、特に位置信号または速度計測が良好に計測することができず、ドローン100の飛行状態を適当に判断することができない場合、圃場の作物を適当に撮影することができない。このため、ドローン100の飛行状態が所定の条件を満たさない、つまり撮影された画像が無効画像と判断する。
 入力信号としては、ドローン100の各種センサ、モバイル端末701や管理サーバ702からの信号、衛星信号などが挙げられる。この場合は、これら信号の値を追跡して、予め定められた閾値などと比較判定することで、有効信号と無効信号の区別を行ってもよい。
F. If the position signal or speed measurement cannot be measured well due to the disturbance of the input signal, etc., and the flight condition of the drone 100 cannot be properly determined, it is possible to properly photograph the crops in the field. Can not. Therefore, it is determined that the flight state of the drone 100 does not satisfy the predetermined condition, that is, the captured image is an invalid image.
Examples of the input signal include various sensors of the drone 100, signals from the mobile terminal 701 and the management server 702, satellite signals, and the like. In this case, the effective signal and the invalid signal may be distinguished by tracking the values of these signals and comparing and determining them with a predetermined threshold value or the like.
 従って、フライトコントローラー501又はカメラシステム20のカメラコントローラー22は、ドローン100の位置情報に基づいて、位置座標が対象とする圃場外の場合は無効画像と判断してもよい。
 さらに、上記B-Fの他、以下の条件に基づいて無効画像と判断してもよい。
 例えば、G.ドローン100の対地高度が所定範囲外(高度高過ぎや、高度低過ぎ)の場合は、適当な撮影が困難となるため、無効画像と判断してもよい。
 さらに、H.ドローン100の姿勢角が所定範囲外の場合は、適当な撮影が困難となるため、無効画像と判断してもよい。
Therefore, the flight controller 501 or the camera controller 22 of the camera system 20 may determine that the image is invalid when the position coordinates are outside the target field based on the position information of the drone 100.
Further, in addition to the above BF, it may be determined as an invalid image based on the following conditions.
For example, G. If the altitude above ground level of the drone 100 is out of the predetermined range (too high or too low), it will be difficult to take an appropriate picture, so it may be determined as an invalid image.
Furthermore, H. If the posture angle of the drone 100 is out of the predetermined range, it is difficult to take an appropriate image, so it may be determined that the image is invalid.
 さらに、I.フライトコントローラー501が異常(一時的な異常など)または故障時(一時的ではない恒久的な機器故障など)は、無効画像と判断してもよい。
 さらに、J.カメラシステム20が異常(例えば、カメラコントローラー22の一時的な異常など)または故障時(例えば、カメラ21のレンズの汚れなど)は、無効画像と判断してもよい。
 さらに、K.ドローン100が既に撮影飛行した位置と重複飛行する場合は、無効画像と判断してもよい。
 さらに、L.フライトコントローラー501が緊急介入指令や異常状態検知などを検知した場合は、無効画像と判断してもよい。
In addition, I. When the flight controller 501 is abnormal (temporary abnormality, etc.) or malfunctions (non-temporary permanent equipment failure, etc.), it may be determined as an invalid image.
In addition, J. When the camera system 20 is abnormal (for example, a temporary abnormality of the camera controller 22) or fails (for example, the lens of the camera 21 is dirty), it may be determined as an invalid image.
In addition, K. If the drone 100 flies overlapping with the position where the shooting flight has already been taken, it may be determined as an invalid image.
Furthermore, L. When the flight controller 501 detects an emergency intervention command, an abnormal state detection, or the like, it may be determined as an invalid image.
 さらに、M.圃場の作物の生育診断を行う目的上、圃場の作物を撮影した画像において、生育診断(例えば、作物の緑色や赤色などの色の判断)を良好に行うことができない場合、その画像を無効画像と判断してもよい。この場合、特に、撮影された画像において、適当な光量が不足しているか否かが問題となる。
 例えば、日中に得られた画像では良好な画像判断をすることができるが、周囲環境が暗くなり良好な色の識別が困難になる夜に得られた画像では適当な画像判断ができない。このため、撮影時刻が所定範囲外(日中以外)の場合は、無効画像と判断してもよい。
Furthermore, M. For the purpose of diagnosing the growth of crops in the field, if the growth diagnosis (for example, judgment of the color of the crop such as green or red) cannot be performed well in the image of the crop in the field, the image is invalidated. You may judge that. In this case, in particular, whether or not an appropriate amount of light is insufficient in the captured image becomes a problem.
For example, an image obtained during the day can make a good image judgment, but an image obtained at night when the surrounding environment becomes dark and good color identification becomes difficult cannot make an appropriate image judgment. Therefore, if the shooting time is out of the predetermined range (other than daytime), it may be determined as an invalid image.
 さらに、N.例えば、夕方などにドローン100を操作して、日没とともに周囲環境が徐々に暗くなる場合は、センサ(光センサなど)もしくはカメラの画像から得られた周囲の照射度合い(光の強さなど)が所定範囲外となる場合は、無効画像と判断することができる。即ち、昼か夜かの二値で判断するのではなく、昼から夜への光の照射度合いの変化を多段階式に予め想定しておき、所定の範囲から外れた場合に、無効画像と判断してもよい。 Furthermore, N. For example, if the drone 100 is operated in the evening and the surrounding environment gradually darkens with sunset, the degree of illumination of the surroundings (light intensity, etc.) obtained from the sensor (light sensor, etc.) or the image of the camera. If is out of the predetermined range, it can be determined that the image is invalid. That is, instead of judging by the binary value of day or night, the change in the degree of light irradiation from day to night is assumed in advance in a multi-step manner, and when it deviates from the predetermined range, it is regarded as an invalid image. You may judge.
 さらに、O.例えば、晴天時(日中)にドローン100を操作した場合でも、気候変化(風、虫など)の影響により周囲環境が悪化する場合は、撮影された画質が劣化し得る。この場合、気象情報、センサなどから得られた情報に基づいて、その時に得られた画像を無効画像と判断してもよい。即ち、日中であっても、風、虫などにより撮影画像に汚れや乱れが生じて、適当な画質が得られない場合には、無効画像と判断してもよい。 Furthermore, O. For example, even when the drone 100 is operated in fine weather (daytime), if the surrounding environment deteriorates due to the influence of climate change (wind, insects, etc.), the captured image quality may deteriorate. In this case, the image obtained at that time may be determined as an invalid image based on the weather information, the information obtained from the sensor, and the like. That is, even in the daytime, if the captured image is stained or disturbed by wind, insects, or the like and appropriate image quality cannot be obtained, it may be determined as an invalid image.
 さらに、P.例えば、晴天時にドローン100を操作しても、気候変化(雨、雪など)の影響により日中であっても周囲環境が暗くなる場合は、撮影された画質が劣化し得る。この場合、気象情報、センサなどから得られた情報に基づいて、その時に得られた画像を無効画像と判断してもよい。即ち、日中であっても、気象条件(雨、雪など)により十分な光量が不足して、適当な画質が得られない場合には、無効画像と判断してもよい。 Furthermore, P. For example, even if the drone 100 is operated in fine weather, if the surrounding environment becomes dark even during the daytime due to the influence of climate change (rain, snow, etc.), the captured image quality may deteriorate. In this case, the image obtained at that time may be determined as an invalid image based on the weather information, the information obtained from the sensor, and the like. That is, even during the daytime, if a sufficient amount of light is insufficient due to weather conditions (rain, snow, etc.) and appropriate image quality cannot be obtained, it may be determined as an invalid image.
 上記B-Pの条件のいずれかに合致することにより無効画像を抽出した場合、特に、上記B-Fの条件のいずれかに合致することにより無効画像を抽出した場合、ドローン100から音や光で警告を出力したり、モバイル端末701などに警告を出力してもよい。
 たとえば、無効画像が抽出された場合、ドローン100に備えられたLED107や警告灯521(図6参照)などの照明により、ドローンの操作者に対して、その状態(特にエラー状態)を光信号(光の点灯、点滅など)で知らせてもよい。また、ドローンが備える表示部に表示を行ってもよい。
When an invalid image is extracted by satisfying any of the above BP conditions, particularly when an invalid image is extracted by satisfying any of the above BF conditions, sound or light is extracted from the drone 100. You may output a warning with, or you may output a warning to a mobile terminal 701 or the like.
For example, when an invalid image is extracted, an optical signal (particularly an error state) is given to the drone operator by lighting such as LED 107 and warning light 521 (see FIG. 6) provided in the drone 100. You may be notified by lighting, blinking, etc.). Further, the display may be performed on the display unit provided in the drone.
 また、無効画像が抽出された場合、ドローン100に備えられたブザー518やスピーカー520(図6参照)などにより、ドローンの操作者に対して、その状態(特にエラー状態)を音声信号で知らせてもよい。
 また、無効画像が抽出された場合、ドローンの操作者の利用するモバイル端末701(図7参照)上に警告を出して、その状態(特にエラー状態)を知らせるための情報を出力してもよい。例えば、モバイル端末701の表示部(ディスプレイや表示ライト)等に文字によるメッセージや光信号を出す、またはスピーカーから音声信号(警告音等)を出す等が考えられる。
Further, when an invalid image is extracted, the buzzer 518 and the speaker 520 (see FIG. 6) provided in the drone 100 are used to notify the drone operator of the state (particularly the error state) by an audio signal. May be good.
Further, when an invalid image is extracted, a warning may be issued on the mobile terminal 701 (see FIG. 7) used by the drone operator, and information for notifying the state (particularly an error state) may be output. .. For example, it is conceivable to output a text message or an optical signal to the display unit (display or display light) of the mobile terminal 701, or to output an audio signal (warning sound or the like) from the speaker.
 従って、ユーザは、ドローン100本体またはモバイル端末701などから発する信号により、ドローン100が有効画像を取得しているのか、または無効画像を取得しているのかを、即座に識別することができる。このため、ユーザは、操作中に有効画像の中に無効画像が紛れ込んでいるか否かについて速やかに理解することができる。 Therefore, the user can immediately identify whether the drone 100 is acquiring the valid image or the invalid image by the signal emitted from the drone 100 main body or the mobile terminal 701. Therefore, the user can quickly understand whether or not the invalid image is mixed in the valid image during the operation.
 カメラシステム20の記憶装置や処理装置などに過度の負担がかかるのを避けるため、撮影画像中に生育診断に適さない無効画像が含まれる場合、無効画像を生育診断の対象から外すことが好ましい。
 このため、フライトコントローラー501(図6)またはカメラシステム20のカメラコントローラー22(図21)は、取得した飛行情報(位置、姿勢、速度、高度、飛行状態など)が所定条件(上記B-Fなど)と一致することにより無効画像と判断した場合には、特に、撮影された画像を無効画像としてフラグ付けをする、画像の記憶を停止する、画像に対する前処理を停止する、画像若しくは前処理後のデータの記憶を停止するまたは画像又は前処理後のデータの外部への伝送を停止する、画像又は前処理後のデータを記憶装置から削除する、ことの少なくとも1つを行うのが好ましい。
In order to avoid imposing an excessive load on the storage device and the processing device of the camera system 20, it is preferable to exclude the invalid image from the target of the growth diagnosis when the captured image contains an invalid image unsuitable for the growth diagnosis.
Therefore, in the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20, the acquired flight information (position, attitude, speed, altitude, flight state, etc.) is a predetermined condition (the above BF, etc.). ), Especially when it is determined that the image is invalid, the captured image is flagged as an invalid image, the storage of the image is stopped, the preprocessing for the image is stopped, or the image or after the preprocessing. It is preferable to stop the storage of the data of the above, stop the transmission of the image or the preprocessed data to the outside, and delete the image or the preprocessed data from the storage device.
 実施形態に応じて、以下のどれか1つを行ってもよいし、以下の複数の組み合わせの中から適当なものを採用してもよい。
 -無効フラグ付けする/しない、
 -元画像(生データ)を記録する/しない、
 -画像に対する前処理をする/しない、
 -前処理後のデータを記録する/しない、
 -元画像または前処理後のデータの伝送する/しない、など。
 例えば、これらの組み合わせの中には、そもそも元画像を記録しない場合、元画像をそのまま記録するけど無効フラグを付ける場合、前処理するけど無効フラグを付ける場合などが含まれる。
Depending on the embodiment, any one of the following may be performed, or an appropriate combination may be adopted from the following plurality of combinations.
-Flag on / off invalid
-Record / not record the original image (raw data)
-Pre-process the image / not
-Record / do not record preprocessed data
-Transmit / not transmit the original image or preprocessed data, etc.
For example, these combinations include the case where the original image is not recorded in the first place, the case where the original image is recorded as it is but the invalid flag is added, and the case where the preprocessing is performed but the invalid flag is added.
 即ち、フライトコントローラー501(図6)またはカメラシステム20のカメラコントローラー22(図21)は、撮影した画像を記憶する/しないを特定するだけでなく、さらに、画像の前処理をする/しない、前処理後のデータを記憶する/しない、前処理後のデータを管理サーバに送る/送らないという切り分けを細かく行う。
 好ましくは、カメラコントローラーは、撮影した生データをドローン100内のカメラシステム20の記憶部25に保存しつつ、イメージ・プロセッサ24での前処理を禁止する、または、伝送部26での前処理後のデータの外部への伝送を禁止する。それによって、質の悪い撮影画像に基づく前処理が行われることを回避したり、無駄なデータ伝送による負担を低減する。上記のように切り分けを細かく行うことで、様々な実施形態に応じて、最適な内容でドローンの制御を行うことを可能にする。
That is, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 not only specifies whether or not to store the captured image, but also preprocesses or does not preprocess the captured image. The data after processing is stored / not stored, and the data after preprocessing is sent / not sent to the management server.
Preferably, the camera controller stores the captured raw data in the storage unit 25 of the camera system 20 in the drone 100, while prohibiting preprocessing by the image processor 24, or after preprocessing by the transmission unit 26. Prohibit the transmission of the data to the outside. As a result, it is possible to avoid performing preprocessing based on poor quality captured images and reduce the burden of unnecessary data transmission. By finely dividing the drone as described above, it is possible to control the drone with the optimum contents according to various embodiments.
 これに対して、簡易な方法では、無効画像が抽出された場合、単に画像の撮影を行う/行わない、または、撮影した画像の記憶を行う/行わないという条件を設定する。一方、本実施形態では、無効画像であってもそれを記憶することで、その後の解析等で利用する選択肢を残すことができる。例えば、無効画像の条件に該当していても、その一部または全部が画像処理に利用可能な場合や、その一部または全部に資料的価値が認められる場合が起こり得る。そのような場合、ドローン100側での前処理またはデータ伝送を避けることで、無駄な処理負担を抑えるとともに、必要に応じて後に画像処理する選択肢を残すことができる。 On the other hand, in the simple method, when an invalid image is extracted, a condition is set in which the image is simply taken / not taken, or the taken image is stored / not taken. On the other hand, in the present embodiment, even if it is an invalid image, by storing it, it is possible to leave an option to be used in subsequent analysis or the like. For example, even if the conditions for invalid images are met, there may be cases where a part or all of them can be used for image processing, or a part or all of them have material value. In such a case, by avoiding pre-processing or data transmission on the drone 100 side, it is possible to suppress unnecessary processing load and leave an option for image processing later if necessary.
 図23は、ドローン100の無効画像判定処理フローの例である。
 無効画像判定処理フローでは、ステップS01でドローン100の飛行指示を受け付けた場合に、ステップS02で、フライトコントローラー501が各種センサ(図6)または外部のサーバ(図7)などからドローン100の飛行状態を取得する。
 次に、ステップS03で、フライトコントローラー501(図6)またはカメラコントローラー22(図21)が、ステップS02で取得した飛行状態が所定の無効画像の条件を満たすか否かを判定する。例えば、上記B-Fなどの条件のいずれかを満たす場合、ステップS04で、フライトコントローラー501(図6)またはカメラコントローラー22(図21)が撮影画像に無効画像のフラグ付けを行う。この後、ステップS05で、フライトコントローラー501(図6)またはカメラコントローラー22(図21)が無効画像の前処理、前処理後のデータの記憶、または前処理後のデータの外部への伝送などを停止する。
FIG. 23 is an example of an invalid image determination processing flow of the drone 100.
In the invalid image determination processing flow, when the flight instruction of the drone 100 is received in step S01, the flight controller 501 receives the flight state of the drone 100 from various sensors (FIG. 6) or an external server (FIG. 7) in step S02. To get.
Next, in step S03, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) determines whether or not the flight state acquired in step S02 satisfies a predetermined invalid image condition. For example, when any of the conditions such as BF is satisfied, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) flags the captured image as an invalid image in step S04. After that, in step S05, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) performs preprocessing of the invalid image, storage of the preprocessed data, transmission of the preprocessed data to the outside, and the like. Stop.
 一方、ステップS03で、例えば、上記Aの条件を満たす場合、ステップS06で、フライトコントローラー501(図6)またはカメラコントローラー22(図21)が撮影画像に有効画像のフラグ付けを行う。この後、ステップS07で、有効画像の前処理、前処理後のデータの記憶、または前処理後のデータの外部への伝送などを実行する。
 なお、以上の処理は、フライトコントローラー501(図6)がカメラコントローラー22(図21)と連携して実施されてもよいし、フライトコントローラー501(図6)の代わりにカメラコントローラー22(図21)が実施してもよい。
On the other hand, in step S03, for example, when the condition of A is satisfied, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) flags the captured image as an effective image in step S06. After that, in step S07, preprocessing of the effective image, storage of the data after the preprocessing, transmission of the data after the preprocessing to the outside, and the like are executed.
The above processing may be performed by the flight controller 501 (FIG. 6) in cooperation with the camera controller 22 (FIG. 21), or the camera controller 22 (FIG. 21) instead of the flight controller 501 (FIG. 6). May be carried out.
 無効画像が抽出された後、管理サーバ702は、無効画像を取得した領域を飛行する飛行経路の再計算をすることができる。
 制御部、即ち、フライトコントローラー501(図6)またはカメラシステム20のカメラコントローラー22(図21)は、各無効画像の位置情報などを記憶して、その位置情報などを外部の管理サーバ702に伝送する。管理サーバ702は、受信したデータに基づいて各無効画像を結ぶ飛行経路を再計算して、各無効画像の領域を取り直すようにフライトコントローラー501に指示を送信してもよい。
After the invalid image is extracted, the management server 702 can recalculate the flight path flying in the area where the invalid image is acquired.
The control unit, that is, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 stores the position information of each invalid image and transmits the position information to the external management server 702. do. The management server 702 may recalculate the flight path connecting each invalid image based on the received data and send an instruction to the flight controller 501 to regain the area of each invalid image.
 例えば、フライトコントローラー501(図6)またはカメラシステム20のカメラコントローラー22(図21)は、無効画像に該当する領域についてフラグ付けを行うとともに、その位置(緯度と経度など)を求める。そして、管理サーバ702が、フラグ付けされた各画像について、位置情報などに基づいて、経済的なドローン100の飛行経路を算定するか、予め定められた飛行経路の中から適当なものを選択して、その経路に従ってドローン100を再飛行させる。 For example, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 flags the area corresponding to the invalid image and obtains the position (latitude and longitude, etc.). Then, the management server 702 calculates the flight path of the economical drone 100 based on the position information and the like for each flagged image, or selects an appropriate flight path from the predetermined flight paths. Then, the drone 100 is re-flyed according to the route.
 図24は、図22で例示した圃場の作物の生育診断を行うドローン100の飛行中、無効画像が抽出された場合を示す概念図である。
 この例では、着色された3つの領域31(X1、X2、Y5、Y6)、領域32(X3、X4、Y8、Y9)、及び領域33(X5、X6、Y8、Y9)において、無効画像のフラグが付けられている。
 管理サーバ702は、これら無効画像が抽出された領域31、32、33について再度撮影する場合の新しい飛行経路を計算するとき、例えば、以下の手段を取ることができる。
FIG. 24 is a conceptual diagram showing a case where an invalid image is extracted during the flight of the drone 100 for diagnosing the growth of crops in the field illustrated in FIG. 22.
In this example, in the three colored regions 31 (X1, X2, Y5, Y6), region 32 (X3, X4, Y8, Y9), and region 33 (X5, X6, Y8, Y9), the invalid image It is flagged.
The management server 702 can take, for example, the following means when calculating a new flight path for re-shooting the areas 31, 32, 33 from which these invalid images have been extracted.
 図25は、図24で例示した3つの無効画像の領域31、32、33を含む飛行経路において、再度、同一の飛行経路を採用するとともに、各無効画像がすべて取り直されるまで、飛行を続ける例を示している。
 即ち、管理サーバ702は、新たに求められた飛行経路では、全体的または部分的に、既に採用された飛行経路に従って、無効画像に該当する地点でだけ撮影を行うように算定することも可能である。
 この場合、飛行経路の計算は簡略化することができる。しかしながら、取り直しが必要な無効画像の数が少ないと、飛行経路に無駄が生じやすい。一方、一度目と二度目の飛行経路に変化がないため、撮影時のカメラの方位や角度などを共通化し易いという長所がある。この手段は、取り直しが必要な無効画像の数が多いと、効果的となる。なお、圃場の大外周りの飛行などは割愛してもよい。
FIG. 25 shows that the same flight path is adopted again in the flight path including the areas 31, 32, and 33 of the three invalid images exemplified in FIG. 24, and the flight is continued until all the invalid images are retaken. An example is shown.
That is, the management server 702 can also calculate that, in the newly obtained flight path, the image is taken only at the point corresponding to the invalid image according to the flight path already adopted, in whole or in part. be.
In this case, the calculation of the flight path can be simplified. However, if the number of invalid images that need to be retaken is small, the flight path tends to be wasted. On the other hand, since there is no change in the flight path between the first and second flight, there is an advantage that it is easy to standardize the direction and angle of the camera at the time of shooting. This method is effective when the number of invalid images that need to be retaken is large. In addition, the flight around the large circumference of the field may be omitted.
 図26は、図24で例示した3つの無効画像の領域31、32、33を含む飛行経路において、再度、同一の飛行経路を採用することなく、各無効画像を最短距離で結ぶように、飛行する例を示している。
 この場合の飛行経路は、複数の無効画像に対応する複数の位置を一筆書きで結ぶとともに、全体の飛行経路の長さが最小になるものが算出される。
 例えば、管理サーバ702は、新たに求められた飛行経路では、フラグ付けられた各画像の領域31、32、33を一筆書きで結ぶとともに、全体的な飛行距離を最小に抑えるように算定してもよい。
 この場合、ドローン100を飛ばす時間を短縮化して、飛行経路に無駄が生じるのを回避することができるが、ジグザグの飛行経路(図22)から離れることによる特有の問題が生じ得る。
FIG. 26 shows flight so as to connect the invalid images at the shortest distance in the flight path including the regions 31, 32, 33 of the three invalid images exemplified in FIG. 24 without adopting the same flight path again. Here is an example of how to do it.
In this case, the flight path is calculated by connecting a plurality of positions corresponding to a plurality of invalid images with a single stroke and minimizing the length of the entire flight path.
For example, the management server 702 connects the areas 31, 32, and 33 of each flagged image with a single stroke in the newly obtained flight path, and calculates to minimize the overall flight distance. May be good.
In this case, the time required to fly the drone 100 can be shortened to avoid waste in the flight path, but a peculiar problem may occur due to the departure from the zigzag flight path (FIG. 22).
 即ち、図22で例示したように、ジグザグの飛行経路では、ドローン100は、縦軸と横軸に対してほぼ平行に飛行するのに対して、図26で例示したように、無効画像を最短距離で結ぶ場合、ドローン100は、縦軸と横軸に対して角度を付けて飛行することになりやすい。この結果、図26の右端に例示するように、最初の飛行経路(ジグザグの飛行経路)で撮影された無効画像31の向きと、二度目の飛行経路(角度を付けた飛行経路)で撮影された画像31’の向きとの間には、ずれ(角度α)が生じやすい。従って、最初の無効画像31をその後の有効画像31’と取り換える際、対応する画像の向きが相違するため、複数の画像を一つにマージする(組み合わせる)際、その場所にて画像の位置ずれを引き起こす虞がある(特に、撮影画像31が正方形でない場合)。 That is, as illustrated in FIG. 22, in the zigzag flight path, the drone 100 flies almost parallel to the vertical axis and the horizontal axis, whereas as illustrated in FIG. 26, the invalid image is the shortest. When connecting by distance, the drone 100 tends to fly at an angle with respect to the vertical axis and the horizontal axis. As a result, as illustrated at the right end of FIG. 26, the orientation of the invalid image 31 taken in the first flight path (zigzag flight path) and the second flight path (angled flight path) are taken. A deviation (angle α) is likely to occur between the direction and the direction of the image 31'. Therefore, when the first invalid image 31 is replaced with the subsequent valid image 31', the orientations of the corresponding images are different. Therefore, when merging (combining) a plurality of images into one, the position of the image shifts at that location. (Especially when the captured image 31 is not square).
 従って、図26に例示した実施形態では、最初の飛行経路(ジグザグの飛行経路)から離れて、各無効画像を撮影する性質上、一度目の撮影時の方位などと、二度目の撮影時の方位などとを対応させて記憶することが好ましい。そして、後に、一度目の有効画像と二度目の有効画像とを互いに対応させる際、フライトコントローラー501又はカメラコントローラー22が、取り直しの発生した画像のずれ(角度α)などを修正すること(撮影画像の回転など)が好ましい。この手段は、取り直しが必要な無効画像の数が少ないと、効果的となる。 Therefore, in the embodiment illustrated in FIG. 26, due to the nature of taking each invalid image away from the first flight path (zigzag flight path), the direction at the time of the first shooting and the direction at the time of the second shooting are taken. It is preferable to store the image in correspondence with the direction and the like. Later, when the first effective image and the second effective image correspond to each other, the flight controller 501 or the camera controller 22 corrects the deviation (angle α) of the image in which the retake occurs (photographed image). (Rotation, etc.) is preferable. This method is effective when the number of invalid images that need to be retaken is small.
 図27は、図24で例示した3つの無効画像を含む飛行経路において、再度、同一の飛行経路を採用することなく、3つの無効画像を出来る限り最短距離で結ぶとともに、最初の飛行経路の場合と同じカメラの方位で撮影するように飛行する例を示している。即ち、無効画像を取り直す方位を一度目の飛行経路の場合と合わせながら、全体の飛行経路の長さを最小に抑えるようにする。
 この手段では、二度目の飛行時に、各無効画像に向って出来る限り最短距離で飛行する区間(一度目の飛行経路に対して斜めに飛ぶ区間)41と、各無効画像に対して一度目の飛行時と同様のカメラの方位で撮影するようにドローン100の方位を修正する区間(一度目の飛行経路と同様に飛ぶ区間)42とを含む。各区間41、42の長さの割合は実施形態に応じて変更可能だが、区間42の長さはできる限り短くしてもよい。
FIG. 27 shows the case of the first flight path while connecting the three invalid images at the shortest possible distance without adopting the same flight path again in the flight path including the three invalid images exemplified in FIG. 24. It shows an example of flying to shoot in the same camera orientation as. That is, the length of the entire flight path is minimized while matching the direction in which the invalid image is retaken with the case of the first flight path.
In this means, during the second flight, the section that flies toward each invalid image at the shortest possible distance (the section that flies diagonally with respect to the first flight path) 41 and the first for each invalid image. It includes a section (a section that flies in the same manner as the first flight path) 42 that corrects the direction of the drone 100 so that the image is taken with the same camera orientation as during flight. The ratio of the lengths of the sections 41 and 42 can be changed according to the embodiment, but the length of the sections 42 may be as short as possible.
 この場合、図26で例示した場合と異なり、最初の飛行経路(ジグザグの飛行経路)で撮影された画像の向きと、撮影し直された二度目の飛行経路(角度を付けた飛行経路)で撮影された画像の向きとの間には、ずれ(図26の角度α)が生じない。従って、最初の無効画像をその後の有効画像と取り換える際、画像の向きがほぼ同一のため、複数の画像を一つにマージする際、位置ずれを引き起こす虞を最小にできる。
 ただし、特に2番目の無効画像の領域32から3番目の無効画像の領域33に飛行する場合のように、最初の飛行経路で撮影された画像の向きと一致させるため、複数回(3回または2回)撮影方向を変更するという無駄を生じさせることがある(符号43)。
In this case, unlike the case illustrated in FIG. 26, the orientation of the image taken in the first flight path (zigzag flight path) and the second flight path (angled flight path) retaken are used. There is no deviation (angle α in FIG. 26) from the orientation of the captured image. Therefore, when the first invalid image is replaced with the subsequent valid image, the orientations of the images are almost the same, so that the possibility of causing misalignment when merging a plurality of images into one can be minimized.
However, in order to match the orientation of the image taken in the first flight path, multiple times (three times or), especially when flying from the area 32 of the second invalid image to the area 33 of the third invalid image. (Twice) It may cause waste of changing the shooting direction (reference numeral 43).
 この場合の前記飛行経路は、複数の無効画像に対応する複数の位置を含むように、一度目の飛行経路と一部または全部が同一の飛行経路を算出することになる。
 また、複数の無効画像に対応する複数の位置の画像を再度取得する場合に、ドローンの向きを無効画像を取得した向きと同一にしたうえで画像を再度取得する。
 その他の方法としては、複数の無効画像に対応する複数の位置の画像を再度取得する場合に、再度取得された画像を、対応する無効画像と同じ構図に変更する(画像処理する)ことで、一度目の飛行と同じ構図の画像を再取得する方法を取ってもよい。
In this case, the flight path calculates a flight path that is partially or wholly the same as the first flight path so as to include a plurality of positions corresponding to the plurality of invalid images.
Further, when reacquiring an image at a plurality of positions corresponding to a plurality of invalid images, the direction of the drone is made the same as the direction in which the invalid image was acquired, and then the image is acquired again.
As another method, when re-acquiring an image at a plurality of positions corresponding to a plurality of invalid images, the re-acquired image is changed to the same composition as the corresponding invalid image (image processing). You may take the method of reacquiring the image of the same composition as the first flight.
 図28は、図24で例示した3つの無効画像の領域31、32、33を含む飛行経路において、再度、同一の飛行経路を採用することなく、各無効画像を出来る限り最短距離で結ぶとともに、最初の飛行経路の場合と同じ方位または90度若しくは180度向きを変えて撮影するように飛行する例を示している。
 図28の例と図27の例とを対比すると、前者では、特に2番目の無効画像の領域32から3番目の無効画像の領域33に飛行する場合、最初の飛行経路で撮影された画像の向きと対応させるため、1度だけ撮影方向を変更させている(符号43)。このため、図28の例は、図27の例と比較して、飛行経路を短縮化するため、より経済的である。ただし、撮影後の画像は上下または左右が対称とする。
FIG. 28 shows that in the flight path including the regions 31, 32, 33 of the three invalid images exemplified in FIG. 24, each invalid image is connected at the shortest possible distance without adopting the same flight path again. An example of flying in the same direction as in the case of the first flight path or in a direction of 90 degrees or 180 degrees is shown.
Comparing the example of FIG. 28 with the example of FIG. 27, in the former, especially when flying from the area 32 of the second invalid image to the area 33 of the third invalid image, the image taken in the first flight path The shooting direction is changed only once in order to correspond to the direction (reference numeral 43). Therefore, the example of FIG. 28 is more economical because the flight path is shortened as compared with the example of FIG. 27. However, the image after shooting is symmetrical vertically or horizontally.
 なお、通常、撮影画像が長方形の場合、上下の長さと左右の長さとがそれぞれ同一のため、180度の向きの変更(回転)を行うことで、取り直した有効画像を無効画像と入れ替えることができる。
 また、撮影画像が正方形の場合、各辺の長さが同一のため、90度の向きの変更(回転)を行うことで、取り直した有効画像を無効画像と入れ替えることができる。
 ただし、撮影時のカメラの撮影角度などを考慮して、撮影画像の回転などを行う。
Normally, when the captured image is rectangular, the vertical and horizontal lengths are the same, so by changing the orientation (rotation) by 180 degrees, the retaken valid image can be replaced with an invalid image. can.
Further, when the captured image is a square, the length of each side is the same, so that the retaken valid image can be replaced with the invalid image by changing the orientation (rotation) by 90 degrees.
However, the shot image is rotated in consideration of the shooting angle of the camera at the time of shooting.
 従って、管理サーバ702は、飛行経路を新たに求める際、以下の手段のうち、適当なものを採用することができる。
 -図25に例示した場合(無効画像を含むように、一度目の飛行経路と一部または全部が同一の飛行経路を用いるように、新しい飛行経路を算定する場合)
 -図26に例示した場合(無効画像を一筆書きで結ぶとともに、全体の飛行経路の長さを最小に抑えるように、新しい飛行経路を算定する場合/無効画像を最短距離で結ぶことを優先して、無効画像の撮影方向を一度目の飛行経路の場合と合わせない場合)
 -図27に例示した場合(無効画像を最短距離で結ぶようにするとともに、無効画像の撮影方向を一度目の飛行経路の場合と合わるように、新しい飛行経路を算定する場合)
 -図28に例示した場合(無効画像を最短距離で結ぶようにするとともに、無効画像の撮影方向を一度目の飛行経路の場合と合わるまたは90度若しくは180度向きを変えるように、新しい飛行経路を算定する場合)
 他、管理サーバ702は、予め複数の飛行経路をデータベース上に用意しておき、その中から無効画像の位置を含む適当なものを選択して、新たな飛行経路として求めてもよい。
Therefore, the management server 702 can adopt an appropriate means among the following means when newly obtaining a flight path.
-In the case illustrated in FIG. 25 (when calculating a new flight path so that the first flight path and a part or all of the same flight path are used so as to include an invalid image).
-In the case illustrated in Fig. 26 (when connecting invalid images with a single stroke and calculating a new flight path so as to minimize the length of the entire flight path / giving priority to connecting invalid images at the shortest distance If the shooting direction of the invalid image does not match the case of the first flight path)
-In the case illustrated in Fig. 27 (when connecting invalid images at the shortest distance and calculating a new flight path so that the shooting direction of the invalid image matches the case of the first flight path)
-In the case illustrated in FIG. 28 (a new flight so as to connect invalid images at the shortest distance and to match the shooting direction of the invalid image with the case of the first flight path or to change the direction by 90 degrees or 180 degrees. When calculating the route)
In addition, the management server 702 may prepare a plurality of flight paths in the database in advance, select an appropriate flight path including the position of the invalid image from the database, and obtain a new flight path.
 図29は、ドローン100の無効画像判定処理及び二度目の飛行経路算定処理フローの例である。
 無効画像判定処理フローでは、ステップS11でドローン100の飛行指示を受け付けた場合に、ステップS12で、フライトコントローラー501(図6)が、各種センサ(図6)または外部のサーバ(図7)などからドローン100の飛行状態を取得する。
 次に、ステップS13で、フライトコントローラー501(図6)またはカメラコントローラー22(図21)がステップS12で取得した飛行状態が所定の無効画像の条件を満たすか否かを判定する。例えば、上記B-Fなどの条件のいずれかを満たす場合、ステップS14で、撮影画像に無効画像のフラグ付けを行う。この後、フライトコントローラー501(図6)またはカメラコントローラー22(図21)が、ステップS15で、無効画像の処理(画像の前処理、前処理後のデータの記憶、前処理後のデータの伝送など)を停止する。
 一方、ステップ3で、例えば、上記Aの条件を満たす場合、ステップS16で、フライトコントローラー501(図6)またはカメラコントローラー22(図21)が、撮影画像に有効画像のフラグ付けを行う。この後、ステップS17で、有効画像の処理(画像の前処理、前処理後のデータの記憶、処理後のデータの伝送など)を実行する。
FIG. 29 is an example of the invalid image determination process and the second flight route calculation process flow of the drone 100.
In the invalid image determination processing flow, when the flight instruction of the drone 100 is received in step S11, the flight controller 501 (FIG. 6) is transmitted from various sensors (FIG. 6) or an external server (FIG. 7) in step S12. Acquire the flight status of the drone 100.
Next, in step S13, it is determined whether or not the flight state acquired by the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) satisfies the predetermined invalid image condition in step S12. For example, when any of the above conditions such as BF is satisfied, the captured image is flagged as an invalid image in step S14. After that, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) processes invalid images (image pre-processing, pre-processing data storage, pre-processing data transmission, etc.) in step S15. ) Is stopped.
On the other hand, in step 3, for example, when the condition of A is satisfied, the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) flags the captured image as an effective image in step S16. After that, in step S17, processing of the effective image (pre-processing of the image, storage of data after pre-processing, transmission of data after processing, etc.) is executed.
 さらに、二度目の飛行経路算定処理フローでは、ステップS18で、無効画像の情報(図24、無効画像のフラグ、位置情報など)を外部の管理サーバ702に伝送する(図7)。
 次に、ステップS19で、外部の管理サーバ702側で、無効画像を結ぶ新しい飛行経路を再計算させる(図25-図28)。
 次に、ステップS20で、再計算した飛行経路を外部の管理サーバ702からドローン100に伝送させる(図7)。その後、ドローン100を再飛行させて、無効画像について画像の取り直しを行わせる。
Further, in the second flight route calculation processing flow, the invalid image information (FIG. 24, invalid image flag, position information, etc.) is transmitted to the external management server 702 in step S18 (FIG. 7).
Next, in step S19, the external management server 702 side recalculates the new flight path connecting the invalid images (FIGS. 25-28).
Next, in step S20, the recalculated flight path is transmitted from the external management server 702 to the drone 100 (FIG. 7). After that, the drone 100 is re-flyed to retake the invalid image.
 フライトコントローラー501(図6)またはカメラシステム20のカメラコントローラー22(図21)は、無効画像を再度撮影する二度目の飛行時には、取り直された各画像について新たに有効か無効かのフラグ付けを行う。それによって、一度目の飛行時に無効画像と判断された画像と、二度目の飛行時に有効画像として取り直された画像とを、一対一に対応させる。その時またはその後、一度目の飛行時の各無効画像を対応する各有効画像と置き換えてもよい。
 フライトコントローラー501(図6)またはカメラシステム20のカメラコントローラー22(図21)は、二度目の飛行時に再度無効画像が生じた場合には、一度目の飛行時の場合と同様の処置を取り、必要に応じて、同様に三度目の飛行時に再度撮影し直してもよい。
The flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 will flag each retaken image as newly valid or invalid on the second flight when the invalid image is taken again. conduct. As a result, there is a one-to-one correspondence between the image determined to be an invalid image during the first flight and the image retaken as a valid image during the second flight. At that time or thereafter, each invalid image during the first flight may be replaced with each corresponding valid image.
If the flight controller 501 (FIG. 6) or the camera controller 22 (FIG. 21) of the camera system 20 causes an invalid image again during the second flight, the flight controller 501 (FIG. 6) takes the same measures as during the first flight. If necessary, the photograph may be taken again during the third flight.
 従って、本発明は、圃場の作物の生育診断を行うために、カメラ512を備えたドローン100、またはドローン100に取り付けられるカメラシステム20を提供する。
 上記ドローン100またはカメラシステム20によって、撮影画像に無効画像が抽出された場合、その前処理または前処理後のデータの外部サーバへの伝送を停止させる。それによって、画像処理の無駄を省き、作業効率の最適化を図る。
 さらに、上記ドローン100またはカメラシステム20によって、撮影画像に無効画像が抽出された場合、その無効画像を新たに撮影するように、外部サーバに対して新しい飛行経路を算定させる。それによって、無効画像の取り直しを行うために、最適化された飛行経路を速やかに利用できるようにする。
Therefore, the present invention provides a drone 100 equipped with a camera 512 or a camera system 20 attached to the drone 100 for diagnosing the growth of crops in a field.
When an invalid image is extracted from the captured image by the drone 100 or the camera system 20, the transmission of the preprocessed or preprocessed data to the external server is stopped. By doing so, waste of image processing is eliminated and work efficiency is optimized.
Further, when an invalid image is extracted from the captured image by the drone 100 or the camera system 20, an external server is made to calculate a new flight route so as to newly capture the invalid image. This makes the optimized flight path available promptly for retrieving invalid images.
 なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施例の構成の一部を他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。また、各実施例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 The present invention is not limited to the above-described embodiment, but includes various modifications. For example, the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations. Further, it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of one embodiment. Further, it is possible to add / delete / replace a part of the configuration of each embodiment with another configuration.
 また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリや、ハードディスク、SSD(Solid State Drive)等の記憶装置、または、ICカード、SDカード、DVD等の記憶媒体に置くことができる。 Further, each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be stored in a memory, a storage device such as a hard disk or SSD (Solid State Drive), or a storage medium such as an IC card, SD card, or DVD.
 また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。
 なお、上述の実施例は少なくとも特許請求の範囲に記載の構成を開示している。
In addition, the control lines and information lines indicate those that are considered necessary for explanation, and do not necessarily indicate all the control lines and information lines in the product. In practice, it can be considered that almost all configurations are interconnected.
It should be noted that the above-mentioned embodiment discloses at least the configuration described in the claims.
21…カメラ、22…制御部(カメラコントローラー)、100…ドローン、501…制御部(フライトコントローラー)、512…カメラ(マルチスペクトルカメラ)、701…モバイル端末、702…管理サーバ、703…管理端末、710…基地局 21 ... camera, 22 ... control unit (camera controller), 100 ... drone, 501 ... control unit (flight controller), 512 ... camera (multispectral camera), 701 ... mobile terminal, 702 ... management server, 703 ... management terminal, 710 ... Base station

Claims (21)

  1.  本体と、
     複数の回転翼と、
     カメラと、
     前記カメラから取得した画像から生成されるデータを外部に伝送する伝送部と、
     記憶装置と、
    を備えるドローンであって、
     前記ドローンの飛行状態が所定の条件を満たす場合に、前記画像を無効画像としてフラグ付けをする、前記画像の前記記憶装置への記憶を停止する、前記画像に対する前処理を停止する、または前記データの外部への伝送を停止する、前記画像を前記記憶装置から削除する、ことのうち少なくとも1つを行う
    ドローン。
    With the main body
    With multiple rotors,
    With the camera
    A transmission unit that transmits data generated from the image acquired from the camera to the outside,
    With storage
    It is a drone equipped with
    When the flight state of the drone satisfies a predetermined condition, the image is flagged as an invalid image, the storage of the image in the storage device is stopped, the preprocessing for the image is stopped, or the data. A drone that stops transmission to the outside of the image, deletes the image from the storage device, and performs at least one of the following.
  2.  前記所定の条件は、前記ドローンの離陸地点から対象エリアへの入場飛行、または前記対象エリアから着陸地点への退場飛行である、請求項1に記載のドローン。 The drone according to claim 1, wherein the predetermined condition is an entry flight from the takeoff point of the drone to the target area, or an exit flight from the target area to the landing point.
  3.  前記所定の条件は、前記ドローンが作業中断地点から一時的に帰還する一時帰還飛行、または帰還地点から前記作業中断地点までの作業再開飛行である、請求項1に記載のドローン。 The drone according to claim 1, wherein the predetermined condition is a temporary return flight in which the drone temporarily returns from the work interruption point, or a work restart flight from the return point to the work interruption point.
  4.  前記所定の条件は、前記ドローンの着陸、または離陸である、請求項1に記載のドローン。 The drone according to claim 1, wherein the predetermined condition is landing or takeoff of the drone.
  5.  前記所定の条件は、前記ドローンの位置計測または速度計測をすることができないことである、請求項1に記載のドローン。 The drone according to claim 1, wherein the predetermined condition is that the position measurement or speed measurement of the drone cannot be performed.
  6.  前記前処理は、前記画像に対して撮影情報の対応付けを行う処理である、請求項1から5のいずれか1項に記載のドローン。 The drone according to any one of claims 1 to 5, wherein the pre-processing is a process of associating shooting information with the image.
  7.  前記画像の前記前処理は、前記画像に基づいて植物が反射する緑色または赤色の光の割合または光量を検出することである、請求項1から5のいずれか1項に記載のドローン。 The drone according to any one of claims 1 to 5, wherein the pretreatment of the image is to detect the proportion or amount of green or red light reflected by the plant based on the image.
  8.  前記画像の前記前処理は、前記画像に基づいて作物の生育度合いを判定することである、請求項1から5のいずれか1項に記載のドローン。 The drone according to any one of claims 1 to 5, wherein the pretreatment of the image is to determine the degree of growth of the crop based on the image.
  9.  前記画像の前記前処理は、前記画像に基づいて、前記画像を取得した位置情報と作物の生育度合いとを対応付けることである、請求項1から5のいずれか1項に記載のドローン。 The drone according to any one of claims 1 to 5, wherein the preprocessing of the image is to associate the position information obtained with the image with the growth degree of the crop based on the image.
  10.  前記ドローンに搭載されたセンサの検出情報に基づいて、前記ドローンの飛行状態が所定の条件を満たすか否かを判定する、請求項1から9のいずれか1項に記載のドローン。 The drone according to any one of claims 1 to 9, which determines whether or not the flight state of the drone satisfies a predetermined condition based on the detection information of the sensor mounted on the drone.
  11.  外部の管理サーバまたはモバイル端末の少なくともどちらから受信する情報に基づいて、前記ドローンの飛行状態が所定の条件を満たすか否かを判定する、請求項1から9のいずれか1項に記載のドローン。 The drone according to any one of claims 1 to 9, which determines whether or not the flight state of the drone satisfies a predetermined condition based on information received from at least either an external management server or a mobile terminal. ..
  12.  前記所定の条件を満たすと判断した場合に、前記ドローンに搭載された照明から光を出力する、又はスピーカーから音を出力する、又は表示部に表示を行うことの少なくともいずれかを行う、請求項1から11のいずれか1項に記載のドローン。 Claimed to perform at least one of outputting light from the lighting mounted on the drone, outputting sound from the speaker, or displaying on the display unit when it is determined that the predetermined condition is satisfied. The drone according to any one of 1 to 11.
  13.  前記所定の条件を満たすと判断した場合に、モバイル端末に警告を表示するための情報を伝送する、請求項1から12のいずれか1項に記載のドローン。 The drone according to any one of claims 1 to 12, which transmits information for displaying a warning to a mobile terminal when it is determined that the predetermined conditions are satisfied.
  14.  本体と、
     複数の回転翼と、
     カメラと、
     前記カメラから取得した画像の前処理を行う制御部と、
     前記前処理後のデータを外部に伝送する伝送部と、
    を備えるドローンであって、
     前記制御部は、
     前記ドローンの飛行状態が所定の条件を満たす場合に、前記所定の条件を満たす状態となった再撮影予定位置の情報を記憶し、
     記憶された前記再撮影予定位置の情報を外部に伝送する、
    ドローン。
    With the main body
    With multiple rotors,
    With the camera
    A control unit that preprocesses the image acquired from the camera, and
    A transmission unit that transmits the preprocessed data to the outside,
    It is a drone equipped with
    The control unit
    When the flight state of the drone satisfies a predetermined condition, the information of the scheduled re-shooting position in which the predetermined condition is satisfied is stored.
    The stored information on the scheduled re-shooting position is transmitted to the outside.
    Drone.
  15.  前記ドローンが飛行する飛行経路として、前記再撮影予定位置を飛行する飛行経路が算出される、請求項14に記載のドローン。 The drone according to claim 14, wherein a flight path for flying at the planned re-shooting position is calculated as a flight path for the drone to fly.
  16.  前記ドローンが飛行する飛行経路として、前記再撮影予定位置を含むように、一度目の飛行経路と一部または全部が同一の飛行経路が算出される、請求項14に記載のドローン。 The drone according to claim 14, wherein a flight path that is partially or wholly the same as the first flight path is calculated so as to include the planned re-shooting position as the flight path that the drone flies.
  17.  前記制御部は、前記再撮影予定位置の画像を再度取得する場合に、前記ドローンの向きを一度目の飛行と同一にしたうえで画像を再度取得する、請求項15又は16に記載のドローン。 The drone according to claim 15 or 16, wherein the control unit acquires the image again after making the direction of the drone the same as that of the first flight when the image of the scheduled re-shooting position is acquired again.
  18.  前記制御部は、前記再撮影予定位置の画像を再度取得する場合に、再度取得された画像を、一度目の飛行で取得された対応する画像と同じ構図に変更する、請求項15又は16に記載のドローン。 The control unit changes the re-acquired image to the same composition as the corresponding image acquired in the first flight when the image of the scheduled re-shooting position is acquired again, according to claim 15 or 16. The drone listed.
  19.  カメラと、
     前記カメラから取得した画像から生成されるデータを外部に伝送する伝送部と、
     本体と複数の回転翼とを備えるドローンと接続する接続部と、
     記憶装置と、
    を備えるカメラシステムであって、
     前記ドローンの飛行状態が所定の条件を満たす場合に、前記画像を無効画像としてフラグ付けをする、前記画像の前記記憶装置への記憶を停止する、前記画像に対する前処理を停止する、または前記データの外部への伝送を停止する、前記画像を前記記憶装置から削除する、ことのうち少なくとも1つを行う、
    カメラシステム。
    With the camera
    A transmission unit that transmits data generated from the image acquired from the camera to the outside,
    A connection that connects to a drone with a body and multiple rotors,
    With storage
    It is a camera system equipped with
    When the flight state of the drone satisfies a predetermined condition, the image is flagged as an invalid image, the storage of the image in the storage device is stopped, the preprocessing for the image is stopped, or the data. To stop transmission to the outside of the image, to delete the image from the storage device, and to perform at least one of the following.
    Camera system.
  20.  カメラと、
     前記カメラから取得した画像の前処理を行う制御部と、
     前記前処理後のデータを外部に伝送する伝送部と、
     本体と複数の回転翼とを備えるドローンと接続する接続部と、
    を備えるカメラシステムであって、
     前記制御部は、
     前記ドローンの飛行状態が所定の条件を満たす場合に、前記所定の条件を満たす状態となった再撮影予定位置の情報を記憶し、
     記憶された前記再撮影予定位置の情報を外部に伝送する、
    カメラシステム。
    With the camera
    A control unit that preprocesses the image acquired from the camera, and
    A transmission unit that transmits the preprocessed data to the outside,
    A connection that connects to a drone with a body and multiple rotors,
    It is a camera system equipped with
    The control unit
    When the flight state of the drone satisfies a predetermined condition, the information of the scheduled re-shooting position in which the predetermined condition is satisfied is stored.
    The stored information on the scheduled re-shooting position is transmitted to the outside.
    Camera system.
  21.  ドローンに搭載されたカメラで撮影された画像もしくは当該画像から生成されるデータの少なくともいずれかを受信するサーバであって、
     前記ドローンの飛行状態が所定の条件を満たす状態で撮影された画像を無効画像としてフラグ付けをする、前記無効画像の記録装置への記録を停止する、前記無効画像に対する処理を停止する、前記無効画像を前記記録装置から削除する、ことのうち少なくとも1つを行う、サーバ。

     
    A server that receives at least one of the images taken by the camera mounted on the drone or the data generated from the images.
    The image taken when the flight state of the drone satisfies a predetermined condition is flagged as an invalid image, the recording of the invalid image in the recording device is stopped, the processing for the invalid image is stopped, and the invalid image is stopped. A server that performs at least one of deleting images from the recording device.

PCT/JP2020/024246 2020-06-19 2020-06-19 Drone for diagnosing crop growth, and camera system for same WO2021255940A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2020/024246 WO2021255940A1 (en) 2020-06-19 2020-06-19 Drone for diagnosing crop growth, and camera system for same
JP2022531240A JPWO2021255940A1 (en) 2020-06-19 2020-06-19

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/024246 WO2021255940A1 (en) 2020-06-19 2020-06-19 Drone for diagnosing crop growth, and camera system for same

Publications (1)

Publication Number Publication Date
WO2021255940A1 true WO2021255940A1 (en) 2021-12-23

Family

ID=79267720

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/024246 WO2021255940A1 (en) 2020-06-19 2020-06-19 Drone for diagnosing crop growth, and camera system for same

Country Status (2)

Country Link
JP (1) JPWO2021255940A1 (en)
WO (1) WO2021255940A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115336520A (en) * 2022-08-26 2022-11-15 北大荒集团黑龙江七星农场有限公司 Big data classification processing system for rice deployment

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115837994A (en) * 2023-02-16 2023-03-24 国网山西省电力公司电力科学研究院 Pod attitude detection and image compensation device and method based on MEMS gyroscope

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11252428A (en) * 1998-03-05 1999-09-17 Hitachi Ltd Super-high resolution camera
JP2003009664A (en) * 2001-06-29 2003-01-14 Minolta Co Ltd Crop growth level measuring system, crop growth level measuring method, crop growth level measuring program, and computer-readable recording medium recorded with the program
JP2006027331A (en) * 2004-07-13 2006-02-02 Hiroboo Kk Method for collecting aerial image information by utilizing unmanned flying object
JP2018152737A (en) * 2017-03-13 2018-09-27 ヤンマー株式会社 Unmanned flight camera
JP2019040383A (en) * 2017-08-25 2019-03-14 コニカミノルタ株式会社 Photographing schedule determination method, and photographing schedule determination control program
WO2020004029A1 (en) * 2018-06-26 2020-01-02 ソニー株式会社 Control device, method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11252428A (en) * 1998-03-05 1999-09-17 Hitachi Ltd Super-high resolution camera
JP2003009664A (en) * 2001-06-29 2003-01-14 Minolta Co Ltd Crop growth level measuring system, crop growth level measuring method, crop growth level measuring program, and computer-readable recording medium recorded with the program
JP2006027331A (en) * 2004-07-13 2006-02-02 Hiroboo Kk Method for collecting aerial image information by utilizing unmanned flying object
JP2018152737A (en) * 2017-03-13 2018-09-27 ヤンマー株式会社 Unmanned flight camera
JP2019040383A (en) * 2017-08-25 2019-03-14 コニカミノルタ株式会社 Photographing schedule determination method, and photographing schedule determination control program
WO2020004029A1 (en) * 2018-06-26 2020-01-02 ソニー株式会社 Control device, method, and program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115336520A (en) * 2022-08-26 2022-11-15 北大荒集团黑龙江七星农场有限公司 Big data classification processing system for rice deployment
CN115336520B (en) * 2022-08-26 2023-05-16 北大荒集团黑龙江七星农场有限公司 Big data classification processing system for rice deployment

Also Published As

Publication number Publication date
JPWO2021255940A1 (en) 2021-12-23

Similar Documents

Publication Publication Date Title
JP6762629B2 (en) Field crop photography method and drone for photography
WO2021255940A1 (en) Drone for diagnosing crop growth, and camera system for same
WO2019168047A1 (en) Drone, drone control method, and drone control program
JP6889502B2 (en) Drones, drone control methods, and drone control programs
JP7008999B2 (en) Driving route generation system, driving route generation method, and driving route generation program, and drone
JP7353630B2 (en) Drone control system, drone control method, and drone
JP7359464B2 (en) Crop growing system
JP6973829B2 (en) Field photography camera
WO2021205559A1 (en) Display device, drone flight propriety determination device, drone, drone flight propriety determination method, and computer program
WO2020075868A1 (en) Cultivated field image analysis method
WO2021152741A1 (en) Crop-growing system
JP7411259B2 (en) Plant pathology diagnosis system, plant pathology diagnosis method, plant pathology diagnosis device, and drone
JP2022088441A (en) Drone steering device and steering program
JP7011233B2 (en) Spraying system and spraying management device
JP2022084735A (en) Drone, drone control method, and drone control program
JP7412037B2 (en) How to define the drone system, controls and work area
JP7412041B2 (en) unmanned aircraft control system
JP7460198B2 (en) Drone for spraying liquid and drone control method
WO2021144988A1 (en) Method for controlling chemical spraying flight of drone and information processing terminal
WO2021255885A1 (en) Spraying system, spraying method, and drone
WO2021130817A1 (en) Agricultural field management system, agricultural field management method and drone
WO2021166101A1 (en) Operation device and drone operation program
JP7465580B2 (en) Drones and drone control methods
WO2021192233A1 (en) Liquid-dispersing drone and drone control method
WO2021220409A1 (en) Area editing system, user interface device, and work area editing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20940974

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022531240

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20940974

Country of ref document: EP

Kind code of ref document: A1