WO2021166008A1 - Drone and method for controlling drone - Google Patents

Drone and method for controlling drone Download PDF

Info

Publication number
WO2021166008A1
WO2021166008A1 PCT/JP2020/005932 JP2020005932W WO2021166008A1 WO 2021166008 A1 WO2021166008 A1 WO 2021166008A1 JP 2020005932 W JP2020005932 W JP 2020005932W WO 2021166008 A1 WO2021166008 A1 WO 2021166008A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
distance
optical sensor
sensor
state
Prior art date
Application number
PCT/JP2020/005932
Other languages
French (fr)
Japanese (ja)
Inventor
敦規 西東
千大 和氣
宏記 加藤
Original Assignee
株式会社ナイルワークス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ナイルワークス filed Critical 株式会社ナイルワークス
Priority to JP2022501389A priority Critical patent/JP7369485B2/en
Priority to PCT/JP2020/005932 priority patent/WO2021166008A1/en
Publication of WO2021166008A1 publication Critical patent/WO2021166008A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • the present invention relates to a drone and a drone control method.
  • Altitude information may be used for drone flight control.
  • an altimeter 36 is used that emits a laser beam or an ultrasonic signal below the unmanned flying object 1 to generate an output indicating altitude from the reflected wave ([0019]).
  • Patent Document 1 uses an altimeter 36 that radiates a laser beam or an ultrasonic signal below the unmanned flying object 1 to generate an output indicating altitude from the reflected wave ([0019]).
  • the altimeter 36 here seems to be supposed to use only one of the laser beam and the ultrasonic signal.
  • ultrasonic waves are weakly reflected from soft objects that easily absorb sound, the detection accuracy may decrease when such objects exist.
  • the present invention has been made in consideration of the above problems, and an object of the present invention is to provide a drone that is robust against changes in a downward object such as the ground and a control method thereof.
  • the drone according to the present invention It is provided with an optical sensor that detects a first distance to a lower object and an ultrasonic sensor that detects a second distance to the lower object. It is characterized by further having a flight control unit that controls the flight of the drone by using a control altitude set by selecting one of the first distance and the second distance or a combination of both.
  • control is set by selecting one of the first distance to the lower object (ground, etc.) detected by the optical sensor and the second distance to the lower object detected by the ultrasonic sensor, or a combination of both. Control the flight of the drone using the altitude.
  • This makes it possible to fly a drone based on the advantages and disadvantages of optical sensors and ultrasonic sensors. Therefore, it is possible to improve the robustness against changes in the downward object such as the ground.
  • the optical sensor for example, a laser type can be used.
  • a time-of-flight type sensor may be used.
  • the TOF sensor is often lighter in weight, and the use of the TOF sensor makes it possible to reduce the weight of the drone. ..
  • the optical sensor and the ultrasonic sensor may be arranged side by side in the lateral direction of the drone. Further, the light irradiation direction by the optical sensor and the ultrasonic irradiation direction by the ultrasonic sensor are at least a part of the light irradiation range by the optical sensor and the ultrasonic irradiation range by the ultrasonic sensor. May be set to overlap.
  • the drone may move forward in a forward leaning position in the direction of travel. If the optical sensor and the ultrasonic sensor are arranged in the lateral direction of the drone, the detected values of both sensors are less likely to deviate from each other even in such a forward leaning posture. Therefore, it is possible to suppress the discrepancy between the first distance by the optical sensor and the second distance by the ultrasonic sensor.
  • the drone may have a support member that projects downward from the main body of the drone to support the optical sensor and the ultrasonic sensor.
  • a heat generation source power distribution board, inverter, power supply, image processing unit of a camera, etc.
  • the optical sensor and the ultrasonic sensor can be arranged away from the heat generation source. Therefore, it is possible to suppress the influence of the heat from the heat generation source on the optical sensor and the ultrasonic sensor.
  • the drone may include a tank for storing the sprayed material and a discharge port (nozzle or the like) for spraying the sprayed material. Further, the optical sensor and the ultrasonic sensor may be arranged above the discharge port. Further, the optical sensor and the ultrasonic sensor may be arranged in front of the discharge port. As a result, it is possible to suppress the influence of the sprayed material sprayed from the discharge port during the flight of the drone on the measurement of the optical sensor or the ultrasonic sensor.
  • the drone may further include a camera that images the lower object and a lower state determination unit that determines the state of the lower object based on the image of the camera.
  • the flight control unit detects the first item.
  • the control altitude may be set by selecting one distance or by increasing the weighting of the first distance from the second distance detected by the ultrasonic sensor. Further, when the state of the lower object is the state in which the accuracy of the optical sensor is lowered, the flight control unit selects the second distance or weights the second distance more than the first distance.
  • the control altitude may be set.
  • the ultrasonic sensor is preferentially used in situations where the detection accuracy of the optical sensor is low, so that the drone can fly with high accuracy. It becomes possible to control.
  • the state in which the accuracy of the optical sensor is lowered may include a state in which the lower object is a mirror surface or white, or a state in which the lower object has a boundary between the shade and the sun and the measurement position of the optical sensor straddles the boundary. ..
  • the measurement position referred to here means a position where the light emitted from the optical sensor hits the lower object and is reflected.
  • the flight control unit determines the measurement position of the optical sensor.
  • the flight speed of the drone may be reduced before crossing the boundary between the shade and the sun.
  • Some optical sensors have a variable dynamic range of the amount of light received in order to improve the detection accuracy.
  • the dynamic range is temporarily saturated due to a sudden increase in the amount of light received, and then the dynamic range is adjusted to achieve high accuracy even in the sun. Measurement becomes possible.
  • the flight speed of the drone is reduced before the measurement position of the optical sensor crosses the boundary from the shade to the sun. As a result, the distance traveled by the drone can be shortened when the amount of light received by the optical sensor is saturated. Therefore, the applicable range of the optical sensor can be expanded, and the robustness against changes in the downward object such as the ground can be improved.
  • ultrasonic waves are much slower than light, and the Doppler effect that accompanies the movement of the drone has a large effect. Therefore, when using the detection value (second distance) of an ultrasonic sensor, the detection accuracy is lowered due to the Doppler effect by lowering the flight speed as compared with the case of using the detection value (first distance) of an optical sensor. Can be suppressed.
  • the measurement accuracy may temporarily decrease due to the dynamic range.
  • such a case may be dealt with.
  • the drone according to the present invention An optical sensor that detects the first distance to the lower target, A flight control unit that controls the flight of the drone by using the first distance as a control altitude. It is provided with a lower state determination unit for determining the surface state of the lower object.
  • the surface state of the lower object determined by the lower state determination unit is an optical sensor accuracy reduction state in which the detection accuracy of the optical sensor is reduced, the flight control unit reduces the flight speed of the drone. It is a feature.
  • the applicable range of the optical sensor can be expanded, and the robustness against changes in the downward object such as the ground can be improved.
  • the drone control method is A drone control method including an optical sensor that detects a first distance to a lower object and an ultrasonic sensor that detects a second distance to the lower object.
  • the flight control unit controls the flight of the drone using a control altitude set by selecting one of the first distance and the second distance or a combination of both.
  • the drone control method is An optical sensor that detects the first distance to the lower target, A flight control unit that controls the flight of the drone by using the first distance as a control altitude. It is a control method of a drone including a lower state determination unit for determining the surface state of the lower object.
  • the surface state of the lower object determined by the lower state determination unit is an optical sensor accuracy reduction state in which the detection accuracy of the optical sensor is reduced, the flight control unit reduces the flight speed of the drone. It is a feature.
  • FIG. 1 is an overall configuration diagram showing an outline of a crop growing system 10 including a drone 24 according to an embodiment of the present invention.
  • the crop growing system 10 (hereinafter, also referred to as “system 10”) can diagnose the growth of the crop 502 growing in the field 500 and spray the chemicals on the crop 502.
  • the crop 502 of the present embodiment is rice (paddy rice), but other crops (for example, upland rice, wheat, barley) may also be used.
  • the system 10 has a field sensor group 20, a growth diagnosis server 22, and a user terminal 26 in addition to the drone 24.
  • the field sensor group 20, the drone 24, and the user terminal 26 can wirelessly communicate with each other via the communication network 30 (including the wireless base station 32) and can communicate with the growth diagnosis server 22.
  • the wireless communication communication that does not go through the wireless base station 32 (for example, LTE (Long Term Evolution), WiFi, etc.) can be used.
  • the field sensor group 20 is installed in the field 500 as a paddy field, detects various data in the field 500, and provides the growth diagnosis server 22 and the like.
  • the field sensor group 20 includes, for example, a water temperature sensor, a temperature sensor, a precipitation sensor, an illuminance meter, an anemometer, a barometer and a hygrometer.
  • the water temperature sensor detects the water temperature of the field 500, which is a paddy field.
  • the temperature sensor detects the air temperature of the field 500.
  • the precipitation sensor detects the amount of precipitation in the field 500.
  • the illuminometer detects the amount of sunshine in the field 500.
  • the anemometer detects the wind speed of the field 500.
  • the barometer detects the barometric pressure in the field 500.
  • the hygrometer detects the humidity of the field 500.
  • the growth diagnosis server 22 (hereinafter, also referred to as “diagnosis server 22”) performs growth diagnosis using the growth diagnosis model, and gives work instructions to the user 600 and the like based on the diagnosis result. Work instructions include fertilizer application timing, fertilizer type / amount, pesticide application timing, pesticide type / amount, and the like.
  • the diagnostic server 22 has an input / output unit, a communication unit, a calculation unit, and a storage unit (none of which are shown).
  • the diagnosis server 22 executes growth diagnosis control for performing growth diagnosis using the growth diagnosis model, flight management control for managing the flight (flight timing, flight path, etc.) of the drone 24, and the like.
  • FIG. 2 is a configuration diagram simply showing the configuration of the drone 24 according to the present embodiment.
  • FIG. 3 is an external perspective view of the drone 24 according to the present embodiment.
  • FIG. 4 is a bottom view of the drone 24 according to the present embodiment.
  • FIG. 5 is a plan view simply showing the internal configuration of the main body 50 of the drone 24 and the arrangement around the main body 50 according to the present embodiment.
  • FIG. 6 is a side view simply showing the internal configuration of the main body 50 of the drone 24 of the present embodiment and the arrangement around the main body 50.
  • the drone 24 of the present embodiment functions as a means for acquiring an image of the field 500 (crop 502) and also as a means for spraying a chemical solution (including a liquid fertilizer) on the crop 502.
  • the drone 24 takes off and landing at the departure and arrival point 510 (FIG. 1).
  • the drone 24 includes a drone sensor group 60, a communication unit 62, a flight mechanism 64, a photographing mechanism 66, a spraying mechanism 68, a drone control unit 70, and a power supply unit 72.
  • the drone sensor group 60 includes a global positioning system sensor (hereinafter referred to as “GPS sensor”), a gyro sensor, a liquid level sensor (none of which is shown), a speedometer 80, an altimeter 82, and the like.
  • GPS sensor global positioning system sensor
  • the gyro sensor detects the angular velocity of the drone 24.
  • the liquid amount sensor detects the amount of liquid in the tank 180 (FIG. 4) of the spraying mechanism 68.
  • the speedometer 80 detects the flight speed Vf of the drone 24.
  • the altimeter 82 detects the distance (so-called ground level) to the object located below the drone 24 (hereinafter referred to as the "downward object Row").
  • the lower target Row includes the ground 520 (FIG. 1), crop 502 and the like.
  • the altimeter 82 of this embodiment includes a TOF sensor 100 and an ultrasonic sensor 102.
  • the TOF sensor 100 is an optical sensor that detects a distance by a time-of-flight method using a laser.
  • the TOF sensor 100 faces downward from the drone 24, and determines the distance to the lower target Row such as the ground 520 (FIG. 1) and the crop 502 (hereinafter, also referred to as “first distance D1” or “detection value D1”). To detect.
  • the TOF sensor 100 has a variable dynamic range of the amount of received light in order to improve the detection accuracy.
  • the ultrasonic sensor 102 is a sensor that detects a distance using ultrasonic waves.
  • the ultrasonic sensor 102 faces downward from the drone 24 and detects the distance to the lower target Row such as the ground 520 and the crop 502 (hereinafter, also referred to as “second distance D2” or “detection value D2”).
  • the light irradiation direction by the TOF sensor 100 and the ultrasonic irradiation direction by the ultrasonic sensor 102 are the range of the light irradiation by the TOF sensor 100 and the ultrasonic irradiation range by the ultrasonic sensor 102 in the measurement areas of both sensors. At least part of it is set to overlap.
  • the light irradiation direction of the TOF sensor 100 and the ultrasonic irradiation direction of the ultrasonic sensor 102 are substantially the same (the central axes in the irradiation direction are substantially parallel).
  • the TOF sensor 100 and the ultrasonic sensor 102 are provided on the bottom surface portion 52 constituting the main body 50 of the drone 24. More specifically, the TOF sensor 100 and the ultrasonic sensor 102 are arranged on the left and right sides of the camera 160 (described later) on the front side of the bottom surface portion 52. In other words, the TOF sensor 100 and the ultrasonic sensor 102 are arranged side by side in the lateral direction of the drone 24. The TOF sensor 100 and the ultrasonic sensor 102 are arranged above and in front of the nozzles 186l1, 186l2, 186r1, and 186r2 (see FIGS. 3 and 4).
  • the communication unit 62 (FIG. 2) is capable of radio wave communication via the communication network 30 (FIG. 1), and includes, for example, a radio wave communication module.
  • the communication unit 62 can communicate with the field sensor group 20, the diagnosis server 22, the user terminal 26, and the like via the communication network 30 (including the wireless base station 32).
  • the radio wave communication module constituting the communication unit 62 is arranged in the control board 110 (FIG. 5).
  • the flight mechanism 64 is a mechanism for flying the drone 24. As shown in FIGS. 3 and 4, the flight mechanism 64 includes a plurality of propellers 130flu, 130fl, 130fru, 130fl, 130rlu, 130rll, 130rru, 130rrl (hereinafter collectively referred to as “propeller 130”) and a plurality of propeller actuators.
  • propeller actuator 132 132fl, 132fl, 132fru, 132fl, 132rlu, 132rrl, 132rru, 132rrl, 132rru, 132rrl
  • propeller guard 134fl, 134fr, 134rr, 134rr hereinafter collectively referred to as "propeller guard 134"
  • the flight mechanism 64 has a plurality of inverters 136 (FIG. 5). Arrows A in FIGS. 3 to 6 indicate the traveling direction of the drone 24.
  • the propeller 130 of the present embodiment is a so-called counter-rotating type, in which two propellers 130 (for example, propellers 130flu and 130fl) are arranged coaxially, and the upper and lower propellers 130 are oriented in opposite directions. Rotate. In this embodiment, there are four sets of counter-rotating propellers 130.
  • each propeller 130 is arranged on four sides of the main body 50 by arms 138u, 138l, 140ru, 140rl, 140ru, 140rl extending from the main body 50 of the drone 24. That is, propellers 132fl and 132fl are arranged on the left front side, propellers 132fru and 132fll are arranged on the right front side, propellers 132rlu and 132rll are arranged on the left rear side, and propellers 132rru and 132rrl are arranged on the right rear side, respectively.
  • rod-shaped legs 142fl, 142fr, 142rr, and 142rr (hereinafter collectively referred to as "feet 142") extend.
  • the propeller actuator 132 is a means for rotating the propeller 130, and is provided for each propeller 130.
  • the propeller actuator 132 of the present embodiment is an electric motor, but may be a motor or the like.
  • a set of upper and lower propellers 130 eg, propellers 130flu, 130fl
  • their corresponding propeller actuators 132 eg, propeller actuators 132fl, 132fl
  • a set of upper and lower propeller actuators 132 rotate in opposite directions.
  • the inverter 136 converts the direct current from the power supply unit 72 into alternating current and supplies it to the propeller actuator 132, and is a so-called ESC (Electric Speed Controller). Eight inverters 136 are provided for each propeller actuator 132. As shown in FIG. 5, the inverter 136 is provided on the bottom surface portion 52 constituting the main body 50 of the drone 24. The inverters 136 are arranged side by side in the front-rear direction on the left and right sides of the control board 110.
  • the photographing mechanism 66 (FIG. 2) is a mechanism for capturing an image of the field 500 or the crop 502, and has a camera 160 (FIGS. 2, 4 to 6).
  • the camera 160 of the present embodiment is a multispectral camera, and particularly acquires an image capable of analyzing the growth state of the crop 502.
  • the photographing mechanism 66 may further include an irradiation unit that irradiates the field 500 with light rays having a specific wavelength, and may be capable of receiving the reflected light from the field 500 with respect to the light rays.
  • the light rays having a specific wavelength may be, for example, red light (wavelength of about 650 nm) and near-infrared light (wavelength of about 774 nm).
  • red light wavelength of about 650 nm
  • near-infrared light wavelength of about 774 nm
  • the camera 160 is provided on the bottom surface portion 52 constituting the main body 50 of the drone 24. More specifically, the camera 160 is arranged between the TOF sensor 100 and the ultrasonic sensor 102 on the front side of the bottom surface portion 52. The camera 160 is arranged downward and images the lower target Row of the ground 520, the crop 502, and the like.
  • the camera 160 outputs image data related to peripheral images taken around the drone 24.
  • the camera 160 is a video camera that shoots a moving image.
  • the camera 160 may be capable of capturing both moving images and still images, or only still images.
  • the orientation of the camera 160 (the posture of the camera 160 with respect to the main body 50 of the drone 24) can be adjusted by a camera actuator (not shown). Alternatively, the position of the camera 160 with respect to the main body 50 of the drone 24 may be fixed.
  • the spraying mechanism 68 (FIG. 2) is a mechanism for spraying a chemical (including liquid fertilizer). As shown in FIG. 4 and the like, the spraying mechanism 68 is collectively referred to as a tank 180, a pump 182, a pipe 184, a flow rate adjusting valve (not shown), and a drug nozzle 186l1, 186l2, 186r1, 186r2 (hereinafter, “nozzle 186”). ).
  • the tank 180 stores the chemicals (sprayed material) to be sprayed.
  • the pump 182 pushes the medicine in the tank 180 into the pipe 184.
  • the pipe 184 connects the tank 180 and each nozzle 186.
  • the pipe 184 is made of a hard material and may also serve to support the nozzle 186.
  • Each nozzle 186 is a means (discharge port) for spraying the drug downward.
  • the discharge port may be formed by providing one or more through holes in the pipe 184.
  • the drone control unit 70 (FIG. 2) controls the entire drone 24, such as flying, photographing, and spraying the drug.
  • the drone control unit 70 includes an input / output unit 190, a calculation unit 192, and a storage unit 194.
  • the input / output unit 190, the calculation unit 192, and the storage unit 194 are arranged on the control board 110 (FIG. 5).
  • the control board 110 is arranged near the center of the bottom surface portion 52 on the bottom surface portion 52 constituting the main body 50.
  • the input / output unit 190 inputs / outputs signals to / from each unit of the drone 24.
  • the arithmetic unit 192 includes a central processing unit (CPU) and operates by executing a program stored in the storage unit 194. Some of the functions executed by the arithmetic unit 192 can also be realized by using a logic IC (Integrated Circuit).
  • the arithmetic unit 192 may also configure a part of the program with hardware (circuit components). The same applies to the calculation unit of the diagnostic server 22 described above, the calculation unit of the user terminal 26 described later, and the like.
  • the calculation unit 192 includes a flight control unit 200, an imaging control unit 202, and a spray control unit 204.
  • the flight control unit 200 controls the flight of the drone 24 via the flight mechanism 64. Further, the flight control unit 200 executes a part of the altitude-related control related to the altitude above ground level (hereinafter, also referred to as “altitude H”) of the drone 24.
  • the shooting control unit 202 controls shooting by the drone 24 via the shooting mechanism 66.
  • the shooting control unit 202 also executes a part of altitude-related control.
  • the spraying control unit 204 controls the spraying of the drug by the drone 24 via the spraying mechanism 68.
  • the storage unit 194 stores programs and data used by the calculation unit 192, and includes a random access memory (hereinafter referred to as "RAM").
  • RAM random access memory
  • a volatile memory such as a register and a non-volatile memory such as a hard disk and a flash memory can be used.
  • the storage unit 194 may have a read-only memory (ROM) in addition to the RAM. The same applies to the storage unit of the diagnostic server 22 described above, the storage unit of the user terminal 26 described later, and the like.
  • the power supply unit 72 supplies electric power to each unit of the drone 24.
  • the power supply unit 72 has a power supply 210 (FIG. 6) and a power supply circuit 212 (FIGS. 5 and 6).
  • the power supply 210 is made of a secondary battery such as a lithium ion battery, for example.
  • the power supply circuit 212 includes a converter and the like, and distributes the electric power from the power supply 210 to each part of the drone 24.
  • the power supply circuit 212 is provided behind the bottom surface portion 52 constituting the main body 50 of the drone 24. Further, as shown in FIG. 6, the power supply 210 is arranged above the power supply circuit 212.
  • the main body 50 has a removable cover 214. The power supply 210 can be attached / detached or replaced with the cover 214 removed.
  • the user terminal 26 controls the drone 24 by the operation of the user 600 (FIG. 1) as an operator in the field 500, and the information received from the drone 24 (for example, the position, the amount of drug, and the remaining battery level). , Camera image, etc.).
  • the flight state (altitude, attitude, etc.) of the drone 24 is not remotely controlled by the user terminal 26, but is autonomously controlled by the drone 24. Therefore, when a flight command is transmitted from the user 600 to the drone 24 via the user terminal 26, the drone 24 performs autonomous flight.
  • manual operations may be possible during basic operations such as takeoff and return, and in emergencies.
  • the user terminal 26 includes an input / output unit (including a touch panel and the like), a communication unit, a calculation unit, and a storage unit (not shown), and is composed of, for example, a general tablet terminal.
  • the user terminal 26 of the present embodiment receives and displays a work instruction or the like from the growth diagnosis server 22.
  • another user terminal used by another user other than the operator may be provided.
  • the other user terminal receives and displays flight information of the drone 24 (current flight status, scheduled flight end time, etc.), work instructions for the user 602, growth diagnosis information, etc. from the diagnosis server 22 or the drone 24. It can be a mobile information terminal.
  • the other user terminal may be a terminal used by the user 600 or the like in order to use the growth diagnosis by the growth diagnosis server 22 in a place other than the field 500 (for example, the company to which the user 600 belongs).
  • the diagnosis server 22 of the present embodiment performs growth diagnosis control and flight management control.
  • the growth diagnosis control is a control for performing a growth diagnosis using a growth diagnosis model.
  • the growth diagnosis referred to here includes, for example, an estimated value (estimated yield) of the yield for each field 500.
  • work instructions regarding water management, fertilization, chemical spraying, etc. of the field 500 as a paddy field are also given.
  • the work instruction is displayed on, for example, the display unit of the user terminal 26.
  • the yield of crop 502 (paddy rice), red light absorption rate, number of paddy, effective light receiving area ratio, amount of accumulated starch in paddy and protein content in paddy can be calculated.
  • Flight management control is a control that manages the flight of the drone 24.
  • the flight timing, flight path, target speed, target altitude, imaging method of the imaging mechanism 66, spraying method of the spraying mechanism 68, etc. of the drone 24 are set based on the work instructions in the growth diagnosis control. ..
  • flight control In the drone 10 of the present embodiment, flight control, imaging control, and chemical spray control are performed.
  • the flight control is a control for flying the drone 24 in the field 500 for photographing, spraying chemicals, and the like.
  • the flight control unit 200 controls the flight mechanism 64 based on a command from the diagnostic server 22.
  • altitude-related control is performed as part of flight control. Details of altitude-related control will be described later with reference to FIG.
  • the imaging control is a control that acquires an image of the field 500 (or crop 502) by the camera 160 of the drone 24 and transmits it to the diagnostic server 22.
  • the imaging control unit 202 controls the imaging mechanism 66 based on a command from the diagnostic server 22.
  • the field image transmitted to the diagnosis server 22 is image-processed and used for growth diagnosis.
  • the chemical spraying control is a control for spraying a chemical solution (including liquid fertilizer) using the drone 24.
  • the spraying control unit 204 controls the spraying mechanism 68 based on a command from the diagnostic server 22.
  • the altitude-related control is the control related to the altitude H of the drone 24 and is a part of the flight control.
  • the TOF sensor 100 and the ultrasonic sensor 102 are included as sensors for detecting the altitude H of the drone 24 (FIGS. 2, 4 and 5).
  • the drone control unit 70 uses a TOF sensor 100 and an ultrasonic sensor for the altitude H (hereinafter, also referred to as “control altitude Hc” or “detection altitude Hc”) used in the flight control of the drone 24. It is set based on the detected values of 102 (first distance D1, second distance D2).
  • the detection value (first distance D1) of the TOF sensor 100 is basically used as the control altitude Hc. Further, in a special situation, the detected value (second distance D2) of the ultrasonic sensor 102 is used as the control altitude Hc. Further, the drone control unit 70 reduces the flight speed Vf of the drone 24 under special circumstances. The detected altitude Hc is compared with the target altitude and used for altitude control of the drone 24.
  • FIG. 7 is a flowchart of the altitude-related control of the present embodiment.
  • the altitude-related control of the present embodiment is basically executed by the drone control unit 70 (FIG. 2) of the drone 24. Note that some of the altitude-related controls shown in FIG. 7 can overlap with other controls in flight control.
  • step S11 the drone control unit 70 (shooting control unit 202) acquires the lower image Ilaw (image of the ground 520 or the like) of the camera 160.
  • step S12 the drone control unit 70 (shooting control unit 202) performs image processing on the lower image Ilaw to determine the state (lower state Slow) of the lower target Row (ground 520 or the like).
  • the downward state Slow here includes, for example, a state in which the lower target Trow is a mirror surface or white, a state in which the lower target Trow has a boundary from the shade to the sun, and a state in which the measurement position of the TOF sensor 100 straddles the boundary. ..
  • Whether or not the lower target Trow is a mirror surface or white is determined, for example, by pattern determination in a region where the light receiving amount of the light receiving element of the camera 160 is saturated. Similarly, the boundary from the shade to the sun is determined, for example, by pattern determination in a region where the light receiving amount of the light receiving element of the camera 160 is saturated.
  • the determined downward state Slow is notified from the shooting control unit 202 to the flight control unit 200.
  • step S13 the drone control unit 70 (flight control unit 200) determines whether or not the detection accuracy of the TOF sensor 100 is in a reduced state (TOF sensor accuracy lowered state) based on the downward state Slow.
  • the TOF sensor accuracy reduction state includes a state in which the lower target Row is a mirror surface or white, and a state in which the drone 24 straddles the boundary from the shade to the sun. In other cases, it is determined that the TOF sensor accuracy is not reduced.
  • the drone control unit 70 determines that it is a normal time, and proceeds to step S14.
  • step S14 the drone control unit 70 (flight control unit 200) sets the detection value (first distance D1) of the TOF sensor 100 as the control altitude Hc.
  • step S15 the drone control unit 70 (flight control unit 200) determines whether or not the established TOF sensor accuracy reduction state is a state that straddles the boundary from the shade to the sun (in other words, the drone 24 passes through the boundary). Whether or not it is inside) is determined. If the boundary from the shade to the sun is being passed (S15: true), the process proceeds to step S16.
  • step S16 the drone control unit 70 (flight control unit 200) executes flight speed reduction control for reducing the flight speed Vf of the drone 24 to a predetermined value (boundary flight speed THvf).
  • the TOF sensor 100 of the present embodiment has a variable dynamic range of the amount of received light in order to improve the detection accuracy. Therefore, when the measurement position straddles the boundary from the shade to the sun, the dynamic range is temporarily saturated due to a sudden increase in the amount of light received, and then the dynamic range is adjusted to enable highly accurate measurement even in the sun.
  • the flight speed Vf of the drone 24 is reduced to the boundary flight speed THvf before the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun.
  • the distance traveled by the drone 24 can be shortened when the amount of light received by the TOF sensor 100 is saturated. Therefore, the applicable range of the TOF sensor 100 can be expanded.
  • the flight speed Vf drops to the boundary flight speed THvf, the flight speed Vf is maintained at the boundary flight speed THvf.
  • step S17 the drone control unit 70 (flight control unit 200) sets the detection value (second distance D2) of the ultrasonic sensor 102 as the control altitude Hc.
  • Steps S18 and S19 are the same as steps S11 and S12.
  • step S20 the drone control unit 70 (flight control unit 200) determines whether or not the measurement position of the TOF sensor 100 has completed the passage of the boundary from the shade to the sun. The determination is made by determining whether or not the measurement position of the TOF sensor 100 is separated from the boundary line by a predetermined distance after passing the boundary line from the shade to the sun based on the lower image Ilow. Alternatively, after determining that the measurement position of the TOF sensor 100 has passed the boundary line from the shade to the sun based on the lower image Ilow, the detection value D1 of the TOF sensor 100 and the detection value D2 of the ultrasonic sensor 102 are compared. Do it by.
  • the absolute value of the difference between the detected values D1 and D2 or the ratio of the difference is determined. It is performed based on whether or not it is below the threshold value of.
  • step S20 the drone control unit 70 (flight control unit 200) sets the detection value (first distance D1) of the TOF sensor 100 as the control altitude Hc.
  • step S15 when the established TOF sensor accuracy reduction state is other than passing through the boundary from the shade to the sun (S15: false), the established TOF sensor accuracy reduction state is a mirror surface of the lower target Row. Or it is in a white state. In that case, the process proceeds to step S22.
  • step S22 the drone control unit 70 (flight control unit 200) sets the detection value (second distance D2) of the ultrasonic sensor 102 as the control altitude Hc.
  • the ultrasonic sensor 102 does not reduce the detection accuracy even for a mirror-surfaced object, a white object, or the like, which the TOF sensor 100 is not good at. Therefore, it is possible to improve the robustness of the drone 24 as a whole against changes in the downward target Row such as the ground 520.
  • the TOF sensor 100 is used as the optical sensor (Fig. 2, etc.). Compared with other types of optical sensors (phase difference detection method, triangular ranging method, etc.), the TOF sensor 100 is often lighter in weight, and the weight of the drone 24 can be reduced by using the TOF sensor 100. It will be possible.
  • the TOF sensor 100 optical sensor
  • the ultrasonic sensor 102 are arranged side by side in the lateral direction of the drone 24 (FIGS. 4 and 5). Further, the light irradiation direction by the TOF sensor 100 and the ultrasonic irradiation direction by the ultrasonic sensor 102 are the light irradiation range by the TOF sensor 100 and the ultrasonic irradiation by the ultrasonic sensor 102 in the measurement areas of both sensors. At least part of the range is set to overlap.
  • the drone 24 may move forward in a forward leaning posture in the direction of travel (arrow A in FIG. 3 and the like).
  • the TOF sensor 100 and the ultrasonic sensor 102 are arranged in the lateral direction of the drone 24, the detection values of both sensors (first distance D1 and second distance D2) even in such a forward leaning posture. Is less likely to be misaligned with each other. Therefore, it is possible to suppress the dissociation between the first distance D1 by the TOF sensor 100 and the second distance D2 by the ultrasonic sensor 102.
  • the drone 24 includes a tank 180 for storing the sprayed material and a nozzle 186 (discharge port) for spraying the sprayed material (FIGS. 3 and 4). Further, the TOF sensor 100 (optical sensor) and the ultrasonic sensor 102 are arranged above the nozzle 186 (FIGS. 3 and 4). Further, the TOF sensor 100 and the ultrasonic sensor 102 are arranged in front of the nozzle 186 (FIGS. 3 and 4). As a result, it is possible to suppress the influence of the sprayed material sprayed from the nozzle 186 on the measurement of the TOF sensor 100 or the ultrasonic sensor 102.
  • the drone 24 determines the state of the lower target Trow (lower state Slow) based on the images of the camera 160 (FIGS. 2, 4 to 6) that captures the lower target Tlow and the images of the camera 160. It is provided with a shooting control unit 202 (lower state determination unit; FIG. 2).
  • the flight control unit 200 selects the first distance D1 detected by the TOF sensor 100 (optical sensor) and sets the control altitude Hc in the normal state (S13: true in FIG. 7) when the accuracy of the optical sensor is not deteriorated. (S14). Further, when the optical sensor accuracy is lowered (S13: false), the flight control unit 200 selects the second distance D2 detected by the ultrasonic sensor 102 and sets the control altitude Hc (S17, S22). ..
  • the ultrasonic sensor 102 is preferentially used in a situation where the detection accuracy of the TOF sensor 100 is lowered. It is possible to control the flight with high accuracy.
  • the optical sensor accuracy reduction state includes a state in which a boundary from the shade to the sun exists in the lower target Row and the measurement position of the TOF sensor 100 (optical sensor) straddles the boundary (S13, S15 in FIG. 7). ).
  • the flight control unit 200 reduces the flight speed Vf of the drone 24 before the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun (S16).
  • the TOF sensor 100 has a variable dynamic range of the amount of received light in order to improve the detection accuracy. Therefore, when the measurement position straddles the boundary from the shade to the sun, the measurement range is temporarily saturated due to a sudden increase in the amount of light received, and then the dynamic range is adjusted to enable highly accurate measurement even in the sun.
  • the flight speed Vf of the drone 24 is reduced before the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun. As a result, the distance traveled by the drone 24 can be shortened when the amount of light received by the TOF sensor 100 is saturated. Therefore, the applicable range of the TOF sensor 100 can be expanded, and the robustness against a change in the downward target Row such as the ground 520 can be improved.
  • the Doppler effect is obtained by lowering the flight speed Vf as compared with the case of using the detected value (first distance D1) of the TOF sensor 100. It is possible to suppress a decrease in detection accuracy due to the above.
  • the crop growing system 10 of the above embodiment had components as shown in FIG.
  • the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc.
  • the crop growing system 10 may have only the drone 24 and the user terminal 26. In that case, the flight of the drone 24 may be controlled by the user terminal 26.
  • the drone 24 imaged the crop 502 and sprayed the chemical solution (FIG. 1).
  • the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc.
  • the drone 24 may only image the crop 502 and spray the chemical solution.
  • the drone 24 may be used for other purposes (for example, aerial photography other than growth diagnosis).
  • the TOF sensor 100 is used as the optical sensor.
  • the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the optical sensor and the ultrasonic sensor 102 as the control altitude Hc.
  • another type of optical sensor phase difference detection method, triangular ranging method, etc. may be used.
  • the TOF sensor 100 and the ultrasonic sensor 102 are arranged side by side in the horizontal direction of the drone 24 (FIGS. 4 and 5).
  • the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc.
  • the TOF sensor 100 and the ultrasonic sensor 102 may be arranged side by side in the front-rear direction of the drone 24.
  • the TOF sensor 100 and the ultrasonic sensor 102 are arranged in the main body 50 of the drone 24 (FIGS. 5 and 6).
  • the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc.
  • the TOF sensor 100 and the ultrasonic sensor 102 may be arranged outside the main body 50 of the drone 24.
  • FIG. 8 is a side view simply showing the internal configuration of the main body 50 of the drone 24 and the arrangement around the main body 50 according to the modified example.
  • the drone 24 has a support member 220 that projects downward from the main body 50 and supports a TOF sensor 100, an ultrasonic sensor 102, and a camera 160.
  • a heat generation source control board 110, inverter 136, power supply 210, etc.
  • the TOF sensor 100, the ultrasonic sensor 102, and the camera 160 can be arranged away from the heat generation source. Can be done. Therefore, it is possible to prevent the TOF sensor 100, the ultrasonic sensor 102, and the camera 160 from being affected by the heat from the heat generation source.
  • the sprayed product sprayed by the drone 24 was a drug as a liquid.
  • the present invention is not limited to this, for example, from the viewpoint of selectively using the detected values of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc.
  • the sprayed material may be something other than a drug (water, etc.), or may be a gas or a solid (including powder).
  • the determination of the state of the lower target Tlow is performed based on the lower image Ilaw of the camera 160 in real time (S12, S19 in FIG. 7).
  • the position coordinates and the lower state Slow may be stored in association with each other in advance, and the lower state Slow may be determined based on the position coordinates during the flight of the drone 24 and the stored information.
  • the downward state Slow may be determined using means other than the TOF sensor 100, the ultrasonic sensor 102, and the camera 160 (for example, the position coordinates of the satellite image and the drone 24).
  • the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc.
  • a state in which the drone 24 straddles the boundary from the sun to the shade may be used.
  • the dynamic range of the TOF sensor 100 is not saturated, but the range to be used is narrowed, so that the detection accuracy may be lowered. Therefore, by switching to the ultrasonic sensor 102 and / or reducing the flight speed Vf, it is possible to improve the robustness against a change in the downward target Tlow.
  • the detection value (first distance D1) of the TOF sensor 100 is used as the control altitude Hc in the normal state (when the TOF sensor accuracy is not lowered), and the ultrasonic sensor 102 is used when the TOF sensor accuracy is lowered.
  • (2nd distance D2) was used as the control altitude Hc (FIG. 7).
  • the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc.
  • the first distance D1 and the second distance D2 are weighted (ratio) according to the state of the lower target Tlow (lower state Slow). ) May be changed.
  • the first distance D1 ⁇ 0.9 + the second distance D2 ⁇ 0.1 is set as the control altitude Hc, and when the TOF sensor accuracy is lowered, the first distance D1 ⁇ 0.3 + the second distance D2 ⁇ 0.7 may be the control altitude Hc.
  • the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 may be selectively used as the control altitude Hc without using the lower image Ilaw of the camera 160.
  • the detection value of the TOF sensor 100 first distance D1
  • the detection value of the ultrasonic sensor 102 second distance D2. May be used as the control altitude Hc.
  • the flight speed Vf of the drone 24 is reduced (S16).
  • the measurement position of the TOF sensor 100 is not limited to this, and the measurement position of the TOF sensor 100 is from the shade to the sun.
  • the flight speed Vf may be maintained even when crossing the boundary.
  • the detection value (first distance D1) of the TOF sensor 100 is used as the control altitude Hc in the normal state (when the TOF sensor accuracy is not lowered), and the ultrasonic sensor 102 is used when the TOF sensor accuracy is lowered.
  • (2nd distance D2) was used as the control altitude Hc (FIG. 7).
  • the detected values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 were selectively used as the control altitude Hc.
  • the detection value of the TOF sensor 100 (first) without using the ultrasonic sensor 102. It is also possible to use only the distance D1) as the control altitude Hc.
  • FIG. 9 is a flowchart of altitude-related control according to a modified example.
  • the detection value (first distance D1) of the TOF sensor 100 is used as the control altitude Hc without using the ultrasonic sensor 102.
  • Steps S31 and S32 in FIG. 9 are the same as S11 and S12 in FIG.
  • the drone control unit 70 determines whether or not the downward state Slow is normal (whether or not the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun). To judge. When the downward state Slow is normal (normal time) (S33: true), the process proceeds to step S34.
  • the drone control unit 70 sets the detection value D1 of the TOF sensor 100 as the control altitude Hc.
  • the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun S33: false
  • the process proceeds to step S35.
  • Step S35 is the same as step S16 in FIG.
  • the drone control unit 70 (flight control unit 200) sets the detection value D1 of the TOF sensor 100 as the control altitude Hc. That is, the drone control unit 70 (flight control unit 200) continues to use the detected value D1 as the control altitude Hc.
  • Steps S37, S38, and S39 are the same as steps S18, S19, and S20 of FIG.
  • step S40 the drone control unit 70 (flight control unit 200) sets the detection value D1 of the TOF sensor 100 as the control altitude Hc.
  • the drone control unit 70 continues to use the detected value (first distance D1) of the TOF sensor 100 as the control altitude Hc.
  • the processing of the control altitude Hc (S34, S36, S40) may be positioned as another control.
  • the control of FIG. 9 only the time when the boundary from the shade to the sun passed was used as the optical sensor accuracy reduced state (S13), but as described above, the time when the boundary from the sun to the shade passed was the optical sensor accuracy. It may be included in the lowered state.
  • the flow shown in FIG. 7 was used for the altitude-related control of the above embodiment, and the flow shown in FIG. 9 was used for the altitude-related control of the modification.
  • the content of the flow is not limited to this.
  • the order of steps S16 and S17 in FIG. 4 can be exchanged.
  • Drone 100 ... TOF sensor (optical sensor) 102 ... Ultrasonic sensor 160 ... Camera 180 ... Tank 186l1, 186l2, 186r1, 186r2 ... Nozzle (discharge port) 200 ... Flight control unit 202 ... Shooting control unit (downward state determination unit) 220 ... Support member D1 ... First distance D2 ... Second distance Hc ... Control altitude Tlow ... Downward target Vf ... Flight speed

Abstract

Provided are a drone that is robust with respect to changes in a lower object such as the ground, and a method for controlling said drone. A flight control unit (200) performs flight control of a drone (24) by using a control altitude that is set by selecting one, or combining both, of a first distance up to a lower object as detected by an optical sensor (100) and a second distance up to the lower object as detected by an ultrasonic sensor (102). Further, when the accuracy of the optical sensor is low, the flight control unit may set the control altitude by selecting the second distance or by assigning greater weight to the second distance than to the first distance, and further, may decrease flight velocity.

Description

ドローン及びドローンの制御方法Drone and drone control method
 本発明は、ドローン及びドローンの制御方法に関する。 The present invention relates to a drone and a drone control method.
 ドローンの飛行制御では高度情報を用いることがある。特許文献1では、無人飛行体1の下方にレーザビーム又は超音波信号を放射して反射波から高度を示す出力を生じる高度計36が用いられる([0019])。 Altitude information may be used for drone flight control. In Patent Document 1, an altimeter 36 is used that emits a laser beam or an ultrasonic signal below the unmanned flying object 1 to generate an output indicating altitude from the reflected wave ([0019]).
特開2018-192932号公報Japanese Unexamined Patent Publication No. 2018-192932
 上記のように、特許文献1では、無人飛行体1の下方にレーザビーム又は超音波信号を放射して反射波から高度を示す出力を生じる高度計36が用いられる([0019])。ここでの高度計36は、レーザビーム又は超音波信号の一方のみを用いることが想定されているように見受けられる。 As described above, Patent Document 1 uses an altimeter 36 that radiates a laser beam or an ultrasonic signal below the unmanned flying object 1 to generate an output indicating altitude from the reflected wave ([0019]). The altimeter 36 here seems to be supposed to use only one of the laser beam and the ultrasonic signal.
 しかしながら、レーザビーム等を用いる光センサの場合、鏡面状の対象(例えば反射率の高い水面)からの反射光を受信すると、受光量が多すぎて測定レンジ(ダイナミックレンジ)が飽和して検出精度が下がる場合があり得る。また、測定レンジが可変な光センサの場合でも、暗い場所(日陰等)から明るい場所(日向等)に急に移動すると、測定レンジの変化に遅れが生じ、検出精度が低下するおそれがある。 However, in the case of an optical sensor using a laser beam or the like, when the reflected light from a mirror-like object (for example, a water surface having high reflectance) is received, the amount of received light is too large and the measurement range (dynamic range) is saturated and the detection accuracy is high. May go down. Further, even in the case of an optical sensor having a variable measurement range, if the sensor suddenly moves from a dark place (shade or the like) to a bright place (sunlight or the like), the change in the measurement range may be delayed and the detection accuracy may decrease.
 さらに、超音波は、音を吸収し易い柔らかい対象からの反射が弱いため、そのような対象が存在する場合、検出精度が下がるおそれがある。 Furthermore, since ultrasonic waves are weakly reflected from soft objects that easily absorb sound, the detection accuracy may decrease when such objects exist.
 本発明は上記のような課題を考慮してなされたものであり、地面等の下方対象の変化に頑健なドローン及びその制御方法を提供することを目的とする。 The present invention has been made in consideration of the above problems, and an object of the present invention is to provide a drone that is robust against changes in a downward object such as the ground and a control method thereof.
 本発明に係るドローンは、
  下方対象までの第1距離を検出する光センサと
  前記下方対象までの第2距離を検出する超音波センサと
 を備えるものであって、
 前記第1距離及び前記第2距離の一方を選択して又は両方を組み合わせて設定した制御用高度を用いて前記ドローンの飛行制御を行う飛行制御部をさらに有する
 ことを特徴とする。
The drone according to the present invention
It is provided with an optical sensor that detects a first distance to a lower object and an ultrasonic sensor that detects a second distance to the lower object.
It is characterized by further having a flight control unit that controls the flight of the drone by using a control altitude set by selecting one of the first distance and the second distance or a combination of both.
 本発明によれば、光センサが検出した下方対象(地面等)までの第1距離と超音波センサが検出した下方対象までの第2距離の一方を選択して又は両方を組み合わせて設定した制御用高度を用いてドローンの飛行制御を行う。これにより、光センサ及び超音波センサの長所及び短所を踏まえてドローンを飛行させることが可能になる。従って、地面等の下方対象の変化に対する頑健さを向上することが可能となる。なお、光センサとしては、例えば、レーザ式を用いることができる。 According to the present invention, control is set by selecting one of the first distance to the lower object (ground, etc.) detected by the optical sensor and the second distance to the lower object detected by the ultrasonic sensor, or a combination of both. Control the flight of the drone using the altitude. This makes it possible to fly a drone based on the advantages and disadvantages of optical sensors and ultrasonic sensors. Therefore, it is possible to improve the robustness against changes in the downward object such as the ground. As the optical sensor, for example, a laser type can be used.
 前記光センサは、例えば、タイム・オブ・フライト(Time-of-Flight)式のセンサ(TOFセンサ)を用いてもよい。他の種類の光センサ(位相差検出方式、三角測距方式等)と比較して、TOFセンサは重量が軽いことが多く、TOFセンサを用いることでドローンの軽量化を図ることが可能となる。 As the optical sensor, for example, a time-of-flight type sensor (TOF sensor) may be used. Compared to other types of optical sensors (phase difference detection method, triangular ranging method, etc.), the TOF sensor is often lighter in weight, and the use of the TOF sensor makes it possible to reduce the weight of the drone. ..
 前記光センサと前記超音波センサは、前記ドローンの横方向に並んで配置されてもよい。また、前記光センサによる光の照射方向と、前記超音波センサによる超音波の照射方向は、前記光センサによる前記光の照射範囲と、前記超音波センサによる前記超音波の照射範囲の少なくとも一部が重なるように設定されてもよい。ドローンは進行方向に向かって前傾姿勢で前進する場合がある。光センサと超音波センサがドローンの横方向に配置されていれば、そのような前傾姿勢の場合であっても、両センサの検出値には互いのずれが生じ難くなる。そのため、光センサによる第1距離と超音波センサによる第2距離との乖離を抑制することが可能となる。 The optical sensor and the ultrasonic sensor may be arranged side by side in the lateral direction of the drone. Further, the light irradiation direction by the optical sensor and the ultrasonic irradiation direction by the ultrasonic sensor are at least a part of the light irradiation range by the optical sensor and the ultrasonic irradiation range by the ultrasonic sensor. May be set to overlap. The drone may move forward in a forward leaning position in the direction of travel. If the optical sensor and the ultrasonic sensor are arranged in the lateral direction of the drone, the detected values of both sensors are less likely to deviate from each other even in such a forward leaning posture. Therefore, it is possible to suppress the discrepancy between the first distance by the optical sensor and the second distance by the ultrasonic sensor.
 前記ドローンは、前記ドローンの本体から下方に突出して前記光センサ及び前記超音波センサを支持する支持部材を有してもよい。これにより、ドローンの本体内に発熱源(配電基板、インバータ、電源、カメラの画像処理部等)が存在する場合でも、光センサ及び超音波センサを発熱源から遠ざけて配置することができる。従って、発熱源からの熱により光センサ及び超音波センサが影響を受けることを抑制することが可能となる。 The drone may have a support member that projects downward from the main body of the drone to support the optical sensor and the ultrasonic sensor. As a result, even when a heat generation source (power distribution board, inverter, power supply, image processing unit of a camera, etc.) exists in the main body of the drone, the optical sensor and the ultrasonic sensor can be arranged away from the heat generation source. Therefore, it is possible to suppress the influence of the heat from the heat generation source on the optical sensor and the ultrasonic sensor.
 前記ドローンは、散布物を保管するタンクと、前記散布物を散布する吐出口(ノズル等)とを備えてもよい。また、前記光センサ及び前記超音波センサは、前記吐出口よりも上方に配置されてもよい。さらに、前記光センサ及び前記超音波センサは、前記吐出口よりも前側に配置されてもよい。これらにより、ドローンの飛行中に吐出口から散布される散布物による、光センサ又は超音波センサの測定に対する影響を抑制することが可能となる。 The drone may include a tank for storing the sprayed material and a discharge port (nozzle or the like) for spraying the sprayed material. Further, the optical sensor and the ultrasonic sensor may be arranged above the discharge port. Further, the optical sensor and the ultrasonic sensor may be arranged in front of the discharge port. As a result, it is possible to suppress the influence of the sprayed material sprayed from the discharge port during the flight of the drone on the measurement of the optical sensor or the ultrasonic sensor.
 前記ドローンは、前記下方対象を撮像するカメラと、前記カメラの画像に基づいて、前記下方対象の状態を判定する下方状態判定部とをさらに備えてもよい。前記飛行制御部は、前記下方状態判定部が判定した前記下方対象の状態が、前記光センサの検出精度が低下する光センサ精度低下状態でない場合(通常時)、前記光センサが検出した前記第1距離を選択して又は前記超音波センサが検出した前記第2距離よりも前記第1距離の重み付けを大きくして前記制御用高度を設定してもよい。また、前記飛行制御部は、前記下方対象の状態が前記光センサ精度低下状態である場合、前記第2距離を選択して又は前記第1距離よりも前記第2距離の重み付けを大きくして前記制御用高度を設定してもよい。 The drone may further include a camera that images the lower object and a lower state determination unit that determines the state of the lower object based on the image of the camera. When the state of the lower object determined by the lower state determination unit is not a state in which the accuracy of the optical sensor is reduced (normal time), the flight control unit detects the first item. The control altitude may be set by selecting one distance or by increasing the weighting of the first distance from the second distance detected by the ultrasonic sensor. Further, when the state of the lower object is the state in which the accuracy of the optical sensor is lowered, the flight control unit selects the second distance or weights the second distance more than the first distance. The control altitude may be set.
 これにより、通常時には光センサの方が超音波センサよりも検出精度が高い場合において、光センサの検出精度が低下する場面では超音波センサを優先して用いることで、ドローンの飛行を高精度に制御することが可能となる。 As a result, when the detection accuracy of an optical sensor is higher than that of an ultrasonic sensor in normal times, the ultrasonic sensor is preferentially used in situations where the detection accuracy of the optical sensor is low, so that the drone can fly with high accuracy. It becomes possible to control.
 光センサ精度低下状態は、前記下方対象が鏡面若しくは白色である状態、又は前記下方対象に日陰と日向の境界が存在し、前記光センサの測定位置が前記境界を跨ぐ状態が含まれてもよい。ここにいう測定位置とは、光センサから照射された光が下方対象に当たって反射する位置を意味する。 The state in which the accuracy of the optical sensor is lowered may include a state in which the lower object is a mirror surface or white, or a state in which the lower object has a boundary between the shade and the sun and the measurement position of the optical sensor straddles the boundary. .. The measurement position referred to here means a position where the light emitted from the optical sensor hits the lower object and is reflected.
 前記光センサ精度低下状態は、前記下方対象に日陰と日向の境界が存在し、前記光センサの測定位置が前記境界を跨ぐ状態である場合、前記飛行制御部は、前記光センサの測定位置が前記日陰と日向の境界を跨ぐ手前で、前記ドローンの飛行速度を低下させてもよい。 In the state where the accuracy of the optical sensor is lowered, when the boundary between the shade and the sun exists in the lower object and the measurement position of the optical sensor straddles the boundary, the flight control unit determines the measurement position of the optical sensor. The flight speed of the drone may be reduced before crossing the boundary between the shade and the sun.
 光センサには、検出精度を高めるために受光量のダイナミックレンジを可変とするものが存在する。そのような光センサの場合、日陰から日向への境界を測定位置が跨ぐ場合、急激な受光量の増加によりダイナミックレンジが一時的に飽和し、その後、ダイナミックレンジの調整により、日向でも高精度な測定が可能となる。本発明では、光センサの測定位置が日陰から日向への境界を跨ぐ手前で、ドローンの飛行速度を低下させる。これにより、光センサの受光量が飽和している状態でドローンが進む距離を短くすることができる。従って、光センサの適用範囲を広げることが可能となり、地面等の下方対象の変化に対する頑健さを向上可能となる。 Some optical sensors have a variable dynamic range of the amount of light received in order to improve the detection accuracy. In the case of such an optical sensor, when the measurement position straddles the boundary from the shade to the sun, the dynamic range is temporarily saturated due to a sudden increase in the amount of light received, and then the dynamic range is adjusted to achieve high accuracy even in the sun. Measurement becomes possible. In the present invention, the flight speed of the drone is reduced before the measurement position of the optical sensor crosses the boundary from the shade to the sun. As a result, the distance traveled by the drone can be shortened when the amount of light received by the optical sensor is saturated. Therefore, the applicable range of the optical sensor can be expanded, and the robustness against changes in the downward object such as the ground can be improved.
 また、超音波は、光よりも大幅に低速であり、ドローンの移動に伴うドップラー効果の影響が大きい。そのため、超音波センサの検出値(第2距離)を利用する際には光センサの検出値(第1距離)を利用する場合よりも、飛行速度を低下させることでドップラー効果による検出精度の低下を抑制することが可能となる。 Also, ultrasonic waves are much slower than light, and the Doppler effect that accompanies the movement of the drone has a large effect. Therefore, when using the detection value (second distance) of an ultrasonic sensor, the detection accuracy is lowered due to the Doppler effect by lowering the flight speed as compared with the case of using the detection value (first distance) of an optical sensor. Can be suppressed.
 さらに、日向から日陰への境界を測定位置が跨ぐ場合も、ダイナミックレンジの関係で、測定精度が一時的に下がり得る。本発明の場合、そのような場合にも対応させてもよい。 Furthermore, even when the measurement position straddles the boundary from the sun to the shade, the measurement accuracy may temporarily decrease due to the dynamic range. In the case of the present invention, such a case may be dealt with.
 本発明に係るドローンは、
  下方対象までの第1距離を検出する光センサと、
  前記第1距離を制御用高度として用いて前記ドローンの飛行制御を行う飛行制御部と、
 前記下方対象の表面状態を判定する下方状態判定部と
 を備えるものであって、
 前記下方状態判定部が判定した前記下方対象の表面状態が、前記光センサの検出精度が低下する光センサ精度低下状態である場合、前記飛行制御部は、前記ドローンの飛行速度を低下させる
 ことを特徴とする。
The drone according to the present invention
An optical sensor that detects the first distance to the lower target,
A flight control unit that controls the flight of the drone by using the first distance as a control altitude.
It is provided with a lower state determination unit for determining the surface state of the lower object.
When the surface state of the lower object determined by the lower state determination unit is an optical sensor accuracy reduction state in which the detection accuracy of the optical sensor is reduced, the flight control unit reduces the flight speed of the drone. It is a feature.
 上記のような構成でも、光センサの適用範囲を広げることが可能となり、地面等の下方対象の変化に対する頑健さを向上可能となる。 Even with the above configuration, the applicable range of the optical sensor can be expanded, and the robustness against changes in the downward object such as the ground can be improved.
 本発明に係るドローンの制御方法は、
 下方対象までの第1距離を検出する光センサと
 前記下方対象までの第2距離を検出する超音波センサと
 を備えるドローンの制御方法であって、
 飛行制御部が、前記第1距離及び前記第2距離の一方を選択して又は両方を組み合わせて設定した制御用高度を用いて前記ドローンの飛行制御を行う
 ことを特徴とする。
The drone control method according to the present invention is
A drone control method including an optical sensor that detects a first distance to a lower object and an ultrasonic sensor that detects a second distance to the lower object.
The flight control unit controls the flight of the drone using a control altitude set by selecting one of the first distance and the second distance or a combination of both.
 本発明に係るドローンの制御方法は、
 下方対象までの第1距離を検出する光センサと、
 前記第1距離を制御用高度として用いて前記ドローンの飛行制御を行う飛行制御部と、
 前記下方対象の表面状態を判定する下方状態判定部と
 を備えるドローンの制御方法であって、
 前記下方状態判定部が判定した前記下方対象の表面状態が、前記光センサの検出精度が低下する光センサ精度低下状態である場合、前記飛行制御部は、前記ドローンの飛行速度を低下させる
 ことを特徴とする。
The drone control method according to the present invention is
An optical sensor that detects the first distance to the lower target,
A flight control unit that controls the flight of the drone by using the first distance as a control altitude.
It is a control method of a drone including a lower state determination unit for determining the surface state of the lower object.
When the surface state of the lower object determined by the lower state determination unit is an optical sensor accuracy reduction state in which the detection accuracy of the optical sensor is reduced, the flight control unit reduces the flight speed of the drone. It is a feature.
 本発明によれば、地面等の下方対象の変化に対する頑健性を向上可能となる。 According to the present invention, it is possible to improve robustness against changes in downward objects such as the ground.
本発明の一実施形態に係るドローンを含む作物育成システムの概要を示す全体構成図である。It is an overall block diagram which shows the outline of the crop growing system including the drone which concerns on one Embodiment of this invention. 前記実施形態に係る前記ドローンの構成を簡略的に示す構成図である。It is a block diagram which shows the structure of the drone which concerns on said embodiment simply. 前記実施形態に係る前記ドローンの外観斜視図である。It is an external perspective view of the drone which concerns on the said embodiment. 前記実施形態に係る前記ドローンの底面図である。It is a bottom view of the drone which concerns on the said embodiment. 前記実施形態に係る前記ドローンの本体の内部構成及びその周辺の配置を簡略的に示す平面図である。It is a top view which simply shows the internal structure of the main body of the drone which concerns on the said embodiment, and the arrangement around it. 前記実施形態に係る前記ドローンの本体の内部構成及びその周辺の配置を簡略的に示す側面図である。It is a side view which shows simply the internal structure of the main body of the drone which concerns on the said embodiment, and the arrangement around it. 前記実施形態の高度関連制御のフローチャートである。It is a flowchart of the altitude-related control of the said embodiment. 変形例に係るドローンの本体の内部構成及びその周辺の配置を簡略的に示す側面図である。It is a side view which shows the internal structure of the main body of the drone which concerns on a modification and the arrangement around it simply. 変形例に係る高度関連制御のフローチャートである。It is a flowchart of altitude-related control which concerns on a modification.
A.一実施形態
<A-1.構成>
[A-1-1.全体構成]
 図1は、本発明の一実施形態に係るドローン24を含む作物育成システム10の概要を示す全体構成図である。作物育成システム10(以下「システム10」ともいう。)は、圃場500に生育する作物502の生育診断を行うと共に、作物502に薬剤を散布することができる。本実施形態の作物502は、イネ(水稲)であるが、その他の作物(例えば、陸稲、小麦、大麦)であってもよい。
A. One Embodiment <A-1. Configuration>
[A-1-1. overall structure]
FIG. 1 is an overall configuration diagram showing an outline of a crop growing system 10 including a drone 24 according to an embodiment of the present invention. The crop growing system 10 (hereinafter, also referred to as “system 10”) can diagnose the growth of the crop 502 growing in the field 500 and spray the chemicals on the crop 502. The crop 502 of the present embodiment is rice (paddy rice), but other crops (for example, upland rice, wheat, barley) may also be used.
 図1に示すように、システム10は、ドローン24に加えて、圃場センサ群20と、生育診断サーバ22と、ユーザ端末26とを有する。圃場センサ群20、ドローン24及びユーザ端末26は、通信ネットワーク30(無線基地局32を含む。)を介して互いに無線通信が可能であると共に、生育診断サーバ22と通信可能である。無線通信としては、無線基地局32を介さない通信(例えば、LTE(Long Term Evolution)、WiFi等)を用いることができる。 As shown in FIG. 1, the system 10 has a field sensor group 20, a growth diagnosis server 22, and a user terminal 26 in addition to the drone 24. The field sensor group 20, the drone 24, and the user terminal 26 can wirelessly communicate with each other via the communication network 30 (including the wireless base station 32) and can communicate with the growth diagnosis server 22. As the wireless communication, communication that does not go through the wireless base station 32 (for example, LTE (Long Term Evolution), WiFi, etc.) can be used.
[A-1-2.圃場センサ群20]
 圃場センサ群20は、水田としての圃場500に設置されて圃場500における各種データを検出して生育診断サーバ22等に提供する。圃場センサ群20には、例えば、水温センサ、温度センサ、降水量センサ、照度計、風速計、気圧計及び湿度計が含まれる。水温センサは、水田である圃場500の水温を検出する。温度センサは、圃場500の気温を検出する。降水量センサは、圃場500の降水量を検出する。照度計は、圃場500の日照量を検出する。風速計は、圃場500の風速を検出する。気圧計は、圃場500の気圧を検出する。湿度計は、圃場500の湿度を検出する。これらのセンサの値の一部は、図示しない気象情報提供サーバ等から取得してもよい。
[A-1-2. Field sensor group 20]
The field sensor group 20 is installed in the field 500 as a paddy field, detects various data in the field 500, and provides the growth diagnosis server 22 and the like. The field sensor group 20 includes, for example, a water temperature sensor, a temperature sensor, a precipitation sensor, an illuminance meter, an anemometer, a barometer and a hygrometer. The water temperature sensor detects the water temperature of the field 500, which is a paddy field. The temperature sensor detects the air temperature of the field 500. The precipitation sensor detects the amount of precipitation in the field 500. The illuminometer detects the amount of sunshine in the field 500. The anemometer detects the wind speed of the field 500. The barometer detects the barometric pressure in the field 500. The hygrometer detects the humidity of the field 500. Some of the values of these sensors may be acquired from a weather information providing server or the like (not shown).
[A-1-3.生育診断サーバ22]
 生育診断サーバ22(以下「診断サーバ22」ともいう。)は、生育診断モデルを用いた生育診断を行い、診断結果に基づいてユーザ600等に作業指示を行う。作業指示には、施肥のタイミング、肥料の種類・量、農薬の散布タイミング、農薬の種類・量等が含まれる。診断サーバ22は、入出力部、通信部、演算部及び記憶部(いずれも図示せず)を有する。また、診断サーバ22は、生育診断モデルを用いた生育診断を行う生育診断制御、ドローン24の飛行(飛行タイミング、飛行経路等)を管理する飛行管理制御等を実行する。
[A-1-3. Growth diagnosis server 22]
The growth diagnosis server 22 (hereinafter, also referred to as “diagnosis server 22”) performs growth diagnosis using the growth diagnosis model, and gives work instructions to the user 600 and the like based on the diagnosis result. Work instructions include fertilizer application timing, fertilizer type / amount, pesticide application timing, pesticide type / amount, and the like. The diagnostic server 22 has an input / output unit, a communication unit, a calculation unit, and a storage unit (none of which are shown). In addition, the diagnosis server 22 executes growth diagnosis control for performing growth diagnosis using the growth diagnosis model, flight management control for managing the flight (flight timing, flight path, etc.) of the drone 24, and the like.
[A-1-4.ドローン24]
(A-1-4-1.概要)
 図2は、本実施形態に係るドローン24の構成を簡略的に示す構成図である。図3は、本実施形態に係るドローン24の外観斜視図である。図4は、本実施形態に係るドローン24の底面図である。図5は、本実施形態に係るドローン24の本体50の内部構成及びその周辺の配置を簡略的に示す平面図である。図6は、本実施形態のドローン24の本体50の内部構成及びその周辺の配置を簡略的に示す側面図である。本実施形態のドローン24は、圃場500(作物502)の画像を取得する手段として機能すると共に、作物502に対する薬液(液体肥料を含む。)を散布する手段としても機能する。ドローン24は、発着地点510(図1)において離着陸する。
[A-1--4. Drone 24]
(A-1-4-1. Overview)
FIG. 2 is a configuration diagram simply showing the configuration of the drone 24 according to the present embodiment. FIG. 3 is an external perspective view of the drone 24 according to the present embodiment. FIG. 4 is a bottom view of the drone 24 according to the present embodiment. FIG. 5 is a plan view simply showing the internal configuration of the main body 50 of the drone 24 and the arrangement around the main body 50 according to the present embodiment. FIG. 6 is a side view simply showing the internal configuration of the main body 50 of the drone 24 of the present embodiment and the arrangement around the main body 50. The drone 24 of the present embodiment functions as a means for acquiring an image of the field 500 (crop 502) and also as a means for spraying a chemical solution (including a liquid fertilizer) on the crop 502. The drone 24 takes off and landing at the departure and arrival point 510 (FIG. 1).
 図2に示すように、ドローン24は、ドローンセンサ群60と、通信部62と、飛行機構64と、撮影機構66と、散布機構68と、ドローン制御部70と、電源部72とを有する。 As shown in FIG. 2, the drone 24 includes a drone sensor group 60, a communication unit 62, a flight mechanism 64, a photographing mechanism 66, a spraying mechanism 68, a drone control unit 70, and a power supply unit 72.
(A-1-4-2.ドローンセンサ群60)
 ドローンセンサ群60は、グローバル・ポジショニング・システム・センサ(以下「GPSセンサ」という。)、ジャイロセンサ、液量センサ(いずれも図示せず)、速度計80、高度計82等を有する。GPSセンサは、ドローン24の現在位置情報を出力する。ジャイロセンサは、ドローン24の角速度を検出する。液量センサは、散布機構68のタンク180(図4)内の液量を検出する。速度計80は、ドローン24の飛行速度Vfを検出する。
(A-1-4-2. Drone sensor group 60)
The drone sensor group 60 includes a global positioning system sensor (hereinafter referred to as “GPS sensor”), a gyro sensor, a liquid level sensor (none of which is shown), a speedometer 80, an altimeter 82, and the like. The GPS sensor outputs the current position information of the drone 24. The gyro sensor detects the angular velocity of the drone 24. The liquid amount sensor detects the amount of liquid in the tank 180 (FIG. 4) of the spraying mechanism 68. The speedometer 80 detects the flight speed Vf of the drone 24.
 高度計82は、ドローン24下方に位置する対象(以下「下方対象Tlow」という。)に対する距離(いわゆる対地高度)を検出する。下方対象Tlowには、地面520(図1)、作物502等が含まれる。本実施形態の高度計82は、TOFセンサ100及び超音波センサ102を含む。TOFセンサ100は、レーザを用いるタイム・オブ・フライト(Time-of-Flight)式で距離を検出する光センサである。TOFセンサ100は、ドローン24の下方を向いており、地面520(図1)、作物502等の下方対象Tlowまでの距離(以下「第1距離D1」又は「検出値D1」ともいう。)を検出する。TOFセンサ100は、検出精度を高めるために受光量のダイナミックレンジを可変とする。 The altimeter 82 detects the distance (so-called ground level) to the object located below the drone 24 (hereinafter referred to as the "downward object Row"). The lower target Row includes the ground 520 (FIG. 1), crop 502 and the like. The altimeter 82 of this embodiment includes a TOF sensor 100 and an ultrasonic sensor 102. The TOF sensor 100 is an optical sensor that detects a distance by a time-of-flight method using a laser. The TOF sensor 100 faces downward from the drone 24, and determines the distance to the lower target Row such as the ground 520 (FIG. 1) and the crop 502 (hereinafter, also referred to as “first distance D1” or “detection value D1”). To detect. The TOF sensor 100 has a variable dynamic range of the amount of received light in order to improve the detection accuracy.
 超音波センサ102は、超音波を用いて距離を検出するセンサである。超音波センサ102は、ドローン24の下方を向いており、地面520、作物502等の下方対象Tlowまでの距離(以下「第2距離D2」又は「検出値D2」ともいう。)を検出する。TOFセンサ100による光の照射方向と、超音波センサ102による超音波の照射方向は、両センサの測定領域において、TOFセンサ100による光の照射範囲と、超音波センサ102による超音波の照射範囲の少なくとも一部が重なるように設定される。例えば、TOFセンサ100による光の照射方向と、超音波センサ102による超音波の照射方向は、略同一(照射方向における中心軸が略並行)である。 The ultrasonic sensor 102 is a sensor that detects a distance using ultrasonic waves. The ultrasonic sensor 102 faces downward from the drone 24 and detects the distance to the lower target Row such as the ground 520 and the crop 502 (hereinafter, also referred to as “second distance D2” or “detection value D2”). The light irradiation direction by the TOF sensor 100 and the ultrasonic irradiation direction by the ultrasonic sensor 102 are the range of the light irradiation by the TOF sensor 100 and the ultrasonic irradiation range by the ultrasonic sensor 102 in the measurement areas of both sensors. At least part of it is set to overlap. For example, the light irradiation direction of the TOF sensor 100 and the ultrasonic irradiation direction of the ultrasonic sensor 102 are substantially the same (the central axes in the irradiation direction are substantially parallel).
 図5に示すように、TOFセンサ100及び超音波センサ102は、ドローン24の本体50を構成する底面部52に設けられる。より具体的には、TOFセンサ100及び超音波センサ102は、底面部52の前側においてカメラ160(後述)の左右に配置される。換言すると、TOFセンサ100及び超音波センサ102は、ドローン24の横方向に並んで配置される。TOFセンサ100及び超音波センサ102は、ノズル186l1、186l2、186r1、186r2よりも上方且つ前側に配置される(図3及び図4参照)。 As shown in FIG. 5, the TOF sensor 100 and the ultrasonic sensor 102 are provided on the bottom surface portion 52 constituting the main body 50 of the drone 24. More specifically, the TOF sensor 100 and the ultrasonic sensor 102 are arranged on the left and right sides of the camera 160 (described later) on the front side of the bottom surface portion 52. In other words, the TOF sensor 100 and the ultrasonic sensor 102 are arranged side by side in the lateral direction of the drone 24. The TOF sensor 100 and the ultrasonic sensor 102 are arranged above and in front of the nozzles 186l1, 186l2, 186r1, and 186r2 (see FIGS. 3 and 4).
(A-1-4-3.通信部62)
 通信部62(図2)は、通信ネットワーク30(図1)を介しての電波通信が可能であり、例えば、電波通信モジュールを含む。通信部62は、通信ネットワーク30(無線基地局32を含む。)を介することで、圃場センサ群20、診断サーバ22、ユーザ端末26等との通信が可能である。通信部62を構成する電波通信モジュールは、制御基板110(図5)内に配置される。
(A-1--4-3. Communication unit 62)
The communication unit 62 (FIG. 2) is capable of radio wave communication via the communication network 30 (FIG. 1), and includes, for example, a radio wave communication module. The communication unit 62 can communicate with the field sensor group 20, the diagnosis server 22, the user terminal 26, and the like via the communication network 30 (including the wireless base station 32). The radio wave communication module constituting the communication unit 62 is arranged in the control board 110 (FIG. 5).
(A-1-4-4.飛行機構64)
 飛行機構64は、ドローン24を飛行させる機構である。図3及び図4に示すように、飛行機構64は、複数のプロペラ130flu、130fll、130fru、130frl、130rlu、130rll、130rru、130rrl(以下「プロペラ130」と総称する。)と、複数のプロペラアクチュエータ132flu、132fll、132fru、132frl、132rlu、132rll、132rru、132rrl(以下「プロペラアクチュエータ132」と総称する。)と、プロペラガード134fl、134fr、134rl、134rr(以下「プロペラガード134」と総称する。)とを有する。
(A-1--4-4. Flight mechanism 64)
The flight mechanism 64 is a mechanism for flying the drone 24. As shown in FIGS. 3 and 4, the flight mechanism 64 includes a plurality of propellers 130flu, 130fl, 130fru, 130fl, 130rlu, 130rll, 130rru, 130rrl (hereinafter collectively referred to as “propeller 130”) and a plurality of propeller actuators. 132fl, 132fl, 132fru, 132fl, 132rlu, 132rrl, 132rru, 132rrl (hereinafter collectively referred to as "propeller actuator 132") and propeller guard 134fl, 134fr, 134rr, 134rr (hereinafter collectively referred to as "propeller guard 134"). And have.
 また、飛行機構64は、複数のインバータ136(図5)を有する。図3~図6の矢印Aは、ドローン24の進行方向を示している。図3に示すように、本実施形態のプロペラ130は、いわゆる二重反転式であり、2つのプロペラ130(例えば、プロペラ130flu、130fll)を同軸に配置し、上下のプロペラ130を相互に逆方向回転させる。本実施形態では、二重反転式のプロペラ130の組が4つある。 Further, the flight mechanism 64 has a plurality of inverters 136 (FIG. 5). Arrows A in FIGS. 3 to 6 indicate the traveling direction of the drone 24. As shown in FIG. 3, the propeller 130 of the present embodiment is a so-called counter-rotating type, in which two propellers 130 (for example, propellers 130flu and 130fl) are arranged coaxially, and the upper and lower propellers 130 are oriented in opposite directions. Rotate. In this embodiment, there are four sets of counter-rotating propellers 130.
 図3及び図4に示すように、各プロペラ130は、ドローン24の本体50から延び出たアーム138u、138l、140ru、140rl、140ru、140rlにより本体50の四方に配置されている。すなわち、左前方にプロペラ132flu、132fllが、右前方にプロペラ132fru、132frlが、左後方にプロペラ132rlu、132rllが、右後方にプロペラ132rru、132rrlがそれぞれ配置されている。プロペラ130の回転軸から下方には、それぞれ棒状の足142fl、142fr、142rl、142rr(以下「足142」と総称する。)が延在する。 As shown in FIGS. 3 and 4, each propeller 130 is arranged on four sides of the main body 50 by arms 138u, 138l, 140ru, 140rl, 140ru, 140rl extending from the main body 50 of the drone 24. That is, propellers 132fl and 132fl are arranged on the left front side, propellers 132fru and 132fll are arranged on the right front side, propellers 132rlu and 132rll are arranged on the left rear side, and propellers 132rru and 132rrl are arranged on the right rear side, respectively. Below the axis of rotation of the propeller 130, rod-shaped legs 142fl, 142fr, 142rr, and 142rr (hereinafter collectively referred to as "feet 142") extend.
 プロペラアクチュエータ132は、プロペラ130を回転させる手段であり、プロペラ130毎に設けられる。本実施形態のプロペラアクチュエータ132は、電動モータであるが、発動機等であってもよい。1組の上下のプロペラ130(例えば、プロペラ130flu、130fll)及びそれらに対応するプロペラアクチュエータ132(例えば、プロペラアクチュエータ132flu、132fll)は同軸上にある。1組の上下のプロペラアクチュエータ132は、互いに反対方向に回転する。 The propeller actuator 132 is a means for rotating the propeller 130, and is provided for each propeller 130. The propeller actuator 132 of the present embodiment is an electric motor, but may be a motor or the like. A set of upper and lower propellers 130 (eg, propellers 130flu, 130fl) and their corresponding propeller actuators 132 (eg, propeller actuators 132fl, 132fl) are coaxial. A set of upper and lower propeller actuators 132 rotate in opposite directions.
 インバータ136は、電源部72からの直流を交流に変換してプロペラアクチュエータ132に供給するものであり、いわゆるESC(Electric Speed Controller)である。インバータ136は、プロペラアクチュエータ132毎に8つ設けられている。図5に示すように、インバータ136は、ドローン24の本体50を構成する底面部52に設けられる。インバータ136は、制御基板110の左右において前後方向に並んで配置される。 The inverter 136 converts the direct current from the power supply unit 72 into alternating current and supplies it to the propeller actuator 132, and is a so-called ESC (Electric Speed Controller). Eight inverters 136 are provided for each propeller actuator 132. As shown in FIG. 5, the inverter 136 is provided on the bottom surface portion 52 constituting the main body 50 of the drone 24. The inverters 136 are arranged side by side in the front-rear direction on the left and right sides of the control board 110.
(A-1-4-5.撮影機構66)
 撮影機構66(図2)は、圃場500又は作物502の画像を撮影する機構であり、カメラ160(図2、図4~図6)を有する。本実施形態のカメラ160は、マルチスペクトルカメラであり、特に作物502の生育状況を分析できる画像を取得する。撮影機構66は、圃場500に対して特定の波長の光線を照射する照射部をさらに備え、当該光線に対する圃場500からの反射光を受光可能になっていてもよい。特定の波長の光線は、例えば赤色光(波長約650nm)と近赤外光(波長約774nm)であってもよい。当該光線の反射光を分析することで、作物502の窒素吸収量を推定し、推定した窒素吸収量に基づいて作物502の生育状況を分析することができる。
(A-1--4-5. Imaging mechanism 66)
The photographing mechanism 66 (FIG. 2) is a mechanism for capturing an image of the field 500 or the crop 502, and has a camera 160 (FIGS. 2, 4 to 6). The camera 160 of the present embodiment is a multispectral camera, and particularly acquires an image capable of analyzing the growth state of the crop 502. The photographing mechanism 66 may further include an irradiation unit that irradiates the field 500 with light rays having a specific wavelength, and may be capable of receiving the reflected light from the field 500 with respect to the light rays. The light rays having a specific wavelength may be, for example, red light (wavelength of about 650 nm) and near-infrared light (wavelength of about 774 nm). By analyzing the reflected light of the light beam, the nitrogen absorption amount of the crop 502 can be estimated, and the growth state of the crop 502 can be analyzed based on the estimated nitrogen absorption amount.
 図5に示すように、カメラ160は、ドローン24の本体50を構成する底面部52に設けられる。より具体的には、カメラ160は、底面部52の前側においてTOFセンサ100と超音波センサ102の間に配置される。カメラ160は、下向きに配置されており、地面520、作物502等の下方対象Tlowを撮像する。 As shown in FIG. 5, the camera 160 is provided on the bottom surface portion 52 constituting the main body 50 of the drone 24. More specifically, the camera 160 is arranged between the TOF sensor 100 and the ultrasonic sensor 102 on the front side of the bottom surface portion 52. The camera 160 is arranged downward and images the lower target Row of the ground 520, the crop 502, and the like.
 カメラ160は、ドローン24の周辺を撮影した周辺画像に関する画像データを出力する。カメラ160は、動画を撮影するビデオカメラである。或いは、カメラ160は、動画及び静止画の両方又は静止画のみを撮影可能としてもよい。 The camera 160 outputs image data related to peripheral images taken around the drone 24. The camera 160 is a video camera that shoots a moving image. Alternatively, the camera 160 may be capable of capturing both moving images and still images, or only still images.
 カメラ160は、図示しないカメラアクチュエータにより向き(ドローン24の本体50に対するカメラ160の姿勢)を調整可能である。或いは、カメラ160は、ドローン24の本体50に対する位置が固定されてもよい。 The orientation of the camera 160 (the posture of the camera 160 with respect to the main body 50 of the drone 24) can be adjusted by a camera actuator (not shown). Alternatively, the position of the camera 160 with respect to the main body 50 of the drone 24 may be fixed.
(A-1-4-6.散布機構68)
 散布機構68(図2)は、薬剤(液体肥料を含む。)を散布する機構である。図4等に示すように、散布機構68は、タンク180、ポンプ182、配管184、流量調整弁(図示せず)及び薬剤ノズル186l1、186l2、186r1、186r2(以下「ノズル186」と総称する。)を有する。
(A-1--4-6. Spraying mechanism 68)
The spraying mechanism 68 (FIG. 2) is a mechanism for spraying a chemical (including liquid fertilizer). As shown in FIG. 4 and the like, the spraying mechanism 68 is collectively referred to as a tank 180, a pump 182, a pipe 184, a flow rate adjusting valve (not shown), and a drug nozzle 186l1, 186l2, 186r1, 186r2 (hereinafter, “nozzle 186”). ).
 タンク180は、散布される薬剤(散布物)を保管する。ポンプ182は、タンク180内の薬剤を配管184に押し出す。配管184は、タンク180と各ノズル186とを接続する。配管184は、硬質の素材から成り、ノズル186を支持する役割を兼ねていてもよい。 The tank 180 stores the chemicals (sprayed material) to be sprayed. The pump 182 pushes the medicine in the tank 180 into the pipe 184. The pipe 184 connects the tank 180 and each nozzle 186. The pipe 184 is made of a hard material and may also serve to support the nozzle 186.
 各ノズル186は、薬剤を下方に向けて散布するための手段(吐出口)である。ノズル186を設ける代わりに、配管184に1つ又は複数の貫通孔を設けることで吐出口を形成してもよい。 Each nozzle 186 is a means (discharge port) for spraying the drug downward. Instead of providing the nozzle 186, the discharge port may be formed by providing one or more through holes in the pipe 184.
(A-1-4-7.ドローン制御部70)
 ドローン制御部70(図2)は、ドローン24の飛行、撮影、薬剤の散布等、ドローン24全体を制御する。図2に示すように、ドローン制御部70は、入出力部190、演算部192及び記憶部194を含む。入出力部190、演算部192及び記憶部194は、制御基板110(図5)に配置される。図5に示すように、制御基板110は、本体50を構成する底面部52上において、底面部52の中央付近に配置される。
(A-1--4-7. Drone control unit 70)
The drone control unit 70 (FIG. 2) controls the entire drone 24, such as flying, photographing, and spraying the drug. As shown in FIG. 2, the drone control unit 70 includes an input / output unit 190, a calculation unit 192, and a storage unit 194. The input / output unit 190, the calculation unit 192, and the storage unit 194 are arranged on the control board 110 (FIG. 5). As shown in FIG. 5, the control board 110 is arranged near the center of the bottom surface portion 52 on the bottom surface portion 52 constituting the main body 50.
 入出力部190は、ドローン24の各部との信号の入出力を行う。演算部192は、中央演算装置(CPU)を含み、記憶部194に記憶されているプログラムを実行することにより動作する。演算部192が実行する機能の一部は、ロジックIC(Integrated Circuit)を用いて実現することもできる。演算部192は、前記プログラムの一部をハードウェア(回路部品)で構成することもできる。前述した診断サーバ22の演算部、後述するユーザ端末26の演算部等も同様である。 The input / output unit 190 inputs / outputs signals to / from each unit of the drone 24. The arithmetic unit 192 includes a central processing unit (CPU) and operates by executing a program stored in the storage unit 194. Some of the functions executed by the arithmetic unit 192 can also be realized by using a logic IC (Integrated Circuit). The arithmetic unit 192 may also configure a part of the program with hardware (circuit components). The same applies to the calculation unit of the diagnostic server 22 described above, the calculation unit of the user terminal 26 described later, and the like.
 図2に示すように、演算部192は、飛行制御部200、撮影制御部202及び散布制御部204を有する。飛行制御部200は、飛行機構64を介してドローン24の飛行を制御する。また、飛行制御部200は、ドローン24の対地高度(以下「高度H」ともいう。)に関連する高度関連制御の一部を実行する。撮影制御部202は、撮影機構66を介してドローン24による撮影を制御する。撮影制御部202も高度関連制御の一部を実行する。散布制御部204は、散布機構68を介してドローン24による薬剤散布を制御する。 As shown in FIG. 2, the calculation unit 192 includes a flight control unit 200, an imaging control unit 202, and a spray control unit 204. The flight control unit 200 controls the flight of the drone 24 via the flight mechanism 64. Further, the flight control unit 200 executes a part of the altitude-related control related to the altitude above ground level (hereinafter, also referred to as “altitude H”) of the drone 24. The shooting control unit 202 controls shooting by the drone 24 via the shooting mechanism 66. The shooting control unit 202 also executes a part of altitude-related control. The spraying control unit 204 controls the spraying of the drug by the drone 24 via the spraying mechanism 68.
 記憶部194は、演算部192が用いるプログラム及びデータを記憶するものであり、ランダム・アクセス・メモリ(以下「RAM」という。)を備える。RAMとしては、レジスタ等の揮発性メモリと、ハードディスク、フラッシュメモリ等の不揮発性メモリとを用いることができる。また、記憶部194は、RAMに加え、リード・オンリー・メモリ(ROM)を有してもよい。前述した診断サーバ22の記憶部、後述するユーザ端末26の記憶部等も同様である。 The storage unit 194 stores programs and data used by the calculation unit 192, and includes a random access memory (hereinafter referred to as "RAM"). As the RAM, a volatile memory such as a register and a non-volatile memory such as a hard disk and a flash memory can be used. Further, the storage unit 194 may have a read-only memory (ROM) in addition to the RAM. The same applies to the storage unit of the diagnostic server 22 described above, the storage unit of the user terminal 26 described later, and the like.
(A-1-4-8.電源部72)
 電源部72は、ドローン24の各部に電力を供給する。電源部72は、電源210(図6)と、電源回路212(図5及び図6)とを有する。電源210は、例えば、リチウムイオン電池等の二次電池からなる。電源回路212は、コンバータ等を含み、電源210からの電力をドローン24の各部に振り分ける。
(A-1--4-8. Power supply unit 72)
The power supply unit 72 supplies electric power to each unit of the drone 24. The power supply unit 72 has a power supply 210 (FIG. 6) and a power supply circuit 212 (FIGS. 5 and 6). The power supply 210 is made of a secondary battery such as a lithium ion battery, for example. The power supply circuit 212 includes a converter and the like, and distributes the electric power from the power supply 210 to each part of the drone 24.
 図5及び図6に示すように、電源回路212は、ドローン24の本体50を構成する底面部52の後ろ側に設けられる。また、図6に示すように、電源210は、電源回路212の上方に配置される。本体50は、取り外し可能なカバー214を有している。電源210は、カバー214を取り外した状態で着脱又は交換することが可能である。 As shown in FIGS. 5 and 6, the power supply circuit 212 is provided behind the bottom surface portion 52 constituting the main body 50 of the drone 24. Further, as shown in FIG. 6, the power supply 210 is arranged above the power supply circuit 212. The main body 50 has a removable cover 214. The power supply 210 can be attached / detached or replaced with the cover 214 removed.
[A-1-5.ユーザ端末26]
 ユーザ端末26(図1)は、圃場500において、操作者としてのユーザ600(図1)の操作によりドローン24を制御すると共に、ドローン24から受信した情報(例えば、位置、薬剤量、電池残量、カメラ映像等)を表示する携帯情報端末である。なお、本実施形態では、ドローン24の飛行状態(高度、姿勢等)は、ユーザ端末26が遠隔制御するのではなく、ドローン24が自律的に制御する。従って、ユーザ端末26を介してユーザ600からドローン24に飛行指令が送信されると、ドローン24は自律飛行を行う。但し、離陸や帰還等の基本操作時、及び緊急時にはマニュアル操作が行なえるようになっていてもよい。ユーザ端末26は、図示しない入出力部(タッチパネル等を含む。)、通信部、演算部及び記憶部を備え、例えば、一般的なタブレット端末により構成される。
[A-1-5. User terminal 26]
The user terminal 26 (FIG. 1) controls the drone 24 by the operation of the user 600 (FIG. 1) as an operator in the field 500, and the information received from the drone 24 (for example, the position, the amount of drug, and the remaining battery level). , Camera image, etc.). In the present embodiment, the flight state (altitude, attitude, etc.) of the drone 24 is not remotely controlled by the user terminal 26, but is autonomously controlled by the drone 24. Therefore, when a flight command is transmitted from the user 600 to the drone 24 via the user terminal 26, the drone 24 performs autonomous flight. However, manual operations may be possible during basic operations such as takeoff and return, and in emergencies. The user terminal 26 includes an input / output unit (including a touch panel and the like), a communication unit, a calculation unit, and a storage unit (not shown), and is composed of, for example, a general tablet terminal.
 また、本実施形態のユーザ端末26は、生育診断サーバ22からの作業指示等を受信して表示する。ドローン24のコントローラとしてのユーザ端末26に加えて、操作者(ユーザ600)以外の別ユーザが用いる別のユーザ端末を設けてもよい。当該別のユーザ端末は、ドローン24の飛行情報(現在の飛行状況、飛行終了予定時刻等)、ユーザ602に対する作業指示、生育診断の情報等を、診断サーバ22又はドローン24から受信して表示する携帯情報端末とすることができる。或いは、当該別のユーザ端末は、圃場500以外の場所(例えば、ユーザ600が所属する会社)において、生育診断サーバ22による生育診断を利用するためにユーザ600等が用いる端末であってもよい。 Further, the user terminal 26 of the present embodiment receives and displays a work instruction or the like from the growth diagnosis server 22. In addition to the user terminal 26 as the controller of the drone 24, another user terminal used by another user other than the operator (user 600) may be provided. The other user terminal receives and displays flight information of the drone 24 (current flight status, scheduled flight end time, etc.), work instructions for the user 602, growth diagnosis information, etc. from the diagnosis server 22 or the drone 24. It can be a mobile information terminal. Alternatively, the other user terminal may be a terminal used by the user 600 or the like in order to use the growth diagnosis by the growth diagnosis server 22 in a place other than the field 500 (for example, the company to which the user 600 belongs).
<A-2.本実施形態の制御>
[A-2-1.概要]
 本実施形態の診断サーバ22では、生育診断制御及び飛行管理制御が行われる。生育診断制御は、生育診断モデルを用いた生育診断を行う制御である。ここに言う生育診断には、例えば、圃場500毎の収量の推定値(推定収量)が含まれる。また、生育診断制御では、水田としての圃場500の水管理、施肥、薬剤散布等に関する作業指示も行われる。作業指示は、例えば、ユーザ端末26の表示部等に表示される。生育診断モデルでは、作物502(水稲)の収量、赤色光吸収率、籾数、有効受光面積率、籾内の蓄積デンプン量及び籾内のタンパク質含有率を算出することができる。
<A-2. Control of this embodiment>
[A-2-1. Overview]
The diagnosis server 22 of the present embodiment performs growth diagnosis control and flight management control. The growth diagnosis control is a control for performing a growth diagnosis using a growth diagnosis model. The growth diagnosis referred to here includes, for example, an estimated value (estimated yield) of the yield for each field 500. Further, in the growth diagnosis control, work instructions regarding water management, fertilization, chemical spraying, etc. of the field 500 as a paddy field are also given. The work instruction is displayed on, for example, the display unit of the user terminal 26. In the growth diagnosis model, the yield of crop 502 (paddy rice), red light absorption rate, number of paddy, effective light receiving area ratio, amount of accumulated starch in paddy and protein content in paddy can be calculated.
 飛行管理制御は、ドローン24の飛行を管理する制御である。飛行管理制御では、生育診断制御での作業指示等に基づいて、ドローン24の飛行タイミング、飛行経路、目標速度、目標高度、撮影機構66の撮影方法、散布機構68の散布方法等が設定される。 Flight management control is a control that manages the flight of the drone 24. In the flight management control, the flight timing, flight path, target speed, target altitude, imaging method of the imaging mechanism 66, spraying method of the spraying mechanism 68, etc. of the drone 24 are set based on the work instructions in the growth diagnosis control. ..
 本実施形態のドローン10では、飛行制御、撮影制御及び薬液散布制御が行われる。飛行制御は、撮影、薬剤散布等のため圃場500においてドローン24を飛行させる制御である。飛行制御では、飛行制御部200が、診断サーバ22からの指令に基づいて飛行機構64を制御する。本実施形態では、飛行制御の一部としての高度関連制御が実行される。高度関連制御の詳細については、図7を参照して後述する。 In the drone 10 of the present embodiment, flight control, imaging control, and chemical spray control are performed. The flight control is a control for flying the drone 24 in the field 500 for photographing, spraying chemicals, and the like. In flight control, the flight control unit 200 controls the flight mechanism 64 based on a command from the diagnostic server 22. In this embodiment, altitude-related control is performed as part of flight control. Details of altitude-related control will be described later with reference to FIG.
 撮影制御は、ドローン24のカメラ160により圃場500(又は作物502)の画像を取得し、診断サーバ22に送信する制御である。撮影制御では、撮影制御部202が、診断サーバ22からの指令に基づいて撮影機構66を制御する。診断サーバ22に送信された圃場画像は、画像処理されて生育診断に用いられる。薬剤散布制御は、ドローン24を用いて薬液(液体肥料を含む。)を散布する制御である。薬剤散布制御では、散布制御部204が、診断サーバ22からの指令に基づいて散布機構68を制御する。 The imaging control is a control that acquires an image of the field 500 (or crop 502) by the camera 160 of the drone 24 and transmits it to the diagnostic server 22. In imaging control, the imaging control unit 202 controls the imaging mechanism 66 based on a command from the diagnostic server 22. The field image transmitted to the diagnosis server 22 is image-processed and used for growth diagnosis. The chemical spraying control is a control for spraying a chemical solution (including liquid fertilizer) using the drone 24. In the drug spraying control, the spraying control unit 204 controls the spraying mechanism 68 based on a command from the diagnostic server 22.
[A-2-2.高度関連制御]
(A-2-2-1.概要)
 上記のように、高度関連制御は、ドローン24の高度Hに関連する制御であり、飛行制御の一部である。上記のように、本実施形態では、ドローン24の高度Hを検出するセンサとして、TOFセンサ100と超音波センサ102を有する(図2、図4及び図5)。本実施形態の高度関連制御において、ドローン制御部70は、ドローン24の飛行制御で用いる高度H(以下「制御用高度Hc」又は「検出高度Hc」ともいう。)をTOFセンサ100と超音波センサ102の検出値(第1距離D1、第2距離D2)に基づいて設定する。本実施形態では、基本的にTOFセンサ100の検出値(第1距離D1)を制御用高度Hcとして用いる。また、特殊な状況では、超音波センサ102の検出値(第2距離D2)を制御用高度Hcとして用いる。さらに、ドローン制御部70は、特殊な状況ではドローン24の飛行速度Vfを低下させる。検出高度Hcは、目標高度と比較されて、ドローン24の高度制御に用いられる。
[A-2-2. Advanced related control]
(A-2-2-1. Overview)
As mentioned above, the altitude-related control is the control related to the altitude H of the drone 24 and is a part of the flight control. As described above, in the present embodiment, the TOF sensor 100 and the ultrasonic sensor 102 are included as sensors for detecting the altitude H of the drone 24 (FIGS. 2, 4 and 5). In the altitude-related control of the present embodiment, the drone control unit 70 uses a TOF sensor 100 and an ultrasonic sensor for the altitude H (hereinafter, also referred to as “control altitude Hc” or “detection altitude Hc”) used in the flight control of the drone 24. It is set based on the detected values of 102 (first distance D1, second distance D2). In this embodiment, the detection value (first distance D1) of the TOF sensor 100 is basically used as the control altitude Hc. Further, in a special situation, the detected value (second distance D2) of the ultrasonic sensor 102 is used as the control altitude Hc. Further, the drone control unit 70 reduces the flight speed Vf of the drone 24 under special circumstances. The detected altitude Hc is compared with the target altitude and used for altitude control of the drone 24.
(A-2-2-2.具体的な流れ)
 図7は、本実施形態の高度関連制御のフローチャートである。本実施形態の高度関連制御は、基本的に、ドローン24のドローン制御部70(図2)が実行する。図7に示す高度関連制御の一部は、飛行制御における他の制御と重複し得ることに留意されたい。
(A-2-2-2. Specific flow)
FIG. 7 is a flowchart of the altitude-related control of the present embodiment. The altitude-related control of the present embodiment is basically executed by the drone control unit 70 (FIG. 2) of the drone 24. Note that some of the altitude-related controls shown in FIG. 7 can overlap with other controls in flight control.
 ステップS11において、ドローン制御部70(撮影制御部202)は、カメラ160の下方画像Ilow(地面520等の画像)を取得する。ステップS12において、ドローン制御部70(撮影制御部202)は、下方画像Ilowを画像処理して下方対象Tlow(地面520等)の状態(下方状態Slow)を判定する。ここでの下方状態Slowには、例えば、下方対象Tlowが鏡面又は白色である状態、下方対象Tlowに日陰から日向への境界が存在し、TOFセンサ100の測定位置が境界を跨ぐ状態が含まれる。 In step S11, the drone control unit 70 (shooting control unit 202) acquires the lower image Ilaw (image of the ground 520 or the like) of the camera 160. In step S12, the drone control unit 70 (shooting control unit 202) performs image processing on the lower image Ilaw to determine the state (lower state Slow) of the lower target Row (ground 520 or the like). The downward state Slow here includes, for example, a state in which the lower target Trow is a mirror surface or white, a state in which the lower target Trow has a boundary from the shade to the sun, and a state in which the measurement position of the TOF sensor 100 straddles the boundary. ..
 下方対象Tlowが鏡面又は白色であるか否かの判定は、例えば、カメラ160の受光素子の受光量が飽和している領域のパターン判定により行う。同様に、日陰から日向への境界の判定は、例えば、カメラ160の受光素子の受光量が飽和している領域のパターン判定により行う。判定された下方状態Slowは、撮影制御部202から飛行制御部200に通知される。 Whether or not the lower target Trow is a mirror surface or white is determined, for example, by pattern determination in a region where the light receiving amount of the light receiving element of the camera 160 is saturated. Similarly, the boundary from the shade to the sun is determined, for example, by pattern determination in a region where the light receiving amount of the light receiving element of the camera 160 is saturated. The determined downward state Slow is notified from the shooting control unit 202 to the flight control unit 200.
 ステップS13において、ドローン制御部70(飛行制御部200)は、TOFセンサ100の検出精度が低下する状態(TOFセンサ精度低下状態)でないか否かを、下方状態Slowに基づいて判定する。TOFセンサ精度低下状態としては、下方対象Tlowが鏡面又は白色である状態、及びドローン24が日陰から日向への境界を跨ぐ状態が含まれる。それら以外の場合、TOFセンサ精度低下状態ではないと判定される。TOFセンサ精度低下状態でない場合(S13:真)、ドローン制御部70は、通常時であると判定して、ステップS14に進む。 In step S13, the drone control unit 70 (flight control unit 200) determines whether or not the detection accuracy of the TOF sensor 100 is in a reduced state (TOF sensor accuracy lowered state) based on the downward state Slow. The TOF sensor accuracy reduction state includes a state in which the lower target Row is a mirror surface or white, and a state in which the drone 24 straddles the boundary from the shade to the sun. In other cases, it is determined that the TOF sensor accuracy is not reduced. When the TOF sensor accuracy is not lowered (S13: true), the drone control unit 70 determines that it is a normal time, and proceeds to step S14.
 ステップS14において、ドローン制御部70(飛行制御部200)は、TOFセンサ100の検出値(第1距離D1)を制御用高度Hcとして設定する。 In step S14, the drone control unit 70 (flight control unit 200) sets the detection value (first distance D1) of the TOF sensor 100 as the control altitude Hc.
 TOFセンサ精度低下状態である場合(S13:偽)、ステップS15に進む。ステップS15において、ドローン制御部70(飛行制御部200)は、成立したTOFセンサ精度低下状態が、日陰から日向への境界を跨ぐ状態であるか否か(換言すると、ドローン24が当該境界を通過中であるか否か)を判定する。日陰から日向への境界を通過中である場合(S15:真)、ステップS16に進む。 If the TOF sensor accuracy is low (S13: false), the process proceeds to step S15. In step S15, the drone control unit 70 (flight control unit 200) determines whether or not the established TOF sensor accuracy reduction state is a state that straddles the boundary from the shade to the sun (in other words, the drone 24 passes through the boundary). Whether or not it is inside) is determined. If the boundary from the shade to the sun is being passed (S15: true), the process proceeds to step S16.
 ステップS16において、ドローン制御部70(飛行制御部200)は、ドローン24の飛行速度Vfを所定値(境界飛行速度THvf)まで低下させる飛行速度低下制御を実行する。上記のように、本実施形態のTOFセンサ100は、検出精度を高めるために受光量のダイナミックレンジを可変とする。そのため、日陰から日向への境界を測定位置が跨ぐ場合、急激な受光量の増加によりダイナミックレンジが一時的に飽和し、その後、ダイナミックレンジの調整により、日向でも高精度な測定が可能となる。 In step S16, the drone control unit 70 (flight control unit 200) executes flight speed reduction control for reducing the flight speed Vf of the drone 24 to a predetermined value (boundary flight speed THvf). As described above, the TOF sensor 100 of the present embodiment has a variable dynamic range of the amount of received light in order to improve the detection accuracy. Therefore, when the measurement position straddles the boundary from the shade to the sun, the dynamic range is temporarily saturated due to a sudden increase in the amount of light received, and then the dynamic range is adjusted to enable highly accurate measurement even in the sun.
 そこで、本実施形態では、TOFセンサ100の測定位置が日陰から日向への境界を跨ぐ手前で、ドローン24の飛行速度Vfを境界飛行速度THvfまで低下させる。これにより、TOFセンサ100の受光量が飽和している状態でドローン24が進む距離を短くすることができる。従って、TOFセンサ100の適用範囲を広げることが可能となる。飛行速度Vfが境界飛行速度THvfまで低下した後は、飛行速度Vfを境界飛行速度THvfで維持する。 Therefore, in the present embodiment, the flight speed Vf of the drone 24 is reduced to the boundary flight speed THvf before the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun. As a result, the distance traveled by the drone 24 can be shortened when the amount of light received by the TOF sensor 100 is saturated. Therefore, the applicable range of the TOF sensor 100 can be expanded. After the flight speed Vf drops to the boundary flight speed THvf, the flight speed Vf is maintained at the boundary flight speed THvf.
 ステップS17において、ドローン制御部70(飛行制御部200)は、超音波センサ102の検出値(第2距離D2)を制御用高度Hcとして設定する。ステップS18、S19は、ステップS11、S12と同様である。 In step S17, the drone control unit 70 (flight control unit 200) sets the detection value (second distance D2) of the ultrasonic sensor 102 as the control altitude Hc. Steps S18 and S19 are the same as steps S11 and S12.
 ステップS20において、ドローン制御部70(飛行制御部200)は、TOFセンサ100の測定位置が、日陰から日向への境界の通過を終了したか否かを判定する。当該判定は、TOFセンサ100の測定位置が日陰から日向への境界線を通過した後、当該境界線から所定距離離れたか否かを下方画像Ilowに基づいて判定することで行う。或いは、TOFセンサ100の測定位置が日陰から日向への境界線を通過したと下方画像Ilowに基づいて判定した後、TOFセンサ100の検出値D1と超音波センサ102の検出値D2とを比較することで行う。より具体的には、TOFセンサ100の測定位置が日陰から日向への境界線を通過したと下方画像Ilowに基づいて判定した後、検出値D1、D2の差分の絶対値又は差分の割合が所定の閾値以下となったか否かに基づいて行う。 In step S20, the drone control unit 70 (flight control unit 200) determines whether or not the measurement position of the TOF sensor 100 has completed the passage of the boundary from the shade to the sun. The determination is made by determining whether or not the measurement position of the TOF sensor 100 is separated from the boundary line by a predetermined distance after passing the boundary line from the shade to the sun based on the lower image Ilow. Alternatively, after determining that the measurement position of the TOF sensor 100 has passed the boundary line from the shade to the sun based on the lower image Ilow, the detection value D1 of the TOF sensor 100 and the detection value D2 of the ultrasonic sensor 102 are compared. Do it by. More specifically, after determining based on the lower image Ilaw that the measurement position of the TOF sensor 100 has passed the boundary line from the shade to the sun, the absolute value of the difference between the detected values D1 and D2 or the ratio of the difference is determined. It is performed based on whether or not it is below the threshold value of.
 測定位置が同境界の通過を終了していない場合(S20:偽)、ステップS16に戻る。測定位置が同境界の通過を終了した場合(S20:真)、ステップS21に進む。ステップS21において、ドローン制御部70(飛行制御部200)は、TOFセンサ100の検出値(第1距離D1)を制御用高度Hcとして設定する。 If the measurement position has not finished passing through the same boundary (S20: false), the process returns to step S16. When the measurement position finishes passing through the same boundary (S20: true), the process proceeds to step S21. In step S21, the drone control unit 70 (flight control unit 200) sets the detection value (first distance D1) of the TOF sensor 100 as the control altitude Hc.
 ステップS15に戻り、成立したTOFセンサ精度低下状態が、日陰から日向への境界を通過中であること以外である場合(S15:偽)、成立したTOFセンサ精度低下状態は、下方対象Tlowが鏡面又は白色である状態である。その場合、ステップS22に進む。ステップS22において、ドローン制御部70(飛行制御部200)は、超音波センサ102の検出値(第2距離D2)を制御用高度Hcとして設定する。 Returning to step S15, when the established TOF sensor accuracy reduction state is other than passing through the boundary from the shade to the sun (S15: false), the established TOF sensor accuracy reduction state is a mirror surface of the lower target Row. Or it is in a white state. In that case, the process proceeds to step S22. In step S22, the drone control unit 70 (flight control unit 200) sets the detection value (second distance D2) of the ultrasonic sensor 102 as the control altitude Hc.
 ステップS14、S21、S22の後は、ステップS11に戻る。 After steps S14, S21, and S22, the process returns to step S11.
<A-3.本実施形態の効果>
 本実施形態によれば、TOFセンサ100(光センサ)が検出した下方対象Tlow(地面520等)までの第1距離D1と超音波センサ102が検出した下方対象Tlowまでの第2距離D2の一方を選択して設定した制御用高度Hcを用いてドローン24の飛行制御を行う(図7)。これにより、TOFセンサ100及び超音波センサ102の長所及び短所を踏まえてドローン24を飛行させることが可能になる。例えば、TOFセンサ100は、超音波センサ102が苦手な土壌でも検出精度が低下しない。また、超音波センサ102は、TOFセンサ100が苦手とする鏡面物体、白色物体等でも検出精度が低下しない。従って、ドローン24全体として、地面520等の下方対象Tlowの変化に対する頑健さを向上することが可能となる。
<A-3. Effect of this embodiment>
According to this embodiment, one of the first distance D1 to the lower target Low (ground 520 etc.) detected by the TOF sensor 100 (optical sensor) and the second distance D2 to the lower target Low detected by the ultrasonic sensor 102. The flight control of the drone 24 is performed using the control altitude Hc set by selecting. (FIG. 7). This makes it possible to fly the drone 24 based on the advantages and disadvantages of the TOF sensor 100 and the ultrasonic sensor 102. For example, the TOF sensor 100 does not reduce the detection accuracy even in soil where the ultrasonic sensor 102 is not good at. Further, the ultrasonic sensor 102 does not reduce the detection accuracy even for a mirror-surfaced object, a white object, or the like, which the TOF sensor 100 is not good at. Therefore, it is possible to improve the robustness of the drone 24 as a whole against changes in the downward target Row such as the ground 520.
 本実施形態において、光センサとしてTOFセンサ100を用いる(図2等)。他の種類の光センサ(位相差検出方式、三角測距方式等)と比較して、TOFセンサ100は重量が軽いことが多く、TOFセンサ100を用いることでドローン24の軽量化を図ることが可能となる。 In this embodiment, the TOF sensor 100 is used as the optical sensor (Fig. 2, etc.). Compared with other types of optical sensors (phase difference detection method, triangular ranging method, etc.), the TOF sensor 100 is often lighter in weight, and the weight of the drone 24 can be reduced by using the TOF sensor 100. It will be possible.
 本実施形態において、TOFセンサ100(光センサ)と超音波センサ102は、ドローン24の横方向に並んで配置される(図4、図5)。また、TOFセンサ100による光の照射方向と、超音波センサ102による超音波の照射方向は、両センサの測定領域において、TOFセンサ100による光の照射範囲と、超音波センサ102による超音波の照射範囲の少なくとも一部が重なるように設定される。ドローン24は進行方向(図3等の矢印A)に向かって前傾姿勢で前進する場合がある。TOFセンサ100と超音波センサ102がドローン24の横方向に配置されていれば、そのような前傾姿勢の場合であっても、両センサの検出値(第1距離D1及び第2距離D2)には互いのずれが生じ難くなる。そのため、TOFセンサ100による第1距離D1と超音波センサ102による第2距離D2との乖離を抑制することが可能となる。 In the present embodiment, the TOF sensor 100 (optical sensor) and the ultrasonic sensor 102 are arranged side by side in the lateral direction of the drone 24 (FIGS. 4 and 5). Further, the light irradiation direction by the TOF sensor 100 and the ultrasonic irradiation direction by the ultrasonic sensor 102 are the light irradiation range by the TOF sensor 100 and the ultrasonic irradiation by the ultrasonic sensor 102 in the measurement areas of both sensors. At least part of the range is set to overlap. The drone 24 may move forward in a forward leaning posture in the direction of travel (arrow A in FIG. 3 and the like). If the TOF sensor 100 and the ultrasonic sensor 102 are arranged in the lateral direction of the drone 24, the detection values of both sensors (first distance D1 and second distance D2) even in such a forward leaning posture. Is less likely to be misaligned with each other. Therefore, it is possible to suppress the dissociation between the first distance D1 by the TOF sensor 100 and the second distance D2 by the ultrasonic sensor 102.
 本実施形態において、ドローン24は、散布物を保管するタンク180と、散布物を散布するノズル186(吐出口)とを備える(図3及び図4)。また、TOFセンサ100(光センサ)及び超音波センサ102は、ノズル186よりも上方に配置される(図3及び図4)。さらに、TOFセンサ100及び超音波センサ102は、ノズル186よりも前側に配置される(図3及び図4)。これらにより、ノズル186から散布される散布物による、TOFセンサ100又は超音波センサ102の測定に対する影響を抑制することが可能となる。 In the present embodiment, the drone 24 includes a tank 180 for storing the sprayed material and a nozzle 186 (discharge port) for spraying the sprayed material (FIGS. 3 and 4). Further, the TOF sensor 100 (optical sensor) and the ultrasonic sensor 102 are arranged above the nozzle 186 (FIGS. 3 and 4). Further, the TOF sensor 100 and the ultrasonic sensor 102 are arranged in front of the nozzle 186 (FIGS. 3 and 4). As a result, it is possible to suppress the influence of the sprayed material sprayed from the nozzle 186 on the measurement of the TOF sensor 100 or the ultrasonic sensor 102.
 本実施形態において、ドローン24は、下方対象Tlowを撮像するカメラ160(図2、図4~図6)と、カメラ160の画像に基づいて、下方対象Tlowの状態(下方状態Slow)を判定する撮影制御部202(下方状態判定部。図2)とを備える。飛行制御部200は、光センサ精度低下状態ではない通常時(図7のS13:真)には、TOFセンサ100(光センサ)が検出した第1距離D1を選択して制御用高度Hcを設定する(S14)。また、飛行制御部200は、光センサ精度低下状態の場合(S13:偽)には、超音波センサ102が検出した第2距離D2を選択して制御用高度Hcを設定する(S17、S22)。 In the present embodiment, the drone 24 determines the state of the lower target Trow (lower state Slow) based on the images of the camera 160 (FIGS. 2, 4 to 6) that captures the lower target Tlow and the images of the camera 160. It is provided with a shooting control unit 202 (lower state determination unit; FIG. 2). The flight control unit 200 selects the first distance D1 detected by the TOF sensor 100 (optical sensor) and sets the control altitude Hc in the normal state (S13: true in FIG. 7) when the accuracy of the optical sensor is not deteriorated. (S14). Further, when the optical sensor accuracy is lowered (S13: false), the flight control unit 200 selects the second distance D2 detected by the ultrasonic sensor 102 and sets the control altitude Hc (S17, S22). ..
 これにより、通常時にはTOFセンサ100の方が超音波センサ102よりも検出精度が高い場合において、TOFセンサ100の検出精度が低下する場面では超音波センサ102を優先して用いることで、ドローン24の飛行を高精度に制御することが可能となる。 As a result, when the detection accuracy of the TOF sensor 100 is higher than that of the ultrasonic sensor 102 in normal times, the ultrasonic sensor 102 is preferentially used in a situation where the detection accuracy of the TOF sensor 100 is lowered. It is possible to control the flight with high accuracy.
 本実施形態において、光センサ精度低下状態は、下方対象Tlowに日陰から日向への境界が存在し、TOFセンサ100(光センサ)の測定位置が境界を跨ぐ状態を含む(図7のS13、S15)。飛行制御部200は、TOFセンサ100の測定位置が日陰から日向への境界を跨ぐ手前で、ドローン24の飛行速度Vfを低下させる(S16)。 In the present embodiment, the optical sensor accuracy reduction state includes a state in which a boundary from the shade to the sun exists in the lower target Row and the measurement position of the TOF sensor 100 (optical sensor) straddles the boundary (S13, S15 in FIG. 7). ). The flight control unit 200 reduces the flight speed Vf of the drone 24 before the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun (S16).
 TOFセンサ100は、検出精度を高めるために受光量のダイナミックレンジを可変とする。そのため、日陰から日向への境界を測定位置が跨ぐ場合、急激な受光量の増加により測定レンジが一時的に飽和し、その後、ダイナミックレンジの調整により、日向でも高精度な測定が可能となる。本実施形態では、TOFセンサ100の測定位置が日陰から日向への境界を跨ぐ手前で、ドローン24の飛行速度Vfを低下させる。これにより、TOFセンサ100の受光量が飽和している状態でドローン24が進む距離を短くすることができる。従って、TOFセンサ100の適用範囲を広げることが可能となり、地面520等の下方対象Tlowの変化に対する頑健さを向上可能となる。 The TOF sensor 100 has a variable dynamic range of the amount of received light in order to improve the detection accuracy. Therefore, when the measurement position straddles the boundary from the shade to the sun, the measurement range is temporarily saturated due to a sudden increase in the amount of light received, and then the dynamic range is adjusted to enable highly accurate measurement even in the sun. In the present embodiment, the flight speed Vf of the drone 24 is reduced before the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun. As a result, the distance traveled by the drone 24 can be shortened when the amount of light received by the TOF sensor 100 is saturated. Therefore, the applicable range of the TOF sensor 100 can be expanded, and the robustness against a change in the downward target Row such as the ground 520 can be improved.
 また、超音波は、光よりも大幅に低速であり、ドローン24の移動に伴うドップラー効果の影響が大きい。そのため、超音波センサ102の検出値(第2距離D2)を利用する際にはTOFセンサ100の検出値(第1距離D1)を利用する場合よりも、飛行速度Vfを低下させることでドップラー効果による検出精度の低下を抑制することが可能となる。 Also, ultrasonic waves are much slower than light, and the Doppler effect that accompanies the movement of the drone 24 has a large effect. Therefore, when using the detected value (second distance D2) of the ultrasonic sensor 102, the Doppler effect is obtained by lowering the flight speed Vf as compared with the case of using the detected value (first distance D1) of the TOF sensor 100. It is possible to suppress a decrease in detection accuracy due to the above.
B.変形例
 なお、本発明は、上記実施形態に限らず、本明細書の記載内容に基づき、種々の構成を採り得ることはもちろんである。例えば、以下の構成を採用することができる。
B. Modifications It should be noted that the present invention is not limited to the above-described embodiment, and it goes without saying that various configurations can be adopted based on the contents described in the present specification. For example, the following configuration can be adopted.
<B-1.構成>
 上記実施形態の作物育成システム10は、図1に示すような構成要素を有していた。しかしながら、例えば、TOFセンサ100と超音波センサ102の検出値D1、D2を制御用高度Hcとして選択的に用いる観点からすれば、これに限らない。例えば、作物育成システム10は、ドローン24と、ユーザ端末26のみを有するものとしてもよい。その場合、ユーザ端末26によりドローン24の飛行を制御してもよい。
<B-1. Configuration>
The crop growing system 10 of the above embodiment had components as shown in FIG. However, the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc. For example, the crop growing system 10 may have only the drone 24 and the user terminal 26. In that case, the flight of the drone 24 may be controlled by the user terminal 26.
 上記実施形態において、ドローン24は、作物502の撮像及び薬液の散布を行った(図1)。しかしながら、例えば、TOFセンサ100と超音波センサ102の検出値D1、D2を制御用高度Hcとして選択的に用いる観点からすれば、これに限らない。例えば、ドローン24は、作物502の撮像及び薬液の散布の一方のみを行うものであってもよい。或いは、ドローン24は、その他の用途(例えば、生育診断以外の空撮)で用いるものであってもよい。 In the above embodiment, the drone 24 imaged the crop 502 and sprayed the chemical solution (FIG. 1). However, the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc. For example, the drone 24 may only image the crop 502 and spray the chemical solution. Alternatively, the drone 24 may be used for other purposes (for example, aerial photography other than growth diagnosis).
 上記実施形態では、光センサとしてTOFセンサ100を用いた。しかしながら、例えば、光センサと超音波センサ102の検出値D1、D2を制御用高度Hcとして選択的に用いる観点からすれば、これに限らない。例えば、TOFセンサ100の代わりに、他の種類の光センサ(位相差検出方式、三角測距方式等)を用いてもよい。 In the above embodiment, the TOF sensor 100 is used as the optical sensor. However, the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the optical sensor and the ultrasonic sensor 102 as the control altitude Hc. For example, instead of the TOF sensor 100, another type of optical sensor (phase difference detection method, triangular ranging method, etc.) may be used.
 上記実施形態では、TOFセンサ100と超音波センサ102をドローン24の横方向に並べて配置した(図4、図5)。しかしながら、例えば、TOFセンサ100と超音波センサ102の検出値D1、D2を制御用高度Hcとして選択的に用いる観点からすれば、これに限らない。例えば、TOFセンサ100と超音波センサ102をドローン24の前後方向に並べて配置してもよい。 In the above embodiment, the TOF sensor 100 and the ultrasonic sensor 102 are arranged side by side in the horizontal direction of the drone 24 (FIGS. 4 and 5). However, the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc. For example, the TOF sensor 100 and the ultrasonic sensor 102 may be arranged side by side in the front-rear direction of the drone 24.
 上記実施形態では、TOFセンサ100と超音波センサ102をドローン24の本体50内に配置した(図5及び図6)。しかしながら、例えば、TOFセンサ100と超音波センサ102の検出値D1、D2を制御用高度Hcとして選択的に用いる観点からすれば、これに限らない。例えば、TOFセンサ100と超音波センサ102をドローン24の本体50の外側に配置してもよい。 In the above embodiment, the TOF sensor 100 and the ultrasonic sensor 102 are arranged in the main body 50 of the drone 24 (FIGS. 5 and 6). However, the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc. For example, the TOF sensor 100 and the ultrasonic sensor 102 may be arranged outside the main body 50 of the drone 24.
 図8は、変形例に係るドローン24の本体50の内部構成及びその周辺の配置を簡略的に示す側面図である。図8の例では、ドローン24は、本体50から下方に突出してTOFセンサ100、超音波センサ102及びカメラ160を支持する支持部材220を有する。これにより、ドローン24の本体50内に発熱源(制御基板110、インバータ136、電源210等)が存在する場合でも、TOFセンサ100、超音波センサ102及びカメラ160を発熱源から遠ざけて配置することができる。従って、発熱源からの熱によりTOFセンサ100、超音波センサ102及びカメラ160が影響を受けることを抑制することが可能となる。 FIG. 8 is a side view simply showing the internal configuration of the main body 50 of the drone 24 and the arrangement around the main body 50 according to the modified example. In the example of FIG. 8, the drone 24 has a support member 220 that projects downward from the main body 50 and supports a TOF sensor 100, an ultrasonic sensor 102, and a camera 160. As a result, even if a heat generation source (control board 110, inverter 136, power supply 210, etc.) exists in the main body 50 of the drone 24, the TOF sensor 100, the ultrasonic sensor 102, and the camera 160 can be arranged away from the heat generation source. Can be done. Therefore, it is possible to prevent the TOF sensor 100, the ultrasonic sensor 102, and the camera 160 from being affected by the heat from the heat generation source.
 上記実施形態において、ドローン24が散布する散布物は、液体としての薬剤であった。しかしながら、例えば、TOFセンサ100と超音波センサ102の検出値を制御用高度Hcとして選択的に用いる観点からすれば、これに限らない。例えば、散布物は、薬剤以外のもの(水等)であってもよく、また、気体又は固体(粉体を含む。)であってもよい。 In the above embodiment, the sprayed product sprayed by the drone 24 was a drug as a liquid. However, the present invention is not limited to this, for example, from the viewpoint of selectively using the detected values of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc. For example, the sprayed material may be something other than a drug (water, etc.), or may be a gas or a solid (including powder).
<B-2.制御>
 上記実施形態では、下方対象Tlowの状態(下方状態Slow)の判定は、リアルタイムでのカメラ160の下方画像Ilowに基づいて行った(図7のS12、S19)。しかしながら、例えば、下方状態Slowを判定する観点からすれば、これに限らない。例えば、位置座標と下方状態Slowとを予め関連付けて記憶しておき、ドローン24飛行中の位置座標と前記記憶情報とに基づいて下方状態Slowを判定してもよい。或いは、TOFセンサ100、超音波センサ102及びカメラ160以外の手段(例えば、衛星写真とドローン24の位置座標)を用いて下方状態Slowを判定してもよい。
<B-2. Control>
In the above embodiment, the determination of the state of the lower target Tlow (lower state Slow) is performed based on the lower image Ilaw of the camera 160 in real time (S12, S19 in FIG. 7). However, for example, from the viewpoint of determining the downward state Slow, the present invention is not limited to this. For example, the position coordinates and the lower state Slow may be stored in association with each other in advance, and the lower state Slow may be determined based on the position coordinates during the flight of the drone 24 and the stored information. Alternatively, the downward state Slow may be determined using means other than the TOF sensor 100, the ultrasonic sensor 102, and the camera 160 (for example, the position coordinates of the satellite image and the drone 24).
 上記実施形態では、TOFセンサ精度低下状態として、下方対象Tlowが鏡面又は白色である状態、及びドローン24が日陰から日向への境界を跨ぐ状態を用いた(図7のS13、S15)。しかしながら、例えば、TOFセンサ100と超音波センサ102の検出値D1、D2を制御用高度Hcとして選択的に用いる観点からすれば、これに限らない。例えば、ドローン24が日向から日陰への境界を跨ぐ状態を用いてもよい。この場合、TOFセンサ100におけるダイナミックレンジの飽和は起こらないが、利用するレンジが狭くなるため、検出精度が低下し得る。そこで、超音波センサ102への切替え及び/又は飛行速度Vfの低下により、下方対象Tlowの変化に対する頑健性を向上することが可能となる。 In the above embodiment, as the TOF sensor accuracy reduction state, a state in which the lower target Row is a mirror surface or a white surface and a state in which the drone 24 straddles the boundary from the shade to the sun are used (S13 and S15 in FIG. 7). However, the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc. For example, a state in which the drone 24 straddles the boundary from the sun to the shade may be used. In this case, the dynamic range of the TOF sensor 100 is not saturated, but the range to be used is narrowed, so that the detection accuracy may be lowered. Therefore, by switching to the ultrasonic sensor 102 and / or reducing the flight speed Vf, it is possible to improve the robustness against a change in the downward target Tlow.
 上記実施形態では、通常時(TOFセンサ精度低下状態以外の場合)にTOFセンサ100の検出値(第1距離D1)を制御用高度Hcとして用い、TOFセンサ精度低下状態の場合に超音波センサ102の検出値(第2距離D2)を制御用高度Hcとして用いた(図7)。しかしながら、例えば、TOFセンサ100と超音波センサ102の検出値D1、D2を制御用高度Hcとして選択的に用いる観点からすれば、これに限らない。例えば、第1距離D1と第2距離D2を組み合わせて制御用高度Hcを算出する構成において、下方対象Tlowの状態(下方状態Slow)に応じて第1距離D1と第2距離D2の重み付け(比率)を変化させてもよい。例えば、通常時は、第1距離D1×0.9+第2距離D2×0.1を制御用高度Hcとし、TOFセンサ精度低下状態の場合、第1距離D1×0.3+第2距離D2×0.7を制御用高度Hcとしてもよい。 In the above embodiment, the detection value (first distance D1) of the TOF sensor 100 is used as the control altitude Hc in the normal state (when the TOF sensor accuracy is not lowered), and the ultrasonic sensor 102 is used when the TOF sensor accuracy is lowered. (2nd distance D2) was used as the control altitude Hc (FIG. 7). However, the present invention is not limited to this, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc. For example, in a configuration in which the first distance D1 and the second distance D2 are combined to calculate the control altitude Hc, the first distance D1 and the second distance D2 are weighted (ratio) according to the state of the lower target Tlow (lower state Slow). ) May be changed. For example, in the normal state, the first distance D1 × 0.9 + the second distance D2 × 0.1 is set as the control altitude Hc, and when the TOF sensor accuracy is lowered, the first distance D1 × 0.3 + the second distance D2 × 0.7 may be the control altitude Hc.
 或いは、カメラ160の下方画像Ilowを用いずに、TOFセンサ100と超音波センサ102の検出値D1、D2を制御用高度Hcとして選択的に用いてもよい。例えば、通常時は、TOFセンサ100の検出値(第1距離D1)を制御用高度Hcとして用い、TOFセンサ100に異常が発生した場合に、超音波センサ102の検出値(第2距離D2)を制御用高度Hcとして用いてもよい。 Alternatively, the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 may be selectively used as the control altitude Hc without using the lower image Ilaw of the camera 160. For example, normally, the detection value of the TOF sensor 100 (first distance D1) is used as the control altitude Hc, and when an abnormality occurs in the TOF sensor 100, the detection value of the ultrasonic sensor 102 (second distance D2). May be used as the control altitude Hc.
 上記実施形態では、TOFセンサ100の測定位置が日陰から日向への境界を跨ぐ際(図7のS15:真)、ドローン24の飛行速度Vfを低下させた(S16)。しかしながら、例えば、TOFセンサ100と超音波センサ102の検出値D1、D2を制御用高度Hcとして選択的に用いる観点からすれば、これに限らず、TOFセンサ100の測定位置が日陰から日向への境界を跨ぐ際でも飛行速度Vfを維持してもよい。 In the above embodiment, when the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun (S15: true in FIG. 7), the flight speed Vf of the drone 24 is reduced (S16). However, for example, from the viewpoint of selectively using the detection values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 as the control altitude Hc, the measurement position of the TOF sensor 100 is not limited to this, and the measurement position of the TOF sensor 100 is from the shade to the sun. The flight speed Vf may be maintained even when crossing the boundary.
<B-3.その他>
 上記実施形態では、通常時(TOFセンサ精度低下状態でない場合)にTOFセンサ100の検出値(第1距離D1)を制御用高度Hcとして用い、TOFセンサ精度低下状態である場合に超音波センサ102の検出値(第2距離D2)を制御用高度Hcとして用いた(図7)。換言すると、TOFセンサ100と超音波センサ102の検出値D1、D2を制御用高度Hcとして選択的に用いた。しかしながら、TOFセンサ100の測定位置が日陰から日向への境界を跨ぐ際、ドローン24の飛行速度Vfを低下させる観点からすれば、超音波センサ102を用いずにTOFセンサ100の検出値(第1距離D1)のみを制御用高度Hcとして用いることも可能である。
<B-3. Others>
In the above embodiment, the detection value (first distance D1) of the TOF sensor 100 is used as the control altitude Hc in the normal state (when the TOF sensor accuracy is not lowered), and the ultrasonic sensor 102 is used when the TOF sensor accuracy is lowered. (2nd distance D2) was used as the control altitude Hc (FIG. 7). In other words, the detected values D1 and D2 of the TOF sensor 100 and the ultrasonic sensor 102 were selectively used as the control altitude Hc. However, from the viewpoint of reducing the flight speed Vf of the drone 24 when the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun, the detection value of the TOF sensor 100 (first) without using the ultrasonic sensor 102. It is also possible to use only the distance D1) as the control altitude Hc.
 図9は、変形例に係る高度関連制御のフローチャートである。図9の高度関連制御では、超音波センサ102を用いずにTOFセンサ100の検出値(第1距離D1)のみを制御用高度Hcとして用いる。 FIG. 9 is a flowchart of altitude-related control according to a modified example. In the altitude-related control of FIG. 9, only the detection value (first distance D1) of the TOF sensor 100 is used as the control altitude Hc without using the ultrasonic sensor 102.
 図9のステップS31、S32は、図7のS11、S12と同様である。ステップS33において、ドローン制御部70(飛行制御部200)は、下方状態Slowが通常であるか否か(TOFセンサ100の測定位置が日陰から日向への境界を跨ぐ場合以外であるか否か)を判定する。下方状態Slowが通常である場合(通常時である場合)(S33:真)、ステップS34に進む。ステップS34において、ドローン制御部70(飛行制御部200)は、TOFセンサ100の検出値D1を制御用高度Hcとして設定する。TOFセンサ100の測定位置が日陰から日向への境界を跨ぐ場合(S33:偽)、ステップS35に進む。 Steps S31 and S32 in FIG. 9 are the same as S11 and S12 in FIG. In step S33, the drone control unit 70 (flight control unit 200) determines whether or not the downward state Slow is normal (whether or not the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun). To judge. When the downward state Slow is normal (normal time) (S33: true), the process proceeds to step S34. In step S34, the drone control unit 70 (flight control unit 200) sets the detection value D1 of the TOF sensor 100 as the control altitude Hc. When the measurement position of the TOF sensor 100 crosses the boundary from the shade to the sun (S33: false), the process proceeds to step S35.
 ステップS35は、図7のステップS16と同様である。続くステップS37において、ドローン制御部70(飛行制御部200)は、TOFセンサ100の検出値D1を制御用高度Hcとして設定する。すなわち、ドローン制御部70(飛行制御部200)は、検出値D1を制御用高度Hcとして利用し続ける。ステップS37、S38、S39は、図7のステップS18、S19、S20と同様である。 Step S35 is the same as step S16 in FIG. In the following step S37, the drone control unit 70 (flight control unit 200) sets the detection value D1 of the TOF sensor 100 as the control altitude Hc. That is, the drone control unit 70 (flight control unit 200) continues to use the detected value D1 as the control altitude Hc. Steps S37, S38, and S39 are the same as steps S18, S19, and S20 of FIG.
 TOFセンサ100の測定位置が、日陰から日向への境界の通過を終了していない場合(S39:偽)、ステップS35に戻る。測定位置が同境界の通過を終了した場合(S39:真)、ステップS40に進む。ステップS40において、ドローン制御部70(飛行制御部200)は、TOFセンサ100の検出値D1を制御用高度Hcとして設定する。 If the measurement position of the TOF sensor 100 has not completed the passage of the boundary from the shade to the sun (S39: false), the process returns to step S35. When the measurement position finishes passing through the same boundary (S39: true), the process proceeds to step S40. In step S40, the drone control unit 70 (flight control unit 200) sets the detection value D1 of the TOF sensor 100 as the control altitude Hc.
 以上のように、図9の高度関連制御では、ドローン制御部70(飛行制御部200)は、TOFセンサ100の検出値(第1距離D1)を制御用高度Hcとして利用し続ける。なお、図9の制御を、ドローン24の飛行速度Vfの制御と捉える場合、制御用高度Hcの処理(S34、S36、S40)は、別の制御として位置付けてもよい。また、図9の制御では、日陰から日向への境界の通過時のみを光センサ精度低下状態として用いたが(S13)、上記のように、日向から日陰への境界の通過時を光センサ精度低下状態に含めてもよい。 As described above, in the altitude-related control of FIG. 9, the drone control unit 70 (flight control unit 200) continues to use the detected value (first distance D1) of the TOF sensor 100 as the control altitude Hc. When the control of FIG. 9 is regarded as the control of the flight speed Vf of the drone 24, the processing of the control altitude Hc (S34, S36, S40) may be positioned as another control. Further, in the control of FIG. 9, only the time when the boundary from the shade to the sun passed was used as the optical sensor accuracy reduced state (S13), but as described above, the time when the boundary from the sun to the shade passed was the optical sensor accuracy. It may be included in the lowered state.
 上記実施形態の高度関連制御では図7に示すフローを用い、上記変形例の高度関連制御では図9に示すフローを用いた。しかしながら、例えば、本発明の効果を得られる場合、フローの内容(各ステップの順番)は、これに限らない。例えば、図4のステップS16とS17の順番を入れ替えることが可能である。 The flow shown in FIG. 7 was used for the altitude-related control of the above embodiment, and the flow shown in FIG. 9 was used for the altitude-related control of the modification. However, for example, when the effect of the present invention can be obtained, the content of the flow (order of each step) is not limited to this. For example, the order of steps S16 and S17 in FIG. 4 can be exchanged.
24…ドローン           100…TOFセンサ(光センサ)
102…超音波センサ        160…カメラ
180…タンク
186l1、186l2、186r1、186r2…ノズル(吐出口)
200…飛行制御部
202…撮影制御部(下方状態判定部)
220…支持部材          D1…第1距離
D2…第2距離           Hc…制御用高度
Tlow…下方対象         Vf…飛行速度
24 ... Drone 100 ... TOF sensor (optical sensor)
102 ... Ultrasonic sensor 160 ... Camera 180 ... Tank 186l1, 186l2, 186r1, 186r2 ... Nozzle (discharge port)
200 ... Flight control unit 202 ... Shooting control unit (downward state determination unit)
220 ... Support member D1 ... First distance D2 ... Second distance Hc ... Control altitude Tlow ... Downward target Vf ... Flight speed

Claims (14)

  1.  下方対象までの第1距離を検出する光センサと
     前記下方対象までの第2距離を検出する超音波センサと、
     前記下方対象を撮像するカメラと
     を備えるドローンであって、
     前記ドローンは、
      前記カメラの画像に基づいて、前記下方対象の状態を判定する下方状態判定部と、
      前記第1距離及び前記第2距離の一方を選択して又は両方を組み合わせて設定した制御用高度を用いて前記ドローンの飛行制御を行う飛行制御部と
     をさらに有し、
     前記飛行制御部は、
      前記下方状態判定部が判定した前記下方対象の状態が、前記光センサの検出精度が低下する光センサ精度低下状態でない場合、前記光センサが検出した前記第1距離を選択して又は前記超音波センサが検出した前記第2距離よりも前記第1距離の重み付けを大きくして前記制御用高度を設定し、
      前記下方対象の状態が前記光センサ精度低下状態である場合、前記第2距離を選択して又は前記第1距離よりも前記第2距離の重み付けを大きくして前記制御用高度を設定すると共に、前記ドローンの飛行速度を低下させる
     ことを特徴とするドローン。
    An optical sensor that detects the first distance to the lower object, an ultrasonic sensor that detects the second distance to the lower object, and
    A drone equipped with a camera that captures the lower object.
    The drone
    A lower state determination unit that determines the state of the lower object based on the image of the camera, and a lower state determination unit.
    It further has a flight control unit that controls the flight of the drone by using the control altitude set by selecting one of the first distance and the second distance or a combination of both.
    The flight control unit
    When the state of the lower object determined by the lower state determination unit is not a state in which the accuracy of the optical sensor is reduced, the first distance detected by the optical sensor is selected or the ultrasonic wave is detected. The control altitude is set by increasing the weighting of the first distance from the second distance detected by the sensor.
    When the state of the lower object is the state in which the accuracy of the optical sensor is lowered, the control altitude is set by selecting the second distance or by increasing the weighting of the second distance from the first distance. A drone characterized by reducing the flight speed of the drone.
  2.  下方対象までの第1距離を検出する光センサと
     前記下方対象までの第2距離を検出する超音波センサと
     を備えるドローンであって、
     前記第1距離及び前記第2距離の一方を選択して又は両方を組み合わせて設定した制御用高度を用いて前記ドローンの飛行制御を行う飛行制御部をさらに有する
     ことを特徴とするドローン。
    A drone including an optical sensor that detects a first distance to a lower object and an ultrasonic sensor that detects a second distance to the lower object.
    A drone further comprising a flight control unit that controls the flight of the drone by using a control altitude set by selecting one of the first distance and the second distance or a combination of both.
  3.  請求項1又は2に記載のドローンにおいて、
     前記光センサは、タイム・オブ・フライト式のセンサである
     ことを特徴とするドローン。
    In the drone according to claim 1 or 2.
    The optical sensor is a drone characterized by being a time-of-flight type sensor.
  4.  請求項1~3のいずれか1項に記載のドローンにおいて、
     前記光センサと前記超音波センサは、前記ドローンの横方向に並んで配置され、
     前記光センサによる光の照射方向と、前記超音波センサによる超音波の照射方向は、前記光センサによる前記光の照射範囲と、前記超音波センサによる前記超音波の照射範囲の少なくとも一部が重なるように設定される
     ことを特徴とするドローン。
    In the drone according to any one of claims 1 to 3,
    The optical sensor and the ultrasonic sensor are arranged side by side in the lateral direction of the drone.
    The light irradiation direction by the optical sensor and the ultrasonic irradiation direction by the ultrasonic sensor overlap at least a part of the light irradiation range by the optical sensor and the ultrasonic irradiation range by the ultrasonic sensor. A drone characterized by being set up to.
  5.  請求項1~4のいずれか1項に記載のドローンにおいて、
     前記ドローンの本体から下方に突出して前記光センサ及び前記超音波センサを支持する支持部材を有する
     ことを特徴とするドローン。
    In the drone according to any one of claims 1 to 4,
    A drone characterized by having a support member that projects downward from the main body of the drone and supports the optical sensor and the ultrasonic sensor.
  6.  請求項1~5のいずれか1項に記載のドローンにおいて、
     前記ドローンは、
      散布物を保管するタンクと、
      前記散布物を散布する吐出口と
     を備え、
     前記光センサ及び前記超音波センサは、前記吐出口よりも上方に配置される
     ことを特徴とするドローン。
    In the drone according to any one of claims 1 to 5,
    The drone
    A tank to store the spray and
    It is provided with a discharge port for spraying the sprayed material.
    A drone characterized in that the optical sensor and the ultrasonic sensor are arranged above the discharge port.
  7.  請求項6に記載のドローンにおいて、
     前記光センサ及び前記超音波センサは、前記吐出口よりも上方及び前側に配置される
     ことを特徴とするドローン。
    In the drone according to claim 6,
    A drone characterized in that the optical sensor and the ultrasonic sensor are arranged above and in front of the discharge port.
  8.  請求項1~5のいずれか1項に記載のドローンにおいて、
     前記ドローンは、
      散布物を保管するタンクと、
      前記散布物を散布する吐出口と
     を備え、
     前記光センサ及び前記超音波センサは、前記吐出口よりも前側に配置される
     ことを特徴とするドローン。
    In the drone according to any one of claims 1 to 5,
    The drone
    A tank to store the spray and
    It is provided with a discharge port for spraying the sprayed material.
    A drone characterized in that the optical sensor and the ultrasonic sensor are arranged in front of the discharge port.
  9.  請求項2に記載のドローンにおいて、
     前記ドローンは、
      前記下方対象を撮像するカメラと、
      前記カメラの画像に基づいて、前記下方対象の状態を判定する下方状態判定部と
     をさらに備え、
     前記飛行制御部は、
      前記下方状態判定部が判定した前記下方対象の状態が、前記光センサの検出精度が低下する光センサ精度低下状態でない場合、前記光センサが検出した前記第1距離を選択して又は前記超音波センサが検出した前記第2距離よりも前記第1距離の重み付けを大きくして前記制御用高度を設定し、
      前記下方対象の状態が前記光センサ精度低下状態である場合、前記第2距離を選択して又は前記第1距離よりも前記第2距離の重み付けを大きくして前記制御用高度を設定する
     ことを特徴とするドローン。
    In the drone according to claim 2,
    The drone
    A camera that captures the lower object and
    Further, a lower state determination unit for determining the state of the lower object based on the image of the camera is provided.
    The flight control unit
    When the state of the lower object determined by the lower state determination unit is not a state in which the accuracy of the optical sensor is reduced, the first distance detected by the optical sensor is selected or the ultrasonic wave is detected. The control altitude is set by increasing the weighting of the first distance from the second distance detected by the sensor.
    When the state of the lower object is the state in which the accuracy of the optical sensor is lowered, the control altitude is set by selecting the second distance or by increasing the weighting of the second distance from the first distance. Characterized drone.
  10.  請求項1又は9に記載のドローンにおいて、
     前記光センサ精度低下状態は、前記下方対象が鏡面若しくは白色である状態、又は前記下方対象に日陰と日向の境界が存在し、前記光センサの測定位置が前記境界を跨ぐ状態が含まれる
     ことを特徴とするドローン。
    In the drone according to claim 1 or 9,
    The state in which the accuracy of the optical sensor is lowered includes a state in which the lower object is a mirror surface or white, or a state in which the lower object has a boundary between the shade and the sun and the measurement position of the optical sensor straddles the boundary. Characterized drone.
  11.  請求項9に記載のドローンにおいて、
     前記光センサ精度低下状態は、前記下方対象に日陰と日向の境界が存在し、前記光センサの測定位置が前記境界を跨ぐ状態であり、
     前記飛行制御部は、前記光センサの測定位置が前記日陰と日向の境界を跨ぐ手前で、前記ドローンの飛行速度を低下させる
     ことを特徴とするドローン。
    In the drone according to claim 9,
    The state in which the accuracy of the optical sensor is lowered is a state in which a boundary between the shade and the sun exists in the lower object, and the measurement position of the optical sensor straddles the boundary.
    The flight control unit is a drone characterized in that the flight speed of the drone is reduced before the measurement position of the optical sensor crosses the boundary between the shade and the sun.
  12.  下方対象までの第1距離を検出する光センサと、
     前記第1距離を制御用高度として用いて前記ドローンの飛行制御を行う飛行制御部と、
     前記下方対象の表面状態を判定する下方状態判定部と
     を備えるドローンであって、
     前記下方状態判定部が判定した前記下方対象の表面状態が、前記光センサの検出精度が低下する光センサ精度低下状態である場合、前記飛行制御部は、前記ドローンの飛行速度を低下させる
     ことを特徴とするドローン。
    An optical sensor that detects the first distance to the lower target,
    A flight control unit that controls the flight of the drone by using the first distance as a control altitude.
    A drone including a lower state determination unit for determining the surface state of the lower object.
    When the surface state of the lower object determined by the lower state determination unit is an optical sensor accuracy reduction state in which the detection accuracy of the optical sensor is reduced, the flight control unit reduces the flight speed of the drone. Characterized drone.
  13.  下方対象までの第1距離を検出する光センサと
     前記下方対象までの第2距離を検出する超音波センサと
     を備えるドローンの制御方法であって、
     飛行制御部が、前記第1距離及び前記第2距離の一方を選択して又は両方を組み合わせて設定した制御用高度を用いて前記ドローンの飛行制御を行う
     ことを特徴とするドローンの制御方法。
    A drone control method including an optical sensor that detects a first distance to a lower object and an ultrasonic sensor that detects a second distance to the lower object.
    A method for controlling a drone, wherein the flight control unit controls the flight of the drone by using a control altitude set by selecting one of the first distance and the second distance or a combination of both.
  14.  下方対象までの第1距離を検出する光センサと、
     前記第1距離を制御用高度として用いて前記ドローンの飛行制御を行う飛行制御部と、
     前記下方対象の表面状態を判定する下方状態判定部と
     を備えるドローンの制御方法であって、
     前記下方状態判定部が判定した前記下方対象の表面状態が、前記光センサの検出精度が低下する光センサ精度低下状態である場合、前記飛行制御部は、前記ドローンの飛行速度を低下させる
     ことを特徴とするドローンの制御方法。
    An optical sensor that detects the first distance to the lower target,
    A flight control unit that controls the flight of the drone by using the first distance as a control altitude.
    It is a control method of a drone including a lower state determination unit for determining the surface state of the lower object.
    When the surface state of the lower object determined by the lower state determination unit is an optical sensor accuracy reduction state in which the detection accuracy of the optical sensor is reduced, the flight control unit reduces the flight speed of the drone. The characteristic drone control method.
PCT/JP2020/005932 2020-02-17 2020-02-17 Drone and method for controlling drone WO2021166008A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022501389A JP7369485B2 (en) 2020-02-17 2020-02-17 Drones and how to control them
PCT/JP2020/005932 WO2021166008A1 (en) 2020-02-17 2020-02-17 Drone and method for controlling drone

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/005932 WO2021166008A1 (en) 2020-02-17 2020-02-17 Drone and method for controlling drone

Publications (1)

Publication Number Publication Date
WO2021166008A1 true WO2021166008A1 (en) 2021-08-26

Family

ID=77391473

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/005932 WO2021166008A1 (en) 2020-02-17 2020-02-17 Drone and method for controlling drone

Country Status (2)

Country Link
JP (1) JP7369485B2 (en)
WO (1) WO2021166008A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116609278A (en) * 2023-07-21 2023-08-18 华东交通大学 Method and system for collecting farmland heavy metal spectrum data

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170101087A (en) * 2016-02-26 2017-09-05 (주)스마트모션 Controlling apparatus of drone using multiple sensor and method thereof
JP2017529616A (en) * 2015-03-31 2017-10-05 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Mobile platform control method and system
WO2018189848A1 (en) * 2017-04-12 2018-10-18 株式会社ナイルワークス Method for spraying chemical by unmanned flight vehicle, and program
WO2019168042A1 (en) * 2018-02-28 2019-09-06 株式会社ナイルワークス Drone, control method thereof, and program
JP2019219778A (en) * 2018-06-18 2019-12-26 カシオ計算機株式会社 Flight device, flight method and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017529616A (en) * 2015-03-31 2017-10-05 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Mobile platform control method and system
KR20170101087A (en) * 2016-02-26 2017-09-05 (주)스마트모션 Controlling apparatus of drone using multiple sensor and method thereof
WO2018189848A1 (en) * 2017-04-12 2018-10-18 株式会社ナイルワークス Method for spraying chemical by unmanned flight vehicle, and program
WO2019168042A1 (en) * 2018-02-28 2019-09-06 株式会社ナイルワークス Drone, control method thereof, and program
JP2019219778A (en) * 2018-06-18 2019-12-26 カシオ計算機株式会社 Flight device, flight method and program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116609278A (en) * 2023-07-21 2023-08-18 华东交通大学 Method and system for collecting farmland heavy metal spectrum data
CN116609278B (en) * 2023-07-21 2023-10-17 华东交通大学 Method and system for collecting farmland heavy metal spectrum data

Also Published As

Publication number Publication date
JP7369485B2 (en) 2023-10-26
JPWO2021166008A1 (en) 2021-08-26

Similar Documents

Publication Publication Date Title
US11169028B2 (en) Unmanned aerial system based thermal imaging and aggregation systems and methods
US10189568B2 (en) Agricultural crop analysis drone
US11771076B2 (en) Flight control method, information processing device, program and recording medium
JP6621140B2 (en) Method and program for spraying medicine by unmanned air vehicle
JP6431395B2 (en) Aerial sprayer
JP2017015527A (en) Wide area sensor system, flight detection method and program
CN110192122A (en) Radar-directed system and method on unmanned moveable platform
JP6948917B2 (en) Spraying machine
JP7176785B2 (en) Drone, drone control method, and drone control program
CN109690250A (en) UAV system secondary navigation system and method
WO2021166008A1 (en) Drone and method for controlling drone
JP2021073902A (en) Control system of drone, control method of drone, and drone
JP2019062793A (en) Control system of combine
JP2018043696A (en) Aerial spray device
WO2021152797A1 (en) Crop raising system
JP6887142B2 (en) Field image analysis method
JP6982908B2 (en) Driving route generator, driving route generation method, and driving route generation program, and drone
US11971725B2 (en) System and method for performing spraying operations with an agricultural applicator
JPWO2020022259A1 (en) Field photography camera
WO2021255885A1 (en) Spraying system, spraying method, and drone
WO2021166175A1 (en) Drone system, controller, and method for defining work area
JP7412041B2 (en) unmanned aircraft control system
JP2022084735A (en) Drone, drone control method, and drone control program
JP7011233B2 (en) Spraying system and spraying management device
JP7411259B2 (en) Plant pathology diagnosis system, plant pathology diagnosis method, plant pathology diagnosis device, and drone

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20920147

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022501389

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20920147

Country of ref document: EP

Kind code of ref document: A1