WO2020044628A1 - Cultivated land imaging system and cultivated land imaging method - Google Patents

Cultivated land imaging system and cultivated land imaging method Download PDF

Info

Publication number
WO2020044628A1
WO2020044628A1 PCT/JP2019/010050 JP2019010050W WO2020044628A1 WO 2020044628 A1 WO2020044628 A1 WO 2020044628A1 JP 2019010050 W JP2019010050 W JP 2019010050W WO 2020044628 A1 WO2020044628 A1 WO 2020044628A1
Authority
WO
WIPO (PCT)
Prior art keywords
field
image
chart
unit
flying object
Prior art date
Application number
PCT/JP2019/010050
Other languages
French (fr)
Japanese (ja)
Inventor
片桐 哲也
康男 小柳
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2020540034A priority Critical patent/JPWO2020044628A1/en
Publication of WO2020044628A1 publication Critical patent/WO2020044628A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G7/00Botany in general
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/19Propulsion using electrically powered motors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • B64U70/90Launching from or landing on platforms

Definitions

  • the present invention relates to a field photographing system and a field photographing method for photographing a field from above.
  • the reflected light measurement unit measures the light intensity of the reflected light of the measurement target having a plurality of leaves at the first wavelength and the second wavelength
  • the sun angle acquisition unit measures the sunlight.
  • the angle of incidence on the reflected light is acquired as the sun angle
  • the direction of the sun with respect to the measurement direction of the reflected light measurement unit is acquired as the sun direction by the sun direction acquisition unit.
  • the growth index calculating section obtains each light intensity of the reflected light at each of the first and second wavelengths measured by the reflected light measuring section, the sun angle obtained by the sun angle obtaining section, and the sun direction obtained by the sun direction obtaining section.
  • a growth index representing the degree of growth in the measurement target is obtained based on the obtained sun direction.
  • NDVI NormalizedmalDifference Vegetation Index; normalized vegetation index
  • the reflected light measurement unit is a so-called multi-spectral camera including a visible imaging unit that generates an image of visible light and an infrared imaging unit that generates an image of infrared light.
  • Each of the visible imaging unit and the infrared imaging unit includes a bandpass filter, a lens (imaging optical system), and a sensor.
  • a multispectral camera is mounted on an unmanned aerial vehicle (also called a drone), and a drone is flown to photograph a field from above, thereby acquiring a visible image and an infrared image, and measuring NDVI. ing.
  • the above-mentioned field refers to a paddy field or a field where a crop is grown.
  • the NDVI value fluctuates due to the following causes.
  • the above-mentioned Patent Document 1 does not consider parallax correction when the multispectral camera is mounted on a drone and is flown. Even when performing parallax correction, it is desirable to perform parallax correction at the same time as imaging of a field from the viewpoint of processing efficiency, and it is desired to construct a system that can realize such parallax correction.
  • the present invention has been made in order to solve the above-described problems, and its object is to mount a plurality of imaging units for acquiring images with different wavelength bands on a flying object, in a system for photographing a field from above,
  • a field photographing system and a field that can reduce the influence of parallax between images acquired by a plurality of imaging units and can perform parallax correction for reducing the effect of the parallax simultaneously with shooting of a field It is to provide a shooting method.
  • a field imaging system obtains a field object having a different wavelength band by shooting a flying object and a field held by the flying object and growing a crop from above, while the aircraft is flying.
  • a plurality of imaging units, and a chart captured by the plurality of imaging units, and based on each chart image obtained by capturing the chart from above by the plurality of imaging units during the flight of the flying object.
  • a parallax correction unit that corrects a positional shift of a pixel indicating the same point caused by parallax between the respective field images, wherein the chart is located in the field or around the field.
  • a field imaging method is directed to a field imaging method in which a plurality of imaging units held by a flying object captures, from the sky, a field where a crop grows during the flight of the flying object, and a field image having a different wavelength band.
  • the method includes a chart image acquiring step and a parallax correcting step of correcting a positional shift of a pixel indicating the same point caused by parallax between the respective field images based on the respective chart images.
  • the parallax correction unit corrects a positional shift of a pixel indicating the same point caused by parallax between each field image based on each chart image obtained by photographing a chart with a plurality of imaging units.
  • the parallax correction unit can perform the parallax correction for reducing the influence of the parallax at the same time as the imaging of the field, and can improve the processing efficiency.
  • FIG. 1 is an explanatory diagram schematically showing an overall configuration of a field photographing system according to an embodiment of the present invention. It is explanatory drawing which shows typically the detailed structure of the imaging device contained in the said field imaging system.
  • FIG. 3 is a block diagram illustrating a schematic configuration of a terminal device included in the field photographing system. It is a top view of a chart contained in the above-mentioned field photography system. It is a top view showing other composition of the above-mentioned chart. It is explanatory drawing which shows the state which attached the chart to the transport vehicle. It is explanatory drawing which shows the state which attached the chart to the helmet. It is explanatory drawing which substitutes as a chart and shows typically the stock missing part in a field.
  • FIG. 3 is a block diagram illustrating a detailed configuration of a flying object included in the field imaging system. It is a flowchart which shows the flow of the process by the field imaging method implemented by the said field imaging system. It is a flow chart which shows the flow of processing of parallax correction in the above-mentioned field photography system.
  • FIG. 9 is an explanatory diagram schematically showing a chart image extracted from a plurality of captured images. It is explanatory drawing which shows projective transformation typically.
  • FIG. 1 is an explanatory diagram schematically showing the overall configuration of the field photographing system 1 of the present embodiment.
  • the field photographing system 1 is a system for photographing a field FD that grows a crop PL, and includes a flying object 10, an imaging device 20, a terminal device 30, and a chart C.
  • the flying object 10 and the terminal device 30 are communicably connected via a communication line NW.
  • the communication line NW is configured by a network (whether wired or wireless) such as a LAN (Local Area Network) or an Internet line.
  • the communication function of the flying object 10 may be provided to the imaging device 20, and the imaging device 20 and the terminal device 30 may be communicably connected via the communication line NW.
  • the imaging device 20 is configured by, for example, a multispectral camera, and is held by the flying object 10.
  • FIG. 2 schematically shows a detailed configuration of the imaging device 20.
  • the imaging device 20 includes a first imaging unit 21 and a second imaging unit 22.
  • the first imaging unit 21 is a visible imaging unit that captures an image of the wavelength band of visible light (visible image) by capturing an image of the field FD.
  • the second imaging unit 22 is a near-infrared imaging unit that captures an image of the field FD and acquires an image in the wavelength band of near-infrared light (near-infrared image). Since the imaging device 20 includes the first imaging unit 21 and the second imaging unit 22, the field FD is photographed from above while the flying object 10 is flying, and field images having different wavelength bands are acquired. Can be.
  • the first imaging unit 21 includes a first bandpass filter 21a, a first imaging optical system 21b, a first image sensor (optical sensor) 21c, a first digital signal processor (not shown), and the like.
  • the first bandpass filter 21a is an optical filter that transmits light in a relatively narrow band having a center wavelength of, for example, 650 nm.
  • the first imaging optical system 21b includes at least one lens, and forms a first optical image of visible light of a measurement target (crop PL or field FD) transmitted through the first bandpass filter 21a. Form an image on the surface.
  • the first image sensor 21c is composed of, for example, a VGA (640 pixels ⁇ 480 pixels) CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor, and the light receiving surface is a first imaging surface. Detects a relatively narrow band of light centered on a wavelength of 650 nm contained in sunlight reflected by the object to be measured, and converts the optical image of the visible light of the object to be converted into an electrical signal. I do.
  • the first digital signal processor performs image processing on the output of the first image sensor 21c to form a visible image.
  • the image data Rv of the visible image is output to the control unit 11 of the flying object 10 (see FIG. 10).
  • the transmission wavelength band of the first bandpass filter 21a is the red (R) wavelength band in the present embodiment, it may be another wavelength band such as blue (B) or green (G). Good.
  • the second imaging unit 22 includes a second bandpass filter 22a, a second imaging optical system 22b, a second image sensor (optical sensor) 22c, a second digital signal processor (not shown), and the like.
  • the second bandpass filter 22a is an optical filter that transmits light in a relatively narrow band having a center wavelength of a predetermined wavelength of 750 nm or more (for example, a wavelength of 800 nm).
  • the second imaging optical system 22b includes at least one lens, and forms an optical image of near-infrared light of the measurement target transmitted through the second bandpass filter 22a on the second imaging surface. I do.
  • the second image sensor 22c is formed of, for example, a VGA type CCD sensor or a CMOS sensor, is arranged such that the light receiving surface coincides with the second imaging surface, and is included in sunlight reflected by the measurement target. A relatively narrow band light having a center wavelength of 800 nm is detected, and an optical image of near-infrared light to be measured is converted into an electric signal.
  • the second digital signal processor performs image processing on the output of the second image sensor 22c to form a near-infrared image.
  • the image data Ri of the near-infrared image is output to the control unit 11 of the flying object 10.
  • FIG. 3 is a block diagram illustrating a schematic configuration of the terminal device 30.
  • the terminal device 30 is configured by, for example, a PC (personal computer), and includes an input unit 31, a display unit 32, a storage unit 33, a communication unit 34, and a control unit 35.
  • the terminal device 30 may be configured by a multifunctional portable terminal (for example, a smartphone or a tablet terminal).
  • the input unit 31 is operated by the user, and is provided for receiving input of information by the user.
  • Such an input unit 31 is specifically configured by an operation unit such as a keyboard, a mouse, and a touchpad.
  • the display unit 32 is a display that displays various types of information, and is configured by, for example, a liquid crystal display device.
  • the storage unit 33 is constituted by, for example, a hard disk, and stores various information in addition to a program for operating the control unit 35.
  • a calculation formula (distortion correction information) for correcting the distortion of the visible image and the near-infrared image acquired by the imaging device 20 (the first imaging unit 21 and the second imaging unit 22) is as follows. Is determined in advance according to the design of the first imaging optical system 21b and the second imaging optical system 22b. Such distortion correction information is stored in the storage unit 33 in advance, and is provided to the flying object 10 via the communication line NW when performing distortion correction processing inside the flying object 10. Note that the above-described distortion correction and parallax correction described below may be performed by the control unit 35 of the terminal device 30.
  • the communication unit 34 is an interface for performing communication with the flying object 10, and includes a transmission circuit, a reception circuit, a modulation circuit, a demodulation circuit, an antenna, and the like, and is communicably connected to the communication line NW. ing.
  • the control unit 35 is constituted by, for example, a CPU (Central Processing Unit), operates according to the operation program stored in the storage unit 33, controls the operation of each unit of the terminal device 30, and necessitates Computation processing is performed accordingly.
  • a CPU Central Processing Unit
  • the chart C is located in the field FD or around the field FD, and is used when performing parallax correction described later.
  • the chart C is attached to an object OB located in the field FD or around the field FD, and is thereby located in the field FD or around the field FD.
  • FIG. 1 shows an example in which the take-off and landing platform 40 located around the field FD is used as the object OB, and the chart C is attached to the take-off and landing platform 40.
  • the take-off and landing table 40 is a sheet (table) installed on the ground in order to prevent the surrounding sand from rolling up when the flying object 10 takes off and land. Such a chart C is photographed from above by the first imaging unit 21 and the second imaging unit 22 during the flight of the flying object 10.
  • FIG. 4 is a plan view of the chart C.
  • the chart C has a pattern including at least one circular portion C 1 (for example, a black circle). Pattern includes a pattern (not intended generated) naturally present in the periphery of the field FD or in the field FD, to distinguish the chart C for parallax correction, chart C is a plurality of circular portions C 1 it is desirably, three or more and is more preferably a pattern including a circular portion C 1, reflectance from this, in addition to the circular part C 1 different circular portion C 2 (for example, black and sheets of background It is desirable that the pattern includes a pattern (a circular portion colored with a color different from the color). Further, as shown in FIG. 5, the chart C may have a lattice pattern.
  • the overall shape of the chart C is preferably a square, but may be another shape such as a rectangle.
  • the object OB to which the chart C is attached is not limited to the take-off and landing platform 40 described above.
  • a transport vehicle 41 that transports the flying object 10 may be used, and the chart C may be attached to the transport vehicle 41 (for example, a ceiling or a hood).
  • the chart C may be attached to the transport vehicle 41 (for example, a ceiling or a hood).
  • a helmet 42 worn by a pilot who remotely controls the flight of the flying object 10 is used as the object OB, and a chart C (for example, one circular portion C 1 ) is placed on top of the helmet 42. May be attached.
  • an incident light sensor installed on the ground (around the field FD) may be used, and the chart C may be attached to this incident light sensor.
  • the incident light sensor is provided to correct the image data Rv of the visible image and the image data Ri of the near-infrared image acquired by the imaging device 20 according to the weather (clear, cloudy). And corresponds to the sunlight measuring unit of Patent Document 1. Further, a generator (located around the field FD) for charging the battery 17 (see FIG.
  • a field server such as a thermometer and a water level gauge installed in the field FD
  • an RTK -GPS Real Time Kinematic-Global Positioning System
  • agricultural machines such as tractors and combiners
  • irrigation equipment such as floodgates (field FD) Inside or around the field FD)
  • the chart C may be attached to these objects OB.
  • the lacking portion N of the crop PL located in the field FD may be used as the chart C.
  • the lacking portion N of the crop PL refers to a portion (region) where the crop PL does not grow from the beginning in the field FD.
  • the crop PL is rice, the rice is not planted from the beginning.
  • a hatched portion indicates a region where the crop PL is growing in the field FD, and a white portion without hatching corresponds to the missing portion N.
  • the reflection spot S of sunlight reflected on the water surface when the field FD is immersed in water may be used as the chart C.
  • the flying object 10 shown in FIG. 1 is configured by, for example, an unmanned aerial vehicle (drone) capable of autonomous flight.
  • the flying object 10 may be configured by a balloon, an airship, an airplane, a helicopter, or the like, in addition to the unmanned aircraft described above.
  • the flight of the flying object 10 is remotely controlled by a ground pilot (operator) operating the operation unit 50 (see FIG. 10).
  • a ground pilot operating the operation unit 50 (see FIG. 10).
  • By flying the flying object 10 along the field FD a plurality of regions T constituting the field FD are photographed from the sky by the imaging device 20, and an image (for example, a visible image or an infrared image) is formed for each region T. Can be obtained.
  • an image of the entire field FD can be finally obtained by bonding the images of the respective regions T.
  • an image of the entire field FD can be finally obtained by bonding the images of the respective regions T.
  • by raising the altitude of the flying object 10 and photographing the entire field FD collectively it is also possible to acquire one image of the entire field FD without pasting the images of the respective regions T.
  • FIG. 10 is a block diagram showing a detailed configuration of the flying object 10.
  • the flying object 10 includes a control unit 11, a communication unit 12, a warning unit 13, a position information acquisition unit 14, an image storage unit 15, a flight drive unit 16, and a battery 17.
  • the flight drive unit 16 has, for example, a propeller, a motor, and the like, and causes the flying object 10 to fly by driving the motor to rotate the propeller.
  • the flight control of the flying object 10 is performed by the control unit 11.
  • the control unit 11 controls the flight drive unit 16 based on the operation of the operation unit 50 by the pilot on the ground.
  • the battery 17 is a battery that supplies driving power to each part of the flying object 10.
  • the control unit 11 includes, for example, a CPU, and includes a general control unit 11a, a distortion correction unit 11b, a parallax correction unit 11c, a growth index calculation unit 11d, and an image synthesis unit 11e. That is, the CPU functions as the overall control unit 11a, the distortion correction unit 11b, the parallax correction unit 11c, the growth index calculation unit 11d, and the image synthesis unit 11e.
  • the overall control unit 11a controls the operation of each unit of the flying object 10 according to an operation program stored in a program storage unit (not shown). The details of the distortion correcting unit 11b, the parallax correcting unit 11c, the growth index calculating unit 11d, and the image synthesizing unit 11e will be described later in the operation description.
  • the communication unit 12 is an interface for performing communication with the terminal device 30 and includes a transmission circuit, a reception circuit, a modulation circuit, a demodulation circuit, an antenna, and the like, and is communicably connected to the communication line NW. ing.
  • the warning unit 13 performs control when the amount of change in pixel value positional deviation due to parallax in a predetermined period (for example, one day or one week) calculated by the parallax correction unit 11c of the control unit 11 exceeds a threshold.
  • a warning is initially issued under the control of the unit 11 (for example, the overall control unit 11a).
  • Such a warning unit 13 includes a display unit 13a (for example, a liquid crystal display device or a simple light source) for performing a warning display, a sound output unit 13b (for example, a speaker) that emits a warning sound, and transmits warning information to the outside. It can be configured with at least one of the communication units (for example, the communication unit 12).
  • the position information acquiring unit 14 is configured by, for example, GPS, and acquires position information (latitude X, longitude Y, height Z) of the flying object 10 (the imaging device 20). Thereby, the control unit 11 (for example, the overall control unit 11a) controls the flight driving unit 16 based on the position information, and moves the flying object 10 to a predetermined height position above a predetermined area T of the field FD. Can fly.
  • position information latitude X, longitude Y, height Z
  • the control unit 11 controls the flight driving unit 16 based on the position information, and moves the flying object 10 to a predetermined height position above a predetermined area T of the field FD. Can fly.
  • the image storage unit 15 is configured by, for example, a nonvolatile memory, and stores each image data of a visible image and a near-infrared image acquired by the first imaging unit 21 and the second imaging unit 22 of the imaging device 20. At the same time, the growth index (particularly, the image data of the NDVI image obtained based on the pixel value after parallax correction) obtained by the growth index calculation unit 11d is stored.
  • FIG. 11 is a flowchart illustrating a flow of processing according to the field photographing method of the present embodiment.
  • the chart C is attached to the take-off and landing platform 40 located around the field FD.
  • the flying object 10 is caused to fly to photograph the field FD and the chart C (S1; photographing step). More specifically, the pilot takes off the flying object 10 from the take-off and landing platform 40 by operating the operation unit 50. Then, during the flight of the flying object 10, the first imaging unit 21 and the second imaging unit 22 held by the flying object 10 photograph the field FD from above, and acquire field images having different wavelength bands (S1). -1; field image acquisition step). That is, the first imaging unit 21 acquires a visible image of the field FD as one field image, and the second imaging unit 22 acquires a near-infrared image of the field FD as the other field image. The acquired image data Rv and the near-infrared image data Ri of the visible image are input from the first imaging unit 21 and the second imaging unit 22 to the control unit 11 of the flying object 10, respectively.
  • the chart C attached to the take-off and landing platform 40 is photographed from the sky by the first imaging unit 21 and the second imaging unit 22 to acquire each chart image (S1-2).
  • a chart image obtaining step That is, the first imaging unit 21 acquires a visible image of the chart C as one chart image, and the second imaging unit 22 acquires a near-infrared image of the chart C as the other chart image.
  • the flying object 10 is rising until the flying object 10 takes off from the take-off and landing platform 40 and reaches a predetermined altitude above the field, and It is assumed that the descent of the flying object 10 from landing at a predetermined altitude in the sky to landing on the take-off and landing platform 40 is included.
  • Image data of each chart image acquired by the first imaging unit 21 and the second imaging unit 22 is input to the control unit 11 of the flying object 10, respectively.
  • the first imaging unit 21 and the second imaging unit 22 simultaneously perform the field image acquiring step of S1-1 and the chart image acquiring step of S1-2. That is, the first imaging unit 21 and the second imaging unit 22 simultaneously capture the chart C while capturing the field FD from above. As a result, a field image showing the chart C (field image also serving as the chart image) is obtained for each of visible and near-infrared.
  • the distortion correction unit 11b of the control unit 11 performs the first imaging unit 21 and the first imaging unit 21 based on the distortion correction information (the calculation formula for correcting the distortion) provided from the terminal device 30 via the communication unit 12.
  • the image data of each image (each field image, each chart image) output from the second imaging unit 22 is respectively corrected.
  • the distortion such as the barrel shape and the pincushion type in each image is corrected, and the next parallax correction can be appropriately performed using each image in a state where the distortion has been corrected.
  • the parallax correction unit 11c of the control unit 11 determines between the field images based on each chart image obtained by photographing the chart C from above by the first imaging unit 21 and the second imaging unit 22. Then, parallax correction (calibration) for correcting a positional shift of pixels indicating the same point caused by parallax is performed (S3; parallax correction step). The details of the parallax correction in S3 will be described later.
  • the overall control unit 11a determines whether or not the amount of change in the positional deviation during a predetermined period exceeds a threshold (S4). For example, when the visible image and the near-infrared image are superimposed on each other, the overall control unit 11a elapses a predetermined period (for example, one day or one week) of the length of a straight line connecting positions indicating the same point. The difference between before and after the change is referred to as the “change amount of the positional deviation” described above, and it is determined whether the change amount exceeds a threshold. When the change amount exceeds the threshold, the warning unit 13 issues a warning under the control of the overall control unit 11a (S5; warning step).
  • S5 warning step
  • this warning step at least one of displaying a warning on the display unit 13a, outputting a warning sound by the audio output unit 13b, and transmitting warning information to the outside by the communication unit 12 is performed.
  • the process directly proceeds to S6.
  • the growth index calculation unit 11d of the control unit 11 calculates the growth index of the crop PL based on each field image after the positional shift has been corrected by the parallax correction unit 11c (S6; growth index calculation step).
  • NDVI is used as the growth index.
  • NDVI is an index indicating the distribution status and activity of vegetation, and the image data Rv of one field image (visible image) and the other field image (visual image) in which the positional shift due to distortion and parallax has been corrected as described above.
  • NDVI ⁇ (Ri ⁇ Rv) / (Ri + Rv) ⁇ using image data Ri of a near infrared image).
  • NDVI indicates a numerical value normalized between ⁇ 1 and 1, and a larger positive number indicates a denser vegetation.
  • the growth index calculation unit 11d may calculate the following value instead of NDVI as the growth index.
  • the growth index calculating unit 11d may calculate the planting rate as a growth index instead of NDVI.
  • the vegetation cover rate indicates a rate at which the crop PL covers the ground surface of the field FD.
  • the growth index calculation unit 11d performs a binarization process on the near-infrared image after correcting the positional shift due to the distortion and the parallax, to form a binary image of white and black, and By calculating the ratio of the white portion in the image, the planting rate can be calculated.
  • a white portion corresponds to the crop PL
  • a black portion corresponds to the soil.
  • the image combining unit 11e of the control unit 11 The NDVI images are combined to generate one NDVI image as a whole (S7; image combining step).
  • “combining” here refers to connecting a plurality of NDVI images side by side. If the number of obtained NDVI images is originally one, image synthesis is unnecessary, and thus the process of S7 is skipped and the process ends.
  • FIG. 12 is a flowchart illustrating the flow of the parallax correction process.
  • the parallax correction unit 11c uses the first imaging unit 21 and the second imaging unit 22 to extract the chart C from a plurality of captured images acquired at different timings when the flying object 10 is moving over the sky.
  • An image including the image is extracted as the above-described chart image (S11).
  • the disparity compensation unit 11c from the plurality of captured images, at least one image corresponding to the circular portion C 1 of the chart C in the image of (preferably four) image present, include an image of the chart C Extracted as a chart image. Then, the extraction of such a chart image is performed on a plurality of captured images obtained by the first imaging unit 21 and the second imaging unit 22, respectively.
  • FIG. 13 schematically illustrates a chart image extracted by the parallax correction unit 11c from the plurality of captured images.
  • the chart image CR-1 is acquired at time t1
  • the chart image CR-2 is acquired at time t2
  • the chart image CR-3 is acquired at time t3, and the second image is acquired by the first imaging unit 21.
  • the imaging unit 22 obtains the chart image Ci-1 at time t1, obtains the chart image Ci-2 at time t2, and obtains the chart image Ci-3 at time t3, they are arranged in chronological order. Is shown.
  • CR is a chart image in a visible wavelength band
  • Ci is a chart image in a near-infrared wavelength band.
  • FIG. 13 in each chart image, an image corresponding to the circular portion C 1 of the chart C, indicated by C 1 '.
  • the parallax correction unit 11c extracts the area CA of the chart C from each of the extracted chart images (S12). For example, the parallax correction unit 11c calculates the position (coordinates) of the image C 1 ′ corresponding to the circular portion C 1 in the extracted chart image, and calculates the number of the images C 1 ′. Then, the image C 1 'if there four are parallax correction unit 11c extracted image C 1 adjacent in the chart image' the distance between the image C 1 'is within a predetermined range, and, When the shape surrounding the four images C 1 ′ is a square, the area surrounding the four images C 1 ′ is extracted as the area CA of the chart C. Further, for example, when there is one image C 1 ′ in the extracted chart image, a square area of a predetermined size surrounding the image C 1 ′ is extracted as the area CA of the chart C.
  • the parallax correction unit 11c is included in one chart image (for example, a visible image) based on each chart image captured and acquired by the first imaging unit 21 and the second imaging unit 22 at the same time.
  • a projection matrix for converting the position coordinates of the reference point P to the position coordinates of the corresponding point Q (indicating the same point in the field FD) corresponding to the other chart image (for example, a near-infrared image) is calculated (S13).
  • the reference point P and the corresponding point Q are, for example, the center positions of the area CA of the chart C obtained in S12 in each chart image.
  • the center position is the average value of the positions of the four images C 1 ′ (the coordinate position obtained by adding the x coordinate and the y coordinate and dividing by 4).
  • the position of the image C 1 ′ can be set.
  • a method of calculating the projection matrix will be described.
  • projecting a certain plane onto another plane using a projection matrix is called homography (projection transformation).
  • homography projection transformation
  • an affine transformation represented by the following equation is generally known (for example, see http://zellij.hatenablog.com/entry/20120523/p1).
  • the projective transformation is an affine transformation.
  • the coordinates of an arbitrary point O on the xy plane are set to (x, y), and the point O is defined by a projection matrix (corresponding to a matrix of 3 rows and 3 columns in Formula 1) on the x′y ′ plane.
  • the coordinates of the point O ′ projected above are (x ′, y ′).
  • the projective transformation is a transformation that combines linear transformation and translation.
  • the above-described linear transformation includes scaling (differential scaling), shearing (skew), and rotation (rotation).
  • the coefficient t x and t y refers to coefficients for parallel movement (translation), the remaining coefficients a, b, c and d refer to coefficients for linear transformation.
  • calculating the projection matrix is to calculate the coefficient of the projection matrix a, b, c, d, six parameters t x and t y. Since there are six unknown parameters (coefficients), if six equations can be established, all six unknown parameters can be obtained (that is, a projection matrix can be calculated). At this time, if the position coordinates of one point before and after projective transformation are known, two equations can be established for x and y by substituting the coordinates into Equation 1. Therefore, by substituting the position coordinates before and after the projective transformation for the three points into Equation 1, six equations can be established, and all six parameters can be obtained by solving them simultaneously.
  • the parallax correction unit 11c inputs the position coordinates of the three points into Expression 1 and calculates the projection matrix as follows. First, the correspondence (x1, y1) of the reference point P on the xy plane of the chart image CR-1 acquired at the time t1 to the correspondence on the x'y 'plane of the chart image Ci-1 acquired at the same time. The coordinates (x1 ′, y1 ′) of the point Q are input into Equation 1.
  • Equation 1 the coordinates (x2, y2) of the reference point P on the xy plane of the chart image CR-2 acquired at the time t2 and the x'y 'plane of the chart image Ci-2 acquired at the same time
  • the coordinates (x2 ′, y2 ′) of the corresponding point Q are input into Equation 1.
  • the correspondence (x3, y3) of the reference point P on the xy plane of the chart image CR-2 obtained at the time t3 to the correspondence on the x'y 'plane of the chart image Ci-3 obtained at the same time.
  • the coordinates (x3 ′, y3 ′) of the point Q are input into Equation 1.
  • the coordinates (x3, y3) of P are different from each other.
  • the coordinates (x1 ′, y1 ′) of the corresponding point Q at time t1 the coordinates (x2 ′, y2 ′) of the corresponding point Q at time t2, and the coordinates (x3 ′, y3) of the corresponding point Q at time t3.
  • the projection matrix is calculated by setting six equations by substituting the position coordinates of the three reference points P and the corresponding points Q into Equation 1, but for example, linear transformation is not necessary.
  • the coefficients a, b, c, and d of the projection matrix are all set to 0, and the position coordinates of one reference point P and corresponding point Q
  • Equation 1 eight or more equations can be established by substituting the position coordinates of four or more reference points P and corresponding points Q into Equation 1.
  • a plurality of groups in which six equations are arbitrarily selected from eight or more equations are formed coefficients of the projection matrix are obtained for each group, and the coefficients obtained in each group are finally determined by the least squares method. May be determined.
  • each coefficient, that is, a projection matrix can be obtained with high accuracy.
  • the parallax correction unit 11c converts the position coordinates of all points included in one field image (for example, a visible image) by the projection matrix obtained in S13 (S14). Thereby, the positional deviation of the pixels indicating the same point, which is caused by the parallax between one field image and the other field image (for example, a near-infrared image), is corrected. That is, each pixel of one field image is associated with a pixel representing the same point in the field FD in the other field image.
  • the field image acquiring step the field FD is photographed from the sky by the first imaging unit 21 and the second imaging unit 22 during the flight of the flying object 10, and the wavelength band of the field FD is adjusted.
  • a different field image visible image, near-infrared image
  • the chart image acquisition step the chart C is photographed from above by the first imaging unit 21 and the second imaging unit 22 during the flight of the flying object 10 to acquire each chart image (S1-2).
  • the parallax correction unit 11c corrects a positional shift of a pixel indicating the same point caused by parallax between each field image based on each chart image (S3).
  • the parallax correction unit 11c can perform parallax correction based on the chart image at the same time as the image of the field FD. Therefore, for example, the efficiency of processing can be improved as compared with a case where parallax correction is performed at a timing different from the imaging of the field FD. That is, the first image capturing unit 21 and the second image capturing unit 22 simultaneously perform the field image obtaining step (S1-1) and the chart image obtaining step (S1-2), thereby improving the processing efficiency. be able to. Further, since the field image and the chart image can be obtained simultaneously by one photographing (since one photographed image can serve as the field image and the chart image), an increase in the number of photographing steps can be avoided.
  • the above-mentioned flight of the flying object 10 includes the rising and falling of the flying object 10. Therefore, not only while the flying object 10 is flying over the field FD at a predetermined altitude, but also while the flying object 10 is rising or descending, the first imaging unit 21 and the second imaging unit 22 Since a chart image can be obtained, the parallax can be corrected by the parallax correction unit 11c even while the flying object 10 is rising or falling.
  • the chart C can be positioned using the object OB by attaching the chart C to the object OB. That is, in this case, the object OB can be effectively used in setting the chart C.
  • the take-off and landing platform 40 of the flying object 10 the transport vehicle 41 for transporting the flying object 10, and the helmet 42 worn by a pilot who remotely controls the flight of the flying object 10 are used when the field FD is photographed using the flying object 10. Is always required. Therefore, by attaching the chart C to any one of these objects OB (the take-off and landing platform 40, the transport vehicle 41, and the helmet 42), it is not necessary to newly search for an installation place of the chart C or separately provide the chart C. It is possible to quickly install the object on the object OB and quickly start photographing the field.
  • the chart C may be directly mounted on the head of the pilot who remotely controls the flight of the flying object 10 without using the helmet 42. From this, it can be said that the chart C may be worn on the head of the pilot who remotely controls the flight of the flying object 10 regardless of whether the pilot wears the helmet 42 or not.
  • the above-described incident light sensor, generator, and RTK-GPS base station may be required when photographing the field FD using the flying object 10 (for example, when the weather fluctuates drastically, the It is very large, and the battery 17 needs to be charged during photography, or the location information acquisition unit 14 needs to communicate with the base station to acquire location information.
  • the field server, the agricultural machine, and the irrigation equipment described above are indispensable for management of the field FD, and are always present in the field FD or around the field FD. Therefore, the chart C may be attached to these objects OB. Even in this case, the chart C is quickly installed on the object OB while effectively utilizing the object OB, and the photographing of the field FD is started quickly. Can be.
  • the chart C is, a pattern including at least one circular portion C 1 or when it has a grid pattern, taking the chart C from above of the field FD by the first imaging unit 21 and the second imaging unit 22, Then, when each chart image is obtained, the image of the chart C (including the image C 1 ′) clearly appears in each chart image. Accordingly, the parallax correction unit 11c can reliably obtain the position of the image of the chart C based on the pixel value, and can acquire the position information accurately and reliably. It can be performed with high accuracy.
  • the lacking portion N of the crop PL located in the field FD and the reflection spot S of the sunlight reflected on the water surface in the field FD simultaneously cause the field FD to be scanned by the first imaging unit 21 and the second imaging unit 22.
  • the first imaging unit 21 and the second imaging unit 22 acquire each chart image and the parallax correction unit 11c. Can perform parallax correction based on each chart image. Therefore, when performing parallax correction, it is not necessary to attach the chart C to the object OB in or around the field FD. Further, even when such an object OB does not exist, the parallax correction can be performed by utilizing (effectively utilizing) the above-described missing portion N or the reflected spot S of sunlight as the chart C.
  • the parallax correction unit 11c is included in one chart image based on each chart image captured and acquired by the first imaging unit 21 and the second imaging unit 22 at the same time.
  • the parallax is corrected (S13, S14). Accordingly, even if the internal state or the external state of the first imaging unit 21 and the second imaging unit 22 is changed due to vibration of the flying object 10 during flight (for example, optical axis deviation, mounting direction). ), It is possible to associate each pixel of one field image (visible image) with a pixel representing the same point in the field FD in the other field image (near infrared image).
  • the parallax correction unit 11c calculates a chart from a plurality of captured images acquired at different timings when the flying object 10 is moving over the sky in each of the first imaging unit 21 and the second imaging unit 22. After extracting an image including the image of C as a chart image, the center position of the image of the chart C is obtained from the extracted chart image, and the center position of the image of the chart C in one of the chart images photographed at the same time is used as a reference. A projection matrix is calculated as a point P, and the center position of the image of the chart C in the other chart image captured at the same time is set as the corresponding point Q (S11 to S13). As described above, in each chart image captured at the same time, the projection matrix required for parallax correction is appropriately calculated by calculating the projection matrix using the center position of the image of the chart C as the reference point P and the corresponding point Q, respectively. Can be obtained.
  • the growth index calculation unit 11d calculates a growth index (for example, an NDVI value) of the crop PL based on each field image after the positional displacement has been corrected by the parallax correction unit 11c (S6). ).
  • a growth index for example, an NDVI value
  • the parallax correction unit 11c described above performs parallax correction. Since the influence can be reduced, it is possible to effectively prevent fluctuation of the NDVI value due to parallax.
  • the growth index calculation unit 11d can acquire an accurate NDVI value at an arbitrary point in the field FD, and can avoid deterioration in the image quality of the NDVI image. Further, even when each optical system included in the first imaging unit 21 and the second imaging unit 22 is contaminated by the attachment of dust and the pixel value of each field image includes noise, as in the present embodiment, By correcting the positional shift of the pixel due to the parallax, it is possible to suppress the fluctuation of the NDVI value due to the noise at an arbitrary point in the field FD as compared with a case where such correction is not performed. .
  • the warning unit 13 issues a warning when the amount of change in the positional deviation during a predetermined period exceeds a threshold value (S5).
  • S5 a threshold value
  • the warning in S5 promptly prompts the user to take necessary measures such as inspection and repair, and enables the user to quickly deal with a failure.
  • the warning unit 13 includes at least one of the display unit 13a, the audio output unit 13b, and the communication unit 12, and in S5, at least one of the warning display, the output of the warning sound, and the transmission of the warning information is performed. It is possible to reliably notify the user that an abnormality such as a failure has occurred, and to prompt the user to take necessary measures such as inspection.
  • parallax correction it is automatically determined whether or not there are three or more images with different chart positions. If the number of images with different chart positions is less than three, a warning is issued. If there are three or more cards, it may be notified that they are appropriate.
  • the field imaging system has a calibration mode, and in the calibration mode, the flying object 10 may be made to automatically fly along a predetermined route, or may be provided with a program for executing such an automatic flight.
  • the field photographing system and the field photographing method of the present embodiment described above can be expressed as follows.
  • the field imaging system of the present embodiment includes a plurality of flying objects, which are held by the flying objects, and acquire a field image having a different wavelength band by photographing a field where a crop is grown from above in the air while the flying object is growing.
  • An imaging unit a chart captured by the plurality of imaging units, and, during the flight of the flying object, based on each chart image obtained by capturing the chart from above by the plurality of imaging units,
  • a parallax correction unit configured to correct a positional shift of a pixel indicating the same point caused by parallax between field images, wherein the chart is located in the field or around the field.
  • the chart may be attached to an object located in the field or around the field.
  • the object may be a take-off and landing platform of the flying object.
  • the object may be a transport vehicle that transports the flying object.
  • the chart may be mounted on a head of a pilot who remotely controls the flight of the flying object.
  • the chart may have a pattern including at least one circular portion or a lattice pattern.
  • the chart may be a missing part of the crop located in the field, or a reflection spot of sunlight reflected on the water surface in the field.
  • the parallax correction unit determines the position coordinates of a reference point included in one of the chart images based on the respective chart images acquired and photographed at the same time by the plurality of imaging units. By calculating a projection matrix to be converted into the position coordinates of the corresponding point in the chart image, by correcting the position coordinates of all points included in one field image by the projection matrix, the position shift is corrected. Is also good.
  • the parallax correction unit may include, in each of the plurality of imaging units, an image of the chart from a plurality of photographed images acquired at different timings when the flying object is moving in the sky. Is extracted as the chart image, the center position of the image of the chart is obtained from the extracted chart image, and the center position of the one chart image taken at the same time as the reference point, The projection matrix may be calculated using the center position of the other chart image captured at the same time as the corresponding point.
  • the above-mentioned field photographing system may further include a growth index calculation unit that calculates a growth index of the crop based on each field image after the position shift has been corrected by the parallax correction unit.
  • the field imaging system described above may further include a warning unit that issues a warning when the amount of change in the positional deviation during a predetermined period exceeds a threshold.
  • the warning unit may include at least one of a display unit for displaying a warning, a sound output unit for generating a warning sound, and a communication unit for transmitting warning information to the outside.
  • the field photographing method includes a plurality of image pickup units that are held by a flying object, and that during the flight of the flying object, a field where crops are grown is photographed from above to acquire field images having different wavelength bands.
  • a parallax correction step of correcting, based on each of the chart images, a positional shift of a pixel indicating the same point caused by parallax between each of the field images.
  • the plurality of imaging units simultaneously perform the field image obtaining step and the chart image obtaining step.
  • the flying of the flying object may include rising and falling of the flying object.
  • the position coordinates of a reference point included in one of the chart images is determined based on the respective chart images acquired and photographed at the same time by the plurality of imaging units.
  • the position shift can be corrected. Good.
  • an image of the chart is obtained from a plurality of photographed images acquired at different timings when the flying object is moving in the sky. Is extracted as the chart image, the center position of the image of the chart is obtained from the extracted chart image, and the center position of the one chart image taken at the same time as the reference point,
  • the projection matrix may be calculated using the center position of the other chart image captured at the same time as the corresponding point.
  • the above-mentioned field photographing method may further include a growth index calculating step of calculating a growth index of the crop based on each field image after the positional displacement has been corrected by the parallax correction step.
  • the above-mentioned field photographing method may further include a warning step of issuing a warning when a change amount of the positional deviation during a predetermined period exceeds a threshold value.
  • the warning step may include at least one of displaying a warning, outputting a warning sound, and transmitting warning information to the outside.
  • the present invention can be used for a system that acquires a field image having a different wavelength band by photographing a field from above with a plurality of imaging units held by a flying object, and calculates a growth index based on each acquired field image. is there.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Botany (AREA)
  • Remote Sensing (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Ecology (AREA)
  • Forests & Forestry (AREA)
  • Environmental Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Signal Processing (AREA)

Abstract

This cultivated land imaging system comprises a flying vehicle, a plurality of imaging units, a chart, and a parallax correction unit (11c). The plurality of imaging units are retained in the flying vehicle, and, during flight of the flying vehicle, acquire cultivated land images of differing wavelength bands by performing imaging from the air of cultivated land where crops are grown. The parallax correction unit corrects positional offset, resulting from parallax, of pixels indicating the same location among the cultivated land images, the correction carried out on the basis of chart images obtained by the plurality of imaging units that image the chart from the air during flight of the flying body. The chart is located inside of or within the vicinity of the cultivated land.

Description

圃場撮影システムおよび圃場撮影方法Field photographing system and field photographing method
 本発明は、圃場を上空から撮影する圃場撮影システムおよび圃場撮影方法に関する。 The present invention relates to a field photographing system and a field photographing method for photographing a field from above.
 従来から、植物の生育指標を測定するシステムが提案されている。例えば特許文献1のシステムでは、反射光測定部により、複数の葉を持つ測定対象の反射光の光強度を第1波長および第2波長で測定し、太陽角度取得部により、太陽光の測定対象への入射角度を太陽角度として取得し、太陽方向取得部により、反射光測定部の測定方向に対する太陽の方向を太陽方向として取得する。そして、生育指標演算部は、反射光測定部で測定した第1および第2波長のそれぞれでの反射光の各光強度、太陽角度取得部で取得した太陽角度、ならびに、太陽方向取得部で取得した太陽方向に基づいて、測定対象における生育の度合いを表す生育指標を求める。特許文献1では、上記の生育指標として、例えばNDVI(Normalized Difference Vegetation Index;正規化植生指標)を求めている。 シ ス テ ム Conventionally, systems for measuring plant growth indices have been proposed. For example, in the system of Patent Document 1, the reflected light measurement unit measures the light intensity of the reflected light of the measurement target having a plurality of leaves at the first wavelength and the second wavelength, and the sun angle acquisition unit measures the sunlight. The angle of incidence on the reflected light is acquired as the sun angle, and the direction of the sun with respect to the measurement direction of the reflected light measurement unit is acquired as the sun direction by the sun direction acquisition unit. Then, the growth index calculating section obtains each light intensity of the reflected light at each of the first and second wavelengths measured by the reflected light measuring section, the sun angle obtained by the sun angle obtaining section, and the sun direction obtained by the sun direction obtaining section. A growth index representing the degree of growth in the measurement target is obtained based on the obtained sun direction. In Patent Document 1, for example, NDVI (NormalizedmalDifference Vegetation Index; normalized vegetation index) is obtained as the above growth index.
 ここで、上記の反射光測定部は、可視光の画像を生成する可視撮像部と、赤外光の画像を生成する赤外撮像部とを備えた、いわゆるマルチスペクトルカメラである。可視撮像部および赤外撮像部は、それぞれ、バンドパスフィルタ、レンズ(結像光学系)、センサを有して構成されている。 Here, the reflected light measurement unit is a so-called multi-spectral camera including a visible imaging unit that generates an image of visible light and an infrared imaging unit that generates an image of infrared light. Each of the visible imaging unit and the infrared imaging unit includes a bandpass filter, a lens (imaging optical system), and a sensor.
国際公開WO2016/181743号公報(請求項5、段落〔0021〕~〔0077〕、図5等参照)International Publication WO2016 / 181743 (See claim 5, paragraphs [0021] to [0077], FIG. 5 and the like)
 近年では、マルチスペクトルカメラを無人航空機(ドローンとも呼ばれる)に搭載し、ドローンを飛行させて圃場を上空から撮影することで、可視画像および赤外画像を取得し、NDVIを測定することも行われている。なお、上記の圃場とは、作物を生育させる水田または畑のことを指す。しかし、このようにマルチスペクトルカメラをドローンに搭載して可視画像および赤外画像を取得する場合には、以下の原因により、NDVI値が変動する。 In recent years, a multispectral camera is mounted on an unmanned aerial vehicle (also called a drone), and a drone is flown to photograph a field from above, thereby acquiring a visible image and an infrared image, and measuring NDVI. ing. The above-mentioned field refers to a paddy field or a field where a crop is grown. However, when the visible image and the infrared image are acquired by mounting the multispectral camera on the drone, the NDVI value fluctuates due to the following causes.
 (A)カメラの内部状態の変化
 ドローンの飛行中の振動や温度変化により、例えば、可視撮像部および赤外撮像部の各光軸の相対的な向きや焦点距離が変化する。この場合、可視画像と赤外画像とで同一地点を示す位置にズレ(視差とも言う)が生じる。したがって、可視画像と赤外画像とで同じ座標位置の画素値を用いてNDVI値を算出すると、圃場内の別々の地点に相当する画素値を用いてNDVI値を算出していることになるため、圃場内の任意の地点についての正確なNDVI値を取得することができなくなる。
(A) Change in Internal State of Camera Due to vibration and temperature changes during flight of the drone, for example, the relative directions and focal lengths of the respective optical axes of the visible imaging unit and the infrared imaging unit change. In this case, a shift (also called parallax) occurs at a position indicating the same point in the visible image and the infrared image. Therefore, when the NDVI value is calculated using the pixel values at the same coordinate position in the visible image and the infrared image, the NDVI value is calculated using the pixel values corresponding to different points in the field. However, it is not possible to obtain an accurate NDVI value for an arbitrary point in the field.
 (B)カメラの外部状態の変化
 ドローンの飛行中の振動やジンバル制御バラツキなどにより、可視撮像部および赤外撮像部の取付方向がずれる。この場合も、可視画像と赤外画像との間で視差が生じるため、上記(A)と同様に、圃場内の任意の地点についての正確なNDVI値を取得することができなくなる。
(B) Changes in the external state of the camera The mounting directions of the visible imaging unit and the infrared imaging unit are deviated due to vibrations during flight of the drone and variations in gimbal control. Also in this case, since a parallax occurs between the visible image and the infrared image, it is impossible to obtain an accurate NDVI value for an arbitrary point in the field, as in (A).
 特に、可視撮像部および赤外撮像部に含まれる各光学系が、例えば塵埃の付着によって汚れていると、可視撮像部および赤外撮像部で取得される可視画像および赤外画像の画素値にノイズが含まれる。このため、上記(A)および(B)に起因するNDVI値の変動がさらに大きくなる。 In particular, if each optical system included in the visible imaging unit and the infrared imaging unit is contaminated by, for example, dust, the pixel values of the visible image and the infrared image acquired by the visible imaging unit and the infrared imaging unit are affected. Includes noise. For this reason, the fluctuation of the NDVI value caused by the above (A) and (B) is further increased.
 したがって、マルチスペクトルカメラをドローンに搭載して飛行させる場合には、上述の原因によるNDVI値の変動を防止することができるように、可視画像と赤外画像との間に生じる視差の影響を低減するための視差補正(キャリブレーション)を行うことが望ましい。この点、上述した特許文献1では、マルチスペクトルカメラをドローンに搭載して飛行させる場合の視差補正については一切検討されていない。また、視差補正を行う場合でも、処理の効率化の観点から、圃場の撮影と同時に視差補正を行うことが望ましく、そのような視差補正を実現できるシステムの構築が望まれる。 Therefore, when flying a multispectral camera mounted on a drone, the influence of parallax generated between the visible image and the infrared image is reduced so as to prevent the fluctuation of the NDVI value due to the above-described cause. It is desirable to perform parallax correction (calibration) for performing the correction. In this regard, the above-mentioned Patent Document 1 does not consider parallax correction when the multispectral camera is mounted on a drone and is flown. Even when performing parallax correction, it is desirable to perform parallax correction at the same time as imaging of a field from the viewpoint of processing efficiency, and it is desired to construct a system that can realize such parallax correction.
 本発明は、上記の問題点を解決するためになされたもので、その目的は、波長帯域の異なる画像を取得する複数の撮像部を飛行体に搭載し、圃場を上空から撮影するシステムにおいて、複数の撮像部で取得される画像間での視差の影響を低減することができるとともに、その視差の影響を低減するための視差補正を、圃場の撮影と同時に行うことができる圃場撮影システムおよび圃場撮影方法を提供することにある。 The present invention has been made in order to solve the above-described problems, and its object is to mount a plurality of imaging units for acquiring images with different wavelength bands on a flying object, in a system for photographing a field from above, A field photographing system and a field that can reduce the influence of parallax between images acquired by a plurality of imaging units and can perform parallax correction for reducing the effect of the parallax simultaneously with shooting of a field It is to provide a shooting method.
 本発明の一側面に係る圃場撮影システムは、飛行体と、前記飛行体に保持され、前記飛行体の飛行中に、作物を生育させる圃場を上空から撮影して波長帯域の異なる圃場画像を取得する複数の撮像部と、前記複数の撮像部によって撮影されるチャートと、前記飛行体の飛行中に、前記複数の撮像部によって前記チャートを上空から撮影して得られた各チャート画像に基づいて、前記各圃場画像間で、視差によって生じる同一地点を示す画素の位置ズレを補正する視差補正部とを含み、前記チャートは、前記圃場内または前記圃場の周囲に位置している。 A field imaging system according to one aspect of the present invention obtains a field object having a different wavelength band by shooting a flying object and a field held by the flying object and growing a crop from above, while the aircraft is flying. A plurality of imaging units, and a chart captured by the plurality of imaging units, and based on each chart image obtained by capturing the chart from above by the plurality of imaging units during the flight of the flying object. A parallax correction unit that corrects a positional shift of a pixel indicating the same point caused by parallax between the respective field images, wherein the chart is located in the field or around the field.
 本発明の他の側面に係る圃場撮影方法は、飛行体に保持される複数の撮像部によって、前記飛行体の飛行中に、作物を生育させる圃場を上空から撮影して波長帯域の異なる圃場画像を取得する圃場画像取得工程と、前記圃場内または前記圃場の周囲に位置しているチャートを、前記飛行体の飛行中に、前記複数の撮像部によって上空から撮影して各チャート画像を取得するチャート画像取得工程と、前記各チャート画像に基づいて、前記各圃場画像間で、視差によって生じる同一地点を示す画素の位置ズレを補正する視差補正工程とを含む。 A field imaging method according to another aspect of the present invention is directed to a field imaging method in which a plurality of imaging units held by a flying object captures, from the sky, a field where a crop grows during the flight of the flying object, and a field image having a different wavelength band. A field image obtaining step of obtaining, and a chart located in or around the field, during the flight of the flying object, from the sky by the plurality of imaging units to obtain each chart image The method includes a chart image acquiring step and a parallax correcting step of correcting a positional shift of a pixel indicating the same point caused by parallax between the respective field images based on the respective chart images.
 上記の圃場撮影システムおよび圃場撮影方法によれば、飛行体に保持された複数の撮像部が圃場を上空から撮影することにより、波長帯域の異なる圃場画像が取得される。視差補正部は、各圃場画像間で、視差によって生じる同一地点を示す画素の位置ズレを、複数の撮像部によってチャートを撮影して得られた各チャート画像に基づいて補正する。これにより、飛行体の振動等に起因して各圃場画像間に視差が生じても、その視差に起因する画素の位置ズレを補正した各圃場画像を得ることができ、視差の影響を低減することが可能となる。また、上記のチャートは、圃場内または圃場の周囲に位置しているため、複数の撮像部は、圃場を上空から撮影するのと同時にチャートを上空から撮影することができる。これにより、視差補正部は、圃場の撮影と同時に、上記した視差の影響を低減する視差補正を行うことができ、処理の効率化を図ることができる。 According to the above-described field imaging system and the field imaging method, a plurality of imaging units held by the flying object image the field from above, so that field images having different wavelength bands are obtained. The parallax correction unit corrects a positional shift of a pixel indicating the same point caused by parallax between each field image based on each chart image obtained by photographing a chart with a plurality of imaging units. Thereby, even if parallax is generated between the field images due to the vibration of the flying object or the like, it is possible to obtain the field images in which the positional deviation of the pixels due to the parallax is corrected, and reduce the influence of the parallax. It becomes possible. In addition, since the above-described chart is located in or around the field, the plurality of imaging units can shoot the chart from above at the same time as shooting the field from above. Accordingly, the parallax correction unit can perform the parallax correction for reducing the influence of the parallax at the same time as the imaging of the field, and can improve the processing efficiency.
本発明の実施の一形態に係る圃場撮影システムの全体構成を模式的に示す説明図である。FIG. 1 is an explanatory diagram schematically showing an overall configuration of a field photographing system according to an embodiment of the present invention. 上記圃場撮影システムに含まれる撮像装置の詳細な構成を模式的に示す説明図である。It is explanatory drawing which shows typically the detailed structure of the imaging device contained in the said field imaging system. 上記圃場撮影システムに含まれる端末装置の概略の構成を示すブロック図である。FIG. 3 is a block diagram illustrating a schematic configuration of a terminal device included in the field photographing system. 上記圃場撮影システムに含まれるチャートの平面図である。It is a top view of a chart contained in the above-mentioned field photography system. 上記チャートの他の構成を示す平面図である。It is a top view showing other composition of the above-mentioned chart. 運搬用車両にチャートを取り付けた状態を示す説明図である。It is explanatory drawing which shows the state which attached the chart to the transport vehicle. ヘルメットにチャートを取り付けた状態を示す説明図である。It is explanatory drawing which shows the state which attached the chart to the helmet. チャートとして代用される、圃場内の欠株部分を模式的に示す説明図である。It is explanatory drawing which substitutes as a chart and shows typically the stock missing part in a field. チャートとして代用される、圃場内の水面に映る太陽光の反射スポットを模式的に示す説明図である。It is explanatory drawing which substitutes as a chart and shows the reflection spot of sunlight reflected on the water surface in a field typically. 上記圃場撮影システムに含まれる飛行体の詳細な構成を示すブロック図である。FIG. 3 is a block diagram illustrating a detailed configuration of a flying object included in the field imaging system. 上記圃場撮影システムによって実施される圃場撮影方法による処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the process by the field imaging method implemented by the said field imaging system. 上記圃場撮影システムにおける視差補正の処理の流れを示すフローチャートである。It is a flow chart which shows the flow of processing of parallax correction in the above-mentioned field photography system. 複数の撮影画像から抽出したチャート画像を模式的に示す説明図である。FIG. 9 is an explanatory diagram schematically showing a chart image extracted from a plurality of captured images. 射影変換を模式的に示す説明図である。It is explanatory drawing which shows projective transformation typically.
 本発明の実施の形態について、図面に基づいて説明すれば、以下の通りである。 The following is a description of embodiments of the present invention, with reference to the accompanying drawings.
 〔圃場撮影システムの構成〕
 図1は、本実施形態の圃場撮影システム1の全体構成を模式的に示す説明図である。圃場撮影システム1は、作物PLを生育させる圃場FDを撮影するシステムであり、飛行体10と、撮像装置20と、端末装置30と、チャートCとを有して構成されている。飛行体10と端末装置30とは、通信回線NWを介して通信可能に接続されている。通信回線NWは、例えばLAN(Local Area Network)やインターネット回線などのネットワーク(有線、無線を問わない)で構成されている。なお、飛行体10の通信機能を撮像装置20に持たせ、撮像装置20と端末装置30とを通信回線NWを介して通信可能に接続してもよい。
[Configuration of field photographing system]
FIG. 1 is an explanatory diagram schematically showing the overall configuration of the field photographing system 1 of the present embodiment. The field photographing system 1 is a system for photographing a field FD that grows a crop PL, and includes a flying object 10, an imaging device 20, a terminal device 30, and a chart C. The flying object 10 and the terminal device 30 are communicably connected via a communication line NW. The communication line NW is configured by a network (whether wired or wireless) such as a LAN (Local Area Network) or an Internet line. The communication function of the flying object 10 may be provided to the imaging device 20, and the imaging device 20 and the terminal device 30 may be communicably connected via the communication line NW.
 (撮像装置)
 撮像装置20は、例えばマルチスペクトルカメラにより構成されており、飛行体10に保持されている。図2は、撮像装置20の詳細な構成を模式的に示している。撮像装置20は、第1の撮像部21と、第2の撮像部22とを備えている。第1の撮像部21は、圃場FDを撮影して可視光の波長帯域の画像(可視画像)を取得する可視撮像部である。第2の撮像部22は、圃場FDを撮影して近赤外光の波長帯域の画像(近赤外画像)を取得する近赤外撮像部である。撮像装置20が第1の撮像部21および第2の撮像部22を備えていることにより、飛行体10の飛行中に圃場FDを上空から撮影して、波長帯域の異なる圃場画像を取得することができる。
(Imaging device)
The imaging device 20 is configured by, for example, a multispectral camera, and is held by the flying object 10. FIG. 2 schematically shows a detailed configuration of the imaging device 20. The imaging device 20 includes a first imaging unit 21 and a second imaging unit 22. The first imaging unit 21 is a visible imaging unit that captures an image of the wavelength band of visible light (visible image) by capturing an image of the field FD. The second imaging unit 22 is a near-infrared imaging unit that captures an image of the field FD and acquires an image in the wavelength band of near-infrared light (near-infrared image). Since the imaging device 20 includes the first imaging unit 21 and the second imaging unit 22, the field FD is photographed from above while the flying object 10 is flying, and field images having different wavelength bands are acquired. Can be.
 第1の撮像部21は、第1のバンドパスフィルタ21a、第1の結像光学系21b、第1のイメージセンサ(光学センサ)21c、および第1のデジタルシグナルプロセッサ(図示せず)等を有している。第1のバンドパスフィルタ21aは、例えば波長650nmを中心波長とする比較的狭帯域で光を透過させる光学フィルタである。第1の結像光学系21bは、少なくとも1個のレンズで構成され、第1のバンドパスフィルタ21aを透過した測定対象(作物PLまたは圃場FD)の可視光の光学像を第1の結像面上に結像する。第1のイメージセンサ21cは、例えばVGAタイプ(640画素×480画素)のCCD(Charge Coupled Device)センサまたはCMOS(Complementary Metal Oxide Semiconductor)センサで構成されており、受光面が第1の結像面に一致するように配置され、測定対象で反射する太陽光に含まれる波長650nmを中心波長とする比較的狭帯域の光を検知し、測定対象の可視光の光学像を電気的な信号に変換する。第1のデジタルシグナルプロセッサは、第1のイメージセンサ21cの出力に対して画像処理を施し、可視画像を形成する。可視画像の画像データRvは、飛行体10の制御部11(図10参照)に出力される。なお、第1のバンドパスフィルタ21aの透過波長帯域は、本実施形態では、赤色(R)の波長帯域としているが、青色(B)または緑色(G)などの他の波長帯域であってもよい。 The first imaging unit 21 includes a first bandpass filter 21a, a first imaging optical system 21b, a first image sensor (optical sensor) 21c, a first digital signal processor (not shown), and the like. Have. The first bandpass filter 21a is an optical filter that transmits light in a relatively narrow band having a center wavelength of, for example, 650 nm. The first imaging optical system 21b includes at least one lens, and forms a first optical image of visible light of a measurement target (crop PL or field FD) transmitted through the first bandpass filter 21a. Form an image on the surface. The first image sensor 21c is composed of, for example, a VGA (640 pixels × 480 pixels) CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor, and the light receiving surface is a first imaging surface. Detects a relatively narrow band of light centered on a wavelength of 650 nm contained in sunlight reflected by the object to be measured, and converts the optical image of the visible light of the object to be converted into an electrical signal. I do. The first digital signal processor performs image processing on the output of the first image sensor 21c to form a visible image. The image data Rv of the visible image is output to the control unit 11 of the flying object 10 (see FIG. 10). Although the transmission wavelength band of the first bandpass filter 21a is the red (R) wavelength band in the present embodiment, it may be another wavelength band such as blue (B) or green (G). Good.
 第2の撮像部22は、第2のバンドパスフィルタ22a、第2の結像光学系22b、第2のイメージセンサ(光学センサ)22c、および第2のデジタルシグナルプロセッサ(図示せず)等を有している。第2のバンドパスフィルタ22aは、750nm以上の所定波長(例えば波長800nm)を中心波長とする比較的狭帯域で光を透過させる光学フィルタである。第2の結像光学系22bは、少なくとも1個のレンズで構成され、第2のバンドパスフィルタ22aを透過した測定対象の近赤外光の光学像を第2の結像面上に結像する。第2のイメージセンサ22cは、例えばVGAタイプのCCDセンサまたはCMOSセンサで構成されており、受光面が第2の結像面に一致するように配置され、測定対象で反射する太陽光に含まれる波長800nmを中心波長とする比較的狭帯域の光を検知し、測定対象の近赤外光の光学像を電気的な信号に変換する。第2のデジタルシグナルプロセッサは、第2のイメージセンサ22cの出力に対して画像処理を施し、近赤外画像を形成する。上記近赤外画像の画像データRiは、飛行体10の制御部11に出力される。 The second imaging unit 22 includes a second bandpass filter 22a, a second imaging optical system 22b, a second image sensor (optical sensor) 22c, a second digital signal processor (not shown), and the like. Have. The second bandpass filter 22a is an optical filter that transmits light in a relatively narrow band having a center wavelength of a predetermined wavelength of 750 nm or more (for example, a wavelength of 800 nm). The second imaging optical system 22b includes at least one lens, and forms an optical image of near-infrared light of the measurement target transmitted through the second bandpass filter 22a on the second imaging surface. I do. The second image sensor 22c is formed of, for example, a VGA type CCD sensor or a CMOS sensor, is arranged such that the light receiving surface coincides with the second imaging surface, and is included in sunlight reflected by the measurement target. A relatively narrow band light having a center wavelength of 800 nm is detected, and an optical image of near-infrared light to be measured is converted into an electric signal. The second digital signal processor performs image processing on the output of the second image sensor 22c to form a near-infrared image. The image data Ri of the near-infrared image is output to the control unit 11 of the flying object 10.
 (端末装置)
 図3は、端末装置30の概略の構成を示すブロック図である。端末装置30は、例えばPC(パーソナルコンピュータ)で構成されており、入力部31と、表示部32と、記憶部33と、通信部34と、制御部35とを有している。なお、端末装置30は、多機能携帯端末(例えばスマートフォン、タブレット端末)で構成されていてもよい。
(Terminal device)
FIG. 3 is a block diagram illustrating a schematic configuration of the terminal device 30. The terminal device 30 is configured by, for example, a PC (personal computer), and includes an input unit 31, a display unit 32, a storage unit 33, a communication unit 34, and a control unit 35. Note that the terminal device 30 may be configured by a multifunctional portable terminal (for example, a smartphone or a tablet terminal).
 入力部31は、ユーザによって操作され、ユーザによる情報の入力を受け付けるために設けられている。このような入力部31は、具体的には、キーボード、マウス、タッチパッドなどの操作部で構成される。表示部32は、各種の情報を表示するディスプレイであり、例えば液晶表示装置で構成されている。 The input unit 31 is operated by the user, and is provided for receiving input of information by the user. Such an input unit 31 is specifically configured by an operation unit such as a keyboard, a mouse, and a touchpad. The display unit 32 is a display that displays various types of information, and is configured by, for example, a liquid crystal display device.
 記憶部33は、例えばハードディスクで構成されており、制御部35を動作させるためのプログラムのほか、各種の情報を記憶する。例えば、撮像装置20(第1の撮像部21、第2の撮像部22)で取得される可視画像および近赤外画像の歪曲を補正するための計算式(歪曲補正情報)は、撮像装置20の第1の結像光学系21bおよび第2の結像光学系22bの設計に応じて予め求められる。このような歪曲補正情報は、記憶部33に予め記憶され、飛行体10の内部で歪曲補正の処理を行う場合は、通信回線NWを介して飛行体10に提供される。なお、上記の歪曲補正や後述する視差補正は、端末装置30の制御部35が行ってもよい。 The storage unit 33 is constituted by, for example, a hard disk, and stores various information in addition to a program for operating the control unit 35. For example, a calculation formula (distortion correction information) for correcting the distortion of the visible image and the near-infrared image acquired by the imaging device 20 (the first imaging unit 21 and the second imaging unit 22) is as follows. Is determined in advance according to the design of the first imaging optical system 21b and the second imaging optical system 22b. Such distortion correction information is stored in the storage unit 33 in advance, and is provided to the flying object 10 via the communication line NW when performing distortion correction processing inside the flying object 10. Note that the above-described distortion correction and parallax correction described below may be performed by the control unit 35 of the terminal device 30.
 通信部34は、飛行体10との間で通信を行うためのインターフェースであり、送信回路、受信回路、変調回路、復調回路、アンテナなどを含んで構成され、通信回線NWと通信可能に接続されている。制御部35は、例えばCPU(Central Processing Unit;中央演算処理装置)で構成されており、記憶部33に記憶された動作プログラムに従って動作し、端末装置30の各部の動作を制御するとともに、必要に応じて演算処理を行う。 The communication unit 34 is an interface for performing communication with the flying object 10, and includes a transmission circuit, a reception circuit, a modulation circuit, a demodulation circuit, an antenna, and the like, and is communicably connected to the communication line NW. ing. The control unit 35 is constituted by, for example, a CPU (Central Processing Unit), operates according to the operation program stored in the storage unit 33, controls the operation of each unit of the terminal device 30, and necessitates Computation processing is performed accordingly.
 (チャート)
 チャートCは、圃場FD内または圃場FDの周囲に位置しており、後述する視差補正を行う際に利用される。例えば、チャートCは、圃場FD内または圃場FDの周囲に位置する物体OBに取り付けられており、これによって、圃場FD内または圃場FDの周囲に位置する。図1では、上記物体OBとして、圃場FDの周囲に位置する離着陸台40を用い、チャートCを離着陸台40に取り付けた例を示している。離着陸台40は、飛行体10が離着陸するときに周囲の砂を巻き上げないようにする目的で地上に設置されたシート(台)である。このようなチャートCは、飛行体10の飛行中に、上記した第1の撮像部21および第2の撮像部22によって上空から撮影される。
(chart)
The chart C is located in the field FD or around the field FD, and is used when performing parallax correction described later. For example, the chart C is attached to an object OB located in the field FD or around the field FD, and is thereby located in the field FD or around the field FD. FIG. 1 shows an example in which the take-off and landing platform 40 located around the field FD is used as the object OB, and the chart C is attached to the take-off and landing platform 40. The take-off and landing table 40 is a sheet (table) installed on the ground in order to prevent the surrounding sand from rolling up when the flying object 10 takes off and land. Such a chart C is photographed from above by the first imaging unit 21 and the second imaging unit 22 during the flight of the flying object 10.
 図4は、上記チャートCの平面図である。チャートCは、少なくとも1つの円形部C1(例えば黒丸)を含む模様を有している。圃場FD内または圃場FDの周囲に自然に存在する模様(意図的に生成したものでない)と、視差補正用のチャートCとを区別するため、チャートCは、複数の円形部C1を含む模様であることが望ましく、3つ以上の円形部C1を含む模様であることがより望ましく、円形部C1に加えてこれとは反射率の異なる円形部C2(例えば黒および背景のシートの色とは異なる色で着色された円形部)を含む模様であることが望ましい。また、図5に示すように、チャートCは、格子模様を有していてもよい。なお、チャートCの全体形状は、正方形であることが望ましいが、長方形などの他の形状であってもよい。 FIG. 4 is a plan view of the chart C. The chart C has a pattern including at least one circular portion C 1 (for example, a black circle). Pattern includes a pattern (not intended generated) naturally present in the periphery of the field FD or in the field FD, to distinguish the chart C for parallax correction, chart C is a plurality of circular portions C 1 it is desirably, three or more and is more preferably a pattern including a circular portion C 1, reflectance from this, in addition to the circular part C 1 different circular portion C 2 (for example, black and sheets of background It is desirable that the pattern includes a pattern (a circular portion colored with a color different from the color). Further, as shown in FIG. 5, the chart C may have a lattice pattern. The overall shape of the chart C is preferably a square, but may be another shape such as a rectangle.
 ここで、チャートCが取り付けられる物体OBは、上記の離着陸台40には限定されない。図6に示すように、上記物体OBとして、飛行体10を運搬する運搬用車両41を用い、チャートCを運搬用車両41(例えば天井やボンネット)に取り付けてもよい。さらに、図7に示すように、上記物体OBとして、飛行体10の飛行を遠隔操作するパイロットが装着するヘルメット42を用い、チャートC(例えば1個の円形部C1)をヘルメット42の上部に取り付けてもよい。 Here, the object OB to which the chart C is attached is not limited to the take-off and landing platform 40 described above. As shown in FIG. 6, as the object OB, a transport vehicle 41 that transports the flying object 10 may be used, and the chart C may be attached to the transport vehicle 41 (for example, a ceiling or a hood). Further, as shown in FIG. 7, a helmet 42 worn by a pilot who remotely controls the flight of the flying object 10 is used as the object OB, and a chart C (for example, one circular portion C 1 ) is placed on top of the helmet 42. May be attached.
 このほかにも、上記物体OBとして、例えば、地上(圃場FDの周囲)に設置の入射光センサを用い、この入射光センサにチャートCを取り付けてもよい。なお、上記の入射光センサとは、撮像装置20で取得される可視画像の画像データRvおよび近赤外画像の画像データRiを、天候(晴れ、曇り)に応じて補正するために設けられており、特許文献1の太陽光測定部に相当する。さらに、飛行体10に搭載されたバッテリー17(図10参照)を充電するための発電機(圃場FDの周囲に位置)、圃場FD内に設置される温度計や水位計などのフィールドサーバー、RTK-GPS(Real Time Kinematic - Global Positioning System)の基地局(圃場FD内または圃場FDの周囲に位置)、トラクターやコンバインなどの農機(圃場FDの周囲に位置)、水門等の灌漑設備(圃場FD内または圃場FDの周囲に位置)などを上記物体OBとして用い、これらの物体OBにチャートCを取り付けてもよい。 In addition, as the object OB, for example, an incident light sensor installed on the ground (around the field FD) may be used, and the chart C may be attached to this incident light sensor. The incident light sensor is provided to correct the image data Rv of the visible image and the image data Ri of the near-infrared image acquired by the imaging device 20 according to the weather (clear, cloudy). And corresponds to the sunlight measuring unit of Patent Document 1. Further, a generator (located around the field FD) for charging the battery 17 (see FIG. 10) mounted on the flying object 10, a field server such as a thermometer and a water level gauge installed in the field FD, and an RTK -GPS (Real Time Kinematic-Global Positioning System) base stations (located in or around the field FD), agricultural machines such as tractors and combiners (located around the field FD), irrigation equipment such as floodgates (field FD) Inside or around the field FD) may be used as the object OB, and the chart C may be attached to these objects OB.
 また、図8に示すように、圃場FD内に位置する作物PLの欠株部分Nを、チャートCとして代用してもよい。ここで、作物PLの欠株部分Nとは、圃場FD内で作物PLを最初から生育させていない部分(領域)を指し、例えば作物PLが稲である場合は、最初から田植えを行っていない部分を指す。なお、図8において、斜線のハッチングで示す部分が圃場FD内で作物PLが生育している領域を示し、ハッチングのない白塗りの部分が欠株部分Nに相当する。さらに、図9に示すように、圃場FD内が水で浸されているときの水面に映る太陽光の反射スポットSをチャートCとして代用してもよい。 Further, as shown in FIG. 8, the lacking portion N of the crop PL located in the field FD may be used as the chart C. Here, the lacking portion N of the crop PL refers to a portion (region) where the crop PL does not grow from the beginning in the field FD. For example, when the crop PL is rice, the rice is not planted from the beginning. Refers to the part. In FIG. 8, a hatched portion indicates a region where the crop PL is growing in the field FD, and a white portion without hatching corresponds to the missing portion N. Further, as shown in FIG. 9, the reflection spot S of sunlight reflected on the water surface when the field FD is immersed in water may be used as the chart C.
 (飛行体)
 図1で示した飛行体10は、例えば自律飛行可能な無人航空機(ドローン)によって構成されている。飛行体10は、上記の無人航空機以外に、気球、飛行船、飛行機、ヘリコプターなどで構成されてもよい。飛行体10の飛行は、地上のパイロット(操縦者)が操作部50(図10参照)を操作することによって遠隔制御される。圃場FDに沿って飛行体10を飛行させることにより、圃場FDを構成する複数の領域Tを撮像装置20によって上空から撮影して、各領域Tごとに画像(例えば可視画像、赤外画像)を取得することができる。したがって、各領域Tの画像を貼り合わせて最終的には圃場FD全体の画像を取得することができる。なお、飛行体10の高度を上げて圃場FD全体をまとめて撮影することにより、各領域Tの画像を貼り合わせることなく圃場FD全体の1つの画像を取得することも可能である。
(Flying object)
The flying object 10 shown in FIG. 1 is configured by, for example, an unmanned aerial vehicle (drone) capable of autonomous flight. The flying object 10 may be configured by a balloon, an airship, an airplane, a helicopter, or the like, in addition to the unmanned aircraft described above. The flight of the flying object 10 is remotely controlled by a ground pilot (operator) operating the operation unit 50 (see FIG. 10). By flying the flying object 10 along the field FD, a plurality of regions T constituting the field FD are photographed from the sky by the imaging device 20, and an image (for example, a visible image or an infrared image) is formed for each region T. Can be obtained. Therefore, an image of the entire field FD can be finally obtained by bonding the images of the respective regions T. In addition, by raising the altitude of the flying object 10 and photographing the entire field FD collectively, it is also possible to acquire one image of the entire field FD without pasting the images of the respective regions T.
 図10は、飛行体10の詳細な構成を示すブロック図である。飛行体10は、制御部11と、通信部12と、警告部13と、位置情報取得部14と、画像蓄積部15と、飛行駆動部16と、バッテリー17とを備えている。飛行駆動部16は、例えばプロペラおよびモータ等を有しており、モータを駆動させてプロペラを回転させることにより、飛行体10を飛行させる。飛行体10の飛行制御は、制御部11によって行われる。このとき、制御部11は、地上のパイロットによる操作部50の操作に基づいて飛行駆動部16を制御する。バッテリー17は、飛行体10の各部に駆動用の電力を供給する電池である。 FIG. 10 is a block diagram showing a detailed configuration of the flying object 10. As shown in FIG. The flying object 10 includes a control unit 11, a communication unit 12, a warning unit 13, a position information acquisition unit 14, an image storage unit 15, a flight drive unit 16, and a battery 17. The flight drive unit 16 has, for example, a propeller, a motor, and the like, and causes the flying object 10 to fly by driving the motor to rotate the propeller. The flight control of the flying object 10 is performed by the control unit 11. At this time, the control unit 11 controls the flight drive unit 16 based on the operation of the operation unit 50 by the pilot on the ground. The battery 17 is a battery that supplies driving power to each part of the flying object 10.
 制御部11は、例えばCPUで構成されており、全体制御部11aと、歪曲補正部11bと、視差補正部11cと、生育指標算出部11dと、画像合成部11eとを有している。つまり、上記のCPUは、全体制御部11a、歪曲補正部11b、視差補正部11c、生育指標算出部11d、および画像合成部11eとして機能する。全体制御部11aは、図示しないプログラム記憶部に記憶された動作プログラムに従って、飛行体10の各部の動作を制御する。なお、歪曲補正部11b、視差補正部11c、生育指標算出部11d、および画像合成部11eの詳細については、後述する動作説明の中で併せて説明する。 The control unit 11 includes, for example, a CPU, and includes a general control unit 11a, a distortion correction unit 11b, a parallax correction unit 11c, a growth index calculation unit 11d, and an image synthesis unit 11e. That is, the CPU functions as the overall control unit 11a, the distortion correction unit 11b, the parallax correction unit 11c, the growth index calculation unit 11d, and the image synthesis unit 11e. The overall control unit 11a controls the operation of each unit of the flying object 10 according to an operation program stored in a program storage unit (not shown). The details of the distortion correcting unit 11b, the parallax correcting unit 11c, the growth index calculating unit 11d, and the image synthesizing unit 11e will be described later in the operation description.
 通信部12は、端末装置30との間で通信を行うためのインターフェースであり、送信回路、受信回路、変調回路、復調回路、アンテナなどを含んで構成され、通信回線NWと通信可能に接続されている。 The communication unit 12 is an interface for performing communication with the terminal device 30 and includes a transmission circuit, a reception circuit, a modulation circuit, a demodulation circuit, an antenna, and the like, and is communicably connected to the communication line NW. ing.
 警告部13は、制御部11の視差補正部11cによって算出される、所定期間(例えば1日や1週間など)における、視差による画素値の位置ズレの変化量が閾値を超えた場合に、制御部11(例えば全体制御部11a)の制御にもとで警告を発する。このような警告部13は、警告表示を行う表示部13a(例えば液晶表示装置でもよいし、単なる光源でもよい)、警告音を発する音声出力部13b(例えばスピーカー)、警告情報を外部に発信する通信部(例えば通信部12)の少なくともいずれかで構成することが可能である。 The warning unit 13 performs control when the amount of change in pixel value positional deviation due to parallax in a predetermined period (for example, one day or one week) calculated by the parallax correction unit 11c of the control unit 11 exceeds a threshold. A warning is initially issued under the control of the unit 11 (for example, the overall control unit 11a). Such a warning unit 13 includes a display unit 13a (for example, a liquid crystal display device or a simple light source) for performing a warning display, a sound output unit 13b (for example, a speaker) that emits a warning sound, and transmits warning information to the outside. It can be configured with at least one of the communication units (for example, the communication unit 12).
 位置情報取得部14は、例えばGPSで構成されており、飛行体10(撮像装置20)の位置情報(緯度X、経度Y、高さZ)を取得する。これにより、制御部11(例えば全体制御部11a)は、上記位置情報に基づいて飛行駆動部16を制御し、圃場FDの所定の領域Tの上空で所定の高さ位置に、飛行体10を飛行させることができる。 The position information acquiring unit 14 is configured by, for example, GPS, and acquires position information (latitude X, longitude Y, height Z) of the flying object 10 (the imaging device 20). Thereby, the control unit 11 (for example, the overall control unit 11a) controls the flight driving unit 16 based on the position information, and moves the flying object 10 to a predetermined height position above a predetermined area T of the field FD. Can fly.
 画像蓄積部15は、例えば不揮発性メモリで構成されており、撮像装置20の第1の撮像部21および第2の撮像部22で取得された可視画像および近赤外画像の各画像データが記憶されるとともに、生育指標算出部11dによって得られた生育指標(特に視差補正後の画素値に基づいて得られたNDVI画像の画像データ)が記憶される。 The image storage unit 15 is configured by, for example, a nonvolatile memory, and stores each image data of a visible image and a near-infrared image acquired by the first imaging unit 21 and the second imaging unit 22 of the imaging device 20. At the same time, the growth index (particularly, the image data of the NDVI image obtained based on the pixel value after parallax correction) obtained by the growth index calculation unit 11d is stored.
 〔圃場撮影システムにおける処理について〕
 次に、上記構成の圃場撮影システム1による圃場撮影方法について説明する。図11は、本実施形態の圃場撮影方法による処理の流れを示すフローチャートである。なお、ここでは、チャートCは、圃場FDの周囲に位置する離着陸台40に取り付けられているとする。
[About processing in the field imaging system]
Next, a field photographing method by the field photographing system 1 having the above configuration will be described. FIG. 11 is a flowchart illustrating a flow of processing according to the field photographing method of the present embodiment. Here, it is assumed that the chart C is attached to the take-off and landing platform 40 located around the field FD.
 まず、飛行体10を飛行させて、圃場FDの撮影およびチャートCの撮影を行う(S1;撮影工程)。より詳しくは、パイロットは、操作部50を操作することにより、飛行体10を離着陸台40から離陸させる。そして、飛行体10の飛行中に、飛行体10に保持された第1の撮像部21および第2の撮像部22によって圃場FDを上空から撮影し、波長帯域の異なる圃場画像を取得する(S1-1;圃場画像取得工程)。すなわち、第1の撮像部21によって、圃場FDの可視画像を一方の圃場画像として取得し、第2の撮像部22によって、圃場FDの近赤外画像を他方の圃場画像として取得する。取得した可視画像の画像データRvおよび近赤外の画像データRiは、第1の撮像部21および第2の撮像部22から飛行体10の制御部11にそれぞれ入力される。 {Circle around (1)} First, the flying object 10 is caused to fly to photograph the field FD and the chart C (S1; photographing step). More specifically, the pilot takes off the flying object 10 from the take-off and landing platform 40 by operating the operation unit 50. Then, during the flight of the flying object 10, the first imaging unit 21 and the second imaging unit 22 held by the flying object 10 photograph the field FD from above, and acquire field images having different wavelength bands (S1). -1; field image acquisition step). That is, the first imaging unit 21 acquires a visible image of the field FD as one field image, and the second imaging unit 22 acquires a near-infrared image of the field FD as the other field image. The acquired image data Rv and the near-infrared image data Ri of the visible image are input from the first imaging unit 21 and the second imaging unit 22 to the control unit 11 of the flying object 10, respectively.
 また、飛行体10の飛行中に、離着陸台40に取り付けられたチャートCを、第1の撮像部21および第2の撮像部22によって上空から撮影して各チャート画像を取得する(S1-2;チャート画像取得工程)。すなわち、第1の撮像部21によって、チャートCの可視画像を一方のチャート画像として取得し、第2の撮像部22によって、チャートCの近赤外画像を他方のチャート画像として取得する。ここで、上記した「飛行体10の飛行中」には、飛行体10が離着陸台40から離陸して圃場上空の所定高度に達するまでの飛行体10の上昇中、および、飛行体10が圃場上空の所定高度から離着陸台40に着陸するまでの飛行体10の下降中が含まれるとする。第1の撮像部21および第2の撮像部22によって取得された各チャート画像の画像データは、飛行体10の制御部11にそれぞれ入力される。 Further, during the flight of the flying object 10, the chart C attached to the take-off and landing platform 40 is photographed from the sky by the first imaging unit 21 and the second imaging unit 22 to acquire each chart image (S1-2). A chart image obtaining step). That is, the first imaging unit 21 acquires a visible image of the chart C as one chart image, and the second imaging unit 22 acquires a near-infrared image of the chart C as the other chart image. Here, during the above-mentioned “during flight of the flying object 10”, the flying object 10 is rising until the flying object 10 takes off from the take-off and landing platform 40 and reaches a predetermined altitude above the field, and It is assumed that the descent of the flying object 10 from landing at a predetermined altitude in the sky to landing on the take-off and landing platform 40 is included. Image data of each chart image acquired by the first imaging unit 21 and the second imaging unit 22 is input to the control unit 11 of the flying object 10, respectively.
 本実施形態では、第1の撮像部21および第2の撮像部22により、S1-1の圃場画像取得工程と、S1-2のチャート画像取得工程とを同時に行う。つまり、第1の撮像部21および第2の撮像部22は、上空から圃場FDを撮影しながら同時にチャートCも撮影する。これにより、可視および近赤外のそれぞれについて、チャートCが映った圃場画像(チャート画像を兼ねた圃場画像)が得られる。 In the present embodiment, the first imaging unit 21 and the second imaging unit 22 simultaneously perform the field image acquiring step of S1-1 and the chart image acquiring step of S1-2. That is, the first imaging unit 21 and the second imaging unit 22 simultaneously capture the chart C while capturing the field FD from above. As a result, a field image showing the chart C (field image also serving as the chart image) is obtained for each of visible and near-infrared.
 次に、制御部11の歪曲補正部11bは、端末装置30から通信部12を介して提供される歪曲補正情報(歪曲を補正するための計算式)に基づいて、第1の撮像部21および第2の撮像部22からそれぞれ出力される各画像(各圃場画像、各チャート画像)の画像データをそれぞれ補正する。これにより、各画像における樽型や糸巻型などの歪曲が補正され、歪曲が補正された状態の各画像を用いて次の視差補正を適切に行うことが可能となる。 Next, the distortion correction unit 11b of the control unit 11 performs the first imaging unit 21 and the first imaging unit 21 based on the distortion correction information (the calculation formula for correcting the distortion) provided from the terminal device 30 via the communication unit 12. The image data of each image (each field image, each chart image) output from the second imaging unit 22 is respectively corrected. Thereby, the distortion such as the barrel shape and the pincushion type in each image is corrected, and the next parallax correction can be appropriately performed using each image in a state where the distortion has been corrected.
 続いて、制御部11の視差補正部11cは、第1の撮像部21および第2の撮像部22によってチャートCを上空から撮影して得られた各チャート画像に基づいて、各圃場画像間で、視差によって生じる同一地点を示す画素の位置ズレを補正する視差補正(キャリブレーション)を行う(S3;視差補正工程)。なお、S3での視差補正の詳細については後述する。 Subsequently, the parallax correction unit 11c of the control unit 11 determines between the field images based on each chart image obtained by photographing the chart C from above by the first imaging unit 21 and the second imaging unit 22. Then, parallax correction (calibration) for correcting a positional shift of pixels indicating the same point caused by parallax is performed (S3; parallax correction step). The details of the parallax correction in S3 will be described later.
 S3での視差補正の後、全体制御部11aは、所定期間における上記位置ズレの変化量変化量が閾値を超えたか否かを判断する(S4)。例えば、全体制御部11aは、可視画像と近赤外画像とを重ね合わせたときに、同一地点を示す位置同士を結ぶ直線の長さの、所定期間(例えば1日や1週間など)が経過する前後での差を、上記した「位置ズレの変化量」とし、この変化量が閾値を超えたか否かを判断する。そして、上記変化量が閾値を超えた場合には、警告部13は、全体制御部11aの制御にもとで警告を発する(S5;警告工程)。この警告工程では、表示部13aによる警告表示、音声出力部13bによる警告音の出力、通信部12による警告情報の外部への発信、の少なくともいずれが行われる。一方、上記変化量が閾値以下である場合には、S6に直接移行する。 After the parallax correction in S3, the overall control unit 11a determines whether or not the amount of change in the positional deviation during a predetermined period exceeds a threshold (S4). For example, when the visible image and the near-infrared image are superimposed on each other, the overall control unit 11a elapses a predetermined period (for example, one day or one week) of the length of a straight line connecting positions indicating the same point. The difference between before and after the change is referred to as the “change amount of the positional deviation” described above, and it is determined whether the change amount exceeds a threshold. When the change amount exceeds the threshold, the warning unit 13 issues a warning under the control of the overall control unit 11a (S5; warning step). In this warning step, at least one of displaying a warning on the display unit 13a, outputting a warning sound by the audio output unit 13b, and transmitting warning information to the outside by the communication unit 12 is performed. On the other hand, if the amount of change is equal to or smaller than the threshold, the process directly proceeds to S6.
 次に、制御部11の生育指標算出部11dは、視差補正部11cによって上記位置ズレを補正した後の各圃場画像に基づいて、作物PLの生育指標を算出する(S6;生育指標算出工程)。本実施形態では、上記の生育指標としては、NDVIを用いる。NDVIは、植生の分布状況や活性度を示す指標であり、上記のようにして歪曲および視差による上記位置ズレが補正された一方の圃場画像(可視画像)の画像データRvおよび他方の圃場画像(近赤外画像)の画像データRiを用いて、NDVI={(Ri-Rv)/(Ri+Rv)}で表される。NDVIは、-1と1との間に正規化した数値を示し、正の大きい数字になるほど植生が濃いことを表す。各圃場画像の各画素ごとにNDVIが算出されると、NDVIの各画素ごとの分布を示すNDVI画像が得られる。 Next, the growth index calculation unit 11d of the control unit 11 calculates the growth index of the crop PL based on each field image after the positional shift has been corrected by the parallax correction unit 11c (S6; growth index calculation step). . In the present embodiment, NDVI is used as the growth index. NDVI is an index indicating the distribution status and activity of vegetation, and the image data Rv of one field image (visible image) and the other field image (visual image) in which the positional shift due to distortion and parallax has been corrected as described above. NDVI = {(Ri−Rv) / (Ri + Rv)} using image data Ri of a near infrared image). NDVI indicates a numerical value normalized between −1 and 1, and a larger positive number indicates a denser vegetation. When the NDVI is calculated for each pixel of each field image, an NDVI image showing the distribution of each pixel of the NDVI is obtained.
 なお、生育指標算出部11dは、生育指標として、NDVIの代わりに以下の値を算出してもよい。NDVI以外の指標としては、例えば、RVI(Ratio Vegetation Index;比植生指数、RVI=Ri/Rv)、DVI(Difference Vegetation Index;差植生指数、DVI=Ri-Rv)、TVI(Transformed Vegetation Index、TVI=NDVI0.5+0.5)、またはIPVI(Infrared Percentage Vegetation Index、IPVI=Ri/(Ri+Rv)=(NDVI+1)/2)等が挙げられる。 Note that the growth index calculation unit 11d may calculate the following value instead of NDVI as the growth index. As an index other than NDVI, for example, RVI (Ratio Vegetation Index; specific vegetation index, RVI = Ri / Rv), DVI (Difference Vegetation Index; difference vegetation index, DVI = Ri-Rv), TVI (Transformed Vegetation Index, TVI) = NDVI 0.5 +0.5) or IPVI (Infrared Percentage Vegetation Index, IPVI = Ri / (Ri + Rv) = (NDVI + 1) / 2).
 また、生育指標算出部11dは、NDVIの代わりに、植被率を生育指標として算出してもよい。植被率とは、圃場FDの地表面を作物PLが覆っている割合を示す。例えば、生育指標算出部11dは、歪曲および視差による上記位置ズレを補正した後の近赤外画像を二値化処理して、白色と黒色との二値化画像を形成し、この二値化画像において、白色部分が占める割合を算出することにより、植被率を算出することができる。なお、二値化画像において、白色部分は作物PLに相当し、黒色部分は土壌に相当する。 生 Furthermore, the growth index calculating unit 11d may calculate the planting rate as a growth index instead of NDVI. The vegetation cover rate indicates a rate at which the crop PL covers the ground surface of the field FD. For example, the growth index calculation unit 11d performs a binarization process on the near-infrared image after correcting the positional shift due to the distortion and the parallax, to form a binary image of white and black, and By calculating the ratio of the white portion in the image, the planting rate can be calculated. In the binarized image, a white portion corresponds to the crop PL, and a black portion corresponds to the soil.
 次に、制御部11の画像合成部11eは、各圃場FDごとに複数のNDVI画像が得られる場合や、各圃場FDに含まれる各領域TごとにNDVI画像が得られた場合に、複数のNDVI画像を合成して、全体として1つのNDVI画像を生成する(S7;画像合成工程)。なお、ここでの「合成」とは、複数のNDVI画像を横に並べてつなぎ合わせることを指す。なお、得られたNDVI画像が元々1つである場合、画像合成は不要であるため、S7の工程をスキップして処理を終了することになる。 Next, when a plurality of NDVI images are obtained for each field FD, or when an NDVI image is obtained for each region T included in each field FD, the image combining unit 11e of the control unit 11 The NDVI images are combined to generate one NDVI image as a whole (S7; image combining step). Note that “combining” here refers to connecting a plurality of NDVI images side by side. If the number of obtained NDVI images is originally one, image synthesis is unnecessary, and thus the process of S7 is skipped and the process ends.
 〔視差補正の詳細について〕
 次に、S3での視差補正の詳細について説明する。図12は、視差補正の処理の流れを示すフローチャートである。まず、視差補正部11cは、第1の撮像部21および第2の撮像部22で、飛行体10が上空で移動しているときに異なるタイミングで取得された複数の撮影画像から、チャートCの像が含まれる画像を上記したチャート画像として抽出する(S11)。例えば、視差補正部11cは、上記複数の撮影画像から、画像内にチャートCの円形部C1に相当する像が少なくとも1つ(好ましくは4つ)存在する画像を、チャートCの像が含まれるチャート画像として抽出する。そして、このようなチャート画像の抽出を、第1の撮像部21および第2の撮像部22でそれぞれ取得された複数の撮影画像を対象として行う。
[Details of parallax correction]
Next, the details of the parallax correction in S3 will be described. FIG. 12 is a flowchart illustrating the flow of the parallax correction process. First, the parallax correction unit 11c uses the first imaging unit 21 and the second imaging unit 22 to extract the chart C from a plurality of captured images acquired at different timings when the flying object 10 is moving over the sky. An image including the image is extracted as the above-described chart image (S11). For example, the disparity compensation unit 11c, from the plurality of captured images, at least one image corresponding to the circular portion C 1 of the chart C in the image of (preferably four) image present, include an image of the chart C Extracted as a chart image. Then, the extraction of such a chart image is performed on a plurality of captured images obtained by the first imaging unit 21 and the second imaging unit 22, respectively.
 図13は、上記複数の撮影画像から視差補正部11cが抽出したチャート画像を模式的に示している。ここでは、第1の撮像部21により、時刻t1においてチャート画像CR-1が取得され、時刻t2においてチャート画像CR-2が取得され、時刻t3においてチャート画像CR-3が取得され、第2の撮像部22により、時刻t1においてチャート画像Ci-1が取得され、時刻t2においてチャート画像Ci-2が取得され、時刻t3においてチャート画像Ci-3が取得された場合において、これらを時系列で並べて示している。なお、CRは、可視の波長帯域のチャート画像であり、Ciは近赤外の波長帯域のチャート画像である。なお、図13において、各チャート画像において、チャートCの円形部C1に相当する像を、C1’で示す。 FIG. 13 schematically illustrates a chart image extracted by the parallax correction unit 11c from the plurality of captured images. Here, the chart image CR-1 is acquired at time t1, the chart image CR-2 is acquired at time t2, the chart image CR-3 is acquired at time t3, and the second image is acquired by the first imaging unit 21. When the imaging unit 22 obtains the chart image Ci-1 at time t1, obtains the chart image Ci-2 at time t2, and obtains the chart image Ci-3 at time t3, they are arranged in chronological order. Is shown. Note that CR is a chart image in a visible wavelength band, and Ci is a chart image in a near-infrared wavelength band. In FIG. 13, in each chart image, an image corresponding to the circular portion C 1 of the chart C, indicated by C 1 '.
 次に、視差補正部11cは、抽出した各チャート画像から、チャートCの領域CAを抽出する(S12)。例えば、視差補正部11cは、抽出したチャート画像内で円形部C1に相当する像C1’ の位置(座標)を算出するとともに、像C1’の個数を算出する。そして、像C1’が4個存在する場合には、視差補正部11cは、抽出したチャート画像内で隣り合う像C1’と像C1’との距離が所定範囲内にあり、かつ、4つの像C1’を囲む形状が正方形状である場合に、その4つの像C1’を囲む領域をチャートCの領域CAとして抽出する。また、例えば、抽出したチャート画像内で像C1’が1個である場合には、その像C1’を囲む所定の大きさの正方形の領域をチャートCの領域CAとして抽出する。 Next, the parallax correction unit 11c extracts the area CA of the chart C from each of the extracted chart images (S12). For example, the parallax correction unit 11c calculates the position (coordinates) of the image C 1 ′ corresponding to the circular portion C 1 in the extracted chart image, and calculates the number of the images C 1 ′. Then, the image C 1 'if there four are parallax correction unit 11c extracted image C 1 adjacent in the chart image' the distance between the image C 1 'is within a predetermined range, and, When the shape surrounding the four images C 1 ′ is a square, the area surrounding the four images C 1 ′ is extracted as the area CA of the chart C. Further, for example, when there is one image C 1 ′ in the extracted chart image, a square area of a predetermined size surrounding the image C 1 ′ is extracted as the area CA of the chart C.
 次に、視差補正部11cは、第1の撮像部21および第2の撮像部22によって同時刻に撮影されて取得された各チャート画像に基づいて、一方のチャート画像(例えば可視画像)に含まれる基準点Pの位置座標を他方のチャート画像(例えば近赤外画像)で対応する(圃場FD内で同一地点を示す)対応点Qの位置座標に変換する射影行列を算出する(S13)。ここで、上記した基準点Pおよび対応点Qは、例えば各チャート画像において、S12で求めたチャートCの領域CAの中心位置とする。この中心位置は、例えばチャート画像内の像C1’が4個の場合は、4個の像C1’の位置の平均値(x座標およびy座標を足して4で割った座標位置)とすることができ、チャート画像内の像C1’が1個の場合は、その像C1’の位置とすることができる。 Next, the parallax correction unit 11c is included in one chart image (for example, a visible image) based on each chart image captured and acquired by the first imaging unit 21 and the second imaging unit 22 at the same time. A projection matrix for converting the position coordinates of the reference point P to the position coordinates of the corresponding point Q (indicating the same point in the field FD) corresponding to the other chart image (for example, a near-infrared image) is calculated (S13). Here, the reference point P and the corresponding point Q are, for example, the center positions of the area CA of the chart C obtained in S12 in each chart image. For example, when there are four images C 1 ′ in the chart image, the center position is the average value of the positions of the four images C 1 ′ (the coordinate position obtained by adding the x coordinate and the y coordinate and dividing by 4). When there is one image C 1 ′ in the chart image, the position of the image C 1 ′ can be set.
 ここで、射影行列の算出方法について説明する。一般的に、射影行列を用いて、ある平面を別の平面に射影することをホモグラフィ(射影変換)と呼ぶ。この射影変換の1つとして、以下の式によって示されるアフィン変換が一般的に知られている(例えばhttp://zellij.hatenablog.com/entry/20120523/p1参照)。以下、射影変換は、アフィン変換であるとする。 Here, a method of calculating the projection matrix will be described. In general, projecting a certain plane onto another plane using a projection matrix is called homography (projection transformation). As one of the projective transformations, an affine transformation represented by the following equation is generally known (for example, see http://zellij.hatenablog.com/entry/20120523/p1). Hereinafter, it is assumed that the projective transformation is an affine transformation.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 図14に示すように、xy平面上の任意の点Oの座標を(x,y)とし、点Oを射影行列(数1式の3行3列の行列に相当)によってx’y’平面上に射影した点O’の座標を(x’,y’)とする。射影変換は、線形変換と平行移動とを組み合わせた変換であり、上記の線形変換には、拡大縮小(differential scaling)、剪断(skew)、回転(rotation)が含まれる。上記の射影行列において、係数txおよびtyは、平行移動(translation)に関する係数を指し、残りの係数a、b、cおよびdは、線形変換に関する係数を指す。 As shown in FIG. 14, the coordinates of an arbitrary point O on the xy plane are set to (x, y), and the point O is defined by a projection matrix (corresponding to a matrix of 3 rows and 3 columns in Formula 1) on the x′y ′ plane. The coordinates of the point O ′ projected above are (x ′, y ′). The projective transformation is a transformation that combines linear transformation and translation. The above-described linear transformation includes scaling (differential scaling), shearing (skew), and rotation (rotation). In the above projection matrix, the coefficient t x and t y, refers to coefficients for parallel movement (translation), the remaining coefficients a, b, c and d refer to coefficients for linear transformation.
 射影行列を算出するとは、射影行列における係数a、b、c、d、txおよびtyの6つのパラメータを算出することである。未知のパラメータ(係数)が6つであるため、6つの方程式を立てることができれば、6つの未知のパラメータ全てを求めることができる(すなわち、射影行列を算出できる)。このとき、1つの点の射影変換前後での位置座標がわかっていれば、それを数1式に代入することで、xおよびyについて2つの方程式を立てることができる。したがって、3点について射影変換前後の位置座標を数1式に代入することにより、6つの方程式を立てることができ、これを連立させて解くことにより、6つのパラメータ全てを求めることができる。 And calculating the projection matrix is to calculate the coefficient of the projection matrix a, b, c, d, six parameters t x and t y. Since there are six unknown parameters (coefficients), if six equations can be established, all six unknown parameters can be obtained (that is, a projection matrix can be calculated). At this time, if the position coordinates of one point before and after projective transformation are known, two equations can be established for x and y by substituting the coordinates into Equation 1. Therefore, by substituting the position coordinates before and after the projective transformation for the three points into Equation 1, six equations can be established, and all six parameters can be obtained by solving them simultaneously.
 よって、本実施形態では、視差補正部11cは、以下のようにして3つの点の位置座標を数1式に入力し、射影行列を算出する。まず、時刻t1において取得されたチャート画像CR-1のxy平面上の基準点Pの座標(x1,y1)と、同時刻で取得されたチャート画像Ci-1のx’y’平面上の対応点Qの座標(x1’,y1’)とを数1式に入力する。同様に、時刻t2において取得されたチャート画像CR-2のxy平面上の基準点Pの座標(x2,y2)と、同時刻で取得されたチャート画像Ci-2のx’y’平面上の対応点Qの座標(x2’,y2’)とを数1式に入力する。さらに、時刻t3において取得されたチャート画像CR-2のxy平面上の基準点Pの座標(x3,y3)と、同時刻で取得されたチャート画像Ci-3のx’y’平面上の対応点Qの座標(x3’,y3’)とを数1式に入力する。これらの手法で得られた6つの方程式を連立させて解くことにより、射影行列を算出することができる。なお、飛行体10は上空を移動していることから、時刻t1における基準点Pの座標(x1,y1)と、時刻t2における基準点Pの座標(x2,y2)と、時刻t3における基準点Pの座標(x3,y3)とは相互に異なる。同様に、時刻t1における対応点Qの座標(x1’,y1’)と、時刻t2における対応点Qの座標(x2’,y2’)と、時刻t3における対応点Qの座標(x3’,y3’)とは相互に異なる。 Therefore, in the present embodiment, the parallax correction unit 11c inputs the position coordinates of the three points into Expression 1 and calculates the projection matrix as follows. First, the correspondence (x1, y1) of the reference point P on the xy plane of the chart image CR-1 acquired at the time t1 to the correspondence on the x'y 'plane of the chart image Ci-1 acquired at the same time. The coordinates (x1 ′, y1 ′) of the point Q are input into Equation 1. Similarly, the coordinates (x2, y2) of the reference point P on the xy plane of the chart image CR-2 acquired at the time t2 and the x'y 'plane of the chart image Ci-2 acquired at the same time The coordinates (x2 ′, y2 ′) of the corresponding point Q are input into Equation 1. Further, the correspondence (x3, y3) of the reference point P on the xy plane of the chart image CR-2 obtained at the time t3 to the correspondence on the x'y 'plane of the chart image Ci-3 obtained at the same time. The coordinates (x3 ′, y3 ′) of the point Q are input into Equation 1. By simultaneously solving the six equations obtained by these methods, a projection matrix can be calculated. Since the flying object 10 is moving in the sky, the coordinates (x1, y1) of the reference point P at time t1, the coordinates (x2, y2) of the reference point P at time t2, and the reference point at time t3. The coordinates (x3, y3) of P are different from each other. Similarly, the coordinates (x1 ′, y1 ′) of the corresponding point Q at time t1, the coordinates (x2 ′, y2 ′) of the corresponding point Q at time t2, and the coordinates (x3 ′, y3) of the corresponding point Q at time t3. ') And mutually different.
 なお、ここでは、3つの基準点Pおよび対応点Qの位置座標を数1式に代入することにより、6つの方程式を立てて射影行列を算出しているが、例えば線形変換が不要であるような射影変換の場合には(平行移動のみの射影変換の場合には)、射影行列の係数a、b、c、dを全て0として、1つの基準点Pおよび対応点Qの位置座標を数1式に代入することにより、係数txおよびtyを求めて射影行列を確定させることが可能である。 Here, the projection matrix is calculated by setting six equations by substituting the position coordinates of the three reference points P and the corresponding points Q into Equation 1, but for example, linear transformation is not necessary. In the case of a simple projection transformation (in the case of a projection transformation with only translation), the coefficients a, b, c, and d of the projection matrix are all set to 0, and the position coordinates of one reference point P and corresponding point Q By substituting into the equation (1), it is possible to determine the coefficients t x and t y to determine the projection matrix.
 また、4以上の基準点Pおよび対応点Qの位置座標を数1式に代入して、8つ以上の方程式を立てることも可能である。この場合、8つ以上の方程式の中から任意に6つの方程式を選択したグループを複数作り、各グループごとに射影行列の係数を求め、各グループで得られた係数から、最小二乗法によって最終的な係数を決定するようにしてもよい。この場合、各係数、つまり射影行列を精度よく求めることが可能となる。 Alternatively, eight or more equations can be established by substituting the position coordinates of four or more reference points P and corresponding points Q into Equation 1. In this case, a plurality of groups in which six equations are arbitrarily selected from eight or more equations are formed, coefficients of the projection matrix are obtained for each group, and the coefficients obtained in each group are finally determined by the least squares method. May be determined. In this case, each coefficient, that is, a projection matrix can be obtained with high accuracy.
 視差補正部11cは、上記のようにして射影行列を算出すると、一方の圃場画像(例えば可視画像)に含まれる全ての点の位置座標を、S13で求めた射影行列によって変換する(S14)。これにより、一方の圃場画像と他方の圃場画像(例えば近赤外画像)との間の視差によって生じる、同一地点を示す画素の位置ズレが補正される。つまり、一方の圃場画像の各画素が、他方の圃場画像において、圃場FD内で同一地点を表す画素と対応付けられる。 After calculating the projection matrix as described above, the parallax correction unit 11c converts the position coordinates of all points included in one field image (for example, a visible image) by the projection matrix obtained in S13 (S14). Thereby, the positional deviation of the pixels indicating the same point, which is caused by the parallax between one field image and the other field image (for example, a near-infrared image), is corrected. That is, each pixel of one field image is associated with a pixel representing the same point in the field FD in the other field image.
 〔効果〕
 以上のように、本実施形態では、圃場画像取得工程において、第1の撮像部21および第2の撮像部22により、飛行体10の飛行中に、圃場FDを上空から撮影して波長帯域の異なる圃場画像(可視画像、近赤外画像)を取得する(S1-1)。また、チャート画像取得工程では、飛行体10の飛行中に、第1の撮像部21および第2の撮像部22によってチャートCを上空から撮影して各チャート画像を取得する(S1-2)。そして、視差補正工程では、視差補正部11cが、各チャート画像に基づいて、各圃場画像間で、視差によって生じる同一地点を示す画素の位置ズレを補正する(S3)。これにより、飛行体10の振動等に起因して各圃場画像間に視差が生じても、その視差に起因する画素の位置ズレを補正した各圃場画像を得ることができ、上記視差の影響を低減することが可能となる。
〔effect〕
As described above, in the present embodiment, in the field image acquiring step, the field FD is photographed from the sky by the first imaging unit 21 and the second imaging unit 22 during the flight of the flying object 10, and the wavelength band of the field FD is adjusted. A different field image (visible image, near-infrared image) is acquired (S1-1). In the chart image acquisition step, the chart C is photographed from above by the first imaging unit 21 and the second imaging unit 22 during the flight of the flying object 10 to acquire each chart image (S1-2). Then, in the parallax correction step, the parallax correction unit 11c corrects a positional shift of a pixel indicating the same point caused by parallax between each field image based on each chart image (S3). Thereby, even if parallax is generated between the field images due to the vibration of the flying object 10 or the like, it is possible to obtain each field image in which the positional shift of the pixel due to the parallax is corrected, and to reduce the influence of the parallax. It becomes possible to reduce.
 また、チャートCは、圃場FD内または圃場FDの周囲に位置しているため、第1の撮像部21および第2の撮像部22は、飛行体10の飛行中に、圃場FDと同時にチャートCを撮影することが可能となる。これにより、視差補正部11cは、圃場FDの撮影と同時にチャート画像に基づいて視差補正を行うことが可能となる。したがって、例えば圃場FDの撮影とは別のタイミングで視差補正を行う場合に比べて、処理の効率化を図ることができる。つまり、第1の撮像部21および第2の撮像部22により、圃場画像取得工程(S1-1)と、チャート画像取得工程(S1-2)とを同時に行うことにより、処理の効率化を図ることができる。また、1回の撮影で圃場画像とチャート画像とを同時に得ることができるため(1つの撮影画像が圃場画像とチャート画像とを兼ねることができるため)、撮影工数の増加も回避できる。 Further, since the chart C is located in the field FD or around the field FD, the first imaging unit 21 and the second imaging unit 22 perform the chart C simultaneously with the field FD during the flight of the flying object 10. Can be taken. Thus, the parallax correction unit 11c can perform parallax correction based on the chart image at the same time as the image of the field FD. Therefore, for example, the efficiency of processing can be improved as compared with a case where parallax correction is performed at a timing different from the imaging of the field FD. That is, the first image capturing unit 21 and the second image capturing unit 22 simultaneously perform the field image obtaining step (S1-1) and the chart image obtaining step (S1-2), thereby improving the processing efficiency. be able to. Further, since the field image and the chart image can be obtained simultaneously by one photographing (since one photographed image can serve as the field image and the chart image), an increase in the number of photographing steps can be avoided.
 また、上記した飛行体10の飛行中には、飛行体10の上昇中および下降中が含まれる。したがって、飛行体10が圃場FDの上空を所定高度で飛行している間だけでなく、飛行体10の上昇中や下降中においても、第1の撮像部21および第2の撮像部22によって各チャート画像を取得することができるため、そのような飛行体10の上昇中や下降中でも、視差補正部11cによる視差補正を行うことが可能となる。 中 In addition, the above-mentioned flight of the flying object 10 includes the rising and falling of the flying object 10. Therefore, not only while the flying object 10 is flying over the field FD at a predetermined altitude, but also while the flying object 10 is rising or descending, the first imaging unit 21 and the second imaging unit 22 Since a chart image can be obtained, the parallax can be corrected by the parallax correction unit 11c even while the flying object 10 is rising or falling.
 また、圃場FD内または圃場FDの周囲に物体OBが位置している場合、チャートCがその物体OBに取り付けられることで、物体OBを利用してチャートCを位置させることができる。つまり、この場合、チャートCの設置において、物体OBを有効利用することができる。 {Circle around (2)} When the object OB is located in or around the field FD, the chart C can be positioned using the object OB by attaching the chart C to the object OB. That is, in this case, the object OB can be effectively used in setting the chart C.
 特に、飛行体10の離着陸台40、飛行体10を運搬する運搬用車両41、飛行体10の飛行を遠隔操作するパイロットが装着するヘルメット42は、飛行体10を用いて圃場FDを撮影する場合に必ず必要となる。したがって、チャートCをこれらのいずれかの物体OB(離着陸台40、運搬用車両41、ヘルメット42)に取り付けることにより、チャートCの設置場所を新たに探したり、別途設ける必要がなく、チャートCを上記物体OBに迅速に設置して、圃場の撮影を迅速に開始させることができる。 In particular, the take-off and landing platform 40 of the flying object 10, the transport vehicle 41 for transporting the flying object 10, and the helmet 42 worn by a pilot who remotely controls the flight of the flying object 10 are used when the field FD is photographed using the flying object 10. Is always required. Therefore, by attaching the chart C to any one of these objects OB (the take-off and landing platform 40, the transport vehicle 41, and the helmet 42), it is not necessary to newly search for an installation place of the chart C or separately provide the chart C. It is possible to quickly install the object on the object OB and quickly start photographing the field.
 なお、チャートCは、飛行体10の飛行を遠隔操作するパイロットの頭部に、ヘルメット42を介さずに直接装着されてもよい。このことから、パイロットがヘルメット42を装着しているか否かに関係なく、チャートCは、飛行体10の飛行を遠隔操作するパイロットの頭部に装着されていてもよいと言うことができる。 The chart C may be directly mounted on the head of the pilot who remotely controls the flight of the flying object 10 without using the helmet 42. From this, it can be said that the chart C may be worn on the head of the pilot who remotely controls the flight of the flying object 10 regardless of whether the pilot wears the helmet 42 or not.
 なお、上述した入射光センサ、発電機、RTK-GPSの基地局は、飛行体10を用いて圃場FDを撮影する際に必要となる場合がある(例えば天候の変動が激しい場合、圃場FDが広大であり、撮影途中でバッテリー17の充電が必要な場合や、位置情報取得部14による位置情報の取得のために上記基地局と通信する必要がある場合など)。また、上述したフィールドサーバー、農機、灌漑設備については、圃場FDの管理において必ず必要であり、圃場FD内または圃場FDの周囲に必ず存在している。したがって、これらの物体OBにチャートCを取り付けてもよく、その場合でも、物体OBを有効利用しつつ、チャートCを上記物体OBに迅速に設置して、圃場FDの撮影を迅速に開始させることができる。 The above-described incident light sensor, generator, and RTK-GPS base station may be required when photographing the field FD using the flying object 10 (for example, when the weather fluctuates drastically, the It is very large, and the battery 17 needs to be charged during photography, or the location information acquisition unit 14 needs to communicate with the base station to acquire location information. Further, the field server, the agricultural machine, and the irrigation equipment described above are indispensable for management of the field FD, and are always present in the field FD or around the field FD. Therefore, the chart C may be attached to these objects OB. Even in this case, the chart C is quickly installed on the object OB while effectively utilizing the object OB, and the photographing of the field FD is started quickly. Can be.
 また、チャートCが、少なくとも1つの円形部C1を含む模様、または格子模様を有している場合、圃場FDの上空から第1の撮像部21および第2の撮像部22によってチャートCを撮影し、各チャート画像を取得したときに、各チャート画像においてチャートCの像(例えば像C1’を含む)が明確に現れる。これにより、視差補正部11cが画素値に基づいてチャートCの像の位置を確実に求めて、その位置情報を精度よく確実に取得することが可能となり、各チャート画像に基づく上記の視差補正を精度よく行うことが可能となる。 Also, the chart C is, a pattern including at least one circular portion C 1 or when it has a grid pattern, taking the chart C from above of the field FD by the first imaging unit 21 and the second imaging unit 22, Then, when each chart image is obtained, the image of the chart C (including the image C 1 ′) clearly appears in each chart image. Accordingly, the parallax correction unit 11c can reliably obtain the position of the image of the chart C based on the pixel value, and can acquire the position information accurately and reliably. It can be performed with high accuracy.
 また、圃場FD内に位置する作物PLの欠株部分Nや、圃場FD内の水面に映る太陽光の反射スポットSは、第1の撮像部21および第2の撮像部22によって圃場FDを同時に撮影したときに、圃場FD内の同一地点を示す位置として各撮影画像に現れる。このため、上記欠株部分Nまたは太陽光の反射スポットSをチャートCとして活用しても、第1の撮像部21および第2の撮像部22が各チャート画像を取得して、視差補正部11cが各チャート画像に基づいて視差補正を行うことが可能となる。したがって、視差補正を行うにあたり、チャートCを、圃場FD内または圃場FDの周囲の物体OBに取り付ける必要がなくなる。また、そのような物体OBが存在しない場合でも、上記欠株部分Nまたは太陽光の反射スポットSをチャートCとして活用して(有効利用して)視差補正を行うことができる。 In addition, the lacking portion N of the crop PL located in the field FD and the reflection spot S of the sunlight reflected on the water surface in the field FD simultaneously cause the field FD to be scanned by the first imaging unit 21 and the second imaging unit 22. When a photograph is taken, it appears in each photographed image as a position indicating the same point in the field FD. For this reason, even if the lacking portion N or the reflected spot S of sunlight is used as the chart C, the first imaging unit 21 and the second imaging unit 22 acquire each chart image and the parallax correction unit 11c. Can perform parallax correction based on each chart image. Therefore, when performing parallax correction, it is not necessary to attach the chart C to the object OB in or around the field FD. Further, even when such an object OB does not exist, the parallax correction can be performed by utilizing (effectively utilizing) the above-described missing portion N or the reflected spot S of sunlight as the chart C.
 また、視差補正工程では、視差補正部11cは、第1の撮像部21および第2の撮像部22によって同時刻に撮影されて取得された各チャート画像に基づいて、一方のチャート画像に含まれる基準点Pの位置座標を他方のチャート画像で対応する対応点Qの位置座標に変換する射影行列を算出し、一方の圃場画像に含まれる全ての点の位置座標を射影行列によって変換することによって、視差を補正する(S13、S14)。これにより、飛行体10の飛行中の振動等に起因して、第1の撮像部21および第2の撮像部22の内部状態または外部状態に変化が生じても(例えば光軸ズレ、取付方向のズレが生じても)、一方の圃場画像(可視画像)の各画素を、他方の圃場画像(近赤外画像)において、圃場FD内で同一地点を表す画素と対応付けることが可能となる。 In the parallax correction step, the parallax correction unit 11c is included in one chart image based on each chart image captured and acquired by the first imaging unit 21 and the second imaging unit 22 at the same time. By calculating a projection matrix for converting the position coordinates of the reference point P to the position coordinates of the corresponding corresponding point Q in the other chart image, and converting the position coordinates of all points included in one field image by the projection matrix Then, the parallax is corrected (S13, S14). Accordingly, even if the internal state or the external state of the first imaging unit 21 and the second imaging unit 22 is changed due to vibration of the flying object 10 during flight (for example, optical axis deviation, mounting direction). ), It is possible to associate each pixel of one field image (visible image) with a pixel representing the same point in the field FD in the other field image (near infrared image).
 また、視差補正部11cは、第1の撮像部21および第2の撮像部22のそれぞれにおいて、飛行体10が上空で移動しているときに異なるタイミングで取得された複数の撮影画像から、チャートCの像が含まれる画像をチャート画像として抽出した後、抽出したチャート画像からチャートCの像の中心位置を求め、同時刻に撮影された一方のチャート画像におけるチャートCの像の中心位置を基準点Pとし、同時刻に撮影された他方のチャート画像におけるチャートCの像の中心位置を対応点Qとして射影行列を算出する(S11~S13)。このように、同時刻に撮影された各チャート画像において、チャートCの像の中心位置をそれぞれ基準点Pおよび対応点Qとして射影行列を算出することにより、視差補正に必要な射影行列を適切に取得することができる。 In addition, the parallax correction unit 11c calculates a chart from a plurality of captured images acquired at different timings when the flying object 10 is moving over the sky in each of the first imaging unit 21 and the second imaging unit 22. After extracting an image including the image of C as a chart image, the center position of the image of the chart C is obtained from the extracted chart image, and the center position of the image of the chart C in one of the chart images photographed at the same time is used as a reference. A projection matrix is calculated as a point P, and the center position of the image of the chart C in the other chart image captured at the same time is set as the corresponding point Q (S11 to S13). As described above, in each chart image captured at the same time, the projection matrix required for parallax correction is appropriately calculated by calculating the projection matrix using the center position of the image of the chart C as the reference point P and the corresponding point Q, respectively. Can be obtained.
 また、生育指標算出工程では、生育指標算出部11dは、視差補正部11cによって上記位置ズレを補正した後の各圃場画像に基づいて、作物PLの生育指標(例えばNDVI値)を算出する(S6)。本実施形態のように、NDVI値を取得する目的で、第1の撮像部21および第2の撮像部22を飛行体10に搭載して飛行させる場合でも、上述した視差補正部11cによって視差の影響を低減することができるため、視差によるNDVI値の変動を効果的に防止することが可能となる。したがって、生育指標算出部11dにより、圃場FD内の任意の地点について正確なNDVI値を取得することが可能となり、NDVI画像の画質劣化を回避することができる。また、第1の撮像部21および第2の撮像部22に含まれる各光学系が、塵埃の付着によって汚れて、各圃場画像の画素値にノイズが含まれる場合でも、本実施形態のように視差に起因する画素の位置ズレを補正することで、そのような補正を行わない場合に比べて、圃場FD内の任意の地点についての、上記ノイズによるNDVI値の変動を抑えることが可能となる。 In the growth index calculation step, the growth index calculation unit 11d calculates a growth index (for example, an NDVI value) of the crop PL based on each field image after the positional displacement has been corrected by the parallax correction unit 11c (S6). ). As in the present embodiment, even when the first imaging unit 21 and the second imaging unit 22 are mounted on the flying object 10 to fly for the purpose of acquiring the NDVI value, the parallax correction unit 11c described above performs parallax correction. Since the influence can be reduced, it is possible to effectively prevent fluctuation of the NDVI value due to parallax. Therefore, the growth index calculation unit 11d can acquire an accurate NDVI value at an arbitrary point in the field FD, and can avoid deterioration in the image quality of the NDVI image. Further, even when each optical system included in the first imaging unit 21 and the second imaging unit 22 is contaminated by the attachment of dust and the pixel value of each field image includes noise, as in the present embodiment, By correcting the positional shift of the pixel due to the parallax, it is possible to suppress the fluctuation of the NDVI value due to the noise at an arbitrary point in the field FD as compared with a case where such correction is not performed. .
 また、警告工程では、警告部13が、所定期間における上記位置ズレの変化量が閾値を超えた場合に警告を発する(S5)。S4において、上記位置ズレの変化量が閾値を超えた場合、上記位置ズレの変化は、飛行体10の振動等に起因するというよりも、第1の撮像部21および第2の撮像部22の故障に起因する可能性が高い。したがって、S5での警告により、点検や修理などの必要な対処をユーザに迅速に促して、故障に対してユーザが迅速に対処することが可能となる。特に、警告部13が表示部13a、音声出力部13b、通信部12の少なくともいずれかを含み、S5において、警告表示、警告音の出力、警告情報の発信の少なくともいずれかが行われることにより、故障などの異常が発生したことのユーザへの報知を確実に行って、点検等の必要な対処をユーザに促すことが確実に可能となる。 {Circle around (5)} In the warning step, the warning unit 13 issues a warning when the amount of change in the positional deviation during a predetermined period exceeds a threshold value (S5). In S4, when the amount of change in the position shift exceeds the threshold, the change in the position shift is caused not by vibration of the flying object 10 but by the first image pickup unit 21 and the second image pickup unit 22. Most likely due to failure. Therefore, the warning in S5 promptly prompts the user to take necessary measures such as inspection and repair, and enables the user to quickly deal with a failure. In particular, the warning unit 13 includes at least one of the display unit 13a, the audio output unit 13b, and the communication unit 12, and in S5, at least one of the warning display, the output of the warning sound, and the transmission of the warning information is performed. It is possible to reliably notify the user that an abnormality such as a failure has occurred, and to prompt the user to take necessary measures such as inspection.
 〔その他〕
 視差補正(キャリブレーション)において、チャート位置が異なる画像が3枚以上あるか否かを自動的に判定し、チャート位置が異なる画像の数が3枚未満である場合は警告を行い、画像数が3枚以上ある場合は適切である旨の報知を行うようにしてもよい。
[Others]
In the parallax correction (calibration), it is automatically determined whether or not there are three or more images with different chart positions. If the number of images with different chart positions is less than three, a warning is issued. If there are three or more cards, it may be notified that they are appropriate.
 1日の撮影終了時にキャリブレーション用の画像が見つからない場合は、警告を行うようにしてもよい。 If an image for calibration is not found at the end of shooting on a single day, a warning may be issued.
 圃場撮影システムはキャリブレーションモードを持ち、キャリブレーションモードでは所定のルートで飛行体10を自動飛行させるようにしてもよく、また、そのような自動飛行を実行させるプログラムを備えていてもよい。 The field imaging system has a calibration mode, and in the calibration mode, the flying object 10 may be made to automatically fly along a predetermined route, or may be provided with a program for executing such an automatic flight.
 以上で説明した本実施形態の圃場撮影システムおよび圃場撮影方法は、以下のように表現することができる。 The field photographing system and the field photographing method of the present embodiment described above can be expressed as follows.
 本実施形態の圃場撮影システムは、飛行体と、前記飛行体に保持され、前記飛行体の飛行中に、作物を生育させる圃場を上空から撮影して波長帯域の異なる圃場画像を取得する複数の撮像部と、前記複数の撮像部によって撮影されるチャートと、前記飛行体の飛行中に、前記複数の撮像部によって前記チャートを上空から撮影して得られた各チャート画像に基づいて、前記各圃場画像間で、視差によって生じる同一地点を示す画素の位置ズレを補正する視差補正部とを含み、前記チャートは、前記圃場内または前記圃場の周囲に位置している。 The field imaging system of the present embodiment includes a plurality of flying objects, which are held by the flying objects, and acquire a field image having a different wavelength band by photographing a field where a crop is grown from above in the air while the flying object is growing. An imaging unit, a chart captured by the plurality of imaging units, and, during the flight of the flying object, based on each chart image obtained by capturing the chart from above by the plurality of imaging units, A parallax correction unit configured to correct a positional shift of a pixel indicating the same point caused by parallax between field images, wherein the chart is located in the field or around the field.
 上記の圃場撮影システムにおいて、前記チャートは、前記圃場内または前記圃場の周囲に位置する物体に取り付けられていてもよい。 In the above-mentioned field imaging system, the chart may be attached to an object located in the field or around the field.
 上記の圃場撮影システムにおいて、前記物体は、前記飛行体の離着陸台であってもよい。 In the above-mentioned field imaging system, the object may be a take-off and landing platform of the flying object.
 上記の圃場撮影システムにおいて、前記物体は、前記飛行体を運搬する運搬用車両であってもよい。 In the above-mentioned field imaging system, the object may be a transport vehicle that transports the flying object.
 上記の圃場撮影システムにおいて、前記チャートは、前記飛行体の飛行を遠隔操作するパイロットの頭部に装着されてもよい。 In the above field imaging system, the chart may be mounted on a head of a pilot who remotely controls the flight of the flying object.
 上記の圃場撮影システムにおいて、前記チャートは、少なくとも1つの円形部を含む模様、または格子模様を有していてもよい。 In the above-described field photographing system, the chart may have a pattern including at least one circular portion or a lattice pattern.
 上記の圃場撮影システムにおいて、前記チャートは、前記圃場内に位置する前記作物の欠株部分、または、前記圃場内の水面に映る太陽光の反射スポットであってもよい。 In the above-mentioned field photographing system, the chart may be a missing part of the crop located in the field, or a reflection spot of sunlight reflected on the water surface in the field.
 上記の圃場撮影システムにおいて、前記視差補正部は、前記複数の撮像部によって同時刻に撮影されて取得された前記各チャート画像に基づいて、一方のチャート画像に含まれる基準点の位置座標を他方のチャート画像で対応する対応点の位置座標に変換する射影行列を算出し、一方の圃場画像に含まれる全ての点の位置座標を前記射影行列によって変換することによって、前記位置ズレを補正してもよい。 In the above-described field photographing system, the parallax correction unit determines the position coordinates of a reference point included in one of the chart images based on the respective chart images acquired and photographed at the same time by the plurality of imaging units. By calculating a projection matrix to be converted into the position coordinates of the corresponding point in the chart image, by correcting the position coordinates of all points included in one field image by the projection matrix, the position shift is corrected. Is also good.
 上記の圃場撮影システムにおいて、前記視差補正部は、前記複数の撮像部のそれぞれにおいて、前記飛行体が上空で移動しているときに異なるタイミングで取得された複数の撮影画像から、前記チャートの像が含まれる画像を前記チャート画像として抽出した後、抽出した前記チャート画像から前記チャートの像の中心位置を求め、同時刻に撮影された前記一方のチャート画像の前記中心位置を前記基準点とし、同時刻に撮影された前記他方のチャート画像の前記中心位置を前記対応点として前記射影行列を算出してもよい。 In the above-described field photographing system, the parallax correction unit may include, in each of the plurality of imaging units, an image of the chart from a plurality of photographed images acquired at different timings when the flying object is moving in the sky. Is extracted as the chart image, the center position of the image of the chart is obtained from the extracted chart image, and the center position of the one chart image taken at the same time as the reference point, The projection matrix may be calculated using the center position of the other chart image captured at the same time as the corresponding point.
 上記の圃場撮影システムは、前記視差補正部によって前記位置ズレを補正した後の各圃場画像に基づいて、前記作物の生育指標を算出する生育指標算出部をさらに含んでいてもよい。 The above-mentioned field photographing system may further include a growth index calculation unit that calculates a growth index of the crop based on each field image after the position shift has been corrected by the parallax correction unit.
 上記の圃場撮影システムは、所定期間における前記位置ズレの変化量が閾値を超えた場合に警告を発する警告部をさらに含んでいてもよい。 The field imaging system described above may further include a warning unit that issues a warning when the amount of change in the positional deviation during a predetermined period exceeds a threshold.
 上記の圃場撮影システムにおいて、前記警告部は、警告表示を行う表示部、警告音を発する音声出力部、警告情報を外部に発信する通信部の少なくともいずれかを含んでいてもよい。 In the above-described field photographing system, the warning unit may include at least one of a display unit for displaying a warning, a sound output unit for generating a warning sound, and a communication unit for transmitting warning information to the outside.
 本実施形態の圃場撮影方法は、飛行体に保持される複数の撮像部によって、前記飛行体の飛行中に、作物を生育させる圃場を上空から撮影して波長帯域の異なる圃場画像を取得する圃場画像取得工程と、前記圃場内または前記圃場の周囲に位置しているチャートを、前記飛行体の飛行中に、前記複数の撮像部によって上空から撮影して各チャート画像を取得するチャート画像取得工程と、前記各チャート画像に基づいて、前記各圃場画像間で、視差によって生じる同一地点を示す画素の位置ズレを補正する視差補正工程とを含む。 The field photographing method according to the present embodiment includes a plurality of image pickup units that are held by a flying object, and that during the flight of the flying object, a field where crops are grown is photographed from above to acquire field images having different wavelength bands. An image obtaining step, and a chart image obtaining step of, during the flight of the flying object, capturing a chart located in the field or around the field from above by the plurality of imaging units to obtain each chart image. And a parallax correction step of correcting, based on each of the chart images, a positional shift of a pixel indicating the same point caused by parallax between each of the field images.
 上記の圃場撮影方法において、前記複数の撮像部により、前記圃場画像取得工程と、前記チャート画像取得工程とを同時に行うことが望ましい。 In the above-described field photographing method, it is preferable that the plurality of imaging units simultaneously perform the field image obtaining step and the chart image obtaining step.
 上記の圃場撮影方法において、前記飛行体の飛行中には、前記飛行体の上昇中および下降中が含まれてもよい。 In the above-described field photographing method, the flying of the flying object may include rising and falling of the flying object.
 上記の圃場撮影方法において、前記視差補正工程では、前記複数の撮像部によって同時刻に撮影されて取得された前記各チャート画像に基づいて、一方のチャート画像に含まれる基準点の位置座標を他方のチャート画像で対応する点の位置座標に変換する射影行列を算出し、一方の圃場画像に含まれる全ての点の位置座標を前記射影行列によって変換することによって、前記位置ズレを補正してもよい。 In the above-described field photographing method, in the parallax correction step, the position coordinates of a reference point included in one of the chart images is determined based on the respective chart images acquired and photographed at the same time by the plurality of imaging units. By calculating a projection matrix to be converted to the position coordinates of the corresponding point in the chart image, and by converting the position coordinates of all the points included in one field image by the projection matrix, the position shift can be corrected. Good.
 上記の圃場撮影方法において、前記視差補正工程では、前記複数の撮像部のそれぞれにおいて、前記飛行体が上空で移動しているときに異なるタイミングで取得された複数の撮影画像から、前記チャートの像が含まれる画像を前記チャート画像として抽出した後、抽出した前記チャート画像から前記チャートの像の中心位置を求め、同時刻に撮影された前記一方のチャート画像の前記中心位置を前記基準点とし、同時刻に撮影された前記他方のチャート画像の前記中心位置を前記対応点として前記射影行列を算出してもよい。 In the above-described field photographing method, in the parallax correction step, in each of the plurality of imaging units, an image of the chart is obtained from a plurality of photographed images acquired at different timings when the flying object is moving in the sky. Is extracted as the chart image, the center position of the image of the chart is obtained from the extracted chart image, and the center position of the one chart image taken at the same time as the reference point, The projection matrix may be calculated using the center position of the other chart image captured at the same time as the corresponding point.
 上記の圃場撮影方法は、前記視差補正工程によって前記位置ズレを補正した後の各圃場画像に基づいて、前記作物の生育指標を算出する生育指標算出工程をさらに含んでいてもよい。 The above-mentioned field photographing method may further include a growth index calculating step of calculating a growth index of the crop based on each field image after the positional displacement has been corrected by the parallax correction step.
 上記の圃場撮影方法は、所定期間における前記位置ズレの変化量が閾値を超えた場合に警告を発する警告工程をさらに含んでいてもよい。 The above-mentioned field photographing method may further include a warning step of issuing a warning when a change amount of the positional deviation during a predetermined period exceeds a threshold value.
 上記の圃場撮影方法において、前記警告工程では、警告表示、警告音の出力、警告情報の外部への発信の少なくともいずれかを行ってもよい。 In the field photographing method described above, the warning step may include at least one of displaying a warning, outputting a warning sound, and transmitting warning information to the outside.
 以上、本発明の実施形態について説明したが、本発明の範囲はこれに限定されるものではなく、発明の主旨を逸脱しない範囲で拡張または変更して実施することができる。 Although the embodiments of the present invention have been described above, the scope of the present invention is not limited to the embodiments, and can be extended or modified without departing from the gist of the invention.
 本発明は、飛行体に保持される複数の撮像部によって圃場を上空から撮影して波長帯域の異なる圃場画像を取得し、取得した各圃場画像に基づいて生育指標を算出するシステムに利用可能である。 INDUSTRIAL APPLICABILITY The present invention can be used for a system that acquires a field image having a different wavelength band by photographing a field from above with a plurality of imaging units held by a flying object, and calculates a growth index based on each acquired field image. is there.
   1   圃場撮影システム
  10   飛行体
  11c  視差補正部
  11d  生育指標算出部
  12   通信部(警告部)
  13   警告部
  13a  表示部
  13b  音声出力部
  21   第1の撮像部
  22   第2の撮像部
  40   離着陸台
  41   運搬用車両
  42   ヘルメット
   C   チャート
   C1    円形部
   C2    円形部
  FD   圃場
   P   基準点
   Q   対応点
  OB   物体
   N   欠株部分
   S   反射スポット
DESCRIPTION OF SYMBOLS 1 Field imaging system 10 Flying object 11c Parallax correction part 11d Growth index calculation part 12 Communication part (warning part)
13 warning unit 13a display portion 13b audio output unit 21 first imaging unit 22 and the second imaging section 40 landing platform 41 transportation vehicles 42 of the helmet C Chart C 1 circular part C 2 circular portion FD field P reference point Q corresponding point OB Object N Missing part S Reflection spot

Claims (20)

  1.  飛行体と、
     前記飛行体に保持され、前記飛行体の飛行中に、作物を生育させる圃場を上空から撮影して波長帯域の異なる圃場画像を取得する複数の撮像部と、
     前記複数の撮像部によって撮影されるチャートと、
     前記飛行体の飛行中に、前記複数の撮像部によって前記チャートを上空から撮影して得られた各チャート画像に基づいて、前記各圃場画像間で、視差によって生じる同一地点を示す画素の位置ズレを補正する視差補正部とを含み、
     前記チャートは、前記圃場内または前記圃場の周囲に位置している、圃場撮影システム。
    Flying objects,
    A plurality of imaging units that are held by the flying object, and during the flight of the flying object, acquire a field image having a different wavelength band by photographing a field where a crop grows from above,
    A chart taken by the plurality of imaging units;
    During the flight of the flying object, based on each chart image obtained by photographing the chart from above by the plurality of imaging units, a positional shift of a pixel indicating the same point caused by parallax between the respective field images. And a parallax correction unit that corrects
    The field photographing system, wherein the chart is located in the field or around the field.
  2.  前記チャートは、前記圃場内または前記圃場の周囲に位置する物体に取り付けられている、請求項1に記載の圃場撮影システム。 The field imaging system according to claim 1, wherein the chart is attached to an object located in or around the field.
  3.  前記物体は、前記飛行体の離着陸台である、請求項2に記載の圃場撮影システム。 The field imaging system according to claim 2, wherein the object is a take-off and landing platform of the flying object.
  4.  前記物体は、前記飛行体を運搬する運搬用車両である、請求項2に記載の圃場撮影システム。 The field imaging system according to claim 2, wherein the object is a transport vehicle that transports the flying object.
  5.  前記チャートは、前記飛行体の飛行を遠隔操作するパイロットの頭部に装着される、請求項1に記載の圃場撮影システム。 The field imaging system according to claim 1, wherein the chart is mounted on a head of a pilot who remotely controls the flight of the flying object.
  6.  前記チャートは、少なくとも1つの円形部を含む模様、または格子模様を有している、請求項1から5のいずれかに記載の圃場撮影システム。 The field imaging system according to any one of claims 1 to 5, wherein the chart has a pattern including at least one circular portion or a lattice pattern.
  7.  前記チャートは、前記圃場内に位置する前記作物の欠株部分、または、前記圃場内の水面に映る太陽光の反射スポットである、請求項1に記載の圃場撮影システム。 2. The field photographing system according to claim 1, wherein the chart is a missing portion of the crop located in the field or a reflection spot of sunlight reflected on a water surface in the field. 3.
  8.  前記視差補正部は、前記複数の撮像部によって同時刻に撮影されて取得された前記各チャート画像に基づいて、一方のチャート画像に含まれる基準点の位置座標を他方のチャート画像で対応する対応点の位置座標に変換する射影行列を算出し、一方の圃場画像に含まれる全ての点の位置座標を前記射影行列によって変換することによって、前記位置ズレを補正する、請求項1から7のいずれかに記載の圃場撮影システム。 The parallax correction unit is configured to correspond to position coordinates of a reference point included in one of the chart images in the other chart image based on each of the chart images captured and acquired at the same time by the plurality of imaging units. 8. The method according to claim 1, further comprising: calculating a projection matrix to be converted into position coordinates of a point; and correcting the position shift by converting the position coordinates of all points included in one field image by the projection matrix. 9. A field imaging system according to any of the claims.
  9.  前記視差補正部は、前記複数の撮像部のそれぞれにおいて、前記飛行体が上空で移動しているときに異なるタイミングで取得された複数の撮影画像から、前記チャートの像が含まれる画像を前記チャート画像として抽出した後、抽出した前記チャート画像から前記チャートの像の中心位置を求め、同時刻に撮影された前記一方のチャート画像の前記中心位置を前記基準点とし、同時刻に撮影された前記他方のチャート画像の前記中心位置を前記対応点として前記射影行列を算出する、請求項8に記載の圃場撮影システム。 The parallax correction unit, in each of the plurality of imaging units, from a plurality of captured images obtained at different timings when the flying object is moving in the sky, the chart including the image of the chart image is included in the chart After being extracted as an image, the center position of the chart image is obtained from the extracted chart image, and the center position of the one chart image taken at the same time as the reference point is used as the reference point. The field imaging system according to claim 8, wherein the projection matrix is calculated using the center position of the other chart image as the corresponding point.
  10.  前記視差補正部によって前記位置ズレを補正した後の各圃場画像に基づいて、前記作物の生育指標を算出する生育指標算出部をさらに含む、請求項1から9のいずれかに記載の圃場撮影システム。 The field imaging system according to any one of claims 1 to 9, further comprising a growth index calculation unit that calculates a growth index of the crop based on each field image after the positional shift has been corrected by the parallax correction unit. .
  11.  所定期間における前記位置ズレの変化量が閾値を超えた場合に警告を発する警告部をさらに含む、請求項1から10のいずれかに記載の圃場撮影システム。 11. The field photographing system according to claim 1, further comprising: a warning unit that issues a warning when the amount of change in the position shift during a predetermined period exceeds a threshold.
  12.  前記警告部は、警告表示を行う表示部、警告音を発する音声出力部、警告情報を外部に発信する通信部の少なくともいずれかを含む、請求項11に記載の圃場撮影システム。 12. The field photographing system according to claim 11, wherein the warning unit includes at least one of a display unit that displays a warning, a voice output unit that emits a warning sound, and a communication unit that transmits warning information to the outside.
  13.  飛行体に保持される複数の撮像部によって、前記飛行体の飛行中に、作物を生育させる圃場を上空から撮影して波長帯域の異なる圃場画像を取得する圃場画像取得工程と、
     前記圃場内または前記圃場の周囲に位置しているチャートを、前記飛行体の飛行中に、前記複数の撮像部によって上空から撮影して各チャート画像を取得するチャート画像取得工程と、
     前記各チャート画像に基づいて、前記各圃場画像間で、視差によって生じる同一地点を示す画素の位置ズレを補正する視差補正工程とを含む、圃場撮影方法。
    By a plurality of imaging units held by the flying body, during the flight of the flying body, a field image acquisition step of capturing a field where a crop grows from above and acquiring a field image with a different wavelength band,
    A chart located in the field or around the field, during the flight of the flying object, a chart image obtaining step of obtaining each chart image by shooting from the sky by the plurality of imaging units,
    A parallax correction step of correcting a positional shift of a pixel indicating the same point caused by parallax between the respective field images based on the respective chart images.
  14.  前記複数の撮像部により、前記圃場画像取得工程と、前記チャート画像取得工程とを同時に行う、請求項13に記載の圃場撮影方法。 The field imaging method according to claim 13, wherein the field image obtaining step and the chart image obtaining step are performed simultaneously by the plurality of imaging units.
  15.  前記飛行体の飛行中には、前記飛行体の上昇中および下降中が含まれる、請求項13または14に記載の圃場撮影方法。 The field imaging method according to claim 13 or 14, wherein the flying of the flying object includes ascending and descending of the flying object.
  16.  前記視差補正工程では、前記複数の撮像部によって同時刻に撮影されて取得された前記各チャート画像に基づいて、一方のチャート画像に含まれる基準点の位置座標を他方のチャート画像で対応する点の位置座標に変換する射影行列を算出し、一方の圃場画像に含まれる全ての点の位置座標を前記射影行列によって変換することによって、前記位置ズレを補正する、請求項13から15のいずれかに記載の圃場撮影方法。 In the parallax correction step, based on each of the chart images captured and acquired at the same time by the plurality of imaging units, a point corresponding to a position coordinate of a reference point included in one of the chart images in the other chart image. The position deviation is corrected by calculating a projection matrix to be converted to the position coordinates of the field image, and converting the position coordinates of all the points included in one field image by the projection matrix. 2. The field photographing method according to item 1.
  17.  前記視差補正工程では、前記複数の撮像部のそれぞれにおいて、前記飛行体が上空で移動しているときに異なるタイミングで取得された複数の撮影画像から、前記チャートの像が含まれる画像を前記チャート画像として抽出した後、抽出した前記チャート画像から前記チャートの像の中心位置を求め、同時刻に撮影された前記一方のチャート画像の前記中心位置を前記基準点とし、同時刻に撮影された前記他方のチャート画像の前記中心位置を前記対応点として前記射影行列を算出する、請求項16に記載の圃場撮影方法。 In the parallax correction step, in each of the plurality of imaging units, from the plurality of captured images obtained at different timings when the flying object is moving in the sky, the image including the image of the chart, the chart After being extracted as an image, the center position of the chart image is obtained from the extracted chart image, and the center position of the one chart image taken at the same time as the reference point is used as the reference point. 17. The field photographing method according to claim 16, wherein the projection matrix is calculated using the center position of the other chart image as the corresponding point.
  18.  前記視差補正工程によって前記位置ズレを補正した後の各圃場画像に基づいて、前記作物の生育指標を算出する生育指標算出工程をさらに含む、請求項13から17のいずれかに記載の圃場撮影方法。 The field photographing method according to any one of claims 13 to 17, further comprising a growth index calculating step of calculating a growth index of the crop based on each field image after correcting the position shift by the parallax correction step. .
  19.  所定期間における前記位置ズレの変化量が閾値を超えた場合に警告を発する警告工程をさらに含む、請求項13から18のいずれかに記載の圃場撮影方法。 19. The field photographing method according to claim 13, further comprising a warning step of issuing a warning when the amount of change in the positional deviation during a predetermined period exceeds a threshold value.
  20.  前記警告工程では、警告表示、警告音の出力、警告情報の外部への発信の少なくともいずれかを行う、請求項19に記載の圃場撮影方法。 20. The field photographing method according to claim 19, wherein in the warning step, at least one of displaying a warning, outputting a warning sound, and transmitting warning information to the outside is performed.
PCT/JP2019/010050 2018-08-29 2019-03-12 Cultivated land imaging system and cultivated land imaging method WO2020044628A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020540034A JPWO2020044628A1 (en) 2018-08-29 2019-03-12 Field photography system and field photography method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018160538 2018-08-29
JP2018-160538 2018-08-29

Publications (1)

Publication Number Publication Date
WO2020044628A1 true WO2020044628A1 (en) 2020-03-05

Family

ID=69645081

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/010050 WO2020044628A1 (en) 2018-08-29 2019-03-12 Cultivated land imaging system and cultivated land imaging method

Country Status (2)

Country Link
JP (1) JPWO2020044628A1 (en)
WO (1) WO2020044628A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09161068A (en) * 1995-12-12 1997-06-20 Furukawa Electric Co Ltd:The Picture photographing method and picture edition device using the method
JP2014093037A (en) * 2012-11-06 2014-05-19 Canon Inc Image processing apparatus
JP2014183788A (en) * 2013-03-25 2014-10-02 Sony Corp Information processing system, and information processing method of information processing system, imaging device and imaging method, and program
WO2017221641A1 (en) * 2016-06-22 2017-12-28 コニカミノルタ株式会社 Plant growth index measurement device, method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09161068A (en) * 1995-12-12 1997-06-20 Furukawa Electric Co Ltd:The Picture photographing method and picture edition device using the method
JP2014093037A (en) * 2012-11-06 2014-05-19 Canon Inc Image processing apparatus
JP2014183788A (en) * 2013-03-25 2014-10-02 Sony Corp Information processing system, and information processing method of information processing system, imaging device and imaging method, and program
WO2017221641A1 (en) * 2016-06-22 2017-12-28 コニカミノルタ株式会社 Plant growth index measurement device, method, and program

Also Published As

Publication number Publication date
JPWO2020044628A1 (en) 2021-09-24

Similar Documents

Publication Publication Date Title
EP2597422B1 (en) Aerial photograph image pickup method and aerial photograph image pickup apparatus
US9488630B2 (en) Integrated remote aerial sensing system
JP4970296B2 (en) Orthophoto image generation method and photographing apparatus
JP6996560B2 (en) Crop cultivation support device
CN110537365B (en) Information processing device, information processing method, information processing program, image processing device, and image processing system
JP7074126B2 (en) Image processing equipment, growth survey image creation system and program
US20180218534A1 (en) Drawing creation apparatus and drawing creation method
JP2003009664A (en) Crop growth level measuring system, crop growth level measuring method, crop growth level measuring program, and computer-readable recording medium recorded with the program
JP7299213B2 (en) Information processing equipment
JP7069609B2 (en) Crop cultivation support device
JP6796789B2 (en) Video recording device
KR102169687B1 (en) Method for acquisition of hyperspectral image using an unmanned aerial vehicle
US11181470B2 (en) Sensing system, sensing method, and sensing device
JP2016223934A (en) Position correction system, position correcting method, and position correction program
WO2020044628A1 (en) Cultivated land imaging system and cultivated land imaging method
JP7274978B2 (en) air vehicle support system
JP7318768B2 (en) Crop cultivation support device
KR101208621B1 (en) System and method for airborne multi-spectral scanner image data processing
JP6880380B2 (en) Image processing equipment, image processing methods, and programs
JP2021019546A (en) Agriculture support system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19854079

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020540034

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19854079

Country of ref document: EP

Kind code of ref document: A1