WO2023233832A1 - Method, program, and information processing device - Google Patents

Method, program, and information processing device Download PDF

Info

Publication number
WO2023233832A1
WO2023233832A1 PCT/JP2023/014996 JP2023014996W WO2023233832A1 WO 2023233832 A1 WO2023233832 A1 WO 2023233832A1 JP 2023014996 W JP2023014996 W JP 2023014996W WO 2023233832 A1 WO2023233832 A1 WO 2023233832A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
area
spectral
information
specific state
Prior art date
Application number
PCT/JP2023/014996
Other languages
French (fr)
Japanese (ja)
Inventor
隆洋 中村
Original Assignee
株式会社ポーラスター・スペース
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ポーラスター・スペース filed Critical 株式会社ポーラスター・スペース
Publication of WO2023233832A1 publication Critical patent/WO2023233832A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G7/00Botany in general
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to a method, a program, and an information processing device.
  • An object of the present invention is to provide an information processing device.
  • a method for operating a computer involves the first step of having a computer processor store in memory information for detecting a specific state of the crop being observed, for comparison with a spectral image taken by a multispectral camera.
  • FIG. 1 is a diagram showing an overview of a system according to an embodiment.
  • FIG. 1 is a block diagram showing the configuration of a system according to an embodiment. It is a front view showing a drone concerning an embodiment.
  • FIG. 1 is a plan view showing a drone according to an embodiment.
  • FIG. 1 is a block diagram showing the hardware configuration of a drone according to an embodiment.
  • FIG. 1 is a block diagram showing the functional configuration of a drone according to an embodiment.
  • FIG. 2 is a conceptual diagram showing a state of imaging of land by a drone according to an embodiment.
  • FIG. 1 is a block diagram showing the hardware configuration of a satellite according to an embodiment.
  • FIG. 1 is a block diagram showing the functional configuration of a satellite according to an embodiment.
  • FIG. 1 is a block diagram showing the functional configuration of a satellite according to an embodiment.
  • FIG. 2 is a conceptual diagram showing a state of imaging of land by a satellite according to an embodiment.
  • FIG. 1 is a block diagram showing a hardware configuration of an information processing device according to an embodiment.
  • FIG. 1 is a diagram showing a functional configuration of an information processing device according to an embodiment.
  • FIG. 2 is a diagram showing a data structure of a first imaging condition DB stored in a satellite according to an embodiment. It is a diagram showing a data structure of an image DB stored in a satellite according to an embodiment. It is a figure showing the data structure of the 2nd imaging end time DB stored in the drone concerning an embodiment. It is a diagram showing the data structure of a flight plan DB stored in the drone according to the embodiment.
  • FIG. 1 is a block diagram showing a hardware configuration of an information processing device according to an embodiment.
  • FIG. 1 is a diagram showing a functional configuration of an information processing device according to an embodiment.
  • FIG. 2 is a diagram showing a data structure of a first imaging condition DB stored
  • FIG. 2 is a diagram showing a data structure of a wide area image DB stored in the information processing device according to the embodiment. It is a flowchart for explaining an example of operation of a satellite concerning an embodiment.
  • 3 is a flowchart for explaining an example of the operation of the information processing apparatus according to the embodiment. It is a flowchart for explaining an example of operation of a drone concerning an embodiment. 12 is a flowchart for explaining another example of the operation of the information processing apparatus according to the embodiment.
  • FIG. 3 is a diagram illustrating an example of the operation of wide area image generation in the information processing device according to the embodiment.
  • FIG. 2 is a conceptual diagram showing a procedure for determining that a crop is in a specific state using a spectral image.
  • FIG. 3 is a diagram showing an example of a screen displayed on the information processing device according to the embodiment.
  • FIG. 7 is a diagram illustrating another example of a screen displayed on the information processing device according to the embodiment.
  • a "processor” refers to one or more processors.
  • the at least one processor is typically a microprocessor such as a CPU (Central Processing Unit), but may be another type of processor such as a GPU (Graphics Processing Unit).
  • At least one processor may be single-core or multi-core.
  • At least one processor may be a processor in a broad sense such as a hardware circuit (for example, FPGA (Field-Programmable Gate Array) or ASIC (Application Specific Integrated Circuit)) that performs part or all of the processing.
  • a hardware circuit for example, FPGA (Field-Programmable Gate Array) or ASIC (Application Specific Integrated Circuit)
  • information such as "xxx table” may be used to explain information that provides an output in response to an input, but this information may be data of any structure, and A learning model such as a generated neural network may also be used. Therefore, the "xxx table” can be called “xxx information.”
  • each table is an example, and one table may be divided into two or more tables, or all or part of two or more tables may be one table. good.
  • processing may be explained using the subject "program”, but a program is executed by a processor to carry out a prescribed process, and to use the storage unit and/or interface unit as appropriate. Since the processing is performed while using the processor, the subject of the processing may be a processor (or a device such as a controller having the processor).
  • the program may be installed on a device such as a computer, or may be located on a program distribution server or a computer-readable (e.g., non-temporary) recording medium, for example.
  • a program distribution server or a computer-readable (e.g., non-temporary) recording medium, for example.
  • two or more programs may be realized as one program, or one program may be realized as two or more programs.
  • identification numbers are used as identification information for various objects, but other types of identification information than identification numbers (for example, identifiers containing alphabetic characters or codes) may be employed.
  • control lines and information lines are shown to be necessary for the explanation, and not all control lines and information lines are necessarily shown in the product. All configurations may be interconnected.
  • a "spectral image” means an image consisting only of reflected light in a specific wavelength band (including specific wavelengths) among the reflected light from an object.
  • a spectral image can be created by passing reflected light from an object (crop) through a filter (preferably with a variable transmission wavelength band) that passes only a specific wavelength band. Only reflected light in a specific wavelength band is imaged on an image sensor, and generated based on an imaging signal output from the image sensor.
  • the reflected light from the object can be imaged on an image sensor without going through a filter, and the image signal output from this image sensor can be converted (filtered) into an image signal of only a specific wavelength band through signal processing. generated.
  • an image that is a collection of a plurality of spectral images of the same object made up of only reflected light in a specific wavelength band may also be referred to as a spectral image.
  • the term "flying object” refers to a mobile object that flies in the air and can be operated without a crew (e.g., an aircraft that can be remotely controlled wirelessly or autonomously, a drone, etc.). , satellite, airship).
  • the expression that a crop is in a "specific state” includes the following. ⁇ Whether or not the crop is sick, and which disease it is infected with, and which disease it is likely to be ⁇ Growth status of the crop (growth status includes whether it is harvestable or not) ⁇ If the target area is a forest, the condition of the vegetation and the distribution of the vegetation
  • Crops include not only agricultural products but also plants. Therefore, the term "a crop is in a specific state" includes the condition of the plants on which the crop is grown (for example, if the crop is an apple, the condition of the leaves of the tree on which the apple is grown) is also included.
  • a first multispectral camera images the ground surface including an area where crops are cultivated, and an operator confirms the spectral image obtained by imaging, and identifies a specific wavelength band in a certain area. (This corresponds to the strength of reflected light in a specific wavelength band and reflected light in other wavelength bands from the same object) and the specific state of crops in that area. Operator associates. Save this association as a library.
  • the area to be observed is similarly imaged by the second multispectral camera to obtain a spectral image.
  • a spectral image taken of the area to be observed is analyzed according to the above-mentioned library, and an area where crops are in a specific state in this area is identified.
  • the first multispectral camera and the second multispectral camera are preferably mounted on a moving body, and acquire spectral images by moving the moving body over the ground surface and the area to be observed.
  • Suitable examples of mobile objects include flying objects such as drones and satellites, but there are no special limitations as long as the object can acquire spectral images by moving the first and second multispectral cameras. do not have.
  • the moving object does not need to be a flying object, and a moving object on which the first and second spectral cameras are mounted may move on the ground surface.
  • spectral images can be acquired by means such as an operator holding the first and second spectral cameras moving on the ground surface, or a car equipped with the first and second spectral cameras driving on the ground surface. It's okay.
  • One example is a mode in which a multispectral camera realized by a smartphone is used as the spectroscopic terminal device disclosed in Japanese Patent No. 6,342,594.
  • the mobile body on which the first multispectral camera is mounted and the mobile body on which the second multispectral camera is mounted do not need to be the same.
  • the first multispectral camera is mounted on a satellite and the second multispectral camera is mounted on a drone, but the first multispectral camera is mounted on the drone and the second multispectral
  • the camera may be mounted on a satellite.
  • both the first and second multispectral cameras may be mounted on a satellite, or both the first and second multispectral cameras may be mounted on a drone.
  • at least one of the first multispectral camera and the second multispectral camera may be mounted on a moving body that moves on the ground surface.
  • the first multispectral camera and the second multispectral camera both receive reflected light from the ground surface in multiple wavelength bands, and generate a spectral image based on this reflected light.
  • the first multispectral camera has a member capable of switching the imaging wavelength, such as a liquid crystal variable wavelength filter, and generates spectral images for a larger number of wavelengths than the second multispectral camera. I can do it.
  • the second multispectral camera can generate spectral images for specific wavelengths. At this time, it is preferable that the specific wavelength cannot be switched while the second multispectral camera is imaging the area to be observed. That is, the specific wavelength is a predetermined wavelength.
  • library herein refers to a highly versatile data set and/or program.
  • the library is implemented as training data in artificial intelligence technology.
  • a learning model is generated using teacher data, and based on this learning model, areas where crops are in a specific state are identified. Therefore, in the following explanation, the term “library” will be used mainly to refer to teaching data, but the term “library” may be used without making any particular distinction between teaching data and learning models.
  • the learning model includes a model that has already been learned, that is, a trained model.
  • the spectral image captured by the first multispectral camera is taken at a constant solar angle (as an example, the spectral image at a mid-south altitude), and a library is created based on the spectral image at a constant solar angle.
  • the spectral image is acquired at the same solar angle as the one at which the spectral image was acquired with the first multispectral camera, thereby minimizing the influence of the solar angle. This can be done to identify areas where crops are in a particular state.
  • the imaging range of the spectral images captured by the first and second multispectral cameras varies depending on the angle of view of each multispectral camera and the altitude of the mobile body equipped with the multispectral camera and the ground surface. It becomes a given thing. In this case, it is preferable to connect multiple spectral images to generate a wide range spectral image.
  • Techniques for connecting spectral images are well known, and one example is a technique for generating orthoimages from spectral images and connecting the orthoimages. The method of connecting spectral images may be applied to spectral images obtained from either the first multispectral camera or the second multispectral camera.
  • the first multispectral camera is mounted on a satellite and the second multispectral camera is mounted on a drone.
  • the moving object and the method of moving the first and second multispectral cameras are not limited to the disclosure of the embodiments.
  • the configurations of the first and second multispectral cameras are not limited to the disclosure of the embodiments.
  • a spectral image for library creation is generated by a first multispectral camera mounted on a satellite 2, which is an example of a flying vehicle, and an area to be observed is detected by a drone 3, which is an example of a flying vehicle.
  • a second multispectral camera mounted on the system captures images, and the resulting spectral images are used to identify areas where crops are in a particular state.
  • the flying object for capturing the spectral image for library creation and the flying object for capturing the spectral image for region identification may be the same, and there are no restrictions on their types.
  • by capturing spectral images for library creation using the satellite 2 there is an advantage that spectral images from a wider range of the earth's surface can be obtained efficiently.
  • by capturing spectral images for area identification with the drone 3 it is possible to easily obtain spectral images by flying the drone 3 to focus on the area where the condition should be identified, and furthermore, it is possible to image the same area multiple times. This has the advantage of lowering the actual cost.
  • Satellite 2 is equipped with a first multispectral camera that uses a liquid crystal tunable filter (LCTF), while Drone 3 is equipped with a first multispectral camera that uses a liquid crystal tunable filter (LCTF).
  • a second multispectral camera capable of capturing a spectral image using a combination of multiple wavelength bands that can clearly identify that the light is in a specific state preferably, a camera capable of capturing only reflected light in a specific wavelength band
  • It is equipped with multiple multispectral cameras.
  • the drone 3 may be equipped with a first multispectral camera using LTCF, and the satellite 2 may be equipped with a second multispectral camera capable of capturing spectral images using a combination of a plurality of wavelength bands.
  • the first multispectral camera on board Satellite 2 is capable of capturing reflected light in a specific wavelength band among the reflected light in a wide wavelength band from the visible to near-infrared ranges, due to the action of a liquid crystal wavelength tunable filter. .
  • the system 1 of this embodiment scans a wide wavelength band and captures spectral images of the same area based on reflected light in multiple wavelength bands.
  • a spectral image captured by the first multispectral camera mounted on the satellite 2 is transmitted to the information processing device 4.
  • the operator of the information processing device 4 compares multiple (preferably three or more wavelength bands) spectral images among the spectral images taken of the same region by the first multispectral camera mounted on the satellite 2, and compares each spectral image.
  • a shading pattern indicating that the crop is in a specific state is identified from the shading pattern in the image, and this shading pattern is associated with information indicating the specific state. Therefore, the operator of the information processing device 4 grasps the growth state of the crops on the ground surface in the same area mentioned above, and based on this growth state, can determine in advance which area's crops are in what specific state. It is assumed that you understand it.
  • the above-mentioned associations are stored in the memory of the information processing device 4 as a library.
  • this library is implemented as training data in artificial intelligence technology. More preferably, when creating the library, the angle of the sun with respect to the earth's surface is acquired or calculated from the time when the earth's surface was imaged by the satellite 2, and the library is created in association with this angle of the sun.
  • the dependence on the solar angle is not essential, including at the beginning of operation of the system 1.
  • the drone 3 is flown based on a predetermined flight plan over the observation area where it is to be determined whether or not the crops are in a specific state, and the second multispectral camera mounted on the drone 3
  • the image is taken by As described above, since the imaging range by the second multispectral camera mounted on the drone 3 is relatively narrow, preferably, multiple drones 3 are flown at the same time and images are captured in parallel by the second multispectral camera. It is preferable to do so.
  • the flight plan be created in advance by the operator of the information processing device 4.
  • the operator grasps the topography of the area to be observed in advance and creates a flight plan in which the drone 3 is to fly focused on areas where crops are expected to be in a specific state.
  • the information processing device 4 it is also possible for the information processing device 4 to recommend areas where crops are expected to be in a specific state. Areas where the crop is expected to be in certain conditions are, for example, along rivers if the crop is sensitive to humidity, areas where fog is expected to occur for long periods of time, or even at certain altitudes. If the crop is expected to be in a particular state in width, then there is an area of this altitude width, and so on.
  • a plurality of wavelength bands are selected in advance so that it is possible to clearly grasp whether or not a specific state is present, and a plurality of cameras (preferably 3 or more cameras, preferably 4 cameras), and these multiple cameras constitute a multispectral camera.
  • a spectral image obtained by imaging the area to be observed using such a second multispectral camera is input to the information processing device 4, and is compared with the library to identify areas where crops are in a specific state.
  • the hatched areas are identified as areas where crops are in a specific state.
  • the information processing device 4 connects the individual spectral images taken by the second multispectral camera mounted on the drone 3 to generate one spectral image, and uses this spectral image to identify the crop. It is preferable to identify an area in a state of . This is because Drone 3 flies at a lower altitude compared to Satellite 2, so if the area to be observed covers a wide area, it is difficult to identify the entire area with a single spectral image. be.
  • spectral images obtained from the first multispectral camera mounted on Satellite 2 can also be connected as appropriate. Preferably, they are joined together.
  • the sun angle of the area at the time of spectral image capturing is obtained and calculated, and this sun angle is taken into account to identify the area where the crop is in the particular state. Identify.
  • the system 1 of this embodiment includes an information processing device 10, a drone 20, and a satellite 30 that are connected via a network N.
  • the hardware configuration of the information processing device 10 is shown in FIG. 11, the hardware configuration of the drone 20 is shown in FIG. 5, and the hardware configuration of the satellite 30 is shown in FIG.
  • These drones 20 and satellites 30 are equipped with information processing devices.
  • the information processing device is composed of a computer equipped with an arithmetic unit and a storage device.
  • the basic hardware configuration of the computer and the basic functional configuration of the computer realized by the hardware configuration will be described later.
  • the network N is composed of various mobile communication systems constructed by the Internet, LAN, wireless base stations, etc.
  • the network includes 3G, 4G, 5G mobile communication systems, LTE (Long Term Evolution), a wireless network (eg, Wi-Fi (registered trademark)) that can be connected to the Internet through a predetermined access point, and the like.
  • communication protocols include, for example, Z-Wave (registered trademark), ZigBee (registered trademark), Bluetooth (registered trademark), and the like.
  • the network also includes a network that is directly connected using a USB (Universal Serial Bus) cable or the like.
  • USB Universal Serial Bus
  • the information processing device 10 receives a spectrum image captured by the satellite 30, and creates a library based on this spectrum image. Then, the information processing device 10 receives the spectral image captured by the drone 20, and identifies an area where crops are in a specific state in the observation target area based on the spectral image and the library.
  • the drone 20 of this embodiment is configured to be able to fly stationary, and includes a spectral camera control device 21 and a spectral camera 22 controlled by the spectral camera control device 21, as shown in FIGS. 3 and 4. It is equipped with a spectral camera control system 23. Each configuration will be explained in detail below.
  • the drone 20 is a flying object that has a function of flying stationary in the air, a so-called hovering function, and in this embodiment, it is a multicopter type drone having a plurality of rotary wings. Further, the drone 20 of this embodiment has a function of autonomously flying along a pre-specified flight path and a function of flying by remote control from a communication device or the like. Furthermore, although not shown in FIGS. 3 and 4, the drone 20 is equipped with a GPS (Global Positioning System) receiver for detecting the position (latitude and longitude) and altitude of the drone during flight, and an attitude sensor that detects the attitude of the user.
  • GPS Global Positioning System
  • the drone 20 is equipped with a spectral camera control system 23 that includes a spectral camera control device 21 and a spectral camera 22 that is a second multispectral camera controlled by the spectral camera control device 21. ing.
  • the spectral camera control system 23 mainly includes a spectral camera 22, an attitude position detector 24 that detects information on the attitude and position of the spectral camera 22, and a spectral camera control system that controls the spectral camera 22. It has a device 21 and a battery 25 that supplies power to each of these devices.
  • the spectrum camera 22 is mounted on the drone 20 so as to face vertically downward so that the ground surface becomes the imaging target while the drone 20 is flying stationary.
  • the spectrum camera 22 mounted on the drone 20 of this embodiment has wavelengths corresponding to each of a plurality of wavelength band combinations that can detect that crops are in a specific state as a result of library generation described later. It has an image sensor that can detect reflected light in a band.
  • the spectral camera 22 has an image sensor and a plurality of filters that can pass only light in a plurality of wavelength bands, and when the spectral camera 22 takes images of the ground surface, the plurality of filters are appropriately switched (replaced) to By letting only the reflected light in the wavelength band reach the image sensor and continuously switching the filter, an imaging result based on the reflected light in the wavelength band corresponding to each combination of a plurality of wavelength bands is obtained.
  • a spectral camera consisting of a combination of filters determined based on the determination result. 22 is mounted on the drone 20.
  • a plurality of cameras capable of imaging only light in a single wavelength band may be installed, and the spectral camera 22 may be constituted by the plurality of cameras.
  • the spectral camera 22 generates a spectral image 2104 based on reflected light in a predetermined specific wavelength band, and the wavelength band can be switched while the drone 20 is imaging the area to be observed.
  • the spectrum camera 22 is configured to be capable of arbitrarily switching between a plurality of wavelength bands, similar to the spectrum camera 32 mounted on the satellite 30.
  • the image sensor captures a spectrum image using a snapshot method.
  • the image sensor 222 is a two-dimensional image sensor such as a CMOS image sensor or a CCD image sensor that can capture images within the field of view at the same timing. Further, the image sensor 222 is configured to perform imaging based on an imaging command signal transmitted from the spectral camera control device 21.
  • the attitude and position detector 24 is a device that detects the attitude and position of the spectrum camera 22.
  • the attitude position detector 24 in this embodiment includes a GPS receiver 240 that detects position information and altitude information of the spectrum camera 22, and an attitude sensor 241 that detects attitude information of the spectrum camera 22.
  • the GPS receiver 240 acquires current position information and altitude information by capturing the positions of multiple artificial satellites.
  • the GPS receiver 240 in this embodiment is configured to acquire longitude information and latitude information as position information, and altitude information as altitude information.
  • the position information and altitude information are not limited to those acquired by the GPS receiver 240, and may be acquired by other methods.
  • a reference point may be set and distance and altitude information from the reference point may be acquired using a distance measuring device that uses laser light or acoustic reflection.
  • the attitude sensor 241 detects attitude information such as the inclination angle, angular velocity, and acceleration of the spectrum camera 22.
  • attitude sensor 241 in this embodiment includes a gyro sensor that uses gyro characteristics and an acceleration sensor, and is configured to obtain tilt angle, angular velocity, and acceleration in three axial directions as attitude information.
  • position information and orientation information are acquired from the orientation position detector 24 provided as the spectrum camera control system 23, but the configuration is not limited to this.
  • the position information and attitude information may be acquired from a GPS receiver and an attitude sensor that the drone 20 already has.
  • FIG. 6 shows a functional configuration realized by the spectrum camera control device 21 of the drone 20.
  • the spectral camera control device 21 includes a storage section 210, a control section 211, and a communication section 212.
  • the communication unit 212 is configured by an unillustrated communication IF of the spectral camera control device 21,
  • the storage unit 210 is configured by an unillustrated main storage device and auxiliary storage device of the spectral camera control device 21, and
  • the control unit 211 mainly includes It is constituted by an unillustrated processor of the spectral camera control device 21.
  • the communication unit 212 communicates with the information processing device 10 and the like via a network N (not shown).
  • the storage unit 210 of the spectral camera control device 21 includes a flight plan DB (DataBase) 2101, a second imaging condition DB 2102, an image DB 2103, and a spectral image 2104.
  • Flight plan DB DataBase
  • second imaging condition DB 2102 a second imaging condition DB 2102
  • image DB 2103 a spectral image 2104.
  • the database referred to here refers to a relational database, which is used to manage data sets called tabular tables, which are structurally defined by rows and columns, in relation to each other.
  • a table is called a table
  • a table column is called a column
  • a table row is called a record.
  • each table has a column set as a primary key to uniquely identify a record, but setting a primary key to a column is not essential.
  • the control unit 211 can cause the processor to add, delete, or update records to a specific table stored in the storage unit 210 according to various programs.
  • the flight plan DB 2101 is a database storing a flight plan for flying the drone 20 along a predetermined route when flying the drone 20 in the air.
  • the image DB 2103 is a database for managing spectral images 2104 obtained as a result of imaging by the spectral camera 22 according to the second imaging condition DB 2102.
  • the image 2104 is a spectrum image obtained as a result of imaging by the spectrum camera 22 according to the second imaging condition DB 2102. Details of the flight plan DB 2101, second imaging condition DB 2102, and image DB 2103 will be described later. ⁇ Configuration of control unit 211 of spectral camera control device 21>
  • the control unit 211 of the spectral camera control device 21 includes a reception control unit 2110, a transmission control unit 2111, a flight control unit 2112, an attitude position information acquisition unit 2113, and an imaging control unit 2114.
  • the control unit 211 implements functional units such as the reception control unit 2110 by executing the application program 2100 stored in the storage unit 210.
  • the reception control unit 2110 controls the process by which the spectral camera control device 21 receives signals from an external device according to a communication protocol.
  • the transmission control unit 2111 controls the process by which the spectral camera control device 21 transmits a signal to an external device according to a communication protocol.
  • the flight control unit 2112 directs the drone toward the target point specified by the flight plan DB 2101 based on the geographical information of the target point specified by the flight plan DB 2101 and the attitude position information acquired by the attitude position information acquisition unit 2113. Control 20 flights.
  • the attitude position information acquisition unit 2113 acquires the position information, altitude information, and attitude information of the drone 20 detected by the attitude position detector 24, and provides the acquisition results to the flight control unit 2112 and the imaging control unit 2114.
  • the imaging control unit 2114 refers to the second imaging condition DB 2102, images the observation target area with the spectral camera 22 at the imaging position specified by the second imaging condition DB 2102, and stores the captured spectral image 2104 in the storage unit 210. At the same time, the conditions under which the spectrum image 2104 was captured are also stored in the image DB 2103.
  • FIG. 7 conceptually shows a state in which the drone 20 of this embodiment images an area to be observed and obtains a spectral image.
  • the imaging area of the spectrum camera 22 at one time is, for example, a rectangular (square) area with one side of about 150 m, and the altitude of the drone 20 at this time is about 150 m.
  • the satellite 30 of this embodiment orbits at a substantially constant speed in an orbit set above the earth.
  • the satellite 30 is equipped with a spectral camera control system 33 that includes a spectral camera control device 31 and a spectral camera 32 that is a first multispectral camera controlled by the spectral camera control device 31.
  • the configurations of the spectrum camera control device 31, spectrum camera 32, and spectrum camera control system 33 are similar to those of the spectrum camera control device 21, spectrum camera 22, and spectrum camera control system 23 of the drone 20. Therefore, descriptions of substantially the same configurations will be omitted, and the description will focus on the main differences.
  • the spectral camera control system 33 mainly includes a spectral camera 32 equipped with a variable liquid crystal wavelength filter 34, an attitude position detector 35 that detects information on the attitude and position of the spectral camera 32, and a tunable liquid crystal wavelength filter of the spectral camera 32. It has a liquid crystal variable wavelength filter control circuit 36 that controls the filter 34, a spectral camera control device 31 that controls the spectral camera 32, and a battery 37 that supplies power to each of these devices.
  • the spectral camera 32 is for capturing a spectral image using a snapshot method, and mainly includes a lens group 320, a depolarizing plate 321 for converting polarized light into non-polarized light, and a liquid crystal that can arbitrarily select the transmission wavelength. It has a variable wavelength filter 34 and an image sensor 322 that captures a two-dimensional spectral image.
  • the lens group 320 uses light refraction to transmit the light from the imaging target to the liquid crystal variable wavelength filter 34, and focuses the transmitted light on the image sensor 322.
  • the lens group 320 in this embodiment includes a light entrance lens 320a that condenses the light of the imaging target and makes it enter the liquid crystal wavelength tunable filter 34, and an image of the light having only the transmitted wavelength after passing through the liquid crystal wavelength tunable filter 34.
  • a condensing lens 320b condenses light onto a sensor 322. Note that the type and number of lenses are not particularly limited, and may be appropriately selected depending on the performance of the spectral camera 32, etc.
  • the depolarizing plate 321 is for depolarizing light and making it non-polarized light.
  • the depolarizing plate 321 is provided on the light incident side of the liquid crystal variable wavelength filter 34, and depolarizes the light before passing through the liquid crystal variable wavelength filter 34, thereby reducing the polarization characteristics. There is.
  • the liquid crystal variable wavelength filter 34 is an optical filter that can arbitrarily select a transmission wavelength from within a predetermined wavelength range. Although not shown, the liquid crystal variable wavelength filter 34 has a structure in which a plurality of plate-shaped liquid crystal elements and plate-shaped polarizing elements are stacked alternately. The alignment state of each liquid crystal element is independently controlled by the applied voltage supplied from the liquid crystal variable wavelength filter control circuit 36. Therefore, the liquid crystal variable wavelength filter 34 is configured to transmit light of any wavelength depending on the alignment state of the liquid crystal element and the combination of the polarizing element.
  • the width of the transmission wavelength of the liquid crystal variable wavelength filter 34 is about 20 nm or less, the transmission center wavelength can be set in steps of 1 nm, and the wavelength switching time is about 10 ms to several 100 ms.
  • the image sensor 322 captures a spectrum image using a snapshot method.
  • the image sensor 322 is a two-dimensional image sensor such as a CMOS image sensor or a CCD image sensor that can capture images within the field of view at the same timing. Further, the image sensor 322 is configured to perform imaging based on an imaging command signal transmitted from the spectral camera control device 31.
  • the liquid crystal variable wavelength filter control circuit 36 controls the liquid crystal variable wavelength filter 34.
  • the liquid crystal variable wavelength filter control circuit 36 supplies an applied voltage according to the wavelength specifying signal to the liquid crystal element of the liquid crystal variable wavelength filter 34. It is supposed to be done.
  • the wavelength specifying signal includes information on the transmission wavelength to be transmitted by the liquid crystal wavelength tunable filter 34, and the liquid crystal wavelength tunable filter control circuit 36 determines which liquid crystal element to apply voltage to based on the transmission wavelength information. It is determined whether the applied voltage is supplied to the liquid crystal element, and the applied voltage is supplied to the specified liquid crystal element.
  • liquid crystal variable wavelength filter control circuit 36 in this embodiment is configured independently from other components such as the spectral camera control device 31, but is not limited to this.
  • the spectral camera control device 31 or the spectrum camera 32 may be provided.
  • FIG. 9 shows a functional configuration realized by the spectrum camera control device 31 of the satellite 30.
  • the spectral camera control device 31 includes a storage section 310, a control section 311, and a communication section 312.
  • the communication unit 312 is configured by an unillustrated communication IF of the spectral camera control device 31
  • the storage unit 310 is configured by an unillustrated main storage device and auxiliary storage device of the spectral camera control device 31
  • the control unit 311 mainly includes It is constituted by an unillustrated processor of the spectral camera control device 31.
  • the communication unit 312 communicates with the information processing device 10 and the like via a network N (not shown).
  • the storage unit 310 of the spectral camera control device 31 includes a first imaging condition DB (DataBase) 3101, an image DB 3102, and a spectral image 3103.
  • the database referred to here refers to a relational database, which is used to manage data sets called tabular tables, which are structurally defined by rows and columns, in relation to each other.
  • a table is called a table
  • a table column is called a column
  • a table row is called a record.
  • each table has a column set as a primary key to uniquely identify a record, but setting a primary key to a column is not essential.
  • the control unit 311 can cause the processor to add, delete, or update records to a specific table stored in the storage unit 310 according to various programs.
  • the first imaging condition DB 3101 is a database regarding conditions for imaging the ground surface with the spectral camera 32 mounted on the satellite 30, and the image DB 3102 is a spectrum obtained as a result of imaging by the spectral camera 32 according to the first imaging condition DB 3101. It is a database for managing images, and the spectrum image 3103 is a spectrum image obtained as a result of imaging by the spectrum camera 32 according to the first imaging condition DB 3101. Details of the first imaging condition DB 3101 and image DB 3102 will be described later. ⁇ Configuration of control unit 311 of spectral camera control device 31>
  • the control unit 311 of the spectral camera control device 31 includes a reception control unit 3110, a transmission control unit 3111, an imaging determination unit 3112, an attitude position information acquisition unit 3113, an imaging control unit 3114, and a wavelength setting unit 3115.
  • the control unit 311 implements functional units such as the reception control unit 3110 by executing the application program 3100 stored in the storage unit 310.
  • the reception control unit 3110 controls the process by which the spectral camera control device 31 receives a signal from an external device according to a communication protocol.
  • the transmission control unit 3111 controls the process by which the spectral camera control device 31 transmits a signal to an external device according to a communication protocol.
  • the imaging determination unit 3112 determines whether or not to image the ground surface with the spectral camera 32 mounted on the satellite 30 based on the imaging start time and imaging end time stored in the first imaging condition DB 3101. The results are sent to the imaging control unit 3114.
  • the attitude and position information acquisition unit 3113 acquires the position information, altitude information, and attitude information of the satellite 30 detected by the attitude and position detector 35, and provides the acquisition results to the imaging control unit 3114.
  • the imaging control unit 3114 receives the determination result from the imaging determination unit 3112, and if the determination result indicates that imaging is to be performed, the imaging control unit 3114 images the ground surface with the spectral camera 32, and stores the captured spectral image 3103 in the storage unit 310. At the same time, the conditions under which the spectrum image 3103 was captured are stored in the image DB 3102.
  • FIG. 10 conceptually shows a state in which the satellite 30 of this embodiment images an area to be observed and obtains a spectral image.
  • the imaging area of the spectrum camera 32 at one time is, for example, a rectangular (square) area with sides of about 2 to 10 km, and the altitude of the satellite 30 at this time is about 500 km.
  • FIG. 11 is a block diagram showing the basic hardware configuration of the information processing device 10.
  • the information processing device 10 includes at least a processor 101, a main storage device 102, an auxiliary storage device 103, a communication IF (Interface) 104, an input IF 105, and an output IF 106. These are electrically connected to each other by a communication bus 107. Further, an input device 110 and an output device 111 are connected to the information processing device 10 via an input IF 105 and an output IF 106, respectively.
  • the processor 101 is hardware for executing a set of instructions written in a program.
  • the processor 101 includes an arithmetic unit, registers, peripheral circuits, and the like.
  • the main storage device 102 is for temporarily storing programs, data processed by the programs, etc.
  • it is a volatile memory such as DRAM (Dynamic Random Access Memory).
  • the auxiliary storage device 103 is a storage device for storing data and programs. Examples include flash memory, SSD (Solid State Drive), HDD (Hard Disc Drive), magneto-optical disk, CD-ROM, DVD-ROM, semiconductor memory, and the like.
  • the communication IF 104 is an interface for inputting and outputting signals for communicating with other computers via a network using a wired or wireless communication standard.
  • the input IF 105 functions as an interface with the input device 110 for receiving input operations from the operator of the information processing device 10.
  • the output IF 106 functions as an interface with the output device 111 for presenting information to the operator.
  • the input device 110 is an input device (for example, a touch panel, a touch pad, a pointing device such as a mouse, a keyboard, etc.) for receiving input operations from an operator.
  • the output device 111 is an output device (display, speaker, etc.) for presenting information to the operator.
  • the information processing device 10 can be virtually realized by distributing all or part of each hardware configuration to a plurality of computers and interconnecting them via a network. In this way, the information processing device 10 is a concept that includes not only a computer housed in a single housing or case, but also a virtualized computer system.
  • FIG. 12 shows a functional configuration realized by the hardware configuration of the information processing device 10.
  • the information processing device 10 includes a storage section 120, a control section 130, and a communication section 140.
  • the communication unit 140 is configured by the communication IF 104
  • the storage unit 120 is configured by the main storage device 102 and the auxiliary storage device 103 of the information processing device 10
  • the control unit 130 is mainly configured by the processor 101 of the information processing device 10.
  • the communication unit 140 communicates with the drone 20 and the like via the network N.
  • the storage unit 120 of the information processing device 10 includes a flight plan DB (DataBase) 122, an image DB 123, a wide-area image DB 124, teacher data 125, a learning model 126, a spectral image 127, and a wide-area spectral image 128.
  • Flight plan DB DataBase
  • the flight plan DB 122, image DB 123, and wide area image DB 124 are databases.
  • the database referred to here refers to a relational database, which is used to manage data sets called tabular tables, which are structurally defined by rows and columns, in relation to each other.
  • a table is called a table
  • a table column is called a column
  • a table row is called a record.
  • each table has a column set as a primary key to uniquely identify a record, but setting a primary key to a column is not essential.
  • the control unit 130 can cause the processor 101 to add, delete, or update records to a specific table stored in the storage unit 120 according to various programs.
  • the flight plan DB 122 is similar to the flight plan DB 2101 stored in the storage unit 210 of the drone 20.
  • the image DB 123 is similar to the image DB 2103 stored in the storage unit 210 of the drone 20 and the image DB 3102 stored in the storage unit 310 of the satellite 30.
  • an image DB 123 that is a combination of the image DB 2103 and the image DB 3102 is stored in the storage unit 120.
  • the wide-area image DB 124 is a database for managing the wide-area spectral image 128 generated by the image synthesis unit 136 of the control unit 130 based on the spectral image 127.
  • the control unit 130 of the information processing device 10 includes a reception control unit 131, a transmission control unit 132, a learning model generation unit 133, a flight plan creation unit 134, an aircraft control unit 135, an image synthesis unit 136, an area identification unit 137, and a type identification unit. 138.
  • the control unit 130 realizes functional units such as the reception control unit 131 by executing the application program 121 stored in the storage unit 120.
  • the reception control unit 131 controls a process in which the information processing device 10 receives a signal from an external device according to a communication protocol.
  • the transmission control unit 132 controls a process in which the information processing device 10 transmits a signal to an external device according to a communication protocol.
  • the learning model generation unit 133 uses the spectral image 127 taken by the spectral camera 32 mounted on the satellite 30 and the image DB 123 sent from the satellite 30, as well as the spectral camera 22 mounted on the drone 20 and the spectral image DB 123 sent from the drone 20.
  • Teacher data 125 is generated based on the image DB 123, and a learning model 126 is generated based on this teacher data 125.
  • the details of the operation of the learning model generation unit 133 when using the spectrum image 127 taken by the spectrum camera 32 mounted on the satellite 30 will be described. Note that the operation is similar when using the spectrum image 127 taken by the spectrum camera 22 mounted on the drone 20.
  • the spectral camera 32 mounted on the satellite 30 images the same area of the earth's surface for each wavelength band obtained by subdividing this wavelength band over a wide range of wavelength bands, and images a large number of spectral images 3103 (spectral images) of the same area with different wavelength bands. Image 127) has been acquired.
  • the learning model generation unit 133 presents the operator of the information processing device 10 with a plurality of spectral images 127 of the same region with different wavelength bands, and generates a wavelength band that can clearly distinguish areas where crops are in a specific state in this region. Spectral image 127 is specified by the operator.
  • the learning model generation unit 133 causes the operator to specify a specific state of the crop that can be specified by the specified wavelength band with respect to the spectral image 127 of the specified wavelength band. associated with specific conditions of the crop. Then, the learning model generation unit 133 causes the storage unit 120 to store the specified wavelength band, the spectral image 127 of the specified wavelength band, the specified specific state of the crop, and the relationship thereof as the teacher data 125. .
  • an original image taken of the same area on the ground surface (in this embodiment, multiple spectral images 127 can be acquired for the same area, but to simplify the explanation, only one image is used). It is possible to acquire spectrum images 127 of a plurality of wavelength bands (in FIG. 23, there are three types of wavelengths ⁇ , ⁇ , and ⁇ , but there is no limit to the number of wavelength bands). Among these spectral images 127, a different density distribution may occur for a certain wavelength (wavelength ⁇ in FIG. 23) than for other wavelengths ⁇ and ⁇ .
  • this is a spectral image 127 for a specific wavelength band, it is assumed that when displayed on a display or the like, it will be displayed in a pattern of black and white shading (that is, grayscale display).
  • the operator of the information processing device 10 knows in advance what kind of specific state the crops are in in the same ground surface area (for example, through a field survey), and the operator of the information processing device 10 knows in advance what kind of specific state the crops are in in the same ground surface area (for example, through a field survey), and when the wavelength ⁇ is different from other wavelengths ⁇ and ⁇ , The operator looks at the unique gray scale distribution that cannot be seen (indicated by hatching in FIG. 23) and inputs the association between this gray scale distribution and a specific state of the crop.
  • the learning model generation unit 133 refers to the image DB 123 and calculates the angle between the earth's surface and the sun when the spectral image of the specified wavelength band is captured. This is because the spectrum of light reflected from crops differs depending on the angle of the sun.
  • the learning model generation unit 133 similarly stores the angle of the sun in the storage unit 120 as the teacher data 125.
  • the learning model generation unit 133 calculates a correction formula (or correction value) for correcting the spectral image to a spectral image of a spectrum captured at the sun angle at a predetermined time, based on the calculated sun angle, Teacher data 125 is generated using the spectral image corrected using this correction formula, etc., and is stored in the storage unit 120.
  • the correction formula itself may be stored in the learning model 126 described later, or the correction formula may be stored separately in the storage unit 120. However, correction based on the solar angle is not essential.
  • the learning model generation unit 133 generates a learning model 126 in the artificial intelligence technology based on the teacher data 125 generated by the above-described procedure.
  • the generation method of the learning model 126 and the type of the learning model 126 known methods can be suitably applied, so further explanation will be omitted here.
  • the learning model generation unit 133 repeatedly performs the above-described teacher data 125 generation procedure to update the teacher data 125, and re-learns the learning model 126 after updating the teacher data 125 or at every predetermined time interval. This improves the accuracy with which the type identification unit 138 (described later) identifies crops as being in a particular state.
  • a spectral image 127 at a limited solar angle is acquired, and this spectral image 127 is This is a preferable operation if the teacher data 125 and learning model 126 are generated based on the following. That is, initially, the teacher data 125 and the learning model 126 are generated based on the spectral image 127 at a limited solar angle, and then the spectral image obtained from the spectral camera 32 mounted on the satellite 30 and/or the spectral image mounted on the drone 20 is generated.
  • the spectral images obtained from the spectral camera 22 are acquired at various sun angles, training data 125 including the influence of the sun angle is generated, and the learning model 126 is retrained based on this training data 125. Good too. As a result, it is possible to bring forward the start of operation of the system 1, and it is also possible to progressively improve the accuracy of the identification results by the type identification unit 138. In addition, the specific state of the actual crop is compared with the identification result by the type identification unit 138, which will be described later, and the actual specific state of the crop is fed back to create/modify the training data 125, and the learning model 126 is re-trained. You may do so. This also allows the accuracy of the identification results by the type identification unit 138 to be progressively improved.
  • the learning model generation unit 133 acquires topographical data of the ground surface imaged by the spectral camera 32 of the satellite 30 and/or the spectral camera 22 of the drone 20, and based on this topographical data, the learning model generation unit 133 determines whether the crops are in a specific state. It is preferable to identify an area that is estimated to be located in the area, and to extract a spectral image 127 obtained by imaging this area with the spectral cameras 22 and 32 and present it to the operator. Whether a crop is in a particular state often depends on the terrain. As an example, it is known that crops are more susceptible to disease if the humidity in the area where they are grown is high.
  • the learning model generation unit 133 acquires topographical data of the ground surface, estimates the environment (sunlight, humidity, wind direction, etc.) of the place where crops are grown based on this topographical data, and determines whether this environment is a specific one.
  • a spectrum image 127 obtained by imaging the location is presented to the operator.
  • the terrain data may be stored in advance in the storage unit 120 of the information processing device 10, or may be obtained from an external service.
  • the specified wavelength band, the spectral image 127 of the specified wavelength band, the specified state of the crop, and the spectral image 127 can be captured without using the learning model generation unit 133, the teacher data 125, and the learning model 126. It is also possible to identify whether a crop is in a particular state based solely on the angle of the sun at the time of the harvest and their relationship.
  • the flight plan creation unit 134 creates a flight plan for flying the drone 20 over the observation target area and capturing an image of the observation target area using the spectrum camera 22 mounted on the drone 20, and stores the flight plan in the storage unit 120 as the flight plan DB 122. Store in. The flight plan creation unit 134 then transmits the created flight plan DB 122 to the drone 20. In addition, when controlling the flight of the drone 20 (driving the drone 20) by the flying object control part 135 mentioned later, it is not necessary to provide the flight plan creation part 134 and flight plan DB122.
  • the flight plan creation unit 134 creates a flight plan in which multiple drones 20 are simultaneously flown over the observation target area, and the spectrum cameras 22 mounted on these multiple drones 20 are to take images of the observation target area in parallel. It is preferable to create one.
  • the flight plan creation unit 134 acquires topographical data of the area to be observed and creates a flight plan based on this topographical data.
  • whether a crop is in a particular state often depends on the terrain.
  • crops are more susceptible to disease if the humidity in the area where they are grown is high. Therefore, the flight plan creation unit 134 acquires topographical data of the area to be observed, estimates the environment (sunlight, humidity, wind direction, etc.) of the place where crops are grown based on this topographical data, and determines whether this environment is specific.
  • a drone 20 is flown focusing on a certain place, and a spectral image is acquired by a spectral camera 22.
  • the terrain data may be stored in advance in the storage unit 120 of the information processing device 10, or may be obtained from an external service.
  • the flight plan creation unit 134 may create a flight plan by having the operator of the information processing device 10 individually designate the flight target position of the drone 20, or the operator may designate only the area to be observed and create the flight plan.
  • the unit 134 may generate the flight target position based on the flight speed of the drone 20 and the spectrum image acquisition speed by the spectrum camera 22.
  • the flying object control unit 135 controls the flight of the drone 20 based on the flight plan DB 122. However, if the drone 20 can fly autonomously based on the flight plan DB 2101 stored in its own storage unit 210, the flying object control unit 135 may not be provided. Alternatively, if the information processing device 10 controls the drone 20, the flying object control unit 135 controls the drone 20 based on a control signal input by the operator through a remote controller, which is an example of the input device 110 provided in the information processing device 10. to operate.
  • the image synthesis unit 136 connects the spectral images 2104 and 127 captured by the spectral camera 22 of the drone 20 to generate a wide-range spectral image 128 that preferably captures the entire area to be observed. This is because the imaging range of the spectral camera 22 of the drone 20 is often narrower than the area to be observed where it is necessary to identify whether or not crops are in a specific state. This is because it is preferable to generate a broad-spectrum image 128 of a wide range in order to perform the above-mentioned processing.
  • the image synthesis unit 136 then stores the generated wide spectrum image 128 in the storage unit 120 and updates the wide range image DB 124. Furthermore, the image synthesis unit 136 connects the spectral images 3103 and 127 captured by the spectral camera 32 of the satellite 30 to generate a wide spectrum image 128 that preferably captures a wide range of the ground surface.
  • the image synthesis unit 136 generates the wide spectrum image 128 based on a known method of generating orthoimages.
  • An ortho image eliminates the positional shift of the image on the photo and converts the aerial photo into an image that is displayed in the correct size and position without any tilt, as if seen from directly above, just like a map (hereinafter referred to as ⁇ orthogonal conversion''). ).
  • To orthographically convert an aerial photograph it is necessary to match the position on the aerial photograph with the horizontal position on the ground. This orthographic transformation is performed using a digital elevation model (elevation data) that represents the three-dimensional shape of the earth's surface. Since the method of generating orthoimages is known, further explanation will be omitted here.
  • FIG. 22 is a diagram showing an example of image synthesis of the spectrum images 2104 and 127 by the image synthesis unit 136.
  • Spectral images 127 obtained by imaging an area 2200 (square in the figure, but not limited to this) by one drone 20 are arranged horizontally in the figure with a predetermined overlapping area, and these spectral images are 127 to generate a wide spectrum image 128. This operation is also performed on the spectral image 127 obtained by imaging the area 2201 with another drone 20 to obtain a wide-area spectral image 128 that includes the area to be observed.
  • the area specifying unit 137 refers to the wide spectrum image 128 created by the image synthesizing unit 136 and performs information processing to specify the area for the type specifying unit 138 to specify whether or not the crop is in a specific state. received from the operator of the device 10.
  • the type identification unit 138 determines whether or not the crop is in a specific state in the area identified by the area identification unit 137, and if so, in what state, based on the wide spectrum image 128 and the learning model 126. Identify. At this time, the type specifying unit 138 receives an input of a specific state of the crop to be specified based on an instruction input from the operator of the information processing device 10, and specifies whether the crop is specific in the area specified by the area specifying unit 137. It may also be possible to specify whether or not the person is in a specific state.
  • the type identifying unit 138 calculates the angle between the earth's surface and the sun when the spectral image 127 that is the source of the wide-spectrum image 128 is captured, Using the angle correction formula, the spectrum is corrected so that it becomes a wide spectrum image 128 captured at a predetermined time.
  • the type identification unit 138 displays the identification result on a display, which is an example of the output device 111.
  • a display which is an example of the output device 111.
  • the type identification unit 138 refers to a map database to display map data of an area to be observed, and also includes an area where crops are in a specific state in this map data. An example of such a mode is superimposed display.
  • the type specifying unit 138 may receive input specifying a time axis from the operator of the information processing device 10 and display the spread of crops in a region in a specific state on the time axis. Furthermore, the type identifying unit 138 determines that the crop is in a specific state, and the cost required to restore this specific state (for example, cutting down crops in a specific state, spraying pesticides, etc.) The estimated damage amount may be calculated and displayed on the display.
  • FIG. 13 is a diagram showing the data structure of the first imaging condition DB 3101 stored in the storage unit 310 of the satellite 30.
  • the first imaging condition DB 3101 is a table having columns of imaging start time, imaging end time, and wavelength condition using an imaging condition ID as a key for specifying imaging conditions by the spectrum camera 32 of the satellite 30.
  • the "imaging condition ID” is information for specifying the imaging condition by the spectrum camera 32 of the satellite 30.
  • the “imaging start time” is information regarding the time at which the spectrum camera 32 of the satellite 30 starts imaging.
  • the “imaging end time” is information regarding the time when imaging by the spectrum camera 32 of the satellite 30 ends.
  • “Wavelength conditions” is information regarding wavelength conditions when imaging is performed by the spectrum camera 32 of the satellite 30.
  • the spectral camera 32 mounted on the satellite 30 is capable of capturing images in multiple wavelength bands (and can therefore generate spectral images in multiple wavelength bands), and in the example shown in FIG. Information about the range of the band and the wavelength width for changing the wavelength band is stored.
  • Each column in the first imaging condition DB 3101 is generated by an information processing device including the information processing device 10, and is stored in the storage unit 310 by being sent to the satellite 30.
  • FIG. 14 is a diagram showing the data structure of the image DBs 2103, 3102, and 123 stored in the storage unit 210 of the drone 20, the storage unit 310 of the satellite 30, and the storage unit 120 of the information processing device 10, respectively.
  • Image DBs 2103, 3102, and 123 use image IDs as keys to identify spectrum images captured by spectrum cameras 22 and 32 of drones 20 and satellites 30, and store information such as image file names, imaging times, location information, altitude information, and city administration.
  • 2 is a table with columns of information and wavelength.
  • Image ID is information for identifying the spectrum image captured by the spectrum cameras 22 and 32 of the drone 20 and the satellite 30.
  • “Image file name” is the file name of the spectral images 2104, 3103, and 127 identified by the image ID and stored in the storage unit 210 of the drone 20, the storage unit 310 of the satellite 30, and the storage unit 120 of the information processing device 10. This is information indicating.
  • "Imaging time” is information indicating the time when the spectrum images 2104, 3103, and 127 specified by the image ID were captured.
  • “Position information” is the position information of the drone 20 and the satellite 30 when the spectrum images 2104, 3103, and 127 specified by the image ID were captured.
  • Altitude information is altitude information of the drone 20 and the satellite 30 when the spectrum images 2104, 3103, and 127 specified by the image ID were captured.
  • Attitude information is attitude information of the drone 20 and the satellite 30 when the spectrum images 2104, 3103, and 127 specified by the image ID were captured.
  • Widelength is information indicating the wavelength band set in the spectrum cameras 22 and 32 of the drone 20 and the satellite 30 when the spectrum images 2104, 3103, and 127 specified by the image ID were captured.
  • Each column in the image DB 2103, 3102, 123 is generated by the spectral camera control devices 21, 31 of the drone 20 and satellite 30 when the spectral images 2104, 3103 are generated by the spectral cameras 22, 32 of the drone 20 and satellite 30. do.
  • FIG. 15 is a diagram showing the data structure of the second imaging condition DB 2102 stored in the storage unit 210 of the drone 20.
  • the second imaging condition DB 2102 is a table that has columns of imaging position, imaging altitude, and imaging posture, using an imaging condition ID as a key for specifying imaging conditions by the spectrum camera 22 of the drone 20.
  • the "imaging condition ID” is information for specifying the imaging condition by the spectrum camera 22 of the drone 20.
  • “Imaging position” is position information of the drone 20.
  • “Imaging altitude” is altitude information of the drone 20.
  • “Imaging posture” is posture information of the drone 20.
  • Each column in the second imaging condition DB 2102 is generated by an information processing device including the information processing device 10, and is stored in the storage unit 210 by being sent to the drone 20.
  • FIG. 16 is a diagram showing the data structure of the flight plan DBs 122 and 2101 stored in the storage unit 120 of the information processing device 10 and the storage unit 210 of the drone 20.
  • the flight plan DB 122, 2101 is a table having columns of elapsed time, flight position, and flight attitude using the flight plan ID for specifying the flight plan of the drone 20 as a key.
  • the "flight plan ID” is information for specifying the flight plan of the drone 20, more specifically, the goal of the drone 20.
  • the "elapsed time” is information indicating the elapsed time from the start of the flight of the drone 20 when the drone 20 should reach the destination specified by the flight plan ID.
  • "Flight position” is target position information that the drone 20 should reach in the target of the drone 20 specified by the flight plan ID.
  • “Flight altitude” is target altitude information that the drone 20 should reach in the target destination specified by the flight plan ID.
  • “Flight attitude” is target attitude information that the drone 20 should reach at the goal specified by the flight plan ID.
  • Each column in the flight plan DBs 122 and 2101 is generated by the flight plan creation unit 134 of the control unit 130 of the information processing device 10, stored in the storage unit 120, and sent to the drone 20.
  • the spectral camera control device 21 of the drone 20 stores the sent flight plan DB 122, 2101 in the storage unit.
  • FIG. 17 is a diagram showing the data structure of the wide area image DB 124 stored in the storage unit 120 of the information processing device 10.
  • the wide-area image DB 124 uses the wide-area image ID for specifying the wide-area spectrum image 128 stored in the storage unit 120 of the information processing device 10 as a key, and stores the wide-area image file name, imaging time, regional information, solar angle, and position information. , altitude information, attitude information, wavelength, and image ID columns.
  • the “wide area image ID” is information for identifying the wide area spectrum image 128 stored in the storage unit 120 of the information processing device 10.
  • the “wide area image file name” is information indicating the file name of the wide area spectrum image 128 that is specified by the wide area image ID and stored in the storage unit 120 of the information processing device 10.
  • Imaging time is information indicating the time when the wide spectrum image 128 specified by the wide area image ID was captured.
  • the wide spectrum image 128 is a composite of a plurality of spectrum images 127, and the imaging times of the individual spectrum images 127 are strictly different. Therefore, when synthesizing a plurality of spectral images 127 to generate a wide spectrum image 128, the image synthesis unit 136 determines the imaging time of one of the spectral images 127 among the plurality of spectral images 127 that are the source of synthesis. , is representatively used as the imaging time of the wide spectrum image 128. This work is also performed for "sun angle", "position information", “altitude information”, and "attitude information”.
  • "Regional information” is information indicating the area where the wide spectrum image 128, which is specified by the wide area image ID and stored in the storage unit 120 of the information processing device 10, was captured.
  • "Sun angle” is information indicating the angle of the sun when the wide spectrum image 128 specified by the wide area image ID was captured. At the beginning of operation of the system 1, the value of the "solar angle” field may be blank.
  • “Position information” is the position information of the drone 20 and the satellite 30 when the spectrum images 2104, 3103, and 127 specified by the image ID were captured.
  • “Altitude information” is altitude information of the drone 20 when the wide spectrum image 128 specified by the wide area image ID was captured.
  • attitude information is attitude information of the drone 20 when the wide spectrum image 128 specified by the wide area image ID is captured.
  • Widelength is information indicating the wavelength band in which the wide spectrum image 128 specified by the wide range image ID was captured.
  • Image ID is information for specifying a plurality of spectral images 127 from which the wide-range spectrum image 128 specified by the wide-range image ID is synthesized, and is common to the image ID of the image DB 2103.
  • Each column in the wide-area image DB 124 is generated by the image synthesis unit 136 of the control unit 130 of the information processing device 10 and stored in the storage unit 120.
  • FIG. 18 is a flowchart showing spectral image capturing processing by the spectral camera control device 31 of the satellite 30.
  • the spectral camera control device 31 refers to the first imaging condition DB 3101 and waits for the imaging start time described in the first imaging condition DB 3101 to arrive (S1800).
  • the spectral camera control device 31 refers to the first imaging condition DB 3101, sets the wavelength band for imaging by the spectral camera 32 (S1801), and uses the set wavelength band.
  • the ground surface is imaged by the spectral camera 32, and a spectral image 3103 is acquired and stored in the storage unit 310, and the conditions at the time of imaging are stored in the image DB 3102 of the storage unit 310 (S1802).
  • the spectral camera control device 31 refers to the first imaging condition DB 3101 and determines whether the imaging operation in S1802 has been completed in all wavelength bands (S1803). If it is determined that the imaging operation in all wavelength bands has been completed (YES in S1803), the process advances to S1804, and if it is determined that there is a wavelength band in which the imaging operation has not been performed yet (NO in S1803), the process returns to S1801 and the next step is to proceed to S1801. Set the wavelength band and perform the following processing.
  • the spectral camera control device 31 refers to the first imaging condition DB 3101 and determines whether the imaging end time has arrived. If it is determined that the imaging end time has come (YES in S1804), the process advances to S1805, and if it is determined that the imaging end time has not yet arrived (NO in S1804), the process returns to S1800 and waits for the next imaging start time.
  • the spectral camera control device 31 determines whether the imaging operation has been completed for all imaging conditions stored in the first imaging condition DB 3101. If it is determined that the imaging operation has ended for all imaging conditions (YES in S1805), the spectrum image 3103 and image DB 3102 stored in the storage unit 310 are transmitted to the information processing apparatus 10 (S1806), and the imaging operation is still in progress. If it is determined that there is an imaging condition that has not been completed (NO in S1805), the process returns to S1800 and waits for the next imaging start time.
  • FIG. 19 is a flowchart showing learning model generation processing by the learning model generation unit 133 of the control unit 130 of the information processing device 10.
  • the learning model generation unit 133 refers to the image DB 123 in the storage unit 120 and specifies/selects the location (region, region) where the spectral image 127 from which the learning model is generated is captured (S1900). Next, the learning model generation unit 133 determines the wavelength band in which the spectral image 127 was captured in order to identify the spectral image 127 from which the learning model is generated, among the spectral images 127 stored in the storage unit 120. Specify and select (S1901).
  • the learning model generation unit 133 extracts from the storage unit 120 the spectral image 127 that matches the conditions specified and selected in S1900 and S1901, and displays the extracted spectral image 127 on the display, which is an example of the output device 111. (S1902).
  • the learning model generation unit 133 refers to the image DB 123, obtains the time when the spectral image 127 extracted in S1902 was captured from the item "imaging time”, and determines from this imaging time that the spectral image 127 was captured. Calculate the angle between the earth's surface and the sun (sun angle). Next, the learning model generation unit 133 calculates a correction formula (a correction value may also be used) for the spectrum when this spectrum image 127 is captured at a specific time (for example, 12:00 in the daytime), and uses this correction formula to The spectrum of the spectrum image 127 is corrected (S1903). This correction formula and the like are stored as a learning model 126.
  • a correction formula a correction value may also be used
  • the learning model generation unit 133 inputs a selection input of a region to be learned from the spectral image 127 displayed on the display using a pointing device such as a mouse, which is an example of the input device 110, to the information processing device. 10 operators (S1904). Then, the learning model generation unit 133 inputs a specification input as to which specific state the crop is in for the area specified in S1903 to the operator of the information processing device 10 using a pointing device such as a mouse, which is an example of the input device 110. (S1905).
  • the learning model generation unit 133 receives input specifications from S1900 to S1905 from the operator of the information processing device 10, and waits for receiving an instruction indicating that various inputs for generating the learning model have been completed (S1906).
  • the instruction is received (YES in S1906)
  • the teacher data 125 is generated based on the input specifications from S1900 to S1905 and the image data of the target spectrum image 127 (S1907).
  • the learning model generation unit 133 then generates the learning model 126 based on the teacher data 125 generated in S1907 (S1908).
  • the learning model generation unit 133 stores the teacher data 125 and the learning model 126 generated in S1907 and S1908 in the storage unit 120.
  • FIG. 20 is a flowchart showing spectral image capturing processing by the spectral camera control device 21 of the drone 20.
  • the spectrum camera control device 21 flies the drone 20 so as to sequentially reach the target position according to the flight plan DB 2101 stored in the storage unit 210 (S2000).
  • the spectral camera control device 21 refers to the second imaging condition DB 2102 stored in the storage unit 210 and waits for the drone 20 to reach the imaging position described in the second imaging condition DB 2102 (S2001). . Then, when the drone 20 reaches the imaging position (YES in S2001), the spectral camera control device 21 causes the spectral camera 22 to image the area to be observed, and stores the obtained spectral image 2104 in the storage unit 210. At the same time, the conditions at the time of imaging are stored in the image DB 2103 of the storage unit 210 (S2002).
  • the spectral camera control device 21 determines whether the imaging operation has been completed for all imaging positions stored in the second imaging condition DB 2102 (S2003). Then, if it is determined that the imaging operation has ended for all imaging conditions (YES in S2003), the spectrum image 2104 and image DB 2103 stored in the storage unit 210 are transmitted to the information processing apparatus 10 (S2004), and the imaging operation is still in progress. If it is determined that there is an imaging position that has not been completed (NO in S2003), the process returns to S2001 and waits for the next imaging position to be reached.
  • the spectrum camera control device 21 ends the flight of the drone 20 (S2005).
  • FIG. 21 is a flowchart illustrating processing performed by the area specifying unit 137 and type specifying unit 138 of the control unit 130 of the information processing device 10 to specify an area where crops are in a specific state.
  • the area specifying unit 137 refers to the image DB 123 in the storage unit 120 and specifies/selects the position (region, region) where the spectral image 127 is captured, which is the source for specifying the area where the crops are in a specific state. (S2100).
  • the type identification unit 138 extracts from the storage unit 120 the spectral image 127 that matches the conditions specified and selected in S2100, and displays the extracted spectral image 127 on the display, which is an example of the output device 111 (S2101). .
  • the image synthesis unit 136 generates a wide spectrum image 128 based on the spectrum image 127 extracted in S2101 (S2102), and displays the generated wide spectrum image 128 on a display or the like (S2103). Note that the operation of generating the wide spectrum image 128 by the image synthesis unit 136 may be performed prior to the processing shown in FIG. 21.
  • the type specifying unit 138 refers to the image DB 123 and obtains the time when the spectral image 127, which is the source of the wide spectrum image 128, was captured from the item "imaging time", and from this imaging time, the spectral image 127 is Calculates the angle between the earth's surface and the sun (sun angle) when the image is taken. Then, based on the learning model 126 or the correction formula stored in the storage unit 120, the type identification unit 138 determines the spectrum of the spectrum image 127 when it is taken at a specific time (for example, 12:00 in the daytime). The spectrum of the spectrum image 127 is corrected so that the spectrum of the wide spectrum image 128 as a whole is corrected (S2104).
  • the type identifying unit 138 uses the learning model 126 stored in the storage unit 120 to determine whether or not crops are in a specific state in the observation target area where the wide spectrum image 128 is captured. Then, estimation processing (inference operation) is performed to determine which area it is (S2105), and the inference result is displayed on a display or the like (S2106).
  • FIG. 24 is a diagram showing an example of a screen displayed on the display, which is an example of the output device 111 of the information processing device 10, in the learning model generation process of the learning model generation unit 133 shown in the flowchart of FIG.
  • a button 2401 for specifying a spectral image 127 as a source for generating a learning model is displayed on the screen 2400, and the operator of the information processing device 10 presses this button using a pointing device or the like that is an example of the input device 110.
  • Spectral images 2402 to 2404 are specified.
  • Specified spectral images 2402 to 2404 are displayed on screen 2400 for each wavelength band.
  • the operator of the information processing device 10 views these spectral images 2402 to 2404 and determines whether the crops are growing based on information about the specific state of the crops in this area and the area where the crops are in the specific state, which has been acquired in advance.
  • the spectral images 2402 to 2404 in the example shown in FIG. 24, the spectral image 2403 at wavelength ⁇ ) that best indicates that the crop is in a specific state
  • a region 2405 where the crop is in a specific state is detected by pointing, which is an example of the input device 110. Specify using a device, etc.
  • the operator of the information processing apparatus 10 uses the pull-down menu 2406 to specify a specific state of the crops in the area 2405.
  • the operator of the information processing device 10 clicks the OK button 2407 using a pointing device or the like to input the end of the specified operation. conduct.
  • FIG. 25 shows an example of the output device 111 of the information processing device 10 in the process of specifying an area where crops are in a specific state by the area specifying unit 137 and the type specifying unit 138, which is shown in the flowchart of FIG.
  • FIG. 2 is a diagram showing an example of a screen displayed on a display.
  • a spectral image 2501 is displayed on the screen 2500, and an area 2502 is provided in which information indicating the imaging date and time of the spectral image 2501 and the imaging area are displayed. Additionally, map data 2503 corresponding to the area where the spectrum image 2501 was captured is displayed.
  • a region 2505 that is estimated to be in the specified specific state is displayed in the spectral image 2501. Is displayed. This region 2505 is also the density distribution of the spectrum image 2501.
  • the spectral camera 32 mounted on the satellite 30 images the ground surface to obtain a spectral image 3103
  • the spectral camera 22 mounted on the drone 20 captures the area to be observed.
  • the spectral image 2104 is acquired by imaging
  • the mobile object on which the spectral camera for acquiring the spectral image is mounted is not limited to the satellite 30 or the drone 20, but any mobile object that can image the ground surface from the air can be used. There are no limitations.
  • methods in which a mobile object equipped with a spectral camera moves on the ground surface such as an operator holding a spectral camera moving on the ground surface, or a car equipped with a spectral camera driving on the ground surface, etc. It may be a method.
  • One example is a mode in which a multispectral camera realized by a smartphone is used as the spectroscopic terminal device disclosed in Japanese Patent No. 6,342,594.
  • the LTCF camera which is the spectrum camera 32 mounted on the satellite 30, can receive reflected light in many (for example, several hundred) wavelength bands to obtain a spectrum image 3103.
  • the spectral camera 22 mounted on the drone 20 uses the library to limit the wavelength bands to (a plurality of) wavelength bands that can detect that crops in a specific state in the observation target area.
  • a spectral image 2104 is obtained from the reflected light.
  • Such a difference in the configurations of the spectral cameras 22 and 32 is also reflected in the weight and price of the spectral cameras 22 and 32. That is, the spectral camera 32 is more expensive and heavier than the spectral camera 22.
  • the difference in weight between the spectral cameras 22 and 32 also affects the configuration of the aircraft on which the spectral cameras 22 and 32 are mounted. In other words, when the spectral camera 32 is mounted on the drone 20, the configuration of the drone 20 has to be increased in size, and furthermore, the time that it can fly at one time may be limited to a short time.
  • the spectral camera 32 is mounted on the satellite 30, the restriction due to the weight of the spectral camera 32 will be somewhat relaxed, but the frequency with which spectral images 3103 of the area to be observed can be acquired will inevitably be lower. That is, the satellite 30 may pass over the area to be observed only once every several months, and therefore the frequency at which the spectral image 3103 can be obtained may also be about once every several months.
  • the number of times the spectrum image 3103 of the area to be observed is acquired by the spectral camera 32 is limited to a certain value, and the time at which the spectral camera 32 images the area to be observed is also limited. It is conceivable to fix it (for example, 12:00 in the daytime) and create a library using the solar angle as the solar angle at that time in the area. In this case, the sun angle is not corrected when creating the library.
  • the time at which the spectrum camera 22 images the area to be observed is set to the time at which the area to be observed is imaged by the spectrum camera 32 (in the above example, 12:00 in the daytime).
  • the drone 20 or the like is repeatedly flown while changing the time at which the spectral camera 22 images the area to be observed, and spectral images 2104 of the area to be observed at multiple times are obtained.
  • the sun angle correction value may be stored in the teacher data 125 and the learning model 126 may be retrained.
  • the advantage of acquiring spectral images 3103 for library creation with the spectral camera 32, which is an LTCF camera, is that the spectral images 3103 of reflected light in a large number of wavelength bands described above can be acquired with a small number of shots (in extreme case, once). .
  • the spectral image 3103 using reflected light in multiple wavelength bands it is possible to accurately select a suitable wavelength band to detect whether or not crops in a region to be observed are in a specific state. can.
  • the number of wavelength bands detected by the spectral camera 22 can be reduced without lowering the accuracy of detecting whether or not crops are in a specific state in the area to be observed.
  • the system 1 of the above-described embodiment uses a spectral image, but in addition to the spectral image, the temperature distribution of the area may also be used. To do this, it is necessary to obtain the temperature of this region using a sensor, etc. when acquiring the spectral image that is the basis for library creation, and also to obtain the temperature of this region when acquiring the spectral image of the area to be observed. It may be acquired by a sensor or the like. At this time, it is preferable to provide a plurality of temperature measurement points so that the temperature distribution has a two-dimensional spread. In this case, the library also includes temperature distribution data for the area.
  • the learning model 126 is used to identify areas where crops are in a specific state in the observation target area, but so-called image analysis technology is used to identify areas where crops are in a specific state. You may also specify an area within.
  • a first multispectral camera is installed to store in memory information for detecting that a crop being observed is in a specific state for comparison with a spectral image taken by the multispectral camera.
  • a spectral image is generated by flying an aircraft and photographing the earth's surface.
  • an aircraft equipped with a second multispectral camera that is different from the first multispectral camera is flown in accordance with the flight plan in the area to be observed, and the wavelengths used to detect that the crops are in a specific state are A spectral image is generated by photographing with a second multispectral camera.
  • the aircraft carrying the first multispectral camera and the aircraft carrying the second multispectral camera are different aircraft.
  • the first multispectral camera has a member, such as a liquid crystal variable wavelength filter 34, that can switch the wavelength to be imaged, and can generate spectral images for a large number of wavelengths.
  • the second multispectral camera can generate spectral images for specific wavelengths. The specific wavelength cannot be switched while the second multispectral camera is imaging the area to be observed. That is, the specific wavelength is a predetermined wavelength.
  • the second multispectral camera may not have a specific member for capturing spectral images, compared to the first multispectral camera.
  • the second multispectral camera does not have a wavelength switchable member and generates a spectral image at a specific wavelength.
  • the second flying vehicle carrying the second multispectral camera may have a lower loadable weight, unlike the first flying vehicle carrying the first multispectral camera. This may mean that the first flying object and the second flying object have different weights (the second flying object is less than the first flying object), or based on these differences in weight.
  • the airspace (geographical range determined by latitude and longitude, and altitude) in which the first flying object and the second flying object can fly may be different. Even if the second aircraft carries a plurality of (for example, about four) second multispectral cameras, and each second multispectral camera generates spectral images at different wavelengths. good.
  • the first multispectral camera 32 capable of acquiring spectral images in a large number of wavelength bands, such as the multispectral camera 32 equipped with the liquid crystal variable wavelength filter 34, tends to be expensive and heavy.
  • the second multispectral camera 22, which can generate only spectral images for specific wavelengths, is inexpensive and can be configured to be lightweight.
  • the first multispectral camera 32 Since the first multispectral camera 32 generates spectral images for library creation, it is preferable to generate spectral images based on reflected light in multiple wavelength bands.
  • the second multispectral camera 22 is used to identify areas where crops are in a specific state in the observation target area, and is preferably photographed frequently by a flying object such as the drone 20. Therefore, the second multispectral camera 22 has a great advantage of being lighter and cheaper than the first multispectral camera 32.
  • each of the above-mentioned configurations, functions, processing units, processing means, etc. may be partially or entirely realized by hardware, for example, by designing an integrated circuit.
  • the present invention can also be realized by software program codes that realize the functions of the embodiments.
  • a storage medium on which a program code is recorded is provided to a computer, and a processor included in the computer reads the program code stored on the storage medium.
  • the program code itself read from the storage medium realizes the functions of the embodiments described above, and the program code itself and the storage medium storing it constitute the present invention.
  • Storage media for supplying such program codes include, for example, flexible disks, CD-ROMs, DVD-ROMs, hard disks, SSDs, optical disks, magneto-optical disks, CD-Rs, magnetic tapes, and non-volatile memory cards. , ROM, etc. are used.
  • program code that implements the functions described in this embodiment can be implemented using a wide range of program or script languages, such as assembler, C/C++, Perl, Shell, PHP, and Java (registered trademark).
  • the software program code that realizes the functions of the embodiment can be stored in a storage means such as a computer's hard disk or memory, or a storage medium such as a CD-RW or CD-R.
  • a processor included in the computer may read and execute the program code stored in the storage means or the storage medium.
  • Step (S2106) How to run it.
  • an aircraft (20) equipped with a multispectral camera (22) is flown in accordance with the flight plan in the area to be observed, and a multispectral camera (20) is used to detect crops using wavelengths to detect specific conditions.
  • the detection information includes information indicating the area where the target crop is grown and a spectral image (127 ), the wavelength information for photographing it, and the sun angle information when photographing crops in the area are associated.
  • a trained model ( 126) is stored in the memory (120), and in the third step (S2105), the crops in a specific state are The method described in Appendix 1 for identifying the area where there is.
  • machine learning is performed using, as detection information, a database associated with information on the temperature of the area when the area is imaged with the multispectral camera (22) as training data (125).
  • the trained model (126) generated by performing the above steps is stored in the memory (120), and in the third step, the area to be observed when the area to be observed is imaged by the multispectral camera (22) is further stored in the memory (120).
  • information on a specific state type is further associated with the database as the teacher data (125), and the trained data generated by the database containing the information on the specific state type is
  • the model (126) is stored in the memory (120), and in the third step (S2105), based on the learned model (126), areas where crops are in a specific state and the type of the specific state are determined.
  • Appendix 10 In the third step (S2105), the designation of the photographing time point is accepted from the user, and based on the spectral image (127) photographed at the designated time point, the geographical range where the crops are in a particular state is displayed. The method described in any of Supplementary Notes 1 to 9.
  • Appendix 11 In the third step (S2105), based on the result of detecting that the crops are in a specific state, information on the crops in the specific state and costs for dealing with the crops in the specific state are provided. The method according to any one of Supplementary Notes 1 to 10, for displaying information.
  • a flying object (30) equipped with a multispectral camera (32) is flown to photograph the ground surface, and based on the obtained spectral image (127), crops to be observed are grown.
  • information indicating the area where the area is located a spectral image (127) obtained by photographing the area with a multispectral camera (32), and information on the wavelength for photographing the area, in association with each other, and storing the information in the memory (120);
  • the user can specify that the crop is in a particular state in association with the spectral image (127).
  • the method according to any one of appendices 2 to 8. (Appendix 13)
  • the spectral images used to create the database are images taken of the same region for a first wavelength group consisting of a plurality of wavelengths
  • the spectral images in the second step are images taken of the same area for a second wavelength group consisting of a plurality of wavelengths.

Abstract

The present invention causes a processor to execute: a first step of storing, in a memory, a trained model 126 for detecting that a crop under observation is in a specific state, the trained model 126 being used for comparison with a spectrum image 127 captured by a multispectral camera; a second step of generating the spectrum image 127 by capturing an image of a region under observation by the multispectral camera at a wavelength for detecting that a crop is in the specific state; a third step of identifying a region including the crop in the specific state in the region under observation on the basis of the captured spectrum image 127 and the trained model 126 for detecting that the crop is in the specific state; and a fourth step of outputting information on the identified region.

Description

方法、プログラム及び情報処理装置Method, program and information processing device
 本開示は、方法、プログラム及び情報処理装置に関する。 The present disclosure relates to a method, a program, and an information processing device.
 農作物(以下、本明細書において概括的に「作物」と称する)の生育や病害虫・土壌の状態を把握する等の目的で、作物が生育されている土地をスペクトルカメラを用いて撮像して、この土地のスペクトル画像を取得する技術が知られている(例えば特許文献1参照)。 For the purpose of understanding the growth of agricultural crops (hereinafter generally referred to as "crops" in this specification), pests, and soil conditions, the land where crops are grown is imaged using a spectral camera, A technique for acquiring a spectral image of this land is known (see, for example, Patent Document 1).
国際公開第2017/179378号International Publication No. 2017/179378
 農業事業者にとっては、作物を育成し、収穫をするにあたっては、どこに作物を植えたかは育成者は把握していると考えられる。一方、作物を収穫して出荷するにあたり、単に植生が濃いかどうかだけでは、作物の育成状況が良好であるかどうか判断できないこともあり得る。 For agricultural business operators, when cultivating and harvesting crops, it is thought that the breeder knows where the crops are planted. On the other hand, when harvesting and shipping crops, it may not be possible to judge whether the crops are growing well or not just by looking at whether the vegetation is thick or not.
 従って、作物を育成し、収穫をする事業を行う事業者にとって、より適切に育成状況を管理することが可能な技術が必要とされている。 Therefore, for businesses that grow and harvest crops, there is a need for technology that allows them to manage the growing situation more appropriately.
 そこで、本開示は、上記課題を解決すべくなされたものであって、その目的は、収穫をする事業を行う事業者にとって、より適切に育成状況を管理することを可能とする方法、プログラム及び情報処理装置を提供することである。 Therefore, the present disclosure has been made to solve the above-mentioned problems, and the purpose is to provide a method, a program, and a program that enable businesses engaged in harvesting to more appropriately manage growing conditions. An object of the present invention is to provide an information processing device.
 一実施形態によると、コンピュータを動作させるための方法が提供される。この方法は、コンピュータのプロセッサに、マルチスペクトルカメラにより撮影されるスペクトル画像と比較するための、観測対象である作物が特定の状態であることを検知するための情報をメモリに記憶させる第1ステップと、観測対象の地域において、作物が特定の状態であることを検知するための波長によりマルチスペクトルカメラで撮影させることでスペクトル画像を生成する第2ステップと、撮影した結果であるスペクトル画像と、作物が特定の状態であることを検知するための情報とに基づいて、観測対象の地域において特定の状態にある作物がある地域を特定する第3ステップと、特定した地域の情報を出力する第4ステップと、を実行させる。 According to one embodiment, a method for operating a computer is provided. This method involves the first step of having a computer processor store in memory information for detecting a specific state of the crop being observed, for comparison with a spectral image taken by a multispectral camera. A second step of generating a spectral image by photographing with a multispectral camera using a wavelength to detect that the crops are in a specific state in the area to be observed; and a spectral image that is the result of the photographing; a third step of identifying areas where crops are in a specific state in the observation target area based on information for detecting that the crops are in a specific state; and a third step of outputting information on the identified area. 4 steps.
 本開示によれば、収穫をする事業を行う事業者にとって、より適切に育成状況を管理することが可能となる。 According to the present disclosure, it becomes possible for businesses that conduct harvesting businesses to manage the growing situation more appropriately.
実施形態に係るシステムの概要を示す図である。1 is a diagram showing an overview of a system according to an embodiment. 実施形態に係るシステムの構成を示すブロック図である。FIG. 1 is a block diagram showing the configuration of a system according to an embodiment. 実施形態に係るドローンを示す正面図である。It is a front view showing a drone concerning an embodiment. 実施形態に係るドローンを示す平面図である。FIG. 1 is a plan view showing a drone according to an embodiment. 実施形態に係るドローンのハードウェア構成を示すブロック図である。FIG. 1 is a block diagram showing the hardware configuration of a drone according to an embodiment. 実施形態に係るドローンの機能的な構成を示すブロック図である。FIG. 1 is a block diagram showing the functional configuration of a drone according to an embodiment. 実施形態に係るドローンによる土地の撮像状態を示す概念図である。FIG. 2 is a conceptual diagram showing a state of imaging of land by a drone according to an embodiment. 実施形態に係る衛星のハードウェア構成を示すブロック図である。FIG. 1 is a block diagram showing the hardware configuration of a satellite according to an embodiment. 実施形態に係る衛星の機能的な構成を示すブロック図である。FIG. 1 is a block diagram showing the functional configuration of a satellite according to an embodiment. 実施形態に係る衛星による土地の撮像状態を示す概念図である。FIG. 2 is a conceptual diagram showing a state of imaging of land by a satellite according to an embodiment. 実施形態に係る情報処理装置のハードウェア構成を示すブロック図である。FIG. 1 is a block diagram showing a hardware configuration of an information processing device according to an embodiment. 実施形態に係る情報処理装置の機能的な構成を示す図である。FIG. 1 is a diagram showing a functional configuration of an information processing device according to an embodiment. 実施形態に係る衛星に格納された第1撮像条件DBのデータ構造を示す図である。FIG. 2 is a diagram showing a data structure of a first imaging condition DB stored in a satellite according to an embodiment. 実施形態に係る衛星に格納された画像DBのデータ構造を示す図である。It is a diagram showing a data structure of an image DB stored in a satellite according to an embodiment. 実施形態に係るドローンに格納された第2撮像終了時刻DBのデータ構造を示す図である。It is a figure showing the data structure of the 2nd imaging end time DB stored in the drone concerning an embodiment. 実施形態に係るドローンに格納された飛行計画DBのデータ構造を示す図である。It is a diagram showing the data structure of a flight plan DB stored in the drone according to the embodiment. 実施形態に係る情報処理装置に格納された広域画像DBのデータ構造を示す図である。FIG. 2 is a diagram showing a data structure of a wide area image DB stored in the information processing device according to the embodiment. 実施形態に係る衛星の動作の一例を説明するためのフローチャートである。It is a flowchart for explaining an example of operation of a satellite concerning an embodiment. 実施形態に係る情報処理装置の動作の一例を説明するためのフローチャートである。3 is a flowchart for explaining an example of the operation of the information processing apparatus according to the embodiment. 実施形態に係るドローンの動作の一例を説明するためのフローチャートである。It is a flowchart for explaining an example of operation of a drone concerning an embodiment. 実施形態に係る情報処理装置の動作の他の例を説明するためのフローチャートである。12 is a flowchart for explaining another example of the operation of the information processing apparatus according to the embodiment. 実施形態に係る情報処理装置における広域画像生成の動作の一例を示す図である。FIG. 3 is a diagram illustrating an example of the operation of wide area image generation in the information processing device according to the embodiment. スペクトル画像により作物が特定の状態にあることを判定する手順を示す概念図である。FIG. 2 is a conceptual diagram showing a procedure for determining that a crop is in a specific state using a spectral image. 実施形態に係る情報処理装置に表示される画面の一例を示す図である。FIG. 3 is a diagram showing an example of a screen displayed on the information processing device according to the embodiment. 実施形態に係る情報処理装置に表示される画面の他の例を示す図である。FIG. 7 is a diagram illustrating another example of a screen displayed on the information processing device according to the embodiment.
 以下、本開示の実施形態について図面を参照して説明する。実施形態を説明する全図において、共通の構成要素には同一の符号を付し、繰り返しの説明を省略する。なお、以下の実施形態は、特許請求の範囲に記載された本開示の内容を不当に限定するものではない。また、実施形態に示される構成要素のすべてが、本開示の必須の構成要素であるとは限らない。また、各図は模式図であり、必ずしも厳密に図示されたものではない。 Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. In all the figures explaining the embodiments, common components are given the same reference numerals and repeated explanations will be omitted. Note that the following embodiments do not unduly limit the content of the present disclosure described in the claims. Furthermore, not all components shown in the embodiments are essential components of the present disclosure. Furthermore, each figure is a schematic diagram and is not necessarily strictly illustrated.
 また、以下の説明において、「プロセッサ」は、1以上のプロセッサである。少なくとも1つのプロセッサは、典型的には、CPU(Central Processing Unit)のようなマイクロプロセッサであるが、GPU(Graphics Processing Unit)のような他種のプロセッサでもよい。少なくとも1つのプロセッサは、シングルコアでもよいしマルチコアでもよい。 Additionally, in the following description, a "processor" refers to one or more processors. The at least one processor is typically a microprocessor such as a CPU (Central Processing Unit), but may be another type of processor such as a GPU (Graphics Processing Unit). At least one processor may be single-core or multi-core.
 また、少なくとも1つのプロセッサは、処理の一部又は全部を行うハードウェア回路(例えばFPGA(Field-Programmable Gate Array)又はASIC(Application Specific Integrated Circuit))といった広義のプロセッサでもよい。 Furthermore, at least one processor may be a processor in a broad sense such as a hardware circuit (for example, FPGA (Field-Programmable Gate Array) or ASIC (Application Specific Integrated Circuit)) that performs part or all of the processing.
 また、以下の説明において、「xxxテーブル」といった表現により、入力に対して出力が得られる情報を説明することがあるが、この情報は、どのような構造のデータでもよいし、入力に対する出力を発生するニューラルネットワークのような学習モデルでもよい。従って、「xxxテーブル」を「xxx情報」と言うことができる。 In addition, in the following explanation, information such as "xxx table" may be used to explain information that provides an output in response to an input, but this information may be data of any structure, and A learning model such as a generated neural network may also be used. Therefore, the "xxx table" can be called "xxx information."
 また、以下の説明において、各テーブルの構成は一例であり、1つのテーブルは、2以上のテーブルに分割されてもよいし、2以上のテーブルの全部又は一部が1つのテーブルであってもよい。 In addition, in the following explanation, the configuration of each table is an example, and one table may be divided into two or more tables, or all or part of two or more tables may be one table. good.
 また、以下の説明において、「プログラム」を主語として処理を説明する場合があるが、プログラムは、プロセッサによって実行されることで、定められた処理を、適宜に記憶部及び/又はインタフェース部などを用いながら行うため、処理の主語が、プロセッサ(或いは、そのプロセッサを有するコントローラのようなデバイス)とされてもよい。 In addition, in the following description, processing may be explained using the subject "program", but a program is executed by a processor to carry out a prescribed process, and to use the storage unit and/or interface unit as appropriate. Since the processing is performed while using the processor, the subject of the processing may be a processor (or a device such as a controller having the processor).
 プログラムは、計算機のような装置にインストールされてもよいし、例えば、プログラム配布サーバ又は計算機が読み取り可能な(例えば非一時的な)記録媒体にあってもよい。また、以下の説明において、2以上のプログラムが1つのプログラムとして実現されてもよいし、1つのプログラムが2以上のプログラムとして実現されてもよい。 The program may be installed on a device such as a computer, or may be located on a program distribution server or a computer-readable (e.g., non-temporary) recording medium, for example. Furthermore, in the following description, two or more programs may be realized as one program, or one program may be realized as two or more programs.
 また、以下の説明において、種々の対象の識別情報として、識別番号が使用されるが、識別番号以外の種類の識別情報(例えば、英字や符号を含んだ識別子)が採用されてもよい。 Furthermore, in the following description, identification numbers are used as identification information for various objects, but other types of identification information than identification numbers (for example, identifiers containing alphabetic characters or codes) may be employed.
 また、以下の説明において、同種の要素を区別しないで説明する場合には、参照符号(又は、参照符号のうちの共通符号)を使用し、同種の要素を区別して説明する場合は、要素の識別番号(又は参照符号)を使用することがある。 In addition, in the following explanation, when the same type of elements are explained without distinguishing them, reference numerals (or common numerals among the reference numerals) are used, and when the same kind of elements are explained separately, the element An identification number (or reference number) may be used.
 また、以下の説明において、制御線や情報線は、説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。全ての構成が相互に接続されていてもよい。 In addition, in the following description, control lines and information lines are shown to be necessary for the explanation, and not all control lines and information lines are necessarily shown in the product. All configurations may be interconnected.
 本明細書において、「スペクトル画像」とは、物体からの反射光のうち特定の波長帯域(含む特定の波長)の反射光のみからなる画像を意味する。スペクトル画像は、一例として、後述する実施形態に示すように、物体(作物)からの反射光を、特定の波長帯域のみ通過させるフィルタ(好ましくは透過波長帯域が可変の)を通過させることで、特定の波長帯域の反射光のみをイメージセンサに結像させ、このイメージセンサから出力される撮像信号に基づいて生成される。あるいは、フィルタを介さずに物体からの反射光をイメージセンサに結像させ、このイメージセンサから出力される撮像信号を信号処理により特定の波長帯域のみの撮像信号に変換する(フィルタリングする)ことでも生成される。なお、同一の物体について特定の波長帯域の反射光のみからなるスペクトル画像を複数まとめた画像もスペクトル画像と称することがある。 In this specification, a "spectral image" means an image consisting only of reflected light in a specific wavelength band (including specific wavelengths) among the reflected light from an object. For example, as shown in the embodiments described below, a spectral image can be created by passing reflected light from an object (crop) through a filter (preferably with a variable transmission wavelength band) that passes only a specific wavelength band. Only reflected light in a specific wavelength band is imaged on an image sensor, and generated based on an imaging signal output from the image sensor. Alternatively, the reflected light from the object can be imaged on an image sensor without going through a filter, and the image signal output from this image sensor can be converted (filtered) into an image signal of only a specific wavelength band through signal processing. generated. Note that an image that is a collection of a plurality of spectral images of the same object made up of only reflected light in a specific wavelength band may also be referred to as a spectral image.
 また、本明細書において、「飛行体」とは、空中を飛行し、搭乗員なしに操縦しうる移動体(例えば、遠隔から無線で操縦し得るか、または自律的に操縦し得る航空機、ドローン、衛星、飛行船)を指す意味で用いる。 In addition, as used herein, the term "flying object" refers to a mobile object that flies in the air and can be operated without a crew (e.g., an aircraft that can be remotely controlled wirelessly or autonomously, a drone, etc.). , satellite, airship).
 さらに、本明細書において、作物が「特定の状態」にあるとは、次の内容を包含するものである。
・作物が病気になっているかどうか、及び、どの病気になっているかどうか、どの病気になりそうか
・作物の生育状態(生育状態には収穫可能かどうかも含まれる)
・対象領域が森林であれば植生の状態、植生の分布
Furthermore, in this specification, the expression that a crop is in a "specific state" includes the following.
・Whether or not the crop is sick, and which disease it is infected with, and which disease it is likely to be ・Growth status of the crop (growth status includes whether it is harvestable or not)
・If the target area is a forest, the condition of the vegetation and the distribution of the vegetation
 そして、作物には農作物のみならず草木も含まれる。従って、作物が特定の状態にあるとは、農作物を生育している草木の状態(例えば、農作物が林檎であればこの林檎を生育している木の葉の状態など)も含まれる。 Crops include not only agricultural products but also plants. Therefore, the term "a crop is in a specific state" includes the condition of the plants on which the crop is grown (for example, if the crop is an apple, the condition of the leaves of the tree on which the apple is grown) is also included.
 <実施形態>
 <実施形態の概要>
 実施形態に係るシステムでは、作物が栽培されている領域を含む地表面を第1のマルチスペクトルカメラにより撮像し、撮像して得られたスペクトル画像をオペレータが確認し、ある領域における特定の波長帯域におけるスペクトル画像の見え方の違い(これは、同一物体に対する特定の波長帯域における反射光とそれ以外の波長帯域における反射光との強弱に相当する)と、その領域における作物の特定の状態とをオペレータが関連付ける。この関連付けをライブラリとして保管しておく。次いで、観測対象の地域について同様に第2のマルチスペクトルカメラにより撮像してスペクトル画像を得る。そして、観測対象の地域を撮像したスペクトル画像を上述のライブラリに従って解析し、この地域において作物が特定の状態にある領域を特定する。
<Embodiment>
<Overview of embodiment>
In the system according to the embodiment, a first multispectral camera images the ground surface including an area where crops are cultivated, and an operator confirms the spectral image obtained by imaging, and identifies a specific wavelength band in a certain area. (This corresponds to the strength of reflected light in a specific wavelength band and reflected light in other wavelength bands from the same object) and the specific state of crops in that area. Operator associates. Save this association as a library. Next, the area to be observed is similarly imaged by the second multispectral camera to obtain a spectral image. Then, a spectral image taken of the area to be observed is analyzed according to the above-mentioned library, and an area where crops are in a specific state in this area is identified.
 第1のマルチスペクトルカメラ及び第2のマルチスペクトルカメラは、好ましくは移動体に搭載され、この移動体を地表面及び観測対象の地域を移動させることでスペクトル画像を取得する。移動体の例としては、飛行体であるドローン、衛星が好適な例として挙げられるが、第1及び第2のマルチスペクトルカメラを移動させてスペクトル画像を取得できる移動体であれば特段の限定はない。さらには、移動体が飛行体である必要はなく、第1、第2のスペクトルカメラが搭載された移動体が地表面を移動してもよい。例えば、第1、第2のスペクトルカメラを保持したオペレータが地表面を徒歩で移動する、第1、第2のスペクトルカメラを搭載した自動車が地表面を走行する等の手段によりスペクトル画像を取得してもよい。一例として、特許6342594号に開示された分光端末装置をスマートフォンにより実現したマルチスペクトルカメラを用いたような態様が挙げられる。 The first multispectral camera and the second multispectral camera are preferably mounted on a moving body, and acquire spectral images by moving the moving body over the ground surface and the area to be observed. Suitable examples of mobile objects include flying objects such as drones and satellites, but there are no special limitations as long as the object can acquire spectral images by moving the first and second multispectral cameras. do not have. Furthermore, the moving object does not need to be a flying object, and a moving object on which the first and second spectral cameras are mounted may move on the ground surface. For example, spectral images can be acquired by means such as an operator holding the first and second spectral cameras moving on the ground surface, or a car equipped with the first and second spectral cameras driving on the ground surface. It's okay. One example is a mode in which a multispectral camera realized by a smartphone is used as the spectroscopic terminal device disclosed in Japanese Patent No. 6,342,594.
 第1のマルチスペクトルカメラが搭載された移動体と第2のマルチスペクトルカメラが搭載された移動体とが同一である必要はない。後述する実施形態では、第1のマルチスペクトルカメラは衛星に搭載され、第2のマルチスペクトルカメラはドローンに搭載されているが、第1のマルチスペクトルカメラがドローンに搭載され、第2のマルチスペクトルカメラは衛星に搭載されていてもよい。さらには、第1、第2のマルチスペクトルカメラがいずれも衛星に搭載され、あるいは、第1、第2のマルチスペクトルカメラがいずれもドローンに搭載されていてもよい。さらには、上述したように、第1のマルチスペクトルカメラ及び第2のマルチスペクトルカメラの少なくとも一方が、地表面を移動する移動体に搭載されていてもよい。 The mobile body on which the first multispectral camera is mounted and the mobile body on which the second multispectral camera is mounted do not need to be the same. In the embodiments described below, the first multispectral camera is mounted on a satellite and the second multispectral camera is mounted on a drone, but the first multispectral camera is mounted on the drone and the second multispectral The camera may be mounted on a satellite. Furthermore, both the first and second multispectral cameras may be mounted on a satellite, or both the first and second multispectral cameras may be mounted on a drone. Furthermore, as described above, at least one of the first multispectral camera and the second multispectral camera may be mounted on a moving body that moves on the ground surface.
 第1のマルチスペクトルカメラ及び第2のマルチスペクトルカメラは、いずれも複数の波長帯域における地表面からの反射光を受光し、この反射光に基づいてスペクトル画像を生成するものである。好ましくは、第1のマルチスペクトルカメラは、液晶波長可変フィルタのような、撮像する波長を切替可能な部材を有し、第2のマルチスペクトルカメラよりも多数の波長についてのスペクトル画像を生成することができる。一方、第2のマルチスペクトルカメラは、特定の波長についてのスペクトル画像を生成することができる。このとき、特定の波長は、第2のマルチスペクトルカメラによる観測対象の地域を撮像している間は切替ができないことが好ましい。つまり、特定の波長は予め定めた波長である。当然、第1のマルチスペクトルカメラにより撮像可能なスペクトル画像の波長帯域の数と第2のマルチスペクトルカメラにより撮像可能なスペクトル画像の波長帯域の数の大小関係に特段の制限はない。 The first multispectral camera and the second multispectral camera both receive reflected light from the ground surface in multiple wavelength bands, and generate a spectral image based on this reflected light. Preferably, the first multispectral camera has a member capable of switching the imaging wavelength, such as a liquid crystal variable wavelength filter, and generates spectral images for a larger number of wavelengths than the second multispectral camera. I can do it. On the other hand, the second multispectral camera can generate spectral images for specific wavelengths. At this time, it is preferable that the specific wavelength cannot be switched while the second multispectral camera is imaging the area to be observed. That is, the specific wavelength is a predetermined wavelength. Naturally, there is no particular restriction on the magnitude relationship between the number of wavelength bands of spectral images that can be imaged by the first multispectral camera and the number of wavelength bands of spectral images that can be imaged by the second multispectral camera.
 本明細書における「ライブラリ」とは、汎用性の高いデータセット及び/またはプログラムを指す。好ましくは、ライブラリは人工知能技術における教師データとして実現される。さらに好ましくは、本実施形態のシステムでは、教師データを用いて学習モデルを生成し、この学習モデルに基づいて作物が特定の状態にある領域を特定する。従って、以下の説明において、「ライブラリ」は主に教師データを指すものとして用いるが、教師データと学習モデルとを特段区別せずに「ライブラリ」と称することがある。なお、以下の説明において、学習モデルは既に学習されたモデル、つまり学習済みモデルも含むものとする。 The term "library" herein refers to a highly versatile data set and/or program. Preferably, the library is implemented as training data in artificial intelligence technology. More preferably, in the system of this embodiment, a learning model is generated using teacher data, and based on this learning model, areas where crops are in a specific state are identified. Therefore, in the following explanation, the term "library" will be used mainly to refer to teaching data, but the term "library" may be used without making any particular distinction between teaching data and learning models. Note that in the following description, the learning model includes a model that has already been learned, that is, a trained model.
 詳細は後述するが、ライブラリ作成時に、スペクトル画像が撮像された際の太陽高度を考慮する必要がある。これは、作物(特に作物の葉)に太陽が当たる角度により、作物からの反射光の波長分布が異なるので、太陽高度(太陽角度)がライブラリに与える影響を考慮する必要があるからである。しかし、第1のマルチスペクトルカメラで撮像されたスペクトル画像に基づいてライブラリを作成するに当たり、システムの運用当初は第1のマルチスペクトルカメラで撮像されたスペクトル画像のバリエーション(特に異なる太陽角度に基づくスペクトル画像のバリエーション)が不足し、ライブラリ作成の際に太陽角度を考慮したライブラリが作成しづらい可能性がある。 The details will be described later, but when creating a library, it is necessary to take into account the altitude of the sun at the time the spectral image was captured. This is because the wavelength distribution of the reflected light from the crop changes depending on the angle at which the sun hits the crop (particularly the leaves of the crop), so it is necessary to consider the influence of the solar altitude (sun angle) on the library. However, when creating a library based on the spectral images taken by the first multispectral camera, the variations of the spectral images taken by the first multispectral camera (particularly the spectra based on different solar angles) are required at the beginning of system operation. Image variations) may be lacking, making it difficult to create a library that takes the sun angle into consideration when creating a library.
 そこで、第1のマルチスペクトルカメラで撮像されたスペクトル画像を一定の太陽角度のものとし(一例として南中高度時のスペクトル画像)、一定の太陽角度のスペクトル画像に基づいてライブラリを作成する。次いで、第2のマルチスペクトルカメラでスペクトル画像を取得する際に、第1のマルチスペクトルカメラによるスペクトル画像を取得した際の太陽角度と同じ太陽角度でスペクトル画像を取得し、太陽角度の影響をできるだけ排除して作物が特定の状態にある領域を特定する作業を行うことができる。 Therefore, the spectral image captured by the first multispectral camera is taken at a constant solar angle (as an example, the spectral image at a mid-south altitude), and a library is created based on the spectral image at a constant solar angle. Next, when acquiring a spectral image with the second multispectral camera, the spectral image is acquired at the same solar angle as the one at which the spectral image was acquired with the first multispectral camera, thereby minimizing the influence of the solar angle. This can be done to identify areas where crops are in a particular state.
 この後、第1及び第2のマルチスペクトルカメラで取得したスペクトル画像のバリエーションを増加させ、つまり、様々な太陽角度におけるスペクトル画像を取得し、太陽角度がライブラリに与える影響を定式化することができる。 After this, it is possible to increase the variation of the spectral images acquired by the first and second multispectral cameras, i.e. to acquire spectral images at different solar angles, and to formulate the influence of the solar angle on the library. .
 ライブラリ作成時、さらには作物が特定の状態にある領域を特定する作業において、広範囲の観測対象の領域についてライブラリを作成し、また、領域特定作業をするニーズが存在する。一方で、第1、第2のマルチスペクトルカメラにより撮像されるスペクトル画像は、それぞれのマルチスペクトルカメラの画角、及び、マルチスペクトルカメラを搭載した移動体と地表面からの高度によりその撮像範囲は所定のものになる。この場合、複数のスペクトル画像をつないで広範囲のスペクトル画像を生成することが好ましい。スペクトル画像をつなぐ手法は周知であり、一例として、スペクトル画像からオルソ画像を生成し、このオルソ画像をつなぐ手法が挙げられる。スペクトル画像をつなぐ手法は、第1のマルチスペクトルカメラ、第2のマルチスペクトルカメラのいずれから取得されたスペクトル画像に適用してもよい。 When creating a library, and furthermore, when identifying areas where crops are in a particular state, there is a need to create a library for a wide range of observation target areas and to perform area identification work. On the other hand, the imaging range of the spectral images captured by the first and second multispectral cameras varies depending on the angle of view of each multispectral camera and the altitude of the mobile body equipped with the multispectral camera and the ground surface. It becomes a given thing. In this case, it is preferable to connect multiple spectral images to generate a wide range spectral image. Techniques for connecting spectral images are well known, and one example is a technique for generating orthoimages from spectral images and connecting the orthoimages. The method of connecting spectral images may be applied to spectral images obtained from either the first multispectral camera or the second multispectral camera.
 図1を参照して、実施形態である方法が適用されたシステムの概要について説明する。本実施形態のシステムでは、第1のマルチスペクトルカメラを衛星に搭載し、第2のマルチスペクトルカメラをドローンに搭載した例を中心に説明するが、第1及び第2のマルチスペクトルカメラが搭載される移動体及び第1及び第2のマルチスペクトルカメラの移動方法については実施形態の開示に制限されない。さらに、第1及び第2のマルチスペクトルカメラの構成(特に取得できるスペクトル画像の波長帯域の数)についても実施形態の開示に限定されない。 With reference to FIG. 1, an overview of a system to which a method according to an embodiment is applied will be described. In the system of this embodiment, an example will be mainly explained in which the first multispectral camera is mounted on a satellite and the second multispectral camera is mounted on a drone. The moving object and the method of moving the first and second multispectral cameras are not limited to the disclosure of the embodiments. Furthermore, the configurations of the first and second multispectral cameras (particularly the number of wavelength bands of spectral images that can be obtained) are not limited to the disclosure of the embodiments.
 実施形態に係るシステム1では、ライブラリ作成のためのスペクトル画像を飛行体の一例である衛星2に搭載した第1のマルチスペクトルカメラにより生成し、観測対象の地域を飛行体の一例であるドローン3に搭載した第2のマルチスペクトルカメラにより撮像して、得られたスペクトル画像を作物が特定の状態にある領域を特定するために用いている。 In the system 1 according to the embodiment, a spectral image for library creation is generated by a first multispectral camera mounted on a satellite 2, which is an example of a flying vehicle, and an area to be observed is detected by a drone 3, which is an example of a flying vehicle. A second multispectral camera mounted on the system captures images, and the resulting spectral images are used to identify areas where crops are in a particular state.
 しかしながら、ライブラリ作成のためのスペクトル画像を撮像するための飛行体と領域特定のためのスペクトル画像を撮像するための飛行体とは同一であってもよいし、その種類には制限はない。但し、ライブラリ作成のためのスペクトル画像を衛星2により撮像することで、より広範囲の地表面からのスペクトル画像を能率良く入手できるメリットがある。一方、領域特定のためのスペクトル画像をドローン3により撮像することで、状態特定をすべき地域に絞ってドローン3を飛行させて簡便にスペクトル画像を入手でき、さらに、同一地域を多数回撮像する際のコストを低廉化できるメリットがある。 However, the flying object for capturing the spectral image for library creation and the flying object for capturing the spectral image for region identification may be the same, and there are no restrictions on their types. However, by capturing spectral images for library creation using the satellite 2, there is an advantage that spectral images from a wider range of the earth's surface can be obtained efficiently. On the other hand, by capturing spectral images for area identification with the drone 3, it is possible to easily obtain spectral images by flying the drone 3 to focus on the area where the condition should be identified, and furthermore, it is possible to image the same area multiple times. This has the advantage of lowering the actual cost.
 また、詳細については後述するが、衛星2には、液晶波長可変フィルタ(LCTF:Liquid Crystal Tunable Filter)を用いた第1のマルチスペクトルカメラを搭載し、一方、ドローン3には、ライブラリ作成において作物が特定の状態にあることを明確に特定しうる複数の波長帯域の組合せによるスペクトル画像を撮像可能な第2のマルチスペクトルカメラ(好ましくは、特定の波長帯域の反射光のみを撮像可能なカメラを複数台搭載したマルチスペクトルカメラ)を搭載している。当然、複数の波長帯域における反射光に基づいて撮像可能なマルチスペクトルカメラであればその構成に限定はない。また、ドローン3にLTCFを用いた第1のマルチスペクトルカメラを搭載し、衛星2に複数の波長帯域の組合せによるスペクトル画像を撮像可能な第2のマルチスペクトルカメラを搭載してもよい。 In addition, although the details will be described later, Satellite 2 is equipped with a first multispectral camera that uses a liquid crystal tunable filter (LCTF), while Drone 3 is equipped with a first multispectral camera that uses a liquid crystal tunable filter (LCTF). A second multispectral camera capable of capturing a spectral image using a combination of multiple wavelength bands that can clearly identify that the light is in a specific state (preferably, a camera capable of capturing only reflected light in a specific wavelength band) It is equipped with multiple multispectral cameras. Naturally, there are no limitations to the configuration as long as the camera is a multispectral camera that can capture images based on reflected light in multiple wavelength bands. Further, the drone 3 may be equipped with a first multispectral camera using LTCF, and the satellite 2 may be equipped with a second multispectral camera capable of capturing spectral images using a combination of a plurality of wavelength bands.
 衛星2に搭載した第1のマルチスペクトルカメラは、液晶波長可変フィルタの作用により、可視域から近赤外域までの広い波長帯域における反射光のうち、特定の波長帯域における反射光を撮像可能である。地表面からの反射光を撮像するにあたって、本実施例のシステム1では、広い波長帯域を走査して、同一領域に対して多数の波長帯域における反射光に基づくスペクトル画像を撮像する。 The first multispectral camera on board Satellite 2 is capable of capturing reflected light in a specific wavelength band among the reflected light in a wide wavelength band from the visible to near-infrared ranges, due to the action of a liquid crystal wavelength tunable filter. . When capturing an image of reflected light from the ground surface, the system 1 of this embodiment scans a wide wavelength band and captures spectral images of the same area based on reflected light in multiple wavelength bands.
 衛星2に搭載した第1のマルチスペクトルカメラにより撮像されたスペクトル画像は情報処理装置4に送信される。情報処理装置4のオペレータは、衛星2に搭載した第1のマルチスペクトルカメラにより同一領域を撮像したたスペクトル画像のうち、複数(好ましくは3波長帯域以上)のスペクトル画像を対比し、それぞれのスペクトル画像における濃淡のパターンから、作物が特定の状態にあることを示す濃淡のパターンを特定し、この濃淡のパターンと特定の状態を示す情報とを関連付ける。従って、情報処理装置4のオペレータは、上述した同一領域における地表面の作物の生育状態を把握し、この生育状態に基づいて、どの領域の作物がどのような特定の状態にあるかを事前に把握していることが前提となる。上述した関連付けはライブラリとして情報処理装置4のメモリ内に格納される。好ましくは、このライブラリは、人工知能技術における教師データとして実現される。さらに好ましくは、ライブラリ作成時に、衛星2により地表面を撮像した時刻から地表面に対する太陽の角度を取得または算出し、この太陽の角度も関連付けてライブラリを作成する。但し、既に説明したように、太陽角度の依存性についてはシステム1の運用当初を含めて必須ではない。 A spectral image captured by the first multispectral camera mounted on the satellite 2 is transmitted to the information processing device 4. The operator of the information processing device 4 compares multiple (preferably three or more wavelength bands) spectral images among the spectral images taken of the same region by the first multispectral camera mounted on the satellite 2, and compares each spectral image. A shading pattern indicating that the crop is in a specific state is identified from the shading pattern in the image, and this shading pattern is associated with information indicating the specific state. Therefore, the operator of the information processing device 4 grasps the growth state of the crops on the ground surface in the same area mentioned above, and based on this growth state, can determine in advance which area's crops are in what specific state. It is assumed that you understand it. The above-mentioned associations are stored in the memory of the information processing device 4 as a library. Preferably, this library is implemented as training data in artificial intelligence technology. More preferably, when creating the library, the angle of the sun with respect to the earth's surface is acquired or calculated from the time when the earth's surface was imaged by the satellite 2, and the library is created in association with this angle of the sun. However, as already explained, the dependence on the solar angle is not essential, including at the beginning of operation of the system 1.
 次いで、作物が特定の状態にあるか否かの判定対象となる観測対象の地域を、予め定めた飛行計画に基づいてドローン3を飛行させ、このドローン3に搭載された第2のマルチスペクトルカメラにより撮像する。上述したように、ドローン3に搭載された第2のマルチスペクトルカメラによる撮像範囲は比較的狭いので、好ましくは、同時に複数台のドローン3を飛行させて並列的に第2のマルチスペクトルカメラにより撮像することが好ましい。 Next, the drone 3 is flown based on a predetermined flight plan over the observation area where it is to be determined whether or not the crops are in a specific state, and the second multispectral camera mounted on the drone 3 The image is taken by As described above, since the imaging range by the second multispectral camera mounted on the drone 3 is relatively narrow, preferably, multiple drones 3 are flown at the same time and images are captured in parallel by the second multispectral camera. It is preferable to do so.
 飛行計画は、情報処理装置4のオペレータが事前に作成することが好ましい。この際、オペレータは、観測対象の地域の地形を事前に把握し、作物が特定の状態にあることが予想される地域を重点的にドローン3により飛行させる飛行計画を作成することが好ましい。この際、情報処理装置4が、作物が特定の状態にあることが予想される地域を推奨する態様も可能である。作物が特定の状態にあることが予想される地域は、例えば、作物が湿度に影響される場合は川沿いの地域、霧が長時間発生することが予想される地域、さらには特定の高度の幅において作物が特定の状態にあることが予想される場合、この高度幅の地域などがある。 It is preferable that the flight plan be created in advance by the operator of the information processing device 4. At this time, it is preferable that the operator grasps the topography of the area to be observed in advance and creates a flight plan in which the drone 3 is to fly focused on areas where crops are expected to be in a specific state. At this time, it is also possible for the information processing device 4 to recommend areas where crops are expected to be in a specific state. Areas where the crop is expected to be in certain conditions are, for example, along rivers if the crop is sensitive to humidity, areas where fog is expected to occur for long periods of time, or even at certain altitudes. If the crop is expected to be in a particular state in width, then there is an area of this altitude width, and so on.
 本実施例のシステム1では、特定の状態であるか否かを顕著に把握可能な複数の波長帯域を事前に選択し、これら波長帯域のそれぞれの反射光を撮像可能なカメラを複数台(好ましくは3台以上、より好ましくは4台)用意し、これら複数台のカメラによりマルチスペクトルカメラを構成している。 In the system 1 of this embodiment, a plurality of wavelength bands are selected in advance so that it is possible to clearly grasp whether or not a specific state is present, and a plurality of cameras (preferably 3 or more cameras, preferably 4 cameras), and these multiple cameras constitute a multispectral camera.
 このような第2のマルチスペクトルカメラにより観測対象の地域を撮像して得られたスペクトル画像を情報処理装置4に入力し、ライブラリと対比して、作物が特定の状態にある領域を特定する。図1においては、ハッチングで示された領域が、作物が特定の状態にある領域であるとして特定されている。この際、情報処理装置4は、ドローン3に搭載した第2のマルチスペクトルカメラにより撮像された個々のスペクトル画像をつなぎ合わせて1枚のスペクトル画像を生成し、このスペクトル画像を用いて作物が特定の状態にある領域を特定することが好ましい。これは、ドローン3は衛星2に比較して低空を飛行するため、観測対象となる地域が広範囲にわたる場合、1枚のスペクトル画像では観測対象の地域の全てを特定対象とすることが難しいからである。スペクトル画像をつなぎ合わせる意味は衛星2に搭載した第1のマルチスペクトルカメラから得られるスペクトル画像についても同様に当てはまるので、既に説明したように、第1のマルチスペクトルカメラから得られるスペクトル画像についても適宜つなぎ合わせることが好ましい。好ましくは、作物が特定の状態にある領域を特定する際に、スペクトル画像撮像時の地域の太陽の角度を取得、算出し、この太陽の角度を考慮して、作物が特定の状態にある領域を特定する。 A spectral image obtained by imaging the area to be observed using such a second multispectral camera is input to the information processing device 4, and is compared with the library to identify areas where crops are in a specific state. In FIG. 1, the hatched areas are identified as areas where crops are in a specific state. At this time, the information processing device 4 connects the individual spectral images taken by the second multispectral camera mounted on the drone 3 to generate one spectral image, and uses this spectral image to identify the crop. It is preferable to identify an area in a state of . This is because Drone 3 flies at a lower altitude compared to Satellite 2, so if the area to be observed covers a wide area, it is difficult to identify the entire area with a single spectral image. be. The meaning of piecing together spectral images also applies to the spectral images obtained from the first multispectral camera mounted on Satellite 2, so as already explained, the spectral images obtained from the first multispectral camera can also be connected as appropriate. Preferably, they are joined together. Preferably, when identifying the area where the crop is in a particular state, the sun angle of the area at the time of spectral image capturing is obtained and calculated, and this sun angle is taken into account to identify the area where the crop is in the particular state. Identify.
 <システム1の基本構成>
 図2を参照して、実施形態である方法が適用されたシステム1の基本構成について説明する。
<Basic configuration of system 1>
With reference to FIG. 2, the basic configuration of the system 1 to which the method of the embodiment is applied will be described.
 本実施形態のシステム1は、ネットワークNを介して接続された情報処理装置10及びドローン20、衛星30を有する。情報処理装置10のハードウェア構成を図11に、ドローン20のハードウェア構成を図5に、衛星30のハードウェア構成を図8に示す。これらドローン20及び衛星30は情報処理装置を搭載している。 The system 1 of this embodiment includes an information processing device 10, a drone 20, and a satellite 30 that are connected via a network N. The hardware configuration of the information processing device 10 is shown in FIG. 11, the hardware configuration of the drone 20 is shown in FIG. 5, and the hardware configuration of the satellite 30 is shown in FIG. These drones 20 and satellites 30 are equipped with information processing devices.
 情報処理装置は演算装置と記憶装置とを備えたコンピュータにより構成されている。コンピュータの基本ハードウェア構成および、当該ハードウェア構成により実現されるコンピュータの基本機能構成は後述する。 The information processing device is composed of a computer equipped with an arithmetic unit and a storage device. The basic hardware configuration of the computer and the basic functional configuration of the computer realized by the hardware configuration will be described later.
 ネットワークNは、インターネット、LAN、無線基地局等によって構築される各種移動通信システム等で構成される。例えば、ネットワークには、3G、4G、5G移動通信システム、LTE(Long Term Evolution)、所定のアクセスポイントによってインターネットに接続可能な無線ネットワーク(例えばWi-Fi(登録商標))等が含まれる。無線で接続する場合、通信プロトコルとして例えば、Z-Wave(登録商標)、ZigBee(登録商標)、Bluetooth(登録商標)等が含まれる。有線で接続する場合は、ネットワークには、USB(Universal Serial Bus)ケーブル等により直接接続するものも含む。 The network N is composed of various mobile communication systems constructed by the Internet, LAN, wireless base stations, etc. For example, the network includes 3G, 4G, 5G mobile communication systems, LTE (Long Term Evolution), a wireless network (eg, Wi-Fi (registered trademark)) that can be connected to the Internet through a predetermined access point, and the like. When connecting wirelessly, communication protocols include, for example, Z-Wave (registered trademark), ZigBee (registered trademark), Bluetooth (registered trademark), and the like. In the case of a wired connection, the network also includes a network that is directly connected using a USB (Universal Serial Bus) cable or the like.
 情報処理装置10は、衛星30が撮像したスペクトル画像を受け入れ、このスペクトル画像に基づいてライブラリを作成する。そして、情報処理装置10は、ドローン20が撮像したスペクトル画像を受け入れ、このスペクトル画像及びライブラリに基づいて、観測対象の地域において作物が特定の状態にある領域を特定する。 The information processing device 10 receives a spectrum image captured by the satellite 30, and creates a library based on this spectrum image. Then, the information processing device 10 receives the spectral image captured by the drone 20, and identifies an area where crops are in a specific state in the observation target area based on the spectral image and the library.
 以下、各装置の構成およびその動作を説明する。 The configuration and operation of each device will be explained below.
 <ドローン20のハードウェア構成>
 本実施形態のドローン20は、静止飛行可能に構成されており、図3及び図4に示すように、スペクトルカメラ制御装置21と、このスペクトルカメラ制御装置21により制御されるスペクトルカメラ22とを有するスペクトルカメラ制御システム23を搭載している。以下、各構成について詳細に説明する。
<Hardware configuration of drone 20>
The drone 20 of this embodiment is configured to be able to fly stationary, and includes a spectral camera control device 21 and a spectral camera 22 controlled by the spectral camera control device 21, as shown in FIGS. 3 and 4. It is equipped with a spectral camera control system 23. Each configuration will be explained in detail below.
 ドローン20は、空中で静止飛行する機能、いわゆるホバリング機能を有する飛行体であり、本実施形態では、複数枚の回転翼を有するマルチコプタ型のドローンである。また、本実施形態のドローン20は、予め指定された飛行経路を自律飛行する機能および通信装置等からの遠隔操作によって飛行する機能を有している。さらに、ドローン20は、図3及び図4において図示しないが、飛行中の自機の位置(緯度・経度)および高度を検出するためのGPS(Global Positioning System)受信機と、飛行中の自機の姿勢を検出する姿勢センサとを有している。 The drone 20 is a flying object that has a function of flying stationary in the air, a so-called hovering function, and in this embodiment, it is a multicopter type drone having a plurality of rotary wings. Further, the drone 20 of this embodiment has a function of autonomously flying along a pre-specified flight path and a function of flying by remote control from a communication device or the like. Furthermore, although not shown in FIGS. 3 and 4, the drone 20 is equipped with a GPS (Global Positioning System) receiver for detecting the position (latitude and longitude) and altitude of the drone during flight, and an attitude sensor that detects the attitude of the user.
 ドローン20は、図5に示すように、スペクトルカメラ制御装置21と、このスペクトルカメラ制御装置21により制御される第2のマルチスペクトルカメラであるスペクトルカメラ22とを有するスペクトルカメラ制御システム23を搭載している。 As shown in FIG. 5, the drone 20 is equipped with a spectral camera control system 23 that includes a spectral camera control device 21 and a spectral camera 22 that is a second multispectral camera controlled by the spectral camera control device 21. ing.
 スペクトルカメラ制御システム23は、図5に示すように、主として、スペクトルカメラ22と、このスペクトルカメラ22の姿勢および位置の情報を検出する姿勢位置検出器24と、スペクトルカメラ22を制御するスペクトルカメラ制御装置21と、これら各機器に電力を供給するバッテリ25とを有する。 As shown in FIG. 5, the spectral camera control system 23 mainly includes a spectral camera 22, an attitude position detector 24 that detects information on the attitude and position of the spectral camera 22, and a spectral camera control system that controls the spectral camera 22. It has a device 21 and a battery 25 that supplies power to each of these devices.
 スペクトルカメラ22は、図5に示すように、ドローン20が静止飛行中に、地表面が撮像対象となるように鉛直下方に向けてドローン20に搭載されている。 As shown in FIG. 5, the spectrum camera 22 is mounted on the drone 20 so as to face vertically downward so that the ground surface becomes the imaging target while the drone 20 is flying stationary.
 本実施形態のドローン20に搭載されているスペクトルカメラ22は、後述するライブラリ生成の結果、作物が特定の状態にあることを検出することが可能な複数の波長帯域の組合せのそれぞれに対応した波長帯域の反射光が検出可能なイメージセンサを有する。詳しくは、スペクトルカメラ22は、イメージセンサと複数の波長帯域の光のみ通過可能な複数のフィルタを有し、スペクトルカメラ22による地表面撮像時に、複数のフィルタを適宜切り替えて(差し替えて)、個々の波長帯域の反射光のみイメージセンサに到達させ、フィルタの切替を連続的に行うことで、複数の波長帯域の組合せのそれぞれに対応した波長帯域の反射光に基づく撮像結果を得る。 The spectrum camera 22 mounted on the drone 20 of this embodiment has wavelengths corresponding to each of a plurality of wavelength band combinations that can detect that crops are in a specific state as a result of library generation described later. It has an image sensor that can detect reflected light in a band. Specifically, the spectral camera 22 has an image sensor and a plurality of filters that can pass only light in a plurality of wavelength bands, and when the spectral camera 22 takes images of the ground surface, the plurality of filters are appropriately switched (replaced) to By letting only the reflected light in the wavelength band reach the image sensor and continuously switching the filter, an imaging result based on the reflected light in the wavelength band corresponding to each combination of a plurality of wavelength bands is obtained.
 この際、ドローン20による観測対象の地域を撮像するに先立って、作物がどの特定の状態にあるかを検出するかを予め決定し、決定結果に基づいて定められたフィルタの組合せからなるスペクトルカメラ22がドローン20に搭載される。但し、単一の波長帯域の光のみ撮像可能なカメラを複数台搭載し、これら複数台のカメラによりスペクトルカメラ22を構成してもよい。このように、スペクトルカメラ22は、予め定めた特定の波長帯域の反射光に基づくスペクトル画像2104を生成するものであり、ドローン20による観測対象の地域を撮像している間において波長帯域の切替を行わない構成である。なお、上述の記載は、スペクトルカメラ22が、衛星30に搭載されるスペクトルカメラ32と同様に、複数の波長帯域を任意に切替可能な構成であることを排除する趣旨ではない。 At this time, before the drone 20 images the area to be observed, it is determined in advance which specific state the crops are in, and a spectral camera consisting of a combination of filters determined based on the determination result. 22 is mounted on the drone 20. However, a plurality of cameras capable of imaging only light in a single wavelength band may be installed, and the spectral camera 22 may be constituted by the plurality of cameras. In this way, the spectral camera 22 generates a spectral image 2104 based on reflected light in a predetermined specific wavelength band, and the wavelength band can be switched while the drone 20 is imaging the area to be observed. This is a configuration in which this is not done. Note that the above description does not exclude that the spectrum camera 22 is configured to be capable of arbitrarily switching between a plurality of wavelength bands, similar to the spectrum camera 32 mounted on the satellite 30.
 イメージセンサは、スナップショット方式でスペクトル画像を撮像するものである。本実施形態において、イメージセンサ222は、CMOSイメージセンサやCCDイメージセンサ等の視野内を同じタイミングで撮像可能な二次元イメージセンサからなる。また、イメージセンサ222は、スペクトルカメラ制御装置21から送信される撮像指令信号に基づいて撮像を実行するようになっている。 The image sensor captures a spectrum image using a snapshot method. In this embodiment, the image sensor 222 is a two-dimensional image sensor such as a CMOS image sensor or a CCD image sensor that can capture images within the field of view at the same timing. Further, the image sensor 222 is configured to perform imaging based on an imaging command signal transmitted from the spectral camera control device 21.
 姿勢位置検出器24は、スペクトルカメラ22の姿勢および位置の状態を検出する機器である。本実施形態における姿勢位置検出器24は、スペクトルカメラ22の位置情報および高度情報を検出するGPS受信機240と、前記スペクトルカメラ22の姿勢情報を検出する姿勢センサ241とを有する。 The attitude and position detector 24 is a device that detects the attitude and position of the spectrum camera 22. The attitude position detector 24 in this embodiment includes a GPS receiver 240 that detects position information and altitude information of the spectrum camera 22, and an attitude sensor 241 that detects attitude information of the spectrum camera 22.
 GPS受信機240は、複数の人工衛星の位置を捕捉することで現在の位置情報および高度情報を取得するものである。本実施形態におけるGPS受信機240は、位置情報として経度情報および緯度情報を取得するとともに、高度情報として標高の情報を取得するようになっている。なお、位置情報および高度情報は、GPS受信機240により取得するものに限定されるものではなく、他の方法により取得してもよい。例えば、基準点を定めレーザー光や音響の反射を用いた距離計測器等によって前記基準点からの距離や高度情報として取得してもよい。 The GPS receiver 240 acquires current position information and altitude information by capturing the positions of multiple artificial satellites. The GPS receiver 240 in this embodiment is configured to acquire longitude information and latitude information as position information, and altitude information as altitude information. Note that the position information and altitude information are not limited to those acquired by the GPS receiver 240, and may be acquired by other methods. For example, a reference point may be set and distance and altitude information from the reference point may be acquired using a distance measuring device that uses laser light or acoustic reflection.
 姿勢センサ241は、スペクトルカメラ22の傾斜角度、角速度および加速度の姿勢情報を検出するものである。本実施形態における姿勢センサ241は、図示しないが、ジャイロ特性を利用したジャイロセンサと加速度センサとから構成され、姿勢情報として傾斜角度、角速度および3軸方向の加速度を取得するようになっている。 The attitude sensor 241 detects attitude information such as the inclination angle, angular velocity, and acceleration of the spectrum camera 22. Although not shown, the attitude sensor 241 in this embodiment includes a gyro sensor that uses gyro characteristics and an acceleration sensor, and is configured to obtain tilt angle, angular velocity, and acceleration in three axial directions as attitude information.
 なお、本実施形態では、スペクトルカメラ制御システム23として備えられている姿勢位置検出器24から位置情報および姿勢情報を取得しているが、この構成に限定されるものではない。例えば、ドローン20が既に備えているGPS受信機および姿勢センサから位置情報および姿勢情報を取得するようにしてもよい。 Note that in this embodiment, position information and orientation information are acquired from the orientation position detector 24 provided as the spectrum camera control system 23, but the configuration is not limited to this. For example, the position information and attitude information may be acquired from a GPS receiver and an attitude sensor that the drone 20 already has.
 <スペクトルカメラ制御装置21の機能構成>
 ドローン20のスペクトルカメラ制御装置21が実現する機能構成を図6に示す。スペクトルカメラ制御装置21は、記憶部210、制御部211、通信部212を備える。通信部212はスペクトルカメラ制御装置21が有する図略の通信IFにより構成され、記憶部210はスペクトルカメラ制御装置21の図略の主記憶装置及び補助記憶装置により構成され、制御部211は主にスペクトルカメラ制御装置21の図略のプロセッサにより構成される。
<Functional configuration of spectral camera control device 21>
FIG. 6 shows a functional configuration realized by the spectrum camera control device 21 of the drone 20. The spectral camera control device 21 includes a storage section 210, a control section 211, and a communication section 212. The communication unit 212 is configured by an unillustrated communication IF of the spectral camera control device 21, the storage unit 210 is configured by an unillustrated main storage device and auxiliary storage device of the spectral camera control device 21, and the control unit 211 mainly includes It is constituted by an unillustrated processor of the spectral camera control device 21.
 通信部212は、図略のネットワークNを介して情報処理装置10等との間での通信を行う。 The communication unit 212 communicates with the information processing device 10 and the like via a network N (not shown).
 <スペクトルカメラ制御装置21の記憶部210の構成>
 スペクトルカメラ制御装置21の記憶部210は、飛行計画DB(DataBase)2101、第2撮像条件DB2102、画像DB2103及びスペクトル画像2104を有する。
<Configuration of storage unit 210 of spectral camera control device 21>
The storage unit 210 of the spectral camera control device 21 includes a flight plan DB (DataBase) 2101, a second imaging condition DB 2102, an image DB 2103, and a spectral image 2104.
 これら飛行計画DB2101等のうち、スペクトル画像2104を除くものは全てデータベースである。ここに言うデータベースは、リレーショナルデータベースを指し、行と列によって構造的に規定された表形式のテーブルと呼ばれるデータ集合を、互いに関連づけて管理するためのものである。データベースでは、表をテーブル、表の列をカラム、表の行をレコードと呼ぶ。リレーショナルデータベースでは、テーブル同士の関係を設定し、関連づけることができる。 Of these flight plan DB 2101, etc., everything except the spectrum image 2104 is a database. The database referred to here refers to a relational database, which is used to manage data sets called tabular tables, which are structurally defined by rows and columns, in relation to each other. In a database, a table is called a table, a table column is called a column, and a table row is called a record. In a relational database, you can set and associate relationships between tables.
 通常、各テーブルにはレコードを一意に特定するための主キーとなるカラムが設定されるが、カラムへの主キーの設定は必須ではない。制御部211は、各種プログラムに従ってプロセッサに、記憶部210に記憶された特定のテーブルにレコードを追加、削除、更新を実行させることができる。 Normally, each table has a column set as a primary key to uniquely identify a record, but setting a primary key to a column is not essential. The control unit 211 can cause the processor to add, delete, or update records to a specific table stored in the storage unit 210 according to various programs.
 飛行計画DB2101は、ドローン20を空中で飛行させる際に、所定経路に従ってこのドローン20を飛行させるための飛行計画が格納されたデータベースであり、第2撮像条件DB2102は、ドローン20に搭載したスペクトルカメラ22により観測対象の地域を撮像する際の条件に関するデータベースであり、画像DB2103は、第2撮像条件DB2102に従ってスペクトルカメラ22が撮像した結果得られたスペクトル画像2104を管理するためのデータベースであり、スペクトル画像2104は、第2撮像条件DB2102に従ってスペクトルカメラ22が撮像した結果得られたスペクトル画像である。飛行計画DB2101、第2撮像条件DB2102及び画像DB2103の詳細については後述する。
 <スペクトルカメラ制御装置21の制御部211の構成>
The flight plan DB 2101 is a database storing a flight plan for flying the drone 20 along a predetermined route when flying the drone 20 in the air. The image DB 2103 is a database for managing spectral images 2104 obtained as a result of imaging by the spectral camera 22 according to the second imaging condition DB 2102. The image 2104 is a spectrum image obtained as a result of imaging by the spectrum camera 22 according to the second imaging condition DB 2102. Details of the flight plan DB 2101, second imaging condition DB 2102, and image DB 2103 will be described later.
<Configuration of control unit 211 of spectral camera control device 21>
 スペクトルカメラ制御装置21の制御部211は、受信制御部2110、送信制御部2111、飛行制御部2112、姿勢位置情報取得部2113、及び撮像制御部2114を備える。制御部211は、記憶部210に記憶されたアプリケーションプログラム2100を実行することにより、これら受信制御部2110等の機能ユニットが実現される。 The control unit 211 of the spectral camera control device 21 includes a reception control unit 2110, a transmission control unit 2111, a flight control unit 2112, an attitude position information acquisition unit 2113, and an imaging control unit 2114. The control unit 211 implements functional units such as the reception control unit 2110 by executing the application program 2100 stored in the storage unit 210.
 受信制御部2110は、スペクトルカメラ制御装置21が外部の装置から通信プロトコルに従って信号を受信する処理を制御する。 The reception control unit 2110 controls the process by which the spectral camera control device 21 receives signals from an external device according to a communication protocol.
 送信制御部2111は、スペクトルカメラ制御装置21が外部の装置に対し通信プロトコルに従って信号を送信する処理を制御する。 The transmission control unit 2111 controls the process by which the spectral camera control device 21 transmits a signal to an external device according to a communication protocol.
 飛行制御部2112は、飛行計画DB2101により指定された目標地点の地理的情報及び姿勢位置情報取得部2113が取得した姿勢位置情報に基づいて、この飛行計画DB2101により指定された目標地点に向けてドローン20の飛行を制御する。 The flight control unit 2112 directs the drone toward the target point specified by the flight plan DB 2101 based on the geographical information of the target point specified by the flight plan DB 2101 and the attitude position information acquired by the attitude position information acquisition unit 2113. Control 20 flights.
 姿勢位置情報取得部2113は、姿勢位置検出器24が検出したドローン20の位置情報、高度情報及び姿勢情報を取得し、取得結果を飛行制御部2112や撮像制御部2114に提供する。 The attitude position information acquisition unit 2113 acquires the position information, altitude information, and attitude information of the drone 20 detected by the attitude position detector 24, and provides the acquisition results to the flight control unit 2112 and the imaging control unit 2114.
 撮像制御部2114は、第2撮像条件DB2102を参照し、第2撮像条件DB2102により指定された撮像位置においてスペクトルカメラ22により観測対象の地域を撮像し、撮像されたスペクトル画像2104を記憶部210に格納するとともに、スペクトル画像2104を撮像した際の条件等を画像DB2103に格納する。 The imaging control unit 2114 refers to the second imaging condition DB 2102, images the observation target area with the spectral camera 22 at the imaging position specified by the second imaging condition DB 2102, and stores the captured spectral image 2104 in the storage unit 210. At the same time, the conditions under which the spectrum image 2104 was captured are also stored in the image DB 2103.
 図7に、本実施形態のドローン20により観測対象の地域を撮像してスペクトル画像を取得する状態を概念的に示す。スペクトルカメラ22による一度の撮像領域は、一例として一辺が150m程度の矩形(正方形)領域であり、この時のドローン20の高度は150m程度である。 FIG. 7 conceptually shows a state in which the drone 20 of this embodiment images an area to be observed and obtains a spectral image. The imaging area of the spectrum camera 22 at one time is, for example, a rectangular (square) area with one side of about 150 m, and the altitude of the drone 20 at this time is about 150 m.
 <衛星30のハードウェア構成>
 本実施形態の衛星30は、地球の上空に設けられた周回軌道上を略一定の速度で周回する。衛星30は、図8に示すように、スペクトルカメラ制御装置31と、このスペクトルカメラ制御装置31により制御される第1のマルチスペクトルカメラであるスペクトルカメラ32とを有するスペクトルカメラ制御システム33を搭載している。これらスペクトルカメラ制御装置31、スペクトルカメラ32及びスペクトルカメラ制御システム33の構成はドローン20のスペクトルカメラ制御装置21、スペクトルカメラ22及びスペクトルカメラ制御システム23と類似している。従って、略同一の構成については説明を省略し、主な相違点に絞って説明を行う。
<Hardware configuration of satellite 30>
The satellite 30 of this embodiment orbits at a substantially constant speed in an orbit set above the earth. As shown in FIG. 8, the satellite 30 is equipped with a spectral camera control system 33 that includes a spectral camera control device 31 and a spectral camera 32 that is a first multispectral camera controlled by the spectral camera control device 31. ing. The configurations of the spectrum camera control device 31, spectrum camera 32, and spectrum camera control system 33 are similar to those of the spectrum camera control device 21, spectrum camera 22, and spectrum camera control system 23 of the drone 20. Therefore, descriptions of substantially the same configurations will be omitted, and the description will focus on the main differences.
 スペクトルカメラ制御システム33は、主として、液晶波長可変フィルタ34を備えたスペクトルカメラ32と、このスペクトルカメラ32の姿勢および位置の情報を検出する姿勢位置検出器35と、前記スペクトルカメラ32の液晶波長可変フィルタ34を制御する液晶波長可変フィルタ制御回路36と、前記スペクトルカメラ32を制御するスペクトルカメラ制御装置31と、これら各機器に電力を供給するバッテリ37とを有する。 The spectral camera control system 33 mainly includes a spectral camera 32 equipped with a variable liquid crystal wavelength filter 34, an attitude position detector 35 that detects information on the attitude and position of the spectral camera 32, and a tunable liquid crystal wavelength filter of the spectral camera 32. It has a liquid crystal variable wavelength filter control circuit 36 that controls the filter 34, a spectral camera control device 31 that controls the spectral camera 32, and a battery 37 that supplies power to each of these devices.
 スペクトルカメラ32は、スナップショット方式でスペクトル画像を撮像するためのものであり、主に、レンズ群320と、偏光を非偏光にするための偏光解消板321と、透過波長を任意に選択できる液晶波長可変フィルタ34と、二次元のスペクトル画像を撮像するイメージセンサ322とを有している。 The spectral camera 32 is for capturing a spectral image using a snapshot method, and mainly includes a lens group 320, a depolarizing plate 321 for converting polarized light into non-polarized light, and a liquid crystal that can arbitrarily select the transmission wavelength. It has a variable wavelength filter 34 and an image sensor 322 that captures a two-dimensional spectral image.
 レンズ群320は、光の屈折を利用して撮像対象からの光を液晶波長可変フィルタ34に透過させるとともに、透過後の光をイメージセンサ322に集光させるものである。本実施形態におけるレンズ群320は、撮像対象の光を集光して液晶波長可変フィルタ34に入光させる入光レンズ320aと、前記液晶波長可変フィルタ34を透過後の透過波長のみの光をイメージセンサ322に集光する集光レンズ320bとによって構成されている。なお、各レンズの種類や枚数は特に限定されるものではなく、スペクトルカメラ32の性能等に応じて適宜選択してよい。 The lens group 320 uses light refraction to transmit the light from the imaging target to the liquid crystal variable wavelength filter 34, and focuses the transmitted light on the image sensor 322. The lens group 320 in this embodiment includes a light entrance lens 320a that condenses the light of the imaging target and makes it enter the liquid crystal wavelength tunable filter 34, and an image of the light having only the transmitted wavelength after passing through the liquid crystal wavelength tunable filter 34. A condensing lens 320b condenses light onto a sensor 322. Note that the type and number of lenses are not particularly limited, and may be appropriately selected depending on the performance of the spectral camera 32, etc.
 偏光解消板321は、偏光を解消して非偏光にするためのものである。本実施形態において、偏光解消板321は、液晶波長可変フィルタ34の入光側に設けられ、液晶波長可変フィルタ34を透過する前の光の偏光を解消し、偏光特性を軽減するようになっている。 The depolarizing plate 321 is for depolarizing light and making it non-polarized light. In this embodiment, the depolarizing plate 321 is provided on the light incident side of the liquid crystal variable wavelength filter 34, and depolarizes the light before passing through the liquid crystal variable wavelength filter 34, thereby reducing the polarization characteristics. There is.
 液晶波長可変フィルタ34は、予め定めた波長範囲内から透過波長を任意に選択できる光学フィルタである。液晶波長可変フィルタ34は、図示しないが、板状の液晶素子と板状の偏光素子とを交互に複数枚重ね合わせた構成を有している。各液晶素子は、液晶波長可変フィルタ制御回路36から供給される印加電圧によって配向状態が独立に制御される。このため、液晶波長可変フィルタ34は、前記液晶素子の配向状態と前記偏光素子との組み合わせにより、任意の波長の光を透過させられるようになっている。 The liquid crystal variable wavelength filter 34 is an optical filter that can arbitrarily select a transmission wavelength from within a predetermined wavelength range. Although not shown, the liquid crystal variable wavelength filter 34 has a structure in which a plurality of plate-shaped liquid crystal elements and plate-shaped polarizing elements are stacked alternately. The alignment state of each liquid crystal element is independently controlled by the applied voltage supplied from the liquid crystal variable wavelength filter control circuit 36. Therefore, the liquid crystal variable wavelength filter 34 is configured to transmit light of any wavelength depending on the alignment state of the liquid crystal element and the combination of the polarizing element.
 なお、本実施形態において、液晶波長可変フィルタ34の透過波長の幅は、約20nm以下であり、透過中心波長を1nmごとに設定でき、波長切り換え時間は10ms~数100ms程度である。 In this embodiment, the width of the transmission wavelength of the liquid crystal variable wavelength filter 34 is about 20 nm or less, the transmission center wavelength can be set in steps of 1 nm, and the wavelength switching time is about 10 ms to several 100 ms.
 イメージセンサ322は、スナップショット方式でスペクトル画像を撮像するものである。本実施形態において、イメージセンサ322は、CMOSイメージセンサやCCDイメージセンサ等の視野内を同じタイミングで撮像可能な二次元イメージセンサからなる。また、イメージセンサ322は、スペクトルカメラ制御装置31から送信される撮像指令信号に基づいて撮像を実行するようになっている。 The image sensor 322 captures a spectrum image using a snapshot method. In this embodiment, the image sensor 322 is a two-dimensional image sensor such as a CMOS image sensor or a CCD image sensor that can capture images within the field of view at the same timing. Further, the image sensor 322 is configured to perform imaging based on an imaging command signal transmitted from the spectral camera control device 31.
 液晶波長可変フィルタ制御回路36は、液晶波長可変フィルタ34を制御するものである。本実施形態において、液晶波長可変フィルタ制御回路36は、スペクトルカメラ制御装置31から送信される波長特定信号を受信すると、当該波長特定信号に応じた印加電圧を液晶波長可変フィルタ34の液晶素子に供給するようになっている。また、波長特定信号には、液晶波長可変フィルタ34により透過させる透過波長の情報が含まれており、液晶波長可変フィルタ制御回路36では、前記透過波長の情報に基づいてどの液晶素子に印加電圧を供給するかを判別し、特定された液晶素子に印加電圧を供給するようになっている。 The liquid crystal variable wavelength filter control circuit 36 controls the liquid crystal variable wavelength filter 34. In this embodiment, upon receiving the wavelength specifying signal transmitted from the spectral camera control device 31, the liquid crystal variable wavelength filter control circuit 36 supplies an applied voltage according to the wavelength specifying signal to the liquid crystal element of the liquid crystal variable wavelength filter 34. It is supposed to be done. Further, the wavelength specifying signal includes information on the transmission wavelength to be transmitted by the liquid crystal wavelength tunable filter 34, and the liquid crystal wavelength tunable filter control circuit 36 determines which liquid crystal element to apply voltage to based on the transmission wavelength information. It is determined whether the applied voltage is supplied to the liquid crystal element, and the applied voltage is supplied to the specified liquid crystal element.
 なお、本実施形態における液晶波長可変フィルタ制御回路36は、スペクトルカメラ制御装置31等の他の構成から独立して構成されているが、これに限定されるものではなく、例えば、スペクトルカメラ制御装置31またはスペクトルカメラ32が備えていてもよい。 Note that the liquid crystal variable wavelength filter control circuit 36 in this embodiment is configured independently from other components such as the spectral camera control device 31, but is not limited to this. For example, the spectral camera control device 31 or the spectrum camera 32 may be provided.
 <スペクトルカメラ制御装置31の機能構成>
 衛星30のスペクトルカメラ制御装置31が実現する機能構成を図9に示す。スペクトルカメラ制御装置31は、記憶部310、制御部311、通信部312を備える。通信部312はスペクトルカメラ制御装置31が有する図略の通信IFにより構成され、記憶部310はスペクトルカメラ制御装置31の図略の主記憶装置及び補助記憶装置により構成され、制御部311は主にスペクトルカメラ制御装置31の図略のプロセッサにより構成される。
<Functional configuration of spectral camera control device 31>
FIG. 9 shows a functional configuration realized by the spectrum camera control device 31 of the satellite 30. The spectral camera control device 31 includes a storage section 310, a control section 311, and a communication section 312. The communication unit 312 is configured by an unillustrated communication IF of the spectral camera control device 31, the storage unit 310 is configured by an unillustrated main storage device and auxiliary storage device of the spectral camera control device 31, and the control unit 311 mainly includes It is constituted by an unillustrated processor of the spectral camera control device 31.
 通信部312は、図略のネットワークNを介して情報処理装置10等との間での通信を行う。 The communication unit 312 communicates with the information processing device 10 and the like via a network N (not shown).
 <スペクトルカメラ制御装置31の記憶部310の構成>
 スペクトルカメラ制御装置31の記憶部310は、第1撮像条件DB(DataBase)3101、画像DB3102及びスペクトル画像3103を有する。
<Configuration of storage unit 310 of spectral camera control device 31>
The storage unit 310 of the spectral camera control device 31 includes a first imaging condition DB (DataBase) 3101, an image DB 3102, and a spectral image 3103.
 これら第1撮像条件DB3101等のうち、スペクトル画像3103を除くものは全てデータベースである。ここに言うデータベースは、リレーショナルデータベースを指し、行と列によって構造的に規定された表形式のテーブルと呼ばれるデータ集合を、互いに関連づけて管理するためのものである。データベースでは、表をテーブル、表の列をカラム、表の行をレコードと呼ぶ。リレーショナルデータベースでは、テーブル同士の関係を設定し、関連づけることができる。 Of these first imaging condition DB 3101, etc., all of them except for the spectral image 3103 are databases. The database referred to here refers to a relational database, which is used to manage data sets called tabular tables, which are structurally defined by rows and columns, in relation to each other. In a database, a table is called a table, a table column is called a column, and a table row is called a record. In a relational database, you can set and associate relationships between tables.
 通常、各テーブルにはレコードを一意に特定するための主キーとなるカラムが設定されるが、カラムへの主キーの設定は必須ではない。制御部311は、各種プログラムに従ってプロセッサに、記憶部310に記憶された特定のテーブルにレコードを追加、削除、更新を実行させることができる。 Normally, each table has a column set as a primary key to uniquely identify a record, but setting a primary key to a column is not essential. The control unit 311 can cause the processor to add, delete, or update records to a specific table stored in the storage unit 310 according to various programs.
 第1撮像条件DB3101は、衛星30に搭載したスペクトルカメラ32により地表面を撮像する際の条件に関するデータベースであり、画像DB3102は、第1撮像条件DB3101に従ってスペクトルカメラ32が撮像した結果得られたスペクトル画像を管理するためのデータベースであり、スペクトル画像3103は、第1撮像条件DB3101に従ってスペクトルカメラ32が撮像した結果得られたスペクトル画像である。第1撮像条件DB3101及び画像DB3102の詳細については後述する。
 <スペクトルカメラ制御装置31の制御部311の構成>
The first imaging condition DB 3101 is a database regarding conditions for imaging the ground surface with the spectral camera 32 mounted on the satellite 30, and the image DB 3102 is a spectrum obtained as a result of imaging by the spectral camera 32 according to the first imaging condition DB 3101. It is a database for managing images, and the spectrum image 3103 is a spectrum image obtained as a result of imaging by the spectrum camera 32 according to the first imaging condition DB 3101. Details of the first imaging condition DB 3101 and image DB 3102 will be described later.
<Configuration of control unit 311 of spectral camera control device 31>
 スペクトルカメラ制御装置31の制御部311は、受信制御部3110、送信制御部3111、撮像判定部3112、姿勢位置情報取得部3113、撮像制御部3114及び波長設定部3115を備える。制御部311は、記憶部310に記憶されたアプリケーションプログラム3100を実行することにより、これら受信制御部3110等の機能ユニットが実現される。 The control unit 311 of the spectral camera control device 31 includes a reception control unit 3110, a transmission control unit 3111, an imaging determination unit 3112, an attitude position information acquisition unit 3113, an imaging control unit 3114, and a wavelength setting unit 3115. The control unit 311 implements functional units such as the reception control unit 3110 by executing the application program 3100 stored in the storage unit 310.
 受信制御部3110は、スペクトルカメラ制御装置31が外部の装置から通信プロトコルに従って信号を受信する処理を制御する。 The reception control unit 3110 controls the process by which the spectral camera control device 31 receives a signal from an external device according to a communication protocol.
 送信制御部3111は、スペクトルカメラ制御装置31が外部の装置に対し通信プロトコルに従って信号を送信する処理を制御する。 The transmission control unit 3111 controls the process by which the spectral camera control device 31 transmits a signal to an external device according to a communication protocol.
 撮像判定部3112は、第1撮像条件DB3101に格納された撮像開始時刻、撮像終了時刻に基づいて、衛星30に搭載されたスペクトルカメラ32により地表面を撮像するか否かの判定を行い、判定結果を撮像制御部3114に送出する。 The imaging determination unit 3112 determines whether or not to image the ground surface with the spectral camera 32 mounted on the satellite 30 based on the imaging start time and imaging end time stored in the first imaging condition DB 3101. The results are sent to the imaging control unit 3114.
 姿勢位置情報取得部3113は、姿勢位置検出器35が検出した衛星30の位置情報、高度情報及び姿勢情報を取得し、取得結果を撮像制御部3114に提供する。 The attitude and position information acquisition unit 3113 acquires the position information, altitude information, and attitude information of the satellite 30 detected by the attitude and position detector 35, and provides the acquisition results to the imaging control unit 3114.
 撮像制御部3114は、撮像判定部3112からの判定結果を受け取り、この判定結果が撮像を行うものであった場合、スペクトルカメラ32により地表面を撮像し、撮像されたスペクトル画像3103を記憶部310に格納するとともに、スペクトル画像3103を撮像した際の条件等を画像DB3102に格納する。 The imaging control unit 3114 receives the determination result from the imaging determination unit 3112, and if the determination result indicates that imaging is to be performed, the imaging control unit 3114 images the ground surface with the spectral camera 32, and stores the captured spectral image 3103 in the storage unit 310. At the same time, the conditions under which the spectrum image 3103 was captured are stored in the image DB 3102.
 図10に、本実施形態の衛星30により観測対象の地域を撮像してスペクトル画像を取得する状態を概念的に示す。スペクトルカメラ32による一度の撮像領域は、一例として一辺が2~10km程度の矩形(正方形)領域であり、この時の衛星30の高度は500km程度である。 FIG. 10 conceptually shows a state in which the satellite 30 of this embodiment images an area to be observed and obtains a spectral image. The imaging area of the spectrum camera 32 at one time is, for example, a rectangular (square) area with sides of about 2 to 10 km, and the altitude of the satellite 30 at this time is about 500 km.
 <情報処理装置10のハードウェア構成>
 図11は、情報処理装置10の基本的なハードウェア構成を示すブロック図である。情報処理装置10は、プロセッサ101、主記憶装置102、補助記憶装置103、通信IF(Interface)104、入力IF105、出力IF106を少なくとも有する。これらは通信バス107により相互に電気的に接続される。また、情報処理装置10には、入力IF105を介して入力装置110が、出力IF106を介して出力装置111がそれぞれ接続されている。
<Hardware configuration of information processing device 10>
FIG. 11 is a block diagram showing the basic hardware configuration of the information processing device 10. As shown in FIG. The information processing device 10 includes at least a processor 101, a main storage device 102, an auxiliary storage device 103, a communication IF (Interface) 104, an input IF 105, and an output IF 106. These are electrically connected to each other by a communication bus 107. Further, an input device 110 and an output device 111 are connected to the information processing device 10 via an input IF 105 and an output IF 106, respectively.
 プロセッサ101とは、プログラムに記述された命令セットを実行するためのハードウェアである。プロセッサ101は、演算装置、レジスタ、周辺回路等から構成される。 The processor 101 is hardware for executing a set of instructions written in a program. The processor 101 includes an arithmetic unit, registers, peripheral circuits, and the like.
 主記憶装置102とは、プログラム、及びプログラム等で処理されるデータ等を一時的に記憶するためのものである。例えば、DRAM(Dynamic Random Access Memory)等の揮発性のメモリである。 The main storage device 102 is for temporarily storing programs, data processed by the programs, etc. For example, it is a volatile memory such as DRAM (Dynamic Random Access Memory).
 補助記憶装置103とは、データ及びプログラムを保存するための記憶装置である。例えば、フラッシュメモリ、SSD(Solid State Drive)、HDD(Hard Disc Drive)、光磁気ディスク、CD-ROM、DVD-ROM、半導体メモリ等である。 The auxiliary storage device 103 is a storage device for storing data and programs. Examples include flash memory, SSD (Solid State Drive), HDD (Hard Disc Drive), magneto-optical disk, CD-ROM, DVD-ROM, semiconductor memory, and the like.
 通信IF104とは、有線又は無線の通信規格を用いて、他のコンピュータとネットワークを介して通信するための信号を入出力するためのインタフェースである。 The communication IF 104 is an interface for inputting and outputting signals for communicating with other computers via a network using a wired or wireless communication standard.
 入力IF105は、情報処理装置10の操作者(オペレータ)からの入力操作を受け付けるための入力装置110とのインタフェースとして機能する。出力IF106は、オペレータに対し情報を提示するための出力装置111とのインタフェースとして機能する。入力装置110は、オペレータからの入力操作を受け付けるための入力装置(例えば、タッチパネル、タッチパッド、マウス等のポインティングデバイス、キーボード等)である。出力装置111は、オペレータに対し情報を提示するための出力装置(ディスプレイ、スピーカ等)である。 The input IF 105 functions as an interface with the input device 110 for receiving input operations from the operator of the information processing device 10. The output IF 106 functions as an interface with the output device 111 for presenting information to the operator. The input device 110 is an input device (for example, a touch panel, a touch pad, a pointing device such as a mouse, a keyboard, etc.) for receiving input operations from an operator. The output device 111 is an output device (display, speaker, etc.) for presenting information to the operator.
 なお、各ハードウェア構成の全部または一部を複数のコンピュータに分散して設け、ネットワークを介して相互に接続することにより情報処理装置10を仮想的に実現することができる。このように、情報処理装置10は、単一の筐体、ケースに収納されたコンピュータだけでなく、仮想化されたコンピュータシステムも含む概念である。 Note that the information processing device 10 can be virtually realized by distributing all or part of each hardware configuration to a plurality of computers and interconnecting them via a network. In this way, the information processing device 10 is a concept that includes not only a computer housed in a single housing or case, but also a virtualized computer system.
 <情報処理装置10の機能構成>
 情報処理装置10のハードウェア構成が実現する機能構成を図12に示す。情報処理装置10は、記憶部120、制御部130、通信部140を備える。通信部140は通信IF104により構成され、記憶部120は情報処理装置10の主記憶装置102及び補助記憶装置103により構成され、制御部130は主に情報処理装置10のプロセッサ101により構成される。
<Functional configuration of information processing device 10>
FIG. 12 shows a functional configuration realized by the hardware configuration of the information processing device 10. The information processing device 10 includes a storage section 120, a control section 130, and a communication section 140. The communication unit 140 is configured by the communication IF 104, the storage unit 120 is configured by the main storage device 102 and the auxiliary storage device 103 of the information processing device 10, and the control unit 130 is mainly configured by the processor 101 of the information processing device 10.
 通信部140は、ネットワークNを介してドローン20等との間での通信を行う。 The communication unit 140 communicates with the drone 20 and the like via the network N.
 <情報処理装置10の記憶部120の構成>
 情報処理装置10の記憶部120は、飛行計画DB(DataBase)122、画像DB123、広域画像DB124、教師データ125、学習モデル126、スペクトル画像127、及び広域スペクトル画像128を有する。
<Configuration of storage unit 120 of information processing device 10>
The storage unit 120 of the information processing device 10 includes a flight plan DB (DataBase) 122, an image DB 123, a wide-area image DB 124, teacher data 125, a learning model 126, a spectral image 127, and a wide-area spectral image 128.
 これら飛行計画DB122等のうち、飛行計画DB122、画像DB123及び広域画像DB124はデータベースである。ここに言うデータベースは、リレーショナルデータベースを指し、行と列によって構造的に規定された表形式のテーブルと呼ばれるデータ集合を、互いに関連づけて管理するためのものである。データベースでは、表をテーブル、表の列をカラム、表の行をレコードと呼ぶ。リレーショナルデータベースでは、テーブル同士の関係を設定し、関連づけることができる。 Of these flight plan DB 122, etc., the flight plan DB 122, image DB 123, and wide area image DB 124 are databases. The database referred to here refers to a relational database, which is used to manage data sets called tabular tables, which are structurally defined by rows and columns, in relation to each other. In a database, a table is called a table, a table column is called a column, and a table row is called a record. In a relational database, you can set and associate relationships between tables.
 通常、各テーブルにはレコードを一意に特定するための主キーとなるカラムが設定されるが、カラムへの主キーの設定は必須ではない。制御部130は、各種プログラムに従ってプロセッサ101に、記憶部120に記憶された特定のテーブルにレコードを追加、削除、更新を実行させることができる。 Normally, each table has a column set as a primary key to uniquely identify a record, but setting a primary key to a column is not essential. The control unit 130 can cause the processor 101 to add, delete, or update records to a specific table stored in the storage unit 120 according to various programs.
 飛行計画DB122は、ドローン20の記憶部210に格納されている飛行計画DB2101と同様のものである。画像DB123は、ドローン20の記憶部210に格納されている画像DB2103、及び、衛星30の記憶部310に格納されている画像DB3102と同様のものである。本実施形態の情報処理装置10には、画像DB2103及び画像DB3102を合体させた画像DB123が記憶部120に格納されている。広域画像DB124は、制御部130の画像合成部136が、スペクトル画像127に基づいて生成した広域スペクトル画像128を管理するためのデータベースである。
 <情報処理装置10の制御部130の構成>
The flight plan DB 122 is similar to the flight plan DB 2101 stored in the storage unit 210 of the drone 20. The image DB 123 is similar to the image DB 2103 stored in the storage unit 210 of the drone 20 and the image DB 3102 stored in the storage unit 310 of the satellite 30. In the information processing apparatus 10 of this embodiment, an image DB 123 that is a combination of the image DB 2103 and the image DB 3102 is stored in the storage unit 120. The wide-area image DB 124 is a database for managing the wide-area spectral image 128 generated by the image synthesis unit 136 of the control unit 130 based on the spectral image 127.
<Configuration of control unit 130 of information processing device 10>
 情報処理装置10の制御部130は、受信制御部131、送信制御部132、学習モデル生成部133、飛行計画作成部134、飛行体制御部135、画像合成部136、領域特定部137及び種類特定部138を備える。制御部130は、記憶部120に記憶されたアプリケーションプログラム121を実行することにより、これら受信制御部131等の機能ユニットが実現される。 The control unit 130 of the information processing device 10 includes a reception control unit 131, a transmission control unit 132, a learning model generation unit 133, a flight plan creation unit 134, an aircraft control unit 135, an image synthesis unit 136, an area identification unit 137, and a type identification unit. 138. The control unit 130 realizes functional units such as the reception control unit 131 by executing the application program 121 stored in the storage unit 120.
 受信制御部131は、情報処理装置10が外部の装置から通信プロトコルに従って信号を受信する処理を制御する。 The reception control unit 131 controls a process in which the information processing device 10 receives a signal from an external device according to a communication protocol.
 送信制御部132は、情報処理装置10が外部の装置に対し通信プロトコルに従って信号を送信する処理を制御する。 The transmission control unit 132 controls a process in which the information processing device 10 transmits a signal to an external device according to a communication protocol.
 学習モデル生成部133は、衛星30に搭載されたスペクトルカメラ32が撮影したスペクトル画像127及び衛星30から送出された画像DB123、また、ドローン20に搭載されたスペクトルカメラ22及びドローン20から送出された画像DB123に基づいて教師データ125を生成し、この教師データ125に基づいて学習モデル126を生成する。 The learning model generation unit 133 uses the spectral image 127 taken by the spectral camera 32 mounted on the satellite 30 and the image DB 123 sent from the satellite 30, as well as the spectral camera 22 mounted on the drone 20 and the spectral image DB 123 sent from the drone 20. Teacher data 125 is generated based on the image DB 123, and a learning model 126 is generated based on this teacher data 125.
 衛星30に搭載されたスペクトルカメラ32が撮影したスペクトル画像127を用いた場合の学習モデル生成部133の動作の詳細について説明する。なお、ドローン20に搭載されたスペクトルカメラ22が撮影したスペクトル画像127を用いた場合の動作も同様である。 The details of the operation of the learning model generation unit 133 when using the spectrum image 127 taken by the spectrum camera 32 mounted on the satellite 30 will be described. Note that the operation is similar when using the spectrum image 127 taken by the spectrum camera 22 mounted on the drone 20.
 衛星30に搭載されたスペクトルカメラ32は、広範囲の波長帯域についてこの波長帯域を細分化した波長帯域毎に同一地表面の地域を撮像し、同一地域について波長帯域が異なる多数のスペクトル画像3103(スペクトル画像127)を取得している。学習モデル生成部133は、同一地域について波長帯域が異なる複数のスペクトル画像127を情報処理装置10のオペレータに提示し、この地域において作物が特定の状態にある領域を明確に分別可能な波長帯域のスペクトル画像127をオペレータに特定させる。次いで、学習モデル生成部133は、特定された波長帯域のスペクトル画像127について、この波長帯域により特定可能な作物の特定の状態をオペレータに特定させ、特定された波長帯域のスペクトル画像127と特定された作物の特定の状態とを関連付ける。そして、学習モデル生成部133は、特定された波長帯域、特定された波長帯域のスペクトル画像127、特定された作物の特定の状態、及びこれらの関連性を教師データ125として記憶部120に格納させる。 The spectral camera 32 mounted on the satellite 30 images the same area of the earth's surface for each wavelength band obtained by subdividing this wavelength band over a wide range of wavelength bands, and images a large number of spectral images 3103 (spectral images) of the same area with different wavelength bands. Image 127) has been acquired. The learning model generation unit 133 presents the operator of the information processing device 10 with a plurality of spectral images 127 of the same region with different wavelength bands, and generates a wavelength band that can clearly distinguish areas where crops are in a specific state in this region. Spectral image 127 is specified by the operator. Next, the learning model generation unit 133 causes the operator to specify a specific state of the crop that can be specified by the specified wavelength band with respect to the spectral image 127 of the specified wavelength band. associated with specific conditions of the crop. Then, the learning model generation unit 133 causes the storage unit 120 to store the specified wavelength band, the spectral image 127 of the specified wavelength band, the specified specific state of the crop, and the relationship thereof as the teacher data 125. .
 例えば、図23に示すように、同一地表面の地域を撮像した原画像(本実施形態では同一地域について複数のスペクトル画像127が取得可能であるが、説明の簡略化のために1つの画像のみ示している)について、複数の波長帯域(図23では波長α、β、γの3種類としているが、波長帯域の数に制限はない)のスペクトル画像127を取得できる。これらスペクトル画像127のうち、ある波長(図23では波長β)について他の波長α、γと異なる濃淡分布が生じることがある。なお、特定の波長帯域についてのスペクトル画像127であるので、ディスプレイ等に表示する際には白黒の濃淡のパターン(つまりグレースケール表示)で表示されることを前提としている。情報処理装置10のオペレータは、同一地表面の地域について作物がどのような特定の状態にあるかを事前に把握しており(例えば現地調査などにより)、波長βにおいて他の波長α、γでは見られない特有の濃淡分布(図23にではハッチングにより示している)を見て、この濃淡分布と作物の特定の状態との関連付けを入力する。 For example, as shown in FIG. 23, an original image taken of the same area on the ground surface (in this embodiment, multiple spectral images 127 can be acquired for the same area, but to simplify the explanation, only one image is used). It is possible to acquire spectrum images 127 of a plurality of wavelength bands (in FIG. 23, there are three types of wavelengths α, β, and γ, but there is no limit to the number of wavelength bands). Among these spectral images 127, a different density distribution may occur for a certain wavelength (wavelength β in FIG. 23) than for other wavelengths α and γ. Note that since this is a spectral image 127 for a specific wavelength band, it is assumed that when displayed on a display or the like, it will be displayed in a pattern of black and white shading (that is, grayscale display). The operator of the information processing device 10 knows in advance what kind of specific state the crops are in in the same ground surface area (for example, through a field survey), and the operator of the information processing device 10 knows in advance what kind of specific state the crops are in in the same ground surface area (for example, through a field survey), and when the wavelength β is different from other wavelengths α and γ, The operator looks at the unique gray scale distribution that cannot be seen (indicated by hatching in FIG. 23) and inputs the association between this gray scale distribution and a specific state of the crop.
 この際、学習モデル生成部133は画像DB123を参照し、特定された波長帯域のスペクトル画像が撮像された際の地表面と太陽とがなす角度を算出する。これは、太陽の角度により作物からの反射光のスペクトルが異なるからである。そして、学習モデル生成部133は、太陽の角度についても同様に教師データ125として記憶部120に格納させる。あるいは、学習モデル生成部133は、算出した太陽の角度に基づき、スペクトル画像を所定時刻における太陽の角度で撮像されたスペクトルのスペクトル画像に補正するための補正式(あるいは補正値)を算出し、この補正式等により補正したスペクトル画像を用いた教師データ125を生成して記憶部120に格納させる。この際、補正式そのものも後述する学習モデル126に格納させてもよいし、補正式は別途記憶部120に格納させてもよい。但し、太陽角度に基づく補正は必須ではない。 At this time, the learning model generation unit 133 refers to the image DB 123 and calculates the angle between the earth's surface and the sun when the spectral image of the specified wavelength band is captured. This is because the spectrum of light reflected from crops differs depending on the angle of the sun. The learning model generation unit 133 similarly stores the angle of the sun in the storage unit 120 as the teacher data 125. Alternatively, the learning model generation unit 133 calculates a correction formula (or correction value) for correcting the spectral image to a spectral image of a spectrum captured at the sun angle at a predetermined time, based on the calculated sun angle, Teacher data 125 is generated using the spectral image corrected using this correction formula, etc., and is stored in the storage unit 120. At this time, the correction formula itself may be stored in the learning model 126 described later, or the correction formula may be stored separately in the storage unit 120. However, correction based on the solar angle is not essential.
 次いで、学習モデル生成部133は、上述の手順により生成された教師データ125に基づいて、人工知能技術における学習モデル126を生成する。学習モデル126の生成手法及び学習モデル126の種類については公知のものが好適に適用可能であるので、ここではこれ以上の説明を省略する。 Next, the learning model generation unit 133 generates a learning model 126 in the artificial intelligence technology based on the teacher data 125 generated by the above-described procedure. As for the generation method of the learning model 126 and the type of the learning model 126, known methods can be suitably applied, so further explanation will be omitted here.
 そして、学習モデル生成部133は、上述した教師データ125生成手順を繰り返し行って教師データ125を更新し、教師データ125を更新したら、あるいは所定時間間隔毎に学習モデル126の再学習を行う。これにより、後述する種類特定部138による作物が特定の状態にあるとの特定の精度が向上する。 Then, the learning model generation unit 133 repeatedly performs the above-described teacher data 125 generation procedure to update the teacher data 125, and re-learns the learning model 126 after updating the teacher data 125 or at every predetermined time interval. This improves the accuracy with which the type identification unit 138 (described later) identifies crops as being in a particular state.
 上述した学習モデル生成部133による教師データ125の更新、及び、学習モデル126の再学習については、特に、システム1の運用当初は限定的な太陽角度におけるスペクトル画像127を取得し、このスペクトル画像127に基づいて教師データ125及び学習モデル126を生成した場合には好ましい動作である。つまり、当初は限定的な太陽角度におけるスペクトル画像127に基づいて教師データ125及び学習モデル126を生成し、その後、衛星30に搭載したスペクトルカメラ32から得られるスペクトル画像、及び/またはドローン20に搭載したスペクトルカメラ22から得られるスペクトル画像を、様々な太陽角度において取得し、太陽角度の影響を含めた教師データ125を生成して、この教師データ125に基づいて学習モデル126の再学習を行ってもよい。これにより、システム1の運用開始時期を早めることができるとともに、種類特定部138による特定結果の精度を累進的に向上させることができる。加えて、後述する種類特定部138による特定結果と実際の作物の特定の状態とを対比し、実際の作物の特定状態をフィードバックして教師データ125を作成/修正し、学習モデル126の再学習を行ってもよい。これによっても、種類特定部138による特定結果の精度を累進的に向上させることができる。 Regarding the updating of the teacher data 125 and the relearning of the learning model 126 by the learning model generation unit 133 described above, in particular, at the beginning of operation of the system 1, a spectral image 127 at a limited solar angle is acquired, and this spectral image 127 is This is a preferable operation if the teacher data 125 and learning model 126 are generated based on the following. That is, initially, the teacher data 125 and the learning model 126 are generated based on the spectral image 127 at a limited solar angle, and then the spectral image obtained from the spectral camera 32 mounted on the satellite 30 and/or the spectral image mounted on the drone 20 is generated. The spectral images obtained from the spectral camera 22 are acquired at various sun angles, training data 125 including the influence of the sun angle is generated, and the learning model 126 is retrained based on this training data 125. Good too. As a result, it is possible to bring forward the start of operation of the system 1, and it is also possible to progressively improve the accuracy of the identification results by the type identification unit 138. In addition, the specific state of the actual crop is compared with the identification result by the type identification unit 138, which will be described later, and the actual specific state of the crop is fed back to create/modify the training data 125, and the learning model 126 is re-trained. You may do so. This also allows the accuracy of the identification results by the type identification unit 138 to be progressively improved.
 この際、学習モデル生成部133は、衛星30のスペクトルカメラ32及び/またはドローン20のスペクトルカメラ22により撮像された地表面の地形データを取得し、この地形データに基づいて、作物が特定の状態にあることが推測される地域を特定し、この地域をスペクトルカメラ22、32により撮像して得られたスペクトル画像127を抽出してオペレータに提示することが好ましい。作物が特定の状態にあるか否かは、地形に依存する場合が多い。一例として、作物が育成されている場所の湿度が高い場合、作物が病気に罹りやすくなることが知られている。そこで、学習モデル生成部133は、地表面の地形データを取得し、この地形データに基づいて作物の育成場所の環境(日照、湿度、風向きなど)を推定し、この環境が特定のものである場所を撮像して得られたスペクトル画像127をオペレータに提示する。地形データは情報処理装置10の記憶部120に事前に格納してもよいし、外部サービスから取得してもよい。 At this time, the learning model generation unit 133 acquires topographical data of the ground surface imaged by the spectral camera 32 of the satellite 30 and/or the spectral camera 22 of the drone 20, and based on this topographical data, the learning model generation unit 133 determines whether the crops are in a specific state. It is preferable to identify an area that is estimated to be located in the area, and to extract a spectral image 127 obtained by imaging this area with the spectral cameras 22 and 32 and present it to the operator. Whether a crop is in a particular state often depends on the terrain. As an example, it is known that crops are more susceptible to disease if the humidity in the area where they are grown is high. Therefore, the learning model generation unit 133 acquires topographical data of the ground surface, estimates the environment (sunlight, humidity, wind direction, etc.) of the place where crops are grown based on this topographical data, and determines whether this environment is a specific one. A spectrum image 127 obtained by imaging the location is presented to the operator. The terrain data may be stored in advance in the storage unit 120 of the information processing device 10, or may be obtained from an external service.
 なお、学習モデル生成部133、教師データ125及び学習モデル126を用いることなく、特定された波長帯域、特定された波長帯域のスペクトル画像127、特定された作物の特定の状態、スペクトル画像127が撮像された時の太陽の角度、及びこれらの関連性のみに基づいて、作物が特定の状態にあるか否かの特定を行うこともできる。 Note that the specified wavelength band, the spectral image 127 of the specified wavelength band, the specified state of the crop, and the spectral image 127 can be captured without using the learning model generation unit 133, the teacher data 125, and the learning model 126. It is also possible to identify whether a crop is in a particular state based solely on the angle of the sun at the time of the harvest and their relationship.
 飛行計画作成部134は、ドローン20により観測対象の地域を飛行させ、ドローン20に搭載されたスペクトルカメラ22により観測対象の地域を撮像するための飛行計画を作成して飛行計画DB122として記憶部120に格納する。そして、飛行計画作成部134は、作成した飛行計画DB122をドローン20に送信する。なお、後述する飛行体制御部135によりドローン20の飛行を制御する(ドローン20を操縦する)場合、飛行計画作成部134及び飛行計画DB122を設けなくともよい。 The flight plan creation unit 134 creates a flight plan for flying the drone 20 over the observation target area and capturing an image of the observation target area using the spectrum camera 22 mounted on the drone 20, and stores the flight plan in the storage unit 120 as the flight plan DB 122. Store in. The flight plan creation unit 134 then transmits the created flight plan DB 122 to the drone 20. In addition, when controlling the flight of the drone 20 (driving the drone 20) by the flying object control part 135 mentioned later, it is not necessary to provide the flight plan creation part 134 and flight plan DB122.
 この際、飛行計画作成部134は、同時に複数のドローン20を観測対象の地域を飛行させ、これら複数のドローン20に搭載されたスペクトルカメラ22により観測対象の地域を並列的に撮像させる飛行計画を作成することが好ましい。 At this time, the flight plan creation unit 134 creates a flight plan in which multiple drones 20 are simultaneously flown over the observation target area, and the spectrum cameras 22 mounted on these multiple drones 20 are to take images of the observation target area in parallel. It is preferable to create one.
 また、飛行計画作成部134は、観測対象の地域の地形データを取得し、この地形データに基づいて飛行計画を作成することが好ましい。上述したように、作物が特定の状態にあるか否かは、地形に依存する場合が多い。一例として、作物が育成されている場所の湿度が高い場合、作物が病気に罹りやすくなることが知られている。そこで、飛行計画作成部134は、観測対象の地域の地形データを取得し、この地形データに基づいて作物の育成場所の環境(日照、湿度、風向きなど)を推定し、この環境が特定のものである場所を重点的にドローン20を飛行させ、スペクトルカメラ22によりスペクトル画像を取得する。地形データは情報処理装置10の記憶部120に事前に格納してもよいし、外部サービスから取得してもよい。 It is also preferable that the flight plan creation unit 134 acquires topographical data of the area to be observed and creates a flight plan based on this topographical data. As mentioned above, whether a crop is in a particular state often depends on the terrain. As an example, it is known that crops are more susceptible to disease if the humidity in the area where they are grown is high. Therefore, the flight plan creation unit 134 acquires topographical data of the area to be observed, estimates the environment (sunlight, humidity, wind direction, etc.) of the place where crops are grown based on this topographical data, and determines whether this environment is specific. A drone 20 is flown focusing on a certain place, and a spectral image is acquired by a spectral camera 22. The terrain data may be stored in advance in the storage unit 120 of the information processing device 10, or may be obtained from an external service.
 飛行計画作成部134による飛行計画作成は、情報処理装置10のオペレータがドローン20の飛行目標位置を個別に指定することにより行ってよいし、オペレータが観測対象の地域のみを特定し、飛行計画作成部134が、ドローン20の飛行速度及びスペクトルカメラ22によるスペクトル画像取得速度に基づいて飛行目標位置を生成することにより行ってもよい。 The flight plan creation unit 134 may create a flight plan by having the operator of the information processing device 10 individually designate the flight target position of the drone 20, or the operator may designate only the area to be observed and create the flight plan. The unit 134 may generate the flight target position based on the flight speed of the drone 20 and the spectrum image acquisition speed by the spectrum camera 22.
 飛行体制御部135は、飛行計画DB122に基づいてドローン20の飛行を制御する。但し、ドローン20が、自身の記憶部210に格納された飛行計画DB2101に基づいて自律的に飛行可能であるならば、飛行体制御部135を設けなくともよい。あるいは、情報処理装置10によりドローン20を操縦するならば、飛行体制御部135は、情報処理装置10に設けられた入力装置110の一例であるリモコンを通じてオペレータが入力した操縦信号に基づいてドローン20を操縦する。 The flying object control unit 135 controls the flight of the drone 20 based on the flight plan DB 122. However, if the drone 20 can fly autonomously based on the flight plan DB 2101 stored in its own storage unit 210, the flying object control unit 135 may not be provided. Alternatively, if the information processing device 10 controls the drone 20, the flying object control unit 135 controls the drone 20 based on a control signal input by the operator through a remote controller, which is an example of the input device 110 provided in the information processing device 10. to operate.
 画像合成部136は、ドローン20のスペクトルカメラ22により撮像されたスペクトル画像2104、127をつなぎ合わせて、好ましくは観測対象の地域全体を撮像した広域スペクトル画像128を生成する。これは、ドローン20のスペクトルカメラ22による撮像範囲は、作物が特定の状態にあるか否かを特定すべき観測対象の地域より狭いことが多いので、一括して作物が特定の状態の特定を行うために、広い範囲の広域スペクトル画像128を生成することが好ましいからである。そして、画像合成部136は、生成した広域スペクトル画像128を記憶部120に格納するとともに、広域画像DB124を更新する。また、画像合成部136は、衛星30のスペクトルカメラ32により撮像されたスペクトル画像3103、127をつなぎ合わせて、好ましくは地表面を広範囲にわたって撮像した広域スペクトル画像128を生成する。 The image synthesis unit 136 connects the spectral images 2104 and 127 captured by the spectral camera 22 of the drone 20 to generate a wide-range spectral image 128 that preferably captures the entire area to be observed. This is because the imaging range of the spectral camera 22 of the drone 20 is often narrower than the area to be observed where it is necessary to identify whether or not crops are in a specific state. This is because it is preferable to generate a broad-spectrum image 128 of a wide range in order to perform the above-mentioned processing. The image synthesis unit 136 then stores the generated wide spectrum image 128 in the storage unit 120 and updates the wide range image DB 124. Furthermore, the image synthesis unit 136 connects the spectral images 3103 and 127 captured by the spectral camera 32 of the satellite 30 to generate a wide spectrum image 128 that preferably captures a wide range of the ground surface.
 この際、画像合成部136は、オルソ画像を生成する既知の手法に基づいて広域スペクトル画像128を生成することが好ましい。オルソ画像は、写真上の像の位置ズレをなくし空中写真を地図と同じく、真上から見たような傾きのない、正しい大きさと位置に表示される画像に変換(以下、「正射変換」という)したものである。空中写真を正射変換するには、空中写真上の位置と地上の水平位置を対応させる必要がある。この正射変換には、地表の三次元形状を表した数値標高モデル(標高データ)を用いて行う。オルソ画像の生成手法は既知であるので、ここではこれ以上の説明を省略する。 At this time, it is preferable that the image synthesis unit 136 generates the wide spectrum image 128 based on a known method of generating orthoimages. An ortho image eliminates the positional shift of the image on the photo and converts the aerial photo into an image that is displayed in the correct size and position without any tilt, as if seen from directly above, just like a map (hereinafter referred to as ``orthogonal conversion''). ). To orthographically convert an aerial photograph, it is necessary to match the position on the aerial photograph with the horizontal position on the ground. This orthographic transformation is performed using a digital elevation model (elevation data) that represents the three-dimensional shape of the earth's surface. Since the method of generating orthoimages is known, further explanation will be omitted here.
 図22は、画像合成部136によるスペクトル画像2104、127の画像合成の一例を示す図である。1台のドローン20により領域2200(図では正方形であるが、これに限られない)を撮像して得られたスペクトル画像127を所定の重なり領域を持ちながら図中水平方向に並べ、これらスペクトル画像127を合成して広域スペクトル画像128を生成する。この作業を、他のドローン20により領域2201を撮像して得られたスペクトル画像127についても行い、観測対象の地域を包含する広域スペクトル画像128を得る。 FIG. 22 is a diagram showing an example of image synthesis of the spectrum images 2104 and 127 by the image synthesis unit 136. Spectral images 127 obtained by imaging an area 2200 (square in the figure, but not limited to this) by one drone 20 are arranged horizontally in the figure with a predetermined overlapping area, and these spectral images are 127 to generate a wide spectrum image 128. This operation is also performed on the spectral image 127 obtained by imaging the area 2201 with another drone 20 to obtain a wide-area spectral image 128 that includes the area to be observed.
 領域特定部137は、画像合成部136により作成された広域スペクトル画像128を参照して、作物が特定の状態にあるか否かを種類特定部138が特定するための領域の特定を、情報処理装置10のオペレータから受ける。 The area specifying unit 137 refers to the wide spectrum image 128 created by the image synthesizing unit 136 and performs information processing to specify the area for the type specifying unit 138 to specify whether or not the crop is in a specific state. received from the operator of the device 10.
 種類特定部138は、領域特定部137により特定された領域について、作物が特定の状態にあるか否か、あるとしたらどの状態にあるかを、広域スペクトル画像128、及び学習モデル126に基づいて特定する。この際、種類特定部138は、情報処理装置10のオペレータからの指示入力に基づいて、特定すべき作物の特定の状態の入力を受け入れ、領域特定部137により特定された領域について、作物が具体的な特定の状態にあるか否かを特定してもよい。この際、種類特定部138は、広域スペクトル画像128の元となるスペクトル画像127が撮像された際の地表面と太陽とがなす角度を算出し、学習モデル126または記憶部120に格納された太陽角度補正式を用いて、所定時刻において撮像された広域スペクトル画像128となるように、そのスペクトルを補正する。 The type identification unit 138 determines whether or not the crop is in a specific state in the area identified by the area identification unit 137, and if so, in what state, based on the wide spectrum image 128 and the learning model 126. Identify. At this time, the type specifying unit 138 receives an input of a specific state of the crop to be specified based on an instruction input from the operator of the information processing device 10, and specifies whether the crop is specific in the area specified by the area specifying unit 137. It may also be possible to specify whether or not the person is in a specific state. At this time, the type identifying unit 138 calculates the angle between the earth's surface and the sun when the spectral image 127 that is the source of the wide-spectrum image 128 is captured, Using the angle correction formula, the spectrum is corrected so that it becomes a wide spectrum image 128 captured at a predetermined time.
 次いで、種類特定部138は、特定結果を、出力装置111の一例であるディスプレイに表示する。種類特定部138による表示態様に特段の制限はないが、一例として、地図データベースを参照して観測対象の地域の地図データを表示するとともに、この地図データに、作物が特定の状態にある領域を重畳表示するような態様が挙げられる。 Next, the type identification unit 138 displays the identification result on a display, which is an example of the output device 111. There are no particular restrictions on the display mode by the type identification unit 138, but as an example, it refers to a map database to display map data of an area to be observed, and also includes an area where crops are in a specific state in this map data. An example of such a mode is superimposed display.
 加えて、種類特定部138は、情報処理装置10のオペレータから時間軸の指定入力を受け入れて、時間軸での作物が特定の状態にある領域での広がりを表示してもよい。さらに、種類特定部138は、作物が特定の状態にあることと、この作物と、この特定の状態を回復する(例えば特定の状態にある作物の伐採、農薬散布など)ために必要となる費用の試算結果とをディスプレイに表示させ、想定される被害額を算出してディスプレイに表示してもよい。
 <データ構造>
 図13は、衛星30の記憶部310に格納されている第1撮像条件DB3101のデータ構造を示す図である。
In addition, the type specifying unit 138 may receive input specifying a time axis from the operator of the information processing device 10 and display the spread of crops in a region in a specific state on the time axis. Furthermore, the type identifying unit 138 determines that the crop is in a specific state, and the cost required to restore this specific state (for example, cutting down crops in a specific state, spraying pesticides, etc.) The estimated damage amount may be calculated and displayed on the display.
<Data structure>
FIG. 13 is a diagram showing the data structure of the first imaging condition DB 3101 stored in the storage unit 310 of the satellite 30.
 第1撮像条件DB3101は、衛星30のスペクトルカメラ32による撮像条件を特定するための撮像条件IDをキーとして、撮像開始時刻、撮像終了時刻及び波長条件のカラムを有するテーブルである。 The first imaging condition DB 3101 is a table having columns of imaging start time, imaging end time, and wavelength condition using an imaging condition ID as a key for specifying imaging conditions by the spectrum camera 32 of the satellite 30.
 「撮像条件ID」は、衛星30のスペクトルカメラ32による撮像条件を特定するための情報である。「撮像開始時刻」は、衛星30のスペクトルカメラ32による撮像を開始する時刻に関する情報である。「撮像終了時刻」は、衛星30のスペクトルカメラ32による撮像を終了する時刻に関する情報である。「波長条件」は、衛星30のスペクトルカメラ32による撮像をするときの波長条件に関する情報である。衛星30に搭載されたスペクトルカメラ32は、複数の波長帯域について撮像が可能であり(従って複数の波長帯域についてのスペクトル画像が生成可能であり)、図13に示す例では、波長条件として、波長帯域の範囲及び波長帯域を変化させる波長幅についての情報が格納されている。 The "imaging condition ID" is information for specifying the imaging condition by the spectrum camera 32 of the satellite 30. The “imaging start time” is information regarding the time at which the spectrum camera 32 of the satellite 30 starts imaging. The “imaging end time” is information regarding the time when imaging by the spectrum camera 32 of the satellite 30 ends. “Wavelength conditions” is information regarding wavelength conditions when imaging is performed by the spectrum camera 32 of the satellite 30. The spectral camera 32 mounted on the satellite 30 is capable of capturing images in multiple wavelength bands (and can therefore generate spectral images in multiple wavelength bands), and in the example shown in FIG. Information about the range of the band and the wavelength width for changing the wavelength band is stored.
 第1撮像条件DB3101における各々のカラムは、情報処理装置10を含む情報処理装置が生成し、衛星30に送出することで記憶部310に格納される。 Each column in the first imaging condition DB 3101 is generated by an information processing device including the information processing device 10, and is stored in the storage unit 310 by being sent to the satellite 30.
 図14は、ドローン20の記憶部210、衛星30の記憶部310及び情報処理装置10の記憶部120にそれぞれ格納されている画像DB2103、3102、123のデータ構造を示す図である。 FIG. 14 is a diagram showing the data structure of the image DBs 2103, 3102, and 123 stored in the storage unit 210 of the drone 20, the storage unit 310 of the satellite 30, and the storage unit 120 of the information processing device 10, respectively.
 画像DB2103、3102、123は、ドローン20及び衛星30のスペクトルカメラ22、32により撮像されたスペクトル画像を特定するための画像IDをキーとして、画像ファイル名、撮像時刻、位置情報、高度情報、市政情報及び波長のカラムを有するテーブルである。 Image DBs 2103, 3102, and 123 use image IDs as keys to identify spectrum images captured by spectrum cameras 22 and 32 of drones 20 and satellites 30, and store information such as image file names, imaging times, location information, altitude information, and city administration. 2 is a table with columns of information and wavelength.
 「画像ID」は、ドローン20及び衛星30のスペクトルカメラ22、32により撮像されたスペクトル画像を特定するための情報である。「画像ファイル名」は、画像IDにより特定され、ドローン20の記憶部210、衛星30の記憶部310及び情報処理装置10の記憶部120に格納されているスペクトル画像2104、3103、127のファイル名を示す情報である。「撮像時刻」は、画像IDにより特定されるスペクトル画像2104、3103、127が撮像された時刻を示す情報である。「位置情報」は、画像IDにより特定されるスペクトル画像2104、3103、127が撮像された際のドローン20、衛星30の位置情報である。「高度情報」は、画像IDにより特定されるスペクトル画像2104、3103、127が撮像された際のドローン20、衛星30の高度情報である。「姿勢情報」は、画像IDにより特定されるスペクトル画像2104、3103、127が撮像された際のドローン20、衛星30の姿勢情報である。「波長」は、画像IDにより特定されるスペクトル画像2104、3103、127が撮像された際のドローン20及び衛星30のスペクトルカメラ22、32に設定された波長帯域を示す情報である。 "Image ID" is information for identifying the spectrum image captured by the spectrum cameras 22 and 32 of the drone 20 and the satellite 30. “Image file name” is the file name of the spectral images 2104, 3103, and 127 identified by the image ID and stored in the storage unit 210 of the drone 20, the storage unit 310 of the satellite 30, and the storage unit 120 of the information processing device 10. This is information indicating. "Imaging time" is information indicating the time when the spectrum images 2104, 3103, and 127 specified by the image ID were captured. “Position information” is the position information of the drone 20 and the satellite 30 when the spectrum images 2104, 3103, and 127 specified by the image ID were captured. "Altitude information" is altitude information of the drone 20 and the satellite 30 when the spectrum images 2104, 3103, and 127 specified by the image ID were captured. "Attitude information" is attitude information of the drone 20 and the satellite 30 when the spectrum images 2104, 3103, and 127 specified by the image ID were captured. "Wavelength" is information indicating the wavelength band set in the spectrum cameras 22 and 32 of the drone 20 and the satellite 30 when the spectrum images 2104, 3103, and 127 specified by the image ID were captured.
 画像DB2103、3102、123における各々のカラムは、ドローン20及び衛星30のスペクトルカメラ22、32によりスペクトル画像2104、3103が生成される際にドローン20及び衛星30のスペクトルカメラ制御装置21、31が生成する。 Each column in the image DB 2103, 3102, 123 is generated by the spectral camera control devices 21, 31 of the drone 20 and satellite 30 when the spectral images 2104, 3103 are generated by the spectral cameras 22, 32 of the drone 20 and satellite 30. do.
 図15は、ドローン20の記憶部210に格納されている第2撮像条件DB2102のデータ構造を示す図である。 FIG. 15 is a diagram showing the data structure of the second imaging condition DB 2102 stored in the storage unit 210 of the drone 20.
 第2撮像条件DB2102は、ドローン20のスペクトルカメラ22による撮像条件を特定するための撮像条件IDをキーとして、撮像位置、撮像高度及び撮像姿勢のカラムを有するテーブルである。 The second imaging condition DB 2102 is a table that has columns of imaging position, imaging altitude, and imaging posture, using an imaging condition ID as a key for specifying imaging conditions by the spectrum camera 22 of the drone 20.
 「撮像条件ID」は、ドローン20のスペクトルカメラ22による撮像条件を特定するための情報である。「撮像位置」は、ドローン20の位置情報である。「撮像高度」は、ドローン20の高度情報である。「撮像姿勢」は、ドローン20の姿勢情報である。 The "imaging condition ID" is information for specifying the imaging condition by the spectrum camera 22 of the drone 20. “Imaging position” is position information of the drone 20. “Imaging altitude” is altitude information of the drone 20. “Imaging posture” is posture information of the drone 20.
 第2撮像条件DB2102における各々のカラムは、情報処理装置10を含む情報処理装置が生成し、ドローン20に送出することで記憶部210に格納される。 Each column in the second imaging condition DB 2102 is generated by an information processing device including the information processing device 10, and is stored in the storage unit 210 by being sent to the drone 20.
 図16は、情報処理装置10の記憶部120及びドローン20の記憶部210に格納されている飛行計画DB122、2101のデータ構造を示す図である。 FIG. 16 is a diagram showing the data structure of the flight plan DBs 122 and 2101 stored in the storage unit 120 of the information processing device 10 and the storage unit 210 of the drone 20.
 飛行計画DB122、2101は、ドローン20の飛行計画を特定するための飛行計画IDをキーとして、経過時間、飛行位置及び飛行姿勢のカラムを有するテーブルである。 The flight plan DB 122, 2101 is a table having columns of elapsed time, flight position, and flight attitude using the flight plan ID for specifying the flight plan of the drone 20 as a key.
 「飛行計画ID」は、ドローン20の飛行計画、より詳細には、ドローン20の到達目標を特定するための情報である。「経過時間」は、飛行計画IDにより特定されるドローン20の到達目標にドローン20が到達すべき、ドローン20の飛行開始からの経過時間を示す情報である。「飛行位置」は、飛行計画IDにより特定されるドローン20の到達目標においてドローン20が到達すべき目標位置情報である。「飛行高度」は、飛行計画IDにより特定されるドローン20の到達目標においてドローン20が到達すべき目標高度情報である。「飛行姿勢」は、飛行計画IDにより特定されるドローン20の到達目標においてドローン20が到達すべき目標姿勢情報である。 The "flight plan ID" is information for specifying the flight plan of the drone 20, more specifically, the goal of the drone 20. The "elapsed time" is information indicating the elapsed time from the start of the flight of the drone 20 when the drone 20 should reach the destination specified by the flight plan ID. "Flight position" is target position information that the drone 20 should reach in the target of the drone 20 specified by the flight plan ID. "Flight altitude" is target altitude information that the drone 20 should reach in the target destination specified by the flight plan ID. "Flight attitude" is target attitude information that the drone 20 should reach at the goal specified by the flight plan ID.
 飛行計画DB122、2101における各々のカラムは、情報処理装置10の制御部130の飛行計画作成部134が生成し、記憶部120に格納するとともにドローン20に送出する。ドローン20のスペクトルカメラ制御装置21は、送出された飛行計画DB122,2101を記憶部に格納する。 Each column in the flight plan DBs 122 and 2101 is generated by the flight plan creation unit 134 of the control unit 130 of the information processing device 10, stored in the storage unit 120, and sent to the drone 20. The spectral camera control device 21 of the drone 20 stores the sent flight plan DB 122, 2101 in the storage unit.
 図17は、情報処理装置10の記憶部120に格納されている広域画像DB124のデータ構造を示す図である。 FIG. 17 is a diagram showing the data structure of the wide area image DB 124 stored in the storage unit 120 of the information processing device 10.
 広域画像DB124は、情報処理装置10の記憶部120に格納されている広域スペクトル画像128を特定するための広域画像IDをキーとして、広域画像ファイル名、撮像時刻、地域情報、太陽角度、位置情報、高度情報、姿勢情報、波長、及び画像IDのカラムを有するテーブルである。 The wide-area image DB 124 uses the wide-area image ID for specifying the wide-area spectrum image 128 stored in the storage unit 120 of the information processing device 10 as a key, and stores the wide-area image file name, imaging time, regional information, solar angle, and position information. , altitude information, attitude information, wavelength, and image ID columns.
 「広域画像ID」は、情報処理装置10の記憶部120に格納されている広域スペクトル画像128を特定するための情報である。「広域画像ファイル名」は、広域画像IDにより特定され、情報処理装置10の記憶部120に格納されている広域スペクトル画像128のファイル名を示す情報である。 The “wide area image ID” is information for identifying the wide area spectrum image 128 stored in the storage unit 120 of the information processing device 10. The “wide area image file name” is information indicating the file name of the wide area spectrum image 128 that is specified by the wide area image ID and stored in the storage unit 120 of the information processing device 10.
 「撮像時刻」は、広域画像IDにより特定される広域スペクトル画像128が撮像された時刻を示す情報である。ここで、広域スペクトル画像128は複数のスペクトル画像127を合成してなるものであり、個々のスペクトル画像127の撮像時刻は厳密には異なる。そこで、画像合成部136は、複数のスペクトル画像127を合成して広域スペクトル画像128を生成する際に、合成元となった複数のスペクトル画像127のうち、いずれかのスペクトル画像127の撮像時刻を、広域スペクトル画像128の撮像時刻として代表して用いる。この作業は、「太陽角度」、「位置情報」、「高度情報」、「姿勢情報」についても行われる。 "Imaging time" is information indicating the time when the wide spectrum image 128 specified by the wide area image ID was captured. Here, the wide spectrum image 128 is a composite of a plurality of spectrum images 127, and the imaging times of the individual spectrum images 127 are strictly different. Therefore, when synthesizing a plurality of spectral images 127 to generate a wide spectrum image 128, the image synthesis unit 136 determines the imaging time of one of the spectral images 127 among the plurality of spectral images 127 that are the source of synthesis. , is representatively used as the imaging time of the wide spectrum image 128. This work is also performed for "sun angle", "position information", "altitude information", and "attitude information".
 「地域情報」は、広域画像IDにより特定され、情報処理装置10の記憶部120に格納されている広域スペクトル画像128が撮像された地域を示す情報である。「太陽角度」は、広域画像IDにより特定される広域スペクトル画像128が撮像された際の太陽の角度を示す情報である。システム1の運用開始当初等において「太陽角度」のフィールドの値が空欄であることがあり得る。「位置情報」は、画像IDにより特定されるスペクトル画像2104、3103、127が撮像された際のドローン20、衛星30の位置情報である。「高度情報」は、広域画像IDにより特定される広域スペクトル画像128が撮像された際のドローン20の高度情報である。「姿勢情報」は、広域画像IDにより特定される広域スペクトル画像128が撮像された際のドローン20の姿勢情報である。「波長」は、広域画像IDにより特定される広域スペクトル画像128が撮像された波長帯域を示す情報である。「画像ID」は、広域画像IDにより特定される広域スペクトル画像128の合成元となった複数のスペクトル画像127を特定するための情報であり、画像DB2103の画像IDと共通である。 "Regional information" is information indicating the area where the wide spectrum image 128, which is specified by the wide area image ID and stored in the storage unit 120 of the information processing device 10, was captured. "Sun angle" is information indicating the angle of the sun when the wide spectrum image 128 specified by the wide area image ID was captured. At the beginning of operation of the system 1, the value of the "solar angle" field may be blank. “Position information” is the position information of the drone 20 and the satellite 30 when the spectrum images 2104, 3103, and 127 specified by the image ID were captured. "Altitude information" is altitude information of the drone 20 when the wide spectrum image 128 specified by the wide area image ID was captured. "Attitude information" is attitude information of the drone 20 when the wide spectrum image 128 specified by the wide area image ID is captured. "Wavelength" is information indicating the wavelength band in which the wide spectrum image 128 specified by the wide range image ID was captured. “Image ID” is information for specifying a plurality of spectral images 127 from which the wide-range spectrum image 128 specified by the wide-range image ID is synthesized, and is common to the image ID of the image DB 2103.
 広域画像DB124における各々のカラムは、情報処理装置10の制御部130の画像合成部136が生成し、記憶部120に格納する。 Each column in the wide-area image DB 124 is generated by the image synthesis unit 136 of the control unit 130 of the information processing device 10 and stored in the storage unit 120.
 <システム1の動作>
 以下、図18~図21のフローチャートを参照しながら、システム1の処理について説明する。
<Operation of system 1>
The processing of the system 1 will be described below with reference to the flowcharts of FIGS. 18 to 21.
 図18は、衛星30のスペクトルカメラ制御装置31による、スペクトル画像撮像処理を示すフローチャートである。 FIG. 18 is a flowchart showing spectral image capturing processing by the spectral camera control device 31 of the satellite 30.
 まず、スペクトルカメラ制御装置31は、第1撮像条件DB3101を参照し、第1撮像条件DB3101に記述されている撮像開始時刻に至るのを待つ(S1800)。そして、撮像開始時刻に至ると(S1800においてYES)、スペクトルカメラ制御装置31は、第1撮像条件DB3101を参照し、スペクトルカメラ32による撮像の波長帯域を設定し(S1801)、設定した波長帯域でスペクトルカメラ32により地表面を撮像させ、スペクトル画像3103を取得して記憶部310に格納するとともに、撮像時の条件を記憶部310の画像DB3102に格納する(S1802)。 First, the spectral camera control device 31 refers to the first imaging condition DB 3101 and waits for the imaging start time described in the first imaging condition DB 3101 to arrive (S1800). When the imaging start time arrives (YES in S1800), the spectral camera control device 31 refers to the first imaging condition DB 3101, sets the wavelength band for imaging by the spectral camera 32 (S1801), and uses the set wavelength band. The ground surface is imaged by the spectral camera 32, and a spectral image 3103 is acquired and stored in the storage unit 310, and the conditions at the time of imaging are stored in the image DB 3102 of the storage unit 310 (S1802).
 次いで、スペクトルカメラ制御装置31は、第1撮像条件DB3101を参照し、全ての波長帯域でS1802による撮像動作が終了したか否かを判定する(S1803)。そして、全ての波長帯域での撮像動作が終了したと判定したら(S1803においてYES)S1804に進み、まだ撮像動作を行っていない波長帯域があると判定したら(S1803においてNO)、S1801に戻って次の波長帯域を設定し、以降の処理を行う。 Next, the spectral camera control device 31 refers to the first imaging condition DB 3101 and determines whether the imaging operation in S1802 has been completed in all wavelength bands (S1803). If it is determined that the imaging operation in all wavelength bands has been completed (YES in S1803), the process advances to S1804, and if it is determined that there is a wavelength band in which the imaging operation has not been performed yet (NO in S1803), the process returns to S1801 and the next step is to proceed to S1801. Set the wavelength band and perform the following processing.
 S1804では、スペクトルカメラ制御装置31は、第1撮像条件DB3101を参照し、撮像終了時刻に至ったか否かを判定する。そして、撮像終了時刻に至ったと判定したら(S1804においてYES)S1805に進み、まだ撮像終了時刻に至っていないと判定したら(S1804においてNO)、S1800に戻って次の撮像開始時刻を待つ。 In S1804, the spectral camera control device 31 refers to the first imaging condition DB 3101 and determines whether the imaging end time has arrived. If it is determined that the imaging end time has come (YES in S1804), the process advances to S1805, and if it is determined that the imaging end time has not yet arrived (NO in S1804), the process returns to S1800 and waits for the next imaging start time.
 S1805では、スペクトルカメラ制御装置31は、第1撮像条件DB3101に格納されている全ての撮像条件について撮像動作を終了したか否かを判定する。そして、全ての撮像条件について撮像動作を終了したと判定したら(S1805においてYES)、記憶部310に格納されているスペクトル画像3103及び画像DB3102を情報処理装置10に送信し(S1806)、まだ撮像動作を終了していない撮像条件があると判定したら(S1805においてNO)、S1800に戻って次の撮像開始時刻を待つ。 In S1805, the spectral camera control device 31 determines whether the imaging operation has been completed for all imaging conditions stored in the first imaging condition DB 3101. If it is determined that the imaging operation has ended for all imaging conditions (YES in S1805), the spectrum image 3103 and image DB 3102 stored in the storage unit 310 are transmitted to the information processing apparatus 10 (S1806), and the imaging operation is still in progress. If it is determined that there is an imaging condition that has not been completed (NO in S1805), the process returns to S1800 and waits for the next imaging start time.
 次に、図19は、情報処理装置10の制御部130の学習モデル生成部133による学習モデル生成処理を示すフローチャートである。 Next, FIG. 19 is a flowchart showing learning model generation processing by the learning model generation unit 133 of the control unit 130 of the information processing device 10.
 まず、学習モデル生成部133は、記憶部120の画像DB123を参照して、学習モデルを生成する元となるスペクトル画像127が撮像された位置(地域、領域)を指定・選択する(S1900)。次いで、学習モデル生成部133は、記憶部120に格納されているスペクトル画像127のうち、学習モデルを生成する元となるスペクトル画像127を特定するための、スペクトル画像127が撮像された波長帯域を指定・選択する(S1901)。 First, the learning model generation unit 133 refers to the image DB 123 in the storage unit 120 and specifies/selects the location (region, region) where the spectral image 127 from which the learning model is generated is captured (S1900). Next, the learning model generation unit 133 determines the wavelength band in which the spectral image 127 was captured in order to identify the spectral image 127 from which the learning model is generated, among the spectral images 127 stored in the storage unit 120. Specify and select (S1901).
 次いで、学習モデル生成部133は、S1900及びS1901により指定・選択された条件に合致するスペクトル画像127を記憶部120から抽出し、抽出したスペクトル画像127を出力装置111の一例であるディスプレイに表示する(S1902)。 Next, the learning model generation unit 133 extracts from the storage unit 120 the spectral image 127 that matches the conditions specified and selected in S1900 and S1901, and displays the extracted spectral image 127 on the display, which is an example of the output device 111. (S1902).
 さらに、学習モデル生成部133は、画像DB123を参照して、S1902において抽出したスペクトル画像127が撮像された時刻を項目「撮像時刻」から取得し、この撮像時刻から、スペクトル画像127が撮像された際の地表面と太陽とがなす角度(太陽角度)を算出する。次いで、学習モデル生成部133は、このスペクトル画像127が特定の時刻(例えば昼間12時)に撮像されたとしたときのスペクトルの補正式(補正値でもよい)を算出し、この補正式を用いてスペクトル画像127のスペクトルを補正する(S1903)。この補正式等は学習モデル126として格納される。 Further, the learning model generation unit 133 refers to the image DB 123, obtains the time when the spectral image 127 extracted in S1902 was captured from the item "imaging time", and determines from this imaging time that the spectral image 127 was captured. Calculate the angle between the earth's surface and the sun (sun angle). Next, the learning model generation unit 133 calculates a correction formula (a correction value may also be used) for the spectrum when this spectrum image 127 is captured at a specific time (for example, 12:00 in the daytime), and uses this correction formula to The spectrum of the spectrum image 127 is corrected (S1903). This correction formula and the like are stored as a learning model 126.
 この後、学習モデル生成部133は、入力装置110の一例であるマウス等のポインティングデバイスにより、ディスプレイに表示されているスペクトル画像127のうち、学習の対象となる領域の選択入力を、情報処理装置10のオペレータから受ける(S1904)。そして、学習モデル生成部133は、S1903により指定された領域について作物がどの特定の状態にあるかの指定入力を、入力装置110の一例であるマウス等のポインティングデバイスにより、情報処理装置10のオペレータから受ける(S1905)。 Thereafter, the learning model generation unit 133 inputs a selection input of a region to be learned from the spectral image 127 displayed on the display using a pointing device such as a mouse, which is an example of the input device 110, to the information processing device. 10 operators (S1904). Then, the learning model generation unit 133 inputs a specification input as to which specific state the crop is in for the area specified in S1903 to the operator of the information processing device 10 using a pointing device such as a mouse, which is an example of the input device 110. (S1905).
 そして、学習モデル生成部133は、情報処理装置10のオペレータから、S1900~S1905までの入力指定を受け、学習モデル生成のための各種入力を終了した旨の指示を受けるのを待ち(S1906)、指示を受けると(S1906においてYES)、S1900~S1905までの入力指定、及び、対象となるスペクトル画像127の画像データ等に基づいて、教師データ125を生成する(S1907)。そして、学習モデル生成部133は、S1907で生成した教師データ125に基づいて学習モデル126を生成する(S1908)。学習モデル生成部133は、S1907及びS1908で生成した教師データ125及び学習モデル126を記憶部120に格納する。 Then, the learning model generation unit 133 receives input specifications from S1900 to S1905 from the operator of the information processing device 10, and waits for receiving an instruction indicating that various inputs for generating the learning model have been completed (S1906). When the instruction is received (YES in S1906), the teacher data 125 is generated based on the input specifications from S1900 to S1905 and the image data of the target spectrum image 127 (S1907). The learning model generation unit 133 then generates the learning model 126 based on the teacher data 125 generated in S1907 (S1908). The learning model generation unit 133 stores the teacher data 125 and the learning model 126 generated in S1907 and S1908 in the storage unit 120.
 次に、図20は、ドローン20のスペクトルカメラ制御装置21による、スペクトル画像撮像処理を示すフローチャートである。 Next, FIG. 20 is a flowchart showing spectral image capturing processing by the spectral camera control device 21 of the drone 20.
 まず、スペクトルカメラ制御装置21は、記憶部210に格納されている飛行計画DB2101に従ってドローン20を目標位置に順次到達するようにこのドローン20を飛行させる(S2000)。 First, the spectrum camera control device 21 flies the drone 20 so as to sequentially reach the target position according to the flight plan DB 2101 stored in the storage unit 210 (S2000).
 次いで、スペクトルカメラ制御装置21は、記憶部210に格納されている第2撮像条件DB2102を参照し、ドローン20が第2撮像条件DB2102に記述されている撮像位置に到達するのを待つ(S2001)。そして、ドローン20が撮像位置に到達したら(S2001においてYES)、スペクトルカメラ制御装置21は、スペクトルカメラ22により観測対象の地域を撮像させ、撮像して得られたスペクトル画像2104を記憶部210に格納するとともに、撮像時の条件を記憶部210の画像DB2103に格納する(S2002)。 Next, the spectral camera control device 21 refers to the second imaging condition DB 2102 stored in the storage unit 210 and waits for the drone 20 to reach the imaging position described in the second imaging condition DB 2102 (S2001). . Then, when the drone 20 reaches the imaging position (YES in S2001), the spectral camera control device 21 causes the spectral camera 22 to image the area to be observed, and stores the obtained spectral image 2104 in the storage unit 210. At the same time, the conditions at the time of imaging are stored in the image DB 2103 of the storage unit 210 (S2002).
 そして、スペクトルカメラ制御装置21は、第2撮像条件DB2102に格納されている全ての撮像位置について撮像動作を終了したか否かを判定する(S2003)。そして、全ての撮像条件について撮像動作を終了したと判定したら(S2003においてYES)、記憶部210に格納されているスペクトル画像2104及び画像DB2103を情報処理装置10に送信し(S2004)、まだ撮像動作を終了していない撮像位置があると判定したら(S2003においてNO)、S2001に戻って次の撮像位置に到達するを待つ。 Then, the spectral camera control device 21 determines whether the imaging operation has been completed for all imaging positions stored in the second imaging condition DB 2102 (S2003). Then, if it is determined that the imaging operation has ended for all imaging conditions (YES in S2003), the spectrum image 2104 and image DB 2103 stored in the storage unit 210 are transmitted to the information processing apparatus 10 (S2004), and the imaging operation is still in progress. If it is determined that there is an imaging position that has not been completed (NO in S2003), the process returns to S2001 and waits for the next imaging position to be reached.
 この後、スペクトルカメラ制御装置21はドローン20の飛行を終了する(S2005)。 After this, the spectrum camera control device 21 ends the flight of the drone 20 (S2005).
 そして、図21は、情報処理装置10の制御部130の領域特定部137及び種類特定部138による、作物が特定の状態にある領域を特定する処理を示すフローチャートである。 FIG. 21 is a flowchart illustrating processing performed by the area specifying unit 137 and type specifying unit 138 of the control unit 130 of the information processing device 10 to specify an area where crops are in a specific state.
 まず、領域特定部137は、記憶部120の画像DB123を参照して、作物が特定の状態にある領域を特定する元となるスペクトル画像127が撮像された位置(地域、領域)を指定・選択する(S2100)。 First, the area specifying unit 137 refers to the image DB 123 in the storage unit 120 and specifies/selects the position (region, region) where the spectral image 127 is captured, which is the source for specifying the area where the crops are in a specific state. (S2100).
 次いで、種類特定部138は、S2100により指定・選択された条件に合致するスペクトル画像127を記憶部120から抽出し、抽出したスペクトル画像127を出力装置111の一例であるディスプレイに表示する(S2101)。 Next, the type identification unit 138 extracts from the storage unit 120 the spectral image 127 that matches the conditions specified and selected in S2100, and displays the extracted spectral image 127 on the display, which is an example of the output device 111 (S2101). .
 次いで、画像合成部136は、S2101により抽出されたスペクトル画像127に基づいて広域スペクトル画像128を生成し(S2102)、生成した広域スペクトル画像128をディスプレイ等に表示する(S2103)。なお、画像合成部136による広域スペクトル画像128の生成動作は、図21に示す処理に先立って行ってもよい。 Next, the image synthesis unit 136 generates a wide spectrum image 128 based on the spectrum image 127 extracted in S2101 (S2102), and displays the generated wide spectrum image 128 on a display or the like (S2103). Note that the operation of generating the wide spectrum image 128 by the image synthesis unit 136 may be performed prior to the processing shown in FIG. 21.
 次いで、種類特定部138は、画像DB123を参照して、広域スペクトル画像128の元となるスペクトル画像127が撮像された時刻を項目「撮像時刻」から取得し、この撮像時刻から、スペクトル画像127が撮像された際の地表面と太陽とがなす角度(太陽角度)を算出する。そして、種類特定部138は、学習モデル126または記憶部120に格納されている補正式等に基づいて、このスペクトル画像127が特定の時刻(例えば昼間12時)に撮像されたとしたときのスペクトルとなるように、スペクトル画像127のスペクトルを補正し、全体として広域スペクトル画像128のスペクトルを補正する(S2104)。 Next, the type specifying unit 138 refers to the image DB 123 and obtains the time when the spectral image 127, which is the source of the wide spectrum image 128, was captured from the item "imaging time", and from this imaging time, the spectral image 127 is Calculates the angle between the earth's surface and the sun (sun angle) when the image is taken. Then, based on the learning model 126 or the correction formula stored in the storage unit 120, the type identification unit 138 determines the spectrum of the spectrum image 127 when it is taken at a specific time (for example, 12:00 in the daytime). The spectrum of the spectrum image 127 is corrected so that the spectrum of the wide spectrum image 128 as a whole is corrected (S2104).
 そして、種類特定部138は、記憶部120に格納されている学習モデル126を用いて、広域スペクトル画像128が撮像された観測対象の地域において、作物が特定の状態にあるか否か、あるとすればどの領域であるかの推定処理(推論動作)を行い(S2105)、推論結果をディスプレイ等に表示させる(S2106)。 Then, the type identifying unit 138 uses the learning model 126 stored in the storage unit 120 to determine whether or not crops are in a specific state in the observation target area where the wide spectrum image 128 is captured. Then, estimation processing (inference operation) is performed to determine which area it is (S2105), and the inference result is displayed on a display or the like (S2106).
 <画面例>
 図24は、図19のフローチャートにより示される学習モデル生成部133の学習モデル生成処理において、情報処理装置10の出力装置111の一例であるディスプレイに表示される画面の一例を示す図である。
<Screen example>
FIG. 24 is a diagram showing an example of a screen displayed on the display, which is an example of the output device 111 of the information processing device 10, in the learning model generation process of the learning model generation unit 133 shown in the flowchart of FIG.
 画面2400には、学習モデル生成の元となるスペクトル画像127を指定するためのボタン2401が表示されており、情報処理装置10のオペレータがこのボタンを入力装置110の一例であるポインティングデバイス等を用いてスペクトル画像2402~2404を指定する。指定されたスペクトル画像2402~2404は、波長帯域毎に画面2400に表示される。 A button 2401 for specifying a spectral image 127 as a source for generating a learning model is displayed on the screen 2400, and the operator of the information processing device 10 presses this button using a pointing device or the like that is an example of the input device 110. Spectral images 2402 to 2404 are specified. Specified spectral images 2402 to 2404 are displayed on screen 2400 for each wavelength band.
 情報処理装置10のオペレータは、これらスペクトル画像2402~2404を閲覧し、事前に取得している、この地域における作物の特定の状態及び作物が特定の状態にある領域の情報に基づいて、作物が特定の状態にあることを最もよく示すスペクトル画像2402~2404(図24に示す例では波長βのスペクトル画像2403)について、作物が特定の状態にある領域2405を、入力装置110の一例であるポインティングデバイス等を用いて指定する。次いで、情報処理装置10のオペレータは、プルダウンメニュー2406を用いて、領域2405における作物の具体的な特定の状態を指定する。そして、領域2405の指定及びプルダウンメニュー2406による作物の具体的な特定の状態の指定が終了したら、情報処理装置10のオペレータはOKボタン2407をポインティングデバイス等によりクリックして、指定動作の終了入力を行う。 The operator of the information processing device 10 views these spectral images 2402 to 2404 and determines whether the crops are growing based on information about the specific state of the crops in this area and the area where the crops are in the specific state, which has been acquired in advance. Regarding the spectral images 2402 to 2404 (in the example shown in FIG. 24, the spectral image 2403 at wavelength β) that best indicates that the crop is in a specific state, a region 2405 where the crop is in a specific state is detected by pointing, which is an example of the input device 110. Specify using a device, etc. Next, the operator of the information processing apparatus 10 uses the pull-down menu 2406 to specify a specific state of the crops in the area 2405. After specifying the area 2405 and specifying the specific state of the crop using the pull-down menu 2406, the operator of the information processing device 10 clicks the OK button 2407 using a pointing device or the like to input the end of the specified operation. conduct.
 次に、図25は、図21のフローチャートにより示される、領域特定部137及び種類特定部138による、作物が特定の状態にある領域を特定する処理において、情報処理装置10の出力装置111の一例であるディスプレイに表示される画面の一例を示す図である。 Next, FIG. 25 shows an example of the output device 111 of the information processing device 10 in the process of specifying an area where crops are in a specific state by the area specifying unit 137 and the type specifying unit 138, which is shown in the flowchart of FIG. FIG. 2 is a diagram showing an example of a screen displayed on a display.
 画面2500にはスペクトル画像2501が表示され、また、このスペクトル画像2501の撮像日時及び撮像領域を示す情報が表示される領域2502が設けられている。また、スペクトル画像2501が撮像された領域に対応する地図データ2503が表示されている。情報処理装置10のオペレータが、プルダウンメニュー2504により、特定を希望する作物の特定の状態を指定すると、スペクトル画像2501には、指定された具体的な特定の状態にあると推測される領域2505が表示される。この領域2505はスペクトル画像2501の濃淡分布でもある。 A spectral image 2501 is displayed on the screen 2500, and an area 2502 is provided in which information indicating the imaging date and time of the spectral image 2501 and the imaging area are displayed. Additionally, map data 2503 corresponding to the area where the spectrum image 2501 was captured is displayed. When the operator of the information processing device 10 specifies a specific state of the crop that he/she wishes to identify using the pull-down menu 2504, a region 2505 that is estimated to be in the specified specific state is displayed in the spectral image 2501. Is displayed. This region 2505 is also the density distribution of the spectrum image 2501.
 <実施形態の効果>
 以上詳細に説明したように、本実施形態のシステム1によれば、収穫をする事業を行う事業者にとって、より適切に育成状況を管理することが可能となる。
<Effects of embodiment>
As described above in detail, according to the system 1 of the present embodiment, it becomes possible for a business operator engaged in a harvesting business to manage the growing situation more appropriately.
 <変形例>
 なお、上記した実施形態は本開示を分かりやすく説明するために構成を詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、各実施形態の構成の一部について、他の構成に追加、削除、置換することが可能である。
<Modified example>
Note that the configurations of the embodiments described above are explained in detail in order to explain the present disclosure in an easy-to-understand manner, and the embodiments are not necessarily limited to those having all of the configurations described. Furthermore, it is possible to add, delete, or replace a part of the configuration of each embodiment with other configurations.
 一例として、上述した実施形態のシステム1では、衛星30に搭載したスペクトルカメラ32により地表面を撮像してスペクトル画像3103を取得し、また、ドローン20に搭載したスペクトルカメラ22により観測対象の地域を撮像してスペクトル画像2104を取得していたが、スペクトル画像を取得するスペクトルカメラが搭載される移動体は衛星30、ドローン20に限定されず、空中から地表面を撮像しうる移動体であれば限定はない。さらに、スペクトルカメラが搭載された移動体が地表面を移動するような手法、例えばスペクトルカメラを保持したオペレータが地表面を徒歩で移動する、スペクトルカメラを搭載した自動車が地表面を走行する等の手法であってもよい。一例として、特許6342594号に開示された分光端末装置をスマートフォンにより実現したマルチスペクトルカメラを用いたような態様が挙げられる。 As an example, in the system 1 of the embodiment described above, the spectral camera 32 mounted on the satellite 30 images the ground surface to obtain a spectral image 3103, and the spectral camera 22 mounted on the drone 20 captures the area to be observed. Although the spectral image 2104 is acquired by imaging, the mobile object on which the spectral camera for acquiring the spectral image is mounted is not limited to the satellite 30 or the drone 20, but any mobile object that can image the ground surface from the air can be used. There are no limitations. Furthermore, methods in which a mobile object equipped with a spectral camera moves on the ground surface, such as an operator holding a spectral camera moving on the ground surface, or a car equipped with a spectral camera driving on the ground surface, etc. It may be a method. One example is a mode in which a multispectral camera realized by a smartphone is used as the spectroscopic terminal device disclosed in Japanese Patent No. 6,342,594.
 このように、ライブラリ作成のためのスペクトル画像3103を取得する手法、観測対象の地域を撮像してスペクトル画像2104を取得する手法には種々の組合せ、バリエーションが存在しうる。以下、このバリエーションの変形例について詳述する。 As described above, there may be various combinations and variations of the method of acquiring the spectral image 3103 for library creation and the method of acquiring the spectral image 2104 by imaging the area to be observed. A modification of this variation will be described in detail below.
 衛星30に搭載したスペクトルカメラ32であるLTCFカメラは、このスペクトルカメラ32一台で数多くの(一例として数百の)波長帯域の反射光を受光してスペクトル画像3103を取得することができる。一方、ドローン20に搭載したスペクトルカメラ22は、ライブラリに基づいて、観測対象の地域における作物が特定の状態にあることを検知可能な(複数の)波長帯域に限定して、これら複数の波長帯域の反射光からスペクトル画像2104を取得している。 The LTCF camera, which is the spectrum camera 32 mounted on the satellite 30, can receive reflected light in many (for example, several hundred) wavelength bands to obtain a spectrum image 3103. On the other hand, the spectral camera 22 mounted on the drone 20 uses the library to limit the wavelength bands to (a plurality of) wavelength bands that can detect that crops in a specific state in the observation target area. A spectral image 2104 is obtained from the reflected light.
 このようなスペクトルカメラ22、32の構成の差異は、スペクトルカメラ22、32の重量及び価格にも反映される。つまり、スペクトルカメラ32はスペクトルカメラ22と比較して高価でありかつ重量が嵩む。スペクトルカメラ22、32の重量の差は、スペクトルカメラ22、32を搭載する飛行体の構成にも影響する。つまり、スペクトルカメラ32をドローン20に搭載する場合、ドローン20の構成が大型化せざるを得ず、さらに、一度に飛行できる時間も短時間に制限されてしまう可能性がある。 Such a difference in the configurations of the spectral cameras 22 and 32 is also reflected in the weight and price of the spectral cameras 22 and 32. That is, the spectral camera 32 is more expensive and heavier than the spectral camera 22. The difference in weight between the spectral cameras 22 and 32 also affects the configuration of the aircraft on which the spectral cameras 22 and 32 are mounted. In other words, when the spectral camera 32 is mounted on the drone 20, the configuration of the drone 20 has to be increased in size, and furthermore, the time that it can fly at one time may be limited to a short time.
 一方、衛星30にスペクトルカメラ32を搭載すれば、スペクトルカメラ32の重量による制限はやや緩和されるものの、観測対象の地域のスペクトル画像3103を取得できる頻度は低くならざるを得ない。つまり、衛星30が観測対象の地域の上空を通過するのは数ヶ月に一回程度になることがあり、従って、スペクトル画像3103を取得できる頻度も数ヶ月に一回程度になることがある。 On the other hand, if the spectral camera 32 is mounted on the satellite 30, the restriction due to the weight of the spectral camera 32 will be somewhat relaxed, but the frequency with which spectral images 3103 of the area to be observed can be acquired will inevitably be lower. That is, the satellite 30 may pass over the area to be observed only once every several months, and therefore the frequency at which the spectral image 3103 can be obtained may also be about once every several months.
 このため、システム1の運用開始当初などにおいて、スペクトルカメラ32により観測対象の地域のスペクトル画像3103を取得する回数を一定に制限して、しかも、スペクトルカメラ32により観測対象の地域を撮像する時刻も固定して(例えば昼間12時)、その地域のその時刻の南中高度を太陽角度としてライブラリを作成することが考えられる。この場合、ライブラリ作成時には太陽角度の補正は行わない。 For this reason, at the beginning of operation of the system 1, the number of times the spectrum image 3103 of the area to be observed is acquired by the spectral camera 32 is limited to a certain value, and the time at which the spectral camera 32 images the area to be observed is also limited. It is conceivable to fix it (for example, 12:00 in the daytime) and create a library using the solar angle as the solar angle at that time in the area. In this case, the sun angle is not corrected when creating the library.
 次に、スペクトルカメラ22により観測対象の地域を撮像する時刻を、スペクトルカメラ32により観測対象の地域を撮像した時刻(上述の例では昼間12時)に合わせてスペクトルカメラ22により観測対象の地域を撮像すれば、太陽高度の補正を行わずとも観測対象の地域の作物が特定の状態にあるかどうかを検知することができる。この後、スペクトルカメラ22により観測対象の地域を撮像する時刻を変えてドローン20等を繰り返し飛行させて、複数の時刻における観測対象の地域のスペクトル画像2104を取得し、このスペクトル画像2104に対して太陽角度の補正を行って、観測対象の地域の作物が特定の状態にあるかどうかの検出を行えばよい。太陽角度の補正値は教師データ125に格納して学習モデル126を再学習すればよいことは上述の通りである。 Next, the time at which the spectrum camera 22 images the area to be observed is set to the time at which the area to be observed is imaged by the spectrum camera 32 (in the above example, 12:00 in the daytime). By capturing images, it is possible to detect whether crops in the area being observed are in a particular state without having to correct the sun's altitude. After this, the drone 20 or the like is repeatedly flown while changing the time at which the spectral camera 22 images the area to be observed, and spectral images 2104 of the area to be observed at multiple times are obtained. By correcting the sun angle, it is possible to detect whether the crops in the area being observed are in a specific state. As described above, the sun angle correction value may be stored in the teacher data 125 and the learning model 126 may be retrained.
 LTCFカメラであるスペクトルカメラ32によりライブラリ作成のためのスペクトル画像3103を取得するメリットは、上述した多数の波長帯域の反射光によるスペクトル画像3103を数少ない(極端には一度の)撮影で取得できることにある。そして、多数の波長帯域の反射光によるスペクトル画像3103を取得することにより、観測対象の地域の作物が特定の状態にあるか否かを検出するために好適な波長帯域を精度良く選定することができる。これにより、観測対象の地域において作物が特定の状態にあるか否かの検出精度を下げることなく、スペクトルカメラ22が検出する波長帯域の数を削減することができる。これは、スペクトルカメラ22の軽量化にもつながり、スペクトルカメラ22を搭載するドローン20等の飛行体として汎用性の高い飛行体を利用できることにつながり、さらに、ドローン20等を長時間かつ高頻度で飛行させることにもつながる。ドローン20等を長時間飛行させることができれば、スペクトル画像2104取得のためのコスト低減になり、また、ドローン20等を高頻度で飛行させることができれば、観測対象の地域における作物が特定の状態にあるか否かの検出をいち早く行うことにもつながる。 The advantage of acquiring spectral images 3103 for library creation with the spectral camera 32, which is an LTCF camera, is that the spectral images 3103 of reflected light in a large number of wavelength bands described above can be acquired with a small number of shots (in extreme case, once). . By acquiring a spectral image 3103 using reflected light in multiple wavelength bands, it is possible to accurately select a suitable wavelength band to detect whether or not crops in a region to be observed are in a specific state. can. Thereby, the number of wavelength bands detected by the spectral camera 22 can be reduced without lowering the accuracy of detecting whether or not crops are in a specific state in the area to be observed. This also leads to a reduction in the weight of the spectral camera 22, and allows the use of a highly versatile flying vehicle such as the drone 20 equipped with the spectral camera 22. Furthermore, it also allows the use of a highly versatile flying vehicle such as the drone 20 for a long period of time and at high frequency. It also leads to flight. If the drone 20 etc. can be flown for a long time, the cost for acquiring the spectral image 2104 can be reduced, and if the drone 20 etc. can be flown frequently, the crops in the area to be observed will be in a specific state. This also helps in quickly detecting whether or not it exists.
 また、当初のライブラリ作成の際に太陽角度の補正を行わずに、スペクトル画像2104の取得時刻をスペクトル画像3103の取得時刻に合わせることにより、システム1立ち上げの際の手間を省くことができ、システム1導入までの日時の削減ができる。そして、システム1運用に伴って複数の時刻におけるスペクトル画像2104、3103を取得することで、作物が特定の状態にあるか否かの検出精度を向上させることができる。 Furthermore, by matching the acquisition time of the spectral image 2104 to the acquisition time of the spectral image 3103 without correcting the solar angle when initially creating the library, it is possible to save time and effort when starting up the system 1. The time required to install System 1 can be reduced. By acquiring spectral images 2104 and 3103 at a plurality of times as the system 1 operates, it is possible to improve the accuracy of detecting whether or not crops are in a specific state.
 また、作物が特定の状態にある領域を特定するために、上述した実施形態のシステム1ではスペクトル画像を用いていたが、スペクトル画像に加えて、当該地域の温度分布を用いてもよい。このためには、ライブラリ作成の元となるスペクトル画像を取得する際に、この地域の温度をセンサ等により取得するとともに、観測対象の地域のスペクトル画像を取得する際にも、この地域の温度をセンサ等により取得すれば良い。この際、温度分布は2次元的な広がりを有するように、温度測定点を複数設けることが好ましい。この場合、ライブラリには当該地域の温度分布データも含まれる。 Furthermore, in order to identify the area where the crops are in a particular state, the system 1 of the above-described embodiment uses a spectral image, but in addition to the spectral image, the temperature distribution of the area may also be used. To do this, it is necessary to obtain the temperature of this region using a sensor, etc. when acquiring the spectral image that is the basis for library creation, and also to obtain the temperature of this region when acquiring the spectral image of the area to be observed. It may be acquired by a sensor or the like. At this time, it is preferable to provide a plurality of temperature measurement points so that the temperature distribution has a two-dimensional spread. In this case, the library also includes temperature distribution data for the area.
 さらに、上述した実施形態のシステム1では、学習モデル126を用いて観測対象の地域のうち作物が特定の状態にある領域を特定していたが、いわゆる画像解析技術を用いて作物が特定の状態にある領域を特定してもよい。 Furthermore, in the system 1 of the above-described embodiment, the learning model 126 is used to identify areas where crops are in a specific state in the observation target area, but so-called image analysis technology is used to identify areas where crops are in a specific state. You may also specify an area within.
 さらに、本開示に係るシステム1において、次のような構成も可能である。 Furthermore, in the system 1 according to the present disclosure, the following configuration is also possible.
 マルチスペクトルカメラにより撮影されるスペクトル画像と比較するための、観測対象である作物が特定の状態であることを検知するための情報をメモリに記憶させる際に、第1のマルチスペクトルカメラを搭載した飛行体を飛行させて地表を撮影させてスペクトル画像を生成する。 A first multispectral camera is installed to store in memory information for detecting that a crop being observed is in a specific state for comparison with a spectral image taken by the multispectral camera. A spectral image is generated by flying an aircraft and photographing the earth's surface.
 次に、第1のマルチスペクトルカメラと異なる第2のマルチスペクトルカメラを搭載した飛行体を、運行計画に従って、観測対象の地域において飛行させ、作物が特定の状態であることを検知するための波長により第2のマルチスペクトルカメラで撮影させることでスペクトル画像を生成する。好ましくは、第1のマルチスペクトルカメラを搭載した飛行体と第2のマルチスペクトルカメラを搭載した飛行体とは異なる飛行体である。 Next, an aircraft equipped with a second multispectral camera that is different from the first multispectral camera is flown in accordance with the flight plan in the area to be observed, and the wavelengths used to detect that the crops are in a specific state are A spectral image is generated by photographing with a second multispectral camera. Preferably, the aircraft carrying the first multispectral camera and the aircraft carrying the second multispectral camera are different aircraft.
 第1のマルチスペクトルカメラは、液晶波長可変フィルタ34のような、撮像する波長を切替可能な部材を有し、多数の波長についてのスペクトル画像を生成することができる。一方、第2のマルチスペクトルカメラは、特定の波長についてのスペクトル画像を生成することができる。特定の波長は、第2のマルチスペクトルカメラによる観測対象の地域を撮像している間は切替ができない。つまり、特定の波長は予め定めた波長である。 The first multispectral camera has a member, such as a liquid crystal variable wavelength filter 34, that can switch the wavelength to be imaged, and can generate spectral images for a large number of wavelengths. On the other hand, the second multispectral camera can generate spectral images for specific wavelengths. The specific wavelength cannot be switched while the second multispectral camera is imaging the area to be observed. That is, the specific wavelength is a predetermined wavelength.
 第2のマルチスペクトルカメラは、第1のマルチスペクトルカメラと比べて、スペクトル画像を撮影するための特定の部材を有さないものであるとしてもよい。例えば、第2のマルチスペクトルカメラは、波長を切替可能な部材を有する第1のマルチスペクトルカメラとは異なり、波長を切り替え可能な部材を有さず特定の波長についてのスペクトル画像を生成するものであるとしてもよい。第2のマルチスペクトルカメラを搭載する第2の飛行体は、第1のマルチスペクトルカメラを搭載する第1の飛行体と異なり、積載可能な重量が低いものであるとしてもよい。これにより第1の飛行体と第2の飛行体とが重量が異なることとしてもよいし(第2の飛行体のほうが第1の飛行体よりも機体重量が少ない)、これら重量の違いに基づいて、第1の飛行体と第2の飛行体とで飛行可能な空域(緯度経度により定まる地理範囲、および、高度)が異なることとしてもよい。第2の飛行体は、複数の(例えば4つ程度の)第2のマルチスペクトルカメラを搭載し、それぞれの第2のマルチスペクトルカメラがそれぞれ異なる波長についてのスペクトル画像を生成するものであるとしてもよい。 The second multispectral camera may not have a specific member for capturing spectral images, compared to the first multispectral camera. For example, unlike the first multispectral camera which has a wavelength switchable member, the second multispectral camera does not have a wavelength switchable member and generates a spectral image at a specific wavelength. There may be one. The second flying vehicle carrying the second multispectral camera may have a lower loadable weight, unlike the first flying vehicle carrying the first multispectral camera. This may mean that the first flying object and the second flying object have different weights (the second flying object is less than the first flying object), or based on these differences in weight. Therefore, the airspace (geographical range determined by latitude and longitude, and altitude) in which the first flying object and the second flying object can fly may be different. Even if the second aircraft carries a plurality of (for example, about four) second multispectral cameras, and each second multispectral camera generates spectral images at different wavelengths. good.
 液晶波長可変フィルタ34を搭載したマルチスペクトルカメラ32のように、多数の波長帯域についてスペクトル画像が取得可能な第1のマルチスペクトルカメラ32は高価でありかつ重量が嵩む傾向にある。一方、特定の波長についてのスペクトル画像のみ生成可能な第2のマルチスペクトルカメラ22は安価でありかつ軽量に構成することができる。 The first multispectral camera 32 capable of acquiring spectral images in a large number of wavelength bands, such as the multispectral camera 32 equipped with the liquid crystal variable wavelength filter 34, tends to be expensive and heavy. On the other hand, the second multispectral camera 22, which can generate only spectral images for specific wavelengths, is inexpensive and can be configured to be lightweight.
 第1のマルチスペクトルカメラ32はライブラリ作成のためのスペクトル画像を生成するものであるから、多数の波長帯域の反射光に基づいてスペクトル画像を生成することが好ましい。一方、第2のマルチスペクトルカメラ22は、観測対象の地域において作物が特定の状態にある領域を特定するためのものであり、ドローン20のような飛行体によって高頻度に撮影することが好ましい。従って、第2のマルチスペクトルカメラ22は、第1のマルチスペクトルカメラ32よりも軽量でかつ安価に構成するメリットは大きい。 Since the first multispectral camera 32 generates spectral images for library creation, it is preferable to generate spectral images based on reflected light in multiple wavelength bands. On the other hand, the second multispectral camera 22 is used to identify areas where crops are in a specific state in the observation target area, and is preferably photographed frequently by a flying object such as the drone 20. Therefore, the second multispectral camera 22 has a great advantage of being lighter and cheaper than the first multispectral camera 32.
 また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、本発明は、実施例の機能を実現するソフトウェアのプログラムコードによっても実現できる。この場合、プログラムコードを記録した記憶媒体をコンピュータに提供し、そのコンピュータが備えるプロセッサが記憶媒体に格納されたプログラムコードを読み出す。この場合、記憶媒体から読み出されたプログラムコード自体が前述した実施例の機能を実現することになり、そのプログラムコード自体、及びそれを記憶した記憶媒体は本発明を構成することになる。このようなプログラムコードを供給するための記憶媒体としては、例えば、フレキシブルディスク、CD-ROM、DVD-ROM、ハードディスク、SSD、光ディスク、光磁気ディスク、CD-R、磁気テープ、不揮発性のメモリカード、ROMなどが用いられる。 Further, each of the above-mentioned configurations, functions, processing units, processing means, etc. may be partially or entirely realized by hardware, for example, by designing an integrated circuit. Further, the present invention can also be realized by software program codes that realize the functions of the embodiments. In this case, a storage medium on which a program code is recorded is provided to a computer, and a processor included in the computer reads the program code stored on the storage medium. In this case, the program code itself read from the storage medium realizes the functions of the embodiments described above, and the program code itself and the storage medium storing it constitute the present invention. Storage media for supplying such program codes include, for example, flexible disks, CD-ROMs, DVD-ROMs, hard disks, SSDs, optical disks, magneto-optical disks, CD-Rs, magnetic tapes, and non-volatile memory cards. , ROM, etc. are used.
 また、本実施例に記載の機能を実現するプログラムコードは、例えば、アセンブラ、C/C++、perl、Shell、PHP、Java(登録商標)等の広範囲のプログラム又はスクリプト言語で実装できる。 Furthermore, the program code that implements the functions described in this embodiment can be implemented using a wide range of program or script languages, such as assembler, C/C++, Perl, Shell, PHP, and Java (registered trademark).
 さらに、実施例の機能を実現するソフトウェアのプログラムコードを、ネットワークを介して配信することによって、それをコンピュータのハードディスクやメモリ等の記憶手段又はCD-RW、CD-R等の記憶媒体に格納し、コンピュータが備えるプロセッサが当該記憶手段や当該記憶媒体に格納されたプログラムコードを読み出して実行するようにしてもよい。 Furthermore, by distributing the software program code that realizes the functions of the embodiment via a network, it can be stored in a storage means such as a computer's hard disk or memory, or a storage medium such as a CD-RW or CD-R. Alternatively, a processor included in the computer may read and execute the program code stored in the storage means or the storage medium.
 <付記>
 以上の各実施形態で説明した事項を以下に付記する。
<Additional notes>
The matters explained in each of the above embodiments are additionally described below.
 (付記1)
 コンピュータ(10)を動作させるための方法であって、コンピュータ(10)のプロセッサ(101)に、マルチスペクトルカメラ(22)により撮影されるスペクトル画像(127)と比較するための、観測対象である作物が特定の状態であることを検知するための情報(126)をメモリ(120)に記憶させる第1ステップ(S1908)と、観測対象の地域において、作物が特定の状態であることを検知するための波長によりマルチスペクトルカメラ(22)で撮影させることでスペクトル画像(127)を生成する第2ステップ(S2002)と、撮影した結果であるスペクトル画像(127)と、作物が特定の状態であることを検知するための情報(126)とに基づいて、観測対象の地域において特定の状態にある作物がある地域を特定する第3ステップ(S2105)と、特定した地域の情報を出力する第4ステップ(S2106)と、
を実行させる、方法。
 (付記2)
 第2ステップにおいて、マルチスペクトルカメラ(22)を搭載した飛行体(20)を、運行計画に従って、観測対象の地域において飛行させ、作物が特定の状態であることを検知するための波長によりマルチスペクトルカメラ(22)で撮影させることでスペクトル画像(127)を生成する、付記1に記載の方法。
 (付記3)
 第1ステップ(S1908)において、検知するための情報として、観測対象の作物が育成されている地域を示す情報と、当該地域をマルチスペクトルカメラ(22)で撮影することで得られるスペクトル画像(127)及びその撮影のための波長の情報と、当該地域において作物を撮影する際の太陽の角度の情報と、が関連付けられたデータベースを教師データとして機械学習を行うことにより生成される学習済みモデル(126)をメモリ(120)に記憶させており、第3ステップ(S2105)において、撮影した結果であるスペクトル画像(127)と、学習済みモデル(126)とに基づいて、特定の状態にある作物がある地域を特定する、付記1に記載の方法。
 (付記4)
 第1ステップ(S1908)において、検知するための情報として、さらに、マルチスペクトルカメラ(22)で地域を撮像したときの地域の温度の情報が関連付けられたデータベースを教師データ(125)として機械学習を行うことにより生成される学習済みモデル(126)をメモリ(120)に記憶させており、第3ステップにおいて、さらに、マルチスペクトルカメラ(22)で観測対象の地域を撮像したときの観測対象の地域の温度に基づいて、特定の状態にある作物がある地域を特定する、付記3に記載の方法。
 (付記5)
 第1ステップ(S1908)において、教師データ(125)であるデータベースとして、さらに、特定の状態の種類の情報が関連付けられており、当該特定の状態の種類の情報を含むデータベースにより生成された学習済みモデル(126)をメモリ(120)に記憶させており、第3ステップ(S2105)において、学習済みモデル(126)に基づき、特定の状態にある作物がある地域と、特定の状態の種類とを特定する、付記3または4に記載の方法。
 (付記6)
 第2ステップ(S2002)において、運行計画として、複数機の飛行体(20)により、それぞれ異なる地域を飛行させ、各飛行体により撮影されたスペクトル画像(127)を、撮影された地域に基づき重ね合わせてスペクトル画像(128)を生成する、付記2に記載の方法。
 (付記7)
 第2ステップ(S2002)において、運行計画として、観測対象の作物に応じて、観測対象の地域ごとにマルチスペクトルカメラ(22)の波長を指定し、波長の指定に従って、各地域で波長を切り替えて、飛行体のマルチスペクトルカメラ(22)によりスペクトル画像(127)を生成する、付記2に記載の方法。
 (付記8)
 第2ステップ(S2002)において、運行計画として、作物の育成場所の周囲の地形が所定の条件を満たす場合に、当該所定の条件を満たす地域を優先して、飛行体のマルチスペクトルカメラ(22)により撮影させることでスペクトル画像(127)を生成する、付記2に記載の方法。
 (付記9)
 第3ステップ(S2105)において、作物が特定の状態になっている地理範囲と、そうでない地理範囲とを区別し、さらに、スペクトル画像(127)を撮影した地点に基づき、地図画像と重ね合わせて、作物が特定の状態になっている範囲を表示する、付記1~8のいずれかに記載の方法。
 (付記10)
 第3ステップ(S2105)において、撮影時点の指定をユーザから受け付けており、指定された時点において撮影されたスペクトル画像(127)に基づき、作物が特定の状態になっている地理範囲を表示する、付記1~9のいずれかに記載の方法。
 (付記11)
 第3ステップ(S2105)において、作物が特定の状態であることを検知した結果に基づき、特定の状態になっている作物の情報と、作物が特定の状態であることに対処するための費用の情報とを表示する、付記1~10のいずれかに記載の方法。
 (付記12)
 第1ステップ(S1908)において、マルチスペクトルカメラ(32)を搭載した飛行体(30)を飛行させて地表を撮影させ、得られたスペクトル画像(127)に基づいて、観測対象の作物が育成されている地域を示す情報と、当該地域をマルチスペクトルカメラ(32)で撮影することで得られるスペクトル画像(127)及びその撮影のための波長の情報とを関連付けてメモリ(120)に記憶させ、さらに、コンピュータ(10)のプロセッサ(101)に、記憶されたスペクトル画像(127)をユーザに閲覧させることにより、当該スペクトル画像(127)と関連付けて、作物が特定の状態であることの指定をユーザから受け付ける第5ステップ(S1904)と、ユーザの指定に応じて、作物が特定の状態であることを検知するためのデータベース(125)をメモリ(120)に記憶させる第6ステップ(S1906)と、を実行させる、付記2~8のいずれかに記載の方法。
 (付記13)
 データベースを作成するためのスペクトル画像は複数の波長からなる第1の波長群について同一の地域を撮像したものであり、第2ステップにおけるスペクトル画像は複数の波長からなる第2の波長群について同一の観測対象の地域を撮像したものであり、第1の波長群の波長数は第2の波長群の波長数より多い、付記12に記載の方法。
 (付記14)
 第1ステップ(S1908)において、地域を示す情報と、スペクトル画像(127)及びその撮影のための波長の情報と関連付けて、さらに、特定の状態の種類の情報をメモリ(120)に蓄積させる、付記12または13に記載の方法。
 (付記15)
 第5ステップ(S1904)においてユーザから受け付けた指定に基づいて、作物が特定の状態であることをスペクトル画像(127)により検知するための波長を特定する、付記12~14のいずれかに記載の方法。
 (付記16)
 第1ステップ(S1908)において、さらにマルチスペクトルカメラ(23)で地域を撮像したときの地域の温度の情報を関連付けてメモリに記憶させる、付記12~15のいずれかに記載の方法。
 (付記17)
 メモリ(120)に蓄積されている情報のうち、作物の育成場所の周囲の地形が所定の条件を満たす場合に、当該所定の条件を満たす地域のスペクトル画像(127)を抽出し、抽出したスペクトル画像(127)を指定可能にユーザに提示する、付記12~16のいずれかに記載の方法。
 (付記18)
 ユーザから、地域に関する情報の入力を受け付けることに応答して、当該入力にかかる地域のスペクトル画像(127)を抽出し、抽出したスペクトル画像(127)を指定可能にユーザに提示する、付記12~17のいずれかに記載の方法。
 (付記19)
 付記1~18のいずれかに記載の方法をコンピュータ(10)に実行させるためのプログラム。
 (付記20)
 付記1~18のいずれかに記載の方法を実行する情報処理装置。
(Additional note 1)
A method for operating a computer (10), wherein a processor (101) of the computer (10) is provided with an observation target for comparison with a spectral image (127) taken by a multispectral camera (22). A first step (S1908) of storing information (126) in the memory (120) for detecting that the crops are in a specific state, and detecting that the crops are in a specific state in the area to be observed. A second step (S2002) of generating a spectral image (127) by photographing with a multispectral camera (22) according to the wavelength of the image, and a spectral image (127) that is the result of the photographing, and a spectral image (127) that shows that the crop is in a specific state. A third step (S2105) of identifying an area where crops in a specific state are located in the observation target area based on the information (126) for detecting this, and a fourth step of outputting information on the identified area. Step (S2106),
How to run it.
(Additional note 2)
In the second step, an aircraft (20) equipped with a multispectral camera (22) is flown in accordance with the flight plan in the area to be observed, and a multispectral camera (20) is used to detect crops using wavelengths to detect specific conditions. The method according to appendix 1, wherein the spectral image (127) is generated by photographing with the camera (22).
(Additional note 3)
In the first step (S1908), the detection information includes information indicating the area where the target crop is grown and a spectral image (127 ), the wavelength information for photographing it, and the sun angle information when photographing crops in the area are associated. A trained model ( 126) is stored in the memory (120), and in the third step (S2105), the crops in a specific state are The method described in Appendix 1 for identifying the area where there is.
(Additional note 4)
In the first step (S1908), machine learning is performed using, as detection information, a database associated with information on the temperature of the area when the area is imaged with the multispectral camera (22) as training data (125). The trained model (126) generated by performing the above steps is stored in the memory (120), and in the third step, the area to be observed when the area to be observed is imaged by the multispectral camera (22) is further stored in the memory (120). The method according to appendix 3, wherein regions with crops in a particular state are identified based on the temperature of the area.
(Appendix 5)
In the first step (S1908), information on a specific state type is further associated with the database as the teacher data (125), and the trained data generated by the database containing the information on the specific state type is The model (126) is stored in the memory (120), and in the third step (S2105), based on the learned model (126), areas where crops are in a specific state and the type of the specific state are determined. The method according to appendix 3 or 4 for identifying.
(Appendix 6)
In the second step (S2002), as an operation plan, multiple aircraft (20) are flown over different areas, and spectral images (127) taken by each aircraft are superimposed based on the photographed areas. The method according to appendix 2, which together generates a spectral image (128).
(Appendix 7)
In the second step (S2002), as an operation plan, the wavelength of the multispectral camera (22) is specified for each region to be observed according to the crops to be observed, and the wavelength is switched in each region according to the wavelength specification. , the method of claim 2, wherein the spectral image (127) is generated by a multispectral camera (22) of the aircraft.
(Appendix 8)
In the second step (S2002), as a flight plan, if the topography around the crop growing place satisfies predetermined conditions, priority is given to the area that satisfies the predetermined conditions, and the multispectral camera (22) of the aircraft The method according to appendix 2, wherein the spectral image (127) is generated by photographing the spectrum image.
(Appendix 9)
In the third step (S2105), the geographical range in which the crops are in a specific state and the geographical range in which they are not are distinguished, and furthermore, based on the point where the spectral image (127) was taken, the spectral image (127) is superimposed on the map image. , the method according to any one of appendices 1 to 8, which displays the range in which the crop is in a specific state.
(Appendix 10)
In the third step (S2105), the designation of the photographing time point is accepted from the user, and based on the spectral image (127) photographed at the designated time point, the geographical range where the crops are in a particular state is displayed. The method described in any of Supplementary Notes 1 to 9.
(Appendix 11)
In the third step (S2105), based on the result of detecting that the crops are in a specific state, information on the crops in the specific state and costs for dealing with the crops in the specific state are provided. The method according to any one of Supplementary Notes 1 to 10, for displaying information.
(Appendix 12)
In the first step (S1908), a flying object (30) equipped with a multispectral camera (32) is flown to photograph the ground surface, and based on the obtained spectral image (127), crops to be observed are grown. information indicating the area where the area is located, a spectral image (127) obtained by photographing the area with a multispectral camera (32), and information on the wavelength for photographing the area, in association with each other, and storing the information in the memory (120); Furthermore, by allowing the user to view the stored spectral image (127) in the processor (101) of the computer (10), the user can specify that the crop is in a particular state in association with the spectral image (127). a fifth step (S1904) of accepting from the user; and a sixth step (S1906) of storing in the memory (120) a database (125) for detecting that the crop is in a specific state according to the user's specification. , the method according to any one of appendices 2 to 8.
(Appendix 13)
The spectral images used to create the database are images taken of the same region for a first wavelength group consisting of a plurality of wavelengths, and the spectral images in the second step are images taken of the same area for a second wavelength group consisting of a plurality of wavelengths. The method according to supplementary note 12, wherein the image is taken of an area to be observed, and the number of wavelengths in the first wavelength group is greater than the number of wavelengths in the second wavelength group.
(Appendix 14)
In the first step (S1908), information indicating the area is associated with information on the spectrum image (127) and the wavelength for photographing the same, and further information on the type of specific state is stored in the memory (120). The method according to appendix 12 or 13.
(Appendix 15)
Based on the specification received from the user in the fifth step (S1904), the wavelength for detecting that the crop is in a specific state by the spectral image (127) is specified, according to any one of appendices 12 to 14. Method.
(Appendix 16)
The method according to any one of appendices 12 to 15, wherein in the first step (S1908), information on the temperature of the area when the area is imaged by the multispectral camera (23) is stored in association with the information in the memory.
(Appendix 17)
Among the information stored in the memory (120), if the topography around the crop growing place satisfies a predetermined condition, a spectral image (127) of the area that satisfies the predetermined condition is extracted, and the extracted spectrum is extracted. The method according to any one of appendices 12 to 16, wherein the image (127) is presented to the user in a designable manner.
(Appendix 18)
Supplementary notes 12 to 12, in which, in response to receiving an input of information regarding a region from a user, a spectral image (127) of the region related to the input is extracted, and the extracted spectral image (127) is presented to the user in a designable manner. 17. The method according to any one of 17.
(Appendix 19)
A program for causing a computer (10) to execute the method according to any one of Supplementary Notes 1 to 18.
(Additional note 20)
An information processing device that executes the method described in any one of Supplementary Notes 1 to 18.
 1…システム 2、30…衛星 3、20…ドローン 4、10…情報処理装置 101…プロセッサ 102…主記憶装置 103…補助記憶装置 120、210、310…記憶部 121、2100、3100…アプリケーションプログラム 122…飛行計画DB 123、2103、3102…画像DB 124…広域画像DB 125…教師データ 126…学習モデル 127、2104、3103…スペクトル画像 128…広域スペクトル画像 130、211、311…制御部 133…学習モデル生成部 134…飛行計画作成部 135…飛行体制御部 136…画像合成部 137…領域特定部 138…種類特定部 21、31…スペクトルカメラ制御装置 22、32…スペクトルカメラ N…ネットワーク  1... System 2, 30... Satellite 3, 20... Drone 4, 10... Information processing device 101... Processor 102... Main storage device 103... Auxiliary storage device 120, 210, 310... Storage unit 121, 2100, 3100... Application program 122 ...Flight plan DB 123, 2103, 3102...Image DB 124...Wide area image DB 125...Teacher data 126...Learning model 127, 2104, 3103...Spectral image 128...Wide area spectral image 130, 211, 311...Control unit 133...Learning model Generation unit 134...Flight plan creation unit 135...Flight control unit 136...Image synthesis unit 137...Area identification unit 138...Type identification unit 21, 31...Spectral camera control device 22, 32...Spectral camera N...Network

Claims (20)

  1.  コンピュータを動作させるための方法であって、前記コンピュータのプロセッサに、
     マルチスペクトルカメラにより撮影されるスペクトル画像と比較するための、観測対象である作物が特定の状態であることを検知するための情報をメモリに記憶させる第1ステップと、
     観測対象の地域において、前記作物が特定の状態であることを検知するための波長により前記マルチスペクトルカメラで撮影させることで前記スペクトル画像を生成する第2ステップと、
     前記撮影した結果である前記スペクトル画像と、前記作物が特定の状態であることを検知するための情報とに基づいて、観測対象の前記地域において特定の状態にある前記作物がある前記地域を特定する第3ステップと、
     前記特定した地域の情報を出力する第4ステップと、
    を実行させる、方法。
    A method for operating a computer, the method comprising: a processor of the computer;
    a first step of storing in a memory information for detecting that the crop being observed is in a specific state for comparison with a spectral image taken by a multispectral camera;
    a second step of generating the spectral image by photographing with the multispectral camera at a wavelength for detecting that the crop is in a specific state in the observation target area;
    Identifying the area where the crop is in a specific state in the area to be observed based on the spectral image that is the photographed result and information for detecting that the crop is in a specific state. The third step is to
    a fourth step of outputting information on the identified area;
    How to run it.
  2.  前記第2ステップにおいて、前記マルチスペクトルカメラを搭載した飛行体を、運行計画に従って、観測対象の前記地域において飛行させ、前記作物が特定の状態であることを検知するための波長により前記マルチスペクトルカメラで撮影させることで前記スペクトル画像を生成する、
    請求項1に記載の方法。
    In the second step, a flying vehicle equipped with the multispectral camera is flown in the area to be observed according to the flight plan, and the multispectral camera is equipped with a wavelength for detecting that the crop is in a specific state. generating the spectral image by photographing with
    The method according to claim 1.
  3.  前記第1ステップにおいて、前記検知するための情報として、
     観測対象の前記作物が育成されている前記地域を示す情報と、
     当該地域を前記マルチスペクトルカメラで撮影することで得られる前記スペクトル画像及びその撮影のための波長の情報と、
     当該地域において前記作物を撮影する際の太陽の角度の情報と、
    が関連付けられたデータベースを教師データとして機械学習を行うことにより生成される学習済みモデルを前記メモリに記憶させており、
     前記第3ステップにおいて、前記撮影した結果である前記スペクトル画像と、前記学習済みモデルとに基づいて、前記特定の状態にある前記作物がある前記地域を特定する、
    請求項1に記載の方法。
    In the first step, as the information for detecting,
    Information indicating the area where the crop to be observed is grown;
    the spectrum image obtained by photographing the area with the multispectral camera and information on the wavelength for photographing the spectrum;
    Information on the angle of the sun when photographing the crop in the area;
    A trained model generated by performing machine learning using a database associated with as training data is stored in the memory,
    In the third step, the area where the crops in the specific state are located is identified based on the spectral image that is the photographed result and the learned model.
    The method according to claim 1.
  4.  前記第1ステップにおいて、前記検知するための情報として、さらに
     前記マルチスペクトルカメラで前記地域を撮像したときの前記地域の温度の情報が関連付けられたデータベースを前記教師データとして前記機械学習を行うことにより生成される前記学習済みモデルを前記メモリに記憶させており、
     前記第3ステップにおいて、さらに、前記マルチスペクトルカメラで前記観測対象の地域を撮像したときの前記観測対象の地域の温度に基づいて、前記特定の状態にある前記作物がある前記地域を特定する、
    請求項3に記載の方法。
    In the first step, the machine learning is performed using, as the detection information, a database associated with information on the temperature of the area when the area is imaged with the multispectral camera as the teacher data. The generated trained model is stored in the memory,
    In the third step, further identifying the area where the crops in the specific state are located based on the temperature of the observation area when the multispectral camera images the observation area;
    The method according to claim 3.
  5.  前記第1ステップにおいて、前記教師データである前記データベースとして、さらに、前記特定の状態の種類の情報が関連付けられており、当該特定の状態の種類の情報を含む前記データベースにより生成された前記学習済みモデルを前記メモリに記憶させており、
     前記第3ステップにおいて、前記学習済みモデルに基づき、前記特定の状態にある前記作物がある前記地域と、前記特定の状態の種類とを特定する、
    請求項3または4に記載の方法。
    In the first step, the database, which is the teacher data, is further associated with information on the specific state type and the learned data generated by the database containing the specific state type information. The model is stored in the memory,
    In the third step, based on the learned model, specifying the region where the crop is in the specific state and the type of the specific state;
    The method according to claim 3 or 4.
  6.  前記第2ステップにおいて、前記運行計画として、複数機の前記飛行体により、それぞれ異なる前記地域を飛行させ、各前記飛行体により撮影された前記スペクトル画像を、撮影された前記地域に基づき重ね合わせて前記スペクトル画像を生成する、
    請求項2に記載の方法。
    In the second step, as the operation plan, a plurality of the aircraft fly over different areas, and the spectral images taken by each of the aircraft are superimposed based on the area where the images were taken. generating the spectral image;
    The method according to claim 2.
  7.  前記第2ステップにおいて、前記運行計画として、観測対象の前記作物に応じて、観測対象の前記地域ごとに前記マルチスペクトルカメラの前記波長を指定し、前記波長の指定に従って、各地域で前記波長を切り替えて、前記飛行体の前記マルチスペクトルカメラにより前記スペクトル画像を生成する、
    請求項2に記載の方法。
    In the second step, as the operation plan, the wavelength of the multispectral camera is specified for each region to be observed according to the crop to be observed, and the wavelength is set in each region according to the designation of the wavelength. switching to generate the spectral image by the multispectral camera of the aircraft;
    The method according to claim 2.
  8.  前記第2ステップにおいて、前記運行計画として、前記作物の育成場所の周囲の地形が所定の条件を満たす場合に、当該所定の条件を満たす前記地域を優先して、前記飛行体の前記マルチスペクトルカメラにより撮影させることで前記スペクトル画像を生成する、
    請求項2に記載の方法。
    In the second step, as the operation plan, if the topography around the place where the crops are grown satisfies a predetermined condition, the multispectral camera of the aircraft is given priority to the area that satisfies the predetermined condition. generating the spectral image by photographing the spectral image,
    The method according to claim 2.
  9.  前記第4ステップにおいて、前記作物が特定の状態になっている地理範囲と、そうでない地理範囲とを区別し、さらに、前記スペクトル画像を撮影した地点に基づき、地図画像と重ね合わせて、前記作物が特定の状態になっている範囲を表示する、
    請求項1~8のいずれかに記載の方法。
    In the fourth step, a geographical range in which the crop is in a specific state and a geographical range in which it is not are distinguished, and further, based on the point where the spectral image was taken, the spectral image is superimposed on a map image, and the crop is in a specific state. Display the range where is in a specific state,
    The method according to any one of claims 1 to 8.
  10.  前記第4ステップにおいて、撮影時点の指定をユーザから受け付けており、指定された時点において撮影された前記スペクトル画像に基づき、前記作物が特定の状態になっている地理範囲を表示する、
    請求項1~9のいずれかに記載の方法。
    In the fourth step, a designation of a photographing time point is accepted from the user, and a geographical range in which the crop is in a particular state is displayed based on the spectral image photographed at the designated time point.
    The method according to any one of claims 1 to 9.
  11.  前記第4ステップにおいて、前記作物が特定の状態であることを検知した結果に基づき、特定の状態になっている前記作物の情報と、前記作物が特定の状態であることに対処するための費用の情報とを表示する、
    請求項1~10のいずれかに記載の方法。
    In the fourth step, based on the result of detecting that the crop is in a specific state, information on the crop in a specific state and a cost for dealing with the crop being in the specific state. display information and
    The method according to any one of claims 1 to 10.
  12.  前記第1ステップにおいて、
     前記マルチスペクトルカメラを搭載した前記飛行体を飛行させて地表を撮影させ、得られた前記スペクトル画像に基づいて、
     観測対象の前記作物が育成されている前記地域を示す情報と、
     当該地域を前記マルチスペクトルカメラで撮影することで得られる前記スペクトル画像及びその撮影のための波長の情報と、
    を関連付けて前記メモリに記憶させ、
     さらに、前記コンピュータの前記プロセッサに、
     前記記憶された前記スペクトル画像をユーザに閲覧させることにより、当該スペクトル画像と関連付けて、前記作物が特定の状態であることの指定を前記ユーザから受け付ける第5ステップと、
     前記ユーザの指定に応じて、前記作物が前記特定の状態であることを検知するためのデータベースを前記メモリに記憶させる第6ステップと、
    を実行させる、請求項2~8のいずれかに記載の方法。
    In the first step,
    The flying object equipped with the multispectral camera is caused to fly to photograph the ground surface, and based on the obtained spectral image,
    Information indicating the area where the crop to be observed is grown;
    the spectrum image obtained by photographing the area with the multispectral camera and information on the wavelength for photographing the spectrum;
    associated and stored in the memory,
    Furthermore, the processor of the computer,
    a fifth step of receiving from the user a designation that the crop is in a particular state in association with the stored spectrum image by allowing the user to view the stored spectrum image;
    a sixth step of storing in the memory a database for detecting that the crop is in the specific state according to the user's designation;
    The method according to any one of claims 2 to 8, wherein the method is performed.
  13.  前記データベースを作成するための前記スペクトル画像は複数の波長からなる第1の波長群について同一の地域を撮像したものであり、
    前記第2ステップにおける前記スペクトル画像は複数の波長からなる第2の波長群について同一の観測対象の前記地域を撮像したものであり、
     前記第1の波長群の波長数は前記第2の波長群の波長数より多い
    請求項12に記載の方法。
    The spectral image for creating the database is an image of the same region for a first wavelength group consisting of a plurality of wavelengths,
    The spectral image in the second step is an image of the area of the same observation target for a second wavelength group consisting of a plurality of wavelengths,
    13. The method of claim 12, wherein the number of wavelengths in the first wavelength group is greater than the number of wavelengths in the second wavelength group.
  14.  前記第1ステップにおいて、前記地域を示す情報と、前記スペクトル画像及びその撮影のための前記波長の情報と関連付けて、さらに、前記特定の状態の種類の情報を前記メモリに蓄積させる、
    請求項12または13に記載の方法。
    In the first step, information indicating the region is associated with information on the spectrum image and the wavelength for photographing the spectrum image, and further information on the type of the specific state is stored in the memory.
    The method according to claim 12 or 13.
  15.  前記第5ステップにおいて前記ユーザから受け付けた前記指定に基づいて、作物が前記特定の状態であることを前記スペクトル画像により検知するための波長を特定する、請求項12~14のいずれかに記載の方法。 15. The method according to claim 12, wherein a wavelength for detecting that the crop is in the specific state using the spectral image is specified based on the designation received from the user in the fifth step. Method.
  16.  前記第1ステップにおいて、さらに
     前記マルチスペクトルカメラで前記地域を撮像したときの前記地域の温度の情報を関連付けて前記メモリに記憶させる、
    請求項12~15のいずれかに記載の方法。
    In the first step, further storing information on the temperature of the area when the area is imaged with the multispectral camera in association with the memory;
    The method according to any one of claims 12 to 15.
  17.  前記メモリに蓄積されている情報のうち、前記作物の育成場所の周囲の地形が所定の条件を満たす場合に、当該所定の条件を満たす前記地域の前記スペクトル画像を抽出し、抽出した前記スペクトル画像を指定可能に前記ユーザに提示する、
    請求項12~16のいずれかに記載の方法。
    Among the information stored in the memory, when the topography around the place where the crops are grown satisfies a predetermined condition, the spectral image of the area that satisfies the predetermined condition is extracted, and the extracted spectral image to the user in a designable manner;
    The method according to any one of claims 12 to 16.
  18.  前記ユーザから、前記地域に関する情報の入力を受け付けることに応答して、当該入力にかかる前記地域の前記スペクトル画像を抽出し、抽出した前記スペクトル画像を指定可能に前記ユーザに提示する、
    請求項12~17のいずれかに記載の方法。
    In response to receiving input of information regarding the region from the user, extracting the spectral image of the region related to the input, and presenting the extracted spectral image to the user in a designable manner;
    The method according to any one of claims 12 to 17.
  19.  請求項1に記載の方法をコンピュータに実行させるためのプログラム。 A program for causing a computer to execute the method according to claim 1.
  20.  請求項1に記載の方法を実行する情報処理装置。  An information processing device that executes the method according to claim 1.​
PCT/JP2023/014996 2022-05-31 2023-04-13 Method, program, and information processing device WO2023233832A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022088324A JP2023176173A (en) 2022-05-31 2022-05-31 Method, program, and information processing device
JP2022-088324 2022-05-31

Publications (1)

Publication Number Publication Date
WO2023233832A1 true WO2023233832A1 (en) 2023-12-07

Family

ID=89026219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/014996 WO2023233832A1 (en) 2022-05-31 2023-04-13 Method, program, and information processing device

Country Status (2)

Country Link
JP (2) JP2023176173A (en)
WO (1) WO2023233832A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006250827A (en) * 2005-03-11 2006-09-21 Pasuko:Kk Analytical method for growth condition of crop
WO2017179378A1 (en) * 2016-04-14 2017-10-19 国立大学法人北海道大学 Spectral camera control device, spectral camera control program, spectral camera control system, aircraft equipped with said system, and spectral image capturing method
JP2022510487A (en) * 2018-12-10 2022-01-26 ザ、クライメイト、コーポレーション Cartography of field anomalies using digital images and machine learning models

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7076969B2 (en) * 2017-09-06 2022-05-30 株式会社トプコン Fertilization equipment, fertilization methods and programs

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006250827A (en) * 2005-03-11 2006-09-21 Pasuko:Kk Analytical method for growth condition of crop
WO2017179378A1 (en) * 2016-04-14 2017-10-19 国立大学法人北海道大学 Spectral camera control device, spectral camera control program, spectral camera control system, aircraft equipped with said system, and spectral image capturing method
JP2022510487A (en) * 2018-12-10 2022-01-26 ザ、クライメイト、コーポレーション Cartography of field anomalies using digital images and machine learning models

Also Published As

Publication number Publication date
JP2023177352A (en) 2023-12-13
JP2023176173A (en) 2023-12-13

Similar Documents

Publication Publication Date Title
Shi et al. Unmanned aerial vehicles for high-throughput phenotyping and agronomic research
KR101793509B1 (en) Remote observation method and system by calculating automatic route of unmanned aerial vehicle for monitoring crops
Pajares Overview and current status of remote sensing applications based on unmanned aerial vehicles (UAVs)
De Oca et al. Low-cost multispectral imaging system for crop monitoring
US11542035B2 (en) Spectral camera control device, spectral camera control system, storage medium storing spectral camera control program, and network system for distributing spectral camera control program
EP3697686B1 (en) Unmanned aerial vehicle for agricultural field assessment
Laliberte et al. Image processing and classification procedures for analysis of sub-decimeter imagery acquired with an unmanned aircraft over arid rangelands
US20140163781A1 (en) Tree Metrology System
JP6390054B2 (en) Monitoring system
Honrado et al. UAV imaging with low-cost multispectral imaging system for precision agriculture applications
de Oca et al. The AgriQ: A low-cost unmanned aerial system for precision agriculture
JP6914874B2 (en) Flight route generator and flight route generation method
Zainuddin et al. Verification test on ability to use low-cost UAV for quantifying tree height
Heaphy et al. UAVs for data collection-plugging the gap
Belton et al. Crop height monitoring using a consumer-grade camera and UAV technology
WO2023233832A1 (en) Method, program, and information processing device
Wijesingha Geometric quality assessment of multi-rotor unmanned aerial vehicle borne remote sensing products for precision agriculture
Miura et al. Estimation of canopy height and biomass of Miscanthus sinensis in semi-natural grassland using time-series UAV data
Sugiura et al. Development of high-throughput field phenotyping system using imagery from unmanned aerial vehicle
Sevilmiş et al. ENS 491-492–Graduation Project
Madsen et al. RoboWeedSupport-Semi-Automated Unmanned Aerial System for Cost Efficient High Resolution in Sub-Millimeter Scale Acquisition of Weed Images
Trolove et al. Comparison of four off-the-shelf unmanned aerial vehicles (UAVs) and two photogrammetry programmes for monitoring pasture and cropping field trials
Ivošević et al. A drone view for agriculture
US20210072084A1 (en) Computer storage medium, network system for distributing spectral camera control program and spectral image capturing method using spectral camera control device
Yamamoto et al. Onion Bulb Counting in a Large-scale Field Using a Drone with Real-Time Kinematic Global Navigation Satellite System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23815594

Country of ref document: EP

Kind code of ref document: A1