WO2019235240A1 - Information processing device - Google Patents
Information processing device Download PDFInfo
- Publication number
- WO2019235240A1 WO2019235240A1 PCT/JP2019/020508 JP2019020508W WO2019235240A1 WO 2019235240 A1 WO2019235240 A1 WO 2019235240A1 JP 2019020508 W JP2019020508 W JP 2019020508W WO 2019235240 A1 WO2019235240 A1 WO 2019235240A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- index
- area
- image
- sunny
- unit
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G7/00—Botany in general
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/27—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
Definitions
- the present invention relates to a technology that supports the determination of the work content related to crops.
- Patent Document 1 discloses a technique for supporting fertilization management including fertilization management such as determination of the amount of fertilizer and other farm work based on observed data such as converted leaf color values calculated from image data obtained by photographing crops. Is disclosed.
- An index (for example, NDVI) indicating a growth state is obtained by using an output of a sensor (an image sensor or the like) that measures an amount of light with respect to a crop in a farm field, and is used as a guide for a work time.
- a sensor an image sensor or the like
- an object of this invention is to support the appropriate judgment of the growth condition in the farm field where shade is mixed.
- the present invention provides a determination unit that determines a sunny area of a field included in a captured field image, and an index that indicates a growth status of the crop in the determined sunny area
- An information processing apparatus including an index acquisition unit acquired as an output unit and an output unit that outputs the acquired sunny index as an index of the sunny area is provided.
- the figure showing the whole agriculture support system composition concerning an example Diagram showing hardware configuration of server device Diagram showing drone hardware configuration
- the figure showing the functional composition which an agricultural support system realizes The figure showing an example of the imaging method of a farm field
- the figure showing an example of the judgment result of the Hinata area The figure showing an example of the NDVI map of a pixel unit
- the figure showing an example of the NDVI map of an area unit The figure showing an example of the NDVI map of an area unit
- the figure showing an example of the search screen of growth information The figure showing an example of the search screen of growth information
- the figure showing the functional composition realized by a modification The figure showing an example of the image for index correction
- the figure showing the functional composition realized by a modification A figure showing an example of an input screen for shooting conditions The
- FIG. 1 represents the whole structure of the agricultural assistance system 1 which concerns on an Example.
- the agricultural support system 1 is a system that supports a person who performs work in a farm (a place where crops such as rice, vegetables, and fruits are grown) by using an index that represents the growth status of the crop.
- the index that represents the growth status is an index that represents one or both of the progress of the growing stage of the crop (for example, whether or not it is suitable for harvesting) and the status (also called activity) such as the presence or absence of disease. It is.
- NDVI Normalized Difference Vegetation ⁇ ⁇ Index
- an index representing the growth status of the crop in the field is calculated using an image of the field taken from above by the flying object. Is done.
- the flying body may be anything as long as it can photograph the field, and a drone is used in this embodiment.
- the agricultural support system 1 includes a network 2, a server device 10, a drone 20, and a user terminal 30.
- the network 2 is a communication system including a mobile communication network and the Internet, and relays data exchange between devices accessing the own system.
- the server device 10 is accessing the network 2 by wired communication (may be wireless communication), and the drone 20 and the user terminal 30 are accessing by wireless communication (the user terminal 30 may be wired communication).
- the user terminal 30 is a terminal used by a user of the system (for example, a worker who performs work in a farm), and is, for example, a smartphone, a laptop computer, or a tablet terminal.
- the drone 20 is a rotorcraft type flying body that includes one or more rotor blades and flies by rotating the rotor blades.
- the drone 20 includes a photographing unit that photographs a farm field from above while flying.
- the drone 20 is carried to the field by a farm worker who is a user of the agricultural support system 1, for example, and performs flight and shooting by performing an operation of starting shooting flight.
- the server device 10 is an information processing device that performs processing related to worker support.
- the server device 10 performs, for example, a process of calculating the above-described NDVI from the field image captured by the drone 20.
- NDVI uses the property that the green leaf of the spot absorbs a lot of red visible light and reflects a lot of light in the near-infrared region (0.7 ⁇ m to 2.5 ⁇ m). Represented by The worker can determine the timing of watering, fertilizer application, pesticide application, etc. on the crops in the field where he / she works with reference to the growth status represented by the calculation result of NDVI displayed on the user terminal 30. .
- FIG. 2 shows the hardware configuration of the server device 10 and the user terminal 30.
- Each of the server device 10 and the user terminal 30 is a computer including each device such as a processor 11, a memory 12, a storage 13, a communication device 14, an input device 15, an output device 16, and a bus 17.
- the term “apparatus” here can be read as a circuit, a device, a unit, or the like. Each device may include one or a plurality of devices, or some of the devices may not be included.
- the processor 11 controls the entire computer by operating an operating system, for example.
- the processor 11 may include a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, a register, and the like. Further, the processor 11 reads programs (program codes), software modules, data, and the like from the storage 13 and / or the communication device 14 to the memory 12, and executes various processes according to these.
- CPU central processing unit
- the number of processors 11 that execute various processes may be one, two or more, and the two or more processors 11 may execute various processes simultaneously or sequentially. Further, the processor 11 may be implemented by one or more chips.
- the program may be transmitted from the network via a telecommunication line.
- the memory 12 is a computer-readable recording medium, and includes, for example, at least one of ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM (Random Access Memory), and the like. May be.
- the memory 12 may be called a register, a cache, a main memory (main storage device), or the like.
- the memory 12 can store the above-described program (program code), software module, data, and the like.
- the storage 13 is a computer-readable recording medium such as an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu- ray (registered trademark) disk, smart card, flash memory (eg, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, or the like.
- an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu- ray (registered trademark) disk, smart card, flash memory (eg, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, or the like.
- a computer-readable recording medium such as an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive,
- the storage 13 may be called an auxiliary storage device.
- the above-described storage medium may be, for example, a database including the memory 12 and / or the storage 13, a server, or other suitable medium.
- the communication device 14 is hardware (transmission / reception device) for performing communication between computers via a wired and / or wireless network, and is also referred to as, for example, a network device, a network controller, a network card, or a communication module.
- the input device 15 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, etc.) that accepts an input from the outside.
- the output device 16 is an output device (for example, a display, a speaker, or the like) that performs output to the outside. Note that the input device 15 and the output device 16 may have an integrated configuration (for example, a touch screen).
- the devices such as the processor 11 and the memory 12 are accessible to each other via a bus 17 for communicating information.
- the bus 17 may be composed of a single bus or may be composed of different buses between devices.
- FIG. 3 shows the hardware configuration of the drone 20.
- the drone 20 is a computer that includes a processor 21, a memory 22, a storage 23, a communication device 24, a flying device 25, a sensor device 26, a photographing device 27, and a bus 28.
- the term “apparatus” here can be read as a circuit, a device, a unit, or the like. Each device may include one or a plurality of devices, or some of the devices may not be included.
- the processor 21, the memory 22, the storage 23, the communication device 24, and the bus 28 are the same type of hardware as the device of the same name shown in FIG. 2 (performance and specifications may be different).
- the communication device 24 can also perform wireless communication between drones in addition to wireless communication with the network 2.
- the flying device 25 is a device that includes a motor, a rotor, and the like and causes the aircraft to fly. The flying device 25 can move the aircraft in all directions in the air, or can stop (hover) the aircraft.
- the sensor device 26 is a device having a sensor group that acquires information necessary for flight control.
- the sensor device 26 is a position sensor that measures the position (latitude and longitude) of the own device, and the direction in which the own device is facing (the drone has a front direction defined, and the front direction is directed.
- Direction sensor that measures the altitude of the aircraft, a velocity sensor that measures the velocity of the aircraft, and an inertial measurement sensor (IMU (Inertial) that measures triaxial angular velocity and acceleration in three directions. Measurement Unit)).
- IMU Inertial measurement sensor
- the photographing device 27 is a so-called digital camera that has a lens, an image sensor, and the like and records an image photographed by the image sensor as digital data.
- This image sensor has sensitivity not only to visible light but also to light having a wavelength in the near infrared region necessary for calculating NDVI.
- the photographing device 27 is attached to the lower part of the casing of the own device (drone 20), has a fixed photographing direction, and photographs a vertically lower part during the flight of the own device.
- the photographing device 27 has an autofocus function, and can automatically focus and photograph even if the flight altitude changes.
- the server device 10 and the drone 20 include a microprocessor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a programmable logic device (PLD), and a field programmable gate array (FPGA). It may be configured including hardware, and a part or all of each functional block may be realized by the hardware. For example, the processor 11 may be implemented by at least one of these hardware.
- DSP digital signal processor
- ASIC application specific integrated circuit
- PLD programmable logic device
- FPGA field programmable gate array
- the server device 10 and the drone 20 included in the agricultural support system 1 store programs provided by this system, and the function group described below is executed by the processor of each device executing the program and controlling each unit. Realized.
- FIG. 4 shows a functional configuration realized by the agricultural support system 1.
- the server device 10 includes an agricultural field image generation unit 101, a sunny area determination unit 102, an index calculation unit 103, a growth information generation unit 104, and a growth information recording unit 105.
- the drone 20 includes a flight control unit 201, a flight unit 202, a sensor measurement unit 203, and an imaging unit 204.
- the flight control unit 201 controls the flight of the own aircraft when photographing a farm field.
- the flight control unit 201 stores, for example, field range information (for example, latitude and longitude information indicating the outer edge of the field) indicating the geographical range of the field registered in advance by a farmer who is a user, and the field range. Based on the information, control is performed to fly the aircraft in a flight path that flies across the entire field at a constant altitude.
- the flight path in this case is, for example, a path in which a rectangular farm field flies in a wavy locus from one side of the farm field to the other side facing the side.
- it may be a route that flies along the outer edge of the field, and after a week, shifts the route to the inside and draws a spiral trajectory. Any route is acceptable.
- the flying unit 202 has a function of flying the aircraft.
- the flying device 202 is operated by operating a motor, a rotor, and the like included in the flying device 25.
- the sensor measurement unit 203 performs measurement by each sensor (position sensor, direction sensor, altitude sensor, speed sensor, inertial measurement sensor) included in the sensor device 26, and calculates the position, direction, altitude, speed, angular velocity, and acceleration of the own device. Measure repeatedly at predetermined time intervals.
- the sensor measurement unit 203 supplies sensor information indicating the measured position, direction, altitude, speed, angular velocity, and acceleration to the flight control unit 201.
- the flight control unit 201 controls the flight unit 202 based on the supplied sensor information and causes the aircraft to fly along the above-described flight path.
- the sensor measurement unit 203 supplies sensor information indicating the measured position, direction, altitude, and speed to the imaging unit 204.
- the photographing unit 204 has a function of photographing a subject using the photographing device 27 and is an example of the “photographing unit” in the present invention.
- the imaging unit 204 captures the field as a subject.
- the imaging unit 204 captures an image of a field, and also captures an area where the crop is growing in the field (a crop area).
- each pixel forming a still image captured by the imaging unit 204 shows visible light red, green, and blue.
- pixel values (R, G, B) it is represented by a pixel value (IR) indicating light having a wavelength in the near infrared region.
- the imaging unit 204 captures a plurality of still images based on the supplied sensor information so that all areas in the field are included.
- FIG. 5 shows an example of a method for photographing a farm field.
- FIG. 5 shows a path B1 when the drone 20 flies over the field A1 with a wavy locus.
- the imaging unit 204 calculates the imaging range (the field range included in the angle of view) of the field (0 m above the ground) from the altitude indicated by the sensor information and the angle of view of the imaging device 27. Then, the photographing unit 204 represents the ratio of the area where the current photographing range and the previous photographing range overlap from the speed and direction indicated by the sensor information (for example, expressed as a percentage of the overlapping area when the area of the photographing range is 100%). ) Is less than the threshold value, the next shooting is performed.
- the photographing unit 204 first photographs the photographing region C1, and then photographs the photographing region C2 that slightly overlaps the photographing region C1.
- the photographing unit 204 notifies the flight control unit 201 of the calculated size of the photographing range when the own device (drone 20) is turned back.
- the flight control unit 201 folds the route by shifting the distance by which the imaging range of the notified size overlaps, for example, the imaging areas C4 and C5 in FIG.
- the imaging unit 204 captures still images obtained by imaging the imaging regions C1 to C32 shown in FIG. 5, that is, a plurality of still images whose imaging ranges are slightly overlapped, by repeating imaging using this method.
- the field A ⁇ b> 1 has a size and shape that can accommodate a plurality of imaging ranges, but this need not be the case. In that case, all the areas in the field are included in any one of the still images by widening the overlapping part of the shooting ranges or by shooting including the outside of the field.
- the photographing method by the photographing unit 204 is not limited to this. For example, if the flight speed and flight altitude at the time of shooting are determined, the overlapping time interval is calculated in advance as shown in FIG. 5, so that shooting may be performed at that time interval. Further, if the map of the farm field and the shooting position are determined in advance, the shooting unit 204 may shoot when flying in the determined position. Further, the photographing unit 204 may photograph a moving image in which the data becomes large as long as the storage capacity or communication speed of the own device is not insufficient.
- a known method for photographing the ground using a drone may be used. Operation
- movement of each part with which drone 20 is provided is started by operation of the flight start by the farm worker mentioned above.
- the drone 20 flies over the set flight path over the field, and the imaging unit 204 repeatedly performs imaging as described above.
- image data indicating the photographed still image and photographing information related to photographing (information indicating the position, orientation, altitude, and time when photographing, and the angle of view of the photographing device 27). Is transmitted to the server device 10.
- the farm field image generation unit 101 of the server device 10 receives the transmitted image data, and acquires a still image indicated by the image data as a farm field image captured by the drone 20.
- the farm field image generation unit 101 is an example of the “first image acquisition unit” in the present invention.
- the field image generation unit 101 also acquires the shooting information indicated by the received image data, the field image generation unit 101 generates an image of the entire field from the image for each shooting region.
- the farm field image generation unit 101 calculates an overlapping part of each image using the position, orientation, altitude, and angle of view indicated by the acquired imaging information, and for the overlapping part, for example, adopts a pixel of one image to make the entire field Generate an image of
- the field image generation unit 101 assigns a pixel ID to each pixel of the generated entire field image
- the field image generation unit 101 supplies the entire field image data indicating the pixel ID and the entire field image to the sunny area determination unit 102 and the index calculation unit 103. To do.
- the sunny area determination unit 102 determines the sunny area of the farm field included in the photographed field image.
- the sunny area determination unit 102 is an example of the “determination unit” in the present invention.
- the sunny area determination unit 102 determines the sunny area included in the field image captured by the imaging unit 204 of the drone 20 as described above.
- the sunny area determination unit 102 determines the sunny area based on the pixel value of each pixel of the entire field image indicated by the supplied entire field image data.
- the sunny area determination unit 102 calculates, for example, values in the HSV color space (hue H: Hue, saturation S: Saturation, brightness V: Value) of each pixel from the R, G, and B values of the pixel values. .
- the sunny area determination unit 102 determines, for example, an area of an object of the same color (for example, a crop area and a soil area) that has a difference between the hue H and the saturation S within a predetermined range among the calculated HSV values of each pixel.
- the sunny area determination unit 102 extracts pixels whose lightness value V changes by a difference threshold value or more in the same color area determined in this way.
- the pixel extracted in this way may indicate the boundary between the sun and the shade.
- the value of brightness V is It is determined that it represents a pixel, and the area is determined as a sunny area. Further, the sunny area determination unit 102 determines an area that has not been determined as the sunny area as a shaded area.
- the sunny area determination unit 102 uses, for example, a difference threshold value and a lightness threshold value corresponding to the hue H and saturation S values of each area (in the leaf area, both threshold values are increased compared to the soil area)
- the shaded area may be determined.
- FIG. 6 shows an example of the determination result of the sunny area.
- the shaded area G1 due to the forest R1 is shown on the southwest side, and the other areas are shown as the sunny area F1.
- the sunny area determination unit 102 When the sunny area determination unit 102 performs the determination as described above, the sunny area information indicating the pixel ID of the pixel included in the area determined to be sunny and the shade information indicating the pixel ID of the pixel included in the area determined to be shaded. Is generated.
- the sunny area determination unit 102 supplies the generated sunny information and shade information to the growth information generation unit 104 together with the entire field image data.
- the index calculation unit 103 calculates an index representing the growth status of the crop shown in the image from the field image acquired by the field image generation unit 101.
- the index calculation unit 103 is an example of the “calculation unit” in the present invention.
- the index calculation unit 103 calculates the above-described NDVI as an index indicating the growth status.
- the image capturing unit 204 of the drone 20 captures an image of the farm field, and the index calculation unit 103 calculates NDVI as described above, thereby measuring the state of crop cultivation in the field.
- the growth information generation unit 104 generates an NDVI map in pixel units representing NDVI at a position on the field corresponding to each pixel.
- FIG. 7 shows an example of an NDVI map in pixel units. In the example of FIG. 7, the NDVI map M1 in pixel units of the field A1 shown in FIG. 5 is represented.
- the index calculation unit 103 supplies the NDVI map M1 representing the calculated NDVI to the growth information generation unit 104 together with the entire field image data as index information representing the calculated crop growth status.
- the growth information generation unit 104 generates growth information indicating the growth status of the crop in the field using the supplied sunny information, shade information, index information (for example, the NDVI map M1) and the entire field image data.
- the growth information generation unit 104 generates, in particular, sunny growth information indicating the growth status of the crop in the sun and shade growth information indicating the growth status of the crop in the shade.
- the growth information generation unit 104 generates such information as follows, for example.
- the growth information generation unit 104 includes the NDVI in the sunny area determined by the sunny area determination unit 102 among the NDVIs of the pixels indicated by the NDVI map M1 supplied from the index calculation unit 103 (NDVI at each position included in the sunny area). )
- a Hinata index As a Hinata index.
- the position includes a range from a position represented by one pixel of the entire field image to a position represented by a plurality of pixels, and represents a region having a certain extent in any case.
- the Hinata index refers to an index representing the growth status of a crop measured from an image of a crop in the sunny area.
- the growth information generation unit 104 acquires an NDVI associated with the pixel ID indicated by the sun direction information as the sun direction indicator among the NDVIs indicated by the index information.
- the growth information generation unit 104 acquires a shade area index that is not determined to be a sunny area (an index of a position that is not determined to be a sunny area) as a shade index.
- the shade index refers to an index representing the growth status of a crop measured from a crop image in the shaded area.
- the growth information generation unit 104 acquires, as a shade index, the NDVI associated with the pixel ID indicated by the shade information among the NDVIs indicated by the index information.
- the growth information generation unit 104 is an example of the “index acquisition unit” in the present invention.
- the growth information generating unit 104 includes the NDVI calculated for the sunny area from the NDVI calculated by the index calculating unit 103 for the entire field image as included in the sunny index (the sunny area of the field). As NDVI at position). Further, the growth information generation unit 104 acquires the NDVI calculated for the shaded area from the NDVI calculated by the index calculation unit 103 for the entire field image as a shaded index (NDVI at a position included in the shaded area of the field). .
- the growth information generation unit 104 generates an NDVI map for each area representing the growth status of the crop for each of a plurality of areas that divide the field A1, for each of the sunny area and the shaded area, using the acquired sunny index and shade index.
- 8A and 8B show examples of NDVI maps in units of areas. 8A shows an NDVI map Mf1 in the area unit in the sunny area F1 shown in FIG. 6, and FIG. 8B shows an NDVI map Mg1 in the area unit in the shaded area G1 shown in FIG.
- each segmented area is represented by an eight-level pattern (Lv1 is the smallest and Lv8 is the largest) according to the average value of NDVI.
- the NDVI map Mf1 indicates that the NDVI is larger and the growth state is better as it approaches the northeast side of the field A1. Further, the NDVI map Mg1 also shows that the NDVI is larger and the growth state is better as it approaches the northeast side of the field A1. However, in the NDVI map Mg1, the NDVI is small because it is shaded even in the divided area representing the same shooting area as the NDVI map Mf1.
- the NDVI of the segmented area Hc14 representing the imaging area C14 is Lv6 (third from the largest) in the NDVI map Mf1, but Lv4 (fifth from the largest) in the NDVI map Mg1.
- the NDVI of the segmented area Hc19 representing the imaging area C19 is Lv7 (second from the largest) in the NDVI map Mf1, but is Lv5 (fourth from the largest) in the NDVI map Mg1.
- the growth information generation unit 104 includes a symbol (a sun identification symbol) indicating that the generated NDVI map Mf1 is generated from the acquired hinata index, shooting information of the entire field image as a base, and information associated with the shooting date and time. It is generated as the above-mentioned sunny growth information. Further, the growth information generation unit 104 associates the generated NDVI map Mg1 with a symbol (shade identification symbol) indicating that the generated NDVI map Mg1 is generated from the acquired shade index, the shooting information and the shooting date and time of the original whole field image. Information is generated as the above-mentioned shade growth information. The growth information generation unit 104 supplies the generated growth direction information and shade growth information to the growth information recording unit 105.
- a symbol a sun identification symbol
- the growth information recording unit 105 records the growth information of the crop in the field (information indicating the growth status of the crop) together with the sunny growth information and the shade growth information generated by the growth information generation unit 104.
- the growth information recording unit 105 is in a state where the recorded growth information can be browsed by the user (farmer) (for example, on a web page accessible by a URL (Uniform (Resource Locator) transmitted to the user). Hold.
- the growth information recording unit 105 displays a screen for searching for growth information that is held, for example.
- 9A to 9D show an example of a growth information search screen.
- the user terminal 30 displays a growth information search screen including input fields for time, field name, and sunshine conditions.
- the search condition “Hyuga” of “Agricultural field A1” of “2017/5/15” is input.
- the user terminal 30 When the user performs an operation of pressing the search button H1, the user terminal 30 sends request data for requesting the sunny growth information generated from the photographed image of the field A1 having “2017/5/15” as the photographing date and time. Send to.
- the growth information recording unit 105 reads the requested sunny growth information from the recorded growth information, and transmits the read sunny growth information to the user terminal 30.
- the user terminal 30 displays the NDVI map Mf1 indicated by the transmitted sunny growth information as shown in FIG. 9B.
- the user terminal 30 takes a photographed image of the field A1 with “2017/5/15” as the photographing date and time.
- Request data for requesting the shade growth information generated from is transmitted to the server device 10.
- the growth information recording unit 105 reads the requested shade growth information from the recorded growth information, and transmits the read shade growth information to the user terminal 30.
- the user terminal 30 displays the NDVI map Mg1 indicated by the transmitted shade growth information as shown in FIG. 9D.
- the sunny growth information is information in which the NDVI map Mf1 representing the sunny index (NDVI of the sunny area) acquired by the growth information generating unit 104 is associated with the sunny identification symbol indicating that it is an index of the sunny area.
- the growth information recording unit 105 outputs (transmits) the sunny growth information to the user terminal 30 to output the associated sunny index as the sunny area index (the sunny area index). Output so that they can be identified.
- the shade growth information is information in which the NDVI map Mg1 representing the shade index (NDVI of the shaded area) acquired by the growth information generation unit 104 is associated with the shade identification symbol indicating the shade area index. .
- the growth information recording unit 105 outputs (transmits) the shade growth information to the user terminal 30 so that the obtained shade index is associated with the shade area index as an index (shade area index). Output so that they can be identified.
- the growth information recording unit 105 is an example of the “output unit” in the present invention.
- FIG. 10 shows an example of the operation procedure of each apparatus in the recording process. This operation procedure is started when a farmer who is a user takes the drone 20 to the field and performs an operation for starting a shooting flight. First, the drone 20 (flight control unit 201, flight unit 202, and sensor measurement unit 203) starts flying over the field based on the stored field range information (step S11).
- the drone 20 starts photographing each photographing region from above the field (step S12), and every time photographing is performed, the photographed still image and photographing information (position when photographing, Image data indicating the azimuth and altitude) is generated and transmitted to the server device 10 (step S13).
- the server device 10 farm field image generation unit 101 acquires an image of the farm field indicated by the transmitted image data (step S14).
- the server device 10 (the farm field image generation unit 101) generates an image of the entire farm field by combining the acquired farm field images (step S21). Subsequently, the server device 10 (the sunny area determination unit 102) determines the sunny area and the shade area of the field included in the generated image of the entire field (Step S22). Next, the server device 10 (index calculation unit 103) calculates an index (NDVI) indicating the growth status of the crop reflected in the image from the generated image of the entire field (step S23).
- NDVI index
- steps S22 and S23 may be performed in the reverse order or in parallel.
- the server apparatus 10 produces
- the server apparatus 10 produces
- the operations in steps S24 and S25 may be performed in the reverse order or in parallel.
- the server device 10 (growth information recording unit 105) records the growth information of the sun and the shade growth information generated in steps S24 and S25 as the growth information of the crop in the field (step S26). Then, when there is an access (request) from the user terminal 30, the server device 10 (growth information recording unit 105) outputs the growth information recorded in step S26 to the user terminal 30 (step S31).
- the reflected light becomes weaker and the pixel value of the pixel becomes smaller than in the sunny area. Therefore, even if the pixel value error is the same, the NDVI error is larger. Therefore, the accuracy of NDVI tends to be lower in the shaded area than in the sunny area.
- the present embodiment by acquiring the NDVI only for the sunny area, it is possible to support an appropriate determination of the growth situation in the field where the shade is mixed as compared with the case where the sunny area and the shade are not distinguished.
- the NDVI only for the shaded area is acquired and the NDVI map for the shaded area is also output.
- the NDVI map for the shaded area is also output.
- the NDVI only in the shaded area it tends to be judged that the growth situation is bad, but by comparing the NDVI only in the shaded area, the place where the growth situation is good and bad in the shaded area (For example, in the case of the NDVI map Mg1 in FIG. 8B, it can be said that the northeast side has a better growth situation).
- a modification using different parameters for obtaining a common value may be combined, and a common value or the like may be obtained using these parameters together.
- the growth information generation unit 104 acquires both the sunflower index and the shade index.
- the present invention is not limited to this.
- only the sunflower index may be acquired.
- the growth information generation unit 104 generates only the NDVI map Mf1 representing the acquired sunny indicator, and outputs the sunny growth information indicating the NDVI map Mf1 to the user terminal 30. Even in this case, since only NDVI in the sunny area can be compared, it is possible to support appropriate judgment of the growing condition of the sunny crop in the field where the shade is mixed.
- NDVI is calculated for each pixel of the entire field, but the present invention is not limited to this.
- the sunny area determination unit 102 determines only the sunny area and supplies the generated sunny information to the index calculation unit 103 together with the entire field image data.
- the index calculation unit 103 calculates NDVI only for the pixels in the sunny area indicated by the supplied sunny information, and supplies a map representing the calculated NDVI of the sunny area together with the entire field image data to the growth information generation unit 104 as index information. .
- the growth information generation unit 104 acquires the NDVI supplied in this way, that is, the NDVI calculated only for the sunny area in the field image as the sunny index. Thereby, the load of the process (NDVI calculation process) of the server device 10 can be reduced as compared with the case where the NDVI of the shaded area is also calculated.
- the growth information generation unit 104 generates an NDVI map in units of areas using the area corresponding to the imaging range as the segmented area in the embodiment, but the segmented area is not limited to this.
- a plurality of shooting ranges may be used as one segmented region, or a region corresponding to a divided region obtained by dividing one shooting region into a plurality of segments may be used as the segmented region.
- the shape and size of each segmented region may be unified or not uniform.
- a rotary wing aircraft was used as a vehicle for autonomous flight, but this is not a limitation.
- it may be an airplane type aircraft or a helicopter type aircraft.
- the function of autonomous flight is not essential, and if it is possible to fly the assigned flight airspace in the assigned flight permission period, for example, a radio control type (wireless control type) operated by a pilot from a remote location. May be used.
- a radio control type wireless control type operated by a pilot from a remote location. May be used.
- the determination of the sun area and the calculation of NDVI are performed based on the image taken by the drone 20 during the flight, but the present invention is not limited to this. These determinations and calculations may be performed based on, for example, an image manually captured by an operator using a digital camera, an image captured by a fixed digital camera installed on a farm field, or an image captured from a satellite.
- the NDVI is calculated using the measured value of the image sensor of the imaging device 27 of the drone 20, but the present invention is not limited to this.
- the NDVI is measured using the measured value of an infrared sensor of a handy type NDVI measuring instrument. May be calculated.
- the handy type can calculate NDVI from the reflected light of the crop only. Accuracy tends to be high. Which method is to be used may be determined in consideration of labor, cost, and required accuracy.
- NDVI was used as an index indicating the growth status, but the present invention is not limited to this.
- a leaf color value value indicating the color of a leaf
- a planting rate occupation rate per unit area of a planting region
- SPAD chlororophyll content
- plant height number of stems, or the like
- any value may be used as an index representing the growth status as long as it represents the growth status of the crop and can be calculated from the captured crop region image.
- the intensity of reflected light from the crops that reach the image sensor of the photographing device 27 is not only the sun and shade, but also the intensity of sunlight in each season, the amount of clouds, the atmospheric conditions, etc. It can change depending on the surrounding environment. Moreover, it may change also with the state of the lens of the imaging device 27 influenced by temperature, humidity, etc. Correction for eliminating the change in the index (NDVI) caused by the change in the intensity of the reflected light reaching the image sensor due to those factors may be performed.
- FIG. 11 shows a functional configuration realized in this modification.
- a server device 10 a including a corrected image acquisition unit 106 in addition to the units illustrated in FIG. 4 is illustrated.
- the corrected image acquisition unit 106 acquires an image for index correction taken by the drone 20.
- the corrected image acquisition unit 106 is an example of the “correction acquisition unit” in the present invention.
- the index correction image is, for example, an image obtained by photographing a panel having a plurality of regions in which the reflectance of light having a specific wavelength is known in advance.
- FIG. 12 shows an example of an index correction image.
- an image obtained by photographing the panel J1 in which the specific reflectance regions J11, J12, J13, and J14 are represented on the surface is represented as an index correction image.
- Each specific reflectance region is, for example, a region on the panel where the reflectance of red light and infrared rays changes stepwise (for example, the reflectance of red light and infrared rays of J11, J12, J13, and J14 are both 20). %, 40%, 60%, 80%).
- the farmer causes the drone 20 to photograph the panel J1 before the operation for starting the photographing flight or after the completion of the photographing flight.
- the drone 20 may be flying, or the user may lift and photograph it.
- the imaging unit 204 of the drone 20 transmits image data indicating the captured image of the panel J1 to the server device 10a.
- the corrected image acquisition unit 106 acquires the image of the panel J1 indicated by the image data thus transmitted as an index correction image.
- the corrected image acquisition unit 106 supplies the acquired index correction image to the index calculation unit 103.
- the index calculation unit 103 calculates a corrected index based on the supplied image, that is, the index correction image acquired by the corrected image acquisition unit 106.
- the index calculation unit 103 uses the red pixel values (r11, r12, r13, r14) of the specific reflectance regions J11, J12, J13, and J14 included in the acquired index correction image and the wavelengths in the near infrared region.
- the pixel values of light ir11, ir12, ir13, ir14 are read out as measurement values.
- the server device 10a captures red pixel values (R11, R12, R13, R14) and the near-infrared region when the specific reflectance regions J11, J12, J13, and J14 are photographed in an environment where NDVI can be measured satisfactorily.
- the pixel values (IR11, IR12, IR13, IR14) of light with wavelengths are stored in the server device 10a as reference values. If the measured value is different from the reference value, the index calculating unit 103 determines a correction formula for correcting the measured value to the reference value.
- the index calculation unit 103 determines correction formulas for R and IR as described above, the index calculation unit 103 performs correction using the correction formula that determines the pixel value of each pixel.
- the index calculation unit 103 uses the corrected R and IR pixel values of each pixel to calculate NDVI (corrected NDVI in this modification) as in the embodiment.
- NDVI corrected NDVI in this modification
- the correction of the pixel value may be performed by a function other than the index calculation unit 103.
- the pixel value of each pixel may be corrected when the agricultural field image generation unit 101 generates an entire agricultural field image.
- the NDVI correction method is not limited to the above-described method, and other known methods may be used.
- FIG. 13 shows a functional configuration realized in this modification.
- a server device 10 b including a sunshine condition specifying unit 107 in addition to the units illustrated in FIG. 11 is illustrated.
- the sunshine condition specifying unit 107 specifies whether the shooting condition of the index correction image is sunny or shaded.
- the sunshine condition specifying unit 107 is an example of the “condition specifying unit” in the present invention.
- a screen for inputting shooting conditions is displayed on the user terminal 30.
- FIG. 14 shows an example of an imaging condition input screen.
- the user terminal 30 displays an agricultural support system screen including input fields for a shooting date, a field name, and a sunshine condition.
- the photographing condition “shade” of “field A1” of “2018/5/15” is input.
- the user terminal 30 captures the photographing condition data indicating that the photographing date and time is “2018/5/15”, the photographing place is “farm field A1”, and the sunshine condition is “shade”.
- the sunshine condition specifying unit 107 sets the sunshine condition (“shade” in the example of FIG. 14) indicated by the transmitted imaging condition data, the date and time indicated by the imaging condition data, and the index correction image captured in the field. Specify as shooting conditions.
- the sunshine condition specifying unit 107 notifies the index calculation unit 103 of the specified shooting conditions together with the shooting date and time and the farm field.
- the index calculation unit 103 based on the index correction image acquired by the correction image acquisition unit 106, is used as an index in a portion determined to be a sunny area in the field image when the shooting conditions for the sun are identified.
- the corrected index is calculated (the index of the portion determined to be a shaded area is not corrected).
- the sunny area determination unit 102 supplies the generated sunny information and shade information to the index calculation unit 103 together with the entire field image data.
- the index calculation unit 103 corrects the pixel value of the pixel in the sunny area indicated by the supplied sunny information among the pixels indicated by the entire field image data as described in the description of FIG.
- the index calculation unit 103 calculates the NDVI in the same manner as in the embodiment, using the pixel value of each pixel indicating the corrected sunny area.
- the index calculation unit 103 uses the corrected image acquisition unit as an index in a portion of the field image that is not determined to be a sunny region (a portion determined to be a shaded region). A corrected index is calculated based on the index correction image acquired in 106 (the index of the portion determined to be the sunny area is not corrected). In this case, the index calculation unit 103 calculates the NDVI by correcting the pixel values of the pixels in the shaded area indicated by the supplied shade information among the pixels indicated by the entire field image data in the same manner as described above.
- the NDVI in the sunlit area is corrected when the panel J1 is in the sun, and the NDVI in the shaded area is corrected when the panel J1 is in the shade.
- the accuracy of correction using the image of panel J1 index correction image
- the NDVI in the area common to the sunshine condition that has not been specified is not corrected, but another correction may be performed for the area. Another correction will be described later.
- FIG. 15 shows a functional configuration realized in this modification.
- a server device 10 c including a flight instruction unit 108 in addition to the units illustrated in FIG. 4 is illustrated.
- the flight instruction unit 108 instructs the drone 20 on the shooting method for the position that is not determined to be the sunny area, that is, the position that is determined to be the shaded area.
- the flight instruction unit 108 is an example of the “instruction unit” in the present invention.
- the flight instruction unit 108 instructs the drone 20 to re-photograph the position determined to be the shaded area, for example, when the shaded area is determined from the field image captured after the shooting flight is completed.
- the flight instruction unit 108 instructs re-imaging so that the photographing is started at the timing when the photographing flight of the shaded area is completed before the scheduled end time.
- the flight instruction unit 108 calculates the time (shooting time) required for shooting the shaded area based on the area of the shaded area and the distance from the shooting start position, and is more than the time that is back from the estimated shooting time by the calculated shooting time. Instruct to start re-shooting at the previous time.
- the flight instructing unit 108 may instruct the drone 20 to shoot at a different shooting time on another day.
- the flight instruction unit 108 when a shaded area is determined from the field image captured after the shooting flight is completed, the imaging time, the field ID (information identifying the field), and the device ID (drone 20). Is stored).
- the farmer who is the user takes the drone 20 to the field, but does not perform the shooting flight start operation himself, for example, performs a flight standby operation.
- the drone 20 transmits to the server device 10c state data indicating that it is in the flight standby state, the field ID (stored in advance by the farm worker), and the device ID.
- the drone 20 instructs the flight instruction unit to instruct the shooting start time to be a time different from the shooting time stored in association with the field ID and the device ID indicated by the state data.
- the imaging start time for example, a time separated from a past imaging time by a predetermined time (a time when the shaded area sufficiently changes) is used. For example, if the past shooting time is 10:00 am and the time determined at 5 am is 5 hours, the flight instruction unit 108 instructs the shooting start time at 3 pm.
- the flight instruction unit 108 gives an instruction with 11:00 am as the shooting start time.
- the flight control unit 201 of the drone 20 starts shooting flight when the shooting start time indicated by the received instruction data comes. Also in this case, since a part of the position that was the shaded area at the time of re-shooting in the past can be taken as the sunny area, NDVI calculated from the pixels in the sunny area is increased as compared with the case where the shooting time change instruction is not given. be able to.
- the index calculation unit 103 corrects an index calculated from a pixel in a shaded area (an index in the shaded area) to an index that is expected to be calculated when the pixel is a sunny area. May be.
- the index calculation unit 103 performs this correction by comparing, for example, the pixel value of the image taken when the same crop is in the sunny area and the pixel value of the image taken in the shaded area.
- FIG. 16 shows an example of the photographing range of the photographing means installed in this modification.
- an entire field image E1 of the field A1 is shown.
- the entire farm field image E1 is an image taken at an early time in the afternoon in which the shadow (shade region G1) of the forest R1 shown in FIG. 5 extends north.
- the fixed camera K1 is installed at a position where the photographing region C41 that is shaded in the shaded region G1 but hardly includes the shaded region G2 is photographed.
- the fixed camera K1 has a predetermined interval in the morning (time when the shooting region C41 becomes the sunny region) and a predetermined time during the day (time when the shooting region C41 becomes the shaded region) (may be every day). Repeat every week).
- FIG. 17 shows a functional configuration realized in this modification.
- a server device 10 d that includes an image acquisition unit 109 in addition to the units illustrated in FIG. 4 is illustrated.
- the fixed camera K1 has a communication function, and transmits image data indicating a captured image to the server device 10d.
- the image acquisition unit 109 acquires an image indicated by the transmitted image data, that is, an image of a fixed area (a fixed area in which the sunny area and the shaded area in the field are switched) captured by the fixed camera K1.
- the image acquisition unit 109 is an example of the “second image acquisition unit” in the present invention.
- the image acquisition unit 109 supplies the acquired image of the fixed area to the index calculation unit 103.
- the index calculation unit 103 is a part of the field image that is not determined to be the sunny area based on the correlation between the index of the sunny area and the index of the shaded area obtained from the image of the fixed area acquired by the image acquisition unit 109. An index corrected for (a portion determined to be a shaded area) is calculated. For example, the index calculation unit 103 calculates the ratio (index ratio) between the NDVI calculated for the pixel in which the specific part of the crop in the field is captured in the sunny area and the NDVI calculated for the pixel in the shaded area. calculate.
- the index calculation unit 103 converts the index value, for example, from 0.0 to 2.0, and then converts the index ratio (from 0 to 1.0). Value). For example, the index calculation unit 103 obtains an expression representing the correlation between the pixel value in the shaded area and the index ratio.
- FIG. 18 shows an example of the correlation of the index ratio.
- FIG. 18 shows a graph in which the pixel value in the shaded area is shown on the horizontal axis and the index ratio is shown on the vertical axis.
- a correlation is shown in which the index ratio decreases as the pixel value increases.
- the index calculation unit 103 obtains an approximate expression indicating this correlation using a known method.
- the correlation is represented linearly, but may be represented by a quadratic curve or may be represented by a cubic or higher curve.
- the index calculation unit 103 calculates the corrected NDVI for each pixel in the shaded area, using an expression representing the correlation thus obtained.
- the index calculation unit 103 is expected to calculate when the pixel is in the sunny area by dividing the pixel value by the index ratio indicated by this expression with respect to the pixel value of each pixel. To the correct index.
- the corrected index of the shaded area as described above, even if there is an area that hardly becomes the sunny area throughout the day, such as the shaded area G2 shown in FIG. Thus, it can be shown with higher accuracy than in the case of not performing the correction of the present modification.
- the index calculation unit 103 may correct the index of the shaded area by a method different from the above modification.
- the index calculation unit 103 includes a portion determined to be a sunny region and a portion not determined to be a sunny region (a portion determined to be a shaded region) from the field image acquired by the field image generation unit 101. Correction is performed focusing on the boundary portion.
- the index calculation unit 103 determines a portion (determined as a shade area) of the field image that is not determined as the sunny area. Calculated index) is calculated. Specifically, for example, the index calculation unit 103 identifies, as a boundary pixel, a pixel that is a boundary line between the sunny area and the shaded area determined by the sunny area determination unit 102, and is adjacent to the sunny area side of the boundary pixel. NDVI is compared between a sunny pixel (an example of a sunny region within a predetermined range) and a shaded pixel adjacent to the shaded region side of the boundary pixel (an example of a shaded region within a predetermined range).
- FIG. 19 shows an example of a sunny pixel and a shaded pixel.
- the sunny pixel Df11 on the sunny area F1 side and the shaded pixel Dg11 on the shaded area G1 side are represented.
- a plurality of sunny pixels and shaded pixels are represented along the boundary line between the sunny region F1 and the shaded region G1.
- not all the sunny pixels and shaded pixels are shown, but sunny pixels and shaded pixels are included between these sunny pixels and shaded pixels.
- the index calculation unit 103 calculates NDVI for each of these sunlit pixels and shaded pixels, and obtains an expression indicating the correlation between the index ratio and the pixel value of the shaded pixel, as in the example of FIG. After that, the index calculation unit 103 calculates the corrected NDVI of the shaded area in the same manner as in the modified example.
- the pixels adjacent to the boundary pixels are used as the sunny area and the shaded area (pixels) within a predetermined range of the boundary pixels, but the present invention is not limited to this. Pixels located at a distance between one or more pixels from the boundary pixel may be used as the sunny area and the shade area. In short, a range in which the growth situation seems to be substantially uniform in the entire field image may be used as the predetermined range.
- the index of the pixel in the shaded area can be corrected to an index that is expected to be calculated when the pixel is the sunny area.
- amendment can be performed even if it does not install the fixed imaging
- the imaging device provided in the drone 20 is not limited to the above.
- it may have a zoom function (the resolution can be increased to improve the accuracy of NDVI), or it may have sensitivity specialized for red and infrared.
- the pixel values of light (blue, green) of other wavelengths may be restored by correcting the spectrum.
- the growth information recording unit 105 may output the sunny index and the shade index by a method different from the embodiment.
- the growth information recording unit 105 outputs, for example, a sunny indicator for a folder or database prepared for a sunny indicator, and outputs a shaded indicator for a folder or database prepared for a shade indicator, for example. Good.
- the growth information recording unit 105 not only outputs the growth information to the user terminal 30, but also, for example, a storage device that accumulates a sunny index and a shade index, an analysis of a growth situation and a future prediction from the sunny index and the shade index, etc. May be output to an analysis apparatus that performs the above and a visualization apparatus that performs a visualization process (such as generation of a graph and a map) that facilitates comparison of the growth status.
- the growth information recording unit 105 may output the growth information to any output destination as long as it leads to supporting the person who performs the work in the field.
- the apparatus for implementing each function shown in FIG. 4 and the like may be different from those shown in FIG.
- the drone may have all or some of the functions of the server device.
- the drone processor is an example of the “information processing apparatus” of the present invention.
- the user terminal 30 may realize the function of the server device.
- the user terminal 30 is an example of the “information processing apparatus” of the present invention.
- each function may be performed by another function or may be performed by a new function.
- the growth information generation unit 104 may perform the operation performed by the index calculation unit 103 (index calculation operation).
- the output part which newly provided the output of the sunny index and shade index which the growth information recording part 105 performs may perform.
- Two or more devices may realize each function provided in the server device. In short, as long as these functions are realized as the entire agricultural support system, the agricultural support system may include any number of devices.
- the present invention can also be understood as an information processing system such as an agricultural support system equipped with a flying object.
- the present invention can be understood as an information processing method for realizing processing performed by each device, or as a program for causing a computer that controls each device to function.
- This program may be provided in the form of a recording medium such as an optical disk in which it is stored, or may be provided in the form of being downloaded to a computer via a network such as the Internet, installed and made available for use. May be.
- Input / output information, etc. may be stored in a specific location (for example, a memory) or managed by a management table. Input / output information and the like can be overwritten, updated, or additionally written. The output information or the like may be deleted. The input information or the like may be transmitted to another device.
- Software Software whether it is called software, firmware, middleware, microcode, hardware description language, or another name, is an instruction, instruction set, code, code segment, program code, program, sub, It should be interpreted broadly to mean a program, a software module, an application, a software application, a software package, a routine, a subroutine, an object, an executable file, an execution thread, a procedure, a function, and the like.
- software, instructions, etc. may be transmitted / received via a transmission medium.
- software may use websites, servers, or other devices using wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave.
- wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave.
- DSL digital subscriber line
- wireless technology such as infrared, wireless and microwave.
- notification of predetermined information is not limited to explicitly performed, but is performed implicitly (for example, notification of the predetermined information is not performed). Also good.
Landscapes
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Health & Medical Sciences (AREA)
- Forests & Forestry (AREA)
- Environmental Sciences (AREA)
- Botany (AREA)
- Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Theoretical Computer Science (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Image Processing (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The objective of the present invention is to support appropriate assessment of a growth status in an agricultural field with mixed shade.
An agricultural field image generating unit (101) acquires an image of an agricultural field imaged by means of a drone (20). A sunlit region determining unit (102) determines a sunlit region of the agricultural field included in the acquired image of the agricultural field. An index calculating unit (103) calculates an index representing a growth status of produce appearing in the image, from the acquired image of the agricultural field. A growth information generating unit (104) acquires, as a sunlit index, an NDVI in a position included in the determined sunlit region, from among calculated NDVIs. The growth information generating unit (104) acquires, as a shaded index, an index in a position not determined to be a sunlit region. The growth information generating unit (104) uses the acquired sunlit index and shaded index to generate, for each sunlit region and shaded region, a region unit NDVI map representing the growth status of produce in each of a plurality of regions into which the agricultural field A1 is divided.
Description
本発明は、作物に関する作業の内容の決定を支援する技術に関する。
The present invention relates to a technology that supports the determination of the work content related to crops.
作物に関する作業の内容の決定を支援する技術が知られている。特許文献1には、作物を撮影した画像データから算出される換算葉色値等の観察されるデータに基づいて、肥料の量の決定等の施肥管理やその他の農作業を含む肥培管理を支援する技術が開示されている。
Technology that supports the determination of the content of work related to crops is known. Patent Document 1 discloses a technique for supporting fertilization management including fertilization management such as determination of the amount of fertilizer and other farm work based on observed data such as converted leaf color values calculated from image data obtained by photographing crops. Is disclosed.
圃場の作物について光に関する量を測定するセンサ(イメージセンサ等)の出力を利用して生育状況を示す指標(例えばNDVI)を取得して作業時期の目安として活用することが行われている。その場合に、日向と日陰では光量が異なるので、同じ育成状況でも得られる指標の値が変化する。
そこで、本発明は、日陰が混じる圃場における生育状況の適切な判断を支援することを目的とする。 An index (for example, NDVI) indicating a growth state is obtained by using an output of a sensor (an image sensor or the like) that measures an amount of light with respect to a crop in a farm field, and is used as a guide for a work time. In that case, since the amount of light differs between the sun and the shade, the value of the index obtained even in the same breeding situation changes.
Then, an object of this invention is to support the appropriate judgment of the growth condition in the farm field where shade is mixed.
そこで、本発明は、日陰が混じる圃場における生育状況の適切な判断を支援することを目的とする。 An index (for example, NDVI) indicating a growth state is obtained by using an output of a sensor (an image sensor or the like) that measures an amount of light with respect to a crop in a farm field, and is used as a guide for a work time. In that case, since the amount of light differs between the sun and the shade, the value of the index obtained even in the same breeding situation changes.
Then, an object of this invention is to support the appropriate judgment of the growth condition in the farm field where shade is mixed.
上記目的を達成するために、本発明は、撮影された圃場の画像に含まれる当該圃場の日向領域を判定する判定部と、判定された前記日向領域における作物の育成状況を示す指標を日向指標として取得する指標取得部と、取得された前記日向指標を前記日向領域の指標として出力する出力部とを備える情報処理装置を提供する。
In order to achieve the above object, the present invention provides a determination unit that determines a sunny area of a field included in a captured field image, and an index that indicates a growth status of the crop in the determined sunny area An information processing apparatus including an index acquisition unit acquired as an output unit and an output unit that outputs the acquired sunny index as an index of the sunny area is provided.
本発明によれば、日陰が混じる圃場における生育状況の適切な判断を支援することができる。
According to the present invention, it is possible to support an appropriate determination of the growth situation in a field with shade.
1 実施例
図1は実施例に係る農業支援システム1の全体構成を表す。農業支援システム1は、作物の生育状況を表す指標を活用して、圃場(稲、野菜及び果物等の作物を生育する場所)での作業を行う者を支援するシステムである。生育状況を表す指標とは、作物の生育段階の進み具合(例えば収穫に適した時期か否か)と、サイズ及び病気の有無等の状況(活性度ともいう)との一方又は両方を表す指標である。 1 Example FIG. 1: represents the whole structure of theagricultural assistance system 1 which concerns on an Example. The agricultural support system 1 is a system that supports a person who performs work in a farm (a place where crops such as rice, vegetables, and fruits are grown) by using an index that represents the growth status of the crop. The index that represents the growth status is an index that represents one or both of the progress of the growing stage of the crop (for example, whether or not it is suitable for harvesting) and the status (also called activity) such as the presence or absence of disease. It is.
図1は実施例に係る農業支援システム1の全体構成を表す。農業支援システム1は、作物の生育状況を表す指標を活用して、圃場(稲、野菜及び果物等の作物を生育する場所)での作業を行う者を支援するシステムである。生育状況を表す指標とは、作物の生育段階の進み具合(例えば収穫に適した時期か否か)と、サイズ及び病気の有無等の状況(活性度ともいう)との一方又は両方を表す指標である。 1 Example FIG. 1: represents the whole structure of the
本実施例では、後述するNDVI(Normalized Difference Vegetation Index:正規化差植生指数)が用いられ、飛行体によって上空から撮影された圃場の画像を用いてその圃場の作物の生育状況を表す指標が算出される。飛行体は圃場を撮影可能であれば何でもよく、本実施例ではドローンが用いられる。農業支援システム1は、ネットワーク2と、サーバ装置10と、ドローン20と、ユーザ端末30とを備える。
In this embodiment, NDVI (Normalized Difference Vegetation 後 述 Index), which will be described later, is used, and an index representing the growth status of the crop in the field is calculated using an image of the field taken from above by the flying object. Is done. The flying body may be anything as long as it can photograph the field, and a drone is used in this embodiment. The agricultural support system 1 includes a network 2, a server device 10, a drone 20, and a user terminal 30.
ネットワーク2は、移動体通信網及びインターネット等を含む通信システムであり、自システムにアクセスする装置同士のデータのやり取りを中継する。ネットワーク2には、サーバ装置10が有線通信で(無線通信でもよい)アクセスしており、ドローン20及びユーザ端末30が無線通信(ユーザ端末30は有線通信でもよい)でアクセスしている。
The network 2 is a communication system including a mobile communication network and the Internet, and relays data exchange between devices accessing the own system. The server device 10 is accessing the network 2 by wired communication (may be wireless communication), and the drone 20 and the user terminal 30 are accessing by wireless communication (the user terminal 30 may be wired communication).
ユーザ端末30は、本システムのユーザ(例えば圃場で作業を行う作業者)が利用する端末であり、例えばスマートフォン、ノートパソコン又はタブレット端末等である。ドローン20は、本実施例では、1以上の回転翼を備え、それらの回転翼を回転させて飛行する回転翼機型の飛行体である。ドローン20は、飛行しながら上空から圃場を撮影する撮影手段を備えている。ドローン20は、例えば農業支援システム1のユーザである農作業者によって圃場まで持ち運ばれ、撮影飛行開始の操作が行われることで飛行及び撮影を行う。
The user terminal 30 is a terminal used by a user of the system (for example, a worker who performs work in a farm), and is, for example, a smartphone, a laptop computer, or a tablet terminal. In this embodiment, the drone 20 is a rotorcraft type flying body that includes one or more rotor blades and flies by rotating the rotor blades. The drone 20 includes a photographing unit that photographs a farm field from above while flying. The drone 20 is carried to the field by a farm worker who is a user of the agricultural support system 1, for example, and performs flight and shooting by performing an operation of starting shooting flight.
サーバ装置10は、作業者の支援に関する処理を行う情報処理装置である。サーバ装置10は、例えば、ドローン20により撮影された圃場の映像から前述したNDVIを算出する処理を行う。NDVIは、直物の緑葉が赤色の可視光を多く吸収して近赤外領域の波長(0.7μm~2.5μm)の光を多く反射するという性質を利用して作物の生育状況を数値で表す。作業者は、ユーザ端末30に表示されるNDVIの算出結果が表す生育状況を参考にして、自分が作業を行う圃場の作物への散水、肥料散布及び農薬散布等のタイミングを判断することができる。
The server device 10 is an information processing device that performs processing related to worker support. The server device 10 performs, for example, a process of calculating the above-described NDVI from the field image captured by the drone 20. NDVI uses the property that the green leaf of the spot absorbs a lot of red visible light and reflects a lot of light in the near-infrared region (0.7 μm to 2.5 μm). Represented by The worker can determine the timing of watering, fertilizer application, pesticide application, etc. on the crops in the field where he / she works with reference to the growth status represented by the calculation result of NDVI displayed on the user terminal 30. .
図2はサーバ装置10及びユーザ端末30のハードウェア構成を表す。サーバ装置10及びユーザ端末30は、いずれも、プロセッサ11と、メモリ12と、ストレージ13と、通信装置14と、入力装置15と、出力装置16と、バス17という各装置を備えるコンピュータである。なお、ここでいう「装置」という文言は、回路、デバイス及びユニット等に読み替えることができる。また、各装置は、1つ又は複数含まれていてもよいし、一部の装置が含まれていなくてもよい。
FIG. 2 shows the hardware configuration of the server device 10 and the user terminal 30. Each of the server device 10 and the user terminal 30 is a computer including each device such as a processor 11, a memory 12, a storage 13, a communication device 14, an input device 15, an output device 16, and a bus 17. Note that the term “apparatus” here can be read as a circuit, a device, a unit, or the like. Each device may include one or a plurality of devices, or some of the devices may not be included.
プロセッサ11は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ11は、周辺装置とのインターフェース、制御装置、演算装置、レジスタなどを含む中央処理装置(CPU:Central Processing Unit)で構成されてもよい。また、プロセッサ11は、プログラム(プログラムコード)、ソフトウェアモジュール及びデータ等を、ストレージ13及び/又は通信装置14からメモリ12に読み出し、これらに従って各種の処理を実行する。
The processor 11 controls the entire computer by operating an operating system, for example. The processor 11 may include a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, a register, and the like. Further, the processor 11 reads programs (program codes), software modules, data, and the like from the storage 13 and / or the communication device 14 to the memory 12, and executes various processes according to these.
各種処理を実行するプロセッサ11は1つでもよいし、2以上であってもよく、2以上のプロセッサ11は、同時又は逐次に各種処理を実行してもよい。また、プロセッサ11は、1以上のチップで実装されてもよい。プログラムは、電気通信回線を介してネットワークから送信されてもよい。
The number of processors 11 that execute various processes may be one, two or more, and the two or more processors 11 may execute various processes simultaneously or sequentially. Further, the processor 11 may be implemented by one or more chips. The program may be transmitted from the network via a telecommunication line.
メモリ12は、コンピュータ読み取り可能な記録媒体であり、例えば、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable ROM)及びRAM(Random Access Memory)等の少なくとも1つで構成されてもよい。メモリ12は、レジスタ、キャッシュ及びメインメモリ(主記憶装置)等と呼ばれてもよい。メモリ12は、前述したプログラム(プログラムコード)、ソフトウェアモジュール及びデータ等を保存することができる。
The memory 12 is a computer-readable recording medium, and includes, for example, at least one of ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM (Random Access Memory), and the like. May be. The memory 12 may be called a register, a cache, a main memory (main storage device), or the like. The memory 12 can store the above-described program (program code), software module, data, and the like.
ストレージ13は、コンピュータが読み取り可能な記録媒体であり、例えば、CD-ROM(Compact Disc ROM)などの光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップなどの少なくとも1つで構成されてもよい。
The storage 13 is a computer-readable recording medium such as an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu- ray (registered trademark) disk, smart card, flash memory (eg, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, or the like.
ストレージ13は、補助記憶装置と呼ばれてもよい。上述の記憶媒体は、例えば、メモリ12及び/又はストレージ13を含むデータベース、サーバその他の適切な媒体であってもよい。通信装置14は、有線及び/又は無線ネットワークを介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)であり、例えばネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュールなどともいう。
The storage 13 may be called an auxiliary storage device. The above-described storage medium may be, for example, a database including the memory 12 and / or the storage 13, a server, or other suitable medium. The communication device 14 is hardware (transmission / reception device) for performing communication between computers via a wired and / or wireless network, and is also referred to as, for example, a network device, a network controller, a network card, or a communication module.
入力装置15は、外部からの入力を受け付ける入力デバイス(例えば、キーボード、マウス、マイクロフォン、スイッチ、ボタン、センサなど)である。出力装置16は、外部への出力を実施する出力デバイス(例えば、ディスプレイ、スピーカなど)である。なお、入力装置15及び出力装置16は、一体となった構成(例えば、タッチスクリーン)であってもよい。また、プロセッサ11及びメモリ12等の各装置は、情報を通信するためのバス17を介して互いにアクセス可能となっている。バス17は、単一のバスで構成されてもよいし、装置間で異なるバスで構成されてもよい。
The input device 15 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, etc.) that accepts an input from the outside. The output device 16 is an output device (for example, a display, a speaker, or the like) that performs output to the outside. Note that the input device 15 and the output device 16 may have an integrated configuration (for example, a touch screen). The devices such as the processor 11 and the memory 12 are accessible to each other via a bus 17 for communicating information. The bus 17 may be composed of a single bus or may be composed of different buses between devices.
図3はドローン20のハードウェア構成を表す。ドローン20は、プロセッサ21と、メモリ22と、ストレージ23と、通信装置24と、飛行装置25と、センサ装置26と、撮影装置27と、バス28という各装置を備えるコンピュータである。なお、ここでいう「装置」という文言は、回路、デバイス及びユニット等に読み替えることができる。また、各装置は、1つ又は複数含まれていてもよいし、一部の装置が含まれていなくてもよい。
FIG. 3 shows the hardware configuration of the drone 20. The drone 20 is a computer that includes a processor 21, a memory 22, a storage 23, a communication device 24, a flying device 25, a sensor device 26, a photographing device 27, and a bus 28. Note that the term “apparatus” here can be read as a circuit, a device, a unit, or the like. Each device may include one or a plurality of devices, or some of the devices may not be included.
プロセッサ21、メモリ22、ストレージ23、通信装置24及びバス28は、図2に表す同名の装置と同種のハードウェア(性能及び仕様等は異なっていてもよい)である。通信装置24は、ネットワーク2との無線通信に加え、ドローン同士の無線通信を行うこともできる。飛行装置25は、モータ及びローター等を備え、自機を飛行させる装置である。飛行装置25は、空中において、あらゆる方向に自機を移動させたり、自機を静止(ホバリング)させたりすることができる。
The processor 21, the memory 22, the storage 23, the communication device 24, and the bus 28 are the same type of hardware as the device of the same name shown in FIG. 2 (performance and specifications may be different). The communication device 24 can also perform wireless communication between drones in addition to wireless communication with the network 2. The flying device 25 is a device that includes a motor, a rotor, and the like and causes the aircraft to fly. The flying device 25 can move the aircraft in all directions in the air, or can stop (hover) the aircraft.
センサ装置26は、飛行制御に必要な情報を取得するセンサ群を有する装置である。センサ装置26は、自機の位置(緯度及び経度)を測定する位置センサと、自機が向いている方向(ドローンには自機の正面方向が定められており、その正面方向が向いている方向)を測定する方向センサと、自機の高度を測定する高度センサと、自機の速度を測定する速度センサと、3軸の角速度及び3方向の加速度を測定する慣性計測センサ(IMU(Inertial Measurement Unit))とを備える。
The sensor device 26 is a device having a sensor group that acquires information necessary for flight control. The sensor device 26 is a position sensor that measures the position (latitude and longitude) of the own device, and the direction in which the own device is facing (the drone has a front direction defined, and the front direction is directed. Direction sensor that measures the altitude of the aircraft, a velocity sensor that measures the velocity of the aircraft, and an inertial measurement sensor (IMU (Inertial) that measures triaxial angular velocity and acceleration in three directions. Measurement Unit)).
撮影装置27は、レンズ及びイメージセンサ等を有し、イメージセンサで撮影した画像をデジタルデータで記録するいわゆるデジタルカメラである。このイメージセンサは、可視光に加えて、NDVIの算出に必要な近赤外領域の波長の光にも感度を有する。撮影装置27は、自機(ドローン20)の筐体の下部に取り付けられ、撮影方向が固定されており、自機の飛行中に鉛直下方を撮影する。また、撮影装置27は、オートフォーカス機能を有しており、飛行高度が変化しても自動的にピントを合わせて撮影をすることができる。
The photographing device 27 is a so-called digital camera that has a lens, an image sensor, and the like and records an image photographed by the image sensor as digital data. This image sensor has sensitivity not only to visible light but also to light having a wavelength in the near infrared region necessary for calculating NDVI. The photographing device 27 is attached to the lower part of the casing of the own device (drone 20), has a fixed photographing direction, and photographs a vertically lower part during the flight of the own device. The photographing device 27 has an autofocus function, and can automatically focus and photograph even if the flight altitude changes.
なお、サーバ装置10及びドローン20は、マイクロプロセッサ、デジタル信号プロセッサ(DSP:Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、及び、FPGA(Field Programmable Gate Array)等のハードウェアを含んで構成されてもよく、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ11は、これらのハードウェアの少なくとも1つで実装されてもよい。
The server device 10 and the drone 20 include a microprocessor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a programmable logic device (PLD), and a field programmable gate array (FPGA). It may be configured including hardware, and a part or all of each functional block may be realized by the hardware. For example, the processor 11 may be implemented by at least one of these hardware.
農業支援システム1が備えるサーバ装置10及びドローン20には、本システムで提供されるプログラムが記憶されており、各装置のプロセッサがプログラムを実行して各部を制御することで以下に述べる機能群が実現される。
図4は農業支援システム1が実現する機能構成を表す。サーバ装置10は、圃場画像生成部101と、日向領域判定部102と、指標算出部103と、生育情報生成部104と、生育情報記録部105とを備える。 Theserver device 10 and the drone 20 included in the agricultural support system 1 store programs provided by this system, and the function group described below is executed by the processor of each device executing the program and controlling each unit. Realized.
FIG. 4 shows a functional configuration realized by theagricultural support system 1. The server device 10 includes an agricultural field image generation unit 101, a sunny area determination unit 102, an index calculation unit 103, a growth information generation unit 104, and a growth information recording unit 105.
図4は農業支援システム1が実現する機能構成を表す。サーバ装置10は、圃場画像生成部101と、日向領域判定部102と、指標算出部103と、生育情報生成部104と、生育情報記録部105とを備える。 The
FIG. 4 shows a functional configuration realized by the
ドローン20は、飛行制御部201と、飛行部202と、センサ測定部203と、撮影部204とを備える。飛行制御部201は、圃場を撮影する際に自機の飛行を制御する。飛行制御部201は、例えば、ユーザである農作業者が予め登録した圃場の地理的な範囲を示す圃場範囲情報(例えば圃場の外縁を示す緯度及び経度の情報)を記憶しておき、その圃場範囲情報に基づいて、圃場全体の上空を一定の高度で万遍なく飛行する飛行経路で自機を飛行させる制御を行う。
The drone 20 includes a flight control unit 201, a flight unit 202, a sensor measurement unit 203, and an imaging unit 204. The flight control unit 201 controls the flight of the own aircraft when photographing a farm field. The flight control unit 201 stores, for example, field range information (for example, latitude and longitude information indicating the outer edge of the field) indicating the geographical range of the field registered in advance by a farmer who is a user, and the field range. Based on the information, control is performed to fly the aircraft in a flight path that flies across the entire field at a constant altitude.
この場合の飛行経路は、例えば長方形の圃場であれば、圃場の一方の辺からその辺に対向する他方の辺まで波状の軌跡を描いて飛行する経路である。他にも、圃場の外縁に沿って飛行して、1週したら内側に経路をずらして渦巻き状の軌跡を描いて飛行する経路であってもよく、要するに圃場の全体を万遍なく飛行する飛行経路であればよい。飛行部202は、自機を飛行させる機能であり、本実施例では、飛行装置25が備えるモータ及びローター等を動作させることで自機を飛行させる。
The flight path in this case is, for example, a path in which a rectangular farm field flies in a wavy locus from one side of the farm field to the other side facing the side. In addition, it may be a route that flies along the outer edge of the field, and after a week, shifts the route to the inside and draws a spiral trajectory. Any route is acceptable. The flying unit 202 has a function of flying the aircraft. In this embodiment, the flying device 202 is operated by operating a motor, a rotor, and the like included in the flying device 25.
センサ測定部203は、センサ装置26が備える各センサ(位置センサ、方向センサ、高度センサ、速度センサ、慣性計測センサ)による測定を行い、自機の位置、方向、高度、速度、角速度、加速度を所定の時間間隔で繰り返し測定する。センサ測定部203は、測定した位置、方向、高度、速度、角速度、加速度を示すセンサ情報を飛行制御部201に供給する。飛行制御部201は、供給されたセンサ情報に基づいて飛行部202を制御し、前述した飛行経路に沿って自機を飛行させる。
The sensor measurement unit 203 performs measurement by each sensor (position sensor, direction sensor, altitude sensor, speed sensor, inertial measurement sensor) included in the sensor device 26, and calculates the position, direction, altitude, speed, angular velocity, and acceleration of the own device. Measure repeatedly at predetermined time intervals. The sensor measurement unit 203 supplies sensor information indicating the measured position, direction, altitude, speed, angular velocity, and acceleration to the flight control unit 201. The flight control unit 201 controls the flight unit 202 based on the supplied sensor information and causes the aircraft to fly along the above-described flight path.
センサ測定部203は、測定した位置、方向、高度及び速度を示すセンサ情報を撮影部204に供給する。撮影部204は、撮影装置27を用いて被写体を撮影する機能であり、本発明の「撮影手段」の一例である。撮影部204は、飛行制御部201が上記のとおり圃場の上空を飛行する制御を行っているときには、その圃場を被写体として撮影する。撮影部204は、圃場を撮影することで、その圃場において作物が生育している領域(作物領域)も撮影することになる。
The sensor measurement unit 203 supplies sensor information indicating the measured position, direction, altitude, and speed to the imaging unit 204. The photographing unit 204 has a function of photographing a subject using the photographing device 27 and is an example of the “photographing unit” in the present invention. When the flight control unit 201 performs control to fly over the field as described above, the imaging unit 204 captures the field as a subject. The imaging unit 204 captures an image of a field, and also captures an area where the crop is growing in the field (a crop area).
撮影装置27のイメージセンサは上記のとおり近赤外領域の波長の光にも感度を有するので、撮影部204が撮影した静止画像を形成する各画素は、可視光の赤色、緑色、青色を示すピクセル値(R、G、B)と共に、近赤外領域の波長の光を示すピクセル値(IR)によって表される。撮影部204は、供給されたセンサ情報に基づいて、圃場内の全ての領域が含まれるように、複数の静止画像を撮影する。
Since the image sensor of the imaging device 27 has sensitivity to light having a wavelength in the near-infrared region as described above, each pixel forming a still image captured by the imaging unit 204 shows visible light red, green, and blue. Along with pixel values (R, G, B), it is represented by a pixel value (IR) indicating light having a wavelength in the near infrared region. The imaging unit 204 captures a plurality of still images based on the supplied sensor information so that all areas in the field are included.
図5は圃場の撮影方法の一例を表す。図5では、ドローン20が圃場A1の上空を波状の軌跡を描いて飛行する際の経路B1が表されている。撮影部204は、センサ情報が示す高度及び撮影装置27の画角から圃場(地上0m地点)の撮影範囲(画角に含まれる圃場の範囲)を算出する。そして、撮影部204は、センサ情報が示す速度及び方向から現時点の撮影範囲と前回の撮影範囲とが重複する面積の割合(例えば撮影範囲の面積を100%とした場合の重複面積のパーセンテージで表す)が閾値未満になったときに次の撮影を行う。
FIG. 5 shows an example of a method for photographing a farm field. FIG. 5 shows a path B1 when the drone 20 flies over the field A1 with a wavy locus. The imaging unit 204 calculates the imaging range (the field range included in the angle of view) of the field (0 m above the ground) from the altitude indicated by the sensor information and the angle of view of the imaging device 27. Then, the photographing unit 204 represents the ratio of the area where the current photographing range and the previous photographing range overlap from the speed and direction indicated by the sensor information (for example, expressed as a percentage of the overlapping area when the area of the photographing range is 100%). ) Is less than the threshold value, the next shooting is performed.
撮影部204は、図5の例であれば、最初に撮影領域C1を撮影し、次に撮影領域C1と少し重なった撮影領域C2を撮影する。また、撮影部204は、自機(ドローン20)の折り返しの際に、算出した撮影範囲の大きさを飛行制御部201に通知する。飛行制御部201は、通知された大きさの撮影範囲が例えば図5の撮影領域C4及びC5のように重複する距離だけ経路をずらして折り返す。
In the example of FIG. 5, the photographing unit 204 first photographs the photographing region C1, and then photographs the photographing region C2 that slightly overlaps the photographing region C1. In addition, the photographing unit 204 notifies the flight control unit 201 of the calculated size of the photographing range when the own device (drone 20) is turned back. The flight control unit 201 folds the route by shifting the distance by which the imaging range of the notified size overlaps, for example, the imaging areas C4 and C5 in FIG.
撮影部204は、この方法での撮影を繰り返すことで、図5に表す撮影領域C1からC32までを撮影した静止画像、すなわち、撮影範囲が少しずつ重複した複数の静止画像を撮影する。なお、図5の例では圃場A1が複数の撮影範囲を丁度収められる大きさ及び形になっていたが、そうなっていなくてもよい。その場合は、撮影範囲同士の重複部分を広くしたり、圃場の外部を含めて撮影したりすることで、圃場内の全ての領域がいずれかの静止画像に含まれるようになる。
The imaging unit 204 captures still images obtained by imaging the imaging regions C1 to C32 shown in FIG. 5, that is, a plurality of still images whose imaging ranges are slightly overlapped, by repeating imaging using this method. In the example of FIG. 5, the field A <b> 1 has a size and shape that can accommodate a plurality of imaging ranges, but this need not be the case. In that case, all the areas in the field are included in any one of the still images by widening the overlapping part of the shooting ranges or by shooting including the outside of the field.
なお、撮影部204による撮影方法はこれに限らない。例えば撮影の際の飛行速度及び飛行高度が決まっていれば、撮影範囲が図5に表すように重複する時間の間隔が予め算出されるので、その時間の間隔で撮影が行われてもよい。また、圃場の地図と撮影位置とが予め決められていれば、撮影部204は、その決められた位置を飛行しているときに撮影すればよい。また、撮影部204は、自機の記憶容量又は通信速度に不足がなければ、データが大きくなる動画像を撮影してもよい。
Note that the photographing method by the photographing unit 204 is not limited to this. For example, if the flight speed and flight altitude at the time of shooting are determined, the overlapping time interval is calculated in advance as shown in FIG. 5, so that shooting may be performed at that time interval. Further, if the map of the farm field and the shooting position are determined in advance, the shooting unit 204 may shoot when flying in the determined position. Further, the photographing unit 204 may photograph a moving image in which the data becomes large as long as the storage capacity or communication speed of the own device is not insufficient.
これら以外にも、ドローンを用いて地上を撮影するための周知の方法が用いられてもよい。ドローン20が備える各部の動作は、上述した農作業者による飛行開始の操作が行われることで開始される。各部の動作が開始されると、ドローン20は圃場の上空を設定された飛行経路で飛行し、撮影部204は、前述のとおり繰り返し撮影を行う。撮影部204は、撮影を行うと、撮影した静止画像と、撮影に関する撮影情報(撮影したときの位置、方位、高度及び時刻と、撮影装置27の画角とを示す情報)とを示す画像データを生成してサーバ装置10に送信する。
In addition to these, a known method for photographing the ground using a drone may be used. Operation | movement of each part with which drone 20 is provided is started by operation of the flight start by the farm worker mentioned above. When the operation of each unit is started, the drone 20 flies over the set flight path over the field, and the imaging unit 204 repeatedly performs imaging as described above. When the photographing unit 204 performs photographing, image data indicating the photographed still image and photographing information related to photographing (information indicating the position, orientation, altitude, and time when photographing, and the angle of view of the photographing device 27). Is transmitted to the server device 10.
サーバ装置10の圃場画像生成部101は、送信されてきた画像データを受け取ることで、その画像データが示す静止画像を、ドローン20により撮影された圃場の画像として取得する。圃場画像生成部101は本発明の「第1画像取得部」の一例である。圃場画像生成部101は、受け取った画像データが示す撮影情報も取得すると、撮影領域毎の画像から圃場全体の画像を生成する。
The farm field image generation unit 101 of the server device 10 receives the transmitted image data, and acquires a still image indicated by the image data as a farm field image captured by the drone 20. The farm field image generation unit 101 is an example of the “first image acquisition unit” in the present invention. When the field image generation unit 101 also acquires the shooting information indicated by the received image data, the field image generation unit 101 generates an image of the entire field from the image for each shooting region.
圃場画像生成部101は、取得した撮影情報が示す位置、方位、高度及び画角を用いて各画像の重複部分を計算し、重複部分については例えば一方の画像の画素を採用することで圃場全体の画像を生成する。圃場画像生成部101は、生成した圃場全体の画像の各画素に画素IDを割り当てると、その画素ID及び圃場全体の画像を示す圃場全体画像データを日向領域判定部102及び指標算出部103に供給する。
The farm field image generation unit 101 calculates an overlapping part of each image using the position, orientation, altitude, and angle of view indicated by the acquired imaging information, and for the overlapping part, for example, adopts a pixel of one image to make the entire field Generate an image of When the field image generation unit 101 assigns a pixel ID to each pixel of the generated entire field image, the field image generation unit 101 supplies the entire field image data indicating the pixel ID and the entire field image to the sunny area determination unit 102 and the index calculation unit 103. To do.
日向領域判定部102は、撮影された圃場の画像に含まれるその圃場の日向領域を判定する。日向領域判定部102は本発明の「判定部」の一例である。日向領域判定部102は、本実施例では、上記のとおりドローン20の撮影部204により撮影された圃場の画像に含まれる日向領域を判定する。日向領域判定部102は、供給された圃場全体画像データが示す圃場全体の画像の各画素のピクセル値に基づいて日向領域を判定する。
The sunny area determination unit 102 determines the sunny area of the farm field included in the photographed field image. The sunny area determination unit 102 is an example of the “determination unit” in the present invention. In this embodiment, the sunny area determination unit 102 determines the sunny area included in the field image captured by the imaging unit 204 of the drone 20 as described above. The sunny area determination unit 102 determines the sunny area based on the pixel value of each pixel of the entire field image indicated by the supplied entire field image data.
日向領域判定部102は、例えば、ピクセル値のうちのR、G、Bの値から各画素のHSV色空間における値(色相H:Hue、彩度S:Saturation、明度V:Value)を算出する。日向領域判定部102は、算出した各画素のHSVの値のうち例えば色相H及び彩度Sの差が所定の範囲内に収まる領域を同じ色の物体の領域(例えば作物領域及び土領域等)と判断する。日向領域判定部102は、そうして判断した同一色の領域において、明度Vの値が差分閾値以上変化する画素を抽出する。
The sunny area determination unit 102 calculates, for example, values in the HSV color space (hue H: Hue, saturation S: Saturation, brightness V: Value) of each pixel from the R, G, and B values of the pixel values. . The sunny area determination unit 102 determines, for example, an area of an object of the same color (for example, a crop area and a soil area) that has a difference between the hue H and the saturation S within a predetermined range among the calculated HSV values of each pixel. Judge. The sunny area determination unit 102 extracts pixels whose lightness value V changes by a difference threshold value or more in the same color area determined in this way.
こうして抽出された画素は、日向と日陰の境界を示す可能性がある。日向領域判定部102は、抽出した画素によって区切られる領域のうち明度Vの値が大きい方の領域の明度Vの値の平均値が明度閾値以上である場合に、その明度Vの値は日向の画素を表していると判断し、その領域を日向領域として判定する。また、日向領域判定部102は、日向領域と判定しなかった領域を日陰領域として判定する。
The pixel extracted in this way may indicate the boundary between the sun and the shade. When the average value of the values of brightness V of the region with the larger value of brightness V among the regions divided by the extracted pixels is equal to or greater than the brightness threshold, the value of brightness V is It is determined that it represents a pixel, and the area is determined as a sunny area. Further, the sunny area determination unit 102 determines an area that has not been determined as the sunny area as a shaded area.
なお、領域の色によって、日向及び日陰の明度Vはそれぞれ異なる大きさになり易く、そのため日向及び日陰の明度Vの差分も異なる大きさになりやすい。例えば緑の葉の領域における日向の画素の明度は、土の領域における日向の画素の明度よりも大きくなりやすい。そこで、日向領域判定部102は、例えば各領域の色相H及び彩度Sの値に応じた差分閾値及び明度閾値(葉領域では土領域に比べて両閾値を大きくする)を用いて日向領域及び日陰領域の判定を行ってもよい。
It should be noted that depending on the color of the area, the brightness V of the sun and the shade tends to be different, and therefore the difference between the brightness V of the sun and the shade tends to be different. For example, the brightness of the sunny pixel in the green leaf region tends to be larger than the brightness of the sunny pixel in the soil region. Therefore, the sunny area determination unit 102 uses, for example, a difference threshold value and a lightness threshold value corresponding to the hue H and saturation S values of each area (in the leaf area, both threshold values are increased compared to the soil area) The shaded area may be determined.
圃場A1の南側には、図5に表すように、圃場A1の西の端から東西方向の長さの3分の2ほどに沿って林R1が存在するものとする。
図6は日向領域の判定結果の一例を表す。図6では、圃場A1の圃場全体画像E1に、判定結果として、南西側に林R1による日陰領域G1が表されており、それ以外の領域が日向領域F1として表されている。 As shown in FIG. 5, it is assumed that the forest R1 exists along the two-thirds of the length in the east-west direction from the west end of the field A1, on the south side of the field A1.
FIG. 6 shows an example of the determination result of the sunny area. In FIG. 6, in the entire field image E1 of the field A1, as a determination result, the shaded area G1 due to the forest R1 is shown on the southwest side, and the other areas are shown as the sunny area F1.
図6は日向領域の判定結果の一例を表す。図6では、圃場A1の圃場全体画像E1に、判定結果として、南西側に林R1による日陰領域G1が表されており、それ以外の領域が日向領域F1として表されている。 As shown in FIG. 5, it is assumed that the forest R1 exists along the two-thirds of the length in the east-west direction from the west end of the field A1, on the south side of the field A1.
FIG. 6 shows an example of the determination result of the sunny area. In FIG. 6, in the entire field image E1 of the field A1, as a determination result, the shaded area G1 due to the forest R1 is shown on the southwest side, and the other areas are shown as the sunny area F1.
日向領域判定部102は、上記のとおり判定を行うと、日向と判定した領域に含まれる画素の画素IDを示す日向情報と、日陰と判定した領域に含まれる画素の画素IDを示す日陰情報とを生成する。日向領域判定部102は、生成した日向情報及び日陰情報を圃場全体画像データと共に生育情報生成部104に供給する。
When the sunny area determination unit 102 performs the determination as described above, the sunny area information indicating the pixel ID of the pixel included in the area determined to be sunny and the shade information indicating the pixel ID of the pixel included in the area determined to be shaded. Is generated. The sunny area determination unit 102 supplies the generated sunny information and shade information to the growth information generation unit 104 together with the entire field image data.
指標算出部103は、圃場画像生成部101により取得された圃場の画像からその画像に写る作物の生育状況を表す指標を算出する。指標算出部103は本発明の「算出部」の一例である。指標算出部103は、上述したNDVIを、生育状況を表す指標として算出する。指標算出部103は、例えば、圃場全体画像データが示す圃場全体の画像の画素毎に、上述した赤色のピクセル値(R)及び近赤外領域の波長の光のピクセル値(IR)をNDVI=(IR-R)/(IR+R)という式に代入してNDVIを算出する。
The index calculation unit 103 calculates an index representing the growth status of the crop shown in the image from the field image acquired by the field image generation unit 101. The index calculation unit 103 is an example of the “calculation unit” in the present invention. The index calculation unit 103 calculates the above-described NDVI as an index indicating the growth status. For example, the index calculation unit 103 calculates the above-described red pixel value (R) and pixel value (IR) of light having a wavelength in the near-infrared region for each pixel of the entire field image indicated by the entire field image data NDVI = NDVI is calculated by substituting into the equation (IR-R) / (IR + R).
本実施例では、ドローン20の撮影部204が圃場の画像を撮影し、指標算出部103が上記のとおりNDVIを算出することで、圃場の作物の育成状況が測定されることになる。生育情報生成部104は、各画素に対応する圃場上の位置におけるNDVIを表す画素単位のNDVIマップを生成する。
図7は画素単位のNDVIマップの一例を表す。図7の例では、図5に表す圃場A1の画素単位のNDVIマップM1が表されている。 In the present embodiment, theimage capturing unit 204 of the drone 20 captures an image of the farm field, and the index calculation unit 103 calculates NDVI as described above, thereby measuring the state of crop cultivation in the field. The growth information generation unit 104 generates an NDVI map in pixel units representing NDVI at a position on the field corresponding to each pixel.
FIG. 7 shows an example of an NDVI map in pixel units. In the example of FIG. 7, the NDVI map M1 in pixel units of the field A1 shown in FIG. 5 is represented.
図7は画素単位のNDVIマップの一例を表す。図7の例では、図5に表す圃場A1の画素単位のNDVIマップM1が表されている。 In the present embodiment, the
FIG. 7 shows an example of an NDVI map in pixel units. In the example of FIG. 7, the NDVI map M1 in pixel units of the field A1 shown in FIG. 5 is represented.
NDVIマップM1は、図6に表す圃場全体画像E1における左上角の画素D1(NDVI=0.3)と、左下角の画素D2(NDVI=-0.5)と、右上角の画素D3(NDVI=0.3)と、右下角の画素D4(NDVI=0.2)とを角に持つ長方形のマップである。指標算出部103は、算出したNDVIを表すNDVIマップM1を、算出した作物の生育状況を表す指標情報として圃場全体画像データと共に生育情報生成部104に供給する。
The NDVI map M1 includes an upper left corner pixel D1 (NDVI = 0.3), a lower left corner pixel D2 (NDVI = -0.5), and an upper right corner pixel D3 (NDVI) in the entire field image E1 shown in FIG. = 0.3) and a rectangular map having a pixel D4 (NDVI = 0.2) at the lower right corner at the corner. The index calculation unit 103 supplies the NDVI map M1 representing the calculated NDVI to the growth information generation unit 104 together with the entire field image data as index information representing the calculated crop growth status.
生育情報生成部104は、供給された日向情報、日陰情報、指標情報(例えばNDVIマップM1)及び圃場全体画像データを用いて、圃場における作物の生育状況を示す生育情報を生成する。生育情報生成部104は、特に、日向における作物の生育状況を示す日向生育情報と、日陰における作物の生育状況を示す日陰生育情報とを生成する。生育情報生成部104は、これらの情報を例えば次のように生成する。
The growth information generation unit 104 generates growth information indicating the growth status of the crop in the field using the supplied sunny information, shade information, index information (for example, the NDVI map M1) and the entire field image data. The growth information generation unit 104 generates, in particular, sunny growth information indicating the growth status of the crop in the sun and shade growth information indicating the growth status of the crop in the shade. The growth information generation unit 104 generates such information as follows, for example.
生育情報生成部104は、指標算出部103から供給されたNDVIマップM1が示す各画素のNDVIのうち、日向領域判定部102により判定された日向領域におけるNDVI(日向領域に含まれる各位置におけるNDVI)を日向指標として取得する。ここでいう位置とは、圃場全体画像の1つの画素が表す位置から複数の画素が表す位置までを含み、いずれの場合もある程度の広さを持った領域を表す。日向指標とは、日向領域の作物の画像から測定される作物の生育状況を表す指標を言うものとする。
The growth information generation unit 104 includes the NDVI in the sunny area determined by the sunny area determination unit 102 among the NDVIs of the pixels indicated by the NDVI map M1 supplied from the index calculation unit 103 (NDVI at each position included in the sunny area). ) As a Hinata index. Here, the position includes a range from a position represented by one pixel of the entire field image to a position represented by a plurality of pixels, and represents a region having a certain extent in any case. The Hinata index refers to an index representing the growth status of a crop measured from an image of a crop in the sunny area.
生育情報生成部104は、具体的には、指標情報が示すNDVIのうち、日向情報が示す画素IDに対応付けられたNDVIを日向指標として取得する。また、生育情報生成部104は、日向領域と判定されなかった日陰領域の指標(日向領域と判定されなかった位置の指標)を日陰指標として取得する。日陰指標とは、日陰領域の作物の画像から測定される作物の生育状況を表す指標を言うものとする。生育情報生成部104は、指標情報が示すNDVIのうち、日陰情報が示す画素IDに対応付けられたNDVIを日陰指標として取得する。生育情報生成部104は本発明の「指標取得部」の一例である。
Specifically, the growth information generation unit 104 acquires an NDVI associated with the pixel ID indicated by the sun direction information as the sun direction indicator among the NDVIs indicated by the index information. In addition, the growth information generation unit 104 acquires a shade area index that is not determined to be a sunny area (an index of a position that is not determined to be a sunny area) as a shade index. The shade index refers to an index representing the growth status of a crop measured from a crop image in the shaded area. The growth information generation unit 104 acquires, as a shade index, the NDVI associated with the pixel ID indicated by the shade information among the NDVIs indicated by the index information. The growth information generation unit 104 is an example of the “index acquisition unit” in the present invention.
生育情報生成部104は、本実施例では、上記のとおり、圃場の画像の全体について指標算出部103により算出されたNDVIから日向領域について算出されたNDVIを日向指標(圃場の日向領域に含まれる位置におけるNDVI)として取得する。また、生育情報生成部104は、圃場の画像の全体について指標算出部103により算出されたNDVIから日陰領域について算出されたNDVIを日陰指標(圃場の日陰領域に含まれる位置におけるNDVI)として取得する。
In the present embodiment, as described above, the growth information generating unit 104 includes the NDVI calculated for the sunny area from the NDVI calculated by the index calculating unit 103 for the entire field image as included in the sunny index (the sunny area of the field). As NDVI at position). Further, the growth information generation unit 104 acquires the NDVI calculated for the shaded area from the NDVI calculated by the index calculation unit 103 for the entire field image as a shaded index (NDVI at a position included in the shaded area of the field). .
生育情報生成部104は、取得した日向指標及び日陰指標を用いて、圃場A1を区分する複数の領域毎の作物の生育状況を表す領域単位のNDVIマップを、日向領域及び日陰領域のそれぞれについて生成する。
図8A及び図8Bは領域単位のNDVIマップの一例を表す。図8Aでは図6に表す日向領域F1における領域単位のNDVIマップMf1が表され、図8Bでは図6に表す日陰領域G1における領域単位のNDVIマップMg1が表されている。 The growthinformation generation unit 104 generates an NDVI map for each area representing the growth status of the crop for each of a plurality of areas that divide the field A1, for each of the sunny area and the shaded area, using the acquired sunny index and shade index. To do.
8A and 8B show examples of NDVI maps in units of areas. 8A shows an NDVI map Mf1 in the area unit in the sunny area F1 shown in FIG. 6, and FIG. 8B shows an NDVI map Mg1 in the area unit in the shaded area G1 shown in FIG.
図8A及び図8Bは領域単位のNDVIマップの一例を表す。図8Aでは図6に表す日向領域F1における領域単位のNDVIマップMf1が表され、図8Bでは図6に表す日陰領域G1における領域単位のNDVIマップMg1が表されている。 The growth
8A and 8B show examples of NDVI maps in units of areas. 8A shows an NDVI map Mf1 in the area unit in the sunny area F1 shown in FIG. 6, and FIG. 8B shows an NDVI map Mg1 in the area unit in the shaded area G1 shown in FIG.
図8A及び図8Bの例では、各区分領域がNDVIの平均値の大きさに応じて8段階(Lv1が最も小さく、Lv8が最も大きい)のパターンで表されている。NDVIマップMf1では、圃場A1の北東側に近づくほどNDVIが大きく生育状況がよいことが表されている。また、NDVIマップMg1でも同じく圃場A1の北東側に近づくほどNDVIが大きく生育状況がよいことが表されている。但し、NDVIマップMg1では、NDVIマップMf1と同じ撮影領域を表す区分領域でも、日陰であるためNDVIが小さくなっている。
In the example of FIGS. 8A and 8B, each segmented area is represented by an eight-level pattern (Lv1 is the smallest and Lv8 is the largest) according to the average value of NDVI. The NDVI map Mf1 indicates that the NDVI is larger and the growth state is better as it approaches the northeast side of the field A1. Further, the NDVI map Mg1 also shows that the NDVI is larger and the growth state is better as it approaches the northeast side of the field A1. However, in the NDVI map Mg1, the NDVI is small because it is shaded even in the divided area representing the same shooting area as the NDVI map Mf1.
例えば撮影領域C14を表す区分領域Hc14のNDVIは、NDVIマップMf1ではLv6(大きい方から3番目)であるが、NDVIマップMg1ではLv4(大きい方から5番目)となっている。また、撮影領域C19を表す区分領域Hc19のNDVIは、NDVIマップMf1ではLv7(大きい方から2番目)であるが、NDVIマップMg1ではLv5(大きい方から4番目)となっている。
For example, the NDVI of the segmented area Hc14 representing the imaging area C14 is Lv6 (third from the largest) in the NDVI map Mf1, but Lv4 (fifth from the largest) in the NDVI map Mg1. Further, the NDVI of the segmented area Hc19 representing the imaging area C19 is Lv7 (second from the largest) in the NDVI map Mf1, but is Lv5 (fourth from the largest) in the NDVI map Mg1.
生育情報生成部104は、生成したNDVIマップMf1を、取得した日向指標から生成したことを示す記号(日向識別記号)、元になった圃場全体画像の撮影情報及び撮影日時に対応付けた情報を上述した日向生育情報として生成する。また、生育情報生成部104は、生成したNDVIマップMg1を、取得した日陰指標から生成したことを示す記号(日陰識別記号)、元になった圃場全体画像の撮影情報及び撮影日時に対応付けた情報を上述した日陰生育情報として生成する。生育情報生成部104は、生成した日向生育情報及び日陰生育情報を生育情報記録部105に供給する。
The growth information generation unit 104 includes a symbol (a sun identification symbol) indicating that the generated NDVI map Mf1 is generated from the acquired hinata index, shooting information of the entire field image as a base, and information associated with the shooting date and time. It is generated as the above-mentioned sunny growth information. Further, the growth information generation unit 104 associates the generated NDVI map Mg1 with a symbol (shade identification symbol) indicating that the generated NDVI map Mg1 is generated from the acquired shade index, the shooting information and the shooting date and time of the original whole field image. Information is generated as the above-mentioned shade growth information. The growth information generation unit 104 supplies the generated growth direction information and shade growth information to the growth information recording unit 105.
生育情報記録部105は、生育情報生成部104により生成された日向生育情報及び日陰生育情報を合わせて、圃場における作物の生育情報(作物の生育状況を表す情報)として記録する。生育情報記録部105は、記録した生育情報をユーザ(農作業者)が閲覧可能な状態で(例えばユーザに伝えられているURL(Uniform Resource Locator)でアクセス可能なウェブページに掲載されるように)保持する。
The growth information recording unit 105 records the growth information of the crop in the field (information indicating the growth status of the crop) together with the sunny growth information and the shade growth information generated by the growth information generation unit 104. The growth information recording unit 105 is in a state where the recorded growth information can be browsed by the user (farmer) (for example, on a web page accessible by a URL (Uniform (Resource Locator) transmitted to the user). Hold.
生育情報記録部105は、ユーザ端末30から上記URLへのアクセスがあると、例えば保持している生育情報を検索させる画面を表示させる。
図9A~9Dは生育情報の検索画面の一例を表す。図9Aでは、ユーザ端末30が、時期、圃場名、日照条件の入力欄を含む生育情報検索画面を表示している。この例では、「2017/5/15」の「圃場A1」の「日向」という検索条件が入力されている。 When the user terminal 30 accesses the URL, the growthinformation recording unit 105 displays a screen for searching for growth information that is held, for example.
9A to 9D show an example of a growth information search screen. In FIG. 9A, the user terminal 30 displays a growth information search screen including input fields for time, field name, and sunshine conditions. In this example, the search condition “Hyuga” of “Agricultural field A1” of “2017/5/15” is input.
図9A~9Dは生育情報の検索画面の一例を表す。図9Aでは、ユーザ端末30が、時期、圃場名、日照条件の入力欄を含む生育情報検索画面を表示している。この例では、「2017/5/15」の「圃場A1」の「日向」という検索条件が入力されている。 When the user terminal 30 accesses the URL, the growth
9A to 9D show an example of a growth information search screen. In FIG. 9A, the user terminal 30 displays a growth information search screen including input fields for time, field name, and sunshine conditions. In this example, the search condition “Hyuga” of “Agricultural field A1” of “2017/5/15” is input.
ユーザが検索ボタンH1を押す操作を行うと、ユーザ端末30は、「2017/5/15」を撮影日時とする圃場A1の撮影画像から生成された日向生育情報を要求する要求データをサーバ装置10に送信する。生育情報記録部105は、記録してある生育情報から要求された日向生育情報を読み出し、読み出した日向生育情報をユーザ端末30に送信する。ユーザ端末30は、送信されてきた日向生育情報が示すNDVIマップMf1を図9Bに表すように表示する。
When the user performs an operation of pressing the search button H1, the user terminal 30 sends request data for requesting the sunny growth information generated from the photographed image of the field A1 having “2017/5/15” as the photographing date and time. Send to. The growth information recording unit 105 reads the requested sunny growth information from the recorded growth information, and transmits the read sunny growth information to the user terminal 30. The user terminal 30 displays the NDVI map Mf1 indicated by the transmitted sunny growth information as shown in FIG. 9B.
また、ユーザが日照条件だけを図9Cに表すように「日陰」と変更して検索ボタンH1を押すと、ユーザ端末30は、「2017/5/15」を撮影日時とする圃場A1の撮影画像から生成された日陰生育情報を要求する要求データをサーバ装置10に送信する。生育情報記録部105は、記録してある生育情報から要求された日陰生育情報を読み出し、読み出した日陰生育情報をユーザ端末30に送信する。ユーザ端末30は、送信されてきた日陰生育情報が示すNDVIマップMg1を図9Dに表すように表示する。
Further, when the user changes only “sunshine condition” to “shade” as shown in FIG. 9C and presses the search button H1, the user terminal 30 takes a photographed image of the field A1 with “2017/5/15” as the photographing date and time. Request data for requesting the shade growth information generated from is transmitted to the server device 10. The growth information recording unit 105 reads the requested shade growth information from the recorded growth information, and transmits the read shade growth information to the user terminal 30. The user terminal 30 displays the NDVI map Mg1 indicated by the transmitted shade growth information as shown in FIG. 9D.
日向生育情報は、生育情報生成部104により取得された日向指標(日向領域のNDVI)を表すNDVIマップMf1を、日向領域の指標であることを示す日向識別記号に対応付けた情報である。生育情報記録部105は、この日向生育情報をユーザ端末30に対して出力する(送信する)ことで、取得された日向指標を日向領域の指標として対応付けた出力(日向領域の指標であることを見分けられるようにした出力)を行う。
The sunny growth information is information in which the NDVI map Mf1 representing the sunny index (NDVI of the sunny area) acquired by the growth information generating unit 104 is associated with the sunny identification symbol indicating that it is an index of the sunny area. The growth information recording unit 105 outputs (transmits) the sunny growth information to the user terminal 30 to output the associated sunny index as the sunny area index (the sunny area index). Output so that they can be identified.
また、日陰生育情報は、生育情報生成部104により取得された日陰指標(日陰領域のNDVI)を表すNDVIマップMg1を、日陰領域の指標であることを示す日陰識別記号に対応付けた情報である。生育情報記録部105は、この日陰生育情報をユーザ端末30に対して出力する(送信する)ことで、取得された日陰指標を日陰領域の指標として対応付けた出力(日陰領域の指標であることを見分けられるようにした出力)を行う。生育情報記録部105は本発明の「出力部」の一例である。
The shade growth information is information in which the NDVI map Mg1 representing the shade index (NDVI of the shaded area) acquired by the growth information generation unit 104 is associated with the shade identification symbol indicating the shade area index. . The growth information recording unit 105 outputs (transmits) the shade growth information to the user terminal 30 so that the obtained shade index is associated with the shade area index as an index (shade area index). Output so that they can be identified. The growth information recording unit 105 is an example of the “output unit” in the present invention.
農業支援システム1が備える各装置は、上記の構成に基づいて、作物の生育情報を記録する記録処理を行う。
図10は記録処理における各装置の動作手順の一例を表す。この動作手順は、ユーザである農作業者が圃場にドローン20を持って行き撮影飛行開始の操作を行うことを契機に開始される。まず、ドローン20(飛行制御部201、飛行部202及びセンサ測定部203)は、記憶している圃場範囲情報に基づき圃場上空での飛行を開始する(ステップS11)。 Each device provided in theagricultural support system 1 performs a recording process for recording crop growth information based on the above configuration.
FIG. 10 shows an example of the operation procedure of each apparatus in the recording process. This operation procedure is started when a farmer who is a user takes thedrone 20 to the field and performs an operation for starting a shooting flight. First, the drone 20 (flight control unit 201, flight unit 202, and sensor measurement unit 203) starts flying over the field based on the stored field range information (step S11).
図10は記録処理における各装置の動作手順の一例を表す。この動作手順は、ユーザである農作業者が圃場にドローン20を持って行き撮影飛行開始の操作を行うことを契機に開始される。まず、ドローン20(飛行制御部201、飛行部202及びセンサ測定部203)は、記憶している圃場範囲情報に基づき圃場上空での飛行を開始する(ステップS11)。 Each device provided in the
FIG. 10 shows an example of the operation procedure of each apparatus in the recording process. This operation procedure is started when a farmer who is a user takes the
次に、ドローン20(撮影部204)は、圃場上空からの各撮影領域の撮影を開始し(ステップS12)、撮影を行う度に、撮影した静止画像と、撮影情報(撮影したときの位置、方位及び高度を示す情報)とを示す画像データを生成してサーバ装置10に送信する(ステップS13)。サーバ装置10(圃場画像生成部101)は、送信されてきた画像データが示す圃場の画像を取得する(ステップS14)。
Next, the drone 20 (the photographing unit 204) starts photographing each photographing region from above the field (step S12), and every time photographing is performed, the photographed still image and photographing information (position when photographing, Image data indicating the azimuth and altitude) is generated and transmitted to the server device 10 (step S13). The server device 10 (farm field image generation unit 101) acquires an image of the farm field indicated by the transmitted image data (step S14).
次に、サーバ装置10(圃場画像生成部101)は、取得した圃場の画像を合わせた圃場全体の画像を生成する(ステップS21)。続いて、サーバ装置10(日向領域判定部102)は、生成された圃場全体の画像に含まれるその圃場の日向領域及び日陰領域を判定する(ステップS22)。次に、サーバ装置10(指標算出部103)は、生成された圃場全体の画像からその画像に写る作物の生育状況を表す指標(NDVI)を算出する(ステップS23)。
Next, the server device 10 (the farm field image generation unit 101) generates an image of the entire farm field by combining the acquired farm field images (step S21). Subsequently, the server device 10 (the sunny area determination unit 102) determines the sunny area and the shade area of the field included in the generated image of the entire field (Step S22). Next, the server device 10 (index calculation unit 103) calculates an index (NDVI) indicating the growth status of the crop reflected in the image from the generated image of the entire field (step S23).
なお、ステップS22及びS23の動作は反対の順番で行われてもよいし、並行して行われてもよい。続いて、サーバ装置10(生育情報生成部104)は、圃場のうちステップS22で判定された日向領域の指標等を示す日向生育情報を生成する(ステップS24)。次に、サーバ装置10(生育情報生成部104)は、圃場のうちステップS22で判定された日陰領域の指標等を示す日陰生育情報を生成する(ステップS25)。なお、ステップS24及びS25の動作は反対の順番で行われてもよいし、並行して行われてもよい。
Note that the operations in steps S22 and S23 may be performed in the reverse order or in parallel. Then, the server apparatus 10 (growth information generation part 104) produces | generates the sunny growth information which shows the parameter | index etc. of the sunny area | region determined by step S22 among farm fields (step S24). Next, the server apparatus 10 (growth information generation part 104) produces | generates the shade growth information which shows the parameter | index etc. of the shade area | region determined by step S22 among agricultural fields (step S25). Note that the operations in steps S24 and S25 may be performed in the reverse order or in parallel.
続いて、サーバ装置10(生育情報記録部105)は、ステップS24及びS25で生成された日向生育情報及び日陰生育情報を合わせて圃場における作物の生育情報として記録する(ステップS26)。そして、サーバ装置10(生育情報記録部105)は、ユーザ端末30からのアクセス(要求)があった場合に、ステップS26で記録した生育情報をユーザ端末30に出力する(ステップS31)。
Subsequently, the server device 10 (growth information recording unit 105) records the growth information of the sun and the shade growth information generated in steps S24 and S25 as the growth information of the crop in the field (step S26). Then, when there is an access (request) from the user terminal 30, the server device 10 (growth information recording unit 105) outputs the growth information recorded in step S26 to the user terminal 30 (step S31).
日陰領域では、日向領域に比べて、反射光が弱くなり画素のピクセル値が小さくなるため、ピクセル値の誤差が同じ値でもNDVIの誤差はより大きな値になる。そのため、日陰領域では、日向領域に比べて、NDVIの精度が低くなりやすい。本実施例では、上記のとおり日向領域だけのNDVIを取得することで、日向と日陰を区別しない場合に比べて、日陰が混じる圃場における生育状況の適切な判断を支援することができる。
In the shaded area, the reflected light becomes weaker and the pixel value of the pixel becomes smaller than in the sunny area. Therefore, even if the pixel value error is the same, the NDVI error is larger. Therefore, the accuracy of NDVI tends to be lower in the shaded area than in the sunny area. In the present embodiment, as described above, by acquiring the NDVI only for the sunny area, it is possible to support an appropriate determination of the growth situation in the field where the shade is mixed as compared with the case where the sunny area and the shade are not distinguished.
また、本実施例では、日陰領域だけのNDVIが取得されて日陰領域のNDVIマップも出力される。日陰領域のNDVIを日向領域のNDVIと比較すると生育状況が悪いと判断されることになりやすいが、日陰領域内だけでNDVIを比較することで、日陰領域内で生育状況が良いところと悪いところ(例えば図8BのNDVIマップMg1なら北東側の方が生育状況が良いところと言える)を判断することができる。
In this embodiment, the NDVI only for the shaded area is acquired and the NDVI map for the shaded area is also output. Compared to the NDVI in the shaded area compared to the NDVI in the shaded area, it tends to be judged that the growth situation is bad, but by comparing the NDVI only in the shaded area, the place where the growth situation is good and bad in the shaded area (For example, in the case of the NDVI map Mg1 in FIG. 8B, it can be said that the northeast side has a better growth situation).
2 変形例
上述した実施例は本発明の実施の一例に過ぎず、以下のように変形させてもよい。また、実施例及び各変形例は必要に応じてそれぞれ組み合わせてもよい。その際は、各変形例について優先順位を付けて(各変形例を実施すると競合する事象が生じる場合にどちらを優先するかを決める順位付けをして)実施してもよい。 2 Modification Examples described above are merely examples of the present invention, and may be modified as follows. Moreover, you may combine an Example and each modification as needed. In that case, you may carry out by giving priority to each modification (it ranks which decides which should be given priority when a competing event occurs if each modification is carried out).
上述した実施例は本発明の実施の一例に過ぎず、以下のように変形させてもよい。また、実施例及び各変形例は必要に応じてそれぞれ組み合わせてもよい。その際は、各変形例について優先順位を付けて(各変形例を実施すると競合する事象が生じる場合にどちらを優先するかを決める順位付けをして)実施してもよい。 2 Modification Examples described above are merely examples of the present invention, and may be modified as follows. Moreover, you may combine an Example and each modification as needed. In that case, you may carry out by giving priority to each modification (it ranks which decides which should be given priority when a competing event occurs if each modification is carried out).
また、具体的な組み合わせ方法として、例えば共通する値(例えばNDVI)を求めるために異なるパラメータを用いる変形例を組み合わせて、それらのパラメータを共に用いて共通する値等を求めてもよい。また、個別に求めた値等を何らかの規則に従い合算して1つの値等を求めてもよい。また、それらの際に、用いられるパラメータ毎に異なる重み付けをしてもよい。
Further, as a specific combination method, for example, a modification using different parameters for obtaining a common value (for example, NDVI) may be combined, and a common value or the like may be obtained using these parameters together. Moreover, you may obtain | require one value etc. by adding together the value etc. which were calculated | required separately according to some rule. Moreover, you may give different weighting for every parameter used in those cases.
2-1 日向指標のみの取得
生育情報生成部104は、実施例では、日向指標及び日陰指標の両方を取得したが、これに限らず、例えば日向指標のみを取得してもよい。その場合、生育情報生成部104は、取得した日向指標を表すNDVIマップMf1だけを生成し、そのNDVIマップMf1を示す日向生育情報をユーザ端末30に対して出力する。この場合でも、日向領域のNDVIだけを比べることができるので、日陰が混じる圃場において日向の作物の生育状況の適切な判断を支援することができる。 2-1 Acquisition of only Hinata Index In the embodiment, the growthinformation generation unit 104 acquires both the sunflower index and the shade index. However, the present invention is not limited to this. For example, only the sunflower index may be acquired. In that case, the growth information generation unit 104 generates only the NDVI map Mf1 representing the acquired sunny indicator, and outputs the sunny growth information indicating the NDVI map Mf1 to the user terminal 30. Even in this case, since only NDVI in the sunny area can be compared, it is possible to support appropriate judgment of the growing condition of the sunny crop in the field where the shade is mixed.
生育情報生成部104は、実施例では、日向指標及び日陰指標の両方を取得したが、これに限らず、例えば日向指標のみを取得してもよい。その場合、生育情報生成部104は、取得した日向指標を表すNDVIマップMf1だけを生成し、そのNDVIマップMf1を示す日向生育情報をユーザ端末30に対して出力する。この場合でも、日向領域のNDVIだけを比べることができるので、日陰が混じる圃場において日向の作物の生育状況の適切な判断を支援することができる。 2-1 Acquisition of only Hinata Index In the embodiment, the growth
2-2 指標の取得方法
実施例では、圃場全体の各画素についてNDVIが算出されたが、これに限らない。例えば上記変形例のように日向生育情報しか生成されない場合には、日向領域についてだけNDVIが算出されればよい。この場合、日向領域判定部102は、日向領域だけを判定し、生成した日向情報を圃場全体画像データと共に指標算出部103に供給する。 2-2 Index Acquisition Method In the embodiment, NDVI is calculated for each pixel of the entire field, but the present invention is not limited to this. For example, when only the sunny growth information is generated as in the above modification, NDVI only needs to be calculated for the sunny area. In this case, the sunnyarea determination unit 102 determines only the sunny area and supplies the generated sunny information to the index calculation unit 103 together with the entire field image data.
実施例では、圃場全体の各画素についてNDVIが算出されたが、これに限らない。例えば上記変形例のように日向生育情報しか生成されない場合には、日向領域についてだけNDVIが算出されればよい。この場合、日向領域判定部102は、日向領域だけを判定し、生成した日向情報を圃場全体画像データと共に指標算出部103に供給する。 2-2 Index Acquisition Method In the embodiment, NDVI is calculated for each pixel of the entire field, but the present invention is not limited to this. For example, when only the sunny growth information is generated as in the above modification, NDVI only needs to be calculated for the sunny area. In this case, the sunny
指標算出部103は、供給された日向情報が示す日向領域の画素についてのみNDVIを算出し、算出した日向領域のNDVIを表すマップを指標情報として圃場全体画像データと共に生育情報生成部104に供給する。生育情報生成部104は、こうして供給されたNDVI、すなわち、圃場の画像のうち日向領域についてのみ算出されたNDVIを日向指標として取得する。これにより、日陰領域のNDVIも算出される場合に比べてサーバ装置10の処理(NDVIの算出処理)の負荷を小さくすることができる。
The index calculation unit 103 calculates NDVI only for the pixels in the sunny area indicated by the supplied sunny information, and supplies a map representing the calculated NDVI of the sunny area together with the entire field image data to the growth information generation unit 104 as index information. . The growth information generation unit 104 acquires the NDVI supplied in this way, that is, the NDVI calculated only for the sunny area in the field image as the sunny index. Thereby, the load of the process (NDVI calculation process) of the server device 10 can be reduced as compared with the case where the NDVI of the shaded area is also calculated.
2-3 区分領域
生育情報生成部104は、実施例では撮影範囲に対応する領域を区分領域として領域単位のNDVIマップを生成したが、区分領域はこれに限らない。例えば複数の撮影範囲を1つの区分領域としてもよいし、1つの撮影領域を複数に分割した分割領域に対応する領域を区分領域としてもよい。また、各区分領域の形及び大きさが統一されていてもよいし揃っていなくてもよい。 2-3 Segmented Area The growthinformation generation unit 104 generates an NDVI map in units of areas using the area corresponding to the imaging range as the segmented area in the embodiment, but the segmented area is not limited to this. For example, a plurality of shooting ranges may be used as one segmented region, or a region corresponding to a divided region obtained by dividing one shooting region into a plurality of segments may be used as the segmented region. Further, the shape and size of each segmented region may be unified or not uniform.
生育情報生成部104は、実施例では撮影範囲に対応する領域を区分領域として領域単位のNDVIマップを生成したが、区分領域はこれに限らない。例えば複数の撮影範囲を1つの区分領域としてもよいし、1つの撮影領域を複数に分割した分割領域に対応する領域を区分領域としてもよい。また、各区分領域の形及び大きさが統一されていてもよいし揃っていなくてもよい。 2-3 Segmented Area The growth
2-4 飛行体
実施例では、自律飛行を行う飛行体として回転翼機型の飛行体が用いられたが、これに限らない。例えば飛行機型の飛行体であってもよいし、ヘリコプター型の飛行体であってもよい。また、自律飛行の機能も必須ではなく、割り当てられた飛行空域を割り当てられた飛行許可期間に飛行することができるのであれば、例えば遠隔から操縦者によって操作されるラジオコントロール型(無線操縦型)の飛行体が用いられてもよい。 2-4 Aircraft In the example, a rotary wing aircraft was used as a vehicle for autonomous flight, but this is not a limitation. For example, it may be an airplane type aircraft or a helicopter type aircraft. In addition, the function of autonomous flight is not essential, and if it is possible to fly the assigned flight airspace in the assigned flight permission period, for example, a radio control type (wireless control type) operated by a pilot from a remote location. May be used.
実施例では、自律飛行を行う飛行体として回転翼機型の飛行体が用いられたが、これに限らない。例えば飛行機型の飛行体であってもよいし、ヘリコプター型の飛行体であってもよい。また、自律飛行の機能も必須ではなく、割り当てられた飛行空域を割り当てられた飛行許可期間に飛行することができるのであれば、例えば遠隔から操縦者によって操作されるラジオコントロール型(無線操縦型)の飛行体が用いられてもよい。 2-4 Aircraft In the example, a rotary wing aircraft was used as a vehicle for autonomous flight, but this is not a limitation. For example, it may be an airplane type aircraft or a helicopter type aircraft. In addition, the function of autonomous flight is not essential, and if it is possible to fly the assigned flight airspace in the assigned flight permission period, for example, a radio control type (wireless control type) operated by a pilot from a remote location. May be used.
2-5 日向領域判定及び指標算出の元情報
実施例では、ドローン20が飛行中に撮影した画像に基づいて日向領域の判定及びNDVIの算出が行われたが、これに限らない。これらの判定及び算出は、例えば作業者がデジカメを用いて手作業で撮影した画像、圃場に設置された固定のデジカメが撮影した画像又は衛星から撮影された画像に基づいて行われてもよい。 2-5 Original Information for Determination of Hyuga Area and Index Calculation In the embodiment, the determination of the sun area and the calculation of NDVI are performed based on the image taken by thedrone 20 during the flight, but the present invention is not limited to this. These determinations and calculations may be performed based on, for example, an image manually captured by an operator using a digital camera, an image captured by a fixed digital camera installed on a farm field, or an image captured from a satellite.
実施例では、ドローン20が飛行中に撮影した画像に基づいて日向領域の判定及びNDVIの算出が行われたが、これに限らない。これらの判定及び算出は、例えば作業者がデジカメを用いて手作業で撮影した画像、圃場に設置された固定のデジカメが撮影した画像又は衛星から撮影された画像に基づいて行われてもよい。 2-5 Original Information for Determination of Hyuga Area and Index Calculation In the embodiment, the determination of the sun area and the calculation of NDVI are performed based on the image taken by the
また、実施例ではドローン20の撮影装置27のイメージセンサの測定値を用いてNDVIが算出されたが、これに限らず、例えばハンディタイプのNDVI測定器の赤外線センサ等の測定値を用いてNDVIが算出されてもよい。もちろん、圃場全体の生育状況を知るためにはドローン20等で撮影した画像を用いた方が手間がかからないが、ハンディタイプの方が作物だけの反射光からNDVIを算出することができるのでNDVIの精度が高くなりやすい。どの方法を用いるかは手間、コスト及び求められる精度との兼ね合いで決定されればよい。
In the embodiment, the NDVI is calculated using the measured value of the image sensor of the imaging device 27 of the drone 20, but the present invention is not limited to this. For example, the NDVI is measured using the measured value of an infrared sensor of a handy type NDVI measuring instrument. May be calculated. Of course, it is less time-consuming to use the image taken with the drone 20 or the like in order to know the growth situation of the entire field, but the handy type can calculate NDVI from the reflected light of the crop only. Accuracy tends to be high. Which method is to be used may be determined in consideration of labor, cost, and required accuracy.
2-6 生育状況を表す指標
実施例では、生育状況を表す指標としてNDVIが用いられたが、これに限らない。例えば、葉色値(葉の色を示す値)、植被率(植被領域の単位面積あたりの占有率)、SPAD(葉緑素含量)、草丈又は茎数等が用いられてもよい。要するに、作物の生育状況を表しており、且つ、撮影された作物領域の画像から算出可能な値であれば、どのような値が生育状況を表す指標として用いられてもよい。 2-6 Index indicating growth status In the examples, NDVI was used as an index indicating the growth status, but the present invention is not limited to this. For example, a leaf color value (value indicating the color of a leaf), a planting rate (occupation rate per unit area of a planting region), SPAD (chlorophyll content), plant height, number of stems, or the like may be used. In short, any value may be used as an index representing the growth status as long as it represents the growth status of the crop and can be calculated from the captured crop region image.
実施例では、生育状況を表す指標としてNDVIが用いられたが、これに限らない。例えば、葉色値(葉の色を示す値)、植被率(植被領域の単位面積あたりの占有率)、SPAD(葉緑素含量)、草丈又は茎数等が用いられてもよい。要するに、作物の生育状況を表しており、且つ、撮影された作物領域の画像から算出可能な値であれば、どのような値が生育状況を表す指標として用いられてもよい。 2-6 Index indicating growth status In the examples, NDVI was used as an index indicating the growth status, but the present invention is not limited to this. For example, a leaf color value (value indicating the color of a leaf), a planting rate (occupation rate per unit area of a planting region), SPAD (chlorophyll content), plant height, number of stems, or the like may be used. In short, any value may be used as an index representing the growth status as long as it represents the growth status of the crop and can be calculated from the captured crop region image.
2-7 指標の補正
撮影装置27のイメージセンサに到達する作物からの反射光の強さは、日向及び日陰だけでなく、季節毎の太陽光の強さ、雲のかかり具合及び大気の状態等の周囲の環境によっても変化し得る。また、気温及び湿度等の影響を受けた撮影装置27のレンズの状態によっても変化し得る。それらの要因でイメージセンサに到達する反射光の強さが変化することに起因する指標(NDVI)の変化をなくすための補正を行ってもよい。 2-7 Correction of indices The intensity of reflected light from the crops that reach the image sensor of the photographingdevice 27 is not only the sun and shade, but also the intensity of sunlight in each season, the amount of clouds, the atmospheric conditions, etc. It can change depending on the surrounding environment. Moreover, it may change also with the state of the lens of the imaging device 27 influenced by temperature, humidity, etc. Correction for eliminating the change in the index (NDVI) caused by the change in the intensity of the reflected light reaching the image sensor due to those factors may be performed.
撮影装置27のイメージセンサに到達する作物からの反射光の強さは、日向及び日陰だけでなく、季節毎の太陽光の強さ、雲のかかり具合及び大気の状態等の周囲の環境によっても変化し得る。また、気温及び湿度等の影響を受けた撮影装置27のレンズの状態によっても変化し得る。それらの要因でイメージセンサに到達する反射光の強さが変化することに起因する指標(NDVI)の変化をなくすための補正を行ってもよい。 2-7 Correction of indices The intensity of reflected light from the crops that reach the image sensor of the photographing
図11は本変形例で実現される機能構成を表す。図11では、図4に表す各部に加えて補正画像取得部106を備えるサーバ装置10aが表されている。補正画像取得部106は、ドローン20により撮影された指標補正用の画像を取得する。補正画像取得部106は本発明の「補正取得部」の一例である。指標補正用の画像とは、例えば予め特定の波長の光の反射率が分かっている領域を複数有するパネルを撮影した画像である。
FIG. 11 shows a functional configuration realized in this modification. In FIG. 11, a server device 10 a including a corrected image acquisition unit 106 in addition to the units illustrated in FIG. 4 is illustrated. The corrected image acquisition unit 106 acquires an image for index correction taken by the drone 20. The corrected image acquisition unit 106 is an example of the “correction acquisition unit” in the present invention. The index correction image is, for example, an image obtained by photographing a panel having a plurality of regions in which the reflectance of light having a specific wavelength is known in advance.
図12は指標補正用の画像の一例を表す。図12では、特定反射率領域J11、J12、J13及びJ14が表面に表されたパネルJ1を撮影した画像が指標補正用の画像として表されている。各特定反射率領域は、例えば、赤色の光及び赤外線の反射率がそれぞれ段階的に変化するパネル上の領域(例えばJ11、J12、J13、J14の赤色の光及び赤外線の反射率はどちらも20%、40%、60%、80%)である。
FIG. 12 shows an example of an index correction image. In FIG. 12, an image obtained by photographing the panel J1 in which the specific reflectance regions J11, J12, J13, and J14 are represented on the surface is represented as an index correction image. Each specific reflectance region is, for example, a region on the panel where the reflectance of red light and infrared rays changes stepwise (for example, the reflectance of red light and infrared rays of J11, J12, J13, and J14 are both 20). %, 40%, 60%, 80%).
本変形例では、農作業者は、撮影飛行開始の操作を行う前又は撮影飛行の終了後に、パネルJ1をドローン20に撮影させる。パネルJ1の撮影時にはドローン20は飛行していてもよいし、ユーザが持ち上げて撮影させてもよい。ドローン20の撮影部204は、撮影したパネルJ1の画像を示す画像データをサーバ装置10aに送信する。補正画像取得部106は、こうして送信されてきた画像データが示すパネルJ1の画像を指標補正用の画像として取得する。
In this modification, the farmer causes the drone 20 to photograph the panel J1 before the operation for starting the photographing flight or after the completion of the photographing flight. At the time of photographing the panel J1, the drone 20 may be flying, or the user may lift and photograph it. The imaging unit 204 of the drone 20 transmits image data indicating the captured image of the panel J1 to the server device 10a. The corrected image acquisition unit 106 acquires the image of the panel J1 indicated by the image data thus transmitted as an index correction image.
補正画像取得部106は、取得した指標補正用の画像を指標算出部103に供給する。指標算出部103は、供給された画像、すなわち補正画像取得部106により取得された指標補正用の画像に基づいて補正された指標を算出する。指標算出部103は、取得された指標補正用の画像に含まれる特定反射率領域J11、J12、J13、J14の赤色のピクセル値(r11、r12、r13、r14)及び近赤外領域の波長の光のピクセル値(ir11、ir12、ir13、ir14)を測定値として読み出す。
The corrected image acquisition unit 106 supplies the acquired index correction image to the index calculation unit 103. The index calculation unit 103 calculates a corrected index based on the supplied image, that is, the index correction image acquired by the corrected image acquisition unit 106. The index calculation unit 103 uses the red pixel values (r11, r12, r13, r14) of the specific reflectance regions J11, J12, J13, and J14 included in the acquired index correction image and the wavelengths in the near infrared region. The pixel values of light (ir11, ir12, ir13, ir14) are read out as measurement values.
サーバ装置10aは、例えばNDVIが良好に測定可能な環境において特定反射率領域J11、J12、J13、J14を撮影したときの赤色のピクセル値(R11、R12、R13、R14)及び近赤外領域の波長の光のピクセル値(IR11、IR12、IR13、IR14)を基準値としてサーバ装置10aに記憶させておく。指標算出部103は、測定値と基準値とが異なっていれば、測定値を基準値に補正する補正式を決定する。
For example, the server device 10a captures red pixel values (R11, R12, R13, R14) and the near-infrared region when the specific reflectance regions J11, J12, J13, and J14 are photographed in an environment where NDVI can be measured satisfactorily. The pixel values (IR11, IR12, IR13, IR14) of light with wavelengths are stored in the server device 10a as reference values. If the measured value is different from the reference value, the index calculating unit 103 determines a correction formula for correcting the measured value to the reference value.
指標算出部103は、例えば、各測定値に特定の係数を乗じればいずれも基準値になる場合はその係数を乗じる補正式(例えば測定値×1.1=基準値)を決定する。また、指標算出部103は、4つの特定反射率領域について同じ係数では全てを基準値にすることができない場合は、特定反射率領域毎に係数を定めればよい(例えばr11×1.1=R11、r12×1.05=R12、r13×1.1=R13、r14×1.15=R14)。
The index calculation unit 103 determines, for example, a correction formula (for example, measurement value × 1.1 = reference value) by which each measurement value is multiplied by a specific coefficient when the measurement value becomes a reference value. Moreover, the index calculation part 103 should just determine a coefficient for every specific reflectance area | region, when all cannot be made into a reference value with the same coefficient about four specific reflectance area | regions (for example, r11 * 1.1 =). R11, r12 × 1.05 = R12, r13 × 1.1 = R13, r14 × 1.15 = R14).
この場合、ピクセル値がr11とr12の間であれば、それらの補正係数である1.1及び1.05の間の数を補正係数として用いる(例えばr11及びr12の真ん中のピクセル値であれば(1.1+1.05)÷2=1.075を用いる)。指標算出部103は、以上のとおりR及びIRについてそれぞれ補正式を決定すると、各画素のピクセル値を決定した補正式を用いて補正する。
In this case, if the pixel value is between r11 and r12, a number between 1.1 and 1.05, which is their correction coefficient, is used as the correction coefficient (for example, if the pixel value is the middle of r11 and r12) (1.1 + 1.05) /2=1.075 is used). When the index calculation unit 103 determines correction formulas for R and IR as described above, the index calculation unit 103 performs correction using the correction formula that determines the pixel value of each pixel.
指標算出部103は、こうして補正した各画素のR及びIRのピクセル値を用いて、実施例と同様にNDVI(本変形例では補正されたNDVIとなる)を算出する。このように指標を補正することで、補正をしない場合に比べて、周囲の環境及びレンズの状態の変化によるNDVIの変化を少なくすることができ、NDVIの精度を高めることができる。なお、ピクセル値の補正は指標算出部103以外の機能が行ってもよい。例えば、圃場画像生成部101が圃場全体画像を生成する際に各画素のピクセル値を補正してもよい。また、NDVIの補正方法は上述した方法に限らず、その他の周知の方法が用いられてもよい。
The index calculation unit 103 uses the corrected R and IR pixel values of each pixel to calculate NDVI (corrected NDVI in this modification) as in the embodiment. By correcting the index in this way, changes in NDVI due to changes in the surrounding environment and the state of the lens can be reduced, and the accuracy of NDVI can be increased, compared to the case where correction is not performed. The correction of the pixel value may be performed by a function other than the index calculation unit 103. For example, the pixel value of each pixel may be corrected when the agricultural field image generation unit 101 generates an entire agricultural field image. Further, the NDVI correction method is not limited to the above-described method, and other known methods may be used.
2-8 日向と日陰
上記の補正を行う際に、パネルJ1が日向にあるか日陰にあるかによって補正の対象を絞り込んでもよい。
図13は本変形例で実現される機能構成を表す。図13では、図11に表す各部に加えて日照条件特定部107を備えるサーバ装置10bが表されている。 2-8 Hinata and Shade When performing the above correction, the correction target may be narrowed down depending on whether panel J1 is in the sun or in the shade.
FIG. 13 shows a functional configuration realized in this modification. In FIG. 13, aserver device 10 b including a sunshine condition specifying unit 107 in addition to the units illustrated in FIG. 11 is illustrated.
上記の補正を行う際に、パネルJ1が日向にあるか日陰にあるかによって補正の対象を絞り込んでもよい。
図13は本変形例で実現される機能構成を表す。図13では、図11に表す各部に加えて日照条件特定部107を備えるサーバ装置10bが表されている。 2-8 Hinata and Shade When performing the above correction, the correction target may be narrowed down depending on whether panel J1 is in the sun or in the shade.
FIG. 13 shows a functional configuration realized in this modification. In FIG. 13, a
日照条件特定部107は、指標補正用の画像の撮影条件が日向及び日陰のいずれであるかを特定する。日照条件特定部107は本発明の「条件特定部」の一例である。本変形例では、例えばユーザ端末30に撮影条件を入力する画面が表示される。
図14は撮影条件の入力画面の一例を表す。図14では、ユーザ端末30が、撮影日、圃場名、日照条件の入力欄を含む農業支援システム画面を表示している。 The sunshinecondition specifying unit 107 specifies whether the shooting condition of the index correction image is sunny or shaded. The sunshine condition specifying unit 107 is an example of the “condition specifying unit” in the present invention. In this modification, for example, a screen for inputting shooting conditions is displayed on the user terminal 30.
FIG. 14 shows an example of an imaging condition input screen. In FIG. 14, the user terminal 30 displays an agricultural support system screen including input fields for a shooting date, a field name, and a sunshine condition.
図14は撮影条件の入力画面の一例を表す。図14では、ユーザ端末30が、撮影日、圃場名、日照条件の入力欄を含む農業支援システム画面を表示している。 The sunshine
FIG. 14 shows an example of an imaging condition input screen. In FIG. 14, the user terminal 30 displays an agricultural support system screen including input fields for a shooting date, a field name, and a sunshine condition.
この例では、「2018/5/15」の「圃場A1」の「日陰」という撮影条件が入力されている。ユーザが送信ボタンH2を押す操作を行うと、ユーザ端末30は、撮影日時が「2018/5/15」で撮影場所が「圃場A1」で日照条件が「日陰」であることを示す撮影条件データをサーバ装置10bに送信する。日照条件特定部107は、送信されてきた撮影条件データが示す日照条件(図14の例では「日陰」)を、その撮影条件データが示す撮影日時及び圃場において撮影された指標補正用の画像の撮影条件として特定する。
In this example, the photographing condition “shade” of “field A1” of “2018/5/15” is input. When the user performs an operation of pressing the send button H2, the user terminal 30 captures the photographing condition data indicating that the photographing date and time is “2018/5/15”, the photographing place is “farm field A1”, and the sunshine condition is “shade”. Is transmitted to the server apparatus 10b. The sunshine condition specifying unit 107 sets the sunshine condition (“shade” in the example of FIG. 14) indicated by the transmitted imaging condition data, the date and time indicated by the imaging condition data, and the index correction image captured in the field. Specify as shooting conditions.
日照条件特定部107は、特定した撮影条件を、撮影日時及び圃場と共に指標算出部103に通知する。指標算出部103は、日向の撮影条件が特定された場合には、圃場の画像のうち日向領域と判定された部分における指標として、補正画像取得部106により取得された指標補正用の画像に基づいて補正された指標を算出する(日陰領域と判定された部分の指標は補正しない)。
The sunshine condition specifying unit 107 notifies the index calculation unit 103 of the specified shooting conditions together with the shooting date and time and the farm field. The index calculation unit 103, based on the index correction image acquired by the correction image acquisition unit 106, is used as an index in a portion determined to be a sunny area in the field image when the shooting conditions for the sun are identified. The corrected index is calculated (the index of the portion determined to be a shaded area is not corrected).
本変形例では、日向領域判定部102が、生成した日向情報及び日陰情報を圃場全体画像データと共に指標算出部103に供給する。指標算出部103は、圃場全体画像データが示す各画素のうち、供給された日向情報が示す日向領域の画素のピクセル値を図12の説明で述べたように補正する。指標算出部103は、こうして補正した日向領域を示す各画素のピクセル値を用いて、実施例と同様にNDVIを算出する。
In this modification, the sunny area determination unit 102 supplies the generated sunny information and shade information to the index calculation unit 103 together with the entire field image data. The index calculation unit 103 corrects the pixel value of the pixel in the sunny area indicated by the supplied sunny information among the pixels indicated by the entire field image data as described in the description of FIG. The index calculation unit 103 calculates the NDVI in the same manner as in the embodiment, using the pixel value of each pixel indicating the corrected sunny area.
また、指標算出部103は、日陰の撮影条件が特定された場合には、圃場の画像のうち日向領域と判定されなかった部分(日陰領域と判定された部分)における指標として、補正画像取得部106により取得された指標補正用の画像に基づいて補正された指標を算出する(日向領域と判定された部分の指標は補正しない)。この場合、指標算出部103は、圃場全体画像データが示す各画素のうち、供給された日陰情報が示す日陰領域の画素のピクセル値を上記同様に補正してNDVIを算出する。
In addition, when the shadow photographing condition is specified, the index calculation unit 103 uses the corrected image acquisition unit as an index in a portion of the field image that is not determined to be a sunny region (a portion determined to be a shaded region). A corrected index is calculated based on the index correction image acquired in 106 (the index of the portion determined to be the sunny area is not corrected). In this case, the index calculation unit 103 calculates the NDVI by correcting the pixel values of the pixels in the shaded area indicated by the supplied shade information among the pixels indicated by the entire field image data in the same manner as described above.
以上のとおり、本変形例では、パネルJ1が日向にある場合は日向領域のNDVIが補正され、パネルJ1が日陰にある場合は日陰領域のNDVIが補正される。これにより、パネルJ1の日照条件を考慮せずに全ての画素について補正を行う場合に比べて、パネルJ1の画像(指標補正用の画像)を用いた補正の精度を向上させることができる。なお、本変形例では、特定されなかった日照条件と共通の領域のNDVIは補正されなかったが、その領域については別の補正が行われてもよい。別の補正については後ほど説明する。
As described above, in this modification, the NDVI in the sunlit area is corrected when the panel J1 is in the sun, and the NDVI in the shaded area is corrected when the panel J1 is in the shade. Thereby, the accuracy of correction using the image of panel J1 (index correction image) can be improved as compared with the case where correction is performed for all pixels without considering the sunshine conditions of panel J1. In this modification, the NDVI in the area common to the sunshine condition that has not been specified is not corrected, but another correction may be performed for the area. Another correction will be described later.
2-9 日陰領域の撮影
日陰領域と判定された領域も、撮影する時間帯によっては日向領域になっている場合がある。そこで、時間帯を変えて日陰領域を撮影してもよい。
図15は本変形例で実現される機能構成を表す。図15では、図4に表す各部に加えて飛行指示部108を備えるサーバ装置10cが表されている。 2-9 Shooting in the shaded area The area determined as the shaded area may also be a sunny area depending on the shooting time zone. Therefore, the shaded area may be photographed by changing the time zone.
FIG. 15 shows a functional configuration realized in this modification. In FIG. 15, aserver device 10 c including a flight instruction unit 108 in addition to the units illustrated in FIG. 4 is illustrated.
日陰領域と判定された領域も、撮影する時間帯によっては日向領域になっている場合がある。そこで、時間帯を変えて日陰領域を撮影してもよい。
図15は本変形例で実現される機能構成を表す。図15では、図4に表す各部に加えて飛行指示部108を備えるサーバ装置10cが表されている。 2-9 Shooting in the shaded area The area determined as the shaded area may also be a sunny area depending on the shooting time zone. Therefore, the shaded area may be photographed by changing the time zone.
FIG. 15 shows a functional configuration realized in this modification. In FIG. 15, a
飛行指示部108は、日向領域と判定されなかった位置、すなわち日陰領域と判定された位置についての撮影方法をドローン20に指示する。飛行指示部108は本発明の「指示部」の一例である。飛行指示部108は、例えば、撮影飛行が終わって撮影された圃場画像から日陰領域が判定された場合に、その日陰領域と判定された位置の再撮影をドローン20に指示する。
The flight instruction unit 108 instructs the drone 20 on the shooting method for the position that is not determined to be the sunny area, that is, the position that is determined to be the shaded area. The flight instruction unit 108 is an example of the “instruction unit” in the present invention. The flight instruction unit 108 instructs the drone 20 to re-photograph the position determined to be the shaded area, for example, when the shaded area is determined from the field image captured after the shooting flight is completed.
本変形例では、例えば、農作業の終了予定時刻がユーザ端末30によって入力され、サーバ装置10cがその終了予定時刻を記憶しているものとする。その場合に、飛行指示部108は、日陰領域の撮影飛行が終了予定時刻前に完了するタイミングで撮影を開始するように再撮影を指示する。飛行指示部108は、日陰領域の面積と撮影開始位置からの距離とに基づいて日陰領域の撮影に要する時間(撮影時間)を算出し、終了予定時刻から算出した撮影時間だけ遡った時刻よりも前の時刻に再撮影を開始するよう指示する。
In this modification, for example, it is assumed that the scheduled end time of farm work is input by the user terminal 30, and the server device 10c stores the scheduled end time. In this case, the flight instruction unit 108 instructs re-imaging so that the photographing is started at the timing when the photographing flight of the shaded area is completed before the scheduled end time. The flight instruction unit 108 calculates the time (shooting time) required for shooting the shaded area based on the area of the shaded area and the distance from the shooting start position, and is more than the time that is back from the estimated shooting time by the calculated shooting time. Instruct to start re-shooting at the previous time.
これにより、再撮影時には日陰領域であった位置の中には日向領域になっている位置もあるので、再撮影の指示を行わない場合に比べて、日向領域の画素から算出されるNDVIを増やすことができる。なお、作業時間が短い場合又は撮影飛行を農作業の終盤に行った場合には、同じ日に再撮影を行う時間が取れないことがある。その場合に、飛行指示部108は、別の日において撮影時刻を変更しての撮影をドローン20に指示してもよい。
As a result, some of the positions that were the shaded areas at the time of re-shooting are positions that are the sunny areas. Therefore, the NDVI calculated from the pixels in the sunny area is increased as compared with the case where the re-shooting instruction is not given. be able to. When the work time is short or when the shooting flight is performed at the end of the farm work, it may not be possible to take time for re-photographing on the same day. In that case, the flight instructing unit 108 may instruct the drone 20 to shoot at a different shooting time on another day.
具体的には、飛行指示部108は、撮影飛行が終わって撮影された圃場画像から日陰領域が判定された場合に、その撮影時刻、圃場ID(圃場を識別する情報)及び装置ID(ドローン20を識別する情報)を記憶しておく。本変形例では、ユーザである農作業者は、圃場にドローン20を持って行くが、自分で撮影飛行開始の操作は行わず、例えば飛行待機の操作を行う。ドローン20は、飛行待機状態になったことと圃場ID(農作業者が予め記憶させておく)及び装置IDとを示す状態データをサーバ装置10cに送信する。
Specifically, the flight instruction unit 108, when a shaded area is determined from the field image captured after the shooting flight is completed, the imaging time, the field ID (information identifying the field), and the device ID (drone 20). Is stored). In this modification, the farmer who is the user takes the drone 20 to the field, but does not perform the shooting flight start operation himself, for example, performs a flight standby operation. The drone 20 transmits to the server device 10c state data indicating that it is in the flight standby state, the field ID (stored in advance by the farm worker), and the device ID.
飛行指示部108は、状態データを受け取ると、その状態データが示す圃場ID及び装置IDに対応付けて記憶している撮影時刻とは異なる時刻を撮影開始時刻とするよう指示する指示データをドローン20に送信する。撮影開始時刻としては、例えば過去の撮影時刻から決められた時間(日陰領域が十分に変化する時間)だけ離れた時刻が用いられる。例えば過去の撮影時刻が午前10時で決められた時間が5時間であれば、飛行指示部108は、午後3時を撮影開始時刻として指示を行う。
When the flight instruction unit 108 receives the state data, the drone 20 instructs the flight instruction unit to instruct the shooting start time to be a time different from the shooting time stored in association with the field ID and the device ID indicated by the state data. Send to. As the imaging start time, for example, a time separated from a past imaging time by a predetermined time (a time when the shaded area sufficiently changes) is used. For example, if the past shooting time is 10:00 am and the time determined at 5 am is 5 hours, the flight instruction unit 108 instructs the shooting start time at 3 pm.
また、過去の撮影時刻が午後4時で決められた時間が5時間であれば、飛行指示部108は、午前11時を撮影開始時刻として指示を行う。ドローン20の飛行制御部201は、受信した指示データが示す撮影開始時刻になったときに撮影飛行を開始する。この場合も、過去の再撮影時には日陰領域であった位置の一部を日向領域として撮影できるので、撮影時刻の変更指示を行わない場合に比べて、日向領域の画素から算出されるNDVIを増やすことができる。
Also, if the past shooting time is 4 pm and the time determined at 5 pm is 5 hours, the flight instruction unit 108 gives an instruction with 11:00 am as the shooting start time. The flight control unit 201 of the drone 20 starts shooting flight when the shooting start time indicated by the received instruction data comes. Also in this case, since a part of the position that was the shaded area at the time of re-shooting in the past can be taken as the sunny area, NDVI calculated from the pixels in the sunny area is increased as compared with the case where the shooting time change instruction is not given. be able to.
2-10 固定画像による指標補正
指標算出部103は、日陰領域の画素から算出する指標(日陰領域の指標)を、その画素が日向領域だった場合に算出されることが見込まれる指標に補正してもよい。指標算出部103は、例えば同じ作物を日向領域のときに撮影した画像のピクセル値と日陰領域のときに撮影した画像のピクセル値とを比較することでこの補正を行う。 2-10 Index Correction by Fixed Image Theindex calculation unit 103 corrects an index calculated from a pixel in a shaded area (an index in the shaded area) to an index that is expected to be calculated when the pixel is a sunny area. May be. The index calculation unit 103 performs this correction by comparing, for example, the pixel value of the image taken when the same crop is in the sunny area and the pixel value of the image taken in the shaded area.
指標算出部103は、日陰領域の画素から算出する指標(日陰領域の指標)を、その画素が日向領域だった場合に算出されることが見込まれる指標に補正してもよい。指標算出部103は、例えば同じ作物を日向領域のときに撮影した画像のピクセル値と日陰領域のときに撮影した画像のピクセル値とを比較することでこの補正を行う。 2-10 Index Correction by Fixed Image The
本変形例では、圃場の中に日向領域及び日陰領域のどちらにもなる領域がある場合に、その領域を撮影する撮影手段が設置される。
図16は本変形例で設置される撮影手段の撮影範囲の一例を表す。図16では、圃場A1の圃場全体画像E1が表されている。圃場全体画像E1は、図5に表す林R1の影(日陰領域G1)が北に延びる午後の早い時間に撮影された画像である。 In this modification, when there is an area that is both a sunny area and a shaded area in the field, an imaging unit that captures the area is installed.
FIG. 16 shows an example of the photographing range of the photographing means installed in this modification. In FIG. 16, an entire field image E1 of the field A1 is shown. The entire farm field image E1 is an image taken at an early time in the afternoon in which the shadow (shade region G1) of the forest R1 shown in FIG. 5 extends north.
図16は本変形例で設置される撮影手段の撮影範囲の一例を表す。図16では、圃場A1の圃場全体画像E1が表されている。圃場全体画像E1は、図5に表す林R1の影(日陰領域G1)が北に延びる午後の早い時間に撮影された画像である。 In this modification, when there is an area that is both a sunny area and a shaded area in the field, an imaging unit that captures the area is installed.
FIG. 16 shows an example of the photographing range of the photographing means installed in this modification. In FIG. 16, an entire field image E1 of the field A1 is shown. The entire farm field image E1 is an image taken at an early time in the afternoon in which the shadow (shade region G1) of the forest R1 shown in FIG. 5 extends north.
この圃場A1を太陽が東にある朝の早い時間に撮影すると、図に示す日陰領域G2のように林R1の影が日陰領域G1に比べて短くなる。本変形例では、日陰領域G1では日陰になるが日陰領域G2をほとんど含まない撮影領域C41を撮影する位置に固定カメラK1が設置される。固定カメラK1は、朝の所定の時刻(撮影領域C41が日向領域になる時刻)と、日中の所定の時刻(撮影領域C41が日陰領域になる時刻)とに決められた間隔(毎日でもよいし毎週でもよい)で繰り返し撮影を行う。
When this field A1 is photographed early in the morning when the sun is east, the shadow of the forest R1 becomes shorter than the shaded area G1 as in the shaded area G2 shown in the figure. In the present modification, the fixed camera K1 is installed at a position where the photographing region C41 that is shaded in the shaded region G1 but hardly includes the shaded region G2 is photographed. The fixed camera K1 has a predetermined interval in the morning (time when the shooting region C41 becomes the sunny region) and a predetermined time during the day (time when the shooting region C41 becomes the shaded region) (may be every day). Repeat every week).
このように、固定カメラK1は、圃場のうち日向領域及び日陰領域が切り替わる固定の領域を撮影する。固定カメラK1は本発明の「撮影手段」の一例である。
図17は本変形例で実現される機能構成を表す。図17では、図4に表す各部に加えて画像取得部109を備えるサーバ装置10dが表されている。固定カメラK1は通信機能を有しており、撮影した画像を示す画像データをサーバ装置10dに送信する。 In this way, the fixed camera K1 captures a fixed area where the sunlit area and the shaded area are switched in the field. The fixed camera K1 is an example of the “photographing means” in the present invention.
FIG. 17 shows a functional configuration realized in this modification. In FIG. 17, aserver device 10 d that includes an image acquisition unit 109 in addition to the units illustrated in FIG. 4 is illustrated. The fixed camera K1 has a communication function, and transmits image data indicating a captured image to the server device 10d.
図17は本変形例で実現される機能構成を表す。図17では、図4に表す各部に加えて画像取得部109を備えるサーバ装置10dが表されている。固定カメラK1は通信機能を有しており、撮影した画像を示す画像データをサーバ装置10dに送信する。 In this way, the fixed camera K1 captures a fixed area where the sunlit area and the shaded area are switched in the field. The fixed camera K1 is an example of the “photographing means” in the present invention.
FIG. 17 shows a functional configuration realized in this modification. In FIG. 17, a
画像取得部109は、送信されてきた画像データが示す画像、すなわち、固定カメラK1が撮影する固定領域(圃場のうち日向領域及び日陰領域が切り替わる固定の領域)の画像を取得する。画像取得部109は本発明の「第2画像取得部」の一例である。画像取得部109は、取得した固定領域の画像を指標算出部103に供給する。
The image acquisition unit 109 acquires an image indicated by the transmitted image data, that is, an image of a fixed area (a fixed area in which the sunny area and the shaded area in the field are switched) captured by the fixed camera K1. The image acquisition unit 109 is an example of the “second image acquisition unit” in the present invention. The image acquisition unit 109 supplies the acquired image of the fixed area to the index calculation unit 103.
指標算出部103は、画像取得部109により取得された固定領域の画像から得られる日向領域の指標と日陰領域の指標の相関関係に基づいて、圃場の画像のうち日向領域と判定されなかった部分(日陰領域と判定された部分)について補正された指標を算出する。指標算出部103は、例えば、圃場内の作物の特定の部分を撮影した画素について日向領域のときに算出したNDVIと、その画素について日陰領域のときに算出したNDVIとの割合(指標割合)を算出する。
The index calculation unit 103 is a part of the field image that is not determined to be the sunny area based on the correlation between the index of the sunny area and the index of the shaded area obtained from the image of the fixed area acquired by the image acquisition unit 109. An index corrected for (a portion determined to be a shaded area) is calculated. For example, the index calculation unit 103 calculates the ratio (index ratio) between the NDVI calculated for the pixel in which the specific part of the crop in the field is captured in the sunny area and the NDVI calculated for the pixel in the shaded area. calculate.
NDVIは-1.0から+1.0の値で算出されるので、指標算出部103は、例えばこれを0.0から2.0の値に変換した上で指標割合(0から1.0の値になる)を算出する。指標算出部103は、例えば日陰領域でのピクセル値と指標割合との相関関係を表す式を求める。
図18は指標割合の相関関係の一例を表す。図18では、日陰領域でのピクセル値を横軸に示し、指標割合を縦軸に示すグラフが表されている。 Since NDVI is calculated from a value of −1.0 to +1.0, theindex calculation unit 103 converts the index value, for example, from 0.0 to 2.0, and then converts the index ratio (from 0 to 1.0). Value). For example, the index calculation unit 103 obtains an expression representing the correlation between the pixel value in the shaded area and the index ratio.
FIG. 18 shows an example of the correlation of the index ratio. FIG. 18 shows a graph in which the pixel value in the shaded area is shown on the horizontal axis and the index ratio is shown on the vertical axis.
図18は指標割合の相関関係の一例を表す。図18では、日陰領域でのピクセル値を横軸に示し、指標割合を縦軸に示すグラフが表されている。 Since NDVI is calculated from a value of −1.0 to +1.0, the
FIG. 18 shows an example of the correlation of the index ratio. FIG. 18 shows a graph in which the pixel value in the shaded area is shown on the horizontal axis and the index ratio is shown on the vertical axis.
この例では、ピクセル値が大きいほど指標割合が小さくなる相関関係が表されている。指標算出部103は、周知の方法を用いてこの相関関係を示す近似式を求める。なお、図18の例では相関関係が直線的に表されているが、2次曲線で表されてもいてもよいし、3次以上の曲線で表されていてもよい。指標算出部103は、こうして求めた相関関係を表す式を用いて、日陰領域の各画素について補正されたNDVIを算出する。
In this example, a correlation is shown in which the index ratio decreases as the pixel value increases. The index calculation unit 103 obtains an approximate expression indicating this correlation using a known method. In the example of FIG. 18, the correlation is represented linearly, but may be represented by a quadratic curve or may be represented by a cubic or higher curve. The index calculation unit 103 calculates the corrected NDVI for each pixel in the shaded area, using an expression representing the correlation thus obtained.
具体的には、指標算出部103は、各画素のピクセル値に対してこの式が示す指標割合でそのピクセル値を除算することで、その画素が日向領域だった場合に算出されることが見込まれる指標に補正する。以上のとおり日陰領域の補正された指標を算出することで、例えば図16に表す日陰領域G2のように1日を通してほとんど日向領域にならない領域があったとしても、その領域の作物の生育状況を、本変形例の補正を行わない場合に比べて高い精度で示すことができる。
Specifically, the index calculation unit 103 is expected to calculate when the pixel is in the sunny area by dividing the pixel value by the index ratio indicated by this expression with respect to the pixel value of each pixel. To the correct index. By calculating the corrected index of the shaded area as described above, even if there is an area that hardly becomes the sunny area throughout the day, such as the shaded area G2 shown in FIG. Thus, it can be shown with higher accuracy than in the case of not performing the correction of the present modification.
2-11 境界画像による指標補正
指標算出部103は、上記変形例とは異なる方法で日陰領域の指標を補正してもよい。指標算出部103は、本変形例では、圃場画像生成部101により取得された圃場の画像から日向領域と判定された部分と日向領域と判定されなかった部分(日陰領域と判定された部分)の境界部分に着目して補正を行う。 2-11 Index Correction by Boundary Image Theindex calculation unit 103 may correct the index of the shaded area by a method different from the above modification. In this modification, the index calculation unit 103 includes a portion determined to be a sunny region and a portion not determined to be a sunny region (a portion determined to be a shaded region) from the field image acquired by the field image generation unit 101. Correction is performed focusing on the boundary portion.
指標算出部103は、上記変形例とは異なる方法で日陰領域の指標を補正してもよい。指標算出部103は、本変形例では、圃場画像生成部101により取得された圃場の画像から日向領域と判定された部分と日向領域と判定されなかった部分(日陰領域と判定された部分)の境界部分に着目して補正を行う。 2-11 Index Correction by Boundary Image The
指標算出部103は、この境界部分の所定の範囲内にある日向領域の指標と日陰領域の指標の相関関係に基づいて、圃場の画像のうち日向領域と判定されなかった部分(日陰領域と判定された部分)について補正された指標を算出する。具体的には、指標算出部103は、例えば、日向領域判定部102により判定された日向領域及び日陰領域の境界線となる画素を境界画素として特定し、その境界画素の日向領域側に隣接する日向画素(所定の範囲内にある日向領域の一例)と、境界画素の日陰領域側に隣接する日陰画素(所定の範囲内にある日陰領域の一例)とのNDVIを比較する。
Based on the correlation between the index of the sunny area and the index of the shade area within the predetermined range of the boundary portion, the index calculation unit 103 determines a portion (determined as a shade area) of the field image that is not determined as the sunny area. Calculated index) is calculated. Specifically, for example, the index calculation unit 103 identifies, as a boundary pixel, a pixel that is a boundary line between the sunny area and the shaded area determined by the sunny area determination unit 102, and is adjacent to the sunny area side of the boundary pixel. NDVI is compared between a sunny pixel (an example of a sunny region within a predetermined range) and a shaded pixel adjacent to the shaded region side of the boundary pixel (an example of a shaded region within a predetermined range).
図19は日向画素及び日陰画素の一例を表す。図19では、圃場A1の圃場全体画像E1において、日向領域F1側の日向画素Df11と、日陰領域G1側の日陰画素Dg11とが表されている。また、日向画素及び日陰画素は日向領域F1及び日陰領域G1の境界線に沿って複数表されている。なお、図では見やすいように全ての日向画素及び日陰画素を表してはいないが、これらの日向画素及び日陰画素の間にも日向画素及び日陰画素が含まれている。
FIG. 19 shows an example of a sunny pixel and a shaded pixel. In FIG. 19, in the field whole image E1 of the field A1, the sunny pixel Df11 on the sunny area F1 side and the shaded pixel Dg11 on the shaded area G1 side are represented. A plurality of sunny pixels and shaded pixels are represented along the boundary line between the sunny region F1 and the shaded region G1. In the figure, for the sake of easy understanding, not all the sunny pixels and shaded pixels are shown, but sunny pixels and shaded pixels are included between these sunny pixels and shaded pixels.
指標算出部103は、これらの日向画素及び日陰画素についてそれぞれNDVIを算出し、図18の例と同様に指標割合と日陰画素のピクセル値についての相関関係を示す式を求める。指標算出部103は、あとは上記変形例と同様に日陰領域の補正されたNDVIを算出する。なお、図19の例では、境界画素に隣接する画素が境界画素の所定の範囲内の日向領域及び日陰領域(の画素)として用いられたが、これに限らない。境界画素から1以上の画素を挟む距離にある画素が日向領域及び日陰領域として用いられてもよい。要するに、圃場全体画像において生育状況が概ね一様になると思われる範囲が所定の範囲として用いられればよい。
The index calculation unit 103 calculates NDVI for each of these sunlit pixels and shaded pixels, and obtains an expression indicating the correlation between the index ratio and the pixel value of the shaded pixel, as in the example of FIG. After that, the index calculation unit 103 calculates the corrected NDVI of the shaded area in the same manner as in the modified example. In the example of FIG. 19, the pixels adjacent to the boundary pixels are used as the sunny area and the shaded area (pixels) within a predetermined range of the boundary pixels, but the present invention is not limited to this. Pixels located at a distance between one or more pixels from the boundary pixel may be used as the sunny area and the shade area. In short, a range in which the growth situation seems to be substantially uniform in the entire field image may be used as the predetermined range.
上記のとおり日向領域及び日陰領域の境界線に隣接する画素であれば、作物の生育状況が似通った状況である可能性が高い。そのため、日向画素及び日陰画素がどちらも日向領域であった場合に算出されると見込まれるNDVIも近い値になりやすい。そのため、上記変形例と同様に、日陰領域の画素の指標をその画素が日向領域だった場合に算出されることが見込まれる指標に補正することができる。また、本変形例では、ドローン20が撮影する圃場の画像だけを用いるので、図16に表すような固定の撮影手段を設置しなくても上記補正を行うことができる。
As described above, if the pixel is adjacent to the boundary line between the sunny area and the shaded area, there is a high possibility that the crop growth situation is similar. For this reason, the NDVI expected to be calculated when both the sunny pixel and the shaded pixel are in the sunny region is likely to be a close value. Therefore, similarly to the above-described modification, the index of the pixel in the shaded area can be corrected to an index that is expected to be calculated when the pixel is the sunny area. Moreover, in this modification, since only the field image which the drone 20 image | photographs is used, the said correction | amendment can be performed even if it does not install the fixed imaging | photography means as shown in FIG.
2-12 撮影装置
ドローン20が備える撮影装置は上記のものに限らない。例えばズーム機能(解像度を上げてNDVIの精度を向上することができる)を有していてもよいし、赤及び赤外線に特化した感度を有していてもよい。その場合、例えば日向領域判定部102が日向領域を判定する際に、スペクトラムを補正することで他の波長の光(青、緑)のピクセル値を復元してもよい。 2-12 Imaging Device The imaging device provided in thedrone 20 is not limited to the above. For example, it may have a zoom function (the resolution can be increased to improve the accuracy of NDVI), or it may have sensitivity specialized for red and infrared. In that case, for example, when the sunny area determination unit 102 determines the sunny area, the pixel values of light (blue, green) of other wavelengths may be restored by correcting the spectrum.
ドローン20が備える撮影装置は上記のものに限らない。例えばズーム機能(解像度を上げてNDVIの精度を向上することができる)を有していてもよいし、赤及び赤外線に特化した感度を有していてもよい。その場合、例えば日向領域判定部102が日向領域を判定する際に、スペクトラムを補正することで他の波長の光(青、緑)のピクセル値を復元してもよい。 2-12 Imaging Device The imaging device provided in the
2-13 日向指標及び日陰指標の出力
生育情報記録部105は、実施例とは異なる方法で日向指標及び日陰指標を出力してもよい。生育情報記録部105は、例えば、日向指標用に用意されたフォルダ又はデータベース等に対して日向指標を出力し、日陰指標用に用意されたフォルダ又はデータベース等に対して日陰指標を出力してもよい。 2-13 Output of Hinge Index and Shade Index The growthinformation recording unit 105 may output the sunny index and the shade index by a method different from the embodiment. The growth information recording unit 105 outputs, for example, a sunny indicator for a folder or database prepared for a sunny indicator, and outputs a shaded indicator for a folder or database prepared for a shade indicator, for example. Good.
生育情報記録部105は、実施例とは異なる方法で日向指標及び日陰指標を出力してもよい。生育情報記録部105は、例えば、日向指標用に用意されたフォルダ又はデータベース等に対して日向指標を出力し、日陰指標用に用意されたフォルダ又はデータベース等に対して日陰指標を出力してもよい。 2-13 Output of Hinge Index and Shade Index The growth
また、生育情報記録部105は、生育情報をユーザ端末30に出力するだけでなく、例えば、日向指標及び日陰指標を蓄積する蓄積装置、日向指標及び日陰指標から生育状況の分析及び将来の予測等を行う解析装置及び生育状況の比較を容易にする可視化処理(グラフ及びマップの生成等)を行う可視化装置等に出力してもよい。生育情報記録部105は、要するに、圃場での作業を行う者を支援することに繋がるのであれば、どのような出力先に生育情報を出力してもよい。
Further, the growth information recording unit 105 not only outputs the growth information to the user terminal 30, but also, for example, a storage device that accumulates a sunny index and a shade index, an analysis of a growth situation and a future prediction from the sunny index and the shade index, etc. May be output to an analysis apparatus that performs the above and a visualization apparatus that performs a visualization process (such as generation of a graph and a map) that facilitates comparison of the growth status. In short, the growth information recording unit 105 may output the growth information to any output destination as long as it leads to supporting the person who performs the work in the field.
2-14 各部を実現する装置
図4等に表す各機能を実現する装置がそれらの図とは異なっていてもよい。例えばサーバ装置が備える全ての機能又は一部の機能をドローンが備えていてもよい。その場合はドローンのプロセッサが本発明の「情報処理装置」の一例となる。また、サーバ装置の機能をユーザ端末30が実現してもよい。その場合はユーザ端末30が本発明の「情報処理装置」の一例となる。 2-14 Apparatus for Implementing Each Unit The apparatus for implementing each function shown in FIG. 4 and the like may be different from those shown in FIG. For example, the drone may have all or some of the functions of the server device. In that case, the drone processor is an example of the “information processing apparatus” of the present invention. Further, the user terminal 30 may realize the function of the server device. In that case, the user terminal 30 is an example of the “information processing apparatus” of the present invention.
図4等に表す各機能を実現する装置がそれらの図とは異なっていてもよい。例えばサーバ装置が備える全ての機能又は一部の機能をドローンが備えていてもよい。その場合はドローンのプロセッサが本発明の「情報処理装置」の一例となる。また、サーバ装置の機能をユーザ端末30が実現してもよい。その場合はユーザ端末30が本発明の「情報処理装置」の一例となる。 2-14 Apparatus for Implementing Each Unit The apparatus for implementing each function shown in FIG. 4 and the like may be different from those shown in FIG. For example, the drone may have all or some of the functions of the server device. In that case, the drone processor is an example of the “information processing apparatus” of the present invention. Further, the user terminal 30 may realize the function of the server device. In that case, the user terminal 30 is an example of the “information processing apparatus” of the present invention.
また、各機能が行う動作を他の機能が行ってもよいし、新たな機能に行わせてもよい。例えば指標算出部103が行う動作(指標の算出動作)を生育情報生成部104が行ってもよい。また、生育情報記録部105が行う日向指標及び日陰指標の出力を新たに設けた出力部が行ってもよい。また、サーバ装置が備える各機能を2以上の装置がそれぞれ実現してもよい。要するに、農業支援システム全体としてこれらの機能が実現されていれば、農業支援システムが何台の装置を備えていてもよい。
Also, the operation performed by each function may be performed by another function or may be performed by a new function. For example, the growth information generation unit 104 may perform the operation performed by the index calculation unit 103 (index calculation operation). Moreover, the output part which newly provided the output of the sunny index and shade index which the growth information recording part 105 performs may perform. Two or more devices may realize each function provided in the server device. In short, as long as these functions are realized as the entire agricultural support system, the agricultural support system may include any number of devices.
2-15 発明のカテゴリ
本発明は、上述したサーバ装置及びユーザ端末のような情報処理装置と、ドローンのような飛行体(ドローンは情報処理装置を兼ねる場合もある)の他、それらの装置及び飛行体を備える農業支援システムのような情報処理システムとしても捉えられる。また、本発明は、各装置が実施する処理を実現するための情報処理方法としても捉えられるし、各装置を制御するコンピュータを機能させるためのプログラムとしても捉えられる。このプログラムは、それを記憶させた光ディスク等の記録媒体の形態で提供されてもよいし、インターネット等のネットワークを介してコンピュータにダウンロードさせ、それをインストールして利用可能にするなどの形態で提供されてもよい。 2-15 Category of Invention In addition to the information processing apparatus such as the server apparatus and the user terminal described above and a flying object such as a drone (the drone may also serve as the information processing apparatus), the present invention It can also be understood as an information processing system such as an agricultural support system equipped with a flying object. In addition, the present invention can be understood as an information processing method for realizing processing performed by each device, or as a program for causing a computer that controls each device to function. This program may be provided in the form of a recording medium such as an optical disk in which it is stored, or may be provided in the form of being downloaded to a computer via a network such as the Internet, installed and made available for use. May be.
本発明は、上述したサーバ装置及びユーザ端末のような情報処理装置と、ドローンのような飛行体(ドローンは情報処理装置を兼ねる場合もある)の他、それらの装置及び飛行体を備える農業支援システムのような情報処理システムとしても捉えられる。また、本発明は、各装置が実施する処理を実現するための情報処理方法としても捉えられるし、各装置を制御するコンピュータを機能させるためのプログラムとしても捉えられる。このプログラムは、それを記憶させた光ディスク等の記録媒体の形態で提供されてもよいし、インターネット等のネットワークを介してコンピュータにダウンロードさせ、それをインストールして利用可能にするなどの形態で提供されてもよい。 2-15 Category of Invention In addition to the information processing apparatus such as the server apparatus and the user terminal described above and a flying object such as a drone (the drone may also serve as the information processing apparatus), the present invention It can also be understood as an information processing system such as an agricultural support system equipped with a flying object. In addition, the present invention can be understood as an information processing method for realizing processing performed by each device, or as a program for causing a computer that controls each device to function. This program may be provided in the form of a recording medium such as an optical disk in which it is stored, or may be provided in the form of being downloaded to a computer via a network such as the Internet, installed and made available for use. May be.
2-16 処理手順等
本明細書で説明した各実施例の処理手順、シーケンス、フローチャートなどは、矛盾がない限り、順序を入れ替えてもよい。例えば、本明細書で説明した方法については、例示的な順序で様々なステップの要素を提示しており、提示した特定の順序に限定されない。 2-16 Processing Procedures, etc. The processing procedures, sequences, flowcharts, etc. of the embodiments described in this specification may be switched in order as long as there is no contradiction. For example, the methods described herein present the elements of the various steps in an exemplary order and are not limited to the specific order presented.
本明細書で説明した各実施例の処理手順、シーケンス、フローチャートなどは、矛盾がない限り、順序を入れ替えてもよい。例えば、本明細書で説明した方法については、例示的な順序で様々なステップの要素を提示しており、提示した特定の順序に限定されない。 2-16 Processing Procedures, etc. The processing procedures, sequences, flowcharts, etc. of the embodiments described in this specification may be switched in order as long as there is no contradiction. For example, the methods described herein present the elements of the various steps in an exemplary order and are not limited to the specific order presented.
2-17 入出力された情報等の扱い
入出力された情報等は特定の場所(例えばメモリ)に保存されてもよいし、管理テーブルで管理してもよい。入出力される情報等は、上書き、更新、又は追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 2-17 Handling of Input / Output Information, etc. Input / output information, etc. may be stored in a specific location (for example, a memory) or managed by a management table. Input / output information and the like can be overwritten, updated, or additionally written. The output information or the like may be deleted. The input information or the like may be transmitted to another device.
入出力された情報等は特定の場所(例えばメモリ)に保存されてもよいし、管理テーブルで管理してもよい。入出力される情報等は、上書き、更新、又は追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 2-17 Handling of Input / Output Information, etc. Input / output information, etc. may be stored in a specific location (for example, a memory) or managed by a management table. Input / output information and the like can be overwritten, updated, or additionally written. The output information or the like may be deleted. The input information or the like may be transmitted to another device.
2-18 ソフトウェア
ソフトウェアは、ソフトウェア、ファームウェア、ミドルウェア、マイクロコード、ハードウェア記述言語と呼ばれるか、他の名称で呼ばれるかを問わず、命令、命令セット、コード、コードセグメント、プログラムコード、プログラム、サブプログラム、ソフトウェアモジュール、アプリケーション、ソフトウェアアプリケーション、ソフトウェアパッケージ、ルーチン、サブルーチン、オブジェクト、実行可能ファイル、実行スレッド、手順、機能などを意味するよう広く解釈されるべきである。 2-18 Software Software, whether it is called software, firmware, middleware, microcode, hardware description language, or another name, is an instruction, instruction set, code, code segment, program code, program, sub, It should be interpreted broadly to mean a program, a software module, an application, a software application, a software package, a routine, a subroutine, an object, an executable file, an execution thread, a procedure, a function, and the like.
ソフトウェアは、ソフトウェア、ファームウェア、ミドルウェア、マイクロコード、ハードウェア記述言語と呼ばれるか、他の名称で呼ばれるかを問わず、命令、命令セット、コード、コードセグメント、プログラムコード、プログラム、サブプログラム、ソフトウェアモジュール、アプリケーション、ソフトウェアアプリケーション、ソフトウェアパッケージ、ルーチン、サブルーチン、オブジェクト、実行可能ファイル、実行スレッド、手順、機能などを意味するよう広く解釈されるべきである。 2-18 Software Software, whether it is called software, firmware, middleware, microcode, hardware description language, or another name, is an instruction, instruction set, code, code segment, program code, program, sub, It should be interpreted broadly to mean a program, a software module, an application, a software application, a software package, a routine, a subroutine, an object, an executable file, an execution thread, a procedure, a function, and the like.
また、ソフトウェア、命令などは、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、同軸ケーブル、光ファイバケーブル、ツイストペア及びデジタル加入者回線(DSL)などの有線技術及び/又は赤外線、無線及びマイクロ波などの無線技術を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び/又は無線技術は、伝送媒体の定義内に含まれる。
Further, software, instructions, etc. may be transmitted / received via a transmission medium. For example, software may use websites, servers, or other devices using wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave. When transmitted from a remote source, these wired and / or wireless technologies are included within the definition of transmission media.
2-19 情報、信号
本明細書で説明した情報、信号などは、様々な異なる技術のいずれかを使用して表されてもよい。例えば、上記の説明全体に渡って言及され得るデータ、命令、コマンド、情報、信号、ビット、シンボル、チップなどは、電圧、電流、電磁波、磁界若しくは磁性粒子、光場若しくは光子、又はこれらの任意の組み合わせによって表されてもよい。 2-19 Information, Signals The information, signals, etc. described herein may be represented using any of a variety of different technologies. For example, data, commands, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description are voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these May be represented by a combination of
本明細書で説明した情報、信号などは、様々な異なる技術のいずれかを使用して表されてもよい。例えば、上記の説明全体に渡って言及され得るデータ、命令、コマンド、情報、信号、ビット、シンボル、チップなどは、電圧、電流、電磁波、磁界若しくは磁性粒子、光場若しくは光子、又はこれらの任意の組み合わせによって表されてもよい。 2-19 Information, Signals The information, signals, etc. described herein may be represented using any of a variety of different technologies. For example, data, commands, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description are voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these May be represented by a combination of
2-20 システム、ネットワーク
本明細書で使用する「システム」及び「ネットワーク」という用語は、互換的に使用される。 2-20 System, Network As used herein, the terms “system” and “network” are used interchangeably.
本明細書で使用する「システム」及び「ネットワーク」という用語は、互換的に使用される。 2-20 System, Network As used herein, the terms “system” and “network” are used interchangeably.
2-21 「に基づいて」の意味
本明細書で使用する「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 2-21 Meaning of “Based on” As used herein, the phrase “based on” does not mean “based only on,” unless expressly specified otherwise. In other words, the phrase “based on” means both “based only on” and “based at least on.”
本明細書で使用する「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 2-21 Meaning of “Based on” As used herein, the phrase “based on” does not mean “based only on,” unless expressly specified otherwise. In other words, the phrase “based on” means both “based only on” and “based at least on.”
2-22 「及び」、「又は」
本明細書において、「A及びB」でも「A又はB」でも実施可能な構成については、一方の表現で記載された構成を、他方の表現で記載された構成として用いてもよい。例えば「A及びB」と記載されている場合、他の記載との不整合が生じず実施可能であれば、「A又はB」として用いてもよい。 2-22 “and”, “or”
In the present specification, for a configuration that can be implemented by either “A and B” or “A or B”, a configuration described in one expression may be used as a configuration described in the other expression. For example, when “A and B” is described, it may be used as “A or B” as long as it is feasible without inconsistency with other descriptions.
本明細書において、「A及びB」でも「A又はB」でも実施可能な構成については、一方の表現で記載された構成を、他方の表現で記載された構成として用いてもよい。例えば「A及びB」と記載されている場合、他の記載との不整合が生じず実施可能であれば、「A又はB」として用いてもよい。 2-22 “and”, “or”
In the present specification, for a configuration that can be implemented by either “A and B” or “A or B”, a configuration described in one expression may be used as a configuration described in the other expression. For example, when “A and B” is described, it may be used as “A or B” as long as it is feasible without inconsistency with other descriptions.
2-23 態様のバリエーション等
本明細書で説明した各実施例は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行うものに限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 2-23 Variations of Embodiments Each embodiment described in the present specification may be used alone, in combination, or may be used by switching according to execution. In addition, notification of predetermined information (for example, notification of being “X”) is not limited to explicitly performed, but is performed implicitly (for example, notification of the predetermined information is not performed). Also good.
本明細書で説明した各実施例は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行うものに限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 2-23 Variations of Embodiments Each embodiment described in the present specification may be used alone, in combination, or may be used by switching according to execution. In addition, notification of predetermined information (for example, notification of being “X”) is not limited to explicitly performed, but is performed implicitly (for example, notification of the predetermined information is not performed). Also good.
以上、本発明について詳細に説明したが、当業者にとっては、本発明が本明細書中に説明した実施例に限定されるものではないということは明らかである。本発明は、特許請求の範囲の記載により定まる本発明の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。従って、本明細書の記載は、例示説明を目的とするものであり、本発明に対して何ら制限的な意味を有するものではない。
Although the present invention has been described in detail above, it will be apparent to those skilled in the art that the present invention is not limited to the embodiments described herein. The present invention can be implemented as modified and changed modes without departing from the spirit and scope of the present invention defined by the description of the scope of claims. Therefore, the description of the present specification is for illustrative purposes and does not have any limiting meaning to the present invention.
1…農業支援システム、10…サーバ装置、20…ドローン、30…ユーザ端末、101…圃場画像生成部、102…日向領域判定部、103…指標算出部、104…生育情報生成部、105…生育情報記録部、106…補正画像取得部、107…日照条件特定部、108…飛行指示部、109…画像取得部、201…飛行制御部、202…飛行部、203…センサ測定部、204…撮影部。
DESCRIPTION OF SYMBOLS 1 ... Agriculture support system, 10 ... Server apparatus, 20 ... Drone, 30 ... User terminal, 101 ... Agricultural field image generation part, 102 ... Hyuga area determination part, 103 ... Index calculation part, 104 ... Growth information generation part, 105 ... Growth Information recording unit 106 ... corrected image acquisition unit 107 ... sunshine condition specifying unit 108 ... flight instruction unit 109 ... image acquisition unit 201 ... flight control unit 202 ... flight unit 203 ... sensor measurement unit 204 ... photographing Department.
Claims (8)
- 撮影された圃場の画像に含まれる当該圃場の日向領域を判定する判定部と、
判定された前記日向領域における作物の育成状況を示す指標を日向指標として取得する指標取得部と、
取得された前記日向指標を前記日向領域の指標として出力する出力部と
を備える情報処理装置。 A determination unit for determining a sunny area of the field included in the captured image of the field;
An index acquisition unit that acquires an index indicating a growth status of the crop in the determined hyuga area as a sunny index;
An information processing apparatus comprising: an output unit that outputs the acquired Hinata index as an index of the Hinata area. - 前記指標取得部は、前記日向領域と判定されなかった日陰領域における前記作物の前記育成状況を示す指標を日陰指標として取得し、
前記出力部は、取得された前記日陰指標を前記日陰領域の指標として出力する
請求項1に記載の情報処理装置。 The index acquisition unit acquires, as a shade index, an index indicating the growth status of the crop in a shade area that is not determined to be the sunny area,
The information processing apparatus according to claim 1, wherein the output unit outputs the acquired shade index as an index of the shade region. - 撮影手段を備える飛行体により撮影された圃場の画像を取得する第1画像取得部と、
取得された前記圃場の前記画像から前記作物の前記育成状況を示す指標を算出する算出部とを備え、
前記指標取得部は、前記圃場の前記画像の全体について算出された前記指標から前記日向領域について算出されたものを前記日向指標として取得し、又は、前記圃場の画像のうち前記日向領域についてのみ算出された前記指標を前記日向指標として取得する
請求項1又は2に記載の情報処理装置。 A first image acquisition unit for acquiring an image of a field imaged by a flying object including an imaging unit;
A calculation unit that calculates an index indicating the growth status of the crop from the acquired image of the field,
The index obtaining unit obtains, as the sunny index, a value calculated for the sunny area from the indices calculated for the entire image of the field, or calculates only the sunny area of the farm image. The information processing apparatus according to claim 1, wherein the obtained index is acquired as the Hinata index. - 前記飛行体により撮影された指標補正用の画像を取得する補正取得部と、
前記指標補正用の画像の撮影条件が日向及び日陰のいずれであるかを特定する条件特定部とを備え
前記算出部は、前記日向の前記撮影条件が特定された場合に、前記圃場の前記画像のうち前記日向領域と判定された部分における前記指標として、取得された前記指標補正用の前記画像に基づいて補正された前記指標を算出する
請求項3に記載の情報処理装置。 A correction acquisition unit that acquires an image for index correction captured by the flying object;
A condition specifying unit that specifies whether the shooting condition of the image for index correction is sunny or shaded. The calculation unit, when the shooting condition for the sunny day is specified, the image of the field. The information processing apparatus according to claim 3, wherein the index corrected based on the acquired index correction image is calculated as the index in the portion determined to be the sunny area. - 前記算出部は、前記日陰の前記撮影条件が特定された場合に、前記圃場の前記画像のうち前記日向領域と判定されなかった部分における前記指標として、取得された前記指標補正用の画像に基づいて補正された前記指標を算出する
請求項4に記載の情報処理装置。 The calculation unit is based on the acquired index correction image as the index in the portion of the image that is not determined to be the sunny area in the image when the shaded shooting condition is specified. The information processing apparatus according to claim 4, wherein the corrected index is calculated. - 前記日向領域と判定されなかった日陰領域の再撮影又は撮影時刻を変更しての撮影を前記飛行体に指示する指示部を備える
請求項3から5のいずれか1項に記載の情報処理装置。 The information processing apparatus according to any one of claims 3 to 5, further comprising an instruction unit that instructs the flying object to re-photograph a shaded area that is not determined to be the sunny area or to change the shooting time. - 前記圃場のうち前記日向領域と日陰領域とが切り替わる固定領域を撮影する撮影手段が撮影した前記固定領域の画像を取得する第2画像取得部を備え、
前記算出部は、取得された前記固定領域の画像から得られる前記日向領域の指標と前記日陰領域の指標の相関関係に基づいて、前記圃場の前記画像のうち前記日向領域と判定されなかった部分について補正された前記指標を算出する
請求項3から6のいずれか1項に記載の情報処理装置。 A second image acquisition unit that acquires an image of the fixed area captured by an image capturing unit that captures a fixed area where the sunny area and the shaded area of the field are switched;
The calculation unit is a part of the image of the field that is not determined to be the sunny area based on a correlation between the index of the sunny area and the index of the shaded area obtained from the acquired image of the fixed area The information processing apparatus according to claim 3, wherein the index corrected for is calculated. - 前記算出部は、取得された前記圃場の前記画像から前記日向領域と判定された部分と前記日向領域と判定されなかった日陰領域の部分との境界の所定の範囲内にある当該日向領域の指標及び当該日陰領域の指標の相関関係に基づいて、前記圃場の前記画像のうち当該日陰領域の部分について補正された前記指標を算出する
請求項3から7のいずれか1項に記載の情報処理装置。 The calculation unit is an index of the sunny area that is within a predetermined range of a boundary between a portion determined as the sunny area from the acquired image of the field and a shaded area not determined as the sunny area. The information processing apparatus according to any one of claims 3 to 7, wherein the index corrected for the portion of the shade area in the image of the field is calculated based on a correlation between the index of the shade area and the shade area. .
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020523619A JP7218365B2 (en) | 2018-06-06 | 2019-05-23 | Information processing equipment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018108475 | 2018-06-06 | ||
JP2018-108475 | 2018-06-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019235240A1 true WO2019235240A1 (en) | 2019-12-12 |
Family
ID=68770226
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/020508 WO2019235240A1 (en) | 2018-06-06 | 2019-05-23 | Information processing device |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7218365B2 (en) |
WO (1) | WO2019235240A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021108650A (en) * | 2020-01-15 | 2021-08-02 | 国立研究開発法人農業・食品産業技術総合研究機構 | Field crop-associated value derivation device, and field crop-associated value derivation method |
JP2021171057A (en) * | 2020-04-20 | 2021-11-01 | 国立研究開発法人農業・食品産業技術総合研究機構 | Crop related value derivation device and crop related value derivation method |
WO2024190391A1 (en) * | 2023-03-13 | 2024-09-19 | コニカミノルタ株式会社 | Identification device, identification method, and program |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006129492A1 (en) * | 2005-06-03 | 2006-12-07 | Honda Motor Co., Ltd. | Vehicle and road sign recognition device |
WO2009116613A1 (en) * | 2008-03-21 | 2009-09-24 | 株式会社 伊藤園 | Method and apparatus of evaluating fitness-for-plucking of tea leaf, system of evaluating fitness-for-plucking of tea leaf, and computer-usable medium |
JP2012183021A (en) * | 2011-03-04 | 2012-09-27 | Hitachi Ltd | Vegetation control device and plant growing system |
WO2018034166A1 (en) * | 2016-08-17 | 2018-02-22 | ソニー株式会社 | Signal processing device and signal processing method, and program |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2018357616A1 (en) * | 2017-10-26 | 2020-04-02 | Sony Corporation | Information processing device, information processing method, program, and information processing system |
-
2019
- 2019-05-23 JP JP2020523619A patent/JP7218365B2/en active Active
- 2019-05-23 WO PCT/JP2019/020508 patent/WO2019235240A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006129492A1 (en) * | 2005-06-03 | 2006-12-07 | Honda Motor Co., Ltd. | Vehicle and road sign recognition device |
WO2009116613A1 (en) * | 2008-03-21 | 2009-09-24 | 株式会社 伊藤園 | Method and apparatus of evaluating fitness-for-plucking of tea leaf, system of evaluating fitness-for-plucking of tea leaf, and computer-usable medium |
JP2012183021A (en) * | 2011-03-04 | 2012-09-27 | Hitachi Ltd | Vegetation control device and plant growing system |
WO2018034166A1 (en) * | 2016-08-17 | 2018-02-22 | ソニー株式会社 | Signal processing device and signal processing method, and program |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021108650A (en) * | 2020-01-15 | 2021-08-02 | 国立研究開発法人農業・食品産業技術総合研究機構 | Field crop-associated value derivation device, and field crop-associated value derivation method |
JP2021171057A (en) * | 2020-04-20 | 2021-11-01 | 国立研究開発法人農業・食品産業技術総合研究機構 | Crop related value derivation device and crop related value derivation method |
JP7044285B2 (en) | 2020-04-20 | 2022-03-30 | 国立研究開発法人農業・食品産業技術総合研究機構 | Crop-related value derivation device and crop-related value derivation method |
WO2024190391A1 (en) * | 2023-03-13 | 2024-09-19 | コニカミノルタ株式会社 | Identification device, identification method, and program |
Also Published As
Publication number | Publication date |
---|---|
JPWO2019235240A1 (en) | 2021-07-08 |
JP7218365B2 (en) | 2023-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210217148A1 (en) | Methods for agronomic and agricultural monitoring using unmanned aerial systems | |
US11763441B2 (en) | Information processing apparatus | |
WO2019235240A1 (en) | Information processing device | |
US12111251B2 (en) | Information processing apparatus, information processing method, program, and sensing system | |
JP2018046787A (en) | Agricultural management prediction system, agricultural management prediction method, and server apparatus | |
KR20200065696A (en) | system for monitoring agricultural produce using drone | |
US11823447B2 (en) | Information processing apparatus, information processing method, program, and information processing system | |
JP7074126B2 (en) | Image processing equipment, growth survey image creation system and program | |
WO2021100430A1 (en) | Information processing device, information processing method, and program | |
AU2016339031A1 (en) | Forestry information management systems and methods streamlined by automatic biometric data prioritization | |
JP2020149201A (en) | Method of presenting recommended spot for measuring growth parameters used for crop lodging risk diagnosis, method of lodging risk diagnosis, and information providing apparatus | |
JP2019153109A (en) | Agricultural management prediction system, agricultural management prediction method, and server device | |
WO2019208538A1 (en) | Information processing device | |
US20220414362A1 (en) | Method and system for optimizing image data for generating orthorectified image | |
CN117014584B (en) | Vegetation remote sensing product acquisition system and method | |
Sorenson | Evaluation of unmanned aerial vehicles and analytical software for creation of a crop consulting business | |
WO2021149355A1 (en) | Information processing device, information processing method, and program | |
Caballong et al. | Establishing an aerial mapping service for rice monitoring | |
KR20240086048A (en) | method and system for processing of growing image data using drone | |
CN116128953A (en) | Method, device and processor for determining crop leaf area index |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19815402 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020523619 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19815402 Country of ref document: EP Kind code of ref document: A1 |