WO2024018559A1 - Shipping quantity prediction method and shipping quantity prediction system - Google Patents

Shipping quantity prediction method and shipping quantity prediction system Download PDF

Info

Publication number
WO2024018559A1
WO2024018559A1 PCT/JP2022/028196 JP2022028196W WO2024018559A1 WO 2024018559 A1 WO2024018559 A1 WO 2024018559A1 JP 2022028196 W JP2022028196 W JP 2022028196W WO 2024018559 A1 WO2024018559 A1 WO 2024018559A1
Authority
WO
WIPO (PCT)
Prior art keywords
crops
harvest
target area
crop
amount
Prior art date
Application number
PCT/JP2022/028196
Other languages
French (fr)
Japanese (ja)
Inventor
由久 宇佐美
正裕 北島
Original Assignee
株式会社ファームシップ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ファームシップ filed Critical 株式会社ファームシップ
Priority to PCT/JP2022/028196 priority Critical patent/WO2024018559A1/en
Publication of WO2024018559A1 publication Critical patent/WO2024018559A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G7/00Botany in general
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining

Definitions

  • the present invention relates to a shipping amount prediction method and a shipping amount prediction system using images taken from above of target areas where crops are planted, and in particular, to estimate the amount of crops shipped to the market from multiple target areas.
  • the present invention relates to a shipping amount prediction method and a shipping amount prediction system.
  • crop yields have been predicted using various methods. For example, a surveyor goes to the site where crops are planted, observes the growing conditions of the crops, and predicts the yield based on experience. It takes time for researchers to predict crop yields. In particular, when the cultivated area is large, it takes more time and costs to predict the yield. In addition, predictions of yields based on the experience of surveyors vary from surveyor to surveyor, resulting in variations in yield predictions, and as a result, the reliability of yield predictions is poor. In addition to predicting crop yields by researchers, for example, a harvest prediction device is proposed in Patent Document 1.
  • the harvest prediction device of Patent Document 1 is a harvest prediction device that predicts the yield of a crop from image data of the crop. It stores growth stage spectrum information, which is information about the crop, growth period information, which is information about the period from the start of crop growth to harvest, and area yield information, which is information about the relationship between the planted area of the crop and the yield.
  • a crop in which the type of crop is identified by comparing information regarding the spectrum obtained from the image data with the type spectrum information stored in the storage unit, and crop type information that is information regarding the identified type is obtained.
  • the type identification unit identifies the growing stage of the crop by comparing the spectrum information obtained from the image data with the growing stage spectrum information stored in the storage unit, and the growing stage is information about the identified growing stage.
  • a growing stage identification unit that obtains information
  • a planted area calculation unit that calculates a planted area based on image data and obtains planted area information that is information about the calculated planted area, and crop type information, growing stage information, and planted area information.
  • a harvest prediction section that predicts the harvest time and yield of crops from the growing period information and area yield information stored in the storage section.
  • Patent Document 2 describes at least one planting date for a crop and a planting amount, which is the amount of the crop to be planted on the planting date, as a method for harmonizing the demand amount and the harvest amount of the crop.
  • a cropping schedule calculation device that calculates the amount has been proposed.
  • the harvest amount of crops is brought closer to the demand amount.
  • JP2003-006612A Japanese Patent Application Publication No. 2021-002110
  • Patent Document 1 If there are multiple fields in a specific region, and if the harvest time and yield of crops are predicted for each field using Patent Document 1, for example, the prediction of the harvest time and yield of crops in each field I can. However, it is not possible to predict the amount of crops shipped from fields throughout the specific region mentioned above. Further, if the above-mentioned Patent Document 2 is used to bring the harvest amount of crops closer to the demand amount, it is possible to bring the harvest amount of crops in each field closer to the demand amount. However, it is not possible to predict the amount of crops shipped from fields throughout the specific region mentioned above.
  • An object of the present invention is to provide a shipment amount prediction method and a shipment amount prediction system that predict the shipment amount of crops shipped from fields in the entire region to the market.
  • the invention [1] includes a step of identifying the harvest time of crops in a target area based on an image taken from above of the target area where the crops are planted, and A process of calculating the growth rate of crops, a process of predicting the yield of crops in a target area at the time of harvest based on the currently calculated growth rate and past growth rates, and a process of predicting the yield of crops in a target area at the time of harvest, and each of a plurality of target areas.
  • the present invention provides a method for predicting the amount of shipments, which includes the step of estimating the amount of shipments to markets where crops are shipped from a plurality of target areas, based on the predicted yields.
  • invention [2] in the process of identifying the harvest time of crops in a target area, a current image taken from above of the target area where crops are planted and a past image taken from above of the target area where crops were planted are used.
  • the shipping amount prediction method according to invention [1] calculates the color distance to the image and specifies the harvest time of crops in the target area based on the color distance.
  • Invention [3] obtains in advance the correlation between the crop yield in the target area at the past harvest time and the growth rate, and in the step of predicting the crop yield in the target area at the harvest time, the correlation is obtained.
  • the shipping amount prediction method according to [2] or [2] which uses the relationship to predict the harvest amount in the target area at the time of harvest.
  • Invention [4] is the shipping amount prediction method according to any one of inventions [1] to [3], wherein the crop is a leafy vegetable.
  • Invention [5] includes a specific unit that identifies the harvest time of crops in a target area based on an image taken from above of the target area where the crops are planted, and a calculation unit that calculates the growth speed of the crops based on the harvest time. a prediction unit that predicts the crop yield in the target area at the time of harvest based on the currently calculated growth rate and past growth rates; , and an estimator that estimates the amount shipped to a market where crops are shipped from a plurality of target regions.
  • the identification unit calculates the color distance between a current image taken from above of a target area where crops are planted and a past image taken from above of a target area where crops are planted
  • the shipping amount prediction system according to invention [5] specifies the harvest time of crops in a target area based on color distance.
  • Invention [7] obtains in advance the correlation between the yield of crops in the target area at the time of past harvest and the growth rate, and the prediction unit uses the correlation to predict the yield in the target area at the time of harvest.
  • the shipping amount prediction system according to invention [5] or [6] predicts the yield of crops.
  • Invention [8] is the shipping amount prediction system according to any one of inventions [5] to [7], wherein the crop is a leafy vegetable.
  • the present invention it is possible to provide a shipping amount prediction method for predicting the shipping amount of crops shipped from fields in the entire region to the market. Further, it is possible to provide a shipping amount prediction system that predicts the shipping amount of crops shipped from fields in the entire region to the market.
  • FIG. 2 is a schematic diagram showing an example of a region targeted by a shipping amount prediction method according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram showing an example of a shipping amount prediction system according to an embodiment of the present invention.
  • 3 is a flowchart illustrating an example of a shipping amount prediction method according to an embodiment of the present invention.
  • (a) to (d) are schematic diagrams for explaining an example of a method for specifying a planting time and a harvest time in a shipping amount prediction method according to an embodiment of the present invention.
  • (a) and (b) are schematic diagrams for explaining an example of a method for specifying the harvest time of crops in the shipment amount prediction method according to the embodiment of the present invention.
  • It is a flowchart which shows an example of the process of calculating the growth rate of the shipping amount prediction method of embodiment of this invention.
  • FIG. 1 is a schematic diagram showing an example of a region targeted by the shipping amount prediction method according to the embodiment of the present invention.
  • FIG. 1 schematically shows a production area 10 in order to explain a shipping amount prediction method and a shipping amount prediction system.
  • the production area 10 is an area where crops are cultivated, and the target area is a cultivation area within the production area 10.
  • the unit (section unit) that defines the cultivation area may be a unit at the level of each prefecture, a unit at the municipal level, or a unit smaller than the municipality.
  • the production area 10 includes a plurality of cultivation areas, and the production area 10 shown in FIG. 1 includes, for example, three cultivation areas 10a, 10b, and 10c. These three cultivation areas 10a, 10b, and 10c correspond to target areas, respectively. Furthermore, the cultivation area 10a includes, for example, four fields 11a. Furthermore, in the cultivation area 10b, there are, for example, 12 fields 11b. Furthermore, the cultivation area 10c has 32 fields 11c. A field is a field where crops are grown. Note that there are no particular restrictions on the scale, planted area, etc. of each field in each cultivation region, and the fields may be arranged regularly or irregularly in each cultivation region. In the fields 11a, 11b, and 11c, crops are planted and harvested after growing to a predetermined size.
  • the crops are, for example, leafy vegetables.
  • Leafy vegetables are also called leafy vegetables. Examples of leafy vegetables include lettuce, komatsuna, cabbage, spinach, broccoli, garland chrysanthemum, bok choy, Chinese cabbage, mizuna, chive, and green onion.
  • FIG. 2 is a schematic diagram showing an example of a shipping amount prediction system according to an embodiment of the present invention.
  • the shipping amount prediction system 20 shown in FIG. 2 is an example of a system used in the shipping amount prediction method, and is configured using, for example, hardware such as a computer or software on the cloud.
  • the shipping amount prediction method can be executed using hardware and software such as a computer, the shipping amount prediction method is not limited to using the shipping amount prediction system 20 shown in FIG. 2. That is, the shipping amount prediction method of the present invention may be implemented using a configuration (system) other than the shipping amount prediction system 20 of FIG. 2. Further, a program for causing a computer or the like to execute each step of the shipping amount prediction method may be used.
  • the shipping amount prediction system 20 shown in FIG. 2 includes, for example, a processing section 22, an input section 24, and a display section 26.
  • the calculation unit 32 includes an acquisition unit 30 , a specification unit 31 , a calculation unit 32 , a prediction unit 33 , an estimation unit 34 , a display control unit 35 , a memory 36 , and a control unit 37 .
  • the shipping amount prediction system 20 also includes a ROM and the like.
  • the processing section 22 is controlled by a control section 37. Further, in the processing unit 22 , the acquisition unit 30 , the identification unit 31 , the calculation unit 32 , the prediction unit 33 , the estimation unit 34 , and the display control unit 35 are connected to the memory 36 . Furthermore, the data output from each of the acquisition section 30 , identification section 31 , calculation section 32 , prediction section 33 , estimation section 34 , and display control section 35 can be stored in the memory 36 .
  • the shipping amount prediction system 20 has the control unit 37 execute a program (computer software) stored in a ROM or the like, thereby controlling the acquisition unit 30, the identification unit 31, the calculation unit 32, the prediction unit 33, and the estimation unit 34. Form each part functionally.
  • the shipping amount prediction system 20 may be configured by a computer in which each part functions by executing a program, or may be a dedicated device in which each part is configured with a dedicated circuit. It may be configured on a server to run on the cloud.
  • the shipment amount prediction system 20 was constructed for the purpose of predicting the shipment amount of crops to be shipped to the market from data related to fields in the entire region, especially image data of each field. Each part of the shipping amount prediction system 20 will be explained below.
  • the input unit 24 is various input devices such as a mouse and a keyboard for inputting various information according to instructions from an operator.
  • the display section 26 displays, for example, the predicted shipment amount of crops to be shipped from fields in the entire region to the market, and various known displays are used.
  • the display unit 26 also includes a device such as a printer for displaying various information on an output medium.
  • an image taken from above of a target area where crops are planted is inputted from outside the shipping amount prediction system 20 in the form of image data by the input unit 24, and is held therein.
  • Image data of an image taken from above of the target area where the above crops are planted, which is held in the acquisition unit 30 is read out to the identification unit 31 .
  • the above-described image data held in the acquisition unit 30 may be stored in the memory 36.
  • the image of the target area where crops are planted be taken from above, for example, once a day, and more preferably, once a day, at the same time. Further, it is preferable that the image includes shooting date information and location information.
  • Level correction, atmospheric correction, geometric correction, etc. may be performed on the image data of an image taken of the target area from above.
  • Level correction is correction that removes variations in sensitivity or noise of the sensor that acquired the image data.
  • atmospheric correction is correction that removes noise caused by water vapor or the like when image data is acquired.
  • geometric correction is correction that removes geometric distortion and the like included in an image.
  • the image data of the image taken from above of the target area where crops are planted is not particularly limited as long as it is computer-readable data and computer-analyzable data.
  • the acquisition unit 30 may convert image data of an image taken from above of a target area where crops are planted into a data format that can be processed by the identification unit 31.
  • the image data of an image of the target area taken from above includes information on the specific wavelength. Specific wavelengths will be explained later.
  • the target areas where crops are planted are, for example, three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG.
  • Images taken of the target area from above are not particularly limited, and can be taken and obtained using aircraft such as airplanes and helicopters, artificial satellites, or drones.
  • the photographed image includes positional information based on the photographing position. This makes it easier to identify the field in the target area in the photographed image by comparing the position information of the photographed image with the position information of the target area.
  • the specifying unit 31 specifies the harvest time of crops in the target area based on an image (image data) taken from above of the target area where the crops are planted.
  • the harvest time of crops in the target area refers to a state in which there are no crops in the field, such as when the crops have been harvested from the field, and is the harvest date and time. In a field, the appearance of the soil from above changes depending on the presence or absence of crops and the growth conditions of the crops at the time of planting or harvesting.
  • the images (image data) are different.
  • the specifying unit 31 uses the above-mentioned differences in images (image data) to specify the harvest time.
  • the identification unit 31 calculates the color distance between the current image taken from above of the target area where crops were planted and the previous image taken from above of the target area where crops were planted, and then calculates the color distance based on the color distance. Identify the harvest time of crops in the target area.
  • the past image is an image of the same area (field) as the current image, and is used to determine the growth status of crops in the current image or whether the crops in the current image have already been harvested.
  • the harvest time As a procedure for identifying the harvest time using the current image and past images, for example, when the image data of the image is expressed in 8 bits and 256 gradations, if the color distance changes by 10 or more, the harvest time shall be. Therefore, in order to identify the harvest time, it is preferable to take an aerial image of the target area where crops are planted once a day. For the color distance, at least one wavelength of the image data may be used, but it is preferable to use a plurality of wavelengths.
  • the specifying unit 31 causes the memory 36 to store the information on the time of harvest of the crops described above and the information on the color distance.
  • the calculation unit 32 calculates the growth rate of crops based on the harvest time.
  • the growth rate is, for example, the number of days from planting to harvesting. For this reason, it is necessary to specify the planting date (time of planting). Planting is when crops are planted in fields where there are no crops.
  • the planting time can be identified by color distance, similar to the above-mentioned harvesting time. In fixed planting, when the image data of an image is expressed in 8 bits and 256 gradations, if the color distance changes by 10 or more, the photographing date and time of the image is taken as the fixed planting time. Therefore, in order to identify the planting time, it is preferable to take an aerial image of the target area where crops are planted once a day.
  • an image of the field before planting to be specified (in which the soil is exposed) is taken in advance.
  • the color distance between the image of the field before planting to be identified and the photographed image of the field to be identified is determined.
  • this color distance is 10 or more
  • the date and time when the image of the field was taken is defined as the planting time. Note that the growth rate will be explained in detail later.
  • the prediction unit 33 predicts the yield of crops in the target area at the time of harvest based on the currently calculated growth rate and past growth rates.
  • the correlation between crop yields and growth speeds in the target area at past harvest times is obtained in advance.
  • the yield amount when the number of days from planting to harvest date is 50, 70, and 100 days is obtained in advance for each type of crop or for each field.
  • the number of days from the planting date to the harvesting date represents the growth rate of the crop (more specifically, the past growth rate).
  • the correlations obtained in advance for each type of crop or for each field are stored in the memory 36, for example.
  • the prediction unit 33 reads the above-mentioned correlation from the memory 36 and predicts the yield of crops in the target area at the time of harvest. Specifically, based on the above-mentioned correlation, the calculation unit 32 predicts the harvest amount corresponding to the currently calculated growth rate. The prediction unit 33 predicts the harvest amount at the time of harvest in each of the plurality of target areas. Specifically, the yields of three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG. 1 are predicted. That is, the prediction unit 33 predicts the yields of the fields 11a, 11b, and 11c shown in FIG. 1, respectively.
  • the method of determining the correlation is not particularly limited, and various known methods can be used.
  • the correlation for example, Pearson's correlation analysis or covariance is used.
  • the estimating unit 34 estimates the amount of crops shipped from the plurality of target regions to the market based on the predicted harvest amount for each of the plurality of target regions. Specifically, using the yields of the three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG. presume. Crops harvested in each of the 10 growing regions (more specifically, each field in each growing region) are shipped to the market. As described above, the harvest amount is predicted for each field in each cultivation region based on the specified harvest date. Therefore, if the period from the harvest date to the time of shipment to the market is known, it is possible to know on what date and how much of the crops harvested from the field will be shipped to the market.
  • the estimation unit 34 uses the harvest amount predicted by the prediction unit 33 in the situation where the harvest date is specified for each of the fields 11a, 11b, and 11c in the three cultivation areas 10a, 10b, and 10c, and calculates the specific date. It is possible to estimate the amount of crops shipped to the market from the 10 production areas in .
  • the amount of crops shipped to the market estimated by the estimation unit 34 is stored in the memory 36, for example.
  • the display control unit 35 uses the image data acquired by the acquisition unit 30, the information on the harvest time specified by the identification unit 31, the growth rate calculated by the calculation unit 32, and the crops in the target area predicted by the prediction unit 33.
  • the harvest amount and the amount shipped to the market estimated by the estimation section 34 are displayed on the display section 26.
  • various information may be read out from the memory 36 and displayed.
  • the display control section 35 can also display various information inputted via the input section 24 on the display section 26. Further, the information to be displayed on the display section 26 may be transmitted using a transmitting section (not shown).
  • FIG. 3 is a flowchart showing an example of a shipping amount prediction method according to an embodiment of the present invention.
  • the harvest time of crops in the target area is specified based on an image taken from above of the target area where the crops are planted (step S10).
  • images are obtained by photographing the three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG. 1 from above using an aircraft, an artificial satellite, or a drone.
  • the above-mentioned image acquisition frequency is, for example, once a day.
  • the image data of the obtained image is output to the acquisition unit 30 of the processing unit 22 of the shipping amount prediction system 20, for example.
  • RGB are the three primary colors used in regular cameras. There are various choices for the wavelengths of RGB, but typically R has a wavelength of 600 to 700 nm, G has a wavelength of 500 to 600 nm, and B has a wavelength of 400 to 500 nm.
  • R has a wavelength of 600 to 700 nm
  • G has a wavelength of 500 to 600 nm
  • B has a wavelength of 400 to 500 nm.
  • NIR is called near infrared rays and has a wavelength of 700 to 800 nm.
  • UV is called ultraviolet light and has a wavelength of 200 to 400 nm.
  • images taken from above include images taken by a multispectral camera, images taken by an infrared camera or an ultraviolet camera, and images taken by reflection of radio waves, and the images are not particularly limited. It is preferable to use images with red (R) wavelengths or near-infrared wavelengths (NIR) because plants can be easily detected. For this reason, it is preferable for an image taken from the sky to have only red (R) or only red (R) and near-infrared wavelengths (NIR) as wavelength components.
  • the camera (imaging device) that captures images taken from above is not particularly limited as long as it has sensitivity to the above-mentioned wavelengths, but it is preferably one that can capture a wide range at once. For this reason, it is preferable to use a camera mounted on the above-mentioned artificial satellite. Note that the number of cameras (imaging devices) is not particularly limited, and may be one or more.
  • FIGS. 4(a) to 4(d) are schematic diagrams for explaining an example of a method for specifying the planting time and harvesting time in the shipping amount prediction method according to the embodiment of the present invention.
  • 4(a) shows a field 40 before planting crops
  • FIG. 4(b) shows a field 41 with crops 13 planted
  • FIG. 4(c) shows a field 42 during the harvest period of crops 13.
  • FIG. 4(d) shows a field 43 after crops are harvested.
  • FIGS. 4(a) to 4(d) in the fields 40 to 43, the degree of soil exposure, etc.
  • the color distance can be calculated from an image taken from the sky, and based on the color distance, the state of the target area before planting, the state of no crops after harvesting, and the state of no crops after the crops have been planted. identify the conditions in which crops are harvested, as well as when crops are harvested.
  • FIGS. 5A and 5B are schematic diagrams for explaining an example of a method for specifying the harvest time of crops in the shipment amount prediction method according to the embodiment of the present invention.
  • 5(a) and (b) show the cultivation area 10b shown in FIG. 1.
  • the cultivation area 10b has 12 fields 11b as described above.
  • position information has been acquired for the 12 fields 11b in the cultivation area 10b.
  • the image (image data) of the field 11b which will be described later, includes position information. Thereby, it is possible to identify which of the 12 fields 11b in the cultivation area 10b is in each photographed image.
  • an image (image data) of the field 11b shown in FIG. 5(a) is obtained before planting.
  • the color distance between a current image taken of the target area from above and a past image of the target area taken from above is used for the change in the image (image data).
  • the past image may be an image taken one day or more before the image to be compared (that is, the current image), but since planting the crop 13 does not require much time, past images may be used. As long as it is an image from one day ago.
  • Past images are stored in the memory 36, for example.
  • color data of a specific area of an image (image data) is used for the color distance. The specific area will be explained later, but the specific area is, for example, the area in the middle of the photographed field.
  • an image obtained by averaging images from a normal year may be used as the past image.
  • normal color data including color data before planting in a normal year and color data immediately before harvest in a normal year, which will be described later, may be used.
  • the image data of the image of the field 12a in FIG. 5(a) is designated as D1
  • the image data of the image of the field 12a in which the crops 13 are planted in FIG. 5(b) is designated as D2.
  • both the image data D1 and the image data D2 are represented by 8 bits and 256 gradations, for example, including the image data values of gradation 0 or gradation 255.
  • the wavelength component R1 of R in image data D1 and the wavelength component R2 of R in image data D2 If the color distance is 256 gradations and the distance is 10 or more, it is determined that the images have been planted. In this way, the planting date can be specified based on the color distance calculated from the image data of each date and time.
  • the specified planting date is stored in the memory 36. Note that when the color distance is Dc, Dc is expressed by the following formula.
  • the color distance may be a combination of the wavelength component R and NIR.
  • the color distance Dc is expressed by the following formula.
  • the color distance may also be a combination of wavelength components R, G, and B.
  • the color distance Dc is expressed by the following formula.
  • the procedure for specifying the planting date has been described above, but the procedure for specifying the harvest time is also the same as the procedure for specifying the above-mentioned planting date.
  • the color distance between the current image taken from above of the target area where crops were planted and the past image taken from above of the target area where crops were planted is calculated, and the color distance of the crops in the target area is calculated based on the color distance. Identify the point of harvest. Specifically, for example, the image data Ds of the field 41 shown in FIG. 4(b) and the image data Dh of the field 42 shown in FIG. 4(c) are compared to identify the harvest time. In this case, as described above, the color distance between the image data Ds and the image data Dh is determined to specify the harvest time.
  • the color distance may be only R, a combination of R and NIR, or a combination of R, G, and B. If the color distance is 256 gradations and the distance is 10 or more, it is determined that crops have already been harvested in the target area in the current image. Then, based on the determination result, the harvest time is specified based on the date and time when the current image was taken. The identified harvest time is stored in the memory 36, for example, as a harvest date.
  • step S12 the growth rate of the crop is calculated based on the harvest time (step S12).
  • step S12 for example, a method shown in FIG. 6 described below is used.
  • FIG. 6 shows an example of step S12, which is executed by the calculation unit 32 of FIG. 2, for example.
  • FIG. 6 is a flowchart showing an example of the process of calculating the growth rate of the shipping amount prediction method according to the embodiment of the present invention.
  • step S12 In calculating the growth rate of crops in the target area (step S12), first, a specific field is photographed from above on a specific day to obtain an image, and image data is obtained (step S20).
  • the specific date in step S20 is the photographing date of the image.
  • the image includes shooting date information and location information.
  • step S22 color data of a specific area of the image is acquired (step S22).
  • the specific area in step S22 is the area in the middle of the field in the photographed image.
  • the color data of the wavelength component R, the wavelength component R and NIR, and the wavelength component RGB described above are used.
  • the average value of R and NIR is used as color data.
  • the average value of RGB is used as color data.
  • the center of the field is the geometric center determined from the outline of the field in the image (image data). The outline of the field can be identified by processing the image data of the photographed image and extracting the outline of the field.
  • the normal color data is the average value of the color data of the middle area (specific area) of the field in the photographed image.
  • the normal color data includes color data before planting in a normal year and color data immediately before harvest in a normal year.
  • the color data before planting in a normal year is the color data on the day immediately before planting obtained from past data, and the wavelength components of the color data are as described above.
  • the color data immediately before harvest in a normal year is the color data on the day immediately before harvest obtained from past data, and the wavelength components of the color data are as described above.
  • the past data is, for example, image data of photographed fields for the past 10 years.
  • the normal color data is stored, for example, in the memory 36 shown in FIG. 2 or on the cloud.
  • step S26 the amount of crop growth is calculated (step S26).
  • step S26 it is calculated where the color of the field in the photographed image is positioned between the planting date and the day immediately before harvest, and the amount of growth is calculated based on the color of the field in the image.
  • the amount of crop growth based on the color of the field in the image is ⁇ 1
  • the amount of growth ⁇ 1 based on the color of the field in the image is expressed by the following formula when using the RGB wavelength components of the image data. .
  • ⁇ 1 ((Cr-Fr)/(Hr-Fr)+(Cg-Fg)/(Hg-Fg)+(Cb-Fb)/(Hb-Fb))/3
  • Cr, Cg, and Cb are the values of the wavelength components RGB of the color data on the specific day obtained in step S20.
  • Fr, Fg, and Fb are the values of the wavelength components RGB of the planting day color data for a normal year.
  • the normal planting date color data is the color data of the planting date obtained from past data.
  • the planting date in a normal year is the day when planting was performed in a normal year, and is the average day of planting dates determined from past data (that is, the planting date in a normal year).
  • Hr, Hg, and Hb are the values of wavelength components RGB of color data on the day immediately before harvest in a normal year.
  • the color data on the day immediately before harvest in a normal year is the color data on the day immediately before harvest obtained from past data.
  • the day immediately before harvest in a normal year is the day before the harvest date in a normal year, and is the average day immediately before harvest determined from past data (that is, the day before harvest in a normal year).
  • step S26 the amount of growth is calculated from the number of days of growth.
  • the amount of growth based on the number of days of growth is ⁇ 2
  • the amount of growth ⁇ 2 based on the number of days of growth is expressed by the following formula using the planting date and the day immediately before harvest.
  • ⁇ 2 (D-Df)/(Dh-Df)
  • D is the specific date on which the image was taken.
  • Df is the planting date in a normal year, and is the average planting date obtained from past data.
  • Dh is the day immediately before harvest in a normal year, and is the average day immediately before harvest determined from past data.
  • the growth rate of the crop is calculated (step S28).
  • ⁇ 1 > ⁇ 2 the period from the planting date to harvesting is short and the growth rate S is fast.
  • the growth rate of a specific area specifically, a specific cultivation area
  • the growth rate is calculated at multiple locations in the area, and the average value thereof is calculated. For example, in the three cultivation areas 10a, 10b, and 10c shown in FIG. 1, the growth rate is calculated for each field 11a, 11b, and 11c.
  • the average value of the growth speeds of the plurality of fields 11a, 11b, and 11c is calculated for each of the three cultivation areas 10a, 10b, and 10c. Furthermore, in order to calculate the growth rate of a specific vegetable, the growth rate is calculated over the entire production area where it is shipped during that period, and the average value thereof is calculated. For example, in a situation where the same vegetable is grown in the three cultivation regions 10a, 10b, and 10c shown in FIG. 1, the growth speed of each is calculated, and the average value thereof is calculated as the growth speed of the specific vegetable.
  • the yield of crops in the target area at the time of harvest is predicted based on the growth rate calculated this time and the past growth rate (step S14).
  • the yield of crops in the target area is predicted, for example, using the correlation for each type of crop acquired in advance as described above.
  • the yields of the three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG. 1 are predicted. That is, the yields of the fields 11a, 11b, and 11c shown in FIG. 1 are predicted.
  • step S16 the shipping amount of a plurality of cultivation areas (that is, a plurality of target areas) is estimated (step S16).
  • step S16 the yields of the three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG. 1, which were predicted in step S14 described above, are used. Furthermore, if the period from the harvest date to the time of shipment to the market is known, it is possible to know on what date and how much of the crops harvested from the field will be shipped to the market.
  • step S16 for each of the fields 11a, 11b, and 11c in the three cultivation regions 10a, 10b, and 10c, the production area on the specific date is Estimate the amount of crops shipped to the market from 10.
  • the amount shipped to the market where crops are shipped from the plurality of cultivation areas is estimated.
  • the amount shipped to the market it is possible to adjust the amount shipped of the crops being produced to avoid oversupply or undersupply to the market. This makes it possible to suppress sudden rises and price collapses in crop prices.
  • the amount shipped to the Tokyo market can be estimated if there is information on the amount shipped by prefecture shown below.
  • the target areas are preferably Nagano, Ibaraki, Shizuoka, Hyogo, Nagasaki, Gunma, Tochigi, Kagawa, Chiba, and Fukuoka.
  • it is possible to predict the shipment volume by investigating the cultivation areas (shipping sources) that account for more than half of the annual shipment volume, so select the cultivation areas that account for more than half of the shipment volume as target areas. It is preferable to do so.
  • the cultivation area to be investigated is preferably a production area that accounts for 60% or more of the annual shipment amount, and more preferably a cultivation area that accounts for 70% or more of the annual shipment amount. Since the cultivation area of crops changes depending on the season, the cultivation area to be investigated may be changed for each season. In this case, cultivation areas covering the above-mentioned ratios are selected as target areas for each month.
  • the present invention is basically configured as described above. Although the shipment amount prediction method and shipment amount prediction system of the present invention have been described in detail above, the present invention is not limited to the above-described embodiments, and various improvements or changes can be made without departing from the gist of the present invention. Of course you can.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Environmental Sciences (AREA)
  • Ecology (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Botany (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Economics (AREA)
  • Forests & Forestry (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Provided are a shipping quantity prediction method and a shipping quantity prediction system for predicting the shipping quantity of crops to be shipped from fields in an entire region to a market. This shipping quantity prediction method comprises: a step for identifying the harvest timing of crops in a target region, in which the crops are planted, on the basis of an image obtained by photographing from the sky the target region; a step for calculating the growth rate of the crops on the basis of the harvest timing; a step for predicting a yield of the crops in the target region at the harvest timing on the basis of the growth rate having been calculated currently and growth rates in the past; and a step for estimating the shipping quantity of crops to be shipped from a plurality of the target regions to the market on the basis of yields predicted from the respective target regions.

Description

出荷量予測方法及び出荷量予測システムShipping volume prediction method and shipping volume prediction system
 本発明は、作物が植えられた対象地域を上空から撮影した画像を用いた出荷量予測方法及び出荷量予測システムに関し、特に、複数の対象地域から出荷される作物の市場への出荷量を推定する出荷量予測方法及び出荷量予測システムに関する。 The present invention relates to a shipping amount prediction method and a shipping amount prediction system using images taken from above of target areas where crops are planted, and in particular, to estimate the amount of crops shipped to the market from multiple target areas. The present invention relates to a shipping amount prediction method and a shipping amount prediction system.
 従来から、作物の収穫量は、様々な方法を用いて予測されている。例えば、調査員が、作物が植えられた現地に赴き、作物の生育状況等を見て、経験等に基づいて収穫量を予測している。調査員による作物の収穫量の予測は、時間がかかる。特に、作付面積が広い場合には、更に収穫量の予測に時間を要し、費用も嵩む。また、調査員の経験に基づく収穫量の予測は調査員毎に個人差があり、収穫量の予測にバラつきが生じる、結果、収穫量の予測の信頼性が悪い。調査員による作物の収穫量の予測以外に、例えば、特許文献1に収穫予測装置が提案されている。 Traditionally, crop yields have been predicted using various methods. For example, a surveyor goes to the site where crops are planted, observes the growing conditions of the crops, and predicts the yield based on experience. It takes time for researchers to predict crop yields. In particular, when the cultivated area is large, it takes more time and costs to predict the yield. In addition, predictions of yields based on the experience of surveyors vary from surveyor to surveyor, resulting in variations in yield predictions, and as a result, the reliability of yield predictions is poor. In addition to predicting crop yields by researchers, for example, a harvest prediction device is proposed in Patent Document 1.
 特許文献1の収穫予測装置は、作物の画像データから作物の収穫量を予測する収穫予測装置において、作物の種別に特有のスペクトルに関する情報である種別スペクトル情報と、作物の育成段階に特有のスペクトルに関する情報である育成段階スペクトル情報と、作物の育成開始から収穫までの期間に関する情報である育成期間情報と、作物の作付面積と収穫量との関係に関する情報である面積収穫量情報とを記憶する記憶部と、画像データから得られるスペクトルに関する情報と、記憶部に記憶している種別スペクトル情報とを比較することにより作物の種別を識別し、識別した種別に関する情報である作物種別情報を得る作物種別識別部と、画像データから得られるスペクトルに関する情報と、記憶部に記憶している育成段階スペクトル情報とを比較することにより作物の育成段階を識別し、識別した育成段階に関する情報である育成段階情報を得る育成段階識別部と、画像データに基づいて作付面積を算出し、算出した作付面積に関する情報である作付面積情報を得る作付面積算出部と、作物種別情報及び育成段階情報並びに作付面積情報と、記憶部に記憶している育成期間情報及び面積収穫量情報とから、作物の収穫時期及び収穫量とを予測する収穫予測部とを備える。 The harvest prediction device of Patent Document 1 is a harvest prediction device that predicts the yield of a crop from image data of the crop. It stores growth stage spectrum information, which is information about the crop, growth period information, which is information about the period from the start of crop growth to harvest, and area yield information, which is information about the relationship between the planted area of the crop and the yield. A crop in which the type of crop is identified by comparing information regarding the spectrum obtained from the image data with the type spectrum information stored in the storage unit, and crop type information that is information regarding the identified type is obtained. The type identification unit identifies the growing stage of the crop by comparing the spectrum information obtained from the image data with the growing stage spectrum information stored in the storage unit, and the growing stage is information about the identified growing stage. A growing stage identification unit that obtains information, a planted area calculation unit that calculates a planted area based on image data and obtains planted area information that is information about the calculated planted area, and crop type information, growing stage information, and planted area information. and a harvest prediction section that predicts the harvest time and yield of crops from the growing period information and area yield information stored in the storage section.
 特許文献1以外に、例えば、特許文献2に、作物の需要量と収穫量との調和を取るものとして、作物についての少なくとも1つの作付日、及び作付日に作付けされる作物の量である作付量を算出する作付スケジュール算出装置が提案されている。特許文献2では、作物の収穫量を需要量に近づけることがなされている。 In addition to Patent Document 1, for example, Patent Document 2 describes at least one planting date for a crop and a planting amount, which is the amount of the crop to be planted on the planting date, as a method for harmonizing the demand amount and the harvest amount of the crop. A cropping schedule calculation device that calculates the amount has been proposed. In Patent Document 2, the harvest amount of crops is brought closer to the demand amount.
特開2003-006612号公報JP2003-006612A 特開2021-002110号公報Japanese Patent Application Publication No. 2021-002110
 特定の地域において、複数の圃場がある場合、圃場毎に、例えば、特許文献1を利用して、作物の収穫時期及び収穫量を予測した場合、各圃場の作物の収穫時期及び収穫量の予測はできる。しかしながら、上述の特定の地域全体の圃場からの作物の出荷量を予測できない。
 また、上述の特許文献2を利用して、作物の収穫量を需要量に近づけることがなされた場合、各圃場の作物の収穫量を需要量に近づけることはできる。しかしながら、上述の特定の地域全体の圃場からの作物の出荷量を予測できない。
 上述の特定の地域全体の圃場からの作物の出荷量を予測できない場合、同じ種類の作物が市場に多く出荷され作物が供給過剰になったり、逆に作物が供給不足になる等して、市場に一定量の作物を安定して出荷できない。
 このため、地域全体の圃場から市場に出荷される作物の出荷量を予測することが望まれている。
If there are multiple fields in a specific region, and if the harvest time and yield of crops are predicted for each field using Patent Document 1, for example, the prediction of the harvest time and yield of crops in each field I can. However, it is not possible to predict the amount of crops shipped from fields throughout the specific region mentioned above.
Further, if the above-mentioned Patent Document 2 is used to bring the harvest amount of crops closer to the demand amount, it is possible to bring the harvest amount of crops in each field closer to the demand amount. However, it is not possible to predict the amount of crops shipped from fields throughout the specific region mentioned above.
If it is not possible to predict the amount of crops shipped from fields in a particular region as mentioned above, many of the same type of crops may be shipped to the market, resulting in an oversupply of crops, or conversely, there may be a shortage of crops, causing the market to decline. It is not possible to stably ship a certain amount of crops.
For this reason, it is desired to predict the amount of crops shipped from fields throughout the region to the market.
 本発明の目的は、地域全体の圃場から市場に出荷される作物の出荷量を予測する出荷量予測方法及び出荷量予測システムを提供することにある。 An object of the present invention is to provide a shipment amount prediction method and a shipment amount prediction system that predict the shipment amount of crops shipped from fields in the entire region to the market.
 上述の目的を達成するために、発明[1]は、作物が植えられた対象地域を上空から撮影した画像に基づいて、対象地域の作物の収穫時点を特定する工程と、収穫時点に基づいて作物の生育速度を算出する工程と、今回算出された生育速度と、過去の生育速度とに基づいて、収穫時点における対象地域での作物の収穫量を予測する工程と、複数の対象地域のそれぞれで予測された収穫量から、複数の対象地域から作物が出荷される市場への出荷量を推定する工程とを有する、出荷量予測方法を提供するものである。 In order to achieve the above-mentioned object, the invention [1] includes a step of identifying the harvest time of crops in a target area based on an image taken from above of the target area where the crops are planted, and A process of calculating the growth rate of crops, a process of predicting the yield of crops in a target area at the time of harvest based on the currently calculated growth rate and past growth rates, and a process of predicting the yield of crops in a target area at the time of harvest, and each of a plurality of target areas. The present invention provides a method for predicting the amount of shipments, which includes the step of estimating the amount of shipments to markets where crops are shipped from a plurality of target areas, based on the predicted yields.
 発明[2]は、対象地域の作物の収穫時点を特定する工程では、作物が植えられた対象地域を上空から撮影した今回の画像と、作物が植えられた対象地域を上空から撮影した過去の画像との色距離を算出し、色距離に基づいて対象地域の作物の収穫時点を特定する、発明[1]に記載の出荷量予測方法である。
 発明[3]は、過去の収穫時点における対象地域における作物の収穫量と、生育速度との相関関係を予め取得しておき、収穫時点における対象地域における作物の収穫量を予測する工程では、相関関係を用いて、収穫時点における対象地域での収穫量を予測する、[2]又は[2]に記載の出荷量予測方法である。
 発明[4]は、作物は、葉物野菜である、発明[1]~[3]のいずれか1つに記載の出荷量予測方法である。
In the invention [2], in the process of identifying the harvest time of crops in a target area, a current image taken from above of the target area where crops are planted and a past image taken from above of the target area where crops were planted are used. The shipping amount prediction method according to invention [1] calculates the color distance to the image and specifies the harvest time of crops in the target area based on the color distance.
Invention [3] obtains in advance the correlation between the crop yield in the target area at the past harvest time and the growth rate, and in the step of predicting the crop yield in the target area at the harvest time, the correlation is obtained. The shipping amount prediction method according to [2] or [2], which uses the relationship to predict the harvest amount in the target area at the time of harvest.
Invention [4] is the shipping amount prediction method according to any one of inventions [1] to [3], wherein the crop is a leafy vegetable.
 発明[5]は、作物が植えられた対象地域を上空から撮影した画像に基づいて、対象地域の作物の収穫時点を特定する特定部と、収穫時点に基づいて作物の生育速度を算出する算出部と、今回算出された生育速度と、過去の生育速度とに基づいて、収穫時点における対象地域における作物の収穫量を予測する予測部と、複数の対象地域のそれぞれで予測された収穫量から、複数の対象地域から作物が出荷される市場への出荷量を推定する推定部とを有する、出荷量予測システムを提供するものである。 Invention [5] includes a specific unit that identifies the harvest time of crops in a target area based on an image taken from above of the target area where the crops are planted, and a calculation unit that calculates the growth speed of the crops based on the harvest time. a prediction unit that predicts the crop yield in the target area at the time of harvest based on the currently calculated growth rate and past growth rates; , and an estimator that estimates the amount shipped to a market where crops are shipped from a plurality of target regions.
 発明[6]は、特定部は、作物が植えられた対象地域を上空から撮影した今回の画像と、作物が植えられた対象地域を上空から撮影した過去の画像との色距離を算出し、色距離に基づいて対象地域の作物の収穫時点を特定する、発明[5]に記載の出荷量予測システムである。
 発明[7]は、過去の収穫時点における対象地域における作物の収穫量と、生育速度との相関関係を予め取得しておき、予測部は、相関関係を用いて、収穫時点における対象地域での作物の収穫量を予測する、発明[5]又は[6]に記載の出荷量予測システムである。
 発明[8]は、作物は、葉物野菜である、発明[5]~[7]のいずれか1つに記載の出荷量予測システムである。
In the invention [6], the identification unit calculates the color distance between a current image taken from above of a target area where crops are planted and a past image taken from above of a target area where crops are planted, The shipping amount prediction system according to invention [5] specifies the harvest time of crops in a target area based on color distance.
Invention [7] obtains in advance the correlation between the yield of crops in the target area at the time of past harvest and the growth rate, and the prediction unit uses the correlation to predict the yield in the target area at the time of harvest. The shipping amount prediction system according to invention [5] or [6] predicts the yield of crops.
Invention [8] is the shipping amount prediction system according to any one of inventions [5] to [7], wherein the crop is a leafy vegetable.
 本発明によれば、地域全体の圃場から市場に出荷される作物の出荷量を予測する出荷量予測方法を提供できる。
 また、地域全体の圃場から市場に出荷される作物の出荷量を予測する出荷量予測システムを提供できる。
According to the present invention, it is possible to provide a shipping amount prediction method for predicting the shipping amount of crops shipped from fields in the entire region to the market.
Further, it is possible to provide a shipping amount prediction system that predicts the shipping amount of crops shipped from fields in the entire region to the market.
本発明の実施形態の出荷量予測方法の対象となる地域の一例を示す模式図である。FIG. 2 is a schematic diagram showing an example of a region targeted by a shipping amount prediction method according to an embodiment of the present invention. 本発明の実施形態の出荷量予測システムの一例を示す模式図である。FIG. 1 is a schematic diagram showing an example of a shipping amount prediction system according to an embodiment of the present invention. 本発明の実施形態の出荷量予測方法の一例を示すフローチャートである。3 is a flowchart illustrating an example of a shipping amount prediction method according to an embodiment of the present invention. (a)~(d)は本発明の実施形態の出荷量予測方法における定植時期と収穫時点の特定方法の一例を説明するための模式図である。(a) to (d) are schematic diagrams for explaining an example of a method for specifying a planting time and a harvest time in a shipping amount prediction method according to an embodiment of the present invention. (a)及び(b)は本発明の実施形態の出荷量予測方法における作物の収穫時点の特定方法の一例を説明するための模式図である。(a) and (b) are schematic diagrams for explaining an example of a method for specifying the harvest time of crops in the shipment amount prediction method according to the embodiment of the present invention. 本発明の実施形態の出荷量予測方法の生育速度を算出する工程の一例を示すフローチャートである。It is a flowchart which shows an example of the process of calculating the growth rate of the shipping amount prediction method of embodiment of this invention.
 以下に、添付の図面に示す好適実施形態に基づいて、本発明の出荷量予測方法及び出荷量予測システムを詳細に説明する。
 なお、以下に説明する図は、本発明を説明するための例示的なものであり、以下に示す図に本発明が限定されるものではない。
 なお、以下において数値範囲を示す「~」とは両側に記載された数値を含む。例えば、εが数値εα~数値εβとは、εの範囲は数値εαと数値εβを上限値及び下限値として含む範囲であり、数学記号で示せばεα≦ε≦εβである。
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The shipping amount prediction method and shipping amount prediction system of the present invention will be described in detail below based on preferred embodiments shown in the accompanying drawings.
Note that the figures described below are illustrative for explaining the present invention, and the present invention is not limited to the figures shown below.
In addition, in the following, "~" indicating a numerical range includes the numerical values written on both sides. For example, when ε is a numerical value ε α to a numerical value ε β , the range of ε is a range that includes the numerical value ε α and the numerical value ε β as upper and lower limits, and expressed in mathematical symbols as ε α ≦ε≦ε β. be.
 図1は本発明の実施形態の出荷量予測方法の対象となる地域の一例を示す模式図である。図1は出荷量予測方法及び出荷量予測システムを説明するために模式的に産地10を示している。産地10は、作物が栽培される地域であり、対象地域は、産地10内の栽培地域である。栽培地域を規定する単位(区画単位)は、各都道府県レベルの単位でもよく、市区町村レベルの単位でもよく、あるいは、市区町村よりも更に狭い区画単位であってもよい。 FIG. 1 is a schematic diagram showing an example of a region targeted by the shipping amount prediction method according to the embodiment of the present invention. FIG. 1 schematically shows a production area 10 in order to explain a shipping amount prediction method and a shipping amount prediction system. The production area 10 is an area where crops are cultivated, and the target area is a cultivation area within the production area 10. The unit (section unit) that defines the cultivation area may be a unit at the level of each prefecture, a unit at the municipal level, or a unit smaller than the municipality.
 産地10には、複数の栽培地域が含まれ、図1に示す産地10には、例えば、3つの栽培地域10a、10b、10cが含まれる。これら3つの栽培地域10a、10b、10cは、それぞれ対象地域に相当する。
 また、栽培地域10aには、例えば、4面の圃場11aがある。また、栽培地域10bには、例えば、12面の圃場11bがある。また、栽培地域10cには、32面の圃場11cがある。圃場とは、作物を栽培する田畑である。なお、各栽培地域における各圃場の規模及び作付面積等については、特に制限なく、また、各栽培地域において圃場は、規則的に配置されてもよいし、不規則に配置されてもよい。
 圃場11a、11b、11cにおいては、それぞれ作物を植えて、所定の大きさ迄生育させた後、収穫する。
The production area 10 includes a plurality of cultivation areas, and the production area 10 shown in FIG. 1 includes, for example, three cultivation areas 10a, 10b, and 10c. These three cultivation areas 10a, 10b, and 10c correspond to target areas, respectively.
Furthermore, the cultivation area 10a includes, for example, four fields 11a. Furthermore, in the cultivation area 10b, there are, for example, 12 fields 11b. Furthermore, the cultivation area 10c has 32 fields 11c. A field is a field where crops are grown. Note that there are no particular restrictions on the scale, planted area, etc. of each field in each cultivation region, and the fields may be arranged regularly or irregularly in each cultivation region.
In the fields 11a, 11b, and 11c, crops are planted and harvested after growing to a predetermined size.
 作物は、例えば、葉物野菜である。葉物野菜は、葉菜類とも呼ばれるものである。葉物野菜は、例えば、レタス、小松菜、キャベツ、ホウレンソウ、ブロッコリー、春菊、チンゲン菜、白菜、水菜、ニラ、及びネギである。 The crops are, for example, leafy vegetables. Leafy vegetables are also called leafy vegetables. Examples of leafy vegetables include lettuce, komatsuna, cabbage, spinach, broccoli, garland chrysanthemum, bok choy, Chinese cabbage, mizuna, chive, and green onion.
(出荷量予測システム)
 次に、出荷量予測システムについて説明する。
 図2は本発明の実施形態の出荷量予測システムの一例を示す模式図である。図2に示す出荷量予測システム20は、出荷量予測方法に用いられるシステムの一例であり、例えば、コンピューター等のハードウェア又はクラウド上ソフトウェアを用いて構成される。出荷量予測方法をコンピューター等のハードウェア及びソフトウェアを用いて実行することができれば、出荷量予測方法において、図2に示す出荷量予測システム20を使うことに限定されるものではない。つまり、図2の出荷量予測システム20以外の構成(システム)を用いて、本発明の出荷量予測方法を実現してもよい。また、出荷量予測方法の各工程をコンピューター等に実行させるためのプログラムを利用してもよい。
(Shipping volume prediction system)
Next, the shipping amount prediction system will be explained.
FIG. 2 is a schematic diagram showing an example of a shipping amount prediction system according to an embodiment of the present invention. The shipping amount prediction system 20 shown in FIG. 2 is an example of a system used in the shipping amount prediction method, and is configured using, for example, hardware such as a computer or software on the cloud. As long as the shipping amount prediction method can be executed using hardware and software such as a computer, the shipping amount prediction method is not limited to using the shipping amount prediction system 20 shown in FIG. 2. That is, the shipping amount prediction method of the present invention may be implemented using a configuration (system) other than the shipping amount prediction system 20 of FIG. 2. Further, a program for causing a computer or the like to execute each step of the shipping amount prediction method may be used.
 図2に示す出荷量予測システム20は、例えば、処理部22と、入力部24と、表示部26とを有する。算出部32は、取得部30、特定部31、算出部32、予測部33、推定部34、表示制御部35、メモリ36、及び制御部37を有する。出荷量予測システム20は、この他に図示はしないがROM等を有する。
 処理部22は、制御部37により制御される。また、処理部22において取得部30、特定部31、算出部32、予測部33、推定部34、及び表示制御部35はメモリ36に接続されている。さらに、取得部30、特定部31、算出部32、予測部33、推定部34、及び表示制御部35の各々から出力されるデータはメモリ36に記憶することができる。
The shipping amount prediction system 20 shown in FIG. 2 includes, for example, a processing section 22, an input section 24, and a display section 26. The calculation unit 32 includes an acquisition unit 30 , a specification unit 31 , a calculation unit 32 , a prediction unit 33 , an estimation unit 34 , a display control unit 35 , a memory 36 , and a control unit 37 . Although not shown, the shipping amount prediction system 20 also includes a ROM and the like.
The processing section 22 is controlled by a control section 37. Further, in the processing unit 22 , the acquisition unit 30 , the identification unit 31 , the calculation unit 32 , the prediction unit 33 , the estimation unit 34 , and the display control unit 35 are connected to the memory 36 . Furthermore, the data output from each of the acquisition section 30 , identification section 31 , calculation section 32 , prediction section 33 , estimation section 34 , and display control section 35 can be stored in the memory 36 .
 出荷量予測システム20は、ROM等に記憶されたプログラム(コンピュータソフトウェア)を、制御部37で実行することにより、取得部30、特定部31、算出部32、予測部33、及び推定部34の各部を機能的に形成する。出荷量予測システム20は、上述のように、プログラムが実行されることで各部位が機能するコンピューターによって構成されてもよいし、各部位が専用回路で構成された専用装置であってもよく、クラウド上で実行されるようにサーバーで構成してもよい。 The shipping amount prediction system 20 has the control unit 37 execute a program (computer software) stored in a ROM or the like, thereby controlling the acquisition unit 30, the identification unit 31, the calculation unit 32, the prediction unit 33, and the estimation unit 34. Form each part functionally. As described above, the shipping amount prediction system 20 may be configured by a computer in which each part functions by executing a program, or may be a dedicated device in which each part is configured with a dedicated circuit. It may be configured on a server to run on the cloud.
 出荷量予測システム20は、地域全体の圃場に関するデータ、特に、各圃場の画像データから市場に出荷される作物の出荷量を予測することを目的として構築されたものである。以下、出荷量予測システム20の各部について説明する。 The shipment amount prediction system 20 was constructed for the purpose of predicting the shipment amount of crops to be shipped to the market from data related to fields in the entire region, especially image data of each field. Each part of the shipping amount prediction system 20 will be explained below.
 入力部24は、マウス及びキーボード等の各種情報をオペレータの指示により入力するための各種の入力デバイスである。
 表示部26は、例えば、予測された地域全体の圃場から市場に出荷される作物の出荷量等を表示するものであり、公知の各種のディスプレイが用いられる。また、表示部26には各種情報を出力媒体に表示するためのプリンタ等のデバイスも含まれる。
The input unit 24 is various input devices such as a mouse and a keyboard for inputting various information according to instructions from an operator.
The display section 26 displays, for example, the predicted shipment amount of crops to be shipped from fields in the entire region to the market, and various known displays are used. The display unit 26 also includes a device such as a printer for displaying various information on an output medium.
 取得部30には、例えば、入力部24により、作物が植えられた対象地域を上空から撮影した画像が、例えば、画像データの形態で、出荷量予測システム20の外部から入力されて保持される。取得部30に保持された、上述の作物が植えられた対象地域を上空から撮影した画像の画像データは特定部31に読み出される。なお、取得部30に保持された上述の画像データはメモリ36に記憶させてもよい。 In the acquisition unit 30, for example, an image taken from above of a target area where crops are planted is inputted from outside the shipping amount prediction system 20 in the form of image data by the input unit 24, and is held therein. . Image data of an image taken from above of the target area where the above crops are planted, which is held in the acquisition unit 30 , is read out to the identification unit 31 . Note that the above-described image data held in the acquisition unit 30 may be stored in the memory 36.
 作物が植えられた対象地域を上空から撮影した画像は、例えば、1日、1回撮影されることが好ましく、1日、1回、同時刻に撮影されることがより好ましい。また、画像は、撮影日情報と位置情報とを含むことが好ましい。
 対象地域を上空から撮影した画像の画像データに対して、レベル補正、大気補正、及び幾何補正等を行ってもよい。レベル補正とは、画像データを取得したセンサの感度のばらつき又はノイズ等を取り除く補正である。また、大気補正とは、画像データを取得した時の水蒸気等によるノイズ等を取り除く補正である。また、幾何補正とは、画像に含まれる幾何学的なひずみ等を取り除く補正である。
It is preferable that the image of the target area where crops are planted be taken from above, for example, once a day, and more preferably, once a day, at the same time. Further, it is preferable that the image includes shooting date information and location information.
Level correction, atmospheric correction, geometric correction, etc. may be performed on the image data of an image taken of the target area from above. Level correction is correction that removes variations in sensitivity or noise of the sensor that acquired the image data. In addition, atmospheric correction is correction that removes noise caused by water vapor or the like when image data is acquired. In addition, geometric correction is correction that removes geometric distortion and the like included in an image.
 なお、作物が植えられた対象地域を上空から撮影した画像の画像データは、コンピューターに読み取り可能なデータであり、かつコンピューターで解析可能なデータであれば、特に限定されるものではない。また、取得部30では、作物が植えられた対象地域を上空から撮影した画像の画像データを、特定部31で処理できるデータ形式に変換してもよい。
 後述のように、画像データの特定の波長の情報を利用するため、対象地域を上空から撮影した画像の画像データは、特定の波長の情報を含むことが好ましい。特定の波長については後に説明する。
 ここで、作物が植えられた対象地域とは、例えば、図1に示す産地10の3つの栽培地域10a、10b、10cである。
Note that the image data of the image taken from above of the target area where crops are planted is not particularly limited as long as it is computer-readable data and computer-analyzable data. Further, the acquisition unit 30 may convert image data of an image taken from above of a target area where crops are planted into a data format that can be processed by the identification unit 31.
As will be described later, in order to utilize information on a specific wavelength of the image data, it is preferable that the image data of an image of the target area taken from above includes information on the specific wavelength. Specific wavelengths will be explained later.
Here, the target areas where crops are planted are, for example, three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG.
 対象地域を上空から撮影した画像は、特に限定されるものではなく、飛行機及びヘリコプター等の航空機、人工衛星、又はドローンを利用して撮影されて取得される。この場合、撮影された画像は、撮影位置に基づく位置情報を含むことが好ましい。これより、撮影された画像の位置情報と、対象地域の位置情報とを照合することで撮影された画像における対象地域の圃場の特定が容易になる。 Images taken of the target area from above are not particularly limited, and can be taken and obtained using aircraft such as airplanes and helicopters, artificial satellites, or drones. In this case, it is preferable that the photographed image includes positional information based on the photographing position. This makes it easier to identify the field in the target area in the photographed image by comparing the position information of the photographed image with the position information of the target area.
 特定部31は、作物が植えられた対象地域を上空から撮影した画像(画像データ)に基づいて、対象地域の作物の収穫時点を特定するものである。
 対象地域の作物の収穫時点とは、作物を圃場から刈り取られる等して、作物が圃場にない状態のことであり、収穫日時である。圃場では、作物の有無、及び定植時又は収穫時等の作物の生育状況によって、上空からの土壌の見え方が変わることから、同じ圃場であっても作物の生育状況によって、上述の上空から撮影した画像(画像データ)が異なる。
 特定部31では、上述の画像(画像データ)の違いを利用して、収穫時点を特定する。
 特定部31では、作物が植えられた対象地域を上空から撮影した今回の画像と、作物が植えられた対象地域を上空から撮影した過去の画像との色距離を算出し、色距離に基づいて対象地域の作物の収穫時点を特定する。
 過去の画像は、今回の画像と同じ地域(圃場)の画像であり、今回の画像における作物の生育状況、あるいは今回の画像において作物が既に収穫済みであるかを判定するために用いられる。
The specifying unit 31 specifies the harvest time of crops in the target area based on an image (image data) taken from above of the target area where the crops are planted.
The harvest time of crops in the target area refers to a state in which there are no crops in the field, such as when the crops have been harvested from the field, and is the harvest date and time. In a field, the appearance of the soil from above changes depending on the presence or absence of crops and the growth conditions of the crops at the time of planting or harvesting. The images (image data) are different.
The specifying unit 31 uses the above-mentioned differences in images (image data) to specify the harvest time.
The identification unit 31 calculates the color distance between the current image taken from above of the target area where crops were planted and the previous image taken from above of the target area where crops were planted, and then calculates the color distance based on the color distance. Identify the harvest time of crops in the target area.
The past image is an image of the same area (field) as the current image, and is used to determine the growth status of crops in the current image or whether the crops in the current image have already been harvested.
 今回の画像と過去の画像とを用いて収穫時点を特定する手順としては、例えば、画像の画像データを8ビット、256階調で表した場合において、色距離が10以上変わった場合、収穫時点とする。このため、収穫時点を特定するために、作物が植えられた対象地域を上空から撮影した画像を、1日、1回撮影することが好ましい。
 色距離は、画像データのうち、少なくとも1つの波長を用いればよいが、複数の波長を用いることが好ましい。
 特定部31は、上述の作物の収穫時点の情報と、色距離の情報とをメモリ36に記憶させる。
As a procedure for identifying the harvest time using the current image and past images, for example, when the image data of the image is expressed in 8 bits and 256 gradations, if the color distance changes by 10 or more, the harvest time shall be. Therefore, in order to identify the harvest time, it is preferable to take an aerial image of the target area where crops are planted once a day.
For the color distance, at least one wavelength of the image data may be used, but it is preferable to use a plurality of wavelengths.
The specifying unit 31 causes the memory 36 to store the information on the time of harvest of the crops described above and the information on the color distance.
 算出部32は、収穫時点に基づいて作物の生育速度を算出するものである。
 生育速度は、例えば、定植から収穫迄の日数とする。このため、定植日(定植時点)を特定する必要がある。
 定植は、作物がない状態の圃場に、作物が植えられることである。定植時点は、上述の収穫時点と同様に、色距離により特定できる。
 定植においても、画像の画像データを8ビット、256階調で表した場合において、色距離が10以上変わった場合、その画像の撮影日時を定植時点とする。このため、定植時点を特定するために、作物が植えられた対象地域を上空から撮影した画像を、1日、1回撮影することが好ましい。
 定植時点を特定する場合、特定しようとする定植前の圃場の画像(土壌が露出している状況)を、予め撮影しておく。特定しようとする定植前の圃場の画像と、撮影した特定しようとする圃場の画像との色距離を求める。この色距離が10以上の場合における圃場の画像の撮影日時を定植時点とする。
 なお、生育速度については、後に詳細に説明する。
The calculation unit 32 calculates the growth rate of crops based on the harvest time.
The growth rate is, for example, the number of days from planting to harvesting. For this reason, it is necessary to specify the planting date (time of planting).
Planting is when crops are planted in fields where there are no crops. The planting time can be identified by color distance, similar to the above-mentioned harvesting time.
In fixed planting, when the image data of an image is expressed in 8 bits and 256 gradations, if the color distance changes by 10 or more, the photographing date and time of the image is taken as the fixed planting time. Therefore, in order to identify the planting time, it is preferable to take an aerial image of the target area where crops are planted once a day.
When specifying the planting time, an image of the field before planting to be specified (in which the soil is exposed) is taken in advance. The color distance between the image of the field before planting to be identified and the photographed image of the field to be identified is determined. When this color distance is 10 or more, the date and time when the image of the field was taken is defined as the planting time.
Note that the growth rate will be explained in detail later.
 予測部33は、今回算出された生育速度と、過去の生育速度とに基づいて、収穫時点における対象地域での作物の収穫量を予測するものである。
 予測のために、過去の収穫時点における対象地域での作物の収穫量と、生育速度との相関関係を予め取得しておく。相関関係に基づく情報として、例えば、定植から収穫日迄の日数が50日、70日、及び100日等における収穫量を、作物の種類毎、又は圃場毎に予め取得しておく。定植日から収穫日迄の日数は、作物の生育速度(詳しくは、過去の生育速度)を表す。
 予め取得した作物の種類毎、又は圃場毎の相関関係は、例えば、メモリ36に記憶される。予測部33では、上述の相関関係をメモリ36から読み出して、収穫時点における対象地域での作物の収穫量を予測する。具体的には、上述した相関関係に基づいて、算出部32により今回算出された生育速度と対応する収穫量を予測する。
 予測部33は、複数の対象地域のそれぞれで収穫時点における収穫量を予測する。具体的には、図1に示す産地10の3つの栽培地域10a、10b、10cの収穫量をそれぞれ予測する。すなわち、予測部33は、図1に示す圃場11a、11b、11cの収穫量をそれぞれ予測する。
The prediction unit 33 predicts the yield of crops in the target area at the time of harvest based on the currently calculated growth rate and past growth rates.
For prediction purposes, the correlation between crop yields and growth speeds in the target area at past harvest times is obtained in advance. As information based on the correlation, for example, the yield amount when the number of days from planting to harvest date is 50, 70, and 100 days is obtained in advance for each type of crop or for each field. The number of days from the planting date to the harvesting date represents the growth rate of the crop (more specifically, the past growth rate).
The correlations obtained in advance for each type of crop or for each field are stored in the memory 36, for example. The prediction unit 33 reads the above-mentioned correlation from the memory 36 and predicts the yield of crops in the target area at the time of harvest. Specifically, based on the above-mentioned correlation, the calculation unit 32 predicts the harvest amount corresponding to the currently calculated growth rate.
The prediction unit 33 predicts the harvest amount at the time of harvest in each of the plurality of target areas. Specifically, the yields of three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG. 1 are predicted. That is, the prediction unit 33 predicts the yields of the fields 11a, 11b, and 11c shown in FIG. 1, respectively.
 相関関係の求め方は、特に限定されるものではなく、公知のものが種々利用可能である。相関関係としては、例えば、Pearsonの相関分析、又は共分散等が用いられる。 The method of determining the correlation is not particularly limited, and various known methods can be used. As the correlation, for example, Pearson's correlation analysis or covariance is used.
 推定部34は、複数の対象地域のそれぞれで予測された収穫量から、複数の対象地域から作物が出荷される市場への出荷量を推定するものである。
 具体的には、上述の予測部33で予測された、図1に示す産地10の3つの栽培地域10a、10b、10cの収穫量を用いて、産地10から出荷される市場への出荷量を推定する。市場には、産地10中の各栽培地域(詳しくは、各栽培地域における各圃場)で収穫された作物が出荷される。
 上述のように、各栽培地域の各圃場について、収穫日が特定された状況で収穫量が予測される。このため、収穫日から市場に出荷される迄の期間が分かっていれば、圃場から収穫された作物が、どの日に市場に、どの位出荷されるかが分かる。推定部34では、3つの栽培地域10a、10b、10cの各圃場11a、11b、11cに対して、予測部33で予測された、収穫日が特定された状況で収穫量を用いて、特定日における産地10からの作物の市場への出荷量を推定できる。
 推定部34で推定した作物の市場への出荷量は、例えば、メモリ36に記憶される。
The estimating unit 34 estimates the amount of crops shipped from the plurality of target regions to the market based on the predicted harvest amount for each of the plurality of target regions.
Specifically, using the yields of the three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG. presume. Crops harvested in each of the 10 growing regions (more specifically, each field in each growing region) are shipped to the market.
As described above, the harvest amount is predicted for each field in each cultivation region based on the specified harvest date. Therefore, if the period from the harvest date to the time of shipment to the market is known, it is possible to know on what date and how much of the crops harvested from the field will be shipped to the market. The estimation unit 34 uses the harvest amount predicted by the prediction unit 33 in the situation where the harvest date is specified for each of the fields 11a, 11b, and 11c in the three cultivation areas 10a, 10b, and 10c, and calculates the specific date. It is possible to estimate the amount of crops shipped to the market from the 10 production areas in .
The amount of crops shipped to the market estimated by the estimation unit 34 is stored in the memory 36, for example.
 表示制御部35は、取得部30で取得される画像データ、特定部31で特定された収穫時点の情報、算出部32で算出された生育速度、予測部33で予測された対象地域における作物の収穫量、及び推定部34で推定された市場への出荷量を表示部26に表示させるものである。
 表示制御部35において、表示部26に表示させる場合、メモリ36から各種の情報を読み出して表示してもよい。また、表示制御部35は、入力部24を介して入力される各種の情報等も表示部26に表示させることができる。また、表示部26に表示させる情報を、送信部(図示せず)を用いて送信するようにしてもよい。
The display control unit 35 uses the image data acquired by the acquisition unit 30, the information on the harvest time specified by the identification unit 31, the growth rate calculated by the calculation unit 32, and the crops in the target area predicted by the prediction unit 33. The harvest amount and the amount shipped to the market estimated by the estimation section 34 are displayed on the display section 26.
In the display control section 35, when displaying on the display section 26, various information may be read out from the memory 36 and displayed. Further, the display control section 35 can also display various information inputted via the input section 24 on the display section 26. Further, the information to be displayed on the display section 26 may be transmitted using a transmitting section (not shown).
(出荷量予測方法)
 図3は本発明の実施形態の出荷量予測方法の一例を示すフローチャートである。
 出荷量予測方法では、まず、作物が植えられた対象地域を上空から撮影した画像に基づいて、対象地域の作物の収穫時点を特定する(ステップS10)。
 例えば、航空機、人工衛星、又はドローンを利用して、図1に示す産地10の3つの栽培地域10a、10b、10cを上空から撮影して画像を得る。なお、上述の画像の取得頻度は、例えば、1日、1回である。
 得た画像の画像データを、例えば、出荷量予測システム20の処理部22の取得部30に出力する。
(Shipping amount prediction method)
FIG. 3 is a flowchart showing an example of a shipping amount prediction method according to an embodiment of the present invention.
In the shipment amount prediction method, first, the harvest time of crops in the target area is specified based on an image taken from above of the target area where the crops are planted (step S10).
For example, images are obtained by photographing the three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG. 1 from above using an aircraft, an artificial satellite, or a drone. Note that the above-mentioned image acquisition frequency is, for example, once a day.
The image data of the obtained image is output to the acquisition unit 30 of the processing unit 22 of the shipping amount prediction system 20, for example.
 上空から撮影した画像は、波長成分として、例えば、RGB、NIR、及びUVを含む。
 RGBは、通常のカメラに使われている3原色である。RGBの波長は、様々な選択があるが、代表的には、Rが波長600~700nm、Gが波長500~600nm、Bが波長400~500nmである。
 NIRは、近赤外線と呼ばれるものであり、波長が700~800nmである。
 UVは、紫外線と呼ばれるものであり、波長が200~400nmである。
An image taken from above includes, for example, RGB, NIR, and UV as wavelength components.
RGB are the three primary colors used in regular cameras. There are various choices for the wavelengths of RGB, but typically R has a wavelength of 600 to 700 nm, G has a wavelength of 500 to 600 nm, and B has a wavelength of 400 to 500 nm.
NIR is called near infrared rays and has a wavelength of 700 to 800 nm.
UV is called ultraviolet light and has a wavelength of 200 to 400 nm.
 上空から撮影された画像は、通常のRGB画像以外に、マルチスペクトルカメラによる画像、赤外線カメラ又は紫外線カメラによる画像、電波の反射による画像があり、画像は、特に限定されるものではない。赤(R)の波長又は近赤外波長(NIR)の画像を用いると、植物を検出しやすいことから好ましい。このため、上空から撮影した画像は、波長成分として赤(R)だけ、又は赤(R)と近赤外波長(NIR)だけが好ましい。
 このため、上空から撮影した画像を得るカメラ(撮像装置)は、上述の波長の感度を有するものであれば、特に限定されるものではないが、広範囲を1度に撮影できるものが好ましい。このため、上述の人工衛星に搭載のカメラを利用することが好ましい。なお、カメラ(撮像装置)の数も、特に限定されるものではなく、1つでも複数でもよい。
In addition to normal RGB images, images taken from above include images taken by a multispectral camera, images taken by an infrared camera or an ultraviolet camera, and images taken by reflection of radio waves, and the images are not particularly limited. It is preferable to use images with red (R) wavelengths or near-infrared wavelengths (NIR) because plants can be easily detected. For this reason, it is preferable for an image taken from the sky to have only red (R) or only red (R) and near-infrared wavelengths (NIR) as wavelength components.
For this reason, the camera (imaging device) that captures images taken from above is not particularly limited as long as it has sensitivity to the above-mentioned wavelengths, but it is preferably one that can capture a wide range at once. For this reason, it is preferable to use a camera mounted on the above-mentioned artificial satellite. Note that the number of cameras (imaging devices) is not particularly limited, and may be one or more.
 上述のように、上空から撮影した画像に基づいて、対象地域の作物の収穫時点を特定する(ステップS10)。
 ここで、図4(a)~(d)は本発明の実施形態の出荷量予測方法における定植時期と収穫時点の特定方法の一例を説明するための模式図である。
 図4(a)は作物の定植前の圃場40を示し、図4(b)は作物13が定植された状態の圃場41を示し、図4(c)は作物13の収穫期の圃場42を示し、図4(d)は作物の収穫後の圃場43を示す。
 図4(a)~(d)に示すように圃場40~43は、作物の定植前、作物が定植された状態、作物の収穫期、及び作物の収穫後では、土壌の露出の程度等が異なる。このことを利用して、例えば、上空から撮影した画像から色距離を算出し、色距離に基づいて、対象地域の作物の定植前の状態及び作物の収穫後の作物がない状態、作物が定植された状態、並びに作物の収穫期を特定する。
As described above, the harvest time of crops in the target area is determined based on the image taken from above (step S10).
Here, FIGS. 4(a) to 4(d) are schematic diagrams for explaining an example of a method for specifying the planting time and harvesting time in the shipping amount prediction method according to the embodiment of the present invention.
4(a) shows a field 40 before planting crops, FIG. 4(b) shows a field 41 with crops 13 planted, and FIG. 4(c) shows a field 42 during the harvest period of crops 13. FIG. 4(d) shows a field 43 after crops are harvested.
As shown in FIGS. 4(a) to 4(d), in the fields 40 to 43, the degree of soil exposure, etc. is different before the planting of crops, in the state where the crops have been planted, during the harvest period of the crops, and after the harvest of the crops. different. Utilizing this, for example, the color distance can be calculated from an image taken from the sky, and based on the color distance, the state of the target area before planting, the state of no crops after harvesting, and the state of no crops after the crops have been planted. identify the conditions in which crops are harvested, as well as when crops are harvested.
 例えば、図5(a)及び(b)は本発明の実施形態の出荷量予測方法における作物の収穫時点の特定方法の一例を説明するための模式図である。図5(a)及び(b)は、図1に示す栽培地域10bを示す。栽培地域10bは、上述のように12面の圃場11bを有する。
 栽培地域10bの12面の圃場11bについて、例えば、位置情報が取得されている。また、後述の圃場11bに画像(画像データ)は位置情報を含む。これにより、栽培地域10bの12面の圃場11bは、それぞれ撮影された画像において、どの圃場であるかを特定できる。
 栽培地域10bにおいて、定植前の状態で、図5(a)に示す圃場11bの画像(画像データ)が得られている。図5(a)に示す圃場11bの画像(画像データ)が、毎日連続して得られる場合、定植前では画像(画像データ)に変化がない。
 一方、例えば、図5(a)に示す圃場11bのうち、一部の圃場12aに対して、図5(b)に示すように作物13を定植した場合、作物13を定植した圃場12aの画像(画像データ)が変化する。この画像(画像データ)の変化を、例えば、色距離を利用して検出する。
For example, FIGS. 5A and 5B are schematic diagrams for explaining an example of a method for specifying the harvest time of crops in the shipment amount prediction method according to the embodiment of the present invention. 5(a) and (b) show the cultivation area 10b shown in FIG. 1. The cultivation area 10b has 12 fields 11b as described above.
For example, position information has been acquired for the 12 fields 11b in the cultivation area 10b. Further, the image (image data) of the field 11b, which will be described later, includes position information. Thereby, it is possible to identify which of the 12 fields 11b in the cultivation area 10b is in each photographed image.
In the cultivation area 10b, an image (image data) of the field 11b shown in FIG. 5(a) is obtained before planting. When the image (image data) of the field 11b shown in FIG. 5(a) is obtained continuously every day, there is no change in the image (image data) before planting.
On the other hand, for example, if crops 13 are planted in some fields 12a of the fields 11b shown in FIG. 5(a) as shown in FIG. 5(b), an image of the fields 12a where the crops 13 are planted (image data) changes. Changes in this image (image data) are detected using, for example, color distance.
 画像(画像データ)の変化には、例えば、対象地域を上空から撮影した今回の画像と、対象地域を上空から撮影した過去の画像との色距離を用いる。ここで、過去の画像とは、比較する画像(すなわち、今回の画像)の1日以上前の画像であればよいが、作物13の定植は、多くの時間を要さないため、過去の画像としては1日前の画像であればよい。過去の画像は、例えば、メモリ36に記憶されている。
 色距離には、例えば、画像(画像データ)の特定領域の色データを用いる。特定領域については、後に説明するが、特定領域とは、例えば、撮影した圃場の真ん中の領域である。
 また、過去の画像として、平年の画像を平均した画像を用いてもよい。また、過去の画像として、後述の平年の定植前の色データ及び平年の収穫直前の色データを含む平年色データを用いてもよい。
For example, the color distance between a current image taken of the target area from above and a past image of the target area taken from above is used for the change in the image (image data). Here, the past image may be an image taken one day or more before the image to be compared (that is, the current image), but since planting the crop 13 does not require much time, past images may be used. As long as it is an image from one day ago. Past images are stored in the memory 36, for example.
For example, color data of a specific area of an image (image data) is used for the color distance. The specific area will be explained later, but the specific area is, for example, the area in the middle of the photographed field.
Further, as the past image, an image obtained by averaging images from a normal year may be used. Further, as the past image, normal color data including color data before planting in a normal year and color data immediately before harvest in a normal year, which will be described later, may be used.
 図5(a)の圃場12aの画像の画像データをD1とし、図5(b)の作物13を定植した圃場12aの画像の画像データをD2とする。なお、画像データD1と、画像データD2とは、いずれも、例えば、階調0又は階調255の画像データの値を合わせて、8ビット、256階調で表されている。
 図5(a)及び(b)の圃場12aについて、波長成分Rを用いて、定植日を特定する場合、画像データD1のRの波長成分R1と、画像データD2のRの波長成分R2との色距離が256階調で10以上離れている場合に、定植されたと判定する。このようにして、各日時の画像データから算出される色距離に基づいて定植日を特定することができる。特定された定植日は、メモリ36に記憶させる。なお、色距離をDcとした場合、Dcは下記式で表される。
The image data of the image of the field 12a in FIG. 5(a) is designated as D1, and the image data of the image of the field 12a in which the crops 13 are planted in FIG. 5(b) is designated as D2. Note that both the image data D1 and the image data D2 are represented by 8 bits and 256 gradations, for example, including the image data values of gradation 0 or gradation 255.
When specifying the planting date using the wavelength component R for the field 12a in FIGS. 5(a) and (b), the wavelength component R1 of R in image data D1 and the wavelength component R2 of R in image data D2 If the color distance is 256 gradations and the distance is 10 or more, it is determined that the images have been planted. In this way, the planting date can be specified based on the color distance calculated from the image data of each date and time. The specified planting date is stored in the memory 36. Note that when the color distance is Dc, Dc is expressed by the following formula.
 色距離は、上述の波長成分R以外に、波長成分RとNIRとの組み合せでもよい。この場合、色距離Dcは下記式で表される。 In addition to the wavelength component R described above, the color distance may be a combination of the wavelength component R and NIR. In this case, the color distance Dc is expressed by the following formula.
 色距離は、また、波長成分RとGとBとの組み合せでもよい。この場合、色距離Dcは下記式で表される。 The color distance may also be a combination of wavelength components R, G, and B. In this case, the color distance Dc is expressed by the following formula.
 以上までに定植日の特定手順について説明したが、収穫時点の特定についても、上述の定植日の特定手順と同様である。作物が植えられた対象地域を上空から撮影した今回の画像と、作物が植えられた対象地域を上空から撮影した過去の画像との色距離を算出し、色距離に基づいて対象地域の作物の収穫時点を特定する。
 具体的には、例えば、図4(b)に示す圃場41の画像データDsと、図4(c)に示す圃場42の画像データDhとを比較して、収穫時点を特定する。この場合、上述のように画像データDsと画像データDhとの色距離を求めて、収穫時点を特定する。色距離については、上述のようにRだけ、RとNIRとの組み合せ、RとGとBとの組み合せのいずれでもよい。色距離が256階調で10以上離れている場合に、今回の画像に写る対象地域では作物が既に収穫されたと判定する。そして、その判定結果から、今回の画像の撮影日時に基づいて収穫時点が特定される。特定された収穫時点は、例えば、収穫日として、メモリ36に記憶させる。
The procedure for specifying the planting date has been described above, but the procedure for specifying the harvest time is also the same as the procedure for specifying the above-mentioned planting date. The color distance between the current image taken from above of the target area where crops were planted and the past image taken from above of the target area where crops were planted is calculated, and the color distance of the crops in the target area is calculated based on the color distance. Identify the point of harvest.
Specifically, for example, the image data Ds of the field 41 shown in FIG. 4(b) and the image data Dh of the field 42 shown in FIG. 4(c) are compared to identify the harvest time. In this case, as described above, the color distance between the image data Ds and the image data Dh is determined to specify the harvest time. Regarding the color distance, as described above, it may be only R, a combination of R and NIR, or a combination of R, G, and B. If the color distance is 256 gradations and the distance is 10 or more, it is determined that crops have already been harvested in the target area in the current image. Then, based on the determination result, the harvest time is specified based on the date and time when the current image was taken. The identified harvest time is stored in the memory 36, for example, as a harvest date.
 次に、収穫時点に基づいて作物の生育速度を算出する(ステップS12)。ステップS12では、例えば、以下で説明する図6に示す方法が用いられる。図6はステップS12の一例を示し、例えば、図2の算出部32で実行される。 Next, the growth rate of the crop is calculated based on the harvest time (step S12). In step S12, for example, a method shown in FIG. 6 described below is used. FIG. 6 shows an example of step S12, which is executed by the calculation unit 32 of FIG. 2, for example.
<作物の生育速度の算出方法>
 図6は本発明の実施形態の出荷量予測方法の生育速度を算出する工程の一例を示すフローチャートである。
 対象地域の作物の生育速度の算出(ステップS12)では、まず、上空から特定の圃場を特定日に、撮影して画像を得て、画像データを取得する(ステップS20)。ステップS20の特定日とは、画像の撮影日である。上述のように画像は、撮影日情報と位置情報とを含むことが好ましい。
<How to calculate crop growth rate>
FIG. 6 is a flowchart showing an example of the process of calculating the growth rate of the shipping amount prediction method according to the embodiment of the present invention.
In calculating the growth rate of crops in the target area (step S12), first, a specific field is photographed from above on a specific day to obtain an image, and image data is obtained (step S20). The specific date in step S20 is the photographing date of the image. As described above, it is preferable that the image includes shooting date information and location information.
 次に、画像の特定領域の色データを取得する(ステップS22)。
 ステップS22の特定領域とは、撮影した画像の圃場の真ん中の領域である。圃場の真ん中の領域の画像データのうち、上述の波長成分R、波長成分RとNIR、波長成分RGBの色データを用いる。波長成分RとNIRの場合、RとNIRの平均値を色データとする。波長成分RGBの場合、RGBの平均値を色データとする。
 圃場の真ん中は、画像(画像データ)における圃場の外形から求めた幾何学的な中心である。圃場の外形は、撮影した画像の画像データを画像処理して、圃場の輪郭を抽出して特定できる。
Next, color data of a specific area of the image is acquired (step S22).
The specific area in step S22 is the area in the middle of the field in the photographed image. Among the image data of the middle area of the field, the color data of the wavelength component R, the wavelength component R and NIR, and the wavelength component RGB described above are used. In the case of wavelength components R and NIR, the average value of R and NIR is used as color data. In the case of wavelength components RGB, the average value of RGB is used as color data.
The center of the field is the geometric center determined from the outline of the field in the image (image data). The outline of the field can be identified by processing the image data of the photographed image and extracting the outline of the field.
 次に、平年色データをデータベースから取得する(ステップS24)。
 平年色データは、撮影した画像における圃場の真ん中の領域(特定領域)の色データの平均値である。
 平年色データは、平年の定植前の色データ及び平年の収穫直前の色データを含む。
 平年の定植前の色データは、過去データから求めた定植直前日の色データであり、色データの波長成分は上述の通りである。
 平年の収穫直前の色データは、過去データから求めた収穫の直前日の色データであり、色データの波長成分は上述の通りである。
 過去データとは、例えば、撮影した圃場の過去10年分の画像データのことである。圃場ができてからの経過年数が10年未満の場合には、圃場ができてから、出荷量予測のために画像が撮影される迄の期間である。
 なお、平年色データは、例えば、図2に示すメモリ36に保存されるか、又はクラウド上に保存される。
Next, normal year color data is acquired from the database (step S24).
The normal color data is the average value of the color data of the middle area (specific area) of the field in the photographed image.
The normal color data includes color data before planting in a normal year and color data immediately before harvest in a normal year.
The color data before planting in a normal year is the color data on the day immediately before planting obtained from past data, and the wavelength components of the color data are as described above.
The color data immediately before harvest in a normal year is the color data on the day immediately before harvest obtained from past data, and the wavelength components of the color data are as described above.
The past data is, for example, image data of photographed fields for the past 10 years. If the number of years that have passed since the field was created is less than 10 years, this is the period from the time the field was created until the image is taken for predicting the amount of shipment.
Note that the normal color data is stored, for example, in the memory 36 shown in FIG. 2 or on the cloud.
 次に、作物の生育量を計算する(ステップS26)。
 ステップS26では、撮影して得られた画像における圃場の色が、定植日と収穫の直前日との間のどこに位置づけられるかを計算して、画像における圃場の色に基づく生育量を計算する。
 画像における圃場の色に基づく、作物の生育量をαとするとき、画像における圃場の色に基づく生育量αは、画像データのRGBの波長成分を利用した場合、下記式で表される。
 α=((Cr-Fr)/(Hr-Fr)+(Cg-Fg)/(Hg-Fg)+(Cb-Fb)/(Hb-Fb))/3
Next, the amount of crop growth is calculated (step S26).
In step S26, it is calculated where the color of the field in the photographed image is positioned between the planting date and the day immediately before harvest, and the amount of growth is calculated based on the color of the field in the image.
When the amount of crop growth based on the color of the field in the image is α1 , the amount of growth α1 based on the color of the field in the image is expressed by the following formula when using the RGB wavelength components of the image data. .
α 1 = ((Cr-Fr)/(Hr-Fr)+(Cg-Fg)/(Hg-Fg)+(Cb-Fb)/(Hb-Fb))/3
 上記式において、Cr、Cg、Cbは、ステップS20で得た特定日の色データの波長成分RGBの値である。
 Fr、Fg、Fbは、平年の定植日色データの波長成分RGBの値である。平年の定植日色データは、過去データから求めた定植日の色データである。
 また、平年の定植日は、平年の定植した日であり、過去データから求めた定植日の平均日(つまり、例年における定植日)である。
 Hr、Hg、Hbは、平年の収穫の直前日の色データの波長成分RGBの値である。平年の収穫の直前日の色データは、過去データから求めた収穫の直前日の色データである。
 また、平年の収穫の直前日は、平年の収穫日の前日であり、過去データから求めた収穫の直前日の平均日(つまり、例年における収穫前日)である。
In the above formula, Cr, Cg, and Cb are the values of the wavelength components RGB of the color data on the specific day obtained in step S20.
Fr, Fg, and Fb are the values of the wavelength components RGB of the planting day color data for a normal year. The normal planting date color data is the color data of the planting date obtained from past data.
Moreover, the planting date in a normal year is the day when planting was performed in a normal year, and is the average day of planting dates determined from past data (that is, the planting date in a normal year).
Hr, Hg, and Hb are the values of wavelength components RGB of color data on the day immediately before harvest in a normal year. The color data on the day immediately before harvest in a normal year is the color data on the day immediately before harvest obtained from past data.
Further, the day immediately before harvest in a normal year is the day before the harvest date in a normal year, and is the average day immediately before harvest determined from past data (that is, the day before harvest in a normal year).
 ステップS26では、生育の日数から生育量を計算する。
 生育の日数に基づく生育量をαとするとき、生育の日数に基づく生育量αは、定植日と収穫の直前日とを用いて、下記式で表される。
 α=(D-Df)/(Dh-Df)
 上記式において、Dは画像を撮影した特定日である。Dfは平年の定植日であり、過去データから求めた定植日の平均日である。Dhは平年の収穫の直前日であり、過去データから求めた収穫の直前日の平均日である。
In step S26, the amount of growth is calculated from the number of days of growth.
When the amount of growth based on the number of days of growth is α 2 , the amount of growth α 2 based on the number of days of growth is expressed by the following formula using the planting date and the day immediately before harvest.
α 2 = (D-Df)/(Dh-Df)
In the above formula, D is the specific date on which the image was taken. Df is the planting date in a normal year, and is the average planting date obtained from past data. Dh is the day immediately before harvest in a normal year, and is the average day immediately before harvest determined from past data.
 次に、作物の生育速度を算出する(ステップS28)。
 作物の生育速度をSとするとき、生育速度Sは、上述のステップS26で得たαとαとを用いて、S=α/αで表される。
 α>αの場合、定植日から収穫迄の期間が短く、生育速度Sが速いことになる。
 なお、特定の地域(具体的には、特定の栽培地域)の生育速度を算出するには、その地域の複数個所で、生育速度を算出して、その平均値を算出する。例えば、図1に示す3つの栽培地域10a、10b、10cにおいて、それぞれの圃場11a、11b、11cについて生育速度を算出する。そして、3つの栽培地域10a、10b、10c毎に複数の圃場11a、11b、11cの生育速度の平均値を算出する。
 また、特定の野菜の生育速度を算出するには、その時期に出荷される産地全体に渡って生育速度を算出して、その平均値を算出する。例えば、図1に示す3つの栽培地域10a、10b、10cで全て同じ野菜を栽培している状況で、それぞれ生育速度を算出して、その平均値を、特定の野菜の生育速度として算出する。
Next, the growth rate of the crop is calculated (step S28).
When the growth rate of the crop is S, the growth rate S is expressed as S=α 12 using α 1 and α 2 obtained in step S26 above.
When α 12 , the period from the planting date to harvesting is short and the growth rate S is fast.
In addition, in order to calculate the growth rate of a specific area (specifically, a specific cultivation area), the growth rate is calculated at multiple locations in the area, and the average value thereof is calculated. For example, in the three cultivation areas 10a, 10b, and 10c shown in FIG. 1, the growth rate is calculated for each field 11a, 11b, and 11c. Then, the average value of the growth speeds of the plurality of fields 11a, 11b, and 11c is calculated for each of the three cultivation areas 10a, 10b, and 10c.
Furthermore, in order to calculate the growth rate of a specific vegetable, the growth rate is calculated over the entire production area where it is shipped during that period, and the average value thereof is calculated. For example, in a situation where the same vegetable is grown in the three cultivation regions 10a, 10b, and 10c shown in FIG. 1, the growth speed of each is calculated, and the average value thereof is calculated as the growth speed of the specific vegetable.
 次に、図3に示すように今回算出された生育速度と、過去の生育速度とに基づいて、収穫時点における対象地域での作物の収穫量を予測する(ステップS14)。
 ステップS14では、例えば、上述のように予め取得した作物の種類毎の相関関係を用いて、対象地域での作物の収穫量を予測する。この場合、例えば、図1に示す産地10の3つの栽培地域10a、10b、10cの収穫量をそれぞれ予測する。すなわち、図1に示す圃場11a、11b、11cの収穫量をそれぞれ予測する。
Next, as shown in FIG. 3, the yield of crops in the target area at the time of harvest is predicted based on the growth rate calculated this time and the past growth rate (step S14).
In step S14, the yield of crops in the target area is predicted, for example, using the correlation for each type of crop acquired in advance as described above. In this case, for example, the yields of the three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG. 1 are predicted. That is, the yields of the fields 11a, 11b, and 11c shown in FIG. 1 are predicted.
 次に、複数の栽培地域(つまり、複数の対象地域)の出荷量を推定する(ステップS16)。
 ステップS16では、上述のステップS14で予測された、図1に示す産地10の3つの栽培地域10a、10b、10cの収穫量を用いる。さらに、収穫日から市場に出荷される迄の期間が分かっていれば、圃場から収穫された作物が、どの日に市場に、どの位出荷されるかが分かる。ステップS16では、3つの栽培地域10a、10b、10cの各圃場11a、11b、11cに対して、ステップS14で予測された、収穫日が特定された状況で収穫量を用いて、特定日における産地10からの作物の市場への出荷量を推定する。
 このようにして、複数の栽培地域(複数の対象地域)のそれぞれで予測された収穫量から、複数の栽培地域から作物が出荷される市場への出荷量を推定する。市場への出荷量を推定することで、生産している作物の出荷量を調整して、市場への過剰供給及び供給不足を回避できる。これにより、作物価格の急激な高騰や値崩れ等を抑えることができる。
Next, the shipping amount of a plurality of cultivation areas (that is, a plurality of target areas) is estimated (step S16).
In step S16, the yields of the three cultivation areas 10a, 10b, and 10c of the production area 10 shown in FIG. 1, which were predicted in step S14 described above, are used. Furthermore, if the period from the harvest date to the time of shipment to the market is known, it is possible to know on what date and how much of the crops harvested from the field will be shipped to the market. In step S16, for each of the fields 11a, 11b, and 11c in the three cultivation regions 10a, 10b, and 10c, the production area on the specific date is Estimate the amount of crops shipped to the market from 10.
In this way, based on the predicted harvest amount for each of the plurality of cultivation areas (plurality of target areas), the amount shipped to the market where crops are shipped from the plurality of cultivation areas is estimated. By estimating the amount shipped to the market, it is possible to adjust the amount shipped of the crops being produced to avoid oversupply or undersupply to the market. This makes it possible to suppress sudden rises and price collapses in crop prices.
<本発明の適用例>
 本発明を適用することで、例えば、レタスの場合、以下に示す県の出荷量の情報があると、東京市場への出荷量が推定できる。下記の県からの出荷割合は略8割である。このため、対象地域としては、長野県、茨城県、静岡県、兵庫県、長崎県、群馬県、栃木県、香川県、千葉県、及び福岡県が好ましい。
 その他の市場に関しても、例年の出荷量の半分以上の栽培地域(出荷元)を調査すれば、出荷量を予測することができるため、対象地域としては、出荷量の半分以上の栽培地域を選択することが好ましい。
 なお、調査する栽培地域としては、好ましくは、例年の出荷量の6割以上の産地であり、更に好ましくは例年の出荷量の7割以上の栽培地域である。
 作物の栽培地域は、季節によって変わるため、各季節毎に、調査する栽培地域を変えてもよい。この場合、各月毎に上述の割合を網羅する栽培地域を、対象地域として選択する。
<Application example of the present invention>
By applying the present invention, for example, in the case of lettuce, the amount shipped to the Tokyo market can be estimated if there is information on the amount shipped by prefecture shown below. Approximately 80% of shipments come from the following prefectures. Therefore, the target areas are preferably Nagano, Ibaraki, Shizuoka, Hyogo, Nagasaki, Gunma, Tochigi, Kagawa, Chiba, and Fukuoka.
Regarding other markets, it is possible to predict the shipment volume by investigating the cultivation areas (shipping sources) that account for more than half of the annual shipment volume, so select the cultivation areas that account for more than half of the shipment volume as target areas. It is preferable to do so.
The cultivation area to be investigated is preferably a production area that accounts for 60% or more of the annual shipment amount, and more preferably a cultivation area that accounts for 70% or more of the annual shipment amount.
Since the cultivation area of crops changes depending on the season, the cultivation area to be investigated may be changed for each season. In this case, cultivation areas covering the above-mentioned ratios are selected as target areas for each month.
 本発明は、基本的に以上のように構成されるものである。以上、本発明の出荷量予測方法及び出荷量予測システムについて詳細に説明したが、本発明は上述の実施形態に限定されず、本発明の主旨を逸脱しない範囲において、種々の改良又は変更をしてもよいのはもちろんである。 The present invention is basically configured as described above. Although the shipment amount prediction method and shipment amount prediction system of the present invention have been described in detail above, the present invention is not limited to the above-described embodiments, and various improvements or changes can be made without departing from the gist of the present invention. Of course you can.
 10 産地
 10a、10b、10c 栽培地域
 11a、11b、11c、12a 圃場
 13 作物
 20 出荷量予測システム
 22 処理部
 24 入力部
 26 表示部
 30 取得部
 31 特定部
 32 算出部
 33 予測部
 34 推定部
 35 表示制御部
 36 メモリ
 37 制御部
 40、41、42、43 圃場
 
10 Production area 10a, 10b, 10c Cultivation area 11a, 11b, 11c, 12a Field 13 Crop 20 Shipping amount prediction system 22 Processing unit 24 Input unit 26 Display unit 30 Acquisition unit 31 Identification unit 32 Calculation unit 33 Prediction unit 34 Estimation unit 35 Display Control unit 36 Memory 37 Control unit 40, 41, 42, 43 Field

Claims (8)

  1.  作物が植えられた対象地域を上空から撮影した画像に基づいて、対象地域の前記作物の収穫時点を特定する工程と、
     前記収穫時点に基づいて前記作物の生育速度を算出する工程と、
     今回算出された前記生育速度と、過去の前記生育速度とに基づいて、前記収穫時点における前記対象地域での前記作物の収穫量を予測する工程と、
     複数の対象地域のそれぞれで予測された収穫量から、複数の対象地域から作物が出荷される市場への出荷量を推定する工程とを有する、出荷量予測方法。
    A step of identifying the harvest time of the crops in the target area based on an image taken from above of the target area where the crops are planted;
    calculating the growth rate of the crop based on the harvest time;
    predicting the harvest amount of the crop in the target area at the harvest time based on the growth rate calculated this time and the growth rate in the past;
    A shipment amount prediction method comprising the step of estimating the shipment amount to a market where crops are shipped from a plurality of target areas based on the predicted harvest amount for each of the plurality of target areas.
  2.  前記対象地域の前記作物の収穫時点を特定する工程では、前記作物が植えられた前記対象地域を前記上空から撮影した今回の画像と、前記作物が植えられた前記対象地域を前記上空から撮影した過去の画像との色距離を算出し、前記色距離に基づいて前記対象地域の前記作物の前記収穫時点を特定する、請求項1に記載の出荷量予測方法。 The step of identifying the harvest time of the crop in the target area includes a current image of the target area where the crop is planted taken from above, and a current image of the target area where the crop is planted taken from above. The shipping amount prediction method according to claim 1, wherein a color distance from a past image is calculated, and the harvest time of the crop in the target area is specified based on the color distance.
  3.  過去の収穫時点における前記対象地域における前記作物の収穫量と、前記生育速度との相関関係を予め取得しておき、
     前記収穫時点における前記対象地域における前記作物の収穫量を予測する工程では、前記相関関係を用いて、前記収穫時点における前記対象地域での前記収穫量を予測する、請求項1又は2に記載の出荷量予測方法。
    Obtaining in advance the correlation between the yield of the crop in the target area at the time of past harvest and the growth rate,
    3. The step of predicting the harvest amount of the crop in the target area at the harvest time uses the correlation to predict the harvest amount in the target area at the harvest time. Shipping volume forecasting method.
  4.  前記作物は、葉物野菜である、請求項1又は2に記載の出荷量予測方法。 The shipping amount prediction method according to claim 1 or 2, wherein the crop is a leafy vegetable.
  5.  作物が植えられた対象地域を上空から撮影した画像に基づいて、対象地域の前記作物の収穫時点を特定する特定部と、
     前記収穫時点に基づいて前記作物の生育速度を算出する算出部と、
     今回算出された前記生育速度と、過去の前記生育速度とに基づいて、前記収穫時点における前記対象地域における前記作物の収穫量を予測する予測部と、
     複数の対象地域のそれぞれで予測された収穫量から、複数の対象地域から作物が出荷される市場への出荷量を推定する推定部とを有する、出荷量予測システム。
    an identification unit that identifies the harvest time of the crops in the target area based on an image taken from above of the target area where the crops are planted;
    a calculation unit that calculates the growth rate of the crop based on the harvest time;
    a prediction unit that predicts the harvest amount of the crop in the target area at the harvest time based on the growth rate calculated this time and the growth rate in the past;
    A shipment amount prediction system comprising: an estimation unit that estimates the shipment amount to a market to which crops are shipped from the plurality of target areas, based on the predicted harvest amount for each of the plurality of target areas.
  6.  前記特定部は、前記作物が植えられた前記対象地域を前記上空から撮影した今回の画像と、前記作物が植えられた前記対象地域を前記上空から撮影した過去の画像との色距離を算出し、前記色距離に基づいて前記対象地域の前記作物の収穫時点を特定する、請求項5に記載の出荷量予測システム。 The identification unit calculates a color distance between a current image of the target area where the crops are planted taken from above and a past image of the target area where the crops are planted taken from above. 6. The shipment amount prediction system according to claim 5, wherein a harvest time of the crop in the target area is specified based on the color distance.
  7.  過去の収穫時点における前記対象地域における前記作物の収穫量と、前記生育速度との相関関係を予め取得しておき、
     前記予測部は、前記相関関係を用いて、前記収穫時点における前記対象地域での前記作物の収穫量を予測する、請求項5又は6に記載の出荷量予測システム。
    Obtaining in advance the correlation between the yield of the crop in the target area at the time of past harvest and the growth rate,
    The shipping amount prediction system according to claim 5 or 6, wherein the prediction unit uses the correlation to predict the harvest amount of the crop in the target area at the harvest time.
  8.  前記作物は、葉物野菜である、請求項5又は6に記載の出荷量予測システム。 The shipment amount prediction system according to claim 5 or 6, wherein the crop is a leafy vegetable.
PCT/JP2022/028196 2022-07-20 2022-07-20 Shipping quantity prediction method and shipping quantity prediction system WO2024018559A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/028196 WO2024018559A1 (en) 2022-07-20 2022-07-20 Shipping quantity prediction method and shipping quantity prediction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/028196 WO2024018559A1 (en) 2022-07-20 2022-07-20 Shipping quantity prediction method and shipping quantity prediction system

Publications (1)

Publication Number Publication Date
WO2024018559A1 true WO2024018559A1 (en) 2024-01-25

Family

ID=89617506

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/028196 WO2024018559A1 (en) 2022-07-20 2022-07-20 Shipping quantity prediction method and shipping quantity prediction system

Country Status (1)

Country Link
WO (1) WO2024018559A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003006612A (en) * 2001-06-20 2003-01-10 Ntt Data Corp Device and method for predicting harvest
JP2015000049A (en) * 2013-06-18 2015-01-05 株式会社日立製作所 Harvest-predicting system and harvest-predicting apparatus
US20180293671A1 (en) * 2015-03-27 2018-10-11 Omniearth, Inc. System and method for predicting crop yield

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003006612A (en) * 2001-06-20 2003-01-10 Ntt Data Corp Device and method for predicting harvest
JP2015000049A (en) * 2013-06-18 2015-01-05 株式会社日立製作所 Harvest-predicting system and harvest-predicting apparatus
US20180293671A1 (en) * 2015-03-27 2018-10-11 Omniearth, Inc. System and method for predicting crop yield

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SHENDRYK YURI, PAN LECHENG, CRAIGIE MATTHEW, STASOLLA MATTIA, TICEHURST CATHERINE, THORBURN PETER: "A Satellite-Based Methodology for Harvest Date Detection and Yield Prediction in Sugarcane", IGARSS 2020, 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, IEEE, 26 September 2020 (2020-09-26) - 2 October 2020 (2020-10-02), pages 5167 - 5170, XP093130688, ISBN: 978-1-7281-6374-1, DOI: 10.1109/IGARSS39084.2020.9323418 *

Similar Documents

Publication Publication Date Title
Ballesteros et al. Onion biomass monitoring using UAV-based RGB imaging
Tu et al. Optimising drone flight planning for measuring horticultural tree crop structure
Wang et al. Estimating leaf area index and aboveground biomass of grazing pastures using Sentinel-1, Sentinel-2 and Landsat images
US11631166B2 (en) Crop yield prediction method and system based on low-altitude remote sensing information from unmanned aerial vehicle
Simic Milas et al. The importance of leaf area index in mapping chlorophyll content of corn under different agricultural treatments using UAV images
Dobrowski et al. Grapevine dormant pruning weight prediction using remotely sensed data
Herrmann et al. Ground-level hyperspectral imagery for detecting weeds in wheat fields
JP2007310463A (en) Farm field management support method and system
Jiang et al. Multi-sensor and multi-platform consistency and interoperability between UAV, Planet CubeSat, Sentinel-2, and Landsat reflectance data
Muhd-Ekhzarizal et al. Estimation of aboveground biomass in mangrove forests using vegetation indices from SPOT-5 image
Kefauver et al. RGB picture vegetation indexes for high-throughput phenotyping platforms (HTPPs)
Huang et al. Use of principal components of UAV-acquired narrow-band multispectral imagery to map the diverse low stature vegetation fAPAR
AU2019309839A1 (en) Information processing device, information processing method, and program
Jewan et al. The feasibility of using a low-cost near-infrared, sensitive, consumer-grade digital camera mounted on a commercial UAV to assess Bambara groundnut yield
CN114080540A (en) Information processing apparatus, information processing method, program, and sensing system
CN114821349A (en) Forest biomass estimation method considering harmonic model coefficients and phenological parameters
Borra-Serrano et al. Towards an objective evaluation of persistency of Lolium perenne swards using UAV imagery
JP2022060277A (en) Crop related value derivation device and crop related value derivation method
Wang et al. Combining both spectral and textural indices for alleviating saturation problem in forest LAI estimation using Sentinel-2 data
WO2024018559A1 (en) Shipping quantity prediction method and shipping quantity prediction system
JP2019153109A (en) Agricultural management prediction system, agricultural management prediction method, and server device
JP2003339238A (en) Method and apparatus for diagnosing crop growth
JP7386503B2 (en) Yield estimation program, yield estimation method, and yield estimation device
Kautz et al. Early detection of bark beetle (Ips typographus) infestations by remote sensing–A critical review of recent research
Gan et al. Multivariate regressions coupling colorimetric and textural features derived from UAV-based RGB images can trace spatiotemporal variations of LAI well in a deciduous forest

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22951946

Country of ref document: EP

Kind code of ref document: A1