WO2023074466A1 - Machine learning device, estimation device, and program - Google Patents

Machine learning device, estimation device, and program Download PDF

Info

Publication number
WO2023074466A1
WO2023074466A1 PCT/JP2022/038750 JP2022038750W WO2023074466A1 WO 2023074466 A1 WO2023074466 A1 WO 2023074466A1 JP 2022038750 W JP2022038750 W JP 2022038750W WO 2023074466 A1 WO2023074466 A1 WO 2023074466A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
predetermined plant
image
root
field
Prior art date
Application number
PCT/JP2022/038750
Other languages
French (fr)
Japanese (ja)
Inventor
克彦 近藤
昭彦 松村
Original Assignee
Tesnology株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tesnology株式会社 filed Critical Tesnology株式会社
Publication of WO2023074466A1 publication Critical patent/WO2023074466A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G7/00Botany in general
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Mining

Definitions

  • the present invention relates to technology for estimating plant growth.
  • Patent Document 1 effective and objective rhizosphere elements in soil for quantitatively grasping the distribution amount and dynamics of fine roots such as trees are automatically and non-destructively measured without manual work such as visual inspection. is disclosed.
  • Patent Document 1 is a technology for predicting the growth situation in a specific field, and when targeting a wide range of fields, there is a problem that the introduction of the mechanism requires a great deal of cost.
  • Patent Document 2 using an aerial image or the like, the photographed images of a field are classified based on the photographing altitude, and from the classified image group, the field, the division of the field, and the plants in the field, Techniques for reconstructing 3D image data and the like and converting them into visualization data based on the field conditions or the imaging conditions have been disclosed.
  • the present invention is an invention completed based on the recognition of the above problems, and its main purpose is to provide a technique that can easily estimate the state of the roots of plants.
  • a machine learning device in one aspect of the present invention comprises: At least one of soil data of a field in which a given plant is cultivated, an image feature amount extracted from a photographed image of the field, root growth data of the given plant at the time of photographing the photographed image, and carbon dioxide absorption amount of the root of the given plant.
  • a storage unit that stores and Using machine learning to generate an estimation model for estimating at least one of the root growth data of a given plant and the amount of carbon dioxide absorbed by the roots of a given plant at the time of photographing from the soil data and the image feature quantity extracted from the photographed image. and a model generator.
  • An estimating device in one aspect of the present invention comprises: a storage unit that stores soil data of a field in which a predetermined plant is cultivated; an image acquisition unit that acquires a photographed image of a field; an extraction unit that extracts an image feature amount from the captured image; Soil data of a field in which a predetermined plant was cultivated in the past and an image feature amount extracted from a photographed image of the field in the past cultivation are used as input data, and the root of the predetermined plant at the time of photographing the photographed image in the past cultivation.
  • a machine learning device in another aspect of the present invention comprises: Soil data of a field for cultivating a predetermined plant that can be cultivated in the second period with the roots remaining after the first harvesting, an image feature amount extracted from the first captured image of the field in the first period, At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of the first photographing of one photographed image, and the image feature amount extracted from the second photographed image of the field in the second period; a storage unit that stores at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the root of the predetermined plant at the time of the second photographing of the second photographed image; Machine learning a first estimation model for estimating at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the root of the predetermined plant at the time of the first photographing from the soil data and the image feature amount of the first photographed image.
  • a second estimation model for estimating at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the root of the predetermined plant at the time of the second photographing from the soil data and the image feature amount of the second photographed image.
  • a model generation unit that generates by
  • An estimating device in another aspect of the present invention comprises: a storage unit for storing soil data of a field for cultivating a predetermined plant that can be cultivated for a second period with roots remaining after the first harvest; an image acquisition unit that acquires a first captured image of the field in the first period or a second captured image of the field in the second period; an extraction unit that extracts an image feature amount from the first captured image or the second captured image;
  • the soil data of the field where the predetermined plant was cultivated in the past and the image feature amount extracted from the photographed image of the field in the first period of past cultivation are used as input data,
  • Using a first estimation model machine-learned using at least one of the root growth data of a predetermined plant and the amount of carbon dioxide absorbed by the root of a predetermined plant at the time of photographing the photographed image in the first period in the past cultivation as output data At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of
  • the soil data of the field where the predetermined plant was cultivated in the past and the image feature amount extracted from the photographed image of the field in the second period of past cultivation are used as input data, Using a second estimation model machine-learned using at least one of the root growth data of a predetermined plant and the amount of carbon dioxide absorbed by the roots of a predetermined plant at the time of capturing the captured image in the second period in the past cultivation as output data, At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of capturing the second captured image, based on the stored soil data and the image feature amount extracted from the second captured image.
  • a machine learning device in yet another aspect of the present invention comprises: Soil data of a field for cultivating a predetermined plant that can be cultivated in the second period with the roots remaining after the first harvesting, an image feature amount extracted from the first captured image of the field in the first period, At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of the first photographing of one photographed image, and the image feature amount extracted from the second photographed image of the field in the second period; At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of the second photographing of the second photographed image, and the root growth data of the predetermined plant and the roots of the predetermined plant at the end of the first period a storage unit that stores at least one of the carbon dioxide absorption amount of Machine learning a first estimation model for estimating at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the root of the predetermined plant at the time of
  • a model generation unit that generates a second estimation model for estimating at least one of the root growth data of the plant and the carbon dioxide absorption amount of the root of the predetermined plant by machine learning.
  • An estimating device comprises: a storage unit for storing soil data of a field for cultivating a predetermined plant that can be cultivated for a second period with roots remaining after the first harvest; an image acquisition unit that acquires a first photographed image of the field at the end of the first period and further acquires a second photographed image of the field in the second period; an extraction unit that extracts an image feature amount from the first captured image and further extracts an image feature amount from the second captured image; Using as input data soil data of a field in which a predetermined plant was cultivated in the past and an image feature amount extracted from a photographed image of the field in the first period of past cultivation, a photographed image of the first period of past cultivation.
  • a first estimation model machine-learned using at least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the root of the predetermined plant at the time of photographing as output data, the stored soil data and the first estimating at least one of the growth data of the roots of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the end of the first term from the image feature amount extracted from the photographed image; Soil data of a field where a predetermined plant was cultivated in the past, image feature values extracted from a photographed image of the field in the second period of the past cultivation, and roots of the predetermined plant at the end of the first period of the past cultivation.
  • the growth data of the roots of the predetermined plant and the carbon dioxide of the roots of the predetermined plant at the time of capturing the captured image in the second period in the past cultivation Using a second estimation model machine-learned with at least one of the carbon uptake as output data, the stored soil data, the image feature amount extracted from the second captured image, and the estimated end of the first term The root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of capturing the second captured image, based on at least one of the root growth data of the predetermined plant at the time and the amount of carbon dioxide absorbed by the roots of the predetermined plant. and an estimating unit that estimates at least one of
  • the state of plant roots can be easily estimated.
  • FIG. 4 is a diagram showing an overview of the test phase; It is a figure which shows the growth period of 3rd period cultivation. It is a figure which shows the outline
  • FIG. 10 is a data structure diagram of teacher data for the second term in the embodiment.
  • 4 is a flow chart showing a process in an estimation model generation phase
  • FIG. 10 is a diagram showing a process in an estimation model usage phase
  • FIG. 10 is a configuration diagram of a neural network in modification 1
  • FIG. 11 is a data structure diagram of teacher data for the second term in modification 1
  • FIG. 11 is a data structure diagram of teacher data for the third period in modification 1
  • It is a block diagram of a growth management system.
  • Biomass which is carbon neutral, is expected as a measure to reduce carbon dioxide.
  • One example of biomass, sorghum becomes an ethanol-producing material.
  • Sorghum is a plant of the grass family. Sorghum grown on one hectare is said to absorb 90 tons of carbon dioxide through its leaves and stems, and 60 tons through its roots. In particular, roots play a role in retaining carbon dioxide in the soil.
  • Sorghum is cultivated in triplicate.
  • the grass of the second period is grown from the roots left after cutting the grass in the first period.
  • the grass of the third period is grown from the roots left after cutting the grass in the second period. In other words, the grass is grown three times while the roots are stretched.
  • FIG. 1 is a diagram showing an overview of the testing stage.
  • Field 220a, field 220b and field 220c cultivate sorghum in soil with different characteristics.
  • the artificial satellite 302 photographs the farm field 220a, the farm field 220b, and the farm field 220c from above.
  • the captured satellite image is transmitted to the image providing server 300 .
  • the growth management device 100 acquires satellite images of the farm field 220 a , the farm field 220 b and the farm field 220 c from the image providing server 300 .
  • the growth management device 100 is a device used for managing the growth of a given plant, for example, in fields 220a, 220b and 220c.
  • Soil data is data representing characteristics of soil, such as acidity, drainage, fertility, and viscosity.
  • Root growth data data representing the state of root growth (hereinafter referred to as “root growth data”) is sent from the field terminal 200 to the growth management device 100.
  • Root growth data are, for example, root length, root weight, root volume, and the like.
  • the growth management device 100 performs machine learning using the data obtained in the test stage to generate an estimation model for estimating the root growth state.
  • FIG. 2 is a diagram showing the growth period of three-season cultivation.
  • the growth of a given plant for example, sorghum
  • sorghum that can be cultivated in the second and third periods with the roots left after the first harvest will be described.
  • the second period is also to be cultivated until the 10th week.
  • satellite images are acquired every seven days, and root measurements are performed.
  • the fruit is formed and the leaves and stems are cut. Some grass is grown as is for 10 week photography and measurements.
  • Modified Example 1 which will be described later, a satellite image is acquired and roots are measured even when the plant is pruned for the second time.
  • the 3rd term shall also be cultivated until the 10th week.
  • satellite images are acquired every seven days, and root measurements are performed.
  • the fruit is formed and the leaves and stems are cut. Some grass is grown as is for 10 week photography and measurements. Roots remain in the ground.
  • FIG. 3 is a diagram showing an overview of the operation stage. In this example, it corresponds to the year 2021. Soil data is sent from the farm field terminal 200 to the growth management apparatus 100 also in the operation stage. Also, as described in connection with FIG. 2, satellite images are similarly acquired in each week of each period. However, in the operation stage, it is not necessary to measure the roots. This is because the growth management apparatus 100 can obtain the root growth data and the amount of carbon dioxide absorbed by the root using the estimation model.
  • the amount of carbon dioxide absorbed by roots can be obtained by root weight x predetermined magnification and root volume x predetermined magnification. Since the amount of carbon dioxide uptake by roots is correlated with the length of roots, the amount of carbon dioxide uptake by roots may be obtained from a correspondence table or formula showing the correlation between root length and carbon dioxide uptake by roots. .
  • the amount of carbon dioxide absorbed by roots (per unit area) can be obtained by multiplying the amount of carbon dioxide absorbed by roots (per plant) ⁇ grass density (plant/unit area). A unit area is, for example, a hectare.
  • FIG. 4 is a diagram showing the phases of information processing.
  • S10 test data collection phase
  • S12 estimation model generation phase
  • S14 estimation model utilization phase
  • the generated estimation model is used to obtain root growth data and root carbon dioxide absorption in the operation stage (FIG. 3).
  • the estimation model for the first period is called a "first estimation model”.
  • the estimation model for the second period is called a "second estimation model”.
  • the estimation model for the third period is called a "third estimation model”.
  • a neural network is used for machine learning. Machine learning processing using neural networks may be general. Also, means other than neural networks, such as natural language processing represented by morphological analysis, may be used for machine learning.
  • FIG. 5 is a functional block diagram of the growth management device 100.
  • the growth management apparatus 100 has the function of a machine learning device that performs processing in the test data collection phase (S10) and the estimation model generation phase (S12) and generates an estimation model by machine learning. Furthermore, in the estimation model utilization phase (S14), the growth management device 100 has a function of an estimation device that obtains satellite imagery and obtains root growth data and root carbon dioxide absorption using the estimation model.
  • Each component of the growth management apparatus 100 includes computing units such as a CPU (Central Processing Unit) and various coprocessors, storage devices such as memory and storage, and hardware including wired or wireless communication lines connecting them. , is stored in a storage device and implemented by software that supplies processing instructions to the computing unit.
  • a computer program may consist of a device driver, an operating system, various application programs located in their higher layers, and a library that provides common functions to these programs.
  • Each illustrated block mainly indicates a functional unit block. Each block may be implemented by causing a computer to execute a program stored in a storage device.
  • the growth management device 100 includes a user interface processing unit 102, a data processing unit 104, a communication processing unit 108 and a data storage unit 106.
  • the user interface processing unit 102 receives user operations via a mouse or a touch panel, and is in charge of user interface processing such as image display and audio output.
  • a communication processing unit 108 is in charge of communication processing via a network, short-range wireless communication, or the like.
  • a data storage unit 106 stores various data.
  • the data processing unit 104 executes various processes based on data acquired by the user interface processing unit 102 and the communication processing unit 108 and data stored in the data storage unit 106 .
  • the data processing unit 104 also functions as an interface for the user interface processing unit 102 , the communication processing unit 108 and the data storage unit 106 .
  • the user interface processing unit 102 has an input unit 122 for inputting data by user operation and an output unit 120 for outputting data to be provided to the user.
  • the communication processing unit 108 includes a transmitting unit 180 that transmits various data and a receiving unit 182 that receives various data.
  • the data processing unit 104 includes a soil data acquisition unit 140 , an image acquisition unit 142 , a feature amount extraction unit 144 , a root data acquisition unit 146 , a teacher data generation unit 148 , an estimation model generation unit 150 and a root growth estimation unit 154 .
  • the soil data acquisition unit 140 acquires soil data from the field terminal 200 using the transmission unit 180, for example.
  • the soil data acquisition unit 140 may acquire soil data received by the input unit 122 .
  • the image acquisition unit 142 acquires a photographed image of the field 220 (a satellite image in this example).
  • the image acquiring unit 142 transmits a satellite image request specifying the position of the field 220 and the shooting date from the transmitting unit 180 to the image providing server 300, and the receiving unit 182 receives the captured satellite image. receive.
  • the satellite that captured the field 220 at the timing of each week of the first term, each week of the second term, and each week of the third term of the field 220a, the field 220b, and the field 220c An image is acquired.
  • the feature quantity extraction unit 144 extracts the feature quantity from the acquired photographed image of the farm field 220 (satellite image in this example).
  • the growth management apparatus 100 uses feature amounts (hereinafter referred to as “image feature amounts”) extracted from satellite images.
  • image feature amounts is, for example, the average color of the field range, the percentage of green in the field range, the degree of green density in the field range, and the height of the plant.
  • image feature amounts are, for example, the average color of the field range, the percentage of green in the field range, the degree of green density in the field range, and the height of the plant.
  • the image feature amount has a correlation with the root growth data.
  • the image features are also correlated with root carbon dioxide uptake associated with root growth data.
  • the feature amount extraction unit 144 extracts the captured image acquired in the first period (hereinafter referred to as “first captured image”), the captured image acquired in the second period (hereinafter referred to as “second captured image”). ”) and the captured image acquired in the third period (hereinafter referred to as “third captured image”).
  • first captured image the captured image acquired in the first period
  • second captured image hereinafter referred to as “second captured image”.
  • third captured image the captured image acquired in the third period
  • the R value of the average color of the field range is the average of the R values of the pixels included in the field range
  • the G value of the average color of the field range is the average value of the pixels included in the field range.
  • the R value of the average color of the field range is the average value of the R values of the pixels included in the field range.
  • Average color is an example of an index that indicates the color characteristics of an image. Instead of the average color, it is also possible to use an index (an index indicating the color feature of an image) that is patterned by a method other than averaging, such as a color tone pattern or contrast.
  • the root data acquisition unit 146 acquires root growth data from the field terminal 200 .
  • the receiving unit 182 may receive the root growth data through network communication such as the Internet or LAN (Local Area Network), or may receive the root growth data through short-range wireless communication. Alternatively, the root data acquisition unit 146 may acquire root growth data received by the input unit 122 .
  • the teacher data generation unit 148 generates teacher data used for machine learning based on soil data, image feature values, and root data (at least one of root growth data and root carbon dioxide absorption).
  • the estimation model generator 150 includes a learning engine 152 .
  • the estimation model generation unit 150 uses the learning engine 152 to generate a learning model (estimation model) based on teacher data.
  • the learning model uses a neural network, it has an input node corresponding to each index of the soil data and the image feature value which are the input variables, an output node corresponding to the index of the root data which is the output variable, and an intermediate node.
  • a neural network is used to generate weight data that indicates the strength of connections between nodes. A configuration example of the neural network will be described later with reference to FIG.
  • each index value of the soil data and the image feature value of the input variables prepared as application data is set to the input node corresponding to the index, and output based on the weight data An index value of root data is obtained from the node.
  • the estimation model generation unit 150 generates the soil data and the image feature amount (in this example, the average color of the field range) extracted from the photographed image (in this example, the satellite image). machine learning to generate an estimation model for estimating at least one of the root growth data of a given plant (sorghum in this example) and the amount of carbon dioxide absorbed by the given plant's roots. More specifically, for the first period, the estimation model generation unit 150 generates root growth data and A first estimation model for estimating at least one of carbon dioxide uptake by roots of a given plant is generated by machine learning.
  • the estimation model generation unit 150 calculates the root growth data of the predetermined plant and the root growth data of the predetermined plant at the time of capturing the second captured image from the soil data and the image feature amount of the second captured image.
  • a second estimation model for estimating at least one of carbon dioxide absorption is generated by machine learning.
  • the estimation model generation unit 150 calculates the root growth data of the predetermined plant and the root growth data of the predetermined plant at the time of capturing the third captured image based on the soil data and the image feature amount of the third captured image.
  • a second estimation model for estimating at least one of carbon dioxide absorption is generated by machine learning.
  • Root growth estimation unit 154 estimates root growth data in the estimation model use phase (S14).
  • Root growth estimation unit 154 includes estimation model utilization unit 156 .
  • the estimation model utilization unit 156 utilizes the estimation model for estimating root growth data.
  • the estimation model using unit 156 applies the soil data of the farm field 220 to be estimated and the image feature amount extracted from the acquired photographed image (satellite image in this example) to the estimation model as input variables.
  • the root growth estimating unit 154 collects the soil data of the field 220 in which the predetermined plant (sorghum in this example) was cultivated in the past cultivation (the test stage in this example), and the photographed image of the field 220 in the past cultivation ( In this example, image feature values extracted from satellite images) are used as input data, and at least one of root growth data of a predetermined plant at the time of photographing images in past cultivation and carbon dioxide absorption amount of the roots of the predetermined plant. is used as the output data and a machine-learned estimation model is used. Then, the root growth estimating unit 154 uses the soil data of the target field stored in the soil data storage unit 160 and the image feature amount extracted from the photographed image acquired by the image acquiring unit 142 to obtain the photographed image. At least one of the growth data of the roots of the predetermined plant and the carbon dioxide absorption amount of the roots of the predetermined plant at the time of photographing is estimated.
  • the root growth estimating unit 154 when the first photographed image is acquired in the first period of the operation stage, determines that the predetermined plant (in this example, Soil data of a field 220 in which sorghum was cultivated and an image feature amount extracted from a photographed image of the field 220 in the first period of past cultivation are used as input data, and a photographed image of the first period of past cultivation is taken.
  • the predetermined plant in this example, Soil data of a field 220 in which sorghum was cultivated and an image feature amount extracted from a photographed image of the field 220 in the first period of past cultivation are used as input data, and a photographed image of the first period of past cultivation is taken.
  • a first estimation model machine-learned using at least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the root of the predetermined plant at the time point as output data is used.
  • the root growth estimating unit 154 calculates, from the soil data of the target field stored in the soil data storage unit 160 and the image feature amount extracted from the first captured image, the predetermined plant estimating at least one of root growth data of plants and carbon dioxide uptake of roots of a given plant.
  • the root growth estimation unit 154 acquires the second photographed image in the second term of the operation stage
  • the predetermined plant (sorghum in this example) was cultivated in the past cultivation (test stage in this example).
  • the soil data of the field 220 and the image feature amount extracted from the photographed image of the field in the second period in the past cultivation are used as input data, and the root of the predetermined plant at the time of photographing the photographed image in the second period in the past cultivation.
  • a second estimation model machine-learned using at least one of the growth data and the carbon dioxide absorption amount of the roots of a predetermined plant as output data is used.
  • the root growth estimating unit 154 calculates, from the soil data of the target field stored in the soil data storage unit 160 and the image feature amount extracted from the second captured image, estimating at least one of root growth data of plants and carbon dioxide uptake of roots of a given plant.
  • the root growth estimation unit 154 cultivates the predetermined plant (sorghum in this example) in the past cultivation (test stage in this example).
  • the soil data of the field 220 and the image feature amount extracted from the photographed image of the field in the third period in the past cultivation are used as input data, and the image of the predetermined plant at the time of photographing the photographed image in the third period in the past cultivation.
  • a third estimation model machine-learned using at least one of the root growth data and the carbon dioxide absorption amount of the roots of the predetermined plant as output data is used.
  • the root growth estimating unit 154 uses the soil data of the target field stored in the soil data storage unit 160 and the image feature amount extracted from the third photographed image to determine whether the predetermined plant at the time of photographing the third photographed image. estimating at least one of root growth data of plants and carbon dioxide uptake of roots of a given plant.
  • Data storage unit 106 includes soil data storage unit 160 , image storage unit 162 , feature amount data storage unit 164 , root data storage unit 166 , teacher data storage unit 168 and estimation model storage unit 170 .
  • the soil data storage unit 160 stores soil data.
  • the soil data storage unit 160 will be described later with reference to FIG.
  • the image storage unit 162 stores a photographed image (a satellite image in this example) in association with a combination of year, field 220, and photographing date and time. It should be noted that the shooting date and time can specify the period and week of each field 220 .
  • the feature amount data storage unit 164 stores feature amount data.
  • the feature amount data will be described later with reference to FIG.
  • the root data storage unit 166 stores root data.
  • the configuration of the root data storage unit 166 will be described later with reference to FIG.
  • the teacher data storage unit 168 stores teacher data.
  • the teacher data will be described later with reference to FIGS. 11 and 12.
  • the estimated model storage unit 170 stores an estimated model for each period in each field 220 .
  • the data storage unit 106 includes a field data storage unit (not shown) that stores field data such as the geographical position and range of each field 220, and the density (strain/unit area) of a given plant (in this example, sorghum). .
  • the data storage unit 106 stores the soil data of the field 220 where a predetermined plant (sorghum in this example) is cultivated, the image feature amount extracted from the photographed image of the field 220 (satellite image in this example), and the At least one of the growth data of the roots of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of photographing the photographed image is stored.
  • the data storage unit 106 stores the soil of a field 220 for cultivating a predetermined plant (in this example, sorghum) that can be cultivated in the second and third periods with the roots remaining after the first harvest.
  • a predetermined plant in this example, sorghum
  • data, the image feature amount extracted from the first captured image of the field 220 in the first period, the growth data of the roots of the predetermined plant at the time of capturing the first captured image, and the carbon dioxide absorption amount of the roots of the predetermined plant At least one of the image feature amount extracted from the second photographed image of the field 220 in the second term, the root growth data of the predetermined plant at the time of photographing the second photographed image, and the carbon dioxide absorption amount of the root of the predetermined plant memorize one and the other.
  • FIG. 6 is a data structure diagram of the soil data storage unit 160.
  • Soil data storage unit 160 stores, for example, the illustrated table.
  • This table has a record for each combination of year and field ID.
  • a record corresponds to a "row” in a table.
  • This table also has items of acidity, drainage, fertility and viscosity, for example, for each combination of year and field ID.
  • An item corresponds to a "row” in a table, and is also called a "column".
  • FIG. 7 is a data structure diagram showing the time and average color in the growth process as feature data.
  • the feature amount data shown in FIG. 7 is shown in a table format.
  • the feature amount data has a record indicating the average color in each period for each combination of year and field ID.
  • the feature amount data is set as data of year, field ID, and weekly average color in each period. That is, the average color of the field range (example of image feature amount) is set in the field of each week in each period.
  • the degree of greenness of the field range is used as the image feature amount
  • the degree of greenness of the field range is set in the field of each week in each period.
  • FIG. 7 shows an example in which the average color is used as the image feature amount. However, if the plant height is used as the image feature amount, the plant height is set in the field of each week in each period.
  • a field corresponds to a data value setting area and is also called a "cell".
  • FIG. 8 is a diagram showing an example of the data structure held by the root data storage unit 166.
  • root data storage unit 166 stores, for example, the table shown in FIG. This table has a record for each combination of year and field ID. In addition, this table is set with year, field ID, weekly root length (an example of root growth data), and root carbon dioxide absorption amount in each period as data.
  • root weight is used as root growth data
  • root weight is set in the field of each week in each period instead of root length.
  • root volume is used as root growth data
  • root volume is set in the field of each week in each period instead of root length.
  • FIG. 9 is a flow chart showing the process in the test data collection phase (S10).
  • the soil data acquisition unit 140 acquires soil data from each field 220 (S20).
  • the soil data acquiring unit 140 may acquire the soil data of the farm field 220 received from the farm field terminal 200 by the receiving unit 182 or the soil data of the farm field 220 accepted by the input unit 122 .
  • the growth management device 100 acquires satellite images and root growth data for each week in the first period of each field 220 (S22).
  • the image acquisition unit 142 uses the communication processing unit 108 to receive satellite images from the image providing server 300 . Satellite images are stored in the image storage unit 162 .
  • the feature amount extraction unit 144 extracts image feature amounts from the satellite image and stores them in the feature amount data storage unit 164 .
  • the root data acquisition unit 146 acquires root growth data for each week in the first period of each field 220 .
  • the root data acquisition unit 146 may acquire root growth data received by the communication processing unit 108 or may acquire root growth data received by the input unit 122 .
  • the growth management device 100 acquires satellite images and root growth data for each week in the second term of each field 220 (S26).
  • the data acquisition process and storage process are the same as in S22.
  • the growth management device 100 acquires satellite images and root growth data for each week in the third term of each field 220 (S30).
  • the data acquisition process and storage process are the same as in S22.
  • FIG. 10 is a configuration diagram of a neural network in the embodiment.
  • acidity, drainage, fertility and viscosity which are examples of soil data
  • the average color of a field range which is an example of image feature values
  • a plurality of intermediate nodes are provided in the intermediate layer of the neural network.
  • the root length and the amount of carbon dioxide absorbed by the root which are examples of root growth data, are set as output nodes. Only one of the root length and the amount of carbon dioxide absorbed by the root may be used as the output node. There is a full connection between the input node of the input layer and the intermediate node of the intermediate layer.
  • This neural network assumes an estimation model in which the root length and the carbon dioxide absorption amount of the root are determined according to the values of the soil data and the image feature values. This neural network is used together in the first estimation model of the first period, the second estimation model of the second period, and the third estimation model of the third period.
  • FIG. 11 is a data structure diagram of teacher data for the first term in the embodiment.
  • the teacher data for the first period is prepared to generate the first estimation model for the first period.
  • the teacher data in this example is in table format.
  • the teacher data has records for each combination of year, field ID, period and week.
  • each record has the items of acidity, drainage, fertility, viscosity, and average color of the field range corresponding to the input node shown in FIG. 10, and root length and root carbon dioxide absorption corresponding to the output node. has an item.
  • One record corresponds to one sample in the teacher data. It is assumed that each sample in the teacher data can be identified according to each combination of year, field ID, period and week.
  • the first record shown corresponds to the year 2020, field ID: A1, the combination of the first period and the first week.
  • the acidity is B1
  • the drainage is C1
  • the fertility is D1
  • the viscosity is E1.
  • These values are copied from the soil data storage section 160 by the teacher data generation section 148 .
  • the average color of the satellite image taken in the first week of the first term in the field 220a with field ID: A1 in 2020 was F1-1-1. This value is copied from the feature amount data storage unit 164 by the teacher data generation unit 148 .
  • the root length measured in the first week of the first term in the field 220a of the field ID: A1 in 2020 is G1-1-1, and the root carbon dioxide calculated based on the root length It shows that the amount of absorption was H1-1-1.
  • the weight data that is the optimal solution is learned by the neural network.
  • the learned weight data is stored in the estimation model storage unit 170 .
  • the learning procedure itself using a neural network may be conventional technology.
  • FIG. 12 is a data structure diagram of teacher data for the second term in the embodiment.
  • the teacher data for the second period is prepared to generate the second estimation model for the second period.
  • the structure of the training data for the second term is the same as the training data for the first term shown in FIG.
  • the illustrated first record corresponds to the year 2020, field ID: A1, the combination of the second period and the first week.
  • the teacher data for the third period has a similar configuration, and values related to the third period are set.
  • FIG. 13 is a flow chart showing the process in the estimation model generation phase (S12).
  • the teacher data generation unit 148 refers to the soil data storage unit 160, the feature amount data storage unit 164, and the root data storage unit 166 to generate the first term teacher data (FIG. 11) (S40). .
  • the estimation model generation unit 150 inputs the teacher data for the first period and activates the learning process by the learning engine 152 (S42).
  • the learning engine 152 executes learning processing and generates a first estimation model (S44).
  • the first estimation model including weight data is stored in estimation model storage section 170 .
  • the teacher data generation unit 148 similarly generates teacher data for the second term (S46).
  • the estimation model generation unit 150 inputs the second term teacher data and activates the learning process by the learning engine 152 (S48).
  • the learning engine 152 executes learning processing to generate a second estimation model (S50).
  • the second estimation model including weight data is stored in estimation model storage section 170 .
  • the teacher data generation unit 148 similarly generates teacher data for the third period (S52).
  • the estimation model generation unit 150 inputs the third term teacher data and activates the learning process by the learning engine 152 (S54).
  • the learning engine 152 executes learning processing and generates a third estimation model (S56).
  • the third estimation model including weight data is stored in estimation model storage section 170 .
  • This process is performed before the estimation model usage phase. In this example, it takes place between the end of the third period in 2020 and the sowing of seeds in 2021. After that, we move to the operation stage.
  • FIG. 14 is a diagram showing the process in the estimation model utilization phase (S14).
  • the image acquisition unit 142 waits for the photographing timing (see FIG. 2) of each field 220 (S60).
  • the image acquisition unit 142 acquires a satellite image of the field (S62).
  • the satellite image acquisition method is the same as in the test data acquisition phase (FIG. 9).
  • the feature amount extraction unit 144 extracts the image feature amount (S64).
  • the extraction of the image feature quantity is also the same as in the test data collection phase (FIG. 9).
  • the root growth estimation unit 154 selects an estimation model according to the period at that time (S66). That is, the root growth estimation unit 154 selects the first estimation model if the imaging timing is the first period, selects the second estimation model if the imaging timing is the second period, and selects the second estimation model if the imaging timing is the third period. If so, select the third estimation model.
  • the estimation model utilization unit 156 applies the soil data of the field 220 and the image feature quantity extracted in S62 to the estimation model. That is, the estimated model utilization unit 156 sets these values to the input nodes of the neural network that performs calculations using the estimated model (S68).
  • the root growth estimator 154 obtains values of root length and root carbon dioxide absorption from the output node as the calculation result of the estimation model (S70).
  • the output unit 120 outputs (for example, displays on a display) the root length and the amount of carbon dioxide absorbed by the root (S72).
  • the transmission unit 180 may transmit the root length and the carbon dioxide absorption amount of the root to the agricultural field terminal 200 .
  • machine learning provides a mechanism for estimating root growth data and root carbon dioxide emissions.
  • root growth data and root carbon dioxide emissions In particular, when the roots that grew in the first period were left behind, it was not possible to easily grasp how the roots continued to grow in the second period.
  • the time of the first harvest is the end of the first term and the start of the second term.
  • the first estimation model is used to estimate the root growth data at the time of the first cutting. Then, the estimated root growth data is added to the estimation element of the root growth data in the second period as the initial value of the root state in the second period.
  • FIG. 15 is a configuration diagram of a neural network in modification 1.
  • the first estimation model of modification 1 is generated using a neural network similar to that of the embodiment.
  • the second and third estimation models of modification 1 are generated using the neural network shown in FIG.
  • an input node corresponding to the root growth data or the amount of carbon dioxide absorbed by the root at the end of the first term is added.
  • an input node for the root length at the end of the previous period is added.
  • Intermediate nodes and output nodes remain unchanged. It is also the same as the embodiment in that it is a full connection.
  • the training data for the first period in modification 1 is the same as in the case of the embodiment (FIG. 11), and the generation of the first estimation model is also the same.
  • FIG. 16 is a data structure diagram of teacher data for the second period in Modification 1. As shown in FIG. In the second term teacher data in Modification 1, an item of root length at the end of the first term is added. The root length measured at the first cutting is set in the field of this item.
  • FIG. 17 is a data structure diagram of teacher data for the third term in Modification 1. As shown in FIG. Similarly, in the teacher data for the third term in Modification 1, an item of root length at the end of the second term is added. The root length measured at the second cutting is set in the field of this item.
  • the growth management device 100 acquires the satellite image and root growth data of each field 220 at the time of the first harvest (S24 ).
  • the growth management apparatus 100 acquires the satellite image and root growth data of each field 220 at the time of the second harvest (S28).
  • the acquisition method of the satellite image, the method of extracting the image feature amount from the satellite image, the acquisition method of the root growth data, and the like are the same as in the case of the timing of photographing each week.
  • the teacher data for the second term (FIG. 16) is generated by adding the root length at the end of the first term (at the time of the first reaping).
  • the estimation model generation unit 150 inputs this second term teacher data and generates a second estimation model in the second modification.
  • teacher data for the third term (FIG. 17) is generated by adding the root length at the end of the second term (at the time of the second reaping).
  • the estimation model generation unit 150 inputs this third period teacher data and generates the third estimation model in the second modified example.
  • the receiving unit 182 receives the notification from the field terminal 200 at the time of the first reaping.
  • the image acquisition unit 142 acquires the satellite image of the field 220 in the same manner as in S62, and the feature amount extraction unit 144 extracts the image feature amount in the same manner as in S64.
  • the estimation model utilization unit 156 applies the soil data of the farm field 220 and the extracted image feature amount to the first estimation model, and the root growth estimation unit 154 calculates the root length and the root length at the time of the first harvesting. Get the value of the carbon dioxide uptake of .
  • the estimation model utilization unit 156 inputs the root length at the time of the first cutting and the root length at the end of the first term in the neural network. Set to node. That is, the estimation model utilization unit 156 applies the root length at the time of the first cutting to the second estimation model in addition to the soil data and the image feature value, and the root growth estimation unit 154 applies the root length at the time of the first cutting to the second estimation model.
  • the values of root length and root CO2 uptake in the second period are obtained considering the root length in the second period.
  • the receiving unit 182 receives a notification from the field terminal 200 at the time of the second harvesting.
  • the image acquisition unit 142 acquires a satellite image of the field in the same manner as in S62, and the feature amount extraction unit 144 extracts the image feature amount in the same manner as in S64.
  • the estimation model using unit 156 applies the soil data of the farm field 220 and the extracted image feature amount to the second estimation model, and the root growth estimation unit 154 calculates the root length and the root length at the time of the second harvesting. Get the value of the carbon dioxide uptake of .
  • the estimation model utilization unit 156 inputs the root length at the time of the second harvesting and the root length at the end of the first term in the neural network. Set to node. That is, the estimation model utilization unit 156 applies the root length at the time of the second cutting to the third estimation model in addition to the soil data and the image feature value, and the root growth estimation unit 154 applies the root length at the time of the second cutting to the third estimation model.
  • the values of root length and root carbon dioxide uptake at the third stage are obtained considering the root length at
  • the data storage unit 106 stores the soil data, the image feature amount extracted from the first captured image of the field in the first period, and the first captured image as the first-term teacher data ( FIG. 11 ). At least one of the growth data of the roots of the predetermined plant and the carbon dioxide absorption amount of the roots of the predetermined plant at the time is stored.
  • the data storage unit 106 stores the soil data, the image feature amount extracted from the second photographed image of the field in the second term, and the predetermined Stores at least one of plant root growth data and predetermined plant root carbon dioxide absorption, and at least one of predetermined plant root growth data and predetermined plant root carbon dioxide absorption at the end of the first term. do.
  • the data storage unit 106 stores the soil data, the image feature amount extracted from the third photographed image of the field in the third term, and the predetermined Stores at least one of plant root growth data and predetermined plant root carbon dioxide absorption, and at least one of predetermined plant root growth data and predetermined plant root carbon dioxide absorption at the end of the second period. do.
  • the estimation model generating unit 150 uses the neural network of FIG. 10 and the first-term teacher data (FIG. 11) to determine the roots of the predetermined plant at the time of the first photographing from the soil data and the image feature amount of the first photographed image. machine learning to generate a first estimation model for estimating at least one of the growth data of the plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant.
  • the estimation model generation unit 150 uses the neural network of FIG. 15 and the teacher data of the second period (FIG. 16) to generate soil data, the image feature amount of the second captured image, and the number of predetermined plants at the end of the first period.
  • the estimation model generation unit 150 uses the neural network of FIG. 15 and the teacher data of the third period (FIG. 17) to generate the soil data, the image feature amount of the third captured image, and the predetermined At least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant at the time of the third imaging is estimated from at least one of the root growth data of the plant and the carbon dioxide absorption amount of the predetermined plant root.
  • a third estimation model is generated by machine learning.
  • the root growth estimation unit 154 uses the first estimation model to obtain the soil data and the image feature amount extracted from the first captured image at the end of the first period. At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the root of the predetermined plant at the end of the first term is estimated from the above. Then, in the second estimation, the root growth estimating unit 154 uses the second estimation model, the soil data, the image feature amount extracted from the second captured image, and the estimated end point of the first period.
  • At least the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant roots Based on at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant roots, at least the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant roots at the time of capturing the second captured image. Estimate one.
  • the root growth estimation unit 154 uses the second estimation model, from the soil data and the image feature amount extracted from the second captured image at the end of the second period, At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the root of the predetermined plant at the end of the second period is estimated. Then, in the third period estimation, the root growth estimation unit 154 uses the third estimation model to obtain the soil data, the image feature amount extracted from the third captured image, and the estimated end point of the second period.
  • At least the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant roots Based on at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant roots, at least the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant roots at the time of capturing the third captured image. Estimate one.
  • the state of the roots at the beginning of the second period is unknown because they are in the soil. Therefore, the first estimation model is used to estimate the state of the last root in the first period, and this is incorporated into the second estimation model. Furthermore, the state of the roots at the beginning of the third period is also unknown because it is in the soil. Therefore, the second estimation model is used to estimate the state of the last root in the second term, and this is incorporated into the third estimation model.
  • the growth management device 100 has the function of a machine learning device that generates an estimation model by machine learning, and the function of an estimation device that obtains root growth data and root carbon dioxide absorption from the estimation model.
  • a machine learning device and an estimation device may be provided separately.
  • FIG. 18 is a configuration diagram of a growth management system.
  • the machine learning device 190 performs processing in the test data collection phase (S10) and the estimation model generation phase (S12), and generates an estimation model by machine learning.
  • the estimation device 192 acquires satellite images and uses the estimation model to obtain root growth data and root carbon dioxide absorption. Therefore, the estimation model generated by the machine learning device 190 is transferred to the estimation device 192 before moving from the estimation model generation phase (S12) to the estimation model use phase (S14).
  • Machine learning device 190 is used, for example, by service providers who provide estimation models.
  • the estimator 192 is used, for example, by farmers using estimation models.
  • the growth management apparatus 100 has a carbon price calculation unit (not shown) that calculates a carbon price (per share) by multiplying the estimated carbon dioxide absorption amount (per share) by a carbon price (rate) per unit amount. ) may be provided.
  • the carbon price calculation unit may calculate the carbon price (per unit area) by multiplying the estimated amount of carbon dioxide absorption by the roots (per unit area) by the carbon price (rate) per unit amount.
  • the output unit 120 may output (for example, display on a display) the carbon price (per share) or the carbon price (per unit area).
  • the transmission unit 180 may transmit the carbon price (per share) or the carbon price (per unit area) to the field terminal 200 .
  • the estimation device 192 also includes a carbon price calculation unit that calculates the carbon price (per share) by multiplying the estimated carbon dioxide absorption amount (per share) by the carbon price (rate) per unit amount. You may do so.
  • the carbon price calculation unit may calculate the carbon price (per unit area) by multiplying the estimated amount of carbon dioxide absorption by the roots (per unit area) by the carbon price (rate) per unit amount.
  • the output unit 120 may output (for example, display on a display) the carbon price (per share) or the carbon price (per unit area).
  • the transmission unit 180 may transmit the carbon price (per share) or the carbon price (per unit area) to the field terminal 200 .
  • a drone image obtained by capturing the farm field 220 from above with a drone camera flying over the farm field 220 may be used.
  • drone images there are more pixels than satellite images, so there is an advantage that it is easy to obtain the ratio of green in the field range.
  • a camera image captured from obliquely above the field 220 or horizontally by a camera installed in the field 220 (hereinafter referred to as "installed camera") may be used.
  • a camera image taken from the horizontal direction has the advantage that the height of the plant can be easily obtained.
  • the field 220 may be divided into a plurality of areas, each area may be treated as one field 220, and test cultivation may be performed by changing the timing of harvesting and the soil. In this way, many samples can be included in the training data.

Abstract

A machine learning device according to the present invention comprises: a storage unit that stores soil data of cultivated land where plants of a prescribed kind are cultivated, image features extracted from captured images of the cultivated land, and growth data of the roots of the plants of the prescribed kind and/or the amounts of carbon dioxide absorption of the roots of the plants of the prescribed kind at the times when the images were captured; and a model generation unit that generates, through machine learning, an estimation model for estimating, from soil data and image features extracted from a captured image, growth data of the roots of a plant of the prescribed kind and/or the amount of carbon dioxide absorption of the roots of the plant of the prescribed kind at the time when the image was captured.

Description

機械学習装置、推定装置およびプログラムMachine learning device, estimation device and program
 本発明は、植物の成長を推定する技術に関する。 The present invention relates to technology for estimating plant growth.
 昨今の地球温暖化に伴い、植物の光合成による大気中の二酸化炭素量を削減するための植樹活動が広く行われている。また、植物の根に炭素固定能力があることが知られており、圃場に植えられた植物の根の長さを正確に把握することが、ひいては炭素吸収量を見積もる上で、重要になると考えられている。 With the recent global warming, tree planting activities are widely carried out to reduce the amount of carbon dioxide in the atmosphere through plant photosynthesis. In addition, it is known that the roots of plants have the ability to fix carbon, and we believe that accurately determining the length of the roots of plants planted in the field will be important in estimating the amount of carbon absorbed. It is
 特許文献1には、樹木等の細根の分布量やその動態を定量的に把握するための効果的で客観的な土壌中の根圏要素を、目視等人手によることなく自動的に非破壊的に分類する技術が開示されている。 In Patent Document 1, effective and objective rhizosphere elements in soil for quantitatively grasping the distribution amount and dynamics of fine roots such as trees are automatically and non-destructively measured without manual work such as visual inspection. is disclosed.
特許第5078508号公報Japanese Patent No. 5078508 国際公開第2019/097892号WO2019/097892
 しかしながら、特許文献1に開示された技術は、ある特定の圃場における生育状況を予測する技術であり、広範囲の圃場を対象にする場合、仕組みの導入に多大なコストを要するという問題がある。 However, the technology disclosed in Patent Document 1 is a technology for predicting the growth situation in a specific field, and when targeting a wide range of fields, there is a problem that the introduction of the mechanism requires a great deal of cost.
 また、特許文献2には、空撮画像等を利用して、圃場の撮影画像を撮影高度に基づいて分類し、分類された画像群から、圃場、前記圃場の区画、前記圃場における植物について、3D画像データ等を再構築し、さらに、前記圃場条件または前記撮像条件に基づく可視化データへの変換を行う技術が開示されている。 Further, in Patent Document 2, using an aerial image or the like, the photographed images of a field are classified based on the photographing altitude, and from the classified image group, the field, the division of the field, and the plants in the field, Techniques for reconstructing 3D image data and the like and converting them into visualization data based on the field conditions or the imaging conditions have been disclosed.
 しかしながら、特許文献2に開示された技術は、区画ごとに高度を変えて撮影した画像を用意する必要があり、ドローンなどのUAV(Unmanned Aerial Vehicle)を、1つの圃場に対して複数回飛行させる必要がある。 However, in the technique disclosed in Patent Document 2, it is necessary to prepare images taken at different altitudes for each section, and a UAV (Unmanned Aerial Vehicle) such as a drone is flown multiple times to one field. There is a need.
 この点を解決するため、衛星で撮影した画像に基づき、大がかりな仕組みを用いることなく、所定の区域に栽培された植物、ひいては根の部分に吸収された二酸化炭素吸収量を把握する技術が求められている。 In order to solve this problem, there is a need for a technology to grasp the amount of carbon dioxide absorbed by plants cultivated in a given area, and by extension, the amount of carbon dioxide absorbed by the roots, based on images taken by satellites, without using a large-scale system. It is
 本発明は、上記課題認識に基づいて完成された発明であり、その主たる目的は、植物の根の状態を簡便に推定できる技術を提供することである。 The present invention is an invention completed based on the recognition of the above problems, and its main purpose is to provide a technique that can easily estimate the state of the roots of plants.
 本発明のある態様における機械学習装置は、
 所定植物を栽培する圃場の土壌データと、圃場の撮影画像から抽出された画像特徴量と、撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とを記憶する記憶部と、
 土壌データと、撮影画像から抽出された画像特徴量とから、撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する推定モデルを機械学習により生成するモデル生成部と、を備える。
A machine learning device in one aspect of the present invention comprises:
At least one of soil data of a field in which a given plant is cultivated, an image feature amount extracted from a photographed image of the field, root growth data of the given plant at the time of photographing the photographed image, and carbon dioxide absorption amount of the root of the given plant. a storage unit that stores and
Using machine learning to generate an estimation model for estimating at least one of the root growth data of a given plant and the amount of carbon dioxide absorbed by the roots of a given plant at the time of photographing from the soil data and the image feature quantity extracted from the photographed image. and a model generator.
 本発明のある態様における推定装置は、
 所定植物を栽培する圃場の土壌データを記憶する記憶部と、
 圃場の撮影画像を取得する画像取得部と、
 取得された撮影画像から画像特徴量を抽出する抽出部と、
 過去の栽培において所定植物を栽培した圃場の土壌データと、過去の栽培における圃場の撮影画像から抽出された画像特徴量とを入力データとし、過去の栽培における撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された推定モデルを用いて、記憶している土壌データと、取得された撮影画像から抽出された画像特徴量とから、取得された撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する推定部と、を備える。
An estimating device in one aspect of the present invention comprises:
a storage unit that stores soil data of a field in which a predetermined plant is cultivated;
an image acquisition unit that acquires a photographed image of a field;
an extraction unit that extracts an image feature amount from the captured image;
Soil data of a field in which a predetermined plant was cultivated in the past and an image feature amount extracted from a photographed image of the field in the past cultivation are used as input data, and the root of the predetermined plant at the time of photographing the photographed image in the past cultivation. Image feature values extracted from stored soil data and captured images using an estimation model machine-learned using at least one of the growth data and the carbon dioxide absorption amount of the roots of a predetermined plant as output data and an estimating unit for estimating at least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of capturing the acquired captured image.
 本発明の別の態様における機械学習装置は、
 1期目の刈り取りの後に根を残して2期目の栽培が可能な所定植物を栽培する圃場の土壌データと、1期目の圃場の第1撮影画像から抽出された画像特徴量と、第1撮影画像の第1撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方と、2期目の圃場の第2撮影画像から抽出された画像特徴量と、第2撮影画像の第2撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とを記憶する記憶部と、
 土壌データと、第1撮影画像の画像特徴量とから、第1撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する第1推定モデルを機械学習により生成し、
 土壌データと、第2撮影画像の画像特徴量とから、第2撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する第2推定モデルを機械学習により生成するモデル生成部と、を備える。
A machine learning device in another aspect of the present invention comprises:
Soil data of a field for cultivating a predetermined plant that can be cultivated in the second period with the roots remaining after the first harvesting, an image feature amount extracted from the first captured image of the field in the first period, At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of the first photographing of one photographed image, and the image feature amount extracted from the second photographed image of the field in the second period; a storage unit that stores at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the root of the predetermined plant at the time of the second photographing of the second photographed image;
Machine learning a first estimation model for estimating at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the root of the predetermined plant at the time of the first photographing from the soil data and the image feature amount of the first photographed image. Generated by
Machine learning a second estimation model for estimating at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the root of the predetermined plant at the time of the second photographing from the soil data and the image feature amount of the second photographed image. and a model generation unit that generates by
 本発明の別の態様における推定装置は、
 1期目の刈り取りの後に根を残して2期目の栽培が可能な所定植物を栽培する圃場の土壌データを記憶する記憶部と、
 1期目の圃場の第1撮影画像又は2期目の圃場の第2撮影画像を取得する画像取得部と、
 第1撮影画像又は第2撮影画像から画像特徴量を抽出する抽出部と、
 第1撮影画像を取得した場合に、過去の栽培において所定植物を栽培した圃場の土壌データと、過去の栽培における1期目の圃場の撮影画像から抽出された画像特徴量とを入力データとし、過去の栽培における1期目の撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第1推定モデルを用いて、記憶している土壌データと、第1撮影画像から抽出された画像特徴量とから、第1撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定し、
 第2撮影画像を取得した場合に、過去の栽培において所定植物を栽培した圃場の土壌データと、過去の栽培における2期目の圃場の撮影画像から抽出された画像特徴量とを入力データとし、過去の栽培における2期目の撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第2推定モデルを用いて、記憶している土壌データと、第2撮影画像から抽出された画像特徴量とから、第2撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する推定部と、を備える。
An estimating device in another aspect of the present invention comprises:
a storage unit for storing soil data of a field for cultivating a predetermined plant that can be cultivated for a second period with roots remaining after the first harvest;
an image acquisition unit that acquires a first captured image of the field in the first period or a second captured image of the field in the second period;
an extraction unit that extracts an image feature amount from the first captured image or the second captured image;
When the first photographed image is acquired, the soil data of the field where the predetermined plant was cultivated in the past and the image feature amount extracted from the photographed image of the field in the first period of past cultivation are used as input data, Using a first estimation model machine-learned using at least one of the root growth data of a predetermined plant and the amount of carbon dioxide absorbed by the root of a predetermined plant at the time of photographing the photographed image in the first period in the past cultivation as output data, At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of capturing the first captured image, based on the stored soil data and the image feature amount extracted from the first captured image. , and
When the second photographed image is acquired, the soil data of the field where the predetermined plant was cultivated in the past and the image feature amount extracted from the photographed image of the field in the second period of past cultivation are used as input data, Using a second estimation model machine-learned using at least one of the root growth data of a predetermined plant and the amount of carbon dioxide absorbed by the roots of a predetermined plant at the time of capturing the captured image in the second period in the past cultivation as output data, At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of capturing the second captured image, based on the stored soil data and the image feature amount extracted from the second captured image. an estimating unit for estimating
 本発明のさらに別の態様における機械学習装置は、
 1期目の刈り取りの後に根を残して2期目の栽培が可能な所定植物を栽培する圃場の土壌データと、1期目の圃場の第1撮影画像から抽出された画像特徴量と、第1撮影画像の第1撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方と、2期目の圃場の第2撮影画像から抽出された画像特徴量と、第2撮影画像の第2撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方と、1期目終了時点の所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方と、を記憶する記憶部と、
 土壌データと、第1撮影画像の画像特徴量とから、第1撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する第1推定モデルを機械学習により生成し、
 土壌データと、第2撮影画像の画像特徴量と、1期目終了時点の所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とから、第2撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する第2推定モデルを機械学習により生成するモデル生成部と、を備える。
A machine learning device in yet another aspect of the present invention comprises:
Soil data of a field for cultivating a predetermined plant that can be cultivated in the second period with the roots remaining after the first harvesting, an image feature amount extracted from the first captured image of the field in the first period, At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of the first photographing of one photographed image, and the image feature amount extracted from the second photographed image of the field in the second period; At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of the second photographing of the second photographed image, and the root growth data of the predetermined plant and the roots of the predetermined plant at the end of the first period a storage unit that stores at least one of the carbon dioxide absorption amount of
Machine learning a first estimation model for estimating at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the root of the predetermined plant at the time of the first photographing from the soil data and the image feature amount of the first photographed image. Generated by
Based on the soil data, the image feature amount of the second photographed image, and at least one of the root growth data of the predetermined plant at the end of the first period and the carbon dioxide absorption amount of the roots of the predetermined plant, the predetermined plant at the time of the second photographing. and a model generation unit that generates a second estimation model for estimating at least one of the root growth data of the plant and the carbon dioxide absorption amount of the root of the predetermined plant by machine learning.
 本発明のさらに別の態様における推定装置は、
 1期目の刈り取りの後に根を残して2期目の栽培が可能な所定植物を栽培する圃場の土壌データを記憶する記憶部と、
 1期目終了時点の圃場の第1撮影画像を取得し、更に2期目の圃場の第2撮影画像を取得する画像取得部と、
 第1撮影画像から画像特徴量を抽出し、更に第2撮影画像から画像特徴量を抽出する抽出部と、
 過去の栽培において所定植物を栽培した圃場の土壌データと、過去の栽培における1期目の圃場の撮影画像から抽出された画像特徴量とを入力データとし、過去の栽培における1期目の撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第1推定モデルを用いて、記憶している土壌データと、第1撮影画像から抽出された画像特徴量とから、1期目終了時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定し、
 過去の栽培において所定植物を栽培した圃場の土壌データと、過去の栽培における2期目の圃場の撮影画像から抽出された画像特徴量と、過去の栽培における1期目終了時点の所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とを入力データとし、過去の栽培における2期目の撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第2推定モデルを用いて、記憶している土壌データと、第2撮影画像から抽出された画像特徴量と、推定された1期目終了時点の所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とから、第2撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する推定部と、を備える。
An estimating device according to yet another aspect of the present invention comprises:
a storage unit for storing soil data of a field for cultivating a predetermined plant that can be cultivated for a second period with roots remaining after the first harvest;
an image acquisition unit that acquires a first photographed image of the field at the end of the first period and further acquires a second photographed image of the field in the second period;
an extraction unit that extracts an image feature amount from the first captured image and further extracts an image feature amount from the second captured image;
Using as input data soil data of a field in which a predetermined plant was cultivated in the past and an image feature amount extracted from a photographed image of the field in the first period of past cultivation, a photographed image of the first period of past cultivation. Using a first estimation model machine-learned using at least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the root of the predetermined plant at the time of photographing as output data, the stored soil data and the first estimating at least one of the growth data of the roots of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the end of the first term from the image feature amount extracted from the photographed image;
Soil data of a field where a predetermined plant was cultivated in the past, image feature values extracted from a photographed image of the field in the second period of the past cultivation, and roots of the predetermined plant at the end of the first period of the past cultivation. and at least one of the carbon dioxide absorption amount of the roots of the predetermined plant is used as input data, and the growth data of the roots of the predetermined plant and the carbon dioxide of the roots of the predetermined plant at the time of capturing the captured image in the second period in the past cultivation Using a second estimation model machine-learned with at least one of the carbon uptake as output data, the stored soil data, the image feature amount extracted from the second captured image, and the estimated end of the first term The root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of capturing the second captured image, based on at least one of the root growth data of the predetermined plant at the time and the amount of carbon dioxide absorbed by the roots of the predetermined plant. and an estimating unit that estimates at least one of
 本発明によれば、植物の根の状態を簡便に推定することができる。 According to the present invention, the state of plant roots can be easily estimated.
試験段階の概要を示す図である。FIG. 4 is a diagram showing an overview of the test phase; 3期栽培の生育期間を示す図である。It is a figure which shows the growth period of 3rd period cultivation. 運用段階の概要を示す図である。It is a figure which shows the outline|summary of an operation stage. 情報処理のフェーズを示す図である。It is a figure which shows the phase of information processing. 生育管理装置の機能ブロック図である。It is a functional block diagram of a growth management apparatus. 土壌データ記憶部のデータ構造図である。It is a data structure diagram of a soil data storage unit. 特徴量データのデータ構造図である。4 is a data structure diagram of feature data; FIG. 根データのデータ構造図である。4 is a data structure diagram of root data; FIG. 試験データ収集フェーズにおける処理過程を示すフローチャートである。Fig. 10 is a flow chart showing the processing steps in the test data collection phase; 実施形態におけるニューラルネットワークの構成図である。1 is a configuration diagram of a neural network in an embodiment; FIG. 実施形態における1期目の教師データのデータ構造図である。FIG. 4 is a data structure diagram of teacher data for the first period in the embodiment. 実施形態における2期目の教師データのデータ構造図である。FIG. 10 is a data structure diagram of teacher data for the second term in the embodiment. 推定モデル生成フェーズにおける処理過程を示すフローチャートである。4 is a flow chart showing a process in an estimation model generation phase; 推定モデル利用フェーズにおける処理過程を示す図である。FIG. 10 is a diagram showing a process in an estimation model usage phase; 変形例1におけるニューラルネットワークの構成図である。FIG. 10 is a configuration diagram of a neural network in modification 1; 変形例1における2期目の教師データのデータ構造図である。FIG. 11 is a data structure diagram of teacher data for the second term in modification 1; 変形例1における3期目の教師データのデータ構造図である。FIG. 11 is a data structure diagram of teacher data for the third period in modification 1; 生育管理システムの構成図である。It is a block diagram of a growth management system.
[実施形態]
 カーボンニュートラルであるバイオマスは、二酸化炭素削減策として期待されている。バイオマスの一例であるソルガムは、エタノール生成材料になる。ソルガムは、イネ科の植物である。1ヘクタールで栽培されるソルガムは、葉と茎などで90トンの二酸化炭素を吸収し、根で60トンの二酸化炭素を吸収すると言われる。特に、根は二酸化炭素を地中に保持させる役割を果たす。
[Embodiment]
Biomass, which is carbon neutral, is expected as a measure to reduce carbon dioxide. One example of biomass, sorghum, becomes an ethanol-producing material. Sorghum is a plant of the grass family. Sorghum grown on one hectare is said to absorb 90 tons of carbon dioxide through its leaves and stems, and 60 tons through its roots. In particular, roots play a role in retaining carbon dioxide in the soil.
 ここでは、ソルガムを栽培し収穫するまでの間に吸収された二酸化炭素量を把握する仕組みを提供する。人工衛星で、ソルガムを栽培する圃場を上空から撮影し、撮影された衛星画像を使用する。さらに、人工知能の技術を活用して、栽培中のソルガムの成長度合いを推定する。特に、根の成長を測る。この例では、教師有り学習を行うため、教師データの基礎となるデータの収集のために、試験的な栽培を行うものとする。 Here, we provide a mechanism for understanding the amount of carbon dioxide absorbed during the period from sorghum cultivation to harvest. A satellite image of a field where sorghum is cultivated is taken from above, and the captured satellite image is used. In addition, artificial intelligence technology is used to estimate the growth rate of sorghum during cultivation. In particular, measure root growth. In this example, in order to perform supervised learning, it is assumed that experimental cultivation is performed in order to collect data serving as the basis for teacher data.
 ソルガムの栽培法として3期作は行う。2期目では、1期目の草の刈り取りで残った根から2期目の草を育てる。さらに、3期目では、2期目の草の刈り取りで残った根から3期目の草を育てる。つまり、根を張らせたまま3回草を生育させる。  Sorghum is cultivated in triplicate. In the second period, the grass of the second period is grown from the roots left after cutting the grass in the first period. Furthermore, in the third period, the grass of the third period is grown from the roots left after cutting the grass in the second period. In other words, the grass is grown three times while the roots are stretched.
 図1は、試験段階の概要を示す図である。
 圃場220a、圃場220bおよび圃場220cは、異なる特性の土壌でソルガムを栽培する。生育度合いを知るために、人工衛星302で、圃場220a、圃場220bおよび圃場220cを上方から撮影する。撮影された衛星画像は、画像提供サーバ300へ伝送される。生育管理装置100は、圃場220a、圃場220bおよび圃場220cの衛星画像を画像提供サーバ300から取得する。生育管理装置100は、たとえば圃場220a、圃場220bおよび圃場220cにおける所定植物の生育を管理するために使用される装置である。
FIG. 1 is a diagram showing an overview of the testing stage.
Field 220a, field 220b and field 220c cultivate sorghum in soil with different characteristics. In order to know the degree of growth, the artificial satellite 302 photographs the farm field 220a, the farm field 220b, and the farm field 220c from above. The captured satellite image is transmitted to the image providing server 300 . The growth management device 100 acquires satellite images of the farm field 220 a , the farm field 220 b and the farm field 220 c from the image providing server 300 . The growth management device 100 is a device used for managing the growth of a given plant, for example, in fields 220a, 220b and 220c.
 栽培開始前に、圃場220a、圃場220bおよび圃場220cの土壌データが、圃場端末200から生育管理装置100へ送られる。土壌データは、土壌の特性を表すデータであって、たとえば酸性度、水はけ、肥沃度、粘度などある。 The soil data of the farm field 220a, the farm field 220b, and the farm field 220c are sent from the farm field terminal 200 to the growth management device 100 before the start of cultivation. Soil data is data representing characteristics of soil, such as acidity, drainage, fertility, and viscosity.
 また、栽培中は、根の成長状態を表すデータ(以下、「根の成長データ」という)が、圃場端末200から生育管理装置100へ送られる。根の成長データは、たとえば根長、根の重量、根の体積などである。 Also, during cultivation, data representing the state of root growth (hereinafter referred to as "root growth data") is sent from the field terminal 200 to the growth management device 100. Root growth data are, for example, root length, root weight, root volume, and the like.
 土壌の特性は、ソルガムの育ち方に影響を与える。特に、根の張り方は、土壌の影響を受けやすい。つまり、土壌データと根の成長データには、関連性がある。この例では、圃場220a、圃場220bおよび圃場220cにおいて、2020年に試験栽培がおこなわれて、基礎的なデータが得られたものとする。 Soil characteristics affect how sorghum grows. In particular, the way the roots spread is easily affected by the soil. In other words, there is a relationship between soil data and root growth data. In this example, it is assumed that test cultivation was performed in 2020 in field 220a, field 220b, and field 220c, and basic data was obtained.
 生育管理装置100は、試験段階で得られたデータを用いて機械学習を行って、根の生育状態を推定するための推定モデルを生成する。 The growth management device 100 performs machine learning using the data obtained in the test stage to generate an estimation model for estimating the root growth state.
 図2は、3期栽培の生育期間を示す図である。
 1期目の刈り取りの後に根を残して2期目と3期目の栽培が可能な所定植物(たとえば、ソルガム)を対象とする生育について説明する。
FIG. 2 is a diagram showing the growth period of three-season cultivation.
The growth of a given plant (for example, sorghum) that can be cultivated in the second and third periods with the roots left after the first harvest will be described.
 1期目は、種まきから開始し、70日目、つまり10週目まで栽培するものとする。その間、7日ごとに衛星画像が取得される。そのタイミングで根の計測が行われる。9週目を過ぎたころに実が成り、葉と茎と共に刈り取られる。一部の草は、10週目の撮影と計測のためにそのまま育てられる。後述する変形例1では、1回目に刈り取ったときにも衛星画像を取得し、根の計測を行う。 In the first period, we will start with seeding and cultivate until the 70th day, that is, the 10th week. Meanwhile, satellite images are acquired every seven days. Root measurement is performed at that timing. After the ninth week, the fruit is formed and the leaves and stems are cut. Some grass is grown as is for 10 week photography and measurements. In Modified Example 1, which will be described later, a satellite image is acquired even when the first harvest is performed, and the roots are measured.
 1回目の刈り取りが、2期目の開始となる。2期目も10週目まで栽培するものとする。また、7日ごとに衛星画像が取得され、根の計測が行われる。9週目を過ぎたころに実が成り、葉と茎と共に刈り取られる。一部の草は、10週目の撮影と計測のためにそのまま育てられる。後述する変形例1では、2回目に刈り取ったときにも衛星画像を取得し、根の計測を行う。 The first harvest marks the beginning of the second term. The second period is also to be cultivated until the 10th week. In addition, satellite images are acquired every seven days, and root measurements are performed. After the ninth week, the fruit is formed and the leaves and stems are cut. Some grass is grown as is for 10 week photography and measurements. In Modified Example 1, which will be described later, a satellite image is acquired and roots are measured even when the plant is pruned for the second time.
 2回目の刈り取りが、3期目の開始となる。3期目も10週目まで栽培するものとする。また、7日ごとに衛星画像が取得され、根の計測が行われる。9週目を過ぎたころに実が成り、葉と茎と共に刈り取られる。一部の草は、10週目の撮影と計測のためにそのまま育てられる。根は、地中に残る。 The second harvest marks the beginning of the third term. The 3rd term shall also be cultivated until the 10th week. In addition, satellite images are acquired every seven days, and root measurements are performed. After the ninth week, the fruit is formed and the leaves and stems are cut. Some grass is grown as is for 10 week photography and measurements. Roots remain in the ground.
 図3は、運用段階の概要を示す図である。
 この例では、2021年に相当する。運用段階でも、土壌データが圃場端末200から生育管理装置100へ送られる。また、図2に関連して説明したように、各期の各週において同様に衛星画像が取得される。ただし、運用段階では、根の計測を行わなくてもよい。生育管理装置100において、推定モデルを利用して、根の成長データおよび根の二酸化炭素吸収量を求めることができるからである。
FIG. 3 is a diagram showing an overview of the operation stage.
In this example, it corresponds to the year 2021. Soil data is sent from the farm field terminal 200 to the growth management apparatus 100 also in the operation stage. Also, as described in connection with FIG. 2, satellite images are similarly acquired in each week of each period. However, in the operation stage, it is not necessary to measure the roots. This is because the growth management apparatus 100 can obtain the root growth data and the amount of carbon dioxide absorbed by the root using the estimation model.
 根の二酸化炭素吸収量(1株当たり)は、根の重量×所定倍率、根の体積×所定倍率で求められる。根の二酸化炭素吸収量は根の長さとの相関関係が成り立つので、根の長さと根の二酸化炭素吸収量を対応付ける対応表や相関関係を示す式から根の二酸化炭素吸収量を求めてもよい。なお、根の二酸化炭素吸収量(単位面積当たり)は、根の二酸化炭素吸収量(1株当たり)×草の密度(株/単位面積)で求めることができる。単位面積は、たとえばヘクタールである。 The amount of carbon dioxide absorbed by roots (per plant) can be obtained by root weight x predetermined magnification and root volume x predetermined magnification. Since the amount of carbon dioxide uptake by roots is correlated with the length of roots, the amount of carbon dioxide uptake by roots may be obtained from a correspondence table or formula showing the correlation between root length and carbon dioxide uptake by roots. . The amount of carbon dioxide absorbed by roots (per unit area) can be obtained by multiplying the amount of carbon dioxide absorbed by roots (per plant)×grass density (plant/unit area). A unit area is, for example, a hectare.
 図4は、情報処理のフェーズを示す図である。
 試験データ収集フェーズ(S10)では、試験段階(図1)における実績データを収集する。推定モデル生成フェーズ(S12)では、収集された実績データを教師データとして用いて学習モデルを生成する。この例における学習モデルを、推定モデルという。推定モデル利用フェーズ(S14)では、生成した推定モデルを用いて運用段階(図3)で根の成長データおよび根の二酸化炭素吸収量を求める。
FIG. 4 is a diagram showing the phases of information processing.
In the test data collection phase (S10), performance data in the test stage (Fig. 1) is collected. In the estimation model generation phase (S12), a learning model is generated using the collected performance data as teacher data. The learning model in this example is called an estimation model. In the estimation model utilization phase (S14), the generated estimation model is used to obtain root growth data and root carbon dioxide absorption in the operation stage (FIG. 3).
 植物の種類によって根の成長や二酸化炭素吸収量が異なるので、特定の植物における成長特性を推定モデルに反映させる。ソルガムの栽培に関しては、ソルガム用の推定モデルが生成される。他の植物に関しては、その植物用の推定モデルが生成される。 Since root growth and carbon dioxide absorption differ depending on the type of plant, the growth characteristics of specific plants are reflected in the estimation model. For sorghum cultivation, an inference model for sorghum is generated. For other plants, an inference model is generated for that plant.
 また、1期目と2期目と3期目では、草の生育の仕方が異なる。そのため、それぞれの生育の仕方が反映された別々の推定モデルを生成する。1期目用の推定モデルを「第1推定モデル」という。2期目用の推定モデルを「第2推定モデル」という。3期目用の推定モデルを「第3推定モデル」という。なお、この例では、機械学習にニューラルネットワークを用いる。ニューラルネットワークを用いた機械学習の処理は、一般的なものであってもよい。また、形態素分析に代表される自然言語処理など、ニューラルネットワーク以外の手段を機械学習に用いてもよい。 In addition, the grass grows differently in the first, second, and third periods. Therefore, separate estimation models are generated in which each growth mode is reflected. The estimation model for the first period is called a "first estimation model". The estimation model for the second period is called a "second estimation model". The estimation model for the third period is called a "third estimation model". In this example, a neural network is used for machine learning. Machine learning processing using neural networks may be general. Also, means other than neural networks, such as natural language processing represented by morphological analysis, may be used for machine learning.
 図5は、生育管理装置100の機能ブロック図である。
 生育管理装置100は、試験データ収集フェーズ(S10)と推定モデル生成フェーズ(S12)における処理を行い、機械学習によって推定モデルを生成する機械学習装置の機能を有する。さらに、生育管理装置100は、推定モデル利用フェーズ(S14)において、衛星撮影を取得して、推定モデルで根の成長データと根の二酸化炭素吸収量を求める推定装置の機能を有する。
FIG. 5 is a functional block diagram of the growth management device 100. As shown in FIG.
The growth management apparatus 100 has the function of a machine learning device that performs processing in the test data collection phase (S10) and the estimation model generation phase (S12) and generates an estimation model by machine learning. Furthermore, in the estimation model utilization phase (S14), the growth management device 100 has a function of an estimation device that obtains satellite imagery and obtains root growth data and root carbon dioxide absorption using the estimation model.
 生育管理装置100の各構成要素は、CPU(Central Processing Unit)および各種コプロセッサ(Coprocessor)などの演算器、メモリやストレージといった記憶装置、それらを連結する有線または無線の通信線を含むハードウェアと、記憶装置に格納され、演算器に処理命令を供給するソフトウェアによって実現される。コンピュータプログラムは、デバイスドライバ、オペレーティングシステム、それらの上位層に位置する各種アプリケーションプログラム、また、これらのプログラムに共通機能を提供するライブラリによって構成されてもよい。図示した各ブロックは、主に機能単位のブロックを示している。各ブロックは、記憶装置に記憶されているプログラムを演算器に実行させることによって実現してもよい。 Each component of the growth management apparatus 100 includes computing units such as a CPU (Central Processing Unit) and various coprocessors, storage devices such as memory and storage, and hardware including wired or wireless communication lines connecting them. , is stored in a storage device and implemented by software that supplies processing instructions to the computing unit. A computer program may consist of a device driver, an operating system, various application programs located in their higher layers, and a library that provides common functions to these programs. Each illustrated block mainly indicates a functional unit block. Each block may be implemented by causing a computer to execute a program stored in a storage device.
 生育管理装置100は、ユーザインターフェース処理部102、データ処理部104、通信処理部108およびデータ格納部106を含む。ユーザインターフェース処理部102は、マウスやタッチパネルなどを介してユーザからの操作を受け付けるほか、画像表示や音声出力など、ユーザインターフェース処理を担当する。通信処理部108は、ネットワークや近距離無線通信などを介した通信処理を担当する。データ格納部106は各種データを格納する。データ処理部104は、ユーザインターフェース処理部102と通信処理部108により取得されたデータおよびデータ格納部106に格納されているデータに基づいて各種処理を実行する。データ処理部104は、ユーザインターフェース処理部102、通信処理部108およびデータ格納部106のインターフェースとしても機能する。 The growth management device 100 includes a user interface processing unit 102, a data processing unit 104, a communication processing unit 108 and a data storage unit 106. The user interface processing unit 102 receives user operations via a mouse or a touch panel, and is in charge of user interface processing such as image display and audio output. A communication processing unit 108 is in charge of communication processing via a network, short-range wireless communication, or the like. A data storage unit 106 stores various data. The data processing unit 104 executes various processes based on data acquired by the user interface processing unit 102 and the communication processing unit 108 and data stored in the data storage unit 106 . The data processing unit 104 also functions as an interface for the user interface processing unit 102 , the communication processing unit 108 and the data storage unit 106 .
 ユーザインターフェース処理部102は、ユーザの操作によってデータを入力する入力部122とユーザへ提供するデータを出力する出力部120を有する。 The user interface processing unit 102 has an input unit 122 for inputting data by user operation and an output unit 120 for outputting data to be provided to the user.
 通信処理部108は、各種データを送信する送信部180と各種データを受信する受信部182を含む。 The communication processing unit 108 includes a transmitting unit 180 that transmits various data and a receiving unit 182 that receives various data.
 データ処理部104は、土壌データ取得部140、画像取得部142、特徴量抽出部144、根データ取得部146、教師データ生成部148、推定モデル生成部150および根成長推定部154を含む。
 土壌データ取得部140は、たとえば送信部180を用いて圃場端末200から土壌データを取得する。土壌データ取得部140は、入力部122において受け付けられた土壌データを取得してもよい。
The data processing unit 104 includes a soil data acquisition unit 140 , an image acquisition unit 142 , a feature amount extraction unit 144 , a root data acquisition unit 146 , a teacher data generation unit 148 , an estimation model generation unit 150 and a root growth estimation unit 154 .
The soil data acquisition unit 140 acquires soil data from the field terminal 200 using the transmission unit 180, for example. The soil data acquisition unit 140 may acquire soil data received by the input unit 122 .
 画像取得部142は、圃場220の撮影画像(この例では、衛星画像)を取得する。この例で、画像取得部142は、送信部180から画像提供サーバ300に対して圃場220の位置と撮影日を指定した衛星画像の要求を送信して、撮影された衛星画像を受信部182で受信する。具体的には、試験段階および運用段階において、圃場220a、圃場220bおよび圃場220cの1期目の各週、2期目の各週および3期目の各週の撮影タイミングにおいて、その圃場220を撮影した衛星画像が取得される。 The image acquisition unit 142 acquires a photographed image of the field 220 (a satellite image in this example). In this example, the image acquiring unit 142 transmits a satellite image request specifying the position of the field 220 and the shooting date from the transmitting unit 180 to the image providing server 300, and the receiving unit 182 receives the captured satellite image. receive. Specifically, in the test stage and the operation stage, the satellite that captured the field 220 at the timing of each week of the first term, each week of the second term, and each week of the third term of the field 220a, the field 220b, and the field 220c An image is acquired.
 特徴量抽出部144は、取得された圃場220の撮影画像(この例では、衛星画像)から特徴量を抽出する。生育管理装置100は、衛星画像から抽出される特徴量(以下、「画像特徴量」という)を用いる。画像特徴量は、たとえば圃場範囲の平均色、圃場範囲の緑色の割合、圃場範囲の緑色の濃さの度合い、草丈などである。植物が成長すれば、葉や茎が大きくなるので、圃場範囲の平均色は土色から緑色へ変化する。また、植物の成長に伴って、圃場範囲の緑色の割合が多くなり、緑色の濃さの度合いも強くなる。さらに、植物の成長に伴って、草丈も長くなる。このように、葉や茎が成長すれば、それにつれて根部も大きくなる。したがって、画像特徴量は、根の成長データと相関関係がある。さらに、画像特徴量は、根の成長データと関連する根の二酸化炭素吸収量とも相関関係がある。具体的には、特徴量抽出部144は、1期目に取得された撮影画像(以下、「第1撮影画像」という)、2期目に取得された撮影画像(以下、「第2撮影画像」という)および3期目に取得された撮影画像(以下、「第3撮影画像」という)から画像特徴量を抽出する。たとえば、画素の色をRGB形式で表す場合、圃場範囲の平均色のR値は、圃場範囲に含まれる画素のR値の平均値であり、圃場範囲の平均色のG値は、圃場範囲に含まれる画素のG値の平均値であり、圃場範囲の平均色のR値は、圃場範囲に含まれる画素のR値の平均値である。平均色は、画像の色の特徴を示す指標の例である。平均色の代わりに、色調パターン、コントラストなど、平均化以外の方法でパターン化した指標(画像の色の特徴を示す指標)を用いるようにしてもよい。 The feature quantity extraction unit 144 extracts the feature quantity from the acquired photographed image of the farm field 220 (satellite image in this example). The growth management apparatus 100 uses feature amounts (hereinafter referred to as “image feature amounts”) extracted from satellite images. The image feature amount is, for example, the average color of the field range, the percentage of green in the field range, the degree of green density in the field range, and the height of the plant. As plants grow, leaves and stems grow larger, so the average color of the field area changes from earthy to green. In addition, as the plants grow, the proportion of green in the field area increases, and the degree of green density also increases. Furthermore, as the plant grows, the height of the plant also increases. In this way, as the leaves and stems grow, the roots grow as well. Therefore, the image feature amount has a correlation with the root growth data. In addition, the image features are also correlated with root carbon dioxide uptake associated with root growth data. Specifically, the feature amount extraction unit 144 extracts the captured image acquired in the first period (hereinafter referred to as “first captured image”), the captured image acquired in the second period (hereinafter referred to as “second captured image”). ”) and the captured image acquired in the third period (hereinafter referred to as “third captured image”). For example, when pixel colors are expressed in RGB format, the R value of the average color of the field range is the average of the R values of the pixels included in the field range, and the G value of the average color of the field range is the average value of the pixels included in the field range. It is the average value of the G values of the included pixels, and the R value of the average color of the field range is the average value of the R values of the pixels included in the field range. Average color is an example of an index that indicates the color characteristics of an image. Instead of the average color, it is also possible to use an index (an index indicating the color feature of an image) that is patterned by a method other than averaging, such as a color tone pattern or contrast.
 根データ取得部146は、圃場端末200から根の成長データを取得する。受信部182が、インターネットやLAN(Local Area Network)などのネットワーク通信で根の成長データを受信してもよいし、近距離無線通信で根の成長データを受信してもよい。あるいは、根データ取得部146は、入力部122において受け付けられた根の成長データを取得してもよい。 The root data acquisition unit 146 acquires root growth data from the field terminal 200 . The receiving unit 182 may receive the root growth data through network communication such as the Internet or LAN (Local Area Network), or may receive the root growth data through short-range wireless communication. Alternatively, the root data acquisition unit 146 may acquire root growth data received by the input unit 122 .
 教師データ生成部148は、土壌データと、画像特徴量と、根データ(根の成長データおよび根の二酸化炭素吸収量の少なくとも一方)とに基づいて、機械学習に用いられる教師データを生成する。推定モデル生成部150は、学習エンジン152を含む。推定モデル生成部150は、学習エンジン152を用いて教師データに基づく学習モデル(推定モデル)を生成する。学習モデルがニューラルネットワークを使用する場合、入力変数である土壌データおよび画像特徴量の各指標に対応する入力ノードと、出力変数である根データの指標に対応する出力ノードと、中間ノードとを備えたニューラルネットワークを使って、ノード間の連結の強さを示す重みデータを生成する。ニューラルネットワークの構成例については、図10に関連して後述する。また、ニューラルネットワークを利用する場合には、適用データとして用意された入力変数の土壌データおよび画像特徴量の各指標値を、その指標に対応する入力ノードに設定して、重みデータに基づいて出力ノードから根データの指標値が得られる。 The teacher data generation unit 148 generates teacher data used for machine learning based on soil data, image feature values, and root data (at least one of root growth data and root carbon dioxide absorption). The estimation model generator 150 includes a learning engine 152 . The estimation model generation unit 150 uses the learning engine 152 to generate a learning model (estimation model) based on teacher data. When the learning model uses a neural network, it has an input node corresponding to each index of the soil data and the image feature value which are the input variables, an output node corresponding to the index of the root data which is the output variable, and an intermediate node. A neural network is used to generate weight data that indicates the strength of connections between nodes. A configuration example of the neural network will be described later with reference to FIG. In addition, when using a neural network, each index value of the soil data and the image feature value of the input variables prepared as application data is set to the input node corresponding to the index, and output based on the weight data An index value of root data is obtained from the node.
 つまり、推定モデル生成部150は、土壌データと、撮影画像(この例では、衛星画像)から抽出された画像特徴量(この例では、圃場範囲の平均色)とから、その撮影画像の撮影時点における所定植物(この例では、ソルガム)の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する推定モデルを機械学習により生成する。より具体的に言えば、推定モデル生成部150は、1期目に関して、土壌データと、第1撮影画像の画像特徴量とから、第1撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する第1推定モデルを機械学習により生成する。また、推定モデル生成部150は、2期目に関して、土壌データと、第2撮影画像の画像特徴量とから、第2撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する第2推定モデルを機械学習により生成する。さらに、推定モデル生成部150は、3期目に関して、土壌データと、第3撮影画像の画像特徴量とから、第3撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する第2推定モデルを機械学習により生成する。 In other words, the estimation model generation unit 150 generates the soil data and the image feature amount (in this example, the average color of the field range) extracted from the photographed image (in this example, the satellite image). machine learning to generate an estimation model for estimating at least one of the root growth data of a given plant (sorghum in this example) and the amount of carbon dioxide absorbed by the given plant's roots. More specifically, for the first period, the estimation model generation unit 150 generates root growth data and A first estimation model for estimating at least one of carbon dioxide uptake by roots of a given plant is generated by machine learning. In addition, regarding the second term, the estimation model generation unit 150 calculates the root growth data of the predetermined plant and the root growth data of the predetermined plant at the time of capturing the second captured image from the soil data and the image feature amount of the second captured image. A second estimation model for estimating at least one of carbon dioxide absorption is generated by machine learning. Furthermore, regarding the third term, the estimation model generation unit 150 calculates the root growth data of the predetermined plant and the root growth data of the predetermined plant at the time of capturing the third captured image based on the soil data and the image feature amount of the third captured image. A second estimation model for estimating at least one of carbon dioxide absorption is generated by machine learning.
 根成長推定部154は、推定モデル利用フェーズ(S14)で根の成長データを推定する。根成長推定部154は、推定モデル利用部156を含む。推定モデル利用部156は、根の成長データの推定のために、推定モデルを利用する。推定モデル利用部156は、推定対象の圃場220の土壌データと、取得された撮影画像(この例では、衛星画像)から抽出された画像特徴量を入力変数として、推定モデルに適用する。 The root growth estimation unit 154 estimates root growth data in the estimation model use phase (S14). Root growth estimation unit 154 includes estimation model utilization unit 156 . The estimation model utilization unit 156 utilizes the estimation model for estimating root growth data. The estimation model using unit 156 applies the soil data of the farm field 220 to be estimated and the image feature amount extracted from the acquired photographed image (satellite image in this example) to the estimation model as input variables.
 つまり、根成長推定部154は、過去の栽培(この例では、試験段階)において所定植物(この例では、ソルガム)を栽培した圃場220の土壌データと、過去の栽培における圃場220の撮影画像(この例では、衛星画像)から抽出された画像特徴量とを入力データとし、過去の栽培における撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された推定モデルを用いる。そして、根成長推定部154は、土壌データ記憶部160に記憶している対象圃場の土壌データと、画像取得部142で取得された撮影画像から抽出された画像特徴量とから、その撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する。 That is, the root growth estimating unit 154 collects the soil data of the field 220 in which the predetermined plant (sorghum in this example) was cultivated in the past cultivation (the test stage in this example), and the photographed image of the field 220 in the past cultivation ( In this example, image feature values extracted from satellite images) are used as input data, and at least one of root growth data of a predetermined plant at the time of photographing images in past cultivation and carbon dioxide absorption amount of the roots of the predetermined plant. is used as the output data and a machine-learned estimation model is used. Then, the root growth estimating unit 154 uses the soil data of the target field stored in the soil data storage unit 160 and the image feature amount extracted from the photographed image acquired by the image acquiring unit 142 to obtain the photographed image. At least one of the growth data of the roots of the predetermined plant and the carbon dioxide absorption amount of the roots of the predetermined plant at the time of photographing is estimated.
 より具体的に言えば、根成長推定部154は、運用段階の1期目において第1撮影画像を取得した場合に、過去の栽培(この例では、試験段階)において所定植物(この例では、ソルガム)を栽培した圃場220の土壌データと、過去の栽培における1期目の圃場220の撮影画像から抽出された画像特徴量とを入力データとし、過去の栽培における1期目の撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第1推定モデルを用いる。そして、根成長推定部154は、土壌データ記憶部160に記憶している対象圃場の土壌データと、第1撮影画像から抽出された画像特徴量とから、第1撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する。 More specifically, the root growth estimating unit 154, when the first photographed image is acquired in the first period of the operation stage, determines that the predetermined plant (in this example, Soil data of a field 220 in which sorghum was cultivated and an image feature amount extracted from a photographed image of the field 220 in the first period of past cultivation are used as input data, and a photographed image of the first period of past cultivation is taken. A first estimation model machine-learned using at least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the root of the predetermined plant at the time point as output data is used. Then, the root growth estimating unit 154 calculates, from the soil data of the target field stored in the soil data storage unit 160 and the image feature amount extracted from the first captured image, the predetermined plant estimating at least one of root growth data of plants and carbon dioxide uptake of roots of a given plant.
 さらに、根成長推定部154は、運用段階の2期目において第2撮影画像を取得した場合に、過去の栽培(この例では、試験段階)において所定植物(この例では、ソルガム)を栽培した圃場220の土壌データと、過去の栽培における2期目の圃場の撮影画像から抽出された画像特徴量とを入力データとし、過去の栽培における2期目の撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第2推定モデルを用いる。そして、根成長推定部154は、土壌データ記憶部160に記憶している対象圃場の土壌データと、第2撮影画像から抽出された画像特徴量とから、第2撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する。 Further, when the root growth estimation unit 154 acquires the second photographed image in the second term of the operation stage, the predetermined plant (sorghum in this example) was cultivated in the past cultivation (test stage in this example). The soil data of the field 220 and the image feature amount extracted from the photographed image of the field in the second period in the past cultivation are used as input data, and the root of the predetermined plant at the time of photographing the photographed image in the second period in the past cultivation. A second estimation model machine-learned using at least one of the growth data and the carbon dioxide absorption amount of the roots of a predetermined plant as output data is used. Then, the root growth estimating unit 154 calculates, from the soil data of the target field stored in the soil data storage unit 160 and the image feature amount extracted from the second captured image, estimating at least one of root growth data of plants and carbon dioxide uptake of roots of a given plant.
 加えて、根成長推定部154は、運用段階の3期目において第3撮影画像を取得した場合に、過去の栽培(この例では、試験段階)において所定植物(この例では、ソルガム)を栽培した圃場220の土壌データと、過去の栽培における3期目の圃場の撮影画像から抽出された画像特徴量とを入力データとし、過去の栽培における3期目の撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第3推定モデルを用いる。そして、根成長推定部154は、土壌データ記憶部160に記憶している対象圃場の土壌データと、第3撮影画像から抽出された画像特徴量とから、第3撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する。 In addition, when the root growth estimating unit 154 acquires the third photographed image in the third term of the operation stage, the root growth estimation unit 154 cultivates the predetermined plant (sorghum in this example) in the past cultivation (test stage in this example). The soil data of the field 220 and the image feature amount extracted from the photographed image of the field in the third period in the past cultivation are used as input data, and the image of the predetermined plant at the time of photographing the photographed image in the third period in the past cultivation. A third estimation model machine-learned using at least one of the root growth data and the carbon dioxide absorption amount of the roots of the predetermined plant as output data is used. Then, the root growth estimating unit 154 uses the soil data of the target field stored in the soil data storage unit 160 and the image feature amount extracted from the third photographed image to determine whether the predetermined plant at the time of photographing the third photographed image. estimating at least one of root growth data of plants and carbon dioxide uptake of roots of a given plant.
 データ格納部106は、土壌データ記憶部160、画像記憶部162、特徴量データ記憶部164、根データ記憶部166、教師データ記憶部168および推定モデル記憶部170を含む。
 土壌データ記憶部160は、土壌データを記憶する。土壌データ記憶部160については、図6に関連して後述する。画像記憶部162は、年と圃場220と撮影日時の組み合わせに対応付けて撮影画像(この例では、衛星画像)を記憶する。なお、撮影日時は、各圃場220の期と週を特定できる。特徴量データ記憶部164は、特徴量データを記憶する。特徴量データについては、図7に関連して後述する。根データ記憶部166は、根データを記憶する。根データ記憶部166の構成については、図8に関連して後述する。教師データ記憶部168は、教師データを記憶する。教師データについては、図11、図12に関連して後述する。推定モデル記憶部170は、各圃場220における各期の推定モデルを記憶する。データ格納部106は、各圃場220について地理的位置と範囲、所定植物(この例では、ソルガム)の密度(株/単位面積)などの圃場データを記憶する圃場データ記憶部(不図示)を含む。
Data storage unit 106 includes soil data storage unit 160 , image storage unit 162 , feature amount data storage unit 164 , root data storage unit 166 , teacher data storage unit 168 and estimation model storage unit 170 .
The soil data storage unit 160 stores soil data. The soil data storage unit 160 will be described later with reference to FIG. The image storage unit 162 stores a photographed image (a satellite image in this example) in association with a combination of year, field 220, and photographing date and time. It should be noted that the shooting date and time can specify the period and week of each field 220 . The feature amount data storage unit 164 stores feature amount data. The feature amount data will be described later with reference to FIG. The root data storage unit 166 stores root data. The configuration of the root data storage unit 166 will be described later with reference to FIG. The teacher data storage unit 168 stores teacher data. The teacher data will be described later with reference to FIGS. 11 and 12. FIG. The estimated model storage unit 170 stores an estimated model for each period in each field 220 . The data storage unit 106 includes a field data storage unit (not shown) that stores field data such as the geographical position and range of each field 220, and the density (strain/unit area) of a given plant (in this example, sorghum). .
 つまり、データ格納部106は、所定植物(この例では、ソルガム)を栽培する圃場220の土壌データと、圃場220の撮影画像(この例では、衛星画像)から抽出された画像特徴量と、その撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とを記憶する。 That is, the data storage unit 106 stores the soil data of the field 220 where a predetermined plant (sorghum in this example) is cultivated, the image feature amount extracted from the photographed image of the field 220 (satellite image in this example), and the At least one of the growth data of the roots of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of photographing the photographed image is stored.
 より詳しく言うと、データ格納部106は、1期目の刈り取りの後に根を残して2期目と3期目の栽培が可能な所定植物(この例では、ソルガム)を栽培する圃場220の土壌データと、1期目の圃場220の第1撮影画像から抽出された画像特徴量と、第1撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方と、2期目の圃場220の第2撮影画像から抽出された画像特徴量と、第2撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とを記憶する。 More specifically, the data storage unit 106 stores the soil of a field 220 for cultivating a predetermined plant (in this example, sorghum) that can be cultivated in the second and third periods with the roots remaining after the first harvest. data, the image feature amount extracted from the first captured image of the field 220 in the first period, the growth data of the roots of the predetermined plant at the time of capturing the first captured image, and the carbon dioxide absorption amount of the roots of the predetermined plant. At least one of the image feature amount extracted from the second photographed image of the field 220 in the second term, the root growth data of the predetermined plant at the time of photographing the second photographed image, and the carbon dioxide absorption amount of the root of the predetermined plant memorize one and the other.
 図6は、土壌データ記憶部160のデータ構造図である。
 土壌データ記憶部160は、たとえば図示したテーブルを記憶する。このテーブルは、年と圃場IDの組み合わせ毎にレコードを有する。レコードは、テーブルにおける「行」に相当する。また、このテーブルは、例えば、年と圃場IDの組み合わせ毎に、酸性度、水はけ、肥沃度および粘度の項目を有する。項目は、テーブルにおける「列」に相当し、「カラム」ということもある。
FIG. 6 is a data structure diagram of the soil data storage unit 160. As shown in FIG.
Soil data storage unit 160 stores, for example, the illustrated table. This table has a record for each combination of year and field ID. A record corresponds to a "row" in a table. This table also has items of acidity, drainage, fertility and viscosity, for example, for each combination of year and field ID. An item corresponds to a "row" in a table, and is also called a "column".
 図7は、特徴量データとして成長過程における時期と平均色を示した場合のデータ構造図を示している。
 図7に示す特徴量データは、テーブル形式で示している。特徴量データは、年と圃場IDの組み合わせ毎に、各期における平均色を示したレコードを有する。言い換えると、特徴量データは、年、圃場ID、各期における週毎の平均色をデータとして設定されている。すなわち、各期の各週のフィールドには、圃場範囲の平均色(画像特徴量の例)が設定されている。画像特徴量として圃場範囲の緑色の濃さの度合いを用いる場合には、圃場範囲の緑色の濃さの度合いが各期の各週のフィールドに設定される。図7においては、画像特徴量を平均色とした場合の例を示したが、画像特徴量として草丈を用いる場合には、草丈が各期の各週のフィールドに設定される。フィールドは、データ値の設定領域に相当し、「セル」ということもある。
FIG. 7 is a data structure diagram showing the time and average color in the growth process as feature data.
The feature amount data shown in FIG. 7 is shown in a table format. The feature amount data has a record indicating the average color in each period for each combination of year and field ID. In other words, the feature amount data is set as data of year, field ID, and weekly average color in each period. That is, the average color of the field range (example of image feature amount) is set in the field of each week in each period. When the degree of greenness of the field range is used as the image feature amount, the degree of greenness of the field range is set in the field of each week in each period. FIG. 7 shows an example in which the average color is used as the image feature amount. However, if the plant height is used as the image feature amount, the plant height is set in the field of each week in each period. A field corresponds to a data value setting area and is also called a "cell".
 図8は、根データ記憶部166が保持するデータ構造の一例を示す図である。
 すなわち、根データ記憶部166は、たとえば図8に示したテーブルを記憶する。このテーブルは、年と圃場IDの組み合わせ毎にレコードを有する。また、このテーブルは、年、圃場ID、各期における週毎の根長(根の生長データの例)と根の二酸化炭素吸収量をデータとして設定されている。根の生長データとして根の重量を用いる場合には、根長に代えて根の重量が各期の各週のフィールドに設定される。根の生長データとして根の体積を用いる場合には、根長に代えて根の体積が各期の各週のフィールドに設定される。
FIG. 8 is a diagram showing an example of the data structure held by the root data storage unit 166. As shown in FIG.
That is, root data storage unit 166 stores, for example, the table shown in FIG. This table has a record for each combination of year and field ID. In addition, this table is set with year, field ID, weekly root length (an example of root growth data), and root carbon dioxide absorption amount in each period as data. When root weight is used as root growth data, root weight is set in the field of each week in each period instead of root length. When root volume is used as root growth data, root volume is set in the field of each week in each period instead of root length.
 図9は、試験データ収集フェーズ(S10)における処理過程を示すフローチャートである。
 土壌データ取得部140は、各圃場220から土壌データを取得する(S20)。土壌データ取得部140は、受信部182によって圃場端末200から受信された圃場220の土壌データを取得してもよいし、入力部122において受け付けられた圃場220の土壌データを取得してもよい。
FIG. 9 is a flow chart showing the process in the test data collection phase (S10).
The soil data acquisition unit 140 acquires soil data from each field 220 (S20). The soil data acquiring unit 140 may acquire the soil data of the farm field 220 received from the farm field terminal 200 by the receiving unit 182 or the soil data of the farm field 220 accepted by the input unit 122 .
 生育管理装置100は、各圃場220の1期目の各週の衛星画像と根の生長データを取得する(S22)。具体的には、画像取得部142は、通信処理部108を用いて、画像提供サーバ300から衛星画像を受信する。衛星画像は、画像記憶部162に記憶される。このとき、特徴量抽出部144は、衛星画像から画像特徴量を抽出し、特徴量データ記憶部164に記憶する。また、根データ取得部146は、各圃場220の1期目の各週における根の成長データを取得する。具体的には、根データ取得部146は、通信処理部108において受信された根の成長データを取得してもよいし、入力部122において受け付けられた根の成長データを取得してもよい。 The growth management device 100 acquires satellite images and root growth data for each week in the first period of each field 220 (S22). Specifically, the image acquisition unit 142 uses the communication processing unit 108 to receive satellite images from the image providing server 300 . Satellite images are stored in the image storage unit 162 . At this time, the feature amount extraction unit 144 extracts image feature amounts from the satellite image and stores them in the feature amount data storage unit 164 . In addition, the root data acquisition unit 146 acquires root growth data for each week in the first period of each field 220 . Specifically, the root data acquisition unit 146 may acquire root growth data received by the communication processing unit 108 or may acquire root growth data received by the input unit 122 .
 S24の処理については、変形例1に関連して後述する。実施形態では、S24の処理を省いてもよい。 The processing of S24 will be described later in relation to Modification 1. In the embodiment, the processing of S24 may be omitted.
 生育管理装置100は、各圃場220の2期目の各週の衛星画像と根の生長データを取得する(S26)。データの取得処理と記憶処理については、S22の場合と同様である。 The growth management device 100 acquires satellite images and root growth data for each week in the second term of each field 220 (S26). The data acquisition process and storage process are the same as in S22.
 S28の処理については、変形例1に関連して後述する。実施形態では、S28の処理を省いてもよい。 The processing of S28 will be described later in relation to Modification 1. In the embodiment, the processing of S28 may be omitted.
 生育管理装置100は、各圃場220の3期目の各週の衛星画像と根の生長データを取得する(S30)。データの取得処理と記憶処理については、S22の場合と同様である。 The growth management device 100 acquires satellite images and root growth data for each week in the third term of each field 220 (S30). The data acquisition process and storage process are the same as in S22.
 図10は、実施形態におけるニューラルネットワークの構成図である。
 実施形態におけるニューラルネットワークの入力層において、土壌データの例である酸性度、水はけ、肥沃度および粘度、画像特徴量の例である圃場範囲の平均色を入力ノードとする。また、ニューラルネットワークの中間層に、複数の中間ノードを設ける。さらに、ニューラルネットワークの出力層において、根の生長データの例である根長と根の二酸化炭素吸収量を出力ノードとする。根長および根の二酸化炭素吸収量のいずれか一方のみを出力ノードとしてもよい。入力層の入力ノードと中間層の中間ノードの間は、全結合である。また、中間層の中間ノードと出力層の出力ノードの間も、全結合である。このニューラルネットワークは、土壌データの値と画像特徴量の値が定まると、それらに応じて根長と根の二酸化炭素吸収量が定まる推定モデルを想定している。このニューラルネットワークは、1期目の第1推定モデル、2期目の第2推定モデルおよび3期目の第3推定モデルにおいて共に使用される。
FIG. 10 is a configuration diagram of a neural network in the embodiment.
In the input layer of the neural network in the embodiment, acidity, drainage, fertility and viscosity, which are examples of soil data, and the average color of a field range, which is an example of image feature values, are input nodes. Also, a plurality of intermediate nodes are provided in the intermediate layer of the neural network. Furthermore, in the output layer of the neural network, the root length and the amount of carbon dioxide absorbed by the root, which are examples of root growth data, are set as output nodes. Only one of the root length and the amount of carbon dioxide absorbed by the root may be used as the output node. There is a full connection between the input node of the input layer and the intermediate node of the intermediate layer. Further, there is a full connection between the intermediate node of the intermediate layer and the output node of the output layer. This neural network assumes an estimation model in which the root length and the carbon dioxide absorption amount of the root are determined according to the values of the soil data and the image feature values. This neural network is used together in the first estimation model of the first period, the second estimation model of the second period, and the third estimation model of the third period.
 図11は、実施形態における1期目の教師データのデータ構造図である。
 1期目の教師データは、1期目の第1推定モデルを生成するために準備される。この例における教師データは、テーブル形式である。教師データは、年、圃場ID、期および週の組み合わせ毎にレコードを有する。また、各レコードは、図10に示した入力ノードに対応する酸性度、水はけ、肥沃度、粘度および圃場範囲の平均色の項目と、出力ノードに対応する根長および根の二酸化炭素吸収量の項目を有する。1つのレコードは、教師データにおける1つのサンプルに相当する。年、圃場ID、期および週の各組み合わせに応じて、教師データにおける各サンプルを特定可能であるものとする。
FIG. 11 is a data structure diagram of teacher data for the first term in the embodiment.
The teacher data for the first period is prepared to generate the first estimation model for the first period. The teacher data in this example is in table format. The teacher data has records for each combination of year, field ID, period and week. In addition, each record has the items of acidity, drainage, fertility, viscosity, and average color of the field range corresponding to the input node shown in FIG. 10, and root length and root carbon dioxide absorption corresponding to the output node. has an item. One record corresponds to one sample in the teacher data. It is assumed that each sample in the teacher data can be identified according to each combination of year, field ID, period and week.
 たとえば、図示した1番目のレコードは、2020年、圃場ID:A1、1期目および第1週の組み合わせに対応する。2020年の圃場ID:A1の圃場220aでは、酸性度がB1であり、水はけがC1であり、肥沃度がD1であり、粘度がE1であることを示している。これらの値は、教師データ生成部148によって、土壌データ記憶部160からコピーされる。また、2020年の圃場ID:A1の圃場220aにおける1期目の第1週に撮影された衛星画像の平均色がF1-1-1であったことを示している。この値は、教師データ生成部148によって、特徴量データ記憶部164からコピーされる。さらに、また、2020年の圃場ID:A1の圃場220aにおける1期目の第1週に計測された根長がG1-1-1であり、その根長に基づいて算出された根の二酸化炭素吸収量がH1-1-1であったことを示している。これらの値は、教師データ生成部148によって、根データ記憶部166からコピーされる。 For example, the first record shown corresponds to the year 2020, field ID: A1, the combination of the first period and the first week. In the field 220a with field ID: A1 in 2020, the acidity is B1, the drainage is C1, the fertility is D1, and the viscosity is E1. These values are copied from the soil data storage section 160 by the teacher data generation section 148 . It also shows that the average color of the satellite image taken in the first week of the first term in the field 220a with field ID: A1 in 2020 was F1-1-1. This value is copied from the feature amount data storage unit 164 by the teacher data generation unit 148 . Furthermore, the root length measured in the first week of the first term in the field 220a of the field ID: A1 in 2020 is G1-1-1, and the root carbon dioxide calculated based on the root length It shows that the amount of absorption was H1-1-1. These values are copied from the root data storage unit 166 by the teacher data generation unit 148 .
 この1期目の教師データを用いて、ニューラルネットワークで最適解となる重みデータが学習される。学習された重みデータは、推定モデル記憶部170に記憶される。ニューラルネットワークを用いた学習の手順自体は、従来技術であってもよい。 Using this first term training data, the weight data that is the optimal solution is learned by the neural network. The learned weight data is stored in the estimation model storage unit 170 . The learning procedure itself using a neural network may be conventional technology.
 図12は、実施形態における2期目の教師データのデータ構造図である。
 2期目の教師データは、2期目の第2推定モデルを生成するために準備される。2期目の教師データの構成は、図11に示した1期目の教師データの場合と同様である。たとえば、図示した1番目のレコードは、2020年、圃場ID:A1、2期目および第1週の組み合わせに対応する。3期目の教師データも同様の構成で、3期目に関する値が設定される。
FIG. 12 is a data structure diagram of teacher data for the second term in the embodiment.
The teacher data for the second period is prepared to generate the second estimation model for the second period. The structure of the training data for the second term is the same as the training data for the first term shown in FIG. For example, the illustrated first record corresponds to the year 2020, field ID: A1, the combination of the second period and the first week. The teacher data for the third period has a similar configuration, and values related to the third period are set.
 図13は、推定モデル生成フェーズ(S12)における処理過程を示すフローチャートである。
 教師データ生成部148は、上述のとおり、土壌データ記憶部160と特徴量データ記憶部164と根データ記憶部166を参照して、1期目の教師データ(図11)を生成する(S40)。
FIG. 13 is a flow chart showing the process in the estimation model generation phase (S12).
As described above, the teacher data generation unit 148 refers to the soil data storage unit 160, the feature amount data storage unit 164, and the root data storage unit 166 to generate the first term teacher data (FIG. 11) (S40). .
 推定モデル生成部150は、1期目の教師データを入力して、学習エンジン152による学習処理を起動する(S42)。学習エンジン152は学習処理を実行し、第1推定モデルを生成する(S44)。重みデータを含む第1推定モデルは、推定モデル記憶部170に記憶される。 The estimation model generation unit 150 inputs the teacher data for the first period and activates the learning process by the learning engine 152 (S42). The learning engine 152 executes learning processing and generates a first estimation model (S44). The first estimation model including weight data is stored in estimation model storage section 170 .
 教師データ生成部148は、同様に2期目の教師データを生成する(S46)。推定モデル生成部150は、2期目の教師データを入力して、学習エンジン152による学習処理を起動する(S48)。学習エンジン152は学習処理を実行し、第2推定モデルを生成する(S50)。重みデータを含む第2推定モデルは、推定モデル記憶部170に記憶される。 The teacher data generation unit 148 similarly generates teacher data for the second term (S46). The estimation model generation unit 150 inputs the second term teacher data and activates the learning process by the learning engine 152 (S48). The learning engine 152 executes learning processing to generate a second estimation model (S50). The second estimation model including weight data is stored in estimation model storage section 170 .
 教師データ生成部148は、同様に3期目の教師データを生成する(S52)。推定モデル生成部150は、3期目の教師データを入力して、学習エンジン152による学習処理を起動する(S54)。学習エンジン152は学習処理を実行し、第3推定モデルを生成する(S56)。重みデータを含む第3推定モデルは、推定モデル記憶部170に記憶される。 The teacher data generation unit 148 similarly generates teacher data for the third period (S52). The estimation model generation unit 150 inputs the third term teacher data and activates the learning process by the learning engine 152 (S54). The learning engine 152 executes learning processing and generates a third estimation model (S56). The third estimation model including weight data is stored in estimation model storage section 170 .
 この処理は、推定モデル利用フェーズの前に行われる。この例では、2020年の3期目終了後から2021年の種まきの頃までの間に行われる。このあと、運用段階に移る。 This process is performed before the estimation model usage phase. In this example, it takes place between the end of the third period in 2020 and the sowing of seeds in 2021. After that, we move to the operation stage.
 図14は、推定モデル利用フェーズ(S14)における処理過程を示す図である。
 画像取得部142は、各圃場220の撮影タイミング(図2参照)を待つ(S60)。画像取得部142は、当該圃場の衛星画像を取得する(S62)。衛星画像の取得方法は、試験データ収集フェーズ(図9)の場合と同様である。特徴量抽出部144は、画像特徴量を抽出する(S64)。画像特徴量の抽出についても、試験データ収集フェーズ(図9)の場合と同様である。
FIG. 14 is a diagram showing the process in the estimation model utilization phase (S14).
The image acquisition unit 142 waits for the photographing timing (see FIG. 2) of each field 220 (S60). The image acquisition unit 142 acquires a satellite image of the field (S62). The satellite image acquisition method is the same as in the test data acquisition phase (FIG. 9). The feature amount extraction unit 144 extracts the image feature amount (S64). The extraction of the image feature quantity is also the same as in the test data collection phase (FIG. 9).
 根成長推定部154は、そのときの期に合わせて推定モデルを選択する(S66)。つまり、根成長推定部154は、撮影タイミングが1期目であれば第1推定モデルを選択し、撮影タイミングが2期目であれば第2推定モデルを選択し、撮影タイミングが3期目であれば第3推定モデルを選択する。 The root growth estimation unit 154 selects an estimation model according to the period at that time (S66). That is, the root growth estimation unit 154 selects the first estimation model if the imaging timing is the first period, selects the second estimation model if the imaging timing is the second period, and selects the second estimation model if the imaging timing is the third period. If so, select the third estimation model.
 推定モデル利用部156は、その圃場220の土壌データと、S62で抽出された画像特徴量を、推定モデルに適用する。つまり、推定モデル利用部156は、推定モデルによる演算を行うニューラルネットワークの入力ノードにこれらの値を設定する(S68)。根成長推定部154は、推定モデルの演算結果として出力ノードから根長と根の二酸化炭素吸収量の値を得る(S70)。 The estimation model utilization unit 156 applies the soil data of the field 220 and the image feature quantity extracted in S62 to the estimation model. That is, the estimated model utilization unit 156 sets these values to the input nodes of the neural network that performs calculations using the estimated model (S68). The root growth estimator 154 obtains values of root length and root carbon dioxide absorption from the output node as the calculation result of the estimation model (S70).
 そして、出力部120は、根長と根の二酸化炭素吸収量を出力(たとえば、ディスプレイへの表示)する(S72)。送信部180は、根長と根の二酸化炭素吸収量を圃場端末200へ送信してもよい。 Then, the output unit 120 outputs (for example, displays on a display) the root length and the amount of carbon dioxide absorbed by the root (S72). The transmission unit 180 may transmit the root length and the carbon dioxide absorption amount of the root to the agricultural field terminal 200 .
 実施形態では、機械学習によって根の成長データや根の二酸化炭素排出量を推定する仕組みが得られる。特に、1期目で成長した根が残された場合に、2期目において根がどのように成長を続けるか、容易には把握できないという問題を解決した。 In the embodiment, machine learning provides a mechanism for estimating root growth data and root carbon dioxide emissions. In particular, when the roots that grew in the first period were left behind, it was not possible to easily grasp how the roots continued to grow in the second period.
[変形例1]
 2期目では、1期目から引き継いだ根を更に成長させることになる。したがって、2期目に関して、1回目の刈り取りが早かったなどの理由から小さな根から始まる場合もあるし、1回目の刈り取りが遅かったなどの理由から大きな根から始まる場合もある。2期目の開始時における根の状態が一律であると想定することは、2期目の根の成長を正しく推定するために好ましくない面もある。
[Modification 1]
In the second term, the roots inherited from the first term will grow further. Therefore, in the second period, it may start with small roots because the first harvest was early, or it may start with large roots because the first harvest was late. Assuming that the state of the roots at the start of the second term is uniform is also not preferable for correctly estimating the growth of the roots in the second term.
 1回目の刈り取り時は、1期目の終了時点であり、且つ2期目の開始時点でもある。変形例1では、第1推定モデルを使って1回目の刈り取り時における根の成長データを推定する。そして、推定された根の成長データを、2期目における根の状態の初期値として、2期目における根の成長データの推測要素に加える。 The time of the first harvest is the end of the first term and the start of the second term. In Modified Example 1, the first estimation model is used to estimate the root growth data at the time of the first cutting. Then, the estimated root growth data is added to the estimation element of the root growth data in the second period as the initial value of the root state in the second period.
 図15は、変形例1におけるニューラルネットワークの構成図である。
 変形例1の第1推定モデルは、実施形態と同様のニューラルネットワークを用いて生成される。変形例1の第2推定モデルと第3推定モデルは、図15に示したニューラルネットワークを用いて生成される。
FIG. 15 is a configuration diagram of a neural network in modification 1. FIG.
The first estimation model of modification 1 is generated using a neural network similar to that of the embodiment. The second and third estimation models of modification 1 are generated using the neural network shown in FIG.
 変形例1におけるニューラルネットワーク(図15)では、前期終了時点における根の成長データ又は根の二酸化炭素吸収量に相当する入力ノードが追加される。この例では、前期終了時点の根長の入力ノードが追加される。中間ノードと出力ノードは、変わらない。全結合である点も、実施形態の場合と同様である。 In the neural network (Fig. 15) in Modification 1, an input node corresponding to the root growth data or the amount of carbon dioxide absorbed by the root at the end of the first term is added. In this example, an input node for the root length at the end of the previous period is added. Intermediate nodes and output nodes remain unchanged. It is also the same as the embodiment in that it is a full connection.
 図2で説明した1回目の刈り取り時において、各週の撮影タイミングの場合と同様に、撮影画像の取得と根の計測が行われる。また、2回目の刈り取り時においても、各週の撮影タイミングの場合と同様に、撮影画像の取得と根の計測が行われる。撮影画像からの画像特徴量の抽出も、各週の撮影タイミングの場合と同様に行われる。データの記憶についても、各週の撮影タイミングの場合と同様である。 At the time of the first reaping described in FIG. 2, acquisition of photographed images and measurement of roots are performed in the same manner as in the case of the photographing timing of each week. In addition, at the time of the second harvesting as well, acquisition of photographed images and measurement of roots are performed in the same manner as in the case of photographing timing of each week. Extraction of the image feature amount from the captured image is also performed in the same manner as in the case of the timing of capturing each week. The storage of data is also the same as in the case of the shooting timing of each week.
 変形例1における1期目の教師データは、実施形態の場合(図11)と同様であり、第1推定モデルの生成も同様である。 The training data for the first period in modification 1 is the same as in the case of the embodiment (FIG. 11), and the generation of the first estimation model is also the same.
 図16は、変形例1における2期目の教師データのデータ構造図である。
 変形例1における2期目の教師データでは、1期目終了時点の根長の項目が追加される。1回目の刈り取り時に計測された根長が、この項目のフィールドに設定される。
FIG. 16 is a data structure diagram of teacher data for the second period in Modification 1. As shown in FIG.
In the second term teacher data in Modification 1, an item of root length at the end of the first term is added. The root length measured at the first cutting is set in the field of this item.
 図17は、変形例1における3期目の教師データのデータ構造図である。
 同様に、変形例1における3期目の教師データでは、2期目終了時点の根長の項目が追加される。2回目の刈り取り時に計測された根長が、この項目のフィールドに設定される。
FIG. 17 is a data structure diagram of teacher data for the third term in Modification 1. As shown in FIG.
Similarly, in the teacher data for the third term in Modification 1, an item of root length at the end of the second term is added. The root length measured at the second cutting is set in the field of this item.
 そのため、図9に示した試験データ収集フェーズの処理過程において、S22の処理の後に、生育管理装置100は、各圃場220の1回目の刈り取り時の衛星画像・根の成長データを取得する(S24)。S26の処理の後に、生育管理装置100は、各圃場220の2回目の刈り取り時の衛星画像・根の成長データを取得する(S28)。衛星画像の取得方法、衛星画像からの画像特徴量の抽出方法、根の成長データの取得方法などは、各週の撮影タイミングの場合と同様である。 Therefore, in the process of the test data collection phase shown in FIG. 9, after the process of S22, the growth management device 100 acquires the satellite image and root growth data of each field 220 at the time of the first harvest (S24 ). After the process of S26, the growth management apparatus 100 acquires the satellite image and root growth data of each field 220 at the time of the second harvest (S28). The acquisition method of the satellite image, the method of extracting the image feature amount from the satellite image, the acquisition method of the root growth data, and the like are the same as in the case of the timing of photographing each week.
 推定モデル生成フェーズ(S12)における処理に関して、図13のS46で、1期目終了時点(1回目の刈り取り時)の根長を加えた2期目の教師データ(図16)を生成する。S48で、推定モデル生成部150は、この2期目の教師データを入力して、変形例2における第2推定モデルを生成する。S52で、2期目終了時点(2回目の刈り取り時)の根長を加えた3期目の教師データ(図17)を生成する。S54で、推定モデル生成部150は、この3期目の教師データを入力して、変形例2における第3推定モデルを生成する。 Regarding the processing in the estimation model generation phase (S12), in S46 of FIG. 13, the teacher data for the second term (FIG. 16) is generated by adding the root length at the end of the first term (at the time of the first reaping). In S48, the estimation model generation unit 150 inputs this second term teacher data and generates a second estimation model in the second modification. In S52, teacher data for the third term (FIG. 17) is generated by adding the root length at the end of the second term (at the time of the second reaping). In S54, the estimation model generation unit 150 inputs this third period teacher data and generates the third estimation model in the second modified example.
 推定モデル利用フェーズ(S14)における処理に関して、図14のS66で第1推定モデルが選択されたときは、実施形態と同様に処理する。 Regarding the processing in the estimation model usage phase (S14), when the first estimation model is selected in S66 of FIG. 14, the same processing as in the embodiment is performed.
 変形例1では、受信部182が、圃場端末200から1回目の刈り取りの時に通知を受信する。1回目の刈り取りの通知を受けて、S62と同様に画像取得部142は、その圃場220の衛星画像を取得し、S64と同様に特徴量抽出部144は、画像特徴量を抽出する。そして、推定モデル利用部156は、その圃場220の土壌データと、抽出された画像特徴量を、第1推定モデルに適用し、根成長推定部154は、1回目の刈り取り時点における根長と根の二酸化炭素吸収量の値を得る。 In Modification 1, the receiving unit 182 receives the notification from the field terminal 200 at the time of the first reaping. Upon receiving the first harvesting notification, the image acquisition unit 142 acquires the satellite image of the field 220 in the same manner as in S62, and the feature amount extraction unit 144 extracts the image feature amount in the same manner as in S64. Then, the estimation model utilization unit 156 applies the soil data of the farm field 220 and the extracted image feature amount to the first estimation model, and the root growth estimation unit 154 calculates the root length and the root length at the time of the first harvesting. Get the value of the carbon dioxide uptake of .
 この後2期目に入って、S66で第2推定モデルが選択されたときに、推定モデル利用部156は、1回目の刈り取り時点における根長を、ニューラルネットワークにおける前期終了時点の根長の入力ノードに設定する。つまり、推定モデル利用部156は、土壌データと、画像特徴量とに加えて、1回目の刈り取り時点における根長を第2推定モデルに適用し、根成長推定部154は、1回目の刈り取り時点における根長が考慮された2期目の根長と根の二酸化炭素吸収量の値を得る。 In the second term after this, when the second estimation model is selected in S66, the estimation model utilization unit 156 inputs the root length at the time of the first cutting and the root length at the end of the first term in the neural network. Set to node. That is, the estimation model utilization unit 156 applies the root length at the time of the first cutting to the second estimation model in addition to the soil data and the image feature value, and the root growth estimation unit 154 applies the root length at the time of the first cutting to the second estimation model. The values of root length and root CO2 uptake in the second period are obtained considering the root length in the second period.
 また、受信部182は、圃場端末200から2回目の刈り取りの時に通知を受信する。2回目の刈り取りの通知を受けて、S62と同様に画像取得部142は、当該圃場の衛星画像を取得し、S64と同様に特徴量抽出部144は、画像特徴量を抽出する。そして、推定モデル利用部156は、その圃場220の土壌データと、抽出された画像特徴量を、第2推定モデルに適用し、根成長推定部154は、2回目の刈り取り時点における根長と根の二酸化炭素吸収量の値を得る。 Also, the receiving unit 182 receives a notification from the field terminal 200 at the time of the second harvesting. Upon receiving the second harvesting notification, the image acquisition unit 142 acquires a satellite image of the field in the same manner as in S62, and the feature amount extraction unit 144 extracts the image feature amount in the same manner as in S64. Then, the estimation model using unit 156 applies the soil data of the farm field 220 and the extracted image feature amount to the second estimation model, and the root growth estimation unit 154 calculates the root length and the root length at the time of the second harvesting. Get the value of the carbon dioxide uptake of .
 この後3期目に入って、S66で第3推定モデルが選択されたときに、推定モデル利用部156は、2回目の刈り取り時点における根長を、ニューラルネットワークにおける前期終了時点の根長の入力ノードに設定する。つまり、推定モデル利用部156は、土壌データと、画像特徴量とに加えて、2回目の刈り取り時点における根長を第3推定モデルに適用し、根成長推定部154は、2回目の刈り取り時点における根長が考慮された3期目の根長と根の二酸化炭素吸収量の値を得る。 In the third term after this, when the third estimation model is selected in S66, the estimation model utilization unit 156 inputs the root length at the time of the second harvesting and the root length at the end of the first term in the neural network. Set to node. That is, the estimation model utilization unit 156 applies the root length at the time of the second cutting to the third estimation model in addition to the soil data and the image feature value, and the root growth estimation unit 154 applies the root length at the time of the second cutting to the third estimation model. The values of root length and root carbon dioxide uptake at the third stage are obtained considering the root length at
 整理すると、データ格納部106は、1期目の教師データ(図11)として、土壌データと、1期目の圃場の第1撮影画像から抽出された画像特徴量と、第1撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とを記憶する。データ格納部106は、2期目の教師データ(図16)として、土壌データと、2期目の圃場の第2撮影画像から抽出された画像特徴量と、第2撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方と、1期目終了時点の所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とを記憶する。データ格納部106は、3期目の教師データ(図17)として、土壌データと、3期目の圃場の第3撮影画像から抽出された画像特徴量と、第3撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方と、2期目終了時点の所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とを記憶する。 To summarize, the data storage unit 106 stores the soil data, the image feature amount extracted from the first captured image of the field in the first period, and the first captured image as the first-term teacher data ( FIG. 11 ). At least one of the growth data of the roots of the predetermined plant and the carbon dioxide absorption amount of the roots of the predetermined plant at the time is stored. The data storage unit 106 stores the soil data, the image feature amount extracted from the second photographed image of the field in the second term, and the predetermined Stores at least one of plant root growth data and predetermined plant root carbon dioxide absorption, and at least one of predetermined plant root growth data and predetermined plant root carbon dioxide absorption at the end of the first term. do. The data storage unit 106 stores the soil data, the image feature amount extracted from the third photographed image of the field in the third term, and the predetermined Stores at least one of plant root growth data and predetermined plant root carbon dioxide absorption, and at least one of predetermined plant root growth data and predetermined plant root carbon dioxide absorption at the end of the second period. do.
 推定モデル生成部150は、図10のニューラルネットワークと1期目の教師データ(図11)を用いて、土壌データと、第1撮影画像の画像特徴量とから、1撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する第1推定モデルを機械学習により生成する。推定モデル生成部150は、図15のニューラルネットワークと2期目の教師データ(図16)を用いて、土壌データと、第2撮影画像の画像特徴量と、1期目終了時点の所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とから、第2撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する第2推定モデルを機械学習により生成する。さらに、推定モデル生成部150は、図15のニューラルネットワークと3期目の教師データ(図17)を用いて、土壌データと、第3撮影画像の画像特徴量と、2期目終了時点の所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とから、第3撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する第3推定モデルを機械学習により生成する。 The estimation model generating unit 150 uses the neural network of FIG. 10 and the first-term teacher data (FIG. 11) to determine the roots of the predetermined plant at the time of the first photographing from the soil data and the image feature amount of the first photographed image. machine learning to generate a first estimation model for estimating at least one of the growth data of the plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant. The estimation model generation unit 150 uses the neural network of FIG. 15 and the teacher data of the second period (FIG. 16) to generate soil data, the image feature amount of the second captured image, and the number of predetermined plants at the end of the first period. estimating at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the root of the predetermined plant at the time of the second photographing from at least one of the root growth data and the carbon dioxide absorption amount of the root of the predetermined plant; Generate an inference model by machine learning. Furthermore, the estimation model generation unit 150 uses the neural network of FIG. 15 and the teacher data of the third period (FIG. 17) to generate the soil data, the image feature amount of the third captured image, and the predetermined At least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant at the time of the third imaging is estimated from at least one of the root growth data of the plant and the carbon dioxide absorption amount of the predetermined plant root. A third estimation model is generated by machine learning.
 運用段階で2期目の推定を行う前提として、根成長推定部154は、第1推定モデルを用いて、土壌データと、1期目終了時点の第1撮影画像から抽出された画像特徴量とから、1期目終了時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する。そして、2期目の推定で、根成長推定部154は、第2推定モデルを用いて、土壌データと、第2撮影画像から抽出された画像特徴量と、推定された1期目終了時点の所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とから、第2撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する。 As a premise for estimating the second period in the operation stage, the root growth estimation unit 154 uses the first estimation model to obtain the soil data and the image feature amount extracted from the first captured image at the end of the first period. At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the root of the predetermined plant at the end of the first term is estimated from the above. Then, in the second estimation, the root growth estimating unit 154 uses the second estimation model, the soil data, the image feature amount extracted from the second captured image, and the estimated end point of the first period. Based on at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant roots, at least the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant roots at the time of capturing the second captured image. Estimate one.
 更に3期目の推定を行う前提として、根成長推定部154は、第2推定モデルを用いて、土壌データと、2期目終了時点の第2撮影画像から抽出された画像特徴量とから、2期目終了時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する。そして、3期目の推定で、根成長推定部154は、第3推定モデルを用いて、土壌データと、第3撮影画像から抽出された画像特徴量と、推定された2期目終了時点の所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方とから、第3撮影画像の撮影時点における所定植物の根の成長データ及び所定植物の根の二酸化炭素吸収量の少なくとも一方を推定する。 Furthermore, as a premise for estimating the third period, the root growth estimation unit 154 uses the second estimation model, from the soil data and the image feature amount extracted from the second captured image at the end of the second period, At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the root of the predetermined plant at the end of the second period is estimated. Then, in the third period estimation, the root growth estimation unit 154 uses the third estimation model to obtain the soil data, the image feature amount extracted from the third captured image, and the estimated end point of the second period. Based on at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant roots, at least the root growth data of the predetermined plant and the carbon dioxide absorption amount of the predetermined plant roots at the time of capturing the third captured image. Estimate one.
 2期目の最初の根の状態は、土の中なのでわからない。そこで、第1推定モデルで1期目の最後の根の状態を推定して、それを第2推定モデルに取り込むようにした。さらに、3期目の最初の根の状態も、土の中なのでわからない。そこで、第2推定モデルで2期目の最後の根の状態を推定して、それを第3推定モデルに取り込むようにした。これらの工夫は、根を残したまま多期作を行える植物特有の課題であって、1期作の作物や樹木ではない事情に関する。特に、このような植物の根に着目することは、従来なかった。2期目や3期目の根の状態を推測する動機は、発明者の優れた思想によるものであり、多期作物(ex.ソルガム)の根を活用した二酸化炭素削減の事業推進に大きく寄与する。 The state of the roots at the beginning of the second period is unknown because they are in the soil. Therefore, the first estimation model is used to estimate the state of the last root in the first period, and this is incorporated into the second estimation model. Furthermore, the state of the roots at the beginning of the third period is also unknown because it is in the soil. Therefore, the second estimation model is used to estimate the state of the last root in the second term, and this is incorporated into the third estimation model. These ideas are specific to plants that can be multi-cropped with their roots left intact, and are not related to single crops or trees. In particular, there has been no conventional focus on the roots of such plants. The motive for estimating the state of the roots in the second and third periods is due to the inventor's excellent idea, which greatly contributes to the promotion of carbon dioxide reduction business using the roots of multi-season crops (ex. sorghum). do.
[変形例2]
 実施形態では、生育管理装置100は、機械学習によって推定モデルを生成する機械学習装置の機能と、推定モデルで根の成長データと根の二酸化炭素吸収量を求める推定装置の機能を有した。ただし、機械学習装置と推定装置を別に設けてもよい。
[Modification 2]
In the embodiment, the growth management device 100 has the function of a machine learning device that generates an estimation model by machine learning, and the function of an estimation device that obtains root growth data and root carbon dioxide absorption from the estimation model. However, a machine learning device and an estimation device may be provided separately.
 図18は、生育管理システムの構成図である。
 機械学習装置190は、試験データ収集フェーズ(S10)と推定モデル生成フェーズ(S12)における処理を行い、機械学習によって推定モデルを生成する。推定装置192は、推定モデル利用フェーズ(S14)において、衛星撮影を取得して、推定モデルを用いて根の成長データと根の二酸化炭素吸収量を求める。そのため、推定モデル生成フェーズ(S12)から推定モデル利用フェーズ(S14)へ移る前に、機械学習装置190で生成された推定モデルを推定装置192へ移すようにする。機械学習装置190は、たとえば推定モデルを提供するサービス業者によって使用される。推定装置192は、たとえば推定モデルを使用する農業者によって使用される。
FIG. 18 is a configuration diagram of a growth management system.
The machine learning device 190 performs processing in the test data collection phase (S10) and the estimation model generation phase (S12), and generates an estimation model by machine learning. In the estimation model utilization phase (S14), the estimation device 192 acquires satellite images and uses the estimation model to obtain root growth data and root carbon dioxide absorption. Therefore, the estimation model generated by the machine learning device 190 is transferred to the estimation device 192 before moving from the estimation model generation phase (S12) to the estimation model use phase (S14). Machine learning device 190 is used, for example, by service providers who provide estimation models. The estimator 192 is used, for example, by farmers using estimation models.
[変形例3]
 生育管理装置100は、推定された根の二酸化炭素吸収量(1株当たり)に単位量当たりの炭素価格(レート)を乗じて炭素価格(1株当たり)を算出する炭素価格算出部(不図示)を設けるようにしてもよい。炭素価格算出部は、推定された根の二酸化炭素吸収量(単位面積当たり)に単位量当たりの炭素価格(レート)を乗じて炭素価格(単位面積当たり)を算出してもよい。出力部120は、炭素価格(1株当たり)又は炭素価格(単位面積当たり)を出力(たとえば、ディスプレイへの表示)してもよい。送信部180は、炭素価格(1株当たり)又は炭素価格(単位面積当たり)を圃場端末200へ送信してもよい。
[Modification 3]
The growth management apparatus 100 has a carbon price calculation unit (not shown) that calculates a carbon price (per share) by multiplying the estimated carbon dioxide absorption amount (per share) by a carbon price (rate) per unit amount. ) may be provided. The carbon price calculation unit may calculate the carbon price (per unit area) by multiplying the estimated amount of carbon dioxide absorption by the roots (per unit area) by the carbon price (rate) per unit amount. The output unit 120 may output (for example, display on a display) the carbon price (per share) or the carbon price (per unit area). The transmission unit 180 may transmit the carbon price (per share) or the carbon price (per unit area) to the field terminal 200 .
 推定装置192も同様に、推定された根の二酸化炭素吸収量(1株当たり)に単位量当たりの炭素価格(レート)を乗じて炭素価格(1株当たり)を算出する炭素価格算出部を設けるようにしてもよい。炭素価格算出部は、推定された根の二酸化炭素吸収量(単位面積当たり)に単位量当たりの炭素価格(レート)を乗じて炭素価格(単位面積当たり)を算出してもよい。出力部120は、炭素価格(1株当たり)又は炭素価格(単位面積当たり)を出力(たとえば、ディスプレイへの表示)してもよい。送信部180は、炭素価格(1株当たり)又は炭素価格(単位面積当たり)を圃場端末200へ送信してもよい。 The estimation device 192 also includes a carbon price calculation unit that calculates the carbon price (per share) by multiplying the estimated carbon dioxide absorption amount (per share) by the carbon price (rate) per unit amount. You may do so. The carbon price calculation unit may calculate the carbon price (per unit area) by multiplying the estimated amount of carbon dioxide absorption by the roots (per unit area) by the carbon price (rate) per unit amount. The output unit 120 may output (for example, display on a display) the carbon price (per share) or the carbon price (per unit area). The transmission unit 180 may transmit the carbon price (per share) or the carbon price (per unit area) to the field terminal 200 .
[その他の変形例]
 衛星画像に代えて、圃場220の上空を飛行するドローンのカメラで、圃場220を上方から撮影したドローン画像を用いてもよい。ドローン画像の場合、衛星画像よりも画素が多いので、圃場範囲の緑色の割合を求めやすいという利点がある。
[Other Modifications]
Instead of the satellite image, a drone image obtained by capturing the farm field 220 from above with a drone camera flying over the farm field 220 may be used. In the case of drone images, there are more pixels than satellite images, so there is an advantage that it is easy to obtain the ratio of green in the field range.
 衛星画像に代えて、圃場220に設置されたカメラ(以下、「設置カメラ」という)で、圃場220の斜め上方から、あるいは水平方向から撮影したカメラ画像を用いてもよい。水平方向から撮影されたカメラ画像は草丈を求めやすいという利点がある。 Instead of the satellite image, a camera image captured from obliquely above the field 220 or horizontally by a camera installed in the field 220 (hereinafter referred to as "installed camera") may be used. A camera image taken from the horizontal direction has the advantage that the height of the plant can be easily obtained.
 実施形態と変形例では、主に根の成長データと根の二酸化炭素吸収量の両方を求める例を示したが、根の成長データと根の二酸化炭素吸収量の一方のみを用いてもよい。また、根の成長データと根の二酸化炭素吸収量は相関が強いので、実施形態と変形例における指標としての両者を置き換えて使用するようにしてもよい。 In the embodiment and modification, an example of obtaining both root growth data and root carbon dioxide uptake was mainly shown, but only one of the root growth data and root carbon dioxide uptake may be used. Further, since the root growth data and the amount of carbon dioxide absorbed by the roots are strongly correlated, both may be used interchangeably as indicators in the embodiment and the modified example.
 試験段階では、圃場220を複数の区域に分割して、各区域を1つの圃場220と見立て、刈り取りの時期や土壌を変えて試験栽培を行うようにしてもよい。このようにすれば、教師データに沢山のサンプルを含めることができる。 In the test stage, the field 220 may be divided into a plurality of areas, each area may be treated as one field 220, and test cultivation may be performed by changing the timing of harvesting and the soil. In this way, many samples can be included in the training data.
 なお、本発明は上記実施形態や変形例に限定されるものではなく、要旨を逸脱しない範囲で構成要素を変形して具体化することができる。上記実施形態や変形例に開示されている複数の構成要素を適宜組み合わせることにより種々の発明を形成してもよい。また、上記実施形態や変形例に示される全構成要素からいくつかの構成要素を削除してもよい。 It should be noted that the present invention is not limited to the above embodiments and modifications, and can be embodied by modifying the constituent elements without departing from the scope of the invention. Various inventions may be formed by appropriately combining a plurality of constituent elements disclosed in the above embodiments and modifications. Also, some constituent elements may be deleted from all the constituent elements shown in the above embodiments and modifications.

Claims (9)

  1.  所定植物を栽培する圃場の土壌データと、前記圃場の撮影画像から抽出された画像特徴量と、前記撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方とを記憶する記憶部と、
     前記土壌データと、前記撮影画像から抽出された前記画像特徴量とから、前記撮影時点における前記所定植物の前記根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定する推定モデルを機械学習により生成するモデル生成部と、を備える機械学習装置。
    Soil data of a field in which a predetermined plant is cultivated, image feature values extracted from a photographed image of the field, root growth data of the predetermined plant at the time of photographing the photographed image, and carbon dioxide of the root of the predetermined plant. a storage unit that stores at least one of the absorption amounts;
    At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of photographing is estimated from the soil data and the image feature amount extracted from the photographed image. A machine learning device, comprising: a model generation unit that generates an estimation model by machine learning.
  2.  所定植物を栽培する圃場の土壌データを記憶する記憶部と、
     前記圃場の撮影画像を取得する画像取得部と、
     前記取得された撮影画像から画像特徴量を抽出する抽出部と、
     過去の栽培において前記所定植物を栽培した圃場の土壌データと、前記過去の栽培における前記圃場の撮影画像から抽出された画像特徴量とを入力データとし、前記過去の栽培における前記撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された推定モデルを用いて、前記記憶している土壌データと、前記取得された撮影画像から抽出された前記画像特徴量とから、前記取得された撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定する推定部と、を備える推定装置。
    a storage unit that stores soil data of a field in which a predetermined plant is cultivated;
    an image acquisition unit that acquires a photographed image of the field;
    an extraction unit that extracts an image feature amount from the acquired captured image;
    Soil data of a field in which the predetermined plant was cultivated in the past and an image feature amount extracted from the photographed image of the field in the past cultivation are used as input data, and the photographing time of the photographed image in the past cultivation. The stored soil data and the acquired At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of capturing the acquired captured image is estimated from the image feature amount extracted from the captured image. an estimating device, comprising: an estimating unit;
  3.  1期目の刈り取りの後に根を残して2期目の栽培が可能な所定植物を栽培する圃場の土壌データと、前記1期目の前記圃場の第1撮影画像から抽出された画像特徴量と、前記第1撮影画像の第1撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方と、前記2期目の前記圃場の第2撮影画像から抽出された画像特徴量と、前記第2撮影画像の第2撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方とを記憶する記憶部と、
     前記土壌データと、前記第1撮影画像の前記画像特徴量とから、前記第1撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定する第1推定モデルを機械学習により生成し、
     前記土壌データと、前記第2撮影画像の前記画像特徴量とから、前記第2撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定する第2推定モデルを機械学習により生成するモデル生成部と、を備える機械学習装置。
    Soil data of a field for cultivating a predetermined plant that can be cultivated in the second period with the roots remaining after the first harvesting period, and an image feature amount extracted from the first photographed image of the field in the first period. from at least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of the first photographing of the first photographed image, and the second photographed image of the field in the second period; a storage unit that stores the extracted image feature amount, and at least one of root growth data of the predetermined plant at the time of the second photographing of the second photographed image and carbon dioxide absorption amount of the root of the predetermined plant;
    At least one of root growth data of the predetermined plant and carbon dioxide absorption amount of the root of the predetermined plant at the time of the first photographing is estimated from the soil data and the image feature amount of the first photographed image. Generate a first estimation model by machine learning,
    At least one of root growth data of the predetermined plant and carbon dioxide absorption amount of the root of the predetermined plant at the time of the second photographing is estimated from the soil data and the image feature amount of the second photographed image. A machine learning device comprising: a model generation unit that generates a second estimation model by machine learning.
  4.  1期目の刈り取りの後に根を残して2期目の栽培が可能な所定植物を栽培する圃場の土壌データを記憶する記憶部と、
     前記1期目の前記圃場の第1撮影画像又は前記2期目の前記圃場の第2撮影画像を取得する画像取得部と、
     前記第1撮影画像又は前記第2撮影画像から画像特徴量を抽出する抽出部と、
     前記第1撮影画像を取得した場合に、過去の栽培において前記所定植物を栽培した圃場の土壌データと、前記過去の栽培における1期目の前記圃場の撮影画像から抽出された画像特徴量とを入力データとし、前記過去の栽培における前記1期目の前記撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第1推定モデルを用いて、前記記憶している土壌データと、前記第1撮影画像から抽出された前記画像特徴量とから、前記第1撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定し、
     前記第2撮影画像を取得した場合に、前記過去の栽培において前記所定植物を栽培した圃場の土壌データと、前記過去の栽培における2期目の前記圃場の撮影画像から抽出された画像特徴量とを入力データとし、前記過去の栽培における前記2期目の前記撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第2推定モデルを用いて、前記記憶している土壌データと、前記第2撮影画像から抽出された前記画像特徴量とから、前記第2撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定する推定部と、を備える推定装置。
    a storage unit for storing soil data of a field for cultivating a predetermined plant that can be cultivated for a second period with roots remaining after the first harvest;
    an image acquisition unit that acquires a first photographed image of the farm field in the first period or a second photographed image of the farm field in the second period;
    an extraction unit that extracts an image feature amount from the first captured image or the second captured image;
    When the first photographed image is acquired, soil data of a field in which the predetermined plant was cultivated in the past cultivation, and an image feature amount extracted from the photographed image of the field in the first period in the past cultivation. Machine learning using as input data and at least one of root growth data of the predetermined plant at the time of photographing the photographed image in the first period in the past cultivation and at least one of carbon dioxide absorption amount of the root of the predetermined plant as output data. Using the obtained first estimation model, from the stored soil data and the image feature amount extracted from the first photographed image, the root shape of the predetermined plant at the time of photographing the first photographed image. estimating at least one of growth data and the amount of carbon dioxide absorbed by the roots of the predetermined plant;
    When the second photographed image is acquired, soil data of the field in which the predetermined plant was cultivated in the past cultivation, and an image feature amount extracted from the photographed image of the field in the second period in the past cultivation. is input data, and at least one of the root growth data of the predetermined plant at the time of capturing the captured image in the second period in the past cultivation and the carbon dioxide absorption amount of the root of the predetermined plant is used as output data Using the learned second estimation model, the root of the predetermined plant at the time of capturing the second captured image from the stored soil data and the image feature amount extracted from the second captured image. and an estimating unit that estimates at least one of the growth data of the plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant.
  5.  1期目の刈り取りの後に根を残して2期目の栽培が可能な所定植物を栽培する圃場の土壌データと、前記1期目の前記圃場の第1撮影画像から抽出された画像特徴量と、前記第1撮影画像の第1撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方と、前記2期目の前記圃場の第2撮影画像から抽出された画像特徴量と、前記第2撮影画像の第2撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方と、前記1期目終了時点の前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方と、を記憶する記憶部と、
     前記土壌データと、前記第1撮影画像の前記画像特徴量とから、前記第1撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定する第1推定モデルを機械学習により生成し、
     前記土壌データと、前記第2撮影画像の前記画像特徴量と、前記1期目終了時点の前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方とから、前記第2撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定する第2推定モデルを機械学習により生成するモデル生成部と、を備える機械学習装置。
    Soil data of a field for cultivating a predetermined plant that can be cultivated in the second period with the roots remaining after the first harvesting period, and an image feature amount extracted from the first photographed image of the field in the first period. from at least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of the first photographing of the first photographed image, and the second photographed image of the field in the second period; at least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of the second photographing of the second photographed image; and the end of the first term. a storage unit that stores at least one of root growth data of the predetermined plant and carbon dioxide absorption amount of the root of the predetermined plant;
    At least one of root growth data of the predetermined plant and carbon dioxide absorption amount of the root of the predetermined plant at the time of the first photographing is estimated from the soil data and the image feature amount of the first photographed image. Generate a first estimation model by machine learning,
    From the soil data, the image feature amount of the second photographed image, and at least one of the root growth data of the predetermined plant at the end of the first period and the carbon dioxide absorption amount of the root of the predetermined plant, a model generating unit that generates, by machine learning, a second estimation model that estimates at least one of root growth data of the predetermined plant and carbon dioxide absorption amount of the root of the predetermined plant at the time of the second photographing. learning device.
  6.  1期目の刈り取りの後に根を残して2期目の栽培が可能な所定植物を栽培する圃場の土壌データを記憶する記憶部と、
     前記1期目終了時点の前記圃場の第1撮影画像を取得し、更に前記2期目の前記圃場の第2撮影画像を取得する画像取得部と、
     前記第1撮影画像から画像特徴量を抽出し、更に前記第2撮影画像から画像特徴量を抽出する抽出部と、
     過去の栽培において前記所定植物を栽培した圃場の土壌データと、前記過去の栽培における1期目の前記圃場の撮影画像から抽出された画像特徴量とを入力データとし、前記過去の栽培における前記1期目の前記撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第1推定モデルを用いて、前記記憶している土壌データと、前記第1撮影画像から抽出された前記画像特徴量とから、前記1期目終了時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定し、
     前記過去の栽培において前記所定植物を栽培した前記圃場の前記土壌データと、前記過去の栽培における2期目の前記圃場の撮影画像から抽出された画像特徴量と、前記過去の栽培における1期目終了時点の前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方とを入力データとし、前記過去の栽培における前記2期目の前記撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された第2推定モデルを用いて、前記記憶している土壌データと、前記第2撮影画像から抽出された前記画像特徴量と、推定された前記1期目終了時点の前記所定植物の前記根の前記成長データ及び前記所定植物の前記根の前記二酸化炭素吸収量の少なくとも一方とから、前記第2撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定する推定部と、を備える推定装置。
    a storage unit for storing soil data of a field for cultivating a predetermined plant that can be cultivated for a second period with roots remaining after the first harvest;
    an image acquisition unit that acquires a first photographed image of the farm field at the end of the first period and further acquires a second photographed image of the farm field in the second period;
    an extraction unit that extracts an image feature amount from the first captured image and further extracts an image feature amount from the second captured image;
    Soil data of a field in which the predetermined plant was cultivated in the past cultivation and an image feature amount extracted from a photographed image of the field in the first period in the past cultivation are used as input data, and the 1 in the past cultivation Using a first estimation model machine-learned using at least one of the root growth data of the predetermined plant and the carbon dioxide absorption amount of the root of the predetermined plant at the time of photographing the photographed image of the period as output data, Root growth data of the predetermined plant and carbon dioxide absorption by the roots of the predetermined plant at the end of the first term based on the stored soil data and the image feature amount extracted from the first photographed image. Estimate at least one of the quantities,
    The soil data of the field where the predetermined plant was cultivated in the past cultivation, the image feature amount extracted from the photographed image of the field in the second period in the past cultivation, and the first period in the past cultivation. At least one of the root growth data of the predetermined plant at the end point and the carbon dioxide absorption amount of the root of the predetermined plant is used as input data, and the photographed image of the second term in the past cultivation is taken. Using a second estimation model machine-learned using at least one of root growth data of a predetermined plant and carbon dioxide absorption amount of the root of the predetermined plant as output data, the stored soil data and the second from the image feature quantity extracted from the photographed image, and at least one of the growth data of the roots of the predetermined plant estimated at the end of the first period and the carbon dioxide absorption amount of the roots of the predetermined plant and an estimating unit that estimates at least one of root growth data of the predetermined plant and carbon dioxide absorption amount of the root of the predetermined plant at the time of capturing the second captured image.
  7.  所定植物を栽培する圃場の土壌データと、前記圃場の撮影画像から抽出された画像特徴量と、前記撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方とを記憶する機能と、
     前記土壌データと、前記撮影画像から抽出された前記画像特徴量とから、前記撮影時点における前記所定植物の前記根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定する推定モデルを機械学習により生成する機能と、をコンピュータに発揮させるプログラム。
    Soil data of a field in which a predetermined plant is cultivated, image feature values extracted from a photographed image of the field, root growth data of the predetermined plant at the time of photographing the photographed image, and carbon dioxide of the root of the predetermined plant. a function of storing at least one of the absorbed amounts;
    At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of photographing is estimated from the soil data and the image feature amount extracted from the photographed image. A program that makes a computer demonstrate the function of generating an inference model by machine learning.
  8.  所定植物を栽培する圃場の土壌データを記憶する機能と、
     前記圃場の撮影画像を取得する機能と、
     前記取得された撮影画像から画像特徴量を抽出する機能と、
     過去の栽培において前記所定植物を栽培した圃場の土壌データと、前記過去の栽培における前記圃場の撮影画像から抽出された画像特徴量とを入力データとし、前記過去の栽培における前記撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を出力データとして機械学習された推定モデルを用いて、前記記憶している土壌データと、前記取得された撮影画像から抽出された前記画像特徴量とから、前記取得された撮影画像の撮影時点における前記所定植物の根の成長データ及び前記所定植物の前記根の二酸化炭素吸収量の少なくとも一方を推定する機能と、をコンピュータに発揮させるプログラム。
    a function of storing soil data of a field in which a predetermined plant is cultivated;
    a function of acquiring a photographed image of the field;
    A function of extracting an image feature amount from the acquired captured image;
    Soil data of a field in which the predetermined plant was cultivated in the past and an image feature amount extracted from the photographed image of the field in the past cultivation are used as input data, and the photographing time of the photographed image in the past cultivation. The stored soil data and the acquired At least one of the root growth data of the predetermined plant and the amount of carbon dioxide absorbed by the roots of the predetermined plant at the time of capturing the acquired captured image is estimated from the image feature amount extracted from the captured image. A program that makes a computer demonstrate functions.
  9.  推定された前記二酸化炭素吸収量に基づいて炭素価格を算出する炭素価格算出部を、更に備える請求項2、4又は6に記載の推定装置。 The estimation device according to claim 2, 4, or 6, further comprising a carbon price calculation unit that calculates a carbon price based on the estimated carbon dioxide absorption amount.
PCT/JP2022/038750 2021-10-29 2022-10-18 Machine learning device, estimation device, and program WO2023074466A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021177864A JP2023066966A (en) 2021-10-29 2021-10-29 Machine learning apparatus, estimation apparatus, and program
JP2021-177864 2021-10-29

Publications (1)

Publication Number Publication Date
WO2023074466A1 true WO2023074466A1 (en) 2023-05-04

Family

ID=86157735

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/038750 WO2023074466A1 (en) 2021-10-29 2022-10-18 Machine learning device, estimation device, and program

Country Status (2)

Country Link
JP (1) JP2023066966A (en)
WO (1) WO2023074466A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4445788A (en) * 1982-04-30 1984-05-01 The Board Of Regents Of The University Of Nebraska Soil probe and method of obtaining moisture, temperature and root distribution of a soil profile
JP2007166967A (en) * 2005-12-21 2007-07-05 Kyowa Engineering Consultants Co Ltd Method and apparatus for evaluating healthiness of tree
CN103141320A (en) * 2013-03-27 2013-06-12 西北农林科技大学 Automatic plant root system monitoring system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4445788A (en) * 1982-04-30 1984-05-01 The Board Of Regents Of The University Of Nebraska Soil probe and method of obtaining moisture, temperature and root distribution of a soil profile
JP2007166967A (en) * 2005-12-21 2007-07-05 Kyowa Engineering Consultants Co Ltd Method and apparatus for evaluating healthiness of tree
CN103141320A (en) * 2013-03-27 2013-06-12 西北农林科技大学 Automatic plant root system monitoring system

Also Published As

Publication number Publication date
JP2023066966A (en) 2023-05-16

Similar Documents

Publication Publication Date Title
Khan et al. Estimation of vegetation indices for high-throughput phenotyping of wheat using aerial imaging
CN109829234B (en) A kind of across scale Dynamic High-accuracy crop condition monitoring and yield estimation method based on high-definition remote sensing data and crop modeling
WO2011102520A1 (en) Method of generating paddy rice crop yield forecasting model, and method of forecasting crop yield of paddy rice
CN110909679B (en) Remote sensing identification method and system for fallow crop rotation information of winter wheat historical planting area
CN109211801A (en) A kind of crop nitrogen demand real time acquiring method
CN109757175A (en) A kind of corn water-fertilizer integral variable fertilization method based on unmanned plane monitoring
Sarkar et al. High‐throughput measurement of peanut canopy height using digital surface models
CN112836575A (en) Multi-time-sequence image rice yield estimation method based on crop phenological period
Yang et al. A VI-based phenology adaptation approach for rice crop monitoring using UAV multispectral images
CN112304902A (en) Real-time monitoring method and device for crop phenology
CN109523550B (en) Five-factor wheat seedling emergence condition evaluation method
WO2023074466A1 (en) Machine learning device, estimation device, and program
Zhou et al. Improved yield prediction of Ratoon rice using unmanned aerial vehicle-based multi-temporal feature method
CN107437262B (en) Crop planting area early warning method and system
CN115619286B (en) Method and system for evaluating quality of sample plot of breeding field district
KR20220123348A (en) information processing unit
CN115909063A (en) Medium-resolution rice extraction method and system
CN115761475A (en) Online monitoring and recognizing system for corn and wheat seedlings
Zobel Lolium perenne L. root systems are a collection of Gaussian curve shaped meso diameter class length distributions
Sheng et al. Evaluation of CLM-Crop for maize growth simulation over Northeast China
Zeng et al. Calibration of the Crop model in the Community Land Model
Huang et al. The estimation of winter wheat yield based on MODIS remote sensing data
CN116297243B (en) Method and device for estimating dressing amount of flue-cured tobacco nitrogenous fertilizer, electronic equipment and storage medium
Honkavaara et al. Precision agriculture in practice–utilisation of novel remote sensing technologies in grass silage production
CN115861827B (en) Decision method and device for crop water and fertilizer stress and mobile phone terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22886794

Country of ref document: EP

Kind code of ref document: A1