WO2020022215A1 - Information processing device, information processing method, and program - Google Patents
Information processing device, information processing method, and program Download PDFInfo
- Publication number
- WO2020022215A1 WO2020022215A1 PCT/JP2019/028464 JP2019028464W WO2020022215A1 WO 2020022215 A1 WO2020022215 A1 WO 2020022215A1 JP 2019028464 W JP2019028464 W JP 2019028464W WO 2020022215 A1 WO2020022215 A1 WO 2020022215A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- feature
- target objects
- area
- estimation
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G7/00—Botany in general
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the present invention relates to an information processing device, an information processing method, and a program.
- Patent Document 1 proposes a method of detecting a flower area from a captured image and calculating the number of flowers by using an image processing technique. Further, by using the partial detector of Patent Document 2, even when the object is partially hidden (for example, when a part of the crop, which is the object, is hidden by leaves or the like), the object can be detected. it can. Thus, even when the object is partially hidden, the number of objects can be obtained with higher accuracy.
- Patent Documents 1 and 2 cannot support the realization of such a mechanism.
- An information processing apparatus includes: a feature obtaining unit configured to obtain a feature amount of the region related to the number of target objects detected from the image from an image of a region that is a part of a field where a crop is grown, A number acquisition unit that acquires an actual number of the target objects existing in a set area of the field, the feature amount acquired by the feature acquisition unit from an image obtained by capturing the set area, Learning means for learning an estimation parameter for estimating the actual number of the target objects present in a designated area of the field, with the actual number acquired by the number acquisition means as learning data, It is characterized by having.
- the present invention it is possible to support realization of a mechanism for estimating the total number of objects even when some or all of the objects whose number is to be obtained cannot be detected.
- FIG. 3 is a diagram illustrating an example of a hardware configuration of an estimation device. It is a figure showing an example of functional composition of an estimation device.
- FIG. 4 is a diagram illustrating an example of a table for managing learning data.
- FIG. 7 is a diagram illustrating an example of a state in which a part of an object is hidden by a leaf.
- FIG. 4 is a diagram illustrating an example of a table for managing estimation data. It is a flowchart which shows an example of a learning process. It is a flowchart which shows an example of an estimation process.
- FIG. 3 is a diagram illustrating an example of a hardware configuration of an estimation device. It is a figure showing an example of functional composition of an estimation device.
- FIG. 4 is a diagram illustrating an example of a table for
- FIG. 4 is a diagram illustrating an example of a table for managing learning data.
- FIG. 4 is a diagram illustrating an example of a table for managing estimation data. It is a flowchart which shows an example of a learning process. It is a flowchart which shows an example of an estimation process. It is a figure showing an example of functional composition of an estimation device.
- FIG. 4 is a diagram illustrating an example of a table for managing correction information. It is a flowchart which shows an example of an estimation process.
- FIG. 2 is a diagram illustrating an example of a system configuration of an information processing system. It is a figure showing an example of a display screen of an estimation result. It is a figure showing an example of a display screen of an estimation result. It is a figure showing an example of a display screen of an estimation result.
- the estimation device 100 learns an estimation parameter which is a parameter used for estimating the number of objects included in the specified area, and determines the number of objects included in the specified area based on the learned estimation parameter. The process of estimating the value will be described.
- FIG. 1 is a diagram illustrating an example of a hardware configuration of the estimation device 100 according to the present embodiment.
- the estimation device 100 is an information processing device such as a personal computer, a server device, and a tablet device that estimates the number of objects included in a designated area.
- the estimation device 100 includes a CPU 101, a RAM 102, a ROM 103, a network I / F 104, a VRAM 105, an input controller 107, an HDD 109, and an input I / F 110.
- the components are communicably connected to each other via a system bus 111.
- the CPU 101 is a central processing unit that controls the entire estimating apparatus 100 as a whole.
- the RAM 102 is a Random Access Memory, and functions as a main memory of the CPU 101, a work memory necessary for loading an execution program and executing a program, and the like.
- the $ ROM 103 is a Read Only Memory, and stores, for example, various programs and various setting information.
- the ROM 103 includes a program ROM in which basic software (OS), which is a system program for controlling equipment of the computer system, is stored, and a data ROM in which information necessary for operating the system is stored. Also, the HDD 109 may store programs and information stored in the ROM 103.
- OS basic software
- HDD 109 may store programs and information stored in the ROM 103.
- the network I / F 104 is a network interface, and is used for input / output control of data such as image data transmitted and received via a network such as a local area network (LAN). It is assumed that the network I / F 104 is an interface corresponding to a network medium such as a wired or wireless network.
- LAN local area network
- the VRAM 105 is a video RAM in which image data displayed on the screen of the display 106 as a display device is expanded.
- the display 106 is a display device, for example, a liquid crystal display or a liquid crystal panel.
- the input controller 107 is a controller used for controlling an input signal from the input device 108.
- the input device 108 is an external input device for receiving an operation instruction from a user, and is, for example, a touch panel, a keyboard, a pointing device, a remote controller, or the like.
- the HDD 109 is a hard disk drive and stores application programs and data such as moving image data and image data.
- the application program stored in the HDD 109 is, for example, a highlight moving image creation application or the like.
- the input I / F 110 is an interface used for connection with an external device such as a CD (DVD) -ROM drive, a memory card drive, etc., and is used, for example, for reading image data captured by a digital camera.
- the system bus 111 is an input / output bus for connecting the respective hardware components of the estimation device so as to be able to communicate with each other, and is, for example, an address bus, a data bus, a control bus, or the like.
- the CPU 101 executes a process based on a program stored in the ROM 103, the HDD 109, or the like, and thereby functions of the estimating apparatus 100 described later in FIGS. 2, 8, and 15, and a flowchart described later in FIGS. 6, 7, 13, 14, and 17. And the like are realized.
- the object whose number is to be estimated is a crop (for example, a bunch of fruits, flowers, grapes, and the like).
- the object whose number is to be estimated is referred to as a target object.
- an object that can hinder detection of the target object is referred to as an obstruction.
- the inhibitor is a leaf.
- the inhibitors may be not only leaves but also trees and stems.
- the target object is not limited to agricultural products, and may be a person or a car. In that case, the obstruction may be, for example, a building.
- the estimation device 100 detects the target object from the captured image of the target region for which the number of target objects is to be estimated, and determines the characteristics of the region determined based on the detected number of target objects. The characteristic amount shown is obtained. Then, based on the obtained feature amount and the number of target objects actually included in the region, the estimation device 100 estimates the parameter used for estimating the actual number of target objects included in the region. We will learn the parameters.
- the actual number of target objects included in a region is defined as the actual number of target objects in the region.
- the estimating apparatus 100 also detects a target object from an image of a specified region, which is a target for estimating the number of target objects, and indicates a feature amount indicating a feature of the region based on the detected number of target objects. Ask for. Then, the estimating apparatus 100 estimates the actual number of target objects included in the area based on the obtained feature amounts and the learned estimation parameters.
- FIG. 2 is a diagram illustrating an example of a functional configuration of the estimation device 100 according to the present embodiment.
- the estimation device 100 includes a number acquisition unit 201, an image acquisition unit 202, a learning unit 203, a feature amount acquisition unit 204, a parameter management unit 205, an estimation unit 206, and a display control unit 207.
- the number acquisition unit 201 acquires the actual number (actual number) of target objects included in a preset region obtained by counting manually or the like.
- the number acquisition unit 201 acquires the actual number by reading, for example, a text file in which the actual number of target objects in a preset area is recorded from the HDD 109 or the like. Further, the number obtaining unit 201 may receive an input of the actual number via the input device 108.
- the image acquisition unit 202 acquires, for example, from an external imaging device or the like, an image in which a preset area including a target object has been photographed, and stores the acquired image in the HDD 109 or the like. In the present embodiment, it is assumed that each of the preset regions is the entire region photographed in the corresponding image.
- the feature amount acquiring unit 204 detects a target object from the image acquired by the image acquiring unit 202 using an object detection technique, and based on the number of detected target objects, a preset target object in which the detected target object exists is set. A feature amount indicating the feature of the region that has been acquired is acquired. In the following, the number of target objects detected from a certain area by the feature amount acquiring unit 204 is defined as the number of detections in that area. In the present embodiment, the feature amount acquisition unit 204 acquires the number of detections of a certain region as a feature amount indicating the feature of the region.
- the process of acquiring a feature by the feature acquiring unit 204 is an example of a feature acquiring process.
- the learning unit 203 performs the following processing for each image received by the number acquisition unit 201. That is, the learning unit 203 corresponds to the actual number of target objects included in the preset area corresponding to the image acquired by the number acquiring unit 201 and the image acquired by the feature amount acquiring unit 204. And a feature amount indicating a feature of the region set in advance. Then, the learning unit 203 learns, by machine learning, estimation parameters used for estimating the actual number of target objects included in the designated area, based on the acquired number and the feature amount. In the present embodiment, the learning unit 203 uses linear regression as a method of machine learning, and learns parameters used for linear regression as estimation parameters. However, the learning unit 203 may learn a parameter in another method such as a support vector machine as an estimated parameter.
- the parameter management unit 205 stores the estimated parameters learned by the learning unit 203 in the HDD 109 or the like and manages them.
- the estimating unit 206 includes a feature amount acquired by the feature amount acquiring unit 204 from an image obtained by photographing an area in which the number of target objects is to be estimated, and a learned estimation parameter managed by the parameter managing unit 205. , The following processing is performed. That is, the estimating unit 206 estimates the actual number of target objects included in the target area for estimating the number of target objects.
- FIG. 3 is a diagram illustrating an example of a table that manages the actual number of target objects acquired by the number acquiring unit 201 and the number of target objects detected by the feature amount acquiring unit 204 as learning data.
- the table 301 includes items of ID, image file, number of detections, and actual number.
- the item of ID indicates identification information for identifying learning data.
- the item of the image file indicates which image the corresponding learning data is generated using.
- the item of the number of detections indicates the number of target objects detected from the image indicated by the item of the corresponding image file.
- the item of the actual number is the number of target objects actually included in a specific region photographed in the image indicated by the item of the corresponding image file (for example, the number including target objects hidden in leaves and not shown in the image). Is shown.
- the table 301 is stored in, for example, the HDD 109 or the like.
- an image (IMG_0001.jpg) indicated by the image file corresponding to the learning data with the ID of 1 will be described with reference to FIG. IMG_0001.
- the number acquisition unit 201 acquires the actual number of target objects in one or more specific areas in advance.
- the feature amount acquisition unit 204 detects a target object from each of a plurality of images in which any of the one or more specific regions is captured, and obtains the number of detections in advance. Then, the number acquiring unit 201 and the characteristic amount acquiring unit 204 store the acquired actual number and the detected number in the HDD 109 or the like as learning data in the format of the table 301 shown in FIG. Thus, learning data used for learning is prepared in advance.
- the image file or its feature amount is referred to as input data.
- the actual number corresponding to the input data is called correct data.
- the learned estimation parameter is also called a learned model.
- FIG. 5 illustrates the number of detected target objects detected by the feature amount acquiring unit 204 from an image of a region in which the actual number of target objects for which estimation is desired is to be estimated, and the actual number of target objects in the region estimated by the estimating unit 206.
- FIG. 3 is a diagram showing an example of a table for managing the values of.
- the table 401 includes items of ID, image file, number of detections, and estimated value.
- the item of ID indicates identification information for identifying an area in which the actual number of target objects is estimated.
- the item of the image file indicates the image used for estimating the actual number.
- the item of the number of detections indicates the number of target objects (the number of detections) detected by the feature amount acquiring unit 204 from the image indicated by the item of the corresponding image file.
- the item of the estimated value indicates the number of target objects estimated by the estimating unit 206.
- the table 401 is stored in, for example, the HDD 109 or the like.
- FIG. 6 is an example of a flowchart showing an example of the estimation parameter learning process.
- step S ⁇ b> 501 the number acquisition unit 201 acquires, for example, from a text file stored in the HDD 109, the file name of an image file in which a predetermined area is captured and the actual number of target objects included in the area. I do. Then, the number acquiring unit 201 registers the acquired file names and the actual number in the table 301 stored in the HDD 109. It is assumed that the HDD 109 previously stores a text file in which image file names and actual numbers are recorded in association with each other in a format such as a CSV format.
- the number acquisition unit 201 determines, for each of a plurality of preset regions, a file name of an image file in which the region is photographed, and a target object included in the region. And the actual number. Then, the number acquiring unit 201 registers, in the table 301, each set of the file name and the actual number acquired for each of the plurality of areas.
- step S502 the feature amount acquisition unit 204 detects the target object from the image indicated by the image file name for each of the image file names registered in the table 301 in step S501, and determines the number of detections in the area captured in the image. Is obtained as the feature value of
- step S503 the feature amount acquiring unit 204 registers, for example, the number of detections (feature amounts) acquired in step S502 in the table 301 stored in the HDD 109.
- step S ⁇ b> 504 the learning unit 203 learns an estimation parameter (in this embodiment, a parameter of linear regression) using a set of the detected number (feature amount) and the actual number registered in the table 301.
- the linear regression is represented by the following equation (1).
- Actual number (estimated value) w0 + (w1 x number of detections) Expression (1)
- step S505 the parameter management unit 205 starts management by, for example, storing the estimated parameters learned in step S504 in the HDD 109.
- FIG. 7 is a flowchart showing an example of the estimation process using the estimation parameters learned by the process of FIG.
- step S601 the estimation unit 206 requests the parameter management unit 205 for the estimation parameters learned in the process of FIG.
- the parameter management unit 205 acquires the estimated parameters learned in S504 stored in S505 from the HDD 109, and transmits the acquired estimated parameters to the estimation unit 206.
- step S ⁇ b> 602 the feature amount acquiring unit 204 detects a target object from an image in which an area designated as a target for estimating the number of target objects is captured, and acquires the number of detections.
- an image obtained by photographing at least a part of the field in S602 is supplied as a processing target, that is, an area photographed in the image is designated as a target of a process of estimating the number of target objects. Is equivalent to If there are a plurality of designated images, the same processing is performed for all of them.
- the feature amount acquiring unit 204 registers the acquired number of detections in the table 401 stored in the HDD 109, for example, in association with the image file name of the image.
- step S603 the estimating unit 206 estimates the number of target objects included in the target region in which the number of target objects is to be estimated based on the estimation parameters acquired in step S601 and the number of detections acquired in step S602. .
- the estimating unit 206 calculates the number of target objects included in the area using Expression (1) based on, for example, w0 and w1 that are the estimation parameters acquired in S601 and the number of detections acquired in S602. Find an estimate.
- the estimating unit 206 outputs the obtained estimated value by registering it in the table 401.
- the estimating unit 206 may output the obtained estimated value by displaying it on the display 106.
- the estimated value registered in the table 401 may be used for, for example, predicting the yield of the crop, which is the target object, and visualizing the information of the predicted high- and low-yield areas on a map. .
- the estimation device 100 detects a target object from an image in which a preset region is captured, and indicates a characteristic of the region based on the number of detected target objects (the number of detections). Take the quantity. Then, the estimation device 100 learns the estimation parameters based on the acquired feature amount and the actual number (actual number) of the target objects included in the area. By using the learned estimation parameters, it is possible to estimate the actual number of target objects included in a region based on a feature amount corresponding to the number of detected target objects detected from the region. That is, the estimating apparatus 100 can support the realization of a mechanism for estimating the total number of objects even when some or all of the target objects cannot be detected by learning the estimation parameters.
- the estimating apparatus 100 obtains the learned estimation parameters and the number of target objects in the region obtained based on the number of detected target objects detected from the captured image of the target region. The following processing was performed based on the feature amounts of That is, the estimation device 100 estimates the actual number of target objects included in the area. As described above, the estimation device 100 can estimate the actual number of target objects included in a region from the feature amount of the region based on the number of target objects detected from the region. Accordingly, the estimation device 100 can support the realization of a mechanism for estimating the total number of objects even when some or all of the target objects cannot be detected.
- the estimating apparatus 100 generates learning data used for learning the estimation parameters in advance, and stores the learning data in the HDD 109 as the table 301.
- the estimation device 100 can support the realization of a mechanism for estimating the total number of objects even when some or all of the target objects cannot be detected by preparing learning data used for learning the estimation parameters.
- the number of detections is the number of target objects detected from the image.
- the number of detections may be, for example, the number of people who have visually detected the target object.
- the estimation device 100 accepts the designation of the number of detections, for example, based on a user operation via the input device 108.
- the estimation device 100 may use the number of people detected via the human sensor as the number of detections, and use the number of people actually existing in the area as the actual number. For example, the estimating apparatus 100 estimates the actual number of persons included in the region based on the feature amount of the region acquired from the number of detections, based on the set of the detected number and the actual number at each of a plurality of time points in the region. May be learned. In addition, the estimation device 100 may estimate the actual number of people included in the area at the designated time using the learned estimation parameters. In addition, the estimation device 100 may generate a set of the detected number and the actual number at each of a plurality of time points in advance and generate learning data used for learning the estimation parameter.
- Example of use> An example of use of a system for presenting to a user the number of target objects obtained by the processing of the estimating apparatus 100 of the present embodiment and the yield of crops that can be predicted based on the number will be described.
- This system includes an estimation device 100.
- the user of this system can utilize the number of target objects estimated by the estimating apparatus 100 in a work to be performed later and a production plan of a processed product.
- the processing of the present embodiment can be suitably applied to a case where grapes for wine production are cultivated as agricultural products.
- the production control of grapes for wine production will be described as an example.
- sampling surveys at a plurality of locations in a field or a plurality of trees have been performed.
- the growth condition of a cared tree may vary depending on the place or year, for example, because the geographical and climatic conditions are not uniform.
- the learned model is learned so that if the number of detected target objects in the image is small, the estimated actual number tends to be small. Therefore, for example, even if there is a tree whose growth state is worse than that of the tree on which the sampling survey was performed due to the influence of geographical conditions, the number of target objects estimated from an image of the tree is determined by the sampling survey. Will be less than trees made.
- the processing of the present embodiment enables more accurate estimation processing irrespective of the position where the sampling investigation is performed.
- FIGS. 19 to 21 are diagrams each showing an example of a display screen showing an estimation result output by the estimation device 100 when the system according to this usage example is introduced at a wine production grape production site.
- the display control unit 207 generates the display screens of FIGS. 19 to 21 based on the estimation result in S603 and displays the display screens on the display 106.
- the display control unit 207 generates the display screens of FIGS. 19 to 21 based on the estimation result in S603, transmits the generated display screen to an external device, and displays the display unit (display or the like) of the transmission destination device. ) May be controlled.
- the screen 1900 is a screen showing, for each of the seven blocks included in the field, an identifier (ID), an area, and an estimated value of the grape harvest amount for the corresponding block.
- the display control unit 207 estimates the weight (unit: t) of the grapes to be harvested based on the total of the results of the processing in S603 (the estimated number of the grapes bunches to be harvested) for all the images of the corresponding blocks. Find the value. Then, the display control unit 207 causes the obtained weight to be included in the screen 1900 as an estimated value of the grape harvest amount. Expressing the grape yield by weight, rather than by the number of bunches, makes it easier to use the grape yield for estimating wine production. In the example of FIG. 19, a predicted value that 19.5 (t) grape is harvested in the block B-01 represented by the area 1901 is shown.
- the display control unit 207 when the display control unit 207 detects a pointing operation (for example, a selection operation such as a click or a tap) to the area 1901 via the input device 108, the display control unit 207 changes the screen displayed on the display 106 to an The screen is switched to a screen 2000 shown in FIG.
- a pointing operation for example, a selection operation such as a click or a tap
- the screen 2000 is a screen for presenting information that is the basis of the prediction of the yield for the block B-01.
- Screen 2000 includes area 2001 at a position corresponding to area 1901 on screen 1900.
- Each of the 66 squares in the area 2001 is a marker indicating one unit (unit) to be subjected to the counting survey.
- the target object is a grape cluster.
- the pattern of each marker is a pattern corresponding to the average number of tufts detected in each unit. That is, the area 2001 shows the geographic distribution of the number of detected target objects existing in the block B-01.
- An area 2002 surrounded by a broken line shows detailed information on the selected block B-01.
- area 2002 indicates information indicating that the average number of detected cells for all units in block B-01 is 8.4.
- information indicating that the average estimated number of cells for all units in the block B-01 is 17.1 is shown.
- the number of detected target objects detected by the feature amount acquisition unit 204 does not include the number of target objects whose detection is hindered by obstacles.
- the actual number acquired by the number acquiring unit 201 is a number including the number of target objects whose detection is inhibited by the obstacle. That is, in the present embodiment, there may be a difference between the actual number serving as a pair of learning data and the number of detections.
- the number of target objects estimated by the estimating unit 206 is larger than the number of detections, as indicated by the area 2002 in FIG. 20, by the number of target objects whose detection is inhibited by the obstacle. There is.
- the area 2003 indicates, for each set of the set of the number of detected cells and the estimated number of cells divided into a plurality of stages, the total number of markers belonging to the corresponding stage.
- the method of expressing the information shown in the area 2003 may be a histogram format as in the example of FIG. 20 or may be various graph formats.
- the area 2003 shows a histogram in which the pattern of the bin is changed for each stage.
- the pattern of the bins in this histogram corresponds to the pattern of the marker in the area 2001.
- the display control unit 207 assigns different patterns to the bins and the markers.
- the bins and the markers may be colored in different colors.
- the distribution of the number of detected tufts is represented by a pseudo heat map.
- the display in the heat map format allows the user to intuitively understand the magnitude of the number of detections and the distribution thereof.
- the estimated cell number is also shown in the area 2002 in association with the detected cell number. Since the user who actually looks at the field sometimes sees the vine before the leaves grow, the user may intuitively recognize the number of bunches hidden in the leaves. For such a user, the number of clusters detected from the image may be felt to be less than the number known to one's own sensation.
- the display control unit 207 associates not only the number of actually detected bunches but also the actual number estimated by the learned model with the number of detections as the basis of the predicted value of the yield. Display. For example, the user first looks at the screen 1900 and knows the predicted value of the yield. Then, when making a subsequent plan for each block, if the user wants to know the basis of the predicted value just in case, he or she clicks the target block. Then, the user uses the screen 2000 corresponding to the clicked block to display the number of clusters detected from the image (the number of clusters that are surely present) and the number of clusters not detected from the image. It is possible to confirm both the estimated and expected values.
- the cause is that the number of detections is small, or It is possible to quickly determine whether the cause is a small estimated number (estimation processing).
- the virtual button 2004 in the area 2002 is a button used to clearly indicate the position where the sampling survey was actually performed among the markers shown in the area 2001.
- the display control unit 207 switches the screen displayed on the display 106 to a screen 2100 shown in FIG.
- 66 markers included in the block B-01 are displayed in the area 2001 of the screen 2100 continuously from the screen 2000. Then, the display control unit 207 makes the ten markers corresponding to the positions where the sampling survey is actually performed out of the sixty-six markers into markers highlighted by a thick line like the marker 2101.
- the display control unit 207 also changes the display state of the virtual button 2004 on the screen 2000 by changing the color or the like. Thereby, the display state of the virtual button 2004 is associated with whether or not the virtual button 2004 is selected. By confirming the virtual button 2004, the user can easily determine whether or not the virtual button 2004 has been selected.
- the function related to the virtual button 2004 is particularly effective when learning data from a sampling survey for that year is used as the basis of the estimation processing. For example, when a learned model in which only data obtained as a result before last year is learned as learning data is used, it is not necessary to confirm the sampling position, and thus the virtual button 2004 may be omitted. For example, after the year in which it is determined that sufficient learning data has been obtained based on the past sampling survey results, the sampling survey for each year may be omitted, and only the counting survey by the estimation processing according to the present embodiment may be performed.
- ⁇ Embodiment 2> a case will be described in which the estimating apparatus 100 specifies an area of a predetermined specific section set in advance and estimates the actual number of target objects included in the specified area.
- the hardware configuration of the estimation device 100 of the present embodiment is the same as that of the first embodiment.
- FIG. 8 is a diagram illustrating an example of a functional configuration of the estimation device 100 according to the present embodiment.
- the estimating apparatus 100 of the present embodiment is different from the case of the first embodiment shown in FIG. 2 in that it includes a section specifying unit 801 for specifying a preset area of a section. Further, in the present embodiment, the image acquired by the image acquisition unit 202 is an image indicating a region of a preset section.
- the section identification unit 801 detects, from an image, an object indicating an area of a preset section (eg, an area of a section set in a field), and based on the position of the detected object, determines an Identify the area.
- the image acquisition unit 202 cuts out the area of the section specified by the section specifying unit 801 from the input image and stores the cut out area in the HDD 109 or the like.
- An image 901 in FIG. 9 is an image obtained by photographing a region of a section
- an image 902 is an image obtained by cutting out the area of a section.
- the section specifying unit 801 detects an object indicating the area of the section from the image 901 and specifies an area surrounded by the detected object in the image 901 as the area of the section.
- the image obtaining unit 202 cuts out the area specified by the section specifying unit 801 from the image 901, obtains the image 902, and stores the obtained image 902 in the HDD 109.
- the section may not fit in one image and may be divided into a plurality of images to capture one section.
- the section specifying unit 801 arranges a plurality of images of the section, detects objects indicating both ends of the section from the plurality of images, and specifies the area of the section for each image.
- the image acquiring unit 202 combines the areas of the sections in each of the plurality of images specified by the section specifying unit 801 and stores the combined image in the HDD 109 or the like.
- a plurality of images 1001 in FIG. 10 are images obtained by capturing the area of the section, and an image 1002 is an image showing the area of the combined section.
- the section specifying unit 801 detects an object indicating the area of the section from the plurality of images 1001, and specifies an area sandwiched by the detected objects in the plurality of images 1001 as the area of the section.
- the image obtaining unit 202 synthesizes the area specified by the partition specifying unit 801 from the image 1001, obtains the image 1002, and stores the obtained image 1002 in the HDD 109.
- the section specifying unit 801 combines a plurality of images obtained by capturing the sections, detects an object indicating the area of the section from the combined image as one combined image, and combines the objects based on the detected object.
- the area of the section in the image may be specified.
- the feature amount acquiring unit 204 detects a target object from the image of the area of the preset section specified by the section specifying unit 801 obtained by the image obtaining unit 202. Then, the feature amount acquiring unit 204 acquires the feature amount of the area of this section based on the number of detected target objects (the number of detections).
- each of the feature amounts included in the learning data used for learning the estimation parameter and the feature amounts used for the estimation process is a feature amount of any of the preset sections. That is, in the present embodiment, the estimation parameter is learned for each section defined in the field.
- the image acquiring unit 202 may acquire an image similar to that of the first embodiment without acquiring an image of the area of the section specified by the section specifying unit 801.
- the feature amount acquiring unit 204 detects the target object from the region of the preset section specified by the section specifying unit 801 in the image obtained by the image obtaining unit 202. Then, the feature amount acquiring unit 204 acquires the feature amount of the area of this section based on the number of detected target objects (the number of detections).
- the estimating apparatus 100 can specify an area of a preset section and estimate the actual number of target objects included in the specified area of the section.
- the estimation device 100 can reduce the influence of the target object that can be detected from an area other than the area in which the actual number of the target objects is to be estimated.
- the section specifying unit 801 specifies the area of the section by detecting an object indicating the area of the section set in advance. However, the area of the section is measured using position information such as GPS data or an image measurement technique. May be specified. Furthermore, the estimating apparatus 100 may display a virtual frame on a camera finder that generates an input image so that the user can confirm an area of the section specified at the time of shooting, or a composite image in which the frame is superimposed. May be generated. Further, the estimation device 100 may store the frame information as metadata of the image in the HDD 109 or the like.
- the estimation device 100 acquires a feature amount indicating a feature of the area based on other attributes of the area in addition to the number of target objects detected from the area.
- the hardware configuration and the functional configuration of the estimation device 100 of the present embodiment are the same as those of the first embodiment.
- the estimation device 100 uses a set of the number of target objects detected from a region and other attributes of the region as a feature amount of the region. Then, the estimation device 100 performs a learning process of the estimation parameter using the feature amount, and a process of estimating the number of target objects using the estimation parameter.
- the table 11 is a table used for registering learning data, and is stored in, for example, the HDD 109 or the like.
- the table 1101 is a table in which information used for learning the estimation parameter is added to the table 301 shown in FIG.
- the table 1101 includes items of ID, image file, number of detections, number of adjacent detections, soil, leaf volume, and actual number.
- the item of the number of adjacent detections is an average value of the target object detected from each of one or more areas adjacent to the area indicated by the corresponding ID.
- the target object is a crop.
- the estimating apparatus 100 includes, in the feature amount of the region, a feature of a region around the region (for example, a statistic (for example, a statistical value (for example, , Average, total, variance, etc.).
- a statistic for example, a statistical value (for example, , Average, total, variance, etc.).
- the item of soil indicates an index value indicating the goodness (easiness of fruiting) of the soil in the area indicated by the corresponding ID.
- the larger the index value the better the soil.
- the estimation device 100 causes the feature value to include an index value indicating the goodness of the soil in which the crop, which is the target object, is planted.
- the estimating apparatus 100 can learn an estimation parameter capable of estimating the actual number of target objects in consideration of the characteristics of the soil, and use the estimation parameters to consider the characteristics of the soil in consideration of the characteristics of the soil. Can be estimated.
- the item of ⁇ leaf amount indicates an index value indicating the amount of leaf detected from the area.
- the larger the index value the larger the amount of leaves.
- the estimation device 100 includes the amount of the detected obstacle in the feature amount. Accordingly, the estimation device 100 can learn an estimation parameter capable of estimating the actual number of target objects in consideration of the amount of the obstruction, and use the estimation parameter to consider the amount of the obstruction. The actual number of target objects can be estimated.
- the feature amount of the region is a set of the number of detections, the number of adjacent detections, the amount of leaves, and an index value indicating the goodness of soil.
- the feature amount of the region may be a set of the number of detections and a part of the index value indicating the number of adjacent detections / leaf volume / good soil.
- the feature amount of the region may include an attribute of the region other than the number of detections, the number of adjacent detections, the amount of leaves, and the index value indicating the goodness of the soil.
- Table 1201 in FIG. 12 is a table for managing the feature amount of the area in which the actual number of target objects is to be estimated and the value of the actual number of target objects in the area estimated by estimating section 206.
- the table 1201 is a table in which information used for estimating the actual number of target objects is added to the table 401 in FIG.
- the table 1201 includes items of ID, image file, number of detections, number of adjacent detections, soil, leaf volume, and estimated value. The items of the number of adjacent detections, soil, and leaf amount are the same items as in the table 1101.
- FIG. 13 is a flowchart showing an example of the estimation parameter learning process.
- a table 1101 is used instead of the table 301.
- Steps S1301 to S1304 are processes in which a feature amount is acquired by the feature amount acquiring unit 204 for each image file registered in the table 1101, and the acquired result is registered.
- the feature amount acquiring unit 204 detects a target object and leaves from the images registered in the table 1101.
- the feature amount acquiring unit 204 may detect the leaf using the object detection technique, or may detect the leaf simply by detecting a pixel having a leaf color.
- the feature amount acquiring unit 204 registers the number of detected target objects and the leaf amount detected in step S1301 in the table 1101.
- the feature amount obtaining unit 204 obtains a leaf amount, which is an index value indicating a leaf amount, based on a ratio between the number of pixels of the detected leaf region and the number of pixels of the entire image.
- the feature amount obtaining unit 204 obtains position information such as GPS data from the metadata of the image registered in the table 1101. Then, the feature amount obtaining unit 204 obtains information on the number of adjacent detections and the goodness of the soil based on the obtained position information. The feature amount obtaining unit 204 obtains position information of an image corresponding to IDs before and after the target ID from the table 1101, and determines whether the image is an image of an adjacent area. Then, the feature amount obtaining unit 204 obtains the number of detections for the image determined to be an adjacent area, and obtains an average value as the number of adjacent detections. For example, the feature quantity acquiring unit 204 sets the number of adjacent detections of ID2 to 3.5, which is the number of detections of ID1 and 4, the number of detections of ID3, on average.
- the feature amount acquiring unit 204 is the region that is actually adjacent using the position information. Was determined.
- the estimation device 100 may perform the following. That is, assuming that the table 1101 includes the position information of the area of the corresponding ID, the feature amount acquiring unit 204 specifies the data of the number of detections of the area around the certain area from the table 1101 based on the position information. You may do it.
- the feature amount acquiring unit 204 acquires, for example, an index value indicating the goodness of the soil corresponding to the shooting position from the database or the like.
- step S ⁇ b> 1304 the feature amount acquiring unit 204 registers information on the number of adjacent detections acquired in step S ⁇ b> 1303 and an index value indicating soil goodness in the table 1101.
- the learning unit 203 learns an estimation parameter using the number of detections, the number of adjacent detections, an index value indicating soil goodness, and the amount of leaves in the table 1101.
- the estimation parameter is a parameter of linear regression.
- step S505 of FIG. 13 the parameter management unit 205 stores and manages the estimated parameters learned in step S504 in the HDD 109 or the like.
- FIG. 14 is a flowchart illustrating an example of an estimation process for estimating the actual number of target objects using the estimation parameters learned in the process of FIG.
- step S601 in FIG. 14 the estimating unit 206 acquires the estimation parameters learned in step S504 in FIG.
- the feature amount acquiring unit 204 detects the target object and the leaves from the image in which the region for which the actual number of target objects is to be estimated is photographed.
- step S1402 the feature amount acquiring unit 204 registers the number of detected target objects and the leaf amount detected in step S1401 in the table 1201 as in step S1302.
- the feature amount obtaining unit 204 obtains a leaf amount, which is an index value indicating a leaf amount, based on a ratio between the number of pixels of the detected leaf region and the number of pixels of the entire image.
- the feature amount obtaining unit 204 obtains position information such as GPS data from metadata of an image of a region in which the actual number of target objects is to be estimated, as in step S1303. Then, the feature amount obtaining unit 204 obtains information on the number of adjacent detections and the goodness of the soil based on the obtained position information. Then, the feature amount acquiring unit 204 registers information on the acquired number of detected neighbors and an index value indicating the goodness of soil in the table 1201.
- step S603 of FIG. 14 the estimating unit 206 executes the following process based on the estimation parameters acquired in step S601 and the feature amounts of the regions registered in the table 1201, using equation (2). That is, the estimating unit 206 estimates the number of target objects that are actually included in the region in which the actual number of target objects is to be estimated.
- the number of target objects detected (the number of detections) of ID836 is smaller than that of ID835 or ID837.
- the ID 836 is an estimated value similar to the ID 835 and the ID 837. Therefore, in the region of ID836, the number of target objects that happened to be hidden by leaves may have increased as compared with ID835 or ID837, and the number of detected objects may have decreased as compared with ID835 or ID837.
- the estimating apparatus 100 can supplement the information on the number of detections by using the number of adjacent detections as the feature quantity, so that the estimated value does not become too small even if the number of detections decreases.
- the estimation device 100 uses the leaf amount as the feature amount, thereby preventing the estimation value from becoming too small even when the leaf amount is hidden. be able to.
- the estimating apparatus 100 uses, as the feature amount, other attributes of the region such as the positional deviation of the yield (good soil) and the amount of leaves in addition to the number of detections. did. Thereby, the estimating apparatus 100 can estimate the real number of the target with higher accuracy than in the first embodiment.
- the estimating apparatus 100 may use information on the size of the target object detected on the assumption that the target object is large in an easy-to-produce location as the feature amount of the region. Good.
- the estimation device 100 may use, as a feature amount, a variety of agricultural crops, a fertilizer application state, the presence or absence of a disease, and the like, which are factors of the yield.
- Agricultural crops tend to be fruitful depending on the weather.
- a description will be given of a process of estimating the actual number of the crops that are the target objects and correcting them in consideration of weather conditions.
- the hardware configuration of the estimation device 100 is the same as that of the first embodiment.
- FIG. 15 is a diagram illustrating an example of a functional configuration of the estimation device 100 according to the present embodiment.
- the functional configuration of the estimation device 100 of the present embodiment is different from the functional configuration of FIG. 2 in that a correction information acquisition unit 1501 that acquires correction information is included.
- the correction information acquisition unit 1501 includes a learned estimation parameter used for estimating the actual number of target objects by the estimation unit 206, and correction information (for example, an estimated value And the like for multiplication).
- the estimating unit 206 estimates the actual number of target objects using the estimation parameters and the correction information.
- FIG. 16 is a diagram showing an example of a table for managing learned parameters and coefficients prepared in advance.
- the table 1601 includes a year, an average number of detections, and parameters.
- the item of the year is the year corresponding to the learning data registered in the table 301.
- the item of the average number of detections is the average number of detections in the corresponding year.
- the parameter is an estimated parameter learned by the learning unit 203 using the learning data corresponding to the corresponding year.
- the table 1601 is a table for managing a plurality of estimation parameters learned by the learning unit 203, each of which is associated with an index value of a preset index called an average detection number.
- the table 1601 is stored in, for example, the HDD 109 or the like.
- the table 1602 includes items of the percentage of fine weather and the coefficient.
- the item of sunny ratio indicates the ratio of sunny days during the period in which the crop, which is the target object actually included in a certain area, grew.
- the item of the coefficient indicates a value for correcting the estimated value, and the larger the ratio of the corresponding sunny day, the larger the value.
- FIG. 17 is a flowchart illustrating an example of an estimation process using estimation parameters.
- step S1701 the correction information acquisition unit 1501 averages the number of detections in the table 401 to acquire the average number of detections for the year corresponding to the estimation target of the actual number of target objects.
- the correction information acquisition unit 1501 acquires, from the estimation parameters registered in the table 1601, the estimation parameter whose corresponding average detection number value is closest to the acquired average detection number.
- the correction information acquisition unit 1501 selects the acquired estimation parameter as the estimation parameter used for the estimation processing.
- the estimation device 100 can acquire the estimation parameter learned under the condition close to the condition where the region to be subjected to the estimation processing is located without learning the estimation parameter again by the process of S1701. Accordingly, the estimation device 100 can prevent the accuracy of the estimation process from decreasing while reducing the load of the process related to learning.
- the correction information acquisition unit 1501 uses, for example, the weather information acquired using an external weather service or the like to generate a target object in the year corresponding to the region to be estimated, during the growing period of the crop, which is the target object. Get the percentage of sunny days.
- the correction information acquisition unit 1501 acquires, from the table 1602, a coefficient corresponding to the acquired percentage of a sunny day.
- the estimating unit 206 obtains an estimated value of the actual number of target objects by using Expression (1), for example, using the estimation parameter selected in S1701. Then, the estimating unit 206 corrects the estimated value by multiplying the obtained estimated value by the coefficient obtained in S1702, and sets the corrected estimated value as a final estimated value.
- the estimation device 100 acquires a coefficient used for correcting the estimated value of the object based on the weather information (the ratio of a sunny day). Then, the estimation device 100 uses the obtained coefficient to correct the estimated value of the actual number of target objects. Thereby, the estimating apparatus 100 can obtain an estimated value of the actual number of target objects with higher accuracy than in the first embodiment.
- harvesting of agricultural products is performed once a year.
- the harvest is not limited to once a year, but may be applied to agricultural products harvested a plurality of times a year.
- the data managed for each year may be changed to be managed for each growth period.
- the estimation device 100 uses the ratio of sunny days to obtain the coefficient.
- the estimation device 100 may use an average value, an integrated value, or the like of sunshine hours, precipitation, temperature, and the like. .
- the estimation device 100 uses the average number of detections used to acquire the estimation parameters described in the present embodiment and the ratio of sunny days used to acquire correction information as one of the feature amounts. Is also good. Further, the estimation device 100 may acquire the estimation parameter and the correction information by using a part of the feature amount described in the third embodiment.
- the estimation device 100 performs a process of generating learning data used for learning an estimation parameter, a process of learning an estimation parameter, and a process of estimating the actual number of target objects.
- these processes need not be executed by a single device.
- FIG. 18 is a system configuration of an information processing system that executes a process of generating learning data used for learning an estimation parameter, a process of learning an estimation parameter, and a process of estimating the actual number of target objects using the estimation parameter in the present embodiment.
- the information processing system includes a generating device 1801, a learning device 1802, and an estimating device 1803.
- the hardware configuration of each of the generating device 1801, the learning device 1802, and the estimating device 1803 is the same as the hardware configuration of the estimating device 100 of the first embodiment illustrated in FIG.
- the function of the generation device 1801 and the processing of the generation device 1801 illustrated in FIG. 18 are realized by the CPU of the generation device 1801 executing the processing based on the program stored in the ROM, the HDD, or the like of the generation device 1801.
- the function of the learning device 1802 and the processing of the learning device 1802 illustrated in FIG. 18 are realized by the CPU of the learning device 1802 executing the processing based on the program stored in the ROM, the HDD, or the like of the learning device 1802.
- the functions of the estimating device 1803 and the processing of the estimating device 1803 shown in FIG. 18 are realized by the CPU of the estimating device 1803 executing the processing based on the programs stored in the ROM, the HDD, or the like of the estimating device 1803.
- the generation device 1801 includes a number acquisition unit 1811, an image acquisition unit 1812, a feature amount acquisition unit 1813, and a generation unit 1814.
- the number acquisition unit 1811, the image acquisition unit 1812, and the feature amount acquisition unit 1813 are the same as the number acquisition unit 201, the image acquisition unit 202, and the feature amount acquisition unit 204 in FIG.
- the generation unit 1814 generates learning data, and stores the generated learning data in a format such as a table 301 or CSV in an HDD or the like of the generation device 1801.
- the generation unit 1814 generates learning data by executing, for example, processing similar to S501 to S503 in FIG.
- the learning device 1802 includes a learning unit 1821 and a parameter management unit 1822.
- the learning unit 1821 and the parameter management unit 1822 are functional components similar to the learning unit 203 and the parameter management unit 205 in FIG. 2, respectively. That is, the learning unit 1821 obtains the learning data generated by the generation device 1801 from the generation device 1801, and executes the same processing as S504 to S505 in FIG. 6 based on the obtained learning data (information in the table 301). Then, the estimation parameters are learned. Then, the parameter management unit 1822 stores the estimated parameters learned by the learning unit 1821 in the HDD or the like of the learning device 1802.
- the estimation device 1803 includes an image acquisition unit 1831, a feature amount acquisition unit 1832, an estimation unit 1833, and a display control unit 1834.
- the image acquisition unit 1831, the feature acquisition unit 1832, the estimation unit 1833, and the display control unit 1834 are the same as the image acquisition unit 202, the feature acquisition unit 204, the estimation unit 206, and the display control unit 207 of FIG. That is, the image acquisition unit 1831, the feature amount acquisition unit 1832, and the estimation unit 1833 execute the same processing as in FIG. 7 to thereby determine the actual number of target objects included in the target region in which the number of target objects is estimated. Is performed.
- the respective devices execute the processing of generating the learning data used for learning the estimated parameters, the processing of learning the estimated parameters, and the processing of estimating the actual number of target objects. This makes it possible to distribute the burden of each process to a plurality of devices.
- the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or an apparatus via a network or a storage medium, and one or more processors in a computer of the system or the apparatus read and execute the program.
- This processing can be realized. Further, it can also be realized by a circuit (for example, an ASIC) that realizes one or more functions.
- the functional configuration of the estimation device 100 described above may be implemented as hardware in the estimation device 100, the generation device 1801, the learning device 1802, the estimation device 1803, and the like.
Abstract
Description
本実施形態では、推定装置100が指定された領域に含まれるオブジェクトの個数の推定に用いられるパラメータである推定パラメータを学習し、学習した推定パラメータに基づいて指定された領域に含まれるオブジェクトの個数を推定する処理について説明する。 <First embodiment>
In the present embodiment, the
実個数(推定値)= w0 +(w1×検出数) ・・・式(1) In step S <b> 504, the
Actual number (estimated value) = w0 + (w1 x number of detections) Expression (1)
本実施形態の推定装置100の処理によって得られる対象オブジェクトの数や、それをもとに予測され得る農作物の収穫量をユーザに提示するシステムの利用例について説明する。このシステムは、推定装置100を含む。このシステムのユーザは、推定装置100によって推定された対象オブジェクトの数を、その後に行う作業や加工品の生産計画に活かすことができる。例えば、本実施形態の処理は、農作物としてワイン製造用のぶどうを栽培する場合に好適に適用できる。以下では、ワイン製造用のぶどうの生産管理を例に説明する。 <Example of use>
An example of use of a system for presenting to a user the number of target objects obtained by the processing of the
本実施形態では、推定装置100が予め設定された特定の区画の領域を特定し、特定した領域に含まれる対象オブジェクトの実際の個数を推定する場合について説明する。 <
In the present embodiment, a case will be described in which the
本実施形態では、推定装置100が領域から検出された対象オブジェクトの個数に加えて、その領域の他の属性に基づいて、その領域の特徴を示す特徴量を取得する場合について説明する。 <
In the present embodiment, a case will be described in which the
実数(推定値)= w0 +(w1×検出数)+(w2×隣接検出数)+(w3×土壌の良さを示す指標値)+(w4×葉量) ・・・式(2) In S504 of FIG. 13, the
Real number (estimated value) = w0 + (w1 x number of detections) + (w2 x number of adjacent detections) + (w3 x index value indicating goodness of soil) + (w4 x leaf volume) ... Equation (2)
農作物は天候により実りやすさに偏りがある。本実施形態では、対象オブジェクトである農作物の実際の個数を推定し、天候の条件を考慮した上で補正する処理について説明する。 <
Agricultural crops tend to be fruitful depending on the weather. In the present embodiment, a description will be given of a process of estimating the actual number of the crops that are the target objects and correcting them in consideration of weather conditions.
実施形態1では、推定装置100が、推定パラメータの学習に用いられる学習データの生成処理、推定パラメータの学習処理、対象オブジェクトの実際の個数の推定処理を行うこととした。しかしながら、これらの処理は、単一の装置により実行されることとしなくてもよい。 <
In the first embodiment, the
本発明は、上述の実施形態の1以上の機能を実現するプログラムを、ネットワーク又は記憶媒体を介してシステム又は装置に供給し、そのシステム又は装置のコンピュータにおける1つ以上のプロセッサがプログラムを読み出し実行する処理でも実現可能である。また、1以上の機能を実現する回路(例えば、ASIC)によっても実現可能である。 <Other embodiments>
The present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or an apparatus via a network or a storage medium, and one or more processors in a computer of the system or the apparatus read and execute the program. This processing can be realized. Further, it can also be realized by a circuit (for example, an ASIC) that realizes one or more functions.
Claims (20)
- 農作物を生育する圃場の一部である領域を撮影した画像から、前記画像から対象オブジェクトが検出された個数に関する前記領域の特徴量を取得する特徴取得手段と、
前記圃場のうち設定された領域に存在する前記対象オブジェクトの実際の個数を取得する個数取得手段と、
前記設定された領域を撮影した画像から前記特徴取得手段によって取得された前記特徴量と、前記個数取得手段に取得された前記実際の個数とを学習データとして、前記圃場のうち指定された領域に存在する前記対象オブジェクトの実際の個数を推定するための推定パラメータを学習する学習手段と、
を有することを特徴とする情報処理装置。 A feature acquisition unit configured to acquire a feature amount of the region related to the number of target objects detected from the image from an image of a region that is a part of a field where a crop grows,
Number acquisition means for acquiring the actual number of the target objects present in a set area of the field,
The feature amount obtained by the feature obtaining unit from the image obtained by capturing the set region, and the actual number obtained by the number obtaining unit as learning data, in the designated area of the field Learning means for learning an estimation parameter for estimating the actual number of existing target objects;
An information processing apparatus comprising: - 前記対象オブジェクトは、少なくとも前記農作物の芽、実、花、房の何れかであることを特徴とする請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the target object is at least one of a bud, a fruit, a flower, and a bunch of the crop.
- 前記指定された領域を撮影した画像から前記特徴取得手段により取得される前記特徴量と、前記学習手段により学習された前記推定パラメータと、に基づいて、前記指定された領域に含まれる前記対象オブジェクトの実際の個数を推定する推定手段を更に有することを特徴とする請求項1に記載の情報処理装置。 The target object included in the designated area based on the feature amount acquired by the feature acquiring unit from the image obtained by photographing the designated area and the estimation parameter learned by the learning unit. 2. The information processing apparatus according to claim 1, further comprising an estimating unit for estimating the actual number.
- 前記推定手段は、前記指定された領域の前記特徴量と、それぞれが予め設定された指標における指標値と対応付けられた前記学習手段により学習された複数の前記推定パラメータから、前記特徴量に対応する前記指標の指標値に基づいて、選択された前記推定パラメータと、に基づいて、前記指定された領域に含まれる前記対象オブジェクトの実際の個数を推定することを特徴とする請求項3に記載の情報処理装置。 The estimating means corresponds to the feature quantity from the feature quantity of the designated area and a plurality of the estimation parameters learned by the learning means each associated with an index value in a preset index. 4. The method according to claim 3, further comprising: estimating an actual number of the target objects included in the designated area based on the selected estimation parameter based on the index value of the index. Information processing device.
- 前記推定手段は、前記指定された領域の前記特徴量と、前記学習手段により学習された前記推定パラメータと、前記推定パラメータを用いて推定された値の補正に用いられる補正情報と、に基づいて、前記指定された領域に含まれる前記対象オブジェクトの実際の個数を推定することを特徴とする請求項4に記載の情報処理装置。 The estimating means is based on the feature amount of the designated area, the estimation parameter learned by the learning means, and correction information used for correcting a value estimated using the estimation parameter. The information processing apparatus according to claim 4, wherein an actual number of the target objects included in the designated area is estimated.
- 農作物を生育する圃場の一部である領域を撮影した画像から、前記画像から対象オブジェクトが検出された個数に関する前記領域の特徴量を取得する特徴取得手段と、
前記圃場のうち指定された領域を撮影した画像から前記特徴取得手段により取得された前記特徴量と、前記圃場の領域を撮影した画像から前記特徴取得手段により取得される前記特徴量から前記領域に含まれる前記対象オブジェクトの実際の個数を推定する推定処理に用いられる予め学習されたパラメータである推定パラメータと、に基づいて、前記指定された領域に含まれる前記対象オブジェクトの実際の個数を推定する推定手段と、
を有することを特徴とする情報処理装置。 A feature acquisition unit configured to acquire a feature amount of the region related to the number of target objects detected from the image from an image of a region that is a part of a field where a crop grows,
The feature amount acquired by the feature acquiring unit from an image of a designated area in the field, and the feature amount acquired by the feature acquiring unit from the image of the field region, Estimating the actual number of the target objects included in the specified area based on an estimation parameter that is a parameter learned in advance used for estimating the actual number of the target objects included Estimating means;
An information processing apparatus comprising: - 前記推定手段は、前記指定された領域を撮影した画像から前記特徴取得手段により取得された前記特徴量と、それぞれが予め設定された指標における指標値と対応付けられた複数の前記推定パラメータから、前記特徴量に対応する前記指標の指標値に基づいて、選択された前記推定パラメータと、に基づいて、前記設定された領域に含まれる前記対象オブジェクトの実際の個数を推定することを特徴とする請求項5に記載の情報処理装置。 The estimation means, the feature amount acquired by the feature acquisition means from an image of the designated area, from the plurality of estimation parameters each associated with an index value in a preset index, An actual number of the target objects included in the set area is estimated based on the selected estimation parameter based on the index value of the index corresponding to the feature amount. The information processing device according to claim 5.
- 前記推定手段は、前記指定された領域を撮影した画像から前記特徴取得手段により取得された前記特徴量と、前記推定パラメータと、前記推定パラメータを用いて推定された値の補正に用いられる補正情報と、に基づいて、前記設定された領域に含まれる前記対象オブジェクトの実際の個数を推定することを特徴とする請求項5に記載の情報処理装置。 The estimating means includes: the feature amount acquired by the feature acquiring means from an image obtained by photographing the designated area; the estimation parameter; and correction information used for correcting a value estimated using the estimation parameter. The information processing apparatus according to claim 5, wherein an actual number of the target objects included in the set area is estimated based on:
- 前記推定手段によって推定された前記対象オブジェクトの実際の個数に基づいて予測される前記農作物の収穫量を所定のディスプレイに表示させる表示制御手段を更に有することを特徴とする請求項3乃至8の何れか1項に記載の情報処理装置。 9. The apparatus according to claim 3, further comprising a display control unit configured to display on a predetermined display a harvest amount of the crop estimated based on an actual number of the target objects estimated by the estimation unit. The information processing device according to claim 1.
- 前記表示制御手段は、さらに、ユーザによる操作に応じて、前記農作物の収穫量の予測の根拠となった所定の範囲で検出された前記対象オブジェクトの個数と前記推定手段によって推定された前記対象オブジェクトの個数とを前記所定のディスプレイに表示させる制御を行うことを特徴とする請求項9に記載の情報処理装置。 The display control unit may further include, in response to an operation by a user, the number of the target objects detected in a predetermined range serving as a basis for predicting the yield of the crop and the target object estimated by the estimation unit. The information processing apparatus according to claim 9, wherein control is performed to display the number and the number on the predetermined display.
- 農作物を生育する圃場の一部である領域を撮影した画像から、前記画像から対象オブジェクトが検出された個数に関する前記領域の特徴量を取得する特徴取得手段と、
前記圃場のうち設定された領域に存在する前記対象オブジェクトの実際の個数を取得する個数取得手段と、
前記設定された領域を撮影した画像から前記特徴取得手段により取得された前記特徴量と、前記個数取得手段により取得された前記実際の個数と、を対応付けて、前記圃場のうち指定された領域を撮影した画像から前記特徴取得手段によって取得される前記特徴量から、前記指定された領域に含まれる前記対象オブジェクトの実際の個数を推定する推定処理に用いられる推定パラメータの学習に用いられる学習データを生成する生成手段と、
を有することを特徴とする情報処理装置。 A feature acquisition unit configured to acquire a feature amount of the region related to the number of target objects detected from the image from an image of a region that is a part of a field where a crop grows,
Number acquisition means for acquiring the actual number of the target objects present in a set area of the field,
The feature amount acquired by the feature acquisition unit from the image obtained by photographing the set region and the actual number acquired by the number acquisition unit are associated with each other, and a designated region in the field is designated. Learning data used for learning estimation parameters used in an estimation process for estimating the actual number of the target objects included in the specified region from the feature amount acquired by the feature acquisition unit from the image obtained by capturing the image. Generating means for generating
An information processing apparatus comprising: - 前記特徴取得手段は、前記一部である領域を撮影した画像に写る所定の物体に基づいて決定される前記画像に含まれる領域から検出された前記対象オブジェクトの個数である前記検出された個数に基づいて、前記特徴量を取得することを特徴とする請求項1乃至11の何れか1項に記載の情報処理装置。 The feature acquisition unit may include a number of the target objects detected from a region included in the image determined based on a predetermined object appearing in an image of the part of the region. The information processing apparatus according to claim 1, wherein the feature amount is obtained based on the information.
- 前記特徴取得手段は、前記一部である領域を撮影した複数の画像に一部が写る予め設定された1の物体に基づいて決定される、前記複数の画像に含まれる領域から検出された前記対象オブジェクトの個数である前記検出された個数に基づいて、前記特徴量を取得することを特徴とする請求項12に記載の情報処理装置。 The feature acquiring unit is determined based on one preset object that partially appears in a plurality of images obtained by capturing the region that is the part, and is detected from an area included in the plurality of images. The information processing apparatus according to claim 12, wherein the feature amount is acquired based on the detected number, which is the number of target objects.
- 前記特徴取得手段は、前記検出された個数と、前記一部である領域に存在する前記対象オブジェクトの検出を阻害し得る予め設定された阻害物の量を示す情報と、に基づいて、前記特徴量を取得することを特徴とする請求項1乃至13何れか1項に記載の情報処理装置。 The feature acquiring unit is configured to perform the feature based on the detected number and information indicating a preset amount of an obstruction that can impede detection of the target object existing in the part of the region. 14. The information processing apparatus according to claim 1, wherein an amount is acquired.
- 前記特徴取得手段は、前記検出された個数と、前記一部である領域における土壌の特徴と、に基づいて、前記特徴量を取得することを特徴とする請求項1乃至14何れか1項に記載の情報処理装置。 15. The method according to claim 1, wherein the characteristic acquiring unit acquires the characteristic amount based on the detected number and a characteristic of soil in the part of the region. An information processing apparatus according to claim 1.
- 前記特徴取得手段は、前記検出された個数と、前記一部である領域の周囲の領域として設定された領域の特徴と、に基づいて、前記特徴量を取得することを特徴とする請求項1乃至15何れか1項に記載の情報処理装置。 2. The feature acquiring unit according to claim 1, wherein the feature acquiring unit acquires the feature amount based on the detected number and a feature of a region set as a region around the partial region. The information processing apparatus according to any one of claims 15 to 15.
- 情報処理装置が実行する情報処理方法であって、
農作物を生育する圃場の一部である領域を撮影した画像から、前記画像から対象オブジェクトが検出された個数に関する前記領域の特徴量を取得する特徴取得ステップと、
前記圃場のうち設定された領域に存在する前記対象オブジェクトの実際の個数を取得する個数取得ステップと、
前記設定された領域を撮影した画像から前記特徴取得ステップで取得された前記特徴量と、前記個数取得ステップで前記実際の個数とを学習データとして、前記圃場のうち指定された領域に存在する前記対象オブジェクトの実際の個数を推定するための推定パラメータを学習する学習ステップと、
を含むことを特徴とする情報処理方法。 An information processing method executed by an information processing apparatus,
A feature acquisition step of acquiring a feature amount of the area related to the number of target objects detected from the image from an image of an area that is a part of a field where a crop grows,
A number acquisition step of acquiring the actual number of the target objects present in a set area of the field,
The feature amount obtained in the feature obtaining step from the image obtained by capturing the set area, and the actual number in the number obtaining step as the learning data, the learning quantity is present in a designated area in the field. A learning step of learning estimation parameters for estimating the actual number of target objects,
An information processing method comprising: - 情報処理装置が実行する情報処理方法であって、
農作物を生育する圃場の一部である領域を撮影した画像から、前記画像から対象オブジェクトが検出された個数に関する前記領域の特徴量を取得する特徴取得ステップと、
前記圃場のうち指定された領域を撮影した画像から前記特徴取得ステップで取得された前記特徴量と、前記圃場の領域を撮影した画像から前記特徴取得ステップで取得される前記特徴量から前記領域に含まれる前記対象オブジェクトの実際の個数を推定する推定処理に用いられる予め学習されたパラメータである推定パラメータと、に基づいて、前記指定された領域に含まれる前記対象オブジェクトの実際の個数を推定する推定ステップと、
を含むことを特徴とする情報処理方法。 An information processing method executed by an information processing apparatus,
A feature acquisition step of acquiring a feature amount of the area related to the number of target objects detected from the image from an image of an area that is a part of a field where a crop grows,
The feature amount obtained in the feature obtaining step from an image obtained by capturing a designated area in the field, and the feature amount obtained in the feature obtaining step from the image obtained by capturing the field of the field into the area. Estimating the actual number of the target objects included in the specified area based on an estimation parameter that is a parameter learned in advance used for estimating the actual number of the target objects included An estimation step;
An information processing method comprising: - 農作物を生育する圃場の一部である領域を撮影した画像から、前記画像から対象オブジェクトが検出された個数に関する前記領域の特徴量を取得する特徴取得ステップと、
前記圃場のうち設定された領域に存在する前記対象オブジェクトの実際の個数を取得する個数取得ステップと、
前記設定された領域を撮影した画像から前記特徴取得ステップで取得された前記特徴量と、前記個数取得ステップで取得された前記実際の個数と、を対応付けて、前記圃場のうち指定された領域を撮影した画像から前記特徴取得ステップで取得される前記特徴量から、前記指定された領域に含まれる前記対象オブジェクトの実際の個数を推定する推定処理に用いられる推定パラメータの学習に用いられる学習データを生成する生成ステップと、
を含むことを特徴とする情報処理方法。 A feature acquisition step of acquiring a feature amount of the area related to the number of target objects detected from the image from an image of an area that is a part of a field where a crop grows,
A number acquisition step of acquiring the actual number of the target objects present in a set area of the field,
The feature amount obtained in the feature obtaining step from the image obtained by capturing the set area and the actual number obtained in the number obtaining step are associated with each other, and a designated area in the field is designated. Learning data used for learning estimation parameters used in an estimation process for estimating the actual number of the target objects included in the specified area from the feature amount acquired in the feature acquisition step from an image obtained by photographing A generating step of generating
An information processing method comprising: - コンピュータを、請求項1乃至16の何れか1項に記載の情報処理装置の各手段として、機能させるためのプログラム。 A program for causing a computer to function as each unit of the information processing apparatus according to any one of claims 1 to 16.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2019309839A AU2019309839A1 (en) | 2018-07-27 | 2019-07-19 | Information processing device, information processing method, and program |
US17/156,267 US20210142484A1 (en) | 2018-07-27 | 2021-01-22 | Information processing apparatus, information processing method, and storage medium |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018141430 | 2018-07-27 | ||
JP2018-141430 | 2018-07-27 | ||
JP2019-097874 | 2019-05-24 | ||
JP2019097874A JP2020024672A (en) | 2018-07-27 | 2019-05-24 | Information processor, information processing method and program |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/156,267 Continuation US20210142484A1 (en) | 2018-07-27 | 2021-01-22 | Information processing apparatus, information processing method, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020022215A1 true WO2020022215A1 (en) | 2020-01-30 |
Family
ID=69181562
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/028464 WO2020022215A1 (en) | 2018-07-27 | 2019-07-19 | Information processing device, information processing method, and program |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020022215A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170228475A1 (en) * | 2016-02-05 | 2017-08-10 | The Climate Corporation | Modeling trends in crop yields |
US20180158207A1 (en) * | 2015-05-29 | 2018-06-07 | Université De Bordeaux | System and method for estimating a harvest volume in a vineyard operation |
-
2019
- 2019-07-19 WO PCT/JP2019/028464 patent/WO2020022215A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180158207A1 (en) * | 2015-05-29 | 2018-06-07 | Université De Bordeaux | System and method for estimating a harvest volume in a vineyard operation |
US20170228475A1 (en) * | 2016-02-05 | 2017-08-10 | The Climate Corporation | Modeling trends in crop yields |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2020024672A (en) | Information processor, information processing method and program | |
US10719787B2 (en) | Method for mapping crop yields | |
US11793119B2 (en) | Information processing device and information processing method | |
JP5729476B2 (en) | Imaging device and imaging support program | |
US11935282B2 (en) | Server of crop growth stage determination system, growth stage determination method, and storage medium storing program | |
JP5756374B2 (en) | Growth management method | |
JP7229864B2 (en) | REMOTE SENSING IMAGE ACQUISITION TIME DETERMINATION SYSTEM AND CROPT GROWTH ANALYSIS METHOD | |
JP2007310463A (en) | Farm field management support method and system | |
JP5657901B2 (en) | Crop monitoring method, crop monitoring system, and crop monitoring device | |
US20200311915A1 (en) | Growth status prediction system and method and computer-readable program | |
JP6760068B2 (en) | Information processing equipment, information processing methods, and programs | |
JPWO2016039176A1 (en) | Information processing apparatus, information processing method, and program | |
US20220405863A1 (en) | Information processing device, information processing method, and program | |
Lootens et al. | High-throughput phenotyping of lateral expansion and regrowth of spaced Lolium perenne plants using on-field image analysis | |
US20140009600A1 (en) | Mobile device, computer product, and information providing method | |
JP7313056B2 (en) | Fertilizer application amount determination device and fertilizer application amount determination method | |
CN107437262B (en) | Crop planting area early warning method and system | |
KR102114384B1 (en) | Image-based crop growth data measuring mobile app. and device therefor | |
WO2020022215A1 (en) | Information processing device, information processing method, and program | |
JP7191785B2 (en) | agricultural support equipment | |
WO2019163249A1 (en) | Color index value calculation system and color index value calculation method | |
CN108363851B (en) | Planting control method and control device, computer equipment and readable storage medium | |
CN115379150A (en) | System and method for automatically generating dynamic video of rice growth process in remote way | |
WO2021124815A1 (en) | Prediction device | |
JP6931418B2 (en) | Image processing methods, image processing devices, user interface devices, image processing systems, servers, and image processing programs |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19841848 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2019309839 Country of ref document: AU Date of ref document: 20190719 Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19841848 Country of ref document: EP Kind code of ref document: A1 |