WO2017047814A1 - 画像判定方法 - Google Patents
画像判定方法 Download PDFInfo
- Publication number
- WO2017047814A1 WO2017047814A1 PCT/JP2016/077728 JP2016077728W WO2017047814A1 WO 2017047814 A1 WO2017047814 A1 WO 2017047814A1 JP 2016077728 W JP2016077728 W JP 2016077728W WO 2017047814 A1 WO2017047814 A1 WO 2017047814A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- sensor
- server
- terminal
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G7/00—Botany in general
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5838—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/5866—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/68—Food, e.g. fruit or vegetables
Definitions
- the present invention relates to an image determination method for captured images.
- Patent Document 1 discloses a plant disease state database configured by an image capturing unit, an image processing unit, and a plant disease state database that stores image names of specific plant diseases and image information of specific disease characteristics in association with each other.
- a disease identification system is described.
- the image capturing unit acquires the image information of the cultivated plant
- the image processing unit analyzes the cultivated plant image information acquired by the image capturing unit and acquires the image information of the suspicious area.
- the image processing unit compares the acquired image information of the suspected disease area with the image information of the disease feature in the plant disease state database. Then, when the image information of the suspected disease area matches the image information of the feature of the disease in the plant disease state database, the image processing unit acquires the name of the disease of the corresponding specific plant as the disease identification result.
- the conventional system for determining disease and pest damage by image search performs image classification based only on pre-stored image information of the pest and disease, the classification accuracy is remarkably according to the difference in the shooting direction, brightness, resolution, etc. of the captured image. There was a decline.
- the present invention has been made to solve such a conventional problem, and an object thereof is to provide an image determination method capable of improving the accuracy of determination of a captured image.
- the image determination method is an image determination method by a server that includes a storage unit and communicates with a mobile terminal and a sensor terminal installed in a farm field.
- a step of storing a plurality of generated environment information indicating a range of environment information suitable for an aspect of an organism in a plurality of storage units, a step of receiving a captured image transmitted from a mobile terminal, and an environment information measured by the sensor terminal from the sensor A step of receiving from the terminal, a step of searching for a biological image similar to the received captured image among a plurality of biological images, and assigning a point to a name relating to a biological aspect corresponding to the searched biological image, and a sensor Extracts the name related to the form of the living organism associated with the generated environment information including the environment information received from the terminal, and weights the points of the extracted name Comprising the steps of, sending to the mobile terminal transmitting the captured image a name of highest point of the plurality of points as the determination result name, the.
- the server associates with the generated environment information including the environment information received from the sensor terminal in the vicinity of the shooting position of the portable terminal when the aspect of the living thing is photographed by the portable terminal. It is preferable to extract a name related to the aspect of the organism.
- the server further includes a step of storing a name related to the form of the living thing in the storage unit in association with the farm field in which the form of the living thing is confirmed, and in the weighting step, the server It is preferable to extract a name related to an aspect of a living thing confirmed in another field around the field where the sensor terminal in the vicinity of the photographing position is installed, and to further weight the extracted point of the name.
- the image determination method makes it possible to improve the accuracy of determination of captured images.
- FIG. 2 is a diagram illustrating an example of a schematic configuration of a sensor terminal 2.
- FIG. 2 is a diagram illustrating an example of a schematic configuration of a sensor base station 3.
- FIG. 2 is a diagram illustrating an example of a schematic configuration of a mobile terminal 4.
- FIG. 2 is a diagram illustrating an example of a schematic configuration of a server 5.
- FIG. It is a figure which shows an example of the data structure of a user management table. It is a figure which shows an example of the data structure of a sensor management table. It is a figure which shows an example of the data structure of a range management table.
- FIG. 1 is a schematic diagram for explaining the outline of the image determination system.
- the image determination system includes a sensor terminal 2 installed for each of a plurality of farm fields, a portable terminal 4 owned by each user who operates each of the plurality of farm fields, and a server 5 that communicates with the portable terminal 4.
- the server 5 searches for a biological image similar to the captured image transmitted from the mobile terminal 4, and transmits the name related to the biological form corresponding to the searched biological image to the mobile terminal 4 as a determination result name.
- Biological images are images of living organisms such as plants, animals, and insects, and images of disease symptoms of plants including crops, etc.
- names related to biological aspects are plant names, animal names, insect names, etc. It is the name of the organism and the name of the disease.
- a pest name or a disease name corresponding to a captured image of a pest or a diseased symptom image of a crop is retrieved, and the retrieved pest name or disease name is transmitted to the mobile terminal 4 as a determination result name.
- the mobile terminal 4 is a multi-function mobile phone (so-called “smart phone”), but a mobile phone (so-called “feature phone”), a personal digital assistant (Personal Digital Assistant, PDA), a mobile game machine, a mobile music player, a tablet PC. Etc.
- an image determination system for searching for a pest name or a disease name corresponding to a captured image of a pest or a captured image of a disease symptom of a crop, and transmitting the searched pest name or disease name as a determination result name to the mobile terminal 4 is provided.
- An example will be described.
- pests or diseases of agricultural crops are referred to as pests and pests or pests are referred to as pests or pests.
- a symptom of a disease of a pest inhabiting the field A operated by the user who owns the portable terminal 4 or a crop grown in the field A is photographed by the photographing function of the portable terminal 4 (1).
- the mobile terminal 4 transmits the captured image to the server 5 (2).
- the server 5 stores a plurality of pest images corresponding to each of the plurality of pests, and generation environment information indicating a range of environmental information in which the pest damage is likely to occur.
- Environmental information in which pest damage is likely to occur is an example of environmental information suitable for the form of a living thing.
- the pest damage image is an image obtained by photographing an actual pest damage, and is stored in the server 5 in association with the name of the pest damage determined by an agricultural expert or the like.
- Environmental information includes air temperature, humidity, soil temperature, soil water content, solar radiation, soil electrical conductivity, soil pH, wind direction and wind speed, saturation, dew point temperature, water level measured by the sensor terminal 2 installed in each field. , Water temperature, CO2, etc., and data indicating environmental factors such as rainfall information acquired from an external server.
- the occurrence environment information is data indicating the range of environment information or field conditions where disease and insect damage are likely to occur. For example, in the occurrence environment information of blast disease, the temperature range is 15 to 25 degrees, the leaf wetting time is 8
- the server 5 receives the captured image transmitted from the mobile terminal 4.
- the server 5 searches for a pest and disease image similar to the captured image received from the mobile terminal 4 among the plurality of stored pest and disease images, and points to each disease and disease name based on the searched disease and insect damage image. Grant (3).
- the server 5 extracts feature points for each stored disease and pest damage image using a SURF (Speeded Up Up Robust Features) method or the like, and calculates a local feature amount of each feature point.
- the server 5 expresses the calculated local feature amount by a feature vector such as a Fisher vector, and creates a classification model for the feature vector using machine learning such as Random Forest.
- the server 5 extracts feature points of the captured image transmitted from the mobile terminal 4, calculates a local feature amount of the feature point, and expresses the calculated local feature amount by a feature vector. Then, by using the feature vector and the classification model in the photographed image, a pest and disease image similar to the photographed image transmitted from the mobile terminal 4 is retrieved, and points are assigned for each disease and insect disease name that is retrieved and corresponds to the disease and insect damage image.
- the classification model is, for example, a plurality of decision trees in Random Forest.
- the point is the number of votes of the disease and insect damage names corresponding to the disease and insect damage images determined in each of the plurality of decision trees.
- the server 5 receives environment information measured by the sensor terminal 2 from the sensor terminal 2. Next, the server 5 specifies the environment information of the field A received from the sensor terminal 2, and determines the generated environment information including the specified environment information. Next, the server 5 extracts the name of the disease or pest associated with the occurrence environment information determined to include the specified environment information, and weights the point corresponding to the extracted name of the disease or pest (4).
- the server 5 transmits to the mobile terminal 4 the name of the disease / pest corresponding to the highest point among the points corresponding to the name of each disease / infestation as the determination result name.
- the user transmits the captured image of the pest and disease captured by the mobile terminal 4 to the server 5, so that the user is the name of the disease and pest based on the image classification based on the captured image of the pest and is adapted to the environmental information of the farm operated by the user
- the pest damage name can be known as the determination result name.
- the image determination system can search for disease and pest damage based on not only pre-stored image information of pest and disease but also environmental information of the field. It becomes possible.
- FIG. 1 is merely an explanation for deepening the understanding of the contents of the present invention.
- the present invention may be implemented in various embodiments described below, and may be implemented in various modifications without substantially exceeding the principle of the present invention. All such variations are within the scope of the present disclosure and the specification.
- FIG. 2 is a diagram illustrating an example of a schematic configuration of the agricultural management system 1.
- the agricultural management system 1 includes one or a plurality of sensor terminals 2, a sensor base station 3, a mobile terminal 4, and a server 5.
- the agricultural management system 1 is an example of an image determination system.
- One or a plurality of sensor terminals 2 and the sensor base station 3 are connected to each other via a sensor network 7.
- the sensor terminal 2 and the server 5 are connected to each other via a communication network.
- the sensor terminal 2 and the server 5 are connected to each other via a sensor network 7, a sensor base station 3, a base station 6, a backbone network 9, a gateway 10, and the Internet 11. Is done.
- the agricultural management system 1 may have a plurality of sensor base stations 3 according to the number of sensor terminals 2.
- the mobile terminal 4 and the base station 6 are connected to each other via the wireless communication network 8.
- the portable terminal 4 and the server 5 are connected to each other via a communication network, and are connected to each other via, for example, a wireless communication network 8, a base station 6, a backbone network 9, a gateway 10, and the
- the base station 6 is a wireless device that connects the sensor base station 3 and the backbone network 9, connects the mobile terminals 4, or connects the mobile terminal 4 and the backbone network 9. Base stations 6 are connected.
- FIG. 3 is a diagram illustrating an example of a schematic configuration of the sensor terminal 2.
- the sensor terminal 2 acquires environmental information indicating the measured environmental factors, transmits environmental information, and the like.
- the sensor terminal 2 includes a sensor terminal communication unit 21, a sensor terminal storage unit 22, a GPS (Global Positioning System) unit 23, a sensor connection unit 24, a sensor unit 25, and a sensor terminal processing unit 26. Prepare.
- the sensor terminal communication unit 21 includes a communication interface circuit including an antenna mainly having a 920 MHz band as a sensitive band, and connects the sensor terminal 2 to the sensor network 7.
- the sensor terminal communication unit 21 performs wireless communication with the sensor base station 3 based on a specific low-power wireless system using a specific channel.
- the frequency band of the sensor terminal communication unit 21 is not limited to the frequency band described above.
- the sensor terminal communication unit 21 transmits the environment information supplied from the sensor terminal processing unit 26 to the sensor base station 3.
- the sensor terminal storage unit 22 includes, for example, a semiconductor memory.
- the sensor terminal storage unit 22 stores a driver program, an operating system program, data, and the like used for processing in the sensor terminal processing unit 26.
- the sensor terminal storage unit 22 stores a wireless communication device driver program that controls the sensor terminal communication unit 21, a GPS driver program that controls the GPS unit 23, a sensor driver program that controls the sensor unit 25, and the like as driver programs.
- the sensor terminal storage unit 22 stores a wireless control program that executes a specific low-power wireless system or the like as an operating system program.
- the sensor terminal storage unit 22 stores environmental information indicating environmental factors measured by the sensor unit 25 as data.
- the GPS unit 23 has a GPS circuit including an antenna mainly having a sensitivity band of 1.5 GHz, and receives GPS signals from GPS satellites (not shown). The GPS unit 23 decodes the GPS signal and acquires time information and the like. Next, the GPS unit 23 calculates the pseudo distance from the GPS satellite to the sensor terminal 2 based on the time information and the like, and solves the simultaneous equations obtained by substituting the pseudo distance, so that the sensor terminal 2 exists. The position (latitude, longitude, altitude, etc.) to be detected is detected. Then, the GPS unit 23 associates the position information indicating the detected position with the acquired time information and periodically outputs it to the sensor terminal processing unit 26.
- the sensor connection unit 24 includes a sensor terminal connected to the sensor unit 25, and is connected to the sensor unit 25 for measuring one or a plurality of types of environmental factors.
- the sensor unit 25 measures environmental factors such as temperature, humidity, soil temperature, soil moisture, solar radiation, soil electrical conductivity, soil pH, wind direction and speed, saturation, dew point temperature, water level, water temperature, and CO2.
- the sensor unit 25 includes an air temperature sensor for measuring the temperature, a humidity sensor for measuring the humidity, a soil temperature sensor for measuring the soil temperature, a soil moisture sensor for measuring the soil moisture content, and the amount of solar radiation.
- Solar radiation sensor for measuring soil
- soil EC (electrical conductivity) sensor for measuring soil electrical conductivity
- soil pH sensor for measuring soil pH, wind direction sensor and wind speed sensor
- dew point temperature sensor water level sensor
- water temperature It includes at least one of a sensor and a CO2 sensor.
- the sensor terminal processing unit 26 includes one or a plurality of processors and their peripheral circuits.
- the sensor terminal processing unit 26 comprehensively controls the overall operation of the sensor terminal 2 and is, for example, a CPU (Central Processing Unit).
- the sensor terminal processing unit 26 includes a sensor terminal communication unit 21, a GPS unit 23, a sensor so that various processes of the sensor terminal 2 are executed in an appropriate procedure according to a program stored in the sensor terminal storage unit 22. The operation of the unit 25 and the like is controlled.
- the sensor terminal processing unit 26 performs processing based on programs (driver program, operating system program, etc.) stored in the sensor terminal storage unit 22.
- the sensor terminal processing unit 26 includes a measurement information acquisition unit 261 and a measurement information transmission unit 262. Each of these units included in the sensor terminal processing unit 26 is a functional module implemented by a program executed on a processor included in the sensor terminal processing unit 26. Alternatively, these units included in the sensor terminal processing unit 26 may be mounted on the sensor terminal 2 as an independent integrated circuit, a microprocessor, or firmware.
- FIG. 4 is a diagram illustrating an example of a schematic configuration of the sensor base station 3.
- the sensor base station 3 receives environmental information from the sensor terminal 2, acquires environmental information indicating the measured environmental factors, transmits environmental information, and the like.
- the sensor base station 3 includes a first base station communication unit 31, a second base station communication unit 32, a base station storage unit 33, a GPS unit 34, a sensor connection unit 35, a sensor unit 36, And a base station processing unit 37.
- the first base station communication unit 31 has a communication interface circuit including an antenna mainly having a 920 MHz band as a sensing band, and connects the sensor base station 3 to the sensor network 7.
- the first base station communication unit 31 performs wireless communication with the sensor terminal 2 using a specific channel based on a specific low power wireless system or the like.
- the frequency band of the first base station communication unit 31 is not limited to the frequency band described above.
- the first base station communication unit 31 receives the environment information transmitted from the sensor terminal 2 and supplies the received environment information to the base station processing unit 37.
- the second base station communication unit 32 has a communication interface circuit including an antenna whose sensitivity band is mainly 2.4 GHz band, 5 GHz band, etc., and is connected to a base station 6 of a wireless LAN (Local Area Network) not shown. Wireless communication is performed based on the wireless communication system of IEEE (The Institute of Electrical and Electronics Electronics Engineers, Inc.) 802.11 standard. Further, the frequency band of the second base station communication unit 32 is not limited to the frequency band described above. Then, the second base station communication unit 32 transmits the environment information supplied from the base station processing unit 37 to the base station 6.
- the base station storage unit 33 includes, for example, a semiconductor memory.
- the base station storage unit 33 stores a driver program, an operating system program, data, and the like used for processing in the base station processing unit 37.
- the base station storage unit 33 includes a wireless communication device driver program for controlling the first base station communication unit 31, a wireless LAN communication device driver program for controlling the second base station communication unit 32, and a GPS unit 34 as driver programs.
- a GPS driver program to be controlled, a sensor driver program to control the sensor unit 25, and the like are stored.
- the base station storage unit 33 stores, as an operating system program, a wireless control program that executes a specific low-power wireless system, a connection control program that executes a wireless communication system of the IEEE 802.11 standard, and the like.
- the base station storage unit 33 stores, as data, environmental information for tightening environmental factors measured by the sensor unit 36 and environmental information received from the sensor terminal 2.
- the GPS unit 34 has a GPS circuit including an antenna whose sensitivity band is mainly a 1.5 GHz band, and receives a GPS signal from a GPS satellite (not shown). The GPS unit 34 decodes the GPS signal and acquires time information and the like. Next, the GPS unit 34 calculates a pseudo distance from the GPS satellite to the sensor base station 3 based on the time information and the like, and solves simultaneous equations obtained by substituting the pseudo distance, thereby obtaining the sensor base station 3. The position (latitude, longitude, altitude, etc.) where is present is detected. Then, the GPS unit 34 periodically associates the position information indicating the detected position with the acquired time information and outputs it to the base station processing unit 37.
- the sensor connection unit 35 includes a sensor terminal connected to the sensor unit 36, and is connected to the sensor unit 36 for measuring one or a plurality of types of environmental factors.
- the sensor unit 36 measures environmental factors such as temperature, humidity, soil temperature, soil moisture, solar radiation, soil electrical conductivity, soil pH, wind direction and speed, saturation, dew point temperature, water level, water temperature, and CO2. Including various sensors.
- the sensor unit 36 includes an air temperature sensor for measuring the temperature, a humidity sensor for measuring the humidity, a soil temperature sensor for measuring the soil temperature, a soil moisture sensor for measuring the soil moisture content, and the amount of solar radiation.
- Solar radiation sensor for measuring soil
- soil EC (electrical conductivity) sensor for measuring soil electrical conductivity
- soil pH sensor for measuring soil pH, wind direction sensor and wind speed sensor
- dew point temperature sensor water level sensor
- water temperature It includes at least one of a sensor and a CO2 sensor.
- the base station processing unit 37 has one or a plurality of processors and their peripheral circuits.
- the base station processing unit 37 controls the overall operation of the sensor base station 3, and is, for example, a CPU (Central Processing Unit).
- the base station processing unit 37 includes the first base station communication unit 31, the second base station communication unit 31, and the second base station communication unit 31 so that various processes of the sensor base station 3 are executed according to a program stored in the base station storage unit 33. Controls operations of the base station communication unit 32, the GPS unit 34, the sensor unit 36, and the like.
- the base station processing unit 37 executes processing based on programs (driver program, operating system program, etc.) stored in the base station storage unit 33.
- the base station processing unit 37 includes a measurement information acquisition unit 371, an environment information reception unit 372, and an environment information transmission unit 373. Each of these units included in the base station processing unit 37 is a functional module implemented by a program executed on a processor included in the base station processing unit 37. Alternatively, these units included in the base station processing unit 37 may be implemented in the sensor base station 3 as independent integrated circuits, microprocessors, or firmware.
- FIG. 5 is a diagram illustrating an example of a schematic configuration of the mobile terminal 4.
- the mobile terminal 4 performs transmission of user information, reception of field environment information history and range information, display of environment information history and range information, and the like. Therefore, the mobile terminal 4 includes a first wireless communication unit 41, a second wireless communication unit 42, a terminal storage unit 43, an operation unit 44, a display unit 45, a photographing unit 46, and a terminal processing unit 47. Is provided.
- the first wireless communication unit 41 has a communication interface circuit including an antenna mainly having a sensitivity band of 2.1 GHz band, and connects the mobile terminal 4 to a communication network (not shown).
- the first radio communication unit 41 establishes a radio signal line by a CDMA (Code Division Multiple Access) method or the like with the base station 6 via a channel assigned by the base station 6, and communicates with the base station 6. Communicate.
- the communication method with the base station 6 is not limited to the CDMA method, and may be another communication method such as W-CDMA (Wideband Code Division Multiple Access) method, LTE (Long Term Evolution) method, and the like.
- the communication method used may be used.
- the communication method with the base station 6 may be another communication method such as PHS (Personal Handy-phone System).
- the frequency band of the first wireless communication unit 41 is not limited to the frequency band described above.
- the first wireless communication unit 41 supplies the data received from the base station 6 to the terminal processing unit 47 and transmits the data supplied from the terminal processing unit
- the second wireless communication unit 42 has a communication interface circuit including an antenna whose sensitivity band is mainly 2.4 GHz band, 5 GHz band, etc., and is connected to an access point of a wireless LAN (Local Area Network) not shown. Wireless communication is performed based on an IEEE (The Institute of Electrical and Electronics Electronics Engineers, Inc.) 802.11 wireless communication system. Further, the frequency band of the second wireless communication unit 42 is not limited to the frequency band described above. Then, the second wireless communication unit 42 supplies the data received from the base station 6 to the terminal processing unit 47 and transmits the data supplied from the terminal processing unit 47 to the base station 6.
- the terminal storage unit 43 includes, for example, a semiconductor memory.
- the terminal storage unit 43 stores a driver program, an operating system program, an application program, data, and the like used for processing in the terminal processing unit 47.
- the terminal storage unit 43 controls the mobile phone communication device driver program that controls the first wireless communication unit 41, the wireless LAN communication device driver program that controls the second wireless communication unit 42, and the operation unit 44 as driver programs.
- An input device driver program, an output device driver program for controlling the display unit 45, and the like are stored.
- the terminal storage unit 43 stores, as an operating system program, a connection control program for executing a wireless communication scheme of the IEEE 802.11 standard, a connection control program for a mobile phone, and the like.
- the terminal storage unit 43 stores, as application programs, a web browser program that acquires and displays a web page, an e-mail program that transmits and receives e-mails, and the like.
- the computer program is a terminal storage unit using a known setup program from a computer-readable portable recording medium such as a CD-ROM (compact disk read only memory) or a DVD-ROM (digital versatile disk read only memory). 43 may be installed.
- the operation unit 44 may be any device as long as the operation of the mobile terminal 4 is possible, for example, a touch panel type input device, a keypad, or the like.
- the owner can enter letters, numbers, etc. using this device.
- the operation unit 44 When operated by the owner, the operation unit 44 generates a signal corresponding to the operation.
- the generated signal is input to the terminal processing unit 47 as an instruction of the owner.
- the display unit 45 may be any device as long as it can output a moving image, a still image, and the like, such as a touch panel display device, a liquid crystal display, an organic EL (Electro-Luminescence) display, and the like.
- the display unit 45 displays a moving image corresponding to the moving image data supplied from the terminal processing unit 47, a still image corresponding to the still image data, and the like.
- the imaging unit 46 includes an imaging optical system, an imaging device, an image processing unit, and the like.
- the imaging optical system is an optical lens, for example, and forms an image of a light beam from a subject on the imaging surface of the imaging device.
- the image sensor is a CCD (Charge-Coupled Device) or CMOS (Complementary-Metal-Oxide-Semiconductor) or the like, and outputs an image signal of a subject image formed on the imaging surface.
- the image processing unit creates and outputs image data in a predetermined file format from the image signal generated by the image sensor.
- the terminal processing unit 47 has one or a plurality of processors and their peripheral circuits.
- the terminal processing unit 47 controls the overall operation of the mobile terminal 4 and is, for example, a CPU (Central Processing Unit).
- the terminal processing unit 47 performs the first wireless communication so that various processes of the mobile terminal 4 are executed in an appropriate procedure according to the program stored in the terminal storage unit 43, the output from the operation of the operation unit 44, and the like.
- the operations of the unit 41, the second wireless communication unit 42, the display unit 45, and the like are controlled.
- the terminal processing unit 47 executes processing based on programs (driver program, operating system program, application program, etc.) stored in the terminal storage unit 43. Further, the terminal processing unit 47 can execute a plurality of programs (such as application programs) in parallel.
- the terminal processing unit 47 includes a browsing execution unit 471, an acquisition unit 472, and a terminal transmission unit 473. Each of these units included in the terminal processing unit 47 is a functional module implemented by a program executed on a processor included in the terminal processing unit 47. Alternatively, these units included in the terminal processing unit 47 may be mounted on the portable terminal 4 as independent integrated circuits, microprocessors, or firmware.
- FIG. 6 is a diagram illustrating an example of a schematic configuration of the server 5.
- 7A to 7D are diagrams illustrating examples of data structures of various tables stored in the server storage unit 52.
- FIG. 6 is a diagram illustrating an example of a schematic configuration of the server 5.
- 7A to 7D are diagrams illustrating examples of data structures of various tables stored in the server storage unit 52.
- FIG. 6 is a diagram illustrating an example of a schematic configuration of the server 5.
- 7A to 7D are diagrams illustrating examples of data structures of various tables stored in the server storage unit 52.
- the server 5 When the server 5 receives the environmental information from the sensor terminal 2 or the sensor base station 3, the server 5 accumulates and manages the environmental information, and transmits the environmental information and the range information of the environmental information to the mobile terminal 4. In addition, when the server 5 receives the captured image of the pest and disease transmitted by the mobile terminal 4, the server 5 searches for the name of the pest and disease that corresponds to the captured image, and transmits the searched disease and pest name as the determination result name to the mobile terminal 4.
- the server 5 includes a server communication unit 51, a server storage unit 52, and a server processing unit 53.
- the server communication unit 51 has a communication interface circuit for connecting the server 5 to the Internet 11.
- the server communication unit 51 receives the data transmitted from the sensor terminal 2 or the sensor base station 3 and the data transmitted from the portable terminal 4 and supplies each received data to the server processing unit 53.
- the server storage unit 52 includes, for example, at least one of a semiconductor memory, a magnetic disk device, and an optical disk device.
- the server storage unit 52 stores a driver program, an operating system program, an application program, data, and the like used for processing by the server processing unit 53.
- the server storage unit 52 stores a communication device driver program that controls the server communication unit 51 as a driver program.
- the computer program may be installed in the server storage unit 52 from a computer-readable portable recording medium such as a CD-ROM or DVD-ROM using a known setup program or the like.
- the server storage unit 52 includes, as data, a user management table shown in FIG. 7A, a sensor management table shown in FIG. 7B, a range management table shown in FIG. 7C, an evaluation history table shown in FIG. 7D, and various images related to screen display. Store data etc. Further, the server storage unit 52 may temporarily store temporary data related to a predetermined process.
- FIG. 7A shows a user management table for managing users.
- the user management table for each user, information such as the user identification number (user ID), the user name, the user mail address, the identification numbers (sensor ID) of the sensor terminal 2 and the sensor base station 3 are associated.
- the sensor terminal ID is an identification number of the sensor terminal 2 owned by the user.
- FIG. 7B shows a sensor management table for managing the sensor terminal 2 and the sensor base station 3.
- the identification number (sensor ID) of the sensor terminal 2 and sensor base station 3 the sensor position, the field identification number (field ID), and the crop identification number Information such as (agricultural crop ID), the current growing season, the lower limit evaluation value, and environmental information for each record is stored in association with each other.
- the crop identification number Information such as (agricultural crop ID), the current growing season, the lower limit evaluation value, and environmental information for each record is stored in association with each other.
- the sensor position is the latitude and longitude acquired by the GPS unit of each sensor terminal 2 and sensor base station 3 transmitted from each sensor terminal 2 and sensor base station 3.
- the field ID is an identification number of the field where each sensor terminal 2 and sensor base station 3 is installed.
- the crop ID is an identification number of a crop cultivated in the field where each sensor terminal 2 and sensor base station 3 is installed.
- the growing season is a plurality of periods obtained by dividing the growing period during which each crop is grown for each of a plurality of growing conditions, and a plurality of growing periods are set for each type of crop.
- the growing period is, for example, a sowing period, a seedling period, an establishment period, or the like.
- the current growing season is the current growing season of the crop grown in the field where each sensor terminal 2 and sensor base station 3 is installed. Note that the current growing season is updated to the next growing season when the server 5 receives the growing season update request transmitted from the user.
- the lower limit evaluation value is the lower limit of the result value desired by the user among the result values of the crops grown in the field where each sensor terminal 2 and sensor base station 3 are installed. The lower limit evaluation value is set by the user, but may be a predetermined evaluation value.
- the environment information for each record is sequentially associated and stored according to the measurement time for each sensor ID based on the sensor ID, the environment information, and the measurement time transmitted from the sensor base station 3 in a predetermined server transmission cycle.
- the environment information of the first measurement time is stored as one record for each sensor ID, and each record is stored as record 1, record 2, record 3,.
- the environmental information is temperature, humidity, soil temperature, soil moisture content, solar radiation, soil electrical conductivity, soil pH, wind direction and speed, saturation, dew point temperature, water level, water temperature, and / or CO2.
- Other environmental factors may be included, and may be an integrated value of each environmental factor.
- the growing season included in each record stores the current growing season at the time when environmental information is stored.
- FIG. 7C shows a range management table for managing range information of environment information.
- an identification number (a crop ID) of the crop for each crop, an identification number (a crop ID) of the crop, range information of environmental information necessary for harvesting the crop for each growing season, and the like are stored in association with each other.
- FIG. 7D shows an evaluation history table for managing evaluation values of crops grown in the past.
- the evaluation history table for each sensor terminal 2 and sensor base station 3, the identification numbers (sensor IDs) of the sensor terminal 2 and sensor base station 3 are collected in the field where the sensor terminal 2 and sensor base station 3 are installed.
- the evaluation history and the like of the farm products that have been processed are stored in association with each other.
- the evaluation history includes the growing period of the crops harvested in the past, the evaluation value indicating the result value of the crops, and the like.
- the storage processing unit 535 associates, for each sensor ID, the evaluation history stored in the evaluation history table and one or both of the field ID and the crop ID stored in the sensor management table. You may make it memorize
- the server processing unit 53 includes one or a plurality of processors and their peripheral circuits.
- the server processing unit 53 controls the overall operation of the server 5 in an integrated manner, and is, for example, a CPU.
- the server processing unit 53 controls the operation of the server communication unit 51 and the like so that various processes of the server 5 are executed in an appropriate procedure according to a program stored in the server storage unit 52 and the like.
- the server processing unit 53 executes processing based on programs (driver program, operating system program, application program, etc.) stored in the server storage unit 52. Further, the server processing unit 53 can execute a plurality of programs (such as application programs) in parallel.
- the server processing unit 53 includes a server reception unit 531, a registration unit 532, a screen creation unit 533, a server transmission unit 534, a storage processing unit 535, a warning unit 536, a correction unit 537, a specification unit 538, and an image determination unit 539. And have.
- Each of these units included in the server processing unit 53 is a functional module implemented by a program executed on a processor included in the server processing unit 53.
- these units included in the server processing unit 53 may be implemented in the server 5 as independent integrated circuits, microprocessors, or firmware.
- FIG. 8 is a diagram illustrating an example of an operation sequence according to the agricultural management system 1.
- the operation sequence illustrated in FIG. 8 is an example of user information registration processing between the mobile terminal 4 and the server 5.
- the operation sequence described below is based on the programs stored in advance in the terminal storage unit 43 and the server storage unit 52, and the elements of the mobile terminal 4 and the server 5 are mainly performed by the terminal processing unit 47 and the server processing unit 53. It is executed in cooperation with.
- the browsing execution unit 471 of the mobile terminal 4 transmits the user information input by the user using the operation unit 44 to the server 5 via the second wireless communication unit 42 together with the user information registration request (step S101). ).
- the server reception unit 531 of the server 5 receives the user information transmitted from the mobile terminal 4 via the server communication unit 51.
- the user information includes the user's name, the user's e-mail address, the sensor IDs of the sensor terminal 2 and the sensor base station owned by the user, the field ID corresponding to the sensor ID, the crop ID, the current growing season, the lower growth limit, the growing season.
- the range information of environmental information necessary for harvesting each crop is included.
- the server 5 may assign a unique ID to the field ID based on the field name input by the user using the operation unit 44. Further, the server 5 may assign a unique ID to the crop ID based on the type of the crop input by the user using the operation unit 44.
- the registration unit 532 executes user information registration processing for registering the user information received by the server reception unit 531 in various tables recorded in the server storage unit 52 (step S102).
- the screen creation unit 533 creates management screen display data including user information registered by the registration unit 532.
- the server transmission unit 534 sends the created management screen display data to the mobile terminal 4 via the server communication unit 51 (step S103).
- the browsing execution unit 471 of the portable terminal 4 displays a management screen (not shown) including registered user information based on the management screen display data received via the second wireless communication unit 42 (step S104). .
- FIG. 9 is a diagram illustrating an example of an operation sequence according to the agricultural management system 1.
- the operation sequence shown in FIG. 9 is an example of environment information storage processing among the sensor terminal 2, the sensor base station 3, and the server 5.
- the operation sequence described below is mainly based on programs stored in the sensor terminal storage unit 22, the base station storage unit 33, and the server storage unit 52 in advance, and the sensor terminal processing unit 26, the base station processing unit 37, and the server. It is executed by the processing unit 53 in cooperation with each element of the sensor terminal 2, the sensor base station 3, and the server 5.
- the measurement information acquisition unit 261 of the sensor terminal 2 acquires the environmental information indicating the environmental factor measured by the sensor unit 25 from the sensor unit 25 via the sensor connection unit 24 at a predetermined measurement cycle, and the GPS unit. 23, time information output periodically is acquired.
- the measurement information acquisition unit 261 associates the environmental information acquired from the sensor unit 25 and the time information acquired from the GPS unit 23 with the sensor ID for identifying the sensor terminal 2 in the sensor terminal storage unit 22.
- the measurement information transmission unit 262 transmits the sensor ID, environment information, and measurement time stored in the sensor terminal storage unit 22 to the sensor base station 3 via the sensor terminal communication unit 21 at a predetermined transmission cycle. (Step S201).
- the measurement information acquisition unit 371 of the sensor base station 3 acquires the environmental information indicating the environmental factor measured by the sensor unit 36 from the sensor unit 36 via the sensor connection unit 35 at a predetermined measurement cycle, and GPS The time information periodically output from the unit 34 is acquired.
- the measurement information acquisition unit 371 associates the environmental information acquired from the sensor unit 36 and the time information acquired from the GPS unit 34 in association with the sensor ID for identifying the sensor base station 3 into the base station storage unit 33.
- the environment information receiving unit 372 receives the sensor ID, the environment information, and the measurement time transmitted from the sensor terminal 2 at a predetermined transmission cycle via the first base station communication unit 31.
- the environment information receiving unit 372 stores the received sensor ID, environment information, and measurement time in the base station storage unit 33.
- the environment information transmission unit 373 transmits the sensor ID, environment information, and measurement time stored in the base station storage unit 33 to the server 5 via the second base station communication unit 32 at a predetermined server transmission cycle. (Step S202).
- the server reception unit 531 of the server 5 receives the sensor ID, environment information, and measurement time transmitted from the sensor base station 3 at a predetermined server transmission cycle via the server communication unit 51. Then, the storage processing unit 535 sequentially stores the received measurement time and environment information, one record at a time, in the sensor management table of the server storage unit 52 in association with the received sensor ID (step S203).
- FIG. 10 is a diagram illustrating an example of an operation sequence according to the agricultural management system 1.
- the operation sequence illustrated in FIG. 10 is an example of range information registration processing between the mobile terminal 4 and the server 5.
- the operation sequence described below is based on the programs stored in advance in the terminal storage unit 43 and the server storage unit 52, and the elements of the mobile terminal 4 and the server 5 are mainly performed by the terminal processing unit 47 and the server processing unit 53. It is executed in cooperation with.
- the browsing execution unit 471 of the mobile terminal 4 obtains the crop ID for identifying the crop input by the user using the operation unit 44 and the range information of the environmental information necessary for harvesting the crop of each evaluation value. Then, the request is sent to the server 5 through the second wireless communication unit 42 together with the registration request for the range information of the environment information (step S301).
- the range information of the environmental information necessary for harvesting the crop of each evaluation value may be referred to as the range information of the environmental information corresponding to each evaluation value.
- the server reception unit 531 of the server 5 receives the crop ID transmitted from the mobile terminal 4 and the range information of the environment information corresponding to each evaluation value via the server communication unit 51. Then, the registration unit 532 associates the crop ID with the range information of the environment information corresponding to each evaluation value and records it in the range management table of the server storage unit 52 (step S302).
- the screen creation unit 533 includes the crop indicated by the crop ID recorded by the range management table of the registration unit 532 and the range information of the environment information corresponding to the crop and corresponding to each evaluation value. Create display data. Then, when the management screen display data is created by the screen creation unit 533, the server transmission unit 534 sends the created management screen display data to the mobile terminal 4 via the server communication unit 51 (step S303).
- the browsing execution unit 471 of the mobile terminal 4 displays a management screen (not shown) including the range information of the registered environmental information based on the management screen display data received via the second wireless communication unit 42 ( Step S304).
- FIG. 11 is a diagram illustrating an example of an operation sequence according to the agricultural management system 1.
- the operation sequence shown in FIG. 11 is an example of a warning mail output process between the mobile terminal 4 and the server 5.
- the operation sequence described below is based on the programs stored in advance in the terminal storage unit 43 and the server storage unit 52, and the elements of the mobile terminal 4 and the server 5 are mainly performed by the terminal processing unit 47 and the server processing unit 53. It is executed in cooperation with.
- the warning unit 536 and the screen creation unit 533 of the server 5 execute a warning mail output process at a predetermined warning cycle (step S401). Details of the warning mail output process will be described later. Then, when the warning mail output process is executed by the warning unit 536 and the screen creation unit 533, the server transmission unit 534 receives the warning mail from the warning unit 536, and the portable terminal indicated by the transmission destination address included in the warning mail 4, a warning mail is transmitted (step S402).
- FIG. 12 is a diagram illustrating an example of an operation flow of warning mail output processing by the warning unit 536 and the screen creation unit 533.
- the warning mail output process shown in FIG. 12 is executed in step S401 in FIG.
- the warning unit 536 refers to the sensor management table stored in the server storage unit 52, and acquires the latest record of environment information for each sensor ID at a predetermined warning cycle (step S501).
- the warning unit 536 refers to the sensor management table stored in the server storage unit 52, and extracts the crop ID and the lower limit evaluation value associated with each sensor ID for each sensor ID.
- the warning unit 536 refers to the range management table stored in the server storage unit 52 and is equal to or higher than the lower limit evaluation value in the current growing season associated with the extracted crop ID (for example, the lower limit evaluation value is If it is 4, the range information of the environmental information of the predetermined values 4 and 5) is acquired (step S502).
- the lower limit evaluation value is an example of a predetermined evaluation value.
- the warning unit 536 determines whether or not the environment information of the latest record is within the range information of the environment information equal to or higher than the lower limit evaluation value corresponding to the crop ID associated with each sensor ID. Determination is made (step S503).
- the warning unit 536 determines that the environment information of the latest record of each sensor ID is within the range information of the environment information of the lower limit evaluation value corresponding to the crop ID associated with each sensor ID (step) (S503-Yes)
- the process proceeds to step S506 to be described later.
- the warning unit 536 determines that the environment information of the latest record of each sensor ID is not within the range information of the environment information of the lower limit evaluation value corresponding to the crop ID associated with each sensor ID (step S503-No) ).
- the warning unit 536 creates environmental suitability information indicating an environment necessary for harvesting the crop of the lower limit evaluation value.
- the environmental compatibility information is information indicating a difference between the environmental information of the latest record and the range information of the environmental information equal to or higher than the lower limit evaluation value.
- the screen creation unit 533 creates management screen display data including the environmental suitability information created by the warning unit 536 (step S504).
- the warning unit 536 creates a warning mail including information indicating that the current environmental information is not within the range information of the environmental information (step S505).
- the warning unit 536 refers to the user management table, specifies the user ID associated with the sensor ID corresponding to the environment information of the latest record determined not to be within the range information of the environment information of the lower limit evaluation value, An email address associated with the identified user ID is acquired.
- the warning unit 536 creates a warning mail in which the acquired mail address is set as the transmission destination, and information indicating that the current environment information is not within the range information of the environment information is set as the text. Then, the warning unit 536 passes the created warning mail to the server transmission unit 534.
- the warning unit 536 may include a URL (UniformUniResource Locator) indicating the storage location of the management screen display data including the environmental suitability information in the warning mail. Thereby, the user who has received the warning mail can display a management screen including the environmental compatibility information on the mobile terminal 4.
- URL UniformUniResource Locator
- the warning unit 536 determines whether or not the processes in steps S503 to S504 have been executed for all the sensor IDs stored in the sensor management table (step S506).
- the process returns to step S503. If the warning unit 536 determines that the processes in steps S503 to S504 have been executed for all sensor IDs (step S506—Yes), the series of steps is terminated.
- FIG. 13 is a diagram illustrating an example of an operation sequence according to the agricultural management system 1.
- the operation sequence illustrated in FIG. 13 is an example of range information correction processing between the mobile terminal 4 and the server 5.
- the operation sequence described below is based on the programs stored in advance in the terminal storage unit 43 and the server storage unit 52, and the elements of the mobile terminal 4 and the server 5 are mainly performed by the terminal processing unit 47 and the server processing unit 53. It is executed in cooperation with.
- the browsing execution unit 471 of the mobile terminal 4 uses the operation unit 44 to input that one of the growing seasons of each crop has ended, and the user who owns the mobile terminal 4
- the growing period update request including the user ID and the sensor ID is transmitted to the server 5 via the second wireless communication unit 42 (step S601).
- the completed growing season is referred to as the current growing season
- the growing season updated by the growing season update request is referred to as the next growing season.
- the sensor ID is the sensor ID of the sensor terminal 2 and the sensor base station 3 installed in the field where the crops in which the growing season has ended are grown.
- Step S602 Details of the range information correction processing will be described later.
- the server transmission unit 534 receives the management screen display data from the screen creation unit 533, and sends the management screen display data to the portable terminal 4. Transmit (step S603).
- FIG. 14 is a diagram illustrating an example of an operation flow of range information correction processing by the correction unit 537 and the screen creation unit 533.
- the range information correction process shown in FIG. 14 is executed in step S602 in FIG.
- the correcting unit 537 refers to the sensor management table, and among the environmental information of each record associated with the sensor ID included in the growing period update request received by the server receiving unit 531, in the current growing period.
- the environmental information of the record is acquired (step S701).
- the correcting unit 537 extracts the crop ID and the lower limit evaluation value associated with the sensor ID included in the growing season update request with reference to the sensor management table, and extracted with reference to the range management table.
- Range information of environmental information equal to or higher than the lower limit evaluation value in the current growing season associated with the crop ID is acquired (step S702).
- the range information of the environmental information equal to or higher than the lower limit evaluation value in the current growing season associated with the extracted crop ID may be referred to as target range information.
- the correcting unit 537 determines whether or not the average value of the environmental information of the record in the current growing season is within the target range information (step S703).
- the correction part 537 may determine with the environmental information of the record in the present growing season being in target range information, when the number of records of the environmental information contained in target range information is more than predetermined number.
- the correcting unit 537 advances the processing to step S706.
- the correcting unit 537 selects an evaluation value lower than the lower limit evaluation value extracted in Step S702.
- the evaluation value extracted and extracted as the lower limit evaluation value associated with the sensor ID included in the growing season update request is stored in the sensor management table (step S704).
- the lower evaluation value and the evaluation value lower than the lower evaluation value are examples of the predetermined evaluation value and the second evaluation value. Note that the second evaluation value is not limited to an evaluation value lower than the lower limit evaluation value.
- the correcting unit 537 determines whether the average value of the environmental information of the records in the current growing season is included in the range information of the environmental information equal to or higher than the lower limit evaluation value acquired in step S702. And when the average value of the environmental information of the record in the current growing season is within the range information of the environmental information equal to or higher than the lower limit evaluation value, the correction unit 537 calculates an evaluation value corresponding to the environmental information equal to or higher than the lower limit evaluation value. The second evaluation value.
- the second evaluation value different from the predetermined evaluation value is extracted according to the environment information, and the evaluation value is stored in the sensor management table.
- the correction unit 537 acquires the range information of the environmental information of the second evaluation value in the next growing season associated with the extracted crop ID extracted in step S702, and the screen creation unit 533 Management screen display data for displaying a management screen including the range information of the extracted environment information is created (step S705). Then, the screen creation unit 533 passes the created management screen display data to the server transmission unit 534 and ends the series of steps.
- the warning unit 536 performs the warning mail output process described above using an evaluation value lower than the lower limit evaluation value.
- FIG. 15 is a diagram illustrating an example of an operation sequence according to the agricultural management system 1.
- the operation sequence illustrated in FIG. 15 is an example of range information update processing between the mobile terminal 4 and the server 5.
- the operation sequence described below is based on the programs stored in advance in the terminal storage unit 43 and the server storage unit 52, and the elements of the mobile terminal 4 and the server 5 are mainly performed by the terminal processing unit 47 and the server processing unit 53. It is executed in cooperation with.
- the user uses the operation unit 44 to install the sensor terminal 2 and the sensor installed in the field where each harvested crop is grown.
- the sensor ID of the base station 3, the crop ID of the harvested crop, the evaluation value, and the growing period are input.
- the acquisition unit 472 of the mobile terminal 4 acquires the input sensor ID, the crop ID, the evaluation value, and the growing period.
- the terminal transmission unit 473 receives the user ID of the user who owns the portable terminal 4 and the harvest data including the sensor ID, the crop ID, the evaluation value, and the growth period acquired by the acquisition unit 472, as a second wireless communication unit. It transmits to the server 5 via 42 (step S801).
- Step S802 when the server reception unit 531 of the server 5 receives the harvest data transmitted from the mobile terminal 4 via the server communication unit 51, the specifying unit 538 and the screen creation unit 533 execute range information update processing ( Step S802). Details of the range information update process will be described later.
- the server transmission unit 534 receives the management screen display data from the screen creation unit 533, and sends the management screen display data to the portable terminal 4. Transmit (step S803).
- FIG. 16 is a diagram illustrating an example of an operation flow of range information update processing by the specifying unit 538 and the screen creation unit 533.
- the range information update process shown in FIG. 16 is executed in step S802 in FIG.
- the specifying unit 538 refers to the sensor management table, and stores the environmental information of all the records associated with the sensor ID and the crop ID included in the harvest data received by the server receiving unit 531 for each record.
- Environmental information is acquired for each included growing season (step S901).
- the environment information acquired in step S901 relates to the sensor ID included in the harvest data received by the server reception unit 531.
- the specifying unit 538 may acquire environment information of all records associated with all sensor IDs stored in the server storage unit 52 and associated with the crop ID.
- the identifying unit 538 classifies each field into a plurality of groups based on the latitude and longitude, altitude, climatic conditions, soil type, and the like of each field, and includes a sensor included in the harvest data received by the server reception unit 531. Environment information of all records associated with the sensor ID corresponding to the field classified into the same group as the field corresponding to the ID, which is associated with the crop ID, may be acquired.
- the specifying unit 538 acquires the sensor ID, the crop evaluation value, and the growing period included in the harvest data received by the server receiving unit 531, and associates the acquired sensor ID with the acquired crop ID, An update process for storing the growth period in the evaluation history table is performed, and then all of the sensor IDs associated with the sensor IDs included in the harvest data received by the server reception unit 531 are referred to the evaluation history table after the update process.
- An evaluation history is acquired (step S902).
- the specifying unit 538 associates all the environmental information acquired in step S901 with evaluation values associated with the growing period including the measurement time of the environmental information (step S903).
- a data set in which an evaluation value is associated with a plurality of environment information is referred to as an environment information history D.
- the identifying unit 538 selects environmental information having a large influence on the evaluation value from the plurality of environmental information histories D (step S904).
- the specifying unit 538 performs principal component analysis using environment information in the plurality of environment information histories D as a variable, and calculates a principal component load amount (factor load amount) of each variable in the first principal component. Then, the specifying unit 538 selects a variable (environment information) whose principal component load amount is larger than a predetermined value. Note that the specifying unit 538 may select not only the principal component load amount of the first principal component but also a variable in which each principal component load amount of the second principal component is larger than a predetermined value.
- the range information update process may be terminated without executing the subsequent steps S905 to S907. Further, when there are few types of environment information in the environment information history D (for example, there are three or less types of environment information), the process may be advanced to step S905 without executing step S904.
- the specifying unit 538 determines whether or not there is a specific correlation between the evaluation value in the selected environment information history D and the environment information (step S905).
- the correlation between the evaluation value and the environmental information will be described.
- FIG. 17 is a schematic diagram in which the environment information history D is mapped onto a two-dimensional plane with the evaluation value and the environment information as axes.
- the environmental information shown in FIG. 17 is an integrated value of soil temperature, but may be an integrated value of other environmental factors (such as temperature and solar radiation).
- crops with high evaluation values are harvested at an accumulated soil temperature of around 2,300 degrees. As the accumulated soil temperature decreases from 2,300 degrees, crops with a low evaluation value are harvested, and as the accumulated soil temperature increases from 2,300 degrees, a crop with a low evaluation value is harvested.
- the correlation coefficient R xy is obtained by the following equation (1). Note that n is the number of data in the environment information history D.
- a and b are represented by the following equations (3) and (4), respectively.
- step S905 the identifying unit 538 calculates a regression line indicating a positive correlation between the evaluation value and the environment information and a regression line indicating a negative correlation, and calculates the calculated regression line.
- the absolute value of the correlation coefficient R xy is 0.8 or more, it is determined that there is a specific correlation between the evaluation value and the environment information.
- step S905-No when the identification unit 538 determines that there is no specific correlation between the evaluation value and the environment information (step S905-No), the process proceeds to step S907.
- the specifying unit 538 determines that there is a specific correlation between the evaluation value and the environmental information (step S905-Yes)
- the specifying unit 538 corresponds to each evaluation value according to the regression line indicating the positive correlation.
- the lower limit value of the environmental information to be calculated is calculated
- the upper limit value of the environmental information corresponding to each evaluation value is calculated according to the regression line indicating the negative correlation.
- the specifying unit 538 updates and stores the lower limit value and the upper limit value corresponding to each calculated evaluation value as the range information of the environment information in the range management table stored in the server storage unit 52 (step S906).
- the screen creation unit 533 creates management screen display data for displaying a management screen including a notification that the range information of the environment information in the range management table has been updated (step S907). Then, the screen creation unit 533 passes the created management screen display data to the server transmission unit 534 and ends the series of steps.
- the specifying unit 538 may determine whether or not there is a specific correlation between the first principal component score and the evaluation value in the principal component analysis performed in step S904. First, the specifying unit 538 performs principal component analysis using environment information in the plurality of environment information histories D as a variable, and calculates a first principal component score corresponding to each variable (environment information). In the above-described equation (1), the specifying unit 538 sets the first principal component score to x i , the evaluation value associated with the variable (environment information) corresponding to the first principal component score as y i, and x i And the correlation coefficient R xy is calculated with the average value of y i as X and Y.
- the specifying unit 538 calculates a regression line indicating a positive correlation and a regression line indicating a negative correlation between the evaluation value and the first principal component score by the above-described formulas (2) to (5), and When the absolute value of the correlation coefficient R xy related to the calculated regression line is 0.8 or more, it is determined that there is a specific correlation between the evaluation value and the first principal component score.
- the identifying unit 538 determines that there is a specific correlation between the evaluation value and the first principal component score, the first principal component corresponding to each evaluation value according to the regression line indicating the positive correlation A lower limit value of the score is calculated, and an upper limit value of the first principal component score corresponding to each evaluation value is calculated according to the regression line indicating a negative correlation.
- the specifying unit 538 multiplies the lower limit value and the upper limit value corresponding to each calculated evaluation value by the inverse matrix of the eigenvector calculated by the principal component analysis, and each variable (environment information) selected in step S904. Are updated and stored as range information of each environment information in the range management table stored in the server storage unit 52. This makes it possible to calculate range information for a plurality of environmental information that affects the evaluation value of the harvested crop.
- the farm management system 1 allows the server 5 to provide farm management information suitable for each field on the basis of the result value of the crops harvested in the past and the history of environmental information. It becomes possible. Therefore, the server 5 can manage an environment suitable for the growth of crops for each field.
- the screen creation unit 533 of the server 5 plots the field environment in which the values obtained by integrating the soil temperature and the solar radiation amount of each record stored in the sensor management table are plotted on a two-dimensional plane with the integrated soil temperature and the solar radiation amount as axes. Display data for displaying a screen including a characteristic curve may be created, and the server transmission unit 534 may transmit the display data to the mobile terminal 4 via the server communication unit 51.
- FIG. 18 is a schematic diagram showing a field environment characteristic curve displayed on a two-dimensional plane with the accumulated soil temperature and the accumulated solar radiation as axes.
- FIG. 18 shows a field environment characteristic curve in the X test site where a crop with a specific evaluation value 5 is harvested and a field environment characteristic curve in a Y town Z field where the specific crop is grown. Thereby, the discrepancy between the field environment characteristic curve in the current Y town Z field and the field environment characteristic curve in the X test field can be visually recognized.
- FIG. 19 is a diagram illustrating an example of an operation sequence according to the agricultural management system 1.
- the operation sequence illustrated in FIG. 19 is an example of an image determination process between the mobile terminal 4 and the server 5.
- movement sequence shown in FIG. 19 is an example of the image determination system which performs the image determination method.
- the operation sequence described below is based on the programs stored in advance in the terminal storage unit 43 and the server storage unit 52, and the elements of the mobile terminal 4 and the server 5 are mainly performed by the terminal processing unit 47 and the server processing unit 53. It is executed in cooperation with.
- the server storage unit 52 stores a pest and disease image management table in which a plurality of pest and disease images, pest and disease names, crop names, generation sites, and generation environment information are stored in association with each other.
- the name of the crop is the name of the crop where the pest damage has occurred.
- a some pest damage image is memorize
- the occurrence site is the site of the crop where the pest damage has occurred.
- the occurrence environment information is data indicating a range of environment information or a field condition in which pest damage is likely to occur. Further, the type of environment information in the generated environment information may be one type or a plurality of types.
- the acquisition unit 472 acquires a photographed image of a disease or pest obtained by photographing by the photographing unit 46 in accordance with an operation using the operation unit 44 by a user of the mobile terminal 4 (step S1001).
- the photographed pest damage is a pest that inhabits the field operated by the user who owns the mobile terminal 4 or a disease symptom of a crop grown in the field.
- the terminal transmission unit 473 of the mobile terminal 4 sends an image determination request including the captured image of the disease and insect damage obtained by the imaging of the imaging unit 46 of the mobile terminal 4 and the user ID of the user who owns the mobile terminal 4 It transmits to the server 5 via 42 (step S1002).
- step S1003 Details of the captured image determination process will be described later.
- the server transmission unit 534 receives the determination result screen display data from the screen creation unit 533, and sends the determination result to the mobile terminal 4.
- the screen display data is transmitted (step S1004).
- the browsing execution part 471 of the portable terminal 4 displays the determination result screen (not shown) containing a determination result name based on the determination result screen display data received via the 2nd wireless communication part 42 (step) S1005).
- FIG. 20 is a diagram illustrating an example of an operation flow of a captured image determination process performed by the image determination unit 539 and the screen creation unit 533.
- the captured image determination process shown in FIG. 20 is executed in step S1003 of FIG.
- the image determination unit 539 acquires a plurality of pest images stored in the pest image management table, and uses a local feature extraction method such as the SURF method for each of the acquired plurality of pest images. A plurality of feature points are extracted, and the local feature amount of each feature point is calculated (step S1101).
- the local feature extraction method is not limited to the SURF method, and a SIFT (Scale-InvariantvariFeature Transform) method or the like may be used.
- the local feature-value calculated by the local feature-value extraction method is 128 dimensions, it is not restricted to this, It may be 64 dimensions or other dimensions.
- the image determination unit 539 expresses each disease and pest damage image with a feature vector based on the plurality of local feature amounts calculated for each of the plurality of disease and pest damage images (step S1102).
- the process of expressing the pest damage image with the feature vector is executed using a feature vector expression method such as a Fisher vector.
- the image determination unit 539 generates a classification model based on the feature vector representing each of the plurality of pest damage images (step S1103).
- the classification model is, for example, a plurality of decision trees in Random Forest. Further, the classification model may be an objective function in which a slack variable ⁇ is introduced when learning data in Soft Margin Support Vector Machine is a feature vector representing each of a plurality of disease and pest damage images.
- the image determination unit 539 extracts a plurality of feature points from the captured image included in the image determination request received by the server reception unit 531 using a local feature amount extraction method, and outputs each feature point. Is calculated (step S1104).
- the image determination unit 539 represents the captured image with a feature vector based on the plurality of local feature amounts calculated for the captured image (step S1105).
- the image determination unit 539 assigns points in association with the names of disease and pests that are similar to the feature vector representing the captured image (step S1106).
- the number is the number of votes of the disease and disease name corresponding to the disease and disease image determined in each of the plurality of decision trees.
- the point is a parameter that controls the penalty between the slack variable ⁇ penalty and the size of margin in Soft Margin Support Vector Machine.
- the image determination unit 539 weights one of the points associated with the pest damage name (step S1107).
- the image determination unit 539 refers to the user management table and acquires a sensor ID associated with the user ID included in the image determination request received by the server reception unit 531.
- the image determination unit 539 refers to the sensor management table, and acquires the environment information and the crop ID of the record within a predetermined period associated with the acquired sensor ID.
- the predetermined period is one year, but it may be the growing period of the crop corresponding to the crop ID associated with the sensor ID.
- the image determination unit 539 extracts the occurrence environment information associated with the acquired crop ID with reference to the pest and disease damage image management table, and the acquired environment information is included in the extracted occurrence environment information. It is determined whether or not. If the image determination unit 539 determines that the acquired environment information is included in the extracted occurrence environment information, the image determination unit 539 refers to the pest and disease image management table and determines that the acquired environment information includes the acquired environment information.
- the pest name associated with the occurrence environment information is specified, and the point of the specified pest name is weighted. For example, the process of weighting the points is a process of multiplying the points by 1.2.
- the image determination unit 539 compares the points associated with each disease and pest name, and determines the disease and pest name assigned with the highest point as the determination result name (step S1108).
- the screen creation unit 533 creates determination result screen display data including the determination result name determined by the image determination unit 539 (step S1109), and passes the created determination result screen display data to the server transmission unit 534. To complete a series of steps.
- the image determination system allows the server 5 to search for disease and pest damage based not only on pre-stored image information of pest and disease but also on environmental information of the field. Therefore, the server 5 can improve the accuracy of the determination of the photographed image of the disease or insect damage.
- the image of the disease and pest damage obtained by the photographing of the photographing unit 46 according to the operation of the user of the portable terminal 4 using the operation unit 44 is only the part of the disease and pest damage according to the operation of the user using the operation unit 44.
- the image which cut out may be sufficient.
- step S1101 when the mobile terminal 4 sends an image determination request including a photographed image of pests and insects to the server 5, the crop name and the occurrence site of the pests and the like input according to the operation using the operation unit 44 by the user of the mobile terminal 4 It may be included in the image determination request.
- a plurality of pest and disease images acquired from the pest and disease image management table are used as the disease and insect images corresponding to the crop name and / or occurrence site included in the image determination request.
- the sensor ID is acquired from the user management table using the user ID included in the image determination request received by the server reception unit 531 as a key, but at the position of the mobile terminal 4 that transmitted the image determination request. You may acquire sensor ID based on it.
- the mobile terminal 4 has a GPS unit (not shown) that detects a position (latitude, longitude, altitude, etc.) where the mobile terminal 4 is present.
- step S ⁇ b> 1002 of FIG. 19 the terminal transmission unit 473 of the portable terminal 4 is a user's user who owns the portable terminal 4, a photographed image of a disease or pest, obtained by photographing of the photographing unit 46 of the portable terminal 4.
- An image determination request including the ID is transmitted to the server 5 via the second wireless communication unit 42.
- the image determination unit 539 refers to the sensor management table and acquires a sensor ID associated with a sensor position near the shooting position received by the server reception unit 531. Thereby, the user can perform image determination even in fields other than the field operated by the user himself / herself.
- the server 5 may acquire the sensor ID directly acquired from the sensor terminal 2 by the mobile terminal 4 that has transmitted the image determination request from the mobile terminal 4.
- the portable terminal 4 that has transmitted the image determination request establishes communication with the sensor terminal 2 located within the wireless reachable range by short-range wireless communication between terminals such as Bluetooth (registered trademark), and communication is performed.
- the sensor ID is directly acquired from the established sensor terminal 2.
- the mobile terminal 4 further includes a short-range wireless communication unit 48 in addition to the units shown in FIG.
- the short-range wireless communication unit 48 includes an interface circuit for performing short-range wireless communication according to a communication method such as Bluetooth (registered trademark).
- the acquisition unit 472 of the portable terminal 4 specifies the sensor terminal 2 located in a predetermined range in which communication is possible via the short-range wireless communication unit 48 and transmits a sensor ID request signal to the sensor terminal 2.
- the sensor terminal processing unit 26 of the sensor terminal 2 receives the sensor ID request signal from the mobile terminal 4, the sensor terminal processing unit 26 acquires its own sensor ID stored in the sensor terminal storage unit 22, and uses the acquired sensor ID as the sensor ID request signal. Is transmitted to the mobile terminal 4 that transmitted.
- the terminal transmission unit 473 of the mobile terminal 4 further includes the sensor ID received in the image determination request and transmits it to the server 5. Thereby, even when the accuracy of the positioning system using the GPS or the like included in the mobile terminal 4 is poor, it is possible to acquire the sensor ID of the sensor terminal 2 located around the mobile terminal 4.
- image determination may be performed based on information on pest damage currently occurring.
- the server reception unit 531 of the server 5 receives the names of pests and pests that have been confirmed to occur in each field from each user who operates the field, and the acquisition unit 472 of the server 5 includes the field where the pests are generated.
- the server storage unit 52 stores a disease and insect occurrence table in which the name of the disease and insect is stored in association with the field ID.
- the image determination unit 539 refers to the sensor management table, and acquires the field ID associated with the sensor position in the vicinity of the imaging position received by the server reception unit 531.
- the image determination unit 539 refers to the pest and disease occurrence table, obtains the name of the pest and disease associated with the field ID, and weights the acquired points of the pest and disease name. Accordingly, it is possible to perform image determination based on the pest and disease caused in the field around the field operated by the user, and it is possible to improve the accuracy of determination of the captured image of the pest and disease.
- the server storage unit 52 stores the name of the disease or pest that has been confirmed to occur and the occurrence position information of the position where the disease or pest is occurring in association with each other.
- the mobile terminal 4 includes a GPS unit (not shown) that detects a position (latitude, longitude, altitude, etc.) where the mobile terminal 4 is present.
- step S ⁇ b> 1002 of FIG. 19 the terminal transmission unit 473 of the portable terminal 4 is a user's user who owns the portable terminal 4, a photographed image of a disease or pest, obtained by photographing of the photographing unit 46 of the portable terminal 4.
- An image determination request including the ID is transmitted to the server 5 via the second wireless communication unit 42.
- the image determination unit 539 determines whether or not the acquired shooting position is included in a predetermined range centered on the generated position information stored in the server storage unit 52.
- the image determination unit 539 specifies the name of the disease or pest that corresponds to the generation position information, and weights the identified point of the disease or pest. Thereby, the user can perform image determination based on the generated disease and insect damage data currently occurring in the whole country.
- the image determination processing executed by the image determination system may be used for determination of animals and plants other than pests and insects.
- the animals and plants are plants, animals, insects and the like.
- the server 5 stores a plurality of animal and plant images corresponding to each of the plurality of animals and plants, and generation environment information indicating a range of environment information in which the animals and plants are likely to be generated.
- the server 5 receives the captured image transmitted from the mobile terminal 4.
- the server 5 searches a plurality of stored plant and animal images for a plant and animal image similar to the captured image received from the mobile terminal 4, and points each animal and plant name based on the retrieved plant and animal image. Give. Note that the point assignment is executed by the image determination unit 539.
- the server 5 receives environment information measured by the sensor terminal 2 from the sensor terminal 2.
- the server 5 identifies the environmental information of the farm field received from the sensor terminal 2, and determines the generated environmental information including the identified environmental information.
- the server 5 extracts the names of animals and plants associated with the generated environment information determined to include the specified environment information, and weights the points corresponding to the extracted names of animals and plants.
- the server 5 may store position information where animals and plants tend to live, and weight points corresponding to the names of animals and plants based on the position information and the shooting position.
- the server 5 transmits the animal and plant name corresponding to the highest point among the points corresponding to each animal and plant name to the portable terminal 4 as a determination result name.
- the user can know the animal and plant name based on the image classification by the photographed image of the animal and plant as the determination result name only by transmitting the captured image of the animal and plant photographed by the portable terminal 4 to the server 5.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Library & Information Science (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Environmental Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Botany (AREA)
- Ecology (AREA)
- Forests & Forestry (AREA)
- Multimedia (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Processing Or Creating Images (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
2 センサ端末
21 センサ端末通信部
22 センサ端末記憶部
23 GPS部
24 センサ接続部
25 センサ部
26 センサ端末処理部
261 測定情報取得部
262 測定情報送信部
3 センサ基地局
31 第1基地局通信部
32 第2基地局通信部
33 基地局記憶部
34 GPS部
35 センサ接続部
36 センサ部
37 基地局処理部
371 測定情報取得部
372 環境情報受信部
373 環境情報送信部
4 携帯端末
41 第1無線通信部
42 第2無線通信部
43 端末記憶部
44 操作部
45 表示部
46 撮影部
47 端末処理部
471 閲覧実行部
472 取得部
473 端末送信部
5 サーバ
51 サーバ通信部
52 サーバ記憶部
53 サーバ処理部
531 サーバ受信部
532 登録部
533 画面作成部
534 サーバ送信部
535 記憶処理部
536 警告部
537 修正部
538 特定部
539 画像判定部
Claims (3)
- 記憶部を有し、携帯端末及び圃場に設置されるセンサ端末と通信するサーバによる画像判定方法であって、前記サーバは、
生物の態様のそれぞれに対応した生物画像、及び、当該生物の態様に適した環境情報の範囲を示す発生環境情報を複数前記記憶部に記憶するステップと、
前記携帯端末から送信された撮影画像を受信するステップと、
前記センサ端末によって測定された前記環境情報を当該センサ端末から受信するステップと、
複数の前記生物画像のうち、受信した前記撮影画像に類似する生物画像を検索して、前記検索された生物画像に対応する生物の態様に関する名称にポイントを付与するステップと、
前記センサ端末から受信した前記環境情報を含む発生環境情報に対応付けられた、生物の態様に関する名称を抽出し、抽出された名称の前記ポイントに重み付けするステップと、
複数の前記ポイントのうち最も高いポイントの名称を判定結果名として前記撮影画像を送信した前記携帯端末に送信するステップと、
を含むことを特徴とする画像判定方法。 - 前記重み付けするステップにおいて、前記サーバは、前記携帯端末によって生物の態様が撮影されたときの前記携帯端末の撮影位置の近傍のセンサ端末から受信した前記環境情報を含む発生環境情報に対応付けられた、生物の態様に関する名称を抽出する、請求項1に記載の画像判定方法。
- 前記サーバが、前記生物の態様が確認された圃場に対応付けて、当該生物の態様に関する名称を前記記憶部に記憶するステップを更に含み、
重み付けするステップにおいて、前記サーバは、前記撮影位置の近傍のセンサ端末が設置されている圃場の周辺の他の圃場で確認された生物の態様に関する名称を抽出し、抽出された名称の前記ポイントに更に重み付けする、請求項2に記載の画像判定方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NZ74101616A NZ741016A (en) | 2015-09-18 | 2016-09-20 | Image evaluation method |
CN201680054114.1A CN108024505B (zh) | 2015-09-18 | 2016-09-20 | 图像判定方法 |
US15/760,561 US10262244B2 (en) | 2015-09-18 | 2016-09-20 | Image evaluation method |
EP16846685.2A EP3351089B1 (en) | 2015-09-18 | 2016-09-20 | Image evaluation method |
AU2016324766A AU2016324766B2 (en) | 2015-09-18 | 2016-09-20 | Image evaluation method |
HK18116068.3A HK1257173A1 (zh) | 2015-09-18 | 2018-12-14 | 圖像評估方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-185843 | 2015-09-18 | ||
JP2015185843A JP6357140B2 (ja) | 2015-09-18 | 2015-09-18 | 画像判定方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017047814A1 true WO2017047814A1 (ja) | 2017-03-23 |
Family
ID=58289435
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/077728 WO2017047814A1 (ja) | 2015-09-18 | 2016-09-20 | 画像判定方法 |
Country Status (8)
Country | Link |
---|---|
US (1) | US10262244B2 (ja) |
EP (1) | EP3351089B1 (ja) |
JP (1) | JP6357140B2 (ja) |
CN (1) | CN108024505B (ja) |
AU (1) | AU2016324766B2 (ja) |
HK (1) | HK1257173A1 (ja) |
NZ (1) | NZ741016A (ja) |
WO (1) | WO2017047814A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107742290A (zh) * | 2017-10-18 | 2018-02-27 | 成都东谷利农农业科技有限公司 | 植物病害识别预警方法及装置 |
US11403833B2 (en) | 2017-05-18 | 2022-08-02 | Semiconductor Energy Laboratory Co., Ltd. | Image detection module and information management system |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3361448B1 (en) * | 2015-10-08 | 2023-09-06 | Sony Group Corporation | Information processing device and information processing method |
CA3035074A1 (en) * | 2016-09-08 | 2018-03-15 | Walmart Apollo, Llc | Systems and methods for identifying pests in crop-containing areas via unmanned vehicles based on crop damage detection |
US10791037B2 (en) | 2016-09-21 | 2020-09-29 | Iunu, Inc. | Reliable transfer of numerous geographically distributed large files to a centralized store |
US11538099B2 (en) | 2016-09-21 | 2022-12-27 | Iunu, Inc. | Online data market for automated plant growth input curve scripts |
US10635274B2 (en) * | 2016-09-21 | 2020-04-28 | Iunu, Inc. | Horticultural care tracking, validation and verification |
US11244398B2 (en) | 2016-09-21 | 2022-02-08 | Iunu, Inc. | Plant provenance and data products from computer object recognition driven tracking |
JP6849950B2 (ja) * | 2017-04-06 | 2021-03-31 | 日本電気株式会社 | 地上基準点装置およびsar測地システム |
JP6307680B1 (ja) * | 2017-05-24 | 2018-04-11 | 節三 田中 | 植物の健康診断システム |
JOP20190145A1 (ar) * | 2017-06-14 | 2019-06-16 | Grow Solutions Tech Llc | أنظمة وطرق لتفريع حصاد قرن إنماء |
US10088816B1 (en) * | 2017-06-26 | 2018-10-02 | International Business Machines Corporation | Cognitive plant clinic |
KR101924393B1 (ko) * | 2017-10-30 | 2018-12-03 | 주식회사 엑스티엘 | 이미지 분석을 통한 해충 동정과 정보 모니터링 시스템 및 이를 이용한 모니터링 방법 |
US11062516B2 (en) | 2018-02-07 | 2021-07-13 | Iunu, Inc. | Augmented reality based horticultural care tracking |
CN108596254B (zh) * | 2018-04-25 | 2021-07-13 | 福州大学 | 耦合叶片多表征的刚竹毒蛾危害检测方法 |
JP2020106890A (ja) * | 2018-12-26 | 2020-07-09 | キヤノン株式会社 | 情報処理装置、その制御方法、プログラム、及びシステム |
CN110310291A (zh) * | 2019-06-25 | 2019-10-08 | 四川省农业科学院农业信息与农村经济研究所 | 一种稻瘟病分级系统及其方法 |
CN110321868A (zh) * | 2019-07-10 | 2019-10-11 | 杭州睿琪软件有限公司 | 对象识别及显示的方法及系统 |
JP2021033924A (ja) * | 2019-08-29 | 2021-03-01 | キヤノン株式会社 | 情報処理システム |
JP6974662B2 (ja) * | 2019-12-17 | 2021-12-01 | 株式会社ミライ菜園 | 予測装置 |
CN111078732A (zh) * | 2019-12-31 | 2020-04-28 | 新疆农业科学院园艺作物研究所 | 一种扁桃品种鉴定方法 |
US11720980B2 (en) | 2020-03-25 | 2023-08-08 | Iunu, Inc. | Crowdsourced informatics for horticultural workflow and exchange |
JP7359073B2 (ja) * | 2020-04-14 | 2023-10-11 | トヨタ自動車株式会社 | 情報処理システム、情報処理装置及びプログラム |
US11645733B2 (en) | 2020-06-16 | 2023-05-09 | Bank Of America Corporation | System and method for providing artificial intelligence architectures to people with disabilities |
CN112948608B (zh) * | 2021-02-01 | 2023-08-22 | 北京百度网讯科技有限公司 | 图片查找方法、装置、电子设备及计算机可读存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6014451A (en) * | 1997-10-17 | 2000-01-11 | Pioneer Hi-Bred International, Inc. | Remote imaging system for plant diagnosis |
JP2012080790A (ja) * | 2010-10-07 | 2012-04-26 | Mega Chips Corp | 育成支援システム |
JP2013111078A (ja) * | 2011-11-24 | 2013-06-10 | Shijin Kogyo Sakushinkai | 植物の病気の識別方法、システムおよびその記録媒体 |
JP2015204788A (ja) * | 2014-04-21 | 2015-11-19 | パナソニックIpマネジメント株式会社 | 栽培支援方法、栽培支援装置、およびコンピュータプログラム |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6919959B2 (en) * | 1999-06-30 | 2005-07-19 | Masten Opto-Diagnostics Co. | Digital spectral identifier-controller and related methods |
JP2003004427A (ja) * | 2001-06-22 | 2003-01-08 | Hitachi Ltd | 画像比較による欠陥検査方法及びその装置 |
IL150915A0 (en) * | 2002-07-25 | 2003-02-12 | Vet Tech Ltd | Imaging system and method for body condition evaluation |
KR100601957B1 (ko) * | 2004-07-07 | 2006-07-14 | 삼성전자주식회사 | 얼굴 인식을 위한 영상간 대응 결정 방법 및 장치, 이를이루기위한 영상 보정 방법 및 장치 |
JP2007241377A (ja) * | 2006-03-06 | 2007-09-20 | Sony Corp | 検索システム、撮像装置、データ保存装置、情報処理装置、撮像画像処理方法、情報処理方法、プログラム |
BRPI1009931A2 (pt) * | 2009-05-01 | 2016-03-15 | Univ Texas Tech System | sistema para estimar a massa estereoscópica sem contato remoto |
US8755570B2 (en) * | 2011-04-27 | 2014-06-17 | Steve Gomas | Apparatus and method for estimation of livestock weight |
JP5714452B2 (ja) * | 2011-08-29 | 2015-05-07 | 任天堂株式会社 | 情報処理装置、情報処理プログラム、情報処理方法および情報処理システム |
JP5733158B2 (ja) * | 2011-11-02 | 2015-06-10 | 富士通株式会社 | 認識支援装置、認識支援方法、およびプログラム |
JP2014085114A (ja) * | 2012-10-19 | 2014-05-12 | Nikon Corp | 物質特定システム、物質特定装置、物質特定方法、およびプログラム |
US20140168412A1 (en) * | 2012-12-19 | 2014-06-19 | Alan Shulman | Methods and systems for automated micro farming |
CN204612755U (zh) * | 2015-04-22 | 2015-09-02 | 仲恺农业工程学院 | 基于云计算平台的虫害监测预警系统 |
-
2015
- 2015-09-18 JP JP2015185843A patent/JP6357140B2/ja active Active
-
2016
- 2016-09-20 CN CN201680054114.1A patent/CN108024505B/zh active Active
- 2016-09-20 AU AU2016324766A patent/AU2016324766B2/en active Active
- 2016-09-20 EP EP16846685.2A patent/EP3351089B1/en active Active
- 2016-09-20 US US15/760,561 patent/US10262244B2/en active Active
- 2016-09-20 WO PCT/JP2016/077728 patent/WO2017047814A1/ja active Application Filing
- 2016-09-20 NZ NZ74101616A patent/NZ741016A/en unknown
-
2018
- 2018-12-14 HK HK18116068.3A patent/HK1257173A1/zh unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6014451A (en) * | 1997-10-17 | 2000-01-11 | Pioneer Hi-Bred International, Inc. | Remote imaging system for plant diagnosis |
JP2012080790A (ja) * | 2010-10-07 | 2012-04-26 | Mega Chips Corp | 育成支援システム |
JP2013111078A (ja) * | 2011-11-24 | 2013-06-10 | Shijin Kogyo Sakushinkai | 植物の病気の識別方法、システムおよびその記録媒体 |
JP2015204788A (ja) * | 2014-04-21 | 2015-11-19 | パナソニックIpマネジメント株式会社 | 栽培支援方法、栽培支援装置、およびコンピュータプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3351089A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11403833B2 (en) | 2017-05-18 | 2022-08-02 | Semiconductor Energy Laboratory Co., Ltd. | Image detection module and information management system |
CN107742290A (zh) * | 2017-10-18 | 2018-02-27 | 成都东谷利农农业科技有限公司 | 植物病害识别预警方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
EP3351089A1 (en) | 2018-07-25 |
EP3351089A4 (en) | 2019-05-01 |
JP6357140B2 (ja) | 2018-07-11 |
CN108024505B (zh) | 2019-11-22 |
US10262244B2 (en) | 2019-04-16 |
HK1257173A1 (zh) | 2019-10-18 |
US20180276504A1 (en) | 2018-09-27 |
NZ741016A (en) | 2019-09-27 |
EP3351089B1 (en) | 2020-04-08 |
JP2017055745A (ja) | 2017-03-23 |
CN108024505A (zh) | 2018-05-11 |
AU2016324766B2 (en) | 2019-06-20 |
AU2016324766A1 (en) | 2018-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6357140B2 (ja) | 画像判定方法 | |
JP6609142B2 (ja) | 農業管理システム、サーバ、農業管理方法、サーバ制御方法、及びサーバプログラム | |
Raeva et al. | Monitoring of crop fields using multispectral and thermal imagery from UAV | |
US9706756B2 (en) | Animal movement mapping and movement prediction method and device | |
CN111767802B (zh) | 一种对象异常状态的检测方法和装置 | |
Zerger et al. | Environmental sensor networks for vegetation, animal and soil sciences | |
JP6321029B2 (ja) | 活動に関連するデータのログを取り、処理するための方法およびシステム | |
CN106776675B (zh) | 一种基于移动终端的植物识别方法及系统 | |
KR102200314B1 (ko) | 드론을 이용한 농작물 모니터링 시스템 | |
US20080157990A1 (en) | Automated location-based information recall | |
WO2019106733A1 (ja) | 生育状況または病害虫発生状況の予測システム、方法およびプログラム | |
US20150187109A1 (en) | Obtaining and displaying agricultural data | |
JP6035995B2 (ja) | 気象情報生成装置、プログラム及び通信システム | |
CN101790745A (zh) | 基于移动的建议系统及其方法 | |
JP6760069B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
JP6590417B2 (ja) | 判別装置、判別方法、判別プログラム、判別システム | |
JP6704148B1 (ja) | 農作物の収穫量予測プログラム及び農作物の品質予測プログラム | |
KR101481323B1 (ko) | 이동 단말로부터 전달되는 식물 영상을 이용한 식물 정보 제공 시스템 | |
Roslin et al. | Smartphone Application Development for Rice Field Management Through Aerial Imagery and Normalised Difference Vegetation Index (NDVI) Analysis. | |
Reed et al. | Predicting winter wheat grain yield using fractional green canopy cover (FGCC) | |
JP2014026378A (ja) | 端末装置、サーバ装置、情報処理システム、情報処理方法、およびプログラム | |
Parida et al. | Spatial mapping of winter wheat using C-band SAR (Sentinel-1A) data and yield prediction in Gorakhpur district, Uttar Pradesh (India) | |
CN106293053A (zh) | 便携型电子装置、传感器控制系统、传感器控制方法 | |
Stitt et al. | Smartphone LIDAR can measure tree cavity dimensions for wildlife studies | |
JP5687995B2 (ja) | 被写体検索装置、被写体検索方法、被写体検索情報提供プログラム及び被写体検索プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16846685 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15760561 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016846685 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016324766 Country of ref document: AU Date of ref document: 20160920 Kind code of ref document: A |