WO2017061281A1 - 情報処理装置、及び、情報処理方法 - Google Patents
情報処理装置、及び、情報処理方法 Download PDFInfo
- Publication number
- WO2017061281A1 WO2017061281A1 PCT/JP2016/077941 JP2016077941W WO2017061281A1 WO 2017061281 A1 WO2017061281 A1 WO 2017061281A1 JP 2016077941 W JP2016077941 W JP 2016077941W WO 2017061281 A1 WO2017061281 A1 WO 2017061281A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- ecosystem
- vegetation
- field
- display
- information
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 57
- 238000003672 processing method Methods 0.000 title claims abstract description 9
- 239000000470 constituent Substances 0.000 claims abstract description 17
- 230000003190 augmentative effect Effects 0.000 claims abstract description 11
- 239000003550 marker Substances 0.000 claims description 148
- 238000005516 engineering process Methods 0.000 abstract description 15
- 238000000034 method Methods 0.000 description 145
- 238000009313 farming Methods 0.000 description 92
- 241000196324 Embryophyta Species 0.000 description 90
- 238000004891 communication Methods 0.000 description 77
- 238000010586 diagram Methods 0.000 description 77
- 238000003306 harvesting Methods 0.000 description 49
- 235000010149 Brassica rapa subsp chinensis Nutrition 0.000 description 36
- 235000000536 Brassica rapa subsp pekinensis Nutrition 0.000 description 36
- 241000499436 Brassica rapa subsp. pekinensis Species 0.000 description 36
- 238000013461 design Methods 0.000 description 34
- 240000006891 Artemisia vulgaris Species 0.000 description 31
- 238000003860 storage Methods 0.000 description 30
- 235000003261 Artemisia vulgaris Nutrition 0.000 description 29
- 230000031068 symbiosis, encompassing mutualism through parasitism Effects 0.000 description 28
- 238000012545 processing Methods 0.000 description 27
- 230000008569 process Effects 0.000 description 26
- 241000894007 species Species 0.000 description 20
- 230000007704 transition Effects 0.000 description 18
- 239000002689 soil Substances 0.000 description 17
- 230000006870 function Effects 0.000 description 16
- 241000238631 Hexapoda Species 0.000 description 15
- 240000008654 Allium ramosum Species 0.000 description 12
- 235000005338 Allium tuberosum Nutrition 0.000 description 12
- 235000005254 Allium ampeloprasum Nutrition 0.000 description 11
- 240000006108 Allium ampeloprasum Species 0.000 description 11
- 238000004458 analytical method Methods 0.000 description 11
- 235000013311 vegetables Nutrition 0.000 description 11
- GRYLNZFGIOXLOG-UHFFFAOYSA-N Nitric acid Chemical compound O[N+]([O-])=O GRYLNZFGIOXLOG-UHFFFAOYSA-N 0.000 description 10
- 230000008859 change Effects 0.000 description 10
- 235000013399 edible fruits Nutrition 0.000 description 10
- 229910017604 nitric acid Inorganic materials 0.000 description 10
- 240000004050 Pentaglottis sempervirens Species 0.000 description 9
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 9
- 230000009471 action Effects 0.000 description 9
- 238000009331 sowing Methods 0.000 description 9
- 240000007124 Brassica oleracea Species 0.000 description 8
- 235000003899 Brassica oleracea var acephala Nutrition 0.000 description 8
- 235000011301 Brassica oleracea var capitata Nutrition 0.000 description 8
- 235000001169 Brassica oleracea var oleracea Nutrition 0.000 description 8
- 238000001514 detection method Methods 0.000 description 8
- 230000008676 import Effects 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 7
- 241000219109 Citrullus Species 0.000 description 6
- 235000012828 Citrullus lanatus var citroides Nutrition 0.000 description 6
- 244000000626 Daucus carota Species 0.000 description 6
- 235000002767 Daucus carota Nutrition 0.000 description 6
- 241000264877 Hippospongia communis Species 0.000 description 6
- 235000006140 Raphanus sativus var sativus Nutrition 0.000 description 6
- 230000036531 allelopathy Effects 0.000 description 6
- 238000009826 distribution Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 238000010899 nucleation Methods 0.000 description 6
- 241000239290 Araneae Species 0.000 description 5
- 235000010894 Artemisia argyi Nutrition 0.000 description 5
- 235000011299 Brassica oleracea var botrytis Nutrition 0.000 description 5
- 240000003259 Brassica oleracea var. botrytis Species 0.000 description 5
- 241000219112 Cucumis Species 0.000 description 5
- 235000015510 Cucumis melo subsp melo Nutrition 0.000 description 5
- 241000219104 Cucurbitaceae Species 0.000 description 5
- 235000011511 Diospyros Nutrition 0.000 description 5
- 241000723267 Diospyros Species 0.000 description 5
- 240000008415 Lactuca sativa Species 0.000 description 5
- 235000002595 Solanum tuberosum Nutrition 0.000 description 5
- 244000061456 Solanum tuberosum Species 0.000 description 5
- 244000030166 artemisia Species 0.000 description 5
- 238000010411 cooking Methods 0.000 description 5
- 230000035784 germination Effects 0.000 description 5
- 238000009434 installation Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 5
- 244000105624 Arachis hypogaea Species 0.000 description 4
- 235000003130 Arctium lappa Nutrition 0.000 description 4
- 235000008078 Arctium minus Nutrition 0.000 description 4
- 244000025254 Cannabis sativa Species 0.000 description 4
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 4
- 235000003228 Lactuca sativa Nutrition 0.000 description 4
- 229910002651 NO3 Inorganic materials 0.000 description 4
- NHNBFGGVMKEFGY-UHFFFAOYSA-N Nitrate Chemical compound [O-][N+]([O-])=O NHNBFGGVMKEFGY-UHFFFAOYSA-N 0.000 description 4
- 241000220259 Raphanus Species 0.000 description 4
- 238000012790 confirmation Methods 0.000 description 4
- 239000004744 fabric Substances 0.000 description 4
- 235000020232 peanut Nutrition 0.000 description 4
- 241000256844 Apis mellifera Species 0.000 description 3
- 240000005528 Arctium lappa Species 0.000 description 3
- 235000000832 Ayote Nutrition 0.000 description 3
- 241000219122 Cucurbita Species 0.000 description 3
- 235000009854 Cucurbita moschata Nutrition 0.000 description 3
- 235000009804 Cucurbita pepo subsp pepo Nutrition 0.000 description 3
- 241000736262 Microbiota Species 0.000 description 3
- FJJCIZWZNKZHII-UHFFFAOYSA-N [4,6-bis(cyanoamino)-1,3,5-triazin-2-yl]cyanamide Chemical compound N#CNC1=NC(NC#N)=NC(NC#N)=N1 FJJCIZWZNKZHII-UHFFFAOYSA-N 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 230000002860 competitive effect Effects 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 238000001556 precipitation Methods 0.000 description 3
- 235000015136 pumpkin Nutrition 0.000 description 3
- 240000007087 Apium graveolens Species 0.000 description 2
- 235000015849 Apium graveolens Dulce Group Nutrition 0.000 description 2
- 235000010591 Appio Nutrition 0.000 description 2
- 235000009849 Cucumis sativus Nutrition 0.000 description 2
- 240000008067 Cucumis sativus Species 0.000 description 2
- 235000007688 Lycopersicon esculentum Nutrition 0.000 description 2
- 235000010676 Ocimum basilicum Nutrition 0.000 description 2
- 240000007926 Ocimum gratissimum Species 0.000 description 2
- 240000004371 Panax ginseng Species 0.000 description 2
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 2
- 235000003140 Panax quinquefolius Nutrition 0.000 description 2
- 235000010582 Pisum sativum Nutrition 0.000 description 2
- 240000004713 Pisum sativum Species 0.000 description 2
- 241000209504 Poaceae Species 0.000 description 2
- 235000005733 Raphanus sativus var niger Nutrition 0.000 description 2
- 240000001970 Raphanus sativus var. sativus Species 0.000 description 2
- 240000003768 Solanum lycopersicum Species 0.000 description 2
- 244000062793 Sorghum vulgare Species 0.000 description 2
- 235000021307 Triticum Nutrition 0.000 description 2
- 244000098338 Triticum aestivum Species 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000005520 cutting process Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 238000012854 evaluation process Methods 0.000 description 2
- 235000008434 ginseng Nutrition 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 235000019713 millet Nutrition 0.000 description 2
- 235000016709 nutrition Nutrition 0.000 description 2
- 235000012015 potatoes Nutrition 0.000 description 2
- 239000004575 stone Substances 0.000 description 2
- 238000003971 tillage Methods 0.000 description 2
- 235000001674 Agaricus brunnescens Nutrition 0.000 description 1
- 241000234282 Allium Species 0.000 description 1
- 235000002732 Allium cepa var. cepa Nutrition 0.000 description 1
- 235000005377 Allium kurrat Nutrition 0.000 description 1
- 244000095847 Allium kurrat Species 0.000 description 1
- 240000002234 Allium sativum Species 0.000 description 1
- 235000017060 Arachis glabrata Nutrition 0.000 description 1
- 235000010777 Arachis hypogaea Nutrition 0.000 description 1
- 235000018262 Arachis monticola Nutrition 0.000 description 1
- 244000294263 Arctium minus Species 0.000 description 1
- 241000208838 Asteraceae Species 0.000 description 1
- 244000178993 Brassica juncea Species 0.000 description 1
- 235000011293 Brassica napus Nutrition 0.000 description 1
- 235000017647 Brassica oleracea var italica Nutrition 0.000 description 1
- 244000233513 Brassica perviridis Species 0.000 description 1
- 235000000540 Brassica rapa subsp rapa Nutrition 0.000 description 1
- 235000007516 Chrysanthemum Nutrition 0.000 description 1
- 240000005250 Chrysanthemum indicum Species 0.000 description 1
- 241000255749 Coccinellidae Species 0.000 description 1
- 241000254173 Coleoptera Species 0.000 description 1
- 240000006766 Cornus mas Species 0.000 description 1
- 244000301850 Cupressus sempervirens Species 0.000 description 1
- 241000220485 Fabaceae Species 0.000 description 1
- 241000208152 Geranium Species 0.000 description 1
- 235000010469 Glycine max Nutrition 0.000 description 1
- 244000068988 Glycine max Species 0.000 description 1
- 101000635799 Homo sapiens Run domain Beclin-1-interacting and cysteine-rich domain-containing protein Proteins 0.000 description 1
- 241000257303 Hymenoptera Species 0.000 description 1
- 240000001549 Ipomoea eriocarpa Species 0.000 description 1
- 235000005146 Ipomoea eriocarpa Nutrition 0.000 description 1
- 235000017858 Laurus nobilis Nutrition 0.000 description 1
- 235000016496 Panda oleosa Nutrition 0.000 description 1
- 240000000220 Panda oleosa Species 0.000 description 1
- 244000062780 Petroselinum sativum Species 0.000 description 1
- 241000282330 Procyon lotor Species 0.000 description 1
- 244000178231 Rosmarinus officinalis Species 0.000 description 1
- 102100030852 Run domain Beclin-1-interacting and cysteine-rich domain-containing protein Human genes 0.000 description 1
- 235000005212 Terminalia tomentosa Nutrition 0.000 description 1
- 244000125380 Terminalia tomentosa Species 0.000 description 1
- 235000010749 Vicia faba Nutrition 0.000 description 1
- 240000006677 Vicia faba Species 0.000 description 1
- 235000002098 Vicia faba var. major Nutrition 0.000 description 1
- 241000607479 Yersinia pestis Species 0.000 description 1
- 240000008042 Zea mays Species 0.000 description 1
- 235000005824 Zea mays ssp. parviglumis Nutrition 0.000 description 1
- 235000002017 Zea mays subsp mays Nutrition 0.000 description 1
- 239000003905 agrochemical Substances 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 235000005822 corn Nutrition 0.000 description 1
- 238000003967 crop rotation Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000004720 fertilization Effects 0.000 description 1
- 235000004611 garlic Nutrition 0.000 description 1
- 235000021384 green leafy vegetables Nutrition 0.000 description 1
- 235000021374 legumes Nutrition 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 244000005700 microbiome Species 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000009343 monoculture Methods 0.000 description 1
- 235000015097 nutrients Nutrition 0.000 description 1
- 230000035764 nutrition Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 238000004181 pedogenesis Methods 0.000 description 1
- 235000011197 perejil Nutrition 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000000575 pesticide Substances 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 235000012045 salad Nutrition 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
- 239000010902 straw Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000009333 weeding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G7/00—Botany in general
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
- A01B79/005—Precision agriculture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2004—Aligning objects, relative positioning of parts
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/06—Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
- G09B5/065—Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems
Definitions
- the present technology relates to an information processing apparatus and an information processing method, and more particularly, to an information processing apparatus and an information processing method that can support, for example, cooperative farming (registered trademark).
- Examples of ecosystem utilization include the use of ladybirds without the use of agricultural chemicals for pest control on agricultural crops, and the symbiotic farming method (registered trademark) based on the symbiotic effects of ecosystems and the use of useful species. synecoculture) etc.
- symbiotic farming method means no-tillage, no fertilization, no pesticides, and no species other than seeds and seedlings. This is a farming method that achieves high yields based on thinning-out harvesting from dense plants.
- symbiotic farming method (registered trademark) is affected by various ecosystem components that make up the ecosystem, it is necessary for farmers in the field to master symbiotic farming method (registered trademark). ) Need assistance from skilled workers.
- This technology has been made in view of such a situation, and makes it possible to easily support the cooperative farming method (registered trademark).
- the information processing apparatus acquires an ecosystem object that represents an ecosystem component that configures an ecosystem of a field in which a plurality of types of vegetation are mixed, and a task object that represents a task to be performed on the ecosystem component. And the ecosystem object are displayed in an AR (Augmented Reality) at a position in a predetermined background space corresponding to the actual position of the ecosystem component, and the task object is displayed in the AR in the background space.
- AR Augmented Reality
- the information processing method of the present technology acquires an ecosystem object that represents an ecosystem component that constitutes an ecosystem of a field in which a plurality of types of vegetation are mixed, and a task object that represents a task to be performed on the ecosystem component.
- the ecosystem object is displayed in an AR (Augmented Reality) at a position in a predetermined background space corresponding to the actual position of the ecosystem component, and the task object is displayed in the AR in the background space.
- an ecosystem object representing an ecosystem component that configures an ecosystem of a field in which a plurality of types of vegetation are mixed, and a task to be performed on the ecosystem component are performed.
- the task object to represent is obtained.
- the ecosystem object is displayed in AR (Augmented Reality) at a position in a predetermined background space corresponding to the actual position of the ecosystem component, and the task object is displayed in AR in the background space. Is done.
- the information processing apparatus may be an independent apparatus or an internal block constituting one apparatus.
- the information processing apparatus can be realized by causing a computer to execute a program.
- a program to be executed by a computer can be provided by being transmitted through a transmission medium or by being recorded on a recording medium.
- This technology can support cooperative farming (registered trademark).
- FIG. 1 is a diagram illustrating a configuration example of a network 10.
- FIG. 3 is a block diagram illustrating a configuration example of a terminal 12.
- FIG. 3 is a block diagram illustrating a configuration example of a server 13.
- FIG. It is a figure which shows the structural example of Kyosei farming method (trademark) DB. It is a figure which shows the structural example of seed sowing DB. It is a figure which shows the structural example of vegetation DB. It is a figure which shows the structural example of yield DB. It is a figure which shows the structural example of management record DB.
- FIG. 1 It is a figure which shows the example of the site map of the web page as a symbiosis farming method (trademark) page. It is a figure which shows the example of a display of the distribution of the field on the map provided with a symbiosis farming method (trademark) page. It is a flowchart explaining the example of a process in case a user refers to the information of a farm field (field). It is a flowchart explaining the example of the upload process of a photograph. It is a flowchart explaining the example of the process of registration of a key event. It is a figure explaining the relation graph produced
- FIG. 6 is a diagram illustrating a display example of AR display on a terminal 12.
- FIG. 6 is a diagram illustrating a display example of AR display on a terminal 12.
- FIG. 6 is a diagram showing a display example of VR display on the terminal 12.
- FIG. 6 is a diagram illustrating a display example of AR display on a terminal 12.
- FIG. 6 is a diagram showing a display example of VR display on the terminal 12.
- FIG. It is a figure which shows the example of a display of the display of the time lapse mode of the ecosystem object in the terminal 12.
- FIG. It is a figure which shows the example of a display of the same vegetation field information display performed with the terminal 12.
- FIG. 6 is a diagram illustrating a display example of transition display performed on the terminal 12.
- FIG. It is a figure which shows the example of a display of the display of the selection agricultural field which finally arrives via a transition display. It is a figure which shows the example of a display of the relevant information in the terminal 12. It is a figure which shows the example of a display of the analysis result of the sensor data in the terminal 12. It is a figure explaining the copy of an ecosystem object. It is a figure which shows the example of the change of the display scale of the background space in the terminal. It is a figure which shows the example of the change of the display of AR object accompanying the change of the display scale of the background space in the terminal 12.
- FIG. 6 is a diagram illustrating an example of transition of a display screen of the display unit 35 when registering or editing an AR object in the terminal 12.
- FIG. 1 is a block diagram showing a configuration example of an embodiment of an ecosystem utilization system to which the present technology is applied.
- the ecosystem utilization system includes a network 10, one or more sensor devices 11, one or more terminals 12, and one or more servers 13.
- the ecosystem utilization system collects various information observed in the ecosystem, obtains information for utilizing the ecosystem based on the information, and provides it to the user.
- the sensor device 11, the terminal 12, and the server 13 are connected to the network 10 by wire or wireless and can communicate with each other.
- the sensor device 11 has a sensor that senses various physical quantities, and a communication function that transmits sensor data (data representing the sensed physical quantities) obtained as a result of sensing by the sensors. Furthermore, the sensor device 11 includes a position detection function for detecting the position of the sensor device 11 itself using, for example, GPS (Global Positioning System).
- GPS Global Positioning System
- Sensor device 11 senses a physical quantity with a sensor. Furthermore, the sensor device 11 transmits sensor data obtained by sensing to the server 13 via the network 10 by a communication function. The sensor data is transmitted from the sensor device 11 to the server 13 together with position information indicating the position of the sensor device 11 detected by the position detection function of the sensor device 11 as necessary.
- a sensor that senses an electromagnetic wave including light such as a sensor that captures an image by sensing light (an image sensor), or a sensor that senses sound (a microphone) is employed. be able to.
- the sensor device 11 is installed in a place where observation (sensing) of an ecosystem such as a forest in a region where the ecosystem is to be utilized (hereinafter also referred to as a utilization region), a river, the sea, a lake, a farm (a farm), or the like. .
- the sensor device 11 can be manually installed at a predetermined position.
- the sensor device 11 can be installed by, for example, spraying the sensor device 11 while moving on an airplane, a ship, an automobile, or the like.
- the sensor device 11 in various places in the utilization area, for example, images of plants and insects, sound of wind, sound of insects, sound of rubbing leaves, etc., temperature, soil temperature, humidity, geomagnetism, etc. Sensing is performed, and sensor data obtained by the sensing is transmitted to the server 13 via the network 10.
- the utilization area may be, for example, a municipality or a part thereof, a prefecture, Japan, or all over the world.
- the utilization area may be a distant area such as Hokkaido and Kyushu, Japan and the United States, and the like.
- the terminal 12 is an information processing apparatus used by a user who receives support for ecosystem utilization including the cooperative farming method (registered trademark) and a user who cooperates in ecosystem utilization.
- the user who receives support for utilizing the ecosystem is, for example, a field worker who practices the cooperative farming method (registered trademark).
- a user who cooperates in ecosystem utilization is, for example, a collaborator such as a local inhabitant who cooperates in collecting information on the ecosystem of the field, although it is not a farm worker who practices the cooperative farming method (registered trademark).
- a collaborator such as a scholar who cooperates in sending advice.
- a portable terminal such as a smartphone, a tablet, an HMD (Head Mounted Display), and a glasses-type wearable device can be employed.
- a notebook PC Personal Computer
- desktop PC Personal Computer
- other devices having a communication function and an information input / output function (interface) for a user can be employed.
- the HMD as the terminal 12 may be either a see-through type or an immersive (non-transparent) type HMD.
- the terminal 12 used by the user in the field is a portable terminal so as not to hinder the user's work on the field.
- the user uses the terminal 12 to perform observation at various places in the utilization area, and transmits an observation value representing the observation result to the server 13 via the network 10.
- an observed value transmitted from the terminal 12 to the server 13 for example, a certain vegetation, insect, or other species of organism was observed in a certain place, a seed image, or a certain crop was harvested. And crop yield, rosette of Chinese cabbage, and other information obtained by the user observing the ecosystem (images (photos, videos), sound, etc. obtained by the user operating the terminal 12) Can be used).
- the terminal 12 transmits data other than the observed value to the server 13 via the network 10. Further, the terminal 12 receives necessary data from the server 13 via the network 10. For example, the terminal 12 receives (acquires) information for utilizing the ecosystem from the server 13 and presents it to the user. The presentation of information to the user can be performed, for example, by displaying an image or outputting sound.
- the server 13 is an information processing apparatus managed by a supporter who supports the utilization of the ecosystem.
- the server 13 receives and registers sensor data transmitted from the sensor device 11 via the network 10 and observation values transmitted from the terminal 12 via the network 10. Further, the server 13 uses the ecosystem based on sensor data from the sensor device 11 (including positional information of the sensor device 11 as necessary), observation values from the terminal 12, and other necessary information. Information is generated and transmitted to the terminal 12 via the network 10.
- the terminal 12 receives information transmitted from the server 13 via the network 10 and displays the information from the server 13 to the user by displaying it as an image or outputting it as sound.
- processing of the terminal 12 and the processing of the server 13 described below can be shared by the terminal 12 and the server 13 as much as possible. Further, the processing performed by the server 13 can be shared by a plurality of servers.
- FIG. 2 is a diagram illustrating a configuration example of the network 10 in FIG.
- the network 10 includes an arbitrary number of wireless relay devices 21, an arbitrary number of wireless LAN (Local Area Network) 22, a mobile phone network 23, the Internet 24, and the like.
- wireless LAN Local Area Network
- the wireless relay device 21 is a device that performs wireless communication and has a router function.
- the wireless relay device 21 is installed uniformly in the use area so that the sensor data obtained by the sensor device 11 can be collected.
- the installation of the wireless relay device 21 can be performed manually, for example, similarly to the installation of the sensor device 11, or by spreading the wireless relay device 21 while moving on an airplane, a ship, an automobile, or the like.
- the wireless relay device 21 can be installed in a vehicle such as a car (for example, a bus that operates regularly), a movable vehicle such as a motorcycle or a ship.
- the wireless relay device 21 receives sensor data transmitted from the sensor device 11 by communicating with the sensor device 11.
- the wireless relay device 21 receives sensor data transmitted from the other wireless relay device 21 by communicating with the other wireless relay device 21. Further, the wireless relay device 21 transmits sensor data to the other wireless relay device 21 by communicating with the other wireless relay device 21.
- the wireless relay device 21 transmits sensor data received from the sensor device 11 or another wireless relay device 21 to the wireless LAN 22 or the mobile phone network 23 by communicating with the wireless LAN 22 or the mobile phone network 23.
- the wireless LAN 22 is built at the user's home or any place.
- the wireless LAN 22 communicates with the terminal 12, the wireless relay device 21, and the Internet 24, thereby transmitting data from the terminal 12 and sensor data from the wireless relay device 21 to the server 13 via the Internet 24. .
- the wireless LAN 22 receives data transmitted from the server 13 via the Internet 24 and transmits it to the terminal 12.
- the cellular phone network 23 is, for example, a 3G line or the like, and communicates with the terminal 12, the server 13, the wireless relay device 21, and the Internet 24.
- the Internet 24 communicates with the terminal 12, the server 13, the wireless LAN 22, and the mobile phone network 23.
- sensor data transmitted by the wireless relay device 21, data transmitted via the wireless LAN 22, and data transmitted by the terminal 12 are transmitted to the server 13 via one or both of the cellular phone network 23 and the Internet 24. Sent to.
- the data transmitted by the server 13 is transmitted to the terminal 12 via one or both of the cellular phone network 23 and the Internet 24.
- the wireless relay device 21 since the wireless relay device 21 has a router function, a certain wireless relay device 21 becomes unable to communicate due to a failure or the like, and the wireless communication path via the wireless relay device 21 cannot be used. Even in this case, the sensor data transmitted from the sensor device 11 can be transmitted to the server 13 using a wireless communication path that passes through another wireless relay device 21.
- the wireless relay device 21 has a router function
- sensor data obtained by the sensor device 11 can be transmitted to the server 13 via various wireless communication paths that pass through the wireless relay device 21. Therefore, the server 13 can collect (receive) the sensor data obtained by the sensor device 11 via another wireless relay device 21 even if a certain wireless relay device 21 becomes unable to communicate.
- a user of an automobile in which the wireless relay device 21 is installed can contribute to collecting information for utilizing the ecosystem simply by traveling on a mountain road or the like in the utilization area.
- the wireless relay device 21 installed in the automobile can be used together with other wireless relay devices 21 at positions close to the location at various locations. It constitutes a wireless communication path and contributes to collecting sensor data obtained by the sensor device 11 by the server 13.
- the wireless relay device 21 is one of the short-range wireless network standards, for example, a wireless communication device compliant with ZIGBEE (registered trademark), and other router functions can be mounted, and a certain distance A small-sized and low-power-consumption wireless communication device capable of wireless communication can be employed.
- ZIGBEE registered trademark
- FIG. 3 is a block diagram showing a configuration example of the terminal 12 of FIG.
- the terminal 12 includes a CPU (Central Processing Unit) 31, a memory 32, a storage 33, an operation unit 34, a display unit 35, a speaker 36, a camera 37, a microphone 38, a position detection unit 39, a communication unit 40, an external I / F (Interface). ) 41 and a drive 42.
- the CPU 31 or the drive 42 is connected to the bus and performs necessary communication with each other.
- the CPU 31 performs various processes by executing programs installed in the memory 32 and the storage 33.
- the memory 32 is composed of, for example, a volatile memory, and temporarily stores programs executed by the CPU 31 and necessary data.
- the storage 33 is composed of, for example, a hard disk or a nonvolatile memory, and stores programs executed by the CPU 31 and necessary data.
- the operation unit 34 includes physical keys (including a keyboard), a mouse, a touch panel, and the like. In response to a user operation, the operation unit 34 outputs an operation signal corresponding to the operation on the bus.
- the display unit 35 is composed of, for example, an LCD (Liquid Crystal Display) or the like, and displays an image according to data supplied from the bus.
- LCD Liquid Crystal Display
- the touch panel as the operation unit 34 is configured by a transparent member and can be configured integrally with the display unit 35. Thereby, the user can input information in such a manner as to operate an icon, a button, or the like displayed on the display unit 35.
- Speaker 36 outputs sound according to data supplied from the bus.
- the camera 37 takes an image (still image (photo), moving image) (senses light) and outputs corresponding image data on the bus.
- the microphone 38 collects sound (senses sound) and outputs corresponding sound data on the bus.
- the position detection unit 39 includes, for example, a circuit using a GPS (Global Positioning System), an antenna, and the like, detects the position of the terminal 12 as the position of the user, and outputs position information representing the position on the bus To do.
- GPS Global Positioning System
- the communication unit 40 includes a communication circuit and an antenna, and performs communication with the wireless LAN 22, the mobile phone network 23, the Internet 24, and the like.
- the external I / F 41 is an interface for exchanging data with, for example, headphones or other external devices.
- the drive 42 can be attached to and detached from a removable recording medium 42A such as a memory card, for example, and drives the removable recording medium 42A attached thereto.
- a removable recording medium 42A such as a memory card
- the program executed by the CPU 31 can be recorded in advance in the storage 33 as a recording medium built in the terminal 12.
- the program can be stored (recorded) in the removable recording medium 42A, provided as so-called package software, and installed on the terminal 12 from the removable recording medium 42A.
- the program can be downloaded from the Internet 24 via the communication unit 40 and installed in the terminal 12.
- the CPU 31 functions as an acquisition unit 51 and a display control unit 52 by executing a program installed in the terminal 12.
- the acquisition unit 51 acquires various information (data) such as an AR (Augmented Reality) object described later.
- data can be received via the communication unit 40, data can be read from the memory 32, the storage 33, and the removable recording medium 42A.
- the display control unit 52 performs display control to be presented to the user by causing the display unit 35 to display information acquired by the acquisition unit 51.
- the terminal 12 is provided with a sensor other than the camera 37 that senses light and the microphone 38 that senses sound, that is, a sensor 43 that senses physical quantities other than light and sound, such as temperature and pressure. it can.
- a sensor 43 that senses physical quantities other than light and sound, such as temperature and pressure. it can.
- the terminal 12 can also serve as the sensor device 11.
- the operation (input) on the terminal 12 can be performed by voice, a gesture, or any other means in addition to the operation of the operation unit 34.
- FIG. 4 is a block diagram illustrating a configuration example of the server 13 in FIG.
- the server 13 includes a CPU 61, a memory 62, a storage 63, an operation unit 64, a display unit 65, a speaker 66, a communication unit 67, an external I / F 68, and a drive 69.
- the CPU 61 to the drive 69 are configured in the same manner as the CPU 31 to the speaker 36 and the communication unit 40 to the drive 42 in FIG.
- the program executed by the CPU 61 can be recorded in advance in a storage 63 as a recording medium built in the server 13.
- the program can be stored (recorded) in the removable recording medium 69A, provided as package software, and installed on the server 13 from the removable recording medium 69A.
- the program can be downloaded from the Internet 24 via the communication unit 67 and installed in the server 13.
- the CPU 61 executes a program installed in the server 13, so that a cooperative farming method (registered trademark) CMS (Content Management System) 71, a graph display control unit 72, a generation unit 73, an AR / VR display control unit 74, an edit Functions as the unit 75 and the analysis unit 76.
- CMS Content Management System
- Kyosai Agricultural Method (registered trademark) CMS71 is a web page (hereinafter also referred to as Kyosei Agricultural Method (registered trademark) page) that exchanges information related to Kyosei Agricultural Method (registered trademark) (text, images, etc.)
- the layout information and the like are registered and managed in a DB (database) stored in the storage 63.
- the symbiotic farming method (registered trademark) CMS 71 constructs a symbiotic farming method (registered trademark) page, and transmits the symbiotic farming method (registered trademark) page as a web server on the Internet 24 from the communication unit 67 to the terminal 12 ( To other devices that function as web browsers).
- the acquisition unit 51 acquires the cooperative farming method (registered trademark) page from the cooperative farming method (registered trademark) CMS 71 via the communication unit 40, and the display control unit 52 performs the cooperation.
- the raw farming method (registered trademark) page is displayed on the display unit 35.
- the graph display control unit 72 generates a multi-partite graph (graph model) such as a bipartite graph (bipartite graph) necessary for generating a relation graph described later from the DB recorded in the storage 63 or the like.
- a multi-partite graph graph model
- the terminal 12 is caused to generate and display a relation graph from the bipartite graph.
- the graph display control unit 72 generates a relationship graph from the bipartite graph and transmits the relationship graph to the terminal 12 from the communication unit 67, thereby causing the terminal 12 to display the relationship graph.
- the acquisition unit 51 acquires the bipartite graph or the relationship graph from the graph display control unit 72 via the communication unit 40.
- the acquiring unit 51 acquires the relationship graph by generating a relationship graph from the bipartite graph.
- the display control unit 52 causes the display unit 35 to display the relationship graph acquired by the acquisition unit 51.
- the generating unit 73 generates an AR object, a VR (virtual reality) space (image) for displaying the AR object in a superimposed manner, and registers (records) it in the storage 63.
- a VR virtual reality space
- the AR / VR display control unit 74 displays an AR object in a real space, an imaged real space obtained by photographing the real space (an image thereof), or an AR display for displaying the VR object (an image obtained by modeling the real space) Processing necessary for causing the terminal 12 to perform VR display is performed.
- the AR / VR display control unit 74 acquires, from the storage 63, information representing an AR object and a photographing real space or VR space as a background space on which the AR object is superimposed and displayed from the communication unit 67. By transmitting to the terminal 12, the terminal 12 is caused to perform AR display or VR display to display the AR object superimposed on the background space.
- the AR / VR display control unit 74 acquires an AR object from the storage 63 and transmits the AR object to the terminal 12 from the communication unit 67, thereby using the real space as the background space for the terminal 12 and the background space. AR display to display the AR object superimposed on the.
- the editing unit 75 edits the AR object registered in the storage 63 according to the operation of the terminal 12 by the user.
- the analysis unit 76 analyzes the sensor data from the sensor device 11.
- various DBs are registered in the storage 63, and various DBs (hereinafter referred to as “cooperative farming methods” (hereinafter referred to as “cooperative farming methods”) that support the management of the cooperative farming method (registered trademark) as part of the various DBs. (Registered trademark) also called DB).
- FIG. 5 is a diagram showing a configuration example of a cooperative farming method (registered trademark) DB.
- the symbiotic farming method (registered trademark) DB is the sowing DB, vegetation DB, yield DB, management record DB, phenology DB, insect flora DB, microbiota DB, climate classification DB, weather DB, coordinate DB, cooperative DB It has a raw farming method (registered trademark) assessment DB, an allelopathy DB, a rotation suitability DB, a plant name DB, a photo record DB, a meta DB, and the like.
- Kyosei Agricultural Method (registered trademark) DB csv (commavseparated values) files (for example, two-dimensional matrix format files) and image files are stored.
- the symbiotic farming method (registered trademark) DB and all or part of the DB stored in the storage 63 can be arranged independently of the server 13 and connected to the server 13 via the network 10.
- FIG. 6 is a diagram showing a configuration example of the seed sowing DB.
- the seed sowing DB is composed of csv files, for example.
- recording date, field section, vine number, cocoon section, seed and seedling distinction, crop name, sakumotsumei, quantity, and manufacturer information are recorded. Even if the seeds and seedlings of the same variety are collected and grown depending on the manufacturer, the manufacturer name can also be a kind of cultivation condition, so it is preferable to keep a management record.
- the fields (plants) in the application area are divided into field sections.
- a vine is formed in the field section, and the ridge is divided into one or more vine sections.
- Each kite is given a kite number that identifies the kite.
- the name of the crop is recorded as “potato” (baron ⁇ ) and the information including the variety, including kanji.
- the name “Satomotsumeme” does not distinguish the variety from “potato”. Is recorded only in katakana. Such a unified representation of only characters facilitates the search.
- FIG. 7 is a diagram showing a configuration example of the vegetation DB.
- Vegetation DB is composed of csv files, for example.
- the recording date and the position information as the observation section coordinates are recorded.
- the following observations are recorded at the observation block coordinates NE.
- germination of broad beans carrots can be harvested, radish can be harvested, leeks have settled, seedling broccoli has settled, seedling cabbage has settled It has been recorded that seedlings of Chinese cabbage are established and can be harvested.
- FIG. 8 is a diagram showing a configuration example of the yield DB.
- Yield DB is composed of csv file, for example.
- the yield of the harvested crop is recorded for each harvested month and day.
- holonigaletus is harvested 100g on January 14, 2012, and radishes are 1700g on January 24, 4000g on January 29, 1500g on January 30, 740g on January 31, and February 20 Each day, 1500g is harvested.
- “Muna” is not a common name, but the name given to the plant by the observer.
- recording of coordinates as position information is omitted, but GPS coordinates or the like can be recorded as position information of fields such as fields and straws where crops are observed.
- the input information of the seed sowing DB can be used for the input of the yield DB.
- the yield DB information on the plants managed by the sowing DB can be displayed as it is.
- FIG. 9 is a diagram showing a configuration example of the management record DB.
- the management record DB is composed of csv files, for example.
- the management work performed and the date performed are recorded. For example, on January 19, 2012, January 20, 2012, January 21, 2012, January 22, 2012, January 23, 2012, civil engineering works such as seedling planting and windbreak shelves were made. What has been done is recorded.
- FIG. 10 is a diagram showing a configuration example of a csv file as a phenology DB.
- the phenology DB is composed of image files and csv files, for example.
- FIG. 10 shows an example of a csv file.
- the contents of the phenology and the recording date and time are recorded in characters. For example, on January 9, 2011, unknown grass seeds were observed, growth was better than others, the lower part of the peas had withered, and places where growth was clearly better than others were observed. Etc. are recorded.
- FIG. 11 is a diagram showing a configuration example of an image file as a phenology DB.
- Phenology 1 which is an image taken at Oiso, dated September 22, 2011, field plot NN, basket number 02, fence plot d3, is shown.
- phenology 1-2 which is an image taken at the same location, is shown with a comment that “NN02d3 has a lot of germination of futaba”.
- the phenology observed by the operator (user) is recorded as characters and images.
- FIG. 12 is a diagram showing a configuration example of an insect fauna DB.
- the insect flora DB is composed of, for example, image files and csv files.
- FIG. 12A an image of an insect photographed on a field 087 named Ise New Farm on February 18, 2012 is shown.
- the observation site is Ise New Farm
- the insects are Coleoptera
- the family name is Ganodermaceae
- the classification seems to be Snagidae
- the group was wintering under the stone Has been.
- FIG. 12B shows an image of an insect photographed on the field 088 named Ise New Farm on February 18, 2012. As a comment, the same content as in the case of FIG. 12A is recorded.
- FIG. 12C shows an image of a still life photographed in a field 089 named Ise New Farm on February 18, 2012.
- the observation site is Ise New Farm
- the organism is a spider
- the name is a spider
- the name is a common spider family
- the species name is a arachnid spider
- the most common species of the common spider family It has been recorded that he often wanders around the surface of the earth.
- FIG. 13 is a diagram showing a configuration example of the weather DB.
- meteorological information such as atmospheric pressure, precipitation, temperature and humidity in the Tsu region in 2012 is recorded at the beginning, mid and late of each month.
- the local average atmospheric pressure is 1018.7 hPa
- the sea level average is 1021.0 hPa.
- the maximum precipitation is 0.5 mm in 10 minutes, 0.5 mm in 1 hour, 0.5 mm in 1 day, and 0.5 mm in total.
- the temperature the maximum is 11.6 ° C and the minimum is 0.2 ° C.
- the average daily maximum is 9.2 ° C, the average minimum is 2.0 ° C, and the daily average is 5.2 ° C.
- the average humidity is 62% and the minimum is 24%.
- FIG. 14 is a diagram showing a configuration example of the allelopathic DB.
- the allelopathic DB is composed of csv files, for example.
- allelopathies are recorded for leeks, watermelons and melons (Cucurbitaceae), carrots, millet, millet, wheat, pumpkins, watermelon, cucumbers, pumpkins (cucumbers), garlic and onions.
- Yes. “1” means that a symbiotic interaction (ie, promoting action) is confirmed between the target plants, and “0” means that it is not confirmed.
- a symbiotic interaction has been confirmed between leek and carrot, but a symbiotic interaction has not been confirmed between leek and wheat.
- the degree of interaction can be expressed step by step using numbers such as 0 to 10.
- FIG. 15 is a diagram showing a configuration example of the rotation suitability DB.
- Rotation aptitude DB is composed of csv files.
- the suitability of watermelon, melon (Cucurbitaceae) and peanut rotation is recorded. “1” means that good rotation suitability was confirmed in the field among the target plants, and “0” means that it was not confirmed.
- watermelon and melon (Cucurbitaceae) and peanuts have been confirmed to have good rotation suitability.
- allelopathic DB and rotation suitability DB are created not only from information known in the literature, but also from other information.
- sowing DB, vegetation DB, and yield DB the vegetation combination in which the symbiotic state was actually established in the symbiosis plantation or the combination that caused vegetation transition (ie, temporal vegetation change) Can be created in the same format.
- the microbiota DB is composed of image files and csv files.
- the microbiota DB records information on microorganisms analyzed from soil samples obtained at Kyosei Farm.
- the climate classification DB consists of csv files.
- This climate classification DB is a DB that records information on the climate classification in which the farm (field) is located, and includes classifications such as a laurel forest zone, a deciduous forest zone, a subtropical climate, and a tropical climate.
- the meteorological DB records for example, image files graphed from meteorological data of meteorological satellites such as AMeDAS, csv files, and various types of meteorological data as sensor data by the sensor device 11 which is an observation device installed in the field.
- the coordinate DB consists of csv files.
- the coordinate DB records the GPS coordinates of each vine in the field. This coordinate has an accuracy of about 10 cm.
- the Kyosai Agricultural Method (registered trademark) assessment DB consists of pdf or image files.
- the Kyosai Agricultural Law (registered trademark) assessment document is a certificate that the examination as a cooperative farming method (registered trademark) has been cleared, and the server administrator examines the field based on the application from the manager of the field 21 It is issued when it is confirmed that the condition as a cooperative farming method (registered trademark) is satisfied. It is permitted to indicate that crops from farms that have received this issue are based on the Kyosei Agricultural Law (registered trademark).
- Plant name DB records names and images of various plants.
- the photo record DB records various photos.
- the meta DB records key events to be described later.
- the Kyosai Agricultural Method (registered trademark) DB records various information necessary for obtaining a vegetation design of a plant cultivated by the Kyosei Agricultural Method (registered trademark).
- FIG. 16 is a flowchart for explaining an example of vegetation design support processing.
- the ecosystem utilization system in Fig. 1 supports vegetation design for Kyosai Agricultural Method (registered trademark) as one of the support for ecosystem utilization.
- a vegetation combination suitable for constructing a mixed vegetation state with those crops that is, a vegetation design of Kyosai Agricultural Method (registered trademark) is performed.
- Searched from allelopathic DB and rotation aptitude DB A spatio-temporal arrangement of vegetation design that is expected to have the lowest cost and high yield is output.
- Kyosai Agricultural Method (registered trademark) is based on mixed vegetation, it mixes and seeds multiple crop seeds and harvests the grown seeds. What kind of combination can be used to achieve higher density of mixed abundance depends on the plant and the land conditions, and is already known in the field, as well as the interaction between plants already known (allelopathy and aptitude for rotation). It is necessary to expect from both successful combinations.
- vegetation design Since it is not possible to completely control the ecosystem including the weather etc., not all of the sown seeds and planted seedlings can be harvested, but the vegetation combinations that reduce costs and maximize yield are estimated. This is a challenge for vegetation design. Since vegetation design is conceptually similar to the composition of a portfolio in equity investment, vegetation design can be called a seed portfolio.
- step S11 the acquisition unit 51 of the terminal 12 acquires the selection of the plant species (vegetation) to be cultivated. That is, when the user operates the operation unit 34 to specify a plant species to be cultivated, this is acquired by the acquisition unit 51.
- This input may be performed by a method in which the user inputs an arbitrary plant name, or a list of plant names prepared in advance may be displayed on the display unit 35 and a predetermined one may be selected from the list. . Thereby, designation
- step S ⁇ b> 12 the communication unit 40 transmits the plant species (plant species name) acquired by the acquisition unit 51 to the server 13 via the network 10.
- step S31 the communication unit 67 of the server 13 receives the plant species transmitted from the terminal 12. That is, the plant species transmitted from the terminal 12 in step S12 is received by the server 13. Thereby, the plant cultivated by the user is accepted by the server 13.
- step S ⁇ b> 32 in the server 13, the cooperative farming method (registered trademark) CMS 71 searches for a vegetation design including the plant species from the terminal 12. That is, the symbiotic farming method (registered trademark) CMS 71 selects a combination of a plant specified by the user (plant species from the terminal 12) and a vegetation suitable for constructing a mixed vegetative state from at least the allelopathic DB and the rotation suitability DB. Search exhaustively from one side. Note that the plant name DB is also used as necessary to confirm the received plant species.
- step S33 the cooperative farming method (registered trademark) CMS 71 calculates a symbiosis score of each vegetation design searched in step S32. That is, the symbiosis score of each combination of vegetation suitable for constructing a coexistence state with the plant designated by the user, which is one or more vegetation designs retrieved in step S32, is calculated.
- the value of the weighting score evaluated as a positive / negative value of the interaction is recorded. That is, the vegetation state of a plant that has been recorded as seeded in the seeding DB is recorded in the vegetation DB, and the yield obtained from the plant is recorded in the yield DB. The weighting score of the plant is added to the seeding DB, the vegetation DB, and the yield DB for each observation, and a combination with a higher yield finally gives a larger weighting score.
- the rotation suitability DB a large weighting score is recorded for a combination of plants suitable for rotation. A symbiosis score based on them is recorded in the allelopathic DB.
- the symbiosis score with watermelon which is one of the other plants recorded as a target of combination with peanuts, is the various conditions, results, etc. when both are cultivated in a mixed and dense state
- the average value of the weighting score for the element is calculated. If the yield is high, the element is assigned a large weighting score, and if the yield is low, a low weighting score is assigned. Their average value is calculated and used as a symbiosis score. The calculation may be performed every time a plant is designated, or may be automatically performed at a predetermined timing.
- allelopathic DB and rotation aptitude DB are used for the calculation of symbiosis score, the case where plant species that are easy to grow every year due to vegetation transition will be averaged. Therefore, evaluation is possible even with the symbiosis score, which is the average value of variable long-time differences divided into the last several years. Considering this, it is possible to respond to and utilize vegetation transition.
- step S34 the cooperative farming method (registered trademark) CMS 71 evaluates the symbiosis score of each vegetation design searched in step S31. That is, the symbiosis score of each vegetation design including the plant designated by the user is compared.
- step S35 the cooperative farming method (registered trademark) CMS 71 selects a vegetation design having a higher symbiosis score. That is, one or more combinations of plant species having a large value of the symbiosis score evaluated in step S34 are selected in descending order.
- all the vegetation designs searched can be presented to the user as they are.
- the evaluation process of the symbiosis score and the process of selecting the vegetation design having the higher symbiosis score value can be omitted.
- the evaluation of the symbiosis score can be performed by the user. In this case, the evaluation process of the symbiosis score and the selection process of the vegetation design having the higher symbiosis score value can be omitted.
- step S36 the communication unit 67 of the server 13 transmits the selected vegetation design to the terminal 12 that has transmitted the plant species received in step S31 to the server 13 via the network 10.
- step S13 the acquisition unit 51 of the terminal 12 acquires the vegetation design transmitted from the server 13 by causing the communication unit 40 to receive the vegetation design. Thereby, the vegetation design of the plant designated by the user in Step S11 is acquired.
- step S ⁇ b> 14 the display control unit 52 causes the display unit 35 to display the vegetation design acquired from the server 13.
- the user can know each combination of the plant species input in step S11 and the vegetation suitable for constructing the mixed vegetative state.
- the user can select a predetermined combination from the combinations proposed and displayed from the ecosystem utilization system and actually cultivate it in the field.
- the displayed combination is a combination of plant species specified by the user and vegetation suitable for building a mixed vegetative state, so it is possible to improve the yield compared to cultivating in a random combination It becomes. As a result, the cost can be reduced compared to the case where plants are cultivated in a random combination.
- the information presented to the user here is not a prediction. Reference information for prediction based on past empirical rules. The prediction is made by the user based on the reference information.
- the cooperative farming method since a plurality of types of plants are vegetated in a mixed and dense state, it is possible to disperse the danger as compared with the case where only one type of plant is cultivated. Many yields can be obtained. This is also the reason why the vegetation design of the ecosystem utilization system is called a seed portfolio. Therefore, the user can designate the number of upper combinations to be presented. Of course, a more appropriate number of plants can also be presented. This enables risk management.
- FIG. 17 is a diagram showing an output example of symbiotic allelopathy.
- FIG. 17 is a display example in step S14 of FIG.
- plants that is, companion plants
- plants suitable for constructing a mixed and dense state with the 10 types of plants shown in the uppermost stage are shown in the lower stage.
- plants suitable for building a mixed state with corn include watermelons and melons (Cucurbitaceae), pumpkins, legumes, lettuce / salad vegetables, cucurbitaceae, sweet basil, vermicelli, geranium, melon, parsley, soybeans, These are morning glory, Japanese mustard spinach, and leafy vegetables.
- Plants suitable for building a mixed state with celery are tomatoes, cabbages, Chinese cabbage, turnips, and peas.
- the plant name shown at the bottom level is displayed as a plant suitable for constructing a mixed and dense state. Therefore, the user can select one or more plants from the display and vegetate together with the designated plants in a mixed and dense state.
- FIG. 18 is a schematic diagram showing a display example of an AR object.
- object data (as an image (data)) is superimposed on a predetermined background space such as a real space that the user actually sees, and the object data (as an image (data)) exists in the background space. It is displayed as if
- object data capable of AR display is also referred to as an AR object.
- image recognition of the background space is performed, a marker is detected, an AR object is displayed at a position corresponding to the marker, and absolute coordinates such as GPS coordinates are used. There is a way to display an AR object at a position.
- the marker there are a case where an object originally existing in the background space is used and a printed matter on which an image such as a predetermined pattern is printed.
- an AR object can be associated with a marker or GPS coordinate
- AR display can be performed by superimposing the AR object on a position in the background space corresponding to the marker or GPS coordinate.
- the server 13 associates an AR object representing the species name with a predetermined GPS coordinate, for example, for a vegetable that is actually vegetated at a predetermined GPS coordinate (the position represented by). , Associating an AR object representing seeding and management work information with a predetermined GPS coordinate for a given GPS coordinate trap, and for an indigenous vegetation with a predetermined GPS coordinate, an AR object representing the indigenous vegetation and a predetermined GPS coordinate Can be associated with each other.
- An AR object associated with the predetermined GPS coordinate can be displayed at a position corresponding to the predetermined GPS coordinate in (image).
- Chinese cabbage 101 to 104, leek 105, Japanese radish 106, cauliflower 107, komatsuna 108, burdock 109, and mugwort group fabric band 110 are vegetated at positions represented by predetermined GPS coordinates in a certain field, respectively. ing.
- the AR object of the Chinese cabbage 101 to the mugwort group fabric band 110 is associated with the GPS coordinates of the position where the Chinese cabbage 101 to the mugwort group fabric band 110 is vegetated, and is recorded and managed. .
- the server 13 displays the name (image to be displayed) input by the user.
- the AR coordinates are associated with the GPS coordinates where the plant whose name is input by the user is vegetated.
- the image is displayed on the display unit 35. Further, when a predetermined instruction is input, as shown in FIG. 18, the name of the corresponding AR object, that is, “Chinese cabbage” or the like is superimposed on the image of the actual plant that is vegetated at the GPS coordinate position. Image to be displayed).
- the mugwort group fabric band 110 is not a plant vegetated as a result of seeding by the user, but is an indigenous plant. In FIG. 18, such an indigenous plant is also given an AR object by the user. .
- the AR object is displayed using GPS coordinates, but the AR object can be displayed using a marker. That is, the AR object can be associated with a GPS coordinate and displayed at a position in the background space corresponding to the GPS coordinate, or can be associated with a marker and displayed at a position in the background space corresponding to the marker.
- FIG. 19 is a diagram showing an example of a site map of a web page as a cooperative farming method (registered trademark) page.
- the server 13 provides the user of the terminal 12 with a cooperative farming method (registered trademark) page, which is a web page in which the cooperative farming method (registered trademark) DB is linked according to the geographical hierarchy shown in FIG.
- a cooperative farming method (registered trademark) page which is a web page in which the cooperative farming method (registered trademark) DB is linked according to the geographical hierarchy shown in FIG.
- an icon for selecting the cooperative farming method (registered trademark) DB is displayed on the top page of the cooperative farming method (registered trademark) page.
- the Kyosai Agricultural Method (registered trademark) page is the top page, the distribution of fields on the map, the entire field information, the entire field information, Consists of a hierarchy of overall information for the ⁇ section.
- the top page, the field distribution on the map, the entire field information, the entire field section information, and the all-compartment information hierarchy correspond to the geographical hierarchy of the earth, the field, the field section, and the all-compartment section, respectively. Therefore, the display unit 35 of the terminal 12 outputs and displays the fields according to this hierarchical structure. For example, the user can select the screens of field sections # 1-1, # 1-2,...
- the entire field information is linked to the climate classification DB, the weather DB, the Kyosai Agricultural Method (registered trademark) assessment document DB, the photo record DB, and the coordinate DB that records GPS coordinates as position information.
- the whole field information is linked to the yield DB, insect flora DB, phenology DB, photo record DB, and coordinate DB.
- the whole information of the culvert is linked to the sowing DB, vegetation DB, phenology DB, photo record DB, and coordinate DB.
- FIG. 20 is a diagram showing a display example of field distribution on a map provided on the Kyosei Agricultural Method (registered trademark) page.
- the position of the field is indicated by a flag 121 on the map of the earth.
- the icons 122 to 129 correspond to a seeding DB, a vegetation DB, a yield DB, a photo recording DB, a phenology DB, an insect flora DB, a climate classification DB, and a weather DB, and are operated when reading out each.
- the geographical hierarchy icon 130 is operated, for example, when selecting a field level such as a field section or a fence section.
- the icon 131 is operated when a search is instructed, the icon 132 is operated when a sort is instructed, and the icon 133 is operated when a key event is instructed.
- the cooperative farming method (registered trademark) CMS 71 of the server 13 searches all words and file names.
- Kyosei Agricultural Method (registered trademark) CMS 71 also has a similar word search function.
- dates such as “April 1, 2011” and “20110401”, “01042011”, “2011/4/1”, “1/4/2011”, “April 1, 2000” Identified.
- it is possible to perform a collective search in which the kana notation / kanji notation / Japanese name / scientific name / English name / common name of the species name are identified. For example, potatoes and potatoes are equated.
- the cooperative farming method (registered trademark) CMS 71 sorts all the parameters. For example, the search result can be rearranged for each parameter such as date order or alphabetical order of species name.
- FIG. 21 is a flowchart for explaining an example of processing when the user refers to information on a farm field (field).
- step S41 the acquisition unit 51 of the terminal 12 acquires field level information. That is, when referring to the information regarding the field, the user operates the operation unit 34 and selects the icon 130 (see FIG. 20) of the geographical hierarchy. When this operation is performed, the display unit 35 displays a screen for selecting a field level. In other words, the level of the field list is displayed. The user operates the operation unit 34 to select a field to be referred to from the screen.
- the acquisition unit 51 acquires the selection information
- the communication unit 40 transmits the selection information to the server 13.
- step S61 the communication unit 67 of the server 13 receives the field level information selected by the terminal 12.
- step S62 a process of creating and outputting a list of fields at the level selected by the terminal 12 in step S41 is performed. That is, the cooperative farming method (registered trademark) CMS 71 searches the coordinate DB to generate a list of fields at the level received in step S61, and the communication unit 67 transmits the list to the terminal 12.
- the cooperative farming method (registered trademark) CMS 71 searches the coordinate DB to generate a list of fields at the level received in step S61, and the communication unit 67 transmits the list to the terminal 12.
- step S42 a list is received and displayed. That is, the list output from the server 13 is received by the communication unit 40 of the terminal 12, and the display control unit 52 displays the list on the display unit 35.
- step S43 the communication unit 40 transmits information on the field selected from the list.
- step S63 the communication unit 67 of the server 13 receives the information on the field selected by the terminal 12.
- step S64 the cooperative farming method (registered trademark) CMS 71 searches the cooperative farming method (registered trademark) DB for a database related to the field received in step S63.
- the DB related to the field at the level specified by the user is searched from the Kyosai Farming Method (registered trademark) DB.
- step S65 processing for outputting a list of searched field DBs is performed. That is, the cooperative farming method (registered trademark) CMS 71 creates a list of related DBs based on the search result, and the communication unit 67 outputs the list to the terminal 12.
- step S44 the communication unit 40 of the terminal 12 receives a list of searched field DBs.
- step S45 the display control unit 52 causes the display unit 35 to display a list of field DBs received by the communication unit 40.
- step S ⁇ b> 46 the acquisition unit 51 acquires the input DB and reference field information.
- step S47 the communication unit 40 transmits the information acquired in step S46 to the server 13.
- step S66 the communication unit 67 of the server 13 receives the information transmitted from the terminal 12.
- step S67 the cooperative farming method (registered trademark) CMS 71 reads the field information of the designated coordinates in the designated DB based on the received information. That is, the field information of the field of coordinates input by the user in the DB received in step S66 is read.
- step S ⁇ b> 68 the communication unit 67 transmits the read field information to the terminal 12.
- step S48 the communication unit 40 of the terminal 12 receives the field information read from the DB from the server 13.
- step S49 the display control unit 52 displays the field information received by the communication unit 40 on the display unit 35.
- step S50 the acquisition unit 51 acquires the date selection information of the information to be referred to. Then, the date selection information is transmitted to the server 13 by the communication unit 40.
- step S ⁇ b> 69 the communication unit 67 of the server 13 receives the date selection information of the information to be referred from the terminal 12.
- the cooperative farming method (registered trademark) CMS 71 reads information on the designated date from the cooperative cooperative farming method (registered trademark) DB.
- the communication unit 67 reads the information on the read date. To the terminal 12.
- step S51 the communication unit 40 of the terminal 12 receives the read date information from the server 13.
- step S52 the display control unit 52 displays the date information received in step S51 on the display unit 35.
- the field level is selected by the icon 130 of the geographical hierarchy, but the field to be referred to can be directly designated by operating the flag 121.
- FIG. 22 is a flowchart illustrating an example of a photo upload process.
- the acquisition unit 51 of the terminal 12 acquires the photograph data in step S81.
- the photograph data image data
- step S82 the position detection unit 39 of the terminal 12 detects GPS coordinates as position information. That is, the coordinates of the subject photographed by the terminal 12 are acquired.
- the coordinates can be the current position of the terminal 12, or the distance and direction from the current position to the subject can be calculated and the current position can be corrected to obtain more accurate subject coordinates.
- the user can also input coordinates by operating the operation unit 34.
- step S83 the display control unit 52 displays a list of information on the cooperative farming method (registered trademark) DB to be linked to the photo data on the display unit 35.
- step S84 the acquisition unit 51 acquires information on selection of information to be linked to the photo data. That is, when the user operates the operation unit 34 to select information to be linked to photo data from the list displayed on the display unit 35, the selection unit 51 acquires the selection information.
- the user can input information (mainly character information) to be added as a comment to the uploaded photo by operating the operation unit 34.
- information mainly character information
- the name “Hakusai” is input as a comment.
- the acquisition unit 51 acquires information that is used as the input comment.
- the communication unit 40 transmits the information acquired in steps S81, S82, S84, and S85 to the server 13.
- step S91 the communication unit 67 of the server 13 receives information transmitted from the terminal 12.
- step S92 the cooperative farming method (registered trademark) CMS 71 registers the information received in step S91 in the cooperative farming method (registered trademark) DB. That is, the photograph taken by the user is registered in the photograph recording DB together with the comment, and further linked to information on the cooperative farming method (registered trademark) DB selected by the user.
- the user can upload a predetermined photograph and comment from the terminal 12 to the server 13.
- the user can refer to the uploaded information later by the process of FIG.
- FIG. 23 is a flowchart illustrating an example of key event registration processing.
- the user can register any event as a key event in the Meta DB of Kyosai Agricultural Method (registered trademark) DB.
- key events events presumed to be important for the management of Kyosai Agricultural Method (registered trademark) can be adopted, and key events are recorded in natural language names and each DB of Kyosei Agricultural Method (registered trademark) DB. Defined by the corresponding item link.
- step S101 the acquisition unit 51 accepts selection of the key event icon 133.
- step S102 the acquisition unit 51 acquires photo data and date. That is, for example, when a user photographs a plant or the like as an event to be recorded as a key event with the camera 37 and operates the operation unit 34 to input a date, the information is acquired by the acquisition unit 51.
- step S103 the position detection unit 39 acquires GPS coordinates as position information. That is, coordinates corresponding to the photographed photograph (GPS coordinates of a plant or the like as an event reflected in the photograph) are acquired.
- step S104 the acquisition unit 51 acquires an input character. That is, when the user operates the operation unit 34 to input character information as a key event, this is acquired. For example, when a user finds a rosette-like Chinese cabbage, he / she can take a picture of the Chinese cabbage and input “Chinese cabbage rosette” as a key event.
- step S105 the communication unit 40 transmits the information acquired in steps S102 to S104 to the server 13.
- step S121 the communication unit 67 of the server 13 receives the information transmitted from the terminal 12.
- step S122 the cooperative farming method (registered trademark) CMS 71 records the information received by the communication unit 67 in the meta DB. That is, the information acquired by the terminal 12 in steps S102 to S104 is recorded (registered) in the meta DB as the key event DB.
- step S106 the acquisition unit 51 of the terminal 12 acquires the field level. That is, when recording a key event, the user operates the operation unit 34 to select a geographic hierarchy icon 130 (FIG. 20). When this operation is performed, the display unit 35 displays a screen for selecting a field level. The user operates the operation unit 34 to select a field level to be referred to from the screen. The acquisition unit 51 acquires the selection information, and the communication unit 40 transmits the selection information to the server 13.
- step S123 the communication unit 67 of the server 13 receives the field level information selected by the terminal 12.
- step S124 a list of fields at the level selected by the terminal 12 in step S106 is created and output. That is, the cooperative farming method (registered trademark) CMS 71 searches the coordinate DB to generate a list of fields at the level received in step S123, and the communication unit 67 transmits the list to the terminal 12.
- step S107 a list is received and displayed. That is, the list output from the server 13 is received by the communication unit 40 of the terminal 12, and the display control unit 52 displays the list on the display unit 35.
- step S ⁇ b> 108 the communication unit 40 transmits the field information selected from the list to the server 13.
- step S125 the communication unit 67 of the server 13 receives information on the field selected by the terminal 12.
- step S126 the cooperative farming method (registered trademark) CMS 71 searches the cooperative farming method (registered trademark) DB for the DB in which the field information received in step S125 is registered.
- the DB related to the field at the level specified by the user is searched from the Kyosai Agricultural Method (registered trademark) DB.
- step S127 processing for outputting a list of searched DBs is performed. That is, the cooperative farming method (registered trademark) CMS 71 creates a list of DBs related to the field at the level designated by the user based on the search result, and the communication unit 67 transmits the list to the terminal 12.
- step S109 the communication unit 40 of the terminal 12 receives the DB list from the server 13.
- step S ⁇ b> 110 the display control unit 52 displays a list of DBs from the server 13 on the display unit 35.
- step S111 the acquiring unit 51 acquires the input DB and field coordinate information linked to the key event.
- step S ⁇ b> 112 the communication unit 40 transmits the information acquired in step S ⁇ b> 111 to the server 13.
- step S1208 the communication unit 67 of the server 13 receives the information transmitted from the terminal 12.
- the cooperative farming method (registered trademark) CMS 71 reads the field information of the designated coordinates from the designated DB of the cooperative farming method (registered trademark) DB. That is, the field information of the field at the coordinates similarly input by the user is read from the DB input by the user.
- step S ⁇ b> 130 the communication unit 67 transmits the read field information to the terminal 12.
- step S113 the communication unit 40 of the terminal 12 receives the read field information. This information is displayed on the display unit 35.
- the user confirms that the field specified (input) by the user is a field linked to the key event by looking at this display. After this confirmation, the user operates the operation unit 34 to instruct to link the field information specified by the user in the DB specified (input) by the user with the key event. Based on this instruction, in step S ⁇ b> 114, the communication unit 40 transmits a command for a link to the server 13.
- step S131 the communication unit 67 of the server 13 receives the link command transmitted from the terminal 12.
- step S132 the cooperative farming method (registered trademark) CMS 71 links the newly recorded key event to the designated field information. That is, the key event newly recorded in the meta DB in step S122 is linked to the field information designated by the user in step S111.
- the user refers to the field information linked to the key event from the key event by operating the operation unit 34 of the terminal 12.
- the field information it is possible to access a key event linked to the field information.
- the event as a specific example of the key event is described in the order of the key event name, the record in free language, and the related DB that is the DB in which the field information linked to the key event is registered.
- Rosette Records in free language: In winter, some vegetable species change into a flat and crumpled form on the surface of the earth, and survive until spring in a shape that does not wither even in cold weather. This form can also be harvested.
- Related DB Vegetation DB, Phenology DB, Yield DB, Weather DB
- Vegetation DB Vegetation DB
- Phenology DB Phenology DB
- Yield DB Weather DB
- Late frost Recording in free language If the ground surface falls below 4 ° C immediately after germination in spring, the foliage will be annihilated by late frost.
- Related DB Seedling DB, Management Record DB, Vegetation DB, Phenology DB, Weather DB
- FIG. 24 is a diagram illustrating a relation graph generated by the graph display control unit 72 of the server 13 (or the acquisition unit 51 of the terminal 12).
- a relation graph representing the relationship between the information i2 can be generated using the information i1 as a scale, contrary to the above case.
- the DB in which the information i1 and i2 are explicitly associated in addition to the DB in which the information i1 and i2 are explicitly associated, the DB in which the information i1 and i2 are implicitly associated can be used.
- the information i1 is the observed vegetation (name of vegetation) and the information i2 is the field (name of the field) where vegetation is observed
- the information i1 and i2 are explicitly associated with each other.
- the DB that is registered is a DB in which vegetation and a field where the vegetation is observed are registered in association with each other.
- the DB in which the information i1 and i2 are implicitly associated is, for example, a DB in which a natural language such as “vegetation # 1 was observed in the field # 1” is registered.
- the graph display control unit 72 generates, for example, a bipartite graph (bipartite graph) as a multipartite graph from the DB in which the information i1 and i2 are associated, and generates a relation graph from the bipartite graph. To do.
- a bipartite graph bipartite graph
- FIG. 24 is a diagram showing an example of a bipartite graph generated from the DB.
- a vegetation / field DB in which vegetation and a field where the vegetation is observed is associated is adopted as the DB.
- a relationship score representing the relationship (strength) between vegetation #i and #j is obtained using field #k as a scale (i ⁇ j ).
- FIG. 25 is a diagram illustrating an example of a relation score obtained from the bipartite graph of the vegetation / field DB in FIG.
- the relationship score between vegetation #i and other vegetation #j is the number of fields associated with both vegetation #i and #j, i.e. the number of fields where both vegetation #i and #j are observed.
- Corresponding values eg, proportional values
- the graph display control unit 72 obtains a relationship score from the bipartite graph, generates a relationship graph using the relationship score, and generates a graph display screen displaying the relationship graph.
- FIG. 26 is a diagram illustrating an example of a graph display screen on which a relation graph generated using the bipartite graph of FIG. 24 and the relation score of FIG. 25 is displayed.
- the relationship graph is composed of nodes indicated by circles in the figure and links indicated by line segments connecting the nodes.
- nodes correspond to vegetation
- links represent nodes, that is, here, relationships between vegetation.
- the relation graph of FIG. 26 shows that the node of vegetation # 1 (the node corresponding to vegetation # 1) is the attention node of interest, and vegetation # 1 represented by the attention node and the other vegetation # 2 to # 6 respectively. Represents the relationship.
- the relationship graph is arranged such that, for example, the node of interest, that is, the vegetation # 1 node is located at (approximately) the center of the screen.
- the length of the link between the node of vegetation # 1 that is the node of interest and the nodes of other vegetation # 2 to # 6 is the vegetation # 1 and the vegetation # shown in FIG.
- the length corresponds to the relationship score between 2 and # 6.
- the length of the link between the node of vegetation # 1 which is the attention node and the node of other vegetation #j is larger as the relation score between vegetation # 1 and #j is larger, that is, vegetation # 1 and The stronger the relationship with #j, the shorter.
- the strong relationship between vegetation # 1 and #j corresponds to the large number of fields where both vegetation # 1 and #j are observed.
- the vegetation #j represented by the node near the node of the vegetation # 1 that is the node of interest is in a symbiotic relationship with the vegetation # 1, and the user who viewed the relationship graph of FIG. Vegetation suitable for constructing a vegetation # 1 and mixed vegetation state can be easily recognized (predicted) (estimated).
- vegetation design can be supported as one of the support of ecosystem utilization.
- the node of vegetation # 1 is the target node, but any node can be selected as the target node on the graph display screen.
- the display control unit 52 Display the graph display screen that displays the relationship graph with the # 3 node as the target node.
- FIG. 27 is a diagram showing an example of a graph display screen on which a relation graph with the vegetation # 3 node as a target node is displayed.
- the relationship graph is arranged so that the node of vegetation # 3, which is the node of interest, is located at the center of the screen.
- the length of the link between the node of vegetation # 3, which is the node of interest, and the nodes of other vegetation # 1, # 2, # 4 to # 6 is shown in FIG. It has a length corresponding to the relationship score between # 3 and vegetation # 1, # 2, # 4 or # 6.
- the relationship between the information i1 with the information i2 as a scale the relationship between the information i1 and the information i2 associated with the information i1 can be expressed in addition to the relationship between the information i1.
- FIG. 28 is a diagram showing an example of a graph display screen on which a relationship graph representing a relationship between vegetation and a field associated with the vegetation is displayed in addition to the relationship between vegetation.
- the relationship graph of FIG. 26 includes a field node (part indicated by a triangle in the figure) associated with vegetation and a link (part indicated by a dotted line in the figure) indicating the relationship between the vegetation and the field. ) And have been added.
- the user can easily recognize the vegetation # 1 and the vegetation suitable for constructing the mixed vegetation state.
- the observed field can be easily recognized.
- the user can estimate the environment in which vegetation # 1 is observed by accessing the Kyosai Agricultural Method (registered trademark) DB and examining the environment of the field in which vegetation # 1 is observed.
- the nodes of information of different categories such as vegetation and the field are represented in the relation graph, the nodes can be displayed so that the vegetation node and the field node can be distinguished.
- the vegetation node and the field node can be displayed by adopting different colors, sizes, shapes, patterns, and the like, for example.
- the graph display screen on which the relationship graph representing the relationship between vegetation and the field associated with the vegetation is displayed has been described with reference to FIG. 27.
- a vegetation node can be selected as a node of interest, and a field node can be selected.
- the display control unit 52 A graph display screen is displayed on which a relation graph with the node of the selected field as the node of interest is displayed.
- the relationship graph with the node of the field as the target node represents the relationship between the fields using the vegetation associated with the field in the vegetation / field DB as a scale. Therefore, when a graph display screen displaying a relationship graph with the node of the field as the node of interest is displayed, a relationship score representing the relationship between the fields #i and #j is obtained using vegetation #k as a scale. (I ⁇ j).
- FIG. 29 is a diagram showing an example of a relation score obtained from the bipartite graph of FIG.
- FIG. 25 the relationship score between vegetation was shown, but FIG. 29 has shown the relationship score between farm fields.
- the relationship score between field #i and other field #j is the number of vegetation associated with both fields #i and #j, i.e. the number of vegetation observed in both fields #i and #j.
- Corresponding values eg, proportional values
- the vegetation observed in both the fields # 1 and # 2 is two vegetation # 1 and # 2.
- the vegetation observed in both fields # 1 and # 3 is one of vegetation # 1, and zero vegetation is observed in both fields # 1 and # 4.
- the graph display control unit 72 obtains a relationship score from the bipartite graph, generates a relationship graph using the relationship score, and generates a graph display screen displaying the relationship graph.
- FIG. 30 is a diagram showing an example of a graph display screen on which a relation graph generated using the relation score of FIG. 29 is displayed.
- FIG. 30 shows an example of a graph display screen of a relation graph displayed when, for example, the user selects the node of the field # 1 as the target node in the relation graph of FIG.
- the node of the field # 1 is set as the node of interest, and the relationship between the field # 1 represented by the node of interest and each of the other fields # 2 to # 4 is represented.
- the relationship graph is arranged so that the node of the field # 1 that is the node of interest is located at the center of the screen, as described in FIG.
- the length of the link between the node of the field # 1 which is the node of interest and the other nodes of the other fields # 2 to # 4 is the field # 1 shown in FIG. And the length corresponding to the relation score with each of the fields # 2 to # 4.
- the length of the link between the node of the field # 1 that is the node of interest and the node of the other field #j is larger as the relation score between the fields # 1 and #j is larger, that is, the field # 1 and The stronger the relationship with #j, the shorter.
- the field # 1 represented by the node of interest and the field #j represented by a node near the node of interest are an environment suitable for vegetation observed in both the fields # 1 and #j and many It can be estimated that the field is a common field.
- the user accesses the cooperative farming method (registered trademark) DB to check the environment common to the farm fields # 1 and #j, thereby The environment suitable for vegetation observed in both j can be estimated.
- the cooperative farming method registered trademark
- the relationship graph of FIG. 30 also displays the nodes of vegetation # 1 and # 2 observed in the field # 1 in a form linked to the field # 1 represented by the node of interest. .
- the user can select an arbitrary node as a node of interest by operating the operation unit 34.
- the display control unit 52 displays the graph displaying the relationship graph with the node of the field # 3 as the node of interest. Display the screen.
- FIG. 31 is a diagram showing an example of a graph display screen on which a relation graph with the node of the field # 3 as a target node is displayed.
- the relationship graph is arranged so that the node of the field # 3 which is the attention node is located at the center of the screen.
- the length of the link between the node of the field # 3 that is the target node and the nodes of the other fields # 1, # 2, and # 4 is the same as the field # 3 shown in FIG.
- the length corresponds to the relationship score with each of the fields # 1, # 2, and # 4.
- the vegetation # 1 to # 3 observed in the field # 3 are linked to the field # 3 represented by the node of interest. Nodes are also displayed.
- the relation graph of FIG. 31 for example, when the node of vegetation # 1 is newly selected as the node of interest, the relation graph is as shown in FIG.
- the user can easily recognize other vegetation that coexists with vegetation # 1 represented by the node of interest by referring to the relationship graph of FIG.
- the field where vegetation # 1 represented by the node of interest is observed can be easily recognized.
- the user when the user newly selects a vegetation node other than vegetation # 1 as the attention node in the relationship graph of FIG. 28, as described in FIGS. 26 and 27, the newly selected node Is displayed as the attention node, the user can easily recognize other vegetation that coexists with the vegetation represented by the node that has newly become the attention node.
- the relationship between the fields and the vegetation observed in the field represented by the target node as shown in FIG. 28 the relationship between the fields and the vegetation observed in the field represented by the target node as shown in FIG. The user can easily recognize the vegetation observed in the field represented by the node of interest and the vegetation observed in the field represented by the node of interest. can do.
- FIG. 32 is a diagram showing an example of a bipartite graph generated from the vegetation / recipe DB.
- the vegetation / recipe DB is a DB in which vegetation is associated with a recipe for cooking using the vegetation as a material.
- the bipartite graph in FIG. 32 shows that vegetation # 1 was observed in recipes # 1 to # 3 (being material), vegetation # 2 was observed in recipes # 1 to # 3, and vegetation # 3 was observed in recipes # 1 and # 3, vegetation # 4 and # 5 were each observed in recipe # 3, and vegetation # 6 was not observed in any of recipes # 1 to # 3 Represents.
- a relationship score representing the relationship (strength) between vegetation #i and #j is obtained using recipe #k as a scale (i ⁇ j).
- FIG. 33 is a diagram showing an example of a relation score obtained from the bipartite graph of the vegetation / recipe DB of FIG.
- the relationship score between vegetation #i and other vegetation #j is the number of recipes associated with both vegetation #i and #j, i.e. the number of recipes where both vegetation #i and #j are observed. Corresponding values can be adopted.
- the recipes observed with vegetation # 2 are two recipes # 1 and # 2
- the recipes observed with vegetation # 3 are Two recipes # 1 and # 3
- the recipe observed with vegetation # 4 is one recipe # 3
- the recipe observed with vegetation # 5 is one recipe # 3.
- the relationship score with vegetation # 2 and # 3 is 2/3, and the relationship score with vegetation # 4 and # 5 is 1/3. Furthermore, the vegetation # 1 has a relationship score of 0 with vegetation # 6.
- the graph display control unit 72 obtains a relationship score from the bipartite graph, generates a relationship graph using the relationship score, and generates a graph display screen displaying the relationship graph.
- FIG. 34 is a diagram showing an example of a graph display screen on which a relation graph generated using the bipartite graph of FIG. 32 and the relation score of FIG. 33 is displayed.
- the relationship graph of FIG. 34 represents the relationship between vegetation # 1 represented by the node of interest and the other vegetation # 2 to # 6, with the node of vegetation # 1 as the node of interest.
- the node of vegetation # 1 which is the attention node and the nodes of other vegetation # 2 to # 6 The length of the link between and vegetation is the length corresponding to the relation score between vegetation # 1 and each of vegetation # 2 to # 6 shown in FIG.
- the length of the link between the node of vegetation # 1 which is the attention node and the node of other vegetation #j is larger as the relation score between vegetation # 1 and #j is larger, that is, vegetation # 1 and The stronger the relationship with #j, the shorter.
- the strong relationship between vegetation # 1 and #j corresponds to a large number of recipes in which both vegetation # 1 and #j are observed.
- the vegetation #j represented by the node near the node of the vegetation # 1 which is the attention node is often used for cooking together with the vegetation # 1, and the relation graph of FIG. Can easily recognize the vegetation that is often used for cooking together with vegetation # 1.
- tomato and basil are often used together for cooking, but vegetation often used together for cooking may be in a symbiotic relationship.
- the user operates the operation unit 34 to select a vegetation node other than vegetation # 1 as a target node and display a relation graph having the selected vegetation node as a target node. be able to.
- the user can select a recipe node as the attention node, and display a relation graph with the selected recipe node as the attention node.
- the recipe represented by the node close to the node of the recipe that is the attention node indicates that there are many vegetation used together with the recipe represented by the attention node.
- the relationship graph can be generated from a vegetation / field DB or a vegetation / recipe DB (bipartite graph) as described above, or from a DB in which vegetation is associated with other arbitrary information.
- relation graph can be generated from a DB in which (biological) species (species) other than vegetation are associated with information other than the species.
- the relationship graph can be generated from a single DB such as a vegetation / field DB and a vegetation / recipe DB, or from a plurality of DBs such as a first DB and a second DB.
- FIG. 35 is a diagram illustrating an example of a bipartite graph generated from the two DBs of the vegetation / field DB and the vegetation / recipe DB described above.
- the relationship graph of FIG. 35 represents the relationship between vegetation # 1 represented by the node of interest and other vegetation # 2 to # 6, with the node of vegetation # 1 as the node of interest.
- the relationship score between vegetation # 1 that is the node of interest and each of the other vegetation # 2 to # 6 is the scale of field #k associated with vegetation #i in the vegetation / field DB.
- the recipe #k associated with the vegetation #i in the vegetation / recipe DB can be obtained as a scale.
- the relationship score between vegetation # 1 which is the attention node and each of the other vegetation # 2 to # 6 is the vegetation / field DB, the field #k associated with vegetation #i, and the vegetation / recipe DB. Both the vegetation #i and the recipe #k associated with the vegetation #i can be obtained as a scale.
- the relationship graph represents the relationship between vegetation #i and other vegetation #j, that is, Based on the relationship score, all other vegetation #j can be ranked against vegetation #i.
- the other vegetation #j closer to the vegetation #i represented by the node of interest is the vegetation with the higher ranking.
- the vegetation / field DB for example, a value corresponding to the number of fields where both vegetation #i and #j are observed is adopted as the relation score of vegetation #i and #j.
- Other related scores of vegetation #i and #j include, for example, the number of times both vegetation #i and #j were observed in the same field, and the field and vegetation #j where vegetation #i was observed.
- the vegetation / farm field DB such as the proximity of the farm field, any value with the field associated with the vegetation as a scale can be adopted.
- the display of the relationship graph can be controlled based on various factors.
- a relationship graph representing the relationship between vegetation For example, in a relationship graph representing the relationship between vegetation, the higher the ranking of the vegetation, the thicker the link between the vegetation node and the node of interest can be displayed, or the link color can be changed. .
- a recipe node having a larger amount of vegetation #i represented by the node of interest increases in size or is positioned closer to the node of interest. Can be arranged.
- the node of vegetation #j that has a strong relationship with vegetation #i represented by the node of interest approaches the node of interest, and the node of interest It is possible to perform an animation display in which the node of vegetation #j ′ having a weak relationship with the vegetation #i to be represented (having a small relation score) moves away from the node of interest.
- a relationship graph representing the relationship between vegetation for the vegetation nodes registered in the vegetation / field DB and the vegetation / recipe DB, all vegetation nodes are displayed, and the vegetation that is the attention node is displayed. Display only the node of #i and the node of vegetation #j whose relation score with vegetation #i is greater than 0, that is, the node of vegetation #j where the field #k observed with vegetation #i exists be able to.
- FIG. 36 is a diagram illustrating an example of a DB used to generate a relation graph by the graph display control unit 72 (or the acquisition unit 51).
- the relationship graph can be generated using the Kyosai Agricultural Method (registered trademark) DB registered in the storage 63 of the server 13 (various DBs constituting the DB).
- the relationship graph can be generated using various DBs on the Internet 24 such as a DB in which a species and other information are associated with each other.
- the relationship graph is a book DB that is a DB of books on the Internet 24, a Web DB that is a DB provided on a web page, an academic DB with academic information registered, and a nutritional database. It can be generated using various DBs such as Nutrition DB with registered information.
- the user of the terminal 12 can select the DB used to generate the relationship graph by operating the operation unit 34.
- a DB used for generating a relation graph can be selected from a plurality of DBs shown in FIG.
- a DB to be used for generating a relation graph can be selected from a plurality of DBs, which DB is the relation graph currently displayed ( Which DB is used to create the relation graph) may be difficult for the user to understand.
- one or more of the background color of the relation graph, the node shape, the node color, and the like can be changed according to the selection (switching) of the DB used to generate the relation graph.
- the terminal 12 can output different sounds from the speaker 36 in accordance with the selection of the DB used for generating the relationship graph.
- the user can recognize which DB is used to generate the relationship graph.
- FIG. 37 is a diagram illustrating a configuration example of a graph display screen on which a relation graph is displayed.
- the graph display screen can be composed of a graph display area 201 and a list display area 202.
- the left end of the graph display screen is the list display area 202, and the rest is the graph display area 201.
- an overhead view display area 211 can be provided in the graph display area 201.
- an overhead view display area 211 is provided at the lower right of the graph display area 201.
- the entire relation graph including all the nodes of the information i1 registered in the DB associated with the information i1 and the information i2 used for generating the relation graph is displayed.
- the entire relation graph in which all the nodes of the information i1 registered in the DB exist may have an enormous number of nodes. If such an entire relation graph is displayed in the graph display area 201, the relation The graph may be difficult to see.
- a part of the entire relationship graph can be displayed in a large size.
- a display frame 212 representing a part displayed in the graph display area 201 out of the entire relation graph displayed in the bird's-eye view display area 211 can be displayed.
- the display frame 212 allows the user to easily recognize which part of the entire relation graph is displayed in the graph display area 201.
- a ranking list is displayed.
- the relationship graph (relation score) representing the relationship between vegetation #i and other vegetation #j
- the relationship between vegetation #i and other vegetation #j That is, all other vegetation #j can be ranked with respect to vegetation #i based on the relationship score between vegetation #i and other vegetation #j.
- a ranking list which is a list of vegetation (name of vegetation) ranking such vegetation #j is displayed.
- the display of the overhead view display area 211 and the display of the ranking list can be turned on / off according to the operation of the operation unit 34 by the user.
- the graph display screen as described above can be generated using Gephi, which is an open source software package, for example.
- FIG. 38 is a diagram showing an example of a graph display screen by Gephi.
- the relation graph generated using the vegetation / farm field DB is displayed in the graph display area 201 with the vegetation “Diospyros” node as an attention node.
- a ranking list of other vegetation ranked with respect to the vegetation “Diospyros” represented by the node of interest is displayed in the list display area 202.
- a field where the vegetation "Diospyros" represented by the node of interest is observed can be displayed.
- FIG. 39 is a diagram illustrating an example of a graph display screen by Gephi when the field “20120329ise” in which the vegetation “Diospyros” represented by the node of interest is observed in the relationship graph of FIG. 38 is selected as a new node of interest. .
- FIG. 40 is a diagram showing another example of the graph display screen by Gephi.
- a search box is provided at the top of Gephi.
- the search box When the user clicks (or taps) the search box, a list of vegetation and fields represented by the nodes is displayed in a pull-down form. By selecting a vegetation or a field from the list, the user can set the selected vegetation or field node as a node of interest.
- FIG. 41 is a diagram showing still another example of the graph display screen by Gephi.
- a display frame 212 is displayed so as to indicate a portion of the relationship graph displayed in the graph display region 201 out of the entire relationship graph displayed in the overhead view display region 211.
- a user profile related to the user of the terminal 12 can be registered in, for example, the storage 33 of the terminal 12, and the relationship graph can be changed based on the user profile.
- the number of years of experience Based on the relationship graph (display) can be changed.
- a relational graph having all the vegetation nodes registered in the DB is displayed, and for a beginner-level user with a short experience year (or 0) Of the vegetation registered in the DB, for example, it is possible to display a relationship graph limited to the nodes of the best 3 vegetation ranking (the top 3 vegetation having a high relationship score). This is to prevent the user from becoming confused when a user at a beginner level displays many vegetation (nodes).
- the DB used to generate the relationship graph can be selected based on the user profile.
- a vegetation / recipe DB is selected as the DB for the housewife in generating the relationship graph.
- a vegetation / recipe DB is selected as the DB for the housewife in generating the relationship graph.
- a vegetation / field DB is used to generate the relationship graph. You can choose.
- the relationship graph is changed based on the content profile. be able to.
- the years of experience For scholarly level users, select only vegetation registered in the DB that has low visibility or vegetation, and display a relationship graph limited to the selected vegetation nodes. be able to. This is because scholar-level users are not so interested in vegetation with a high degree of recognition or vegetation with a low degree of rarity.
- the relation between seeds that is, the relation score can be corrected based on the user profile and content profile.
- the vegetation registered in the DB can be corrected so as to increase the relation score of vegetation with low visibility or vegetation with high rarity.
- vegetation with a high degree of popularity and vegetation with a low degree of rarity are ranked higher in the ranking list, and are displayed at a position close to the node of interest in the relationship graph.
- FIG. 42 is a flowchart illustrating an example of processing for displaying a graph display screen.
- step S201 the user of the terminal 12 operates the operation unit 34 to select the attention DB, which is the DB used to generate the relationship graph, from the plurality of DBs.
- step S202 the communication unit 40 of the terminal 12 transmits information on the attention DB (information representing the attention DB) to the server 13.
- the communication unit 67 of the server 13 receives information on the attention DB from the terminal 12, and in step S211, the graph display control unit 72 accesses the attention DB and generates a bipartite graph from the attention DB.
- step S212 the graph display control unit 72 generates a graph display screen displaying a relation graph from the bipartite graph of the target DB.
- step S213 the communication unit 67 transmits the graph display screen generated by the graph display control unit 72 to the terminal 12.
- step S203 the communication unit 40 of the terminal 12 receives the graph display screen from the server 13, and the acquisition unit 51 acquires the graph display screen.
- step S204 the display control unit 52 of the terminal 12 displays the graph display screen acquired by the acquisition unit 51 on the display unit 35.
- the server 13 generates a graph display screen from the bipartite graph of the target DB and transmits it to the terminal 12.
- the server 13 transmits the bipartite graph of the target DB to the terminal 12.
- the acquisition unit 51 can acquire the graph by generating a graph display screen from the bipartite graph from the server 13.
- a relation graph composed of nodes and links is obtained from a DB in which species and other information other than species are associated, and represents the relationship between species with other information as a scale.
- the user can obtain the relationship between the seeds as knowledge, and use the knowledge for ecosystem utilization such as cooperative farming method (registered trademark).
- FIG. 43 is a diagram for explaining the outline of support for cooperative farming method (registered trademark) using AR display / VR display of AR objects.
- the AR display / VR display of the AR object can be performed on the terminal 12, and the AR display / VR display of the AR object is used to support the cooperative farming method (registered trademark). It can be performed.
- the server 13 collects various subsets of ecosystem information from the sensor device 11 or the terminal 12, and the subsets are collected. While organizing with various scales, it associates with a multipartite graph etc., and provides it to the user of the terminal 12 with AR display or VR display.
- the AR object is an actual real space, a captured real space (image) obtained by photographing the real space, or a VR space (image) obtained by modeling the real space as a background space (image). It is displayed superimposed on the background space.
- the AR display means that the real space of the current location where the user is present, the real shooting space, or the VR space is used as a background space, and the AR object is superimposed and displayed in the background space.
- the AR display means that an arbitrary place (and necessary date and time) is designated, and the shooting real space of the designated place (and date and time) or VR space is set as a background space in the background space. This means that AR objects are superimposed and displayed.
- the AR display can be performed using a real space that the user directly sees with the see-through HMD as the background space.
- the AR display is a shooting real space (image) obtained by shooting a real space on the spot
- the VR space (image) obtained by modeling the space can be used as the background space (image).
- VR display for example, an arbitrary position (or a farm field, a region, or the like) is designated as a designated position on the terminal 12, and a shooting real space or VR space at the designated position is used as a background space (image thereof).
- the background space can be acquired from the server 13 and performed.
- the date and time can be specified as necessary in addition to the actual shooting space as the background space and the location of the VR space.
- the AR object can be displayed superimposed on the past field image.
- the display mode for displaying an AR object in AR display is also referred to as AR mode
- the display mode for displaying an AR object in VR display is also referred to as VR mode.
- the display mode is set by, for example, a user operation or the specification of the terminal 12.
- the display mode is set to the AR mode.
- the display mode can be set to the VR mode or the AR mode in accordance with a user operation or the like.
- the AR object and the background space are provided separately from the server 13 to the terminal 12, and the AR object can be displayed superimposed on the background space on the terminal 12.
- the server 13 superimposes the AR object in the background space in advance and provides it to the terminal 12.
- the terminal 12 displays the background space provided from the server 13 and in which the AR object is superimposed in advance. can do.
- users who use the terminal 12 can be roughly divided into two types of users: teacher users and student users.
- the student user is a user who receives advice on cooperative farming method (registered trademark) by working on a farm field and viewing AR display / VR display of an AR object, for example.
- cooperative farming method registered trademark
- the teacher user recognizes the situation of the field by viewing the AR display / VR display of the AR object at a remote location away from the field, and edits the AR object.
- This is a user who provides (registered trademark) advice.
- the teacher user can easily support the cooperative farming method (registered trademark) on the student data, and the student user can receive the cooperation from the teacher user.
- Support for raw farming (registered trademark) can be easily obtained.
- the student user may become a teacher user
- the teacher user may become a student user
- the acquisition unit 51 acquires the ecosystem object, task object, background space, and other information necessary for display from the server 13, and the display control unit 52 acquires the acquisition unit 51.
- the display control unit 52 acquires the acquisition unit 51.
- the terminal 12 can download a predetermined AR object or background space from the server 13 in advance.
- the terminal 12 can perform the AR display / VR display in a stand-alone manner without communicating with the server 13 within the range of the AR object or background space previously downloaded from the server 13.
- FIG. 44 is a diagram showing a display example of the AR display on the terminal 12.
- an AR object is superimposed and displayed in the background space with the real space or the photographed real space of a certain field as the background space.
- AR objects used in support of cooperative farming method (registered trademark) using AR display / VR display of AR objects are divided into ecosystem objects and task objects depending on the difference of objects represented by the AR objects. .
- the ecosystem object represents an ecosystem component that constitutes an ecosystem in a field in which cooperative farming (registered trademark) is performed, that is, a field in which a plurality of types of vegetation are mixed, and is given to the ecosystem component.
- Examples of the ecosystem components constituting the field ecosystem include various sensors of the field environment such as vegetation, field climate, insects and other organisms, and temperature sensed by the sensor device 11 disposed in the field. This includes various elements that make up the ecosystem of the field, such as data. For example, a stone in a farm field, an arrival point in the shade of the summer solstice, a well drained place / a bad place, and the like also correspond to ecosystem components.
- Task object represents a task to be performed on ecosystem components.
- Examples of tasks performed on ecosystem components include harvesting vegetation as an ecosystem component, planting vegetation, and cutting (weeding).
- the AR object is divided into an ecosystem object and a task object according to the difference of the object represented by the AR object, and, for example, a photograph (data), an icon (data), etc. And sensor data (representing symbols).
- the ecosystem object for example, a photograph, an icon, or sensor data is adopted, and as the task object, for example, an icon is adopted.
- an actual farm field or a photographed real space is used as a background space, and a photograph as an ecosystem object is displayed in the background space (to exist).
- a printed matter on which an image as a marker is printed is installed in the field.
- a marker is detected from an image obtained by photographing a field with the camera 37, and a photograph ecosystem object is displayed at a relative position with the marker as a reference.
- the relative position with reference to the marker where the ecosystem object of the photograph is displayed is, for example, the position (near the position) where the vegetation or the like shown in the photograph is (observed).
- the user can easily recognize that the vegetation reflected in the photograph is at the position (near) of the ecosystem object by looking at the ecosystem object of the photograph.
- the user can easily recognize the arrangement of a plurality of vegetation reflected in each of the photographs as the plurality of ecosystem objects based on the positional relationship between the plurality of ecosystem objects.
- a marker is associated with an ecosystem object, and the ecosystem object is displayed at a relative position based on the marker associated with the ecosystem object.
- One marker can be associated with one ecosystem object, or can be associated with a plurality of ecosystem objects. In addition, a plurality of markers can be associated with one ecosystem object.
- the position where the ecosystem object is displayed can be determined based on each of the plurality of markers.
- the ecosystem object can be accurately displayed at the position of the ecosystem component to which the ecosystem object is assigned, that is, the ecosystem component represented by the ecosystem object.
- the markers installed in the field may be broken or tilted over time, and some may be hidden by growing vegetation.
- the marker is not necessarily photographed from the front so that the whole is clearly visible, and may be photographed from an oblique direction.
- the ecosystem object to be displayed may not be displayed, or even if the ecosystem object is displayed, it may be displayed at a position deviated from the position to be originally displayed.
- the ecosystem object to be displayed is not displayed, or the ecosystem object is displayed at a position shifted from the position to be originally displayed. Can be prevented from occurring.
- FIG. 45 is a diagram illustrating a display example of the AR display on the terminal 12.
- a real space of an actual office room or a photographed real space is used as a background space, and a photograph as an ecosystem object is displayed in the background space.
- a printed matter as a marker is installed in one room of the office.
- a marker is detected from an image obtained by taking an image of the office with the camera 37, and a photo ecosystem object is displayed at a relative position based on the marker. Is done.
- a marker is associated with an ecosystem object, and the ecosystem object is displayed at a relative position with reference to the marker associated with the ecosystem object. Is done.
- the user can easily recognize, for example, the vegetation planted in the field and the arrangement of the vegetation in the field at a remote office remote from the field.
- the teacher user recognizes the vegetation planted in the field or the arrangement of a plurality of vegetation in the field at the office, and, for example, advises the student user on the field, for example, vegetation design (strategy). Etc. can be performed.
- a plurality of users can use the respective terminals 12 to simultaneously view a field ecosystem object (an ecosystem object representing a field ecosystem component).
- a plurality of users can discuss whether or not there is a certain vegetation, the arrangement of vegetation, etc. while recognizing the actual vegetation and the arrangement of vegetation while staying in the office. .
- FIG. 45 a plurality of markers are installed in the office.
- Each of the plurality of markers installed in the office is a marker having the same content as, for example, a marker installed in a different field. That is, in FIG. 45, for example, the marker M is a marker having the same content as a marker installed in a certain field F, and the marker M ′ is a marker having the same content as a marker installed in another field F ′. It has become.
- the user can easily recognize, for example, the vegetation of the fields F and F ′ and the arrangement of the vegetation in the remote office away from the fields F and F ′, and between the fields F and F ′. Compare vegetation and vegetation placement.
- the biological object in the field F can be associated with the marker M, and further can be associated with an office marker that is a marker different from the marker M.
- the ecosystem object of the field F can be displayed by setting the office marker in addition to the marker M.
- a marker M ′ is associated with the biological system object in the field F ′, and the above-described office marker different from the marker M can be further associated.
- the office marker When the office marker is associated with the ecosystem object of the field F and the ecosystem object of the field F ′, the office marker is installed in the office so that the ecosystem object of the field F and the field F 'Ecosystem objects can be displayed.
- the size of the ecosystem object can be changed and displayed according to the size of the marker.
- the ecosystem object can be displayed in a size proportional to the size of the marker, for example.
- the ecosystem object is arranged (displayed) on the front side of the marker, but the arrangement position of the ecosystem object is not limited to the front side of the marker. That is, the ecosystem object can be arranged at an arbitrary position such as the back side (back side) of the marker.
- the ecosystem object can be arranged at an arbitrary position such as the back side (back side) of the marker.
- a marker is placed at a position where the image can be photographed from within the observation platform, and the marker is photographed from the observation platform, so that All granted ecosystem objects can be displayed.
- FIG. 46 is a diagram showing a display example of VR display on the terminal 12.
- FIG. 46 a VR space or a photographed real space of a certain field is used as a background space, and an ecosystem object representing vegetation is superimposed and displayed in the background space.
- a marker is placed in the background space, and an ecosystem object associated with the marker can be displayed at a relative position based on the marker.
- the user can use the terminal 12 as a background space with a VR space that is a model of a field at an arbitrary place and at an arbitrary timing and at an arbitrary timing.
- Ecosystem objects that represent ecosystem components such as vegetation can be seen.
- the user can easily recognize, for example, the vegetation of the field and the arrangement of the vegetation at an arbitrary place and at an arbitrary timing, and examine the vegetation design and advice to the student user of the field. it can.
- FIG. 47 is a diagram showing a display example of the AR display on the terminal 12. As shown in FIG.
- an AR object is superimposed and displayed in the background space with a real space or a photographed real space of a certain field as a background space.
- FIG. 47 as in FIG. 44, in the terminal 12, for example, an actual farm field or photographed real space is used as a background space, and as an ecosystem object in the background space (as it exists). Pictures are displayed according to the markers.
- the task object is displayed in the background space of the field (as it exists).
- an icon displaying a fruit and an arrow-shaped icon are displayed.
- the icon displaying the fruit as the task object represents the task of harvesting the vegetation represented by the ecosystem object adjacent to the task object.
- the user of the field can recognize that the vegetation represented by the ecosystem object adjacent to the task object should be harvested by looking at the icon displaying the fruit as the task object in the actual field. .
- the arrow-shaped icon as the task object represents the task of harvesting vegetation and the order of harvesting, similar to the icon displaying the fruit as the task object.
- the user of the farm can recognize the order of harvesting vegetation by looking at the arrow-shaped icon as the task object in the actual farm.
- FIG. 48 is a diagram illustrating a display example of VR display on the terminal 12.
- an ecosystem object representing sensor data is displayed in the background space according to the marker.
- environmental information of the agricultural field such as, for example, the nitric acid concentration, conductivity, and soil hardness of the soil in each place is sensed by the sensor device 11, and in FIG. 48, an ecosystem representing sensor data obtained by the sensing.
- the object is displayed at a position in the background space corresponding to the actual position where the sensor data is obtained in accordance with the marker.
- a hemispherical symbol is adopted as an ecosystem object of the sensor data.
- the hemispherical symbol can be displayed with, for example, a nitric acid concentration, a size or a color intensity according to sensor data values such as conductivity and soil hardness. Moreover, a hemispherical symbol can be displayed with the color according to the types of sensor data, such as nitric acid concentration, electrical conductivity, and soil hardness, for example.
- the teacher user can recognize the environment of the field by looking at the ecosystem object of the sensor data with the field as the background space in a remote place away from the field, and the vegetation design suitable for the environment, Changes and the like can be advised to student users in the field.
- the ecosystem object associated with the marker is displayed at a relative position with respect to the marker.
- the ecosystem object includes absolute coordinates such as GPS coordinates. It is possible to associate position information representing a specific position and display an ecosystem object at an absolute position represented by the position information.
- FIG. 49 is a diagram showing a display example of the display of the time lapse mode of the ecosystem object on the terminal 12.
- the ecosystem object of the photograph taken at one place (range) such as a certain field is displayed on the terminal 12 along one axis, that is, along the time axis, for example.
- a log scale time axis is provided in which the direction from the bottom to the top indicates the passage of time.
- a field at a predetermined position is set as a field of interest, and a VR space (image thereof) of the field of interest is displayed at a position where the time axis is substantially divided into two equal parts.
- ecosystem objects of photographs taken in the field of interest are displayed in a spiral arrangement.
- the field of interest for example, the current location of the terminal 12 (the user) or a field at a position designated by the user operating the terminal 12 can be employed.
- a field in which a photograph as an ecosystem object designated by the user of the terminal 12 operating the terminal 12 can be employed.
- the date of attention in the time lapse mode display, pay attention to the date of attention at the position where the VR space of the field of interest is displayed.
- photographed on the field is displayed.
- an ecosystem object of a photograph taken in the field of interest in the past from the date of attention is displayed.
- an ecosystem object of a photograph taken on the target farm in the future from the target date is displayed.
- the attention date can be changed, for example, by dragging (swiping) the display screen of the time lapse mode display in FIG. 49 in the vertical direction.
- the user can operate the terminal 12 to select any one of the ecosystem objects of the photograph displayed in the time lapse mode as the selection object.
- the ecosystem object can be linked with related information related to the ecosystem component represented by the ecosystem object.
- information stored in a DB such as a cooperative farming method (registered trademark) DB that can be accessed by the cooperative farming method (registered trademark) CMS 71, or information that becomes a node of a relation graph Can be adopted.
- the terminal 12 can display information on other fields (plants) where the ecosystem component represented by the selected object exists.
- the same vegetation field information that displays information on another field close to the target field where the same vegetation as the vegetation exists is present. Display can be made.
- FIG. 50 is a diagram showing a display example of the same vegetation field information display performed on the terminal 12.
- the VR space of the field of interest is used as a background space, and a photo ecosystem object as a selected object is superimposed and displayed in the background space.
- a VR space that models an actual farm field is used as a background space, and (as it exists) as an ecosystem object. Pictures are displayed according to the markers.
- the terminal 12 displays the same vegetation field information, which is information on other fields in which the same vegetation as the vegetation represented by the selected object of the target field is present, superimposed on the background space.
- the same vegetation field information in FIG. 50 is a character string indicating that there are Ise and Oiso fields (plants) as other fields close to the target field where the same vegetation as the vegetation represented by the selected object of the target field exists. Includes “Ise” and "Oiso”. Further, the same vegetation field information in FIG. 50 includes an arrow indicating that the Ise field is located in the left rear direction of the background space of the same vegetation field information display in FIG. And an arrow indicating that the field information display is located in the right front direction of the background space of the field information display.
- the user can easily recognize another field in which the same vegetation as the vegetation represented by the selected object exists by looking at the same vegetation field information in FIG.
- the user can operate the terminal 12 to select another field displayed in the same vegetation field information display in FIG. 50 as the selected field.
- the display of the display unit 35 is displayed from the same vegetation field information display of the target field in FIG. 50 via the predetermined transition display. Transition to.
- a user of a certain field A can easily confirm the state of another field B in which the same vegetation as the vegetation existing in the field A exists.
- the teacher user can compare the state between a certain field A and another field B in which the same vegetation as that vegetation exists, and can be used for vegetation design and the like.
- FIG. 51 is a diagram showing a display example of the transition display performed on the terminal 12.
- transition display a screen that is gradually drawn from the display of the target farm field in FIG. 50 is displayed, and thereafter, a bird's-eye view screen that overlooks the country in which the target farm field is located and the entire earth.
- the screen gradually changes from the bird's-eye view screen to the selected field, and finally, the selected field is displayed, that is, the VR space of the selected field is displayed as the background space.
- FIG. 51 shows an example of an overhead screen that is one scene of the transition display as described above.
- the whole of Japan (and a part of the Eurasian continent) is displayed. Furthermore, on the bird's-eye view screen, environmental information such as the temperature and precipitation in the region is displayed in a bar graph and color at the position of each region shown on the bird's-eye view screen.
- the user can visually grasp, for example, the environment of the target field and the selected field and the outline of the environment of the region from the target field to the selected field by looking at the transition display.
- Google Earth registered trademark provided by Google Inc.
- Google Inc. For the transition display, for example, Google Earth (registered trademark) provided by Google Inc. can be used.
- FIG. 52 is a diagram showing a display example of the display of the selected field that finally arrives via the transition display of FIG.
- the VR space of the selected field is used as a background space, and ecosystem objects E1, E2, E3, E4, and E5 are displayed in the background space according to the markers.
- the ecosystem object E1 is, for example, an ecosystem object of a photograph in which vegetation as an ecosystem component is photographed, and is displayed at the position of the vegetation.
- Ecosystem objects E2 and E3 are, for example, iconic ecosystem objects representing the quality of soil as an ecosystem component (for example, vegetation is easy to grow or difficult to grow), and are displayed at the corresponding soil positions. ing.
- the ecosystem object E4 is, for example, an ecosystem object with an icon representing a honeycomb as an ecosystem component, and is displayed at a position where the honeycomb is located.
- the ecosystem object E5 is, for example, an ecosystem object of an icon representing a dangerous area of a farm or an area to be alerted as an ecosystem component, and is a dangerous area (dangerous area) or an area to be alerted (alert area). ) Position (range).
- a circular area centered on the beehive represented by the ecosystem object E4 is a danger area and a warning area represented by the ecosystem object E5.
- the donut-shaped area at the outer peripheral portion of the circular area centered on the beehive represented by the ecosystem object E4 is a warning area
- the circular area at the central portion is the danger area. It has become.
- VR display is performed.
- the ecosystem objects E1 to E5 in FIG. 52 can perform AR display in the selected field.
- the user who is actually working in the selected field can examine the place where the vegetation is to be planted by looking at the ecosystem objects E2 and E3 representing the quality of the soil, for example.
- the user can work safely while avoiding bees, for example, by looking at the ecosystem object E4 representing the beehive and the ecosystem object E5 representing the danger area and the warning area.
- the terminal 12 can notify the user that the terminal 12 has entered the danger area or the warning area by sound or vibration.
- FIG. 53 is a diagram illustrating a display example of related information on the terminal 12.
- an ecosystem object is linked to related information related to an ecosystem component represented by the ecosystem object in the information of a CMS database such as a cooperative farming method (registered trademark) DB. I can leave.
- a CMS database such as a cooperative farming method (registered trademark) DB.
- the terminal 12 can display related information linked to the selected object.
- FIG. 53 shows a display example of related information displayed on the terminal 12 as described above.
- a list of related information is displayed in the form of a web page.
- FIG. 54 is a diagram showing a display example of the sensor data analysis result on the terminal 12.
- the terminal 12 can request the server 13 to perform various analyzes of sensor data, for example, in accordance with user operations.
- the analysis unit 76 analyzes the sensor data in response to a request from the terminal 12, and provides (transmits) the analysis result to the terminal 12.
- the terminal 12 displays the analysis result of the sensor data provided from the server 13.
- the analysis unit 76 can perform analysis or the like that divides one year into, for example, three seasons, four seasons, five seasons, and the like according to environmental information such as temperature.
- the display example of the analysis result of such an analysis is shown.
- FIG. 55 is a diagram for explaining a copy of an ecosystem object.
- the editing unit 75 can edit the AR object registered in the storage 63 in accordance with the operation of the terminal 12 by the user.
- AR Examples of editing AR objects include copying ecosystem objects.
- an ecosystem object for example, an ecosystem object of a certain field A is copied to another field B.
- the terminal 12 When copying ecosystem objects, the terminal 12 performs VR display using a plurality of, for example, two VR spaces of the two fields A and B as background spaces.
- the ecosystem object is not yet assigned to the field B, and therefore, no ecosystem object is arranged in the background space of the field B.
- the ecosystem object is copied in accordance with the operation of the user of the terminal 12 in a state where two or more fields such as the two fields A and B are displayed in VR on the terminal 12. Is called.
- the user of the terminal 12 performs a copy operation for designating an ecosystem object to be copied and designating a copy destination position of the ecosystem object.
- the user of the terminal 12 moves the ecosystem object representing the mugwort of the field A to the background of the field B.
- the ecosystem object to be copied can be specified, and the copy destination position of the ecosystem object can be specified.
- the terminal 12 recognizes the ecosystem object representing the mugwort of the field A that has been dragged as the ecosystem object to be copied, and the position in the background space of the field B where the drag has ended is determined. It is recognized as the copy destination location of the ecosystem object.
- the ecosystem object representing mugwort among the ecosystem objects of the farm A can be copied to the position where the mugwort of the farm B is observed.
- the user of the farm B can easily give an ecosystem object representing the mugwort to the mugwort of the farm B.
- a teacher user instructs a student user who works in the field B to plant mugwort as advice for vegetation design
- an ecosystem object representing mugwort among the ecosystem objects in the field A Can be copied to the position of the field B where the mugwort is to be planted.
- the teacher user can easily advise the student user to plant mugwort.
- the server 13 uses the ecosystem object represented by the ecosystem object to be copied.
- Task objects representing tasks to be performed can be added (registered).
- the editing unit 75 in the server 13 The task to be performed on the mugwort represented by the ecosystem object copied to the field B (hereinafter also referred to as a copy object) is specified.
- the editing unit 75 generates a task object representing a task to be performed on mugwort (the generation unit 73 generates a task object) and registers it in the storage 63.
- the AR / VR display control unit 74 causes the terminal 12 to display the task object registered in the storage 63.
- a task object representing a task to be performed on the mugwort is displayed near the ecosystem object as a copy object representing the mugwort of the field B.
- a task object representing planting of mugwort is displayed as a task to be performed on mugwort.
- the ecosystem object as a copy object representing mugwort and the task object representing planting given to the field B are displayed when the user of the field B actually works on the field B.
- AR display can be performed on the terminal 12 of the user B.
- the user of the field B should see the ecosystem object as a copy object representing mugwort and the task object representing planting, and plant the mugwort at the position where the copy object representing mugwort is displayed. Can be easily recognized.
- FIG. 56 is a diagram showing an example of changing the display scale of the background space on the terminal 12.
- the display scale of the background space (the image thereof) is changed according to the user's operation or the like. can do.
- FIG. 56 shows an example of a background space Ba with a reduced scale as a display scale and a background space Bb with a larger scale.
- the background space Ba is an image overlooking a certain farm field
- the background space Bb is an image obtained by enlarging a part of the background space Ba.
- FIG. 57 is a diagram illustrating an example of changing the display of the AR object in accordance with the change of the display scale of the background space on the terminal 12.
- AR objects are displayed superimposed on the same background spaces Ba and Bb as in FIG.
- the scale as the display scale is small, that is, for example, when the background space Ba that is an image of an overhead view of a certain field is displayed, the AR assigned to the field reflected in the background space Ba Objects can be displayed with thinning out.
- the display scale is given to the farm field reflected in the background space Bb. AR objects can be displayed without thinning out.
- how much the AR object is thinned out can be set according to, for example, the display scale and the number of AR objects assigned to the field reflected in the background space.
- the maximum number of AR objects to be displayed on one screen is set, and thinning out of AR objects can be performed so that the number is equal to or less than the maximum number.
- FIG. 58 is a diagram illustrating an example of VR display in which each of a plurality of farm fields is used as a background space.
- the terminal 12 can perform VR display using each of the VR spaces of the two fields as a background space.
- the terminal 12 can simultaneously perform VR display using not only two but also three or more VR spaces of the field as background spaces.
- the VR displays of the plurality of fields can be displayed side by side.
- the terminal 12 can display the VR display of each of the plurality of fields side by side, for example, and can display the VR display of each of the plurality of fields in a superimposed manner with a predetermined transparency.
- the terminal 12 by performing VR display using each of the VR spaces of a plurality of fields as a background space, it is possible to easily compare, for example, vegetation existing in each of the plurality of fields.
- the display scale of the background space (VR display with the AR object superimposed thereon) on which each of the plurality of fields is reflected can be changed as described with reference to FIGS.
- wide fields and narrow fields are displayed so that their display sizes match, making it easy to compare, for example, the distribution of vegetation in a wide field and the distribution of vegetation in a narrow field. Can be done.
- multiple field display when performing VR display (hereinafter also referred to as “multiple field display”) using each of a plurality of fields as a background space, fields having different positions (locations) are to be used as a plurality of fields to be displayed. Of course, it is possible to adopt a field having the same position but different time.
- VR display using the current state of a certain field and the past state of the field as background space can be performed. Further, for example, it is possible to perform VR display in which a state at a certain past time point of a certain field and a state at another past time point of the field are used as the background space.
- FIG. 59 is a diagram for explaining an example of associating markers with ecosystem objects.
- the generation unit 73 generates an ecosystem object, for example, according to the operation of the terminal 12 by the user, and associates the marker with the ecosystem object as necessary.
- the ecosystem object can be displayed (arranged) at a position in the relative background space with reference to the marker associated with the ecosystem object. .
- one marker M1 can be associated with one ecosystem object obj1.
- one marker can be associated with a plurality of ecosystem objects. That is, for example, as shown in FIG. 59, one marker M1 can be associated with a plurality of three ecosystem objects obj1, obj2, and obj3.
- the marker M1 When the marker M1 is associated with one ecosystem object obj1, when the marker M1 is detected from an image obtained by photographing with the camera 37 at the terminal 12, the relative to the marker M1 as a reference. One ecosystem object obj1 associated with the marker M1 is displayed at the correct position.
- the marker M1 is associated with the three ecosystem objects obj1 to obj3, when the marker M1 is detected from an image obtained by photographing with the camera 37 in the terminal 12, the marker M1 is used as a reference.
- the three ecosystem objects obj1 to obj3 associated with the marker M1 are displayed at the relative positions.
- the relative positions of the ecosystem objects obj1 to obj3 displayed with reference to the marker M1 can be set separately for each of the ecosystem objects obj1 to obj3.
- one marker can be associated with a plurality of ecosystem objects, for example, one marker is associated with a part or all of the ecosystem objects assigned to one field, etc. Can do.
- FIG. 60 is a diagram for explaining an example of associating markers with ecosystem objects.
- a plurality of different markers can be associated with one ecosystem object.
- the markers M1 and M2 are associated with the ecosystem objects obj1 to obj3, respectively.
- the relative position to display the ecosystem object obj # i with respect to the marker M1 is set to the same position, but can be set to a different position.
- the marker M1 is represented by the ecosystem object obj # i, which is one of the markers M1 and M2.
- An ecosystem object obj # i can be displayed in an AR display in the field where the ecosystem component exists.
- the ecosystem object obj # i is displayed in AR at the arbitrary location. can do.
- the teacher user can see the ecosystem object obj # i representing the ecosystem component existing in the farm at a remote place away from the field where the ecosystem component represented by the ecosystem object obj # i exists. it can.
- the teacher user can easily recognize the vegetation planted in the field and the arrangement of the vegetation in the field in a remote place away from the field, and provide, for example, vegetation design advice to the field student user. It can be performed.
- FIG. 61 is a diagram for explaining the change of the marker installation position.
- the ecosystem object is displayed (arranged) at a position in the relative background space based on the marker associated with the ecosystem object.
- FIG. 61 an ecosystem object is displayed at a position Pb separated by the vector Va with reference to the marker position Pa.
- the ecosystem object is displayed at a relative position with respect to the marker. Therefore, when the marker is moved, the absolute position at which the ecosystem object is displayed also corresponds to the movement of the marker. Move.
- the position where the ecosystem object is displayed moves according to the movement of the marker as described above.
- the ecosystem object is displayed at a position unrelated to the ecosystem component represented by the ecosystem object.
- the marker can be moved in the position invariant mode in which the (absolute) position where the ecosystem object associated with the marker is displayed is not changed before and after the movement of the marker.
- the ecosystem object is displayed at the position Pb separated by the vector Va with reference to the marker position Pa.
- the ecosystem object is not related to the ecosystem component represented by the ecosystem object. It is possible to prevent display at the position.
- the ecosystem object is displayed at a position separated by the vector Vc with reference to the marker position Pc after movement, that is, at the same position Pb as that for the marker before movement.
- FIG. 62 is a diagram showing a display example of an ecosystem object representing an ecosystem component related to the entire field.
- an ecosystem object is displayed with a certain field A (real space, photographing real space, or VR space) as a background space.
- FIG. 62 ecosystem objects representing the Chinese cabbage, mugwort, leek, beehive, nitric acid concentration, moisture content, and climate of the field A are displayed.
- the ecosystem objects of Chinese cabbage, mugwort, leek and honeycomb are displayed in the field A in the vicinity (near) where the Chinese cabbage, mugwort, leek and honeycomb are present.
- the ecosystem object representing each of the nitrate concentration, the water content, and the climate in the field A is an ecosystem object (hereinafter, also referred to as an entire object) representing an ecosystem component related to the entire field A.
- the background space of A is displayed at a fixed position on the display screen of the display unit 35 (the rightmost position of the display screen in FIG. 62).
- the terminal 12 is a see-through type HMD, and in the see-through type HMD, the real space of the field A is used as a background space, and an ecosystem object is superimposed and displayed on the background space.
- the user tilts his / her neck while wearing the see-through type HMD as the terminal 12, the user tilts his / her head in the background space that appears on the display screen (enters the user ’s field of view). It moves to the range on the right side of the field A than before.
- the user tilts the head even after the user tilts the neck. They are displayed at the same position, that is, at the right end position of the display screen in FIG.
- the entire object representing the ecosystem components related to the whole farm field A is displayed.
- the user when AR display or VR display is performed with the field A as the background space, the user, for example, as the ecosystem component related to the entire field A, for example, about nitric acid concentration, water content, climate, etc. Can be recognized at any timing.
- the display of the entire object can be turned on / off in accordance with, for example, a user operation.
- FIG. 63 is a diagram for explaining display of related information on the terminal 12.
- an ecosystem object can be linked to related information related to the ecosystem component represented by the ecosystem object.
- the related information includes, for example, information stored in a CMS database such as the cooperative farming method (registered trademark) DB accessible by the cooperative farming method (registered trademark) CMS 71, or the nodes of the relationship graphs described in FIGS. The following information can be employed.
- the terminal 12 when the user selects a certain ecosystem object as a selection object in the display of the time lapse mode in FIG. 49, the terminal 12 is linked to the selection object. Related information can be displayed.
- a certain ecosystem object is selected as a selection object from the time-lapse mode display (FIG. 49) in accordance with the user operation.
- the selected object is an ecosystem object representing a certain vegetation.
- the terminal 12 transmits the selected object to the server 13 according to, for example, a user operation, and the server 13 uses vegetation (hereinafter referred to as related vegetation) as related information related to the vegetation represented by the selected object from the terminal 12. Is also identified from the relationship graph.
- vegetation hereinafter referred to as related vegetation
- the graph display control unit 72 displays the vegetation (of the node) linked to the vegetation (of the node) represented by the selected object in a multipartite graph such as a relation graph (bipartite graph), for example. Identify as related vegetation.
- the cooperative farming method (registered trademark) CMS 71 searches the CMS database for related vegetation information, that is, for example, a photograph or name of the related vegetation, and transmits it to the terminal 12 in the form of a web page. To do.
- the list of related vegetation is displayed in the form of a web page together with the information (photographs) of the related vegetation.
- the user of the terminal 12 can easily obtain vegetation information related to the vegetation represented by the selected object simply by selecting the selected object.
- FIG. 64 is a diagram for explaining an example of selection of an AR object for performing AR display or VR display.
- an AR object (hereinafter also referred to as a display target object) that performs AR display or VR display superimposed on the background space on the display screen of the display unit 35 may be appropriately selected, that is, changed. it can.
- the display target object can be selected (changed) according to the skill level of the user of the terminal 12 with respect to the cooperative farming method (registered trademark).
- the level of proficiency of the user of the terminal 12 can be recognized based on the profile by having the user input the user's profile in advance and registering it in the terminal 12 or the server 13.
- 64A shows a display example of AR display or VR display for a novice user.
- an ecosystem object objES1, objES2, objES3, and objES4 and task objects objT1, objT2, objT3, objT12, and objT23 are displayed with a certain field A as a background space.
- Ecosystem objects objES1 and objES3 represent Chinese cabbage.
- Ecosystem object objES2 represents leek, and ecosystem object objES4 represents mugwort.
- Task objects objT1 to objT3 represent vegetation harvesting, and task objects objT12 and objT23 represent harvesting order.
- Novice users can easily recognize the position in the field A where the vegetation such as Chinese cabbage represented by each of the ecosystem objects objES1 to objES4 exists by looking at the ecosystem objects objES1 to objES4. Furthermore, the novice user uses the task objects objT1 to objT3 and the task objects objT12 and objT23 to determine the Chinese cabbage represented by the ecosystem object objES1, the leek represented by the ecosystem object objES2, and the Chinese cabbage represented by the ecosystem object objES3. It is easy to recognize what should be harvested in order.
- the AR display or VR display for novice users represents ecosystem components that are not highly specialized in order to teach beginners the symbiotic farming method (registered trademark).
- Ecosystem objects, task objects that carefully represent tasks to be performed on ecosystem components, and the like are selected and displayed as display target objects.
- FIG. 64B shows a display example of AR display or VR display for expert users.
- an ecosystem object objES1, objES2, objES3, and objES4 and task objects objT1, objT2, and objT3 are displayed with a certain field A as a background space.
- the AR display or VR display of B in FIG. 64 is displayed by selecting the task objects objT12 and objT23 as display target objects in that the task objects objT12 and objT23 are not displayed as display target objects.
- 64A which is different from the AR display or VR display for beginners.
- the expert user already knows, for example, that Chinese cabbage and leek should be harvested in the order of the Chinese cabbage represented by the ecosystem object objES1, the leek represented by the ecosystem object objES2, and the Chinese cabbage represented by the ecosystem object objES3. Have won as.
- task objects objT12 and objT23 representing the order of harvest of Chinese cabbage and leek are redundant information, and the display of such task objects objT12 and objT23 is bothersome for the expert user. May make you feel.
- the ecosystem objects objES1 to objES4 and the task objects objT1 to objT3, objT12 and objT23 are excluded from the task objects objT12 and objT23.
- Ecosystem objects objES1 to objES4 and task objects objT1 to objT3 are selected and displayed as display target objects.
- rare vegetation may be important for experienced users, but not important for novice users.
- the terminal 12 displays only the ecosystem object representing rare vegetation for the user with high skill level, for example, according to the skill level of the user, and the skill level is low. Users can only see ecosystem objects that represent major vegetation.
- the selection of the display target object according to the skill level is performed, for example, by assigning a level representing the skill level to the AR object and selecting the AR object to which the level representing the skill level of the user is assigned. Can do.
- FIG. 65 is a diagram for explaining another example of selection of an AR object for AR display or VR display.
- the display target object can be selected according to the skill level of the user or by specifying a data category.
- AR objects can be divided into, for example, photographs (data), icons (data), and sensor data (representing symbols) according to the category of data as the AR object. it can.
- one or more categories are selected from the above-described categories of photos, icons, and sensor data in accordance with the user's operation, and the AR object of the category is set as the display target object. You can choose.
- FIG. 65A shows a display example of AR display or VR display when all AR objects in the categories of photos, icons, and sensor data are selected as display target objects.
- FIG. 65A a photo ecosystem object representing a Chinese cabbage and leek, an icon task object representing a Chinese cabbage and leek harvesting task, and a sensor data ecosystem object representing each of the nitric acid concentration and water content are shown. Selected and displayed as an object to be displayed.
- FIG. 65B shows a display example of AR display or VR display when only the AR object of the photo category is selected as the display target object among the photo, icon, and sensor data categories.
- FIG. 65B a photo ecosystem object representing each of Chinese cabbage and leek is selected and displayed as a display target object.
- 65C shows a display example of AR display or VR display when only the AR object of the sensor data category is selected as the display target object among the photo, icon, and sensor data categories. .
- an ecosystem object of sensor data representing each of the nitric acid concentration and the water content is selected and displayed as a display target object.
- 65D shows a display example of AR display or VR display when an AR object of a photo and an icon among categories of a photo, an icon, and sensor data is selected as a display target object.
- a photo ecosystem object representing a Chinese cabbage and leek, and an icon task object representing a Chinese cabbage and leek harvesting task are selected and displayed as display target objects.
- one or more categories can be selected from the categories of AR objects, and the AR objects in the categories can be selected as display target objects. Therefore, only AR objects of a category required by the user can be displayed by AR display or VR display.
- FIG. 66 is a flowchart for explaining an example of processing for registering an ecosystem object in the ecosystem utilization system of FIG. 1 (FIG. 43).
- step S311 the sensor device 11 and the terminal 12 acquire registration target information for an ecosystem component to which an ecosystem object is assigned (hereinafter also referred to as an attention ecosystem component).
- the sensor device 11 acquires sensor data obtained as a result of sensing a predetermined physical quantity such as temperature or moisture as the target ecosystem component as registration target information.
- the terminal 12 in the acquisition unit 51, for example, in response to the user's operation, a photograph of the ecosystem component of interest such as vegetation photographed on the farm field or a comment (for example, “bee Or “What is this plant?”) Is acquired as registration target information.
- step S312 the sensor device 11 and the terminal 12 acquire position information of the ecosystem component of interest.
- the sensor device 11 acquires, for example, GPS coordinates representing the position of the sensor device 11 itself as position information of the ecosystem component of interest.
- the terminal 12 acquires, for example, GPS coordinates representing the position of the terminal 12 itself as position information of the ecosystem component of interest.
- the relative position of the ecosystem component of interest is detected on the basis of the position of the terminal 12 itself, and the GPS coordinates of the terminal 12 itself are determined relative to the ecosystem component of interest. Coordinates obtained by correcting the position can be acquired as position information of the ecosystem constituent of interest.
- the user can operate the operation unit 34 to input the position information of the ecosystem component of interest.
- step S313 the sensor device 11 and the terminal 12 transmit the registration target information and the position information acquired for the ecosystem component of interest to the server 13.
- step S321 the server 13 receives the registration target information and position information of the ecosystem component of interest transmitted from the sensor device 11 or the terminal 12.
- step S322 the generation unit 73 of the server 13 (FIG. 4) acquires an ecosystem object to be given to the ecosystem component of interest.
- the generation unit 73 assigns the photograph to the target ecosystem constituent. Adopted for system objects.
- the generation unit 73 when sensor data is included in the registration target information of the target ecosystem constituent, the generation unit 73 generates a symbol representing the sensor data as an ecosystem object to be given to the target ecosystem constituent.
- the ecosystem object candidate that gives, for example, the ecosystem object of the icon representing the beehive described in FIG. 52, the ecosystem object of the icon representing the dangerous area, or the like to the ecosystem component of interest. It can be displayed on the terminal 12 as an object.
- the candidate object which the user selected from the candidate objects displayed on the terminal 12 can be employ
- step S323 the generation unit 73 generates object information that associates the registration target information of the target ecosystem component, the position information, the ecosystem object assigned to the target ecosystem component, and the like. Then, the generation unit 73 registers the object information in the object DB stored in the storage 63, for example, for registering the object information.
- FIG. 67 is a diagram showing an outline of object information.
- the object information includes an ecosystem object representing an ecosystem component, location information of the ecosystem component, and registration target information.
- object information can include field information, marker information, task objects, and the like.
- the farm field information is information related to the farm field where the ecosystem components are observed (the farm field existing at the position represented by the position information).
- the field information can include the field name, field address, field soil information, and the like.
- an ecosystem object representing an ecosystem component existing in a certain field can be picked up.
- Marker information is information related to markers that display ecosystem objects.
- the marker information includes, for example, an image as a marker, relative position information indicating a relative position where an ecosystem object is displayed with reference to the marker, and the like.
- the image as the marker can be uploaded from the terminal 12 to the server 13, for example. Also, an image as a marker can be generated by the server 13 and downloaded to the terminal 12, for example.
- the relative position information of the marker can be calculated, for example, by having the user input by operating the terminal 12, or from the position of the marker and the position information of the ecosystem object.
- the ecosystem is located at a position represented by the relative position information included in the marker information with reference to the marker.
- the object is displayed.
- the same image can be adopted as an image as a marker included in the marker information.
- one marker can be associated with a plurality of ecosystem objects, and a plurality of ecosystem objects respectively representing a plurality of ecosystem components at close positions can be displayed by the one marker.
- the task object represents a task to be performed on the ecosystem component represented by the ecosystem object included in the object information.
- FIG. 68 is a flowchart for explaining an example of AR object display processing in the AR mode in the ecosystem utilization system of FIG. 1 (FIG. 43).
- step S331 the acquisition unit 51 of the terminal 12 (FIG. 3) acquires a real space that is a background space, a photographing real space, or a VR space.
- the terminal 12 when the terminal 12 is an immersive HMD or a smartphone, a shooting real space obtained by shooting a real space with the camera 37, or a VR space (an image thereof) obtained by modeling the real space, Acquired as background space (image).
- a real space that the user of the terminal 12 can see through is used as a background space as it is.
- step S332 the acquisition unit 51 of the terminal 12 transmits an object request for requesting an ecosystem object and a task object to be placed in the background space to the server 13 (via the communication unit 40).
- the acquisition unit 51 detects a marker from the background space, and transmits an object request to the server 13 together with the marker.
- the acquisition unit 51 acquires the current location of the terminal 12 and transmits an object request to the server 13 together with the current location.
- step S341 the AR / VR display control unit 74 of the server 13 (FIG. 4) receives the ecosystem object and the task object (via the communication unit 67) in response to the object request transmitted from the terminal 12. 12 to send.
- the AR / VR display control unit 74 includes object information including a marker transmitted from the terminal 12 together with the object request, or position information (hereinafter referred to as a position close to the current location of the terminal 12 transmitted together with the object request).
- the object information including the absolute position information) is searched from the object DB stored in the storage 63 as the object information of interest.
- the AR / VR display control unit 74 displays the ecosystem object included in the target object information and the relative position information of the marker or the absolute position information (hereinafter, also referred to as object position information). 12 to send.
- the AR / VR display control unit 74 transmits the task object to the terminal 12.
- step S333 the acquisition unit 51 of the terminal 12 acquires the ecosystem object, task object, and object position information transmitted from the server 13.
- step S334 the display control unit 52 of the terminal 12 performs AR display by superimposing the ecosystem object acquired by the acquisition unit 51 on the position represented by the object position information in the background space acquired in step S331.
- the display control unit 52 displays the AR on the task object acquired by the acquisition unit 51 in the vicinity of the ecosystem object, superimposed on the background space.
- the terminal 12 performs the AR display described with reference to FIGS. 44 and 45 described above, for example.
- the terminal 12 can always download from the server 13 object information including absolute position information indicating a position close to the current location of the terminal 12.
- the terminal 12 displays the ecosystem object and the task object included in the downloaded object information from the server 13 without making an object request to the server 13, that is, without communicating with the server 13. can do.
- FIG. 69 is a flowchart for explaining an example of processing of displaying an AR object in the VR mode in the ecosystem utilization system of FIG. 1 (FIG. 43).
- step S351 the terminal 12 waits for the user to designate a predetermined position as the designated position, and transmits the designated position to the server 13.
- step S361 the server 13 receives the designated position from the terminal 12.
- step S ⁇ b> 362 the server 13 transmits to the terminal 12 a photographing real space in which a real space in a predetermined range including the designated position from the terminal 12 is photographed, or a VR space (an image thereof) that models the real space. .
- the storage 63 stores a shooting real space in which a real space is shot and a VR space that models the real space.
- the AR / VR display control unit 74 models the shooting real space obtained by shooting the real space including the designated position from the shooting real space or VR space stored in the storage 63, or the real space.
- the VR space is read and transmitted to the terminal 12.
- step S352 the acquisition unit 51 of the terminal 12 acquires the shooting real space or the VR space from the server 13 as the background space.
- step S353 the acquisition unit 51 of the terminal 12 transmits to the server 13 an object request for requesting an ecosystem object and a task object to be arranged in the background space.
- the acquisition unit 51 detects a marker from the background space, and transmits an object request to the server 13 together with the marker. Alternatively, the acquisition unit 51 transmits an object request to the server 13 together with the designated position.
- step S363 the AR / VR display control unit 74 of the server 13 transmits the ecosystem object and the task object to the terminal 12 in response to the object request transmitted from the terminal 12.
- the AR / VR display control unit 74 includes object information including a marker transmitted from the terminal 12 together with the object request or an object including absolute position information indicating a position close to the designated position transmitted together with the object request. Information is searched from the object DB stored in the storage 63 as the object of interest information.
- the AR / VR display control unit 74 transmits the ecosystem object included in the target object information and the relative position information of the marker as the object position information or the absolute position information to the terminal 12.
- the AR / VR display control unit 74 transmits the task object to the terminal 12.
- steps S354 and S355 the terminal 12 performs the same processing as that in steps S333 and S334 in FIG. 68, respectively, and thereby, in the captured real space or VR space as the background space acquired in step S352, System objects and task objects are superimposed and displayed in VR.
- the terminal 12 performs, for example, the VR display described with reference to FIGS. 46 and 48 described above.
- the terminal 12 superimposes the ecosystem object and the task object on the background space and displays the VR, but for example, the server 13 displays the ecosystem object and the task object in the background space.
- FIG. 70 is a flowchart for explaining an example of processing for editing an AR object in the ecosystem utilization system of FIG. 1 (FIG. 43).
- FIG. 70 as editing of an AR object, for example, inputting a comment to an ecosystem object, or a task object representing a task to be performed on an ecosystem component represented by the ecosystem object is set as the ecosystem object.
- the example of the process in the case of performing matching is shown.
- step S371 the terminal 12 displays the AR object in the AR mode or the VR mode as described with reference to FIG.
- step S372 the acquisition unit 51 of the terminal 12 accepts the editing information after the user performs an operation of inputting editing information as editing of the AR object displayed in step S371.
- the acquisition unit 51 receives the comment as editing information.
- the acquisition unit 51 displays the task.
- the object is accepted as editing information.
- the terminal 12 can display, for example, candidate task objects that can be associated with the ecosystem object.
- the user can select a task object to be associated with the ecosystem object from among the task object candidates displayed on the terminal 12.
- task object candidates can be acquired from the server 13, for example.
- step S373 the acquisition unit 51 of the terminal 12 transmits the editing information received in step S372 to the server 13 together with identification information for identifying the corresponding AR object.
- identification information for identifying the AR object is assigned to the AR object.
- the terminal 12 transmits the editing information together with the identification information of the ecosystem object to which the comment is input and the ecosystem object associated with the task object.
- step S381 the editing unit 75 of the server 13 receives the identification information and the editing information transmitted from the terminal 12.
- step S382 the editing unit 75 of the server 13 edits the AR object according to the editing information from the terminal 12.
- the editing unit 75 adds the editing information to the object information (FIG. 67) including the ecosystem object identified by the identification information transmitted from the terminal 12 together with the editing information. Add a comment.
- the editing unit 75 adds the task object as editing information to the object information including the ecosystem object identified by the identification information transmitted from the terminal 12 together with the editing information. Add
- step S374 the terminal 12 redisplays the AR object in the AR mode or the VR mode as described in FIG. 68 or FIG.
- the acquisition unit 51 acquires the edited AR object again from the server 13, and the display control unit 52 displays the edited AR object.
- the ecosystem object is displayed together with the comment input by the user in step S374.
- step S374 the ecosystem object is displayed together with the task object associated with the user.
- FIG. 70 the case where the ecosystem object is edited has been described.
- the task object is edited, that is, a comment is input to the task object. It is possible.
- FIG. 71 is a flowchart for explaining another example of the editing process of the AR object in the ecosystem utilization system of FIG. 1 (FIG. 43).
- FIG. 70 shows an example of processing when copying an ecosystem object, for example, as editing of an AR object.
- step S391 for example, the terminal 12 displays the AR object in the VR mode for each of the two fields A and B as described in FIG.
- the terminal 12 waits for the user to specify the two fields A and B, and, for example, as shown in FIG. 55, the background of each of the VR spaces of the two fields A and B specified by the user As a space, VR display is performed.
- step S392 the acquisition unit 51 of the terminal 12 receives the copy information after editing the AR object displayed in step S391 until the user performs an operation for inputting the copy information.
- the user can copy a certain ecosystem object displayed in the background space of the farm field A to a desired position in the background space of the farm field B as a copy target. You can copy the target ecosystem object.
- the acquisition unit 51 receives, as copy information, a copy object that is an ecosystem object to be copied, and a field and position of a copy destination where the copy object is copied.
- step S393 the acquisition unit 51 of the terminal 12 transmits the copy information received in step S392 to the server 13.
- step S401 the editing unit 75 of the server 13 receives the copy information transmitted from the terminal 12.
- step S402 the editing unit 75 of the server 13 copies the ecosystem object according to the copy information from the terminal 12.
- the editing unit 75 generates new object information including the copy object included in the copy information as the object information of the copy destination field included in the copy information. Further, the editing unit 75 sets position information indicating the position of the copy destination included in the copy information as (absolute) position information of the new object information, and adds the new object information to the object DB of the storage 63 (added). )sign up.
- step S403 the editing unit 75 determines a task to be performed on the ecosystem constituent represented by the copy object.
- the editing unit 75 determines that there is no task to be performed on the ecosystem component represented by the copy object. In this case, the server 13 does not perform subsequent processing.
- the editing unit 75 determines the planting of the vegetation as a task to be performed for the ecosystem composition represented by the copy object.
- the editing unit 75 includes (registers) a task object representing planting in the new object information additionally registered in the object DB in step S402, whereby the task is added to the copy object included in the new object information. Objects are associated.
- step S394 the terminal 12 re-displays the AR object in the VR mode as described with reference to FIG.
- the acquisition unit 51 acquires the edited AR object again from the server 13, and the display control unit 52 displays the edited AR object.
- step S394 A copy object is displayed at a desired position in the background space of the field B.
- step S394 the task object associated with the copy object is displayed together with the copy object.
- FIG. 72 is a diagram illustrating an example of transition of the display screen of the display unit 35 when the terminal 12 performs registration or editing of an AR object.
- the terminal 12 changes the display screen of the display unit 35 as shown in FIG. 72 by exchanging necessary information with the server 13.
- the “Main Menu” screen is displayed on the display screen of the display unit 35.
- the “Main Menu” screen has a [Camera Capture] button and an [Edit Object] button.
- the terminal 12 takes a picture with the camera 37 and acquires the GPS coordinates (position information) at the time of taking the picture. Then, on the display screen of the display unit 35, a “comment entry” screen for inputting a comment on the photograph is displayed.
- the “comment entry” screen has a “registration” button.
- the display screen of the display unit 35 has a “registration complete” button. A screen is displayed.
- the photograph taken by the user on the “Photograph” screen and the comment entered on the “Comment Enter” screen are registered as the registration target information described with reference to FIG. Is transmitted from the terminal 12 to the server 13. Furthermore, the GPS coordinates acquired at the time of taking a picture are transmitted from the terminal 12 to the server 13 as the position information described with reference to FIG. In the server 13, object information including position information from the terminal 12 and registration target information is registered.
- the “Action List” screen has a [Generate] button, a [Browse] button, a [Create Task] button, a [Task List] button, a [Data Import] button, and a [Registered Data] button.
- the coordinates input by the user on the “Coordinate Designation” screen are transmitted from the terminal 12 to the server 13 as the position information described with reference to FIG. Further, the comment entered by the user on the “comment entry” screen is transmitted from the terminal 12 to the server 13 as the registration target information described with reference to FIG.
- object information including position information from the terminal 12 and registration target information is registered. That is, in this case, object information without a photograph as an ecosystem object, in other words, object information including an empty photograph is registered.
- the display screen of the display unit 35 returns to the “coordinate designation” screen.
- the “object selection” screen for example, a list of photographs registered as ecosystem objects in the object DB of the server 13 is displayed.
- the field where the ecosystem component represented by the ecosystem object selected by the user on the “Select Object” screen is observed (the object information of the ecosystem object selected by the user (FIG. 67)).
- the time-lapse mode display of the ecosystem object described with reference to FIG. 49 is performed with the farm field represented by the farm field information included as the target farm field.
- the “Time Lapse- Relation Record” screen has an [Select Object] button, [Select Other Farm Name] button, [Shoot] button, [Edit Comment] button, and [Delete] button.
- the server 13 selects the ecosystem component node represented by the selected object.
- a relation graph (such as FIG. 26) as a node of interest is generated, and a “Gephi graph model search” screen having the relation graph is displayed on the display screen of the display unit 35 of the terminal 12 using Gephi.
- the “Gephi graph model search” screen has a [URL selection] button.
- a display unit is displayed on the “Gephi graph model search” screen.
- a “CMS database search” screen is displayed on the display screen 35.
- CMS database search for example, as described with reference to FIG. 53, related information related to vegetation and the like represented by the node selected by the user on the “Gephi graph model search” screen is searched from the CMS database of the server 13. Displayed.
- the display screen of the display unit 35 displays, for example, FIG. The displayed same vegetation field information display is displayed.
- the display screen of the display unit 35 is, for example, The screen switches to the “target plantation object” screen via the transition display using Google Earth shown in FIG.
- the selected farm field as shown in FIG. 52 is displayed.
- the user can perform an operation of replacing a photo (including an empty photo) as a selected object with another photo.
- the “Comment entry” screen has a [Register] button.
- the server 13 displays the photo as the selected object included in the object information (FIG. 67). Replaced.
- the user can perform an operation of editing a comment included in the object information (FIG. 67) of the selected object.
- the “Edit” screen has a [Register] button.
- the server 13 reflects the edited result of the comment on the object information (FIG. 67). .
- the “Erase Confirmation” screen has an [Erase] button.
- the server 13 deletes the object information (FIG. 67) of the selected object.
- an “erase completion” screen indicating completion of deletion of the object information is displayed.
- the “icon list” screen has buttons corresponding to tasks that can be performed on ecosystem components.
- FIG. 72 for simplification of explanation, two tasks of harvesting and harvesting are adopted as tasks that can be performed on the ecosystem components, and the “icon list” screen displays harvesting [Harvest] Button and a [Weed Control] button representing logging.
- an “object designation” screen having a “designation” button is displayed on the display screen of the display unit 35.
- an ecosystem object representing an ecosystem component that can be harvested which is a task corresponding to the [Harvest] button, is displayed.
- a “comment edit” screen having a [register] button is displayed on the display screen of the display unit 35.
- the user inputs a comment for a task object (hereinafter also referred to as “harvest object”) representing a task corresponding to the [Harvest] button, which is associated with the ecosystem object selected on the “Object Specification” screen. be able to.
- a comment for a task object hereinafter also referred to as “harvest object” representing a task corresponding to the [Harvest] button, which is associated with the ecosystem object selected on the “Object Specification” screen.
- the server 13 object information of the ecosystem object selected by the user on the “Object Designation” screen (FIG. 67).
- the harvest object and the comment input by the user are registered (added).
- This “registration complete” screen has a “complete” button.
- the display screen of the display unit 35 returns to the “icon list” screen.
- an ecosystem object representing an ecosystem component that can be a target of logging, which is a task corresponding to the [Weed Control] button, is displayed.
- a “comment edit” screen having a [register] button is displayed on the display screen of the display unit 35.
- a “registration completion” screen indicating completion of logging object registration is displayed.
- This “registration complete” screen has a “complete” button.
- the display screen of the display unit 35 returns to the “icon list” screen.
- the task object is registered in association with the ecosystem object selected by the user on the “object designation” screen.
- buttons of task objects representing harvesting and cutting are called [Harvest] button and [WeedWeControl] button for convenience.
- the user selects an ecosystem object that represents an ecosystem component to be harvested. At the same time, it is possible to specify the order of harvesting the ecosystem components. And in the server 13, the arrow-shaped icon showing the order of harvesting can be produced
- an icon as a task object representing harvesting is displayed as an operable [harvest icon] button.
- the detailed information of the harvest displayed on the “Details” screen includes, for example, the name of the fruit to be harvested (vegetation) and the fruit to be harvested, such as harvesting from the red ripe fruit and leaving the blue immature fruit. Selection criteria, information on balance with other work, such as leaving only 20% for self-seeding without harvesting all fruits, is included.
- the detailed information can be input as a comment on, for example, the “edit comment” screen for inputting a comment on the harvested object.
- the user On the “object designation” screen displayed when the [Weed Control] button of the “icon list” described above is operated on the terminal 12, the user represents an ecosystem representing an ecosystem component to be cut.
- the object it is also possible to specify the order of logging of ecosystem components.
- the arrow-shaped icon showing the order of logging can be produced
- an icon as a task object representing logging is displayed as an operable [Logging Icon] button.
- the detailed information of logging displayed on the “Details” screen includes, for example, the name of the fruit (vegetation) to be logged, whether to remove roots, cut on the ground, or cut at the height of the vegetable. And other vegetation strategies, etc., and other work related information.
- the [Import] screen is displayed on the display screen of the display unit 35.
- the sensor data selected by the user on the “import data list” screen is imported into the object DB of the storage 63 of the server 13.
- the “Import” screen has a [Next] button, and when this [Next] button is operated, a [Registered Data List] screen is displayed on the display screen of the display unit 35.
- the [Registered Data List] screen is also displayed when the [Registered Data] button is operated on the “Action List” screen.
- the [Registered Data List] screen displays, for example, each field (name of the field), [Nitric Acid] button, [Electric Conductivity] button, and [Soil Hardness] button.
- sensor data symbol of the nitrate concentration sensed in the field selected by the user on the [Registered Data List] screen is displayed.
- the “visual data display” screen for conductivity is displayed on the display screen of the display unit 35.
- the processing performed by the computer (CPU) according to the program does not necessarily have to be performed in time series in the order described as the flowchart. That is, the processing performed by the computer according to the program includes processing executed in parallel or individually (for example, parallel processing or object processing).
- the program may be processed by one computer (processor), or may be distributedly processed by a plurality of computers. Furthermore, the program may be transferred to a remote computer and executed.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- the embodiments of the present technology are not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present technology. That is, the present technology can be applied to general ecosystem management in addition to support of cooperative farming method (registered trademark).
- the present technology can take a configuration of cloud computing in which one function is shared by a plurality of devices via a network and is jointly processed.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- this technique can take the following structures.
- An acquisition unit that acquires an ecosystem object that represents an ecosystem component that constitutes an ecosystem of a field in which a plurality of types of vegetation coexist, and a task object that represents a task to be performed on the ecosystem component;
- Display control for displaying the ecosystem object in AR (Augmented Reality) at a position in a predetermined background space corresponding to the actual position of the ecosystem component, and displaying the task object in the background space in AR
- An information processing apparatus comprising: a display control unit that performs: ⁇ 2> A predetermined marker or GPS (Global Positioning System) position information is associated with the ecosystem object, The said display control part performs the display control which displays the said ecosystem object in the relative position on the basis of the said marker in the said background space, or the position which the said positional information represents.
- ⁇ 1> Information processing device.
- ⁇ 3> The information processing apparatus according to ⁇ 1> or ⁇ 2>, wherein the background space is a real real space, a photographing real space in which the real space is photographed, or a VR (Virtual Reality) space that models the real space .
- the ecosystem object or the task object is edited according to a user operation, The information processing apparatus according to any one of ⁇ 1> to ⁇ 3>, wherein the acquisition unit acquires the ecosystem object after editing and the task object.
- ⁇ 5> The information processing apparatus according to ⁇ 4>, wherein the ecosystem object is copied to another field as an edit of the ecosystem object.
- ⁇ 6> The information processing apparatus according to ⁇ 5>, wherein a task object representing a task to be performed on an ecosystem constituent represented by the ecosystem object copied to the other field is added.
- ⁇ 7> The information processing apparatus according to any one of ⁇ 1> to ⁇ 6>, wherein a display scale of the background space is changeable.
- ⁇ 8> The information processing apparatus according to any one of ⁇ 1> to ⁇ 7>, wherein two or more farmland images can be simultaneously displayed as the background space.
- ⁇ 9> The information processing apparatus according to ⁇ 2>, wherein one marker is associated with part or all of the ecosystem object of one field.
- ⁇ 10> The information processing apparatus according to ⁇ 2>, wherein two different markers are associated with the same ecosystem object.
- the marker associated with the ecosystem object is movable; The information processing apparatus according to ⁇ 2>, wherein the ecosystem object is displayed at the same position as before the marker is moved after the marker associated with the ecosystem object is moved.
- the display control unit further performs display control for displaying the ecosystem object along one axis.
- the display control unit further performs display control for displaying the ecosystem object along a time axis.
- the ecosystem object is a photograph of the ecosystem component, a symbol representing sensor data obtained by sensing the ecosystem component, or an icon representing the ecosystem component ⁇ 1> to ⁇ 13>
- the information processing apparatus according to any one of the above.
- ⁇ 15> The information processing apparatus according to any one of ⁇ 1> to ⁇ 14>, wherein the display control unit further performs display control for displaying the ecosystem object representing the ecosystem constituent relating to the entire field in a fixed position.
- the ecosystem object is linked with related information related to the ecosystem component represented by the ecosystem object, The information processing apparatus according to any one of ⁇ 1> to ⁇ 15>, wherein the display control unit further performs display control for displaying the related information.
- ⁇ 17> The information processing apparatus according to any one of ⁇ 1> to ⁇ 16>, wherein the displayed ecosystem object or the task object is changed according to a user's skill level.
- ⁇ 18> The information processing apparatus according to any one of ⁇ 1> to ⁇ 17>, wherein the ecosystem component includes a dangerous area of a farm field or an area to be alerted.
- ⁇ 19> The information processing apparatus according to any one of ⁇ 1> to ⁇ 18>, wherein the ecosystem objects of one or more categories selected from a plurality of categories are displayed.
- ⁇ 20> Obtaining an ecosystem object representing an ecosystem component constituting an ecosystem of a field in which a plurality of types of vegetation coexist, and a task object representing a task to be performed on the ecosystem component; Display control for displaying the ecosystem object in AR (Augmented Reality) at a position in a predetermined background space corresponding to the actual position of the ecosystem component, and displaying the task object in the background space in AR
- An information processing method including: ⁇ 21> An acquisition unit that acquires an ecosystem object that represents an ecosystem component that constitutes an ecosystem of a field in which a plurality of types of vegetation coexist, and a task object that represents a task to be performed on the ecosystem component; Display control for displaying the ecosystem object in AR (Augmented Reality) at a position in a predetermined background space corresponding to the actual position of the ecosystem component, and displaying the task object in the background space in AR
- a program that causes a computer to function as a display control unit.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Marketing (AREA)
- Environmental Sciences (AREA)
- Primary Health Care (AREA)
- Health & Medical Sciences (AREA)
- Mining & Mineral Resources (AREA)
- Marine Sciences & Fisheries (AREA)
- Animal Husbandry (AREA)
- Agronomy & Crop Science (AREA)
- General Health & Medical Sciences (AREA)
- Entrepreneurship & Innovation (AREA)
- Biodiversity & Conservation Biology (AREA)
- Soil Sciences (AREA)
- Forests & Forestry (AREA)
- Ecology (AREA)
- Botany (AREA)
- Mechanical Engineering (AREA)
- Quality & Reliability (AREA)
- Architecture (AREA)
- Development Economics (AREA)
- Educational Administration (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
自由言語による記録:同じ土地条件であっても、周囲に塀などの風を遮る構造がある場所には、野菜の生長が高まる。
関連DB:座標DB、収量DB、管理記録DB、植生DB、フェノロジーDB
自由言語による記録:レタスは耕された土地では良く発芽するが、不耕起を続けて土壌構造が形成されて来ると、発芽しにくくなる。
関連DB:種まきDB、管理記録DB、植生DB、フェノロジーDB
自由言語による記録:土中の栄養分が同じでも、他の植生と競合することで野菜がより大きく生長する。
関連DB:植生DB、フェノロジーDB、収量DB
なお、競合成長の具体例としては、イヌホオズキと競合して大きくなったニンジンの例、夏草を刈らなかった畝で発芽率は悪いが個々の野菜の生長は大きくなった例などがある。
自由言語による記録:冬期に野菜種によっては地表に平たく這いつくばるような形態に変化し、寒さでも枯れない形状で春まで生き延びる。この形態でも収穫可能である。
関連DB:植生DB、フェノロジーDB、収量DB、気象DB
自由言語による記録:冬期に、ニンジンやキャベツなどの野菜が紅葉することがある。紅葉しても収穫可能。
関連DB:植生DB、フェノロジーDB、収量DB、気象DB
自由言語による記録:春の発芽直後に地表が4℃以下になると、遅霜で双葉が全滅するため、全面追い蒔きや苗の定植で対応する必要がある。
関連DB:種まきDB、管理記録DB、植生DB、フェノロジーDB、気象DB
複数種類の植生が混生する圃場の生態系を構成する生態系構成物を表す生態系オブジェクト、及び、前記生態系構成物について行うタスクを表すタスクオブジェクトを取得する取得部と、
前記生態系オブジェクトを、前記生態系構成物の現実の位置に対応する、所定の背景空間内の位置にAR(Augmented Reality)表示するとともに、前記タスクオブジェクトを前記背景空間内にAR表示する表示制御を行う表示制御部と
を備える情報処理装置。
<2>
前記生態系オブジェクトには、所定のマーカ、又は、GPS(Global Positioning System)の位置情報が対応付けられ、
前記表示制御部は、前記生態系オブジェクトを、前記背景空間内の、前記マーカを基準とする相対的な位置、又は、前記位置情報が表す位置に表示する表示制御を行う
<1>に記載の情報処理装置。
<3>
前記背景空間は、現実の実空間、前記実空間を撮影した撮影実空間、又は、前記実空間をモデル化したVR(Virtual Reality)空間である
<1>又は<2>に記載の情報処理装置。
<4>
前記生態系オブジェクト、又は、前記タスクオブジェクトは、ユーザの操作に応じて編集され、
前記取得部は、編集後の前記生態系オブジェクト、及び、前記タスクオブジェクトを取得する
<1>ないし<3>のいずれかに記載の情報処理装置。
<5>
前記生態系オブジェクトの編集として、1つの圃場の前記生態系オブジェクトの、他の圃場へのコピーが行われる
<4>に記載の情報処理装置。
<6>
前記他の圃場にコピーされる前記生態系オブジェクトが表す生態系構成物について行うべきタスクを表すタスクオブジェクトが追加される
<5>に記載の情報処理装置。
<7>
前記背景空間の表示スケールが変更可能である
<1>ないし<6>のいずれかに記載の情報処理装置。
<8>
2以上の圃場の画像を、前記背景空間として、同時に表示可能である
<1>ないし<7>のいずれかに記載の情報処理装置。
<9>
1つの前記マーカが、1つの圃場の一部又は全部の前記生態系オブジェクトに対応付けられる
<2>に記載の情報処理装置。
<10>
異なる2つの前記マーカが、同一の前記生態系オブジェクトに対応付けられる
<2>に記載の情報処理装置。
<11>
前記生態系オブジェクトに対応付けられる前記マーカは移動可能であり、
前記生態系オブジェクトに対応付けられる前記マーカの移動後、前記生態系オブジェクトは、前記マーカの移動前と同一位置に表示される
<2>に記載の情報処理装置。
<12>
前記表示制御部は、前記生態系オブジェクトを、1軸に沿って表示する表示制御を、さらに行う
<1>ないし<11>のいずれかに記載の情報処理装置。
<13>
前記表示制御部は、前記生態系オブジェクトを、時間軸に沿って表示する表示制御を、さらに行う
<12>に記載の情報処理装置。
<14>
前記生態系オブジェクトは、前記生態系構成物を撮影した写真、前記生態系構成物をセンシングすることにより得られるセンサデータを表すシンボル、又は、前記生態系構成物を表すアイコンである
<1>ないし<13>のいずれかに記載の情報処理装置。
<15>
前記表示制御部は、圃場全体に関する前記生態系構成物を表す前記生態系オブジェクトを、固定の位置に表示する表示制御を、さらに行う
<1>ないし<14>のいずれかに記載の情報処理装置。
<16>
前記生態系オブジェクトには、その生態系オブジェクトが表す前記生態系構成物に関連する関連情報がリンクされ、
前記表示制御部は、前記関連情報を表示する表示制御を、さらに行う
<1>ないし<15>のいずれかに記載の情報処理装置。
<17>
ユーザの熟練度に応じて、表示される前記生態系オブジェクト、又は、前記タスクオブジェクトが変更される
<1>ないし<16>のいずれかに記載の情報処理装置。
<18>
前記生態系構成物には、圃場の危険なエリア、又は、警戒すべきエリアが含まれる
<1>ないし<17>のいずれかに記載の情報処理装置。
<19>
複数のカテゴリの中から選択された1以上のカテゴリの前記生態系オブジェクトが表示される
<1>ないし<18>のいずれかに記載の情報処理装置。
<20>
複数種類の植生が混生する圃場の生態系を構成する生態系構成物を表す生態系オブジェクト、及び、前記生態系構成物について行うタスクを表すタスクオブジェクトを取得することと、
前記生態系オブジェクトを、前記生態系構成物の現実の位置に対応する、所定の背景空間内の位置にAR(Augmented Reality)表示するとともに、前記タスクオブジェクトを前記背景空間内にAR表示する表示制御を行うこと
を含む情報処理方法。
<21>
複数種類の植生が混生する圃場の生態系を構成する生態系構成物を表す生態系オブジェクト、及び、前記生態系構成物について行うタスクを表すタスクオブジェクトを取得する取得部と、
前記生態系オブジェクトを、前記生態系構成物の現実の位置に対応する、所定の背景空間内の位置にAR(Augmented Reality)表示するとともに、前記タスクオブジェクトを前記背景空間内にAR表示する表示制御を行う表示制御部と
して、コンピュータを機能させるためのプログラム。
Claims (20)
- 複数種類の植生が混生する圃場の生態系を構成する生態系構成物を表す生態系オブジェクト、及び、前記生態系構成物について行うタスクを表すタスクオブジェクトを取得する取得部と、
前記生態系オブジェクトを、前記生態系構成物の現実の位置に対応する、所定の背景空間内の位置にAR(Augmented Reality)表示するとともに、前記タスクオブジェクトを前記背景空間内にAR表示する表示制御を行う表示制御部と
を備える情報処理装置。 - 前記生態系オブジェクトには、所定のマーカ、又は、GPS(Global Positioning System)の位置情報が対応付けられ、
前記表示制御部は、前記生態系オブジェクトを、前記背景空間内の、前記マーカを基準とする相対的な位置、又は、前記位置情報が表す位置に表示する表示制御を行う
請求項1に記載の情報処理装置。 - 前記背景空間は、現実の実空間、前記実空間を撮影した撮影実空間、又は、前記実空間をモデル化したVR(Virtual Reality)空間である
請求項1に記載の情報処理装置。 - 前記生態系オブジェクト、又は、前記タスクオブジェクトは、ユーザの操作に応じて編集され、
前記取得部は、編集後の前記生態系オブジェクト、及び、前記タスクオブジェクトを取得する
請求項1に記載の情報処理装置。 - 前記生態系オブジェクトの編集として、1つの圃場の前記生態系オブジェクトの、他の圃場へのコピーが行われる
請求項4に記載の情報処理装置。 - 前記他の圃場にコピーされる前記生態系オブジェクトが表す生態系構成物について行うべきタスクを表すタスクオブジェクトが追加される
請求項5に記載の情報処理装置。 - 前記背景空間の表示スケールが変更可能である
請求項1に記載の情報処理装置。 - 2以上の圃場の画像を、前記背景空間として、同時に表示可能である
請求項1に記載の情報処理装置。 - 1つの前記マーカが、1つの圃場の一部又は全部の前記生態系オブジェクトに対応付けられる
請求項2に記載の情報処理装置。 - 異なる2つの前記マーカが、同一の前記生態系オブジェクトに対応付けられる
請求項2に記載の情報処理装置。 - 前記生態系オブジェクトに対応付けられる前記マーカは移動可能であり、
前記生態系オブジェクトに対応付けられる前記マーカの移動後、前記生態系オブジェクトは、前記マーカの移動前と同一位置に表示される
請求項2に記載の情報処理装置。 - 前記表示制御部は、前記生態系オブジェクトを、1軸に沿って表示する表示制御を、さらに行う
請求項1に記載の情報処理装置。 - 前記表示制御部は、前記生態系オブジェクトを、時間軸に沿って表示する表示制御を、さらに行う
請求項12に記載の情報処理装置。 - 前記生態系オブジェクトは、前記生態系構成物を撮影した写真、前記生態系構成物をセンシングすることにより得られるセンサデータを表すシンボル、又は、前記生態系構成物を表すアイコンである
請求項1に記載の情報処理装置。 - 前記表示制御部は、圃場全体に関する前記生態系構成物を表す前記生態系オブジェクトを、固定の位置に表示する表示制御を、さらに行う
請求項1に記載の情報処理装置。 - 前記生態系オブジェクトには、その生態系オブジェクトが表す前記生態系構成物に関連する関連情報がリンクされ、
前記表示制御部は、前記関連情報を表示する表示制御を、さらに行う
請求項1に記載の情報処理装置。 - ユーザの熟練度に応じて、表示される前記生態系オブジェクト、又は、前記タスクオブジェクトが変更される
請求項1に記載の情報処理装置。 - 前記生態系構成物には、圃場の危険なエリア、又は、警戒すべきエリアが含まれる
請求項1に記載の情報処理装置。 - 複数のカテゴリの中から選択された1以上のカテゴリの前記生態系オブジェクトが表示される
請求項1に記載の情報処理装置。 - 複数種類の植生が混生する圃場の生態系を構成する生態系構成物を表す生態系オブジェクト、及び、前記生態系構成物について行うタスクを表すタスクオブジェクトを取得することと、
前記生態系オブジェクトを、前記生態系構成物の現実の位置に対応する、所定の背景空間内の位置にAR(Augmented Reality)表示するとともに、前記タスクオブジェクトを前記背景空間内にAR表示する表示制御を行うこと
を含む情報処理方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017544447A JP6834966B2 (ja) | 2015-10-08 | 2016-09-23 | 情報処理装置、及び、情報処理方法、並びにプログラム |
EP16853435.2A EP3361448B1 (en) | 2015-10-08 | 2016-09-23 | Information processing device and information processing method |
US15/765,042 US11058065B2 (en) | 2015-10-08 | 2016-09-23 | Information processing device and information processing method |
US17/369,786 US11793119B2 (en) | 2015-10-08 | 2021-07-07 | Information processing device and information processing method |
US18/364,152 US20230380349A1 (en) | 2015-10-08 | 2023-08-02 | Information processing device and information processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-199862 | 2015-10-08 | ||
JP2015199862 | 2015-10-08 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/765,042 A-371-Of-International US11058065B2 (en) | 2015-10-08 | 2016-09-23 | Information processing device and information processing method |
US17/369,786 Continuation US11793119B2 (en) | 2015-10-08 | 2021-07-07 | Information processing device and information processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017061281A1 true WO2017061281A1 (ja) | 2017-04-13 |
Family
ID=58487611
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/077941 WO2017061281A1 (ja) | 2015-10-08 | 2016-09-23 | 情報処理装置、及び、情報処理方法 |
Country Status (4)
Country | Link |
---|---|
US (3) | US11058065B2 (ja) |
EP (1) | EP3361448B1 (ja) |
JP (3) | JP6834966B2 (ja) |
WO (1) | WO2017061281A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018198316A1 (ja) * | 2017-04-28 | 2018-11-01 | 株式会社オプティム | 農業生産工程管理(gap)対応情報提供システム、方法及びプログラム |
KR20200046381A (ko) * | 2018-10-24 | 2020-05-07 | 디에스글로벌 (주) | 위치정보 및 시간정보 기반의 ar 컨텐츠 제공 방법 및 시스템 |
WO2021177186A1 (ja) | 2020-03-06 | 2021-09-10 | ソニーグループ株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
WO2021176892A1 (ja) | 2020-03-06 | 2021-09-10 | ソニーグループ株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
JP2021141855A (ja) * | 2020-03-12 | 2021-09-24 | 本田技研工業株式会社 | 情報処理装置、情報提供システムおよび情報処理方法 |
US11212954B2 (en) * | 2019-05-08 | 2022-01-04 | Deere & Company | Apparatus and methods for field operations based on historical field operation data |
WO2023002658A1 (ja) * | 2021-07-20 | 2023-01-26 | ソニーグループ株式会社 | プログラム、情報処理装置、及び、情報処理方法 |
WO2023157687A1 (ja) * | 2022-02-21 | 2023-08-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
WO2023190916A1 (ja) * | 2022-03-30 | 2023-10-05 | 都市緑地株式会社 | 仮想空間の農園を介する生産者/販売者との交流方法、サーバ装置及び農園関連商品と仮想空間の連携システム |
WO2024135338A1 (ja) * | 2022-12-23 | 2024-06-27 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
WO2024154382A1 (ja) * | 2023-01-16 | 2024-07-25 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017061281A1 (ja) | 2015-10-08 | 2017-04-13 | ソニー株式会社 | 情報処理装置、及び、情報処理方法 |
CN113574538A (zh) * | 2019-03-07 | 2021-10-29 | 立体修剪股份公司 | 用于辅助植物修剪的系统和方法 |
US11270189B2 (en) * | 2019-10-28 | 2022-03-08 | International Business Machines Corporation | Cognitive decision platform for honey value chain |
EP4175459A1 (en) * | 2020-07-03 | 2023-05-10 | Signify Holding B.V. | Methods and systems for determining the growth stage of a plant |
US20230046882A1 (en) * | 2021-08-11 | 2023-02-16 | Deere & Company | Obtaining and augmenting agricultural data and generating an augmented display |
KR20240023297A (ko) * | 2022-08-11 | 2024-02-21 | 붐앤드림베케이션 주식회사 | 메타 시공간 제품좌표 생성장치 기반의 메타 시공간 제품 매매장치 및 방법, 메타 시공간 제품 검색 및 접속장치 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013230088A (ja) * | 2012-04-27 | 2013-11-14 | Mitsubishi Electric Corp | 農業用管理システム |
WO2014007109A1 (ja) * | 2012-07-04 | 2014-01-09 | ソニー株式会社 | 農作業支援装置および方法、プログラム、記録媒体、並びに農作業支援システム |
Family Cites Families (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5699244A (en) * | 1994-03-07 | 1997-12-16 | Monsanto Company | Hand-held GUI PDA with GPS/DGPS receiver for collecting agronomic and GPS position data |
US5566069A (en) * | 1994-03-07 | 1996-10-15 | Monsanto Company | Computer network for collecting and analyzing agronomic data |
US5721679A (en) * | 1995-12-18 | 1998-02-24 | Ag-Chem Equipment Co., Inc. | Heads-up display apparatus for computer-controlled agricultural product application equipment |
US20060106539A1 (en) * | 2004-11-12 | 2006-05-18 | Choate Paul H | System and method for electronically recording task-specific and location-specific information, including farm-related information |
EP2391881B1 (en) | 2009-02-02 | 2023-07-05 | Planetary Emissions Management | System of systems for monitoring greenhouse gas fluxes |
US8830267B2 (en) * | 2009-11-16 | 2014-09-09 | Alliance For Sustainable Energy, Llc | Augmented reality building operations tool |
CA3108569A1 (en) * | 2010-10-05 | 2012-04-12 | Bayer Cropscience Lp | A system and method of establishing an agricultural pedigree for at least one agricultural product |
US8511936B2 (en) * | 2010-12-02 | 2013-08-20 | Rensselaer Polytechnic Institute | Method and apparatus for coastline remediation, energy generation, and vegetation support |
US8694454B2 (en) * | 2011-02-17 | 2014-04-08 | Superior Edge, Inc. | Methods, apparatus and systems for generating, updating and executing a vegetation control plan |
US8721337B2 (en) * | 2011-03-08 | 2014-05-13 | Bank Of America Corporation | Real-time video image analysis for providing virtual landscaping |
US10699222B2 (en) * | 2011-04-15 | 2020-06-30 | Basf Agro Trademarks Gmbh | Visual information system and computer mobility application for field personnel |
WO2013030965A1 (ja) * | 2011-08-30 | 2013-03-07 | 富士通株式会社 | 撮影装置、撮影支援プログラム、情報提供方法、および情報提供プログラム |
US8941560B2 (en) * | 2011-09-21 | 2015-01-27 | Google Inc. | Wearable computer with superimposed controls and instructions for external device |
US8957916B1 (en) * | 2012-03-23 | 2015-02-17 | Google Inc. | Display method |
US8761811B2 (en) * | 2012-04-27 | 2014-06-24 | Oracle International Corporation | Augmented reality for maintenance management, asset management, or real estate management |
KR20130132050A (ko) * | 2012-05-25 | 2013-12-04 | 한국전자통신연구원 | 농업용 환경제어 시스템을 위한 플랫폼 장치 |
US10520482B2 (en) * | 2012-06-01 | 2019-12-31 | Agerpoint, Inc. | Systems and methods for monitoring agricultural products |
US20130321245A1 (en) * | 2012-06-04 | 2013-12-05 | Fluor Technologies Corporation | Mobile device for monitoring and controlling facility systems |
US9113590B2 (en) * | 2012-08-06 | 2015-08-25 | Superior Edge, Inc. | Methods, apparatus, and systems for determining in-season crop status in an agricultural crop and alerting users |
US9129429B2 (en) * | 2012-10-24 | 2015-09-08 | Exelis, Inc. | Augmented reality on wireless mobile devices |
WO2014100502A1 (en) * | 2012-12-19 | 2014-06-26 | Alan Shulman | Methods and systems for automated micro farming |
US20140200690A1 (en) * | 2013-01-16 | 2014-07-17 | Amit Kumar | Method and apparatus to monitor and control conditions in a network - integrated enclosed ecosytem for growing plants |
JP6059027B2 (ja) * | 2013-01-21 | 2017-01-11 | 株式会社クボタ | 農作業機と農作業管理プログラム |
WO2014146046A1 (en) * | 2013-03-15 | 2014-09-18 | Alain Poivet | Intelligent energy and space management |
US9552675B2 (en) * | 2013-06-03 | 2017-01-24 | Time Traveler App Llc | Display application and perspective views of virtual space |
JP5825328B2 (ja) * | 2013-11-07 | 2015-12-02 | コニカミノルタ株式会社 | 透過型hmdを有する情報表示システム及び表示制御プログラム |
US10068354B2 (en) * | 2014-01-02 | 2018-09-04 | Deere & Company | Obtaining and displaying agricultural data |
CA2932744A1 (en) * | 2014-01-08 | 2015-07-16 | Precisionhawk Inc. | Method and system for generating augmented reality agricultural presentations |
JP6265027B2 (ja) * | 2014-04-22 | 2018-01-24 | 富士通株式会社 | 表示装置、位置特定プログラム、および位置特定方法 |
US9996976B2 (en) * | 2014-05-05 | 2018-06-12 | Avigilon Fortress Corporation | System and method for real-time overlay of map features onto a video feed |
US20150325047A1 (en) * | 2014-05-06 | 2015-11-12 | Honeywell International Inc. | Apparatus and method for providing augmented reality for maintenance applications |
AR101678A1 (es) | 2014-09-11 | 2017-01-04 | Sony Corp | Dispositivo de procesamiento de información, método de procesamiento de información y medio de almacenamiento legible por computadora no transitorio de almacenamiento de programa |
EP3193303B1 (en) * | 2014-09-11 | 2024-01-31 | Sony Group Corporation | System and method for generating a relationship graphs |
US9652840B1 (en) * | 2014-10-30 | 2017-05-16 | AgriSight, Inc. | System and method for remote nitrogen monitoring and prescription |
US9667710B2 (en) * | 2015-04-20 | 2017-05-30 | Agverdict, Inc. | Systems and methods for cloud-based agricultural data processing and management |
JP6357140B2 (ja) * | 2015-09-18 | 2018-07-11 | Psソリューションズ株式会社 | 画像判定方法 |
WO2017061281A1 (ja) | 2015-10-08 | 2017-04-13 | ソニー株式会社 | 情報処理装置、及び、情報処理方法 |
US10628895B2 (en) * | 2015-12-14 | 2020-04-21 | The Climate Corporation | Generating digital models of relative yield of a crop based on nitrate values in the soil |
US10331931B2 (en) * | 2016-02-05 | 2019-06-25 | The Climate Corporation | Modeling trends in crop yields |
-
2016
- 2016-09-23 WO PCT/JP2016/077941 patent/WO2017061281A1/ja active Application Filing
- 2016-09-23 JP JP2017544447A patent/JP6834966B2/ja active Active
- 2016-09-23 US US15/765,042 patent/US11058065B2/en active Active
- 2016-09-23 EP EP16853435.2A patent/EP3361448B1/en active Active
-
2021
- 2021-02-04 JP JP2021016251A patent/JP7060119B2/ja active Active
- 2021-07-07 US US17/369,786 patent/US11793119B2/en active Active
-
2022
- 2022-04-06 JP JP2022063269A patent/JP7405179B2/ja active Active
-
2023
- 2023-08-02 US US18/364,152 patent/US20230380349A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013230088A (ja) * | 2012-04-27 | 2013-11-14 | Mitsubishi Electric Corp | 農業用管理システム |
WO2014007109A1 (ja) * | 2012-07-04 | 2014-01-09 | ソニー株式会社 | 農作業支援装置および方法、プログラム、記録媒体、並びに農作業支援システム |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018198316A1 (ja) * | 2017-04-28 | 2018-11-01 | 株式会社オプティム | 農業生産工程管理(gap)対応情報提供システム、方法及びプログラム |
KR20200046381A (ko) * | 2018-10-24 | 2020-05-07 | 디에스글로벌 (주) | 위치정보 및 시간정보 기반의 ar 컨텐츠 제공 방법 및 시스템 |
KR102199686B1 (ko) | 2018-10-24 | 2021-01-07 | 디에스글로벌(주) | 위치정보 및 시간정보 기반의 ar 컨텐츠 제공 방법 및 시스템 |
US11212954B2 (en) * | 2019-05-08 | 2022-01-04 | Deere & Company | Apparatus and methods for field operations based on historical field operation data |
US11825761B2 (en) | 2019-05-08 | 2023-11-28 | Deere & Company | Apparatus and methods for field operations based on historical field operation data |
WO2021177186A1 (ja) | 2020-03-06 | 2021-09-10 | ソニーグループ株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
WO2021176892A1 (ja) | 2020-03-06 | 2021-09-10 | ソニーグループ株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
JP2021141855A (ja) * | 2020-03-12 | 2021-09-24 | 本田技研工業株式会社 | 情報処理装置、情報提供システムおよび情報処理方法 |
JP7337011B2 (ja) | 2020-03-12 | 2023-09-01 | 本田技研工業株式会社 | 情報処理装置、情報提供システムおよび情報処理方法 |
WO2023002658A1 (ja) * | 2021-07-20 | 2023-01-26 | ソニーグループ株式会社 | プログラム、情報処理装置、及び、情報処理方法 |
WO2023157687A1 (ja) * | 2022-02-21 | 2023-08-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
WO2023190916A1 (ja) * | 2022-03-30 | 2023-10-05 | 都市緑地株式会社 | 仮想空間の農園を介する生産者/販売者との交流方法、サーバ装置及び農園関連商品と仮想空間の連携システム |
WO2024135338A1 (ja) * | 2022-12-23 | 2024-06-27 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
WO2024154382A1 (ja) * | 2023-01-16 | 2024-07-25 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
Also Published As
Publication number | Publication date |
---|---|
JP7060119B2 (ja) | 2022-04-26 |
EP3361448B1 (en) | 2023-09-06 |
US20180271027A1 (en) | 2018-09-27 |
US11793119B2 (en) | 2023-10-24 |
JP2022120848A (ja) | 2022-08-18 |
US11058065B2 (en) | 2021-07-13 |
EP3361448A1 (en) | 2018-08-15 |
US20210329846A1 (en) | 2021-10-28 |
US20230380349A1 (en) | 2023-11-30 |
JP7405179B2 (ja) | 2023-12-26 |
JPWO2017061281A1 (ja) | 2018-08-30 |
JP6834966B2 (ja) | 2021-02-24 |
EP3361448A4 (en) | 2019-02-20 |
JP2021073616A (ja) | 2021-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7060119B2 (ja) | 情報処理装置、及び、情報処理方法、情報処理システム、並びにプログラム | |
JP6512463B2 (ja) | 農作業支援方法、農作業支援システム、およびプログラム | |
WO2016039176A1 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
WO2016039174A1 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
US20140009600A1 (en) | Mobile device, computer product, and information providing method | |
WO2016039175A1 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
Wyckhuys et al. | Influence of extra-field characteristics to abundance of key natural enemies of Spodoptera frugiperda Smith (Lepidoptera: Noctuidae) in subsistence maize production | |
Mohapatra et al. | NRRI'riceXpert'APP: TAKING RICE TECHNOLOGIESIN THE DOORSTEP OF FARMERS | |
Rajasekaran et al. | Intelligent smart farming and crop visualization | |
Lalrochunga et al. | GIS-Based Image Processing for Farmland Site Selection: An Eastern Himalayan Region Perspective | |
ELLIOTT | MICHAEL J. BREWER, 1 TAKUJI NOMA1 AND |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16853435 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017544447 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15765042 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016853435 Country of ref document: EP |