WO2020261028A1 - 情報処理システム、および情報処理方法 - Google Patents
情報処理システム、および情報処理方法 Download PDFInfo
- Publication number
- WO2020261028A1 WO2020261028A1 PCT/IB2020/055509 IB2020055509W WO2020261028A1 WO 2020261028 A1 WO2020261028 A1 WO 2020261028A1 IB 2020055509 W IB2020055509 W IB 2020055509W WO 2020261028 A1 WO2020261028 A1 WO 2020261028A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- wearable device
- information
- cooking
- user
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 52
- 238000003672 processing method Methods 0.000 title claims abstract description 19
- 238000010411 cooking Methods 0.000 claims abstract description 151
- 238000000034 method Methods 0.000 claims abstract description 69
- 239000004615 ingredient Substances 0.000 claims abstract description 20
- 239000000463 material Substances 0.000 claims description 107
- 238000010438 heat treatment Methods 0.000 claims description 48
- 238000005520 cutting process Methods 0.000 claims description 33
- 210000000988 bone and bone Anatomy 0.000 claims description 31
- 238000003384 imaging method Methods 0.000 claims description 17
- 239000000126 substance Substances 0.000 claims description 8
- 244000045947 parasite Species 0.000 claims description 5
- 230000008569 process Effects 0.000 claims description 5
- 239000011521 glass Substances 0.000 description 77
- 238000013528 artificial neural network Methods 0.000 description 28
- 230000006870 function Effects 0.000 description 22
- 238000001514 detection method Methods 0.000 description 19
- 230000003287 optical effect Effects 0.000 description 17
- 238000004891 communication Methods 0.000 description 16
- 210000001508 eye Anatomy 0.000 description 13
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 13
- 241000251468 Actinopterygii Species 0.000 description 12
- 235000019688 fish Nutrition 0.000 description 12
- 238000010586 diagram Methods 0.000 description 8
- 238000010801 machine learning Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 230000011218 segmentation Effects 0.000 description 7
- 238000005401 electroluminescence Methods 0.000 description 6
- 239000004973 liquid crystal related substance Substances 0.000 description 6
- 239000003921 oil Substances 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 241001504592 Trachurus trachurus Species 0.000 description 5
- 238000002570 electrooculography Methods 0.000 description 5
- 239000003550 marker Substances 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 238000010191 image analysis Methods 0.000 description 4
- 230000007257 malfunction Effects 0.000 description 4
- 239000004278 EU approved seasoning Substances 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 239000004744 fabric Substances 0.000 description 3
- 235000013305 food Nutrition 0.000 description 3
- 235000011194 food seasoning agent Nutrition 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 235000013372 meat Nutrition 0.000 description 3
- 235000013311 vegetables Nutrition 0.000 description 3
- 241000270322 Lepidosauria Species 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 210000005252 bulbus oculi Anatomy 0.000 description 2
- 235000013399 edible fruits Nutrition 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 230000001678 irradiating effect Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000011514 reflex Effects 0.000 description 2
- 210000001525 retina Anatomy 0.000 description 2
- 230000004270 retinal projection Effects 0.000 description 2
- 235000014102 seafood Nutrition 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 235000001674 Agaricus brunnescens Nutrition 0.000 description 1
- 241000272525 Anas platyrhynchos Species 0.000 description 1
- 241000269350 Anura Species 0.000 description 1
- 241000271566 Aves Species 0.000 description 1
- 241001474374 Blennius Species 0.000 description 1
- 241000283690 Bos taurus Species 0.000 description 1
- 241000238424 Crustacea Species 0.000 description 1
- 241000196324 Embryophyta Species 0.000 description 1
- 241000467686 Eschscholzia lobbii Species 0.000 description 1
- 241000287828 Gallus gallus Species 0.000 description 1
- 241000238631 Hexapoda Species 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 241000124008 Mammalia Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 241000238814 Orthoptera Species 0.000 description 1
- 241001494479 Pecora Species 0.000 description 1
- 241000270295 Serpentes Species 0.000 description 1
- 241000287181 Sturnus vulgaris Species 0.000 description 1
- 241000282887 Suidae Species 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 235000013339 cereals Nutrition 0.000 description 1
- 235000013330 chicken meat Nutrition 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 210000004087 cornea Anatomy 0.000 description 1
- 235000013365 dairy product Nutrition 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 235000013601 eggs Nutrition 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 210000003746 feather Anatomy 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000000796 flavoring agent Substances 0.000 description 1
- 235000019634 flavors Nutrition 0.000 description 1
- 235000013373 food additive Nutrition 0.000 description 1
- 239000002778 food additive Substances 0.000 description 1
- -1 foodstuffs Substances 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 235000019198 oils Nutrition 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 150000003839 salts Chemical class 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 235000015170 shellfish Nutrition 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/538—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/68—Food, e.g. fruit or vegetables
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/02—Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
Definitions
- One aspect of the present invention relates to an information processing system, an information processing device, and an information processing method. Further, one aspect of the present invention relates to a cooking assist system, a cooking assist device, and a cooking assist method.
- Patent Document 1 In recent years, cooking recipe introduction services through information terminals such as smartphones and tablets have gained support (Patent Document 1).
- the user can have an information terminal on which the recipe is displayed at hand and cook while referring to the recipe.
- the object detection is a technique for recognizing an object in a rectangle by extracting a portion of an image in which an object is expected to appear as a bounding box (Patent Document 2).
- Patent Document 2 R-CNN (Regions with Convolutional Neural Networks), YOLO (You Only Look Access), SSD (Single Shot MultiBox Detector) and the like have been proposed.
- Patent Document 3 a technique called semantic segmentation has been proposed in which an image is divided into regions using a neural network and labels are given to each divided region.
- FCN Full Convolutional Network
- SegNet U-Net
- PSPNet Pulid Scene Parsing Network
- Information terminals such as smartphones and tablets are generally operated using a touch panel.
- the user cooks while looking at these information terminals, it is necessary to operate the information terminals according to the cooking process.
- the information terminal may not be operated accurately, which is inconvenient.
- the information terminal is touched with such a hand, the information terminal becomes dirty, and further, the dirt causes the information terminal to malfunction, which is not preferable.
- a fire source such as a water supply or a stove in the kitchen
- bringing an information terminal to the kitchen may cause a malfunction of the information terminal due to water or fire.
- the electromagnetic field may cause malfunction or failure of the information terminal.
- One aspect of the present invention is to provide an information processing system capable of obtaining information without using hands. Further, one aspect of the present invention is to provide an information processing device capable of obtaining information without using a hand. Further, one aspect of the present invention is to provide an information processing method capable of obtaining information without using a hand. Further, one aspect of the present invention is to provide a cooking assist system capable of obtaining information without using hands. Further, one aspect of the present invention is to provide a cooking assist device capable of obtaining information without using hands. Further, one aspect of the present invention is to provide a cooking assist method capable of obtaining information without using hands.
- One aspect of the invention comprises a wearable device with display means and imaging means and a database connected to the wearable device via a network, the database being at least one of information about cooking recipes, cooking methods, and ingredients.
- the wearable device detects the first material by the imaging means, the wearable device collects information about the first material from the database, and the first material is in a specific area of the range to be imaged by the imaging means.
- an information processing system that displays information about the first material in the display means when is present and hides information about the first material from the display means when the first material is not present in a particular area. is there.
- the information about the first material preferably includes the cutting position of the first material.
- the information about the first material preferably includes the position of the bone contained in the first material.
- the wearable device is preferably a glasses-type wearable device.
- the database is preferably stored on the server.
- One aspect of the present invention includes a wearable device including a display means and an imaging means, and a cooking utensil having a temperature sensor.
- the wearable device and the temperature sensor are connected via a first network, and the wearable device is
- the wearable device collects information about the temperature inside the cooking utensil from the temperature sensor, and when the cooking utensil is present in a specific area of the range to be imaged by the imaging means, the display means is used. It is an information processing system that displays information about temperature and hides information about temperature from the display means when there is no cooking utensil in a specific area.
- the information processing system further has a database, the database is connected via a wearable device and a second network including a temperature sensor and a first network, and the database provides information about the temperature. Received via the second network, the database preferably calculates the time required to heat the cookware from the information about the temperature and displays it on the display means.
- the wearable device is preferably a glasses-type wearable device.
- the database is preferably stored on the server.
- One aspect of the present invention is an information processing method using a display means and a wearable device provided with an imaging means, and the wearable device allows the user to see materials and cooking utensils through the display means.
- the information processing method includes a step of detecting a cutting board existing in the user's line of sight using an imaging means, a step of identifying a first material provided on the cutting board, and a cooking method as a display means. It includes a step of displaying and a step of displaying the cutting position of the first material on the display means so as to overlap the first material in the line of sight of the user.
- the information processing method further includes a step of displaying the position of a foreign substance existing on the surface or inside of the material on the display means so as to overlap the material in the line of sight of the user.
- the foreign body is preferably one selected from bone, scales, parasites, and hair.
- the wearable device is preferably a glasses-type wearable device.
- an information processing system in which information can be obtained without using hands.
- an information processing device capable of obtaining information without using a hand.
- one aspect of the present invention makes it possible to provide an information processing method capable of obtaining information without using a hand.
- a cooking assist system in which information can be obtained without using hands.
- a cooking assist device capable of obtaining information without using hands.
- a cooking assist method capable of obtaining information without using hands.
- FIG. 1 is a schematic view illustrating an example of how to use the AR glass according to one aspect of the present invention.
- FIG. 2 is a block diagram illustrating one aspect of the present invention.
- FIG. 3 is a flowchart illustrating one aspect of the present invention.
- 4A and 4B are schematic views illustrating an aspect of the present invention.
- 5A and 5B are schematic views illustrating an aspect of the present invention.
- 6A and 6B are schematic views illustrating an aspect of the present invention.
- FIG. 7 is a schematic diagram illustrating an example of a method for displaying a guideline, which is one aspect of the present invention.
- FIG. 8A is a schematic diagram illustrating a method of causing a neural network to learn teacher data by using machine learning, which is one aspect of the present invention.
- FIG. 8B is a schematic diagram illustrating a method of generating an image using a trained neural network, which is one aspect of the present invention.
- FIG. 9 is a schematic diagram illustrating a method of generating an image according to one aspect of the present invention.
- FIG. 10A is a schematic diagram illustrating a method of causing a neural network to learn teacher data by using machine learning, which is one aspect of the present invention.
- FIG. 10B is a schematic diagram illustrating a method of generating an image using a trained neural network, which is one aspect of the present invention.
- 11A and 11B are schematic views illustrating an aspect of the present invention.
- 12A and 12B are schematic views illustrating an example of a method of displaying a guideline, which is one aspect of the present invention.
- 13A and 13B are schematic views illustrating an aspect of the present invention.
- 14A and 14B are schematic views illustrating a method for detecting a cooking utensil, which is one aspect of the present invention.
- 15A and 15B are schematic views illustrating an aspect of the present invention.
- the user can cook while using the information terminal.
- the information terminal one that can be worn by the user is preferable.
- an example of using a wearable device as an information terminal used by a user is shown.
- the wearable device shown in the present embodiment is preferably a spectacle-type wearable device, and such a wearable device may be referred to as an AR glass.
- FIG. 1 shows an example in which a user cooks while wearing an AR glass 10.
- AR is an abbreviation for Augmented Reality
- the AR glass 10 can project information such as images and characters onto the outside world seen by the user.
- the AR glass 10 can present to the user information necessary for the user to perform cooking, such as information on cooking recipes, cooking methods, and ingredients.
- the ingredients are plant-based materials such as grains, vegetables, fruits and seaweed, animal-based materials such as seafood, meat, eggs, dairy products and bones, seasonings, flavors, oils, food additives, etc. for cooking. Contains the necessary materials.
- the material may be called a food material. The user can acquire the information necessary for cooking through the AR glass 10 even when both hands are closed.
- the AR glass 10 is composed of a glass portion 10a, a housing 10b, and a wiring 10c.
- the wiring 10c connects the glass portion 10a and the housing 10b.
- the housing 10b is housed in a pocket of an apron. If the glass portion 10a and the housing 10b are capable of wireless communication, the wiring 10c may be omitted. Further, when all the elements required for the AR glass 10 fit in the glass portion 10a, the housing 10b and the wiring 10c may be omitted.
- FIG. 2 is a block diagram showing a configuration example of the AR glass 10.
- the AR glass 10 includes a first camera module 11, a second camera module 12, a controller 13, a display unit 14, a processor 15, a memory 16, a communication module 17, an audio controller 18, and a microphone 19. It has a speaker 20, a battery 21, a sensor 25, and a bus 22.
- the first camera module 11, the second camera module 12, the controller 13, the processor 15, the memory 16, the communication module 17, the sensor 25, and the audio controller 18 are used for data via the bus 22. Communicate.
- the first camera module 11 has a function of acquiring the user's field of view as an image. Further, the second camera module 12 has a function of capturing the movement of the user's eyeball and detecting the user's line of sight.
- the imaging direction of the first camera module 11 substantially matches the line of sight of the user.
- the imaging range of the first camera module 11 preferably includes a field of view that the user can see through the AR glass 10. That is, it is preferable that a specific region of the imaging range of the first camera module 11 matches the field of view that the user can see through the AR glasses 10.
- the audio controller 18 has a function of analyzing the audio signal acquired by the microphone 19 and converting it into a digital signal. It also has a function of generating an audio signal to be output to the speaker 20.
- the controller 13 has a function of generating an image to be displayed on the display unit 14.
- the communication module 17 has a function of communicating with a database via a network such as the Internet.
- the AR glass 10 can communicate with the database via the network.
- the data downloaded from the database is stored in the memory 16.
- the database is preferably stored on the server 23.
- the AR glass 10 uses the communication module 17 to connect to the server 23 via the network.
- the AR glass 10 may connect to the server 23 via a device connected to the AR glass 10 and the server 23 via a network.
- a desktop computer 26, a laptop computer 27, a tablet computer 28 (tablet terminal), a smartphone 29, and the like can be used.
- the database may be stored in the above device.
- the user can register user information in the AR glass 10 and the database, set usage conditions, register cooking utensils used by the user, and the like using the above device.
- the server 23 preferably performs machine learning using the teacher data.
- the AR glass 10 can download the learning result obtained by the above learning from the server 23 and store it in the memory 16.
- the battery 21 has a function of supplying power to each device constituting the AR glass 10.
- the processor 15 has a function of integrally controlling the device connected to the bus 22.
- the processor 15 comprehensively determines the information acquired from the first camera module 11, the second camera module 12, the communication module 17, and the audio controller 18, and gives an instruction to the controller 13.
- the controller 13 receives an instruction from the processor 15 to generate image data and display it on the display unit 14.
- see-through panel that transmits external light
- examples of the see-through panel include an organic EL (Electro Luminescence) display, a liquid crystal display, and the like.
- the pixel density of the see-through panel can be 1000 ppi or more and 50000 ppi or less, preferably 2000 ppi or more and 50000 ppi or less, more preferably 3000 ppi or more and 50000 ppi or less, and further preferably 5000 ppi or more and 50000 ppi or less.
- the pixel density can be 4500 ppi or more and 5500 ppi or less, 5500 ppi or more and 6500 ppi or less, or 6500 ppi or more and 7500 ppi or less.
- the number of pixels in the see-through panel can be, for example, 1000 or more and 20000 or less, preferably 2000 or more and 20000 or less, and more preferably 3000 or more and 20000 or less in the scanning line direction or the signal line direction.
- the shape of the display area can be close to a square (the ratio of the length to the width is 0.8 or more and 1.2 or less).
- the shape of the display area is a horizontally long rectangle (for example, the ratio of the horizontal length to the vertical is 1.5 or more and 5.0 or less).
- the see-through panel may be adapted to a television standard having an aspect ratio of 16: 9, in which case the resolution may be FHD standard, 4K2K standard, or 8K4K standard.
- the display unit 14 may have a device for projecting an image on a reflector arranged in front of the user's eyes.
- the display unit 14 has an optical member such as a light guide plate and a half mirror, and a light emitting element.
- the light emitting element include an organic EL element, an LED (Light Emitting Diode) element, and an inorganic EL element.
- the display unit 14 may have a retinal projection type display device.
- the retinal projection type display device is a device that projects an image on the retina by irradiating the user's retina with a low-intensity laser beam.
- the display unit 14 includes a laser oscillator, an optical system (light guide plate, reflector, half mirror, etc.) and the like. Further, the laser oscillator is preferably controlled by MEMS (Micro Electro Mechanical System).
- the user wearing the AR glass 10 can see the ingredients and utensils necessary for cooking through the display unit 14. From the user's line of sight, the display unit 14 displays information about the material and cooking utensil that the user is looking at. That is, from the user's point of view, the display unit 14 displays information about these so as to overlap with the ingredients and cooking utensils.
- the AR glass 10 may also have a sensor 25 such as an acceleration sensor, a gyro sensor, a temperature sensor, and an electro-oculography sensor.
- the electro-oculography sensor is a sensor that detects a potential change caused by the movement of the eye, and is provided on the glass portion 10a. By having the electro-oculography sensor, the AR glass 10 can analyze the movement of the user's eyes and track the user's line of sight.
- the AR glass 10 may include the processor 15, the memory 16, the communication module 17, and the battery 21 in the housing 10b, and may include other components in the glass portion 10a. By doing so, the weight of the glass portion 10a can be reduced, and the burden felt by the user can be reduced.
- only the battery 21 may be included in the housing 10b, and other components may be included in the glass portion 10a, if it does not burden the user.
- the device connected to the bus 22 nearby it is possible to improve the processing speed and save power.
- the battery 21 can be further miniaturized, all the structural elements of the AR glass 10 may be included in the glass portion 10a. In that case, the housing 10b and the wiring 10c are unnecessary.
- FIG. 3 is a flowchart for explaining the processing flow of the system.
- FIG. 3 includes steps S0 to S10.
- steps S0 to S10 the details of the system will be described step by step from step S0.
- the AR glass 10 is used to prepare the fish will be described.
- the field of view 50 shown in FIG. 4A represents a field of view that the user can see through the AR glass 10 immediately after the system is started.
- the cutting board 60 and the material 62, as well as the kitchen knife 63 and the cloth width 64, are present in the field of view 50.
- an example in which fish is used as the material 62 is shown.
- the imaging range of the first camera module 11 preferably includes the field of view 50. That is, it is preferable that a specific region of the imaging range of the first camera module 11 coincides with the field of view 50.
- the cutting board 60 is preferably a plain one with less dirt and scratches.
- markers 61a, 61b, 61c and 61d are arranged at the four corners of the cutting board 60.
- the markers 61a to 61d may be printed directly on the cutting board 60, or may be placed on the cutting board by the user, such as a sticker.
- the number of markers is not limited to this, and may be arranged only at the three corners of the cutting board 60, for example.
- step S1 the system detects the positions of the cutting board 60 by detecting the markers 61a to 61d using the first camera module 11.
- the system adds only the region surrounded by the markers 61a to 61d to the analysis target, and excludes the region outside the region from the analysis target. For example, as shown in FIG. 4B, the system excludes the shaded areas in the figure from analysis.
- the system tries to recognize the material 62 by image analysis, but if the material existing around the cutting board 60 such as the kitchen knife 63 and the cloth width 64 is added to the target of the image analysis, the recognition accuracy of the material 62 is adversely affected. To exert. As shown in FIG. 4B, the system can improve the recognition accuracy of the material 62 by excluding those existing around the cutting board 60 from the analysis target. Further, by narrowing the area of the image to be analyzed, it becomes possible to perform image analysis at higher speed.
- the region of the cutting board 60 is formed by using the object detection method such as R-CNN, YOLO, SSD described in the background technique and the semantic segmentation such as FCN, SegNet, U-Net, PSPNet, etc. without using a marker. May be detected.
- object detection method such as R-CNN, YOLO, SSD described in the background technique
- semantic segmentation such as FCN, SegNet, U-Net, PSPNet, etc. without using a marker. May be detected.
- step S2 the system identifies the type of material 62.
- a rectangle 65 is displayed so as to surround the material 62, and candidates for the material 62 are displayed in the menu 70. It is preferable to use the above-mentioned object detection method for specifying the material 62. It is preferable that the menu 70 is displayed in order from the most probable candidate for the material 62.
- the user can select the type of material 62 from the candidates displayed in the menu 70.
- the user can select a menu by voice input. For example, when the user utters "horse mackerel", the system analyzes the user's voice and determines that "horse mackerel" has been selected.
- the user can select a menu by looking at it. For example, the user stares at the item of "horse mackerel" displayed on the menu 70.
- the AR glass 10 detects the line of sight of the user and determines that "horse mackerel" has been selected.
- the detection of the user's line of sight can be performed by using the second camera module 12. From the user's eyeball image acquired by the second camera module 12, the user's line of sight can be detected by analyzing the positional relationship between the inner corner of the eye and the iris with the inner corner of the user as a reference point.
- the line of sight of the user can be detected. Detecting the user's line of sight by irradiating the user's eyes with infrared rays emitted from an infrared source and analyzing the positional relationship between the corneal reflex and the pupil with the reflection position on the cornea (called the corneal reflex) as the reference point. Can be done.
- an infrared source such as an infrared LED and an infrared detector such as an infrared camera
- the AR glass 10 may track the user's line of sight by analyzing the user's electro-oculography.
- the pointer 51 may be displayed on the menu 70 in accordance with the line of sight of the user. By displaying the pointer 51, the user can determine whether or not the item intended by the user is selected.
- the pointer 51 may be displayed only when an item is selected from the menu 70.
- the sensor 25 may be used to detect the movement of the user to display or hide the pointer 51.
- the user wearing the AR glasses 10 may show or hide the pointer 51 by shaking his head up and down or left and right.
- a user wearing the AR glass 10 can display the pointer 51 by shaking his head up and down, and can hide the pointer 51 by shaking it left and right.
- the user can select a menu without using his / her hands. By doing so, the user can operate the system even when both hands are closed during cooking.
- the movement of the user's hand may be detected by using the first camera module 11.
- the user can select a desired item by arranging a hand, particularly a finger, on the display unit 14 so as to overlap the desired item of the menu 70.
- the selected item may be determined in combination with a known gesture recognition method such as a finger gesture.
- the user can operate the system without touching the information terminal.
- the user can operate the system without touching the information terminal, it is possible to suppress the failure or malfunction of the information terminal due to touching the information terminal with a wet hand or a hand with a material attached. Can be done. In addition, it is not necessary to continue cooking by touching the information terminal, which is preferable in terms of food hygiene.
- the menu 70 and the pointer 51 are preferably displayed when the material 62 is present in the field of view 50.
- the menu 70 and the pointer 51 may be hidden.
- the menu 70 and the pointer 51 may be hidden after a certain period of time has elapsed in the state where the material 62 does not exist in the visual field 50.
- step S3 the system displays recipe candidates using the ingredient 62.
- a recipe candidate using horse mackerel is displayed in the menu 70.
- the user can select a recipe from the menu 70 in the same manner as in step S2. It is preferable that the recipe displays health-related information such as calories, salt and sugar, and necessary ingredients and seasonings.
- the user can register the number of people who serve dishes, taste preferences, etc. when selecting a recipe.
- step S4 the system displays the recipe selected in step S3.
- the recipe 71 is displayed above the field of view 50.
- the recipe 71 a number of each procedure and an image for explaining each procedure are displayed.
- the procedure in the recipe 71 is selected in the same manner as in step S2
- the image of the selected procedure is displayed in a large size (FIG. 6B).
- a text 72 explaining the procedure and a text 73 explaining the one-point advice are displayed below the visual field 50.
- the image displayed in the recipe 71 is not limited to the still image, and a moving image may be used.
- the recipe 71, the text 72, the text 73, and the pointer 51 are preferably displayed when one of the material 62 and the cutting board 60 is present in the field of view 50.
- the recipe 71, the text 72, the text 73, and the pointer 51 may be hidden when the material 62 and the cutting board 60 are not present in the visual field 50 due to the movement of the line of sight by the user or the movement of the material 62 or the cutting board 60.
- the recipe 71, the text 72, the text 73, and the pointer 51 may be hidden after a certain period of time has elapsed in the state where the material 62 and the cutting board 60 are not present in the field of view 50.
- step S5 the user selects whether to enable or disable the display function of the kitchen knife guideline indicating the cutting position of the material. If the kitchen knife guideline is enabled, in step S6, the system displays the kitchen knife guideline indicating the proper cutting position of the material.
- FIG. 7 is a view of the field of view 50 when the kitchen knife guideline is displayed.
- the user is handling the fish (material 62) with a kitchen knife.
- the system displays the guideline 75 so as to overlap the material 62.
- the user can correctly cut and cook the material 62 by inserting the knife blade according to the guideline 75.
- the guideline 75 three-dimensionally along the shape of the material 62, the user can easily cut or process the material 62.
- a method of displaying the kitchen knife guideline on the system will be described with reference to FIGS. 8A and 8B. It is preferable to display the kitchen knife guideline by using machine learning using a neural network.
- FIG. 8A shows a method of training a neural network with teacher data. The learning is preferably performed on the server 23 shown in FIG.
- FIG. 8B shows a method of generating a target image using a trained neural network.
- the process shown in FIG. 8B is performed inside the AR glass 10.
- the AR glass 10 downloads the trained neural network from the server 23 and generates a target image.
- Data set 80 is prepared as teacher data.
- the data set 80 has an image set 81 and an image set 82.
- the image set 81 has images of a plurality of materials.
- the image set 82 is obtained by adding a start point 83 represented by a circle and an end point 84 also represented by a circle to each image of the image set 81.
- the data set 80 is trained by the neural network 85.
- the neural network 85 it is preferable to use a neural network used for image generation such as Autoencoder, CAE (Convolutional Autoencoder), VAE (Variational Autoencoder), U-net, and GAN (Generative Adversarial Networks).
- Image 86 is an image of material 62 acquired by the system.
- the image 86 is input to the trained neural network 85, the image 87 in which the start point 88 and the end point 89 are added to the material 62 is obtained.
- the target image 90 to which the kitchen knife guideline is added to the material can be obtained.
- step S7 the user selects whether to enable or disable the bone detection function. If bone detection is enabled, the system performs bone detection in step S9.
- the bone detection function is particularly effective in detecting small bones. By using the bone detection function, it is possible to suppress the cooking and provision of the material in which the small bone remains.
- the detection function is also effective in detecting scales and parasites. Foreign substances such as bones, scales, and parasites that are not edible are called foreign substances. Therefore, the bone detection function can be called a foreign substance detection function.
- FIG. 9 shows how the field of view 50 looks when bone detection is enabled.
- Material 76 represents a fish fillet.
- the material 76 is a part of the body cut out from the material 62.
- the bone 78 contained in the material 76 is emphasized. Therefore, the user can easily find and remove the bone contained in the fish fillet regardless of its size.
- the text 73 may be displayed on the detected bone 78.
- the text 73 preferably contains information about the bone 78 and advice on how to deal with it. For example, when the detected bone 78 is a small bone to be removed, information indicating that the small bone has been detected and advice for prompting the removal may be displayed as text 73.
- a method for detecting bone contained in fish will be described with reference to FIGS. 10A and 10B.
- For bone detection it is preferable to use a method of Semantic Segmentation.
- FIG. 10A shows a method of training a neural network with teacher data. The learning is preferably performed on the server 23 shown in FIG.
- FIG. 10B shows a method of generating a target image using a trained neural network.
- the process shown in FIG. 1B is performed inside the AR glass 10.
- the AR glass 10 downloads the trained neural network from the server 23 and generates a target image.
- Data set 100 is prepared as teacher data.
- the data set 100 includes an image set 91 and an image set 92.
- the image set 91 comprises images of a plurality of materials (in this case, fish fillets).
- each image of the image set 91 is painted separately in the bone region and the other regions.
- the data set 100 is trained by the neural network 95.
- the neural network 95 a neural network used for Semantic Segmentation such as FCN, SegNet, U-net, PSPNet, etc. is preferable.
- Image 96 is an image of material 76 acquired by the system.
- the image 96 is input to the trained neural network 95, the image 97 in which the material 76 is painted on the bone and the other regions is obtained.
- image 98 By synthesizing image 97 and image 96, it is possible to obtain image 98 in which the bones of the material 76, particularly the small bones, are emphasized.
- the above system can detect not only bones but also fish scales and parasites parasitized on fish by using techniques such as object detection and Semantic Segmentation.
- the material 76 is not a fish
- hair, feathers, and the like are also included in the foreign matter.
- foreign substances include shells, shell fragments, and sand.
- foreign substances include hair, lint, fibers, and other parts of clothing such as users and people involved in cooking such as chefs.
- the detection function of the system preferably detects any foreign matter that is not edible, such as these. In particular, it is not hygienic that hair adheres to or mixes with the material. It is preferable to detect and remove using the above system.
- step S9 the user selects "terminate” to terminate the system (step S10).
- an information processing system capable of obtaining information without using hands.
- an information processing device capable of obtaining information without using a hand.
- one aspect of the present invention makes it possible to provide an information processing method capable of obtaining information without using a hand.
- a cooking assist system in which information can be obtained without using hands.
- a cooking assist device capable of obtaining information without using hands.
- a cooking assist method capable of obtaining information without using hands.
- Embodiment 2 In the previous embodiment, an example of fish is shown as a material, but the present invention is not limited to this.
- Materials include seafood other than fish, meat of mammals such as cows, pigs and sheep, meat of birds such as chickens, duck and turkey, reptiles such as snakes and lizards, amphibians such as frogs, insects such as crickets, vegetables and fruits. Mushrooms and the like can be used.
- meat of mammals such as cows, pigs and sheep
- meat of birds such as chickens, duck and turkey
- reptiles such as snakes and lizards
- amphibians such as frogs
- insects such as crickets, vegetables and fruits. Mushrooms and the like
- Mushrooms and the like can be used.
- an example in which vegetables are used as the material 77 is shown.
- FIG. 11A shows the user's field of view 50 looking through the AR glass 10 as the material 77 is provided on the cutting board 60.
- the recipe 71 and the guideline 75 are displayed on the display unit 14 of the AR glass 10. The user can cut the material 77 while aligning the knife with the guideline 75.
- FIG. 11B shows how the AR glass 10 detects the user's line of sight and selects a desired procedure from the recipe 71.
- a pointer 51 is displayed on the recipe 71 according to the line of sight of the user, and the user can select a procedure while checking the position of the pointer 51.
- the image of the selected procedure is displayed larger. Further, a text 72 explaining the procedure and a text 73 explaining the one-point advice are displayed below the visual field 50. As a result, the user can grasp the points to be noted in each procedure.
- the image displayed in the recipe 71 is not limited to the still image, and a moving image may be used.
- a method of displaying the kitchen knife guideline on the system will be described with reference to FIGS. 12A and 12B. It is preferable to display the kitchen knife guideline by using machine learning using a neural network.
- the system uses the first camera module 11 to detect the markers 61a to 61d to detect the position of the cutting board 60 (FIG. 12A).
- the material 77 on the cutting board is detected, and the material 77 candidate and the recipe candidate are displayed to the user.
- the user identifies material 77 and selects a recipe using a method similar to that of the previous embodiment.
- the system obtains information about the material 77 and recipes for the material 77, that is, cooking methods and processing methods from the database.
- FIG. 12B shows an example of cutting the material 77 into small pieces (cut into julienne).
- the system displays the guideline 75 on the display 14 so as to overlap the desired position of the material 77. For example, the guideline 75 is displayed at a position at a predetermined distance from the end of the material 77.
- the number of the displayed guideline 75 is not limited.
- One guideline 75 may be displayed for each cutting, or a plurality of guideline 75s required for a plurality of cuttings may be displayed.
- the neural network may be made to learn the teacher data as in the previous embodiment.
- a neural network used for image generation such as Autoencoder, CAE, VAE, U-net, and GAN.
- Image 99 can be obtained by performing image processing based on this information and adding the guideline 75 to the material 77.
- an information processing system capable of obtaining information without using hands.
- an information processing device capable of obtaining information without using a hand.
- one aspect of the present invention makes it possible to provide an information processing method capable of obtaining information without using a hand.
- a cooking assist system in which information can be obtained without using hands.
- a cooking assist device capable of obtaining information without using hands.
- a cooking assist method capable of obtaining information without using hands.
- the heating time varies depending on the cooking content, ingredients, and amount of ingredients.
- the system detects an object of the cooking utensil during cooking while performing the cooking and processing shown in the previous embodiment, and displays the heating time in the cooking utensil and the time until the end of cooking on the AR glass 10. Let me. By detecting an object with this system, even when cooking is performed simultaneously using a plurality of cooking utensils, the heating time and the time until the end of cooking can be displayed on the AR glass 10 for each cooking utensil.
- FIG. 13A shows a user observing through the AR glass 10 how cooking utensils 210, cooking utensils 212, and cooking utensils 214 are arranged on a stove 200 having a plurality of heating means and cooked by each heating means.
- the field of view 50 is shown.
- heating means included in the stove 200 heating by gas, induction heating (IH), heating by electric resistance, or the like can be used.
- the heating intensity can be adjusted.
- the heating intensity that is, the thermal power can be adjusted by the amount of gas introduced.
- the heating intensity can be adjusted by the introduced electric power.
- FIG. 13A shows an example in which a cooking utensil 210 is used to heat a material that requires cooking for 15 minutes.
- FIG. 13A shows a state in which 10 minutes have passed since the start of cooking. Further, it is shown that the text 220a indicating the elapsed time of cooking and the remaining heating time is displayed on the display unit 14 of the AR glass 10.
- FIG. 13A shows an example in which a cooking utensil 212 is used to heat a material that requires cooking for 10 minutes.
- FIG. 13A shows a state in which 2 minutes have passed since the start of cooking. Further, it is shown that the text 220b indicating the elapsed time of cooking and the remaining heating time is displayed on the display unit 14 of the AR glass 10.
- FIG. 13A shows an example in which water is heated to 100 ° C. using cooking utensil 214.
- FIG. 13A shows how the water in the cooking utensil 214 is heated to 80 ° C.
- the cooking utensil 214 is provided with a temperature sensor 216 capable of wirelessly communicating with the AR glass 10.
- the temperature inside the cooking utensil 214 detected by the temperature sensor 216 is transmitted to the AR glass 10 and can be displayed on the display unit 14.
- the system predicts the heating time required to reach the desired temperature from the amount of water contained in the cooking utensil 214, the temperature of the water before heating, and the amount of change in the temperature of the water due to heating.
- the heating time is preferably predicted in a database. Further, for predicting the heating time, machine learning or a data sheet in which the energy required for heating a certain amount of water is described can be used.
- FIG. 13A shows that the current temperature of water in the cooking utensil 214 and the result predicted to reach 100 ° C. in 2 minutes are displayed in the text 220c.
- the cooking utensil 210 and the cooking utensil 212 may be provided with the temperature sensor 216 in the same manner as the cooking utensil 214.
- the temperature sensor 216 can acquire the temperature of the oil and transmit it to the AR glass 10.
- the AR glass 10 can display the internal temperature of each cooking utensil on the display unit 14.
- the user can cook at the optimum temperature while checking the temperature inside the cooking utensil.
- excessive heating of the oil in the cooking utensil may cause the oil to ignite.
- the AR glass 10 can display a warning on the display unit 14 to alert the user.
- the AR glass 10 may be connected to the stove 200 via a network. In this case, it is preferable to stop the heating of the desired heating portion by the signal emitted from the AR glass 10.
- the temperature information in the cooking utensil and the cooking recipe acquired from the temperature sensor 216 may be analyzed by a database, the time required for cooking may be calculated, and displayed in the texts 220a, 220b, and 220c.
- FIG. 13B shows the user's field of view 50 looking through the AR glass 10 on the stove 200 after 5 minutes have passed from the state of FIG. 13A.
- FIG. 13B shows how the predetermined heating time has ended.
- the text 220a indicates that the heating time has ended.
- the AR glass 10 may transmit a signal to the stove 200, and the stove 200 that receives the signal may end the heating of the cooking utensil 210.
- FIG. 13B shows a state in which 7 minutes have passed from the start of cooking.
- the text 220b on the display unit 14 of the AR glass 10 shows that the elapsed cooking time and the remaining heating time are updated.
- the AR glass 10 analyzes the temperature obtained from the temperature sensor 216 provided on the cooking utensil 212, and transmits a signal for adjusting the heating intensity (thermal power, etc.) to the cooking utensil 212 to the stove 200. May be good. For example, if the inside of the cooking utensil 212 is overheated, the AR glass 10 can send a signal to the stove 200 to reduce the heating. Further, when the internal temperature of the cooking utensil 212 is low, the AR glass 10 can transmit a signal to increase the heating to the stove 200. The stove 200 that has received the signal adjusts the heating intensity of the corresponding heating unit.
- FIG. 13B shows a user moving the cooking utensil 214 from the stove 200. Since the cooking utensil 214 does not exist in the field of view 50, the text 220c is hidden on the display unit 14.
- the system may terminate the heating when the water inside the cookware 214 reaches the desired temperature. Further, the system may detect that the cooking utensil 214 is moved from the heating means by image processing described later, and may end the heating. At the end of heating, the AR glass 10 may send a signal of the end of heating to the stove 200, and the stove 200 may receive the signal to end the heating of the desired heating means.
- the object detection method such as R-CNN, YOLO, SSD described in the background technique and the semantic segmentation such as FCN, SegNet, U-Net, PSPNet can be used.
- the system can image-analyze features such as the shape and size of the cookware to identify the type of cookware.
- the characteristics of the shape include the shape and number of handles of the cooking utensil and the shape of the spout.
- the size is characterized by the area, height, volume, etc. of the bottom surface.
- a rectangle 222a surrounding the cooking utensil 210, a rectangle 222b surrounding the cooking utensil 212, and a rectangle 222c surrounding the cooking utensil 214 are displayed on the display unit 14.
- the system determines that the cooking utensil 210 is a pan and displays the text 224a.
- the rectangle 222b it is determined that the cooking utensil 212 is a frying pan, and the text 224b is displayed.
- the image in the rectangle 222c it is determined that the cooking utensil 214 is a kettle, and the text 224c is displayed.
- each cooking utensil may be determined by machine learning using teacher data, or the cooking utensils used by the user in advance are registered in the database or the memory 16 of the AR glass 10 and the registered data. You may go based on.
- each cooking utensil may be divided into regions.
- the type of cooking utensil may be specified from the shape of each area and a label may be given to each area, or the type of cooking utensil may be specified by the above image analysis and then a label may be given to each area. ..
- the system displays information about the cooking utensil on the display unit 14.
- the information displayed on the display unit 14 includes the type of cooking utensil, the ingredients contained in the cooking utensil, the cooking time, and the like.
- the information about the cooking utensil may be hidden.
- the information about the cooking utensil may be hidden after a certain period of time has passed in the state where the cooking utensil is not present in the field of view 50.
- an information processing system capable of obtaining information without using hands.
- an information processing device capable of obtaining information without using a hand.
- one aspect of the present invention makes it possible to provide an information processing method capable of obtaining information without using a hand.
- a cooking assist system in which information can be obtained without using hands.
- a cooking assist device capable of obtaining information without using hands.
- a cooking assist method capable of obtaining information without using hands.
- FIG. 15A shows a perspective view of the glasses-type information terminal 900.
- the information terminal 900 has a pair of display panels 901, a pair of housings (housing 902a, housing 902b), a pair of optical members 903, a pair of mounting portions 904, and the like.
- the information terminal 900 can project the image displayed on the display panel 901 onto the display area 906 of the optical member 903. Further, since the optical member 903 has translucency, the user can see the image displayed in the display area 906 by superimposing it on the transmitted image visually recognized through the optical member 903. Therefore, the information terminal 900 is an information terminal capable of AR display.
- the display unit 14 described in the previous embodiment includes not only the display panel 901 but also an optical member 903 including a display area 906, and an optical system having a lens 911, a reflector 912, and a reflection surface 913 described later. Can also be included.
- As the display panel 901 an organic EL display, an LED display, an inorganic EL display, a liquid crystal display, or the like can be used. When a liquid crystal display is used as the display panel 901, it is preferable to provide a light source that functions as a backlight.
- the information terminal 900 is provided with a pair of cameras 905 capable of photographing the front and a pair of cameras 909 capable of photographing the user side.
- the camera 905 is a part of the component of the first camera module 11, and the camera 909 is a part of the component of the second camera module 12.
- the camera 905 of the present embodiment is not limited to this.
- the number of cameras 905 provided in the information terminal 900 may be one.
- the camera 905 may be provided in the central portion of the front surface of the information terminal 900, or may be provided in the front surface of one of the housing 902a and the housing 902b. Further, the two cameras 905 may be provided in front of the housing 902a and the housing 902b, respectively.
- the camera 909 can detect the line of sight of the user. Therefore, it is preferable that two cameras 909 are provided, one for the right eye and the other for the left eye. However, if one camera can detect the line of sight of both eyes, one camera 909 may be used. Further, the camera 909 may be an infrared camera capable of detecting infrared rays.
- the housing 902a has a wireless communication device 907, and a video signal or the like can be supplied to the housing 902. Further, it is preferable that the wireless communication device 907 has a communication module 17 and communicates with the database. In addition to the wireless communication device 907 or in addition to the wireless communication device 907, a connector to which a cable 910 to which a video signal or a power supply potential is supplied may be connected may be provided. The cable 910 may have a function as a wiring 10c connected to the housing 10b. Further, by providing the housing 902 with an acceleration sensor, a gyro sensor, or the like as the sensor 25, it is possible to detect the direction of the user's head and display an image corresponding to the direction in the display area 906. Further, the housing 902 is preferably provided with a battery 21, and can be charged wirelessly or by wire.
- the housing 902b is provided with an integrated circuit 908.
- the integrated circuit 908 includes a controller 13, a processor 15, a memory 16, an audio controller 18, and the information terminal 900 such as a camera 905, a wireless communication device 907, a pair of display panels 901, a microphone 19, and a speaker 20. It has a function to control various components and a function to generate an image.
- the integrated circuit 908 may have a function of generating a composite image for AR display.
- the wireless communication device 907 can communicate data with an external device.
- data transmitted from the outside can be output to the integrated circuit 908, and the integrated circuit 908 can generate image data for AR display based on the data.
- Examples of data transmitted from the outside include data including information necessary for cooking transmitted from a database and data including information related to cooking transmitted from various sensors provided in cooking utensils. ..
- a display panel 901, a lens 911, and a reflector 912 are provided inside the housing 902. Further, a portion of the optical member 903 corresponding to the display area 906 has a reflecting surface 913 that functions as a half mirror.
- the light 915 emitted from the display panel 901 passes through the lens 911 and is reflected by the reflector 912 toward the optical member 903. Inside the optical member 903, the light 915 repeats total internal reflection at the end surface of the optical member 903 and reaches the reflecting surface 913 to project an image on the reflecting surface 913. As a result, the user can visually recognize both the light 915 reflected on the reflecting surface 913 and the transmitted light 916 transmitted through the optical member 903 (including the reflecting surface 913).
- the light source is provided so that the light from the light source functioning as a backlight enters the lens 911 through the display panel 901. That is, it is preferable that the liquid crystal panel of the liquid crystal display is provided between the light source and the lens 911.
- FIG. 15B shows an example in which the reflector 912 and the reflector 913 each have a curved surface.
- the degree of freedom in optical design can be increased and the thickness of the optical member 903 can be reduced as compared with the case where these are flat surfaces.
- the reflector 912 and the reflection surface 913 may be flat.
- the reflector 912 a member having a mirror surface can be used, and it is preferable that the reflector has a high reflectance. Further, as the reflecting surface 913, a half mirror utilizing the reflection of the metal film may be used, but if a prism or the like utilizing the total reflection is used, the transmittance of the transmitted light 916 can be increased.
- the housing 902 has a mechanism for adjusting the distance between the lens 911 and the display panel 901 and the angles thereof. This makes it possible to adjust the focus and enlarge / reduce the image.
- the lens 911 and the display panel 901 may be configured to be movable in the optical axis direction.
- the housing 902 has a mechanism capable of adjusting the angle of the reflector 912. By changing the angle of the reflector 912, it is possible to change the position of the display area 906 in which the image is displayed. This makes it possible to arrange the display area 906 at an optimum position according to the position of the user's eyes.
- a display device can be applied to the display panel 901. Therefore, the information terminal 900 can be displayed with extremely high definition.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Business, Economics & Management (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Tourism & Hospitality (AREA)
- Data Mining & Analysis (AREA)
- Computer Hardware Design (AREA)
- Evolutionary Computation (AREA)
- Library & Information Science (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- General Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Primary Health Care (AREA)
- Marketing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- User Interface Of Digital Computer (AREA)
- Medical Treatment And Welfare Office Work (AREA)
Abstract
Description
図2は、本発明の一態様を説明するブロック図である。
図3は、本発明の一態様を説明するフローチャートである。
図4A、および図4Bは、本発明の一態様を説明する模式図である。
図5A、および図5Bは、本発明の一態様を説明する模式図である。
図6A、および図6Bは、本発明の一態様を説明する模式図である。
図7は、本発明の一態様であるガイドラインを表示させる方法の一例を説明する模式図である。
図8Aは、本発明の一態様である、機械学習を用いてニューラルネットワークに教師データを学習させる方法を説明する模式図である。図8Bは、本発明の一態様である、学習済みのニューラルネットワークを用いた画像の生成方法を説明する模式図である。
図9は、本発明の一態様の画像の生成方法を説明する模式図である。
図10Aは、本発明の一態様である、機械学習を用いてニューラルネットワークに教師データを学習させる方法を説明する模式図である。図10Bは、本発明の一態様である、学習済みのニューラルネットワークを用いた画像の生成方法を説明する模式図である。
図11A、および図11Bは、本発明の一態様を説明する模式図である。
図12A、および図12Bは、本発明の一態様であるガイドラインを表示させる方法の一例を説明する模式図である。
図13A、および図13Bは、本発明の一態様を説明する模式図である。
図14A、および図14Bは、本発明の一態様である、調理器具の検出方法を説明する模式図である。
図15A、および図15Bは、本発明の一態様を説明する模式図である。
本実施の形態では、本発明の一態様である情報処理システムについて説明を行う。本実施の形態の情報処理システムでは、ユーザーに対し、調理を補助することができる。
図1は、ユーザーがARグラス10を装着しながら、調理を行う例を示している。なお、ARとはAugmented Reality(拡張現実)の略であり、ARグラス10は、ユーザーが見ている外部の世界に、画像や文字などの情報を投影することができる。
図2は、ARグラス10の構成例を示すブロック図である。ARグラス10は、第1のカメラモジュール11と、第2のカメラモジュール12と、コントローラ13と、表示部14と、プロセッサ15と、メモリ16と、通信モジュール17と、オーディオコントローラ18と、マイクロフォン19と、スピーカ20と、バッテリ21と、センサ25と、バス22を有する。
ARグラス10を用いた情報処理システムの詳細について、図3乃至図10を用いて説明を行う。なお、以降の説明において、上記情報処理システムを、単に「システム」と略記する。
先の実施の形態では、材料として魚の例を示したが、本発明はこれに限らない。材料として、魚以外の魚介類、牛、豚、羊など哺乳類の肉、鶏、鴨、七面鳥など鳥類の肉、蛇やトカゲなどのは虫類、カエルなどの両生類、コオロギなどの昆虫、野菜、果物、キノコなどを用いることができる。本実施の形態では、材料77として、野菜を用いる例を示す。
本実施の形態では、調理中の調理器具に対する処理時間をARグラス10に表示させる方法を説明する。
本実施の形態では、ARグラス10に用いることのできる、表示装置を備える情報端末の構成例について説明する。
Claims (14)
- 表示手段、および撮像手段を備えたウェアラブル装置と、
前記ウェアラブル装置とネットワークを介して接続するデータベースと、
を有し、
前記データベースは、調理レシピ、調理方法、および材料に関する情報の少なくとも一、を有し、
前記ウェアラブル装置は、前記撮像手段により第1の材料を検出し、
前記ウェアラブル装置は、前記データベースから前記第1の材料に関する情報を収集し、
前記撮像手段が撮像する範囲の特定の領域に前記第1の材料が存在するとき、前記表示手段に前記第1の材料に関する情報を表示し、
前記領域に前記第1の材料が存在しないとき、前記表示手段から前記第1の材料に関する情報を非表示にする、情報処理システム。 - 請求項1において、前記調理レシピに基づき、前記表示手段に前記第1の材料を用いる調理方法を表示する情報処理システム。
- 請求項1または請求項2において、前記第1の材料に関する情報は、前記第1の材料の切断位置を含む情報処理システム。
- 請求項1乃至請求項3のいずれか一項において、前記第1の材料に関する情報は、前記第1の材料に含まれる骨の位置を含む情報処理システム。
- 請求項1乃至請求項4のいずれか一項において、前記ウェアラブル装置は、眼鏡型ウェアラブル装置である情報処理システム。
- 請求項1乃至請求項5のいずれか一項において、前記データベースは、サーバーに保存されている情報処理システム。
- 表示手段、および撮像手段を備えたウェアラブル装置と、
温度センサを有する調理器具と、
を有し、
前記ウェアラブル装置と前記温度センサは、第1のネットワークを介して接続され、
前記ウェアラブル装置は、前記撮像手段により前記調理器具を検出し、
前記ウェアラブル装置は、前記温度センサから前記調理器具の内部の温度に関する情報を収集し、
前記撮像手段が撮像する範囲の特定の領域に前記調理器具が存在するとき、前記表示手段に前記温度に関する情報を表示し、
前記領域に前記調理器具が存在しないとき、前記表示手段から前記温度に関する情報を非表示にする、情報処理システム。 - 請求項7において、
前記情報処理システムは、さらにデータベースを有し、
前記データベースは、前記ウェアラブル装置、および前記温度センサと前記第1のネットワークを含む第2のネットワークを介して接続されており、
前記データベースは、前記温度に関する情報を、前記第2のネットワークを介して受信し、
前記データベースは、前記温度に関する情報から、前記調理器具の加熱に必要な時間を計算により求め、前記表示手段に表示する、情報処理システム。 - 請求項7または請求項8において、前記ウェアラブル装置は、眼鏡型ウェアラブル装置である情報処理システム。
- 請求項8において、前記データベースは、サーバーに保存されている情報処理システム。
- 表示手段、および撮像手段を備えたウェアラブル装置を用いた情報処理方法であり、
前記ウェアラブル装置は、ユーザーが、前記表示手段を通して、材料や調理器具を見ることができるよう、前記ユーザーに装着され、
前記情報処理方法は、
前記撮像手段を用いて前記ユーザーの視線上に存在するまな板を検出する工程と、
前記まな板上に設けられた第1の材料を特定する工程と、
前記表示手段に調理方法を表示する工程と、
前記ユーザーの視線上で、前記第1の材料と重なるように、前記表示手段に前記第1の材料の切断位置を表示する工程と、
を含む情報処理方法。 - 請求項11において、前記情報処理方法は、さらに、
前記ユーザーの視線上で、前記材料と重なるように、前記表示手段に前記材料の表面または内部に存在する異物の位置を表示する工程を含む情報処理方法。 - 請求項12において、前記異物は、骨、鱗、寄生虫、および毛から選ばれた一である情報処理方法。
- 請求項11乃至請求項13のいずれか一項において、前記ウェアラブル装置は、眼鏡型ウェアラブル装置である情報処理方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202080043825.5A CN114026592A (zh) | 2019-06-25 | 2020-06-12 | 信息处理系统及信息处理方法 |
JP2021528031A JPWO2020261028A1 (ja) | 2019-06-25 | 2020-06-12 | |
KR1020227001953A KR20220025817A (ko) | 2019-06-25 | 2020-06-12 | 정보 처리 시스템 및 정보 처리 방법 |
US17/619,623 US11922690B2 (en) | 2019-06-25 | 2020-06-12 | Data processing system and data processing method |
DE112020003104.7T DE112020003104T5 (de) | 2019-06-25 | 2020-06-12 | Datenverarbeitungssystem und Verfahren zur Datenverarbeitung |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-117639 | 2019-06-25 | ||
JP2019117639 | 2019-06-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020261028A1 true WO2020261028A1 (ja) | 2020-12-30 |
Family
ID=74060027
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2020/055509 WO2020261028A1 (ja) | 2019-06-25 | 2020-06-12 | 情報処理システム、および情報処理方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11922690B2 (ja) |
JP (1) | JPWO2020261028A1 (ja) |
KR (1) | KR20220025817A (ja) |
CN (1) | CN114026592A (ja) |
DE (1) | DE112020003104T5 (ja) |
WO (1) | WO2020261028A1 (ja) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006323590A (ja) * | 2005-05-18 | 2006-11-30 | Nippon Telegr & Teleph Corp <Ntt> | レシートおよびレシート発行装置、カードおよびカード記録装置、端末装置、家計簿データ管理システム、家計簿データ管理方法、コンピュータプログラム、記録媒体 |
JP2011058782A (ja) * | 2009-09-14 | 2011-03-24 | Keio Gijuku | 調理システム、この調理システムに用いられる調理器具、及び、調理システムセット |
JP2017120329A (ja) * | 2015-12-28 | 2017-07-06 | 株式会社ブリリアントサービス | 調理用ヘッドマウントディスプレイおよび調理用ヘッドマウントディスプレイのプログラム |
JP2017120164A (ja) * | 2015-12-28 | 2017-07-06 | 株式会社ブリリアントサービス | 調理調整ヘッドマウントディスプレイ、調理調整ヘッドマウントディスプレイのプログラム、および調理調整ヘッドマウントディスプレイシステム |
JP2018128979A (ja) * | 2017-02-10 | 2018-08-16 | パナソニックIpマネジメント株式会社 | 厨房支援システム |
WO2019031020A1 (ja) * | 2017-08-09 | 2019-02-14 | 株式会社DSi | 計量システム、電子秤及び電子秤用マーカー |
KR20190100496A (ko) * | 2018-02-05 | 2019-08-29 | 충북대학교 산학협력단 | 요리안내 시스템 및 방법 |
JP6692960B1 (ja) * | 2019-03-29 | 2020-05-13 | 株式会社エヌ・ティ・ティ・データ | 調理支援システム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DK1887874T3 (da) * | 2005-05-31 | 2012-10-15 | Teknologisk Inst | Fremgangsmåde og anvendelse af en database til automatisk at bestemme kvalitetskarakteristika for en slagtekrop i en slagtelinje |
EP3116358B2 (en) * | 2014-03-14 | 2022-12-07 | Spectrum Brands, Inc. | Wirelessly operable cooking appliance |
WO2016079960A1 (en) * | 2014-11-18 | 2016-05-26 | Seiko Epson Corporation | Image processing apparatus, control method for image processing apparatus, and computer program |
JP6509686B2 (ja) * | 2015-09-04 | 2019-05-08 | 株式会社東芝 | 電子機器及び方法 |
JP6444835B2 (ja) | 2015-09-07 | 2018-12-26 | 株式会社東芝 | 画像処理装置、画像処理プログラムおよび画像処理システム |
US9704257B1 (en) | 2016-03-25 | 2017-07-11 | Mitsubishi Electric Research Laboratories, Inc. | System and method for semantic segmentation using Gaussian random field network |
JP6327682B1 (ja) | 2017-01-30 | 2018-05-23 | クックパッド株式会社 | 情報処理システム、情報処理装置、情報処理方法、及びプログラム |
US10360734B2 (en) * | 2017-05-05 | 2019-07-23 | Unity IPR ApS | Contextual applications in a mixed reality environment |
JP7338184B2 (ja) * | 2018-03-28 | 2023-09-05 | 株式会社リコー | 情報処理装置、情報処理システム、移動体、情報処理方法、及びプログラム |
JP7076007B2 (ja) * | 2018-11-29 | 2022-05-26 | マクセル株式会社 | 映像表示装置および方法 |
CN113939223A (zh) | 2019-06-07 | 2022-01-14 | 株式会社半导体能源研究所 | 复合装置 |
-
2020
- 2020-06-12 US US17/619,623 patent/US11922690B2/en active Active
- 2020-06-12 CN CN202080043825.5A patent/CN114026592A/zh active Pending
- 2020-06-12 DE DE112020003104.7T patent/DE112020003104T5/de active Pending
- 2020-06-12 JP JP2021528031A patent/JPWO2020261028A1/ja active Pending
- 2020-06-12 KR KR1020227001953A patent/KR20220025817A/ko unknown
- 2020-06-12 WO PCT/IB2020/055509 patent/WO2020261028A1/ja active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006323590A (ja) * | 2005-05-18 | 2006-11-30 | Nippon Telegr & Teleph Corp <Ntt> | レシートおよびレシート発行装置、カードおよびカード記録装置、端末装置、家計簿データ管理システム、家計簿データ管理方法、コンピュータプログラム、記録媒体 |
JP2011058782A (ja) * | 2009-09-14 | 2011-03-24 | Keio Gijuku | 調理システム、この調理システムに用いられる調理器具、及び、調理システムセット |
JP2017120329A (ja) * | 2015-12-28 | 2017-07-06 | 株式会社ブリリアントサービス | 調理用ヘッドマウントディスプレイおよび調理用ヘッドマウントディスプレイのプログラム |
JP2017120164A (ja) * | 2015-12-28 | 2017-07-06 | 株式会社ブリリアントサービス | 調理調整ヘッドマウントディスプレイ、調理調整ヘッドマウントディスプレイのプログラム、および調理調整ヘッドマウントディスプレイシステム |
JP2018128979A (ja) * | 2017-02-10 | 2018-08-16 | パナソニックIpマネジメント株式会社 | 厨房支援システム |
WO2019031020A1 (ja) * | 2017-08-09 | 2019-02-14 | 株式会社DSi | 計量システム、電子秤及び電子秤用マーカー |
KR20190100496A (ko) * | 2018-02-05 | 2019-08-29 | 충북대학교 산학협력단 | 요리안내 시스템 및 방법 |
JP6692960B1 (ja) * | 2019-03-29 | 2020-05-13 | 株式会社エヌ・ティ・ティ・データ | 調理支援システム |
Also Published As
Publication number | Publication date |
---|---|
CN114026592A (zh) | 2022-02-08 |
KR20220025817A (ko) | 2022-03-03 |
DE112020003104T5 (de) | 2022-03-17 |
JPWO2020261028A1 (ja) | 2020-12-30 |
US11922690B2 (en) | 2024-03-05 |
US20220351509A1 (en) | 2022-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6299744B2 (ja) | 情報処理装置および記憶媒体 | |
KR102179142B1 (ko) | 착용식 식품 영양 피드백 시스템 | |
US10458845B2 (en) | Mobile device for food identification an quantification using spectroscopy and imaging | |
EP4011274A1 (en) | Eye tracking using eyeball center position | |
RU2668408C2 (ru) | Устройства, системы и способы виртуализации зеркала | |
US9135508B2 (en) | Enhanced user eye gaze estimation | |
US20170227754A1 (en) | Systems and applications for generating augmented reality images | |
CN112507799A (zh) | 基于眼动注视点引导的图像识别方法、mr眼镜及介质 | |
CN105446474B (zh) | 可穿戴智能设备及其交互的方法、可穿戴智能设备系统 | |
CN108416703A (zh) | 厨房支援系统 | |
CN112181152A (zh) | 基于mr眼镜的广告推送管理方法、设备及应用 | |
US11763437B2 (en) | Analyzing apparatus and method, and image capturing system | |
CN110021404A (zh) | 用于处理与食物相关的信息的电子设备和方法 | |
CN103648366A (zh) | 用于远程测量光学焦点的系统和方法 | |
JP2014167716A (ja) | 情報処理装置および記憶媒体 | |
JPWO2018066190A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP2010061381A (ja) | 自動レシピ作成装置およびプログラム | |
JP2017120329A (ja) | 調理用ヘッドマウントディスプレイおよび調理用ヘッドマウントディスプレイのプログラム | |
JP2021165622A (ja) | 食材貯蔵システム、嗜好情報取得方法および嗜好情報取得装置 | |
WO2020261028A1 (ja) | 情報処理システム、および情報処理方法 | |
KR20190048922A (ko) | 스마트 테이블 및 스마트 테이블의 제어방법 | |
JP2016162124A (ja) | 調理法決定方法及び調理法決定装置 | |
JP6895830B2 (ja) | 調理支援装置 | |
CN113631094A (zh) | 身心状态的判定系统、判定装置、方法及计算机程序 | |
US11864834B1 (en) | Multi-tiled plenoptic system for the detection and correction of ocular defects and for improved foveated rendering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20833441 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021528031 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20227001953 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20833441 Country of ref document: EP Kind code of ref document: A1 |