WO2021166673A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2021166673A1
WO2021166673A1 PCT/JP2021/004222 JP2021004222W WO2021166673A1 WO 2021166673 A1 WO2021166673 A1 WO 2021166673A1 JP 2021004222 W JP2021004222 W JP 2021004222W WO 2021166673 A1 WO2021166673 A1 WO 2021166673A1
Authority
WO
WIPO (PCT)
Prior art keywords
texture
cooking
information processing
information
parameters
Prior art date
Application number
PCT/JP2021/004222
Other languages
French (fr)
Japanese (ja)
Inventor
暢彦 向井
佳恵 永野
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021166673A1 publication Critical patent/WO2021166673A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A23FOODS OR FOODSTUFFS; TREATMENT THEREOF, NOT COVERED BY OTHER CLASSES
    • A23PSHAPING OR WORKING OF FOODSTUFFS, NOT FULLY COVERED BY A SINGLE OTHER SUBCLASS
    • A23P30/00Shaping or working of foodstuffs characterised by the process or apparatus
    • A23P30/30Puffing or expanding
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J44/00Multi-purpose machines for preparing food with several driving units
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C15/00Details
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/80Homes; Buildings

Definitions

  • This technology is particularly related to an information processing device, an information processing method, and a program that make it possible to easily realize a favorite texture.
  • a technique for reproducing a dish prepared by a cook on a cooking robot by sensing the movement of the cook during cooking and saving / transmitting the sensing result data is being studied.
  • the cooking operation by the cooking robot is performed so as to realize, for example, the same movement as the movement of the cook's hand based on the sensing result.
  • Patent Document 1 describes a technique for reproducing the texture of an original food by combining a plurality of taste elements based on numerical data related to the texture. It is also described that a food having the same shape as the original food is formed by a 3D printer based on the shape data obtained by reading the shape of the food with a 3D scanner.
  • This technology was made in view of such a situation, and makes it possible to easily realize a favorite texture.
  • the information processing device on one aspect of the present technology includes a parameter setting unit that sets parameters that specify cooking operations by the cooking device based on texture specification data that specifies the texture of each part of the food to be cooked.
  • parameters that specify the cooking operation by the cooking device are set based on the texture specification data that specifies the texture of each part of the ingredient to be cooked.
  • FIG. 1 It is a block diagram which shows the structural example of the cooking system which concerns on one Embodiment of this technique. It is a figure which shows the installation example of an information processing terminal and a cooking apparatus. It is a figure which shows the example of the operation of the cooking apparatus using an information processing terminal. It is a figure which shows the example of the control of a cooking apparatus. It is a perspective view which shows the appearance of a cooking apparatus in an enlarged manner. It is a figure which shows the example of the irradiation of a laser beam. It is a figure which shows the example of the ingredient after cooking. It is a figure which shows the example of the texture designation data. It is a figure which shows another example of texture designation data.
  • FIG. 1 is a diagram showing a configuration example of a cooking system according to an embodiment of the present technology.
  • the cooking system of FIG. 1 is configured by connecting an information processing terminal 1 and a cooking device 2 via wireless communication. Communication between the information processing terminal 1 and the cooking device 2 may be performed via a wire.
  • the information processing terminal 1 is connected to the information processing server 11 via a network 12 such as the Internet.
  • the information processing terminal 1 is a tablet terminal provided with a display on the front of the housing.
  • the information processing terminal 1 detects a user's operation on a button or the like displayed on the display, and controls the operation of the cooking device 2 in cooperation with the information processing server 11 as appropriate.
  • the information processing terminal 1 functions as an information processing device that controls the operation of the cooking device 2 and causes the cooking device 2 to perform cooking.
  • the information processing terminal 1 is a tablet terminal, but the information processing terminal 1 may be configured by other types of devices having a display, such as a PC, a TV, and a smartphone.
  • the cooking device 2 has a drive system device and various sensors, and is a device for cooking using ingredients.
  • Ingredients include vegetable ingredients such as vegetables and fruits, animal ingredients such as meat and fish, processed ingredients, seasonings, and beverages such as water and liquor. Not only solid ingredients but also powdered ingredients are included.
  • the information processing terminal 1 and the cooking device 2 constituting the cooking system are installed in a kitchen where, for example, a chef or other cook cooks.
  • the users of the information processing terminal 1 and the cooking apparatus 2 are chefs and people who support the chefs. Although only one chef is shown in FIG. 2, there are many people other than the chef in the kitchen where the information processing terminal 1 and the cooking device 2 are installed.
  • the information processing terminal 1 may be operated by a person other than the chef.
  • the information processing terminal 1 is located near the chef who is cooking, and the cooking device 2 is installed on a table at a remote position.
  • the chef operates the cooking device 2 using the information processing terminal 1 and causes the cooking device 2 to cook.
  • FIG. 3 is a diagram showing an example of operation of the cooking apparatus 2 using the information processing terminal 1.
  • the chef inputs the desired texture to the information processing terminal 1 by operating the screen displayed on the display of the information processing terminal 1.
  • the texture may be input by voice.
  • the information processing terminal 1 When the texture of chocolate is specified, the information processing terminal 1 generates recipe data for realizing the specified texture, as shown in the lower part of FIG.
  • the recipe data is information used to control the cooking apparatus 2.
  • information on the ingredients to be cooked, information on the cooking operation of the cooking device 2, and control parameters for controlling the cooking device 2 are described for each cooking process. ..
  • the cooking device 2 is controlled according to the description of the recipe data, and the cooking operation is performed in the cooking device 2.
  • the texture of chocolate, for example, to be cooked becomes that of the texture specified by the chef.
  • the texture of the foodstuff to be cooked is specified by the user, and the cooking device 2 performs cooking so as to realize the texture specified by the user.
  • the user can easily obtain the ingredients having the desired texture.
  • select each numerical value for the texture expressed in languages such as "fluffy”, “karikari”, “purunpurun”, “Melty”, and “Crispy”. Is done by.
  • FIG. 5 is an enlarged perspective view showing the appearance of the cooking apparatus 2.
  • the cooking device 2 is a laser cooking device that irradiates a cooking object with a laser beam to cook.
  • the cooking device 2 is basically configured by attaching a table-shaped cooking table 22 and a driving unit 24 composed of an XY stage or the like to the main body 21.
  • a substantially square plate-shaped chocolate C is placed on the counter 22 as an ingredient to be cooked.
  • Three-dimensional chocolate may be used as an ingredient to be cooked.
  • the main body 21 communicates with the information processing terminal 1 and controls the operation of each unit according to the control by the information processing terminal 1.
  • a control parameter related to the cooking operation is transmitted from the information processing terminal 1.
  • the output (characteristics) of the laser beam is adjusted according to the control parameters transmitted from the information processing terminal 1.
  • the cooking operation controlled by the main body 21 includes adjustment of the position of the laser head 34, adjustment of the output of the laser light emitted from the laser head 34, and the like.
  • Piping 23-1, 23-2 is provided between the main body 21 and the countertop 22.
  • the pipe 23-1 is used to send hot water or cooling water from the main body 21 to the countertop 22.
  • the pipe 23-2 is used to return the water flowing from the pipe 23-1 and circulating in the pipe inside the countertop 22 from the countertop 22 to the main body 21.
  • the temperature of each portion of chocolate C placed on the countertop 22 is adjusted by circulating the water whose flow rate and temperature are adjusted through the piping inside the countertop 22.
  • the drive unit 24 is configured by connecting the rod-shaped support member 31 and the support member 32 in a cross shape at the connecting unit 33.
  • the support member 31 and the support member 32 are mounted substantially horizontally.
  • a laser light source including an optical fiber
  • a laser head 34 incorporating an optical system are provided.
  • the connecting position of the support member 31 and the supporting member 32 by the connecting portion 33 By adjusting the connecting position of the support member 31 and the supporting member 32 by the connecting portion 33, the position of the laser head 34 on the horizontal plane is adjusted. Further, the height of the laser head 34 is adjusted by adjusting the height of the support member 31 by the internal mechanism of the main body 21.
  • a non-contact thermometer 41 and a fan 42 are attached to a member (not shown) of the cooking apparatus 2.
  • the non-contact thermometer 41 measures the temperature of chocolate C in a non-contact manner.
  • the fan 42 adjusts the temperature of chocolate C by sending hot or cold air.
  • the fan 42 is provided with a heater for generating warm air.
  • the information processing terminal 1 controls the irradiation position (trajectory) of the laser beam on the surface of the chocolate C placed on the countertop 22 by controlling the cooking apparatus 2 having such a configuration.
  • the information processing terminal 1 can control the temperature of chocolate C for each portion by selecting an arbitrary portion of chocolate C and irradiating it with a laser beam or by applying cold air from a fan 42.
  • the output of the laser beam emitted from the laser head 34 is feedback-controlled using the output from the non-contact thermometer 41 that measures the surface temperature of the chocolate C to be irradiated. Further, the rotation speed of the fan 42 and the flow rate of the cooling water passing through the inside of the countertop 22 are also feedback-controlled using the output from the non-contact thermometer 41. As a result, the heating state and the cooling state of each portion of chocolate C are adjusted.
  • the fat content (cocoa butter) of only the selected part of chocolate C is changed to an arbitrary crystalline state. Cooking operation is also possible.
  • FIG. 6 is a diagram showing an example of laser light irradiation.
  • a condensing optical system 52 composed of a laser light source 51 and a condensing lens is provided inside the laser head 34.
  • the laser light is irradiated to the surface of the chocolate C.
  • the small circle below the chocolate C shown on the left side of FIG. 6 represents the irradiation range of the laser beam.
  • the position of chocolate C is adjusted so as to be offset from the imaging point.
  • the defocused laser beam is irradiated.
  • the circle below the chocolate C shown in the center of FIG. 6 represents the irradiation range of the laser beam.
  • the laser head 34 is provided with a homogenizer 53 composed of a cylindrical lens or the like.
  • a homogenizer 53 By providing the homogenizer 53, laser light (including a line shape) having a homogenized amount of light is irradiated.
  • the size of the irradiation range of the laser beam is a circle or line having a diameter of 1 mm or more.
  • the diameter of the laser beam or the length of the linear laser beam By setting the diameter of the laser beam or the length of the linear laser beam to 1 mm or more, it becomes possible to prevent evaporation (ablation) of chocolate C due to excessive heating. Further, by uniformly heating a wide area, more accurate heating control becomes possible in the temperature range of 20 to 50 degrees, which is important in the crystal transition process described later.
  • FIG. 7 is a diagram showing an example of ingredients after cooking.
  • the whole meat may be cooked before cooking with laser light. Cooking to bake the whole meat may be performed on the cooking table 22 of the cooking device 2, or may be performed using another cooking device such as an oven or a microwave oven.
  • cooking by the cooking device 2 is performed according to the user's designation of the texture of the ingredients.
  • the texture is specified by the user so as to specify the texture for each part of the food.
  • Cooking for each part of the foodstuff as shown in FIG. 7 is performed so as to realize the texture of each part specified by the user.
  • Texture designation data that specifies the texture of each part of the food material is generated in the information processing terminal 1 according to the user's designation, and the cooking operation of the cooking device 2 is controlled based on the generated texture designation data. ..
  • FIG. 8 is a diagram showing an example of texture designation data.
  • the texture designation data is configured as map information representing the texture of each part of the food material.
  • Each grid-like region constituting the texture designation data shown on the left side of FIG. 8 corresponds to each portion of the food material. The difference in the shade of each region indicates that the texture of each region is different.
  • a texture that gradually changes from left to right is realized.
  • a texture of meat that gradually increases in texture from left to right is realized.
  • the cooking operation by the cooking apparatus 2 is performed so as to increase the output of the laser beam from left to right, for example.
  • the difference in color of each part indicates that the texture is different.
  • the cooking device 2 it is possible to cook each part, so that it is possible to realize such a texture that changes for each part.
  • FIG. 9 is a diagram showing another example of texture designation data.
  • the texture designation data for one ingredient, cake may be configured as map information representing the texture of each of a plurality of layers. ..
  • the texture designation data D1 shown in FIG. 9 is information for designating the texture of the lowest layer, and the texture designation data D2 is information for designating the texture of the second layer from the bottom.
  • the texture designation data D3 is information for designating the texture of the third layer from the bottom, and the texture designation data D4 is information for designating the texture of the uppermost layer.
  • the texture designation data D1 to D4 each represent the texture of each portion in the target layer.
  • the user can design the change in texture as a two-dimensional or three-dimensional map and have him / her cook.
  • the user can design the texture by assuming the eating direction (how to eat), such as whether the eater is right-handed or left-handed.
  • FIG. 10 is a block diagram showing a configuration example of hardware of the information processing terminal 1.
  • the information processing terminal 1 is composed of a computer.
  • the CPU (Central Processing Unit) 101, ROM (Read Only Memory) 102, and RAM (Random Access Memory) 103 are connected to each other by the bus 104.
  • the CPU 101 loads the program stored in the storage unit 111 into the RAM 103 via the input / output interface 105 and the bus 104 and executes the program, so that various processes such as control of the cooking device 2 are performed.
  • the input / output interface 105 is connected to the bus 104.
  • a microphone 106, a camera 107, an operation unit 108, a display 109, a speaker 110, a storage unit 111, a communication unit 112, and a drive 113 are connected to the input / output interface 105.
  • the microphone 106 detects the user's voice and outputs the voice data to the CPU 101.
  • the camera 107 captures the surrounding situation including the user's movement.
  • the image data captured by the camera 107 is supplied to the CPU 101.
  • the operation unit 108 is composed of a touch panel provided on the display 109, a button provided on the housing of the information processing terminal 1, and the like.
  • the operation unit 108 detects the user's operation and outputs information indicating the content of the operation to the CPU 101.
  • the display 109 is composed of an LCD, an organic EL display, and the like.
  • the display 109 displays various screens such as a screen used for designating the texture according to the control by the CPU 101.
  • the speaker 110 outputs synthetic voice according to the control by the CPU 101.
  • the storage unit 111 is composed of a hard disk, a non-volatile memory, or the like.
  • the storage unit 111 stores various information such as a program executed by the CPU 101.
  • the communication unit 112 is composed of a network interface and the like.
  • the communication unit 112 communicates with the information processing server 11 on the network 12.
  • the communication unit 112 receives the information transmitted from the information processing server 11 and outputs it to the CPU 101, and transmits the information supplied from the CPU 101 to the information processing server 11.
  • the drive 113 drives the removable media 114 to read data from the removable media 114 and write data to the removable media 114.
  • FIG. 11 is a block diagram showing a functional configuration example of the information processing terminal 1.
  • the information processing unit 121 is realized in the information processing terminal 1.
  • the information processing unit 121 is composed of a presentation unit 131, a texture designation data generation unit 132, an inference result acquisition unit 133, a control parameter setting unit 134, a foodstuff state detection unit 135, a recipe data generation unit 136, and a control unit 137. ..
  • At least a part of the functional units shown in FIG. 11 is realized by executing a predetermined program by the CPU 101 of FIG.
  • the presentation unit 131 controls the display 109 and presents various types of information to the user. With respect to the presentation by the presentation unit 131, the user makes various selections such as selection of texture. Information representing the user's selection result is supplied to the texture designation data generation unit 132.
  • the texture designation data generation unit 132 generates texture designation data according to the result of selection of texture by the user, and outputs it to the inference result acquisition unit 133.
  • the inference result acquisition unit 133 controls the communication unit 112 and communicates with the information processing server 11.
  • the inference result acquisition unit 133 transmits the texture designation data to the information processing server 11 and acquires the result of the inference performed based on the texture designation data.
  • the information processing server 11 uses an inference model generated by machine learning to infer a cooking method or the like that realizes a texture selected by the user.
  • the inference of the cooking method is also performed by estimating the cooking method for each part.
  • the inference result acquisition unit 133 outputs an inference result including information on the cooking method to the control parameter setting unit 134 and the recipe data generation unit 136.
  • the design of the ingredients is also selected by the user.
  • the design of the food material is a design expressed in the appearance of the food material.
  • a basic design which is a rough design
  • information on the basic design is transmitted to the information processing server 11.
  • the information processing server 11 a design in which the basic design is made more detailed and a candidate for the detailed design capable of realizing the texture selected by the user is also inferred. If the cooking for the design is not done, the user selection of the basic design and the detailed design may not be made.
  • the detailed design candidate information acquired by the inference result acquisition unit 133 is supplied to the presentation unit 131, and the detailed design candidate is presented to the user.
  • the user selects a favorite detailed design from the presented candidates.
  • the inference result acquisition unit 133 including information on the cooking method regarding the detailed design selected by the user is supplied to the control parameter setting unit 134 and the recipe data generation unit 136.
  • the control parameter setting unit 134 sets control parameters that define the content of the cooking operation for each part of the food material based on the information supplied from the inference result acquisition unit 133, and outputs the control parameter to the recipe data generation unit 136.
  • the control parameters output to the recipe data generation unit 136 are used to generate the recipe data.
  • control parameter setting unit 134 updates the control parameter based on the sensing result of the state of the food material performed by the food material state detection unit 135 during the cooking operation by the cooking device 2, and sends the updated control parameter to the control unit 137.
  • Information indicating the state of the food material to be cooked in the cooking apparatus 2 is supplied from the food material state detection unit 135.
  • the control parameter setting unit 134 outputs the adjusted control parameter adjusted according to the state of the food to the control unit 137 and reflects it in the cooking operation of the cooking device 2.
  • the control parameters are updated in real time according to the state of the ingredients to be cooked.
  • the laser output as a control parameter is corrected according to the difference in the absorption rate of the laser light in the portion to be irradiated with the laser light (the portion to be cooked). As a result, stable cooking is possible even when ingredients in which different ingredients are mixed are targeted for cooking.
  • the food condition detection unit 135 detects the state of the food to be cooked.
  • the foodstuff state detection unit 135 analyzes an image of the foodstuff to be cooked taken by the cooking device 2 or sensor data measured by various sensors including a non-contact thermometer 41. Detect the condition of the ingredients to be cooked.
  • a camera or the like for taking a picture of foodstuffs is also provided at a predetermined position of the cooking apparatus 2. By analyzing the image taken by the camera 107 (FIG. 10) provided in the information processing terminal 1, the state of the foodstuff to be cooked may be detected.
  • the food condition detection unit 135 detects the degree of progress of cooking by laser light.
  • the food material state detection unit 135 outputs the sensing result of the food material state to the control parameter setting unit 134.
  • the state of the food may be sensed by the cooking apparatus 2, and the sensing result may be acquired by the food state detection unit 135.
  • the recipe data generation unit 136 generates recipe data based on the information supplied from the texture designation data generation unit 132 and the control parameters set by the control parameter setting unit 134. Recipe data is data prepared for each dish. In the recipe data, information such as ingredients used in each cooking process up to the completion of cooking is described.
  • the recipe data generation unit 136 functions as a cooking process information generation unit that generates information on each cooking process.
  • Cooking means a product that is completed after cooking. Cooking means the process of cooking and the act (work) of cooking.
  • FIG. 12 is a diagram showing an example of the description content of the recipe data.
  • one recipe data is composed of a plurality of cooking process data sets.
  • a cooking process data set for cooking process # 1 a cooking process data set for cooking process # 2, ..., A cooking process data set for cooking process # N are included.
  • Each cooking process data set includes ingredient information, cooking operation information, and the control parameters described above, as shown in the balloon.
  • Ingredient information is information about ingredients used in the cooking process.
  • the information about the food material includes information indicating the type of the food material, the amount of the food material, the size of the food material, and the like.
  • ingredients include not only ingredients that have not been cooked at all, but also ingredients that have been cooked (prepared) obtained by performing a certain cooking.
  • the food ingredient information included in the cooking process data set of a cooking process includes information on the ingredients that have undergone the previous cooking process.
  • Cooking operation information is information related to cooking operation for realizing the cooking process.
  • one cooking process data set is composed of time-series data of cooking operation information for realizing one cooking process.
  • the cooking operation information represents the coordinates representing the irradiation position of the laser beam at each time.
  • the operation of the drive unit 24 is controlled according to the coordinates represented by the cooking operation information.
  • the control unit 137 of FIG. 11 communicates with the cooking device 2 by controlling the communication unit 112.
  • the control unit 137 controls the operation of the cooking device 2 based on the recipe data generated by the recipe data generation unit 136.
  • control unit 137 selects the cooking process data sets described in the recipe data one by one, and transmits a command command according to the content of the cooking operation information included in the selected cooking process data set. Control the operation of the cooking device 2.
  • the command command transmitted by the control unit 137 also includes control parameters.
  • control unit 137 transmits the updated control parameter to the cooking device 2.
  • FIG. 13 is a diagram showing an example of control parameters.
  • the control parameter is composed of the following information, for example.
  • Pulse width By shortening the pulse width, it is possible to suppress the diffusion of heat to the region not irradiated with the laser, and finer processing becomes possible. By increasing the pulse width, the cumulative irradiation amount increases and processing can be performed in a short time.
  • Pulse frequency By lowering the pulse frequency, it is possible to suppress the diffusion of heat to the region where the laser is not irradiated, and finer processing becomes possible. By increasing the pulse frequency, the cumulative irradiation amount increases and processing can be performed in a short time.
  • spot diameter [nm / ⁇ m] By reducing the spot diameter, finer processing becomes possible. By increasing the spot diameter, it is possible to perform processing in a wide range in a short time. By adjusting the focal length, the spot diameter can be set.
  • Scan mode (raster scan / vector scan)
  • the scan mode of vector scan is selected.
  • Vector scanning enables machining in a short time.
  • the scan mode of raster scan is selected. Raster scan makes it possible to draw various patterns.
  • Scan speed [mm / s] By slowing down the scanning speed, it is possible to apply more heat and make larger holes. By increasing the scanning speed, processing can be performed in a shorter time.
  • DPI dots per inch
  • the pre-irradiation temperature represents the temperature before laser irradiation (during preheating).
  • Irradiation temperature [°C] The irradiation temperature represents the set temperature when the feedback control of the output by the temperature sensor (non-contact thermometer 41) is used.
  • Post-irradiation temperature [°C] The post-irradiation temperature represents the temperature after laser irradiation (during cooling).
  • Fan ON / OFF
  • the fan 42 is used (turned on) when it is necessary to discharge smoke or debris during laser irradiation, or when cooling foodstuffs.
  • the fan is turned off when the food is liquefied by heating and the surface may be deformed by the wind.
  • control parameters include various information that defines the characteristics of the laser beam.
  • ⁇ Laser wavelength Short (ultraviolet rays, bluish purple rays, etc.)
  • Pulse width Short (nanosecond order, etc.)
  • Pulse frequency Long (a few kHz)
  • Spot diameter Small (a few ⁇ m)
  • -Scan mode Raster scan-Repeat count: Set a predetermined number of times
  • predetermined values are set respectively. Even with such a setting, it is possible to realize a “rough / rough” texture.
  • the degree of "roughness / roughness” can be adjusted by adjusting the output and changing the amount of charring.
  • the following control parameters are set.
  • an ingredient that melts by heating such as chocolate or cheese
  • ⁇ Laser wavelength Far infrared rays
  • ⁇ Spot diameter Wide
  • Irradiation temperature Keep constant by feedback control
  • ⁇ Laser output Low
  • the degree of "melting / stickiness" can be adjusted by adjusting the irradiation temperature and the post-irradiation temperature, for example.
  • the cocoa butter can be brought into a crystalline state (type I to IV) having a low melting point by adjusting the temperature after irradiation. This makes it possible to keep the melted state even at room temperature.
  • the following control parameters are set.
  • an ingredient that melts by heating such as chocolate or cheese
  • ⁇ Laser wavelength Far infrared rays
  • ⁇ Spot diameter Wide
  • Irradiation temperature Keep constant by feedback control
  • ⁇ Laser output Low
  • the surface of the food can be melted without being burnt, and then cooled to flatten the surface in a mirror shape and solidify.
  • the fan 42 is used, waves are generated on the liquid surface and flattening becomes difficult, so the fan 42 is turned off.
  • a method of processing innumerable holes on the surface of foodstuffs is used by setting the following control parameters.
  • DPI Set appropriately
  • the degree of "crispness" can be dealt with, for example, by changing the number of repetitions to change the depth of the holes, or changing the DPI to change the number of holes.
  • the information processing server 11 has basically the same configuration as the hardware configuration of the information processing terminal 1 shown in FIG.
  • the configuration of the information processing terminal 1 shown in FIG. 10 will be described with reference to the configuration of the information processing server 11 as appropriate.
  • FIG. 14 is a block diagram showing a functional configuration example of the information processing server 11.
  • the information processing unit 151 is realized in the information processing server 11.
  • the information processing unit 151 is composed of an input information acquisition unit 161, an inference unit 162, a learning unit 163, and a texture DB 164. At least a part of the functional units shown in FIG. 14 is realized by executing a predetermined program by the CPU 101 of FIG. 10 constituting the information processing server 11.
  • the input information acquisition unit 161 controls the communication unit 112 and communicates with the information processing terminal 1.
  • the input information acquisition unit 161 receives the information transmitted from the information processing terminal 1 and outputs it to the inference unit 162.
  • the information processing terminal 1 transmits texture designation data representing the texture selected by the user and information on the basic design.
  • the inference unit 162 infers the detailed design, the cooking method, etc. that realize the texture selected by the user by using the information supplied from the input information acquisition unit 161 as the input to the inference model.
  • FIG. 15 is a diagram showing an example of an inference model.
  • an inference model M1 composed of a neural network or the like is prepared in advance in the inference unit 162.
  • the inference model M1 prepared in the inference unit 162 is generated by, for example, performing machine learning based on the texture information stored in the texture DB 164.
  • the inference unit 162 inputs the texture designation data and the basic design selected by the user into the inference model based on the information supplied from the input information acquisition unit 161, and inputs the information of the ingredients, the cooking method, and the detailed design. get. For example, a plurality of types of detailed designs corresponding to the basic design are output.
  • the ingredients acquired as the output of the inference model are the ingredients used to achieve the target texture.
  • the ingredient used in combination with the main ingredient may be output as the ingredient used to realize the target texture.
  • the cooking method represents a cooking operation for realizing each cooking process of the cooking device 2. Depending on the cooking method, how to drive each part of the cooking apparatus 2 to achieve the target texture is shown.
  • the cooking method also includes, for example, control parameters.
  • Such an inference model is prepared, for example, for each ingredient to be cooked and for each cooking device controlled by the information processing terminal 1.
  • the inference model is prepared for each cooking device, the inference of the cooking method including the control parameters is performed according to the specifications of the cooking device.
  • An inference model may be prepared for each basic design selected by the user.
  • the inference unit 162 transmits the output of the inference model as an inference result to the information processing terminal 1.
  • the inference unit 162 functions as a texture inference device that infers the cooking method and the like from the texture and infers the detailed design from the texture and the basic design.
  • FIG. 16 is a diagram showing another example of the inference model.
  • the same inference as the inference using the inference model M1 shown in FIG. 15 may be performed using a plurality of inference models.
  • the inference model M11 shown in FIG. 16 is a model in which texture designation data is input and ingredients and a cooking method that realize the texture specified by the texture designation data are output.
  • the inference model M11 is generated, for example, by learning using texture information described later.
  • the inference model M12 is a model that inputs the texture designation data and the basic design and outputs the detailed design that realizes the texture specified by the texture designation data.
  • the inference model M12 is generated by preparing a plurality of detailed designs based on the basic design and performing learning using information on the subjective texture when eating the ingredients with each detailed design. NS.
  • the learning unit 163 of FIG. 14 performs learning based on the texture information stored in the texture DB 164 and generates parameters constituting the inference model. Based on the parameters generated by the learning unit 163, the inference model 162 as shown in FIGS. 15 and 16 is prepared in the inference unit 162.
  • FIG. 17 is a diagram showing an example of texture information.
  • the texture information shown in FIG. 17 is the texture information of chocolate. Based on the texture information of chocolate, for example, an inference model for chocolate is learned.
  • One texture information is composed by associating the ingredients, the state, the serving temperature, the laser cooking process, and the numerical values representing the texture.
  • each numerical value of "melting”, “fluffy”, and “crispy” is associated with the numerical value expressing the texture. More types of numerical values representing the texture may be included in the texture information.
  • the texture information at the top is “white chocolate” for the ingredients, “liquid” for the state, “40 ° C” for the serving temperature, “output 20%, speed 30 mm / s, no cooling” for the laser cooking process, " The information is such that the numerical value representing "melting” is “90”, the numerical value representing "fluffy” is “10”, and the numerical value representing "crisp” is “0".
  • the texture information consisting of such information is, for example, tasting the ingredients obtained by performing various cooking using the same device as the cooking device 2, and numerically expresses the subjective texture of the person who tasted the food. It is generated by doing what you do for each ingredient and each basic design.
  • the texture DB 164 stores texture information corresponding to various ingredients and various basic designs.
  • the learning of the neural network is performed by inputting the respective numerical values of "melting”, “fluffy”, and “crisp” and outputting the ingredients and the laser cooking process.
  • Inference model M11 is generated.
  • the laser cooking process inferred using the inference model corresponds to the cooking method provided to the information processing terminal 1 as the inference result.
  • the texture information representing the texture obtained by cooking by combining a plurality of ingredients may be used for learning.
  • the texture information when the texture information is insufficient, the relationship between the texture and other information may be modeled in the form of a linear model or the like, and the texture information may be generated by linear interpolation.
  • the texture information itself may be inferred using the inference model generated by machine learning, and the insufficient texture information may be generated.
  • step S1 the presentation unit 131 (FIG. 11) displays the selection screen of the basic design according to the foodstuff to be cooked on the display 109.
  • FIG. 19 is a diagram showing an example of a selection screen of the basic design.
  • an image showing a sample of the basic design is displayed.
  • images 201 to 203 representing three types of basic design samples are displayed.
  • the user can select his / her favorite basic design by selecting any of the images 201 to 203.
  • step S2 the presentation unit 131 displays the texture selection screen on the display 109.
  • FIG. 20 is a diagram showing an example of a texture selection screen.
  • Image 203 showing the selected basic design is displayed in the upper center of the texture selection screen.
  • a slide bar 211 used for selecting a numerical value of texture is displayed for each type of texture. By operating the slide bar 211, the user can select a desired numerical value for each type of texture.
  • the texture of each part of the food material may be selected by using the display on the slide bar 211.
  • FIG. 21 is a diagram showing another example of the texture selection screen.
  • the foodstuff image 221 which is an image showing the foodstuff to be cooked is displayed in the substantially center of the screen.
  • a grid is superimposed on the food material image 221. The user selects an arbitrary part formed by the grid, and then selects a numerical value representing the texture of the selected part by using the display of the slide bar 211 to select the texture of each part of the food material. can do.
  • the texture template may be prepared without selecting the part that specifies the texture. In this case, the user can select the texture of each part of the food material by selecting the template.
  • FIG. 22 is a diagram showing an example of a texture template.
  • a in FIG. 22 represents a template in which the texture changes in the left-right direction
  • B in FIG. 22 represents a template in which the texture changes in the vertical direction
  • C in FIG. 22 represents a template whose texture changes from the center to the outside.
  • the texture of each part of the ingredients may be selected by voice instead of by operating the screen display.
  • step S3 the texture designation data generation unit 132 has a texture based on the texture for each part of the food material selected by the user. Generate specified data. The generated texture designation data is output to the inference result acquisition unit 133.
  • step S4 the inference result acquisition unit 133 transmits the texture designation data supplied from the texture designation data generation unit 132 to the information processing server 11 and acquires the inference result. From the information processing server 11, the result of the inference performed in the information processing server 11 based on the texture designation data is transmitted.
  • the inference result includes the ingredients and the cooking method that realize the target texture.
  • the inference result acquisition unit 133 transmits the information of the basic design selected by the user to the information processing server 11 and acquires the candidate of the detailed design corresponding to the basic design.
  • step S5 the presentation unit 131 presents a candidate for the detailed design.
  • FIG. 23 is a diagram showing an example of a presentation screen of a candidate for detailed design.
  • An image showing a sample of detailed design is displayed on the presentation screen.
  • images 231 to 236 representing six types of detailed design samples are displayed. The user can select his / her favorite detailed design by selecting any of the images 231 to 236.
  • control parameter setting unit 134 sets the control parameters based on the inference result supplied from the inference result acquisition unit 133 and the detailed design selected by the user.
  • step S7 the recipe data generation unit 136 generates recipe data based on the control parameters set by the control parameter setting unit 134.
  • step S8 the control unit 137 controls the operation of the cooking device 2 based on the recipe data generated by the recipe data generation unit 136.
  • a cooking operation for realizing a texture designated by the user is performed according to the control by the control unit 137.
  • the control by the control unit 137 is continued until the cooking based on the recipe data is completed.
  • step S11 the inference unit 162 inputs the texture designation data supplied from the input information acquisition unit 161 into the inference model, and acquires the inference result of the ingredients, the cooking method, and the detailed design.
  • step S12 the inference unit 162 transmits the output of the inference model as the inference result to the information processing terminal 1.
  • the user can receive the proposal of the final form (detailed design) of the food material that realizes the texture by specifying the texture desired by the user by language or numerical value.
  • the user can generate recipe data that describes the ingredients, cooking method, etc. necessary for cooking to obtain the desired texture, and the cooking apparatus 2 is based on such recipe data. By letting them cook, they can actually receive the ingredients with the texture they want.
  • Candidates for ingredients to be cooked may be selected by the information processing server 11 and presented to the user. For example, the user specifies a desired texture, and selects a favorite ingredient from the candidates for the ingredient presented according to the designation of the texture.
  • the ingredients to be cooked may be automatically selected by the information processing server 11.
  • the information processing server 11 uses the texture inference device described above to infer design candidates capable of realizing the specified texture, and the user. Presented at.
  • ingredients it is also possible to select a plurality of ingredients as the ingredients to be cooked.
  • the ratio of each ingredient (such as 6: 4) is also selected.
  • the processing after the ratio of ingredients and the design are selected is the same as the processing described above. That is, the recipe data is generated in the information processing terminal 1 based on the inference result of the cooking method. Further, the cooking operation of the cooking apparatus 2 is controlled based on the recipe data, and the texture specified by the user is realized.
  • Tempering temperature control operation
  • chocolate tempering is a temperature operation in which chocolate melted at 50 ° C or higher is cooled while stirring, held at 27 ° C for a certain period of time, and then raised to about 31 ° C.
  • the structure of chocolate is that fine particles of sugar and cacao solids are dispersed in cocoa butter fats and oils, and the continuous phase is fats and oils. Chocolate solidifies when continuous phase fats and oils crystallize, and becomes liquid when it melts. Cocoa butter crystals show a polymorphic phenomenon, and there are six types of polymorphs called type I to type VI, but the most suitable chocolate product is type V. It is necessary to select and crystallize only the V type of the 6 types of polymorphs that cocoa butter has.
  • Tempering produces V-type crystal nuclei of cocoa butter in liquid chocolate, and subsequent cooling crystallizes all cocoa butter molecules as V-type.
  • the cocoa butter molecule crystallizes as a V-type, which makes the chocolate smooth.
  • the tempering temperature varies depending on the proportion of sugar and milk powder added to the chocolate product.
  • Typical methods for tempering are manual methods on a marble top plate and methods using a dedicated device called a tempering machine, but both methods are used only for chocolate before molding. Can't. Also, it can only be used for whole chocolate.
  • the cooking apparatus 2 it is possible to temper the already molded chocolate product by partially irradiating it with a laser beam. Specifically, the fat content (cocoa butter) only in the area irradiated with the laser beam is selectively melted and then cooled, and further, partial heating and cooling by the laser beam are repeated as necessary. It is possible to change the fat content of only the selected portion of the chocolate product to an arbitrary crystalline state.
  • a fan 42 or the like may be used to heat or cool the entire chocolate.
  • FIG. 25 is a diagram showing an example of a control device.
  • the information processing unit 121 of FIG. 11 is provided in the information processing server 11.
  • inferences such as a cooking method are performed according to the texture selected by the user, and recipe data is generated. Further, the cooking apparatus 2 is controlled via the network 12 according to the description of the recipe data.
  • the cooking device to be controlled is the cooking device 2 that irradiates a laser beam to cook, the above-mentioned control may be performed for another cooking device.
  • FIG. 26 is a diagram showing another configuration example of the control system.
  • a cooking device 301 such as a microwave oven is provided instead of the cooking device 2.
  • the heating cooker 301 performs a cooking operation (heating) according to a command command supplied from the information processing unit 121, and cooks.
  • control parameters for example, information indicating heating intensity, heating time, and change in heating is used.
  • recipe data it is possible to use recipe data to control various devices that automatically perform cooking operations.
  • various cooking devices such as a 3D printer that molds ingredients and an arm robot that cooks using an arm based on recipe data.
  • Cooking may be performed using a cooking device 2 in which a plurality of laser heads 34 are arranged in an array or a plane. This makes it possible to efficiently cook and to bake the whole food.
  • ingredients to be cooked Although the case where the ingredients to be cooked are chocolate has been mainly described, it is also possible to have the cooking apparatus 2 cook various ingredients. For example, by irradiating a raw egg with a broken shell with a laser beam, it is possible to engrave characters on the yolk portion. Further, by irradiating the banana peel with a laser beam, it is possible to cook the banana peel so that the characters appear after an arbitrary time elapses due to the oxidation of the peel.
  • the series of processes described above can be executed by hardware or software.
  • the programs constituting the software are installed on a computer embedded in dedicated hardware, a general-purpose personal computer, or the like.
  • the installed program is recorded and provided on the removable media 114 shown in FIG. 10, which consists of an optical disk (CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc), etc.), a semiconductor memory, or the like. It may also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting.
  • the program can be pre-installed in the ROM 102 or the storage unit 111.
  • the program executed by the computer may be a program in which processing is performed in chronological order in the order described in this specification, or processing is performed in parallel or at a necessary timing such as when a call is made. It may be a program to be performed.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
  • this technology can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and jointly processed.
  • each step described in the above flowchart can be executed by one device or shared by a plurality of devices.
  • one step includes a plurality of processes
  • the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
  • An information processing device including a parameter setting unit that sets parameters that specify cooking operations by a cooking device based on texture specification data that specifies the texture of each part of the food to be cooked.
  • the information processing device according to (1) wherein the parameter setting unit sets the parameters according to the specifications of the cooking device.
  • the parameter setting unit uses the texture designation data as input of an inferencer generated by learning based on texture information including information on a cooking method and information on the texture of ingredients after cooking by each cooking method.
  • the information processing apparatus according to any one of (1) to (3) above, wherein the parameters are set based on the cooking method inferred from the above.
  • the information processing device configured by a neural network that inputs the texture designation data and outputs a cooking method that realizes the texture represented by the texture designation data. .. (6)
  • the information processing apparatus according to any one of (1) to (5) above, further comprising a presentation unit that presents a selection screen used for texture selection.
  • the presenting unit presents design candidates for the ingredients that realize the selected texture according to the specifications of the cooking device.
  • the information processing device according to (6) above, wherein the parameter setting unit sets the parameters according to the selected candidate for the design.
  • the information processing device adjusts the parameters according to the state of the foodstuff.
  • the texture designation data is data for designating the texture of the food material, which is different for each portion.
  • the texture designation data is described in any one of (1) to (9) above, which is data for designating the texture of each part on the plane of the food material or the texture of each part on the three-dimensional space.
  • the information processing device according to any one of (1) to (10) above, wherein the parameter setting unit sets the parameters that define the characteristics of the laser light that irradiates the foodstuff. (12) The parameter setting unit sets the parameter that defines the content of each cooking operation included in the cooking process.
  • the information processing apparatus according to any one of (1) to (11), further comprising a cooking process information generation unit that generates cooking process information including the parameters as information related to the cooking operation.
  • a texture designation data generation unit that generates the texture designation data according to the texture selected by the user.
  • Information processing device An information processing method that sets parameters that specify the cooking operation of a cooking device based on texture specification data that specifies the texture of each part of the food to be cooked.
  • 15) On the computer A program for executing the process of setting parameters that specify the cooking operation by the cooking device based on the texture specification data that specifies the texture of each part of the ingredients to be cooked.

Landscapes

  • Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • Chemical & Material Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Civil Engineering (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Structural Engineering (AREA)
  • Computing Systems (AREA)
  • Business, Economics & Management (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Architecture (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Polymers & Plastics (AREA)
  • General Preparation And Processing Of Foods (AREA)

Abstract

The technology of the present invention relates to an information processing device, an information processing method, and a program by which it is possible for a desired food texture to easily be realized. The information processing device according to one aspect of the present technology sets a parameter defining a cooking operation to be performed by a cooking appliance, on the basis of food texture designation data which designates a food texture for each portion of a food to be cooked. The present technology can be applied to a computer for controlling cooking performed by a cooking robot.

Description

情報処理装置、情報処理方法、およびプログラムInformation processing equipment, information processing methods, and programs
 本技術は、特に、好みの食感を容易に実現することができるようにした情報処理装置、情報処理方法、およびプログラムに関する。 This technology is particularly related to an information processing device, an information processing method, and a program that make it possible to easily realize a favorite texture.
 調理中の調理人の動きをセンシングし、センシング結果のデータを保存・送信することによって、調理人が作った料理を調理ロボットにおいて再現する技術が検討されている。調理ロボットによる調理動作は、例えば、調理人の手の動きと同じ動きをセンシング結果に基づいて実現するようにして行われる。 A technique for reproducing a dish prepared by a cook on a cooking robot by sensing the movement of the cook during cooking and saving / transmitting the sensing result data is being studied. The cooking operation by the cooking robot is performed so as to realize, for example, the same movement as the movement of the cook's hand based on the sensing result.
 調理動作を再現するだけでなく、実際の食感を再現する技術についても各種検討されている。 Various techniques for reproducing the actual texture as well as the cooking operation are being studied.
 例えば特許文献1には、食感に関する数値データに基づいて複数の味覚要素を組み合わせることにより、オリジナルの食品の食感を再現する技術が記載されている。食品の形状を3Dスキャナで読み取って得られた形状データに基づいて、オリジナルの食品の形状と同じ形状の食品を3Dプリンタによって造形することについても記載されている。 For example, Patent Document 1 describes a technique for reproducing the texture of an original food by combining a plurality of taste elements based on numerical data related to the texture. It is also described that a food having the same shape as the original food is formed by a 3D printer based on the shape data obtained by reading the shape of the food with a 3D scanner.
特開2017-163916号公報JP-A-2017-163916
 調理人などのユーザによって言語で表現された食感を再現することができれば便利である。 It would be convenient if the texture expressed in language by a user such as a cook could be reproduced.
 本技術はこのような状況に鑑みてなされたものであり、好みの食感を容易に実現することができるようにするものである。 This technology was made in view of such a situation, and makes it possible to easily realize a favorite texture.
 本技術の一側面の情報処理装置は、調理対象の食材の部分毎の食感を指定する食感指定データに基づいて、調理機器による調理動作を規定するパラメータを設定するパラメータ設定部を備える。 The information processing device on one aspect of the present technology includes a parameter setting unit that sets parameters that specify cooking operations by the cooking device based on texture specification data that specifies the texture of each part of the food to be cooked.
 本技術の一側面においては、調理対象の食材の部分毎の食感を指定する食感指定データに基づいて、調理機器による調理動作を規定するパラメータが設定される。 In one aspect of the present technology, parameters that specify the cooking operation by the cooking device are set based on the texture specification data that specifies the texture of each part of the ingredient to be cooked.
本技術の一実施形態に係る調理システムの構成例を示すブロック図である。It is a block diagram which shows the structural example of the cooking system which concerns on one Embodiment of this technique. 情報処理端末と調理装置の設置例を示す図である。It is a figure which shows the installation example of an information processing terminal and a cooking apparatus. 情報処理端末を用いた調理装置の操作の例を示す図である。It is a figure which shows the example of the operation of the cooking apparatus using an information processing terminal. 調理装置の制御の例を示す図である。It is a figure which shows the example of the control of a cooking apparatus. 調理装置の外観を拡大して示す斜視図である。It is a perspective view which shows the appearance of a cooking apparatus in an enlarged manner. レーザー光の照射の例を示す図である。It is a figure which shows the example of the irradiation of a laser beam. 調理後の食材の例を示す図である。It is a figure which shows the example of the ingredient after cooking. 食感指定データの例を示す図である。It is a figure which shows the example of the texture designation data. 食感指定データの他の例を示す図である。It is a figure which shows another example of texture designation data. 情報処理端末のハードウェアの構成例を示すブロック図である。It is a block diagram which shows the configuration example of the hardware of an information processing terminal. 情報処理端末の機能構成例を示すブロック図である。It is a block diagram which shows the functional structure example of an information processing terminal. レシピデータの記述内容の例を示す図である。It is a figure which shows the example of the description content of the recipe data. 制御パラメータの例を示す図である。It is a figure which shows the example of a control parameter. 情報処理サーバの機能構成例を示すブロック図である。It is a block diagram which shows the functional structure example of an information processing server. 推論モデルの例を示す図である。It is a figure which shows the example of an inference model. 推論モデルの他の例を示す図である。It is a figure which shows another example of an inference model. 食感情報の例を示す図である。It is a figure which shows the example of the texture information. 情報処理端末の自動調理処理について説明するフローチャートである。It is a flowchart explaining the automatic cooking process of an information processing terminal. 基本デザインの選択画面の例を示す図である。It is a figure which shows the example of the selection screen of a basic design. 食感の選択画面の例を示す図である。It is a figure which shows the example of the texture selection screen. 食感の選択画面の他の例を示す図である。It is a figure which shows another example of the texture selection screen. 食感のテンプレートの例を示す図である。It is a figure which shows the example of the texture template. 詳細デザインの候補の提示画面の例を示す図である。It is a figure which shows the example of the presentation screen of the candidate of a detailed design. 情報処理サーバの推論処理について説明するフローチャートである。It is a flowchart explaining the inference processing of an information processing server. 制御装置の例を示す図である。It is a figure which shows the example of the control device. 制御システムの他の構成例を示す図である。It is a figure which shows the other configuration example of a control system.
 以下、本技術を実施するための形態について説明する。説明は以下の順序で行う。
 1.調理システムの構成
 2.各装置の構成
 3.各装置の動作
 4.メインの食材の選択について
 5.テンパリングへの適用例
 6.その他
Hereinafter, modes for implementing the present technology will be described. The explanation will be given in the following order.
1. 1. Cooking system configuration 2. Configuration of each device 3. Operation of each device 4. Selection of main ingredients 5. Application example for tempering 6. others
<調理システムの構成>
 図1は、本技術の一実施形態に係る調理システムの構成例を示す図である。
<Cooking system configuration>
FIG. 1 is a diagram showing a configuration example of a cooking system according to an embodiment of the present technology.
 図1の調理システムは、情報処理端末1と調理装置2が無線通信を介して接続されることによって構成される。情報処理端末1と調理装置2の間の通信が有線を介して行われるようにしてもよい。情報処理端末1は、インターネットなどのネットワーク12を介して情報処理サーバ11に接続される。 The cooking system of FIG. 1 is configured by connecting an information processing terminal 1 and a cooking device 2 via wireless communication. Communication between the information processing terminal 1 and the cooking device 2 may be performed via a wire. The information processing terminal 1 is connected to the information processing server 11 via a network 12 such as the Internet.
 図1の例においては、情報処理端末1は、筐体の正面にディスプレイが設けられたタブレット端末とされている。情報処理端末1は、ディスプレイに表示されたボタンなどに対するユーザの操作を検出し、情報処理サーバ11と適宜連携して、調理装置2の動作を制御する。情報処理端末1は、調理装置2の動作を制御し、調理装置2に調理を行わせる情報処理装置として機能する。 In the example of FIG. 1, the information processing terminal 1 is a tablet terminal provided with a display on the front of the housing. The information processing terminal 1 detects a user's operation on a button or the like displayed on the display, and controls the operation of the cooking device 2 in cooperation with the information processing server 11 as appropriate. The information processing terminal 1 functions as an information processing device that controls the operation of the cooking device 2 and causes the cooking device 2 to perform cooking.
 図1の例においては情報処理端末1がタブレット端末とされているが、PC、TV、スマートフォンなどの、ディスプレイを有する他の形態の機器によって情報処理端末1が構成されるようにしてもよい。 In the example of FIG. 1, the information processing terminal 1 is a tablet terminal, but the information processing terminal 1 may be configured by other types of devices having a display, such as a PC, a TV, and a smartphone.
 調理装置2は、駆動系の装置と各種のセンサを有し、食材を用いた調理を行う装置である。食材には、野菜や果物などの植物性の食材、肉や魚などの動物性の食材の他に、加工済みの食材、調味料、水や酒などの飲料も含まれる。固形の食材だけでなく、粉末状の食材も含まれる。 The cooking device 2 has a drive system device and various sensors, and is a device for cooking using ingredients. Ingredients include vegetable ingredients such as vegetables and fruits, animal ingredients such as meat and fish, processed ingredients, seasonings, and beverages such as water and liquor. Not only solid ingredients but also powdered ingredients are included.
 調理システムを構成する情報処理端末1と調理装置2は、図2に示すように、例えば、シェフなどの調理人が調理を行う厨房に設置される。情報処理端末1と調理装置2のユーザは、シェフや、シェフをサポートする人となる。図2には1人のシェフしか示されていないが、情報処理端末1と調理装置2が設置される厨房にはシェフ以外の多くの人が存在する。シェフ以外の人によって情報処理端末1の操作が行われることもある。 As shown in FIG. 2, the information processing terminal 1 and the cooking device 2 constituting the cooking system are installed in a kitchen where, for example, a chef or other cook cooks. The users of the information processing terminal 1 and the cooking apparatus 2 are chefs and people who support the chefs. Although only one chef is shown in FIG. 2, there are many people other than the chef in the kitchen where the information processing terminal 1 and the cooking device 2 are installed. The information processing terminal 1 may be operated by a person other than the chef.
 図2の例においては、調理を行っているシェフの近くに情報処理端末1があり、離れた位置にある台の上に調理装置2が設置されている。シェフは、情報処理端末1を用いて調理装置2を操作し、調理装置2に調理を行わせる。 In the example of FIG. 2, the information processing terminal 1 is located near the chef who is cooking, and the cooking device 2 is installed on a table at a remote position. The chef operates the cooking device 2 using the information processing terminal 1 and causes the cooking device 2 to cook.
 図3は、情報処理端末1を用いた調理装置2の操作の例を示す図である。 FIG. 3 is a diagram showing an example of operation of the cooking apparatus 2 using the information processing terminal 1.
 図3の上段に示すように、調理を行っているシェフが、次に使う食材であるチョコレートの食感として、周りがふわふわな食感を望んでいるものとする。例えば、周りがふわふわな食感のチョコレートがデザートの飾り付けなどに用いられる。 As shown in the upper part of Fig. 3, it is assumed that the chef who is cooking wants a fluffy texture as the texture of chocolate, which is the next ingredient to be used. For example, chocolate with a fluffy texture is used to decorate desserts.
 この場合、シェフは、情報処理端末1のディスプレイに表示される画面を操作するなどして、自分が望んでいる食感を情報処理端末1に入力する。食感の入力が音声によって行われるようにしてもよい。 In this case, the chef inputs the desired texture to the information processing terminal 1 by operating the screen displayed on the display of the information processing terminal 1. The texture may be input by voice.
 チョコレートの食感が指定された場合、情報処理端末1は、図3の下段に示すように、指定された食感を実現するためのレシピデータを生成する。レシピデータは、調理装置2を制御することに用いられる情報である。後に詳述するように、レシピデータには、調理対象とする食材の情報、調理装置2の調理動作の情報、および、調理装置2を制御するための制御パラメータが、調理工程毎に記述される。 When the texture of chocolate is specified, the information processing terminal 1 generates recipe data for realizing the specified texture, as shown in the lower part of FIG. The recipe data is information used to control the cooking apparatus 2. As will be described in detail later, in the recipe data, information on the ingredients to be cooked, information on the cooking operation of the cooking device 2, and control parameters for controlling the cooking device 2 are described for each cooking process. ..
 図4に示すように、レシピデータの記述に従って調理装置2が制御され、調理装置2において調理動作が行われる。調理装置2において調理動作が行われることにより、調理対象となっている例えばチョコレートの食感は、シェフにより指定された食感のものとなる。 As shown in FIG. 4, the cooking device 2 is controlled according to the description of the recipe data, and the cooking operation is performed in the cooking device 2. By performing the cooking operation in the cooking apparatus 2, the texture of chocolate, for example, to be cooked becomes that of the texture specified by the chef.
 このように、図1の調理システムにおいては、調理対象とする食材の食感がユーザにより指定され、ユーザにより指定された食感を実現するような調理が調理装置2により行われる。ユーザは、食感を言語や数値で指定することにより、希望する食感の食材を容易に得ることができる。後述するように、食感の指定は、「ふわふわ」、「かりかり」、「ぷるんぷるん」、「Melty」、「Crispy」などの言語で表される食感に対して、それぞれの数値を選択することによって行われる。 As described above, in the cooking system of FIG. 1, the texture of the foodstuff to be cooked is specified by the user, and the cooking device 2 performs cooking so as to realize the texture specified by the user. By specifying the texture by language or numerical value, the user can easily obtain the ingredients having the desired texture. As will be described later, when specifying the texture, select each numerical value for the texture expressed in languages such as "fluffy", "karikari", "purunpurun", "Melty", and "Crispy". Is done by.
 図5は、調理装置2の外観を拡大して示す斜視図である。 FIG. 5 is an enlarged perspective view showing the appearance of the cooking apparatus 2.
 図5に示すように、調理装置2は、調理対象の食材にレーザー光を照射して調理を行うレーザー調理装置である。 As shown in FIG. 5, the cooking device 2 is a laser cooking device that irradiates a cooking object with a laser beam to cook.
 調理装置2は、基本的に、本体部21に対して、テーブル状の調理台22と、XYステージ等で構成される駆動部24が取り付けられることによって構成される。図5の例においては、略正方形の板状のチョコレートCが調理対象の食材として調理台22に置かれている。立体的なチョコレートが調理対象の食材として用いられるようにしてもよい。 The cooking device 2 is basically configured by attaching a table-shaped cooking table 22 and a driving unit 24 composed of an XY stage or the like to the main body 21. In the example of FIG. 5, a substantially square plate-shaped chocolate C is placed on the counter 22 as an ingredient to be cooked. Three-dimensional chocolate may be used as an ingredient to be cooked.
 本体部21は、情報処理端末1との間で通信を行い、情報処理端末1による制御に従って各部の動作を制御する。情報処理端末1からは、調理動作に関する制御パラメータが送信されてくる。情報処理端末1から送信されてきた制御パラメータに従って、レーザー光の出力(特性)が調整される。本体部21が制御する調理動作には、レーザーヘッド34の位置の調整、レーザーヘッド34から照射されるレーザー光の出力の調整などが含まれる。 The main body 21 communicates with the information processing terminal 1 and controls the operation of each unit according to the control by the information processing terminal 1. A control parameter related to the cooking operation is transmitted from the information processing terminal 1. The output (characteristics) of the laser beam is adjusted according to the control parameters transmitted from the information processing terminal 1. The cooking operation controlled by the main body 21 includes adjustment of the position of the laser head 34, adjustment of the output of the laser light emitted from the laser head 34, and the like.
 本体部21と調理台22の間には配管23-1,23-2が設けられる。配管23-1は、本体部21から調理台22に対して温水または冷却水を送出することに用いられる。配管23-2は、配管23-1から流入し、調理台22の内部の配管を循環した水を調理台22から本体部21に戻すことに用いられる。流量と温度が調整された水が調理台22の内部の配管を循環することにより、調理台22に置かれたチョコレートCの部分毎の温度が調整される。 Piping 23-1, 23-2 is provided between the main body 21 and the countertop 22. The pipe 23-1 is used to send hot water or cooling water from the main body 21 to the countertop 22. The pipe 23-2 is used to return the water flowing from the pipe 23-1 and circulating in the pipe inside the countertop 22 from the countertop 22 to the main body 21. The temperature of each portion of chocolate C placed on the countertop 22 is adjusted by circulating the water whose flow rate and temperature are adjusted through the piping inside the countertop 22.
 駆動部24は、棒状の支持部材31と支持部材32が連結部33において十字に連結することによって構成される。支持部材31と支持部材32は略水平に取り付けられる。支持部材32の先端には、レーザー光源(光ファイバーを含む)と光学系を内蔵したレーザーヘッド34が設けられる。 The drive unit 24 is configured by connecting the rod-shaped support member 31 and the support member 32 in a cross shape at the connecting unit 33. The support member 31 and the support member 32 are mounted substantially horizontally. At the tip of the support member 32, a laser light source (including an optical fiber) and a laser head 34 incorporating an optical system are provided.
 支持部材31と支持部材32の連結位置が連結部33により調整されることにより、レーザーヘッド34の水平面上の位置が調整される。また、支持部材31の高さが本体部21の内部の機構によって調整されることにより、レーザーヘッド34の高さが調整される。 By adjusting the connecting position of the support member 31 and the supporting member 32 by the connecting portion 33, the position of the laser head 34 on the horizontal plane is adjusted. Further, the height of the laser head 34 is adjusted by adjusting the height of the support member 31 by the internal mechanism of the main body 21.
 調理装置2の図示せぬ部材には、非接触式温度計41とファン42が取り付けられる。非接触式温度計41は、チョコレートCの温度を非接触で測定する。ファン42は、温風または冷風を送ることによってチョコレートCの温度を調整する。ファン42には、温風を発生させるためのヒーターが設けられる。 A non-contact thermometer 41 and a fan 42 are attached to a member (not shown) of the cooking apparatus 2. The non-contact thermometer 41 measures the temperature of chocolate C in a non-contact manner. The fan 42 adjusts the temperature of chocolate C by sending hot or cold air. The fan 42 is provided with a heater for generating warm air.
 情報処理端末1は、このような構成を有する調理装置2を制御することにより、調理台22に置かれたチョコレートCの表面に対するレーザー光の照射位置(軌跡)を制御する。チョコレートCの任意の箇所を選択してレーザー光を照射したり、ファン42から冷風をあてたりすることにより、情報処理端末1は、チョコレートCの温度を部分毎に制御することが可能となる。 The information processing terminal 1 controls the irradiation position (trajectory) of the laser beam on the surface of the chocolate C placed on the countertop 22 by controlling the cooking apparatus 2 having such a configuration. The information processing terminal 1 can control the temperature of chocolate C for each portion by selecting an arbitrary portion of chocolate C and irradiating it with a laser beam or by applying cold air from a fan 42.
 レーザーヘッド34から照射されるレーザー光の出力は、照射対象物であるチョコレートCの表面温度を測定する非接触式温度計41からの出力を用いて、フィードバック制御される。また、ファン42の回転数や、調理台22の内部を通過する冷却水の流量についても、非接触式温度計41からの出力を用いてフィードバック制御される。これにより、チョコレートCの部分毎の加熱状態と冷却状態が調整される。 The output of the laser beam emitted from the laser head 34 is feedback-controlled using the output from the non-contact thermometer 41 that measures the surface temperature of the chocolate C to be irradiated. Further, the rotation speed of the fan 42 and the flow rate of the cooling water passing through the inside of the countertop 22 are also feedback-controlled using the output from the non-contact thermometer 41. As a result, the heating state and the cooling state of each portion of chocolate C are adjusted.
 なお、ファン42にヒーターを内蔵させたり、冷却水に温水を用いたりすることにより、チョコレートCの全体の加温状態を制御することも可能である。 It is also possible to control the overall heating state of chocolate C by incorporating a heater in the fan 42 or using hot water as the cooling water.
 チョコレートCの任意の箇所を選択的に加熱した後、製品全体の正確な温度制御を行うことにより、チョコレートCの選択した箇所のみの脂肪分(ココアバター)を任意の結晶状態に変更するといったような調理動作も可能となる。 After selectively heating any part of chocolate C, by performing accurate temperature control of the entire product, the fat content (cocoa butter) of only the selected part of chocolate C is changed to an arbitrary crystalline state. Cooking operation is also possible.
 図6は、レーザー光の照射の例を示す図である。 FIG. 6 is a diagram showing an example of laser light irradiation.
 レーザーヘッド34の内部には、図6の左側に示すように、レーザー光源51と、集光レンズにより構成される集光光学系52が設けられる。 Inside the laser head 34, as shown on the left side of FIG. 6, a condensing optical system 52 composed of a laser light source 51 and a condensing lens is provided.
 レーザー光源51から出力され、集光光学系52において集光されたレーザー光の結像点をチョコレートC上に設定することにより、レーザー光がチョコレートCの表面に照射される。図6の左側に示すチョコレートCの下に示す小円は、レーザー光の照射範囲を表す。 By setting the imaging point of the laser light output from the laser light source 51 and focused by the focusing optical system 52 on the chocolate C, the laser light is irradiated to the surface of the chocolate C. The small circle below the chocolate C shown on the left side of FIG. 6 represents the irradiation range of the laser beam.
 また、図6の中央の双方向の矢印で示すように、結像点からオフセットされた位置にくるようにチョコレートCの位置が調整される。これにより、デフォーカスされたレーザー光が照射される。図6の中央に示すチョコレートCの下に示す円は、レーザー光の照射範囲を表す。 Further, as shown by the double-headed arrow in the center of FIG. 6, the position of chocolate C is adjusted so as to be offset from the imaging point. As a result, the defocused laser beam is irradiated. The circle below the chocolate C shown in the center of FIG. 6 represents the irradiation range of the laser beam.
 図6の右側に示すように、レーザーヘッド34には、円筒レンズ等から構成されるホモジナイザ53が設けられる。ホモジナイザ53を設けることにより、光量を均質化したレーザー光(ライン状を含む)が照射される。 As shown on the right side of FIG. 6, the laser head 34 is provided with a homogenizer 53 composed of a cylindrical lens or the like. By providing the homogenizer 53, laser light (including a line shape) having a homogenized amount of light is irradiated.
 このように、制御パラメータを調整することにより、レーザー光の照射範囲、照射特性などが調整される。チョコレートの加熱を安定的・均一的に行うために、レーザー光の照射範囲の大きさは、直径1mm以上の円または線とされる。 By adjusting the control parameters in this way, the irradiation range, irradiation characteristics, etc. of the laser beam can be adjusted. In order to heat chocolate stably and uniformly, the size of the irradiation range of the laser beam is a circle or line having a diameter of 1 mm or more.
 レーザー光の直径またはライン状のレーザー光の長さを1mm以上にすることにより、チョコレートCの過度な加熱による蒸発(アブレーション)が起こりにくくすることが可能となる。また、広い面積を均一に加熱することで、後述する結晶転移プロセスで重要な20~50度の温度範囲において、より高精度な加熱制御が可能となる。 By setting the diameter of the laser beam or the length of the linear laser beam to 1 mm or more, it becomes possible to prevent evaporation (ablation) of chocolate C due to excessive heating. Further, by uniformly heating a wide area, more accurate heating control becomes possible in the temperature range of 20 to 50 degrees, which is important in the crystal transition process described later.
 図7は、調理後の食材の例を示す図である。 FIG. 7 is a diagram showing an example of ingredients after cooking.
 図7のAの矢印の先に示すように、レーザー光の照射を制御することにより、チョコレートの表面に図柄などのデザインを施すことが可能となる。 As shown at the tip of the arrow A in FIG. 7, by controlling the irradiation of the laser beam, it is possible to apply a design such as a pattern to the surface of the chocolate.
 また、図7のBの矢印の先に示すように、レーザー光の照射を制御することにより、ステーキ用の肉などの表面に焦げ目をつけることが可能となる。レーザー光を用いた調理の前に、全体を焼く調理が肉に対して施されるようにしてもよい。肉全体を焼く調理は、調理装置2の調理台22上で行われるようにしてもよいし、オーブンや電子レンジなどの他の調理装置を用いて行われるようにしてもよい。 Further, as shown at the tip of the arrow B in FIG. 7, by controlling the irradiation of the laser beam, it is possible to brown the surface of meat for steak or the like. The whole meat may be cooked before cooking with laser light. Cooking to bake the whole meat may be performed on the cooking table 22 of the cooking device 2, or may be performed using another cooking device such as an oven or a microwave oven.
 オーブンなどによる調理後にレーザー光を照射することにより、肉の切断形状に依存せず、任意のパターンの焦げ目をつけることが可能になる。全体を調理した後に網状のパターンの焦げ目を表面につけることにより、焼き網風の食感を実現することが可能となる。 By irradiating laser light after cooking in an oven etc., it is possible to brown any pattern regardless of the cut shape of the meat. By browning the surface with a net-like pattern after cooking the whole, it is possible to realize a grill-like texture.
 フライパンなどを用いて焼き目をつける場合、通常は油を使用する必要があるが、調理装置2を用いることにより、油を使わずに、焼き目をつけて食感を変えることが可能になる。 When grilling with a frying pan or the like, it is usually necessary to use oil, but by using the cooking device 2, it is possible to grill and change the texture without using oil. ..
 上述したように、調理装置2による調理は、食材の食感をユーザが指定することに応じて行われる。ユーザによる食感の指定は、食材の部分毎の食感を指定するようにして行われる。 As described above, cooking by the cooking device 2 is performed according to the user's designation of the texture of the ingredients. The texture is specified by the user so as to specify the texture for each part of the food.
 図7に示すような食材の各部分を対象とした調理は、ユーザにより指定された部分毎の食感を実現するようにして行われることになる。食材の部分毎の食感を指定する食感指定データがユーザの指定に応じて情報処理端末1において生成され、生成された食感指定データに基づいて、調理装置2の調理動作が制御される。 Cooking for each part of the foodstuff as shown in FIG. 7 is performed so as to realize the texture of each part specified by the user. Texture designation data that specifies the texture of each part of the food material is generated in the information processing terminal 1 according to the user's designation, and the cooking operation of the cooking device 2 is controlled based on the generated texture designation data. ..
 図8は、食感指定データの例を示す図である。 FIG. 8 is a diagram showing an example of texture designation data.
 図8の左側に示すように、食感指定データは、食材の部分毎の食感を表すマップ情報として構成される。図8の左側に示す食感指定データを構成する格子状の各領域は、食材の各部分に対応する。各領域の濃淡が異なることは、それぞれの食感が異なることを表す。 As shown on the left side of FIG. 8, the texture designation data is configured as map information representing the texture of each part of the food material. Each grid-like region constituting the texture designation data shown on the left side of FIG. 8 corresponds to each portion of the food material. The difference in the shade of each region indicates that the texture of each region is different.
 図8の例においては、基本的に、左から右に向かうにつれて色が濃くなる、すなわち、左から右に向かうにつれて食感が変化することを指定する情報が食感指定データとして生成されている。 In the example of FIG. 8, basically, information that specifies that the color becomes darker from left to right, that is, the texture changes from left to right is generated as texture designation data. ..
 このような食感指定データに基づいて調理が行われることにより、図8の右側に示すように、左から右に向かうにつれて徐々に変化する食感が実現される。例えば、左から右に向かうにつれて歯ごたえが徐々に増すような肉の食感が実現される。この場合、調理装置2による調理動作は、例えば、左から右に向かうにつれてレーザー光の出力を上げるようにして行われる。なお、図8の右側に示す食材の肉において、各部分の色が異なることは、食感が異なることを表す。 By cooking based on such texture designation data, as shown on the right side of FIG. 8, a texture that gradually changes from left to right is realized. For example, a texture of meat that gradually increases in texture from left to right is realized. In this case, the cooking operation by the cooking apparatus 2 is performed so as to increase the output of the laser beam from left to right, for example. In addition, in the meat of the food material shown on the right side of FIG. 8, the difference in color of each part indicates that the texture is different.
 調理装置2によれば部分毎の調理ができることから、このような、部分毎に変化する食感を実現することが可能となる。 According to the cooking device 2, it is possible to cook each part, so that it is possible to realize such a texture that changes for each part.
 図9は、食感指定データの他の例を示す図である。 FIG. 9 is a diagram showing another example of texture designation data.
 図9の左側に示すように、1つの食材であるケーキを対象とした食感指定データが、複数のレイヤ(層)の部分毎の食感を表すマップ情報として構成されるようにしてもよい。 As shown on the left side of FIG. 9, the texture designation data for one ingredient, cake, may be configured as map information representing the texture of each of a plurality of layers. ..
 図9に示す食感指定データD1は、最下層の食感を指定する情報であり、食感指定データD2は、下から2番目の層の食感を指定する情報である。食感指定データD3は、下から3番目の層の食感を指定する情報であり、食感指定データD4は、最上層の食感を指定する情報である。食感指定データD1乃至D4は、それぞれ、対象とする層における部分毎の食感を表している。 The texture designation data D1 shown in FIG. 9 is information for designating the texture of the lowest layer, and the texture designation data D2 is information for designating the texture of the second layer from the bottom. The texture designation data D3 is information for designating the texture of the third layer from the bottom, and the texture designation data D4 is information for designating the texture of the uppermost layer. The texture designation data D1 to D4 each represent the texture of each portion in the target layer.
 このような食感指定データに基づいて下の層から順に調理が行われることにより、図9の右側に示すように、平面的にも、立体的にも変化する食感が実現される。図9の例においては、平面方向と高さ方向の両方において食感が変化するケーキが調理後の食材として作られている。 By cooking in order from the lower layer based on such texture designation data, as shown on the right side of FIG. 9, a texture that changes both in a plane and in a three-dimensional manner is realized. In the example of FIG. 9, a cake whose texture changes in both the plane direction and the height direction is made as an ingredient after cooking.
 このように、ユーザは、食感の変化を2次元または3次元のマップとしてデザインし、調理を行わせることが可能となる。 In this way, the user can design the change in texture as a two-dimensional or three-dimensional map and have him / her cook.
 例えばステーキ用の肉を焼く場合、ユーザは、食べる人が右利きであるのか左利きであるのかといったように、食べる方向(食べ方)を想定して食感をデザインすることができる。 For example, when grilling meat for steak, the user can design the texture by assuming the eating direction (how to eat), such as whether the eater is right-handed or left-handed.
 また、図9に示すような切り分けられたケーキの場合、中心側(鋭角側)から外側に向かって食べる人が多いことから、ユーザは、そのような食べ方を想定して食感をデザインすることもできる。ユーザは、アフォーダンスに合わせて食感の変化をデザインすることが可能となる。 Further, in the case of a cut cake as shown in FIG. 9, many people eat it from the center side (acute angle side) toward the outside, so the user designs the texture assuming such a way of eating. You can also do it. The user can design the change in texture according to the affordance.
 食感指定データを生成し、調理装置2の調理動作を制御する情報処理端末1の動作についてはフローチャートを参照して後述する。 The operation of the information processing terminal 1 that generates texture designation data and controls the cooking operation of the cooking device 2 will be described later with reference to the flowchart.
<各装置の構成>
・情報処理端末1の構成
 図10は、情報処理端末1のハードウェアの構成例を示すブロック図である。
<Configuration of each device>
Configuration of Information Processing Terminal 1 FIG. 10 is a block diagram showing a configuration example of hardware of the information processing terminal 1.
 図10に示すように、情報処理端末1はコンピュータにより構成される。CPU(Central Processing Unit)101、ROM(Read Only Memory)102、RAM(Random Access Memory)103は、バス104により相互に接続される。 As shown in FIG. 10, the information processing terminal 1 is composed of a computer. The CPU (Central Processing Unit) 101, ROM (Read Only Memory) 102, and RAM (Random Access Memory) 103 are connected to each other by the bus 104.
 CPU101が、例えば記憶部111に記憶されているプログラムを入出力インタフェース105およびバス104を介してRAM103にロードして実行することにより、調理装置2の制御などの各種の処理が行われる。 The CPU 101 loads the program stored in the storage unit 111 into the RAM 103 via the input / output interface 105 and the bus 104 and executes the program, so that various processes such as control of the cooking device 2 are performed.
 バス104には、入出力インタフェース105が接続される。入出力インタフェース105には、マイクロフォン106、カメラ107、操作部108、ディスプレイ109、スピーカ110、記憶部111、通信部112、およびドライブ113が接続される。 The input / output interface 105 is connected to the bus 104. A microphone 106, a camera 107, an operation unit 108, a display 109, a speaker 110, a storage unit 111, a communication unit 112, and a drive 113 are connected to the input / output interface 105.
 マイクロフォン106は、ユーザの音声を検出し、音声データをCPU101に出力する。 The microphone 106 detects the user's voice and outputs the voice data to the CPU 101.
 カメラ107は、ユーザの動作を含む周囲の状況を撮影する。カメラ107により撮影された画像データはCPU101に供給される。 The camera 107 captures the surrounding situation including the user's movement. The image data captured by the camera 107 is supplied to the CPU 101.
 操作部108は、ディスプレイ109に重ねて設けられたタッチパネル、情報処理端末1の筐体に設けられたボタンなどにより構成される。操作部108は、ユーザの操作を検出し、操作の内容を表す情報をCPU101に出力する。 The operation unit 108 is composed of a touch panel provided on the display 109, a button provided on the housing of the information processing terminal 1, and the like. The operation unit 108 detects the user's operation and outputs information indicating the content of the operation to the CPU 101.
 ディスプレイ109は、LCD、有機ELディスプレイなどにより構成される。ディスプレイ109は、CPU101による制御に従って、食感の指定に用いられる画面などの各種の画面を表示する。 The display 109 is composed of an LCD, an organic EL display, and the like. The display 109 displays various screens such as a screen used for designating the texture according to the control by the CPU 101.
 スピーカ110は、CPU101による制御に従って合成音声を出力する。 The speaker 110 outputs synthetic voice according to the control by the CPU 101.
 記憶部111は、ハードディスクや不揮発性のメモリなどにより構成される。記憶部111は、CPU101が実行するプログラムなどの各種の情報を記憶する。 The storage unit 111 is composed of a hard disk, a non-volatile memory, or the like. The storage unit 111 stores various information such as a program executed by the CPU 101.
 通信部112は、ネットワークインタフェースなどより構成される。通信部112は、ネットワーク12上の情報処理サーバ11との間で通信を行う。通信部112は、情報処理サーバ11から送信されてきた情報を受信してCPU101に出力し、CPU101から供給された情報を情報処理サーバ11に対して送信する。 The communication unit 112 is composed of a network interface and the like. The communication unit 112 communicates with the information processing server 11 on the network 12. The communication unit 112 receives the information transmitted from the information processing server 11 and outputs it to the CPU 101, and transmits the information supplied from the CPU 101 to the information processing server 11.
 ドライブ113は、リムーバブルメディア114を駆動し、リムーバブルメディア114からのデータの読み出しと、リムーバブルメディア114に対するデータの書き込みを行う。 The drive 113 drives the removable media 114 to read data from the removable media 114 and write data to the removable media 114.
 図11は、情報処理端末1の機能構成例を示すブロック図である。 FIG. 11 is a block diagram showing a functional configuration example of the information processing terminal 1.
 図11に示すように、情報処理端末1においては情報処理部121が実現される。情報処理部121は、提示部131、食感指定データ生成部132、推論結果取得部133、制御パラメータ設定部134、食材状態検出部135、レシピデータ生成部136、および制御部137により構成される。図11に示す機能部のうちの少なくとも一部は、図10のCPU101により所定のプログラムが実行されることによって実現される。 As shown in FIG. 11, the information processing unit 121 is realized in the information processing terminal 1. The information processing unit 121 is composed of a presentation unit 131, a texture designation data generation unit 132, an inference result acquisition unit 133, a control parameter setting unit 134, a foodstuff state detection unit 135, a recipe data generation unit 136, and a control unit 137. .. At least a part of the functional units shown in FIG. 11 is realized by executing a predetermined program by the CPU 101 of FIG.
 提示部131は、ディスプレイ109を制御し、各種の情報をユーザに提示する。提示部131による提示に対して、食感の選択などの各種の選択がユーザにより行われる。ユーザの選択結果を表す情報は食感指定データ生成部132に供給される。 The presentation unit 131 controls the display 109 and presents various types of information to the user. With respect to the presentation by the presentation unit 131, the user makes various selections such as selection of texture. Information representing the user's selection result is supplied to the texture designation data generation unit 132.
 食感指定データ生成部132は、ユーザによる食感の選択結果に応じて食感指定データを生成し、推論結果取得部133に出力する。 The texture designation data generation unit 132 generates texture designation data according to the result of selection of texture by the user, and outputs it to the inference result acquisition unit 133.
 推論結果取得部133は、通信部112を制御し、情報処理サーバ11との間で通信を行う。推論結果取得部133は、食感指定データを情報処理サーバ11に送信し、食感指定データに基づいて行われた推論の結果を取得する。後述するように、情報処理サーバ11においては、機械学習によって生成された推論モデルを用いて、ユーザにより選択された食感を実現する調理方法などの推論が行われる。 The inference result acquisition unit 133 controls the communication unit 112 and communicates with the information processing server 11. The inference result acquisition unit 133 transmits the texture designation data to the information processing server 11 and acquires the result of the inference performed based on the texture designation data. As will be described later, the information processing server 11 uses an inference model generated by machine learning to infer a cooking method or the like that realizes a texture selected by the user.
 食感の選択が上述したように部分毎の食感を選択するようにして行われるから、調理方法の推論についても、部分毎の調理方法を推定するようにして行われる。推論結果取得部133は、調理方法の情報を含む推論結果を制御パラメータ設定部134とレシピデータ生成部136に出力する。 Since the texture is selected by selecting the texture for each part as described above, the inference of the cooking method is also performed by estimating the cooking method for each part. The inference result acquisition unit 133 outputs an inference result including information on the cooking method to the control parameter setting unit 134 and the recipe data generation unit 136.
 なお、食感の選択とともに、食材のデザインもユーザにより選択される。食材のデザインは、図7を参照して説明したように、食材の外観に表現される図柄などである。 In addition to selecting the texture, the design of the ingredients is also selected by the user. As described with reference to FIG. 7, the design of the food material is a design expressed in the appearance of the food material.
 例えば、大まかなデザインである基本デザインがユーザにより選択され、基本デザインの情報が情報処理サーバ11に対して送信される。情報処理サーバ11においては、基本デザインをより詳細にしたデザインであり、かつ、ユーザにより選択された食感を実現することが可能な詳細デザインの候補の推論も行われる。デザインを施すための調理が行われない場合、基本デザインおよび詳細デザインのユーザによる選択が行われないようにしてもよい。 For example, a basic design, which is a rough design, is selected by the user, and information on the basic design is transmitted to the information processing server 11. In the information processing server 11, a design in which the basic design is made more detailed and a candidate for the detailed design capable of realizing the texture selected by the user is also inferred. If the cooking for the design is not done, the user selection of the basic design and the detailed design may not be made.
 推論結果取得部133により取得された詳細デザインの候補の情報は提示部131に供給され、詳細デザインの候補がユーザに対して提示される。ユーザは、提示された候補の中から、好みの詳細デザインを選択する。制御パラメータ設定部134とレシピデータ生成部136に対しては、ユーザにより選択された詳細デザインに関する、調理方法の情報を含む推論結果が推論結果取得部133から供給される。 The detailed design candidate information acquired by the inference result acquisition unit 133 is supplied to the presentation unit 131, and the detailed design candidate is presented to the user. The user selects a favorite detailed design from the presented candidates. The inference result acquisition unit 133 including information on the cooking method regarding the detailed design selected by the user is supplied to the control parameter setting unit 134 and the recipe data generation unit 136.
 制御パラメータ設定部134は、推論結果取得部133から供給された情報に基づいて、食材の各部分に対する調理動作の内容を規定する制御パラメータを設定し、レシピデータ生成部136に出力する。レシピデータ生成部136に出力された制御パラメータは、レシピデータの生成に用いられる。 The control parameter setting unit 134 sets control parameters that define the content of the cooking operation for each part of the food material based on the information supplied from the inference result acquisition unit 133, and outputs the control parameter to the recipe data generation unit 136. The control parameters output to the recipe data generation unit 136 are used to generate the recipe data.
 また、制御パラメータ設定部134は、調理装置2による調理動作中に食材状態検出部135により行われる食材の状態のセンシング結果に基づいて制御パラメータを更新し、更新後の制御パラメータを制御部137に出力する。食材状態検出部135からは、調理装置2において調理対象となっている食材の状態を表す情報が供給されてくる。 Further, the control parameter setting unit 134 updates the control parameter based on the sensing result of the state of the food material performed by the food material state detection unit 135 during the cooking operation by the cooking device 2, and sends the updated control parameter to the control unit 137. Output. Information indicating the state of the food material to be cooked in the cooking apparatus 2 is supplied from the food material state detection unit 135.
 制御パラメータ設定部134は、食材の状態に応じて調整した調整後の制御パラメータを制御部137に出力し、調理装置2の調理動作に反映させる。制御パラメータの更新は、調理対象の食材の状態に応じてリアルタイムで行われる。 The control parameter setting unit 134 outputs the adjusted control parameter adjusted according to the state of the food to the control unit 137 and reflects it in the cooking operation of the cooking device 2. The control parameters are updated in real time according to the state of the ingredients to be cooked.
 例えば、食材としてチョコレートを用いる場合は、カメラにより撮影された画像に基づいて色調を検出することにより、調理対象のチョコレートがホワイト系のチョコレートであるのか、ダーク系のチョコレートであるのかが判断される。この場合、レーザー光の照射対象となっている部分(調理対象となっている部分)のレーザー光の吸収率の違いに応じて、制御パラメータとしてのレーザー出力が補正される。これにより、異なる素材が混ざっている食材が調理対象となっている場合でも、安定した調理が可能となる。 For example, when chocolate is used as an ingredient, it is determined whether the chocolate to be cooked is white chocolate or dark chocolate by detecting the color tone based on the image taken by the camera. .. In this case, the laser output as a control parameter is corrected according to the difference in the absorption rate of the laser light in the portion to be irradiated with the laser light (the portion to be cooked). As a result, stable cooking is possible even when ingredients in which different ingredients are mixed are targeted for cooking.
 食材状態検出部135は、調理対象の食材の状態を検出する。例えば、食材状態検出部135は、調理装置2により撮影された、調理対象の食材が写る画像、または、非接触式温度計41を含む各種のセンサにより計測されたセンサデータを解析することにより、調理対象の食材の状態を検出する。調理装置2の所定の位置には、食材を撮影するカメラなども設けられる。情報処理端末1に設けられるカメラ107(図10)により撮影された画像を解析することによって、調理対象となっている食材の状態が検出されるようにしてもよい。 The food condition detection unit 135 detects the state of the food to be cooked. For example, the foodstuff state detection unit 135 analyzes an image of the foodstuff to be cooked taken by the cooking device 2 or sensor data measured by various sensors including a non-contact thermometer 41. Detect the condition of the ingredients to be cooked. A camera or the like for taking a picture of foodstuffs is also provided at a predetermined position of the cooking apparatus 2. By analyzing the image taken by the camera 107 (FIG. 10) provided in the information processing terminal 1, the state of the foodstuff to be cooked may be detected.
 食材状態検出部135により、レーザー光による調理の進行の程度などが検出される。食材状態検出部135は、食材の状態のセンシング結果を制御パラメータ設定部134に出力する。食材の状態のセンシングが調理装置2により行われ、そのセンシング結果が食材状態検出部135により取得されるようにしてもよい。 The food condition detection unit 135 detects the degree of progress of cooking by laser light. The food material state detection unit 135 outputs the sensing result of the food material state to the control parameter setting unit 134. The state of the food may be sensed by the cooking apparatus 2, and the sensing result may be acquired by the food state detection unit 135.
 レシピデータ生成部136は、食感指定データ生成部132から供給された情報と、制御パラメータ設定部134により設定された制御パラメータに基づいてレシピデータを生成する。レシピデータは、料理毎に用意されるデータである。レシピデータには、料理の完成に至るまでの、それぞれの調理工程で用いる食材などの情報が記述される。レシピデータ生成部136は、それぞれの調理工程の情報を生成する調理工程情報生成部として機能する。 The recipe data generation unit 136 generates recipe data based on the information supplied from the texture designation data generation unit 132 and the control parameters set by the control parameter setting unit 134. Recipe data is data prepared for each dish. In the recipe data, information such as ingredients used in each cooking process up to the completion of cooking is described. The recipe data generation unit 136 functions as a cooking process information generation unit that generates information on each cooking process.
 なお、料理は、調理を経て出来上がる成果物のことを意味する。調理は、料理を作る過程や、料理を作る行為(作業)のことを意味する。 Cooking means a product that is completed after cooking. Cooking means the process of cooking and the act (work) of cooking.
 図12は、レシピデータの記述内容の例を示す図である。 FIG. 12 is a diagram showing an example of the description content of the recipe data.
 図12に示すように、1つのレシピデータは、複数の調理工程データセットから構成される。図12の例においては、調理工程#1に関する調理工程データセット、調理工程#2に関する調理工程データセット、・・・、調理工程#Nに関する調理工程データセットが含まれる。 As shown in FIG. 12, one recipe data is composed of a plurality of cooking process data sets. In the example of FIG. 12, a cooking process data set for cooking process # 1, a cooking process data set for cooking process # 2, ..., A cooking process data set for cooking process # N are included.
 各調理工程データセットには、吹き出しに示すように、食材情報、調理動作情報、および、上述した制御パラメータが含まれる。 Each cooking process data set includes ingredient information, cooking operation information, and the control parameters described above, as shown in the balloon.
 食材情報は、調理工程において用いる食材に関する情報である。食材に関する情報には、食材の種類、食材の量、食材の大きさなどを表す情報が含まれる。 Ingredient information is information about ingredients used in the cooking process. The information about the food material includes information indicating the type of the food material, the amount of the food material, the size of the food material, and the like.
 なお、食材には、調理が全く施されていない食材だけでなく、ある調理が施されることによって得られた調理済み(下処理済み)の食材も含まれる。ある調理工程の調理工程データセットに含まれる食材情報には、それより前の調理工程を経た食材の情報が含まれる。 Note that the ingredients include not only ingredients that have not been cooked at all, but also ingredients that have been cooked (prepared) obtained by performing a certain cooking. The food ingredient information included in the cooking process data set of a cooking process includes information on the ingredients that have undergone the previous cooking process.
 調理動作情報は、調理工程を実現するための調理動作に関する情報である。例えば、1つの調理工程を実現するための調理動作情報の時系列データにより1つの調理工程データセットが構成される。調理動作情報により、レーザー光の各時刻の照射位置を表す座標が表される。調理動作情報により表される座標に従って、例えば駆動部24の動作が制御される。 Cooking operation information is information related to cooking operation for realizing the cooking process. For example, one cooking process data set is composed of time-series data of cooking operation information for realizing one cooking process. The cooking operation information represents the coordinates representing the irradiation position of the laser beam at each time. For example, the operation of the drive unit 24 is controlled according to the coordinates represented by the cooking operation information.
 図11の制御部137は、通信部112を制御することによって調理装置2との間で通信を行う。制御部137は、レシピデータ生成部136により生成されたレシピデータに基づいて、調理装置2の動作を制御する。 The control unit 137 of FIG. 11 communicates with the cooking device 2 by controlling the communication unit 112. The control unit 137 controls the operation of the cooking device 2 based on the recipe data generated by the recipe data generation unit 136.
 例えば、制御部137は、レシピデータに記述されている調理工程データセットを1つずつ順に選択し、選択した調理工程データセットに含まれる調理動作情報の内容に応じて命令コマンドを送信することによって調理装置2の動作を制御する。制御部137が送信する命令コマンドには制御パラメータも含まれる。 For example, the control unit 137 selects the cooking process data sets described in the recipe data one by one, and transmits a command command according to the content of the cooking operation information included in the selected cooking process data set. Control the operation of the cooking device 2. The command command transmitted by the control unit 137 also includes control parameters.
 また、制御部137は、調理対象の食材の状態に基づいて、制御パラメータ設定部134により制御パラメータが更新された場合、更新後の制御パラメータを調理装置2に送信する。 Further, when the control parameter is updated by the control parameter setting unit 134 based on the state of the food material to be cooked, the control unit 137 transmits the updated control parameter to the cooking device 2.
 図13は、制御パラメータの例を示す図である。 FIG. 13 is a diagram showing an example of control parameters.
 制御パラメータは、例えば以下の各情報により構成される。 The control parameter is composed of the following information, for example.
 1.レーザー波長[nm/μm]
 赤外線・遠赤外線(780nm乃至1mm)のレーザー(炭酸ガスレーザー10.6μmなど)を用いることにより、食材の内部まで熱を通すことが可能となる。遠赤外線を用いることにより、透明な食材の調理ができる可能性がある。紫外線(1乃至400nm)のレーザー(エキシマレーザー351nmなど)を用いることにより、食材の表面に微細加工を施すことが可能となる。
1. 1. Laser wavelength [nm / μm]
By using an infrared / far infrared (780 nm to 1 mm) laser (carbon dioxide laser 10.6 μm, etc.), it is possible to pass heat to the inside of the food. By using far infrared rays, there is a possibility that transparent ingredients can be cooked. By using an ultraviolet (1 to 400 nm) laser (excimer laser 351 nm, etc.), it is possible to perform microfabrication on the surface of foodstuffs.
 2.レーザー出力[W]
 レーザー出力を大きくすることにより、より大きな熱を与えたり、大きな穴を開けたりすることが可能となる。
2. Laser output [W]
By increasing the laser output, it is possible to give more heat or make a larger hole.
 3.パルス幅[s]
 パルス幅を短くすることにより、レーザーが照射されない領域への熱の拡散を抑えることができ、より細かな加工が可能になる。パルス幅を長くすることにより、累積照射量が増え、短い時間での加工が可能になる。
3. 3. Pulse width [s]
By shortening the pulse width, it is possible to suppress the diffusion of heat to the region not irradiated with the laser, and finer processing becomes possible. By increasing the pulse width, the cumulative irradiation amount increases and processing can be performed in a short time.
 4.パルス周波数[Hz]
 パルス周波数を低くすることにより、レーザーが照射されない領域への熱の拡散を抑えることができ、より細かな加工が可能になる。パルス周波数を高くすることにより、累積照射量が増え、短い時間での加工が可能になる。
4. Pulse frequency [Hz]
By lowering the pulse frequency, it is possible to suppress the diffusion of heat to the region where the laser is not irradiated, and finer processing becomes possible. By increasing the pulse frequency, the cumulative irradiation amount increases and processing can be performed in a short time.
 5.スポット径[nm/μm]
 スポット径を小さくすることにより、より細かな加工が可能になる。スポット径を大きくすることにより、広い範囲の加工を短時間で行うことが可能になる。焦点距離を調整することによって、スポット径の設定が可能となる。
5. Spot diameter [nm / μm]
By reducing the spot diameter, finer processing becomes possible. By increasing the spot diameter, it is possible to perform processing in a wide range in a short time. By adjusting the focal length, the spot diameter can be set.
 6.スキャンモード(ラスタースキャン/ベクトルスキャン)
 一筆書きで描画できるデザインを施す場合、ベクトルスキャンのスキャンモードが選択される。ベクトルスキャンにより、短時間での加工が可能となる。一方、複雑な模様を含むデザインを施す場合、ラスタースキャンのスキャンモードが選択される。ラスタースキャンにより、様々な模様を描画することが可能となる。
6. Scan mode (raster scan / vector scan)
When designing a design that can be drawn with a single stroke, the scan mode of vector scan is selected. Vector scanning enables machining in a short time. On the other hand, when designing with a complicated pattern, the scan mode of raster scan is selected. Raster scan makes it possible to draw various patterns.
 7.スキャン速度[mm/s]
 スキャン速度を遅くすることにより、より大きな熱を与えたり、大きな穴を開けたりすることが可能となる。スキャン速度を速くすることにより、より短い時間での加工が可能となる。
7. Scan speed [mm / s]
By slowing down the scanning speed, it is possible to apply more heat and make larger holes. By increasing the scanning speed, processing can be performed in a shorter time.
 8.DPI(1インチ当たりドット数)
 例えば食材の表面にドットを形成することによってデザインを施す場合、DPIにより、模様の細かさを指定することが可能となる。
8. DPI (dots per inch)
For example, when designing by forming dots on the surface of foodstuffs, it is possible to specify the fineness of the pattern by DPI.
 9.繰り返し数[回数]
 少ない熱量/加工量で調理動作を繰り返し行うことにより、レーザーが照射されない領域への熱の拡散を抑えて、より細かな加工が可能になる。
9. Number of repetitions [number of times]
By repeating the cooking operation with a small amount of heat / processing amount, it is possible to suppress the diffusion of heat to the region not irradiated with the laser and enable finer processing.
 10.照射前温度[℃]
 照射前温度は、レーザー照射前(予熱時)の温度を表す。
10. Pre-irradiation temperature [℃]
The pre-irradiation temperature represents the temperature before laser irradiation (during preheating).
 11.照射温度[℃]
 照射温度は、温度センサ(非接触式温度計41)による出力のフィードバック制御を使用した場合の設定温度を表す。
11. Irradiation temperature [℃]
The irradiation temperature represents the set temperature when the feedback control of the output by the temperature sensor (non-contact thermometer 41) is used.
 12.照射後温度[℃]
 照射後温度は、レーザー照射後(冷却時)の温度を表す。
12. Post-irradiation temperature [℃]
The post-irradiation temperature represents the temperature after laser irradiation (during cooling).
 13.ファン(ON/OFF)
 レーザー照射時の煙やデブリの排出が必要な場合や、食材を冷却する場合にファン42は用いられる(ONとなる)。加熱により食材が液体化しており、風で表面が変形する可能性がある場合などにおいては、ファンはOFFとなる。
13. Fan (ON / OFF)
The fan 42 is used (turned on) when it is necessary to discharge smoke or debris during laser irradiation, or when cooling foodstuffs. The fan is turned off when the food is liquefied by heating and the surface may be deformed by the wind.
 このように、制御パラメータには、レーザー光の特性を規定する各種の情報が含まれる。 As described above, the control parameters include various information that defines the characteristics of the laser beam.
 例えば、「ざらざら/がりがり」の食感を実現する方法として、食材の表面を立体的に細かく加工する方法と、表面を焦がす方法との2つの方法がある。 For example, there are two methods for achieving a "rough / rough" texture: a method of finely processing the surface of the food material in three dimensions and a method of scorching the surface.
 前者の、食材の表面を立体的に細かく加工する方法を用いる場合、例えば、以下のような制御パラメータの設定が行われる。
 ・レーザー波長:短め(紫外線、青紫線など)
 ・パルス幅:短め(ナノ秒オーダーなど)
 ・パルス周波数:長め(数kHz程度)
 ・スポット径:小さめ(数μm程度)
 ・スキャンモード:ラスタースキャン
 ・繰返し回数:所定の回数設定
When the former method of finely processing the surface of the food material in three dimensions is used, for example, the following control parameters are set.
・ Laser wavelength: Short (ultraviolet rays, bluish purple rays, etc.)
・ Pulse width: Short (nanosecond order, etc.)
・ Pulse frequency: Long (a few kHz)
・ Spot diameter: Small (a few μm)
-Scan mode: Raster scan-Repeat count: Set a predetermined number of times
 このような設定により、食材の表面に数μmレベルの細かいピラミッド状などのテクスチャを形成することができ、「ざらざら/がりがり」の食感を実現することが可能となる。「ざらざら/がりがり」の程度は、テクスチャの加工サイズを変更することで調整が可能となる。 With such a setting, it is possible to form a texture such as a fine pyramid on the surface of the food material at the level of several μm, and it is possible to realize a “rough / rough” texture. The degree of "roughness / roughness" can be adjusted by changing the processing size of the texture.
 後者の、表面を焦がす方法を用いる場合、例えば、以下のような制御パラメータの設定が行われる。
 ・レーザー波長:遠赤外線
 ・レーザー出力:大きめ(数十W程度)
When the latter method of scorching the surface is used, for example, the following control parameters are set.
・ Laser wavelength: Far infrared rays ・ Laser output: Large (several tens of watts)
 他の制御パラメータについては、それぞれ所定の値が設定される。このような設定によっても、「ざらざら/がりがり」の食感を実現することが可能となる。「ざらざら/がりがり」の程度は、出力を調整して焦げの量を変更することで調整が可能となる。 For other control parameters, predetermined values are set respectively. Even with such a setting, it is possible to realize a “rough / rough” texture. The degree of "roughness / roughness" can be adjusted by adjusting the output and changing the amount of charring.
 「とろける/ねっとり」の食感を実現する場合、例えば、以下のような制御パラメータの設定が行われる。この場合、例えば、チョコレートやチーズのように、加熱により溶ける食材が調理対象の食材として選択される。
 ・レーザー波長:遠赤外線
 ・スポット径:広め
 ・照射温度:フィードバック制御により一定に保つ
 ・レーザー出力:低め
In order to realize a "melting / sticky" texture, for example, the following control parameters are set. In this case, for example, an ingredient that melts by heating, such as chocolate or cheese, is selected as the ingredient to be cooked.
・ Laser wavelength: Far infrared rays ・ Spot diameter: Wide ・ Irradiation temperature: Keep constant by feedback control ・ Laser output: Low
 このような設定により、食材の表面を焦がさずに溶かすことで「とろける/ねっとり」の食感を実現することが可能となる。「とろける/ねっとり」の程度は、例えば、照射温度と照射後温度を調整することで対応可能である。なお、調理対象の食材がチョコレートである場合、照射後温度を調整することによって、カカオバターを融点の低い結晶状態(I~IV型)とすることができる。これにより、室温でも溶けた状態を保つことが可能となる。 With such a setting, it is possible to realize a "melting / sticky" texture by melting the surface of the food without burning it. The degree of "melting / stickiness" can be adjusted by adjusting the irradiation temperature and the post-irradiation temperature, for example. When the foodstuff to be cooked is chocolate, the cocoa butter can be brought into a crystalline state (type I to IV) having a low melting point by adjusting the temperature after irradiation. This makes it possible to keep the melted state even at room temperature.
 「つるつる」の食感を実現する場合、例えば、以下のような制御パラメータの設定が行われる。この場合、例えば、チョコレートやチーズのように、加熱により溶ける食材が調理対象の食材として選択される。
 ・レーザー波長:遠赤外線
 ・スポット径:広く
 ・照射温度:フィードバック制御により一定に保つ
 ・レーザー出力:低め
In order to realize a "smooth" texture, for example, the following control parameters are set. In this case, for example, an ingredient that melts by heating, such as chocolate or cheese, is selected as the ingredient to be cooked.
・ Laser wavelength: Far infrared rays ・ Spot diameter: Wide ・ Irradiation temperature: Keep constant by feedback control ・ Laser output: Low
 このような設定により、食材の表面を焦がさずに溶かし、その後、冷却することにより、表面を鏡面状に平坦化して固化することができる。この際、ファン42を使用すると液状表面に波がたち平坦化が難しくなることからファン42はOFFにされる。 With such a setting, the surface of the food can be melted without being burnt, and then cooled to flatten the surface in a mirror shape and solidify. At this time, if the fan 42 is used, waves are generated on the liquid surface and flattening becomes difficult, so the fan 42 is turned off.
 「さくさく」の食感を実現する場合、例えば、以下のような制御パラメータの設定により、食材の表面に無数の穴を加工する方法が用いられる。
 ・レーザー波長:短め(紫外線、青紫線など)
 ・パルス幅:比較的短く(ナノ秒オーダーなど)
 ・パルス周波数:比較的高め(数kHz程度)
 ・スポット径:小さく(数μm程度)
 ・DPI:適切に設定
In order to realize a "crispy" texture, for example, a method of processing innumerable holes on the surface of foodstuffs is used by setting the following control parameters.
・ Laser wavelength: Short (ultraviolet rays, bluish purple rays, etc.)
-Pulse width: relatively short (nanosecond order, etc.)
・ Pulse frequency: relatively high (a few kHz)
・ Spot diameter: Small (several μm)
・ DPI: Set appropriately
 このような設定により、食材の表面に数乃至数百μmレベルの穴を多数あけることができる。「さくさく」の程度は、例えば、繰り返し数を変更することで穴の深さを変えたり、DPIの変更で穴の個数を変えたりすることで対応できる。 With such a setting, it is possible to make a large number of holes on the surface of the food material at the level of several to several hundred μm. The degree of "crispness" can be dealt with, for example, by changing the number of repetitions to change the depth of the holes, or changing the DPI to change the number of holes.
・情報処理サーバ11の構成
 情報処理サーバ11は、図10に示す情報処理端末1のハードウェア構成と基本的に同様の構成を有している。以下、適宜、図10に示す情報処理端末1の構成を、情報処理サーバ11の構成として引用して説明する。
• Configuration of Information Processing Server 11 The information processing server 11 has basically the same configuration as the hardware configuration of the information processing terminal 1 shown in FIG. Hereinafter, the configuration of the information processing terminal 1 shown in FIG. 10 will be described with reference to the configuration of the information processing server 11 as appropriate.
 図14は、情報処理サーバ11の機能構成例を示すブロック図である。 FIG. 14 is a block diagram showing a functional configuration example of the information processing server 11.
 図14に示すように、情報処理サーバ11においては情報処理部151が実現される。情報処理部151は、入力情報取得部161、推論部162、学習部163、および食感DB164により構成される。図14に示す機能部のうちの少なくとも一部は、情報処理サーバ11を構成する図10のCPU101により所定のプログラムが実行されることによって実現される。 As shown in FIG. 14, the information processing unit 151 is realized in the information processing server 11. The information processing unit 151 is composed of an input information acquisition unit 161, an inference unit 162, a learning unit 163, and a texture DB 164. At least a part of the functional units shown in FIG. 14 is realized by executing a predetermined program by the CPU 101 of FIG. 10 constituting the information processing server 11.
 入力情報取得部161は、通信部112を制御し、情報処理端末1との間で通信を行う。入力情報取得部161は、情報処理端末1から送信されてきた情報を受信し、推論部162に出力する。調理装置2を用いた調理の開始時、情報処理端末1からは、ユーザにより選択された食感を表す食感指定データと基本デザインの情報が送信されてくる。 The input information acquisition unit 161 controls the communication unit 112 and communicates with the information processing terminal 1. The input information acquisition unit 161 receives the information transmitted from the information processing terminal 1 and outputs it to the inference unit 162. At the start of cooking using the cooking device 2, the information processing terminal 1 transmits texture designation data representing the texture selected by the user and information on the basic design.
 推論部162は、入力情報取得部161から供給された情報を推論モデルに対する入力として用いることによって、ユーザにより選択された食感を実現する詳細デザイン、調理方法などの推論を行う。 The inference unit 162 infers the detailed design, the cooking method, etc. that realize the texture selected by the user by using the information supplied from the input information acquisition unit 161 as the input to the inference model.
 図15は、推論モデルの例を示す図である。 FIG. 15 is a diagram showing an example of an inference model.
 図15に示すように、推論部162には、ニューラルネットワークなどにより構成される推論モデルM1が予め用意される。推論部162に用意される推論モデルM1は、例えば、食感DB164に格納されている食感情報などに基づく機械学習が行われることによって生成される。 As shown in FIG. 15, an inference model M1 composed of a neural network or the like is prepared in advance in the inference unit 162. The inference model M1 prepared in the inference unit 162 is generated by, for example, performing machine learning based on the texture information stored in the texture DB 164.
 推論部162は、入力情報取得部161から供給された情報に基づいて、ユーザにより選択された食感指定データと基本デザインを推論モデルに入力し、食材、調理方法、詳細デザインのそれぞれの情報を取得する。基本デザインに対応する詳細デザインは例えば複数種類出力される。 The inference unit 162 inputs the texture designation data and the basic design selected by the user into the inference model based on the information supplied from the input information acquisition unit 161, and inputs the information of the ingredients, the cooking method, and the detailed design. get. For example, a plurality of types of detailed designs corresponding to the basic design are output.
 推論モデルの出力として取得される食材は、目標の食感を実現するために用いられる食材である。調理対象とするメインの食材がユーザにより選択されている場合、メインの食材と組み合わせて用いる食材が、目標の食感を実現するために用いる食材として出力されるようにしてもよい。 The ingredients acquired as the output of the inference model are the ingredients used to achieve the target texture. When the main ingredient to be cooked is selected by the user, the ingredient used in combination with the main ingredient may be output as the ingredient used to realize the target texture.
 調理方法は、調理装置2のそれぞれの調理工程を実現するための調理動作を表す。調理方法により、調理装置2の各部をどのように駆動させれば目標の食感が実現できるのかが表される。調理方法には、例えば制御パラメータも含まれる。 The cooking method represents a cooking operation for realizing each cooking process of the cooking device 2. Depending on the cooking method, how to drive each part of the cooking apparatus 2 to achieve the target texture is shown. The cooking method also includes, for example, control parameters.
 このような推論モデルが、例えば、調理対象とする食材毎、情報処理端末1が制御対象とする調理装置毎に用意される。推論モデルが調理装置毎に用意される場合、制御パラメータを含む調理方法の推論は、調理装置の仕様に応じて行われることになる。 Such an inference model is prepared, for example, for each ingredient to be cooked and for each cooking device controlled by the information processing terminal 1. When the inference model is prepared for each cooking device, the inference of the cooking method including the control parameters is performed according to the specifications of the cooking device.
 ユーザにより選択された基本デザイン毎に推論モデルが用意されるようにしてもよい。推論部162は、推論モデルの出力を推論結果として情報処理端末1に送信する。 An inference model may be prepared for each basic design selected by the user. The inference unit 162 transmits the output of the inference model as an inference result to the information processing terminal 1.
 このように、推論部162は、食感から調理方法などを推論したり、食感と基本デザインから詳細デザインを推論したりする食感推論器として機能する。 In this way, the inference unit 162 functions as a texture inference device that infers the cooking method and the like from the texture and infers the detailed design from the texture and the basic design.
 図16は、推論モデルの他の例を示す図である。 FIG. 16 is a diagram showing another example of the inference model.
 図15に示す推論モデルM1を用いた推論と同じ推論が、複数の推論モデルを用いて行われるようにしてもよい。 The same inference as the inference using the inference model M1 shown in FIG. 15 may be performed using a plurality of inference models.
 図16に示す推論モデルM11は、食感指定データを入力とし、食感指定データにより指定される食感を実現する食材と調理方法を出力とするモデルである。推論モデルM11は、例えば、後述する食感情報を用いた学習によって生成される。 The inference model M11 shown in FIG. 16 is a model in which texture designation data is input and ingredients and a cooking method that realize the texture specified by the texture designation data are output. The inference model M11 is generated, for example, by learning using texture information described later.
 また、推論モデルM12は、食感指定データと基本デザインを入力とし、食感指定データにより指定される食感を実現する詳細デザインを出力とするモデルである。推論モデルM12は、基本デザインをベースとした詳細デザインを複数用意し、それぞれの詳細デザインが施された食材を食べたときの主観的な食感の情報を用いた学習が行われることによって生成される。 The inference model M12 is a model that inputs the texture designation data and the basic design and outputs the detailed design that realizes the texture specified by the texture designation data. The inference model M12 is generated by preparing a plurality of detailed designs based on the basic design and performing learning using information on the subjective texture when eating the ingredients with each detailed design. NS.
 図14の学習部163は、食感DB164に格納されている食感情報などに基づいて学習を行い、推論モデルを構成するパラメータを生成する。学習部163により生成されたパラメータに基づいて、図15、図16に示すような推論モデルが推論部162に用意される。 The learning unit 163 of FIG. 14 performs learning based on the texture information stored in the texture DB 164 and generates parameters constituting the inference model. Based on the parameters generated by the learning unit 163, the inference model 162 as shown in FIGS. 15 and 16 is prepared in the inference unit 162.
 図17は、食感情報の例を示す図である。 FIG. 17 is a diagram showing an example of texture information.
 図17に示す食感情報はチョコレートの食感情報である。チョコレートの食感情報に基づいて、例えば、チョコレート用の推論モデルの学習が行われる。 The texture information shown in FIG. 17 is the texture information of chocolate. Based on the texture information of chocolate, for example, an inference model for chocolate is learned.
 1つの食感情報は、食材、状態、提供温度、レーザー調理プロセス、および食感を表す数値を対応付けることによって構成される。図17の例においては、食感を表す数値として、「とろける」、「ふわふわ」、「パリパリ」のそれぞれの数値が対応付けられている。食感を表すさらに多くの種類の数値が食感情報に含まれるようにしてもよい。 One texture information is composed by associating the ingredients, the state, the serving temperature, the laser cooking process, and the numerical values representing the texture. In the example of FIG. 17, each numerical value of "melting", "fluffy", and "crispy" is associated with the numerical value expressing the texture. More types of numerical values representing the texture may be included in the texture information.
 例えば、最上段の食感情報は、食材を「ホワイトチョコ」、状態を「液体」、提供温度を「40℃」、レーザー調理プロセスを「出力20%、速度30mm/s、冷却なし」、「とろける」を表す数値を「90」、「ふわふわ」を表す数値を「10」、「パリパリ」を表す数値を「0」とする情報である。 For example, the texture information at the top is "white chocolate" for the ingredients, "liquid" for the state, "40 ° C" for the serving temperature, "output 20%, speed 30 mm / s, no cooling" for the laser cooking process, " The information is such that the numerical value representing "melting" is "90", the numerical value representing "fluffy" is "10", and the numerical value representing "crisp" is "0".
 このような情報からなる食感情報は、例えば、調理装置2と同様の装置を使って様々な調理を行って得られた食材を試食し、試食した人の主観的な食感を数値で表現することを、食材毎、基本デザイン毎に行うことによって生成される。食感DB164には、様々な食材、様々な基本デザインに応じた食感情報が格納される。 The texture information consisting of such information is, for example, tasting the ingredients obtained by performing various cooking using the same device as the cooking device 2, and numerically expresses the subjective texture of the person who tasted the food. It is generated by doing what you do for each ingredient and each basic design. The texture DB 164 stores texture information corresponding to various ingredients and various basic designs.
 例えば学習部163においては、「とろける」、「ふわふわ」、「パリパリ」のそれぞれの数値を入力とし、食材、レーザー調理プロセスを出力とするニューラルネットワークの学習が行われることにより、図16に示すような推論モデルM11が生成される。推論モデルを用いて推論されるレーザー調理プロセスが、推論結果として情報処理端末1に提供される調理方法に相当する。 For example, in the learning unit 163, as shown in FIG. 16, the learning of the neural network is performed by inputting the respective numerical values of "melting", "fluffy", and "crisp" and outputting the ingredients and the laser cooking process. Inference model M11 is generated. The laser cooking process inferred using the inference model corresponds to the cooking method provided to the information processing terminal 1 as the inference result.
 1つの食材の食感だけでなく、複数の食材を組み合わせて調理を行って得られた食感を表す食感情報が学習に用いられるようにしてもよい。また、食感情報が不足している場合、食感と他の情報との関係を線型モデルなどの形でモデル化し、線型補間によって食感情報が生成されるようにしてもよい。機械学習によって生成された推論モデルを用いて、食感情報自体が推論され、不足する食感情報が生成されるようにしてもよい。 Not only the texture of one ingredient but also the texture information representing the texture obtained by cooking by combining a plurality of ingredients may be used for learning. Further, when the texture information is insufficient, the relationship between the texture and other information may be modeled in the form of a linear model or the like, and the texture information may be generated by linear interpolation. The texture information itself may be inferred using the inference model generated by machine learning, and the insufficient texture information may be generated.
<各装置の動作>
・情報処理端末1の動作
 ここで、図18のフローチャートを参照して、情報処理端末1の自動調理処理について説明する。図18の処理は、例えば調理対象の食材が選択されたときに開始される。ここでは、板状のチョコレートが調理対象の食材として選択されている場合について説明する。
<Operation of each device>
-Operation of the information processing terminal 1 Here, the automatic cooking process of the information processing terminal 1 will be described with reference to the flowchart of FIG. The process of FIG. 18 is started, for example, when the foodstuff to be cooked is selected. Here, a case where plate-shaped chocolate is selected as an ingredient to be cooked will be described.
 ステップS1において、提示部131(図11)は、調理対象の食材に応じた基本デザインの選択画面をディスプレイ109に表示させる。 In step S1, the presentation unit 131 (FIG. 11) displays the selection screen of the basic design according to the foodstuff to be cooked on the display 109.
 図19は、基本デザインの選択画面の例を示す図である。 FIG. 19 is a diagram showing an example of a selection screen of the basic design.
 基本デザインの選択画面には、基本デザインのサンプルを表す画像が表示される。図19の例においては、3種類の基本デザインのサンプルを表す画像201乃至203が表示されている。ユーザは、画像201乃至203のうちのいずれかを選択することによって、好みの基本デザインを選択することができる。 On the basic design selection screen, an image showing a sample of the basic design is displayed. In the example of FIG. 19, images 201 to 203 representing three types of basic design samples are displayed. The user can select his / her favorite basic design by selecting any of the images 201 to 203.
 基本デザインが選択された場合、ステップS2において、提示部131は、食感の選択画面をディスプレイ109に表示させる。 When the basic design is selected, in step S2, the presentation unit 131 displays the texture selection screen on the display 109.
 図20は、食感の選択画面の例を示す図である。 FIG. 20 is a diagram showing an example of a texture selection screen.
 食感の選択画面の中央上方には、選択済みの基本デザインを表す画像203が表示される。選択画面の下方には、食感の数値の選択に用いられるスライドバー211が食感の種類毎に表示される。ユーザは、スライドバー211を操作することにより、食感の種類毎に、好みの数値を選択することができる。 Image 203 showing the selected basic design is displayed in the upper center of the texture selection screen. At the bottom of the selection screen, a slide bar 211 used for selecting a numerical value of texture is displayed for each type of texture. By operating the slide bar 211, the user can select a desired numerical value for each type of texture.
 食材の部分毎の食感を、スライドバー211の表示を用いて選択することができるようにしてもよい。 The texture of each part of the food material may be selected by using the display on the slide bar 211.
 図21は、食感の選択画面の他の例を示す図である。 FIG. 21 is a diagram showing another example of the texture selection screen.
 図21の例においては、調理対象の食材を表す画像である食材画像221が画面の略中央に表示されている。食材画像221には格子が重ねて表示されている。ユーザは、格子により形成される任意の部分を選択し、続けて、選択した部分の食感を表す数値をスライドバー211の表示を用いて選択することにより、食材の部分毎の食感を選択することができる。 In the example of FIG. 21, the foodstuff image 221 which is an image showing the foodstuff to be cooked is displayed in the substantially center of the screen. A grid is superimposed on the food material image 221. The user selects an arbitrary part formed by the grid, and then selects a numerical value representing the texture of the selected part by using the display of the slide bar 211 to select the texture of each part of the food material. can do.
 食感を指定する部分の選択を行うことなく、食感のテンプレートが用意されるようにしてもよい。この場合、ユーザは、テンプレートを選択することにより、食材の部分毎の食感を選択することができる。 The texture template may be prepared without selecting the part that specifies the texture. In this case, the user can select the texture of each part of the food material by selecting the template.
 図22は、食感のテンプレートの例を示す図である。 FIG. 22 is a diagram showing an example of a texture template.
 図22のAは、左右方向に食感が変化するテンプレートを表し、図22のBは、上下方向に食感が変化するテンプレートを表す。図22のCは、中心から外側に向かうにつれて食感が変化するテンプレートを表す。 A in FIG. 22 represents a template in which the texture changes in the left-right direction, and B in FIG. 22 represents a template in which the texture changes in the vertical direction. C in FIG. 22 represents a template whose texture changes from the center to the outside.
 食材の部分毎の食感の選択が、画面表示に対する操作により行われるのではなく、音声によって行われるようにしてもよい。 The texture of each part of the ingredients may be selected by voice instead of by operating the screen display.
 図18の説明に戻り、食材の部分毎の食感が選択された後、ステップS3において、食感指定データ生成部132は、ユーザにより選択された食材の部分毎の食感に基づいて食感指定データを生成する。生成した食感指定データは推論結果取得部133に出力される。 Returning to the description of FIG. 18, after the texture for each part of the food material is selected, in step S3, the texture designation data generation unit 132 has a texture based on the texture for each part of the food material selected by the user. Generate specified data. The generated texture designation data is output to the inference result acquisition unit 133.
 ステップS4において、推論結果取得部133は、食感指定データ生成部132から供給された食感指定データを情報処理サーバ11に送信し、推論結果を取得する。情報処理サーバ11からは、食感指定データに基づいて情報処理サーバ11において行われた推論の結果が送信されてくる。上述したように、推論結果には、目標とする食感を実現する食材、調理方法が含まれる。 In step S4, the inference result acquisition unit 133 transmits the texture designation data supplied from the texture designation data generation unit 132 to the information processing server 11 and acquires the inference result. From the information processing server 11, the result of the inference performed in the information processing server 11 based on the texture designation data is transmitted. As described above, the inference result includes the ingredients and the cooking method that realize the target texture.
 また、推論結果取得部133は、ユーザにより選択された基本デザインの情報を情報処理サーバ11に送信し、基本デザインに対応する詳細デザインの候補を取得する。 Further, the inference result acquisition unit 133 transmits the information of the basic design selected by the user to the information processing server 11 and acquires the candidate of the detailed design corresponding to the basic design.
 ステップS5において、提示部131は、詳細デザインの候補を提示する。 In step S5, the presentation unit 131 presents a candidate for the detailed design.
 図23は、詳細デザインの候補の提示画面の例を示す図である。 FIG. 23 is a diagram showing an example of a presentation screen of a candidate for detailed design.
 提示画面には詳細デザインのサンプルを表す画像が表示される。図23の例においては、6種類の詳細デザインのサンプルを表す画像231乃至236が表示されている。ユーザは、画像231乃至236のうちのいずれかを選択することによって、好みの詳細デザインを選択することができる。 An image showing a sample of detailed design is displayed on the presentation screen. In the example of FIG. 23, images 231 to 236 representing six types of detailed design samples are displayed. The user can select his / her favorite detailed design by selecting any of the images 231 to 236.
 詳細デザインが選択された場合、図18のステップS6において、制御パラメータ設定部134は、推論結果取得部133から供給された推論結果やユーザにより選択された詳細デザインに基づいて制御パラメータを設定する。 When the detailed design is selected, in step S6 of FIG. 18, the control parameter setting unit 134 sets the control parameters based on the inference result supplied from the inference result acquisition unit 133 and the detailed design selected by the user.
 ステップS7において、レシピデータ生成部136は、制御パラメータ設定部134により設定された制御パラメータに基づいてレシピデータを生成する。 In step S7, the recipe data generation unit 136 generates recipe data based on the control parameters set by the control parameter setting unit 134.
 ステップS8において、制御部137は、レシピデータ生成部136により生成されたレシピデータに基づいて、調理装置2の動作を制御する。調理装置2においては、制御部137による制御に従って、ユーザにより指定された食感を実現するための調理動作が行われる。制御部137による制御は、レシピデータに基づく調理が終わるまで続けられる。 In step S8, the control unit 137 controls the operation of the cooking device 2 based on the recipe data generated by the recipe data generation unit 136. In the cooking apparatus 2, a cooking operation for realizing a texture designated by the user is performed according to the control by the control unit 137. The control by the control unit 137 is continued until the cooking based on the recipe data is completed.
・情報処理サーバ11の動作
 次に、図24のフローチャートを参照して、情報処理サーバ11の推論処理について説明する。この処理は、情報処理端末1から送信されてきた食感指定データと基本デザインの情報が入力情報取得部161により取得された場合に開始される。
-Operation of the information processing server 11 Next, the inference processing of the information processing server 11 will be described with reference to the flowchart of FIG. 24. This process is started when the texture designation data and the basic design information transmitted from the information processing terminal 1 are acquired by the input information acquisition unit 161.
 ステップS11において、推論部162は、入力情報取得部161から供給されてきた食感指定データなどを推論モデルに入力し、食材、調理方法、詳細デザインの推論結果を取得する。 In step S11, the inference unit 162 inputs the texture designation data supplied from the input information acquisition unit 161 into the inference model, and acquires the inference result of the ingredients, the cooking method, and the detailed design.
 ステップS12において、推論部162は、推論モデルの出力を推論結果として情報処理端末1に送信する。 In step S12, the inference unit 162 transmits the output of the inference model as the inference result to the information processing terminal 1.
 以上の一連の処理により、ユーザは、自分が望む食感を言語や数値で指定することにより、その食感を実現する食材の最終形態(詳細デザイン)の候補の提示を受けることができる。 Through the above series of processes, the user can receive the proposal of the final form (detailed design) of the food material that realizes the texture by specifying the texture desired by the user by language or numerical value.
 また、ユーザは、自分が望む食感となるような調理を行うために必要な食材、調理方法などを記述したレシピデータを生成させることができ、そのようなレシピデータに基づいて調理装置2に調理を行わせることにより、自分が望む食感の食材の提供を実際に受けることが可能となる。 In addition, the user can generate recipe data that describes the ingredients, cooking method, etc. necessary for cooking to obtain the desired texture, and the cooking apparatus 2 is based on such recipe data. By letting them cook, they can actually receive the ingredients with the texture they want.
<メインの食材の選択について>
 調理対象とする食材の候補が情報処理サーバ11において選択され、ユーザに対して提示されるようにしてもよい。例えば、ユーザは、希望する食感を指定し、食感を指定することに応じて提示された食材の候補の中から、好みの食材を選択することになる。調理対象とする食材が情報処理サーバ11により自動的に選択されるようにしてもよい。
<Selection of main ingredients>
Candidates for ingredients to be cooked may be selected by the information processing server 11 and presented to the user. For example, the user specifies a desired texture, and selects a favorite ingredient from the candidates for the ingredient presented according to the designation of the texture. The ingredients to be cooked may be automatically selected by the information processing server 11.
 調理対象とする食材が選択された後、情報処理サーバ11においては、上述した食感推論器を用いて、指定された食感を実現することが可能なデザインの候補の推論が行われ、ユーザに提示される。 After the ingredients to be cooked are selected, the information processing server 11 uses the texture inference device described above to infer design candidates capable of realizing the specified texture, and the user. Presented at.
 ここで、調理対象とする食材として複数の食材が選択されるようにすることも可能である。ホワイトチョコレートとダークチョコレートといったように複数の食材が選択される場合、それぞれの食材の比率(6:4など)も選択される。 Here, it is also possible to select a plurality of ingredients as the ingredients to be cooked. When multiple ingredients are selected, such as white chocolate and dark chocolate, the ratio of each ingredient (such as 6: 4) is also selected.
 食材の比率とデザインが選択された後の処理は、上述した処理と同様である。すなわち、調理方法の推論結果に基づいて、情報処理端末1においてレシピデータが生成される。また、レシピデータに基づいて調理装置2の調理動作が制御され、ユーザが指定した食感が実現される。 The processing after the ratio of ingredients and the design are selected is the same as the processing described above. That is, the recipe data is generated in the information processing terminal 1 based on the inference result of the cooking method. Further, the cooking operation of the cooking apparatus 2 is controlled based on the recipe data, and the texture specified by the user is realized.
<テンパリングへの適用例>
 レーザー光を用いた調理装置2による調理は、チョコレートのテンパリングにも適用可能である。テンパリング(調温操作)は、50度以上で融けたチョコレートを撹拌しながら冷却し、27度で一定時間保持した後に、31度程度にまで昇温させる温度操作である。
<Example of application to tempering>
Cooking with the cooking device 2 using laser light can also be applied to chocolate tempering. Tempering (temperature control operation) is a temperature operation in which chocolate melted at 50 ° C or higher is cooled while stirring, held at 27 ° C for a certain period of time, and then raised to about 31 ° C.
 チョコレートの構造は、ココアバター油脂中に、微粒子となった砂糖やカカオ固形分が分散されたものであり、連続相は油脂である。チョコレートは、連続相の油脂が結晶化することで固まり、融解することで液体となる。ココアバターの結晶は、多形現象を示し、I型~VI型と呼ばれる6種類の多形となるが、チョコレート製品として最適なものはV型である。ココアバターがもつ6種類の多形のうちのV型のみを選択して結晶化させる必要がある。 The structure of chocolate is that fine particles of sugar and cacao solids are dispersed in cocoa butter fats and oils, and the continuous phase is fats and oils. Chocolate solidifies when continuous phase fats and oils crystallize, and becomes liquid when it melts. Cocoa butter crystals show a polymorphic phenomenon, and there are six types of polymorphs called type I to type VI, but the most suitable chocolate product is type V. It is necessary to select and crystallize only the V type of the 6 types of polymorphs that cocoa butter has.
 テンパリングにより、液体チョコレート中にココアバターのV型結晶核が生じ、その後の冷却によって、全てのココアバター分子がV型として結晶化する。ココアバター分子がV型として結晶化することにより、チョコレートの口当たりが滑らかなものとなる。なお、テンパリングの温度はチョコレート製品に添加される砂糖や粉乳の割合により、異なる。 Tempering produces V-type crystal nuclei of cocoa butter in liquid chocolate, and subsequent cooling crystallizes all cocoa butter molecules as V-type. The cocoa butter molecule crystallizes as a V-type, which makes the chocolate smooth. The tempering temperature varies depending on the proportion of sugar and milk powder added to the chocolate product.
 テンパリングを行う手法として代表的な方法は、大理石の天板上で手作業によって行う方法やテンパリングマシーンと呼ばれる専用装置によって行う方法であるが、いずれの方法も、成型前のチョコレートに対してしか用いることができない。また、チョコレート全体に対してしか用いることができない。 Typical methods for tempering are manual methods on a marble top plate and methods using a dedicated device called a tempering machine, but both methods are used only for chocolate before molding. Can't. Also, it can only be used for whole chocolate.
 調理装置2によれば、既に成形されたチョコレート製品に対して、レーザー光を部分的に照射することによってテンパリングを行うことが可能となる。具体的には、レーザー光を照射した箇所のみの脂肪分(ココアバター)を選択的に融解させた後に冷却し、さらに、レーザー光による部分的な加熱と冷却を必要に応じて繰り返すことにより、チョコレート製品の選択した箇所のみの脂肪分を任意の結晶状態に変更することが可能となる。ファン42等用いて、チョコレート全体の加熱または冷却が行われるようにしてもよい。 According to the cooking apparatus 2, it is possible to temper the already molded chocolate product by partially irradiating it with a laser beam. Specifically, the fat content (cocoa butter) only in the area irradiated with the laser beam is selectively melted and then cooled, and further, partial heating and cooling by the laser beam are repeated as necessary. It is possible to change the fat content of only the selected portion of the chocolate product to an arbitrary crystalline state. A fan 42 or the like may be used to heat or cool the entire chocolate.
<その他>
・制御装置の例
 調理装置2の制御が情報処理端末1により行われるものとしたが、他の装置により行われるようにしてもよい。
<Others>
-Example of control device Although the control of the cooking device 2 is assumed to be performed by the information processing terminal 1, it may be performed by another device.
 図25は、制御装置の例を示す図である。 FIG. 25 is a diagram showing an example of a control device.
 図25のAに示すように、調理装置2の制御が情報処理サーバ11により行われるようにすることが可能である。 As shown in A of FIG. 25, it is possible to control the cooking apparatus 2 by the information processing server 11.
 この場合、図11の情報処理部121は情報処理サーバ11に設けられる。情報処理部121が設けられた情報処理サーバ11においては、ユーザにより選択された食感に応じて、調理方法などの推論が行われ、レシピデータが生成される。また、レシピデータの記述に従って、調理装置2の制御がネットワーク12を介して行われる。 In this case, the information processing unit 121 of FIG. 11 is provided in the information processing server 11. In the information processing server 11 provided with the information processing unit 121, inferences such as a cooking method are performed according to the texture selected by the user, and recipe data is generated. Further, the cooking apparatus 2 is controlled via the network 12 according to the description of the recipe data.
 図25のBに示すように、情報処理部121が調理装置2内に設けられるようにすることも可能である。 As shown in B of FIG. 25, it is also possible to provide the information processing unit 121 in the cooking apparatus 2.
 このように、情報処理端末1以外の各種の装置により調理装置2の制御が行われるようにすることが可能である。 In this way, it is possible to control the cooking device 2 by various devices other than the information processing terminal 1.
・調理装置の例
 制御対象の調理装置がレーザー光を照射して調理を行う調理装置2であるものとしたが、他の調理装置を対象として上述した制御が行われるようにしてもよい。
-Example of a cooking device Although the cooking device to be controlled is the cooking device 2 that irradiates a laser beam to cook, the above-mentioned control may be performed for another cooking device.
 図26は、制御システムの他の構成例を示す図である。 FIG. 26 is a diagram showing another configuration example of the control system.
 図26に示す制御システムにおいては、調理装置2に代えて、電子レンジなどの加熱調理器301が設けられている。加熱調理器301は、情報処理部121から供給された命令コマンドに従って調理動作(加熱)を行い、調理を行うことになる。制御パラメータとし、例えば、加熱の強度、加熱時間、加熱の変化を表す情報が用いられる。 In the control system shown in FIG. 26, a cooking device 301 such as a microwave oven is provided instead of the cooking device 2. The heating cooker 301 performs a cooking operation (heating) according to a command command supplied from the information processing unit 121, and cooks. As control parameters, for example, information indicating heating intensity, heating time, and change in heating is used.
 このように、調理動作を自動的に行う各種の機器の制御にレシピデータが用いられるようにすることが可能である。電子レンジ以外に、食材の成型を行う3Dプリンタ、アームを使って調理を行うアームロボットなどの各種の調理装置の制御がレシピデータに基づいて行われるようにすることが可能である。 In this way, it is possible to use recipe data to control various devices that automatically perform cooking operations. In addition to the microwave oven, it is possible to control various cooking devices such as a 3D printer that molds ingredients and an arm robot that cooks using an arm based on recipe data.
 複数個のレーザーヘッド34がアレイ状または面状に並べられた調理装置2を用いて調理が行われるようにしてもよい。これにより、調理を効率的に行ったり、食材の全体を焼き分けたりすることが可能となる。 Cooking may be performed using a cooking device 2 in which a plurality of laser heads 34 are arranged in an array or a plane. This makes it possible to efficiently cook and to bake the whole food.
・調理対象の食材の例
 調理対象の食材がチョコレートである場合について主に説明したが、各種の食材の調理を調理装置2に行わせることも可能である。例えば殻を割った生卵を対象としてレーザー光を照射することにより、黄身部分に文字を刻印することが可能である。また、バナナの皮を対象としてレーザー光を照射することにより、皮の酸化によって任意の時間の経過後に文字が現れるような調理が可能となる。
-Examples of ingredients to be cooked Although the case where the ingredients to be cooked are chocolate has been mainly described, it is also possible to have the cooking apparatus 2 cook various ingredients. For example, by irradiating a raw egg with a broken shell with a laser beam, it is possible to engrave characters on the yolk portion. Further, by irradiating the banana peel with a laser beam, it is possible to cook the banana peel so that the characters appear after an arbitrary time elapses due to the oxidation of the peel.
・プログラムについて
 上述した一連の処理は、ハードウェアにより実行することもできるし、ソフトウェアにより実行することもできる。一連の処理をソフトウェアにより実行する場合には、そのソフトウェアを構成するプログラムが、専用のハードウェアに組み込まれているコンピュータ、または、汎用のパーソナルコンピュータなどにインストールされる。
-About the program The series of processes described above can be executed by hardware or software. When a series of processes are executed by software, the programs constituting the software are installed on a computer embedded in dedicated hardware, a general-purpose personal computer, or the like.
 インストールされるプログラムは、光ディスク(CD-ROM(Compact Disc-Read Only Memory),DVD(Digital Versatile Disc)等)や半導体メモリなどよりなる図10に示されるリムーバブルメディア114に記録して提供される。また、ローカルエリアネットワーク、インターネット、デジタル放送といった、有線または無線の伝送媒体を介して提供されるようにしてもよい。プログラムは、ROM102や記憶部111に、あらかじめインストールしておくことができる。 The installed program is recorded and provided on the removable media 114 shown in FIG. 10, which consists of an optical disk (CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc), etc.), a semiconductor memory, or the like. It may also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting. The program can be pre-installed in the ROM 102 or the storage unit 111.
 コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program in which processing is performed in chronological order in the order described in this specification, or processing is performed in parallel or at a necessary timing such as when a call is made. It may be a program to be performed.
 本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 In the present specification, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
 なお、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limited, and other effects may be obtained.
 本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 The embodiment of the present technology is not limited to the above-described embodiment, and various changes can be made without departing from the gist of the present technology.
 例えば、本技術は、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 For example, this technology can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and jointly processed.
 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 In addition, each step described in the above flowchart can be executed by one device or shared by a plurality of devices.
 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Further, when one step includes a plurality of processes, the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
・構成の組み合わせ例
 本技術は、以下のような構成をとることもできる。
-Example of combination of configurations This technology can also have the following configurations.
(1)
 調理対象の食材の部分毎の食感を指定する食感指定データに基づいて、調理機器による調理動作を規定するパラメータを設定するパラメータ設定部を備える
 情報処理装置。
(2)
 前記パラメータ設定部は、前記調理機器の仕様に応じて前記パラメータを設定する
 前記(1)に記載の情報処理装置。
(3)
 前記パラメータ設定部は、ユーザにより選択された食感を指定する前記食感指定データに基づいて前記パラメータを設定する
 前記(1)または(2)に記載の情報処理装置。
(4)
 前記パラメータ設定部は、調理方法の情報とそれぞれの調理方法による調理後の食材の食感の情報とを含む食感情報に基づく学習によって生成された推論器の入力として前記食感指定データを用いることによって推論された調理方法に基づいて前記パラメータを設定する
 前記(1)乃至(3)のいずれかに記載の情報処理装置。
(5)
 前記推論器は、前記食感指定データを入力とし、前記食感指定データにより表される食感を実現する調理方法を出力とするニューラルネットワークにより構成される
 前記(4)に記載の情報処理装置。
(6)
 食感の選択に用いられる選択画面を提示する提示部をさらに備える
 前記(1)乃至(5)のいずれかに記載の情報処理装置。
(7)
 前記提示部は、選択された食感を実現する前記食材のデザインの候補を、前記調理機器の仕様に応じて提示し、
 前記パラメータ設定部は、選択された前記デザインの候補に応じて前記パラメータを設定する
 前記(6)に記載の情報処理装置。
(8)
 前記調理機器により調理が行われている前記食材の状態を検出する食材状態検出部をさらに備え、
 前記パラメータ設定部は、前記食材の状態に応じて前記パラメータを調整する
 前記(1)乃至(7)のいずれかに記載の情報処理装置。
(9)
 前記食感指定データは、部分毎に異なる前記食材の食感を指定するデータである
 前記(1)乃至(8)のいずれかに記載の情報処理装置。
(10)
 前記食感指定データは、前記食材の平面上の部分毎の食感、または、3次元空間上の部分毎の食感を指定するデータである
 前記(1)乃至(9)のいずれかに記載の情報処理装置。
(11)
 前記調理機器は、レーザー光を照射することによって調理を行う装置であり、
 前記パラメータ設定部は、前記食材に照射する前記レーザー光の特性を規定する前記パラメータを設定する
 前記(1)乃至(10)のいずれかに記載の情報処理装置。
(12)
 前記パラメータ設定部は、調理工程に含まれるそれぞれの前記調理動作の内容を規定する前記パラメータを設定し、
 前記パラメータをそれぞれの前記調理動作に関する情報として含む調理工程情報を生成する調理工程情報生成部をさらに備える
 前記(1)乃至(11)のいずれかに記載の情報処理装置。
(13)
 ユーザにより選択された食感に応じて前記食感指定データを生成する食感指定データ生成部をさらに備える
 前記(1)乃至(12)のいずれかに記載の情報処理装置。
(14)
 情報処理装置が、
 調理対象の食材の部分毎の食感を指定する食感指定データに基づいて、調理機器による調理動作を規定するパラメータを設定する
 情報処理方法。
(15)
 コンピュータに、
 調理対象の食材の部分毎の食感を指定する食感指定データに基づいて、調理機器による調理動作を規定するパラメータを設定する
 処理を実行させるためのプログラム。
(1)
An information processing device including a parameter setting unit that sets parameters that specify cooking operations by a cooking device based on texture specification data that specifies the texture of each part of the food to be cooked.
(2)
The information processing device according to (1), wherein the parameter setting unit sets the parameters according to the specifications of the cooking device.
(3)
The information processing device according to (1) or (2), wherein the parameter setting unit sets the parameters based on the texture designation data that specifies the texture selected by the user.
(4)
The parameter setting unit uses the texture designation data as input of an inferencer generated by learning based on texture information including information on a cooking method and information on the texture of ingredients after cooking by each cooking method. The information processing apparatus according to any one of (1) to (3) above, wherein the parameters are set based on the cooking method inferred from the above.
(5)
The information processing device according to (4) above, wherein the inference device is configured by a neural network that inputs the texture designation data and outputs a cooking method that realizes the texture represented by the texture designation data. ..
(6)
The information processing apparatus according to any one of (1) to (5) above, further comprising a presentation unit that presents a selection screen used for texture selection.
(7)
The presenting unit presents design candidates for the ingredients that realize the selected texture according to the specifications of the cooking device.
The information processing device according to (6) above, wherein the parameter setting unit sets the parameters according to the selected candidate for the design.
(8)
Further provided with a food material state detection unit for detecting the state of the food material being cooked by the cooking device.
The information processing device according to any one of (1) to (7) above, wherein the parameter setting unit adjusts the parameters according to the state of the foodstuff.
(9)
The information processing device according to any one of (1) to (8) above, wherein the texture designation data is data for designating the texture of the food material, which is different for each portion.
(10)
The texture designation data is described in any one of (1) to (9) above, which is data for designating the texture of each part on the plane of the food material or the texture of each part on the three-dimensional space. Information processing equipment.
(11)
The cooking device is a device that cooks by irradiating a laser beam.
The information processing device according to any one of (1) to (10) above, wherein the parameter setting unit sets the parameters that define the characteristics of the laser light that irradiates the foodstuff.
(12)
The parameter setting unit sets the parameter that defines the content of each cooking operation included in the cooking process.
The information processing apparatus according to any one of (1) to (11), further comprising a cooking process information generation unit that generates cooking process information including the parameters as information related to the cooking operation.
(13)
The information processing apparatus according to any one of (1) to (12) above, further comprising a texture designation data generation unit that generates the texture designation data according to the texture selected by the user.
(14)
Information processing device
An information processing method that sets parameters that specify the cooking operation of a cooking device based on texture specification data that specifies the texture of each part of the food to be cooked.
(15)
On the computer
A program for executing the process of setting parameters that specify the cooking operation by the cooking device based on the texture specification data that specifies the texture of each part of the ingredients to be cooked.
 1 情報処理端末, 2 調理装置, 11 情報処理サーバ, 121 情報処理部, 131 提示部, 132 食感指定データ生成部, 133 推論結果取得部, 134 制御パラメータ設定部, 135 食材状態検出部, 136 レシピデータ生成部, 137 制御部, 161 入力情報取得部, 162 推論部, 163 学習部, 164 食感DB 1 Information processing terminal, 2 Cooking device, 11 Information processing server, 121 Information processing unit, 131 Presentation unit, 132 Texture designation data generation unit, 133 Inference result acquisition unit, 134 Control parameter setting unit, 135 Food condition detection unit, 136 Recipe data generation unit, 137 control unit, 161 input information acquisition unit, 162 inference unit, 163 learning unit, 164 texture DB

Claims (15)

  1.  調理対象の食材の部分毎の食感を指定する食感指定データに基づいて、調理機器による調理動作を規定するパラメータを設定するパラメータ設定部を備える
     情報処理装置。
    An information processing device including a parameter setting unit that sets parameters that specify cooking operations by a cooking device based on texture specification data that specifies the texture of each part of the food to be cooked.
  2.  前記パラメータ設定部は、前記調理機器の仕様に応じて前記パラメータを設定する
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the parameter setting unit sets the parameters according to the specifications of the cooking device.
  3.  前記パラメータ設定部は、ユーザにより選択された食感を指定する前記食感指定データに基づいて前記パラメータを設定する
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the parameter setting unit sets the parameters based on the texture designation data that specifies the texture selected by the user.
  4.  前記パラメータ設定部は、調理方法の情報とそれぞれの調理方法による調理後の食材の食感の情報とを含む食感情報に基づく学習によって生成された推論器の入力として前記食感指定データを用いることによって推論された調理方法に基づいて前記パラメータを設定する
     請求項1に記載の情報処理装置。
    The parameter setting unit uses the texture designation data as input of an inferencer generated by learning based on texture information including information on a cooking method and information on the texture of ingredients after cooking by each cooking method. The information processing apparatus according to claim 1, wherein the parameters are set based on the cooking method inferred from the above.
  5.  前記推論器は、前記食感指定データを入力とし、前記食感指定データにより表される食感を実現する調理方法を出力とするニューラルネットワークにより構成される
     請求項4に記載の情報処理装置。
    The information processing device according to claim 4, wherein the inference device is configured by a neural network that inputs the texture designation data and outputs a cooking method that realizes the texture represented by the texture designation data.
  6.  食感の選択に用いられる選択画面を提示する提示部をさらに備える
     請求項3に記載の情報処理装置。
    The information processing apparatus according to claim 3, further comprising a presentation unit that presents a selection screen used for texture selection.
  7.  前記提示部は、選択された食感を実現する前記食材のデザインの候補を、前記調理機器の仕様に応じて提示し、
     前記パラメータ設定部は、選択された前記デザインの候補に応じて前記パラメータを設定する
     請求項6に記載の情報処理装置。
    The presenting unit presents design candidates for the ingredients that realize the selected texture according to the specifications of the cooking device.
    The information processing device according to claim 6, wherein the parameter setting unit sets the parameters according to the selected candidate for the design.
  8.  前記調理機器により調理が行われている前記食材の状態を検出する食材状態検出部をさらに備え、
     前記パラメータ設定部は、前記食材の状態に応じて前記パラメータを調整する
     請求項1に記載の情報処理装置。
    Further provided with a food material state detection unit for detecting the state of the food material being cooked by the cooking device.
    The information processing device according to claim 1, wherein the parameter setting unit adjusts the parameters according to the state of the foodstuff.
  9.  前記食感指定データは、部分毎に異なる前記食材の食感を指定するデータである
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the texture designation data is data for designating the texture of the food material, which is different for each portion.
  10.  前記食感指定データは、前記食材の平面上の部分毎の食感、または、3次元空間上の部分毎の食感を指定するデータである
     請求項9に記載の情報処理装置。
    The information processing device according to claim 9, wherein the texture designation data is data for designating the texture of each part of the food material on a plane or the texture of each part on a three-dimensional space.
  11.  前記調理機器は、レーザー光を照射することによって調理を行う装置であり、
     前記パラメータ設定部は、前記食材に照射する前記レーザー光の特性を規定する前記パラメータを設定する
     請求項1に記載の情報処理装置。
    The cooking device is a device that cooks by irradiating a laser beam.
    The information processing device according to claim 1, wherein the parameter setting unit sets the parameters that define the characteristics of the laser beam that irradiates the foodstuff.
  12.  前記パラメータ設定部は、調理工程に含まれるそれぞれの前記調理動作の内容を規定する前記パラメータを設定し、
     前記パラメータをそれぞれの前記調理動作に関する情報として含む調理工程情報を生成する調理工程情報生成部をさらに備える
     請求項1に記載の情報処理装置。
    The parameter setting unit sets the parameter that defines the content of each cooking operation included in the cooking process.
    The information processing apparatus according to claim 1, further comprising a cooking process information generation unit that generates cooking process information including the parameters as information related to the cooking operation.
  13.  ユーザにより選択された食感に応じて前記食感指定データを生成する食感指定データ生成部をさらに備える
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, further comprising a texture designation data generation unit that generates the texture designation data according to the texture selected by the user.
  14.  情報処理装置が、
     調理対象の食材の部分毎の食感を指定する食感指定データに基づいて、調理機器による調理動作を規定するパラメータを設定する
     情報処理方法。
    Information processing device
    An information processing method that sets parameters that specify the cooking operation of a cooking device based on texture specification data that specifies the texture of each part of the food to be cooked.
  15.  コンピュータに、
     調理対象の食材の部分毎の食感を指定する食感指定データに基づいて、調理機器による調理動作を規定するパラメータを設定する
     処理を実行させるためのプログラム。
    On the computer
    A program for executing the process of setting parameters that specify the cooking operation by the cooking device based on the texture specification data that specifies the texture of each part of the ingredients to be cooked.
PCT/JP2021/004222 2020-02-20 2021-02-05 Information processing device, information processing method, and program WO2021166673A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020027199 2020-02-20
JP2020-027199 2020-02-20

Publications (1)

Publication Number Publication Date
WO2021166673A1 true WO2021166673A1 (en) 2021-08-26

Family

ID=77392052

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/004222 WO2021166673A1 (en) 2020-02-20 2021-02-05 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2021166673A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023042286A1 (en) * 2021-09-15 2023-03-23 日本電気株式会社 Food development support device, food development support method, and food development support program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016116497A (en) * 2014-12-24 2016-06-30 Tdk株式会社 Food production system
WO2018206810A1 (en) * 2017-05-12 2018-11-15 Koninklijke Philips N.V. Cooking appliance
CN108897245A (en) * 2018-07-16 2018-11-27 华中农业大学 A kind of intelligent cooking system
WO2019102509A1 (en) * 2017-11-21 2019-05-31 Illycaffe' S.P.A. Machine to dispense coffee-based beverages, and corresponding dispensing method and program
US20190364952A1 (en) * 2016-12-14 2019-12-05 Nestec S.A. Food processing system and associated method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016116497A (en) * 2014-12-24 2016-06-30 Tdk株式会社 Food production system
US20190364952A1 (en) * 2016-12-14 2019-12-05 Nestec S.A. Food processing system and associated method
WO2018206810A1 (en) * 2017-05-12 2018-11-15 Koninklijke Philips N.V. Cooking appliance
WO2019102509A1 (en) * 2017-11-21 2019-05-31 Illycaffe' S.P.A. Machine to dispense coffee-based beverages, and corresponding dispensing method and program
CN108897245A (en) * 2018-07-16 2018-11-27 华中农业大学 A kind of intelligent cooking system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023042286A1 (en) * 2021-09-15 2023-03-23 日本電気株式会社 Food development support device, food development support method, and food development support program

Similar Documents

Publication Publication Date Title
Hertafeld et al. Multi-material three-dimensional food printing with simultaneous infrared cooking
CN212157290U (en) Cooking aid
CN107427044B (en) Apparatus and method for heating and cooking food using laser beam and electromagnetic radiation
KR102131161B1 (en) Microwave voice control method and microwave
EP3665419A1 (en) Configurable cooking systems and methods
US20180338354A1 (en) Pattern recognizing appliance
JP2018517532A (en) Autonomous cooking device for preparing food from recipe file and method for creating recipe file
US11339971B2 (en) Oven with automatic control of pan elevation
JP5026081B2 (en) Intelligent cooking method
CN110431916A (en) The method of cooking stove component and monitoring cooking process with monitoring system
WO2021166673A1 (en) Information processing device, information processing method, and program
AU2017294574A1 (en) Generating a cooking process
CN110806699A (en) Control method and device of cooking equipment, cooking equipment and storage medium
KR20180080307A (en) Cooking method for electric cooking apparatus having stirring means
Abd Rahman et al. Response surface optimization for hot air‐frying technique and its effects on the quality of sweet potato snack
CN103284620B (en) Frying and roasting machine
JP2006304796A (en) Method for cooking or boiling in frying pan
CN112361387A (en) Gas stove fire power control method, fire power control system and gas stove
Blutinger et al. The future of software-controlled cooking
US20220047109A1 (en) System and method for targeted heating element control
Blutinger et al. Characterization of CO2 laser browning of dough
KR20210134619A (en) cooking robot, cooking robot control device, control method
CN111552259A (en) Intelligent kitchen ware control system based on Internet of things
CN109691903B (en) Heating method of frying and baking machine, heating device, frying and baking machine and computer storage medium
WO2019171618A1 (en) Cooking information system and server

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21756984

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21756984

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP