WO2019065304A1 - Food provision system, food provision method, and device for managing food provision system - Google Patents

Food provision system, food provision method, and device for managing food provision system Download PDF

Info

Publication number
WO2019065304A1
WO2019065304A1 PCT/JP2018/034162 JP2018034162W WO2019065304A1 WO 2019065304 A1 WO2019065304 A1 WO 2019065304A1 JP 2018034162 W JP2018034162 W JP 2018034162W WO 2019065304 A1 WO2019065304 A1 WO 2019065304A1
Authority
WO
WIPO (PCT)
Prior art keywords
food
data
unit
taste
robot
Prior art date
Application number
PCT/JP2018/034162
Other languages
French (fr)
Japanese (ja)
Inventor
茂憲 蛭田
大輔 村松
史郎 北村
祐至 齋藤
修司 仲山
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to JP2019544586A priority Critical patent/JPWO2019065304A1/en
Publication of WO2019065304A1 publication Critical patent/WO2019065304A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A23FOODS OR FOODSTUFFS; TREATMENT THEREOF, NOT COVERED BY OTHER CLASSES
    • A23PSHAPING OR WORKING OF FOODSTUFFS, NOT FULLY COVERED BY A SINGLE OTHER SUBCLASS
    • A23P20/00Coating of foodstuffs; Coatings therefor; Making laminated, multi-layered, stuffed or hollow foodstuffs
    • A23P20/20Making of laminated, multi-layered, stuffed or hollow foodstuffs, e.g. by wrapping in preformed edible dough sheets or in edible food containers
    • AHUMAN NECESSITIES
    • A23FOODS OR FOODSTUFFS; TREATMENT THEREOF, NOT COVERED BY OTHER CLASSES
    • A23PSHAPING OR WORKING OF FOODSTUFFS, NOT FULLY COVERED BY A SINGLE OTHER SUBCLASS
    • A23P30/00Shaping or working of foodstuffs characterised by the process or apparatus
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • B29C64/393Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y30/00Apparatus for additive manufacturing; Details thereof or accessories therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • B33Y50/02Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a food providing system for providing food manufactured by a 3D printer or the like, a food providing method, and a management apparatus for the food providing system.
  • Patent Document 1 an apparatus configured to manufacture food using a 3D printer has been known (see, for example, Patent Document 1).
  • food is laminated based on a cross-sectional view of CAD data while adding a coloring agent and a flavoring agent to produce a food of a predetermined shape having a desired color and flavor.
  • Patent Document 1 International Publication No. 2015/106059
  • a food providing system stacks food based on a data output unit that outputs shape data, color data, and taste data of a food, and the shape data and taste data output from the data output unit.
  • 3D modeling apparatus for producing a shaped food having a shape and taste corresponding to food, a user device having a display unit mounted on the user and displaying a 3D image conforming to the real space, and cubic
  • a display control unit configured to control the display unit based on color data output from the data output unit such that a three-dimensional image of a shaped food manufactured by the original shaping apparatus is displayed in the same color as the food.
  • the method for providing food outputs shape data, color data and taste data of a food, stacks foodstuffs based on the output shape data and taste data, and corresponds to a food Produce a shaped food with taste and taste, and output it so that the three-dimensional image matched with the real space of the produced shaped food is displayed in the same color as the food on the display unit of the user device attached to the user Controlling the display unit on the basis of the selected color data.
  • a management apparatus for a food providing system includes: a signal input unit that receives data transmitted from a user device and a robot; and shape data of a food based on data received through the signal input unit.
  • a data output unit that specifies and outputs color data and taste data
  • a printer control unit that outputs shape data and taste data output from the data output unit to an external 3D modeling device, and a 3D modeling device
  • a user device control unit that outputs color data output from the data output unit to an external user device such that a three-dimensional image of the formed shaped food is displayed in the same color as the food.
  • FIG. 2 is a diagram showing the configuration of the user equipment of FIG. 1;
  • FIG. 2 is a view showing a schematic configuration of the 3D printer of FIG. 1;
  • FIG. 2 is a diagram showing the configuration of the robot of FIG. 1;
  • FIG. 2 is a block diagram showing a schematic configuration of a management server of FIG. 7 is a flowchart showing an example of processing executed by the calculation unit of FIG. 5;
  • the food provision system is applied to a robot system, and provides food to a user using information acquired by a robot operating according to a command from the user. More specifically, the user is made to recognize the food (order food) which can be ordered at the remote place through the image signal of the camera acquired using the robot at the remote place, and the order food is ordered according to the instruction from the user.
  • the corresponding food is manufactured by the 3D printer, and the manufactured food (the shaped food) is configured to be provided to the user.
  • the ordered food and the shaped food have the same shape and taste but have different colors.
  • the robot can act alone according to a command from the user, but in the following, an example will be described where the robot acts with a third party such as a user's family instead of alone.
  • a third party such as a user's family instead of alone.
  • the robot is operated according to a command from the user, and the information acquired by the robot is provided to the user.
  • the information of the food of the travel destination is provided to the user via the robot as follows, and the food is manufactured by the 3D printer using the information. This allows the user to feel like eating with the family at the travel destination.
  • FIG. 1 is a view schematically showing the overall configuration of a food providing system 100 according to an embodiment of the present invention.
  • the food providing system 100 includes a user device 10 mounted to the user 1 located at the A point, a 3D printer 20 disposed at the A point, and a user at the A point located at the B point 1 includes the robot 30 operated by 1 and the management server 40.
  • the user device 10, the 3D printer 20, the robot 30, and the management server 40 are communicably connected via a network 2 including a wireless communication network such as an Internet line.
  • the point A is, for example, the user's home, and the point B is a point distant from the point A, for example, a point in a different area. Note that countries A and B may be different from each other.
  • the robot 30 at the point B is rented from the store 3 near the point B or the point B. That is, the robot 30 is rented by the family of the user 1 who has visited the store 3 and acts with the family of the user 1 at the point B. The robot 30 is returned to the store 3 by the family of the user 1 when the trip at the point B ends.
  • Each robot 30 of the store 3 is previously given a unique identification ID.
  • FIG. 2 is a diagram showing the configuration of the user device 10.
  • the user device 10 is, for example, a wearable computer that generally has a substantially helmet shape, and is mounted on the head of the user.
  • the user device 10 includes a plurality of sensors 11 that detect brain activity such as the user's brain waves, magnetoencephalographic waves, and the state of cerebral blood flow. That is, the user device 10 is provided with a so-called brain machine interface (BMI) that detects the user's thoughts or intentions from brain activity signals and realizes mechanical operations without using the body.
  • BMI brain machine interface
  • the user device 10 further includes a display 12, a microphone 13, a speaker 14, an input device 15, a controller 16, a wireless unit 17, and a camera 18.
  • the display 12 is a non-transmissive head mounted display, for example, and is disposed so as to cover the periphery of the user's eyes, and a camera image from the robot 30 is displayed. The positions of the user's nose and mouth are exposed without being covered by the user device 10.
  • the outer surface of the display 12 is provided with a pair of cameras 18 each having an imaging element such as a CCD corresponding to the position of the user's eyes.
  • the display 12 can also display a three-dimensional image of the real space taken by the camera 18. A virtual image can be superimposed and displayed on this three-dimensional image.
  • the microphone 13 is provided so as to be movable toward the user's mouth, and inputs an audio signal according to the user's speech.
  • the speaker 14 is disposed at the user's ear and outputs sound.
  • the input device 15 is constituted by a switch, a touch panel or the like operated by the user, and can input various information such as personal information of the user via the input device 15.
  • the controller 16 includes a microcomputer having a CPU, a ROM, a RAM, etc., and controls the wireless unit 17 to communicate with the management server 40.
  • the controller 16 causes the management server 40 to transmit signals from the sensor 11 and the microphone 13.
  • the controller 16 outputs a control signal to the display 12 or the speaker 14 based on the signal transmitted from the management server 40.
  • FIG. 3 is a block diagram showing a schematic configuration of the 3D printer 20.
  • the 3D printer 20 creates an object by stacking materials based on shape data of a three-dimensional model, and in the present embodiment, food is stacked to manufacture a food (a shaped food).
  • the 3D printer 20 includes an input device 21, a food material storage unit 22, a flavor storage unit 23, a mixing unit 24, a nozzle 25, a controller 26, and a wireless unit 27.
  • the food material storage unit 22 stores a plurality of food materials having different hardnesses such as agar, konjac, jelly and the like in a different container in a state of being cut into pieces.
  • the foodstuff storage part 22 can also store powdered foodstuff in a container.
  • the flavor storage unit 23 stores basic flavors such as sweet, bitter, sour, salty and umami in the form of powder or liquid in different containers.
  • the stirring unit 24 mixes the predetermined amount of food of the predetermined type supplied from the food storage 22 and the predetermined amount of flavor of the predetermined type supplied from the flavor storage 23.
  • the nozzle 25 moves according to the coordinate data output from the controller 26, and laminates the flavored food supplied from the mixing unit 24 while injecting it from its tip, thereby manufacturing a three-dimensional shaped shaped food.
  • the controller 26 includes a microcomputer having an arithmetic unit 26A such as a CPU, a storage unit 26B such as a ROM and a RAM, and various peripheral circuits.
  • Arithmetic unit 26A has a data input unit 261 and a signal output unit 262 as a functional configuration.
  • the data input unit 261 takes in a signal input from the input device 21 and takes in a signal (food data) transmitted from the management server 40 via the wireless unit 27.
  • the signal output unit 262 outputs a control signal to each operation unit (such as an actuator) of the food material storage unit 22, the flavor storage unit 23, the mixing unit 24 and the nozzle 25 according to the signal taken in by the data input unit 261.
  • the operations of the unit 22, the flavor storage unit 23, the stirring unit 24 and the nozzle 25 are controlled.
  • FIG. 4 is a view showing the configuration of the robot 30.
  • the robot 30 is a humanoid robot having a head, a body, two arms, and two legs, and can move by itself by bipedal walking.
  • the height of the robot 30 is close to the height of an adult, for example, about 140 to 160 cm.
  • the robot 30 has a plurality of sensors having detection functions corresponding to five senses that are human sense functions for sensing the external world, that is, a visual sensor 311, an auditory sensor 312, a tactile sensor 313, an odor sensor 314, a taste sensor And a sensor 315.
  • the sensors 311 to 315 output signals (five sense signals) corresponding to human senses as detection signals.
  • the vision sensor 311 is configured by a camera, and drives an imaging unit including an image sensor such as a CMOS sensor or a CCD sensor provided at the eye position of the robot 30 and a lens, and the imaging unit vertically and horizontally. And a zoom mechanism for scaling an object, and acquires an image (moving image) around the robot 30.
  • the auditory sensor 312 is configured, for example, by a microphone provided at the position of the ear of the robot 30, and acquires audio around the robot 30.
  • the tactile sensor 313 is, for example, a force sensor provided at the position of the hand of the robot 30 or the like, and detects an external force acting on the hand of the robot 30.
  • the odor sensor 314 is provided at the position of the nose of the robot 30, and detects an odor.
  • the taste sensor 315 is provided at the position of the mouth of the robot 30, and detects taste.
  • the robot 30 further includes an actuator 32, a speaker 33, an input device 34, a GPS sensor 35, a controller 36, and a wireless unit 37.
  • the actuator 32 is constituted by, for example, a plurality of servomotors provided at the joint of the robot 30, and the robot 30 operates by driving the actuator 32.
  • the speaker 33 is provided at the position of the mouth of the robot 30 and outputs sound.
  • the input device 34 includes various switches such as a power switch.
  • the GPS sensor 35 receives GPS signals from GPS satellites. The position of the robot 30 can be detected by the signal from the GPS sensor 35.
  • the controller 36 includes a microcomputer having a CPU, a ROM, a RAM, and the like, and controls the wireless unit 37 to communicate with the management server 40.
  • the controller 36 causes the management server 40 to transmit signals from the sensors 311 to 315 that output five sense signals and the GPS sensor 35.
  • the controller 36 outputs a control signal to the actuator 32, the speaker 33, etc. based on the signal transmitted from the management server 40.
  • FIG. 5 is a block diagram showing an example of a schematic configuration of the management server 40.
  • the management server 40 includes an input device 41, a display device 42, a wireless unit 43, and a controller 44.
  • the input device 41 and the display device 42 can be omitted.
  • the wireless unit 43 may be configured to be connected not only by wireless communication but also by wired communication.
  • the controller 44 includes an arithmetic unit 44A such as a CPU, a storage unit 44B such as a ROM, a RAM, a hard disk, and other peripheral circuits, and controls the wireless unit 43 to control the user device 10, the 3D printer 20 and the robot 30. Communicate with.
  • Arithmetic unit 44A has a signal input unit 441, an order reception unit 442, a data output unit 443, a robot control unit 444, a printer control unit 445, and a user device control unit 446 as functional components.
  • the storage unit 44B has a food database 447 as a functional configuration.
  • the food database 447 stores food data including shape data, color data, and taste data of a plurality of foods (dishes etc.) formed by a single or a plurality of foods and flavors, together with the name of the food.
  • Food data includes food recognized by the user via the robot 30 (visual sensor 311), for example, data of food (order food) that can be ordered at a restaurant or a store that sells food, and the shape of the food as ordered food.
  • the data of the shaped food in the case of producing the food (the shaped food) by the 3D printer 20 by imitating the taste is included.
  • the data of the shaped food is stored in the food database 447 in association with the data of the ordered food.
  • the food database 447 stores information of food of various genres, and this information is periodically updated or sequentially updated via, for example, the input device 41.
  • the signal input unit 441 transmits the data transmitted from the user device 10 (the sensor 11 or the microphone 13 or the like in FIG. 2) and the data transmitted from the robot 30 (the sensors 311 to 315 or the like in FIG. 4) Get through.
  • the signal input unit 441 also acquires an operation completion signal output from the 3D printer 20 when the 3D printer 20 completes manufacturing of the food.
  • the order receiving unit 442 receives an order of food to be manufactured by the 3D printer 20. For example, when food is photographed by the camera (vision sensor 311) of the robot 30 and food is displayed on the display 12 of the user device 10, if food is ordered via the robot 30 according to a command from the user, the order reception unit 431 accepts the food order. In addition, since the robot 30 can not eat food, it is not necessary to actually make the ordered food at the restaurant.
  • the order receiving unit 442 may determine the user's intention to order food and accept the order without making the order even. That is, the user may order the order of food via the sensor 11 or the microphone 13 without the order operation by the robot 30, and the order reception unit 442 may receive the order.
  • the order receiving unit 442 determines the name of the ordered food based on the signal from the sensor 11 or the visual sensor 311, and specifies the order food.
  • the name of the ordered food is unknown, food data matching the shape data and color data of the food acquired by the visual sensor 311 can be retrieved from the food database 436 to thereby specify the ordered food.
  • the data output unit 443 refers to the food database 447, and extracts and outputs shape data, color data and taste data of the ordered food specified by the order reception unit 442.
  • the robot control unit 444 transmits an operation signal to the actuator 32 of the robot 30 to the robot 30 via the wireless unit 43 based on the signal (brain activity signal) from the sensor 11 of the user device 10 read by the signal input unit 441. Send.
  • the controller 36 of the robot 30 outputs a control signal to the actuator 32 in response to the operation signal. Thereby, the robot 30 can be operated according to the user's intention.
  • the robot control unit 444 can also output sound based on a signal from the microphone 13 of the user device 10 from the speaker 33 of the robot 30.
  • the printer control unit 445 transmits the shape data and the taste data of the ordered food among the food data output from the data output unit 443 to the 3D printer 20 via the wireless unit 43.
  • the controller 26 (signal output unit 262) of the 3D printer 20 sends control signals to the operation units of the food storage unit 22, the flavor storage unit 23, the mixing unit 24, and the nozzle 25 based on the transmitted shape data and taste data. Output This produces a shaped food whose shape and taste match the custom food.
  • the user device control unit 446 sends an operation signal to the user device 10 to the user device 10 via the wireless unit 43 based on the signals (five sense signals) from the sensors 311 to 315 of the robot 30 read by the signal input unit 441. Send. For example, an image signal detected by the visual sensor 311 is transmitted.
  • the controller 16 of the user device 10 outputs a control signal to the display 12 in response to the image signal, and causes the display 12 to display a three-dimensional image obtained from the visual sensor 311.
  • the user device control unit 446 can also cause the speaker 14 of the user device 10 to output a sound based on the signal from the aural sensor 312.
  • the user device control unit 446 determines which of the image acquired by the visual sensor 311 of the robot 30 and the image acquired by the camera 18 of the user device 10 is to be displayed on the display 12. In response to this determination, the display of the display 12 is switched. For example, when the robot 30 is operated according to a command from the user, the user device control unit 446 transmits a signal from the visual sensor 311 to the user device 10, and the display 12 displays an image based on the signal from the visual sensor 311. Display. On the other hand, when the manufacture of the shaped food is completed by the 3D printer 20 after ordering the food, the user device control unit 446 transmits an image switching signal to the user device 10 so as to display an image based on the signal from the camera 18. Note that the display image can also be switched by the operation of the input device 15 of the user device 10.
  • the user device control unit 446 transmits the color data of the ordered food output from the data output unit 443 to the user device 10 when transmitting the image switching signal.
  • the controller 16 of the user device 10 controls the color of the display image of the shaped food based on the color data. That is, when displaying the image of the shaped food taken by the camera 18 on the display 12, the controller 16 controls the display color of the image so that the shaped food is displayed in the same color as the ordered food.
  • FIG. 6 is a flowchart showing an example of processing executed by the computing unit 44A of the management server 40 according to a program stored in advance in the storage unit 44B, in particular, processing pertaining to display control on the display 12 of the user device 10.
  • the process shown in this flowchart is started, for example, when an operation start command of the robot is input from the user device 10, and is repeated at a predetermined cycle.
  • step S1 it is determined whether the robot 30 has ordered food according to an instruction from the user, that is, whether the order receiving unit 442 has received an order of food. If the result in Step S1 is negative, the process proceeds to Step S2, and the image signal acquired by the visual sensor 311 of the robot 30 read by the signal input unit 441 is transmitted to the user device 10 via the wireless unit 43. As a result, an image (moving image) acquired via the robot 30 is displayed on the display 12 of the user device 10.
  • step S1 the data output unit 443 refers to the food database 447 and outputs the food data of the ordered food specified in step S3, that is, shape data, color data, and taste data.
  • step S5 the printer control unit 445 transmits the food data (shape data and taste data) of the ordered food to the 3D printer 20 via the wireless unit 43.
  • step S6 it is determined whether the production of the shaped food has been completed based on whether the operation completion signal is transmitted from the 3D printer 20 or not. If affirmed at step S6, the process proceeds to step S7, and if not, the process proceeds to step S2.
  • the user device control unit 446 transmits a signal (for example, an image switching signal) to the user device 10 via the wireless unit 43 so that the image acquired by the camera 18 is displayed on the display 12. Thereby, the image display of the display 12 is switched, and the image of the real space (image of the shaped food) photographed by the camera 18 is displayed on the display 12.
  • a signal for example, an image switching signal
  • step S7 the user device control unit 446 transmits the color data of the ordered food output from the data output unit 443 in step S4 to the user device 10.
  • the controller 16 of the user device 10 controls the display color of the shaped food based on the color data.
  • the shape and taste of the shaped food correspond to (for example, match) the shape and taste of the ordered food, and the appearance of the shaped food displayed on the display 12 is the same as that of the ordered food. I feel as if I have tasted the food.
  • the shape data and taste data of the order food 5 are transmitted to the 3D printer 20 (step S5).
  • the 3D printer 20 operates based on the shape data and the taste data, and a shaped food 6 (cake) having the same shape and the same taste as the ordered food 5 is manufactured at the point A.
  • the surface of the custom food 5 is formed with a plurality of colors and patterns, whereas the surface of the shaped food 6 is formed with a single color, for example, and there is no pattern.
  • step S7 the image display of the display 12 is switched (step S7).
  • color data of the ordered food 5 is transmitted to the user device 10, and the shaped food 6 is displayed on the display 12 in the same color and pattern as the ordered food 5.
  • the user 1 recognizes that it is the same as the ordered food 5 while actually being the shaped food 6 having a different color, and by eating the shaped food 6, a feeling like eating the ordered food 5 is obtained.
  • the food providing system 100 outputs a data output unit 443 that outputs shape data, color data and taste data of food (order food), and shape data and taste output from the data output unit 443.
  • a 3D printer 20 for producing a shaped food having a shape and taste corresponding to the ordered food by laminating food based on data and a display for displaying a three-dimensional image fitted to the user while being fitted to the user 12 controls the display 12 based on color data output from the data output unit 443 so that the three-dimensional image of the shaped food manufactured by the 3D printer 20 is displayed in the same color as the ordered food.
  • a controller 16 (FIGS. 1, 2 and 5).
  • the shape and taste of the ordered food can be easily reproduced by the 3D printer 20.
  • the shaped food is artificially displayed on the display 12 in a color corresponding to the ordered food.
  • a shaped food can be manufactured, for example with the color of the foodstuff supplied from the foodstuff storage part 22, without being concerned about the color of a shaped food, and the manufacturing cost of a shaped food can be reduced. That is, the 3D printer 20 can be used to provide a shaped food with high satisfaction to the user with an inexpensive configuration.
  • the food provision system 100 further includes a movable robot 30 that operates in response to an instruction from the user via wireless communication and has a visual sensor 311 (FIG. 1).
  • the user device control unit 446 and the controller 16 further control the display 12 so that the three-dimensional image captured by the visual sensor 311 is displayed in place of the three-dimensional image matching the real space (step S2).
  • the user can obtain various information via the robot 30 at a remote location. For example, from the appearance of the food displayed on the display 12, it can be determined which food the user orders.
  • the food provision system 100 further includes a food database 447 in which food data including shape data, color data and taste data of a plurality of food is stored (FIG. 5).
  • the data output unit 443 extracts shape data, color data and taste data corresponding to the food from the food database 447 and outputs the same (step S4).
  • step S4 it is possible to provide the user with a shaped food in which the food ordered through the robot 30 is reproduced. Therefore, the user can feel that he is eating with his family, and a high satisfaction can be obtained.
  • the data output unit 443 can output the shape data and color data of the food taken by the visual sensor 311, and can also extract and output taste data corresponding to the food from the food database 447.
  • food data can be extracted from the shape data and color data of the food photographed by the visual sensor 311, and the 3D printer 20 It can be used to easily produce the desired food.
  • the food provision method outputs shape data, color data and taste data of the ordered food (step S4), and based on the output shape data and taste data, the food is produced To produce a shaped food having a shape and taste corresponding to the ordered food (step S5), and the display 12 of the user device 10 worn by the user, a three-dimensional image of the shaped food manufactured in real space Controlling the display 12 based on the output color data so that the same color as the ordered food is displayed (step S7).
  • step S7 a three-dimensional image of the shaped food manufactured in real space
  • the food providing system 100 is applied to a robot system in the above embodiment, the food providing system of the present invention can be applied to other than a robot system. That is, the data output unit 443 may output the shape data, the color data, and the taste data of the food for which the order has been received without intervention of the robot 30.
  • the 3D printer 20 is used to manufacture the shaped food, but the food is stacked based on the shape data and the taste data output from the data output unit, and the shape having the shape and the taste corresponding to the food is formed.
  • the configuration of the three-dimensional modeling apparatus is not limited to that described above as long as it produces food.
  • the configuration of the display unit for displaying a three-dimensional image that matches the real space is not limited to this.
  • the configuration of the user device attached to the user is not limited to that described above.
  • the controller 16 of the user device 10 switches the display of the display 12 of the user device 10 according to an instruction from the user device control unit 446, and based on color data transmitted according to an instruction from the user device control unit 446.
  • the display unit controls based on the color data output from the data output unit so that the three-dimensional image of the shaped food is displayed in the same color as the ordered food
  • the configuration of the display control unit is not limited to that described above, as long as it is possible.
  • the humanoid robot 30 capable of biped walking is used, but if it operates according to a command from the user via wireless communication and has a photographing unit such as the visual sensor 311, the configuration of the robot Are not limited to those described above.
  • food data including shape data, color data, and taste data of a plurality of food is stored in the food database 447 of the management server 40, but the configuration of the storage unit is not limited to this.
  • signals are transmitted and received between the management server 40 and the user device 10, the 3D printer 20, and the robot 30. That is, although the user device 10, the 3D printer 20 and the robot 30 communicate with each other through the management server 40, they may directly communicate with each other without the management server 40. In this case, the functions of the management server 40 may be provided to the user device 10, the 3D printer 20, and the controllers 16, 26, 36 of the robot 30, and the like.
  • the robot 30 is rented at the store 3 in the above embodiment, the present invention can be configured in a similar manner even if, for example, the user uses a robot owned at home. Instead of having the robot 30 act with the family, it may act alone.
  • the management server 40 and the terminal of the store 3 may be configured to be communicable, and application for rental reservation of the robot 30, payment of a rental fee, and the like may be performed via the management server 40.
  • Reference Signs List 10 user equipment, 12 displays, 16 controllers, 20 3D printers, 30 robots, 100 food provision systems, 311 visual sensors, 443 data output units, 446 user equipment control units, 447 food databases

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Materials Engineering (AREA)
  • Manufacturing & Machinery (AREA)
  • Polymers & Plastics (AREA)
  • Food Science & Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Manipulator (AREA)
  • Formation And Processing Of Food Products (AREA)

Abstract

This food provision system comprises: a data output unit (443) that outputs shape data, color data, and flavor data pertaining to a requested food; a three-dimensional modeling device that, on the basis of the shape data and flavor data outputted from the data output unit, layers food ingredients and produces a model food having a shape and a flavor corresponding to those of the requested food; a user machine that is mounted on a user and has a display unit for displaying a three-dimensional image that is consistent with real space; and a display control unit controls the display unit on the basis of the color data outputted from the data output unit, the display unit being controlled so that a three-dimensional image of the model food produced by the three-dimensional modeling device is displayed in the same colors as the requested food.

Description

食品提供システム、食品提供方法および食品提供システム用管理装置Food provision system, food provision method and management device for food provision system
 本発明は、3Dプリンタなどにより製造された食品を提供する食品提供システム、食品提供方法および食品提供システム用管理装置に関する。 The present invention relates to a food providing system for providing food manufactured by a 3D printer or the like, a food providing method, and a management apparatus for the food providing system.
 従来より、3Dプリンタを用いて食品を製造するようにした装置が知られている(例えば特許文献1参照)。この特許文献1記載の装置では、着色料や香味料を添加しながらCADデータの断面図に基づいて食材を積層することにより、所望の色や風味をもった所定形状の食品を製造する。 Conventionally, an apparatus configured to manufacture food using a 3D printer has been known (see, for example, Patent Document 1). In the device described in Patent Document 1, food is laminated based on a cross-sectional view of CAD data while adding a coloring agent and a flavoring agent to produce a food of a predetermined shape having a desired color and flavor.
 特許文献1:国際公開第2015/106059号 Patent Document 1: International Publication No. 2015/106059
 ところで、実際の食品には、色とりどりの食材を用いて彩り豊富な状態で提供されるものがある。このような食品を、上記特許文献1記載の3Dプリンタなどを用いて再現しようとすると、食品の製造コストが大幅に上昇するおそれがある。 By the way, some actual food products are provided in colorful and pleasing condition using multicolored food. If such a food is to be reproduced using the 3D printer described in Patent Document 1 or the like, the manufacturing cost of the food may be significantly increased.
 本発明の一態様である食品提供システムは、食品の形状データと色データと味データとを出力するデータ出力部と、データ出力部から出力された形状データと味データとに基づいて食材を積層して食品に対応する形状と味とを有する造形食品を製造する三次元造形装置と、ユーザに装着されるとともに、現実空間に合致した三次元画像を表示する表示部を有するユーザ機器と、三次元造形装置により製造された造形食品の三次元画像が食品と同一色で表示されるようにデータ出力部から出力された色データに基づいて表示部を制御する表示制御部と、を備える。 A food providing system according to an aspect of the present invention stacks food based on a data output unit that outputs shape data, color data, and taste data of a food, and the shape data and taste data output from the data output unit. 3D modeling apparatus for producing a shaped food having a shape and taste corresponding to food, a user device having a display unit mounted on the user and displaying a 3D image conforming to the real space, and cubic And a display control unit configured to control the display unit based on color data output from the data output unit such that a three-dimensional image of a shaped food manufactured by the original shaping apparatus is displayed in the same color as the food.
 本発明の他の態様である食品提供方法は、食品の形状データと色データと味データとを出力し、出力された形状データと味データとに基づいて食材を積層して食品に対応する形状と味とを有する造形食品を製造し、ユーザに装着されたユーザ機器の表示部に、製造された造形食品の現実空間に合致した三次元画像が食品と同一色で表示されるように、出力された色データに基づいて表示部を制御することを含む。 The method for providing food according to another aspect of the present invention outputs shape data, color data and taste data of a food, stacks foodstuffs based on the output shape data and taste data, and corresponds to a food Produce a shaped food with taste and taste, and output it so that the three-dimensional image matched with the real space of the produced shaped food is displayed in the same color as the food on the display unit of the user device attached to the user Controlling the display unit on the basis of the selected color data.
 本発明のさらに他の態様である食品提供システム用管理装置は、ユーザ機器及びロボットから送信されたデータを受信する信号入力部と、信号入力部を介して受信したデータに基づき食品の形状データと色データと味データとを特定し出力するデータ出力部と、データ出力部から出力された形状データと味データとを外部の三次元造形装置に出力するプリンタ制御部と、三次元造形装置により製造される造形食品の三次元画像が食品と同一色で表示されるようにデータ出力部から出力された色データを外部のユーザ機器に出力するユーザ機器制御部と、を備える。 A management apparatus for a food providing system according to still another aspect of the present invention includes: a signal input unit that receives data transmitted from a user device and a robot; and shape data of a food based on data received through the signal input unit. Manufactured by a data output unit that specifies and outputs color data and taste data, a printer control unit that outputs shape data and taste data output from the data output unit to an external 3D modeling device, and a 3D modeling device And a user device control unit that outputs color data output from the data output unit to an external user device such that a three-dimensional image of the formed shaped food is displayed in the same color as the food.
 本発明によれば、三次元造形装置によって製造される造形食品自体の色を気にする必要がなく、安価に造形食品を製造することができる。 ADVANTAGE OF THE INVENTION According to this invention, it is not necessary to be concerned about the color of modeling food itself manufactured with a three-dimensional modeling apparatus, and modeling food can be manufactured cheaply.
本発明の実施形態に係る食品提供システムの全体構成を概略的に示す図。BRIEF DESCRIPTION OF THE DRAWINGS The figure which shows roughly the whole structure of the food provision system which concerns on embodiment of this invention. 図1のユーザ機器の構成を示す図。FIG. 2 is a diagram showing the configuration of the user equipment of FIG. 1; 図1の3Dプリンタの概略構成を示す図。FIG. 2 is a view showing a schematic configuration of the 3D printer of FIG. 1; 図1のロボットの構成を示す図。FIG. 2 is a diagram showing the configuration of the robot of FIG. 1; 図1の管理サーバの概略構成を示すブロック図。FIG. 2 is a block diagram showing a schematic configuration of a management server of FIG. 図5の演算部で実行される処理の一例を示すフローチャート。7 is a flowchart showing an example of processing executed by the calculation unit of FIG. 5; 本発明の実施形態に係る食品提供システムによる動作の一例を示す図。The figure which shows an example of operation | movement by the food provision system which concerns on embodiment of this invention.
 以下、図1~図7を参照して本発明の実施形態について説明する。本発明の実施形態に係る食品提供システムは、ロボットシステムに適用され、ユーザからの指令によって動作するロボットが取得した情報を用いてユーザに食品を提供するものである。より具体的には、遠隔地のロボットを用いて取得したカメラの画像信号を介して、その遠隔地で注文可能な食品(注文食品)をユーザに認識させ、ユーザからの指令に応じて注文食品に対応する食品を3Dプリンタで製造し、製造された食品(造形食品)をユーザに提供するように構成する。注文食品と造形食品とは、形状および味が互いに同一であるが、色は異なる。 Hereinafter, an embodiment of the present invention will be described with reference to FIGS. 1 to 7. The food provision system according to the embodiment of the present invention is applied to a robot system, and provides food to a user using information acquired by a robot operating according to a command from the user. More specifically, the user is made to recognize the food (order food) which can be ordered at the remote place through the image signal of the camera acquired using the robot at the remote place, and the order food is ordered according to the instruction from the user The corresponding food is manufactured by the 3D printer, and the manufactured food (the shaped food) is configured to be provided to the user. The ordered food and the shaped food have the same shape and taste but have different colors.
 ロボットは、ユーザからの指令により単独で行動することができるが、以下では、単独ではなくロボットがユーザの家族等の第三者とともに行動する例を説明する。例えば、ユーザが家族と一緒に外出(例えば旅行)することが困難な場合に、ユーザの代わりにロボットが家族と一緒に旅行する場合を想定する。このとき、ユーザからの指令によりロボットを動作させるとともに、ロボットにより取得された情報をユーザに提供する。これによりユーザは、例えば自宅にいながら、あたかも家族とともに旅行している気分を味わうことができる。特に本実施形態では、以下のように旅行先の食品の情報を、ロボットを介してユーザに提供し、その情報を用いて3Dプリンタで食品を製造する。これによりユーザは、旅行先で家族と一緒に食事している気分を味わうことができる。 The robot can act alone according to a command from the user, but in the following, an example will be described where the robot acts with a third party such as a user's family instead of alone. For example, when it is difficult for a user to go out with a family (for example, travel), it is assumed that a robot travels with a family instead of the user. At this time, the robot is operated according to a command from the user, and the information acquired by the robot is provided to the user. This allows the user to feel as if he were traveling with his family, for example while at home. In particular, in the present embodiment, the information of the food of the travel destination is provided to the user via the robot as follows, and the food is manufactured by the 3D printer using the information. This allows the user to feel like eating with the family at the travel destination.
 図1は、本発明の実施形態に係る食品提供システム100の全体構成を概略的に示す図である。図1に示すように、食品提供システム100は、A地点に位置するユーザ1に装着されたユーザ機器10と、A地点に配置された3Dプリンタ20と、B地点に位置してA地点のユーザ1により操作されるロボット30と、管理サーバ40とを含んで構成される。 FIG. 1 is a view schematically showing the overall configuration of a food providing system 100 according to an embodiment of the present invention. As shown in FIG. 1, the food providing system 100 includes a user device 10 mounted to the user 1 located at the A point, a 3D printer 20 disposed at the A point, and a user at the A point located at the B point 1 includes the robot 30 operated by 1 and the management server 40.
 これらユーザ機器10と、3Dプリンタ20と、ロボット30と、管理サーバ40とは、インターネット回線等の無線通信網を含むネットワーク2で通信可能に接続される。A地点は例えばユーザの自宅であり、B地点はA地点から離れた地点、例えば異なる地域の地点である。なお、A地点とB地点とが互いに異なる国であってもよい。 The user device 10, the 3D printer 20, the robot 30, and the management server 40 are communicably connected via a network 2 including a wireless communication network such as an Internet line. The point A is, for example, the user's home, and the point B is a point distant from the point A, for example, a point in a different area. Note that countries A and B may be different from each other.
 B地点のロボット30は、B地点またはB地点の近傍の店舗3からレンタルされる。すなわち、ロボット30は、店舗3に赴いたユーザ1の家族によりレンタルされ、B地点でユーザ1の家族とともに行動する。ロボット30は、B地点での旅行が終了するときにユーザ1の家族により店舗3に返却される。店舗3の各ロボット30には、予め固有の識別IDが付されている。 The robot 30 at the point B is rented from the store 3 near the point B or the point B. That is, the robot 30 is rented by the family of the user 1 who has visited the store 3 and acts with the family of the user 1 at the point B. The robot 30 is returned to the store 3 by the family of the user 1 when the trip at the point B ends. Each robot 30 of the store 3 is previously given a unique identification ID.
 図2は、ユーザ機器10の構成を示す図である。図2に示すようにユーザ機器10は、例えば全体が略ヘルメット形状を呈するウェアラブルコンピュータであり、ユーザの頭部に装着される。ユーザ機器10は、ユーザの脳波や脳磁波、脳血流の状態等の脳活動を検出する複数のセンサ11を備える。すなわち、ユーザ機器10は、ユーザの思考ないし意図を脳活動信号から検出し、身体を使うことなく機械操作を実現する、いわゆるブレインマシンインタフェース(BMI)を備える。 FIG. 2 is a diagram showing the configuration of the user device 10. As shown in FIG. As shown in FIG. 2, the user device 10 is, for example, a wearable computer that generally has a substantially helmet shape, and is mounted on the head of the user. The user device 10 includes a plurality of sensors 11 that detect brain activity such as the user's brain waves, magnetoencephalographic waves, and the state of cerebral blood flow. That is, the user device 10 is provided with a so-called brain machine interface (BMI) that detects the user's thoughts or intentions from brain activity signals and realizes mechanical operations without using the body.
 さらにユーザ機器10は、ディスプレイ12と、マイク13と、スピーカ14と、入力装置15と、コントローラ16と、無線ユニット17と、カメラ18とを有する。ディスプレイ12は、例えば非透過型のヘッドマウントディスプレイであり、ユーザの両眼の周囲を覆うように配置され、ロボット30からのカメラ画像が表示される。なお、ユーザの鼻と口の位置は、ユーザ機器10に覆われずに露出する。 The user device 10 further includes a display 12, a microphone 13, a speaker 14, an input device 15, a controller 16, a wireless unit 17, and a camera 18. The display 12 is a non-transmissive head mounted display, for example, and is disposed so as to cover the periphery of the user's eyes, and a camera image from the robot 30 is displayed. The positions of the user's nose and mouth are exposed without being covered by the user device 10.
 ディスプレイ12の外表面には、ユーザの両眼の位置に対応してCCD等の撮像素子を有する一対のカメラ18が設けられる。ディスプレイ12には、カメラ18により撮影された現実空間の三次元画像を表示することもできる。この三次元画像には、仮想の画像を重ねて表示することができる。 The outer surface of the display 12 is provided with a pair of cameras 18 each having an imaging element such as a CCD corresponding to the position of the user's eyes. The display 12 can also display a three-dimensional image of the real space taken by the camera 18. A virtual image can be superimposed and displayed on this three-dimensional image.
 マイク13は、ユーザの口元に向けて移動可能に設けられ、ユーザの発話による音声信号を入力する。スピーカ14は、ユーザの耳元に配置され、音声を出力する。入力装置15は、ユーザにより操作されるスイッチやタッチパネル等により構成され、入力装置15を介してユーザの個人情報等、各種情報を入力することができる。 The microphone 13 is provided so as to be movable toward the user's mouth, and inputs an audio signal according to the user's speech. The speaker 14 is disposed at the user's ear and outputs sound. The input device 15 is constituted by a switch, a touch panel or the like operated by the user, and can input various information such as personal information of the user via the input device 15.
 コントローラ16は、CPU,ROM,RAM等を有するマイクロコンピュータを含み、無線ユニット17を制御して管理サーバ40と通信する。例えばコントローラ16は、センサ11とマイク13とからの信号を管理サーバ40に送信させる。さらにコントローラ16は、管理サーバ40から送信された信号に基づきディスプレイ12やスピーカ14などに制御信号を出力する。 The controller 16 includes a microcomputer having a CPU, a ROM, a RAM, etc., and controls the wireless unit 17 to communicate with the management server 40. For example, the controller 16 causes the management server 40 to transmit signals from the sensor 11 and the microphone 13. Furthermore, the controller 16 outputs a control signal to the display 12 or the speaker 14 based on the signal transmitted from the management server 40.
 図3は、3Dプリンタ20の概略構成を示すブロック図である。3Dプリンタ20は、三次元モデルの形状データに基づいて材料を積層していくことでオブジェクトを作成するものであり、本実施形態では、食材を積層して食品(造形食品)を製造する。図3に示すように、3Dプリンタ20は、入力装置21と、食材貯蔵部22と、風味貯蔵部23と、掻き混ぜ部24と、ノズル25と、コントローラ26と、無線ユニット27とを有する。 FIG. 3 is a block diagram showing a schematic configuration of the 3D printer 20. As shown in FIG. The 3D printer 20 creates an object by stacking materials based on shape data of a three-dimensional model, and in the present embodiment, food is stacked to manufacture a food (a shaped food). As shown in FIG. 3, the 3D printer 20 includes an input device 21, a food material storage unit 22, a flavor storage unit 23, a mixing unit 24, a nozzle 25, a controller 26, and a wireless unit 27.
 食材貯蔵部22は、寒天、こんにゃく、ゼリー等の硬さの異なる複数の食材を細切れにされた状態でそれぞれ異なる容器に保管する。食材貯蔵部22は、粉末状の食材を容器に保管することもできる。風味貯蔵部23は、例えば甘味、苦味、酸味、塩味、うま味等の基本の風味を粉末または液体の形態でそれぞれ異なる容器に保管する。掻き混ぜ部24は、食材貯蔵部22から供給された所定種類の所定量の食材と、風味貯蔵部23から供給された所定種類の所定量の風味とを掻き混ぜる。ノズル25は、コントローラ26から出力される座標データに従って移動し、掻き混ぜ部24から供給された風味付き食材を、その先端から射出しながら積層し、これにより三次元形状の造形食品を製造する。 The food material storage unit 22 stores a plurality of food materials having different hardnesses such as agar, konjac, jelly and the like in a different container in a state of being cut into pieces. The foodstuff storage part 22 can also store powdered foodstuff in a container. The flavor storage unit 23 stores basic flavors such as sweet, bitter, sour, salty and umami in the form of powder or liquid in different containers. The stirring unit 24 mixes the predetermined amount of food of the predetermined type supplied from the food storage 22 and the predetermined amount of flavor of the predetermined type supplied from the flavor storage 23. The nozzle 25 moves according to the coordinate data output from the controller 26, and laminates the flavored food supplied from the mixing unit 24 while injecting it from its tip, thereby manufacturing a three-dimensional shaped shaped food.
 コントローラ26は、CPU等の演算部26Aと、ROM,RAM等の記憶部26Bと、各種周辺回路とを有するマイクロコンピュータを含む。演算部26Aは、機能的構成として、データ入力部261と、信号出力部262とを有する。データ入力部261は、入力装置21から入力された信号を取り込むとともに、管理サーバ40から送信された信号(食品データ)を、無線ユニット27を介して取り込む。信号出力部262は、データ入力部261により取り込まれた信号に従い食材貯蔵部22、風味貯蔵部23、掻き混ぜ部24およびノズル25の各動作部(アクチュエータなど)に制御信号を出力し、食材貯蔵部22、風味貯蔵部23、掻き混ぜ部24およびノズル25の動作を制御する。 The controller 26 includes a microcomputer having an arithmetic unit 26A such as a CPU, a storage unit 26B such as a ROM and a RAM, and various peripheral circuits. Arithmetic unit 26A has a data input unit 261 and a signal output unit 262 as a functional configuration. The data input unit 261 takes in a signal input from the input device 21 and takes in a signal (food data) transmitted from the management server 40 via the wireless unit 27. The signal output unit 262 outputs a control signal to each operation unit (such as an actuator) of the food material storage unit 22, the flavor storage unit 23, the mixing unit 24 and the nozzle 25 according to the signal taken in by the data input unit 261. The operations of the unit 22, the flavor storage unit 23, the stirring unit 24 and the nozzle 25 are controlled.
 図4は、ロボット30の構成を示す図である。図4に示すようにロボット30は、頭、胴体、2つの腕部、2つの脚部を有する人型ロボットであり、二足歩行によりロボット自身で移動することができる。ロボット30の高さは、成人の身長に近く、例えば140~160cm程度である。 FIG. 4 is a view showing the configuration of the robot 30. As shown in FIG. As shown in FIG. 4, the robot 30 is a humanoid robot having a head, a body, two arms, and two legs, and can move by itself by bipedal walking. The height of the robot 30 is close to the height of an adult, for example, about 140 to 160 cm.
 ロボット30は、外界を感知するための人の感覚機能である五感に対応した検出機能を有する複数のセンサ、すなわち視覚センサ311と、聴覚センサ312と、触覚センサ313と、臭覚センサ314と、味覚センサ315とを有する。これらセンサ311~315は、人の五感に対応した信号(五感信号)を検出信号として出力する。 The robot 30 has a plurality of sensors having detection functions corresponding to five senses that are human sense functions for sensing the external world, that is, a visual sensor 311, an auditory sensor 312, a tactile sensor 313, an odor sensor 314, a taste sensor And a sensor 315. The sensors 311 to 315 output signals (five sense signals) corresponding to human senses as detection signals.
 より具体的には、視覚センサ311はカメラにより構成され、ロボット30の目の位置に設けられたCMOSセンサやCCDセンサ等の画像センサとレンズとを有する撮影部と、撮影部を上下左右に駆動する駆動部と、被写体を拡大縮小するズーム機構とを有し、ロボット30の周囲の画像(動画)を取得する。聴覚センサ312は、例えばロボット30の耳の位置に設けられたマイクにより構成され、ロボット30の周囲の音声を取得する。触覚センサ313は、例えばロボット30のハンドの位置等に設けられた力覚センサにより構成され、ロボット30のハンドに作用する外力を検出する。臭覚センサ314は、ロボット30の鼻の位置に設けられ、臭いを検出する。味覚センサ315は、ロボット30の口の位置に設けられ、味を検出する。 More specifically, the vision sensor 311 is configured by a camera, and drives an imaging unit including an image sensor such as a CMOS sensor or a CCD sensor provided at the eye position of the robot 30 and a lens, and the imaging unit vertically and horizontally. And a zoom mechanism for scaling an object, and acquires an image (moving image) around the robot 30. The auditory sensor 312 is configured, for example, by a microphone provided at the position of the ear of the robot 30, and acquires audio around the robot 30. The tactile sensor 313 is, for example, a force sensor provided at the position of the hand of the robot 30 or the like, and detects an external force acting on the hand of the robot 30. The odor sensor 314 is provided at the position of the nose of the robot 30, and detects an odor. The taste sensor 315 is provided at the position of the mouth of the robot 30, and detects taste.
 さらにロボット30は、アクチュエータ32と、スピーカ33と、入力装置34と、GPSセンサ35と、コントローラ36と、無線ユニット37とを有する。アクチュエータ32は、例えばロボット30の関節部に設けられた複数のサーボモータ等により構成され、アクチュエータ32の駆動によりロボット30が動作する。スピーカ33は、ロボット30の口の位置に設けられ、音声を出力する。入力装置34は、電源スイッチ等の各種スイッチを備える。GPSセンサ35は、GPS衛星からのGPS信号を受信する。GPSセンサ35からの信号により、ロボット30の位置を検出することができる。 The robot 30 further includes an actuator 32, a speaker 33, an input device 34, a GPS sensor 35, a controller 36, and a wireless unit 37. The actuator 32 is constituted by, for example, a plurality of servomotors provided at the joint of the robot 30, and the robot 30 operates by driving the actuator 32. The speaker 33 is provided at the position of the mouth of the robot 30 and outputs sound. The input device 34 includes various switches such as a power switch. The GPS sensor 35 receives GPS signals from GPS satellites. The position of the robot 30 can be detected by the signal from the GPS sensor 35.
 コントローラ36は、CPU,ROM,RAM等を有するマイクロコンピュータを含み、無線ユニット37を制御して管理サーバ40と通信する。例えばコントローラ36は、五感信号を出力するセンサ311~315とGPSセンサ35とからの信号を管理サーバ40に送信させる。さらにコントローラ36は、管理サーバ40から送信された信号に基づきアクチュエータ32やスピーカ33などに制御信号を出力する。 The controller 36 includes a microcomputer having a CPU, a ROM, a RAM, and the like, and controls the wireless unit 37 to communicate with the management server 40. For example, the controller 36 causes the management server 40 to transmit signals from the sensors 311 to 315 that output five sense signals and the GPS sensor 35. Furthermore, the controller 36 outputs a control signal to the actuator 32, the speaker 33, etc. based on the signal transmitted from the management server 40.
 図5は、管理サーバ40の概略構成の一例を示すブロック図である。図5に示すように、管理サーバ40は、入力装置41と、表示装置42と、無線ユニット43と、コントローラ44とを有する。なお、入力装置41と表示装置42とは、省略することができる。また、無線ユニット43は無線だけでなく有線により通信接続される構成であってもよい。 FIG. 5 is a block diagram showing an example of a schematic configuration of the management server 40. As shown in FIG. As illustrated in FIG. 5, the management server 40 includes an input device 41, a display device 42, a wireless unit 43, and a controller 44. The input device 41 and the display device 42 can be omitted. The wireless unit 43 may be configured to be connected not only by wireless communication but also by wired communication.
 コントローラ44は、CPU等の演算部44Aと、ROM,RAM,ハードディスク等の記憶部44Bと、その他の周辺回路とを有し、無線ユニット43を制御してユーザ機器10、3Dプリンタ20およびロボット30と通信する。演算部44Aは、機能的構成として、信号入力部441と、注文受付部442と、データ出力部443と、ロボット制御部444と、プリンタ制御部445と、ユーザ機器制御部446とを有する。記憶部44Bは、機能的構成として、食品データベース447を有する。 The controller 44 includes an arithmetic unit 44A such as a CPU, a storage unit 44B such as a ROM, a RAM, a hard disk, and other peripheral circuits, and controls the wireless unit 43 to control the user device 10, the 3D printer 20 and the robot 30. Communicate with. Arithmetic unit 44A has a signal input unit 441, an order reception unit 442, a data output unit 443, a robot control unit 444, a printer control unit 445, and a user device control unit 446 as functional components. The storage unit 44B has a food database 447 as a functional configuration.
 食品データベース447は、単一または複数の食材と風味とにより形成される複数の食品(料理等)の形状データと色データと味データとを含む食品データを、食品の名称とともに記憶する。食品データには、ロボット30(視覚センサ311)を介してユーザが認識した食品、例えば料理店や食品を販売している店舗で注文可能な食品(注文食品)のデータと、注文食品に形状と味とを真似して3Dプリンタ20で食品(造形食品)を製造する場合の造形食品のデータとが含まれる。造形食品のデータは、注文食品のデータに対応付けて食品データベース447に記憶される。食品データベース447には、種々のジャンルの食品の情報が記憶され、この情報は例えば入力装置41を介して定期的に更新あるいは逐次更新される。 The food database 447 stores food data including shape data, color data, and taste data of a plurality of foods (dishes etc.) formed by a single or a plurality of foods and flavors, together with the name of the food. Food data includes food recognized by the user via the robot 30 (visual sensor 311), for example, data of food (order food) that can be ordered at a restaurant or a store that sells food, and the shape of the food as ordered food The data of the shaped food in the case of producing the food (the shaped food) by the 3D printer 20 by imitating the taste is included. The data of the shaped food is stored in the food database 447 in association with the data of the ordered food. The food database 447 stores information of food of various genres, and this information is periodically updated or sequentially updated via, for example, the input device 41.
 信号入力部441は、ユーザ機器10(図2のセンサ11やマイク13等)から送信されたデータ、およびロボット30(図4のセンサ311~315等)から送信されたデータを、無線ユニット43を介して取得する。信号入力部441は、3Dプリンタ20で食品の製造が完了したときに3Dプリンタ20から出力される動作完了信号も取得する。 The signal input unit 441 transmits the data transmitted from the user device 10 (the sensor 11 or the microphone 13 or the like in FIG. 2) and the data transmitted from the robot 30 (the sensors 311 to 315 or the like in FIG. 4) Get through. The signal input unit 441 also acquires an operation completion signal output from the 3D printer 20 when the 3D printer 20 completes manufacturing of the food.
 注文受付部442は、3Dプリンタ20で製造すべき食品の注文を受け付ける。例えばロボット30のカメラ(視覚センサ311)で食品を撮影し、食品がユーザ機器10のディスプレイ12に表示されている状態で、ユーザからの指令によりロボット30を介して食品を注文すると、注文受付部431は、その食品の注文を受け付ける。なお、ロボット30は食品を食べることができないので、注文した食品が料理店で実際に作られる必要はない。 The order receiving unit 442 receives an order of food to be manufactured by the 3D printer 20. For example, when food is photographed by the camera (vision sensor 311) of the robot 30 and food is displayed on the display 12 of the user device 10, if food is ordered via the robot 30 according to a command from the user, the order reception unit 431 accepts the food order. In addition, since the robot 30 can not eat food, it is not necessary to actually make the ordered food at the restaurant.
 ユーザからの指令により、ロボット30に実際に食品を注文させてもよいが、単に注文のふりをさせるだけでもよい。注文のふりすらさせずに、ユーザの食品の注文の意思を、注文受付部442が判断して注文を受け付けるようにしてもよい。すなわち、ロボット30による注文動作を介さずに、ユーザがセンサ11やマイク13を介して食品の注文を指令し、これを注文受付部442が受け付けるようにしてもよい。 Although the robot 30 may actually order food according to an instruction from the user, it may be merely to pretend to be an order. The order receiving unit 442 may determine the user's intention to order food and accept the order without making the order even. That is, the user may order the order of food via the sensor 11 or the microphone 13 without the order operation by the robot 30, and the order reception unit 442 may receive the order.
 注文受付部442は、ロボット30により食品の注文動作がなされると、その注文された食品の名称をセンサ11や視覚センサ311からの信号に基づいて判断し、注文食品を特定する。注文食品の名称が不明なとき、視覚センサ311により取得された食品の形状データと色データとに合致する食品データを食品データベース436から検索し、これにより注文食品を特定することもできる。 When the robot 30 places an order for food, the order receiving unit 442 determines the name of the ordered food based on the signal from the sensor 11 or the visual sensor 311, and specifies the order food. When the name of the ordered food is unknown, food data matching the shape data and color data of the food acquired by the visual sensor 311 can be retrieved from the food database 436 to thereby specify the ordered food.
 データ出力部443は、食品データベース447を参照し、注文受付部442により特定された注文食品の形状データと色データと味データとを抽出して出力する。 The data output unit 443 refers to the food database 447, and extracts and outputs shape data, color data and taste data of the ordered food specified by the order reception unit 442.
 ロボット制御部444は、信号入力部441が読み込んだユーザ機器10のセンサ11からの信号(脳活動信号)に基づいて、ロボット30のアクチュエータ32に対する動作信号を、無線ユニット43を介してロボット30に送信する。ロボット30のコントローラ36は、この動作信号に応じてアクチュエータ32に制御信号を出力する。これにより、ユーザの意図に従いロボット30を動作させることができる。ロボット制御部444は、ユーザ機器10のマイク13からの信号に基づく音声を、ロボット30のスピーカ33から出力させることもできる。 The robot control unit 444 transmits an operation signal to the actuator 32 of the robot 30 to the robot 30 via the wireless unit 43 based on the signal (brain activity signal) from the sensor 11 of the user device 10 read by the signal input unit 441. Send. The controller 36 of the robot 30 outputs a control signal to the actuator 32 in response to the operation signal. Thereby, the robot 30 can be operated according to the user's intention. The robot control unit 444 can also output sound based on a signal from the microphone 13 of the user device 10 from the speaker 33 of the robot 30.
 プリンタ制御部445は、データ出力部443から出力された食品データのうち、注文食品の形状データと味データとを、無線ユニット43を介して3Dプリンタ20に送信する。3Dプリンタ20のコントローラ26(信号出力部262)は、送信された形状データと味データとに基づいて食材貯蔵部22、風味貯蔵部23、掻き混ぜ部24およびノズル25の各動作部に制御信号を出力する。これにより、注文食品と形状および味が一致する造形食品が製造される。 The printer control unit 445 transmits the shape data and the taste data of the ordered food among the food data output from the data output unit 443 to the 3D printer 20 via the wireless unit 43. The controller 26 (signal output unit 262) of the 3D printer 20 sends control signals to the operation units of the food storage unit 22, the flavor storage unit 23, the mixing unit 24, and the nozzle 25 based on the transmitted shape data and taste data. Output This produces a shaped food whose shape and taste match the custom food.
 ユーザ機器制御部446は、信号入力部441が読み込んだロボット30のセンサ311~315からの信号(五感信号)に基づいて、ユーザ機器10に対する動作信号を、無線ユニット43を介してユーザ機器10に送信する。例えば視覚センサ311により検出された画像信号を送信する。ユーザ機器10のコントローラ16は、この画像信号に応じてディスプレイ12に制御信号を出力し、ディスプレイ12に視覚センサ311から得られた三次元画像を表示させる。ユーザ機器制御部446は、聴覚センサ312からの信号に基づく音声を、ユーザ機器10のスピーカ14から出力させることもできる。 The user device control unit 446 sends an operation signal to the user device 10 to the user device 10 via the wireless unit 43 based on the signals (five sense signals) from the sensors 311 to 315 of the robot 30 read by the signal input unit 441. Send. For example, an image signal detected by the visual sensor 311 is transmitted. The controller 16 of the user device 10 outputs a control signal to the display 12 in response to the image signal, and causes the display 12 to display a three-dimensional image obtained from the visual sensor 311. The user device control unit 446 can also cause the speaker 14 of the user device 10 to output a sound based on the signal from the aural sensor 312.
 ユーザ機器制御部446は、ロボット30の視覚センサ311により取得された画像とユーザ機器10のカメラ18により取得された画像のうち、いずれをディスプレイ12に表示するかを決定する。この決定に応じてディスプレイ12の表示が切り替えられる。例えばユーザからの指令によりロボット30を動作させているときは、ユーザ機器制御部446は、視覚センサ311からの信号をユーザ機器10に送信し、ディスプレイ12に視覚センサ311からの信号に基づく画像を表示させる。一方、食品の注文後に3Dプリンタ20で造形食品の製造が完了すると、ユーザ機器制御部446は、カメラ18からの信号に基づく画像を表示させるようにユーザ機器10に画像切替信号を送信する。なお、ユーザ機器10の入力装置15の操作により、表示画像を切り替えることもできる。 The user device control unit 446 determines which of the image acquired by the visual sensor 311 of the robot 30 and the image acquired by the camera 18 of the user device 10 is to be displayed on the display 12. In response to this determination, the display of the display 12 is switched. For example, when the robot 30 is operated according to a command from the user, the user device control unit 446 transmits a signal from the visual sensor 311 to the user device 10, and the display 12 displays an image based on the signal from the visual sensor 311. Display. On the other hand, when the manufacture of the shaped food is completed by the 3D printer 20 after ordering the food, the user device control unit 446 transmits an image switching signal to the user device 10 so as to display an image based on the signal from the camera 18. Note that the display image can also be switched by the operation of the input device 15 of the user device 10.
 ユーザ機器制御部446は、画像切替信号の送信時に、データ出力部443から出力された注文食品の色データを併せてユーザ機器10に送信する。ユーザ機器10のコントローラ16は、この色データに基づいて造形食品の表示画像の色を制御する。すなわち、コントローラ16は、カメラ18により撮影された造形食品の画像をディスプレイ12に表示させるとき、造形食品が注文食品と同一の色で表示されるように画像の表示色を制御する。 The user device control unit 446 transmits the color data of the ordered food output from the data output unit 443 to the user device 10 when transmitting the image switching signal. The controller 16 of the user device 10 controls the color of the display image of the shaped food based on the color data. That is, when displaying the image of the shaped food taken by the camera 18 on the display 12, the controller 16 controls the display color of the image so that the shaped food is displayed in the same color as the ordered food.
 図6は、予め記憶部44Bに記憶されたプログラムに従い管理サーバ40の演算部44Aで実行される処理、特にユーザ機器10のディスプレイ12に対する表示制御に係る処理の一例を示すフローチャートである。このフローチャートに示す処理は、例えばユーザ機器10からロボットの動作開始指令が入力されると開始され、所定周期で繰り返される。 FIG. 6 is a flowchart showing an example of processing executed by the computing unit 44A of the management server 40 according to a program stored in advance in the storage unit 44B, in particular, processing pertaining to display control on the display 12 of the user device 10. The process shown in this flowchart is started, for example, when an operation start command of the robot is input from the user device 10, and is repeated at a predetermined cycle.
 まず、ステップS1で、ユーザからの指令によりロボット30が食品を注文したか否か、すなわち注文受付部442で食品の注文が受け付けられたか否かを判定する。ステップS1で否定されるとステップS2に進み、信号入力部441が読み込んだ、ロボット30の視覚センサ311により取得された画像信号を、無線ユニット43を介してユーザ機器10に送信する。これによりユーザ機器10のディスプレイ12に、ロボット30を介して取得された画像(動画)が表示される。 First, in step S1, it is determined whether the robot 30 has ordered food according to an instruction from the user, that is, whether the order receiving unit 442 has received an order of food. If the result in Step S1 is negative, the process proceeds to Step S2, and the image signal acquired by the visual sensor 311 of the robot 30 read by the signal input unit 441 is transmitted to the user device 10 via the wireless unit 43. As a result, an image (moving image) acquired via the robot 30 is displayed on the display 12 of the user device 10.
 一方、ステップS1で肯定されるとステップS3に進み、注文受付部442が注文食品の名称を特定する。次いで、ステップS4で、データ出力部443が食品データベース447を参照し、ステップS3で特定された注文食品の食品データ、すなわち形状データと色データと味データとを出力する。次いで、ステップS5で、プリンタ制御部445が、無線ユニット43を介して3Dプリンタ20に注文食品の食品データ(形状データと味データ)を送信する。 On the other hand, if affirmed by step S1, it will progress to step S3, and the order reception part 442 will specify the name of order food. Next, in step S4, the data output unit 443 refers to the food database 447 and outputs the food data of the ordered food specified in step S3, that is, shape data, color data, and taste data. Next, in step S5, the printer control unit 445 transmits the food data (shape data and taste data) of the ordered food to the 3D printer 20 via the wireless unit 43.
 次いで、ステップS6で、3Dプリンタ20から動作完了信号が送信されたか否かにより、造形食品の製造が完了したか否かを判定する。ステップS6で肯定されるとステップS7に進み、否定されるとステップS2に進む。ステップS7では、ユーザ機器制御部446が、カメラ18により取得された画像をディスプレイ12に表示させるように無線ユニット43を介してユーザ機器10に信号(例えば画像切替信号)を送信する。これによりディスプレイ12の画像表示が切り替えられ、ディスプレイ12にはカメラ18により撮影された現実空間の画像(造形食品の画像)が表示される。 Next, in step S6, it is determined whether the production of the shaped food has been completed based on whether the operation completion signal is transmitted from the 3D printer 20 or not. If affirmed at step S6, the process proceeds to step S7, and if not, the process proceeds to step S2. In step S7, the user device control unit 446 transmits a signal (for example, an image switching signal) to the user device 10 via the wireless unit 43 so that the image acquired by the camera 18 is displayed on the display 12. Thereby, the image display of the display 12 is switched, and the image of the real space (image of the shaped food) photographed by the camera 18 is displayed on the display 12.
 さらに、ステップS7では、ユーザ機器制御部446が、ステップS4でデータ出力部443から出力された注文食品の色データをユーザ機器10に送信する。ユーザ機器10のコントローラ16は、この色データに基づいて、造形食品の表示色を制御する。これによりユーザは、造形食品を注文食品と同一の色と認識する。このため、注文食品と同一の食品を簡易に再現してユーザに提供することができる。造形食品の形状と味とは、注文食品の形状と味とに対応(例えば一致)しており、しかもディスプレイ12に表示される造形食品の見た目が注文食品と同一であるため、ユーザはまるで注文食品を味わったかのような気分となる。 Furthermore, in step S7, the user device control unit 446 transmits the color data of the ordered food output from the data output unit 443 in step S4 to the user device 10. The controller 16 of the user device 10 controls the display color of the shaped food based on the color data. Thus, the user recognizes the shaped food as the same color as the ordered food. Therefore, the same food as the ordered food can be simply reproduced and provided to the user. The shape and taste of the shaped food correspond to (for example, match) the shape and taste of the ordered food, and the appearance of the shaped food displayed on the display 12 is the same as that of the ordered food. I feel as if I have tasted the food.
 本実施形態に係る食品提供システム100の動作をより具体的に説明する。図7に示すように、B地点でロボット30が注文食品5(ケーキ)を注文すると、その注文食品5の形状データと味データとが3Dプリンタ20に送信される(ステップS5)。これら形状データと味データとに基づき3Dプリンタ20が動作し、A地点で注文食品5と同一形状および同一味の造形食品6(ケーキ)が製造される。なお、注文食品5は複数の色と模様とで表面が形成されているのに対し、造形食品6の表面は例えば単色で形成され、模様もない。 The operation of the food providing system 100 according to the present embodiment will be described more specifically. As shown in FIG. 7, when the robot 30 orders the order food 5 (cake) at the point B, the shape data and taste data of the order food 5 are transmitted to the 3D printer 20 (step S5). The 3D printer 20 operates based on the shape data and the taste data, and a shaped food 6 (cake) having the same shape and the same taste as the ordered food 5 is manufactured at the point A. The surface of the custom food 5 is formed with a plurality of colors and patterns, whereas the surface of the shaped food 6 is formed with a single color, for example, and there is no pattern.
 造形食品6の製造が完了すると、ディスプレイ12の画像表示が切り替えられる(ステップS7)。このとき、ユーザ機器10には注文食品5の色データが送信され、造形食品6は、注文食品5と同一の色および模様でディスプレイ12に表示される。これによりユーザ1は、実際には色が異なる造形食品6でありながら、注文食品5と同一であると認識し、造形食品6を食べることで注文食品5を食べたような感覚が得られる。 When the production of the shaped food 6 is completed, the image display of the display 12 is switched (step S7). At this time, color data of the ordered food 5 is transmitted to the user device 10, and the shaped food 6 is displayed on the display 12 in the same color and pattern as the ordered food 5. As a result, the user 1 recognizes that it is the same as the ordered food 5 while actually being the shaped food 6 having a different color, and by eating the shaped food 6, a feeling like eating the ordered food 5 is obtained.
 本実施形態によれば以下のような作用効果を奏することができる。
(1)本実施形態に係る食品提供システム100は、食品(注文食品)の形状データと色データと味データとを出力するデータ出力部443と、データ出力部443から出力された形状データと味データとに基づいて食材を積層して注文食品に対応する形状と味とを有する造形食品を製造する3Dプリンタ20と、ユーザに装着されるとともに、現実空間に合致した三次元画像を表示するディスプレイ12を有するユーザ機器10と、3Dプリンタ20により製造された造形食品の三次元画像が注文食品と同一色で表示されるようにデータ出力部443から出力された色データに基づいてディスプレイ12を制御するユーザ機器制御部446およびコントローラ16とを備える(図1,2,5)。
According to the present embodiment, the following effects can be achieved.
(1) The food providing system 100 according to the present embodiment outputs a data output unit 443 that outputs shape data, color data and taste data of food (order food), and shape data and taste output from the data output unit 443. A 3D printer 20 for producing a shaped food having a shape and taste corresponding to the ordered food by laminating food based on data and a display for displaying a three-dimensional image fitted to the user while being fitted to the user 12 controls the display 12 based on color data output from the data output unit 443 so that the three-dimensional image of the shaped food manufactured by the 3D printer 20 is displayed in the same color as the ordered food. And a controller 16 (FIGS. 1, 2 and 5).
 これにより注文食品の形状と味とを3Dプリンタ20で容易に再現することができる。注文食品の色については再現させずに、ディスプレイ12を介して注文食品に一致した色で造形食品を擬似的に表示させる。このため、造形食品の色を気にせずに、例えば食材貯蔵部22から供給された食材の色のままで造形食品を製造することができ、造形食品の製造コストを低減することができる。すなわち、安価な構成で、3Dプリンタ20を用いてユーザに満足感の高い造形食品を提供することができる。 Thereby, the shape and taste of the ordered food can be easily reproduced by the 3D printer 20. Instead of reproducing the color of the ordered food, the shaped food is artificially displayed on the display 12 in a color corresponding to the ordered food. For this reason, a shaped food can be manufactured, for example with the color of the foodstuff supplied from the foodstuff storage part 22, without being concerned about the color of a shaped food, and the manufacturing cost of a shaped food can be reduced. That is, the 3D printer 20 can be used to provide a shaped food with high satisfaction to the user with an inexpensive configuration.
(2)食品提供システム100は、無線通信を介したユーザからの指令に応じて動作し、視覚センサ311を有する移動可能なロボット30をさらに備える(図1)。ユーザ機器制御部446とコントローラ16とは、さらに視覚センサ311により撮影された三次元画像が、現実空間に合致した三次元画像に代えて表示されるようにディスプレイ12を制御する(ステップS2)。これにより遠隔地のロボット30を介してユーザが各種の情報を得ることができる。例えばディスプレイ12に表示される食品の外見から、ユーザがどの食品を注文するかを判断することができる。 (2) The food provision system 100 further includes a movable robot 30 that operates in response to an instruction from the user via wireless communication and has a visual sensor 311 (FIG. 1). The user device control unit 446 and the controller 16 further control the display 12 so that the three-dimensional image captured by the visual sensor 311 is displayed in place of the three-dimensional image matching the real space (step S2). As a result, the user can obtain various information via the robot 30 at a remote location. For example, from the appearance of the food displayed on the display 12, it can be determined which food the user orders.
(3)食品提供システム100は、複数の食品の形状データと色データと味データとを含む食品データが記憶された食品データベース447をさらに備える(図5)。データ出力部443は、ロボット30を介して食品が注文されると、この食品に対応した形状データと色データと味データとを食品データベース447から抽出して出力する(ステップS4)。これにより、ロボット30を介して注文された食品を再現した造形食品をユーザに提供することができる。したがって、ユーザは家族と一緒に食事している気分を味わうことができ、高い満足感が得られる。 (3) The food provision system 100 further includes a food database 447 in which food data including shape data, color data and taste data of a plurality of food is stored (FIG. 5). When the food is ordered through the robot 30, the data output unit 443 extracts shape data, color data and taste data corresponding to the food from the food database 447 and outputs the same (step S4). Thereby, it is possible to provide the user with a shaped food in which the food ordered through the robot 30 is reproduced. Therefore, the user can feel that he is eating with his family, and a high satisfaction can be obtained.
(4)データ出力部443は、視覚センサ311により撮影された食品の形状データと色データとを出力するとともに、この食品に対応する味データを食品データベース447から抽出して出力することもできる。これにより、例えば注文したい食品の名称がわからないときであっても、視覚センサ311により撮影された食品の形状データと色データから、食品データ(味データ)を抽出することができ、3Dプリンタ20を用いて所望の食品を容易に製造することができる。 (4) The data output unit 443 can output the shape data and color data of the food taken by the visual sensor 311, and can also extract and output taste data corresponding to the food from the food database 447. Thus, for example, even when the name of the food to be ordered is not known, food data (taste data) can be extracted from the shape data and color data of the food photographed by the visual sensor 311, and the 3D printer 20 It can be used to easily produce the desired food.
(5)本発明の実施形態に係る食品提供方法は、注文食品の形状データと色データと味データとを出力し(ステップS4)、このうち出力された形状データと味データとに基づいて食材を積層して注文食品に対応する形状と味とを有する造形食品を製造し(ステップS5)、ユーザに装着されたユーザ機器10のディスプレイ12に、製造された造形食品の現実空間における三次元画像が注文食品と同一色で表示されるように、出力された色データに基づいてディスプレイ12を制御することを含む(ステップS7)。これにより、食品の製造コストを抑えて注文食品と同様の食品をユーザに対し提供することができる。 (5) The food provision method according to the embodiment of the present invention outputs shape data, color data and taste data of the ordered food (step S4), and based on the output shape data and taste data, the food is produced To produce a shaped food having a shape and taste corresponding to the ordered food (step S5), and the display 12 of the user device 10 worn by the user, a three-dimensional image of the shaped food manufactured in real space Controlling the display 12 based on the output color data so that the same color as the ordered food is displayed (step S7). As a result, it is possible to provide the user with the same food as the custom food while suppressing the manufacturing cost of the food.
 なお、上記実施形態では、食品提供システム100をロボットシステムに適用したが、本発明の食品提供システムは、ロボットシステム以外にも同様に適用することができる。すなわち、ロボット30を介さずに注文を受け付けた食品の形状データと色データと味データとを、データ出力部443が出力するようにしてもよい。上記実施形態では、3Dプリンタ20を用いて造形食品を製造したが、データ出力部から出力された形状データと味データとに基づいて食材を積層して食品に対応する形状と味とを有する造形食品を製造するのであれば、三次元造形装置の構成は上述したものに限らない。 Although the food providing system 100 is applied to a robot system in the above embodiment, the food providing system of the present invention can be applied to other than a robot system. That is, the data output unit 443 may output the shape data, the color data, and the taste data of the food for which the order has been received without intervention of the robot 30. In the above embodiment, the 3D printer 20 is used to manufacture the shaped food, but the food is stacked based on the shape data and the taste data output from the data output unit, and the shape having the shape and the taste corresponding to the food is formed. The configuration of the three-dimensional modeling apparatus is not limited to that described above as long as it produces food.
 上記実施形態では、ヘッドマウントディスプレイをディスプレイ12として用いたが、現実空間に合致した三次元画像を表示する表示部の構成はこれに限らない。ユーザに装着されるユーザ機器の構成も上述したものに限らない。上記実施形態では、ユーザ機器制御部446からの指令によりユーザ機器10のディスプレイ12の表示を切り替えるとともに、ユーザ機器制御部446からの指令によって送信された色データに基づいて、ユーザ機器10のコントローラ16が造形食品の表示画像の色を制御するようにしたが、造形食品の三次元画像が注文食品と同一色で表示されるようにデータ出力部から出力された色データに基づいて表示部が制御されるのであれば、表示制御部の構成は上述したものに限らない。 Although the head mounted display is used as the display 12 in the above embodiment, the configuration of the display unit for displaying a three-dimensional image that matches the real space is not limited to this. The configuration of the user device attached to the user is not limited to that described above. In the above embodiment, the controller 16 of the user device 10 switches the display of the display 12 of the user device 10 according to an instruction from the user device control unit 446, and based on color data transmitted according to an instruction from the user device control unit 446. Controls the color of the display image of the shaped food, but the display unit controls based on the color data output from the data output unit so that the three-dimensional image of the shaped food is displayed in the same color as the ordered food The configuration of the display control unit is not limited to that described above, as long as it is possible.
 上記実施形態では、二足歩行可能な人型ロボット30を用いたが、無線通信を介したユーザからの指令に応じて動作し、視覚センサ311などの撮影部を有するのであれば、ロボットの構成は上述したものに限らない。上記実施形態では、複数の食品の形状データと色データと味データとを含む食品データを、管理サーバ40の食品データベース447に記憶するようにしたが、記憶部の構成はこれに限らない。 In the above embodiment, the humanoid robot 30 capable of biped walking is used, but if it operates according to a command from the user via wireless communication and has a photographing unit such as the visual sensor 311, the configuration of the robot Are not limited to those described above. In the above embodiment, food data including shape data, color data, and taste data of a plurality of food is stored in the food database 447 of the management server 40, but the configuration of the storage unit is not limited to this.
 上記実施形態では、管理サーバ40とユーザ機器10、3Dプリンタ20およびロボット30との間で信号を送受信するようにした。すなわち、ユーザ機器10、3Dプリンタ20およびロボット30が、管理サーバ40を介して互いに通信するようにしたが、これらが管理サーバ40を介さずに直接通信するようにしてもよい。この場合、管理サーバ40の機能を、ユーザ機器10、3Dプリンタ20およびロボット30のコントローラ16,26,36等が有するようにすればよい。 In the above embodiment, signals are transmitted and received between the management server 40 and the user device 10, the 3D printer 20, and the robot 30. That is, although the user device 10, the 3D printer 20 and the robot 30 communicate with each other through the management server 40, they may directly communicate with each other without the management server 40. In this case, the functions of the management server 40 may be provided to the user device 10, the 3D printer 20, and the controllers 16, 26, 36 of the robot 30, and the like.
 上記実施形態では、ロボット30を店舗3でレンタルするようにしたが、例えばユーザが自宅に所有するロボットを用いても、本発明は、同様に構成することができる。ロボット30を家族とともに行動させるのではなく、単独で行動させてもよい。管理サーバ40と店舗3の端末とを通信可能に構成し、ロボット30のレンタル予約の申し込み、レンタル料金の支払い等を、管理サーバ40を介して行うようにしてもよい。 Although the robot 30 is rented at the store 3 in the above embodiment, the present invention can be configured in a similar manner even if, for example, the user uses a robot owned at home. Instead of having the robot 30 act with the family, it may act alone. The management server 40 and the terminal of the store 3 may be configured to be communicable, and application for rental reservation of the robot 30, payment of a rental fee, and the like may be performed via the management server 40.
 以上の説明はあくまで一例であり、本発明の特徴を損なわない限り、上述した実施形態および変形例により本発明が限定されるものではない。上記実施形態と変形例の1つまたは複数を任意に組み合わせることも可能であり、変形例同士を組み合わせることも可能である。 The above description is merely an example, and the present invention is not limited to the above-described embodiment and modifications as long as the features of the present invention are not impaired. It is also possible to arbitrarily combine one or more of the above-described embodiment and the modifications, and it is also possible to combine the modifications.
10 ユーザ機器、12 ディスプレイ、16 コントローラ、20 3Dプリンタ、30 ロボット、100 食品提供システム、311 視覚センサ、443 データ出力部、446 ユーザ機器制御部、447 食品データベース Reference Signs List 10 user equipment, 12 displays, 16 controllers, 20 3D printers, 30 robots, 100 food provision systems, 311 visual sensors, 443 data output units, 446 user equipment control units, 447 food databases

Claims (6)

  1.  食品の形状データと色データと味データとを出力するデータ出力部と、
     前記データ出力部から出力された形状データと味データとに基づいて食材を積層して前記食品に対応する形状と味とを有する造形食品を製造する三次元造形装置と、
     ユーザに装着されるとともに、現実空間に合致した三次元画像を表示する表示部を有するユーザ機器と、
     前記三次元造形装置により製造された造形食品の三次元画像が前記食品と同一色で表示されるように前記データ出力部から出力された色データに基づいて前記表示部を制御する表示制御部と、を備えることを特徴とする食品提供システム。
    A data output unit that outputs food shape data, color data, and taste data;
    A three-dimensional shaping apparatus for producing a shaped food having a shape and taste corresponding to the food by laminating the food based on the shape data and the taste data output from the data output unit;
    A user device that has a display unit that is worn by the user and that displays a three-dimensional image that matches the real space;
    A display control unit that controls the display unit based on color data output from the data output unit such that a three-dimensional image of a shaped food manufactured by the three-dimensional modeling apparatus is displayed in the same color as the food A food providing system comprising:
  2.  請求項1に記載の食品提供システムにおいて、
     無線通信を介したユーザからの指令に応じて動作し、撮影部を有する移動可能なロボットをさらに備え、
     前記表示制御部は、さらに前記撮影部により撮影された三次元画像が前記現実空間に合致した三次元画像に代えて表示されるように前記表示部を制御することを特徴とする食品提供システム。
    In the food provision system according to claim 1,
    The mobile robot further comprises a movable robot that operates in response to an instruction from a user via wireless communication and has a photographing unit.
    The display control unit further controls the display unit such that a three-dimensional image photographed by the photographing unit is displayed in place of a three-dimensional image matching the real space.
  3.  請求項2に記載の食品提供システムにおいて、
     複数の食品の形状データと色データと味データとを含む食品データが記憶された記憶部をさらに備え、
     前記データ出力部は、前記ロボットを介して食品が注文されると、この食品に対応した形状データと色データと味データとを前記記憶部から抽出して出力することを特徴とする食品提供システム。
    In the food provision system according to claim 2,
    A storage unit storing food data including shape data, color data, and taste data of a plurality of food;
    The food providing system characterized in that when the food is ordered through the robot, the data output unit extracts shape data, color data, and taste data corresponding to the food from the storage unit. .
  4.  請求項2に記載の食品提供システムにおいて、
     複数の食品の形状データと色データと味データとを含む食品データが記憶された記憶部をさらに備え、
     前記データ出力部は、前記撮影部により撮影された食品の形状データと色データとを出力するとともに、この食品に対応する味データを前記記憶部から抽出して出力することを特徴とする食品提供システム。
    In the food provision system according to claim 2,
    A storage unit storing food data including shape data, color data, and taste data of a plurality of food;
    The data output unit outputs the shape data and color data of the food photographed by the photographing unit, and extracts and outputs taste data corresponding to the food from the storage unit. system.
  5.  食品の形状データと色データと味データとを出力し、
     出力された形状データと味データとに基づいて食材を積層して前記食品に対応する形状と味とを有する造形食品を製造し、
     ユーザに装着されたユーザ機器の表示部に、製造された造形食品の現実空間に合致した三次元画像が前記食品と同一色で表示されるように、出力された色データに基づいて前記表示部を制御することを含むことを特徴とする食品提供方法。
    Output food shape data, color data and taste data,
    Food is stacked based on the output shape data and taste data to produce a shaped food having a shape and taste corresponding to the food,
    The display unit based on the output color data such that a three-dimensional image matching the actual space of the manufactured shaped food is displayed in the same color as the food, on the display unit of the user device attached to the user A method of providing food comprising: controlling the food.
  6.  ユーザ機器及びロボットから送信されたデータを受信する信号入力部と、
     前記信号入力部を介して受信したデータに基づき食品の形状データと色データと味データとを特定し出力するデータ出力部と、
     前記データ出力部から出力された形状データと味データとを外部の三次元造形装置に出力するプリンタ制御部と、
     前記三次元造形装置により製造される造形食品の三次元画像が前記食品と同一色で表示されるように前記データ出力部から出力された色データを外部のユーザ機器に出力するユーザ機器制御部と、を備えることを特徴とする食品提供システム用管理装置。
    A signal input unit that receives data transmitted from a user device and a robot;
    A data output unit that specifies and outputs shape data, color data and taste data of a food based on the data received via the signal input unit;
    A printer control unit that outputs the shape data and taste data output from the data output unit to an external three-dimensional modeling apparatus;
    A user device control unit that outputs color data output from the data output unit to an external user device such that a three-dimensional image of a shaped food manufactured by the three-dimensional shaping apparatus is displayed in the same color as the food The management apparatus for food provision systems characterized by including.
PCT/JP2018/034162 2017-09-29 2018-09-14 Food provision system, food provision method, and device for managing food provision system WO2019065304A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019544586A JPWO2019065304A1 (en) 2017-09-29 2018-09-14 Food delivery system, food delivery method and management device for food delivery system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-189797 2017-09-29
JP2017189797 2017-09-29

Publications (1)

Publication Number Publication Date
WO2019065304A1 true WO2019065304A1 (en) 2019-04-04

Family

ID=65902430

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/034162 WO2019065304A1 (en) 2017-09-29 2018-09-14 Food provision system, food provision method, and device for managing food provision system

Country Status (2)

Country Link
JP (1) JPWO2019065304A1 (en)
WO (1) WO2019065304A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008532107A (en) * 2005-01-18 2008-08-14 インタッチ・テクノロジーズ・インコーポレーテッド Mobile video conferencing platform with automatic shut-off
JP2010517875A (en) * 2007-01-19 2010-05-27 ネステク ソシエテ アノニム Autonomous food and beverage dispensing machine
JP2014211748A (en) * 2013-04-18 2014-11-13 ソニー株式会社 Information processing apparatus and storage medium
JP2016131507A (en) * 2015-01-16 2016-07-25 株式会社リコー Method for producing three-dimensional molded food product, and three-dimensional molded food product
JP2017076295A (en) * 2015-10-16 2017-04-20 富士フイルム株式会社 Augmented reality provision system, information processing device and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008532107A (en) * 2005-01-18 2008-08-14 インタッチ・テクノロジーズ・インコーポレーテッド Mobile video conferencing platform with automatic shut-off
JP2010517875A (en) * 2007-01-19 2010-05-27 ネステク ソシエテ アノニム Autonomous food and beverage dispensing machine
JP2014211748A (en) * 2013-04-18 2014-11-13 ソニー株式会社 Information processing apparatus and storage medium
JP2016131507A (en) * 2015-01-16 2016-07-25 株式会社リコー Method for producing three-dimensional molded food product, and three-dimensional molded food product
JP2017076295A (en) * 2015-10-16 2017-04-20 富士フイルム株式会社 Augmented reality provision system, information processing device and program

Also Published As

Publication number Publication date
JPWO2019065304A1 (en) 2020-12-03

Similar Documents

Publication Publication Date Title
JP7109408B2 (en) Wide range simultaneous remote digital presentation world
JP6906580B2 (en) Viewport-based augmented reality tactile effects systems, methods and non-transient computer-readable media
JP6663972B2 (en) How to customize objects in additional machining
EP2793151B1 (en) Peripheral apparatus and method of construction
CN104820498B (en) The man-machine interaction method and system that the virtual ornaments of hand are tried on
CN104796806A (en) System and method for producing a personalized earphone
WO2003063086A1 (en) Image processing system, image processing apparatus, and display apparatus
JP7239916B2 (en) Remote control system, information processing method, and program
US11279037B2 (en) Force-sense visualization apparatus, robot, and force-sense visualization program
JP7160669B2 (en) Program, Information Processing Apparatus, and Method
WO2019065304A1 (en) Food provision system, food provision method, and device for managing food provision system
JP2007130691A (en) Communication robot
US20160027099A1 (en) Presentation device for carrying out a product presentation
JP2004042151A (en) Communication robot
JP5079108B2 (en) 3D surface figure production method
JP7023971B2 (en) Service provision system, service provision method, and management device for service provision system
JP6892928B2 (en) Information provision system, information provision method and management device for information provision system
NL1023561C2 (en) Method for the custom manufacture of a handle, a measuring form for use in the method, as well as a handle manufactured with this method and component parts thereof.
WO2022044843A1 (en) Information processing device, information processing method, and program
WO2024089870A1 (en) Portable information terminal, virtual space system, and control method for portable information terminal
WO2021117533A1 (en) Healthcare system and processing device
JP2019063905A (en) Robot control system, robot control method and user apparatus for robot control system
WO2022149496A1 (en) Entertainment system and robot
WO2017125783A1 (en) Remote interactive system of augmented reality and associated method
WO2019150331A1 (en) Systems and methods for a common operator to control multiple vehicles cross-reference to related applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18862106

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019544586

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18862106

Country of ref document: EP

Kind code of ref document: A1