WO2018150756A1 - Information processing device, information processing method, and storage medium - Google Patents

Information processing device, information processing method, and storage medium Download PDF

Info

Publication number
WO2018150756A1
WO2018150756A1 PCT/JP2017/047343 JP2017047343W WO2018150756A1 WO 2018150756 A1 WO2018150756 A1 WO 2018150756A1 JP 2017047343 W JP2017047343 W JP 2017047343W WO 2018150756 A1 WO2018150756 A1 WO 2018150756A1
Authority
WO
WIPO (PCT)
Prior art keywords
food
beverage
information processing
display
information
Prior art date
Application number
PCT/JP2017/047343
Other languages
French (fr)
Japanese (ja)
Inventor
裕士 瀧本
岩村 厚志
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2018150756A1 publication Critical patent/WO2018150756A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/12Hotels or restaurants
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a storage medium.
  • Devices that display various information according to user operations on the touch panel such as smartphones and tablet terminals, are widely used.
  • tablet terminals the screen size has been increased, and the use method in which a plurality of users operate at the same time is being considered.
  • a projector is conventionally used as a device for displaying information.
  • Patent Document 1 information is displayed according to the environment in which information is to be displayed and the status of the information being displayed. Techniques for performing are disclosed.
  • the present disclosure provides a mechanism that can further improve the quality of services related to foods or beverages provided according to user interaction.
  • control unit that controls display of the display object related to the food or beverage based on the information related to the food or beverage obtained as a result of the sensing and the setting information associated with the food or beverage.
  • An information processing apparatus is provided.
  • the display of the display object related to the food or drink is controlled by the processor.
  • the computer controls display of the display object related to the food or beverage based on the information related to the food or beverage obtained as a result of the sensing and the setting information associated with the food or beverage.
  • a storage medium storing a program for functioning as a control unit is provided.
  • the display of the display object related to the food and drink is controlled based on the information about the food or beverage (hereinafter, also simply referred to as food or drink) obtained as a result of the sensing. It is possible to provide a real-time service according to meal conditions. Furthermore, according to this indication, since the display of the display object regarding the said food / beverage is controlled based on the setting information matched with food / beverage, it is possible to provide a detailed service for every food / beverage. . Thus, according to the present disclosure, it is possible to provide a real-time and detailed service.
  • FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure.
  • the system may mean a configuration for executing a predetermined process, and the system as a whole can be regarded as one device, or a system is configured by a plurality of devices. It can also be regarded as.
  • the information processing system according to the present embodiment illustrated in FIG. 1 may also be configured to execute predetermined processing (for example, processing realized by the functional configuration illustrated in FIG. 4) as the entire information processing system. Which of these configurations is considered as one device may be arbitrary.
  • an information processing system 100a includes an input unit 110a and an output unit 130a.
  • the output unit 130a visually notifies the user of the information by displaying various types of information on the table 140a.
  • a projector is used as the output unit 130a.
  • the output unit 130a is disposed above the table 140a, for example, spaced from the table 140a by a predetermined distance while being suspended from the ceiling, and projects information on the top surface of the table 140a.
  • a method for displaying information on the top surface of the table 140a from above is also referred to as a “projection type”.
  • the entire area where information is displayed by the output unit 130a is also referred to as a display screen.
  • the output unit 130a displays information presented to the user as the application is executed by the information processing system 100a on the display screen.
  • the displayed information is, for example, an operation screen of each application.
  • each display area in which the operation screen of such an application is displayed on the display screen is also referred to as a display object.
  • the display object may be a so-called GUI (Graphical User Interface) component (widget).
  • the output unit 130a may include a lighting device.
  • the information processing system 100a turns on the lighting device based on the content of information input by the input unit 110a and / or the content of information displayed by the output unit 130a. You may control states, such as light extinction.
  • the output unit 130a may include a speaker and may output various kinds of information as sound.
  • the number of speakers may be one or plural.
  • the output unit 130a includes a plurality of speakers, the information processing system 100a may limit the speakers that output sound or adjust the direction in which sound is output.
  • the output unit 130a may include a plurality of output devices, and may include, for example, a projector, a lighting device, and a speaker.
  • the input unit 110a is a device that inputs operation details of a user who uses the information processing system 100a.
  • the input unit 110a is provided above the table 140a, for example, in a state suspended from the ceiling.
  • the input unit 110a is provided apart from the table 140a on which information is displayed.
  • the input unit 110a may be configured by an imaging device that can capture the top surface of the table 140a, that is, the display screen.
  • a camera that images the table 140a with one lens a stereo camera that can record information in the depth direction by imaging the table 140a with two lenses, or the like can be used.
  • the input unit 110a is a stereo camera, for example, a visible light camera or an infrared camera can be used.
  • the information processing system 100a analyzes the image (captured image) captured by the camera to physically locate the table 140a. It is possible to detect an object (hereinafter also referred to as a real object), for example, the position of a user's hand.
  • a stereo camera is used as the input unit 110a, the information processing system 100a analyzes the captured image by the stereo camera, and in addition to the position information of the object located on the table 140a, Depth information (in other words, three-dimensional information) can be acquired.
  • the information processing system 100a can detect contact or proximity of the user's hand to the table 140a in the height direction and separation of the hand from the table 140a based on the depth information.
  • contact when the user touches or brings an operating body such as a hand in contact with information on the display screen is also simply referred to as “contact”.
  • the position of the operating body for example, the user's hand on the display screen (that is, the top surface of the table 140a) is detected, and based on the detected position of the operating body.
  • Various information is input. That is, the user can perform various operation inputs by moving the operating tool on the display screen. For example, when a user's hand contact with the display object is detected, an operation input for the display object is performed.
  • a case where a user's hand is used as an operation body will be described as an example, but the present embodiment is not limited to this example, and various operation members such as a stylus are used as the operation body. May be.
  • the input unit 110a may capture not only the top surface of the table 140a but also a user existing around the table 140a.
  • the information processing system 100a can detect the position of the user around the table 140a based on the captured image.
  • the information processing system 100a may perform personal recognition of the user by extracting physical features that can identify the individual of the user, such as the size of the user's face and body included in the captured image. .
  • the present embodiment is not limited to such an example, and user operation input may be executed by other methods.
  • the input unit 110a may be provided as a touch panel on the top surface of the table 140a, and the user's operation input may be detected by contact of the user's finger or the like with respect to the touch panel.
  • the touch panel may be realized by various methods such as a pressure-sensitive type, a capacitance type, and an optical type.
  • the input unit 110a performs spatial position recognition of an object using ultrasonic reflection, or detects a contact position between the object and another object by detecting and analyzing the vibration of the object.
  • a user operation on the table top surface 140a may be detected.
  • the input unit 110a may employ any one or any combination thereof as a technique for detecting a user operation on the table top surface 140a. Further, the user's operation input may be detected by a gesture with respect to the imaging device constituting the input unit 110a.
  • the input unit 110a may include a voice input device such as a microphone that picks up sounds produced by the user and environmental sounds of the surrounding environment.
  • a microphone array for collecting sound in a specific direction can be suitably used. Further, the microphone array can be configured such that the sound collection direction can be adjusted to an arbitrary direction.
  • an operation input may be performed using the collected voice.
  • the information processing system 100a may perform individual recognition based on the voice by analyzing the collected voice.
  • the input unit 110a may be configured by a remote control device (so-called remote control).
  • the remote controller may be one in which a predetermined instruction is input by operating a predetermined button arranged on the remote controller, or movement of the remote controller by a sensor such as an acceleration sensor or a gyro sensor mounted on the remote controller. Or a posture may be detected, and a predetermined instruction may be input by an operation of the user moving the remote controller.
  • the information processing system 100a may include other input devices such as a mouse, a keyboard, buttons, switches, and levers (not shown) as the input unit 110a, and user operations are input through these input devices. Also good.
  • the configuration of the information processing system 100a according to the present embodiment has been described above with reference to FIG. Although not shown in FIG. 1, another device may be connected to the information processing system 100a.
  • an illumination device for illuminating the table 140a may be connected to the information processing system 100a.
  • the information processing system 100a may control the lighting state of the lighting device according to the state of the display screen.
  • the configuration of the information processing system is not limited to that shown in FIG.
  • the information processing system according to the present embodiment only needs to include an output unit that displays various types of information on the display screen, and an input unit that can accept at least an operation input for the displayed information. Is not limited.
  • FIG.2 and FIG.3 the other structural example of the information processing system which concerns on this embodiment is demonstrated.
  • 2 and 3 are diagrams showing another configuration example of the information processing system according to the present embodiment.
  • an output unit 130a is provided below the table 140b.
  • the output unit 130a is a projector, for example, and projects information from below toward the top plate of the table 140b.
  • the top plate of the table 140b is formed of a transparent material such as a glass plate or a transparent plastic plate, and the information projected by the output unit 130a is displayed on the top surface of the table 140b.
  • a method of projecting information from the bottom of the table 140b onto the output unit 130a and displaying the information on the top surface of the table 140b is also referred to as a “rear projection type”.
  • the input unit 110b is provided on the top surface (front surface) of the table 140b.
  • the input unit 110b is configured by, for example, a touch panel, and the operation input by the user is performed when the touch of the operating body on the display screen on the top surface of the table 140b is detected by the touch panel.
  • the configuration of the input unit 110b is not limited to this example, and the input unit 110b may be provided below the table 140b and separated from the table 140b, similarly to the information processing system 100a shown in FIG.
  • the input unit 110b is configured by an imaging device, for example, and can detect the position of the operation body on the top surface of the table 140b through a top plate formed of a transparent material.
  • a touch panel display is installed on a table with its display screen facing upward.
  • the input unit 110c and the output unit 130c can be integrally configured as the touch panel display. That is, various types of information are displayed on the display screen of the display, and the operation input by the user is performed by detecting the touch of the operating body on the display screen of the display by the touch panel.
  • an imaging device may be provided above the touch panel display as the input unit 110c. The position of the user around the table can be detected by the imaging device.
  • the information processing system according to the present embodiment can be realized by various configurations.
  • the present embodiment will be described by taking as an example the configuration of the information processing system 100a in which the input unit 110a and the output unit 130a are provided above the table 140a shown in FIG.
  • the information processing system 100a, the input unit 110a, and the output unit 130a are simply referred to as the information processing system 100, the input unit 110, and the output unit 130.
  • FIG. 4 is a block diagram illustrating an example of a functional configuration of the information processing system 100 according to the present embodiment.
  • the information processing system 100 includes an input unit 110, a processing unit 120, an output unit 130, a storage unit 150, and a communication unit 160 as its functions.
  • the input unit 110 is an input interface for inputting various information to the information processing system 100. A user can input various types of information to the information processing system 100 via the input unit 110.
  • the input unit 110 corresponds to the input units 110a to 110c shown in FIGS.
  • the input unit 110 can include various sensors.
  • the input unit 110 performs sensing on the user in the sensing target range, user actions, real objects, and the relationship between these and display objects, generates sensing information indicating the sensing result, and outputs the sensing information to the processing unit 120.
  • the sensing target range may not be limited to the top surface of the table 140, and may include, for example, the periphery of the table 140.
  • the input unit 110 includes an imaging device and captures a captured image including a user's body, a user's face, a user's hand, and an object positioned on the top surface of the table 140.
  • Information for example, information about the captured image
  • the imaging device may be a visible light camera or an infrared camera, for example.
  • the input unit 110 may be configured as an imaging device including a function as a depth sensor capable of acquiring depth information such as a stereo camera.
  • the depth sensor may be configured separately from the imaging device as a sensor using an arbitrary method such as a time of flight method or a structured light method.
  • the input unit 110 may include a touch sensor. In that case, the touch sensor detects a touch on the display screen. And the detection function of the user's hand which is not touching on the display screen and the object on the display screen may be secured by the imaging device which images the depth sensor and / or the display screen from above.
  • the input unit 110 includes a sound collection device and collects a user's voice, a sound accompanying a user's operation, an environmental sound, and the like. Information input via the input unit 110 is provided to the processing unit 120 described later, and a user's voice input is recognized, or movement of an object is detected.
  • the sound collection device may be an array microphone, and the sound source direction may be detected by the processing unit 120.
  • the sound collection device is typically configured by a microphone, but may be configured as a sensor that detects sound by vibration of light.
  • the processing unit 120 includes various processors such as a CPU and a DSP, and controls the operation of the information processing system 100 by executing various arithmetic processes. For example, the processing unit 120 processes various types of information obtained from the input unit 110 or the communication unit 160 and stores the information in the storage unit 150 or causes the output unit 130 to output the information.
  • the processing unit 120 may be regarded as an information processing apparatus that processes various types of information.
  • the processing unit 120 includes a setting unit 121, an acquisition unit 123, a display control unit 125, and a notification unit 127 as its functions. Note that the processing unit 120 may have functions other than these functions.
  • each function of the processing unit 120 is realized by a processor constituting the processing unit 120 operating according to a predetermined program.
  • the setting unit 121 has a function of setting setting information in association with food and drink.
  • the acquisition part 123 has a function which acquires the information regarding food / beverage obtained as a result of sensing, and the setting information matched with the said food / beverage.
  • the display control part 125 has a function which controls the display of the display object regarding the said food / beverage based on the information regarding the acquired food / beverage and setting information.
  • the notification unit 127 has a function of notifying the external device of information about the food and drink based on the acquired information about food and drink and the setting information. Details of processing by each of these components will be described in detail later.
  • the output unit 130 is an output interface for notifying the user of various types of information processed by the information processing system 100.
  • the output unit 130 includes a display device such as a display, a touch panel, or a projector, and displays various types of information on the display screen under the control of the display control unit 123.
  • the output unit 130 corresponds to the output units 130a to 130c shown in FIGS. 1 to 3, and displays a display object on the display screen as described above.
  • the present embodiment is not limited to this example, and the output unit 130 may further include an audio output device such as a speaker, and may output various types of information as audio.
  • Storage unit 150 is a storage device that temporarily or permanently stores information for the operation of the information processing system 100.
  • the storage unit 150 stores setting information.
  • the communication unit 160 is a communication interface for transmitting / receiving data to / from an external device by wire / wireless.
  • the communication unit 160 is a method such as wireless LAN (Local Area Network), Wi-Fi (Wireless Fidelity (registered trademark)), infrared communication, Bluetooth (registered trademark), etc., directly with an external device or via a network access point.
  • Communicate For example, the communication unit 160 communicates information with a user device such as a user's smartphone or a wearable device attached to the user.
  • the communication unit 160 may acquire information from an SNS (Social Networking Service) or the like by communicating with a server on the Web, for example.
  • SNS Social Networking Service
  • FIG. 5 is a diagram for explaining an overview of the information processing system 100 according to the present embodiment.
  • FIG. 5 shows a scene at a restaurant such as a restaurant, bar or cafe.
  • users ie, guests
  • 10A and 10B are seated on the table 140, and the food and drink 20A, 20B, and 20C are placed on the top surface (ie, display screen) of the table 140.
  • These food and drink 20A, 20B, and 20C are ordered by the user 10A or 10B and distributed by a restaurant clerk.
  • the information processing system 100 is installed in such a restaurant, for example. Then, the information processing system 100 senses food and drinks 20A, 20B, and 20C by the users 10A and 10B via the input unit 110. Then, as illustrated in FIG. 5, the information processing system 100 displays the display objects 30 ⁇ / b> A and 30 ⁇ / b> B on the top surface of the table 140 via the output unit 130.
  • the information processing system 100 displays a display object on the table 140 on which food and drink are arranged by a display device (that is, the output unit 130) installed in a restaurant that provides food and drink.
  • the display object may be displayed on a user terminal such as a user's smartphone or wearable device, or may be displayed on a wall or floor of a restaurant.
  • FIG. 6 is a diagram illustrating an example of the flow of operation processing of the information processing system 100 according to the present embodiment.
  • the information processing system 100 performs sensing using the input unit 110 and acquires information about the foods 20A, 20B, and 20C (step S102).
  • the information processing system 100 acquires setting information associated with the foods 20A, 20B, and 20C (step S104).
  • the information processing system 100 controls the display of the display object based on the information on the foods 20A, 20B, and 20C and the setting information associated with the foods 20A, 20B, and 20C (step S106).
  • the information processing system 100 displays a display object 30A including a beverage menu for prompting the user 10A who eats the meat dish 20A to order additional beverages.
  • the information processing system 100 displays a display object 30 ⁇ / b> B including a cooking menu for prompting the user 10 ⁇ / b> B who eats the shrimp fried food 20 ⁇ / b> B and drinks the beer 20 ⁇ / b> C to order additional side dishes.
  • the information processing system 100 accepts additional orders in response to user operations on these display objects 30A or 30B.
  • the information processing system 100 senses food and drink and displays display objects corresponding to the food and drink setting information, thereby simplifying the user's ordering operation or proposing a combination of food and drink, for example. It becomes possible to do. Thereby, it is possible to enrich a user's eating and drinking experience.
  • the information processing system 100 controls the notification of information to the external device based on the information on the foods 20A, 20B, and 20C and the setting information associated with the foods 20A, 20B, and 20C ( Step S108).
  • Examples of external devices include various devices in restaurants, terminals held by store clerk, and servers on a network.
  • the information processing system 100 transmits a lower bowl instruction to the store clerk's terminal at the timing when the food and drink have been eaten, and an instruction to arrange the next food and drink in the course dish.
  • the information processing system 100 transmits the order information to the kitchen apparatus at the timing when the additional order is made.
  • the information processing system 100 transmits an accounting instruction to the accounting apparatus of the restaurant at the timing when all the food and drink on the table 140 have been eaten and finished drinking. Thereby, the restaurant can provide an appropriate service at an appropriate timing.
  • step S110 determines whether or not the end condition is satisfied. If it is determined that the condition is not satisfied, the process returns to step S102. If it is determined that the condition is satisfied, the process ends. As the termination condition, for example, a user standing up can be considered.
  • Information processing system 100 acquires information about food and drink based on sensing information.
  • the information processing system 100 acquires information on food and drink based on sensing information such as a captured image, an infrared image, user voice, or depth information.
  • the information processing system may acquire information on food and drink associated with the user.
  • the user here is, for example, a person who is provided with food or drink (for example, a person who ordered or a person who eats or drinks) or a person who provides food or drink (for example, a person who cooks food or drink in a kitchen, or a kitchen A person who carries food and drink to the table).
  • the information regarding food / beverage may be acquired for every food / beverage, may be acquired collectively regarding several food / beverage products corresponding to a user, and may be acquired collectively for every table.
  • food and drink may be captured as ingested items. The ingestion is a food, drink or combination thereof that is taken orally.
  • information on food and drink may include information unique to food and drink.
  • information unique to food and drink for example, names of food and drink, raw materials, cooking methods, and the like are conceivable.
  • information on food and drink may include information on the state of food and drink.
  • information regarding the state of food and drink for example, the temperature, remaining amount and consumption speed of food and drink can be considered.
  • information on food and drink may include information on provision of food and drink.
  • information related to the provision of food and drink include, for example, the elapsed time since provision, whether or not it is a part of a course dish, related food and drink (for example, other dishes included in the course dish, or main dishes)
  • the corresponding wine may be offered.
  • the information regarding food and drink may include information regarding people regarding food and drink.
  • information about a person related to food and drink which user is associated (orderer or food and drink), whether or not the user who eats or drinks belongs to one or more groups, the role of the user in the group, food and drink
  • the degree of store congestion, the number of store clerk, and the like can be considered.
  • Information processing system 100 (for example, acquisition part 123) acquires setting information matched with food and drink. For example, the information processing system 100 identifies food and drink by recognizing food and drink, and acquires setting information associated with the identified food and drink from the storage unit 150.
  • Setting information related to display execution condition may include information related to the display execution condition of the display object.
  • the information processing system 100 (for example, the display control unit 125) displays a display object when a display execution condition included in the setting information is satisfied.
  • the display execution condition includes a condition for starting display and a condition for ending.
  • the information processing system 100 displays the display object for a period from when the condition for starting display is satisfied until the condition for ending is satisfied.
  • the setting information may include a plurality of display execution conditions. For example, there may be a plurality of display object candidates that can be displayed for one food and drink, and display execution conditions can be set for each candidate. Then, the information processing system 100 displays a display object corresponding to the satisfied display execution condition.
  • setting information regarding display execution conditions An example of setting information regarding display execution conditions will be described below.
  • the setting information related to the display execution condition is based on at least one of those described below.
  • the display execution condition may be related to food and drink. An example will be described below.
  • the display execution condition may be based on the remaining amount of food and drink. For example, when the remaining amount of the beverage is equal to or less than the threshold value, it may be determined that display of the display object that prompts the additional order should be executed.
  • the display execution condition may be based on contact between food and drink and a predetermined real object. For example, when a fork comes into contact with a meat dish, it may be determined that display of a display object that explains the production area of the meat should be executed.
  • the display execution condition may be based on the relationship between the food and drink and other food and drink. For example, regarding a plurality of beverages provided to a plurality of users, when the remaining amount of some of the beverages is 20% or less, it may be determined that display of a display object that prompts an additional order should be executed.
  • the display execution condition may be based on whether food or drink is provided as a single item or as part of a course meal. For example, it may be determined that different display objects should be displayed for food and drink provided as a single item (for example, in a la carte) and food and drink provided as part of a course meal.
  • the display execution condition may be related to time. An example will be described below.
  • the display execution condition may be based on time. For example, it may be determined that a different display object should be displayed depending on whether the time belongs to a lunch time zone or a dinner time zone.
  • the display execution condition may be based on the elapsed time since the food or drink is provided. For example, when the elapsed time from the provision of food and drink exceeds a threshold, it can be determined that display of a display object that prompts an additional order should be executed.
  • the display execution condition may be based on the elapsed time since the user started eating. For example, when the elapsed time since the user started eating for a seat exceeds a threshold value, it may be determined that display of a display object that prompts an additional order should be executed.
  • the display execution condition may be related to the user. An example will be described below.
  • the display execution condition may be based on user attribute information.
  • user attribute information for example, identification information such as a user name, sex, age, occupation, and the like are conceivable. For example, it may be determined that different display objects should be displayed depending on whether the user is male or female.
  • the display execution condition may be based on the user's state.
  • the user's state for example, the user's biometric information, emotion, seat position, and the like can be considered.
  • biological information pulse, body temperature, blood pressure, complexion color, and the like can be considered. For example, it may be determined that different display objects should be displayed when the emotion is high and when the emotion is calm.
  • the display execution condition may be based on the relationship between the user and another user (for example, a fellow person). For example, when the user is the host of the group, it is determined that the display object that prompts the additional order should be displayed, and when the user is the guest, the display object that prompts the additional order should not be displayed. It can be judged.
  • the display execution condition may be based on the user's eating history or preference.
  • the user's food history can be acquired from, for example, a database that stores food and drinks that the user has eaten and pasted in association with identification information.
  • the user's preference can be estimated from the user's food history, for example. For example, it may be determined that display object display recommending food and drink that matches the user's preference should be executed.
  • the display execution condition may be based on the user's voice. For example, it may be determined that display object display that recommends food and drink along a topic should be executed based on a voice recognition result of a conversation between the user and another user (for example, a cohabitant) or a store clerk.
  • the display execution conditions related to the user described above may be set for one user or may be set for a group. For example, it may be determined that different display objects should be displayed depending on whether the user is seated alone or a group of people is seated.
  • Display execution condition related to user operation may relate to a user operation. For example, when a predetermined operation such as a single tap, a double tap, or a drag on the displayed display object is detected, it can be determined that display of the display object should be executed.
  • the setting information related to the display execution condition may be information indicating the display execution condition itself.
  • the information indicating the display execution condition itself is, for example, information indicating a threshold for the remaining amount of beverage.
  • the setting information related to the display execution condition may be information for changing the display execution condition itself.
  • the information that affects the display execution condition itself is, for example, information that lowers the threshold value of the remaining amount of beverage according to the elapsed time since the meal was started.
  • the consumption of food and drink is fast in the time zone immediately after the meal is started, and the consumption becomes slow as time passes. Based on this, in the first half of the meal, a display object that prompts an additional order is displayed even if the remaining amount of beverage is large, and in the second half of the meal, a display object that prompts an additional order is displayed when the remaining amount of beverage is small. obtain.
  • the setting information related to the display execution condition may include information indicating a time lag from when the display execution condition is satisfied until the display object is actually displayed.
  • the information processing system 100 displays the display object after the set time lag has elapsed after the display execution condition is satisfied.
  • the information processing system 100 displays a display object that prompts an additional order as soon as the remaining amount of beverage falls below the threshold in the first half of the meal, and predetermined after the remaining amount of beverage falls below the threshold in the second half of the meal.
  • a display object that prompts for additional orders is displayed after a lapse of time.
  • the same thing can be realized by changing the display execution condition such as lowering the threshold value of the remaining amount of beverage according to the elapsed time since the meal was started.
  • the setting information may include information regarding the contents of the display object.
  • the information processing system 100 for example, the display control unit 125
  • the setting information related to the content of the display object can include at least one of image (moving image / still image) data, audio data, and text data.
  • the setting information related to the content is at least one of the items described below.
  • the content of the display object may be information on additional orders.
  • information relating to additional orders include, for example, an order issue button, information indicating food and drink candidates that can be ordered, information indicating order history at that time (may include total amount information at that time), and ordering Information indicating customizable options such as number, size and other options may be considered.
  • the display object related to the additional order may be in a format for accepting selection of the food and options desired by the user, or an order for the same food and drink that the user is currently eating and drinking, that is, a request for a replacement. May be accepted. By displaying the information regarding the additional order, the user can easily place the additional order.
  • the content of the display object may be information related to the elapsed time since the food and drink were provided.
  • the elapsed time since the food and drink are provided, it is possible to motivate the user to eat early. Thereby, it becomes possible for a restaurant to improve the rotation rate of a seat.
  • the content of the display object may be information regarding the temperature of food and drink.
  • the temperature of the food and drink By displaying the temperature of the food and drink, it is possible to motivate the user to eat at an appropriate temperature.
  • the content of the display object may be information related to the association between the food and drink and the user.
  • the user can prevent losing his / her beverage due to, for example, moving the seat.
  • the content of the display object may be information explaining food and drink.
  • information explaining food and drink a production area, raw materials, allergy information, a nutrient, energy amount, how to eat, etc. can be considered, for example.
  • the user can eat the food in an appropriate way even if it is the first food, for example.
  • the setting information may include information regarding the display style of the display object.
  • the information processing system 100 for example, the display control unit 125
  • the setting information regarding the display format is at least one of the following items.
  • the setting information regarding the display style of the display object may be setting information regarding the display position of the display object.
  • As setting information regarding the display position for example, a position on the display screen where the display object is to be displayed (for example, an absolute position or a relative position based on food and drink), a range, a size, and other display objects or real objects It is conceivable whether or not to allow duplication.
  • the setting information regarding the display style of the display object may be setting information regarding the display orientation of the display object.
  • the setting information regarding the display posture may include, for example, information indicating the direction of characters included in the display object, and the information processing system 100 rotates the display object so that the characters included in the display object face the user, for example. Can be.
  • the setting information related to the display style of the display object may be setting information related to animation.
  • animation for example, movement, rotation, size change, color change, and the like of the display object can be considered.
  • the setting information regarding the display style of the display object may be setting information regarding the degree of detail. For example, a simple display object may be displayed when a sufficient display area cannot be secured, and a detailed display object may be displayed when a sufficient display area is secured. As a simple display object, an icon indicating that there is new information can be considered.
  • the setting information related to notification to an external device may include information similar to setting information related to display of a display object.
  • the setting information related to the notification to the external device may include information similar to the setting information related to the display execution condition as setting information related to the notification execution condition to the external device.
  • the information processing system 100 (for example, the setting unit 121) can set setting information in various ways.
  • the information processing system 100 may first set default setting information, and change the setting information by performing machine learning according to the subsequent usage history.
  • the default setting information may be learned based on the usage history in other restaurants, for example.
  • the information processing system 100 may change the setting information according to the input by the restaurant. Specifically, each item of the setting information (for example, the threshold value of the remaining amount of beverage related to the display execution condition) may be input by the restaurant. Moreover, each item of setting information may be changed according to the rough input by a restaurant. For example, the restaurant inputs a flag for setting the display timing of the display object such as “normal”, “early”, and “late” for food and drink. Then, the information processing system 100 raises or lowers the threshold value of the remaining amount of the beverage related to the display execution condition, for example, so as to satisfy the request to make the display timing early / slow. In addition, the input of the flag may be performed individually for each food and drink, or common to a plurality of food and drink such as common to drinks or small quantities of food. May be done.
  • the information processing system 100 may flexibly change the setting information according to circumstances for each user or convenience for each restaurant.
  • the setting information may be variably set based on at least one of a user or a restaurant that provides food and drink.
  • the information processing system 100 can change the setting information according to the past use history for a user who has a past visit history (for example, a regular customer).
  • the information processing system 100 can change the setting information according to various seasonal conditions such as recommended items for each season, the purchase status of the day, and sales.
  • the information processing system 100 may change the setting information according to the weather or the like.
  • ⁇ Whether the setting information can be changed may be set for each item. For example, items that cannot be changed by the restaurant side can be fixed.
  • a changeable range may be set for changeable items. For example, with respect to items having a preferable range for the restaurant side, it is possible to keep the change within the preferable range.
  • Information processing system 100 controls display of a display object about the food based on information about food and drink obtained as a result of sensing and setting information associated with the food and drink. To do.
  • the information processing system 100 controls the display object to be displayed, the display start / end timing of the display object, and / or the display style based on the information about the food and drink provided to the user and the setting information.
  • the information processing system 100 may control the display of display objects according to the amount of food and drink. For example, the information processing system 100 may make a display object that prompts an additional order stand out as the remaining amount of food or drink decreases.
  • FIG. FIG. 7 is a diagram illustrating an example of display control by the information processing system 100 according to the present embodiment.
  • the information processing system 100 displays a small menu icon 30A when the remaining amount of beer 20 provided to the user 10 is abundant, and a large menu icon 30B when the remaining amount decreases. Is displayed, and a refill order button 30C is displayed when it is drunk. In this way, as the remaining amount of food and drink decreases, it is possible to prevent the user from forgetting to order and to increase sales for the restaurant by prompting the additional order.
  • the information processing system 100 may control the display of display objects according to the consumption pace of food and drink. For example, the information processing system 100 may display a display object that prompts an additional order at an earlier timing as the consumption pace of food and drink is faster and at a later timing as the rate is slower. Specifically, when the time from serving to the end of eating or drinking is short, the information processing system 100 displays a display object that prompts an additional order after a short time from the end of eating or the end of drinking. In addition, when the time from serving to the end of eating or drinking is long, the information processing system 100 displays a display object that prompts an additional order after a long time from the end of eating or drinking. Thereby, it becomes possible to receive an additional order at an appropriate timing according to the user's consumption pace.
  • the information processing system 100 may control the display of display objects according to the order in which the user eats and drinks food and drink. For example, the information processing system 100 estimates the user's preference based on the order of eating and drinking food, in other words, how to reduce or leave the food, and adds other food and beverage such as matching the user's preference. A display object that prompts an order may be displayed. Note that the information processing system 100 may acquire profile information from the database and determine whether it is left because it does not like the left food or food, or whether it is left intentionally.
  • the information processing system 100 may control the display of display objects according to the progress of food and beverage provision.
  • the progress of food and beverage provision may be, for example, the progress of course cooking (how many of all the products have been provided) or the progress of a plurality of foods and beverages ordered individually.
  • the information processing system 100 displays the display object earlier in the first half of the course dish and displays the display object later in the second half of the course dish.
  • the information processing system 100 displays a display object that instructs the user to open the center of the table.
  • the information processing system 100 may control the display of display objects according to the correspondence between food and drink and the user.
  • the information processing system 100 may display a display object that indicates the correspondence between the food and drink and the user.
  • Such a display object may clearly indicate to the user the food and drink ordered by the user. In this case, for example, it is possible to prevent the user from losing sight of his / her beverage by moving the seat.
  • the display object may clearly indicate to the store clerk which user ordered the food or drink. In that case, the store clerk can know the place where the food or drink is to be served.
  • FIG. FIG. 8 is a diagram illustrating an example of display control by the information processing system 100 according to the present embodiment. As shown in FIG.
  • the information processing system 100 displays a display object 30A indicating that the beer 20A belongs to the user 10A and a display object 30B indicating that the beer 20B belongs to the user 10B.
  • the information processing system 100 includes a display object 30C indicating that the orderer of the dish 20C is the user 10A and should be arranged in front of the user 10A, and the orderer of the dish 20D is the user 10B.
  • a display object 30D indicating that the user 10B should be arranged is displayed in front of the user 10B.
  • the information processing system 100 may control the display of display objects according to the end of eating and drinking of food and drink.
  • the information processing system 100 may display a display object that accepts an evaluation of food and drink at the timing when the food and drink of the food and drink are finished (finished eating or finished drinking).
  • the evaluation may be in a scoring format or may be a free-filled questionnaire format. Thereby, the evaluation for every food and drink finer than the evaluation for every restaurant widely performed conventionally becomes possible.
  • the information processing system 100 may post the input evaluation to the user's SNS together with the captured image of the food and drink.
  • the information processing system 100 may control the display of the display object according to the elapsed time after the display object is displayed. For example, the information processing system 100 makes the display area less noticeable, for example, by reducing the display area over time after the display of the display object is started, and finally ends the display.
  • the information processing system 100 may control the display of the display object according to the congestion degree of the restaurant.
  • the degree of congestion of the restaurant may be the degree of order congestion or the degree of customer congestion. For example, when the degree of order congestion (for example, the order frequency or the number of unsuccessful orders) exceeds a threshold value, the information processing system 100 delays the display timing of the display object related to the additional order or displays it inconspicuously. To do. As a result, new orders are suppressed and the degree of congestion can be reduced.
  • the information processing system 100 can control the display of display objects according to the role of the user in the group. For example, when the user is a group host, the information processing system 100 displays a display object that prompts an additional order and a display object that includes accounting information. Thereby, for example, the user can appropriately treat the guest.
  • the information processing system 100 may control the display of the display object according to the user action. For example, the information processing system 100 starts or ends display of the display object when detecting a specific gesture such as shaking a glass.
  • the information processing system 100 may control display of the display object according to the user's audio.
  • the information processing system 100 can recognize the content of the user's conversation by voice to estimate what the user wants to eat or drink and can control the display of the display object according to the estimation result.
  • Biometric information The information processing system 100 may control display of a display object according to biometric information. For example, the information processing system 100 displays a display object that recommends warm food when the user's body temperature is lower than a threshold.
  • the information processing system 100 (for example, the notification unit 127) externally outputs the information about the food and drink based on the information about the food and drink provided to the user obtained as a result of the sensing and the setting information associated with the food and drink. Control device notifications. For example, the information processing system 100 controls the information to be notified, the notification timing, and / or the notification format based on the information about the food and drink provided to the user and the setting information.
  • the information processing system 100 may notify the external device of order information when an order for a predetermined amount of food or drink is made in units of tables. In other words, the information processing system 100 may wait for notification of order information until there is an order for a predetermined amount of food and drink in units of tables. Thereby, it is possible to improve the work efficiency of the restaurant side.
  • the information processing system 100 may issue an alert or stop accepting an order when an order exceeding a threshold set for each user or each table is performed.
  • the user can limit the accounting amount or the amount of alcohol consumed.
  • FIG. 9 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment.
  • the information processing apparatus 900 illustrated in FIG. 9 can realize the information processing system 100 illustrated in FIG. 4, for example.
  • Information processing by the information processing system 100 according to the present embodiment is realized by cooperation between software and hardware described below.
  • the information processing apparatus 900 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a.
  • the information processing apparatus 900 includes a bridge 904, an external bus 904b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, and a communication device 913.
  • the information processing apparatus 900 may include a processing circuit such as an electric circuit, a DSP, or an ASIC instead of or in addition to the CPU 901.
  • the CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 according to various programs. Further, the CPU 901 may be a microprocessor.
  • the ROM 902 stores programs used by the CPU 901, calculation parameters, and the like.
  • the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
  • the CPU 901 can form the processing unit 120 illustrated in FIG. 4.
  • the CPU 901, ROM 902, and RAM 903 are connected to each other by a host bus 904a including a CPU bus.
  • the host bus 904 a is connected to an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 904.
  • an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus
  • PCI Peripheral Component Interconnect / Interface
  • the host bus 904a, the bridge 904, and the external bus 904b do not necessarily have to be configured separately, and these functions may be mounted on one bus.
  • the input device 906 is realized by a device in which information is input by the user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever.
  • the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA that supports the operation of the information processing device 900.
  • the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the CPU 901.
  • a user of the information processing apparatus 900 can input various data and instruct a processing operation to the information processing apparatus 900 by operating the input device 906.
  • the input device 906 can be formed by a device that detects information about the user.
  • the input device 906 includes various sensors such as an image sensor (for example, a camera), a depth sensor (for example, a stereo camera), an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance sensor, and a force sensor. Can be included.
  • the input device 906 includes information related to the information processing device 900 state, such as the posture and movement speed of the information processing device 900, and information related to the surrounding environment of the information processing device 900, such as brightness and noise around the information processing device 900. May be obtained.
  • the input device 906 receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite) and receives position information including the latitude, longitude, and altitude of the device.
  • GNSS Global Navigation Satellite System
  • a GNSS module to measure may be included.
  • the input device 906 may detect the position by transmission / reception with Wi-Fi (registered trademark), a mobile phone / PHS / smartphone, or the like, or near field communication.
  • Wi-Fi registered trademark
  • the input device 906 can form, for example, the input unit 110 shown in FIG.
  • the output device 907 is formed of a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as laser projectors, LED projectors and lamps, audio output devices such as speakers and headphones, printer devices, and the like. .
  • the output device 907 outputs results obtained by various processes performed by the information processing device 900. Specifically, the display device visually displays results obtained by various processes performed by the information processing device 900 in various formats such as text, images, tables, and graphs.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally.
  • the output device 907 can form, for example, the output unit 130 shown in FIG.
  • the storage device 908 is a data storage device formed as an example of a storage unit of the information processing device 900.
  • the storage apparatus 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
  • the storage device 908 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
  • the storage device 908 can form, for example, the storage unit 150 shown in FIG.
  • the drive 909 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 900.
  • the drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903.
  • the drive 909 can also write information to a removable storage medium.
  • connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of transmitting data by USB (Universal Serial Bus), for example.
  • USB Universal Serial Bus
  • the communication device 913 is a communication interface formed by a communication device or the like for connecting to the network 920, for example.
  • the communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like.
  • the communication device 913 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices.
  • the communication device 913 can form, for example, the communication unit 160 illustrated in FIG.
  • the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920.
  • the network 920 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like.
  • the network 920 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
  • IP-VPN Internet Protocol-Virtual Private Network
  • each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
  • a computer program for realizing each function of the information processing apparatus 900 according to the present embodiment as described above can be produced and mounted on a PC or the like.
  • a computer-readable recording medium storing such a computer program can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed via a network, for example, without using a recording medium.
  • the information processing system 100 is based on information about food and drink provided to a user obtained as a result of sensing and setting information associated with the food and drink. Controls display of display objects related to objects. Thereby, the information processing system 100 can provide a fine-tuned service set for each food and drink in real time according to the state of meal.
  • the information processing system 100 can be applied in any place where food and drink can be provided, such as a general home or a cafeteria cafeteria.
  • the processing unit 120 and the storage unit 150 are connected to an apparatus such as a server connected to the input unit 110, the output unit 130, and the communication unit 160 via a network or the like. It may be provided.
  • the processing unit 120 and the storage unit 150 are provided in a device such as a server, information obtained by the input unit 110 or the communication unit 160 is transmitted to the device such as the server via a network or the like, and the processing unit 120 sets the drawing information set.
  • the information for the output unit 130 to output from the device such as the server is sent to the output unit 130 through a network or the like.
  • An information processing apparatus comprising: (2) The information processing apparatus according to (1), wherein the setting information relates to display execution conditions for the display object. (3) The display execution condition is based on at least one of the remaining amount of the food or beverage, the contact between the food or beverage and a predetermined real object, or the relationship between the food or beverage and another food or beverage, The information processing apparatus according to 2).
  • the information processing apparatus according to (2) or (3), wherein the display execution condition is based on whether the food or beverage is provided as a single item or as part of a course dish. (5) The display execution condition is based on at least one of a time, an elapsed time since the food or beverage was provided, or an elapsed time since the user who provided the food or beverage started eating ( The information processing apparatus according to any one of 2) to (4). (6) The information processing apparatus according to any one of (1) to (5), wherein the setting information relates to a content of the display object.
  • the content includes information on additional orders, information on the elapsed time since the food or beverage was provided, information on the temperature of the food or beverage, and correspondence between the food or beverage and the user provided with the food or beverage.
  • the setting information relates to a display format of the display object.
  • the display style is at least one of a display position, a display posture, an animation, and a detail level of the display object.
  • the setting information is variably set based on at least one of a user to whom the food or beverage is provided or a restaurant that provides the food or beverage, (1) to (9) The information processing apparatus described in 1.
  • the control unit displays the display object that prompts an additional order at an earlier timing as the consumption pace of the food or beverage is faster, and at a later timing as the delay is slower, according to any one of (1) to (11).
  • the control unit displays the display object that prompts an additional order of another food or beverage according to an order in which the user to whom the food or beverage is provided eats or drinks the food or beverage, (1) to (12 ).
  • the information processing apparatus according to any one of (14) The information processing according to any one of (1) to (13), wherein the control unit displays the display object indicating a correspondence relationship between the food or beverage and a user to whom the food or beverage is provided. apparatus.
  • the control unit displays the display object that receives an evaluation of the food or beverage at a timing when the eating or drinking of the food or beverage is completed. apparatus.
  • An information processing method including: (20) Computer A control unit for controlling display of a display object relating to the food or beverage based on information relating to the food or beverage obtained as a result of sensing and setting information associated with the food or beverage; A storage medium that stores a program for functioning as a computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Strategic Management (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

[Problem] To provide a technology capable of improving the quality of service associated with foods or drinks provided according to user interaction. [Solution] An information processing device is provided with a control unit that controls the presentation of a display object associated with foods or drinks on the basis of information associated with the food or the drink obtained as a result of sensing and preset information that maps the food or the drink.

Description

情報処理装置、情報処理方法及び記憶媒体Information processing apparatus, information processing method, and storage medium
 本開示は、情報処理装置、情報処理方法及び記憶媒体に関する。 The present disclosure relates to an information processing apparatus, an information processing method, and a storage medium.
 スマートフォンやタブレット端末などの、タッチパネルに対するユーザ操作に応じて様々な情報を表示する装置が、広く普及している。タブレット端末については、画面サイズの大型化も図られるようになり、複数のユーザが同時に操作する使われ方も考慮されるようになりつつある。また、情報を表示する装置としては、従来からプロジェクタが用いられている。 Devices that display various information according to user operations on the touch panel, such as smartphones and tablet terminals, are widely used. With regard to tablet terminals, the screen size has been increased, and the use method in which a plurality of users operate at the same time is being considered. In addition, a projector is conventionally used as a device for displaying information.
 ユーザインタラクションに応じて情報を効率よく表示する技術は従来から多く提案されており、例えば下記特許文献1では、情報を表示しようとする環境や、表示されている情報の状況に応じて情報の表示を行う技術が開示されている。 Many techniques for efficiently displaying information according to user interaction have been proposed. For example, in Patent Document 1 below, information is displayed according to the environment in which information is to be displayed and the status of the information being displayed. Techniques for performing are disclosed.
国際公開第2015/098188号International Publication No. 2015/098188
 近年、上述した各種情報処理装置をはじめとして、ユーザとのインタラクションに基づいて各種サービスを提供する装置が増えている。その一例として、飲食店等における飲食物に関するサービスの提供が挙げられる。上記特許文献1においても、飲食物に関するサービスを提供することが考慮されているものの、提供されるサービスの質には向上の余地があった。 In recent years, an increasing number of devices provide various services based on user interaction, including the various information processing devices described above. One example is the provision of services related to food and drink at restaurants and the like. Even in the above-mentioned Patent Document 1, although provision of services related to food and drink is considered, there is room for improvement in the quality of services provided.
 そこで、本開示では、ユーザインタラクションに応じて提供される食品又は飲料に関するサービスの質をより向上させることが可能な仕組みを提供する。 Therefore, the present disclosure provides a mechanism that can further improve the quality of services related to foods or beverages provided according to user interaction.
 本開示によれば、センシングの結果得られた食品又は飲料に関する情報、及び前記食品又は飲料に対応付けられた設定情報に基づいて、前記食品又は飲料に関する表示オブジェクトの表示を制御する制御部、を備える情報処理装置が提供される。 According to the present disclosure, the control unit that controls display of the display object related to the food or beverage based on the information related to the food or beverage obtained as a result of the sensing and the setting information associated with the food or beverage. An information processing apparatus is provided.
 また、本開示によれば、センシングの結果得られた食品又は飲料に関する情報、及び前記食品又は飲料に対応付けられた設定情報に基づいて、前記食品又は飲料に関する表示オブジェクトの表示をプロセッサにより制御すること、を含む情報処理方法が提供される。 Moreover, according to this indication, based on the information regarding the food or drink obtained as a result of the sensing, and the setting information associated with the food or drink, the display of the display object related to the food or drink is controlled by the processor. An information processing method including the above is provided.
 また、本開示によれば、コンピュータを、センシングの結果得られた食品又は飲料に関する情報、及び前記食品又は飲料に対応付けられた設定情報に基づいて、前記食品又は飲料に関する表示オブジェクトの表示を制御する制御部、として機能させるためのプログラムが記憶された記憶媒体が提供される。 Further, according to the present disclosure, the computer controls display of the display object related to the food or beverage based on the information related to the food or beverage obtained as a result of the sensing and the setting information associated with the food or beverage. A storage medium storing a program for functioning as a control unit is provided.
 本開示によれば、センシングの結果得られた、食品又は飲料(以下では、単に飲食物(food or drink)とも称する)に関する情報に基づいて当該飲食物に関する表示オブジェクトの表示が制御されるので、食事の状況に応じたリアルタイムなサービスを提供することが可能である。さらに、本開示によれば、飲食物に対応付けられた設定情報に基づいて当該飲食物に関する表示オブジェクトの表示が制御されるので、飲食物ごとにきめ細やかなサービスを提供することが可能である。このように、本開示によれば、リアルタイム且つきめ細やかなサービスを提供することが可能である。 According to the present disclosure, the display of the display object related to the food and drink is controlled based on the information about the food or beverage (hereinafter, also simply referred to as food or drink) obtained as a result of the sensing. It is possible to provide a real-time service according to meal conditions. Furthermore, according to this indication, since the display of the display object regarding the said food / beverage is controlled based on the setting information matched with food / beverage, it is possible to provide a detailed service for every food / beverage. . Thus, according to the present disclosure, it is possible to provide a real-time and detailed service.
 以上説明したように本開示によれば、ユーザインタラクションに応じて提供される飲食物に関するサービスの質をより向上させることが可能な仕組みが提供される。なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 As described above, according to the present disclosure, a mechanism capable of further improving the quality of services related to food and drink provided according to user interaction is provided. Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
本開示の一実施形態に係る情報処理システムの一構成例を示す図である。It is a figure showing an example of 1 composition of an information processing system concerning one embodiment of this indication. 本実施形態に係る情報処理システムの他の構成例を示す図である。It is a figure which shows the other structural example of the information processing system which concerns on this embodiment. 本実施形態に係る情報処理システムの他の構成例を示す図である。It is a figure which shows the other structural example of the information processing system which concerns on this embodiment. 本実施形態に係る情報処理システムの機能構成の一例を示すブロック図である。It is a block diagram which shows an example of a function structure of the information processing system which concerns on this embodiment. 本実施形態に係る情報処理システムの概要を説明するための図である。It is a figure for demonstrating the outline | summary of the information processing system which concerns on this embodiment. 本実施形態に係る情報処理システムの動作処理の流れの一例を示す図である。It is a figure which shows an example of the flow of the operation processing of the information processing system which concerns on this embodiment. 本実施形態に係る情報処理システムによる表示制御の一例を示す図である。It is a figure which shows an example of the display control by the information processing system which concerns on this embodiment. 本実施形態に係る情報処理システムによる表示制御の一例を示す図である。It is a figure which shows an example of the display control by the information processing system which concerns on this embodiment. 本実施形態に係る情報処理装置のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the information processing apparatus which concerns on this embodiment.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
 なお、説明は以下の順序で行うものとする。
  1.情報処理システムの概要
  2.機能構成例
  3.技術的特徴
   3.1.概要
   3.2.飲食物に関する情報
   3.3.設定情報
    3.3.1.設定情報の内容
    3.3.2.設定情報の設定
   3.4.表示制御
   3.5.外部機器への通知
  4.ハードウェア構成例
  5.まとめ
The description will be made in the following order.
1. 1. Overview of information processing system 2. Functional configuration example Technical features 3.1. Outline 3.2. Information about food and drink 3.3. Setting information 3.3.1. Content of setting information 3.3.2. Setting information setting 3.4. Display control 3.5. 3. Notification to external device 4. Hardware configuration example Summary
 <<1.情報処理システムの概要>>
 図1を参照して、本開示の一実施形態に係る情報処理システムの構成について説明する。図1は、本開示の一実施形態に係る情報処理システムの一構成例を示す図である。なお、本明細書において、システムとは、所定の処理を実行するための構成のことを意味していてよく、システム全体として1つの装置とみなすこともできるし、複数の装置によってシステムが構成されているとみなすこともできる。図1に示す本実施形態に係る情報処理システムも、情報処理システム全体として所定の処理(例えば図4に示す機能構成によって実現される処理)を実行可能に構成されていればよく、情報処理システム内のどの構成を1つの装置とみなすかは任意であってよい。
<< 1. Overview of Information Processing System >>
A configuration of an information processing system according to an embodiment of the present disclosure will be described with reference to FIG. FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure. In this specification, the system may mean a configuration for executing a predetermined process, and the system as a whole can be regarded as one device, or a system is configured by a plurality of devices. It can also be regarded as. The information processing system according to the present embodiment illustrated in FIG. 1 may also be configured to execute predetermined processing (for example, processing realized by the functional configuration illustrated in FIG. 4) as the entire information processing system. Which of these configurations is considered as one device may be arbitrary.
 図1を参照すると、本開示の一実施形態に係る情報処理システム100aは、入力部110aと、出力部130aと、を備える。 Referring to FIG. 1, an information processing system 100a according to an embodiment of the present disclosure includes an input unit 110a and an output unit 130a.
 出力部130aは、各種の情報をテーブル140aに表示することにより、当該情報をユーザに対して視覚的に通知する。出力部130aとしては、例えばプロジェクタが用いられる。図示するように、出力部130aは、テーブル140aの上方に、例えば天井から吊り下げられた状態でテーブル140aと所定の距離離隔して配置され、テーブル140aの天面に情報を投影する。このように上方からテーブル140aの天面に情報を表示する方式を、「プロジェクション型」とも呼称する。 The output unit 130a visually notifies the user of the information by displaying various types of information on the table 140a. For example, a projector is used as the output unit 130a. As illustrated, the output unit 130a is disposed above the table 140a, for example, spaced from the table 140a by a predetermined distance while being suspended from the ceiling, and projects information on the top surface of the table 140a. A method for displaying information on the top surface of the table 140a from above is also referred to as a “projection type”.
 なお、以下の説明では、出力部130aによって情報が表示される領域全体のことを表示画面とも呼称する。例えば、出力部130aは、表示画面に、情報処理システム100aによるアプリケーションの実行に伴いユーザに対して提示される情報を表示する。表示される情報は、例えば各アプリケーションの動作画面である。以下では、表示画面において、このようなアプリケーションの動作画面が表示される各表示領域を、表示オブジェクトとも称する。表示オブジェクトは、いわゆるGUI(Graphical User Interface)部品(ウィジェット)であってもよい。 In the following description, the entire area where information is displayed by the output unit 130a is also referred to as a display screen. For example, the output unit 130a displays information presented to the user as the application is executed by the information processing system 100a on the display screen. The displayed information is, for example, an operation screen of each application. Hereinafter, each display area in which the operation screen of such an application is displayed on the display screen is also referred to as a display object. The display object may be a so-called GUI (Graphical User Interface) component (widget).
 ここで、情報処理システム100aがプロジェクション型である場合には、出力部130aは照明機器を含んでもよい。出力部130aに照明機器が含まれる場合、情報処理システム100aは、入力部110aによって入力された情報の内容及び/又は出力部130aによって表示される情報の内容に基づいて、当該照明機器の点灯、消灯等の状態を制御してもよい。 Here, when the information processing system 100a is a projection type, the output unit 130a may include a lighting device. When the output unit 130a includes a lighting device, the information processing system 100a turns on the lighting device based on the content of information input by the input unit 110a and / or the content of information displayed by the output unit 130a. You may control states, such as light extinction.
 また、出力部130aは、スピーカを含んでもよく、各種の情報を音声として出力してもよい。出力部130aがスピーカで構成される場合、スピーカの数は1つであってもよく、複数であってもよい。出力部130aが複数のスピーカで構成される場合、情報処理システム100aは、音声を出力するスピーカを限定したり、音声を出力する方向を調整したりしてもよい。もちろん、出力部130aは、複数の出力装置を含んでいてもよく、例えば、プロジェクタ、照明機器及びスピーカを含んでいてもよい。 Further, the output unit 130a may include a speaker and may output various kinds of information as sound. When the output unit 130a is configured by speakers, the number of speakers may be one or plural. When the output unit 130a includes a plurality of speakers, the information processing system 100a may limit the speakers that output sound or adjust the direction in which sound is output. Of course, the output unit 130a may include a plurality of output devices, and may include, for example, a projector, a lighting device, and a speaker.
 入力部110aは、情報処理システム100aを使用するユーザの操作内容を入力する装置である。図1に示す例では、入力部110aは、テーブル140aの上方に、例えば天井から吊り下げられた状態で設けられる。このように、入力部110aは、情報が表示される対象となるテーブル140aと離隔して設けられる。入力部110aは、テーブル140aの天面、すなわち、表示画面を撮影し得る撮像装置によって構成され得る。入力部110aとしては、例えば1つのレンズでテーブル140aを撮像するカメラや、2つのレンズでテーブル140aを撮像して奥行き方向の情報を記録することが可能なステレオカメラ等が用いられ得る。入力部110aがステレオカメラである場合には、例えば可視光カメラや赤外線カメラ等が用いられ得る。 The input unit 110a is a device that inputs operation details of a user who uses the information processing system 100a. In the example illustrated in FIG. 1, the input unit 110a is provided above the table 140a, for example, in a state suspended from the ceiling. As described above, the input unit 110a is provided apart from the table 140a on which information is displayed. The input unit 110a may be configured by an imaging device that can capture the top surface of the table 140a, that is, the display screen. As the input unit 110a, for example, a camera that images the table 140a with one lens, a stereo camera that can record information in the depth direction by imaging the table 140a with two lenses, or the like can be used. When the input unit 110a is a stereo camera, for example, a visible light camera or an infrared camera can be used.
 入力部110aとして、1つのレンズでテーブル140aを撮像するカメラが用いられる場合、情報処理システム100aは、そのカメラが撮像した画像(撮像画像)を解析することで、テーブル140a上に位置する物理的な物体(以下、実物体とも称する)、例えばユーザの手の位置を検出することができる。また、入力部110aとしてステレオカメラが用いられる場合には、情報処理システム100aは、当該ステレオカメラによる撮像画像を解析することで、テーブル140a上に位置する物体の位置情報に加えて、当該物体の深度情報(換言すると、三次元情報)を取得することができる。情報処理システム100aは、当該深度情報に基づいて、高さ方向におけるテーブル140aへのユーザの手の接触若しくは近接、及びテーブル140aからの手の離脱を検出することが可能となる。なお、以下の説明では、ユーザが情報に表示画面に手等の操作体を接触又は近接させることを、まとめて単に「接触」とも称する。 When a camera that captures the table 140a with a single lens is used as the input unit 110a, the information processing system 100a analyzes the image (captured image) captured by the camera to physically locate the table 140a. It is possible to detect an object (hereinafter also referred to as a real object), for example, the position of a user's hand. When a stereo camera is used as the input unit 110a, the information processing system 100a analyzes the captured image by the stereo camera, and in addition to the position information of the object located on the table 140a, Depth information (in other words, three-dimensional information) can be acquired. The information processing system 100a can detect contact or proximity of the user's hand to the table 140a in the height direction and separation of the hand from the table 140a based on the depth information. In the following description, when the user touches or brings an operating body such as a hand in contact with information on the display screen is also simply referred to as “contact”.
 本実施形態では、入力部110aによる撮像画像に基づいて、表示画面上(すなわちテーブル140aの天面上)における操作体、例えばユーザの手の位置が検出され、検出された操作体の位置に基づいて各種の情報が入力される。つまり、ユーザは、表示画面上で操作体を動かすことにより、各種の操作入力を行うことができる。例えば、表示オブジェクトに対するユーザの手の接触が検出されることにより、当該表示オブジェクトに対する操作入力が行われることになる。なお、以下の説明では、一例として、操作体としてユーザの手が用いられる場合について説明するが、本実施形態はかかる例に限定されず、操作体としてはスタイラス等の各種の操作部材が用いられてもよい。 In the present embodiment, based on the image captured by the input unit 110a, the position of the operating body, for example, the user's hand on the display screen (that is, the top surface of the table 140a) is detected, and based on the detected position of the operating body. Various information is input. That is, the user can perform various operation inputs by moving the operating tool on the display screen. For example, when a user's hand contact with the display object is detected, an operation input for the display object is performed. In the following description, a case where a user's hand is used as an operation body will be described as an example, but the present embodiment is not limited to this example, and various operation members such as a stylus are used as the operation body. May be.
 また、入力部110aが撮像装置によって構成される場合には、入力部110aは、テーブル140aの天面を撮影するだけでなく、テーブル140aの周囲に存在するユーザを撮影してもよい。例えば、情報処理システム100aは、撮像画像に基づいて、テーブル140aの周囲におけるユーザの位置を検出することができる。また、例えば、情報処理システム100aは、撮像画像に含まれるユーザの顔や体の大きさ等、ユーザ個人を特定し得る身体的な特徴を抽出することにより、ユーザの個人認識を行ってもよい。 Further, when the input unit 110a is configured by an imaging device, the input unit 110a may capture not only the top surface of the table 140a but also a user existing around the table 140a. For example, the information processing system 100a can detect the position of the user around the table 140a based on the captured image. In addition, for example, the information processing system 100a may perform personal recognition of the user by extracting physical features that can identify the individual of the user, such as the size of the user's face and body included in the captured image. .
 ここで、本実施形態はかかる例に限定されず、他の方法によりユーザの操作入力が実行されてもよい。例えば、入力部110aがテーブル140aの天面にタッチパネルとして設けられてもよく、ユーザの操作入力は、当該タッチパネルに対するユーザの指等の接触によって検出されてもよい。なお、タッチパネルは、感圧式、静電容量式、又は光学式等の多様な方式で実現されてもよい。また、入力部110aは、超音波の反射を用いて対象物の空間位置認識を行う、又は対象物の振動を検出して解析することで対象物と他の物体との接触位置を検出することで、テーブル天面140aへのユーザ操作を検出してもよい。入力部110aは、テーブル天面140aへのユーザ操作を検出する技術として、これらのいずれか又は任意の組み合わせを採用し得る。また、入力部110aを構成する撮像装置に対するジェスチャによってユーザの操作入力が検出されてもよい。あるいは、入力部110aは、ユーザが発する音声や、周囲の環境の環境音を収音するマイクロフォン等の音声入力装置を含んでもよい。当該音声入力装置としては、特定の方向の音声を収音するためのマイクアレイが好適に用いられ得る。また、当該マイクアレイは、その収音方向を任意の方向に調整可能に構成され得る。入力部110aとして音声入力装置が用いられる場合には、収音された音声により操作入力が行われてもよい。また、情報処理システム100aは、収音された音声を解析することにより、当該音声に基づいて個人認識を行ってもよい。あるいは、入力部110aは、リモートコントロール装置(いわゆるリモコン)によって構成されてもよい。当該リモコンは、リモコンに配置された所定のボタンを操作することにより、所定の指示が入力されるものであってもよいし、リモコンに搭載される加速度センサやジャイロセンサ等のセンサによってリモコンの動きや姿勢が検出されることにより、ユーザが当該リモコンを動かす操作によって、所定の指示が入力されるものであってもよい。更に、情報処理システム100aは、図示しないマウス、キーボード、ボタン、スイッチ及びレバー等、他の入力装置を入力部110aとして備えてもよく、ユーザによる操作は、これらの入力装置を介して入力されてもよい。 Here, the present embodiment is not limited to such an example, and user operation input may be executed by other methods. For example, the input unit 110a may be provided as a touch panel on the top surface of the table 140a, and the user's operation input may be detected by contact of the user's finger or the like with respect to the touch panel. The touch panel may be realized by various methods such as a pressure-sensitive type, a capacitance type, and an optical type. Further, the input unit 110a performs spatial position recognition of an object using ultrasonic reflection, or detects a contact position between the object and another object by detecting and analyzing the vibration of the object. Thus, a user operation on the table top surface 140a may be detected. The input unit 110a may employ any one or any combination thereof as a technique for detecting a user operation on the table top surface 140a. Further, the user's operation input may be detected by a gesture with respect to the imaging device constituting the input unit 110a. Alternatively, the input unit 110a may include a voice input device such as a microphone that picks up sounds produced by the user and environmental sounds of the surrounding environment. As the sound input device, a microphone array for collecting sound in a specific direction can be suitably used. Further, the microphone array can be configured such that the sound collection direction can be adjusted to an arbitrary direction. When a voice input device is used as the input unit 110a, an operation input may be performed using the collected voice. Further, the information processing system 100a may perform individual recognition based on the voice by analyzing the collected voice. Alternatively, the input unit 110a may be configured by a remote control device (so-called remote control). The remote controller may be one in which a predetermined instruction is input by operating a predetermined button arranged on the remote controller, or movement of the remote controller by a sensor such as an acceleration sensor or a gyro sensor mounted on the remote controller. Or a posture may be detected, and a predetermined instruction may be input by an operation of the user moving the remote controller. Furthermore, the information processing system 100a may include other input devices such as a mouse, a keyboard, buttons, switches, and levers (not shown) as the input unit 110a, and user operations are input through these input devices. Also good.
 以上、図1を参照して、本実施形態に係る情報処理システム100aの構成について説明した。なお、図1には図示しないが、情報処理システム100aには他の装置が接続されていてもよい。例えば、情報処理システム100aには、テーブル140aを照らすための照明機器が接続されていてもよい。情報処理システム100aは、表示画面の状態に応じて当該照明機器の点灯状態を制御してもよい。 The configuration of the information processing system 100a according to the present embodiment has been described above with reference to FIG. Although not shown in FIG. 1, another device may be connected to the information processing system 100a. For example, an illumination device for illuminating the table 140a may be connected to the information processing system 100a. The information processing system 100a may control the lighting state of the lighting device according to the state of the display screen.
 ここで、本実施形態では、情報処理システムの構成は図1に示すものに限定されない。本実施形態に係る情報処理システムは、各種の情報を表示画面に表示する出力部と、表示された情報に対する操作入力を少なくとも受け付けることが可能な入力部を備えればよく、その具体的な構成は限定されない。図2及び図3を参照して、本実施形態に係る情報処理システムの他の構成例について説明する。図2及び図3は、本実施形態に係る情報処理システムの他の構成例を示す図である。 Here, in the present embodiment, the configuration of the information processing system is not limited to that shown in FIG. The information processing system according to the present embodiment only needs to include an output unit that displays various types of information on the display screen, and an input unit that can accept at least an operation input for the displayed information. Is not limited. With reference to FIG.2 and FIG.3, the other structural example of the information processing system which concerns on this embodiment is demonstrated. 2 and 3 are diagrams showing another configuration example of the information processing system according to the present embodiment.
 図2に示す情報処理システム100bでは、テーブル140bの下方に出力部130aが設けられる。出力部130aは、例えばプロジェクタであり、当該テーブル140bの天板に向かって下側から情報を投影する。テーブル140bの天板は、例えばガラス板や透明プラスチック板等の透明な材質で形成されており、出力部130aによって投影された情報は、テーブル140bの天面に表示されることとなる。このようにテーブル140bの下から情報を出力部130aに投影して、テーブル140bの天面に情報を表示する方式を、「リアプロジェクション型」とも称する。 In the information processing system 100b shown in FIG. 2, an output unit 130a is provided below the table 140b. The output unit 130a is a projector, for example, and projects information from below toward the top plate of the table 140b. The top plate of the table 140b is formed of a transparent material such as a glass plate or a transparent plastic plate, and the information projected by the output unit 130a is displayed on the top surface of the table 140b. A method of projecting information from the bottom of the table 140b onto the output unit 130a and displaying the information on the top surface of the table 140b is also referred to as a “rear projection type”.
 図2に示す例では、テーブル140bの天面(表面)に入力部110bが設けられる。入力部110bは、例えばタッチパネルによって構成され、テーブル140bの天面の表示画面への操作体の接触が当該タッチパネルによって検出されることにより、ユーザによる操作入力が行われる。なお、入力部110bの構成はかかる例に限定されず、図1に示す情報処理システム100aと同様に、入力部110bは、テーブル140bの下方にテーブル140bと離隔して設けられてもよい。この場合、入力部110bは、例えば撮像装置によって構成され、透明な材質によって形成される天板越しに、テーブル140bの天面上での操作体の位置を検出し得る。 In the example shown in FIG. 2, the input unit 110b is provided on the top surface (front surface) of the table 140b. The input unit 110b is configured by, for example, a touch panel, and the operation input by the user is performed when the touch of the operating body on the display screen on the top surface of the table 140b is detected by the touch panel. Note that the configuration of the input unit 110b is not limited to this example, and the input unit 110b may be provided below the table 140b and separated from the table 140b, similarly to the information processing system 100a shown in FIG. In this case, the input unit 110b is configured by an imaging device, for example, and can detect the position of the operation body on the top surface of the table 140b through a top plate formed of a transparent material.
 図3に示す情報処理システム100cでは、タッチパネル式のディスプレイが、その表示画面を上方に向けた状態でテーブル上に設置される。情報処理システム100cでは、入力部110c及び出力部130cは、当該タッチパネル式のディスプレイとして一体的に構成され得る。つまり、ディスプレイの表示画面に各種の情報が表示され、当該ディスプレイの表示画面に対する操作体の接触がタッチパネルによって検出されることにより、ユーザによる操作入力が行われる。なお、情報処理システム100cにおいても、図1に示す情報処理システム100aと同様に、入力部110cとして、タッチパネル式のディスプレイの上方に撮像装置が設けられてもよい。当該撮像装置により、テーブルの周囲のユーザの位置等が検出され得る。 In the information processing system 100c shown in FIG. 3, a touch panel display is installed on a table with its display screen facing upward. In the information processing system 100c, the input unit 110c and the output unit 130c can be integrally configured as the touch panel display. That is, various types of information are displayed on the display screen of the display, and the operation input by the user is performed by detecting the touch of the operating body on the display screen of the display by the touch panel. In the information processing system 100c as well, as in the information processing system 100a shown in FIG. 1, an imaging device may be provided above the touch panel display as the input unit 110c. The position of the user around the table can be detected by the imaging device.
 以上、図2及び図3を参照して、本実施形態に係る情報処理システムの他の構成例について説明した。以上説明したように、本実施形態に係る情報処理システムは、多様な構成によって実現され得る。ここで、以下では、図1に示す、テーブル140aの上方に入力部110a及び出力部130aが設けられる情報処理システム100aの構成を例に挙げて、本実施形態についての説明を行う。ただし、上述した図2又は図3に示す構成等、本実施形態に係る情報処理システムを実現し得る他の構成であっても、以下に説明する機能と同様の機能を実現することが可能である。以下の説明では、簡単のため、情報処理システム100a、入力部110a、出力部130aを、単に情報処理システム100、入力部110、出力部130とも呼称することとする。 The other configuration example of the information processing system according to the present embodiment has been described above with reference to FIGS. 2 and 3. As described above, the information processing system according to the present embodiment can be realized by various configurations. Here, in the following, the present embodiment will be described by taking as an example the configuration of the information processing system 100a in which the input unit 110a and the output unit 130a are provided above the table 140a shown in FIG. However, even with other configurations that can realize the information processing system according to the present embodiment, such as the configuration shown in FIG. 2 or FIG. 3 described above, functions similar to those described below can be realized. is there. In the following description, for the sake of simplicity, the information processing system 100a, the input unit 110a, and the output unit 130a are simply referred to as the information processing system 100, the input unit 110, and the output unit 130.
 <<2.機能構成>>
 以下、図4を参照して、以上説明した本実施形態に係る情報処理システム100を実現し得る機能構成について説明する。図4は、本実施形態に係る情報処理システム100の機能構成の一例を示すブロック図である。
<< 2. Functional configuration >>
Hereinafter, a functional configuration capable of realizing the information processing system 100 according to the present embodiment described above will be described with reference to FIG. FIG. 4 is a block diagram illustrating an example of a functional configuration of the information processing system 100 according to the present embodiment.
 図4を参照すると、本実施形態に係る情報処理システム100は、その機能として、入力部110、処理部120、出力部130、記憶部150、及び通信部160を含む。 Referring to FIG. 4, the information processing system 100 according to the present embodiment includes an input unit 110, a processing unit 120, an output unit 130, a storage unit 150, and a communication unit 160 as its functions.
 (1)入力部110
 入力部110は、情報処理システム100に対して各種の情報を入力するための入力インタフェースである。ユーザは、入力部110を介して、各種の情報を情報処理システム100に入力することができる。入力部110は、図1~図3に示す入力部110a~110cに対応するものである。
(1) Input unit 110
The input unit 110 is an input interface for inputting various information to the information processing system 100. A user can input various types of information to the information processing system 100 via the input unit 110. The input unit 110 corresponds to the input units 110a to 110c shown in FIGS.
 入力部110は、各種センサを含み得る。入力部110は、センシング対象範囲におけるユーザ、ユーザ動作、実物体、およびこれらと表示オブジェクトとの関係等についてセンシングを行い、センシングの結果を示すセンシング情報を生成して処理部120に出力する。センシング対象範囲は、テーブル140の天面に限定されずともよく、例えばテーブル140の周囲を含んでいてもよい。 The input unit 110 can include various sensors. The input unit 110 performs sensing on the user in the sensing target range, user actions, real objects, and the relationship between these and display objects, generates sensing information indicating the sensing result, and outputs the sensing information to the processing unit 120. The sensing target range may not be limited to the top surface of the table 140, and may include, for example, the periphery of the table 140.
 例えば、入力部110は、撮像装置を含み、ユーザの体、ユーザの顔、ユーザの手、及びテーブル140の天面上に位置する物体等を含む撮像画像を撮影する。入力部110を介して入力された情報(例えば当該撮像画像についての情報等)は、後述する処理部120に提供され、ユーザが識別されたり、ユーザの操作入力が認識されたり、物体が検出されたりする。撮像装置は、例えば可視光カメラ又は赤外線カメラであってもよい。また、上述したように、入力部110は、ステレオカメラ等の深度情報を取得可能な深度センサとしての機能を含む撮像装置として構成されてもよい。他方、深度センサは、time of flight方式、又はstructured light方式等の任意の方式によるセンサとして、撮像装置とは別個に構成されてもよい。また、入力部110は、タッチセンサを含んで構成されてもよい。その場合、タッチセンサは表示画面へのタッチを検出する。そして、表示画面上のタッチしていないユーザの手及び表示画面上の物体の検出機能は、深度センサ及び/又は表示画面を上方から撮像する撮像装置により担保されてもよい。 For example, the input unit 110 includes an imaging device and captures a captured image including a user's body, a user's face, a user's hand, and an object positioned on the top surface of the table 140. Information (for example, information about the captured image) input via the input unit 110 is provided to the processing unit 120 described later, and a user is identified, a user operation input is recognized, or an object is detected. Or The imaging device may be a visible light camera or an infrared camera, for example. Further, as described above, the input unit 110 may be configured as an imaging device including a function as a depth sensor capable of acquiring depth information such as a stereo camera. On the other hand, the depth sensor may be configured separately from the imaging device as a sensor using an arbitrary method such as a time of flight method or a structured light method. The input unit 110 may include a touch sensor. In that case, the touch sensor detects a touch on the display screen. And the detection function of the user's hand which is not touching on the display screen and the object on the display screen may be secured by the imaging device which images the depth sensor and / or the display screen from above.
 例えば、入力部110は、収音装置を含み、ユーザの音声、ユーザの動作に伴う音、及び環境音等を収音する。入力部110を介して入力された情報は、後述する処理部120に提供され、ユーザの音声入力が認識されたり、物体の移動等が検出されたりする。収音装置は、アレイマイクであってもよく、処理部120により音源方向が検出されてもよい。なお、収音装置は、典型的にはマイクによって構成されるが、光の振動により音を検出するセンサとして構成されてもよい。 For example, the input unit 110 includes a sound collection device and collects a user's voice, a sound accompanying a user's operation, an environmental sound, and the like. Information input via the input unit 110 is provided to the processing unit 120 described later, and a user's voice input is recognized, or movement of an object is detected. The sound collection device may be an array microphone, and the sound source direction may be detected by the processing unit 120. The sound collection device is typically configured by a microphone, but may be configured as a sensor that detects sound by vibration of light.
 (2)処理部120
 処理部120は、例えばCPU又はDSP等の各種のプロセッサを含み、各種の演算処理を実行することにより、情報処理システム100の動作を制御する。例えば、処理部120は、入力部110又は通信部160から得られた各種情報を処理して、記憶部150に情報を記憶させたり、出力部130に情報を出力させたりする。処理部120は、各種の情報を処理する情報処理装置として捉えられてもよい。処理部120は、その機能として、設定部121、取得部123、表示制御部125及び通知部127を含む。なお、処理部120は、これらの機能以外の機能を有していてもよい。また、処理部120の各機能は、処理部120を構成するプロセッサが所定のプログラムに従って動作することにより実現される。
(2) Processing unit 120
The processing unit 120 includes various processors such as a CPU and a DSP, and controls the operation of the information processing system 100 by executing various arithmetic processes. For example, the processing unit 120 processes various types of information obtained from the input unit 110 or the communication unit 160 and stores the information in the storage unit 150 or causes the output unit 130 to output the information. The processing unit 120 may be regarded as an information processing apparatus that processes various types of information. The processing unit 120 includes a setting unit 121, an acquisition unit 123, a display control unit 125, and a notification unit 127 as its functions. Note that the processing unit 120 may have functions other than these functions. In addition, each function of the processing unit 120 is realized by a processor constituting the processing unit 120 operating according to a predetermined program.
 例えば、設定部121は、飲食物に対応付けて設定情報を設定する機能を有する。また、取得部123は、センシングの結果得られた、飲食物に関する情報、及び当該飲食物に対応付けられた設定情報を取得する機能を有する。また、表示制御部125は、取得された飲食物に関する情報及び設定情報に基づいて、当該飲食物に関する表示オブジェクトの表示を制御する機能を有する。また、通知部127は、取得された飲食物に関する情報及び設定情報に基づいて、当該飲食物に関する情報を外部機器に通知する機能を有する。これらの各構成要素による処理の詳細については、後に詳しく説明する。 For example, the setting unit 121 has a function of setting setting information in association with food and drink. Moreover, the acquisition part 123 has a function which acquires the information regarding food / beverage obtained as a result of sensing, and the setting information matched with the said food / beverage. Moreover, the display control part 125 has a function which controls the display of the display object regarding the said food / beverage based on the information regarding the acquired food / beverage and setting information. In addition, the notification unit 127 has a function of notifying the external device of information about the food and drink based on the acquired information about food and drink and the setting information. Details of processing by each of these components will be described in detail later.
 (3)出力部130
 出力部130は、情報処理システム100によって処理される各種の情報をユーザに対して通知するための出力インタフェースである。出力部130は、例えば、ディスプレイ、タッチパネル又はプロジェクタ等の表示装置を含み、表示制御部123からの制御により、表示画面に各種の情報を表示する。出力部130は、図1~図3に示す出力部130a~130cに対応するものであり、上述したように、表示画面に表示オブジェクトを表示する。なお、本実施形態はかかる例に限定されず、出力部130は、スピーカ等の音声出力装置を更に含んでもよく、各種の情報を音声として出力してもよい。
(3) Output unit 130
The output unit 130 is an output interface for notifying the user of various types of information processed by the information processing system 100. The output unit 130 includes a display device such as a display, a touch panel, or a projector, and displays various types of information on the display screen under the control of the display control unit 123. The output unit 130 corresponds to the output units 130a to 130c shown in FIGS. 1 to 3, and displays a display object on the display screen as described above. Note that the present embodiment is not limited to this example, and the output unit 130 may further include an audio output device such as a speaker, and may output various types of information as audio.
 (4)記憶部150
 記憶部150は、情報処理システム100の動作のための情報を一時的に又は恒久的に記憶する記憶装置である。例えば、記憶部150は、設定情報を記憶する。
(4) Storage unit 150
The storage unit 150 is a storage device that temporarily or permanently stores information for the operation of the information processing system 100. For example, the storage unit 150 stores setting information.
 (5)通信部160
 通信部160は、有線/無線で外部機器との間でデータの送受信を行うための通信インタフェースである。例えば、通信部160は、無線LAN(Local Area Network)、Wi-Fi(Wireless Fidelity、登録商標)、赤外線通信、Bluetooth(登録商標)等の方式で、外部機器と直接、またはネットワークアクセスポイントを介して通信する。例えば、通信部160は、ユーザのスマートフォン又はユーザに装着されたウェラブルデバイス等のユーザデバイスとの間で情報を通信する。通信部160は、例えばWeb上のサーバと通信して、SNS(Social Networking Service)等から情報を取得してもよい。
(5) Communication unit 160
The communication unit 160 is a communication interface for transmitting / receiving data to / from an external device by wire / wireless. For example, the communication unit 160 is a method such as wireless LAN (Local Area Network), Wi-Fi (Wireless Fidelity (registered trademark)), infrared communication, Bluetooth (registered trademark), etc., directly with an external device or via a network access point. Communicate. For example, the communication unit 160 communicates information with a user device such as a user's smartphone or a wearable device attached to the user. The communication unit 160 may acquire information from an SNS (Social Networking Service) or the like by communicating with a server on the Web, for example.
 <<3.技術的特徴>>
 <3.1.概要>
 まず、図5及び図6を参照して、本実施形態に係る情報処理システム100による動作の概要を説明する。
<< 3. Technical features >>
<3.1. Overview>
First, with reference to FIG.5 and FIG.6, the outline | summary of operation | movement by the information processing system 100 which concerns on this embodiment is demonstrated.
 図5は、本実施形態に係る情報処理システム100の概要を説明するための図である。図5では、レストラン、バー又はカフェ等の飲食店における一光景が示されている。図5に示すように、テーブル140にはユーザ(即ち、客)10A及び10Bが着席しており、テーブル140の天面(即ち、表示画面)上に飲食物20A、20B及び20Cが載置されている。これらの飲食物20A、20B及び20Cは、ユーザ10A又は10Bにより注文され、飲食店の店員により配膳されたものである。 FIG. 5 is a diagram for explaining an overview of the information processing system 100 according to the present embodiment. FIG. 5 shows a scene at a restaurant such as a restaurant, bar or cafe. As shown in FIG. 5, users (ie, guests) 10A and 10B are seated on the table 140, and the food and drink 20A, 20B, and 20C are placed on the top surface (ie, display screen) of the table 140. ing. These food and drink 20A, 20B, and 20C are ordered by the user 10A or 10B and distributed by a restaurant clerk.
 情報処理システム100は、例えばこのような飲食店に設置される。そして、情報処理システム100は、ユーザ10A及び10Bによる飲食物20A、20B及び20Cの食事の様子を、入力部110を介してセンシングする。そして、図5に示すように、情報処理システム100は、出力部130を介してテーブル140の天面に表示オブジェクト30A及び30Bを表示する。なお、図5では、情報処理システム100が、飲食物を提供する飲食店に設置された表示装置(即ち、出力部130)により、飲食物が配膳されるテーブル140に表示オブジェクトを表示する例を説明したが、本技術はかかる例に限定されない。例えば、表示オブジェクトは、ユーザのスマートフォン又はウェラブルデバイス等のユーザ端末に表示されてもよいし、飲食店の壁や床等に表示されてもよい。 The information processing system 100 is installed in such a restaurant, for example. Then, the information processing system 100 senses food and drinks 20A, 20B, and 20C by the users 10A and 10B via the input unit 110. Then, as illustrated in FIG. 5, the information processing system 100 displays the display objects 30 </ b> A and 30 </ b> B on the top surface of the table 140 via the output unit 130. In FIG. 5, an example in which the information processing system 100 displays a display object on the table 140 on which food and drink are arranged by a display device (that is, the output unit 130) installed in a restaurant that provides food and drink. Although described, the present technology is not limited to such an example. For example, the display object may be displayed on a user terminal such as a user's smartphone or wearable device, or may be displayed on a wall or floor of a restaurant.
 以下、図5に示した例における情報処理システム100による動作処理について、図6を参照しながら詳しく説明する。 Hereinafter, operation processing by the information processing system 100 in the example shown in FIG. 5 will be described in detail with reference to FIG.
 図6は、本実施形態に係る情報処理システム100の動作処理の流れの一例を示す図である。図6に示すように、情報処理システム100は、入力部110によるセンシングを行って、飲食物20A、20B及び20Cに関する情報を取得する(ステップS102)。また、情報処理システム100は、飲食物20A、20B及び20Cに対応付けられた設定情報を取得する(ステップS104)。 FIG. 6 is a diagram illustrating an example of the flow of operation processing of the information processing system 100 according to the present embodiment. As illustrated in FIG. 6, the information processing system 100 performs sensing using the input unit 110 and acquires information about the foods 20A, 20B, and 20C (step S102). In addition, the information processing system 100 acquires setting information associated with the foods 20A, 20B, and 20C (step S104).
 次いで、情報処理システム100は、飲食物20A、20B及び20Cに関する情報、及び当該飲食物20A、20B及び20Cに対応付けられた設定情報に基づいて、表示オブジェクトの表示を制御する(ステップS106)。例えば、情報処理システム100は、肉料理20Aを食べるユーザ10Aに対して、飲料の追加注文を促すための飲料のメニューを含む表示オブジェクト30Aを表示する。また、情報処理システム100は、エビフライ20Bを食べビール20Cを飲むユーザ10Bに対して、サイドディッシュの追加注文を促すための料理のメニューを含む表示オブジェクト30Bを表示する。情報処理システム100は、これらの表示オブジェクト30A又は30Bへのユーザ操作に応じて、追加注文を受け付ける。このように、情報処理システム100は、飲食物をセンシングして飲食物の設定情報に応じた表示オブジェクトを表示することで、例えばユーザの注文操作を簡単にしたり、料理と飲料の組み合わせを提案したりすることが可能となる。これにより、ユーザの飲食体験を、より豊かにすることが可能である。 Next, the information processing system 100 controls the display of the display object based on the information on the foods 20A, 20B, and 20C and the setting information associated with the foods 20A, 20B, and 20C (step S106). For example, the information processing system 100 displays a display object 30A including a beverage menu for prompting the user 10A who eats the meat dish 20A to order additional beverages. In addition, the information processing system 100 displays a display object 30 </ b> B including a cooking menu for prompting the user 10 </ b> B who eats the shrimp fried food 20 </ b> B and drinks the beer 20 </ b> C to order additional side dishes. The information processing system 100 accepts additional orders in response to user operations on these display objects 30A or 30B. In this way, the information processing system 100 senses food and drink and displays display objects corresponding to the food and drink setting information, thereby simplifying the user's ordering operation or proposing a combination of food and drink, for example. It becomes possible to do. Thereby, it is possible to enrich a user's eating and drinking experience.
 次に、情報処理システム100は、飲食物20A、20B及び20Cに関する情報、及び当該飲食物20A、20B及び20Cに対応付けられた設定情報に基づいて、外部機器への情報の通知を制御する(ステップS108)。外部機器としては、飲食店内の各種装置、店員が有する端末、及びネットワーク上のサーバ等が考えられる。例えば、情報処理システム100は、飲食物が食べ終わったタイミングで下膳指示を、コース料理における次に飲食物の配膳指示を、店員の端末に送信する。また、情報処理システム100は、追加注文がなされたタイミングで、注文情報を厨房の装置に送信する。また、情報処理システム100は、テーブル140上の飲食物がすべて食べ終わり飲み終わったタイミングで、会計指示を飲食店の会計装置に送信する。これにより、飲食店は、適切なタイミングで適切なサービスを提供することが可能となる。 Next, the information processing system 100 controls the notification of information to the external device based on the information on the foods 20A, 20B, and 20C and the setting information associated with the foods 20A, 20B, and 20C ( Step S108). Examples of external devices include various devices in restaurants, terminals held by store clerk, and servers on a network. For example, the information processing system 100 transmits a lower bowl instruction to the store clerk's terminal at the timing when the food and drink have been eaten, and an instruction to arrange the next food and drink in the course dish. Further, the information processing system 100 transmits the order information to the kitchen apparatus at the timing when the additional order is made. In addition, the information processing system 100 transmits an accounting instruction to the accounting apparatus of the restaurant at the timing when all the food and drink on the table 140 have been eaten and finished drinking. Thereby, the restaurant can provide an appropriate service at an appropriate timing.
 その後、情報処理システム100は、終了条件が満たされたか否かを判定する(ステップS110)。満たされていないと判定された場合、処理は上記ステップS102に戻り、満たされたと判定された場合、処理は終了する。終了条件としては、例えばユーザが席を立ったこと等が考えられる。 Thereafter, the information processing system 100 determines whether or not the end condition is satisfied (step S110). If it is determined that the condition is not satisfied, the process returns to step S102. If it is determined that the condition is satisfied, the process ends. As the termination condition, for example, a user standing up can be considered.
 以上、本実施形態に係る情報処理システム100による動作の概要を説明した。以下、情報処理システム100の技術的特徴について詳しく説明する。 The outline of the operation by the information processing system 100 according to the present embodiment has been described above. Hereinafter, technical features of the information processing system 100 will be described in detail.
 <3.2.飲食物に関する情報>
 情報処理システム100(例えば、取得部123)は、センシング情報に基づいて、飲食物に関する情報を取得する。例えば、情報処理システム100は、撮像画像、赤外線画像、ユーザ音声又は深度情報等のセンシング情報に基づいて、飲食物に関する情報を取得する。とりわけ、情報処理システムは、ユーザに対応付けられた飲食物に関する情報を取得してもよい。ここでのユーザとは、例えば、飲食物を提供された人(例えば、注文した人、又は飲食する人)又は飲食物を提供する人(例えば、厨房において飲食物を調理する人、又は厨房からテーブルまで飲食物を運ぶ人)であってもよい。以下では、ユーザとは飲食物を提供された人である場合について説明する。なお、飲食物に関する情報は、飲食物ごとに取得されてもよいし、ユーザに対応する複数の飲食物に関し集約して取得されてもよいし、テーブルごとに集約して取得されてもよい。また、飲食物は、摂取物として捉えられてもよい。摂取物とは、経口摂取される食品、飲み物又はその組み合わせである。
<3.2. Information about food and drink>
Information processing system 100 (for example, acquisition part 123) acquires information about food and drink based on sensing information. For example, the information processing system 100 acquires information on food and drink based on sensing information such as a captured image, an infrared image, user voice, or depth information. In particular, the information processing system may acquire information on food and drink associated with the user. The user here is, for example, a person who is provided with food or drink (for example, a person who ordered or a person who eats or drinks) or a person who provides food or drink (for example, a person who cooks food or drink in a kitchen, or a kitchen A person who carries food and drink to the table). Below, the case where a user is the person who was provided with food and drink is demonstrated. In addition, the information regarding food / beverage may be acquired for every food / beverage, may be acquired collectively regarding several food / beverage products corresponding to a user, and may be acquired collectively for every table. In addition, food and drink may be captured as ingested items. The ingestion is a food, drink or combination thereof that is taken orally.
 飲食物に関する情報は多様に考えられる。以下、その一例を説明する。 Information on food and drink is considered diverse. An example will be described below.
 例えば、飲食物に関する情報は、飲食物固有の情報を含んでいてもよい。飲食物固有の情報としては、例えば、飲食物の名前、原材料、及び調理方法等が考えられる。 For example, information on food and drink may include information unique to food and drink. As information unique to food and drink, for example, names of food and drink, raw materials, cooking methods, and the like are conceivable.
 例えば、飲食物に関する情報は、飲食物の状態に関する情報を含んでいてもよい。飲食物の状態に関する情報としては、例えば、飲食物の温度、残量及び消費速度等が考えられる。 For example, information on food and drink may include information on the state of food and drink. As information regarding the state of food and drink, for example, the temperature, remaining amount and consumption speed of food and drink can be considered.
 例えば、飲食物に関する情報は、飲食物の提供に関する情報を含んでいてもよい。飲食物の提供に関する情報としては、例えば、提供されてからの経過時間、コース料理の一部であるか否か、関連する飲食物(例えば、コース料理に含まれる他の料理、又はメインディッシュに対応するワイン)が提供されるか等が考えられる。 For example, information on food and drink may include information on provision of food and drink. Examples of information related to the provision of food and drink include, for example, the elapsed time since provision, whether or not it is a part of a course dish, related food and drink (for example, other dishes included in the course dish, or main dishes) The corresponding wine) may be offered.
 例えば、飲食物に関する情報は、飲食物に関する人に関する情報を含んでいてもよい。
飲食物に関する人に関する情報としては、どのユーザに対応付けられているか(注文者又は飲食者)、飲食物を飲食するユーザが一人か又は複数人のグループに属するか、グループにおけるユーザの役割、飲食店の混雑度合い、及び店員の数等が考えられる。
For example, the information regarding food and drink may include information regarding people regarding food and drink.
As information about a person related to food and drink, which user is associated (orderer or food and drink), whether or not the user who eats or drinks belongs to one or more groups, the role of the user in the group, food and drink The degree of store congestion, the number of store clerk, and the like can be considered.
 <3.3.設定情報>
 情報処理システム100(例えば、取得部123)は、飲食物に対応付けられた設定情報を取得する。例えば、情報処理システム100は、飲食物を画像認識することで飲食物を特定し、特定した飲食物に対応付けられた設定情報を記憶部150から取得する。
<3.3. Setting information>
Information processing system 100 (for example, acquisition part 123) acquires setting information matched with food and drink. For example, the information processing system 100 identifies food and drink by recognizing food and drink, and acquires setting information associated with the identified food and drink from the storage unit 150.
 <3.3.1.設定情報の内容>
 (1)表示実行条件に関する設定情報
 設定情報は、表示オブジェクトの表示実行条件に関する情報を含み得る。情報処理システム100(例えば、表示制御部125)は、設定情報に含まれる表示実行条件が満たされた場合に表示オブジェクトを表示する。
<3.3.1. Contents of setting information>
(1) Setting information related to display execution condition The setting information may include information related to the display execution condition of the display object. The information processing system 100 (for example, the display control unit 125) displays a display object when a display execution condition included in the setting information is satisfied.
 表示実行条件は、表示を開始する条件と終了する条件とを含む。情報処理システム100は、表示を開始する条件が満たされてから、終了する条件が満たされるまでの期間、表示オブジェクトを表示する。 The display execution condition includes a condition for starting display and a condition for ending. The information processing system 100 displays the display object for a period from when the condition for starting display is satisfied until the condition for ending is satisfied.
 設定情報には、複数の表示実行条件が含まれてもよい。例えば、ひとつの飲食物に関して表示され得る表示オブジェクトの候補は複数あってもよく、その各々の候補に対して表示実行条件が設定され得る。そして、情報処理システム100は、満たされた表示実行条件に対応する表示オブジェクトを表示する。 The setting information may include a plurality of display execution conditions. For example, there may be a plurality of display object candidates that can be displayed for one food and drink, and display execution conditions can be set for each candidate. Then, the information processing system 100 displays a display object corresponding to the satisfied display execution condition.
 -表示実行条件に関する設定情報一例
 表示実行条件に関する設定情報の一例を以下に説明する。表示実行条件に関する設定情報は、以下に説明するものの少なくともいずれかに基づくものである。
-An example of setting information regarding display execution conditions An example of setting information regarding display execution conditions will be described below. The setting information related to the display execution condition is based on at least one of those described below.
  ・飲食物に関する表示実行条件
 表示実行条件は、飲食物に関するものであってもよい。以下、その一例を説明する。
-Display execution condition regarding food and drink The display execution condition may be related to food and drink. An example will be described below.
 表示実行条件は、飲食物の残量に基づくものであってもよい。例えば、飲料の残量が閾値以下になった場合に、追加注文を促す表示オブジェクトの表示を実行すべきと判断され得る。 The display execution condition may be based on the remaining amount of food and drink. For example, when the remaining amount of the beverage is equal to or less than the threshold value, it may be determined that display of the display object that prompts the additional order should be executed.
 表示実行条件は、飲食物と所定の実物体との接触に基づくものであってもよい。例えば、肉料理にフォークが接触した場合に、肉の産地を説明する表示オブジェクトの表示を実行すべきと判断され得る。 The display execution condition may be based on contact between food and drink and a predetermined real object. For example, when a fork comes into contact with a meat dish, it may be determined that display of a display object that explains the production area of the meat should be executed.
 表示実行条件は、飲食物と他の飲食物との関係に基づくものであってもよい。例えば、複数のユーザに提供された複数の飲料に関し、そのうち数人の飲料の残量が20%以下になった場合に、追加注文を促す表示オブジェクトの表示を実行すべきと判断され得る。 The display execution condition may be based on the relationship between the food and drink and other food and drink. For example, regarding a plurality of beverages provided to a plurality of users, when the remaining amount of some of the beverages is 20% or less, it may be determined that display of a display object that prompts an additional order should be executed.
 表示実行条件は、飲食物が単品として提供されたものかコース料理の一部として提供されたものかに基づくものであってもよい。例えば、単品として(例えば、アラカルトで)提供された飲食物と、コース料理の一部として提供された飲食物とで、異なる表示オブジェクトを表示すべきと判断され得る。 The display execution condition may be based on whether food or drink is provided as a single item or as part of a course meal. For example, it may be determined that different display objects should be displayed for food and drink provided as a single item (for example, in a la carte) and food and drink provided as part of a course meal.
  ・時間に関する表示実行条件
 表示実行条件は、時間に関するものであってもよい。以下、その一例を説明する。
-Display execution condition regarding time The display execution condition may be related to time. An example will be described below.
 表示実行条件は、時刻に基づくものであってもよい。例えば、時刻が昼食の時間帯に属するのか、又は夕食の時間帯に属するのか、等に応じて、異なる表示オブジェクトを表示すべきと判断され得る。 The display execution condition may be based on time. For example, it may be determined that a different display object should be displayed depending on whether the time belongs to a lunch time zone or a dinner time zone.
 表示実行条件は、飲食物が提供されてからの経過時間に基づくものであってもよい。例えば、飲食物が提供されてからの経過時間が閾値を超えた場合に、追加注文を促す表示オブジェクトの表示を実行すべきと判断され得る。 The display execution condition may be based on the elapsed time since the food or drink is provided. For example, when the elapsed time from the provision of food and drink exceeds a threshold, it can be determined that display of a display object that prompts an additional order should be executed.
 表示実行条件は、ユーザが食事を開始してからの経過時間に基づくものであってもよい。例えば、ユーザが席について食事を開始してからの経過時間が閾値を超えた場合に、追加注文を促す表示オブジェクトの表示を実行すべきと判断され得る。 The display execution condition may be based on the elapsed time since the user started eating. For example, when the elapsed time since the user started eating for a seat exceeds a threshold value, it may be determined that display of a display object that prompts an additional order should be executed.
  ・ユーザに関する表示実行条件
 表示実行条件は、ユーザに関するものであってもよい。以下、その一例を説明する。
-Display execution condition regarding the user The display execution condition may be related to the user. An example will be described below.
 表示実行条件は、ユーザの属性情報に基づくものであってもよい。ユーザの属性情報としては、例えばユーザの名前等の識別情報、性別、年齢及び職業等が考えられる。例えば、ユーザが男性である場合と女性である場合とで、異なる表示オブジェクトを表示すべきと判断され得る。 The display execution condition may be based on user attribute information. As user attribute information, for example, identification information such as a user name, sex, age, occupation, and the like are conceivable. For example, it may be determined that different display objects should be displayed depending on whether the user is male or female.
 表示実行条件は、ユーザの状態に基づくものであってもよい。ユーザの状態としては、例えばユーザの生体情報、感情、及び席の位置等が考えられる。また、生体情報としては、脈拍、体温、血圧、及び顔色等が考えられる。例えば、感情が高ぶっているときと落ち着いているときとで、異なる表示オブジェクトを表示すべきと判断され得る。 The display execution condition may be based on the user's state. As the user's state, for example, the user's biometric information, emotion, seat position, and the like can be considered. In addition, as biological information, pulse, body temperature, blood pressure, complexion color, and the like can be considered. For example, it may be determined that different display objects should be displayed when the emotion is high and when the emotion is calm.
 表示実行条件は、ユーザと他のユーザ(例えば、同席者)との関係に基づくものであってもよい。例えば、ユーザがグループのホスト役である場合に、追加注文を促す表示オブジェクトの表示を実行すべきと判断され、ユーザがゲストである場合に、追加注文を促す表示オブジェクトの表示を実行すべきでないと判断され得る。 The display execution condition may be based on the relationship between the user and another user (for example, a fellow person). For example, when the user is the host of the group, it is determined that the display object that prompts the additional order should be displayed, and when the user is the guest, the display object that prompts the additional order should not be displayed. It can be judged.
 表示実行条件は、ユーザの飲食履歴又は嗜好に基づくものであってもよい。ユーザの飲食履歴は、例えば、ユーザが過去に飲食した飲食物を識別情報に紐付けて蓄積するデータベースから取得され得る。ユーザの嗜好は、例えばユーザの飲食履歴から推定され得る。例えば、ユーザの嗜好に合致する飲食物を推薦する表示オブジェクト表示を実行すべきと判断され得る。 The display execution condition may be based on the user's eating history or preference. The user's food history can be acquired from, for example, a database that stores food and drinks that the user has eaten and pasted in association with identification information. The user's preference can be estimated from the user's food history, for example. For example, it may be determined that display object display recommending food and drink that matches the user's preference should be executed.
 表示実行条件は、ユーザの話声に基づくものであってもよい。例えば、ユーザと他のユーザ(例えば、同席者)又は店員との会話の音声認識結果に基づいて、話題に沿った飲食物を推薦する表示オブジェクト表示を実行すべきと判断され得る。 The display execution condition may be based on the user's voice. For example, it may be determined that display object display that recommends food and drink along a topic should be executed based on a voice recognition result of a conversation between the user and another user (for example, a cohabitant) or a store clerk.
 上述したユーザに関する表示実行条件は、ひとりのユーザに対して設定されてもよいし、グループに対して設定されてもよい。例えば、ユーザがひとりで席に着いた場合とグループとして複数人で席に着いた場合とで、異なる表示オブジェクトを表示すべきと判断され得る。 The display execution conditions related to the user described above may be set for one user or may be set for a group. For example, it may be determined that different display objects should be displayed depending on whether the user is seated alone or a group of people is seated.
  ・ユーザ操作に関する表示実行条件
 表示実行条件は、ユーザ操作に関するものであってもよい。例えば、表示された表示オブジェクト対するシングルタップ、ダブルタップ又はドラッグ等の所定の操作が検出された場合に、表示オブジェクトの表示を実行すべきと判断され得る。
Display execution condition related to user operation The display execution condition may relate to a user operation. For example, when a predetermined operation such as a single tap, a double tap, or a drag on the displayed display object is detected, it can be determined that display of the display object should be executed.
 -補足
 表示実行条件に関する設定情報は、表示実行条件そのものを示す情報であってもよい。表示実行条件そのものを示す情報とは、例えば、飲料の残量の閾値を示す情報である。
-Supplement The setting information related to the display execution condition may be information indicating the display execution condition itself. The information indicating the display execution condition itself is, for example, information indicating a threshold for the remaining amount of beverage.
 他にも、表示実行条件に関する設定情報は、表示実行条件そのものに変更を加える情報であってもよい。表示実行条件そのものに影響を加える情報とは、例えば、食事が開始されてからの経過時間に応じて飲料の残量の閾値を下げる、といった情報である。典型的には、食事が開始されてすぐの時間帯は飲食物の消費が速く、時間が経過するにつれて消費が遅くなる。これに基づき、食事の前半では飲料の残量が多くても追加注文を促す表示オブジェクトが表示され、食事の後半では飲料の残量が僅かになった場合に追加注文を促す表示オブジェクトが表示され得る。 In addition, the setting information related to the display execution condition may be information for changing the display execution condition itself. The information that affects the display execution condition itself is, for example, information that lowers the threshold value of the remaining amount of beverage according to the elapsed time since the meal was started. Typically, the consumption of food and drink is fast in the time zone immediately after the meal is started, and the consumption becomes slow as time passes. Based on this, in the first half of the meal, a display object that prompts an additional order is displayed even if the remaining amount of beverage is large, and in the second half of the meal, a display object that prompts an additional order is displayed when the remaining amount of beverage is small. obtain.
 また、表示実行条件に関する設定情報は、表示実行条件が満たされてから実際に表示オブジェクトが表示されるまでのタイムラグを示す情報が含まれていてもよい。その場合、情報処理システム100は、表示実行条件が満たされてから、設定されたタイムラグ分の時間が経過後に表示オブジェクトを表示する。例えば、情報処理システム100は、食事の前半では飲料の残量が閾値以下になってすぐに追加注文を促す表示オブジェクトを表示し、食事の後半では飲料の残量が閾値以下になってから所定時間経過後に追加注文を促す表示オブジェクトが表示する。同様のことは、食事が開始されてからの経過時間に応じて、飲料の残量の閾値を下げる等の、表示実行条件に変更を加えることによっても実現可能である。 Further, the setting information related to the display execution condition may include information indicating a time lag from when the display execution condition is satisfied until the display object is actually displayed. In this case, the information processing system 100 displays the display object after the set time lag has elapsed after the display execution condition is satisfied. For example, the information processing system 100 displays a display object that prompts an additional order as soon as the remaining amount of beverage falls below the threshold in the first half of the meal, and predetermined after the remaining amount of beverage falls below the threshold in the second half of the meal. A display object that prompts for additional orders is displayed after a lapse of time. The same thing can be realized by changing the display execution condition such as lowering the threshold value of the remaining amount of beverage according to the elapsed time since the meal was started.
 (2)内容に関する設定情報
 例えば、設定情報は、表示オブジェクトの内容に関する情報を含み得る。情報処理システム100(例えば、表示制御部125)は、設定情報に含まれる表示実行条件が満たされた場合に、満たされた表示実行条件に対応する内容の表示オブジェクトを表示する。
(2) Setting information regarding contents For example, the setting information may include information regarding the contents of the display object. When the display execution condition included in the setting information is satisfied, the information processing system 100 (for example, the display control unit 125) displays a display object having contents corresponding to the satisfied display execution condition.
 表示オブジェクトの内容に関する設定情報は、具体的には、画像(動画像/静止画像)データ、音声データ又はテキストデータの少なくともいずれかを含み得る。 Specifically, the setting information related to the content of the display object can include at least one of image (moving image / still image) data, audio data, and text data.
 -表示オブジェクトの内容に関する設定情報一例
 表示オブジェクトの内容に関する設定情報の一例を以下に説明する。内容に関する設定情報は、以下に説明するものの少なくともいずれかである。
-An example of setting information regarding the contents of a display object An example of setting information regarding the contents of a display object will be described below. The setting information related to the content is at least one of the items described below.
 例えば、表示オブジェクトの内容は、追加注文に関する情報であってもよい。追加注文に関する情報としては、例えば注文発行ボタン、注文可能な飲食物の候補を示す情報、その時点での注文履歴を示す情報(その時点での合計金額情報を含んでいてもよい)、並びに注文数、サイズ及びその他のオプション等のカスタム可能なオプションを示す情報等が考えられる。なお、追加注文に関する表示オブジェクトは、ユーザが所望する飲食物及びオプションの選択を受け付ける形式であってもよいし、現在ユーザが飲食している飲食物と同一の飲食物の注文、即ちおかわりの依頼を受け付ける形式であってもよい。追加注文に関する情報が表示されることで、ユーザは、容易に追加注文を行うことが可能となる。 For example, the content of the display object may be information on additional orders. Examples of information relating to additional orders include, for example, an order issue button, information indicating food and drink candidates that can be ordered, information indicating order history at that time (may include total amount information at that time), and ordering Information indicating customizable options such as number, size and other options may be considered. It should be noted that the display object related to the additional order may be in a format for accepting selection of the food and options desired by the user, or an order for the same food and drink that the user is currently eating and drinking, that is, a request for a replacement. May be accepted. By displaying the information regarding the additional order, the user can easily place the additional order.
 例えば、表示オブジェクトの内容は、飲食物が提供されてからの経過時間に関する情報であってもよい。飲食物が提供されてからの経過時間が表示されることで、ユーザに、早く食べるよう動機づけすることが可能となる。これにより、飲食店にとっては、席の回転率を向上させることが可能となる。 For example, the content of the display object may be information related to the elapsed time since the food and drink were provided. By displaying the elapsed time since the food and drink are provided, it is possible to motivate the user to eat early. Thereby, it becomes possible for a restaurant to improve the rotation rate of a seat.
 例えば、表示オブジェクトの内容は、飲食物の温度に関する情報であってもよい。飲食物の温度が表示されることで、ユーザに、適温で食べるよう動機づけすることが可能となる。 For example, the content of the display object may be information regarding the temperature of food and drink. By displaying the temperature of the food and drink, it is possible to motivate the user to eat at an appropriate temperature.
 例えば、表示オブジェクトの内容は、飲食物とユーザとの対応付けに関する情報であってもよい。グループのうちどのユーザが注文した飲料なのかが表示されることで、ユーザは、例えば席移動によって自分の飲料を見失うことを防止することが可能となる。 For example, the content of the display object may be information related to the association between the food and drink and the user. By displaying which user of the group ordered the beverage, the user can prevent losing his / her beverage due to, for example, moving the seat.
 例えば、表示オブジェクトの内容は、飲食物を説明する情報であってもよい。飲食物を説明する情報としては、例えば、産地、原材料、アレルギー情報、栄養素、エネルギー量、及び食べ方等が考えられる。飲食物を説明する情報が表示されることで、ユーザは、例えば初めての食材であっても適切な食べ方で食べることが可能となる。 For example, the content of the display object may be information explaining food and drink. As information explaining food and drink, a production area, raw materials, allergy information, a nutrient, energy amount, how to eat, etc. can be considered, for example. By displaying information describing food and drink, the user can eat the food in an appropriate way even if it is the first food, for example.
 (3)表示様式に関する設定情報
 例えば、設定情報は、表示オブジェクトの表示様式に関する情報を含み得る。情報処理システム100(例えば、表示制御部125)は、設定情報に含まれる表示実行条件が満たされた場合に、満たされた表示実行条件に対応する内容の表示オブジェクトを、設定された表示様式で表示する。
(3) Setting information regarding display style For example, the setting information may include information regarding the display style of the display object. When the display execution condition included in the setting information is satisfied, the information processing system 100 (for example, the display control unit 125) displays a display object having contents corresponding to the satisfied display execution condition in the set display format. indicate.
 -表示様式に関する設定情報一例
 表示様式に関する設定情報の一例を以下に説明する。表示様式に関する設定情報は、以下に説明するものの少なくともいずれかである。
-Example of setting information related to display style An example of setting information related to the display style will be described below. The setting information regarding the display format is at least one of the following items.
 表示オブジェクトの表示様式に関する設定情報は、表示オブジェクトの表示位置に関する設定情報であってもよい。表示位置に関する設定情報としては、例えば表示オブジェクトを表示すべき表示画面上の位置(例えば、絶対位置、又は飲食物を基準にした相対位置)、範囲、大きさ、及び他の表示オブジェクト又は実物体との重複を許容するか否か等が考えられる。 The setting information regarding the display style of the display object may be setting information regarding the display position of the display object. As setting information regarding the display position, for example, a position on the display screen where the display object is to be displayed (for example, an absolute position or a relative position based on food and drink), a range, a size, and other display objects or real objects It is conceivable whether or not to allow duplication.
 表示オブジェクトの表示様式に関する設定情報は、表示オブジェクトの表示姿勢に関する設定情報であってもよい。表示姿勢に関する設定情報は、例えば表示オブジェクトに含まれる文字の方向を示す情報を含んでいてもよく、情報処理システム100は、例えば表示オブジェクトに含まれる文字がユーザに正対するよう、表示オブジェクトを回転させ得る。 The setting information regarding the display style of the display object may be setting information regarding the display orientation of the display object. The setting information regarding the display posture may include, for example, information indicating the direction of characters included in the display object, and the information processing system 100 rotates the display object so that the characters included in the display object face the user, for example. Can be.
 表示オブジェクトの表示様式に関する設定情報は、アニメーションに関する設定情報であってもよい。アニメーションとしては、例えば、表示オブジェクトの移動、回転、サイズ変更、及び色変更等が考えられる。 The setting information related to the display style of the display object may be setting information related to animation. As the animation, for example, movement, rotation, size change, color change, and the like of the display object can be considered.
 表示オブジェクトの表示様式に関する設定情報は、詳細度に関する設定情報であってもよい。例えば、十分な表示領域が確保できない場合には簡易的な表示オブジェクトが表示され、十分な表示領域が確保される場合には詳細な表示オブジェクトが表示されてもよい。簡易的な表示オブジェクトとしては、新たな情報がある旨を示すアイコン等が考えられる。 The setting information regarding the display style of the display object may be setting information regarding the degree of detail. For example, a simple display object may be displayed when a sufficient display area cannot be secured, and a detailed display object may be displayed when a sufficient display area is secured. As a simple display object, an icon indicating that there is new information can be considered.
 (4)外部機器への通知に関する設定情報
 外部機器への通知に関する設定情報は、表示オブジェクトの表示に関する設定情報と同様の情報を含み得る。例えば、外部機器への通知に関する設定情報は、表示実行条件に関する設定情報と同様の情報を、外部機器への通知実行条件に関する設定情報として含み得る。
(4) Setting information related to notification to external device The setting information related to notification to an external device may include information similar to setting information related to display of a display object. For example, the setting information related to the notification to the external device may include information similar to the setting information related to the display execution condition as setting information related to the notification execution condition to the external device.
 <3.3.2.設定情報の設定>
 情報処理システム100(例えば、設定部121)は、設定情報を多様に設定し得る。
<3.3.2. Setting information settings>
The information processing system 100 (for example, the setting unit 121) can set setting information in various ways.
 例えば、情報処理システム100は、まず、デフォルトの設定情報を設定し、その後の利用履歴に応じた機械学習を行って設定情報を変更してもよい。なお、デフォルトの設定情報は、例えば、他の飲食店における利用履歴に基づいて学習されたものであってもよい。 For example, the information processing system 100 may first set default setting information, and change the setting information by performing machine learning according to the subsequent usage history. Note that the default setting information may be learned based on the usage history in other restaurants, for example.
 情報処理システム100は、飲食店による入力に応じて設定情報を変更してもよい。詳しくは、設定情報の各項目(例えば、表示実行条件に関する飲料の残量の閾値)が、飲食店により入力されてもよい。また、飲食店による大まかな入力に応じて、設定情報の各項目が変更されてもよい。例えば、飲食店は、飲食物に対して「通常」「早め」「遅め」といった、表示オブジェクトの表示タイミングを設定するためのフラグを入力する。すると、情報処理システム100は、表示タイミングを早くしたい/遅くしたいといった要求が満たされるように、例えば表示実行条件に関する飲料の残量の閾値を上げ下げする。なお、フラグの入力は、各々の飲食物に対して個別に行われてもよいし、例えば飲み物に共通して又は量が少ない食べ物に共通して行われる等、複数の飲食物に対して共通に行われてもよい。 The information processing system 100 may change the setting information according to the input by the restaurant. Specifically, each item of the setting information (for example, the threshold value of the remaining amount of beverage related to the display execution condition) may be input by the restaurant. Moreover, each item of setting information may be changed according to the rough input by a restaurant. For example, the restaurant inputs a flag for setting the display timing of the display object such as “normal”, “early”, and “late” for food and drink. Then, the information processing system 100 raises or lowers the threshold value of the remaining amount of the beverage related to the display execution condition, for example, so as to satisfy the request to make the display timing early / slow. In addition, the input of the flag may be performed individually for each food and drink, or common to a plurality of food and drink such as common to drinks or small quantities of food. May be done.
 情報処理システム100は、ユーザ毎の事情又は飲食店ごとの都合に応じて、柔軟に設定情報を変更してもよい。換言すると、設定情報は、ユーザ又は飲食物を提供する飲食店の少なくともいずれかに基づいて可変に設定されてもよい。具体的には、情報処理システム100は、過去に来店履歴のあるユーザ(例えば、常連客)に関しては、過去の利用履歴に応じて設定情報を変更し得る。また、情報処理システム100は、季節ごとのおすすめ品、その日の仕入れの状況、セール等の飲食店の多様な事情に応じて、設定情報を変更し得る。その他、情報処理システム100は、天候等に応じて設定情報を変更してもよい。 The information processing system 100 may flexibly change the setting information according to circumstances for each user or convenience for each restaurant. In other words, the setting information may be variably set based on at least one of a user or a restaurant that provides food and drink. Specifically, the information processing system 100 can change the setting information according to the past use history for a user who has a past visit history (for example, a regular customer). In addition, the information processing system 100 can change the setting information according to various seasonal conditions such as recommended items for each season, the purchase status of the day, and sales. In addition, the information processing system 100 may change the setting information according to the weather or the like.
 設定情報は、項目ごとに変更可否が設定されてもよい。例えば、飲食店側にとって変更することが許容できない項目に関しては、固定することが可能である。 ∙ Whether the setting information can be changed may be set for each item. For example, items that cannot be changed by the restaurant side can be fixed.
 設定情報のうち、変更可能な項目に関し、変更可能な範囲が設定されてもよい。例えば、飲食店側にとって好ましい範囲がある項目に関しては、変更を好ましい範囲内に収めることが可能である。 In the setting information, a changeable range may be set for changeable items. For example, with respect to items having a preferable range for the restaurant side, it is possible to keep the change within the preferable range.
 <3.4.表示制御>
 情報処理システム100(例えば、表示制御部125)は、センシングの結果得られた飲食物に関する情報、及び当該飲食物に対応付けられた設定情報に基づいて、当該飲食物に関する表示オブジェクトの表示を制御する。例えば、情報処理システム100は、ユーザに提供された飲食物に関する情報、及び設定情報に基づいて、表示する表示オブジェクト、表示オブジェクトを表示開始/終了するタイミング、及び/又は表示様式を制御する。
<3.4. Display control>
Information processing system 100 (for example, display control part 125) controls display of a display object about the food based on information about food and drink obtained as a result of sensing and setting information associated with the food and drink. To do. For example, the information processing system 100 controls the display object to be displayed, the display start / end timing of the display object, and / or the display style based on the information about the food and drink provided to the user and the setting information.
 以下、情報処理システム100による表示制御の具体例を説明する。 Hereinafter, a specific example of display control by the information processing system 100 will be described.
 (1)飲食物の残量
 情報処理システム100は、飲食物の量に応じて表示オブジェクトの表示を制御してもよい。例えば、情報処理システム100は、飲食物の残量が少なくなるほど、追加注文を促す表示オブジェクトを目立たせてもよい。その一例を、図7に示した。図7は、本実施形態に係る情報処理システム100による表示制御の一例を示す図である。図7に示した例では、情報処理システム100は、ユーザ10に提供されたビール20の残量が豊富な段階では、小さなメニューアイコン30Aを表示し、残量が減ってくると大きなメニューアイコン30Bを表示し、飲み干されるとおかわり注文ボタン30Cを表示している。このように、飲食物の残量が少なくなるほど強く追加注文を促すことで、ユーザにとっては注文し忘れを防止することが可能となるし、飲食店にとっては売上を向上させることが可能となる。
(1) Remaining amount of food and drink The information processing system 100 may control the display of display objects according to the amount of food and drink. For example, the information processing system 100 may make a display object that prompts an additional order stand out as the remaining amount of food or drink decreases. An example is shown in FIG. FIG. 7 is a diagram illustrating an example of display control by the information processing system 100 according to the present embodiment. In the example shown in FIG. 7, the information processing system 100 displays a small menu icon 30A when the remaining amount of beer 20 provided to the user 10 is abundant, and a large menu icon 30B when the remaining amount decreases. Is displayed, and a refill order button 30C is displayed when it is drunk. In this way, as the remaining amount of food and drink decreases, it is possible to prevent the user from forgetting to order and to increase sales for the restaurant by prompting the additional order.
 (2)消費ペース
 情報処理システム100は、飲食物の消費ペースに応じて表示オブジェクトの表示を制御してもよい。例えば、情報処理システム100は、飲食物の消費ペースが速いほど早いタイミングで、遅いほど遅いタイミングで、追加注文を促す表示オブジェクトを表示してもよい。具体的には、情報処理システム100は、配膳から食べ終わり又は飲み終わりまでの時間が短い場合に、食べ終わり又は飲み終わりから短い時間を空けて、追加注文を促す表示オブジェクトを表示する。また、情報処理システム100は、配膳から食べ終わり又は飲み終わりまでの時間が長い場合に、食べ終わり又は飲み終わりから長い時間を空けて、追加注文を促す表示オブジェクトを表示する。これにより、ユーザの消費ペースに応じた適切なタイミングで、追加注文を受けることが可能となる。
(2) Consumption pace The information processing system 100 may control the display of display objects according to the consumption pace of food and drink. For example, the information processing system 100 may display a display object that prompts an additional order at an earlier timing as the consumption pace of food and drink is faster and at a later timing as the rate is slower. Specifically, when the time from serving to the end of eating or drinking is short, the information processing system 100 displays a display object that prompts an additional order after a short time from the end of eating or the end of drinking. In addition, when the time from serving to the end of eating or drinking is long, the information processing system 100 displays a display object that prompts an additional order after a long time from the end of eating or drinking. Thereby, it becomes possible to receive an additional order at an appropriate timing according to the user's consumption pace.
 (3)飲食物を飲食する順番
 情報処理システム100は、ユーザが飲食物を飲食する順番に応じて表示オブジェクトの表示を制御してもよい。例えば、情報処理システム100は、飲食物を飲食する順番、換言すると飲食物の減り方又は残し方に基づいて、ユーザの嗜好を推定し、ユーザの好みに合致する等の他の飲食物の追加注文を促す表示オブジェクトを表示し得る。なお、情報処理システム100は、プロファイル情報をデータベースから取得して、残された飲食物が嫌いだから残しているのか、敢えて残しているのか等を判断してもよい。
(3) Order to eat and drink The information processing system 100 may control the display of display objects according to the order in which the user eats and drinks food and drink. For example, the information processing system 100 estimates the user's preference based on the order of eating and drinking food, in other words, how to reduce or leave the food, and adds other food and beverage such as matching the user's preference. A display object that prompts an order may be displayed. Note that the information processing system 100 may acquire profile information from the database and determine whether it is left because it does not like the left food or food, or whether it is left intentionally.
 (4)飲食物提供の進捗
 情報処理システム100は、飲食物提供の進捗に応じて表示オブジェクトの表示を制御してもよい。飲食物提供の進捗とは、例えばコース料理の進捗(全何品中の何品が提供済みか)であってもよいし、単品注文された複数の飲食物の進捗であってもよい。例えば、情報処理システム100は、コース料理の前半では早めに表示オブジェクトを表示し、コース料理の後半では遅めに表示オブジェクトを表示する。また、例えば、情報処理システム100は、所定時間後に大皿料理が配膳される場合、ユーザ向けに、テーブルの中央を空けるよう指示する表示オブジェクトを表示する。
(4) Progress of food and beverage provision The information processing system 100 may control the display of display objects according to the progress of food and beverage provision. The progress of food and beverage provision may be, for example, the progress of course cooking (how many of all the products have been provided) or the progress of a plurality of foods and beverages ordered individually. For example, the information processing system 100 displays the display object earlier in the first half of the course dish and displays the display object later in the second half of the course dish. In addition, for example, when a platter dish is served after a predetermined time, the information processing system 100 displays a display object that instructs the user to open the center of the table.
 (5)飲食物とユーザとの対応関係
 情報処理システム100は、飲食物とユーザとの対応関係に応じて表示オブジェクトの表示を制御してもよい。例えば、情報処理システム100は、飲食物とユーザとの対応関係を示す表示オブジェクトを表示してもよい。かかる表示オブジェクトは、ユーザ向けに、当該ユーザが注文した飲食物を明示するものであってもよく、その場合、例えば席移動によって自分の飲料を見失うことを防止することが可能となる。また、かかる表示オブジェクトは、店員向けに、どのユーザが注文した飲食物かを明示するものであってもよく、その場合、店員は飲食物を配膳すべき場所を知得することが可能となる。その一例を、図8に示した。図8は、本実施形態に係る情報処理システム100による表示制御の一例を示す図である。図8に示すように、ユーザ10A及び10Bが着席したテーブル140上にビール20A及び20Bが載置されており、店員40が料理20C及び20Dを配膳しようとしている。図8に示すように、情報処理システム100は、ビール20Aがユーザ10Aのものであることを示す表示オブジェクト30A、及びビール20Bがユーザ10Bのものであることを示す表示オブジェクト30Bを表示している。また、図8に示すように、情報処理システム100は、料理20Cの注文者がユーザ10Aでありユーザ10Aの眼前に配膳すべきことを示す表示オブジェクト30C、及び料理20Dの注文者がユーザ10Bでありユーザ10Bの眼前に配膳すべきことを示す表示オブジェクト30Dを表示している。
(5) Correspondence relationship between food and drink and user The information processing system 100 may control the display of display objects according to the correspondence between food and drink and the user. For example, the information processing system 100 may display a display object that indicates the correspondence between the food and drink and the user. Such a display object may clearly indicate to the user the food and drink ordered by the user. In this case, for example, it is possible to prevent the user from losing sight of his / her beverage by moving the seat. In addition, the display object may clearly indicate to the store clerk which user ordered the food or drink. In that case, the store clerk can know the place where the food or drink is to be served. An example is shown in FIG. FIG. 8 is a diagram illustrating an example of display control by the information processing system 100 according to the present embodiment. As shown in FIG. 8, beers 20A and 20B are placed on a table 140 on which users 10A and 10B are seated, and a store clerk 40 is about to serve dishes 20C and 20D. As illustrated in FIG. 8, the information processing system 100 displays a display object 30A indicating that the beer 20A belongs to the user 10A and a display object 30B indicating that the beer 20B belongs to the user 10B. . 8, the information processing system 100 includes a display object 30C indicating that the orderer of the dish 20C is the user 10A and should be arranged in front of the user 10A, and the orderer of the dish 20D is the user 10B. A display object 30D indicating that the user 10B should be arranged is displayed in front of the user 10B.
 (6)飲食物の飲食の終了
 情報処理システム100は、飲食物の飲食の終了に応じて表示オブジェクトの表示を制御してもよい。例えば、情報処理システム100は、飲食物の飲食が終了した(食べ終わった、又は飲み終わった)タイミングで、飲食物の評価を受け付ける表示オブジェクトを表示してもよい。評価は、採点形式であってもよいし、自由記入のアンケート形式であってもよい。これにより、従来広く行われていた飲食店ごとの評価よりも細かい、飲食物ごとの評価が可能となる。また、情報処理システム100は、入力された評価を、飲食物の撮像画像と共にユーザのSNSに投稿する等してもよい。
(6) End of eating and drinking of food and drink The information processing system 100 may control the display of display objects according to the end of eating and drinking of food and drink. For example, the information processing system 100 may display a display object that accepts an evaluation of food and drink at the timing when the food and drink of the food and drink are finished (finished eating or finished drinking). The evaluation may be in a scoring format or may be a free-filled questionnaire format. Thereby, the evaluation for every food and drink finer than the evaluation for every restaurant widely performed conventionally becomes possible. Moreover, the information processing system 100 may post the input evaluation to the user's SNS together with the captured image of the food and drink.
 (7)表示後の経過時間
 情報処理システム100は、表示オブジェクトを表示した後の経過時間に応じて表示オブジェクトの表示を制御してもよい。例えば、情報処理システム100は、表示オブジェクトの表示を開始してから、時間経過と共に表示領域を小さくする等目立たないようにし、最終的には表示を終了する。
(7) Elapsed time after display The information processing system 100 may control the display of the display object according to the elapsed time after the display object is displayed. For example, the information processing system 100 makes the display area less noticeable, for example, by reducing the display area over time after the display of the display object is started, and finally ends the display.
 (8)飲食店の混雑度合い
 情報処理システム100は、飲食店の混雑度合いに応じて表示オブジェクトの表示を制御してもよい。飲食店の混雑度合いとは、注文の混雑度合いであってもよいし、客の混雑度合いであってもよい。例えば、情報処理システム100は、注文の混雑度合い(例えば、注文の頻度又は消化できていない注文数等)が閾値を超える場合に、追加注文に関する表示オブジェクトの表示タイミングを遅くしたり、目立たなく表示したりする。これにより、新たな注文が抑制されて、混雑度合いの緩和が可能となる。
(8) Congestion degree of restaurant The information processing system 100 may control the display of the display object according to the congestion degree of the restaurant. The degree of congestion of the restaurant may be the degree of order congestion or the degree of customer congestion. For example, when the degree of order congestion (for example, the order frequency or the number of unsuccessful orders) exceeds a threshold value, the information processing system 100 delays the display timing of the display object related to the additional order or displays it inconspicuously. To do. As a result, new orders are suppressed and the degree of congestion can be reduced.
 (9)グループにおける役割
 情報処理システム100は、ユーザのグループにおける役割に応じて表示オブジェクトの表示を制御し得る。例えば、情報処理システム100は、ユーザがグループのホスト役である場合に、追加注文を促す表示オブジェクトや会計情報を含む表示オブジェクトを表示する。これにより、例えばユーザはゲストを適切にもてなすことが可能となる。
(9) Role in Group The information processing system 100 can control the display of display objects according to the role of the user in the group. For example, when the user is a group host, the information processing system 100 displays a display object that prompts an additional order and a display object that includes accounting information. Thereby, for example, the user can appropriately treat the guest.
 (10)ユーザ動作
 情報処理システム100は、ユーザ動作に応じて表示オブジェクトの表示を制御してもよい。例えば、情報処理システム100は、グラスを揺らす等の特定のジェスチャを検出した場合に、表示オブジェクトの表示を開始又は終了する。
(10) User Action The information processing system 100 may control the display of the display object according to the user action. For example, the information processing system 100 starts or ends display of the display object when detecting a specific gesture such as shaking a glass.
 (11)音声
 情報処理システム100は、ユーザの音声に応じて表示オブジェクトの表示を制御してもよい。例えば、情報処理システム100は、ユーザの会話の内容を音声認識することで、ユーザが食べたいもの又は飲みたいものを推定して、推定結果に応じて表示オブジェクトの表示を制御し得る。
(11) Audio The information processing system 100 may control display of the display object according to the user's audio. For example, the information processing system 100 can recognize the content of the user's conversation by voice to estimate what the user wants to eat or drink and can control the display of the display object according to the estimation result.
 (12)生体情報
 情報処理システム100は、生体情報に応じて表示オブジェクトの表示を制御してもよい。例えば、情報処理システム100は、ユーザの体温が閾値より低い場合に、温かい飲食物を推薦する表示オブジェクトを表示する。
(12) Biometric information The information processing system 100 may control display of a display object according to biometric information. For example, the information processing system 100 displays a display object that recommends warm food when the user's body temperature is lower than a threshold.
 <3.5.外部機器への通知>
 情報処理システム100(例えば通知部127)は、センシングの結果得られたユーザに提供された飲食物に関する情報、及び当該飲食物に対応付けられた設定情報に基づいて、当該飲食物に関する情報の外部機器への通知を制御する。例えば、情報処理システム100は、ユーザに提供された飲食物に関する情報、及び設定情報に基づいて、通知する情報、通知するタイミング、及び/又は通知様式を制御する。
<3.5. Notification to external devices>
The information processing system 100 (for example, the notification unit 127) externally outputs the information about the food and drink based on the information about the food and drink provided to the user obtained as a result of the sensing and the setting information associated with the food and drink. Control device notifications. For example, the information processing system 100 controls the information to be notified, the notification timing, and / or the notification format based on the information about the food and drink provided to the user and the setting information.
 例えば、情報処理システム100は、テーブル単位で所定量の飲食物の注文があった場合に、注文情報を外部機器へ通知してもよい。換言すると、情報処理システム100は、テーブル単位で所定量の飲食物の注文があるまで、注文情報の通知を待機していてもよい。これにより、飲食店側の作業効率を向上させることが可能である。 For example, the information processing system 100 may notify the external device of order information when an order for a predetermined amount of food or drink is made in units of tables. In other words, the information processing system 100 may wait for notification of order information until there is an order for a predetermined amount of food and drink in units of tables. Thereby, it is possible to improve the work efficiency of the restaurant side.
 また、情報処理システム100は、ユーザ単位で又はテーブル単位で設定された閾値を超える注文が行われた場合に、アラートを発行又は注文の受け付けを停止してもよい。これにより、ユーザは、会計金額又は摂取アルコール量等を制限することが可能である。 Further, the information processing system 100 may issue an alert or stop accepting an order when an order exceeding a threshold set for each user or each table is performed. Thus, the user can limit the accounting amount or the amount of alcohol consumed.
 <<4.ハードウェア構成例>>
 最後に、図9を参照して、本実施形態に係る情報処理装置のハードウェア構成について説明する。図9は、本実施形態に係る情報処理装置のハードウェア構成の一例を示すブロック図である。なお、図9に示す情報処理装置900は、例えば、図4に示した情報処理システム100を実現し得る。本実施形態に係る情報処理システム100による情報処理は、ソフトウェアと、以下に説明するハードウェアとの協働により実現される。
<< 4. Hardware configuration example >>
Finally, the hardware configuration of the information processing apparatus according to the present embodiment will be described with reference to FIG. FIG. 9 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment. Note that the information processing apparatus 900 illustrated in FIG. 9 can realize the information processing system 100 illustrated in FIG. 4, for example. Information processing by the information processing system 100 according to the present embodiment is realized by cooperation between software and hardware described below.
 図9に示すように、情報処理装置900は、CPU(Central Processing Unit)901、ROM(Read Only Memory)902、RAM(Random Access Memory)903及びホストバス904aを備える。また、情報処理装置900は、ブリッジ904、外部バス904b、インタフェース905、入力装置906、出力装置907、ストレージ装置908、ドライブ909、接続ポート911及び通信装置913を備える。情報処理装置900は、CPU901に代えて、又はこれとともに、電気回路、DSP若しくはASIC等の処理回路を有してもよい。 As shown in FIG. 9, the information processing apparatus 900 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a. The information processing apparatus 900 includes a bridge 904, an external bus 904b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, and a communication device 913. The information processing apparatus 900 may include a processing circuit such as an electric circuit, a DSP, or an ASIC instead of or in addition to the CPU 901.
 CPU901は、演算処理装置および制御装置として機能し、各種プログラムに従って情報処理装置900内の動作全般を制御する。また、CPU901は、マイクロプロセッサであってもよい。ROM902は、CPU901が使用するプログラムや演算パラメータ等を記憶する。RAM903は、CPU901の実行において使用するプログラムや、その実行において適宜変化するパラメータ等を一時記憶する。CPU901は、例えば、図4に示す処理部120を形成し得る。 The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 according to various programs. Further, the CPU 901 may be a microprocessor. The ROM 902 stores programs used by the CPU 901, calculation parameters, and the like. The RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like. For example, the CPU 901 can form the processing unit 120 illustrated in FIG. 4.
 CPU901、ROM902及びRAM903は、CPUバスなどを含むホストバス904aにより相互に接続されている。ホストバス904aは、ブリッジ904を介して、PCI(Peripheral Component Interconnect/Interface)バスなどの外部バス904bに接続されている。なお、必ずしもホストバス904a、ブリッジ904および外部バス904bを分離構成する必要はなく、1つのバスにこれらの機能を実装してもよい。 The CPU 901, ROM 902, and RAM 903 are connected to each other by a host bus 904a including a CPU bus. The host bus 904 a is connected to an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 904. Note that the host bus 904a, the bridge 904, and the external bus 904b do not necessarily have to be configured separately, and these functions may be mounted on one bus.
 入力装置906は、例えば、マウス、キーボード、タッチパネル、ボタン、マイクロフォン、スイッチ及びレバー等、ユーザによって情報が入力される装置によって実現される。また、入力装置906は、例えば、赤外線やその他の電波を利用したリモートコントロール装置であってもよいし、情報処理装置900の操作に対応した携帯電話やPDA等の外部接続機器であってもよい。さらに、入力装置906は、例えば、上記の入力手段を用いてユーザにより入力された情報に基づいて入力信号を生成し、CPU901に出力する入力制御回路などを含んでいてもよい。情報処理装置900のユーザは、この入力装置906を操作することにより、情報処理装置900に対して各種のデータを入力したり処理動作を指示したりすることができる。 The input device 906 is realized by a device in which information is input by the user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever. The input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA that supports the operation of the information processing device 900. . Furthermore, the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the CPU 901. A user of the information processing apparatus 900 can input various data and instruct a processing operation to the information processing apparatus 900 by operating the input device 906.
 他にも、入力装置906は、ユーザに関する情報を検知する装置により形成され得る。例えば、入力装置906は、画像センサ(例えば、カメラ)、深度センサ(例えば、ステレオカメラ)、加速度センサ、ジャイロセンサ、地磁気センサ、光センサ、音センサ、測距センサ、力センサ等の各種のセンサを含み得る。また、入力装置906は、情報処理装置900の姿勢、移動速度等、情報処理装置900自身の状態に関する情報や、情報処理装置900の周辺の明るさや騒音等、情報処理装置900の周辺環境に関する情報を取得してもよい。また、入力装置906は、GNSS(Global Navigation Satellite System)衛星からのGNSS信号(例えば、GPS(Global Positioning System)衛星からのGPS信号)を受信して装置の緯度、経度及び高度を含む位置情報を測定するGNSSモジュールを含んでもよい。また、位置情報に関しては、入力装置906は、Wi-Fi(登録商標)、携帯電話・PHS・スマートフォン等との送受信、または近距離通信等により位置を検知するものであってもよい。入力装置906は、例えば、図4に示す入力部110を形成し得る。 Alternatively, the input device 906 can be formed by a device that detects information about the user. For example, the input device 906 includes various sensors such as an image sensor (for example, a camera), a depth sensor (for example, a stereo camera), an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance sensor, and a force sensor. Can be included. In addition, the input device 906 includes information related to the information processing device 900 state, such as the posture and movement speed of the information processing device 900, and information related to the surrounding environment of the information processing device 900, such as brightness and noise around the information processing device 900. May be obtained. Further, the input device 906 receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite) and receives position information including the latitude, longitude, and altitude of the device. A GNSS module to measure may be included. As for the position information, the input device 906 may detect the position by transmission / reception with Wi-Fi (registered trademark), a mobile phone / PHS / smartphone, or the like, or near field communication. The input device 906 can form, for example, the input unit 110 shown in FIG.
 出力装置907は、取得した情報をユーザに対して視覚的又は聴覚的に通知することが可能な装置で形成される。このような装置として、CRTディスプレイ装置、液晶ディスプレイ装置、プラズマディスプレイ装置、ELディスプレイ装置、レーザープロジェクタ、LEDプロジェクタ及びランプ等の表示装置や、スピーカ及びヘッドホン等の音声出力装置や、プリンタ装置等がある。出力装置907は、例えば、情報処理装置900が行った各種処理により得られた結果を出力する。具体的には、表示装置は、情報処理装置900が行った各種処理により得られた結果を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。他方、音声出力装置は、再生された音声データや音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。出力装置907は、例えば、図4に示す出力部130を形成し得る。 The output device 907 is formed of a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as laser projectors, LED projectors and lamps, audio output devices such as speakers and headphones, printer devices, and the like. . For example, the output device 907 outputs results obtained by various processes performed by the information processing device 900. Specifically, the display device visually displays results obtained by various processes performed by the information processing device 900 in various formats such as text, images, tables, and graphs. On the other hand, the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally. The output device 907 can form, for example, the output unit 130 shown in FIG.
 ストレージ装置908は、情報処理装置900の記憶部の一例として形成されたデータ格納用の装置である。ストレージ装置908は、例えば、HDD等の磁気記憶部デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等により実現される。ストレージ装置908は、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置および記憶媒体に記録されたデータを削除する削除装置などを含んでもよい。このストレージ装置908は、CPU901が実行するプログラムや各種データ及び外部から取得した各種のデータ等を格納する。ストレージ装置908は、例えば、図4に示す記憶部150を形成し得る。 The storage device 908 is a data storage device formed as an example of a storage unit of the information processing device 900. The storage apparatus 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like. The storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like. The storage device 908 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like. The storage device 908 can form, for example, the storage unit 150 shown in FIG.
 ドライブ909は、記憶媒体用リーダライタであり、情報処理装置900に内蔵、あるいは外付けされる。ドライブ909は、装着されている磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリ等のリムーバブル記憶媒体に記録されている情報を読み出して、RAM903に出力する。また、ドライブ909は、リムーバブル記憶媒体に情報を書き込むこともできる。 The drive 909 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 900. The drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903. The drive 909 can also write information to a removable storage medium.
 接続ポート911は、外部機器と接続されるインタフェースであって、例えばUSB(Universal Serial Bus)などによりデータ伝送可能な外部機器との接続口である。 The connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of transmitting data by USB (Universal Serial Bus), for example.
 通信装置913は、例えば、ネットワーク920に接続するための通信デバイス等で形成された通信インタフェースである。通信装置913は、例えば、有線若しくは無線LAN(Local Area Network)、LTE(Long Term Evolution)、Bluetooth(登録商標)又はWUSB(Wireless USB)用の通信カード等である。また、通信装置913は、光通信用のルータ、ADSL(Asymmetric Digital Subscriber Line)用のルータ又は各種通信用のモデム等であってもよい。この通信装置913は、例えば、インターネットや他の通信機器との間で、例えばTCP/IP等の所定のプロトコルに則して信号等を送受信することができる。通信装置913は、例えば、図4に示した通信部160を形成し得る。 The communication device 913 is a communication interface formed by a communication device or the like for connecting to the network 920, for example. The communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB). The communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like. The communication device 913 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices. The communication device 913 can form, for example, the communication unit 160 illustrated in FIG.
 なお、ネットワーク920は、ネットワーク920に接続されている装置から送信される情報の有線、または無線の伝送路である。例えば、ネットワーク920は、インターネット、電話回線網、衛星通信網などの公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)などを含んでもよい。また、ネットワーク920は、IP-VPN(Internet Protocol-Virtual Private Network)などの専用回線網を含んでもよい。 Note that the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920. For example, the network 920 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like. Further, the network 920 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
 以上、本実施形態に係る情報処理装置900の機能を実現可能なハードウェア構成の一例を示した。上記の各構成要素は、汎用的な部材を用いて実現されていてもよいし、各構成要素の機能に特化したハードウェアにより実現されていてもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用するハードウェア構成を変更することが可能である。 Heretofore, an example of the hardware configuration capable of realizing the functions of the information processing apparatus 900 according to the present embodiment has been shown. Each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
 なお、上述のような本実施形態に係る情報処理装置900の各機能を実現するためのコンピュータプログラムを作製し、PC等に実装することが可能である。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体も提供することができる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信されてもよい。 It should be noted that a computer program for realizing each function of the information processing apparatus 900 according to the present embodiment as described above can be produced and mounted on a PC or the like. In addition, a computer-readable recording medium storing such a computer program can be provided. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed via a network, for example, without using a recording medium.
 <<5.まとめ>>
 以上、図1~図9を参照して、本開示の一実施形態について詳細に説明した。上記説明したように、本実施形態に係る情報処理システム100は、センシングの結果得られたユーザに提供された飲食物に関する情報、及び当該飲食物に対応付けられた設定情報に基づいて、当該飲食物に関する表示オブジェクトの表示を制御する。これにより、情報処理システム100は、食事の状況に応じてリアルタイムに、且つ飲食物ごとに設定されたきめ細やかなサービスを提供することが可能である。
<< 5. Summary >>
The embodiment of the present disclosure has been described in detail above with reference to FIGS. As described above, the information processing system 100 according to the present embodiment is based on information about food and drink provided to a user obtained as a result of sensing and setting information associated with the food and drink. Controls display of display objects related to objects. Thereby, the information processing system 100 can provide a fine-tuned service set for each food and drink in real time according to the state of meal.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 例えば、上記実施形態では、情報処理システム100が飲食店において適用される例を説明したが、本技術はかかる例に限定されない。例えば、情報処理システム100は、一般家庭又は合宿所の食堂等の、飲食物が提供され得る任意の場所において適用可能である。 For example, in the above-described embodiment, an example in which the information processing system 100 is applied in a restaurant has been described, but the present technology is not limited to such an example. For example, the information processing system 100 can be applied in any place where food and drink can be provided, such as a general home or a cafeteria cafeteria.
 なお、本明細書において説明した装置は、単独の装置として実現されてもよく、一部または全部が別々の装置として実現されても良い。例えば、図4に示した情報処理システム100の機能構成例のうち、処理部120及び記憶部150が、入力部110、出力部130及び通信部160とネットワーク等で接続されたサーバ等の装置に備えられていても良い。処理部120及び記憶部150がサーバ等の装置に備えられる場合は、入力部110又は通信部160により得られた情報がネットワーク等を通じて当該サーバ等の装置に送信され、処理部120が描画情報セットを関連付けて処理して、当該サーバ等の装置から、出力部130が出力するための情報がネットワーク等を通じて出力部130に送られる。 Note that the device described in this specification may be realized as a single device, or a part or all of the devices may be realized as separate devices. For example, in the functional configuration example of the information processing system 100 illustrated in FIG. 4, the processing unit 120 and the storage unit 150 are connected to an apparatus such as a server connected to the input unit 110, the output unit 130, and the communication unit 160 via a network or the like. It may be provided. When the processing unit 120 and the storage unit 150 are provided in a device such as a server, information obtained by the input unit 110 or the communication unit 160 is transmitted to the device such as the server via a network or the like, and the processing unit 120 sets the drawing information set. The information for the output unit 130 to output from the device such as the server is sent to the output unit 130 through a network or the like.
 また、本明細書においてフローチャートを用いて説明した処理は、必ずしも図示された順序で実行されなくてもよい。いくつかの処理ステップは、並列的に実行されてもよい。また、追加的な処理ステップが採用されてもよく、一部の処理ステップが省略されてもよい。 In addition, the processes described using the flowcharts in this specification do not necessarily have to be executed in the order shown. Some processing steps may be performed in parallel. Further, additional processing steps may be employed, and some processing steps may be omitted.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 センシングの結果得られた食品又は飲料に関する情報、及び前記食品又は飲料に対応付けられた設定情報に基づいて、前記食品又は飲料に関する表示オブジェクトの表示を制御する制御部、
を備える情報処理装置。
(2)
 前記設定情報は、前記表示オブジェクトの表示実行条件に関する、前記(1)に記載の情報処理装置。
(3)
 前記表示実行条件は、前記食品又は飲料の残量、前記食品又は飲料と所定の実物体との接触、又は前記食品又は飲料と他の食品又は飲料との関係の少なくともいずれかに基づく、前記(2)に記載の情報処理装置。
(4)
 前記表示実行条件は、前記食品又は飲料が単品として提供されたものかコース料理の一部として提供されたものかに基づく、前記(2)又は(3)に記載の情報処理装置。
(5)
 前記表示実行条件は、時刻、前記食品又は飲料が提供されてからの経過時間、又は前記食品又は飲料が提供されたユーザが食事を開始してからの経過時間の少なくともいずれかに基づく、前記(2)~(4)のいずれか一項に記載の情報処理装置。
(6)
 前記設定情報は、前記表示オブジェクトの内容に関する、前記(1)~(5)のいずれか一項に記載の情報処理装置。
(7)
 前記内容は、追加注文に関する情報、前記食品又は飲料が提供されてからの経過時間に関する情報、前記食品又は飲料の温度に関する情報、前記食品又は飲料と前記食品又は飲料が提供されたユーザとの対応付けに関する情報、又は前記食品又は飲料を説明する情報の少なくともいずれかである、前記(6)に記載の情報処理装置。
(8)
 前記設定情報は、前記表示オブジェクトの表示様式に関する、前記(1)~(7)のいずれか一項に記載の情報処理装置。
(9)
 前記表示様式は、前記表示オブジェクトの表示位置、表示姿勢、アニメーション、又は詳細度の少なくともいずれかである、前記(8)に記載の情報処理装置。
(10)
 前記設定情報は、前記食品又は飲料が提供されたユーザ又は前記食品又は飲料を提供する飲食店の少なくともいずれかに基づいて可変に設定される、前記(1)~(9)のいずれか一項に記載の情報処理装置。
(11)
 前記制御部は、前記食品又は飲料の残量が少なくなるほど、追加注文を促す前記表示オブジェクトを目立たせる、前記(1)~(10)のいずれか一項に記載の情報処理装置。
(12)
 前記制御部は、前記食品又は飲料の消費ペースが速いほど早いタイミングで、遅いほど遅いタイミングで、追加注文を促す前記表示オブジェクトを表示する、前記(1)~(11)のいずれか一項に記載の情報処理装置。
(13)
 前記制御部は、前記食品又は飲料が提供されたユーザが前記食品又は飲料を飲食する順番に応じた他の食品又は飲料の追加注文を促す前記表示オブジェクトを表示する、前記(1)~(12)のいずれか一項に記載の情報処理装置。
(14)
 前記制御部は、前記食品又は飲料と前記食品又は飲料が提供されたユーザとの対応関係を示す前記表示オブジェクトを表示する、前記(1)~(13)のいずれか一項に記載の情報処理装置。
(15)
 前記制御部は、前記食品又は飲料の飲食が終了したタイミングで、前記食品又は飲料の評価を受け付ける前記表示オブジェクトを表示する、前記(1)~(14)のいずれか一項に記載の情報処理装置。
(16)
 前記制御部は、前記食品又は飲料を提供する飲食店の混雑度合いに応じて前記表示オブジェクトの表示を制御する、前記(1)~(15)のいずれか一項に記載の情報処理装置。
(17)
 前記制御部は、前記食品又は飲料が提供されたユーザの役割に応じた前記表示オブジェクトを表示する、前記(1)~(16)のいずれか一項に記載の情報処理装置。
(18)
 前記制御部は、前記食品又は飲料を提供する飲食店に設置された表示装置により前記食品又は飲料が配膳されるテーブルに前記表示オブジェクトを表示する、前記(1)~(17)のいずれか一項に記載の情報処理装置。
(19)
 センシングの結果得られた食品又は飲料に関する情報、及び前記食品又は飲料に対応付けられた設定情報に基づいて、前記食品又は飲料に関する表示オブジェクトの表示をプロセッサにより制御すること、
を含む情報処理方法。
(20)
 コンピュータを、
 センシングの結果得られた食品又は飲料に関する情報、及び前記食品又は飲料に対応付けられた設定情報に基づいて、前記食品又は飲料に関する表示オブジェクトの表示を制御する制御部、
として機能させるためのプログラムが記憶された記憶媒体。
The following configurations also belong to the technical scope of the present disclosure.
(1)
A control unit for controlling display of a display object relating to the food or beverage based on information relating to the food or beverage obtained as a result of sensing and setting information associated with the food or beverage;
An information processing apparatus comprising:
(2)
The information processing apparatus according to (1), wherein the setting information relates to display execution conditions for the display object.
(3)
The display execution condition is based on at least one of the remaining amount of the food or beverage, the contact between the food or beverage and a predetermined real object, or the relationship between the food or beverage and another food or beverage, The information processing apparatus according to 2).
(4)
The information processing apparatus according to (2) or (3), wherein the display execution condition is based on whether the food or beverage is provided as a single item or as part of a course dish.
(5)
The display execution condition is based on at least one of a time, an elapsed time since the food or beverage was provided, or an elapsed time since the user who provided the food or beverage started eating ( The information processing apparatus according to any one of 2) to (4).
(6)
The information processing apparatus according to any one of (1) to (5), wherein the setting information relates to a content of the display object.
(7)
The content includes information on additional orders, information on the elapsed time since the food or beverage was provided, information on the temperature of the food or beverage, and correspondence between the food or beverage and the user provided with the food or beverage. The information processing apparatus according to (6), wherein the information processing apparatus is at least one of information on attaching or information describing the food or beverage.
(8)
The information processing apparatus according to any one of (1) to (7), wherein the setting information relates to a display format of the display object.
(9)
The information processing apparatus according to (8), wherein the display style is at least one of a display position, a display posture, an animation, and a detail level of the display object.
(10)
The setting information is variably set based on at least one of a user to whom the food or beverage is provided or a restaurant that provides the food or beverage, (1) to (9) The information processing apparatus described in 1.
(11)
The information processing apparatus according to any one of (1) to (10), wherein the control unit makes the display object that prompts an additional order stand out as the remaining amount of the food or beverage decreases.
(12)
The control unit displays the display object that prompts an additional order at an earlier timing as the consumption pace of the food or beverage is faster, and at a later timing as the delay is slower, according to any one of (1) to (11). The information processing apparatus described.
(13)
The control unit displays the display object that prompts an additional order of another food or beverage according to an order in which the user to whom the food or beverage is provided eats or drinks the food or beverage, (1) to (12 ). The information processing apparatus according to any one of
(14)
The information processing according to any one of (1) to (13), wherein the control unit displays the display object indicating a correspondence relationship between the food or beverage and a user to whom the food or beverage is provided. apparatus.
(15)
The information processing unit according to any one of (1) to (14), wherein the control unit displays the display object that receives an evaluation of the food or beverage at a timing when the eating or drinking of the food or beverage is completed. apparatus.
(16)
The information processing apparatus according to any one of (1) to (15), wherein the control unit controls display of the display object in accordance with a degree of congestion of a restaurant that provides the food or beverage.
(17)
The information processing apparatus according to any one of (1) to (16), wherein the control unit displays the display object according to a role of a user who is provided with the food or beverage.
(18)
The control unit displays the display object on a table on which the food or beverage is arranged by a display device installed in a restaurant that provides the food or beverage, any one of (1) to (17) The information processing apparatus according to item.
(19)
Controlling display of a display object related to the food or beverage by a processor based on information about the food or beverage obtained as a result of sensing and setting information associated with the food or beverage,
An information processing method including:
(20)
Computer
A control unit for controlling display of a display object relating to the food or beverage based on information relating to the food or beverage obtained as a result of sensing and setting information associated with the food or beverage;
A storage medium that stores a program for functioning as a computer.
 10  ユーザ
 20  飲食物
 30  表示オブジェクト
 100  情報処理システム
 110  入力部
 120  処理部
 121  設定部
 123  取得部
 125  表示制御部
 127  通知部
 130  出力部
 140  テーブル
 150  記憶部
 160  通信部
DESCRIPTION OF SYMBOLS 10 User 20 Food and drink 30 Display object 100 Information processing system 110 Input part 120 Processing part 121 Setting part 123 Acquisition part 125 Display control part 127 Notification part 130 Output part 140 Table 150 Storage part 160 Communication part

Claims (20)

  1.  センシングの結果得られた食品又は飲料に関する情報、及び前記食品又は飲料に対応付けられた設定情報に基づいて、前記食品又は飲料に関する表示オブジェクトの表示を制御する制御部、
    を備える情報処理装置。
    A control unit for controlling display of a display object relating to the food or beverage based on information relating to the food or beverage obtained as a result of sensing and setting information associated with the food or beverage;
    An information processing apparatus comprising:
  2.  前記設定情報は、前記表示オブジェクトの表示実行条件に関する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the setting information relates to a display execution condition of the display object.
  3.  前記表示実行条件は、前記食品又は飲料の残量、前記食品又は飲料と所定の実物体との接触、又は前記食品又は飲料と他の食品又は飲料との関係の少なくともいずれかに基づく、請求項2に記載の情報処理装置。 The display execution condition is based on at least one of a remaining amount of the food or beverage, a contact between the food or beverage and a predetermined real object, or a relationship between the food or beverage and another food or beverage. 2. The information processing apparatus according to 2.
  4.  前記表示実行条件は、前記食品又は飲料が単品として提供されたものかコース料理の一部として提供されたものかに基づく、請求項2に記載の情報処理装置。 3. The information processing apparatus according to claim 2, wherein the display execution condition is based on whether the food or beverage is provided as a single item or as part of a course dish.
  5.  前記表示実行条件は、時刻、前記食品又は飲料が提供されてからの経過時間、又は前記食品又は飲料が提供されたユーザが食事を開始してからの経過時間の少なくともいずれかに基づく、請求項2に記載の情報処理装置。 The display execution condition is based on at least one of a time, an elapsed time since the food or beverage was provided, or an elapsed time since the user who provided the food or beverage started eating. 2. The information processing apparatus according to 2.
  6.  前記設定情報は、前記表示オブジェクトの内容に関する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the setting information relates to a content of the display object.
  7.  前記内容は、追加注文に関する情報、前記食品又は飲料が提供されてからの経過時間に関する情報、前記食品又は飲料の温度に関する情報、前記食品又は飲料と前記食品又は飲料が提供されたユーザとの対応付けに関する情報、又は前記食品又は飲料を説明する情報の少なくともいずれかである、請求項6に記載の情報処理装置。 The content includes information on additional orders, information on the elapsed time since the food or beverage was provided, information on the temperature of the food or beverage, and correspondence between the food or beverage and the user provided with the food or beverage. The information processing apparatus according to claim 6, wherein the information processing apparatus is at least one of information relating to pasting or information describing the food or beverage.
  8.  前記設定情報は、前記表示オブジェクトの表示様式に関する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the setting information relates to a display style of the display object.
  9.  前記表示様式は、前記表示オブジェクトの表示位置、表示姿勢、アニメーション、又は詳細度の少なくともいずれかである、請求項8に記載の情報処理装置。 The information processing apparatus according to claim 8, wherein the display style is at least one of a display position, a display posture, an animation, and a detail level of the display object.
  10.  前記設定情報は、前記食品又は飲料が提供されたユーザ又は前記食品又は飲料を提供する飲食店の少なくともいずれかに基づいて可変に設定される、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the setting information is variably set based on at least one of a user to whom the food or beverage is provided or a restaurant that provides the food or beverage.
  11.  前記制御部は、前記食品又は飲料の残量が少なくなるほど、追加注文を促す前記表示オブジェクトを目立たせる、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit makes the display object that prompts an additional order stand out as the remaining amount of the food or beverage decreases.
  12.  前記制御部は、前記食品又は飲料の消費ペースが速いほど早いタイミングで、遅いほど遅いタイミングで、追加注文を促す前記表示オブジェクトを表示する、請求項1に記載の情報処理装置。 2. The information processing apparatus according to claim 1, wherein the control unit displays the display object that prompts an additional order at an earlier timing as the consumption pace of the food or beverage is faster and at a later timing as the delay is slower.
  13.  前記制御部は、前記食品又は飲料が提供されたユーザが前記食品又は飲料を飲食する順番に応じた他の食品又は飲料の追加注文を促す前記表示オブジェクトを表示する、請求項1に記載の情報処理装置。 2. The information according to claim 1, wherein the control unit displays the display object that prompts an additional order of another food or beverage according to an order in which the user to whom the food or beverage is provided eats or drinks the food or beverage. Processing equipment.
  14.  前記制御部は、前記食品又は飲料と前記食品又は飲料が提供されたユーザとの対応関係を示す前記表示オブジェクトを表示する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit displays the display object indicating a correspondence relationship between the food or beverage and a user who is provided with the food or beverage.
  15.  前記制御部は、前記食品又は飲料の飲食が終了したタイミングで、前記食品又は飲料の評価を受け付ける前記表示オブジェクトを表示する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit displays the display object that receives an evaluation of the food or beverage at a timing when the eating or drinking of the food or beverage is completed.
  16.  前記制御部は、前記食品又は飲料を提供する飲食店の混雑度合いに応じて前記表示オブジェクトの表示を制御する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit controls display of the display object according to a degree of congestion of a restaurant that provides the food or beverage.
  17.  前記制御部は、前記食品又は飲料が提供されたユーザの役割に応じた前記表示オブジェクトを表示する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit displays the display object corresponding to a role of a user who is provided with the food or beverage.
  18.  前記制御部は、前記食品又は飲料を提供する飲食店に設置された表示装置により前記食品又は飲料が配膳されるテーブルに前記表示オブジェクトを表示する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit displays the display object on a table on which the food or beverage is arranged by a display device installed in a restaurant that provides the food or beverage.
  19.  センシングの結果得られた食品又は飲料に関する情報、及び前記食品又は飲料に対応付けられた設定情報に基づいて、前記食品又は飲料に関する表示オブジェクトの表示をプロセッサにより制御すること、
    を含む情報処理方法。
    Controlling display of a display object related to the food or beverage by a processor based on information about the food or beverage obtained as a result of sensing and setting information associated with the food or beverage,
    An information processing method including:
  20.  コンピュータを、
     センシングの結果得られた食品又は飲料に関する情報、及び前記食品又は飲料に対応付けられた設定情報に基づいて、前記食品又は飲料に関する表示オブジェクトの表示を制御する制御部、
    として機能させるためのプログラムが記憶された記憶媒体。
    Computer
    A control unit for controlling display of a display object relating to the food or beverage based on information relating to the food or beverage obtained as a result of sensing and setting information associated with the food or beverage;
    A storage medium that stores a program for functioning as a computer.
PCT/JP2017/047343 2017-02-15 2017-12-28 Information processing device, information processing method, and storage medium WO2018150756A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-025770 2017-02-15
JP2017025770 2017-02-15

Publications (1)

Publication Number Publication Date
WO2018150756A1 true WO2018150756A1 (en) 2018-08-23

Family

ID=63169800

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/047343 WO2018150756A1 (en) 2017-02-15 2017-12-28 Information processing device, information processing method, and storage medium

Country Status (1)

Country Link
WO (1) WO2018150756A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020116290A1 (en) * 2018-12-06 2020-06-11 株式会社アーティフィス Table projection device
JPWO2021095768A1 (en) * 2019-11-15 2021-05-20
JP7346900B2 (en) 2019-05-10 2023-09-20 東京電力ホールディングス株式会社 Food and beverage service management system, food and beverage service management method and program
WO2024019099A1 (en) * 2022-07-19 2024-01-25 ダイキン工業株式会社 Thermal index estimation device and thermal environment control device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002117186A (en) * 2000-10-10 2002-04-19 Soft Service:Kk Questionnaire managing device and questionnaire method
JP2012108282A (en) * 2010-11-17 2012-06-07 Nikon Corp Electronic apparatus
WO2015098188A1 (en) * 2013-12-27 2015-07-02 ソニー株式会社 Display control device, display control method, and program
JP2016194762A (en) * 2015-03-31 2016-11-17 ソニー株式会社 Information processing system, information processing method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002117186A (en) * 2000-10-10 2002-04-19 Soft Service:Kk Questionnaire managing device and questionnaire method
JP2012108282A (en) * 2010-11-17 2012-06-07 Nikon Corp Electronic apparatus
WO2015098188A1 (en) * 2013-12-27 2015-07-02 ソニー株式会社 Display control device, display control method, and program
JP2016194762A (en) * 2015-03-31 2016-11-17 ソニー株式会社 Information processing system, information processing method, and program

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020116290A1 (en) * 2018-12-06 2020-06-11 株式会社アーティフィス Table projection device
JPWO2020116290A1 (en) * 2018-12-06 2021-02-15 株式会社アーティフィス Table projection device
CN113170071A (en) * 2018-12-06 2021-07-23 株式会社阿提菲斯 Desktop projection device
JP7193790B2 (en) 2018-12-06 2022-12-21 株式会社アーティフィス table projection device
JP7346900B2 (en) 2019-05-10 2023-09-20 東京電力ホールディングス株式会社 Food and beverage service management system, food and beverage service management method and program
JPWO2021095768A1 (en) * 2019-11-15 2021-05-20
WO2021095768A1 (en) * 2019-11-15 2021-05-20 株式会社Nttドコモ Information processing device
JP7344307B2 (en) 2019-11-15 2023-09-13 株式会社Nttドコモ information processing equipment
WO2024019099A1 (en) * 2022-07-19 2024-01-25 ダイキン工業株式会社 Thermal index estimation device and thermal environment control device
JP7473721B2 (en) 2022-07-19 2024-04-23 ダイキン工業株式会社 Thermal index estimation device and thermal environment control device

Similar Documents

Publication Publication Date Title
WO2018150756A1 (en) Information processing device, information processing method, and storage medium
JP6586758B2 (en) Information processing system, information processing method, and program
JP6777201B2 (en) Information processing equipment, information processing methods and programs
KR102445720B1 (en) Device location based on machine learning classifications
US20220215258A1 (en) Devices, Systems, and Methods that Observe and Classify Real-World Activity Relating to an Observed Object, and Track and Disseminate State Relating the Observed Object
US10355947B2 (en) Information providing method
US20170046800A1 (en) Systems and Methods of Automatically Estimating Restaurant Wait Times Using Wearable Devices
US10832041B2 (en) Information processing apparatus, information processing method, and program for detecting a target motion
KR20160128017A (en) Electronic apparatus, server and control method thereof
EP2435992A1 (en) Map guidance for the staff of a service-oriented business
CN105286423A (en) System and method for identifying identity of intelligent cups
CN205083178U (en) Novel intelligence cup device
CN108920125B (en) It is a kind of for determining the method and apparatus of speech recognition result
JP6572629B2 (en) Information processing apparatus, information processing method, and program
JP2016015071A (en) Information output device, order system, order presentation method, and program
JP2014123214A (en) Electronic apparatus
JP6590005B2 (en) Electronic device and program
JP6277582B2 (en) Electronics
US11307877B2 (en) Information processing apparatus and information processing method
CN104347017B (en) The system and method that a kind of advertisement is played
JP2014123215A (en) Electronic apparatus
WO2019171866A1 (en) Information processing device, method for determining dish provision timing, and program
KR102515140B1 (en) Digital tissue box that can be linked to calendar for each user account
US11678442B2 (en) Voice-activated electronic device assembly with separable base
EP3519916B1 (en) Planar electrical connector for an electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17896793

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17896793

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP