EP3308224A1 - Device for assisting a user in a household - Google Patents
Device for assisting a user in a householdInfo
- Publication number
- EP3308224A1 EP3308224A1 EP16728843.0A EP16728843A EP3308224A1 EP 3308224 A1 EP3308224 A1 EP 3308224A1 EP 16728843 A EP16728843 A EP 16728843A EP 3308224 A1 EP3308224 A1 EP 3308224A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- user
- unit
- interaction unit
- control unit
- interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2816—Controlling appliance services of a home automation network by calling their functionalities
- H04L12/282—Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24C—DOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
- F24C7/00—Stoves or ranges heated by electric energy
- F24C7/08—Arrangement or mounting of control or safety devices
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/0092—Nutrition
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/06—Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
- G09B5/067—Combinations of audio and projected visual presentation, e.g. film, slides
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2823—Reporting information sensed by appliance or service execution status of appliance services in a home automation network
- H04L12/2827—Reporting to a device within the home network; wherein the reception of the information reported automatically triggers the execution of a home appliance functionality
- H04L12/2829—Reporting to a device within the home network; wherein the reception of the information reported automatically triggers the execution of a home appliance functionality involving user profiles according to which the execution of a home appliance functionality is automatically triggered
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3173—Constructional details thereof wherein the projection device is specially adapted for enhanced portability
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
Definitions
- the invention relates to a device for supporting a user in a household, in particular for controlling and monitoring household appliances.
- Households typically have a variety of home appliances, particularly a variety of home appliances, such as household appliances. a refrigerator, an oven, a stove, etc.
- the home appliances can be used, for example, to temper foods and to make meals or meals from the food.
- there are a variety of different tasks such as the receipt and management of a supply of food, the selection of recipes for the preparation of meals, the preparation of meals, etc.
- the present document addresses the technical problem of providing a device that efficiently supports a person in a household in performing the multitude of tasks in a household.
- an apparatus for assisting a user in a household is described.
- the device is also referred to in this document as a Personal Kitchen Assistant or PKA for short.
- the device comprises a base with which the device can be placed on a standing surface (eg on a worktop in a kitchen).
- the base can be immovable in relation to the standing surface in the erected state of the device.
- a user can place the device on a base by means of the base, so that the device stands securely and stably on the platform (even if parts of the device, such as the interaction units mentioned below, move).
- the device comprises a first interaction unit, which has an optical sensor (eg, an image or video camera), which is set up to capture image data from a detection area of an environment of the device.
- the detection area typically has a specific, limited horizontal angle range of the environment of the device. That is, by means of the optical sensor, the entire horizontal angle range of 360 ° of the surroundings of the device can typically not be detected at the same time.
- Typical detection areas have a horizontal angle range of 120 °, 90 ° or less.
- the first interaction unit may be moved relative to the base (eg by means of a first actuator, such as by means of a first electric motor) to change the detection area (in particular in the horizontal direction).
- the apparatus comprises a second interaction unit comprising a projector (e.g., a pico projector) configured to project an image onto a projection surface in the environment of the apparatus.
- the projection surface is typically limited to a particular horizontal angle range (e.g., 60 ° or less).
- the second interaction unit may be moved separately from the first interaction unit (e.g., by means of a second actuator, such as by a second electric motor) to alter the projection surface of the projector.
- the device further comprises a control unit, e.g. includes a processor and control software.
- the control unit is set up to determine a position of a user of the device in the vicinity of the device. In particular, the position relative to the device can be determined. The position of the user may e.g.
- control unit is configured to cause the first interaction unit and the second interaction unit to be respectively moved depending on the position of the user.
- control unit is arranged to detect an input of the user (for example, based on the image data of the optical sensor), and in response to the input, to cause the projector to project an image on the projection surface.
- the device enables effective support of a user in the household.
- the provision of (at least) two separate interaction units, which can be moved separately from each other, allows one to Users can effectively make inputs (eg instructions) (eg via a first interaction unit facing the user) and receive corresponding outputs (eg via a second interaction unit facing away from the user).
- the control unit may be configured to cause the first interaction unit to be moved in such a way that the user is at least partially in the detection range of the optical sensor.
- the first interaction unit can thus be moved towards the user.
- effective inputs are enabled by the user (e.g., by evaluating the image data).
- the second interaction unit can be moved in such a way that both the projection surface and the device are in the field of vision of the user (based on the current position of the user).
- the second interaction unit (in particular the projector of the second interaction unit) can thus be moved away from the user.
- it can be ensured that the user can view the projected output on the basis of the current position of the user, and further enables inputs to the device.
- the apparatus may include a first actuator (eg, a first motor) configured to move the first interaction unit about a first rotation axis in response to a first control signal of the control unit to allow different detection ranges in a horizontal angle range of 360 ° about the apparatus - chen.
- the device may comprise a second actuator (eg, a second motor) arranged to move the second interaction unit about a second rotation axis in response to a second control signal of the control unit to different projection surfaces in a horizontal angle range of 360 ° around the device to enable.
- the first and the second axis of rotation may possibly be identical.
- the rotation of the interaction units enables flexible alignment of the device with respect to the position of the user.
- the device may include acoustic sensors (eg, as part of and / or as part of the first interaction unit) each configured to acquire acoustic data relating to acoustic signals around the device.
- An acoustic sensor may include a microphone.
- the acoustic sensors can differ chen bodies of the device may be arranged. It can thus be achieved that acoustic signals which are caused by the user (eg voice instructions of the user) have different transit times to the different acoustic sensors.
- the control unit can be set up to detect the presence of the user in the surroundings of the device on the basis of the acoustic data. For example, it can be detected that the user has given a voice instruction to the device. Furthermore, the control unit may be configured to determine the position of the user on the basis of the acoustic data of the plurality of acoustic sensors. In particular, the durations of acoustic signals can be evaluated for this purpose.
- the use of acoustic sensors thus allows the determination of the position of the user. The position can be determined independently of a current orientation of the first interaction unit. Furthermore, the use of at least one acoustic sensor enables comfortable interaction with a user via human speech.
- the control unit may be configured to determine a first position of the user on the basis of the acoustic data of the plurality of acoustic sensors.
- the first position may correspond to a relatively rough estimate of the actual position of the user.
- the control unit can then cause the first interaction unit to be moved in dependence on the first position of the user, so that the user is at least partially located in the detection range of the optical sensor.
- a second position of the user can then be determined on the basis of the image data. On the basis of the image data, typically the position of the user can be determined with increased precision. The second position thus typically provides a more accurate estimate of the user's actual position than the first position.
- the control unit may then cause the first interaction unit and the second interaction unit to be moved in response to the second position of the user.
- the device may include a storage unit configured to store profile data relating to one or more predefined users.
- the profile data may have characteristics (eg a language profile and / or a visual appearance profile) of the or multiple predefined users, the characteristics enabling identification of users.
- the control unit can be set up to determine on the basis of the profile data and on the basis of the acoustic data and / or the image data whether the user corresponds to a predefined user.
- the device can effectively identify a user and be customized for that user.
- the profile data may optionally include further information related to the user, such as information regarding preferences, habits, etc. of the user.
- the functionality of a unique identification of a user can optionally be provided as an option that can be deactivated by a user (eg for privacy reasons).
- profile data for identifying a user for data protection may possibly only be stored locally on the storage unit of the device.
- the control unit can be set up to transfer the device from a sleep mode to an active mode in dependence on the acoustic data.
- a comfortable activation of the device via acoustic signals take place.
- the control unit can be set up on the basis of the acoustic data by means of intuitive voice control, e.g. based on natural language processing, to determine the input of the user.
- voice control e.g. based on natural language processing
- the projector typically has a fixed projection direction relative to the second interaction unit.
- the second interaction unit can comprise distance sensors which are set up to detect distance data which indicates a distance of the respective distance sensor in the direction of projection to a surface in the vicinity of the device.
- the distance sensors are arranged at different locations of the second interaction unit.
- the control unit may be configured to cause the second interaction unit to also be moved in response to the distance data.
- a flat surface in the vicinity of the device eg a wall in a room
- This flat surface can then possibly be used (taking into account the position of the user) as a projection surface for the projector.
- the first interaction unit may include an input / output unit configured to detect a touch input of the user and / or to generate an optical output to the user via a screen.
- the input / output unit may in particular have a touch-sensitive screen.
- the device, in particular the first interaction unit may comprise an acoustic actuator (eg a loudspeaker) which is set up to generate an acoustic output (eg speech).
- an acoustic actuator eg a loudspeaker
- the device may comprise a communication unit which is set up, via a communication link with a domestic appliance (in particular with a household appliance, such as an oven, a stove, a refrigerator, etc.) and / or with a server (eg with an Internet server and / or to communicate with a server outside of a household).
- the communication connection may include a wireless and / or a wired communication connection (eg LAN, WLAN, Bluetooth, UMTS, LTE, etc.).
- the control unit may be configured to receive information from the home appliance (eg, a state of the home appliance) and / or from the server (eg, a recipe) in response to the user's input. Furthermore, the control unit may be configured to cause the projector to display the information in the projected image.
- the provision of a communication unit enables effective interaction (in particular control and / or monitoring) of household appliances.
- the control unit may be configured to obtain instructions for producing a food (in particular a recipe) from a server (eg from an Internet server). The control unit may then control the home appliance depending on the instructions and in response to an input by the user. Thus, the production of a food as a task in a household for the user can be facilitated.
- control unit may be configured based on the Sparda with respect to the user to determine a process progress in the production of the food.
- the domestic appliance in particular the household appliance
- the control unit may be further configured to determine profile data relating to the user.
- the profile data may be stored on a storage unit of the device. It can then be created depending on the profile data and depending on an input of the user, a shopping list. If necessary, this shopping list can be sent on the basis of the communication unit to a remote electronic device (eg to a smartphone).
- a remote electronic device eg to a smartphone
- the control unit may be configured to generate profile data for the user on the basis of a multiplicity of inputs of the user and to store it on the storage unit of the device.
- the profile data may be e.g. Display characteristics for identifying the user, preferences of the user and / or habits of the user.
- the device can be efficiently adapted to one or more users.
- Figure 1 shows an exemplary personal assistant for a household
- Figure 2 exemplary communication partner of a personal assistant.
- Fig. 1 shows a device 100, which can be used in particular for the control of household appliances in a household.
- the device 100 is also referred to as personal in this document Kitchen Assistant or PKA for short.
- the device 100 is typically the size of a (relatively small) food processor and can be placed on the counter of a kitchen, for example.
- the PKA 100 comprises a base 130 and at least two interaction units 1 10, 120, which are arranged movably on the base 130.
- the two interaction units 110, 120 can move independently of one another on the basis 130.
- the PKA 100 includes a first interaction unit 110 that can be rotated about an axis of rotation that is perpendicular to the base 130.
- the PKA 100 shown in FIG. 1 comprises a second interaction unit 120, which can likewise be rotated about the axis of rotation (independently of the first interaction unit 110).
- the movement of the interaction units 110, 120 can each be done by dedicated actuators (e.g., motors) (not shown in Figure 1).
- the first interaction unit 110 includes one or more interaction modules 1 1 1, 1 12, 1 13, 1 14 for interaction with a user of the PKA 100, wherein the one or more interaction modules 1 1 1, 1 12, 1 13, 1 14 the first interaction unit 1 10 should be facing the user for interaction with the user.
- the first interaction unit 110 may include a screen 11 (for example, a touch-sensitive screen) for outputting information and optionally for inputting instructions.
- the first interaction unit 110 may include a camera 12 configured to capture image data, e.g.
- the first interaction unit 110 may include a speaker 13 for acoustic output (e.g., for the output of speech and / or sounds).
- the first interaction unit 110 may include one or more microphones 14 to capture acoustic data from the environment of the PKA 100 (e.g., spoken instructions of the user).
- the second interaction unit 120 may include one or more interaction modules 121 that should be facing away from the user for interaction with the user of the PKA 100.
- the second interaction unit 120 may include a projector 121 that is configured to project an image onto a projection area in the vicinity of the PKA 100. The image can be projected in such a way that it can be seen by the user from a current position of the user.
- the second interaction unit 120 may be suitably moved (in particular rotated) to project the image to a suitable projection surface.
- the second interaction unit 120 can also comprise one or more distance sensors 122, which are set up to determine the distance to a projection surface (eg, to a wall in a space of the PKA 100) for determining a suitable projection surface.
- the PKA 100 further includes a control unit 131 that is configured to control a movement of the first and second interaction units 1 10, 120 and to control one or more functions of the PKA 100. Furthermore, the PKA 100 includes a communication unit 132 that is configured to communicate with other electronic devices via a communication network. This is exemplified in Fig. 2. In particular, FIG.
- the communication unit 132 may be configured for this purpose to set up wired (such as LAN) and / or wireless (such as WLAN, Bluetooth, UMTS, LTE, etc.) communication links.
- the control unit 131 and / or the communication unit 132 can, as shown in FIG. 1, be arranged in the base 130 of the PKA 100.
- the control unit 131 may be configured to detect a user of the PKA 100.
- control unit 131 may be configured to determine a position of the user relative to the position of the PKA 100.
- a user can be detected on the basis of the acoustic data and / or on the basis of the image data.
- it can be recognized on the basis of the acoustic data that a user addresses the PKA 100.
- a position of the user can be determined (at least roughly) on the basis of travel time shifts of the individual acoustic signals.
- the first interaction unit 110 can then be informed by the control unit. unit 131 are caused to move the camera 1 12 in the direction of the determined position of the user.
- the position of the user can be determined in a precise manner.
- the first interaction unit 110 can be moved further in order to ensure that the screen 1 1 1 of the first interaction unit 1 10 is as precisely as possible facing the user.
- the user is enabled to efficiently view output through the screen 1 1 1 and / or to make inputs via the screen 1 1 1.
- the camera 1 12 may also be facing the user in order to enable reliable input via gestures or facial expressions of the user.
- the second interaction unit 120 can be moved such that the projector 121 of the second interaction unit 120 can project an image onto a projection surface, which can be viewed by the user from his current position. In the projected image, for example, information about the state of one or more domestic appliances 201 and / or about procedural steps of a recipe for a meal to be prepared can be displayed.
- the PKA 100 can be set up to capture instructions of the user (eg by input via the screen 1 1 1, by inputting speech and / or by gestures or facial expressions). Further, the PKA 100 may be configured to perform actions in response to the instructions. In particular, depending on the instructions, one or more domestic appliances 201 of the household may be controlled. For this purpose, suitable control signals can be transmitted via the communication unit 132 to the one or more home appliances 201. In the following, exemplary functions of the PKA 100 are shown. These functions may optionally be provided individually by the PKA 100, respectively. The PKA 100 may enable bi-directional communication between PKA 100 and one or more home appliances 201 or other electronic devices 203 via the communication unit 132.
- state information regarding the state of a domestic appliance 201 or electronic appliance 203 can be transmitted to the PCA 100.
- Bi-directional communication between PKA 100 and one or more users may be by projection (via the projector 121) and / or voice (by means of the one or more microphones 14).
- the presence detection of a user and the user identification can be done by facial and / or speech recognition on the basis of the image data and / or on the basis of the acoustic data.
- an intuitive voice control in particular by means of NLP (Natural Language Processing), can be used.
- the PKA 100 may include a memory unit 133 in which profiles for one or more users of the PKA 100 may be stored.
- profiles for one or more users of the PKA 100 may be stored.
- profiles for one or more users of the PKA 100 may be stored.
- profiles for one or more users of the PKA 100 may be stored.
- preferences and / or habits of a user can be deposited.
- preferred foods may be stored that may be taken into account when creating a shopping list (for example, after actual inventory of the contents of a refrigerator).
- the PKA 100 may include a battery and / or an accumulator configured to store electrical energy for operation of the PKA 100.
- the PKA 100 can thus be mobile and portable.
- the PKA 100 can be controlled by voice, gestures and / or facial expressions (by face detection) by a user. Further, the PKA 100 may be configured to determine a state of mind of the user based on the image data and / or on the basis of the acoustic data (such as "satisfied”, “dissatisfied", “awarding", “disapproving”). The operation of the PKA 100 may then be adapted to the determined state of mind (e.g., the colors used for projection may be adjusted to the determined state of mind). This can improve the interaction with a user.
- the PKA 100 may be configured to project content onto a surface by means of the projector 121 (eg by means of a pico projector).
- the projected contents can be requested beforehand by the user (eg by voice).
- the content may be determined at the user's request (possibly in dependence on a current context) and then projected.
- the results of a search query may be determined and projected by the PCA 100.
- the user may have a grocery list created by the PCA 100 upon instruction. For this purpose, if necessary, a standard shopping list in the storage unit 133 can be accessed.
- the content of a refrigerator 201 can be determined.
- a shopping list and output via the projector 121 can then be determined (eg by subtraction of the standard shopping list and the contents of the refrigerator) a shopping list and output via the projector 121.
- This list can be adjusted depending on inputs (eg gestures).
- current prices for the elements of the shopping list can be determined (eg from different suppliers). It can then be selected a suitable provider. Possibly. the shopping list can be transmitted from the PKA 100 to the personal device 203 of another person, with the request to buy the listed items at the selected provider.
- the PKA 100 may be configured to assist a user in creating a food (e.g., a pastry or a meal).
- a food e.g., a pastry or a meal.
- further functions of the PKA 100 are described, which can also be provided by the PKA 100 in isolation from this application example.
- a wake-up function of the PKA 100 ie a transition from a rest state to an active state
- the PCA 100 can automatically identify possible, free projection surfaces and effect an independent mechanical rotation of the projector 121 or the second interaction unit 120 into the correct projection position.
- the projection system ie, the second interaction unit 120
- the gesture recognition system ie, the first interaction unit 110
- the projection system ie, the second interaction unit 120
- the gesture recognition system ie, the first interaction unit 1
- the control unit 131 of the PKA 100 can thus be set up to identify a head of the user on the basis of the image data of the optical sensor 12 and to cause the first interaction unit 110 to be moved such that the user's head is within the detection range of the optical sensor 12 remains.
- the PKA 100 may be configured to generate gestures for communication with the user, eg by turning the screen 1 1 1 of the first interaction unit 1 10 or by horizontally shaking / vertically pitching the first interaction unit 1 10 as interactive feedback for the user users. For example, deliberate ignoring or agreeing, looking forward, etc., can be suggested by movement of the first interaction unit 110.
- the PKA 100 may be configured to communicate with a user by moving an interaction unit 110.
- the PKA 100 may include a source of vibration as additional feedback.
- the PKA 100 may be configured to detect the presence of a user based on acoustic data and / or image data. Furthermore, user input can be made via voice input (in particular via intuitive voice control, for example by means of natural language processing).
- the PKA 100 may establish a communication connection to a local and / or external recipe database.
- Recipients adapted to the user can be determined and output in list or image form via the projector 121.
- Recipe suggestions can be displayed differentiated, e.g. distinguished by ingredients and utensils available in the household and on the other hand by unavailable ingredients and utensils still to be purchased.
- the prescription proposals may possibly be for a pending occasion, e.g. Birthday, dinner, brunch, etc. be customized.
- the PCA 100 can be set up to compare the planned preparation time for a selected recipe with a user's diary, and if necessary to inform the user that the required preparation time with collided with his diary. The user may then choose another recipe if necessary. It can also be compared with other PAs 100 (eg in other households), eg regarding the availability of ingredients. Thus, the user can be made aware that a particular ingredient is available in a neighboring household.
- the PKA 100 may have an input capability or an automatic detection of household-available equipment, e.g. by RFID tags and / or by direct image recognition and / or by verbal description by the user.
- Inputs, such as a selection or interaction, of the user can be made by voice control and / or gesture recognition.
- the PKA 100 may be configured to control home appliances 201 via the communication unit 132 or to request a status with respect to the home appliances 201.
- the home appliances 201 may be e.g. a refrigerator, a stove, a vacuum cleaner, a blender, a food processor, a multi-cooker, small appliances etc. include.
- home appliances 201 can be controlled according to the selected recipe. For example, a fill level or a content of a refrigerator can be determined.
- a cooker may be driven according to the process progress of the recipe, e.g. by interactive preheating, by program selection, by the selection of multi-level programs, by the setting of a timer, by deactivation, etc.
- a mixer can be controlled, e.g.
- the PKA 100 may be further configured to cause a baking process to be completed when a baking oven door is opened and / or a telescopic extension is extended.
- the PKA 100 may have the individual functions isolated (ie independent of the illustrated example). As already stated, the PKA 100 be brought into an active state by voice control.
- the PKA 100 may be portable and / or mobile.
- the PKA 100 may be powered by a battery (which may optionally be charged with a stationary charging station).
- the PKA 100 may further include an alarm clock function.
- the position of a user / speaker can be determined (eg with an accuracy of +/- 10 °). Objects in the vicinity of the PKA 100 may possibly be recognized by the PKA 100 by means of RFID tags.
- the PKA 100 may be configured to gain access to media databases (e.g., a message channel) via a communication connection.
- Information from a media database may be determined and displayed via the projector 121. In doing so, user preferences can be taken into account (which may possibly be learned automatically by the PCA 100).
- the displayed information can be selected depending on the persons present in the environment of the PKA 100.
- the contents may be distributed according to areas of interest of the persons present in the vicinity of the PKA 100.
- the PKA 100 may provide a reminder function. For example, in response to a weather forecast, an indication of the carrying of a rain screen may be given.
- the PKA 100 may communicate with entertainment systems in the home, such as home entertainment, via the communication unit 131. TV, radio, etc. interact. In particular, a remote control of these devices can be done by the PKA 100.
- the PKA 100 may interact with personal electronic devices 203. For example, the location of an owner of the electronic device 203 can be determined via a personal electronic device 203. The location may then be output by the PKA 100.
- the PKA 100 can communicate via the communication unit 132 with home automation in a household. For example, 100 pictures of a camera at the house entrance can be determined and output via the PKA.
- the PKA 100 may be configured to provide a connection with a door entry system to communicate directly with the PKA 100 at the house entrance and, if necessary, to activate a door opener.
- the PKA 100 can provide a videoconferencing system for interaction with other people via microphone and projection and internet connection.
- conference data incoming from the conference partner can be output via the projector 121 and via the loudspeaker 1 13.
- the PKA 100 may be configured to access a SW database for acquiring software updates and / or software applications for extended functionality.
- the PKA 100 thus allows a variety of different functions to support a user in a household.
- an automatic interaction with home appliances 201 is enabled, such as e.g. a control of an oven, a dishwasher, a food processor, etc.
- the user typically intervenes only indirectly in the control by the user e.g. select a cake recipe and start preparing a corresponding dough.
- the PKA 100 analyzes actions of the user and draws conclusions regarding the temporal device control and interactively controls the home appliances 100 with the user. For this purpose, the PKA 100 can quasi-continuously evaluate image data regarding the user in order to determine the process progress.
- Autonomous communication can take place between multiple PAs 100, e.g. for a reconciliation of the refrigerator contents of neighboring households, for reconciliation of cooked recipes, etc. Furthermore, an adaptive interaction with the user can take place, such as e.g. learning and subsequently recognizing the patient's state of mind based on visual and / or acoustic features.
- the PKA 100 may be configured to communicate with the user via voice output and / or simulation of "facial expressions / gestures" (eg, by real or virtual movement of the HW or SW components of the PKA 100, which is a natural human response simulate).
- the PKA 100 may be set up to compare with a calendar and / or habits of user (s), as well as online services, recurring tasks, etc., and relevant information regarding matching via voice output or Output projection. Furthermore, an automatic comparison of needs of the user, eg a meal request, a recipe request, etc. with sources on the Internet, eg with order platforms for food, with online shops, etc. done.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102015210879.1A DE102015210879A1 (en) | 2015-06-15 | 2015-06-15 | Device for supporting a user in a household |
PCT/EP2016/061404 WO2016202524A1 (en) | 2015-06-15 | 2016-05-20 | Device for assisting a user in a household |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3308224A1 true EP3308224A1 (en) | 2018-04-18 |
Family
ID=56121024
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16728843.0A Withdrawn EP3308224A1 (en) | 2015-06-15 | 2016-05-20 | Device for assisting a user in a household |
Country Status (5)
Country | Link |
---|---|
US (1) | US20180176030A1 (en) |
EP (1) | EP3308224A1 (en) |
CN (1) | CN107969150A (en) |
DE (1) | DE102015210879A1 (en) |
WO (1) | WO2016202524A1 (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102017213427A1 (en) * | 2017-08-02 | 2019-02-07 | BSH Hausgeräte GmbH | Food processor with display |
DE102017215279A1 (en) | 2017-08-31 | 2019-02-28 | BSH Hausgeräte GmbH | household assistant |
DE102017218162A1 (en) | 2017-10-11 | 2019-04-11 | BSH Hausgeräte GmbH | Household assistant with projector |
ES2713598A1 (en) * | 2017-11-15 | 2019-05-22 | Bsh Electrodomesticos Espana Sa | Household appliance system (Machine-translation by Google Translate, not legally binding) |
DE102018001509A1 (en) * | 2018-02-27 | 2019-08-29 | MChef GmbH & Co.KG | Method and system for preparing food |
DE102018205558A1 (en) | 2018-04-12 | 2019-10-17 | BSH Hausgeräte GmbH | Kitchen device with imaging surface element for digital content and kitchen arrangement with kitchen device |
CN110579985A (en) * | 2018-06-07 | 2019-12-17 | 佛山市顺德区美的电热电器制造有限公司 | control method, device and system |
EP3736497A1 (en) * | 2019-05-06 | 2020-11-11 | Electrolux Appliances Aktiebolag | Cooking appliance |
CN110393922A (en) * | 2019-08-26 | 2019-11-01 | 徐州华邦益智工艺品有限公司 | A kind of acoustic control down toy dog with projection function |
CN110891165B (en) * | 2019-12-03 | 2021-05-25 | 珠海格力电器股份有限公司 | Projection device |
EP4271969A1 (en) * | 2020-12-30 | 2023-11-08 | InterProducTec Consulting GmbH & Co. KG | Food processing system |
EP4086726A1 (en) * | 2021-05-06 | 2022-11-09 | BSH Hausgeräte GmbH | Electronic device for providing information regarding an appliance |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3819856A (en) * | 1972-04-17 | 1974-06-25 | D Pearl | Camera capsule |
DE10027963A1 (en) * | 2000-06-08 | 2001-12-13 | O F A Line Gmbh | Food produce identification through a refrigerator with transponder to maintain control of conditions |
US7134080B2 (en) * | 2002-08-23 | 2006-11-07 | International Business Machines Corporation | Method and system for a user-following interface |
KR20070029794A (en) * | 2004-07-08 | 2007-03-14 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | A method and a system for communication between a user and a system |
US20100231506A1 (en) * | 2004-09-07 | 2010-09-16 | Timothy Pryor | Control of appliances, kitchen and home |
US7249708B2 (en) * | 2005-02-04 | 2007-07-31 | The Procter & Gamble Company | Household management systems and methods |
JP4563863B2 (en) * | 2005-05-12 | 2010-10-13 | クリナップ株式会社 | System kitchen |
ITBO20100193A1 (en) * | 2010-03-27 | 2011-09-28 | Chanan Gardi | MULTIMEDIA EQUIPMENT |
US8751049B2 (en) * | 2010-05-24 | 2014-06-10 | Massachusetts Institute Of Technology | Kinetic input/output |
US8322863B1 (en) * | 2010-06-18 | 2012-12-04 | Samuel Seungmin Cho | Apparatus and method for automated visual distortion adjustments for a portable projection device |
US20130052616A1 (en) * | 2011-03-17 | 2013-02-28 | Sears Brands, L.L.C. | Methods and systems for device management with sharing and programming capabilities |
JP5961945B2 (en) * | 2011-08-18 | 2016-08-03 | 株式会社リコー | Image processing apparatus, projector and projector system having the image processing apparatus, image processing method, program thereof, and recording medium recording the program |
CN102999282B (en) * | 2011-09-08 | 2015-11-25 | 北京林业大学 | Based on data object logic control system and the method thereof of real-time stroke input |
KR20130096539A (en) * | 2012-02-22 | 2013-08-30 | 한국전자통신연구원 | Autonomous moving appartus and method for controlling thereof |
US20130279706A1 (en) * | 2012-04-23 | 2013-10-24 | Stefan J. Marti | Controlling individual audio output devices based on detected inputs |
US8990274B1 (en) * | 2012-05-10 | 2015-03-24 | Audible, Inc. | Generating a presentation associated with a set of instructions |
US9836590B2 (en) * | 2012-06-22 | 2017-12-05 | Microsoft Technology Licensing, Llc | Enhanced accuracy of user presence status determination |
US8983662B2 (en) * | 2012-08-03 | 2015-03-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Robots comprising projectors for projecting images on identified projection surfaces |
US20140095479A1 (en) * | 2012-09-28 | 2014-04-03 | Sherry S. Chang | Device, method, and system for recipe recommendation and recipe ingredient management |
US9239627B2 (en) * | 2012-11-07 | 2016-01-19 | Panasonic Intellectual Property Corporation Of America | SmartLight interaction system |
CN103170980B (en) * | 2013-03-11 | 2016-04-20 | 常州铭赛机器人科技股份有限公司 | A kind of navigation system of household service robot and localization method |
US10391636B2 (en) * | 2013-03-15 | 2019-08-27 | Sqn Venture Income Fund, L.P. | Apparatus and methods for providing a persistent companion device |
US20140365225A1 (en) * | 2013-06-05 | 2014-12-11 | DSP Group | Ultra-low-power adaptive, user independent, voice triggering schemes |
US9494683B1 (en) * | 2013-06-18 | 2016-11-15 | Amazon Technologies, Inc. | Audio-based gesture detection |
US20150146078A1 (en) * | 2013-11-27 | 2015-05-28 | Cisco Technology, Inc. | Shift camera focus based on speaker position |
EP2975908A1 (en) * | 2014-06-18 | 2016-01-20 | Toshiba Lighting & Technology Corporation | Lighting device |
-
2015
- 2015-06-15 DE DE102015210879.1A patent/DE102015210879A1/en not_active Withdrawn
-
2016
- 2016-05-20 US US15/736,388 patent/US20180176030A1/en not_active Abandoned
- 2016-05-20 CN CN201680035193.1A patent/CN107969150A/en active Pending
- 2016-05-20 WO PCT/EP2016/061404 patent/WO2016202524A1/en active Application Filing
- 2016-05-20 EP EP16728843.0A patent/EP3308224A1/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
CN107969150A (en) | 2018-04-27 |
US20180176030A1 (en) | 2018-06-21 |
WO2016202524A1 (en) | 2016-12-22 |
DE102015210879A1 (en) | 2016-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3308224A1 (en) | Device for assisting a user in a household | |
Fischinger et al. | Hobbit, a care robot supporting independent living at home: First prototype and lessons learned | |
US20170364828A1 (en) | Multifunction mobile units | |
CN111839371B (en) | Ground sweeping method and device, sweeper and computer storage medium | |
CN105301997B (en) | Intelligent prompt method and system based on mobile robot | |
DE202017107611U1 (en) | Design for compact home assistants with combined sound waveguide and heat sink | |
CN109360559A (en) | The method and system of phonetic order is handled when more smart machines exist simultaneously | |
DE102017129939A1 (en) | Conversational proactive notifications for a voice interface device | |
DE202016007875U1 (en) | Systems for the automatic monitoring of real-time activities in a location to determine waiting times using portable devices | |
CN105446162A (en) | Intelligent home system and intelligent home control method of robot | |
KR20190079669A (en) | Social robot with environmental control feature | |
CN105361429A (en) | Intelligent studying platform based on multimodal interaction and interaction method of intelligent studying platform | |
Goldberg et al. | Collaborative online teleoperation with spatial dynamic voting and a human" Tele-Actor" | |
CN110578994A (en) | operation method and device | |
CN111077786B (en) | Intelligent household equipment control method and device based on big data analysis | |
EP2975908A1 (en) | Lighting device | |
CN110535735A (en) | Playback equipment control method and device based on Internet of Things operating system | |
CN113251610A (en) | Method and device for air conditioner control and air conditioner | |
CN109357366A (en) | Adjustment control method, device, storage medium and air-conditioning system | |
CN109558004A (en) | A kind of control method and device of human body auxiliary robot | |
US20150169834A1 (en) | Fatigue level estimation method, program, and method for providing program | |
CN110533898B (en) | Wireless control learning system, method, apparatus, device and medium for controlled device | |
KR102227427B1 (en) | Cleaning robot, home monitoring apparatus and method for controlling the same | |
DE202018101233U1 (en) | Systems and devices for activity monitoring via a home assistant | |
CN106303701A (en) | Intelligent television content recommendation method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180115 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20190705 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20201201 |