WO2006120636A2 - Method and device for providing help to a user in using a device by using a robot - Google Patents
Method and device for providing help to a user in using a device by using a robot Download PDFInfo
- Publication number
- WO2006120636A2 WO2006120636A2 PCT/IB2006/051442 IB2006051442W WO2006120636A2 WO 2006120636 A2 WO2006120636 A2 WO 2006120636A2 IB 2006051442 W IB2006051442 W IB 2006051442W WO 2006120636 A2 WO2006120636 A2 WO 2006120636A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- help
- robot
- user
- request
- input
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/453—Help systems
Definitions
- the present invention relates to a method and a device for providing help to a user in using a device, wherein said help is provided by a robot.
- the present invention further relates to a robot adapted to provide help to a user in using said device. Background of the invention
- manufacturers of electronic and other goods provide buyers with user manuals including installation and operation methods, function descriptions, notices, etc., in a booklet or electronic form, along with their products.
- the typical help function is included in the software package when it is purchased, where the "Help" function in most cases is a special function along with other main functions provided with the software package.
- the "Help" function By pressing this "Help" function, the user can e.g. receive animated help via showing characters on the screen, which buttons/menus one has to select in case of a problem etc.
- many electronic devices have either no display or are connected to a screen/television. An example of such devices is a CD player. The only way for the user to find out how to perform various functions is therefore to go through the manual, which followed the device when it was purchased.
- many users consider such procedures as very tedious and too time demanding, especially since for many high tech devices the manual consists of several hundreds of pages. This often makes the search for help extremely difficult and in some cases even impossible. It can of course happen that the user cannot find the manual when needed.
- the present invention relates to a method of providing help to a user in using a device, wherein said help is provided by a robot, the method comprising the steps of:
- the robot may be adapted to either ask the user directly about the problem he/she needs to have solved, or be adapted to communicate with the device and request the device for the help the user needs.
- the help can e.g. be an animated help comprising pointing towards the button(s) the user should press, or simply illustrating the procedure of performing the function.
- the device does not even have to be very "intelligent", since, in principle, it only needs to be capable of generating and sending said help request to the robot, wherein subsequently the helping procedure can be based on a direct communication between the user and the robot.
- the help request further comprises an identification of the device, so in that way the robot is aware of what kind of device is requesting for help.
- said received input is obtained via push or touch button function comprised in said device, whereby pressing said push or touch button said indication for help is obtained. In that way, the user can request for said help in a very easy and comfortable way.
- said received input comprises an oral input from said user.
- the user can therefore communicate orally with the device, wherein he/she in a very precise way can formulate the type of problem, and this problem can then be presented to the robot by the device.
- said help request comprises the ID number of said device for uniquely identifying said device. In that way, in cases where there is more than one device, the robot can uniquely identify which device sends the request.
- the request might further include the coordinate position of the device, so that the robot also knows the exact location of the device.
- the robot could of course have all the IDs of the devices along with their locations pre- stored in a memory.
- the method further comprises transmitting help scripts to said robot, wherein said help scripts are assembled from pre- stored help scripts in said device.
- the robot can be provided by the help scripts necessary to help the user, even before the robot has approached the device/user.
- the help script may be included in the help request, or be sent as a separate document.
- the assembling of said help scripts is based on the user's input or the frequency of the user's attempts to perform a particular function on said device. Therefore, based on the oral communication between the user and the device where the user orally describes the problem, e.g. "I don't know how to record on my VCR", the device can process the user's oral request using e.g. a speech recognizer, and subsequently assemble the relevant help scripts, e.g. those relating to the recording procedure.
- the device can estimate that the user' s intention is to record.
- the robot might e.g. communicate directly with the user.
- said robot is selected from a number of robots based on at least one selection criteria. This would/could be necessary where the user possesses more than one robot. In that way, only one robot should be selected.
- said selection criteria are based on at least one selection parameter from a group of selection parameters consisting of:
- the robot can be selected so that waiting time for receiving the help is minimized, where the selection is based on the robot which is closest to the device, or that the provided help is the most comfortable help where the robot has the technical features which are the most favorable (e.g. color display), or that the provided help is very personal where the selected robot is the user's favorable robot. Also, some of the robots might simply not be available because they are solving other tasks.
- the present invention relates to a computer readable medium having stored therein instructions for causing a processing unit to execute said method.
- the present invention relates to a device adapted to provide help to a user in using said device, wherein said help is provided by a robot, comprising: input means for receiving an input from said user indicating a need for help in using said device,
- the present invention relates to a robot adapted to provide a help to a user in using a device, comprising:
- a receiver for receiving a help request from said device indicating a need for help in using said device, a processor for processing the received help request, and - a communication means for communicating with said user when providing said help.
- processing capacity and the intelligence in such robots can further be utilized in providing animated help functions for the users.
- said processor is, in case of more than one robot being present, adapted to select the robot to provide said animated help function.
- the robots communicate with each other and, e.g. based on the previously mentioned selection criteria, determine which one should help the user in the specific case.
- FIG 1 and 2 show one embodiment according to the present invention where a user, which is using a device, requests for help
- figure 3 shows a device according to the present invention
- figure 4 shows one embodiment of a robot according to the present invention
- figure 5 is a flowchart according to one embodiment of the present invention, where, after starting the device, the user needs help to perform a function on the device
- figure 6 is a flowchart according to another embodiment of the present invention
- figure 7 is a flowchart seen from the robot point of view.
- Figure 1 shows one embodiment according to the present invention where a user 102, which is using a device 101, requests for help.
- the device 101 can be any kind of electronic device, such as CD device or the like, which is typically not connected to a screen/television to show help animations.
- the user 102 indicates that he/she needs help in using the device 101 by e.g. pressing a "help- button", whereby a help request 104 is issued or generated and transmitted to a robot 103. In that way the robot 103 is informed that help is needed on this device 101.
- the help request 104 preferably contains information, which uniquely identifies the device, which can be issued simultaneously when pressing the button.
- a help script is extracted from a pre- stored help file within the device 101 and transmitted to the robot 103, either as a separate document or as an additional data file comprised in said help request 104.
- the robot can be any kind of machine or mechanical device, which possesses high processing capacity and preferably can operate automatically with humanlike skills.
- the received input from the user 102 is in one embodiment based on frequency performed by the user of performing particular functions on the device 101, e.g. the user 102 has been trying to record on his/her VCR by pressing the "REC" button a number of times.
- This frequency is detected by the device 101, which based thereon issues said help request 104.
- the device 101 might further be adapted to trigger different help scripts relating to the recording procedure. In such a case, the device 101 estimates that the user 102 needs help for recording on the device 101, and based thereon extracts partial help script relating to this subject.
- the robot 103 might be adapted to communicate directly with the user 102 to find out what kind of help is needed by the user.
- the help scripts comprise a complete help script for this particular device 101 or, as mentioned previously, a partial help script relating to the particular help that the user 102 needs, e.g. a script relating to recording on the user's VCR.
- the help request 104 preferably comprises an ID number of the device 101 for uniquely identifying the device along with its coordinate position.
- the robot 103 may also be preprogrammed in a way that it is already aware of the location of the device 101 with this particular ID number.
- the help request 104 and help scripts are preferably in a markup language for this particular robot 103, such as for creating physical animations.
- the help request 104 comprises only a request to help the user 102.
- the actual help needed by the user 102 is established orally from the user 102 towards the robot 103, wherein via speech recognizing process the robot 103 processes the user request 102. This could be the case where the device 101 cannot find out what kind of help was needed.
- the robot 103 interacts directly with the device 101 by requesting the device 101 for the help script in order to assist the user 102 with the user's help request.
- the robot 103 may also be adapted to request for said script from a database 105 over a wireless communication channel 106 such as the internet.
- the robot 103 has the help script for this device 101 already pre-stored so after processing the oral help request from the user the robot 103 can immediately help the user 102 e.g. via animating to the user how to perform the function the user 102 was not able to perform.
- Such an animation is shown in Fig. 2 where the robot 103 illustrates to the user 102 via pointing or animating and/or orally how to perform the specific function on the device 101 that the user 102 was not able to perform.
- the user shows the robot some domestic appliance, and then the robot explains how to use that appliance. For instance, the user might need help on a "hair dryer".
- the robot recognizes the appliance by some kind of a sensor (e.g. a camera) and based thereon helps the user.
- the robot 103 In the case where the number of robots is more than one, the robot 103 must be selected from said number of robots based on various selection criteria.
- the criteria can comprise which one of the robots is closest to the device 101, which of the robots is the most suitable to be used for this particular device (e.g. it can be preferred that the robot comprises a color display), or the user 102 might prefer a particular robot, or his/her favorable robot.
- Such a selection procedure can either be performed on the device 101 side, or on the robot side. In the former case, the device 101 determines, based on said one or more selection criteria, which one to use.
- the device might transmit a request to all the robots about their coordinate position of the robots, and based on responses from all the roots selects the closest one.
- the device 101 might also be adapted to receive a request from the user 102, such as an oral request, where the user can instruct the device 101 which robot he/she prefers.
- the device 101 transmits a help request 104 to all the robots.
- the robots By communicating with each other, e.g. by requesting each other for data giving the coordinate position or the location of the surrounding robots along with the identification of the robots, the robots internally determine which one has the shortest distance to the device 101.
- the selected robot 103 transmits data to the device identifying itself towards the device and confirms that this robot is the one who will assist the user 102.
- Another selection criteria used by the robots to select a robot to help the user 102 is whether or not the robot(s) is/are available, and/or which one of the robots has the help script for this device 101 pre-stored.
- the device 101 can e.g.
- the robot 103 can start providing the user with the help needed. If it was not possible to estimate what kind of help the user 102 needs, the robot may be adapted to receive oral information from the user 102 about the help needed, e.g. "I don't know how to record on my VCR", whereby this user input is then processed and used for providing said animated help function for the user.
- FIG. 3 shows a device 101, 301 according to the present invention comprising an input means (I_M) 305 for receiving help request from the user 102, a memory (M) 302 having e.g. pre-stored help scripts, a processor (P) 303 which is e.g. adapted to generate said help request 104 based on a command from said user 102, wherein the processor (P) 303 may further be adapted to trigger different help scripts from the memory (M) 302.
- I_M input means
- M memory
- P processor
- the command from the user 102 can be an oral command wherein the processor 303 processes the oral command into a request, which is to be transmitted to the particular robot.
- the device 101, 301 comprises a transmitter (T) 304 for transmitting said help request and (when needed) said help scripts 104 to said robot 103.
- the memory 302 is not essential for the present invention, since the robot
- a server (database) 105 might, as mentioned previously, obtain all the help scripts form a server (database) 105 over a wireless communication channel 106, e.g. the internet.
- Figure 4 shows one embodiment of a robot 103, 401 according to the present invention comprising a receiver (R) 404, a memory (M) 402, a processor (P) 403 and communication means (C_M) 405.
- the receiver (R) 404 is e.g. adapted for receiving said help request 104 from the device 101, 301.
- the processor (P) 403 is e.g. adapted to process the various help requests received by the robot or for e.g. determining which robot from said number of robots should be used in helping the user 102.
- the communication means (C_M) 405 is adapted to provide any kind of help for the user 102 and can e.g. comprise a display for displaying a help function for the user or for displaying an animated help.
- the communication means (C_M) can further comprise movable parts, e.g. an artificial hand, for creating physical animation for the user, such as to animate for the user how to record on the user' s VCR.
- the communication means (C_M) 405 can also comprise a speech recognizer for communicating orally with the user 102. In that way the user 102 can, as previously discussed, formulate the problem orally to the robot, i.e. "how do I record on this VCR". Shown is also a database 406 which the robot 103, 401 may access, e.g. in cases where the device does not have help scripts pre-stored.
- FIG. 5 is a flowchart according to one embodiment of the present invention, where after starting (S) 501 the device, the user needs help to perform a function on the device, e.g. to record or to adjust the timer.
- This can be indicated by the user by means of e.g. pressing a help button on the device and in that way generating a help request (H_R) 503.
- Another way of indicating this assistance could be based on the number of attempts to perform said function, e.g. the user has several times pressed the "REC" button on the VCR, whereas the correct way of recording would be to press the "REC" and the "PLAY” buttons simultaneously.
- the attempts made by the user might indicate the user's interest to record.
- the device can be adapted to trigger different help scripts (T_H) 505 relating to the recording procedure.
- a downloadable help script (T_H) 505 is formulated for the robot, preferably in a markup language for the robots for e.g. creating physical or any other kind of animations for the user.
- the device further selects the robot (S_R) 507 to be used for helping the user.
- the selection may be based on at least one selection parameter indicating which of the robots is closest to the device, which robot has the technical features which are favorable to this particular device (e.g. the robot has a color display, arms etc), or which robot is the user's "favorable" robot (the device could e.g. be adapted to request the user what robot he/she prefers).
- help request (H_R) 503 and help scripts (T_H) 505 are now transmitted by the device (T_H_S) 509 to the selected robot, which then executes the request/script and approaches the user and helps him/her (H_U) 511.
- the robot might further be adapted to request the user whether he/she needs more help (M_H) 513, if yes the processing steps from 505-511 are repeated, otherwise the helping function provided by the robot ends (E) 515.
- Figure 6 is a flowchart according to another embodiment of the present invention, where after starting (S) 601 the device, the user needs help to perform a function on the device, where the user indicates this by e.g.
- the device receives a help request (H_R) 603. Subsequently, the device triggers different help scripts (T_H) 605, and subsequently transmits (T_H_S) 607 a help script to one or more robots.
- T_H help scripts
- T_H_S help scripts
- the difference between this embodiment and the embodiment in Fig. 4 is that, in cases where one or more robots exist in the surroundings of the device, the robots select which one should be used to assist the user. After selection, the robot approaches the user and offers him/her help (H_U) 609. The robot might further be adapted to request the user whether he/she needs more helps (M_H) 611, if yes the processing steps from 605-609 are repeated, otherwise the helping function provided by the robot ends (E) 613.
- the robot could be adapted to assist the user by communicating directly with the user.
- said help button could be used for "calling" the robot, i.e. only transmitting the help request, whereby the device is simultaneously uniquely identified by the robot.
- the robot which receives the "call” approaches the device and asks the user what the problem is.
- the device itself could therefore only be adapted to call the robot, so that the intelligence is more or less on the robot side.
- Figure 7 is a flowchart seen from the robot point of view, beginning with
- (S) 701 the robot receiving a help request (R_H_S) 703 from the device, indicating what device is requesting for help. Also, a help script (H_S) 705 from the device is/may be received, which either can be sent as a separate document or be comprised in the help request. After processing the received documents the robot approaches the device and helps the user (H_U) 707. This help assistance can be in the form of e.g. pointing, animating, displaying etc. as discussed previously. The robot may further be adapted to request the user whether he/she needs more help (M_H) 709. If the answer is yes, the process steps from 703-707 are repeated, otherwise the help is ended (E) 711.
- M_H pointing, animating, displaying etc.
Abstract
This invention relates to a method and a device for supplying help to a user when using a device, wherein the help is obtained using a robot. When the user requests for such help, the request is processed and subsequently transmitted to a robot. After receiving the help request, the robot approaches the device and helps the user, either by requesting the user directly, or via communicating with the device.
Description
METHOD AND DEVICE FOR PROVIDING HELP TO A USER IN USING A DEVICE BY USING A ROBOT
Field of the invention
The present invention relates to a method and a device for providing help to a user in using a device, wherein said help is provided by a robot. The present invention further relates to a robot adapted to provide help to a user in using said device. Background of the invention
Generally, manufacturers of electronic and other goods provide buyers with user manuals including installation and operation methods, function descriptions, notices, etc., in a booklet or electronic form, along with their products.
For personal computers, the typical help function is included in the software package when it is purchased, where the "Help" function in most cases is a special function along with other main functions provided with the software package. By pressing this "Help" function, the user can e.g. receive animated help via showing characters on the screen, which buttons/menus one has to select in case of a problem etc. On the other hand, many electronic devices have either no display or are connected to a screen/television. An example of such devices is a CD player. The only way for the user to find out how to perform various functions is therefore to go through the manual, which followed the device when it was purchased. However, many users consider such procedures as very tedious and too time demanding, especially since for many high tech devices the manual consists of several hundreds of pages. This often makes the search for help extremely difficult and in some cases even impossible. It can of course happen that the user cannot find the manual when needed.
Object and summary of the invention
It is an object of the present invention to solve these problems by providing an animated help function for users when using such devices.
According to one aspect, the present invention relates to a method of providing help to a user in using a device, wherein said help is provided by a robot, the method comprising the steps of:
- receiving an input from said user indicating a need for a help in using said device,
- processing said received input for generating a help request, and
- transmitting said help request to said robot, wherein said robot is adapted to receive said help request and based thereon provide said help.
In that way, the user can be provided with help when using all kinds of devices, such as those having no display screen where the user would otherwise manually have to find out how to solve a particular function. Since today's robots, and especially the future models, possess enormous processing capacity and intelligence a very comfortable help function can be obtained. The robot may be adapted to either ask the user directly about the problem he/she needs to have solved, or be adapted to communicate with the device and request the device for the help the user needs.
The help can e.g. be an animated help comprising pointing towards the button(s) the user should press, or simply illustrating the procedure of performing the function. The device does not even have to be very "intelligent", since, in principle, it only needs to be capable of generating and sending said help request to the robot, wherein subsequently the helping procedure can be based on a direct communication between the user and the robot. Preferably, the help request further comprises an identification of the device, so in that way the robot is aware of what kind of device is requesting for help.
In an embodiment said received input is obtained via push or touch button function comprised in said device, whereby pressing said push or touch button said indication for help is obtained. In that way, the user can request for said help in a very easy and comfortable way.
In an embodiment, said received input comprises an oral input from said user. The user can therefore communicate orally with the device, wherein he/she in a very precise way can formulate the type of problem, and this problem can then be presented to the robot by the device. Furthermore, through such an oral input the user can even request for a particular robot (in case the user possesses more than one robot). This makes the whole communication very user friendly.
In an embodiment, said help request comprises the ID number of said device for uniquely identifying said device. In that way, in cases where there is more than one device, the robot can uniquely identify which device sends the request. The request might further include the coordinate position of the device, so that the robot also knows the exact location of the device. The robot could of course have all the IDs of the devices along with their locations pre- stored in a memory.
In an embodiment, the method further comprises transmitting help scripts to said robot, wherein said help scripts are assembled from pre- stored help scripts in said device. In that way, the robot can be provided by the help scripts necessary to help the user, even before the robot has approached the device/user. The help script may be included in the help request, or be sent as a separate document.
In an embodiment, the assembling of said help scripts is based on the user's input or the frequency of the user's attempts to perform a particular function on said device. Therefore, based on the oral communication between the user and the device where the user orally describes the problem, e.g. "I don't know how to record on my VCR", the device can process the user's oral request using e.g. a speech recognizer, and subsequently assemble the relevant help scripts, e.g. those relating to the recording procedure. In case the user has been pressing the "REC" button, but is not aware of the fact that by pressing the "REC" button and the "PLAY" button simultaneously the recording feature is obtained, the device can estimate that the user' s intention is to record. However, in cases where such estimation fails, or the device is not able to estimate what the problem is, the robot might e.g. communicate directly with the user.
In an embodiment, said robot is selected from a number of robots based on at least one selection criteria. This would/could be necessary where the user possesses more than one robot. In that way, only one robot should be selected.
In an embodiment, said selection criteria are based on at least one selection parameter from a group of selection parameters consisting of:
- the coordinate position of said robot towards said device,
- the functional features provided by the robots, - the user's favorable robot, and
- whether or not the robots are busy.
Therefore, the robot can be selected so that waiting time for receiving the help is minimized, where the selection is based on the robot which is closest to the
device, or that the provided help is the most comfortable help where the robot has the technical features which are the most favorable (e.g. color display), or that the provided help is very personal where the selected robot is the user's favorable robot. Also, some of the robots might simply not be available because they are solving other tasks. In a further aspect, the present invention relates to a computer readable medium having stored therein instructions for causing a processing unit to execute said method.
According to another aspect the present invention relates to a device adapted to provide help to a user in using said device, wherein said help is provided by a robot, comprising: input means for receiving an input from said user indicating a need for help in using said device,
- a processor for processing said received input and generating a help request, and - a transmitter for transmitting said help request to said robot, wherein said robot is adapted to receive said help request and based thereon provide said help. In an embodiment, said input means for receiving said help request comprises a speech recognizer for receiving an oral input from said user. In that way the device is capable of communicating with the user or the robot. According to yet another aspect, the present invention relates to a robot adapted to provide a help to a user in using a device, comprising:
- a receiver for receiving a help request from said device indicating a need for help in using said device, a processor for processing the received help request, and - a communication means for communicating with said user when providing said help.
Therefore, the processing capacity and the intelligence in such robots can further be utilized in providing animated help functions for the users.
In an embodiment, said processor is, in case of more than one robot being present, adapted to select the robot to provide said animated help function. In such cases, the robots communicate with each other and, e.g. based on the previously mentioned selection criteria, determine which one should help the user in the specific case.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter. Brief description of the drawings
In the following preferred embodiments of the invention will be described referring to the figures, where
figures 1 and 2 show one embodiment according to the present invention where a user, which is using a device, requests for help, figure 3 shows a device according to the present invention, figure 4 shows one embodiment of a robot according to the present invention, figure 5 is a flowchart according to one embodiment of the present invention, where, after starting the device, the user needs help to perform a function on the device, figure 6 is a flowchart according to another embodiment of the present invention, figure 7 is a flowchart seen from the robot point of view.
Description of preferred embodiments
Figure 1 shows one embodiment according to the present invention where a user 102, which is using a device 101, requests for help. The device 101 can be any kind of electronic device, such as CD device or the like, which is typically not connected to a screen/television to show help animations. As illustrated here, the user 102 indicates that he/she needs help in using the device 101 by e.g. pressing a "help- button", whereby a help request 104 is issued or generated and transmitted to a robot 103. In that way the robot 103 is informed that help is needed on this device 101. The help request 104 preferably contains information, which uniquely identifies the device, which can be issued simultaneously when pressing the button. In a preferred embodiment, a help script is extracted from a pre- stored help file within the device 101 and transmitted to the robot 103, either as a separate document or as an additional data file comprised in said help request 104. The robot can be any kind of machine or
mechanical device, which possesses high processing capacity and preferably can operate automatically with humanlike skills.
The received input from the user 102 is in one embodiment based on frequency performed by the user of performing particular functions on the device 101, e.g. the user 102 has been trying to record on his/her VCR by pressing the "REC" button a number of times. This frequency is detected by the device 101, which based thereon issues said help request 104. The device 101 might further be adapted to trigger different help scripts relating to the recording procedure. In such a case, the device 101 estimates that the user 102 needs help for recording on the device 101, and based thereon extracts partial help script relating to this subject. In cases where it is impossible to estimate what kind of help the user 102 needs, the robot 103 might be adapted to communicate directly with the user 102 to find out what kind of help is needed by the user.
In one embodiment the help scripts comprise a complete help script for this particular device 101 or, as mentioned previously, a partial help script relating to the particular help that the user 102 needs, e.g. a script relating to recording on the user's VCR. Furthermore, the help request 104 preferably comprises an ID number of the device 101 for uniquely identifying the device along with its coordinate position. The robot 103 may also be preprogrammed in a way that it is already aware of the location of the device 101 with this particular ID number. The help request 104 and help scripts are preferably in a markup language for this particular robot 103, such as for creating physical animations.
In another embodiment the help request 104 comprises only a request to help the user 102. In this embodiment the actual help needed by the user 102 is established orally from the user 102 towards the robot 103, wherein via speech recognizing process the robot 103 processes the user request 102. This could be the case where the device 101 cannot find out what kind of help was needed. In that way, the robot 103 interacts directly with the device 101 by requesting the device 101 for the help script in order to assist the user 102 with the user's help request. The robot 103 may also be adapted to request for said script from a database 105 over a wireless communication channel 106 such as the internet. In one embodiment the robot 103 has the help script for this device 101 already pre-stored so after processing the oral help request from the user the robot 103 can immediately help the user 102 e.g. via
animating to the user how to perform the function the user 102 was not able to perform. Such an animation is shown in Fig. 2 where the robot 103 illustrates to the user 102 via pointing or animating and/or orally how to perform the specific function on the device 101 that the user 102 was not able to perform. In another embodiment the user shows the robot some domestic appliance, and then the robot explains how to use that appliance. For instance, the user might need help on a "hair dryer". The robot recognizes the appliance by some kind of a sensor (e.g. a camera) and based thereon helps the user.
In the case where the number of robots is more than one, the robot 103 must be selected from said number of robots based on various selection criteria. The criteria can comprise which one of the robots is closest to the device 101, which of the robots is the most suitable to be used for this particular device (e.g. it can be preferred that the robot comprises a color display), or the user 102 might prefer a particular robot, or his/her favorable robot. Such a selection procedure can either be performed on the device 101 side, or on the robot side. In the former case, the device 101 determines, based on said one or more selection criteria, which one to use. In case the selection criteria comprises which robot is closest to the device 101 the device might transmit a request to all the robots about their coordinate position of the robots, and based on responses from all the roots selects the closest one. The device 101 might also be adapted to receive a request from the user 102, such as an oral request, where the user can instruct the device 101 which robot he/she prefers.
In the latter case where the robots select which one to use, the device 101 transmits a help request 104 to all the robots. By communicating with each other, e.g. by requesting each other for data giving the coordinate position or the location of the surrounding robots along with the identification of the robots, the robots internally determine which one has the shortest distance to the device 101. In one embodiment, the selected robot 103 then transmits data to the device identifying itself towards the device and confirms that this robot is the one who will assist the user 102. Another selection criteria used by the robots to select a robot to help the user 102 is whether or not the robot(s) is/are available, and/or which one of the robots has the help script for this device 101 pre-stored. Now, the device 101 can e.g. transmit a complete or partial help script to this selected robot (assuming the robot does not already have it stored), or the robot might download the script directly from a server (database) 105 over the
internet 106 or even from one of the surrounding robots. During or after executing the received help script, the robot 103 can start providing the user with the help needed. If it was not possible to estimate what kind of help the user 102 needs, the robot may be adapted to receive oral information from the user 102 about the help needed, e.g. "I don't know how to record on my VCR", whereby this user input is then processed and used for providing said animated help function for the user.
In one embodiment, if the user 102 is not able to perform the task despite the help provided by the robot, the robot 103 communicates directly with the device, which then carries out the function for the user. Figure 3 shows a device 101, 301 according to the present invention comprising an input means (I_M) 305 for receiving help request from the user 102, a memory (M) 302 having e.g. pre-stored help scripts, a processor (P) 303 which is e.g. adapted to generate said help request 104 based on a command from said user 102, wherein the processor (P) 303 may further be adapted to trigger different help scripts from the memory (M) 302. The command from the user 102 can be an oral command wherein the processor 303 processes the oral command into a request, which is to be transmitted to the particular robot. Finally, the device 101, 301 comprises a transmitter (T) 304 for transmitting said help request and (when needed) said help scripts 104 to said robot 103. The memory 302 is not essential for the present invention, since the robot
103 might, as mentioned previously, obtain all the help scripts form a server (database) 105 over a wireless communication channel 106, e.g. the internet.
Figure 4 shows one embodiment of a robot 103, 401 according to the present invention comprising a receiver (R) 404, a memory (M) 402, a processor (P) 403 and communication means (C_M) 405. The receiver (R) 404 is e.g. adapted for receiving said help request 104 from the device 101, 301. The processor (P) 403 is e.g. adapted to process the various help requests received by the robot or for e.g. determining which robot from said number of robots should be used in helping the user 102. The communication means (C_M) 405 is adapted to provide any kind of help for the user 102 and can e.g. comprise a display for displaying a help function for the user or for displaying an animated help. The communication means (C_M) can further comprise movable parts, e.g. an artificial hand, for creating physical animation for the user, such as to animate for the user how to record on the user' s VCR. The
communication means (C_M) 405 can also comprise a speech recognizer for communicating orally with the user 102. In that way the user 102 can, as previously discussed, formulate the problem orally to the robot, i.e. "how do I record on this VCR". Shown is also a database 406 which the robot 103, 401 may access, e.g. in cases where the device does not have help scripts pre-stored.
Figure 5 is a flowchart according to one embodiment of the present invention, where after starting (S) 501 the device, the user needs help to perform a function on the device, e.g. to record or to adjust the timer. This can be indicated by the user by means of e.g. pressing a help button on the device and in that way generating a help request (H_R) 503. Another way of indicating this assistance could be based on the number of attempts to perform said function, e.g. the user has several times pressed the "REC" button on the VCR, whereas the correct way of recording would be to press the "REC" and the "PLAY" buttons simultaneously. However, the attempts made by the user might indicate the user's interest to record. In that way, the device can be adapted to trigger different help scripts (T_H) 505 relating to the recording procedure. It follows that a downloadable help script (T_H) 505 is formulated for the robot, preferably in a markup language for the robots for e.g. creating physical or any other kind of animations for the user. In this embodiment, where more than one robot is in the surrounding of the device, the device further selects the robot (S_R) 507 to be used for helping the user. As discussed previously under Figs. 1-3, the selection may be based on at least one selection parameter indicating which of the robots is closest to the device, which robot has the technical features which are favorable to this particular device (e.g. the robot has a color display, arms etc), or which robot is the user's "favorable" robot (the device could e.g. be adapted to request the user what robot he/she prefers).
The help request (H_R) 503 and help scripts (T_H) 505 are now transmitted by the device (T_H_S) 509 to the selected robot, which then executes the request/script and approaches the user and helps him/her (H_U) 511. The robot might further be adapted to request the user whether he/she needs more help (M_H) 513, if yes the processing steps from 505-511 are repeated, otherwise the helping function provided by the robot ends (E) 515.
Figure 6 is a flowchart according to another embodiment of the present invention, where after starting (S) 601 the device, the user needs help to perform a function on the device, where the user indicates this by e.g. pressing a help button, whereby doing so the device receives a help request (H_R) 603. Subsequently, the device triggers different help scripts (T_H) 605, and subsequently transmits (T_H_S) 607 a help script to one or more robots. The difference between this embodiment and the embodiment in Fig. 4 is that, in cases where one or more robots exist in the surroundings of the device, the robots select which one should be used to assist the user. After selection, the robot approaches the user and offers him/her help (H_U) 609. The robot might further be adapted to request the user whether he/she needs more helps (M_H) 611, if yes the processing steps from 605-609 are repeated, otherwise the helping function provided by the robot ends (E) 613.
In still another embodiment, the robot could be adapted to assist the user by communicating directly with the user. In that way, said help button could be used for "calling" the robot, i.e. only transmitting the help request, whereby the device is simultaneously uniquely identified by the robot. In this embodiment, the robot which receives the "call", approaches the device and asks the user what the problem is. The device itself could therefore only be adapted to call the robot, so that the intelligence is more or less on the robot side. Figure 7 is a flowchart seen from the robot point of view, beginning with
(S) 701 the robot receiving a help request (R_H_S) 703 from the device, indicating what device is requesting for help. Also, a help script (H_S) 705 from the device is/may be received, which either can be sent as a separate document or be comprised in the help request. After processing the received documents the robot approaches the device and helps the user (H_U) 707. This help assistance can be in the form of e.g. pointing, animating, displaying etc. as discussed previously. The robot may further be adapted to request the user whether he/she needs more help (M_H) 709. If the answer is yes, the process steps from 703-707 are repeated, otherwise the help is ended (E) 711. It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word 'comprising' does not exclude the presence of other elements or steps
than those listed in a claim. The invention can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In a device claim enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Claims
1. A method of providing help to a user (102) in using a device (101, 301), wherein said help is provided by a robot (103, 401), the method comprising the steps of: receiving an input from said user (102) indicating a need for a help in using said device (101, 301), - processing said received input for generating a help request (104), and transmitting said help request (104) to said robot (103, 401), wherein said robot is adapted to receive said help request (104) and based thereon provides said help.
2. A method according to claim 1, wherein said received input is obtained via push or touch button functions comprised in said device (101, 301), whereby pressing said push or touch button said indication for help is obtained.
3. A method according to claim 1, wherein said received input comprises an oral input from said user (102).
4. A method according to claim 1, wherein said help request (104) comprises the ID number of said device (101, 301) for uniquely identifying said device.
5. A method according to any of the preceding claims, further comprising transmitting help scripts to said robot, wherein said help scripts are assembled from pre- stored help scripts in said device (101, 301).
6. A method according to claim 5, wherein assembling said help scripts is based on the user' s input or the frequency of the user' s attempts to perform a particular function on said device (101, 301).
7. A method according to any of the preceding claims, wherein said robot (103, 401) is selected from a number of robots based on at least one selection criteria.
8. A method according to claim 7, wherein said selection criteria is based on at least one selection parameter from a group of selection parameters consisting of:
- the coordinate position of said robot (103, 401) towards said device (103,
401),
- the functional features provided by the robots, the user's favorable robot, and whether or not the robots are busy.
9. A computer-readable medium having stored therein instructions for causing a processing unit to execute a method according to claim 1-8.
10. A device (101, 301) adapted to provide help to a user (102) in using said device (101, 301), wherein said help is provided by a robot (103, 401), comprising: input means (305) for receiving an input from said user (102) indicating a need for a help in using said device (101, 301), a processor (303) for processing said received input and generating a help request (104), and - a transmitter (304) for transmitting said help request (104) to said robot
(103, 401), wherein said robot is adapted to receive said help request (104) and based thereon provide said help.
11. A device according to claim 10, wherein said input means (305) for receiving said help request comprises a speech recognizer for receiving an oral input from said user (102).
12. A robot (103, 401) adapted to provide help to a user (102) in using a device (101, 301), comprising: - a receiver (404) for receiving a help request (104) from said device (101,
301) indicating a need for help in using said device (101, 301), a processor (403) for processing the received help request (104), and - a communication means (405) for communicating with said user (102) when providing said help.
13. A robot according to claim 12, wherein said processor (403), in case of more than one robot being present, is adapted to select the robot (103, 401) and to provide said animated help function.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05103851.1 | 2005-05-10 | ||
EP05103851 | 2005-05-10 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2006120636A2 true WO2006120636A2 (en) | 2006-11-16 |
WO2006120636A3 WO2006120636A3 (en) | 2007-05-03 |
Family
ID=37054751
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2006/051442 WO2006120636A2 (en) | 2005-05-10 | 2006-05-09 | Method and device for providing help to a user in using a device by using a robot |
Country Status (2)
Country | Link |
---|---|
TW (1) | TW200712807A (en) |
WO (1) | WO2006120636A2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9534906B2 (en) | 2015-03-06 | 2017-01-03 | Wal-Mart Stores, Inc. | Shopping space mapping systems, devices and methods |
US10017322B2 (en) | 2016-04-01 | 2018-07-10 | Wal-Mart Stores, Inc. | Systems and methods for moving pallets via unmanned motorized unit-guided forklifts |
US10346794B2 (en) | 2015-03-06 | 2019-07-09 | Walmart Apollo, Llc | Item monitoring system and method |
US11046562B2 (en) | 2015-03-06 | 2021-06-29 | Walmart Apollo, Llc | Shopping facility assistance systems, devices and methods |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002337079A (en) * | 2001-05-11 | 2002-11-26 | Sony Corp | Device/method for processing information, recording medium and program |
EP1477908A1 (en) * | 2003-05-14 | 2004-11-17 | Wilfried Beck | Communication system and method for improved use of electrical device |
US6879862B2 (en) * | 2000-02-28 | 2005-04-12 | Roy-G-Biv Corporation | Selection and control of motion data |
-
2006
- 2006-05-09 WO PCT/IB2006/051442 patent/WO2006120636A2/en active Application Filing
- 2006-05-10 TW TW095116582A patent/TW200712807A/en unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6879862B2 (en) * | 2000-02-28 | 2005-04-12 | Roy-G-Biv Corporation | Selection and control of motion data |
JP2002337079A (en) * | 2001-05-11 | 2002-11-26 | Sony Corp | Device/method for processing information, recording medium and program |
EP1477908A1 (en) * | 2003-05-14 | 2004-11-17 | Wilfried Beck | Communication system and method for improved use of electrical device |
Non-Patent Citations (2)
Title |
---|
VAN BREEMEN A J N: "Animation engine for believable interactive user-interface robots" INTELLIGENT ROBOTS AND SYSTEMS, 2004. (IROS 2004). PROCEEDINGS. 2004 IEEE/RSJ INTERNATIONAL CONFERENCE ON SENDAI, JAPAN 28 SEPT.-2 OCT., 2004, PISCATAWAY, NJ, USA,IEEE, vol. 3, 28 September 2004 (2004-09-28), pages 2873-2878, XP010766156 ISBN: 0-7803-8463-6 * |
YOSHIMI T ET AL: "Development of a concept model of a robotic information home appliance, aprialpha" INTELLIGENT ROBOTS AND SYSTEMS, 2004. (IROS 2004). PROCEEDINGS. 2004 IEEE/RSJ INTERNATIONAL CONFERENCE ON SENDAI, JAPAN 28 SEPT.-2 OCT., 2004, PISCATAWAY, NJ, USA,IEEE, vol. 1, 28 September 2004 (2004-09-28), pages 205-211, XP010765635 ISBN: 0-7803-8463-6 * |
Cited By (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10287149B2 (en) | 2015-03-06 | 2019-05-14 | Walmart Apollo, Llc | Assignment of a motorized personal assistance apparatus |
US9875502B2 (en) | 2015-03-06 | 2018-01-23 | Wal-Mart Stores, Inc. | Shopping facility assistance systems, devices, and methods to identify security and safety anomalies |
US9801517B2 (en) | 2015-03-06 | 2017-10-31 | Wal-Mart Stores, Inc. | Shopping facility assistance object detection systems, devices and methods |
US10280054B2 (en) | 2015-03-06 | 2019-05-07 | Walmart Apollo, Llc | Shopping facility assistance systems, devices and methods |
US9875503B2 (en) | 2015-03-06 | 2018-01-23 | Wal-Mart Stores, Inc. | Method and apparatus for transporting a plurality of stacked motorized transport units |
US9896315B2 (en) | 2015-03-06 | 2018-02-20 | Wal-Mart Stores, Inc. | Systems, devices and methods of controlling motorized transport units in fulfilling product orders |
US9908760B2 (en) | 2015-03-06 | 2018-03-06 | Wal-Mart Stores, Inc. | Shopping facility assistance systems, devices and methods to drive movable item containers |
US9994434B2 (en) | 2015-03-06 | 2018-06-12 | Wal-Mart Stores, Inc. | Overriding control of motorize transport unit systems, devices and methods |
US11840814B2 (en) | 2015-03-06 | 2023-12-12 | Walmart Apollo, Llc | Overriding control of motorized transport unit systems, devices and methods |
US10071891B2 (en) | 2015-03-06 | 2018-09-11 | Walmart Apollo, Llc | Systems, devices, and methods for providing passenger transport |
US10239739B2 (en) | 2015-03-06 | 2019-03-26 | Walmart Apollo, Llc | Motorized transport unit worker support systems and methods |
US10071893B2 (en) | 2015-03-06 | 2018-09-11 | Walmart Apollo, Llc | Shopping facility assistance system and method to retrieve in-store abandoned mobile item containers |
US9534906B2 (en) | 2015-03-06 | 2017-01-03 | Wal-Mart Stores, Inc. | Shopping space mapping systems, devices and methods |
US10130232B2 (en) | 2015-03-06 | 2018-11-20 | Walmart Apollo, Llc | Shopping facility assistance systems, devices and methods |
US10138100B2 (en) | 2015-03-06 | 2018-11-27 | Walmart Apollo, Llc | Recharging apparatus and method |
US10189691B2 (en) | 2015-03-06 | 2019-01-29 | Walmart Apollo, Llc | Shopping facility track system and method of routing motorized transport units |
US10189692B2 (en) | 2015-03-06 | 2019-01-29 | Walmart Apollo, Llc | Systems, devices and methods for restoring shopping space conditions |
US11761160B2 (en) | 2015-03-06 | 2023-09-19 | Walmart Apollo, Llc | Apparatus and method of monitoring product placement within a shopping facility |
US10239740B2 (en) | 2015-03-06 | 2019-03-26 | Walmart Apollo, Llc | Shopping facility assistance system and method having a motorized transport unit that selectively leads or follows a user within a shopping facility |
US10239738B2 (en) | 2015-03-06 | 2019-03-26 | Walmart Apollo, Llc | Apparatus and method of monitoring product placement within a shopping facility |
US10071892B2 (en) | 2015-03-06 | 2018-09-11 | Walmart Apollo, Llc | Apparatus and method of obtaining location information of a motorized transport unit |
US9757002B2 (en) | 2015-03-06 | 2017-09-12 | Wal-Mart Stores, Inc. | Shopping facility assistance systems, devices and methods that employ voice input |
US10081525B2 (en) | 2015-03-06 | 2018-09-25 | Walmart Apollo, Llc | Shopping facility assistance systems, devices and methods to address ground and weather conditions |
US10315897B2 (en) | 2015-03-06 | 2019-06-11 | Walmart Apollo, Llc | Systems, devices and methods for determining item availability in a shopping space |
US10336592B2 (en) | 2015-03-06 | 2019-07-02 | Walmart Apollo, Llc | Shopping facility assistance systems, devices, and methods to facilitate returning items to their respective departments |
US10346794B2 (en) | 2015-03-06 | 2019-07-09 | Walmart Apollo, Llc | Item monitoring system and method |
US10351399B2 (en) | 2015-03-06 | 2019-07-16 | Walmart Apollo, Llc | Systems, devices and methods of controlling motorized transport units in fulfilling product orders |
US10351400B2 (en) | 2015-03-06 | 2019-07-16 | Walmart Apollo, Llc | Apparatus and method of obtaining location information of a motorized transport unit |
US10358326B2 (en) | 2015-03-06 | 2019-07-23 | Walmart Apollo, Llc | Shopping facility assistance systems, devices and methods |
US10435279B2 (en) | 2015-03-06 | 2019-10-08 | Walmart Apollo, Llc | Shopping space route guidance systems, devices and methods |
US10486951B2 (en) | 2015-03-06 | 2019-11-26 | Walmart Apollo, Llc | Trash can monitoring systems and methods |
US10508010B2 (en) | 2015-03-06 | 2019-12-17 | Walmart Apollo, Llc | Shopping facility discarded item sorting systems, devices and methods |
US10570000B2 (en) | 2015-03-06 | 2020-02-25 | Walmart Apollo, Llc | Shopping facility assistance object detection systems, devices and methods |
US10597270B2 (en) | 2015-03-06 | 2020-03-24 | Walmart Apollo, Llc | Shopping facility track system and method of routing motorized transport units |
US10611614B2 (en) | 2015-03-06 | 2020-04-07 | Walmart Apollo, Llc | Shopping facility assistance systems, devices and methods to drive movable item containers |
US10633231B2 (en) | 2015-03-06 | 2020-04-28 | Walmart Apollo, Llc | Apparatus and method of monitoring product placement within a shopping facility |
US10669140B2 (en) | 2015-03-06 | 2020-06-02 | Walmart Apollo, Llc | Shopping facility assistance systems, devices and methods to detect and handle incorrectly placed items |
US10815104B2 (en) | 2015-03-06 | 2020-10-27 | Walmart Apollo, Llc | Recharging apparatus and method |
US10875752B2 (en) | 2015-03-06 | 2020-12-29 | Walmart Apollo, Llc | Systems, devices and methods of providing customer support in locating products |
US11034563B2 (en) | 2015-03-06 | 2021-06-15 | Walmart Apollo, Llc | Apparatus and method of monitoring product placement within a shopping facility |
US11046562B2 (en) | 2015-03-06 | 2021-06-29 | Walmart Apollo, Llc | Shopping facility assistance systems, devices and methods |
US11679969B2 (en) | 2015-03-06 | 2023-06-20 | Walmart Apollo, Llc | Shopping facility assistance systems, devices and methods |
US10214400B2 (en) | 2016-04-01 | 2019-02-26 | Walmart Apollo, Llc | Systems and methods for moving pallets via unmanned motorized unit-guided forklifts |
US10017322B2 (en) | 2016-04-01 | 2018-07-10 | Wal-Mart Stores, Inc. | Systems and methods for moving pallets via unmanned motorized unit-guided forklifts |
Also Published As
Publication number | Publication date |
---|---|
TW200712807A (en) | 2007-04-01 |
WO2006120636A3 (en) | 2007-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8554250B2 (en) | Remote object recognition | |
JP5605725B2 (en) | Information notification system, information notification method, information processing apparatus, control method thereof, and control program | |
JP2007116270A (en) | Terminal and apparatus control system | |
US20180104816A1 (en) | Robot device and non-transitory computer readable medium | |
CN113287175B (en) | Interactive health state assessment method and system thereof | |
CN103717358A (en) | Control system, display control method, and non-transitory computer readable storage medium | |
KR102466438B1 (en) | Cognitive function assessment system and method of assessing cognitive funtion | |
CN110459211A (en) | Interactive method, client, electronic equipment and storage medium | |
WO2006120636A2 (en) | Method and device for providing help to a user in using a device by using a robot | |
CN112860059A (en) | Image identification method and device based on eyeball tracking and storage medium | |
CN109840119A (en) | A kind of terminal applies starting method, terminal and computer readable storage medium | |
CN113064642A (en) | Method and system for starting working process of electric control equipment, terminal and electric control equipment | |
WO2005106633A2 (en) | Method and system for control of an application | |
CN111400539A (en) | Voice questionnaire processing method, device and system | |
CN104239842A (en) | Visual sense identification realization method, device and system | |
CN111309992B (en) | Intelligent robot response method, system, robot and storage medium | |
CN107077696B (en) | Man-machine matching device, matching system, man-machine matching method, and man-machine matching program | |
EP4057606A1 (en) | Information processing device, information processing system, of information processing method, and carrier means | |
JP6167675B2 (en) | Human-machine matching device, matching system, human-machine matching method, and human-machine matching program | |
CN113325722B (en) | Multi-mode implementation method and device for intelligent cooking and intelligent cabinet | |
CN112424731B (en) | Information processing apparatus, information processing method, and recording medium | |
JP6721023B2 (en) | Information processing device and program | |
CN105912308A (en) | Method and device for displaying application program fitting | |
JP6721024B2 (en) | Information processing device and program | |
CN113442135A (en) | Robot programming method, robot programming apparatus, robot, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NENP | Non-entry into the national phase in: |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
NENP | Non-entry into the national phase in: |
Ref country code: RU |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: RU |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 06744881 Country of ref document: EP Kind code of ref document: A2 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 06744881 Country of ref document: EP Kind code of ref document: A2 |