US10987804B2 - Robot device and non-transitory computer readable medium - Google Patents
Robot device and non-transitory computer readable medium Download PDFInfo
- Publication number
- US10987804B2 US10987804B2 US15/642,665 US201715642665A US10987804B2 US 10987804 B2 US10987804 B2 US 10987804B2 US 201715642665 A US201715642665 A US 201715642665A US 10987804 B2 US10987804 B2 US 10987804B2
- Authority
- US
- United States
- Prior art keywords
- function
- robot device
- solution
- robot
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Definitions
- the present invention relates to a robot device and a non-transitory computer readable medium.
- a robot device including a detector and a controller.
- the detector detects a surrounding situation.
- the controller performs control so that the solution will be executed by using an element other than the robot device.
- FIG. 1 is a block diagram of a device system according to the exemplary embodiment
- FIG. 2 illustrates the external appearance of a robot device according to the exemplary embodiment
- FIG. 3 is a block diagram of the robot device according to the exemplary embodiment
- FIG. 4 is a block diagram of a terminal device
- FIGS. 5 and 6 illustrate characteristics of wireless communication technologies
- FIG. 7 illustrates a provided-function management table
- FIG. 8 illustrates a non-provided-function management table
- FIG. 9 illustrates a solution management table
- FIG. 10 illustrates a device function management table
- FIG. 11 illustrates a collaborative function management table
- FIG. 12 is a flowchart illustrating an overview of situation checking processing
- FIG. 13 is a flowchart illustrating details of the situation checking processing
- FIG. 14 is a flowchart illustrating processing executed in a user decision mode
- FIG. 15 is a flowchart illustrating processing executed in an autonomous decision mode
- FIG. 16 is a flowchart illustrating control processing for executing a solution
- FIG. 17 is a view for explaining an application scene 1;
- FIG. 18 is a view for explaining an application scene 2;
- FIG. 19 is a view for explaining an application scene 3;
- FIG. 20 is a view for explaining an application scene 4.
- FIGS. 21 through 30 illustrate examples of screens
- FIG. 31 is a schematic view illustrating the external appearance of a multifunction device
- FIGS. 32 through 35 illustrate examples of screens
- FIG. 36 is a sequence diagram illustrating connection processing
- FIG. 37 illustrates a collaborative function management table
- FIGS. 38A through 39B illustrate examples of screens
- FIG. 40 illustrates a device function management table
- FIGS. 41A and 41B illustrate examples of screens
- FIG. 42 illustrates a device function management table
- FIG. 43 illustrates a collaborative function management table
- FIGS. 44A through 46B illustrate examples of screens.
- FIG. 1 illustrates an example of such a device system.
- the device system includes a robot device 10 , one or plural devices 12 , and a terminal device 14 .
- the robot device 10 , the device 12 , and the terminal device 14 may communicate with another device via a communication path N, such as a network, or via different individual communication paths.
- a communication path N such as a network
- the configuration of the device system shown in FIG. 1 is only an example, and the robot device 10 , the device 12 , and the terminal device 14 may not necessarily communicate with another device.
- the single device 12 is included in the device system in the example in FIG. 1 , plural devices 12 may be included in the device system. In this case, plural devices 12 having the same function may be included, or plural devices 12 having different functions may be included.
- the single terminal device 14 is included in the device system in FIG. 1 , plural terminal devices 14 may be included in the device system. Another device, such as a server, may also be included.
- the robot device 10 has the function of detecting the situation around the robot device 10 and finding a problem in the detected situation.
- the robot device 10 may detect the surrounding situation while moving or being still. When the robot device 10 has found a problem, it determines whether it is able to solve the problem by itself and performs processing according to the determination result. If the robot device 10 is able to solve the problem by itself, it solves the problem by executing a solution for solving this problem. If the robot device 10 is unable to solve the problem by itself, it performs control so that this solution can be executed by using an element other than the robot device 10 . Examples of the element are a human, a device other than the robot device 10 , and a robot device other than the robot device 10 . The robot device 10 itself may an example of the element.
- the robot device 10 may cause another device to solve the problem, request a human to solve the problem, cause another device and a human to solve the problem, or solve the problem by collaborative work with another device and a human.
- Examples of problems to be detected by the robot device 10 are those which can be solved only by the robot device 10 , those which can be solved only by a device other than the robot device 10 , those which can be solved only by a human, those which can be solved by collaborative work between another device and a human, those which can be solved by collaborative work between the robot device 10 and a human, those which can be solved by collaborative work between another device and the robot device 10 , those which can be solved by collaborative work among another device, another robot device, and the robot device 10 , and those which can be solved by collaborative work among another device, a human, and the robot device 10 .
- examples of the solution are a solution implemented only by the robot device 10 , a solution implemented only by another device, a solution implemented only by a human, a solution implemented by collaborative work between another device and a human, a solution implemented by collaborative work between the robot device 10 and a human, a solution implemented by collaborative work between another device and the robot device 10 , a solution implemented by collaborative work among another device, another robot device, and the robot device 10 , and a solution implemented by collaborative work between another device, a human, and the robot device 10 .
- the device 12 is a device having a specific function.
- Examples of the device 12 are an image forming device having an image forming function, a personal computer (PC), a display device such as a liquid crystal display and a projector, an aroma diffuser diffusing aromas, a telephone set, a clock, a watch, a camera, a monitor camera, an automatic vending machine, an air conditioner, an electric fan, and a humidifier.
- a device other than these examples may be included in the device system. Some devices may be considered as robot devices, depending on the functions of such devices.
- the terminal device 14 is a PC, a tablet PC, a smartphone, or a cellular phone, for example.
- the terminal device 14 is used by a user to execute a solution for solving a problem, for example.
- the robot device 10 detects a surrounding situation and finds a problem in the detected situation. The robot device 10 then performs control so that a solution for solving the problem can be executed in accordance with the type of problem.
- FIG. 2 illustrates the external appearance of the robot device 10 .
- the robot device 10 is a humanoid robot, for example.
- the robot device 10 may alternatively be another type of robot.
- the robot device 10 has a torso part 16 , a head part 18 provided above the torso part 16 , a leg part 20 provided below the torso part 16 , arm parts 22 provided on both sides of the torso part 16 , and finger parts 24 provided at the tips of the arm parts 22 .
- the robot device 10 has various sensors, such as a visual sensor, a hearing sensor, a touch sensor, a taste sensor, and an odor sensor, and thus have functions corresponding to the five senses of humans such as sight, hearing, touch, taste, and smell. Concerning the sense of touch, for example, the robot device 10 has the capability to understand and distinguish superficial sensation (such as touch, pain, and temperature), deep sensation (such as pressure, position, and vibration), and cortical sensation (such as two-point perception and stereo perception) from each other.
- the robot device 10 also has the sense of balance.
- a sensor such as a camera 26 , is provided in the head part 18 of the robot device 10 .
- the sense of sight of the robot device 10 is achieved by recognizing images captured by the camera 26 .
- a voice collector such as a microphone, is provided in the robot device 10 .
- the sense of hearing of the robot device 10 is achieved by recognizing voice obtained by the microphone.
- the robot device 10 may include a detector for detecting the brain waves of a human.
- a brain wave detector is attached to a human, and the detector provided in the robot device 10 receives information transmitted from the brain wave detector.
- the leg part 20 corresponds to a moving unit, and is driven by a driving force from a drive source, such as a motor.
- the robot device 10 is able to move by using the leg part 20 .
- the leg part 20 may have the shape of human legs or may be a roller or a tire.
- the leg part 20 may have another shape.
- the leg part 20 is only an example of the moving unit.
- the robot device 10 may include another moving unit, for example, a component for flying such as a propeller, a wing, and an airplane engine, or a component for moving under the water, such as an underwater engine. That is, the robot device 10 may include, as a moving unit, at least one of a component for land moving, a component for flying, and a component for moving under the water.
- the robot device 10 may not necessarily include any moving unit.
- the robot device 10 may have the capability to catch or carry an object by using the arm parts 22 and the finger parts 24 .
- the robot device 10 may have the capability to move while catching or holding an object.
- the robot device 10 may have the function of outputting sound.
- the robot device 10 may have a communication function to send and receive data with another device.
- the robot device 10 may have the capability to communicate with a human, another device, or another robot device by emitting sound or sending a communication message.
- the robot device 10 may have the capability to make decisions similar to those achieved by a human through machine learning using artificial intelligence (AI). Neural-network deep learning or reinforcement learning for partially reinforcing learning fields may be utilized.
- AI artificial intelligence
- the robot device 10 may have the function of searching for information (solutions for solving a problem, for example) by using the Internet, for example.
- the robot device 10 may control the operation of another device by communicating with this device using a communication function.
- the robot device 10 may operate another device by using a remote controller or directly operate this device without using a remote controller.
- the robot device 10 manipulates an operation unit (such as buttons or a panel) provided in this device. If the robot device 10 is unable to control the operation of another device by communicating with this device, it may operate the device by using a remote controller or may directly operate this device.
- the robot device 10 can identify the operation unit of another device or a remote controller to operate this device or the remote controller.
- the robot device 10 may include a display 28 .
- On the display 28 information concerning problems, information concerning solutions, various messages, and so on, are displayed.
- a communication unit 30 which is a communication interface, has the functions of sending data to another device and receiving data from another device.
- the communication unit 30 may be a communication interface having a wireless communication function or a communication interface having a wired communication function.
- the communication unit 30 supports one or plural communication methods, and communicates with a communication party in accordance with the communication method suitable for the communication party (that is, the communication method supported by the communication party). Examples of the communication methods are infrared communication, visible light communication, Wi-Fi (registered trademark) communication, near field communication (such as Bluetooth (registered trademark)), and a radio frequency identifier (RFID).
- RFID radio frequency identifier
- the communication unit 30 switches the communication methods in accordance with the communication party or the surrounding environments (the distance between the robot device 10 and the communication party or the presence or the absence of an obstacle between the robot device 10 and the communication party, for example).
- the frequency band used for communication may be a short waveband of 800 to 920 MHz, such as low power wide area (LPWA), or a long waveband of 2.4 GHz or 5 GHz (such as MuLTEfire).
- the communication unit 30 switches the frequency band in accordance with the communication party or the communication method in accordance with the surrounding environments.
- a storage unit 32 is a storage device, such as a hard disk or a memory (solid-state drive (SSD), for example).
- SSD solid-state drive
- provided-function management information 34 non-provided-function management information 36 , solution management information 38 , device function management information 40 , and collaborative function management information 42 are stored.
- Various items of data and various programs are also stored.
- Device address information indicating the addresses of other devices may also be stored in the storage unit 32 . The above-described items of information may be stored in the same storage device or in different storage devices.
- the provided-function management information 34 is information indicating functions provided in the robot device 10 .
- the provided-function management information 34 also indicates the association between the individual functions provided in the robot device 10 and individual operations (including processing and manipulation) executable by using these functions. Referring to the provided-function management information 34 makes it possible to specify (identify) operations executable by the robot device 10 . Functions that are not indicated in the provided-function management information 34 are those that are not provided in the robot device 10 . Referring to the provided-function management information 34 may thus make it possible to specify (identify) operations that are not executable by the robot device 10 .
- the non-provided-function management information 36 is information indicating functions that are not provided in the robot device 10 .
- the non-provided-function management information 36 also associates the individual functions that are not provided in the robot device 10 and individual operations (including processing and manipulation) that are not executable by the robot device 10 by not having these functions. Referring to the non-provided-function management information 36 makes it possible to specify (identify) operations that are not executable by the robot device 10 . Functions that are not indicated in the non-provided-function management information 36 may be those that are provided in the robot device 10 . Referring to the non-provided-function management information 36 may thus make it possible to specify (identify) operations that are executable by the robot device 10 .
- Both of the provided-function management information 34 and the non-provided-function management information 36 may be stored in the storage unit 32 so that operations that are executable or those that are not executable by the robot device 10 can be specified based on the provided-function management information 34 and the non-provided-function management information 36 .
- one of the provided-function management information 34 and the non-provided-function management information 36 may be stored in the storage unit 32 so that operations that are executable or those that are not executable by the robot device 10 can be specified based on the provided-function management information 34 or the non-provided-function management information 36 stored in the storage unit 32 .
- the solution management information 38 is information indicating solutions for solving problems (how to solve a problem and how to act to solve the problem).
- the solution management information 38 also indicates the association between individual problems which are likely to occur and solutions for solving these problems.
- the device function management information 40 is information for managing the functions of devices other than the robot device 10 .
- the device function management information 40 indicates the association between device identification information for identifying devices and function information indicating the functions provided in the devices.
- Examples of the device identification information are a device ID, a device name, a device type, a model number of a device, a position at which a device is installed (device position information), and an image representing the external appearance of a device.
- Examples of the function information are a function ID and a function name.
- an image forming device which is a device other than the robot device 10 , has a scan function, a copy function, and a scan transfer function
- function information indicating the scan function, function information indicating the copy function, and function information indicating the scan transfer function are associated with the device identification information for identifying the image forming device.
- the device function management information 40 makes it possible to specify (identify) the functions of individual devices.
- Examples of devices managed by the device function management information 40 are those included in the device system, such as the device 12 . Devices which are not included in the device system may also be managed by the device function management information 40 .
- the robot device 10 may obtain information (including device identification information and function information) concerning a new device which is not included in the device system and register the obtained information in the device function management information 40 . Information concerning a new device may be obtained by using the Internet, for example, or as a result of an administrator, for example, inputting such information.
- the robot device 10 may update the device function management information 40 regularly or at a certain timing or a timing specified by the administrator, for example.
- function information concerning a new function which has not been provided in a device before the updating operation may be registered in the device function management information 40 after the updating operation.
- function information concerning a function which has been provided in a device before the updating operation may be deleted from the device function management information 40 or may be registered as information indicating that this function is disabled. Updating information may be obtained by using the Internet, for example, or as a result of the administrator, for example, inputting such information.
- the collaborative function management information 42 is information for managing collaborative functions that are executable by combining plural functions. By combining plural functions, one or plural collaborative functions are executed. A collaborative function may be executed by combining plural functions of one device or by combining plural functions of plural devices. A device that provides an operation instruction (terminal device 14 , for example) may be included in devices to be identified, and functions provided in such a device may be used as part of a collaborative function. Functions provided in the robot device 10 may also be used as part of a collaborative function.
- a collaborative function may be a function that is executable without using a hardware device.
- a collaborative function may be a function that is executable by combining plural software items.
- a collaborative function may be a function that is executable by combining a function of a hardware device and a function of software.
- the collaborative function management information 42 indicates the association between a combination of items of function information concerning functions to be combined for a collaborative function and collaborative function information indicating this collaborative function.
- the collaborative function information indicates a collaborative function ID and a collaborative function name, for example.
- the collaborative function management information 42 is also updated accordingly. Because of this updating operation, a collaborative function that is not executable by combining plural functions before the updating operation may become executable after the updating operation. In contrast, a collaborative function that is executable by combining plural functions before the updating operation may become inexecutable after the updating operation.
- Collaborative function information indicating a collaborative function that has become executable after the updating operation may be registered in the collaborative function management information 42 .
- Collaborative function information indicating a collaborative function that has become inexecutable after the updating operation may be deleted from the collaborative function management information 42 or may be registered as information indicating that this collaborative function is disabled.
- the collaborative function management information 42 is information for managing collaborative functions that are executable by combining plural functions of plural devices.
- the collaborative function management information 42 indicates the association between a combination of items of device identification information for identifying individual devices used for a collaborative function and collaborative function information indicating this collaborative function.
- the collaborative function management information 42 is also updated accordingly. Because of this updating operation, a collaborative function that is not executable by combining plural functions of plural devices before the updating operation may become executable after the updating operation. In contrast, a collaborative function that is executable by combining plural functions of plural devices before the updating operation may become inexecutable after the updating operation.
- a collaborative function may be a function executed by combining different functions or a function executed by combining the same functions of different devices.
- a collaborative function may be a function that is not executable unless different functions are combined or the same functions of different devices are combined.
- Such a collaborative function may be a function that is executable by combining different functions or a function that is executable by combining the same functions of different devices. For example, by combining a device (printer) having a print function and a device (scanner) having a scan function, a copy function is implemented as a collaborative function. That is, by combining a print function and a scan function, a copy function is implemented.
- collaborative function information indicating a collaborative function that is, a copy function
- a combination of device identification information for identifying a device having a print function and device identification information for identifying a device having a scan function are associated with each other.
- available function management information may be stored.
- the available function management information is information for managing functions available to an individual user, and indicates the association between user identification information for identifying a user and function information (may include collaborative function information) indicating functions available to this user.
- Functions available to a user may be those provided to the user free of charge and those purchased by the user. Such functions may be functions to be singly used or collaborative functions.
- the user identification information is user account information indicating a user ID and a user name, for example. Referring to the available function management information makes it possible to identify (specify) functions available to an individual user.
- the available function management information is updated every time a function is provided to a user (every time a function is provided to a user for a charge or free of charge, for example).
- At least one of the provided-function management information 34 , the non-provided-function management information 36 , the solution management information 38 , the device function management information 40 , the collaborative function management information 42 , and the available function management information may be stored in a device other than the robot device 10 (such as a server, which is not shown, the device 12 , and the terminal 14 ). In this case, such information stored in another device may not necessarily be stored in the storage unit 32 of the robot device 10 .
- a situation information collector 44 has a function of collecting information concerning situations around the robot device 10 by using various sensors. Hereinafter, such information will be called “situation information”.
- the situation information collector 44 the above-described visual sensor, hearing sensor, touch sensor, taste sensor, and odor sensor may be used.
- the robot device 10 performs image recognition as a result of the visual sensor capturing images (such as video images and still images) around the robot device 10 .
- the robot device 10 performs voice recognition as a result of the hearing sensor picking up sound (including voice) around the robot device 10 .
- the temperature, humidity, and odor around the robot device 10 are detected by using other sensors. Sensors other than the above-described sensors may also be used to collect information concerning the situations around the robot device 10 .
- the situation information collector 44 may collect situation information from devices and sensors other than the robot device 10 .
- a moving unit 46 has the function of moving the robot device 10 by using at least one of a component for land moving, a component for flying, and a component for moving under the water.
- the moving unit 46 is constituted by the leg part 20 shown in FIG. 2 , for example.
- An operating unit 48 has the functions of operating devices other than the robot device 10 and lifting and carrying objects.
- the operating unit 48 is constituted by the leg part 20 , the arm parts 22 , and the finger parts 24 shown in FIG. 2 , for example.
- a user interface (UI) 50 includes a display (display 28 shown in FIG. 2 , for example) and an operation unit.
- the display is a display device, such as a liquid crystal display.
- the operation unit is an input device, such as a touch panel or a keyboard.
- the UI 50 may be a user interface serving as both of a display and an operation unit, such as a touch display and a device which displays a digital keyboard on a display.
- the robot device 10 may not necessarily include the UI 50 , or may include only hardware keys, such as various buttons, without a display. Examples of buttons as hardware keys are buttons dedicated to the use of numeric input, such as a numeric keypad, and buttons dedicated to the use of indicating directions, such as direction indicator keys.
- a speaker 52 has the function of outputting sound and voice. Voice for a solution, such as a message for requesting a human to solve a problem, is output from the speaker 52 .
- a controller 54 controls operations of the individual elements of the robot device 10 .
- the controller 54 includes a detector 56 , a solution specifying unit 58 , a judging unit 60 , a search unit 62 , and an identifying unit 64 .
- the detector 56 has the function of finding a problem by detecting the situation around the robot device 10 , based on the situation information collected by the situation information collector 44 (values of the various sensors, for example) and by determining whether a problem has occurred around the robot device 10 .
- the detector 56 detects, as the situation, information concerning people around the robot device 10 and information concerning the surroundings other than people. Examples of information concerning people are images of people captured by the visual sensor (images of the faces, images of the entire bodies, and images showing the movements of people, for example) and the voice picked up by the hearing sensor. Examples of information concerning the surroundings other than people are temperature information obtained by a temperature sensor and humidity information obtained by a humidity sensor.
- the detector 56 finds a problem by detecting the surrounding situation by using a combination of images captured by the visual sensor, sound and voice picked up by the hearing sensor, information concerning the sense of touch obtained by the touch sensor, information concerning the taste obtained by the taste sensor, and information concerning the odor obtained by the odor sensor.
- the detector 56 may use information collected by a sensor which is not included in the robot device 10 .
- the detector 56 may obtain information from a sensor installed outside the robot device 10 , such as a sensor installed in the room and a sensor provided in another device, and may find a problem by using such information.
- the detector 56 may determine that a problem has occurred when a value obtained by a sensor (the temperature obtained by a temperature sensor, for example) is equal to or exceed a threshold.
- the threshold may be changed in accordance with the age or the gender detected by various sensors provided in the robot device 10 , thereby making it possible to find a problem according to the individuals. That is, people may feel differently about the problems according to the age or the gender, and varying of the threshold according to the age or the gender makes it possible to find problems according to the individuals.
- the solution specifying unit 58 has the function of specifying (identifying) a solution for solving a problem detected by the detector 56 by referring to the solution management information 38 .
- the solution specifying unit 58 may select a solution based on the priority level determined for the first detection result or that for the second detection result. In this case, the solution specifying unit 58 selects a solution based on the first detection result so that a problem occurring to people can be solved preferentially over that in the surroundings.
- the judging unit 60 has the function of judging whether the robot device 10 can solve a problem detected by the detector 56 .
- problems that the robot device 60 is unable to solve are problems that are not solvable only by the functions of the robot device 10 , problems that require a certain amount of time (preset time) to solve only by using the functions of the robot device 10 , and problems that do not produce solving results of a satisfactory quality (preset quality) by using the functions of the robot device 10 . If a set of functions provided in the robot device 10 covers the solution specified by the solution specifying unit 58 , the judging unit 60 judges that it is possible to solve the problem only by the robot device 10 .
- the judging unit 60 judges that it is possible to solve the problem only by the robot device 10 .
- the judging unit 60 judges that it is not possible to solve the problem only by the robot device 10 . That is, if the robot device 10 does not have any or some of the functions required for executing the selected solution, the judging unit 60 judges that it is not possible to solve the problem only by the robot device 10 .
- the judging unit 60 may refer to the provided-function management information 34 .
- the judging unit 60 may refer to the non-provided-function management information 36 .
- the robot device 10 executes the solution specified by the solution specifying unit 58 under the control of the controller 54 .
- the controller 54 performs control so that the solution can be executed by using an element other than the robot device 10 (another device or a human).
- the controller 54 may cause a device other than the robot device 10 to execute the solution, request a human to execute the solution, or cause the robot device 10 and another element (a human or another device, for example) to execute the solution together.
- the search unit 62 has the function of searching for a solution which is used for solving a problem detected by the detector 56 and which is not registered in the solution management information 38 .
- the search unit 62 searches for such a solution by using the Internet, for example.
- the identifying unit 64 has the function of identifying (specifying) a device other than the robot device 10 and also identifying the functions provided in this device.
- the identifying unit 64 may identify a device, based on an image of the device (image of the external appearance of the device, for example) captured by the visual sensor, or based on device identification information indicated in the device obtained from an image captured by the visual sensor.
- the identifying unit 64 may alternatively obtain position information indicating the position at which the device is installed.
- the identifying unit 64 also identifies the functions of the identified device. To identify the functions, the identifying unit 64 refers to the device function management information 40 stored in the storage unit 32 and specifies function information indicating the functions associated with the device identification information concerning the identified device.
- the identifying unit 64 may identify plural devices to be combined as collaborative devices.
- the identifying unit 64 may refer to the collaborative function management information 42 stored in the storage unit 32 and specify collaborative function information indicating a collaborative function associated with a combination of items of device identification information concerning these identified plural devices. This makes it possible to identify (specify) a collaborative function to be executed by combining the functions of the identified devices.
- the identifying unit 64 may receive user identification information for identifying a user.
- the identifying unit 64 may then refer to the available function management information stored in the storage unit 32 and specify function information indicating the functions available to the user indicated by the user identification information. This makes it possible to identify (specify) a set of functions available to this user. For example, user identification information is sent from the terminal device 14 to the robot device 10 , and the identifying unit 64 specifies function information indicating the functions associated with the user identification information. More specifically, the identifying unit 64 receives device identification information and user identification information.
- the identifying unit 64 then refers to the device function management information 40 and specifies function information indicating the functions associated with the device identification information, and also refers to the available function management information and specifies function information indicating the functions associated with the user identification information. This makes it possible to specify the functions which are provided in the device specified by the device identification information and which are available to the user specified by the user identification information.
- the controller 54 may execute function purchase processing and manage the purchase history. For example, if a charged function is purchased by a user, the controller 54 may perform billing processing for the user.
- the controller 54 includes an intelligence unit, and controls the individual elements of the controller 54 by using AI of this intelligence unit.
- At least one of the detector 56 , the solution specifying unit 58 , the judging unit 60 , the search unit 62 , and the identifying unit 64 may be stored in a device other than the robot device 10 (such as a server, which is not shown, the device 12 , and the terminal 14 ). In this case, such an element stored in another device may not necessarily be included in the controller 54 of the robot device 10 .
- the configuration of the terminal device 14 will be described below in detail with reference to the block diagram of FIG. 4 .
- a communication unit 66 which is a communication interface, has the functions of sending data to another device and receiving data from another device.
- the communication unit 66 may be a communication interface having a wireless communication function or a communication interface having a wired communication function.
- a camera 68 which serves as an imaging unit, captures an image of an object so as to generate image data (still image data and video image data, for example). Not only the camera 68 , but also an external camera connected to a communication path, such as a network, may be used.
- the communication unit 66 may receive image data indicating images captured by such an external camera, and a UI 72 may display the image data so that the user can process or use the image data.
- the terminal device 14 may not necessarily include the camera 68 .
- a storage unit 70 is a storage device, such as a hard disk or a memory (SSD, for example).
- various programs and various items of data are stored in the storage unit 70 .
- Address information concerning the address of the robot device 10 and the addresses of other devices (the device 12 , for example), information concerning identified devices, information concerning identified devices to be combined as collaborative devices, information concerning the functions of identified devices, and information concerning collaborative functions may also be stored in the storage unit 70 .
- the above-described items of information may be stored in the same storage device or in different storage devices.
- the UI 72 includes a display and an operation unit.
- the display is a display device, such as a liquid crystal display.
- the operation unit is an input device, such as a touch panel, a keyboard, and a mouse.
- the UI 72 may be a user interface serving as both of a display and an operation unit, such as a touch display and a device which displays a digital keyboard on a display.
- a controller 74 controls the operations of the individual elements of the terminal device 14 .
- the controller 74 which serves as a display controller (control unit), for example, causes various items of information to be displayed on the display of the UI 72 .
- images captured by the robot device 10 On the display of the UI 72 , images captured by the robot device 10 , images captured by the camera 68 , images linked with identified devices to be used (devices to be used singly and devices to be combined), and images linked with functions are displayed.
- An image linked with a device may be an image representing this device captured by the robot device 10 (a still image or a video image, for example) or an image schematically representing this device (an icon, for example), or an image representing this device captured by the camera 68 (a still image or a video image, for example).
- Image data indicating a schematic image of a device may be stored in the robot device 10 and provided from the robot device 10 to the terminal device 14 . Such image data may alternatively be stored in the terminal device 14 in advance or may be stored in another device and provided from such a device to the terminal device 14 .
- An image linked with a function is an image representing this function, such as an icon.
- FIG. 5 illustrates the characteristics (advantages and disadvantages) of wireless communication technologies according to the frequency.
- FIG. 6 illustrates the characteristics of wireless communication technologies according to the communication method.
- RFID one of the major standards of wireless communication technologies using a frequency of 900 MHz is RFID.
- Some of the advantages of RFID are having high resistance to obstacles and having few interference frequency bands such as that of microwave ovens.
- Some of the disadvantages of RFID are a large-size antenna and a short coverage range.
- Some of the major standards of wireless communication technologies using a frequency of 2.4 GHz are ZigBee (registered trademark) and Bluetooth. Some of the advantages of such a communication technology are high power saving, high speed, and a small-size antenna, while one of the disadvantages is having many interference frequency bands.
- Some of the major standards of wireless communication technologies using a frequency of 5 GHz are IEEE802.11a and MuLTEfire. Some of the advantages of such a communication technology are having few interference frequency bands and high speed, while one of the disadvantages is having low resistance to obstacles.
- NFC near field communication
- the communication unit 30 of the robot device 10 communicates with the communication party by using a wireless communication technology having characteristics suitable for the surrounding environments and the communication party. More particularly, the communication unit 30 communicates with a communication party by changing the wireless communication technology in accordance with the distance between the robot device 10 and the communication party, the presence or the absence of an obstacle therebetween, and the communication method supported by the communication party.
- FIG. 7 shows an example of a provided-function management table as the provided-function management information 34 .
- FIG. 8 shows an example of a non-provided-function management table as the non-provided-function management information 36 .
- the management number, information indicating a function provided in the robot device 10 , and information indicating an operation (including processing and manipulation) executable by using this function are associated with each other.
- the robot device 10 has a lifting function of lifting an object by using the arm parts 22 , and is able to lift and carry an object up to 30 kg by using this lifting function.
- the robot device 10 also has a moving function, and is movable by changing the speed within 10 km per hour by using this moving function. Referring to the provided-function management table makes it possible to specify (identify) the functions provided in the robot device 10 and operations executable by using the functions.
- Functions and operations that are not registered in the provided-function management table may be functions that are not provided in the robot device 10 and operations that are not executable by the robot device 10 . Referring to the provided-function management table can thus specify (identify) functions that are not provided in the robot device 10 and operations that are not executable by the robot device 10 .
- the management number, information indicating a function that is not provided in the robot device 10 , information indicating an operation (including processing and manipulation) that is not executable by the robot device 10 by not having this function are associated with each other.
- the robot device 10 does not have a cooling function for the external environments (a function of cooling around the robot device 10 ), and is thus not capable of cooling the room.
- the robot device 10 does not have a print function and is thus not capable of printing the words of voice picked up by the robot device 10 or a document seen by the robot device 10 .
- non-provided-function management table makes it possible to specify (identify) the functions that are not provided in the robot device 10 and operations that are not executable by the robot device 10 .
- Functions and operations that are not registered in the non-provided-function management table may be functions provided in the robot device 10 and operations executable by the robot device 10 .
- Referring to the non-provided-function management table may thus specify (identify) functions provided in the robot device 10 and operations executable by the robot device 10 .
- both of data indicating the provided-function management table and data indicating the non-provided-function management table may be stored, and operations that are executable by the robot device 10 or operations that are not executable by the robot device 10 may be specified based on these two items of data.
- one of the two items of data may be stored in the storage unit 32 , and operations that are executable by the robot device 10 or operations that are not executable by the robot device 10 may be specified based on the stored item of data.
- FIG. 9 shows an example of a solution management table as the solution management information 38 .
- the management number, information indicating a situation (problem), and information indicating a solution for solving this problem (situation) are associated with each other.
- the detector 56 detects a situation (problem).
- the solution specifying unit 58 refers to the solution management table and specifies a solution for solving the problem detected by the detector 56 .
- a solution for solving this problem is to reduce the temperature in the room, and more specifically, solutions are “(1) cooling the room by the air conditioner” and “(2) opening the window”.
- the room temperature is detected by a certain sensor (a temperature sensor, for example) provided in the robot device 10 .
- the temperature threshold may be changed according to the age or the gender of people around the robot device 10 .
- the detector 56 of the robot device 10 detects people around the robot device 10 and also estimates the age and the gender of people, based on information (image and voice information, for example) obtained by various sensors (a visual sensor and a hearing sensor, for example).
- the temperature threshold used for older people (the age is equal to or higher than an age threshold) and that for younger people (the age is lower than the age threshold) may be different. For example, if older people are detected, a lower temperature threshold may be used than that for younger people. This makes it possible to execute a suitable solution according to the age.
- the temperature threshold may also be changed according to the gender. This makes it possible to execute a suitable solution according to the gender.
- a solution for solving this problem is to raise the temperature in the room, and more specifically, solutions are “(1) turning on the heating” and “(2) closing the window”.
- the detector 56 of the robot device 10 detects the situation (3) by observing the movement of people around the robot device 10 by using various sensors. More specifically, the detector 56 detects the situation (3) from the facial expressions, movements of the legs and the feet, voice, and sweat of people in the room, based on images showing the faces and the movements of people.
- the detector 56 may preferentially select one of the multiple situations, and the solution specifying unit 58 may specify solutions for solving the preferential situation. For example, if the situation (1) “the room temperature is 30° C. or higher” is detected and if the situation (3) “people in the room look cold” is also detected, the detector 56 preferentially selects the situation (3) over the situation (1), and the solution specifying unit 58 specifies solutions for solving the situation (3).
- the priority levels for preferentially selecting solutions are determined in advance.
- a situation detected based on information concerning people is preferentially selected over a situation detected based on information concerning the surroundings other than people.
- the situation (3) is a situation based on information concerning people (a situation detected by analyzing images or voice of people, for example), and the situation (1) is a situation based on information concerning the surroundings other than people.
- FIG. 10 shows an example of a device function management table as the device function management information 40 .
- the device ID information indicating the device name (device type, for example), information indicating the functions of a device (function information), and the image ID are associated with each other.
- the device ID and the device name are examples of the device identification information.
- the image ID is an example of image identification information for identifying an image representing a device (an image showing the external appearance of the device or an image schematically representing the device (such as an icon), for example).
- the image ID may not necessarily be included in the device function management table.
- the device having a device ID “B”, for example, is a multifunction device (image forming device including multiple image forming functions), and has a print function and a scan function, for example.
- the image ID for identifying an image representing this device is associated with the device.
- Image data indicating an image of a device is stored in the storage unit 32 of the robot device 10 or in another device.
- the detector 56 detects the device ID for identifying a device around the robot device 10
- the identifying unit 64 refers to the device function management table and specifies the device name, functions, and image ID associated with the device ID. This makes it possible to identify devices around the robot device 10 .
- Information indicating the device name and image data of an image of the device may be sent from the robot device 10 to the terminal device 14 and be displayed on the terminal device 14 .
- An image representing a device is displayed as the image linked with this device.
- the image linked with a device may be an image captured by a camera or an image schematically representing the device (an icon, for example). If a user specifies the image linked with the device on the terminal device 14 , information concerning the functions of this device (function information or description information concerning the functions) may be sent from the robot device 10 to the terminal device 14 and be displayed on the terminal device 14 .
- FIG. 11 shows an example of a collaborative function management table as the collaborative function management information 42 .
- a combination of device IDs, information indicating the names of devices to be combined (device types, for example), and information indicating collaborative functions (collaborative function information) are associated with each other.
- the device having a device ID “A”, for example, is a PC
- the device having a device ID “B” is a multifunction device.
- Scan transfer function is the function of transferring image data generated by a scanning operation of the multifunction device (B) to the PC (A).
- Print function is the function of sending data (image data or document data, for example) stored in the PC (A) to the multifunction device (B) and printing the data in the multifunction device (B).
- step S 01 the situation information collector 44 collects situation information (the values of various sensors) concerning the surroundings around the robot device 10 by using various sensors.
- the detector 56 then detects the situation around the robot device 10 based on the situation information and finds a problem occurring in this situation.
- step S 02 the judging unit 60 judges whether the robot device 10 can solve the problem detected by the detector 56 .
- problems that the robot device 10 are unable to solve are problems that are not solvable only by the functions of the robot device 10 , problems that require a certain amount of time (preset time) to solve only by using the functions of the robot device 10 , and problems that do not produce solving results of a satisfactory quality (preset quality) by using the functions of the robot device 10 .
- the robot device 10 If the robot device 10 can solve the problem (YES in step S 02 ), it executes a solution for solving the problem in step S 03 .
- This solution is specified by the solution specifying unit 58 .
- the robot device 10 executes the specified solution by using the functions of the robot device 10 without using another device or receiving human assistance.
- the robot device 10 may cause the UI 50 of the robot device 10 or the UI 72 of the terminal device 14 to display information concerning the problem and information concerning the solution.
- the robot device 10 may execute the solution when receiving an execution instruction from a user.
- step S 04 search processing for solutions is executed. It is determined in step S 04 whether the robot device 10 will search for solutions by itself. If the result of step S 04 is YES, the robot device executes an autonomous decision mode in step S 05 . If the robot device 10 will not search for solutions by itself (NO in step S 04 ), the robot device 10 executes a user decision mode in step S 06 . It has been determined in advance whether the autonomous decision mode or the user decision mode will be executed. That is, if the robot device 10 is unable to solve the problem detected by the detector 56 , it executes a predetermined one of the autonomous decision mode and the user decision mode. The mode to be executed may be specified by the user.
- step S 10 the situation information collector 44 collects situation information (the values of various sensors) concerning the surroundings around the robot device 10 by using various sensors.
- step S 11 the detector 56 anticipates a problem that may occur by using a combination of the values of the sensors.
- the detector 56 anticipates a problem by using a combination of information obtained by analyzing the images of the surroundings around the robot device 10 and the values of the sensors. Voice picked up by a hearing sensor may also be used, and the detector 56 may anticipate a problem by using the results of analyzing voice, as well as the result of analyzing the images and the values of the sensors.
- the detector 56 detects that the temperature needs decreasing by an air conditioner or an electric fan. If the detector 56 attempts to detect a surrounding situation by using only one sensor, it may fail to detect the situation correctly.
- the use of an image representing the surroundings of the robot device 10 together with the values of sensors makes it possible to detect more complicated situations with higher precision.
- the use of such an image also makes it possible to detect a situation from a personal point of view, as well as from a general point of view.
- a sensor is able to detect a situation that is not possible to detect by the sense of sight or the sense of smell.
- the solution specifying unit 58 estimates a solution that can return the values of the sensors to the normal values.
- the solution specifying unit 58 refers to the solution management table shown in FIG. 9 , for example, and specifies a solution for the problem detected by the detector 56 .
- Information concerning the specified solution may be presented upon receiving an inquiry from a user, which will be discussed later, or as an instruction (control) request when the problem is solved by using a human or a device other than the robot device 10 .
- the judging unit 60 judges whether the robot device 10 can solve the problem detected by the detector 56 . To make this judgement, in step S 13 , the judging unit 60 compares the functions of the robot device 10 with the solution specified by the solution specifying unit 58 .
- step S 14 It is then determined in step S 14 whether a set of functions provided in the robot device 10 covers the solution specified by the solution specifying unit 58 , that is, whether the robot device 10 has a function for executing the solution. If the result of step S 14 is YES, the process proceeds to step S 03 shown in FIG. 12 . In this case, the robot device 10 executes the specified solution by using the corresponding function of the robot device 10 without using another device or receiving human assistance. The robot device 10 may execute the solution upon receiving an execution instruction from a user.
- the robot device 10 executes the autonomous decision mode or the user decision mode in step S 15 .
- step S 20 the communication unit 30 of the robot device 10 sends situation information (the values of various sensors) collected by the situation information collector 44 to the terminal device 14 under the control of the controller 54 .
- the communication unit 30 may switch the communication method in accordance with the presence or the absence of obstacles or the distance between the robot device 10 and the terminal device 14 .
- the communication unit 30 may send situation information to a terminal device 14 which has been registered as a destination, or to a terminal device 14 identified by the robot device 10 , or to a terminal device 14 which has sent a request to communicate with the robot device 10 .
- the identifying unit 64 of the robot device 10 may obtain device identification information concerning a terminal device 14 from an image of the terminal device 14 captured by the visual sensor, and may identify the terminal device 14 based on the device identification information.
- the communication unit 30 may send situation information to a terminal device 14 located within a preset range from the position of the robot device 10 (for example, a terminal device 14 located within a range where the robot device 10 can perform near field communication.
- the identifying unit 64 may identify a user who seems to have a problem and send situation information to the terminal device 14 of this user.
- the robot device 10 may collect and send additional situation information to the terminal device 14 in response to a request from the user of the terminal device 14 .
- the robot device 10 captures additional images of the surroundings by using the visual sensor, such as a camera, and sends the resulting images (video images and still images) to the terminal device 14 , or recollects data concerning the information (the temperature, for example) requested by the user and resends the data to the terminal device 14 .
- the visual sensor such as a camera
- step S 22 the robot device 10 presents one or plural solutions for solving the problem detected by the detector 56 (one or plural solutions specified by the solution specifying unit 58 ).
- the communication unit 30 sends information indicating one or plural solutions to the terminal device 14 under the control of the controller 54 .
- the information indicating one or plural solutions is displayed on the UI 72 of the terminal device 14 .
- the controller 54 may cause information indicating one or plural solutions to be displayed on the UI 50 of the robot device 10 .
- the controller 54 of the robot device 10 determines in step S 23 whether the solution selected by the user is a solution to be executed by using a device (the device 12 , for example) other than the robot device 10 . For example, when the user selects one of the solutions by using the terminal device 14 , and information indicating the selected solution is sent from the terminal device 14 to the robot device 10 . The controller 54 then determines based on this information whether the solution selected by the user is a solution to be executed by using a device other than the robot device 10 . If information indicating one or plural solutions is displayed on the UI 50 of the robot device 10 and the user has selected one of the solutions, the controller 54 determines whether the solution selected by the user will be executed by using another device.
- a device the device 12 , for example
- step S 24 the communication unit 30 sends information concerning a device that can execute the solution to the terminal device 14 under the control of the controller 54 .
- the controller 54 refers to the device function management table and the collaborative function management table, and specifies one or multiple devices having a function that can execute the solution selected by the user. If “printing” is selected as a solution, a multifunction device having a print function is specified as a device that can execute the solution.
- Examples of information concerning a device that can execute the solution are an image showing the external appearance of the device, address information indicating the address of the device for connecting to the device, and information concerning the specifications of the device.
- the controller 54 may cause information concerning the device to be displayed on the UI 50 of the robot device 10 . In this case, the information may not necessarily be sent to the terminal device 14 .
- step S 25 the solution is executed in accordance with user operation. That is, the function of the device for executing the solution is performed.
- a function included in a single device may be executed, or a collaborative function using functions of plural devices may be executed.
- a function included in the robot device 10 may be used. An instruction to execute the function may be provided from the robot device 10 or from the terminal device 14 to the device. Details of the operation for performing the function will be discussed later.
- the robot device 10 requests a person around the robot device 10 to execute the solution in step S 26 . Then, in step S 27 , the robot device 10 tells the user how to execute the solution (execution procedure).
- the controller 54 causes a speaker to output the procedure as voice or the UI 50 to display the procedure, or causes the robot device 10 to move to the person to touch him or her. If the procedure is presented to the person as voice from the speaker, the volume may be increased so that the person will understand the procedure better.
- the communication unit 30 may alternatively send information indicating the procedure to the terminal device 14 , in which case, the information is displayed on the UI 72 of the terminal device 14 .
- step S 23 If the controller 54 determines in step S 23 that the solution selected by the user is a solution to be executed by using both of a device and a person, steps S 24 through S 27 are executed.
- a function of the robot device 10 may also be used as part of the solution.
- step S 30 if necessary, the controller 54 of the robot device 10 specifies an additional function or a request to a user necessary for executing a solution specified by the solution specifying unit 58 .
- the controller 54 determines in step S 31 whether the solution specified by the solution specifying unit 58 is a solution to be executed by using a device other than the robot device 10 (the device 12 , for example).
- step S 32 the controller 54 searches for devices around the robot device 10 that can execute the solution.
- the controller 54 searches for such devices, based on images obtained by the visual sensor, position information concerning the positions of the devices, and the wireless communication status, for example.
- the controller 54 refers to the device function management table and the collaborative function management table, and specifies one or plural devices having a function that can execute the solution. If the solution is “printing”, the controller 54 searches for a multifunction device having a print function.
- the controller 54 determines in step S 33 whether there is a device around the robot device 10 that can execute the solution. If there is such a device (YES in step S 33 ), the process proceeds to step S 34 .
- step S 34 the communication unit 30 of the robot device 10 sends information indicating an instruction to execute the solution (an instruction to perform the function for executing the solution) to the searched device under the control of the controller 54 . Upon receiving this information, the device executes the solution in response to the instruction.
- the controller 54 may check whether it is possible to obtain a control right for the device that can execute the solution. That is, the controller 54 checks whether address information concerning the device or a driver for controlling the device is stored in the robot device 10 . If such a driver can be obtained by using a network, the controller 54 downloads the driver.
- the robot device 10 may provide an instruction to execute the solution to the device by directly operating the operation panel of the device or by operating a remote controller of the device.
- a determination as to whether the solution has successfully been executed can be made according to whether the problem has been solved after the device performed the function. If the detector 56 does not find the problem any longer, the controller 54 determines that the problem has been solved. If the problem has not been solved, the controller 54 resends an instruction to execute the solution to the device, or searches for another solution.
- the controller 54 searches for another solution in step S 35 . In this case, the controller 54 may switch the autonomous decision mode to the user decision mode.
- the robot device 10 requests a person around the robot device 10 to execute the solution in step S 36 .
- the robot device 10 tells the user about the content of the request (procedure to execute the solution, for example).
- the controller 54 causes the speaker to output the content of the request as voice or the UI 50 to display the content of the request, or causes the robot device 10 to move to the person to touch him or her.
- the communication unit 30 may alternatively send information indicating the content of the request to the terminal device 14 . In this case, the information is displayed on the UI 72 of the terminal device 14 .
- the controller 54 determines in step S 38 whether the person has accepted the request. If the person has accepted the request (YES in step S 38 ), the controller 54 finishes the processing. The user then executes the solution. If it is found that the person has not accepted the request (NO in step S 38 ), the controller 54 searches for another solution in step S 35 .
- the controller 54 may switch the autonomous mode to the user decision mode.
- the controller 54 may make the above-described determination by voice recognition. If the controller 54 acknowledges a reply from the person accepting the request by voice recognition, it determines that the person has accepted the request. In another example, if the operation that is likely to be performed by the user for executing the solution is performed within a preset time and if this operation is recognized by a certain sensor (visual sensor, for example), the controller 54 may determine that the person has accepted the request.
- the situation around the robot device 10 may change over time, and the problem which may occur is also likely to change accordingly.
- the robot device 10 thus detects the situation (problem) almost in real time and specifies the solution based on the detection results.
- the problems that can be solved by the robot device 10 are also changed in accordance with the updating of the functions of the robot device 10 .
- the functions of the robot device 10 are updated by changing at least one of hardware and software of the robot device 10 .
- the solution specifying unit 58 may preferentially specify a solution to be executed by using a device (a solution which does not require human assistance) over a solution to be executed by a person.
- a solution to be executed by a person is not always executed (may be rejected).
- the solution specifying unit 58 may specify solutions which do not require human assistance without searching for solutions requiring human assistance. This can reduce the load for searching for solutions.
- Control processing for executing a solution will be described below in detail with reference to the flowchart of FIG. 16 . It is assumed that the robot device 10 is unable to solve a problem by itself.
- step S 40 the controller 54 of the robot device 10 determines whether a solution selected by the user in the user decision mode or a solution specified by the robot device 10 in the autonomous decision mode requires control of a device. That is, the controller 54 determines whether the solution is a solution to be executed by using a device. In other words, the controller 54 determines whether the solution is implemented only by a person without using a device.
- step S 41 the robot device 10 requests a person around the robot device 10 to execute the solution.
- the communication unit 30 sends information indicating the content of a request (procedure taken to execute the solution, for example) to the terminal device 14 . This information is displayed on the UI 72 of the terminal device 14 .
- step S 42 the communication unit 30 sends this information to the terminal device 14 by using a communication method suitable for the surrounding environments. Examples of the surrounding environments are the distance between the robot device 10 and the terminal device 14 and the presence or the absence of obstacles therebetween.
- the controller 54 may alternatively cause the speaker to output the content of the request as voice or the UI 50 to display the content of the request, or cause the robot device 10 to move to the person to touch him or her.
- step S 43 the controller 54 sees if the operation that is likely to be performed for executing the solution is performed by the user within a preset time. The controller 54 then determines in step S 44 whether the problem has been solved based on whether this operation has been performed. If this operation has been performed within the preset time and if a certain sensor (the visual sensor, for example) has detected this operation, the controller 54 determines that the problem has been solved (YES in step S 44 ). The controller 54 then finishes the processing.
- the controller 54 determines that the problem has not been solved (NO in step S 44 ). In this case, the robot device 10 may search for another solution or resend the same request to the user in step S 45 .
- step S 40 determines in step S 46 whether the use of the device for executing the solution is free. More specifically, the controller 54 searches for (identifies) a device around the robot device 10 and determines whether the use of this device is free. Information concerning whether the use of a device is charged or free is managed for each device. This information may be managed by the device function management table. The controller 54 may alternatively obtain from an identified device information indicating whether the use of this device is charged or free.
- step S 47 If the use of the device is not free (NO in step S 46 ), that is, if the use of the device is charged, the controller 54 determines in step S 47 whether the robot device 10 possesses a payment instrument for paying for the device. Examples of the payment instruments are electronic money (digital currency), virtual money, cash, and credit cards. Payment may be made by using an instrument other than these examples. If the robot device 10 does not possess the payment instrument for paying for the device (NO in step S 47 ), the controller 54 searches for another solution in step S 48 . If the robot device 10 possesses the payment instruction for paying for the device (YES in step S 47 ), the process proceeds to step S 49 . The controller 54 controls the payment operation of the robot device 10 . That is, the robot device 10 makes payment when using the device. If the robot device 10 does not possess the payment instrument for paying for the device, it may receive payment support from at least one of people and devices other than the robot device 10 (borrow or receive money from someone or a device).
- the controller 54 determines in step S 47 whether the
- the controller 54 determines in step S 49 whether the device for executing the solution can be controlled via a communication function.
- Information indicating whether a device can be controlled via a communication function is managed for each device. This information may be managed by the device function management table.
- the controller 54 selects a communication method that can be employed for communication with this device (communication method supported by this device) in step S 50 .
- the communication method supported by a device is managed for each device.
- the communication method supported by each device may be managed by the device function management table. If communication errors occur when performing communication by using the communication method supported by the device, the process may proceed to step S 57 .
- the robot device 10 may attempt to communicate with the device by using each of the plural communication methods. In this case, the robot device 10 may select the device by using the optimal communication method (the communication method having the highest speed or producing the least noise).
- step S 51 the controller 54 obtains address information concerning the device and an access password.
- the controller 54 may obtain address information from another device, such as a server, storing address information or via the Internet. Address information may alternatively be stored in the robot device 10 .
- step S 52 if a driver for controlling the device is required, the controller 54 obtains the driver and installs it in the robot device 10 .
- the controller 54 may obtain the driver from a device, such as a server, storing drivers or via the Internet.
- step S 53 the communication unit 30 sends information indicating an instruction to execute the solution (an instruction to perform the function for executing the solution) to the device. Upon receiving this information, the device executes the solution in accordance with the instruction.
- step S 54 the controller 54 sees if the device is solving the problem.
- the controller 54 determines in step S 55 whether the problem has been solved. If the operation that is likely to be performed for executing the solution is performed by the device within a preset time and if a certain sensor (the visual sensor, for example) has detected this operation, the controller 54 determines that the problem has been solved (YES in step S 55 ). The controller 54 then finishes the processing. If this operation has not been performed within the preset time nor has a certain sensor (the visual sensor, for example) detected this operation, the controller 54 determines that the problem has not been solved (NO in step S 55 ). In this case, the robot device 10 may search for another solution or resend the same instruction to the device in step S 56 .
- the controller 54 may determine whether it is possible to execute another solution. For example, if the room temperature is high and if, a device other than an air conditioner, such as an electric fan, is installed in the room, the controller 54 may search for another solution using the electric fan.
- the search unit 62 may search for another solution by using the Internet.
- the robot device 10 searches for a remote controller for operating the device in step S 57 .
- the robot device 10 may identify the remote controller by analyzing an image captured by the visual sensor, for example.
- the robot device 10 determines in step S 58 whether a remote controller has been found. If the robot device 10 fails to find a remote controller (NO in step S 58 ), it searches for another solution in step S 59 .
- step S 58 the robot device 10 operates the remote controller to input an instruction for executing the solution in step S 60 . Upon receiving this instruction, the device executes the solution in accordance with the instruction.
- the controller 54 sees if the device is solving the problem, and determines in step S 61 whether the problem has been solved. If the operation that is likely to be performed for executing the solution is performed by the device within a preset time and if a certain sensor (the visual sensor, for example) has detected this operation, the controller 54 determines that the problem has been solved (YES in step S 61 ). The controller 54 then finishes the processing. If this operation has not been performed within the preset time nor has a certain sensor (the visual sensor, for example) detected this operation, the controller 54 determines that the problem has not been solved (NO in step S 61 ). In this case, the robot device 10 may search for another solution or resend the same execution instruction to the device by using the remote controller in step S 62 .
- the robot device 10 may search for another solution or resend the same execution instruction to the device by using the remote controller in step S 62 .
- FIG. 17 staff members 76 and the robot device 10 are shown.
- the staff members 76 (three members) are holding a meeting, for example, and the robot device 10 is with the staff members 76 .
- the situation information collector 44 of the robot device 10 collects situation information concerning the surroundings of the robot device 10 by using various sensors. More specifically, the situation information collector 44 collects the content of the conversation among the staff members 76 as voice, images (images of the faces and the entire bodies of the staff members 76 ), the temperature, and the humidity as situation information. The detector 56 of the robot device 10 detects the surrounding situation based on the situation information (the state of the staff members 76 (such as the content of the conversation, expressions, and attitudes) and the temperature). If a staff member 76 says “I want to have the content of our conversation written on paper”, the situation information collector 44 collects voice information concerning this conversation as situation information. The detector 56 then determines based on the conversation whether any problem (issue) is occurring in this situation. In this example, the detector 56 detects a problem (issue) “a staff member 76 wants to have the content of their conversation printed on paper”.
- the solution specifying unit 58 refers to the solution management information 38 and specifies a solution for solving this problem.
- This problem can be solved by a combination of “a function of collecting the content of conversation as voice information and converting it into a character string” and “a print function”, for example. That is, the solution for this problem is constituted by a combination of “a function of collecting the content of conversation as voice information and converting it into a character string” and “a print function”.
- the robot device 10 searches for a device having a print function.
- the robot device 10 captures an image of a device around the robot device 10 by using a visual sensor (camera), and the identifying unit 64 identifies this device by analyzing the image.
- a multifunction device 78 having a print function is installed near the robot device 10 .
- the identifying unit 64 identifies the multifunction device 78 and also identifies the function (print function, for example) of the multifunction device 78 .
- the solution is executed by using the robot device 10 and the multifunction device 78 .
- This solution is a solution implemented by a collaborative function using the robot device 10 and the multifunction device 78 , that is, a solution implemented by combining the robot device 10 and the multifunction device 78 as collaborative devices.
- the robot device 10 may execute the solution automatically or in response to an instruction from a user.
- the robot device 10 communicates with the multifunction device 78 , sends information indicating the content of conversation among the staff members 76 to the multifunction device 78 , and provides an instruction to print the information to the multifunction device 78 .
- the content of conversation among the staff members 76 is printed on paper.
- the robot device 10 may move to the multifunction device 78 , fetch paper on which the content of conversation is printed, and hand it to a staff member 76 .
- the robot device 10 may output voice to inform the staff members 76 that the solution has been executed.
- the robot device 10 is used as one of the devices to execute the solution.
- the robot device 10 may not necessarily be used, and instead, the solution may be executed by using a device other than the robot device 10 .
- the robot device 10 provides an instruction to execute the solution to another device.
- the robot device 10 may directly operate the multifunction device 78 .
- the solution specifying unit 58 specifies “a copy function” as the solution.
- the robot device 10 searches for a device having a copy function (such as the multifunction device 78 ), and causes the multifunction device 78 to make a copy of the document.
- the robot device 10 receives the document from a staff member 76 , sets it in the multifunction device 78 , and directly operates the operation panel of the multifunction device 78 , thereby providing a copy instruction to the multifunction device 78 .
- the robot device 10 identifies the operation panel by analyzing an image captured by the visual sensor, for example, and then provides a copy instruction.
- FIG. 18 Three persons 82 and the robot device 10 are shown. A person 80 is lying down.
- the situation information collector 44 of the robot device 10 collects an image of the person 80 lying down as situation information, and the detector 56 determines based on this image whether a problem is occurring. In the example in FIG. 18 , a problem “someone is lying down” is detected.
- the solution specifying unit 58 refers to the solution management information 38 and specifies a solution for solving this problem.
- This problem can be solved by “asking someone for help to rescue the person lying down”. That is, an example of the solution for this problem is “asking someone for help”.
- the robot device 10 asks a person 82 other than the person 80 lying down for help. To do so, the robot device 10 may output sound or move to the person 82 to touch him or her.
- the controller 54 may display information concerning the rescue procedure on the UI 50 of the robot device 10 or the UI 72 of the terminal device 14 of the person 82 .
- the robot device 10 may identify the terminal device 14 by using a certain sensor and send information concerning the rescue procedure to the terminal device 14 .
- the solution specifying unit 58 specifies a solution “carrying the person lying down to a safe place”, for example. If the robot device 10 has the function of carrying objects, it may carry the person 80 only by itself or together with another person 82 or another device, or with another device and another person 82 . The robot device 10 identifies a device which may carry the person 80 . If the robot device 10 does not have the function of carrying objects, it may instruct a person 82 to carry the person 80 lying down to a safe place. Information indicating this instruction may be displayed on the UI 50 of the robot device 10 , or may be sent to the terminal device 14 of the person 82 and displayed on the UI 72 of the terminal device 14 .
- Application scene 3 will be discussed below with reference to FIG. 19 .
- three users 84 and the robot device 10 are shown. If a user 84 says “I'm thirsty”, the situation information collector 44 of the robot device 10 collects voice information concerning this remark as situation information, and the detector 56 determines based on this remark whether a problem is occurring. In this example, a problem “being thirsty” is detected.
- the solution specifying unit 58 refers to the solution management information 38 and specifies a solution for solving the problem. This problem can be solved by “buying a drink”, for example. That is, the solution for this problem is constituted by “a function of providing a drink”.
- the robot device 10 searches for a device for providing a drink.
- the robot device 10 captures an image of a device around the robot device 10 by using a visual sensor (camera), and the identifying unit 64 identifies this device by analyzing the image.
- a vending machine 86 an example of the device
- the identifying unit 64 identifies the vending machine 86 and identifies the function (the function of providing a drink for a charge, for example) of the vending machine 86 .
- the identifying unit 64 may identify the payment instrument (such as electronic money or cash) for the vending machine 86 , based on the image captured by the visual sensor, or may communicate with the vending machine 86 to identify the payment instrument.
- the controller 54 determines whether the robot device 10 has this payment instrument.
- the controller 54 performs the payment operation of the robot device 10 .
- the robot device 10 then buys a drink from the vending machine 86 by paying with this payment instrument.
- the robot device 10 may move to the vending machine 86 and directly operate it to buy a drink.
- the robot device 10 identifies purchase buttons of the vending machine 86 by analyzing the image captured by the visual sensor, and buys a drink.
- the robot device 10 may deliver this drink to the user 84 .
- the robot device 10 may buy a drink in response to an instruction from the user 84 or without an instruction from the user 84 .
- the controller 54 causes information indicating a solution (the user 84 can buy a drink) to be displayed on the UI 50 . If the user 84 provides a purchase instruction by using the UI 50 , the robot device 10 buys a drink by paying with the payment instrument. Information indicating the solution may be sent to the terminal device 14 of the user 84 and displayed on the UI 72 of the terminal device 14 . If the user 84 provides a purchase instruction by using the UI 72 , information concerning the purchase instruction is sent from the terminal device 14 to the robot device 10 , and the robot device 10 buys a drink in response to the purchase instruction.
- the robot device 10 may pay by receiving payment support from at least one of devices other than the robot device 10 and the users 84 .
- the robot device 10 may buy a drink by receiving money from a user 84 concerned with the solution for solving the problem or by borrowing the payment instrument from another device.
- a user 84 concerned with the solution for solving the problem is the user 84 said “I'm thirsty”. This user 84 is identified by a certain sensor of the robot device 10 .
- FIG. 20 An example of the user decision mode (user decision mode 1) will be described below in detail with reference to FIG. 20 .
- the staff members 87 are talking, and plural devices (such as a multifunction device 78 , a projector 88 , a camera 90 , a display 92 , and an aroma diffuser 94 ) are installed near the robot device 10 .
- plural devices such as a multifunction device 78 , a projector 88 , a camera 90 , a display 92 , and an aroma diffuser 94 .
- the status information collector 44 of the robot device 10 collects status information concerning the surroundings of the robot device 10 by using various sensors, and the detector 56 detects the surrounding situation based on the status information.
- the identifying unit 64 identifies devices around the robot device 10 . For example, the detector 56 detects a situation (problem) “three staff members are holding a meeting and seem to be arguing about something”. In the example in FIG. 20 , the identifying unit 64 identifies the multifunction device 78 , the projector 88 , the camera 90 , the display 92 , and the aroma diffuser 94 .
- the identifying unit 64 may identify the terminal device 14 of a staff member 87 (who seems to have a problem, for example).
- the communication unit 30 sends the situation information to a terminal device 14 registered in the robot device 10 or a terminal device 14 identified by the identifying unit 64 (the terminal device 14 of the staff member 87 who seems to have a problem, for example) under the control of the controller 54 .
- the situation information includes information concerning the situation (problem) detected by the detector 56 and information concerning the devices identified by the identifying unit 64 .
- FIG. 21 shows an example of the screen.
- a situation explanation screen 96 is displayed.
- information concerning the situation (problem) detected by the detector 56 and information concerning the devices identified by the identifying unit 64 are displayed.
- a character string indicating “three members are holding a meeting and seem to be arguing about something” is displayed as the situation (problem)
- a character string “there are a multifunction device B, a projector C, a camera D, a display E, and an aroma diffuser F” is displayed.
- the communication unit 30 of the robot device 10 then sends additional information to the terminal device 14 in response to this request. If an image representing the surrounding situation is requested by the user as additional information, the communication unit 30 sends image data indicating the surrounding situation to the terminal device 14 .
- the communication unit 30 sends image data linked with the staff members 87 who seem to have a problem and image data linked with the devices identified by the identifying unit 64 to the terminal device 14 as the image data indicating the surrounding situation.
- the image data linked with the staff members 87 who seem to have a problem may be image data indicating an image of the staff members 87 captured by the visual sensor or image data schematically indicating the staff members 87 .
- the image data linked with the devices may be image data indicating images of the devices captured by the visual sensor (camera) when the identifying unit 64 has identified the devices or image data schematically representing the identified devices (icons).
- the image data schematically representing the identified devices may be stored in the robot device 10 in advance, or may be stored in another device, such as a server, in advance and be sent to the robot device 10 .
- the communication unit 30 sends image data (image data linked with the staff members 87 and image data linked with the devices, for example) as the additional information to the terminal device 14 .
- image data image data linked with the staff members 87 and image data linked with the devices, for example
- FIG. 22 shows an example of the image.
- a situation explanation screen 98 is displayed.
- a set of images is displayed as the additional information.
- an image 100 linked with the staff members 87 , a device image 102 linked with the multifunction device 78 , a device image 104 linked with the projector 88 , a device image 106 linked with the camera 90 , a device image 108 linked with the display 92 , and a device image 110 linked with the aroma diffuser 94 are displayed.
- the user specifies a device image on the situation explanation screen 98 and provides an instruction to execute a solution by using the device linked with the specified device image
- information indicating this instruction is sent to the specified device.
- this device executes the solution for solving the problem detected by the detector 58 in response to the instruction.
- Information indicating the instruction may be sent to the device from the terminal device 14 or from the robot device 10 .
- the user specifies the device images 108 and 110 , for example.
- the solution specifying unit 58 refers to the solution management information 38 and specifies a solution which is used for solving the problem “three members are holding a meeting and seem to be arguing about something” detected by the detector 56 and which uses the display 92 linked with the device image 108 and the aroma diffuser 94 linked with the device image 110 .
- the solution specifying unit 58 specifies (identifies) a solution for solving the problem “three members are holding a meeting and seem to be arguing about something” detected by the detector 56 by referring to the solution management information 38 , and also specifies a set of devices used for the identified solution (a set of devices having functions for executing the solution) by referring to the device function management information 40 and the collaborative function management information 42 . If the user specifies the display 92 and the aroma diffuser 94 , the solution specifying unit 58 selects a solution to be executed by using the display 92 and the aroma diffuser 94 .
- an execution instruction screen 112 is displayed on the UI 72 of the terminal device 14 , as shown in FIG. 23 .
- device images device images 108 and 110 , for example
- information indicating a function (solution) that is executable by the devices specified by the user is displayed.
- the solution that is executable by using the display 92 and the aroma diffuser 94 is “displaying a soothing image on the display 92 and diffusing a soothing aroma from the aroma diffuser 94 ”, for example.
- This solution is a collaborative function executable by combining the display 92 and the aroma diffuser 94 , and is registered in the collaborative function management information 42 .
- information indicating this instruction is sent from the terminal device 14 to the display 92 and the aroma diffuser 94 .
- This information may alternatively be sent to the display 92 and the aroma diffuser 94 via the robot device 10 .
- the display 92 displays a soothing image
- the aroma diffuser 94 diffuses a soothing aroma.
- Data indicating a soothing image may be stored in the robot device 10 in advance and may be sent from the robot device 10 to the display 92 .
- the data may alternatively be stored in another device, such as a server, and be sent from this device to the display 92 .
- the screens 96 , 98 , and 112 shown in FIGS. 21 through 23 may be displayed on the UI 50 of the robot device 10 .
- the screens 96 , 98 , and 112 may not necessarily be displayed on the UI 72 of the terminal device 14 .
- Information displayed on the screens 96 , 98 , and 112 may be output as voice information.
- user decision mode 2 Another example of the user decision mode (user decision mode 2) will be discussed below in detail.
- the communication unit 30 sends situation information to a terminal device 14 under the control of the controller 54 .
- a notifying screen 114 is displayed, as shown in FIG. 24 .
- a message that a problem (accident) that is not possible to handle by the robot device 10 has occurred is displayed.
- a situation explanation screen 116 is displayed on the UI 72 of the terminal device 14 .
- an explanation of the situation (problem) detected by the robot device 10 is displayed.
- a message asking the user whether the user has understood the situation is displayed. If the user presses a “NO” button, the user can make an inquiry to the robot device 10 , that is, the user can make a request for additional information.
- an inquiry screen 118 is displayed, as shown in FIG. 26 .
- the user can input inquiries (issues about which the user additionally wants to know) by using the terminal device 14 .
- the user has input some inquires “I want data about XXX” and “I want to see video of XXX”. Information indicating the inquiries is sent from the terminal device 14 to the robot device 10 .
- the situation information collector 44 of the robot device 10 collects information (image data and voice data, for example) in response to the user inquires.
- the communication unit 30 then sends the information collected by the situation information collector 44 to the terminal device 14 .
- On the situation explanation screen 116 of the UI 72 of the terminal device 14 the information is displayed.
- a solution display screen 120 is displayed on the UI 72 of the terminal device 14 , as shown in FIG. 27 .
- information indicating solutions specified by the solution specifying unit 58 (such as explanations and names of the solutions) is displayed. For example, if, as discussed above, the problem “three staff members are holding a meeting and seem to be arguing about something” is detected, the solution specifying unit 58 refers to the solution management information 38 and specifies solutions for solving the problem.
- Information indicating the specified solutions is sent from the robot device 10 to the terminal device 14 , and is displayed on the UI 72 of the terminal device 14 . In the example in FIG.
- the solutions may be displayed in a random order or in descending order of the effectiveness or the feasibility. For example, a solution using a device located closer to the terminal device 14 or the robot device 10 has a higher feasibility and is thus displayed in the list in a higher position.
- Information concerning the positions of the robot device 10 , the terminal device 14 , and the other devices may be obtained by using a global positioning system (GPS).
- GPS global positioning system
- the controller 54 calculates the distance between each device and the robot device 10 or the terminal device 14 by using the GPS position information. Based on the calculation results, the order of the solutions displayed on the solution display screen 120 is determined.
- a screen 122 is displayed on the UI 72 , as shown in FIG. 28 .
- a message such as “please select a solution”, that is, a message instructing the user to select a solution, is displayed. If the user does not find a suitable solution (a desirable solution) among the recommended solutions (see FIG. 27 ) presented to the user, the user can provide another instruction.
- a checking screen 124 is displayed on the UI 72 , as shown in FIG. 29 .
- information indicating a solution selected by the user is displayed.
- the solution (2) “displaying a soothing image on the display” and the solution (3) “diffusing a citrus aroma from the aroma diffuser” are selected by the user.
- the user may specify another solution.
- a user input screen 126 is displayed on the UI 72 of the terminal device 14 , as shown in FIG. 30 .
- the user specifies a device and a solution to be executed by using this device.
- a multifunction device is specified as the device for executing the solution
- “XXX processing” is specified as the solution using this multifunction device.
- the user may specify a device used for the solution by inputting the name of this device with characters or by specifying a device image linked with this device.
- the user selects the device image linked with a device used for executing the solution.
- a device image By selecting a device image in this manner, a user unfamiliar with the name of a device used for a solution can still specify the device.
- a list of functions included in the device is displayed on the UI 72 . Functions included in a device can be specified by referring to the device function management information 40 . The user then selects a function used for the solution from the list.
- the user provides an instruction to execute the solution by using the terminal device 14 , information indicating the instruction is sent to the device specified by the user, and the device executes the solution.
- the screens 114 , 116 , 118 , 120 , 122 , 124 , and 126 shown in FIGS. 24 through 30 may be displayed on the UI 50 of the robot device 10 .
- the screens 114 , 116 , 118 , 120 , 122 , 124 , and 126 may not necessarily be displayed on the UI 72 of the terminal device 14 .
- Information displayed on the screens 114 , 116 , 118 , 120 , 122 , 124 , and 126 may be output as voice information.
- information to be displayed on the UI 72 of the terminal device 14 is sent from the robot device 10 to the terminal device 14 .
- the information may alternatively be sent from another device, such as a server, to the terminal device 14 under the control of the robot device 10 .
- a device is identified by obtaining device identification information indicating this device by using the augmented reality (AR) technology.
- AR augmented reality
- a device to be singly used is identified by obtaining device identification information indicating this device, and collaborative devices are identified by obtaining device identification information concerning these devices.
- AR technology a known AR technology may be utilized. Examples of the known AR technology are a marker-based AR technology using a marker, such as a two-dimensional barcode, a markerless AR technology using an image recognition technology, and position information AR technology using position information.
- device identification information may be obtained without using the AR technology.
- a device connected to a network may be identified based on the IP address of this device or by reading the device ID.
- devices and terminals having various wireless communication functions such as infrared communication, visible light communication, Wi-Fi, and Bluetooth
- device IDs of devices or terminals to be used as collaborative devices or terminals may be obtained by using corresponding wireless communication functions, and then, a collaborative function may be executed.
- FIG. 31 shows the schematic external appearance of the multifunction device 78 .
- a marker 128 such as a two-dimensional barcode, is provided on the housing of the multifunction device 78 .
- the marker 128 is coded device identification information of the multifunction device 78 .
- the robot device 10 captures an image of the marker 128 by using a visual sensor so as to generate image data indicating the marker 128 .
- the controller 54 of the robot device 10 performs decoding processing on the marker image indicated by the image data so as to extract device identification information.
- the robot device 10 can thus identify the multifunction device 78 .
- the identifying unit 64 of the robot device 10 then refers to the device function management information 40 and specifies function information indicating functions associated with the extracted identification information. As a result, the functions provided in the multifunction device 78 are specified (identified).
- the marker 128 may also include coded function information indicating the functions of the multifunction device 78 .
- the controller 54 performing decoding processing on image data indicating the marker 128 , the device identification information of the multifunction device 78 is extracted, and also, function information indicating the functions of the multifunction device 78 is extracted.
- functions provided in the multifunction device, as well as the multifunction device 78 are specified (identified).
- the robot device 10 captures an image of the entirety or part of the external appearance of a device (the multifunction device 78 , for example) by using the visual sensor so as to generate external-appearance image data.
- capturing an image of information for specifying the device such as the device name (the product name, for example) and the model number, contributes to specifying the device.
- the controller 54 of the robot device 10 then identifies the device based on the external-appearance image data.
- external-image associating information indicating the association between external-appearance image data indicating the entirety or part of the external appearance of a device and device identification information concerning this device is stored.
- the controller 54 compares the external-appearance image data obtained by capturing an image of the device with each item of external-appearance image data included in the external-image associating information, and specifies the device identification information concerning the device to be used based on the comparison results. For example, the controller 54 extracts features of the external appearance of the device from the external-appearance image data obtained by capturing an image of the device, and specifies external-appearance image data having the same or similar features from a set of external-appearance image data included in the external-image associating information. The controller 54 then specifies the device identification information associated with the specified external-appearance image data. As a result, the device (the multifunction device 78 , for example) is specified.
- the device may be specified based on the device name and the model number indicated by the external-appearance image data.
- the identifying unit 64 refers to the device function management information 40 and specifies function information indicating the functions associated with the specified device identification information. As a result, the functions provided in the device (the multifunction device 78 , for example) are specified.
- position information indicating a position at which a device is installed is obtained by using a GPS function.
- Each device has a GPS function and obtains device position information indicating the position of a device.
- the robot device 10 then outputs information indicating a request to obtain device position information to a device, and receives the device position information from this device as a response to the request.
- the controller 54 of the robot device 10 identifies the device based on the device position information.
- position associating information indicating the association between device position information indicating a position at which a device is installed and device identification information concerning this device is stored.
- the controller 54 specifies the device identification information associated with the device position information, based on the position associating information. As a result, the device is specified (identified).
- the identifying unit 64 refers to the device function management information 40 and specifies function information indicating the functions associated with the specified device identification information. As a result, the functions provided in the device (the multifunction device 78 , for example) are specified (identified).
- a device image linked with the multifunction device 78 is displayed on the terminal device 14 as situation information.
- the device image 102 linked with the multifunction device 78 is displayed on the UI 72 of the terminal device 14 , as shown in FIG. 32 .
- the device image 102 may be an image captured by the visual sensor of the robot device 10 or may be an image schematically representing the multifunction device 78 .
- information indicating the name of this device may be sent from the robot device 10 to the terminal device 14 , and the name of the device may be displayed on the UI 72 of the terminal device 14 .
- the name “multifunction device B” is displayed.
- the multifunction device 78 For example, if the user specifies the device image 102 by using the terminal device 14 , information indicating the solutions to be executed by using the multifunction device 78 linked with the device image 102 (button images for specifying solutions to be executed, for example) is displayed on the UI 72 of the terminal device 14 , as shown in FIG. 33 .
- the multifunction device B has a print function, a scan function, a copy function, and a fax function, for example. Button images for executing these functions as solutions are displayed on the UI 72 .
- execution instruction information indicating an instruction to execute the print function is sent from the terminal device 14 or the robot device 10 to the multifunction device 78 .
- the execution instruction information includes control data for executing the print function and data such as image data to be printed by using the print function.
- the multifunction device 78 Upon receiving the execution instruction information, the multifunction device 78 performs printing in accordance with the execution instruction information.
- a device image linked with the multifunction device 78 and a device image linked with the projector 88 are displayed on the terminal device 14 as situation information.
- the device image 102 linked with the multifunction device 78 and the device image 104 linked with the projector 88 are displayed on the UI 72 of the terminal device 104 , as shown in FIG. 34 .
- the device images 102 and 104 may be images captured by the visual sensor of the robot device 10 or images schematically representing the multifunction device 78 and the projector 88 .
- information indicating the name of the device may be sent from the robot terminal 10 to the terminal device 14 and may be displayed on the UI 72 of the terminal device 14 .
- the name of the multifunction device 78 “multifunction device B” and the name of the projector 88 “projector C” are displayed.
- the user specifies the device images 102 and 104 by using the terminal device 14 , information indicating solutions using the multifunction device 78 linked with the device image 102 and the projector 88 linked with the device image 104 (button images for executing the solutions) is displayed on the UI 72 of the multifunction device 14 , as shown in FIG. 35 .
- Each of the solutions is a solution implemented by a collaborative function using the multifunction device 78 and the projector 88 .
- a collaborative function of projecting a scanned image generated by the multifunction device 78 by using the projector 88 and a collaborative function of printing an image projected by the projector 88 by using the multifunction device 78 are executable.
- Button images for executing these collaborative functions are displayed on the UI 72 of the multifunction device 14 . If the user specifies a button image to instruct the execution of the solution (collaborative function), execution instruction information indicating an instruction to execute this solution is sent from the terminal device 14 or the robot device 10 to the multifunction device 78 and the projector 88 . Upon receiving the execution instruction information, the multifunction device 78 and the projector 88 execute the collaborative function specified by the user.
- the multifunction device 78 and the projector 88 may be specified as collaborative devices to be combined as a result of the user touching the device image 102 with a finger, for example, and then sliding the finger to the device image 104 to specify the device images 102 and 104 .
- the user may touch the device image 104 first with a finger and then slide the finger to the device image 102 .
- a screen contact medium such as a pen, may be used instead of a finger.
- the user may specify the device images 102 and 104 by connecting them so as to specify the multifunction device 78 and the projector 88 as collaborative devices.
- the user may specify the device images 102 and 104 by superposing them on each other so as to specify the multifunction device 78 and the projector 88 as collaborative devices.
- the user may draw a figure such as a circle around device images linked with collaborative devices, or may specify device images linked with collaborative devices within a preset time.
- the user may specify devices to be canceled on the screen or press a collaboration cancel button.
- the user may specify devices to be canceled by performing a preset operation, such as drawing a cross mark.
- a collaborative device may be set as a basic collaborative device in advance.
- the multifunction device 78 for example, is set as a basic collaborative device in advance.
- Device identification information concerning a basic collaborative device may be stored in the robot device 10 or another device, such as a server, in advance.
- the user may alternatively specify a basic collaborative device by using the terminal device 14 . If a basic collaborative device is set, the user specifies a device image linked with a device other than the basic collaborative device so as to select a device to be combined with the basic collaborative device.
- the functions are implemented by using hardware devices.
- the functions may be implemented by software (software applications).
- function images are displayed on the UI 72 of the terminal device 14 .
- a function linked with a function image or a collaborative function using plural functions linked with plural function images may be specified.
- Device images linked with hardware devices and function images linked with functions implemented by software may be displayed together on the UI 72 .
- a collaborative function using a device linked with the device image and a function linked with the function image may be specified.
- a connection request is sent from the terminal device 14 to collaborative devices to be combined, and the terminal device 14 and these devices are connected to each other.
- a connection request may be sent from the robot device 10 to collaborative devices, and the robot device 10 and these devices are connected to each other. This connection processing will be described below with reference to the sequence diagram of FIG. 36 .
- step S 70 a user specifies a collaborative function to be executed by using the terminal device 14 .
- step S 71 the terminal device 14 sends information indicating a connection request to devices (multifunction device 78 and projector 88 , for example) executing this collaborative function.
- devices multifunction device 78 and projector 88 , for example
- the terminal device 14 obtains the address information indicating the addresses of these devices from the robot device 10 .
- the address information may be stored in the terminal device 14 .
- the terminal device 14 may obtain the address information by another approach.
- the terminal device 14 then sends information indicating a connection request to the collaborative devices (multifunction device 78 and projector 88 , for example) by using the obtained address information.
- the multifunction device 78 and the projector 88 Upon receiving the information indicating the connection request, in step S 72 , the multifunction device 78 and the projector 88 accept or do not accept this connection request. If the multifunction device 78 and the projector 88 are not allowed to connect to another device or if the number of devices that have requested to connect to the multifunction device 78 and the projector 88 exceeds a maximum number, the multifunction device 78 and the projector 88 do not accept the connection request. If the multifunction device 78 and the projector 88 accept the connection request, they may prohibit the operation for changing settings information concerning the multifunction device 78 and the projector 88 in order to protect it from being changed by the terminal device 14 .
- changing of color parameters and the preset time to shift to a power-saving mode in the multifunction device 78 may be prohibited. This enhances the security for collaborative devices.
- more limitations may be imposed on changing of settings information for collaborative devices than for singly used devices. For example, changing of fewer setting items is allowed for collaborative devices than for singly used devices. Reading of personal information concerning the other users, such as operation records, may be prohibited. This enhances the security for personal information concerning the users.
- step S 73 result information indicating whether the connection request has been accepted is sent from the multifunction device 78 and the projector 88 to the terminal device 14 . If the connection request has been accepted, communication is established between the terminal device 14 and each of the multifunction device 78 and the projector 88 .
- step S 74 the user provides an instruction to execute the collaborative function by using the terminal device 14 .
- step S 75 execution instruction information indicating an instruction to execute the collaborative function is sent from the terminal device 14 to the multifunction device 78 and the projector 88 .
- the execution instruction information sent to the multifunction device 78 includes information (job information, for example) indicating processing to be executed by the multifunction device 78 .
- the execution instruction information sent to the projector 88 includes information (job information, for example) indicating processing to be executed by the projector 88 .
- the multifunction device 78 and the projector 88 execute the functions in accordance with the execution instruction information.
- the collaborative function involves processing for sending and receiving data between the multifunction device 78 and the projector 88 , such as transferring scanned data from the multifunction device 78 (multifunction device B) to the projector 88 (projector C) and projecting the scanned data by the projector 88 , communication is established between the multifunction device 78 and the projector 88 .
- the execution instruction information sent to the multifunction device 78 includes address information indicating the address of the projector 88
- the execution instruction information sent to the projector 88 includes address information indicating the address of the multifunction device 78 . Communication is established between the multifunction device 78 and the projector 88 by using these items of address information.
- step S 77 information indicating the completion of the execution of the collaborative function is sent from the multifunction device 78 and the projector 88 to the terminal device 14 .
- step S 78 the information indicating the completion of the execution of the collaborative function is displayed on the UI 72 of the terminal device 14 . If this information is not displayed after the lapse of a preset time after the execution instruction information has been sent, the terminal device 14 displays information indicating an error on the UI 72 , and may resend the execution instruction information or the connection request information to the multifunction device 78 and the projector 88 .
- step S 79 the user checks in step S 79 whether to cancel the collaboration state between the multifunction device 78 and the projector 88 , and executes processing in accordance with the checking result in step S 80 . If the user finds that the collaboration state will be canceled, the user provides an instruction to cancel the collaboration state by using the terminal device 14 so as to cancel the communication between the terminal device 14 and each of the multifunction device 78 and the projector 88 . The communication between the multifunction device 78 and the projector 88 is also canceled. If the user finds that the collaboration state will not be canceled, the terminal device 14 may continue to provide an execution instruction.
- the above-described execution instruction information may be sent from the robot device 10 to collaborative devices.
- the order of displaying items of information concerning collaborative functions may be switched in accordance with the order of connecting device images linked with devices. This processing will be discussed below in detail with reference to FIGS. 37 through 39B .
- FIG. 37 illustrates a collaborative function management table, which is another example of the collaborative function management information 42 .
- information indicating a combination of device IDs information indicating the names of collaborative devices (device types, for example), information indicating collaborative functions (collaborative function information), information indicating the connecting order, and information indicating the priority level are associated with each other.
- the connecting order indicates the order of connecting device images linked with devices.
- the priority level indicates the priority of displaying items of information concerning collaborative functions.
- the device having a device ID “A”, for example, is a PC, and the device having a device ID “B” is a multifunction device.
- “scan transfer function” and “print function” are implemented as collaborative functions.
- “Scan transfer function” is the function of transferring image data generated by a scanning operation of the multifunction device (B) to the PC (A).
- Print function is the function of sending data (image data or document data, for example) stored in the PC (A) to the multifunction device (B) and printing the data in the multifunction device (B). If the user connects the multifunction device (B) to the PC (A), that is, if the user connects the device image linked with the multifunction device (B) to the device image linked with the PC (A), “scan transfer function” takes the first priority level, and “print function” takes the second priority level.
- information concerning “scan transfer function” is displayed preferentially over information concerning “print function”.
- the user connects the PC(A) to the multifunction device (B), that is, if the user connects the device image linked with the PC (A) to the device image linked with the multifunction device (B), “print function” takes the first priority level, and “scan transfer function” takes the second priority level.
- information concerning “print function” is displayed preferentially over information concerning “scan transfer function”.
- FIGS. 38A through 39B illustrate examples of screens displayed on the UI 72 of the terminal device 14 .
- the multifunction device (B) and the PC (A), for example, are identified.
- a device image 102 linked with the multifunction device (B) and a device image 130 linked with the PC (A) are displayed as situation information on the UI 72 of the terminal device 14 , as shown in FIG. 38A .
- the user connects device images representing devices to be combined by using an indicator (a user's finger, a pen, or a stylus, for example).
- the controller 74 of the terminal device 14 detects the positions at which the indicator contacts the screen to identify the movement of the indicator on the screen. For example, as indicated by an arrow 132 in FIG.
- the user touches the device image 102 on the screen by using the indicator and slides the indicator on the screen to the device image 130 so as to connect the device images 102 and 130 .
- This operation specifies the multifunction device (B) linked with the device image 102 and the PC (A) linked with the device image 130 as collaborative devices and also specifies the connecting order of the devices.
- the connecting order of device images corresponds to the connecting order of devices.
- the multifunction device (B) corresponds to a first device, while the PC (A) corresponds to a second device.
- the user connects the device image 102 to the device image 130 , and thus, the multifunction device (B) is connected to the PC (A).
- the controller 74 of the terminal device 14 may cause an image representing a path followed by the user to be displayed on the screen. After collaborative devices are connected to each other, the controller 74 may replace the path by a preset straight line and display it on the screen.
- the identifying unit 64 of the robot device 10 refers to the collaborative function management table shown in FIG. 37 , and identifies collaborative functions corresponding to a combination of the PC (A) and the multifunction device (B). Collaborative functions executable by combining the PC (A) and the multifunction device (B) are identified in this manner.
- the identifying unit 64 refers to the collaborative function management table and identifies the priority level associated with the connecting order. This will be explained more specifically with reference to FIG. 37 .
- the PC (A) and the multifunction device (B) are specified as collaborative devices, and thus, the collaborative functions executable by the PC (A) and the multifunction device (B) are “scan transfer function” and “print function”.
- the user has connected the multifunction device (B) to the PC (A) (B-+A), and thus, “scan transfer function” takes the first priority level, while “print function” takes the second priority level.
- Information concerning the collaborative functions identified as described above and information concerning the priority levels are sent from the robot device 10 to the terminal device 14 .
- the controller 74 of the terminal device 14 causes the information concerning the collaborative functions to be displayed on the UI 72 according to the priority levels.
- the controller 74 of the terminal device 14 causes information concerning collaborative function candidates to be displayed on the UI 72 .
- “Scan transfer function” takes the first priority level, while “print function” takes the second priority level.
- the information concerning “scan transfer function” is displayed preferentially over the information concerning “print function”.
- the information concerning “scan transfer function” is displayed above the information concerning “print function”.
- a description “transfer scanned data generated by the multifunction device (B) to the PC (A)” is displayed.
- As the information concerning “print function”, a description “print data stored in the PC (A)” is displayed.
- the specified collaborative function is executed. If the user presses a “YES” button, for example, the collaborative function linked with the “YES” button is executed.
- Identifying of collaborative functions and priority levels may be performed by the device terminal 14 instead of the robot device 10 .
- the user may draw a figure, such as a circle, instead of sliding an indicator between the device images.
- the drawing order corresponds to the connecting order.
- the user may output voice to specify collaborative devices and the connecting order thereof.
- FIGS. 39A and 39B illustrate another example of the operation.
- the user touches the device image 130 on the screen by using an indicator and slides the indicator on the screen to the device image 102 in the direction indicated by an arrow 134 , thereby connecting the device images 130 and 102 .
- This operation specifies the PC (A) linked with the device image 130 and the multifunction device (B) linked with the device image 102 as collaborative devices and also specifies the connecting order of the devices.
- the user connects the device image 130 to the device image 102 , and thus, the PC (A) is connected to the multifunction device (B).
- the display order of items of information concerning collaborative functions is changed in accordance with the connecting order of device images, that is, the connecting order of devices.
- the connecting order of devices serves as the order of utilizing the functions of the devices and the order of transferring data between the devices.
- the operation for connecting devices (that is, the operation for connecting device images) serves as the operation for specifying the order of utilizing the functions of the devices and the order of transferring data between the devices.
- information concerning a collaborative function which is highly likely to be used by the user is preferentially displayed. If the user connects the device image 102 linked with the multifunction device (B) to the device image 130 linked with the PC (A), it can be assumed that the user will utilize the collaborative function “using the function of the multifunction device (B) first to transfer data from the multifunction device (B) to the PC (A)”. If the user connects the device image 130 linked with the PC (A) to the device image 102 linked with the multifunction device (B), it can be assumed that the user will utilize the collaborative function “using the function of the PC (A) first to transfer data from the PC (A) to the multifunction device (B)”.
- the above-described display switching processing may be applied to the use of function images linked with functions. For example, displaying of items of information concerning collaborative functions is switched in accordance with the order of specifying a device image linked with a first function and a device image linked with a second function.
- Different functions may be assigned to different positions within a device image linked with a device to be used as a collaborative device.
- a user specifies a certain position within a device image, information concerning a collaborative function implemented by using a function assigned to this position is preferentially displayed. This processing will be described below in detail.
- FIG. 40 shows an example of the device function management table.
- Data in the device function management table is stored in the robot device 10 as the device function management information 40 .
- the device ID information indicating the device name (device type, for example), information indicating the position within a device image (device image position), information indicating the function assigned to the device image position (function information), and the image ID are associated with each other.
- the device image position is a specific position (specific portion) within a device image linked with a device.
- the device image position is, for example, a specific position within a device image schematically representing a device or a specific position within a device image captured by a camera. Different functions are assigned to the specific positions within a device image.
- FIGS. 41A through 41B illustrate examples of screens displayed on the UI 72 of the terminal device 14 .
- the multifunction device (B) and the PC (A), for example, are identified.
- the device images 102 and 130 are displayed as situation information on the UI 72 of the terminal device 14 , as shown in FIG. 41A .
- the following functions, for example, are assigned to specific positions of the device image 102 .
- a print function is assigned to a specific position (portion image 102 a ) corresponding to the body of the multifunction device (B).
- a scan function is assigned to a specific portion (portion image 102 b ) corresponding to a document cover, document glass, and an automatic document feeder of the multifunction device (B).
- a stapling function is assigned to a specific position (portion image 102 c ) corresponding to a post-processor of the multifunction device (B).
- the stapling function is the function of stapling sheets output from the multifunction device (B).
- the following functions, for example, are assigned to portions of the device image 130 .
- a data storage function is assigned to a specific position (portion image 130 a ) corresponding to the body of the PC (A).
- a screen display function is assigned to a specific position (portion image 130 b ) corresponding to the display of the PC (A).
- the data storage function is the function of storing data sent from another device in the PC (A).
- the screen display function is the function of displaying data sent from another device on the PC (A).
- the controller 74 of the terminal device 14 may cause the names of functions (such as print and scan) assigned to specific positions within a device image to be displayed on the UI 72 . This enables the user to understand which functions are assigned to specific positions.
- the names of functions may not necessarily be displayed.
- this function is specified as a collaborative function.
- this function is specified as a collaborative function.
- the user connects specific positions (portion images) to which functions are assigned by using an indicator. For example, as indicated by an arrow 136 in FIG. 41A , the user touches the portion image 102 b with the indicator and slides the indicator to the portion image 130 b so as to connect the portion images 102 b and 130 b .
- This connecting operation specifies the multifunction device (B) linked with the device image 102 including the portion image 102 b and the PC (A) linked with the device image 130 including the portion image 130 b as collaborative devices, and also specifies the scan function assigned to the portion image 102 b and the screen display function assigned to the portion image 130 b .
- This connecting operation may also specify the connecting order of the devices.
- the connecting order of the portion images corresponds to the connecting order of the devices.
- the user connects the portion image 102 b to the portion image 130 b , and thus, the multifunction device (B) is connected to the PC (A).
- the scan function and the screen display function are specified.
- Information indicating the connecting order of the devices and information indicating the positions within the device images specified by the user are sent from the terminal device 14 to the robot device 10 .
- the identifying unit 64 of the robot device 10 refers to the collaborative function management table shown in FIG. 11 and identifies collaborative functions to be implemented by combining the PC (A) and the multifunction device (B).
- the identifying unit 64 also refers to the device function management table shown in FIG. 40 , and identifies the functions assigned to the positions specified by the user.
- the identifying unit 64 then adjusts the priority levels of the collaborative functions implemented by combining the PC (A) and the multifunction device (B). More specifically, the identifying unit 64 raises the priority level of the collaborative function using the functions assigned to the positions specified by the user and lowers the priority levels of collaborative functions which do not use these functions.
- Information concerning the collaborative functions and information concerning the priority levels are sent from the robot device 10 to the terminal device 14 .
- the controller 74 of the terminal device 14 causes information concerning the collaborative functions to be displayed on the UI 72 as information concerning collaborative function candidates in accordance with the priority levels.
- the controller 74 causes information concerning the collaborative function candidates to be displayed on the UI 72 .
- the user has specified the scan function and the screen display function in this order. Consequently, information concerning a collaborative function “scan transfer display function” to be executed by combining the scan function and the screen display function is displayed preferentially over information concerning the other collaborative function.
- the information concerning “scan transfer display function” is displayed above the other items of information. For example, the information concerning “scan transfer display function” is displayed preferentially over information concerning a collaborative function “scan transfer storage function” executed by combining the scan function and the data storage function.
- the scan transfer display function is the function of transferring scanned data generated by the multifunction device (B) to the PC (A) and displaying the scanned data on the screen of the PC (A).
- the scan transfer storage function is the function of transferring scanned data generated by the multifunction device (B) to the PC (A) and storing the scanned data in the PC (A).
- a description of each collaborative function is displayed as information concerning the collaborative function.
- the above-described collaboration processing using portion images enables a user to individually specify the functions assigned to portions of collaborative devices and makes it possible to preferentially display information concerning a collaborative function implemented by the functions specified by the user. As a result, a collaborative function which is likely to be used by the user is preferentially displayed.
- a collaborative function may be a function using a combination of portions of the same device, a function using a combination of portions of different devices, a function using a combination of the entirety of a device and a portion of another device, or a function using a combination of the entirety of a device and the entirety of another device.
- the above-described collaboration processing using portion images may be applicable to the use of function images linked with functions. For example, different functions are assigned to different positions within a function image, and a collaborative function implemented by using the functions assigned to the positions specified by the user is identified.
- FIG. 42 shows an example of the device function management table.
- Data in the device function management table is stored in the robot device 10 as the device function management information 40 .
- the device ID information indicating the device name (device type, for example), information indicating the name of a portion of a device (type of portion, for example), information indicating the portion ID of the portion as portion identification information for identifying this portion, information indicating the function assigned to this portion (function of this portion), and the portion image ID for identifying a portion image linked with this portion are associated with each other.
- the portion image is an image representing the external appearance of a portion of a device captured by a camera.
- the portion image may alternatively be a portion image schematically representing a portion of a device. Different functions are assigned to different portions of a device.
- a screen display function is assigned to the display of the PC (A), and information concerning the screen display function is associated with the portion image ID of a portion image linked with the display.
- the screen display function is the function of displaying information on the PC (A).
- a data storage function is assigned to the body of the PC (A), and information concerning the data storage function is associated with the portion image ID of a portion image linked with the body.
- the data storage function is the function of storing data in the PC (A).
- a print function is assigned to the body of the multifunction device (B), and information concerning the print function is associated with the portion image ID of a portion image linked with the body.
- a scan function is assigned to a reader (a portion corresponding to a document cover, document glass, and an automatic document feeder) of the multifunction device (B), and information concerning the scan function is associated with the portion image ID of a portion image linked with the reader.
- a stapling function is assigned to a post-processor of the multifunction device (B), and information concerning the stapling function is associated with the portion image ID of a portion image linked with the post-processor.
- the stapling function is the function of stapling sheets output from the multifunction device (B).
- the function assigned to a portion of a device may be specified (identified) by using the markerless AR technology. If image data is generated as a result of a camera (a visual sensor of the robot device 10 , for example) capturing an image of a portion of a device, the identifying unit 64 of the robot device 10 refers to the device function management table and specifies (identifies) a function associated with this image data. This operation makes it possible to specify (identify) the function associated with the portion of the device. For example, if image data is generated as a result of the visual sensor capturing an image of the body of the multifunction device (B), the identifying unit 64 refers to the device function management table and specifies (identifies) the print function associated with this image data. It is thus possible to specify that the function assigned to the body of the multifunction device (B) is the print function.
- the function assigned to a portion of a device may be specified (identified) by using the marker-based AR technology.
- a marker which is coded portion identification information (portion ID, for example), such as a two-dimensional barcode, is provided in each portion of a device. An image of the marker is captured by the visual sensor and is processed by using the marker-based AR technology so as to obtain portion identification information (portion ID) indicating this portion.
- the identifying unit 64 of the robot device 10 then refers to the device function management table and specifies (identifies) the function associated with the portion identification information.
- FIG. 43 shows an example of the collaborative function management table.
- Data in the collaborative function management table is stored in the robot device 10 as the collaborative function management information 42 .
- the collaborative function management table indicates collaborative functions which are each executable by using functions of plural portions.
- information indicating a combination of portions of devices, information indicating a combination of portion IDs, and information indicating a collaborative function executable by using plural functions of the combination of the portions of the devices are associated with each other.
- information indicating a combination of a portion of a device and the entirety of another device and information indicating a collaborative function executable by using the function of this portion of the device and the function of the entirety of the device may be associated with each other.
- the collaborative function management table will be explained more specifically.
- a print function is assigned to a combination of the display of the PC (A) and the body of the multifunction device (B).
- Information indicating the print function as a collaborative function is associated with information indicating a combination of the portion ID of the display of the PC (A) and the portion ID of the body of the multifunction device (B).
- the print function as a collaborative function is the function of sending data stored in the PC (A) to the multifunction device (B) and printing the data in the multifunction device (B).
- a print function is assigned to a combination of the body of the multifunction device (B) and the body of the projector (C).
- Information indicating the print function as a collaborative function is associated with information indicating a combination of the portion ID of the body of the multifunction device (B) and the portion ID of the body of the projector (C).
- the print function as a collaborative function is the function of sending data projected by the projector (C) to the multifunction device (B) and printing the data in the multifunction device (B).
- a scan projecting function is assigned to a combination of the reader of the multifunction device (B) and the body of the projector (C).
- Information indicating the scan projecting function as a collaborative function is associated with information indicating a combination of the portion ID of the reader of the multifunction device (B) and the portion ID of the body of the projector (C).
- the scan projecting function as a collaborative function is the function of sending scanned data generated by the multifunction device (B) to the projector (C) and projecting the data by the projector (C).
- a collaborative function may be a function using functions of plural portions of the same device or a function using functions of plural portions of different devices.
- a collaborative function may be a function using functions of three or more portions.
- the identifying unit 64 of the robot device 10 refers to the collaborative function management table and specifies (identifies) a collaborative function associated with a combination of the identified plural portions. This operation makes it possible to specify (identify) a collaborative function using functions of plural portions identified by capturing images thereof, for example. If the body of the multifunction device (B) and the body of the projector (C) are identified, the robot device 10 refers to the collaborative function management table and specifies the print function as a collaborative function associated with a combination of the body of the multifunction device (B) and the body of the projector (C).
- FIGS. 44A through 45B illustrate examples of screens displayed on the UI 72 of the terminal device 14 .
- the multifunction device (B) and the PC (A), for example, are identified.
- the device images 102 and 130 associated with the identified devices are displayed as situation information on the UI 72 of the terminal device 14 , as shown in FIG. 44A .
- the user by using an indicator (a user's finger, a pen, or a stylus, for example), the user superposes a device image linked with a first image on a device image linked with an associating image (second image).
- the user specifies the device image 102 with the indicator and superposes the device image 102 on the device image 130 , as indicated by an arrow 138 .
- the user superposes the device images 102 and 130 by performing a drag-and-drop operation, for example. That is, the user drags the device image 102 and drops it on the device image 130 .
- the drag-and-drop operation is a known operation.
- the user may provide a voice instruction to specify device images to be superposed on each other.
- the device images 102 and 130 may be specified and superposed on each other according to a voice instruction provided from the user.
- the multifunction device (B) linked with the device image 102 and the PC (A) linked with the device image 130 are specified as collaborative devices.
- a device image While a device image is being dragged, it may be displayed on the UI 72 in an identifiable manner, such as it is displayed translucently or in a specific color.
- a checking screen 140 is displayed on the UI 72 of the terminal device 14 , as shown in FIG. 44C .
- the checking screen 140 is a screen for checking whether the user wishes to combine the devices specified as collaborative devices. If the user provides a collaboration instruction (if the user presses a “YES” button) on the checking screen 140 , information concerning the collaborative functions is displayed on the UI 72 .
- the controller 74 of the terminal device 14 causes information concerning collaborative function candidates to be displayed on the UI 72 .
- the scan transfer function and the print function are implemented. Consequently, information concerning the scan transfer function and information concerning the print function are displayed on the UI 72 .
- a connection request is sent from the terminal device 14 to the collaborative devices.
- a standby screen is displayed on the UI 72 of the terminal device 14 while the connection request is being sent. After the connection between the terminal device 14 and the collaborative devices has successfully established, the specified collaborative function is executed.
- a collaborative function may be specified by superposing a portion image on a device image or on a portion image. This processing will be described below with reference to FIGS. 46A and 46B .
- FIGS. 46A and 46B illustrate examples of screens displayed on the UI 72 of the terminal device 14 .
- the multifunction device (B) and the PC (A), for example, are identified.
- the device images 102 and 130 are displayed as situation information on the UI 72 of the terminal device 14 , as shown in FIG. 46A .
- Portion images 102 a , 102 b , 102 c , 130 a , and 130 b are displayed as images that can be individually moved independently by separating from other portion images.
- the multifunction device (B) linked with the device image 102 including the portion image 102 b and the PC (A) linked with the device image 130 including the portion image 130 b are specified as collaborative devices.
- the scan function assigned to the portion image 102 b and the screen display function assigned to the portion image 130 b are also specified as a collaborative function.
- Functions assigned to portion images are managed in the robot device 10 .
- identification information for identifying portion images, function information indicating functions assigned to the portion images, and collaborative function information indicating collaborative functions to be executed by combining functions are stored in the robot device 10 in association with each other.
- identification information indicating these superposed portion images is sent from the terminal device 14 to the robot device 10 .
- identification information indicating the portion image 102 b and that indicating the portion image 130 b are sent from the terminal device 14 to the robot device 10 .
- the identifying unit 64 of the robot device 10 specifies the functions assigned to the portion images 102 b and 130 b based on the identification information so as to specify a collaborative function using these functions. Information concerning the collaborative function is sent from the robot device 10 to the terminal device 14 and is displayed.
- the above-described processing enables a user to individually specify functions of collaborative devices and makes it possible to preferentially display information concerning a collaborative function using the functions specified by the user. As a result, a collaborative function which is likely to be used by the user is preferentially displayed.
- the priority levels of displaying of collaborative functions may be changed in accordance with the order of superposing portion images.
- information concerning a collaborative function using the functions linked with superposed portion images is preferentially displayed.
- Each of the robot device 10 and the terminal device 14 may be implemented as a result of software and hardware operating together. More specifically, each of the robot device 10 and the terminal device 14 includes one or plural processors, such as a central processing unit (CPU), which is not shown. As a result of this processor or these processors reading and executing a program stored in a storage device, which is not shown, the functions of the robot device 10 and the terminal device 14 are implemented.
- processors such as a central processing unit (CPU)
- CPU central processing unit
- This program is stored in a storage device by using a recording medium, such as a compact disc (CD) or a digital versatile disc (DVD), or via a communication path, such as a network.
- a recording medium such as a compact disc (CD) or a digital versatile disc (DVD)
- a communication path such as a network.
- the functions of the robot device 10 and the terminal device 14 may be implemented by using hardware resources, such as a processor, an electronic circuit, and an application specific integrated circuit (ASIC).
- ASIC application specific integrated circuit
- a device such as a memory
- the functions of the robot device 10 and the terminal device 14 may be implemented by using a digital signal processor (DSP) or a field programmable gate array (FPGA).
- DSP digital signal processor
- FPGA field programmable gate array
Abstract
Description
Claims (15)
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-205160 | 2016-10-19 | ||
JPJP2016-205160 | 2016-10-19 | ||
JP2016205160A JP6179653B1 (en) | 2016-10-19 | 2016-10-19 | Information processing apparatus and program |
JP2017002408A JP6439806B2 (en) | 2017-01-11 | 2017-01-11 | Robot apparatus and program |
JPJP2017-002408 | 2017-01-11 | ||
JP2017-002408 | 2017-01-11 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180104816A1 US20180104816A1 (en) | 2018-04-19 |
US10987804B2 true US10987804B2 (en) | 2021-04-27 |
Family
ID=61902968
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/642,665 Active 2038-01-30 US10987804B2 (en) | 2016-10-19 | 2017-07-06 | Robot device and non-transitory computer readable medium |
Country Status (1)
Country | Link |
---|---|
US (1) | US10987804B2 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10987804B2 (en) * | 2016-10-19 | 2021-04-27 | Fuji Xerox Co., Ltd. | Robot device and non-transitory computer readable medium |
JP6447689B1 (en) | 2017-09-11 | 2019-01-09 | 富士ゼロックス株式会社 | Information processing apparatus and program |
US11240180B2 (en) * | 2018-03-20 | 2022-02-01 | Fujifilm Business Innovation Corp. | Message providing device and non-transitory computer readable medium |
JP7099092B2 (en) * | 2018-07-03 | 2022-07-12 | 富士フイルムビジネスイノベーション株式会社 | Information processing equipment and programs |
JP7187922B2 (en) * | 2018-09-25 | 2022-12-13 | 富士フイルムビジネスイノベーション株式会社 | Information processing device and program |
US20210339401A1 (en) * | 2018-10-03 | 2021-11-04 | Sony Group Corporation | Mobile unit control device, mobile unit control method, and program |
WO2020164734A1 (en) * | 2019-02-15 | 2020-08-20 | Telefonaktiebolaget Lm Ericsson (Publ) | Technique for controlling wireless command transmission to a robotic device |
US11119713B2 (en) * | 2019-10-29 | 2021-09-14 | Kyocera Document Solutions Inc. | Systems, processes, and computer program products for delivery of printed paper by robot |
US11584004B2 (en) * | 2019-12-17 | 2023-02-21 | X Development Llc | Autonomous object learning by robots triggered by remote operators |
US11882129B2 (en) * | 2020-07-15 | 2024-01-23 | Fenix Group, Inc. | Self-contained robotic units for providing mobile network services and intelligent perimeter |
US20220244683A1 (en) * | 2021-02-02 | 2022-08-04 | Kyocera Document Solutions Inc. | Integration of printing device to a smart space |
Citations (96)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5084826A (en) * | 1989-07-27 | 1992-01-28 | Nachi-Fujikoshi Corp. | Industrial robot system |
JPH0565766A (en) | 1991-09-06 | 1993-03-19 | Chiyoda Corp | Construction robot system |
US5963712A (en) * | 1996-07-08 | 1999-10-05 | Sony Corporation | Selectively configurable robot apparatus |
US6266577B1 (en) * | 1998-07-13 | 2001-07-24 | Gte Internetworking Incorporated | System for dynamically reconfigure wireless robot network |
US6324443B1 (en) * | 1998-02-26 | 2001-11-27 | Fanuc Ltd. | Robot control apparatus |
US6364026B1 (en) * | 1998-04-01 | 2002-04-02 | Irving Doshay | Robotic fire protection system |
US6411055B1 (en) * | 1997-11-30 | 2002-06-25 | Sony Corporation | Robot system |
US6584375B2 (en) * | 2001-05-04 | 2003-06-24 | Intellibot, Llc | System for a retail environment |
JP2003291083A (en) | 2002-03-28 | 2003-10-14 | Toshiba Corp | Robot device, robot controlling method, and robot delivery system |
US6636781B1 (en) * | 2001-05-22 | 2003-10-21 | University Of Southern California | Distributed control and coordination of autonomous agents in a dynamic, reconfigurable system |
US20040024490A1 (en) * | 2002-04-16 | 2004-02-05 | Mclurkin James | System amd methods for adaptive control of robotic devices |
US20040162638A1 (en) * | 2002-08-21 | 2004-08-19 | Neal Solomon | System, method and apparatus for organizing groups of self-configurable mobile robotic agents in a multi-robotic system |
US20050065652A1 (en) * | 2003-09-22 | 2005-03-24 | Honda Motor Co., Ltd. | Autonomously moving robot management system |
JP2005111637A (en) | 2003-10-10 | 2005-04-28 | Ntt Data Corp | Network robot service system |
US20050113974A1 (en) * | 2003-09-30 | 2005-05-26 | Kabushiki Kaisha Toshiba | Cooperative robot system and navigation robot system |
US20060079997A1 (en) * | 2002-04-16 | 2006-04-13 | Mclurkin James | Systems and methods for dispersing and clustering a plurality of robotic devices |
US20060112034A1 (en) * | 2003-06-02 | 2006-05-25 | Matsushita Electric Industrial Co., Ltd. | Article handling system and method and article management system and method |
US7096090B1 (en) * | 2003-11-03 | 2006-08-22 | Stephen Eliot Zweig | Mobile robotic router with web server and digital radio links |
US20060195226A1 (en) * | 2003-08-07 | 2006-08-31 | Matsushita Electric Industrial Co., Ltd. | Mobile robot system and program for controlling the same |
US20070021867A1 (en) * | 2005-07-22 | 2007-01-25 | Lg Electronics Inc. | Home networking system using self-moving robot |
US7174238B1 (en) * | 2003-09-02 | 2007-02-06 | Stephen Eliot Zweig | Mobile robotic system with web server and digital radio links |
US20070031217A1 (en) * | 2005-05-31 | 2007-02-08 | Anil Sharma | Track Spiders Robotic System |
US20070061040A1 (en) * | 2005-09-02 | 2007-03-15 | Home Robots, Inc. | Multi-function robotic device |
US20070061487A1 (en) * | 2005-02-01 | 2007-03-15 | Moore James F | Systems and methods for use of structured and unstructured distributed data |
US7211980B1 (en) * | 2006-07-05 | 2007-05-01 | Battelle Energy Alliance, Llc | Robotic follow system and method |
US7236861B2 (en) * | 2005-02-16 | 2007-06-26 | Lockheed Martin Corporation | Mission planning system with asynchronous request capability |
US20070150098A1 (en) * | 2005-12-09 | 2007-06-28 | Min Su Jang | Apparatus for controlling robot and method thereof |
US20070192910A1 (en) * | 2005-09-30 | 2007-08-16 | Clara Vu | Companion robot for personal interaction |
US20070208442A1 (en) * | 2006-02-27 | 2007-09-06 | Perrone Paul J | General purpose robotics operating system |
US20070250212A1 (en) * | 2005-12-02 | 2007-10-25 | Halloran Michael J | Robot system |
US20070271002A1 (en) * | 2006-05-22 | 2007-11-22 | Hoskinson Reed L | Systems and methods for the autonomous control, automated guidance, and global coordination of moving process machinery |
US7304581B2 (en) * | 2004-08-31 | 2007-12-04 | Kabushiki Kaisha Toshiba | Mobile information apparatus and moving method therefor, and information system and position estimation method |
US20080009970A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Robotic Guarded Motion System and Method |
US20080009969A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Multi-Robot Control Interface |
US20080009967A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Robotic Intelligence Kernel |
US20080009968A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Generic robot architecture |
US20080161970A1 (en) * | 2004-10-19 | 2008-07-03 | Yuji Adachi | Robot apparatus |
US20080269948A1 (en) * | 2006-11-13 | 2008-10-30 | Solomon Research Llc | Hybrid control system for collectives of evolvable nanorobots and microrobots |
US20090015404A1 (en) * | 2007-07-13 | 2009-01-15 | Industrial Technology Research Institute | Method for coordinating cooperative robots |
US20090234788A1 (en) * | 2007-03-31 | 2009-09-17 | Mitchell Kwok | Practical Time Machine Using Dynamic Efficient Virtual And Real Robots |
US20090248200A1 (en) * | 2007-10-22 | 2009-10-01 | North End Technologies | Method & apparatus for remotely operating a robotic device linked to a communications network |
US20090306823A1 (en) * | 2007-01-12 | 2009-12-10 | Hansjorg Baltes | Method and System for Robot Generation |
JP2010244222A (en) | 2009-04-03 | 2010-10-28 | Mekiki:Kk | Cooperative purchasing system and method using sns |
US7860614B1 (en) * | 2005-09-13 | 2010-12-28 | The United States Of America As Represented By The Secretary Of The Army | Trainer for robotic vehicle |
US20110054679A1 (en) * | 2009-08-27 | 2011-03-03 | Sakura Finetek U.S.A., Inc. | Integrated tissue processing and embedding systems, and methods thereof |
US7912633B1 (en) * | 2005-12-01 | 2011-03-22 | Adept Mobilerobots Llc | Mobile autonomous updating of GIS maps |
US20110135189A1 (en) * | 2009-12-09 | 2011-06-09 | Electronics And Telecommunications Research Institute | Swarm intelligence-based mobile robot, method for controlling the same, and surveillance robot system |
US7966093B2 (en) * | 2007-04-17 | 2011-06-21 | Yefim Zhuk | Adaptive mobile robot system with knowledge-driven architecture |
US8131839B2 (en) * | 2007-08-01 | 2012-03-06 | Motorola Solutions, Inc. | Method and apparatus for resource assignment in a sensor network |
US8138868B2 (en) * | 2005-11-28 | 2012-03-20 | University Of Florida Research Foundation, Inc. | Method and structure for magnetically-directed, self-assembly of three-dimensional structures |
US8160746B2 (en) * | 2007-12-04 | 2012-04-17 | Industrial Technology Research Institute | System and method for graphically allocating robot's working space |
US20120165984A1 (en) * | 2010-12-23 | 2012-06-28 | Electronics And Telecommunications Research Institute | Mobile robot apparatus, door control apparatus, and door opening and closing method therefor |
US20120185094A1 (en) * | 2010-05-20 | 2012-07-19 | Irobot Corporation | Mobile Human Interface Robot |
US8307061B1 (en) * | 2011-10-27 | 2012-11-06 | Google Inc. | System and method for determining manufacturer instructions executable by a robotic device |
US8355818B2 (en) * | 2009-09-03 | 2013-01-15 | Battelle Energy Alliance, Llc | Robots, systems, and methods for hazard evaluation and visualization |
US20130054029A1 (en) * | 2011-04-15 | 2013-02-28 | Irobot | Auto-reach method for a remote vehicle |
US20130218339A1 (en) * | 2010-07-23 | 2013-08-22 | Aldebaran Robotics | "humanoid robot equipped with a natural dialogue interface, method for controlling the robot and corresponding program" |
US20130217421A1 (en) * | 2010-10-27 | 2013-08-22 | Kt Corporation | System, method and robot terminal apparatus for providing robot interaction service using location information of mobile communication terminal |
US20130261796A1 (en) * | 2012-04-03 | 2013-10-03 | Knu-Industry Cooperation Foundation | Intelligent robot apparatus responsive to environmental change and method of controlling and reconfiguring intelligent robot apparatus |
US8586410B2 (en) * | 2010-01-25 | 2013-11-19 | University Of Florida Research Foundation, Inc. | Enhanced magnetic self-assembly using integrated micromagnets |
US20130345876A1 (en) * | 2012-06-20 | 2013-12-26 | Irobot Corporation | Suspended robot systems and methods for using same |
US20140136302A1 (en) * | 2011-05-25 | 2014-05-15 | Se Kyong Song | System and method for operating a smart service robot |
JP2014188597A (en) | 2013-03-26 | 2014-10-06 | Advanced Telecommunication Research Institute International | Robot service cooperation system and platform |
US20140336818A1 (en) * | 2013-05-10 | 2014-11-13 | Cnh Industrial America Llc | Control architecture for multi-robot system |
US8958912B2 (en) * | 2012-06-21 | 2015-02-17 | Rethink Robotics, Inc. | Training and operating industrial robots |
US8983883B2 (en) * | 2006-08-17 | 2015-03-17 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Autonomic and apoptotic, aeronautical and aerospace systems, and controlling scientific data generated therefrom |
US8984136B1 (en) * | 2011-05-06 | 2015-03-17 | Google Inc. | Systems and methods for object recognition |
US8982217B1 (en) * | 2012-01-31 | 2015-03-17 | Google Inc. | Determining states and modifying environments according to states |
US20150088310A1 (en) * | 2012-05-22 | 2015-03-26 | Intouch Technologies, Inc. | Social behavior rules for a medical telepresence robot |
US9004200B2 (en) * | 2011-09-09 | 2015-04-14 | Pinhas Ben-Tzvi | Mobile robot with hybrid traction and mobility mechanism |
US9020636B2 (en) * | 2010-12-16 | 2015-04-28 | Saied Tadayon | Robot for solar farms |
US9026248B1 (en) * | 2011-05-06 | 2015-05-05 | Google Inc. | Methods and systems for multirobotic management |
US20150127155A1 (en) * | 2011-06-02 | 2015-05-07 | Brain Corporation | Apparatus and methods for operating robotic devices using selective state space training |
US9043012B2 (en) * | 2011-08-29 | 2015-05-26 | Neil S. Davey | Pharmacy automation using autonomous robot |
US20150190925A1 (en) * | 2014-01-07 | 2015-07-09 | Irobot Corporation | Remotely Operating a Mobile Robot |
US20150217449A1 (en) * | 2014-02-03 | 2015-08-06 | Brain Corporation | Apparatus and methods for control of robot actions based on corrective user inputs |
US9222205B2 (en) * | 2013-03-15 | 2015-12-29 | A&P Technology, Inc. | Rapidly configurable braiding machine |
US20160059412A1 (en) * | 2014-09-02 | 2016-03-03 | Mark Oleynik | Robotic manipulation methods and systems for executing a domain-specific application in an instrumented environment with electronic minimanipulation libraries |
US20160082298A1 (en) * | 2014-09-19 | 2016-03-24 | William Kelly Dagenhart | Forest Fire Control System |
US20160114488A1 (en) * | 2014-10-24 | 2016-04-28 | Fellow Robots, Inc. | Customer service robot and related systems and methods |
US9330286B2 (en) * | 2014-07-31 | 2016-05-03 | Accenture Global Services Limited | Test automation for automated fare management systems |
US20160250752A1 (en) * | 2015-02-27 | 2016-09-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular robot with smart device |
US9486921B1 (en) * | 2015-03-26 | 2016-11-08 | Google Inc. | Methods and systems for distributing remote assistance to facilitate robotic object manipulation |
US20160325127A1 (en) * | 2014-11-07 | 2016-11-10 | Kenneth William Billman | Laser System Module-Equipped Firefighting Aircraft |
US20170088205A1 (en) * | 2015-09-25 | 2017-03-30 | California Institute Of Technology | Puffer: pop-up flat folding explorer robot |
US20170128759A1 (en) * | 2015-11-05 | 2017-05-11 | Lockheed Martin Corporation | Methods and systems of applying fire retardant based on onboard sensing and decision making processes |
US9656806B2 (en) * | 2015-02-13 | 2017-05-23 | Amazon Technologies, Inc. | Modular, multi-function smart storage containers |
US9720414B1 (en) * | 2013-07-29 | 2017-08-01 | Vecna Technologies, Inc. | Autonomous vehicle providing services at a transportation terminal |
US20170286916A1 (en) * | 2015-08-31 | 2017-10-05 | Avaya Inc. | Communication systems for multi-source robot control |
US20170282362A1 (en) * | 2016-03-31 | 2017-10-05 | Avaya Inc. | Command and control of a user-provided robot by a contact center |
US20170282375A1 (en) * | 2015-08-31 | 2017-10-05 | Avaya Inc. | Operational parameters |
US20170286651A1 (en) * | 2015-08-31 | 2017-10-05 | Avaya Inc. | Authentication |
US20180104816A1 (en) * | 2016-10-19 | 2018-04-19 | Fuji Xerox Co., Ltd. | Robot device and non-transitory computer readable medium |
US20190242916A1 (en) * | 2018-02-02 | 2019-08-08 | HighRes Biosolutions, Inc. | Auto-navigating robotic processing vehicle |
US20190248016A1 (en) * | 2017-02-06 | 2019-08-15 | Cobalt Robotics Inc. | Mobile robot with arm for door interactions |
US20190369641A1 (en) * | 2018-05-31 | 2019-12-05 | Carla R. Gillett | Robot and drone array |
-
2017
- 2017-07-06 US US15/642,665 patent/US10987804B2/en active Active
Patent Citations (108)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5084826A (en) * | 1989-07-27 | 1992-01-28 | Nachi-Fujikoshi Corp. | Industrial robot system |
JPH0565766A (en) | 1991-09-06 | 1993-03-19 | Chiyoda Corp | Construction robot system |
US5963712A (en) * | 1996-07-08 | 1999-10-05 | Sony Corporation | Selectively configurable robot apparatus |
US6411055B1 (en) * | 1997-11-30 | 2002-06-25 | Sony Corporation | Robot system |
US6324443B1 (en) * | 1998-02-26 | 2001-11-27 | Fanuc Ltd. | Robot control apparatus |
US6364026B1 (en) * | 1998-04-01 | 2002-04-02 | Irving Doshay | Robotic fire protection system |
US6266577B1 (en) * | 1998-07-13 | 2001-07-24 | Gte Internetworking Incorporated | System for dynamically reconfigure wireless robot network |
US6584375B2 (en) * | 2001-05-04 | 2003-06-24 | Intellibot, Llc | System for a retail environment |
US6636781B1 (en) * | 2001-05-22 | 2003-10-21 | University Of Southern California | Distributed control and coordination of autonomous agents in a dynamic, reconfigurable system |
JP2003291083A (en) | 2002-03-28 | 2003-10-14 | Toshiba Corp | Robot device, robot controlling method, and robot delivery system |
US20040024490A1 (en) * | 2002-04-16 | 2004-02-05 | Mclurkin James | System amd methods for adaptive control of robotic devices |
US20070179669A1 (en) * | 2002-04-16 | 2007-08-02 | Mclurkin James | System and methods for adaptive control of robotic devices |
US20060079997A1 (en) * | 2002-04-16 | 2006-04-13 | Mclurkin James | Systems and methods for dispersing and clustering a plurality of robotic devices |
US20040162638A1 (en) * | 2002-08-21 | 2004-08-19 | Neal Solomon | System, method and apparatus for organizing groups of self-configurable mobile robotic agents in a multi-robotic system |
US8112176B2 (en) * | 2002-08-21 | 2012-02-07 | Neal Solomon | System for self-organizing mobile robotic collectives |
US20060112034A1 (en) * | 2003-06-02 | 2006-05-25 | Matsushita Electric Industrial Co., Ltd. | Article handling system and method and article management system and method |
US20060195226A1 (en) * | 2003-08-07 | 2006-08-31 | Matsushita Electric Industrial Co., Ltd. | Mobile robot system and program for controlling the same |
US7174238B1 (en) * | 2003-09-02 | 2007-02-06 | Stephen Eliot Zweig | Mobile robotic system with web server and digital radio links |
US7467026B2 (en) * | 2003-09-22 | 2008-12-16 | Honda Motor Co. Ltd. | Autonomously moving robot management system |
US20050065652A1 (en) * | 2003-09-22 | 2005-03-24 | Honda Motor Co., Ltd. | Autonomously moving robot management system |
US20050113974A1 (en) * | 2003-09-30 | 2005-05-26 | Kabushiki Kaisha Toshiba | Cooperative robot system and navigation robot system |
JP2005111637A (en) | 2003-10-10 | 2005-04-28 | Ntt Data Corp | Network robot service system |
US7096090B1 (en) * | 2003-11-03 | 2006-08-22 | Stephen Eliot Zweig | Mobile robotic router with web server and digital radio links |
US7304581B2 (en) * | 2004-08-31 | 2007-12-04 | Kabushiki Kaisha Toshiba | Mobile information apparatus and moving method therefor, and information system and position estimation method |
US20080161970A1 (en) * | 2004-10-19 | 2008-07-03 | Yuji Adachi | Robot apparatus |
US7539558B2 (en) * | 2004-10-19 | 2009-05-26 | Panasonic Corporation | Robot apparatus |
US8200700B2 (en) * | 2005-02-01 | 2012-06-12 | Newsilike Media Group, Inc | Systems and methods for use of structured and unstructured distributed data |
US20070061487A1 (en) * | 2005-02-01 | 2007-03-15 | Moore James F | Systems and methods for use of structured and unstructured distributed data |
US7236861B2 (en) * | 2005-02-16 | 2007-06-26 | Lockheed Martin Corporation | Mission planning system with asynchronous request capability |
US20070031217A1 (en) * | 2005-05-31 | 2007-02-08 | Anil Sharma | Track Spiders Robotic System |
US20070021867A1 (en) * | 2005-07-22 | 2007-01-25 | Lg Electronics Inc. | Home networking system using self-moving robot |
US20070061040A1 (en) * | 2005-09-02 | 2007-03-15 | Home Robots, Inc. | Multi-function robotic device |
US20070061043A1 (en) * | 2005-09-02 | 2007-03-15 | Vladimir Ermakov | Localization and mapping system and method for a robotic device |
US7860614B1 (en) * | 2005-09-13 | 2010-12-28 | The United States Of America As Represented By The Secretary Of The Army | Trainer for robotic vehicle |
US20070192910A1 (en) * | 2005-09-30 | 2007-08-16 | Clara Vu | Companion robot for personal interaction |
US8138868B2 (en) * | 2005-11-28 | 2012-03-20 | University Of Florida Research Foundation, Inc. | Method and structure for magnetically-directed, self-assembly of three-dimensional structures |
US7912633B1 (en) * | 2005-12-01 | 2011-03-22 | Adept Mobilerobots Llc | Mobile autonomous updating of GIS maps |
US20070250212A1 (en) * | 2005-12-02 | 2007-10-25 | Halloran Michael J | Robot system |
US20070150098A1 (en) * | 2005-12-09 | 2007-06-28 | Min Su Jang | Apparatus for controlling robot and method thereof |
US20070208442A1 (en) * | 2006-02-27 | 2007-09-06 | Perrone Paul J | General purpose robotics operating system |
US20070271002A1 (en) * | 2006-05-22 | 2007-11-22 | Hoskinson Reed L | Systems and methods for the autonomous control, automated guidance, and global coordination of moving process machinery |
US20080009968A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Generic robot architecture |
US20080009969A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Multi-Robot Control Interface |
US20080009970A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Robotic Guarded Motion System and Method |
US20080009967A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Robotic Intelligence Kernel |
US7211980B1 (en) * | 2006-07-05 | 2007-05-01 | Battelle Energy Alliance, Llc | Robotic follow system and method |
US8983883B2 (en) * | 2006-08-17 | 2015-03-17 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Autonomic and apoptotic, aeronautical and aerospace systems, and controlling scientific data generated therefrom |
US20080269948A1 (en) * | 2006-11-13 | 2008-10-30 | Solomon Research Llc | Hybrid control system for collectives of evolvable nanorobots and microrobots |
US20120150345A1 (en) * | 2007-01-12 | 2012-06-14 | Hansjorg Baltes | Method and system for robot generation |
US9671786B2 (en) * | 2007-01-12 | 2017-06-06 | White Magic Robotics Inc. | Method and system for robot generation |
US20090306823A1 (en) * | 2007-01-12 | 2009-12-10 | Hansjorg Baltes | Method and System for Robot Generation |
US20090234788A1 (en) * | 2007-03-31 | 2009-09-17 | Mitchell Kwok | Practical Time Machine Using Dynamic Efficient Virtual And Real Robots |
US7966093B2 (en) * | 2007-04-17 | 2011-06-21 | Yefim Zhuk | Adaptive mobile robot system with knowledge-driven architecture |
US20090015404A1 (en) * | 2007-07-13 | 2009-01-15 | Industrial Technology Research Institute | Method for coordinating cooperative robots |
US8108071B2 (en) * | 2007-07-13 | 2012-01-31 | Industrial Technology Research Institute | Method for coordinating cooperative robots |
US8131839B2 (en) * | 2007-08-01 | 2012-03-06 | Motorola Solutions, Inc. | Method and apparatus for resource assignment in a sensor network |
US20090248200A1 (en) * | 2007-10-22 | 2009-10-01 | North End Technologies | Method & apparatus for remotely operating a robotic device linked to a communications network |
US20130218346A1 (en) * | 2007-10-22 | 2013-08-22 | Timothy D. Root | Method & apparatus for remotely operating a robotic device linked to a communications network |
US8160746B2 (en) * | 2007-12-04 | 2012-04-17 | Industrial Technology Research Institute | System and method for graphically allocating robot's working space |
JP2010244222A (en) | 2009-04-03 | 2010-10-28 | Mekiki:Kk | Cooperative purchasing system and method using sns |
US20110054679A1 (en) * | 2009-08-27 | 2011-03-03 | Sakura Finetek U.S.A., Inc. | Integrated tissue processing and embedding systems, and methods thereof |
US8355818B2 (en) * | 2009-09-03 | 2013-01-15 | Battelle Energy Alliance, Llc | Robots, systems, and methods for hazard evaluation and visualization |
US20110135189A1 (en) * | 2009-12-09 | 2011-06-09 | Electronics And Telecommunications Research Institute | Swarm intelligence-based mobile robot, method for controlling the same, and surveillance robot system |
US8586410B2 (en) * | 2010-01-25 | 2013-11-19 | University Of Florida Research Foundation, Inc. | Enhanced magnetic self-assembly using integrated micromagnets |
US20120185094A1 (en) * | 2010-05-20 | 2012-07-19 | Irobot Corporation | Mobile Human Interface Robot |
US20130218339A1 (en) * | 2010-07-23 | 2013-08-22 | Aldebaran Robotics | "humanoid robot equipped with a natural dialogue interface, method for controlling the robot and corresponding program" |
US20130217421A1 (en) * | 2010-10-27 | 2013-08-22 | Kt Corporation | System, method and robot terminal apparatus for providing robot interaction service using location information of mobile communication terminal |
US9020636B2 (en) * | 2010-12-16 | 2015-04-28 | Saied Tadayon | Robot for solar farms |
US20120165984A1 (en) * | 2010-12-23 | 2012-06-28 | Electronics And Telecommunications Research Institute | Mobile robot apparatus, door control apparatus, and door opening and closing method therefor |
US20130054029A1 (en) * | 2011-04-15 | 2013-02-28 | Irobot | Auto-reach method for a remote vehicle |
US8984136B1 (en) * | 2011-05-06 | 2015-03-17 | Google Inc. | Systems and methods for object recognition |
US9026248B1 (en) * | 2011-05-06 | 2015-05-05 | Google Inc. | Methods and systems for multirobotic management |
US20140136302A1 (en) * | 2011-05-25 | 2014-05-15 | Se Kyong Song | System and method for operating a smart service robot |
US20150127155A1 (en) * | 2011-06-02 | 2015-05-07 | Brain Corporation | Apparatus and methods for operating robotic devices using selective state space training |
US9043012B2 (en) * | 2011-08-29 | 2015-05-26 | Neil S. Davey | Pharmacy automation using autonomous robot |
US9004200B2 (en) * | 2011-09-09 | 2015-04-14 | Pinhas Ben-Tzvi | Mobile robot with hybrid traction and mobility mechanism |
US8307061B1 (en) * | 2011-10-27 | 2012-11-06 | Google Inc. | System and method for determining manufacturer instructions executable by a robotic device |
US8982217B1 (en) * | 2012-01-31 | 2015-03-17 | Google Inc. | Determining states and modifying environments according to states |
US20130261796A1 (en) * | 2012-04-03 | 2013-10-03 | Knu-Industry Cooperation Foundation | Intelligent robot apparatus responsive to environmental change and method of controlling and reconfiguring intelligent robot apparatus |
US20150088310A1 (en) * | 2012-05-22 | 2015-03-26 | Intouch Technologies, Inc. | Social behavior rules for a medical telepresence robot |
US20130345876A1 (en) * | 2012-06-20 | 2013-12-26 | Irobot Corporation | Suspended robot systems and methods for using same |
US8958912B2 (en) * | 2012-06-21 | 2015-02-17 | Rethink Robotics, Inc. | Training and operating industrial robots |
US9222205B2 (en) * | 2013-03-15 | 2015-12-29 | A&P Technology, Inc. | Rapidly configurable braiding machine |
JP2014188597A (en) | 2013-03-26 | 2014-10-06 | Advanced Telecommunication Research Institute International | Robot service cooperation system and platform |
US20160046025A1 (en) | 2013-03-26 | 2016-02-18 | Advanced Telecommunications Research Institute International | Robot service cooperation system, platform and method |
US20140336818A1 (en) * | 2013-05-10 | 2014-11-13 | Cnh Industrial America Llc | Control architecture for multi-robot system |
US9720414B1 (en) * | 2013-07-29 | 2017-08-01 | Vecna Technologies, Inc. | Autonomous vehicle providing services at a transportation terminal |
US20150190925A1 (en) * | 2014-01-07 | 2015-07-09 | Irobot Corporation | Remotely Operating a Mobile Robot |
US9358685B2 (en) * | 2014-02-03 | 2016-06-07 | Brain Corporation | Apparatus and methods for control of robot actions based on corrective user inputs |
US20150217449A1 (en) * | 2014-02-03 | 2015-08-06 | Brain Corporation | Apparatus and methods for control of robot actions based on corrective user inputs |
US9330286B2 (en) * | 2014-07-31 | 2016-05-03 | Accenture Global Services Limited | Test automation for automated fare management systems |
US20160059412A1 (en) * | 2014-09-02 | 2016-03-03 | Mark Oleynik | Robotic manipulation methods and systems for executing a domain-specific application in an instrumented environment with electronic minimanipulation libraries |
US20160082298A1 (en) * | 2014-09-19 | 2016-03-24 | William Kelly Dagenhart | Forest Fire Control System |
US20160114488A1 (en) * | 2014-10-24 | 2016-04-28 | Fellow Robots, Inc. | Customer service robot and related systems and methods |
US20160325127A1 (en) * | 2014-11-07 | 2016-11-10 | Kenneth William Billman | Laser System Module-Equipped Firefighting Aircraft |
US9656806B2 (en) * | 2015-02-13 | 2017-05-23 | Amazon Technologies, Inc. | Modular, multi-function smart storage containers |
US20160250752A1 (en) * | 2015-02-27 | 2016-09-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular robot with smart device |
US9486921B1 (en) * | 2015-03-26 | 2016-11-08 | Google Inc. | Methods and systems for distributing remote assistance to facilitate robotic object manipulation |
US20170282375A1 (en) * | 2015-08-31 | 2017-10-05 | Avaya Inc. | Operational parameters |
US20170286916A1 (en) * | 2015-08-31 | 2017-10-05 | Avaya Inc. | Communication systems for multi-source robot control |
US20170286651A1 (en) * | 2015-08-31 | 2017-10-05 | Avaya Inc. | Authentication |
US20170088205A1 (en) * | 2015-09-25 | 2017-03-30 | California Institute Of Technology | Puffer: pop-up flat folding explorer robot |
US20170128759A1 (en) * | 2015-11-05 | 2017-05-11 | Lockheed Martin Corporation | Methods and systems of applying fire retardant based on onboard sensing and decision making processes |
US20170282362A1 (en) * | 2016-03-31 | 2017-10-05 | Avaya Inc. | Command and control of a user-provided robot by a contact center |
US20180104816A1 (en) * | 2016-10-19 | 2018-04-19 | Fuji Xerox Co., Ltd. | Robot device and non-transitory computer readable medium |
US20190248016A1 (en) * | 2017-02-06 | 2019-08-15 | Cobalt Robotics Inc. | Mobile robot with arm for door interactions |
US20190242916A1 (en) * | 2018-02-02 | 2019-08-08 | HighRes Biosolutions, Inc. | Auto-navigating robotic processing vehicle |
US20190369641A1 (en) * | 2018-05-31 | 2019-12-05 | Carla R. Gillett | Robot and drone array |
Non-Patent Citations (3)
Title |
---|
Jan. 28, 2020 Notice of Reasons for Refusal issued in Japanese Patent Application No. 2018-208458. |
Jan. 28, 2020 Notice of Reasons for Refusal issued in Japanese Patent Application No. 2018-208618. |
May 29, 2018 Office Action issued in Japanese Patent Application No. 2017-002408. |
Also Published As
Publication number | Publication date |
---|---|
US20180104816A1 (en) | 2018-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10987804B2 (en) | Robot device and non-transitory computer readable medium | |
US11059179B2 (en) | Robot device and non-transitory computer readable medium | |
CN108297092B (en) | Robot apparatus and control method thereof | |
US10495878B2 (en) | Mobile terminal and controlling method thereof | |
EP2945043B1 (en) | Eyewear-type terminal and method of controlling the same | |
US10120992B2 (en) | Mobile terminal and method for controlling the same | |
CN112243510A (en) | Implementation of biometric authentication | |
US10593322B2 (en) | Electronic device and method for controlling the same | |
US9317113B1 (en) | Gaze assisted object recognition | |
US20160260086A1 (en) | Mobile terminal and method for controlling the same | |
EP3182265B1 (en) | Mobile terminal and method for controlling the same | |
US10349245B2 (en) | Information processing apparatus and non-transitory computer readable medium for communicating with a robot | |
US20170243054A1 (en) | Mobile terminal and control method thereof | |
US11860991B2 (en) | Information processing apparatus and non-transitory computer readable medium | |
US11153427B2 (en) | Mobile terminal and method for controlling the same | |
JP6721023B2 (en) | Information processing device and program | |
JP2019144825A (en) | Information processing device and program | |
JP6721024B2 (en) | Information processing device and program | |
US11095784B2 (en) | Information processing apparatus and non-transitory computer readable medium for setting function for entity in real space | |
JP2019040468A (en) | Information processing apparatus and program | |
US11900929B2 (en) | Electronic apparatus providing voice-based interface and method for controlling the same | |
US10792819B2 (en) | Information processing apparatus and non-transitory computer readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FUJI XEROX CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TOKUCHI, KENGO;REEL/FRAME:042922/0649 Effective date: 20170703 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: FUJIFILM BUSINESS INNOVATION CORP., JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:FUJI XEROX CO., LTD.;REEL/FRAME:056078/0098 Effective date: 20210401 |