JP6439806B2 - Robot apparatus and program - Google Patents

Robot apparatus and program Download PDF

Info

Publication number
JP6439806B2
JP6439806B2 JP2017002408A JP2017002408A JP6439806B2 JP 6439806 B2 JP6439806 B2 JP 6439806B2 JP 2017002408 A JP2017002408 A JP 2017002408A JP 2017002408 A JP2017002408 A JP 2017002408A JP 6439806 B2 JP6439806 B2 JP 6439806B2
Authority
JP
Japan
Prior art keywords
device
function
robot apparatus
means
example
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2017002408A
Other languages
Japanese (ja)
Other versions
JP2018111154A (en
Inventor
賢吾 得地
賢吾 得地
Original Assignee
富士ゼロックス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士ゼロックス株式会社 filed Critical 富士ゼロックス株式会社
Priority to JP2017002408A priority Critical patent/JP6439806B2/en
Priority claimed from US15/642,665 external-priority patent/US20180104816A1/en
Publication of JP2018111154A publication Critical patent/JP2018111154A/en
Application granted granted Critical
Publication of JP6439806B2 publication Critical patent/JP6439806B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Description

  The present invention relates to a robot apparatus and a program.

  Patent Documents 1-3 disclose a robot apparatus that executes processing in cooperation with other devices.

JP 2014-188597 A JP-A-5-65766 JP 2005-1111637 A

  In general, means for solving a problem generated by a robot apparatus is determined in advance. Therefore, there are cases in which the problem cannot be solved by changing the means in a flexible manner according to the situation where the robot apparatus is placed.

  An object of the present invention is to enable a robot apparatus to solve a problem in cooperation in accordance with a situation.

The invention according to claim 1 is determined for each situation of the surrounding situation and a situation collecting means for detecting a person existing around the device, and for determining whether or not the problem of the situation has occurred. A storage unit that stores information indicating a threshold value, a detection unit that determines that the problem has occurred when a value representing a detection result of the situation detected by the situation collection unit is equal to or greater than the threshold value, If only the functions that can not be executed solutions to solve before Kitoi problem, the saw including a control unit, a for performing control to execute the solutions that utilize elements other than the self apparatus, wherein the detecting means Furthermore, the robot apparatus changes the threshold value according to the attribute of the person detected by the situation collecting unit by referring to the information stored in the storage unit .

  The invention according to claim 2 is the robot apparatus according to claim 1, wherein the element includes at least one of a device and a person other than the self apparatus.

  The invention according to claim 3 is characterized in that, when no person is detected, the control means preferentially executes control for executing the solution means not using a person. It is a robot device.

  The invention according to claim 4 is the robot apparatus according to claim 2 or 3, wherein the control means causes a device other than the own apparatus to execute the solution means.

  According to a fifth aspect of the invention, the control means communicates with a device other than the own device and controls a device other than the own device, thereby causing the device other than the own device to execute the solution means. The robot apparatus according to claim 4.

  The invention according to claim 6 is characterized in that the control unit controls a direct operation to an operation unit of a device other than the own device, thereby causing a device other than the own device to execute the solution unit. The robot apparatus according to claim 4.

  The invention according to claim 7 is the robot apparatus according to claim 6, wherein the control unit controls the direct operation when communication with a device other than the device itself is impossible to control.

  The invention according to claim 8 is characterized in that the solving means is means executed by at least two collaborative operations among the own device, a device other than the own device, and a person. A robot apparatus according to claim 7.

  The invention according to claim 9 is characterized in that the control means controls output of information indicating the solving means as control for executing the solving means. The robot apparatus according to claim 1.

  The invention according to claim 10 is the robot apparatus according to claim 9, wherein the information indicating the solving means includes information indicating the situation.

  The invention according to claim 11 is the robot apparatus according to claim 9 or 10, wherein the control means controls output of information indicating the solution means to the terminal device.

  The invention according to a twelfth aspect is the robot apparatus according to the eleventh aspect, wherein the solving means is executed based on an instruction of a user who uses the terminal device.

  The invention according to claim 13 is the robot apparatus according to claim 12, characterized in that the instruction is an instruction to execute a function that can be executed by using devices existing around the apparatus. .

The invention according to claim 14, position information of the equipment, based on at least one of the surrounding image and communication status further comprises identifying means for identifying a device existing around the own device, that 14. The robot apparatus according to claim 13, wherein the robot apparatus is characterized in that:

  The invention according to claim 15 further comprises transmission means for transmitting information indicating the instruction to a device existing around the device, according to any one of claims 12 to 14. It is a robot device.

  The invention according to claim 16 is the robot apparatus according to any one of claims 9 to 15, wherein the control means controls display of information indicating the solution means.

  The invention according to claim 17 is the robot apparatus according to claim 16, wherein the control unit further controls display of a device image associated with a device capable of executing the solution unit.

  The invention according to claim 18 is the robot apparatus according to claim 17, characterized in that a function of using a device associated with a device image designated by a user is executed.

  The invention according to claim 19 is characterized in that, when a plurality of device images are designated by a user, a cooperation function that uses a plurality of devices associated with the plurality of device images is executed. It is a robot apparatus of description.

The invention according to claim 2 0, when the first detection result based on the information about the person, and a second detection result based on information other than humans, the problem of the situation based on different, wherein, in advance based on the first detection result or priority of the second detection result determined, selecting the solutions to solve the problem of the situation, to claim 19 claim 1, wherein the It is a robot apparatus of description.

The invention according to claim 2 1, information about the person is information including an image representing a person, it is a robotic apparatus of claim 2 0, wherein.

The invention according to claim 2 2, wherein the image is an image representing at least a portion of the human body, it is a robot device according to claim 2 1, wherein the.

The invention according to claim 2 3, wherein the image is an image representing a human face, it is a robot apparatus according to claim 2 2 wherein.

The invention according to claim 2 4, the information about the person, the information including the human voice, it is a robot apparatus according to any one of claims 2 to 3 claim 20, wherein.

The invention according to claim 25 is characterized in that, when the elements other than the own apparatus can be used for a fee, the control means controls a payment operation for using the elements other than the own apparatus. The robot apparatus according to any one of claims 1 to 24.

The invention according to claim 26 is the robot apparatus according to claim 25 , wherein the payment operation includes a payment operation using at least one of electronic currency and virtual currency.

The invention according to claim 27 is characterized in that, when the device itself does not have a payment capability, the control means performs the payment operation by receiving payment assistance from an element other than the device itself. The robot apparatus according to Item 25 or Claim 26 .

The invention according to claim 28 is the robot apparatus according to claim 27 , wherein the control means acquires money for payment from a person who can be involved in an action for solving the problem. .

The invention according to claim 29, further comprising a retrieval means for search the means for solving the previous SL problem is the robotic device according to claim 28 claim 1, characterized in that .

The invention according to claim 3 0, wherein the search means uses the Internet to search for the solution, it is a robot apparatus according to claim 29, wherein.

The invention according to claim 3 1, when the not determine the solutions only in its own device, the control means controls the execution of the mode querying the user, claim 3 0 claim 1, characterized in that A robot apparatus according to any one of the above.

The invention according to claim 3 2, in the mode for prompting the user, the control means, in response to a request from the user, according to claim 3 1 for controlling the further collection of information on the situation, it is a feature of It is a robot apparatus as described in.

The invention according to claim 3 3, wherein the self-device can solve problems, according to the own apparatus is changed according to the update of the functions of the claim 3 2 claim 1, characterized in that It is a robot device.

The invention according to claim 3 4, the update of the function, according to claim 3 3, wherein the self apparatus is performed by changing at least one of hardware and software has, it is characterized by It is a robot device.

The invention according to claim 35, wherein the control unit, until the problem is resolved, control is repeated to execute the solution, to claim 3 4 claim 1, characterized in that It is a robot apparatus of description.

The invention according to claim 3 6, when the solutions are not executed, the control means performs control to perform another solution, either claim 3 claims 1-5, characterized in that It is a robot apparatus as described in.

The invention according to claim 37, further comprising a moving means for moving to another location, it is a robot apparatus according to any one of claims 3 to 6 claim 1 wherein the.

The invention according to claim 38, wherein the moving means has a function of moving on land, function of flight, and the ability to move through water, having at least one function in the it in claim 37, wherein It is a robot apparatus of description.

The invention according to claim 39 further comprises a communication means for communicating with devices other than the own device by changing the communication scheme according to the surrounding environment, to claim 38 claim 1, wherein the It is a robot apparatus of description.

The invention according to claim 39, the environment of the surroundings, the distance between the devices existing around, and, according to claim 39, wherein the presence or absence of an obstacle, to at least one of, the It is a robot device.

The invention according to claim 4 1, wherein the solutions are, for means for urging the human behavior to resolve the problem, the control means controls the notification procedure of actions to solve the problem to it is a robot apparatus according to any one of claims 4 0 from claim 1.

The invention according to claim 4 2, wherein said problem can not be resolved is a feature of the apparatus, the function alone can not solve the problem of the device itself, the only function of the apparatus is a problem that the time required for the resolution becomes more time threshold or, wherein a function of the problem a result of the quality is less than or equal to the predetermined quality of work of its own device, it is a robot apparatus according to any one of claims 4 1 from claim 1.

The invention according to claim 4 3, a computer, surroundings, and, status collecting means for detecting a person existing around the own device, determined for each human attributes, the occurrence of problems of the situation Detection that refers to a storage unit that stores information indicating a threshold value for determination, and determines that the problem has occurred when a value representing a detection result of the situation detected by the situation collection unit is equal to or greater than the threshold value means, if only functions of the apparatus can not execute the solutions to solve the previous Kitoi problem, control means for performing said control in order to perform the solutions that utilize elements other than the self apparatus, to function as, The detection means is a program that further changes the threshold according to the attribute of the person detected by the status collection means by referring to information stored in the storage means .

According to the invention of claim 1,9,10,4 2, 4 4, it is possible to solve the problem in cooperation according robotic device to the situation.

  According to the second aspect of the present invention, a solution that uses at least one of a device and a person is executed.

  According to the third aspect of the present invention, the load on the robot apparatus is reduced as compared with the case of performing control for executing the solution means using humans.

  According to the invention of claim 4-6, the solving means is executed by a device other than the robot apparatus.

  According to the seventh aspect of the invention, even when the device cannot be controlled by communication, the solution means that uses the device is executed.

  According to the eighth aspect of the invention, the solving means is executed using the robot apparatus.

  According to the inventions according to claims 11, 12, and 16, the solution means is presented to the user.

  According to the invention according to claims 13-15, the solving means is executed by utilizing the devices present in the surroundings.

  According to the invention of claim 17, a device capable of executing the solving means is presented to the user.

  According to the eighteenth and nineteenth aspects, the solving means is executed by manipulating the device image.

According to the invention of claim 21 - 24, the accuracy of detection can be improved.

According to the invention of claim 2 0, even if the detection result is inconsistent, solutions are identified.

Claim 2 5 - According to the invention of 28, it is possible to use an instrument that can be used for a fee.

According to the invention of claim 29, 3 0, solutions are searched.

According to the invention of claim 3 1, when it is not possible to determine the solutions only in a robot apparatus, a state in which a user instruction is obtained.

According to the invention according to claim 3 2, information required for the user to determine the solutions are provided to the user.

According to the inventions according to claims 3 3 and 3 4 , the problem that the robot apparatus can solve is changed according to the update of the function of the robot apparatus.

According to the invention of claim 3 5, 3 6, until the problem is resolved, solutions are executed.

According to the inventions according to claims 37 and 38 , the robot apparatus can detect the situation and control the solution means while moving.

According to the invention of claim 39, 4 0, it is possible to communicate by using a more appropriate communication method.

According to the invention according to claim 4 1, human is likely to solve the problem.

It is a block diagram which shows the apparatus system which concerns on this embodiment. It is a figure which shows the external appearance of the robot apparatus which concerns on this embodiment. It is a block diagram which shows the robot apparatus which concerns on this embodiment. It is a block diagram which shows a terminal device. It is a figure which shows the characteristic of radio | wireless communication technology. It is a figure which shows the characteristic of radio | wireless communication technology. It is a figure which shows an owned function management table. It is a figure which shows a non-owning function management table. It is a figure which shows a solution means management table. It is a figure which shows a device function management table. It is a figure which shows a cooperation function management table. It is a flowchart which shows the outline of a status confirmation process. It is a flowchart which shows the detail of a status confirmation process. It is a flowchart which shows the process in user judgment mode. It is a flowchart which shows the process in autonomous determination mode. It is a flowchart which shows execution control of a solution means. It is a figure for demonstrating the application scene 1 of a robot apparatus. It is a figure for demonstrating the application scene 2 of a robot apparatus. It is a figure for demonstrating the application scene 3 of a robot apparatus. It is a figure for demonstrating the application scene 4 of a robot apparatus. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows the external appearance of an apparatus typically. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen. It is a flowchart which shows a connection process. It is a figure which shows a cooperation function management table. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a device function management table. It is a figure which shows a screen. It is a figure which shows a device function management table. It is a figure which shows a cooperation function management table. It is a figure which shows a screen. It is a figure which shows a screen. It is a figure which shows a screen.

  With reference to FIG. 1, a device system as an information processing system according to an embodiment of the present invention will be described. FIG. 1 shows an example of a device system according to the present embodiment.

  The device system according to the present embodiment includes, for example, a robot device 10, one or a plurality of devices 12, and a terminal device 14. For example, the robot apparatus 10, the device 12, and the terminal apparatus 14 may communicate with other apparatuses via a communication path N such as a network, or may communicate with other apparatuses via different communication paths. . Of course, the configuration shown in FIG. 1 is merely an example, and the robot device 10, the device 12, and the terminal device 14 do not have to communicate with other devices.

  In the example illustrated in FIG. 1, one device 12 is included in the device system, but a plurality of devices 12 may be included in the device system. In this case, a plurality of devices 12 having the same function may be included in the device system, or a plurality of devices 12 having different functions may be included in the device system. Further, although one terminal device 14 is included in the device system, a plurality of terminal devices 14 may be included in the device system. Other devices such as a server may be included in the device system.

  The robot apparatus 10 is an apparatus having a function of detecting a situation around the robot apparatus 10 and detecting a problem that has occurred. The robot apparatus 10 may detect the surrounding situation while moving, or may detect the surrounding situation in a stopped state. When a problem is detected, the robot apparatus 10 determines whether the problem can be solved by the robot apparatus 10 itself, and performs processing according to the determination result. When the robot apparatus 10 can solve the problem, the robot apparatus 10 solves the problem by executing a solution means for solving the problem. When the problem cannot be solved by the robot apparatus 10, the robot apparatus 10 performs control for executing a solution means that uses elements other than the robot apparatus 10. The elements are, for example, a person, a device other than the robot device 10, a robot device other than the robot device 10, and the like. Of course, the element may include the robot apparatus 10 itself. For example, the robot apparatus 10 may cause another device to solve the problem, may ask a person to cooperate, may cause another device and a person to solve the problem, You may work with people to solve the problem.

  The problems detected by the robot apparatus 10 are, for example, problems that can be solved only by the robot apparatus 10, problems that can be solved only by other devices other than the robot apparatus 10, problems that can be solved only by people, and collaboration between other devices and people. Problems that can be solved by work, problems that can be solved by joint work between the robot apparatus 10 and a person, problems that can be solved by joint work between the other apparatus and the robot apparatus 10, and between other equipment and other robot apparatuses and the robot apparatus 10 Problems that can be solved by collaborative work, problems that can be solved by collaborative work between other devices, people, and the robot apparatus 10. That is, the solving means is, for example, a means realized only by the robot apparatus 10, a means realized only by another device, a means realized only by a person, or a means realized by collaborative work between another device and a person. , Means realized by joint work between the robot apparatus 10 and a person, means realized by joint work between the other apparatus and the robot apparatus 10, and joint work between the other equipment, the other robot apparatus and the robot apparatus 10 Means realized, means realized by collaborative work of another device, a person, and the robot apparatus 10.

  The device 12 is a device having a specific function. For example, an image forming device having an image forming function, a personal computer (PC), a display device such as a liquid crystal display or a projector, an aroma device that generates a scent, a telephone, a watch , Cameras, surveillance cameras, vending machines, air conditioners (air conditioners, coolers, etc.), electric fans, humidifiers, etc. Of course, devices other than these may be included in the device system. Note that depending on the function of the device, the device may be included in the category of the robot apparatus.

  The terminal device 14 is a device such as a personal computer (PC), a tablet PC, a smartphone, a mobile phone, or the like. The terminal device 14 is used by the user, for example, when executing a solution means for solving a problem.

  In the present embodiment, a surrounding situation is detected by the robot apparatus 10, and control for executing a solution means for solving the problem is performed according to the problem of the situation.

  Hereinafter, each device included in the device system according to the present embodiment will be described in detail.

  FIG. 2 shows an appearance of the robot apparatus 10. The robot apparatus 10 is, for example, a humanoid robot. Of course, the robot apparatus 10 may be a robot of a type other than a humanoid. In the example illustrated in FIG. 2, the robot apparatus 10 includes a body portion 16, a head 18 provided on the body portion 16, leg portions 20 provided at a lower portion of the body portion 16, and both side portions of the body portion 16. Each of the arm portions 22 and the finger portion 24 provided at the tip of each arm portion 22.

  The robot apparatus 10 includes various sensors such as a visual sensor, an auditory sensor, a tactile sensor, a taste sensor, and an olfactory sensor, for example, and has an ability related to vision, hearing, touch, taste, and olfaction corresponding to human senses. The robot apparatus 10 has, for example, superficial sensations (tactile sensation, pain angle, temperature sensation, etc.), deep sensation (pressure sensation, positional sensation, vibration sensation, etc.), cortical sensation (two-point discrimination sensation, three-dimensional discrimination, etc. ) Have the ability to segregate and understand. The robot apparatus 10 has a sense of balance. For example, a sensor such as a camera 26 is provided on the head 18, and vision is realized by image recognition using an image obtained by the camera 26 or the like. The robot apparatus 10 is provided with sound collecting means such as a microphone, and hearing is realized by voice recognition using sound obtained by the microphone or the like.

  The robot apparatus 10 may include means for detecting a human brain wave (for example, a means in which a human brain wave detection device is provided and information transmitted from the brain wave detection device is received).

  The leg portion 20 corresponds to an example of a moving means, and is driven by a driving force from a driving source such as a motor, for example. The robot apparatus 10 can be moved by the legs 20. The leg 20 may have a shape like a human foot, a roller, a tire, or the like, or may have another shape. The leg portion 20 is merely an example of a moving unit, and the robot apparatus 10 may include, for example, a configuration for flying (for example, a propeller, a wing, a flying engine, etc.) as a moving unit other than the leg unit 20. Alternatively, a configuration for moving in water (for example, an engine for moving in water) may be provided. That is, the robot apparatus 10 only needs to include at least one of a means for moving on land, a means for flying, and a means for moving underwater as the moving means. Of course, the robot apparatus 10 may not include a moving unit.

  The robot apparatus 10 may have a capability of grasping and operating an object by the arm part 22 and the finger part 24. The robot apparatus 10 may have an ability to move while grasping or holding an object.

  The robot apparatus 10 may have a function of emitting a voice. The robot apparatus 10 may have a communication function and transmit / receive data to / from other apparatuses. The robot apparatus 10 may have an ability to communicate with a person, another apparatus, or another robot apparatus by making a sound or sending a communication message.

  The robot apparatus 10 may have an ability to make a judgment close to a person by machine learning using artificial intelligence (AI), for example. Neural network type deep learning may be used, or reinforcement learning that partially enhances the learning field may be used.

  The robot apparatus 10 may have a function of searching for information (for example, solution means for solving a problem) using, for example, the Internet.

  The robot apparatus 10 may communicate with other devices by a communication function and control operations of the other devices, may operate other devices using a remote controller or the like, or may use a remote controller or the like. Alternatively, other devices may be operated directly. In the direct operation, for example, an operation unit (for example, a button or a panel) provided in another device is operated by the robot apparatus 10. For example, when the operation of another device cannot be controlled by communication, the robot apparatus 10 may operate another device using a remote controller or the like, or may directly operate another device. For example, the robot apparatus 10 analyzes an image obtained by a visual sensor, identifies an operation unit or a remote control of another device, and operates the other device or the remote control.

  In addition, the robot apparatus 10 may include a display unit 28. The display unit 28 displays information about the problem that has occurred, information about the solution, various messages, and the like.

  Hereinafter, the configuration of the robot apparatus 10 will be described in detail with reference to FIG. FIG. 3 is a block diagram showing a configuration of the robot apparatus 10.

  The communication unit 30 is a communication interface, and has a function of transmitting data to other devices and a function of receiving data from other devices. The communication unit 30 may be a communication interface having a wireless communication function or a communication interface having a wired communication function. The communication unit 30 corresponds to, for example, one or more types of communication methods, and communicates with the communication partner according to a communication method suitable for the communication partner (that is, a communication method supported by the communication partner). The communication method is, for example, infrared communication, visible light communication, Wi-Fi (registered trademark) communication, short-range wireless communication (for example, Bluetooth (registered trademark), RFID (Radio Frequency Identifier), or the like). The communication unit 30 switches, for example, the communication method according to the communication partner, the surrounding environment (for example, the distance between the robot device 10 and the communication partner, or the obstacle between the robot device 10 and the communication partner). The communication method is switched according to the presence / absence). Further, as a communication frequency band, for example, a short wavelength band such as 800 MHz to 920 MHz (LPWA (Low Power Wide Area) or the like), a long wavelength such as 2.4 GHz or 5 GHz (MuLTEfire), or the like may be used. For example, the communication unit 30 switches the frequency band according to the communication partner or switches the communication method according to the surrounding environment.

  The storage unit 32 is a storage device such as a hard disk or a memory (for example, SSD). In the storage unit 32, for example, owned function management information 34, non-owned function management information 36, solution means management information 38, device function management information 40, linkage function management information 42, and other various data, various programs, etc. Is remembered. In addition, device address information indicating the address of another device may be stored in the storage unit 32. The above information may be stored in separate storage devices, or may be stored in the same storage device.

  The owned function management information 34 is information indicating the functions of the robot apparatus 10. In the owned function management information 34, for example, information indicating the function of the robot apparatus 10 and information indicating that the function can be executed (operation, processing, operation, etc.) are associated with each other. By referring to the owned function management information 34, it is specified (identified) that it can be executed by the robot apparatus 10. Since functions not shown in the owned function management information 34 are functions that the robot apparatus 10 does not have, it is specified (identified) that the robot apparatus 10 cannot execute by referring to the owned function management information 34. ).

  The non-owned function management information 36 is information indicating functions that the robot apparatus 10 does not have. The non-owned function management information 36 corresponds to, for example, information indicating a function that the robot apparatus 10 does not have and information indicating that the robot apparatus 10 cannot execute because the robot apparatus 10 does not have the function. It is attached. By referring to the non-owned function management information 36, it is specified (identified) that the robot apparatus 10 cannot execute. Functions that are not shown in the non-owned function management information 36 may be functions that the robot apparatus 10 has, and therefore can be executed by the robot apparatus 10 by referring to the non-owned function management information 36. May be specified (identified).

  The storage unit 32 stores information on both the owned function management information 34 and the non-owned function management information 36. Based on these information, it is specified that the robot apparatus 10 can execute or cannot execute the information. Alternatively, any one piece of information may be stored, and based on the information, it may be specified that the robot apparatus 10 can execute or cannot execute.

  The solution management information 38 is information indicating solution means (solution, solution action) for solving the problem that has occurred. In the solution management information 38, for example, information indicating a problem assumed to occur and information indicating a solution for the problem are associated with each other.

  The device function management information 40 is information for managing functions possessed by devices other than the robot apparatus 10. For example, device function management information 40 indicates device identification information (device identification information) for identifying a device and a function possessed by the device. This is information indicating the association with function information. The device identification information includes, for example, a device ID, a device name, information indicating the device type, a device model number, information indicating a position where the device is installed (device position information), an appearance image indicating the appearance of the device, Etc. The function information is, for example, a function ID or a function name. For example, when an image forming apparatus as a device has a scan function, a copy function, and a scan transfer function, the device identification information of the image forming apparatus includes function information indicating a scan function, function information indicating a copy function, and The function information indicating the scan transfer function is associated. By referring to the device function management information 40, the function of each device is specified (identified).

  The device managed by the device function management information 40 is, for example, a device (for example, the device 12) included in the device system. Of course, devices that are not included in the device system may be managed by the device function management information 40. For example, the robot apparatus 10 may acquire information related to a new device that is not included in the device system (information including device identification information and function information) and newly register it in the device function management information 40. Information about the device may be acquired by using the Internet, for example, or may be input by an administrator or the like. Further, the robot apparatus 10 may update the device function management information 40 at an arbitrary timing, periodically, or at a timing designated by an administrator or the like. As a result, function information indicating functions that the device did not have before the update but that the device had after the update may be registered in the device function management information 40. Similarly, when function information indicating a function that the device has before the update but does not have the device after the update is deleted from the device function management information 40 or registered as unusable information There is. The update information may be acquired by using the Internet or the like, or may be input by an administrator or the like.

  The cooperation function management information 42 is information for managing a cooperation function executed by linking a plurality of functions. By coordinating a plurality of functions, one or a plurality of cooperation functions are executed. For example, the cooperation function may be executed by linking a plurality of functions of one device, or may be executed by linking a plurality of functions of a plurality of devices. In addition, a device that issues an operation instruction (for example, the terminal device 14) may be included in the identification target device, and the function of the terminal device may be used as a part of the cooperation function, or the function of the robot device 10 may be linked. It may be used as part of the function.

  The linkage function may be a function that is executed without using a device as hardware. For example, the cooperation function may be a function executed by linking a plurality of software. Of course, the linkage function may be a function executed by linking a function of a device as hardware and a function realized by software.

  The cooperation function management information 42 is, for example, information indicating a correspondence between a combination of function information indicating each function used for the cooperation function and cooperation function information indicating the cooperation function. The cooperation function information is, for example, a cooperation function ID or a cooperation function name. When a single function is updated, the link function management information 42 is also updated along with the update. As a result, a collaborative function with multiple functions that could not be linked to each other before the update becomes available after the update, and conversely, an available link function before the update cannot be used after the update. It may be possible. The link function information indicating the link function that can be used after the update is registered in the link function management information 42, and the link function information indicating the link function that has become unavailable after the update is deleted from the link function management information 42. Or, it is registered as unusable information.

  When a plurality of devices are linked, the linkage function management information 42 is information for managing a linkage function that uses a plurality of functions of the plurality of devices, and for identifying each device used for the linkage function. This is information indicating a correspondence between a combination of device identification information and linkage function information. As described above, when the device function management information 40 is updated, the cooperation function management information 42 is also updated along with the update. As a result, the cooperation function by multiple devices that could not cooperate with each other before the update becomes available after the update, and conversely, the available cooperation function before the update becomes unavailable after the update. It may be possible.

  The cooperation function may be a function executed by linking a plurality of different functions to each other, or may be a function executed by linking the same function. The cooperation function may be a function that could not be used before the cooperation. The functions that could not be used before cooperation may be functions that can be used by using the same functions among the functions of the devices to be linked, or functions that can be used by combining different functions. May be. For example, by coordinating a device having a print function (printer) and a device having a scan function (scanner), a copy function as a cooperation function is realized. That is, the copy function is realized by linking the print function and the scan function. In this case, a copy function as a linkage function is associated with a combination of a print function and a scan function. In the cooperation function management information 42, for example, cooperation function information indicating a copy function as a cooperation function, device identification information for identifying a device having a print function, and device identification information for identifying a device having a scan function. Are associated with each other.

  The storage unit 32 may store usable function management information. The usable function management information is information for managing a function that can be used by each user. For example, user identification information for identifying a user and function information (link function) indicating a function that can be used by the user. Information (which may include information). The functions that can be used by the user are, for example, functions provided free of charge to the user, functions purchased by the user, and the like, which may be a single function or a cooperation function. The user identification information is user account information such as a user ID and a name, for example. By referring to the available function management information, the functions available to each user are identified (specified). The available function management information is updated, for example, every time a function is provided to the user (for example, every time a function is provided to the user free of charge or for a fee).

  At least one of the owned function management information 34, the non-owned function management information 36, the solution management information 38, the device function management information 40, the linkage function management information 42, and the usable function management information is information other than the robot apparatus 10. You may memorize | store in the apparatus (For example, the server which is not shown in figure, the apparatus 12, the terminal device 14, etc.). In this case, the at least one information may not be stored in the storage unit 32 of the robot apparatus 10.

  The situation collection unit 44 has a function of collecting information about the situation around the robot apparatus 10 (hereinafter referred to as “situation information”) using various sensors. As the sensor, the above-described visual sensor, auditory sensor, tactile sensor, taste sensor, olfactory sensor, or the like is used. An image (for example, a moving image or a still image) around the robot apparatus 10 is acquired by the visual sensor, and image recognition is performed. Sounds around the robot apparatus 10 (for example, human conversations and noises) are collected by the auditory sensor, and voice recognition is performed. In addition, the ambient temperature, humidity, smell, etc. of the robot apparatus 10 are detected. Of course, other sensors may be used to collect information related to the situation around the robot apparatus 10. The status collection unit 44 may acquire status information from devices other than the robot device 10, sensors, and the like.

  The moving unit 46 has a function of moving the robot apparatus 10 by at least one of a means for moving on land, a means for flying, and a means for moving underwater. The moving part 46 is comprised by the leg part 20 shown by FIG. 2, for example.

  The working unit 48 has a function of operating equipment other than the robot apparatus 10 and lifting or moving an object. The working unit 48 includes, for example, the leg part 20, the arm part 22, and the finger part 24 shown in FIG.

  The UI unit 50 is a user interface unit, and includes a display unit (for example, the display unit 28 shown in FIG. 1) and an operation unit. The display unit is a display device such as a liquid crystal display. The operation unit is an input device such as a touch panel or a keyboard. Of course, it may be a user interface that serves both as a display unit and an operation unit (for example, including a touch-type display or a device that electronically displays a keyboard or the like on the display). The robot apparatus 10 may not include the UI unit 50 or may include a hardware key (for example, various buttons) without including the display unit. The buttons as hardware keys are, for example, buttons specialized for numeric input (for example, numeric keys), buttons specialized for direction instructions (for example, direction instruction keys), and the like.

  The speaker 52 has a function of outputting sound. For example, a sound related to the solving means (for example, a message for requesting cooperation from a person) is emitted from the speaker 52.

  The control unit 54 controls the operation of each unit of the robot apparatus 10. The control unit 54 includes a detection unit 56, a solution means identification unit 58, a determination unit 60, a search unit 62, and an identification unit 64.

  Based on the situation information (for example, values of various sensors) collected by the situation collection unit 44, the detection unit 56 detects the situation around the robot apparatus 10, and whether there is a problem around the robot apparatus 10. It has a function to determine whether or not a problem has occurred. The detection unit 56 detects, for example, information related to a person existing around the robot apparatus 10 or information other than a person as a situation. Information about a person is, for example, an image of a person obtained by a visual sensor (for example, an image of a person's face, an image of a whole body, or an image representing a person's movement), or a person's voice collected by an auditory sensor. Etc. The information other than the person is, for example, temperature information obtained by the temperature sensor, humidity information obtained by the humidity sensor, or the like. Specifically, the detection unit 56 obtains an image obtained by a visual sensor, a sound collected by an auditory sensor, information on a tactile sense obtained by a tactile sensor, information on a taste obtained by a taste sensor, and an olfactory sensor. Combining information about odors, etc., the surrounding situation is detected, and the problem that has occurred is detected. Note that the detection unit 56 may use information collected by a sensor that the robot apparatus 10 does not have. For example, the detection unit 56 acquires information collected by an external sensor (for example, a sensor installed in a room or a sensor included in another device) and uses the information to obtain a problem. May be detected.

  For example, the detection unit 56 may determine that a problem has occurred when a value obtained by a sensor (for example, a temperature detected by a temperature sensor) is equal to or greater than a threshold value. The threshold value may be changed according to the age and sex of a person detected by various sensors included in the robot apparatus 10. By changing the threshold according to the age and sex of the person, the problem can be detected according to the individuality of the person. That is, since the way of feeling the problem may differ depending on the age and sex, the problem corresponding to the age and sex can be detected by changing the threshold according to the age and sex.

  The solving means specifying unit 58 has a function of referring to the solving means management information 38 and specifying (identifying) a solving means for solving the problem detected by the detecting unit 56.

  Based on a detection result (corresponding to a first detection result) based on information about a person (for example, a person's image or sound) and a detection result based on information other than a person (corresponding to a second detection result) When the problem in the situation is different, the solution specifying unit 58 may select a solution for solving the problem that has occurred based on the priority order of the first detection result or the second detection result determined in advance. . In this case, for example, by selecting a solving means based on the first detection result, a solving means for solving a problem that is actually assumed to occur in a person is selected.

  The determination unit 60 has a function of determining whether the problem detected by the detection unit 56 can be solved by the robot apparatus 10. Problems that cannot be solved by the robot device 10 are, for example, problems that cannot be solved by the function of the robot device 10 alone, and when the time required for the solution by the function of the robot device 10 is equal to or longer than a preset time, depends on the function of the robot device 10 The problem is that the quality of the result of the work falls below a preset quality. When the solving means specified by the solving means specifying unit 58 is included in the function group of the robot apparatus 10, the determining unit 60 determines that the problem can be solved only by the robot apparatus 10. . That is, when the robot apparatus 10 has all the functions necessary for executing the solving means, the determination unit 60 determines that the problem can be solved only by the robot apparatus 10. On the other hand, when the solving means specified by the solving means specifying unit 58 is not included in the function group of the robot apparatus 10, the determination unit 60 cannot solve the problem only by the robot apparatus 10. Judge that there is. That is, when the robot apparatus 10 does not have all or some of the functions necessary for executing the solving means, the determination unit 60 determines that the problem cannot be solved only by the robot apparatus 10. To do. For example, the determination unit 60 may specify a function that the robot apparatus 10 has (or a function that the robot apparatus 10 does not have) by referring to the owned function management information 34, By referring to the non-owned function management information 36, a function that the robot apparatus 10 does not have (or a function that the robot apparatus 10 has) may be specified.

  When the determination unit 60 determines that the problem can be solved by the robot device 10, the robot device 10 executes the solution means specified by the solution means specification unit 58 under the control of the control unit 54. . On the other hand, when the determination unit 60 determines that the problem cannot be solved by the robot apparatus 10, the control unit 54 provides a solution unit that uses elements (other devices or people) other than the robot apparatus 10. Control to execute. For example, the control unit 54 may cause a device other than the robot apparatus 10 to execute the solving means, or may request cooperation from a person. Of course, the robot device 10 and other elements (for example, a person or other equipment) may jointly execute the solving means.

  The search unit 62 is a solution means for solving the problem detected by the detection unit 56 and has a function of searching for a solution means that is not registered in the solution management information 38. The search unit 62 searches for the solution by using, for example, the Internet.

  The identification unit 64 has a function of identifying a device other than the robot apparatus 10 and identifying (specifying) a function of the device. For example, the identification unit 64 may identify the device based on an image (for example, an image representing the appearance of the device) obtained by photographing the device with a visual sensor, or by photographing the device with the visual sensor. The device identification information provided in the device may be acquired to identify the device, or the device may be identified by acquiring position information indicating the position where the device is installed. The identification unit 64 further identifies the function of the device. For example, the identification unit 64 specifies function information indicating a function associated with the device identification information of the identified device in the device function management information 40 stored in the storage unit 32. As a result, the function of the device is identified (specified).

  Further, the identification unit 64 identifies a plurality of devices to be linked, and in the linkage function management information 42 stored in the storage unit 32, the linkage function associated with the combination of device identification information of the plurality of devices. May be specified. Thereby, the cooperation function performed by cooperating the function which each apparatus of cooperation object has is identified (specific).

  When a function that can be used by a user is managed, the identification unit 64 receives user identification information for identifying the user, and the user identification information in the available function management information stored in the storage unit 32. You may specify the function information which shows each function matched with. Thereby, the function group that can be used by the user is identified (specified). For example, user identification information is transmitted from the terminal device 14 to the robot apparatus 10, and function information indicating each function associated with the user identification information is specified by the identification unit 64. For example, the identification unit 64 receives the device identification information and the user identification information, specifies the function information indicating the function associated with the device identification information in the device function management information 40, and in the available function management information The function information indicating the function associated with the user identification information is specified. As a result, a function that the device specified by the device identification information has and that can be used by the user specified by the user identification information is specified.

  The control unit 54 may execute a function purchase process and manage the purchase history. For example, when a paid function is purchased by a user, the control unit 54 may apply a charging process to the user.

  The control unit 54 has artificial intelligence (AI) as an intelligent unit, and the functions of the respective units of the control unit 54 are realized by the artificial intelligence.

  It should be noted that at least one of the detection unit 56, the solution means identification unit 58, the determination unit 60, the search unit 62, and the identification unit 64 is a device other than the robot device 10 (for example, a server, device 12, terminal device 14 not shown). Etc.). in this case. The at least one unit may not be included in the control unit 54 of the robot apparatus 10.

  Hereinafter, the configuration of the terminal device 14 will be described in detail with reference to FIG. FIG. 4 shows the configuration of the terminal device 14.

  The communication unit 66 is a communication interface, and has a function of transmitting data to other devices and a function of receiving data from other devices. The communication unit 66 may be a communication interface having a wireless communication function or a communication interface having a wired communication function.

  A camera 68 serving as an imaging unit generates image data (for example, still image data or moving image data) by imaging an imaging target. In addition to using the camera of the terminal device 14, the communication unit 66 receives image data captured by an external camera connected to a communication path such as a network, and the UI unit 72 displays the image data. The image data may be manipulated. Note that the terminal device 14 may not include the camera 68.

  The storage unit 70 is a storage device such as a hard disk or a memory (for example, SSD), and stores various programs, various data, and the like. In the storage unit 70, the address information of the robot apparatus 10, the address information of each device (for example, the device 12), the information about the identified device, the information about the identified device to be linked, and the information about the function of the identified device Information related to the cooperation function, etc. may be stored. These pieces of information may be stored in separate storage devices, or may be stored in the same storage device.

  The UI unit 72 is a user interface unit and includes a display unit and an operation unit. The display unit is a display device such as a liquid crystal display. The operation unit is, for example, an input device such as a touch panel, a keyboard, and a mouse. Of course, a user interface that serves as both a display unit and an operation unit (for example, a touch-type display or a device that electronically displays a keyboard or the like on the display) may be used.

  The control unit 74 controls the operation of each unit of the terminal device 14. The control unit 74 functions as a display control unit (control unit), for example, and displays various types of information on the display unit of the UI unit 72.

  The display unit of the UI unit 72 includes, for example, an image obtained by shooting with the robot apparatus 10, an image shot with the camera 68, and a device to be used (for example, a device used alone or a device to be linked). An image associated with the identified device, an image associated with the function, and the like are displayed. The image associated with the device may be, for example, an image (for example, a still image or a moving image) representing the device captured by the robot apparatus 10 or an image (for example, an icon) schematically representing the device. Alternatively, it may be an image (for example, a still image or a moving image) representing the device taken by the camera 68. For example, the image data schematically represented may be stored in the robot apparatus 10 and provided from the robot apparatus 10 to the terminal apparatus 14, or may be stored in advance in the terminal apparatus 14, or stored in another apparatus. Then, the terminal device 14 may be provided from the other device. The image associated with the function is, for example, an image such as an icon representing the function.

  Hereinafter, each wireless communication technique will be described with reference to FIGS. FIG. 5 shows the characteristics for each frequency as the characteristics (merits and demerits) of each wireless communication technology, and FIG. 6 shows the characteristics for each communication method.

  As shown in FIG. 5, the main standard of wireless communication technology with a frequency of 900 MHz is, for example, RFID. The merit is that it is strong against obstacles, and there are few interference frequency bands, and the demerit is that the antenna is large and the communicable distance is short.

  The main standard of wireless communication technology with a frequency of 2.4 GHz is, for example, ZigBee (registered trademark), Bluetooth, or the like. Advantages include power saving, high communication speed, and small antenna, and disadvantages include frequent interference.

  Major standards of wireless communication technology with a frequency of 5 GHz are, for example, IEEE802.11a and MuLTEfire. The merit is that the interference frequency band is small and the communication speed is fast, and the demerit is that it is weak against an obstacle.

  As shown in FIG. 6, the advantages of infrared communication include low power consumption and easy miniaturization, and the disadvantage is that infrared rays are not visible.

  A merit of visible light communication is that the communication path can be understood by visual observation of visible light, and a disadvantage is that directivity is strong.

  An advantage of near field communication (NFC) is that pairing between a plurality of devices is easy, and a disadvantage is that it can only be used at a short distance.

  When communicating with a communication partner using a wireless communication technique, the communication unit 30 of the robot apparatus 10 communicates with the communication partner using a wireless communication technique having characteristics suitable for the surrounding environment and the communication partner. More specifically, the communication unit 30 is wireless depending on, for example, the distance between the communication partner, the presence of an obstacle between the robot apparatus 10 and the communication partner, the communication method supported by the communication partner, and the like. Communicate with the communication partner by changing the communication technology.

  Hereinafter, the functions of the robot apparatus 10 will be described in detail with reference to FIGS. FIG. 7 shows an example of the owned function management table as the owned function management information 34, and FIG. 8 shows an example of the non-owned function management table as the non-owned function management information 36.

  In the owned function management table, as an example, there are a management number, information indicating the function of the robot apparatus 10, and information indicating that the robot apparatus 10 can be executed by the function (operation, processing, operation, etc.). Are associated with each other. For example, the robot apparatus 10 has a function of lifting an object with the arm portion 22, and the function can lift and move an object having a weight of up to 30 kg, for example. Moreover, the robot apparatus 10 is provided with the function to move, and can move by changing the speed within 10 km / h by the function. By referring to the owned function management table, it is specified (identified) that the robot apparatus 10 has a function and that the robot apparatus 10 can be executed by the function. In addition, since functions that are not registered in the owned function management table and that can be executed are functions that the robot apparatus 10 does not have or that cannot be executed, the robot apparatus can be obtained by referring to the owned function management table. It is specified (identified) that the function 10 cannot be executed or that it cannot be executed.

  In the non-owned function management table, as an example, a management number, information indicating a function that the robot apparatus 10 does not have, and that the robot apparatus 10 does not have that function cannot be executed (operation, Information indicating processing, operation, etc.) are associated with each other. For example, the robot apparatus 10 does not have an external cooling function (for example, a function of cooling the surroundings of the robot apparatus 10), and thus the robot apparatus 10 cannot cool the room. Further, the robot apparatus 10 does not have a printing function (printing function), and therefore, the voice heard by the robot apparatus 10 and the viewed document cannot be printed. By referring to the non-owned function management table, it is specified (identified) that a function that the robot apparatus 10 does not have and that the robot apparatus 10 cannot be executed. In addition, since functions that are not registered in the non-owned function management table and that cannot be executed may be functions that the robot apparatus 10 has or can be executed, the non-owned function management table is referred to. Thus, it may be specified (identified) that the robot apparatus 10 has a function or executable.

  The storage unit 32 may store data of both the owned function management table and the non-owned function management table, and based on these data, it may be specified that the robot apparatus 10 can execute or cannot execute the data. However, any one of the data may be stored, and based on the data, it may be specified that the robot apparatus 10 can execute or cannot execute the data.

  Hereinafter, a specific example of the solution management information 38 will be described with reference to FIG. FIG. 9 shows an example of a solution management table as the solution management information 38. In the solution management table, for example, a management number, information indicating a situation (problem), and information indicating a solution for the situation (problem) (means for solving the problem) are associated with each other. It has been. The situation (problem) is detected by the detection unit 56. The solving means specifying unit 58 specifies the solving means corresponding to the problem detected by the detecting unit 56 in the solving means management table.

  For example, when the situation (1) “the room temperature is equal to or higher than the temperature threshold (for example, 30 ° C. or higher)”, the means for solving the situation is a means for lowering the room temperature. , “(1) Cool with a cooler” and “(2) Open the window”. The temperature of the room is detected by, for example, various sensors (for example, a temperature sensor) provided in the robot apparatus 10. The temperature threshold may be changed according to the age and sex of a person around the robot apparatus 10. For example, the detection unit 56 of the robot apparatus 10 detects surrounding people based on information (for example, image and audio information) obtained by various sensors (for example, a visual sensor and an auditory sensor) and detects the detected people. Estimate age, gender, etc. For example, different temperature thresholds are used when an older person (a person whose age is greater than or equal to the age threshold) is detected and when a younger person (a person whose age is less than the age threshold) is detected May be. For example, when an older person is detected, a smaller temperature threshold may be used than when a younger person is detected. Thereby, the solution suitable for the age is executed. Of course, the threshold value may be changed for each gender. Thereby, the solution suitable for the sex is executed.

  As another example, when the situation (3) “the person in the room is acting like cold” is detected, the means for solving the situation is a means for raising the temperature of the room. Specifically, “(1) Turn on the stove” and “(2) Close the door”. The detection unit 56 of the robot apparatus 10 detects the situation (3) by detecting the actions of surrounding people by various sensors. Specifically, the detection unit 56 detects a human facial expression, the presence / absence of sweat, movements of arms and legs, etc., based on an image representing a person's face or body movement, or detects a person's voice. Thus, the situation (3) is detected.

  When a plurality of situations that contradict each other are detected, the detection unit 56 selects one situation from the plurality of situations as a priority situation, and the resolution means identification unit 58 solves the priority situation. You may specify the solution for this. For example, when the situation (1) “room temperature is 30 ° C. or higher” is detected, and further the situation (3) “the person in the room is acting like cold” is detected, the situation (3) Is selected in preference to the situation (1), and a solution means for solving the situation (3) is selected. This priority is set in advance. For example, a situation detected based on information about a person is selected with priority over a situation detected based on information other than a person. The situation (3) is a situation detected based on information about a person (for example, a situation detected by analyzing a person's image or sound), and the situation (1) is based on information other than a person. It is a detected situation. By selecting Situation (3) over Situation (1), the solution to the problem that is assumed to have occurred in the surrounding people will be preferentially selected. Can solve the problem.

  Hereinafter, the device function management information 40 will be described in detail with reference to FIG. FIG. 10 shows an example of a device function management table as the device function management information 40. In the device function management table, for example, a device ID, information indicating a device name (for example, a device type), information indicating a function of the device (function information), and an image ID are associated with each other. Yes. The device ID and the device name correspond to an example of device identification information. The image ID is an example of image identification information for identifying an image representing a device (for example, an image representing an external appearance of the device, an image schematically representing the device (for example, an icon), or the like). The device function management table may not include the image ID. For example, a device whose device ID is “B” is a multifunction peripheral (an image forming apparatus having a plurality of image forming functions), and has functions such as a print function and a scan function. The device is associated with an image ID for identifying an image representing the device. Image data representing the device is stored in, for example, the storage unit 32 of the robot apparatus 10 or another apparatus.

  For example, device IDs for identifying devices existing around the robot device 10 are acquired by various sensors of the robot device 10, and the identification unit 64 of the robot device 10 refers to the device function management table, thereby A device name, a function, and an image ID associated with the device ID are specified. Thereby, devices existing around the robot apparatus 10 are identified. For example, information indicating the device name and image data representing the device may be transmitted from the robot device 10 to the terminal device 14 and displayed on the terminal device 14. An image representing a device is displayed as an image associated with the device. The image associated with the device may be an image generated by shooting with a camera or the like, or may be an image (for example, an icon) schematically representing the device. In addition, when an image associated with a device is designated by the user in the terminal device 14, information related to the function of the device (for example, function information or function description information) is transmitted from the robot device 10 to the terminal device 14. And may be displayed on the terminal device 14.

  Hereinafter, the cooperative function management information 42 will be described in detail with reference to FIG. FIG. 11 shows an example of a cooperation function management table as the cooperation function management information 42. In the linkage function management table, as an example, a combination of device IDs, information indicating a device name to be linked (for example, the type of each device), and information indicating a linkage function (linkage function information) are associated with each other. ing. For example, a device with a device ID “A” is a PC (personal computer), and a device with a device ID “B” is a multifunction device. By linking the PC (A) and the multifunction peripheral (B), for example, a “scan transfer function” or a “print function” is realized as a link function. The “scan transfer function” is a function for transferring image data generated by scanning by the multifunction peripheral (B) to the PC (A). The “print function” is a function for transmitting data (for example, image data or document data) stored in the PC (A) to the multifunction device (B) and printing the data on the multifunction device (B).

  Hereinafter, an outline of the situation confirmation process will be described with reference to FIG. FIG. 12 is a flowchart showing the processing.

  First, the situation collection unit 44 collects situation information (values of various sensors) related to the situation around the robot apparatus 10 using various sensors, and the detection unit 56 detects the surroundings of the robot apparatus 10 based on the situation information. The situation is detected, it is determined whether or not a problem has occurred around the robot apparatus 10, and the problem that has occurred is detected (S01).

  The determination unit 60 determines whether or not the robot device 10 can solve the problem detected by the detection unit 56 (S02). As described above, the problems that cannot be solved by the robot apparatus 10 are, for example, problems that cannot be solved by the function of the robot apparatus 10 alone, problems that the time required for the solution by the function of the robot apparatus 10 alone exceeds a predetermined time, robot The problem is that the quality of the result of the work by the function of the apparatus 10 is not more than a predetermined quality.

  If the problem detected by the detection unit 56 does not correspond to a problem that cannot be solved by the robot apparatus 10 (S02, No), the robot apparatus 10 executes a solution means for solving the problem (S03). The solving means is specified by the solving means specifying unit 58. In this case, the robot apparatus 10 executes the solving means by using its own function without using equipment other than the robot apparatus 10 and without receiving assistance from a person. For example, when the robot apparatus 10 displays information indicating a problem or information indicating a solution means on the UI unit 50 of the robot apparatus 10 or the UI unit 72 of the terminal device 14 and an instruction to execute the solution means is given by the user, Solution may be implemented.

  On the other hand, when the problem detected by the detection unit 56 corresponds to a problem that cannot be solved by the robot apparatus 10 (S02, Yes), search processing for a solving means is executed. When searching for a solution means only by the robot apparatus 10 (S04, Yes), the robot apparatus 10 executes an autonomous determination mode (S05). On the other hand, when the solution means is not searched for only by the robot apparatus 10 (S04, No), the robot apparatus 10 executes the user determination mode (S06). Whether to execute the autonomous determination mode or the user determination mode is set in advance. That is, when the problem detected by the detection unit 56 corresponds to a problem that cannot be solved by the robot apparatus 10, the robot apparatus 10 executes a mode set in advance in the autonomous determination mode and the user determination mode. The mode to be executed may be specified by the user.

  Hereinafter, the status confirmation process will be described in detail with reference to FIG. FIG. 13 is a flowchart showing the processing.

  First, the situation collection unit 44 collects situation information (values of various sensors) about the situation around the robot apparatus 10 by various sensors (S10), and the detection unit 56 is generated by combining the values of various sensors. A problem is assumed (S11). For example, the detection unit 56 assumes a problem caused by combining information obtained by analyzing an image obtained by photographing the surroundings and a value of a sensor other than the image. Of course, it is also possible to collect a human voice and detect the problem by combining the voice analysis results. For example, when the detected temperature is equal to or higher than the temperature threshold (for example, 30 ° C. or higher) and a person who is sweating is shown in the image representing the surroundings, the detection unit 56 uses a cooler, a fan, or the like. It is detected that the ambient temperature needs to be lowered. When a situation is judged using only one sensor, the judgment may be erroneous, but a more complicated situation can be detected with higher accuracy by combining it with an image representing the surroundings. For example, by combining the image representing the surroundings and the sensor value, not only a uniform determination but also an individual determination is possible. Of course, by using a sensor, there is an advantage that it is possible to detect even a situation that cannot be detected by human eyes or smell.

  Next, the solution means identification unit 58 assumes solution means for returning the values of various sensors to normal values (S12). For example, the solving means specifying unit 58 specifies the solving means corresponding to the problem detected by the detecting unit 56 in the solving means management table shown in FIG. Information relating to the solution means specified here is presented at the time of a user inquiry described later, or as an instruction (control) request when a problem is solved using a device or person other than the robot apparatus 10. Also good.

  Next, the determination unit 60 determines whether or not the robot device 10 can solve the problem detected by the detection unit 56. Therefore, the determination unit 60 compares the function of the robot apparatus 10 with the solution means specified by the solution means specifying unit 58 (S13).

  When the solving means specified by the solving means specifying unit 58 is included in the functions of the robot apparatus 10, that is, when the robot apparatus 10 has a function for executing the solving means (S14). , Yes), the process proceeds to step S03 shown in FIG. In this case, the robot apparatus 10 executes the solving means by using its own function without using equipment other than the robot apparatus 10 and without receiving assistance from a person. The robot apparatus 10 may execute the solving means when an execution instruction is given by the user.

  When the solving means specified by the solving means specifying unit 58 is not included in the functions of the robot apparatus 10, that is, when the robot apparatus 10 does not have a function for executing the solving means (S14). , No), the robot apparatus 10 executes the autonomous determination mode or the user determination mode (S15).

  Hereinafter, the process in the user determination mode will be described in detail with reference to FIG. FIG. 14 is a flowchart showing the processing.

  First, the communication unit 30 of the robot apparatus 10 transmits the status information (values of various sensors) collected by the status collection unit 44 to the terminal device 14 under the control of the control unit 54 (S20). At this time, the communication unit 30 may switch the communication method according to the presence or absence of an obstacle, the distance between the robot apparatus 10 and the terminal apparatus 14, and the like. For example, the communication unit 30 may transmit the status information to the terminal device 14 registered in advance as a transmission destination, may transmit the status information to the terminal device 14 identified by the robot device 10, or may be a robot. The status information may be transmitted to the terminal device 14 that has requested communication with the device 10. For example, the identification unit 64 of the robot apparatus 10 acquires device identification information of the terminal apparatus 14 from an image obtained by photographing the terminal apparatus 14 with a visual sensor, and identifies the terminal apparatus 14 based on the device identification information. May be. As another example, the communication unit 30 is a terminal device 14 that exists in a range set in advance with reference to the position of the robot device 10 (for example, a terminal device 14 that exists in a range in which communication is possible by short-range wireless communication). ) May send status information. As yet another example, the identification unit 64 of the robot apparatus 10 may identify a user who is assumed to have a problem and transmit the status information to the terminal apparatus 14 possessed by the user.

  The robot apparatus 10 may further collect status information and transmit it to the terminal apparatus 14 in response to a request from the user of the terminal apparatus 14 (S21). For example, the robot apparatus 10 additionally captures the surroundings with a visual sensor such as a camera while moving, and transmits an image (moving image or still image) obtained by the imaging to the terminal apparatus 14 or an instruction from the user The collected data (for example, temperature) is collected again and retransmitted to the terminal device 14.

  Next, the robot apparatus 10 presents one or more solution means (one or more solution means specified by the solution means specifying unit 58) for solving the problem detected by the detection unit 56 to the user ( S22). For example, the communication unit 30 of the robot apparatus 10 transmits information indicating the solving means to the terminal device 14 under the control of the control unit 54. In this case, information indicating the solving means is displayed on the UI unit 72 of the terminal device 14. As another example, the control unit 54 of the robot apparatus 10 may display information indicating the solving means on the UI unit 50 of the robot apparatus 10 itself.

  When the execution target solving means is selected by the user from one or more solving means presented to the user, the control unit 54 of the robot apparatus 10 determines that the solving means selected by the user is the robot apparatus 10. It is determined whether or not the solution means uses a device other than the device (for example, the device 12) (S23). For example, when the user selects a solving means using the terminal device 14, information indicating the solving means is transmitted from the terminal device 14 to the robot apparatus 10, and the control unit 54 of the robot apparatus 10 determines the information based on the information. Then, it is determined whether or not the solution selected by the user is a solution using an apparatus other than the robot apparatus 10. When information indicating the solving means is displayed on the UI unit 50 of the robot apparatus 10 and the user selects the solving means using the UI unit 50, the control unit 54 makes a determination according to the selection.

  When the solving means selected by the user corresponds to a solving means using equipment other than the robot apparatus 10 (S23, Yes), the communication unit 30 of the robot apparatus 10 controls the solving means under the control of the control unit 54. Is transmitted to the terminal device 14 (S24). For example, the control unit 54 identifies one or a plurality of devices having a function capable of realizing the solving means selected by the user by referring to the device function management table and the cooperation function management table. For example, when “print” is selected as the solution means, a multifunction device having a print function is specified as a device that realizes the solution means. The information about the device may include, for example, an appearance image representing the appearance of the device, device address information for connecting to the device, information indicating the device specifications, and the like. As another example, the control unit 54 of the robot apparatus 10 may cause the UI unit 50 of the robot apparatus 10 to display information regarding the device. In this case, information regarding the device may not be transmitted to the terminal device 14.

  Next, the solving means is executed by the user's operation (S25). That is, the function of the device for realizing the solution is executed. As a function for realizing the solving means, a function possessed by a single device may be executed, or a cooperation function using functions possessed by a plurality of devices may be executed. Of course, the function of the robot apparatus 10 may be used as a function for realizing the solving means. The function execution instruction for the device may be given from the robot device 10 or from the terminal device 14. The operation for executing the function will be described in detail later.

  On the other hand, when the solving means selected by the user does not correspond to a solving means that uses equipment other than the robot apparatus 10 (S23, No), the robot apparatus 10 cooperates with surrounding people to execute the solving means. Is requested (S24). At this time, the robot apparatus 10 informs the user of a procedure for executing the solving means (a procedure of work to be performed by the user) (S25). For example, the control unit 54 of the robot apparatus 10 causes the procedure to be generated as a sound on a speaker, displays the procedure on the UI unit 50 of the robot apparatus 10, or moves to a place where a person is present and touches a person. If the procedure is difficult to communicate to people, the sound may be increased. The communication unit 30 of the robot apparatus 10 may transmit information indicating the procedure to the terminal apparatus 14. In this case, information indicating the procedure is displayed on the UI unit 72 of the terminal device 14.

  Note that, when the solving means realized by both the device and the person is selected by the user, the above steps S24 to S27 are executed. Even in this case, the function of the robot apparatus 10 may be used as a part of the solving means.

  Hereinafter, the process in the autonomous determination mode will be described in detail with reference to FIG. FIG. 15 is a flowchart showing the processing.

  First, the control unit 54 of the robot apparatus 10 specifies an additional function and a support request (user support request) necessary for realizing the solution means specified by the solution means specifying unit 58 as necessary (user support request) (see FIG. S30).

  The control unit 54 of the robot apparatus 10 determines whether or not the solving means specified by the solving means specifying unit 58 is a solving means using a device other than the robot device 10 (for example, the device 12) (S31).

  When the solving means specified by the solving means specifying unit 58 corresponds to a solving means that uses equipment other than the robot apparatus 10 (S31, Yes), the control unit 54 of the robot apparatus 10 determines the solving means in the surroundings. An executable device is searched (S32). The control unit 54 searches for a device based on, for example, an image obtained by a visual sensor, position information of each device, wireless communication status, and the like. For example, the control unit 54 identifies one or a plurality of devices having a function capable of realizing the solving means by referring to the device function management table and the cooperation function management table. For example, when the solving means is “print”, the control unit 54 of the robot apparatus 10 searches for a multifunction peripheral having a print function.

  When there is a device that can execute the solution means (S33, Yes), the communication unit 30 of the robot apparatus 10 instructs the execution of the solution means (functions for realizing the solution means) under the control of the control unit 54. Information indicating the execution instruction) is transmitted to the device (S34). The device that has received the information indicating the execution instruction executes the solving means according to the instruction.

  The control unit 54 of the robot apparatus 10 may confirm whether or not the control right of the device that can execute the solving means can be acquired. That is, the control unit 54 confirms whether the address information of the device and the controllable driver are stored in the robot apparatus 10. When the driver or the like can be acquired using a network or the like, the control unit 54 downloads the driver or the like. The robot apparatus 10 may give an execution instruction to the device by directly operating the operation panel of the device, or may give an execution instruction to the device by operating a remote control of the device.

  Whether or not the solving means has been completed successfully is determined by whether or not the problem has been solved after the function is executed by the device. If the problem is not detected by the detection unit 56, the control unit 54 determines that the problem has been solved. When the problem has not been solved, the control unit 54 of the robot apparatus 10 gives an instruction to execute the solution means again to the device or searches for another solution means.

  When there is no device that can execute the solution (S33, No), the control unit 54 of the robot apparatus 10 searches for another solution (S35). At this time, the control unit 54 may switch the mode to the user determination mode.

  When the solving means specified by the solving means specifying unit 58 does not correspond to the solving means using equipment other than the robot apparatus 10 (S31, No), the robot apparatus 10 uses the surrounding people to execute the solving means. Is requested to cooperate (S36). At this time, the robot apparatus 10 informs the user of the contents of the cooperation request (for example, a procedure for executing the solving means) (S37). For example, the control unit 54 of the robot apparatus 10 generates a cooperation request content as a sound on a speaker, displays the cooperation request content on the UI unit 50 of the robot apparatus 10, or moves to a place where a person is present and touches a person. Or The communication unit 30 of the robot apparatus 10 may transmit information indicating the cooperation request content to the terminal apparatus 14. In this case, information indicating the content of the cooperation request is displayed on the UI unit 72 of the terminal device 14.

  If the cooperation is accepted (S38, Yes), the process ends. In this case, the user executes the solving means. When the cooperation is not accepted (S38, No), the control unit 54 of the robot apparatus 10 searches for another solution (S35). The control unit 54 may switch the mode to the user determination mode. For example, the control unit 54 may determine whether or not the cooperation has been received by recognizing the user's reply by voice recognition. For example, when a reply indicating that the cooperation is accepted is recognized by voice, the control unit 54 determines that the cooperation is accepted. As another example, when an operation assumed to be performed by the user to realize the solution is executed within a preset time and the operation is detected by various sensors (for example, a visual sensor), The control unit 54 may determine that the cooperation has been received.

  As time passes, the surrounding situation also changes, so that the problem that occurs may change. In this case, the robot apparatus 10 detects a situation (problem) that changes every moment, and specifies a solution means according to the detection result.

  Further, the problem that the robot apparatus 10 can solve is changed according to the update of the function of the robot apparatus 10. The update of the function is performed, for example, by changing at least one of hardware and software included in the robot apparatus 10.

  The solving means specifying unit 58 preferentially specifies a solving means (a solving means that does not require human cooperation) that is executed by using a device without being executed by a person, rather than a solving means that is executed by a person. May be. A solution executed by a person is not necessarily executed by a person. Therefore, a solution that can be executed more reliably is identified by preferentially identifying a solution that does not require human cooperation. In addition, when no person is detected, the solution specifying unit 58 may specify a solution that does not require human cooperation without searching for a solution that requires human cooperation. Thereby, the load required for searching for a solution means is reduced.

  Hereinafter, the execution control of the solving means will be described in detail with reference to FIG. FIG. 16 is a flowchart showing the control. In the following, it is assumed that the robot apparatus 10 cannot solve the problem.

  First, the control unit 54 of the robot apparatus 10 includes device control in the solving means selected by the user in the user determination mode or the solving means specified by the robot apparatus 10 in the autonomous determination mode. It is determined whether or not (S40). That is, the control unit 54 determines whether or not the above solution means corresponds to a solution means that uses a device. In other words, the control unit 54 determines whether or not the above solution means is a solution means realized only by a person without using a device.

  If the above solution does not include control of the device (S40, No), that is, if the solution corresponds to a solution realized only by a person without using the device, the robot apparatus 10 solves the problem. In order to execute the means, a cooperation is requested from the surrounding people (S41). For example, the communication unit 30 of the robot apparatus 10 transmits information indicating cooperation request contents (for example, a procedure for executing the solving means) to the terminal apparatus 14. In this case, information indicating the content of the cooperation request is displayed on the UI unit 72 of the terminal device 14. At this time, the communication unit 30 of the robot apparatus 10 transmits information indicating the content of the cooperation request to the terminal apparatus 14 using a communication method according to the surrounding environment (S42). The surrounding environment is, for example, the distance between the robot apparatus 10 and the terminal apparatus 14, the presence or absence of an obstacle between the robot apparatus 10 and the terminal apparatus 14, and the like. The control unit 54 of the robot apparatus 10 causes the speaker to generate the cooperation request content as a voice, displays the cooperation request content on the UI unit 50 of the robot apparatus 10, or moves to a place where a person is present and touches a person. May be. The control unit 54 of the robot apparatus 10 observes whether or not an operation assumed to be performed by the user in order to realize the solving means is executed within a preset time (S43). Based on the observation, the control unit 54 determines whether or not the problem has been solved (S44). When the assumed operation is executed within a preset time and the operation is detected by various sensors (for example, a visual sensor), the control unit 54 of the robot apparatus 10 determines that the problem has been solved ( S44, Yes). In this case, the process ends. On the other hand, when the assumed operation is not executed within a preset time and the operation is not detected by various sensors, the control unit 54 of the robot apparatus 10 determines that the problem has not been solved (S44, No). ). In this case, the robot apparatus 10 may search for another solution means or notify the user of the same cooperation request content (S45).

  When the above-described solving means includes device control (S40, Yes), that is, when the solving means corresponds to the solving means using the device, the control unit 54 of the robot apparatus 10 is used as the solving means. It is determined whether the corresponding device is a device that can be used free of charge (S46). Specifically, the control unit 54 of the robot apparatus 10 searches (identifies) a device that can execute the solution in the surroundings, and determines whether or not the device is a device that can be used free of charge. Information indicating free or paid is managed for each device. For example, in the device function management table, it may be managed that each device can be used free of charge or for a fee. As another example, the control unit 54 may acquire information indicating whether the crisis is a free or payable device from the identified device.

  If the device used for the solution does not correspond to a device that can be used free of charge (S46, No), that is, if the device corresponds to a device that can be used for a fee, the control unit 54 of the robot apparatus 10 It is determined whether the robot apparatus 10 has a payment means corresponding to the device (S47). The payment means is, for example, electronic money (electronic currency), virtual currency, cash, credit card or the like. Of course, payment may be made by means other than these. When the robot apparatus 10 does not have a payment means (S47, No), the control part 54 of the robot apparatus 10 searches for another solution means (S48). When the robot apparatus 10 has a payment means (S47, Yes), the process proceeds to step S49. In this case, the control unit 54 of the robot apparatus 10 controls a payment operation for using a payable device. That is, the robot apparatus 10 pays by the payment means when using the device. In addition, when the robot apparatus 10 does not have a payment means, the robot apparatus 10 receives payment assistance from at least one of devices and people other than the robot device 10 (for example, borrows money from a person or a device). Payment operations may be performed.

  Next, the control unit 54 of the robot apparatus 10 determines whether or not the device used for the solution means corresponds to a device that can be controlled via the communication function (S49). Information indicating whether control is possible via the communication function is managed for each device. For example, in the device function management table, it may be managed whether each device can be controlled via the communication function.

  When the device used for the solving means corresponds to a device that can be controlled via the communication function (S49, Yes), the control unit 54 of the robot apparatus 10 can communicate with the device (the device). Is selected) (S50). The communication method supported by each device is managed for each device. For example, the communication method supported by each device may be managed in the device function management table. Note that if a communication error occurs even if communication with the device is performed according to the communication method supported by the device, the processing after step S57 may be executed. When the device is compatible with a plurality of communication methods, the robot apparatus 10 may attempt communication with the device using each of the plurality of communication methods. In this case, the robot apparatus 10 may communicate with the device using an optimum communication method (for example, a communication method with the fastest communication speed or a communication method with the least noise) among a plurality of communication methods.

  Further, the control unit 54 of the robot apparatus 10 acquires the address information, the access password, etc. of the device used for the solving means (S51). The control unit 54 of the robot apparatus 10 may acquire address information or the like from a device such as a server that stores address information or the like, or may acquire address information or the like via the Internet or the like. As another example, address information and the like may be stored in the robot apparatus 10. Further, when the driver of the device is necessary to control the device used for the solving means, the control unit 54 of the robot device 10 acquires the driver and installs it in the robot device 10 (S52). The control unit 54 of the robot apparatus 10 may acquire a driver from an apparatus such as a server that stores the driver, or may acquire the driver via the Internet or the like.

  Next, the communication unit 30 of the robot apparatus 10 transmits information indicating an execution instruction of the solving means (execution instruction of a function for executing the solving means) to a device used for the solving means (S53). The device that has received the information indicating the execution instruction executes the solving means according to the instruction.

  The control unit 54 of the robot apparatus 10 observes whether or not the device has solved the problem (S54), and determines whether or not the problem has been resolved based on the observation (S55). For example, when an operation that is assumed to be performed by a device to realize the solution is executed by the device within a preset time and the operation is detected by various sensors (for example, a visual sensor), The control unit 54 of the robot apparatus 10 determines that the device has solved the problem (S55, Yes). In this case, the process ends. On the other hand, when the assumed operation is not executed within the preset time and the operation is not detected by various sensors, the control unit 54 of the robot apparatus 10 determines that the problem has not been solved (S55, No). ). In this case, the robot apparatus 10 may search for another solution, or may transmit the same execution instruction to the device (S56). Note that the control unit 54 of the robot apparatus 10 may determine whether or not another new solving means can be executed. For example, when the temperature of the room is high and a fan is newly installed in the room as a device other than the cooler, the control unit 54 may search for a new solution using the fan. The search unit 62 may search for a new solution by using the Internet or the like.

  If the device used for the solving means does not correspond to a device that can be controlled via the communication function (S49, No), the robot apparatus 10 searches for a remote controller for operating the device (S57). For example, the robot apparatus 10 identifies a remote controller by analyzing an image obtained by a visual sensor.

  If the remote controller is not found (S58, No), the robot apparatus 10 searches for another solution (S59).

  When the remote controller is found (S58, Yes), the robot apparatus 10 inputs an instruction to execute the solving means by operating the remote controller (S59). The device that has received the instruction executes the solving means according to the instruction.

  The control unit 54 of the robot apparatus 10 observes whether or not the device has solved the problem, and determines whether or not the problem has been resolved based on the observation (S61). For example, when an operation that is assumed to be performed by a device to realize the solution is executed by the device within a preset time and the operation is detected by various sensors (for example, a visual sensor), The control unit 54 of the robot apparatus 10 determines that the device has solved the problem (S61, Yes). In this case, the process ends. On the other hand, when the assumed operation is not executed within the preset time and the operation is not detected by various sensors, the control unit 54 of the robot apparatus 10 determines that the problem has not been solved (S61, No). ). In this case, the robot apparatus 10 may search for another solution, or may transmit the same execution instruction to the device using a remote controller (S62).

  Hereinafter, application scenes of the robot apparatus 10 according to the present embodiment will be described in detail.

(Application 1)
Application scene 1 will be described with reference to FIG. FIG. 17 shows a person, the robot apparatus 10 and the like. For example, it is assumed that a meeting is held by people 76 (three people) and the robot apparatus 10 is present at the meeting.

  The situation collecting unit 44 of the robot apparatus 10 collects surrounding situation information using various sensors. The situation collecting unit 44 uses, for example, voice indicating the conversation of the surrounding person 76, an image representing the person 76 (for example, an image representing the face of the person 76, an image representing the whole body, etc.), temperature, humidity, and the like as the situation information. collect. The detection unit 56 of the robot apparatus 10 detects the surrounding situation based on the situation information (for example, the state of the person 76 (conversation, facial expression, attitude, etc.), temperature, etc.). For example, when the person 76 says, “I want you to make a note of the current remark and use paper”, the situation collection unit 44 of the robot apparatus 10 collects the voice information of the conversation as the situation information, and the detection unit of the robot apparatus 10 56 determines whether a problem has occurred based on the conversation. In the above utterance, the problem of “I want the contents of the utterance to be printed on paper” is detected.

  The solution means identification unit 58 refers to the solution means management information 38 and identifies a solution means for solving the problem. This problem can be solved by combining, for example, “a function for collecting utterances as voice information and converting the content of the utterances into character strings” and a “printing function”. That is, a solution to this problem is configured by a combination of “a function for collecting utterances as voice information and converting the utterance contents into a character string” and a “printing function”.

  Here, it is assumed that the robot apparatus 10 has a “function to collect utterances as voice information and convert the utterance contents into a character string” and does not have a “print function”. In this case, the above problem cannot be solved only by the robot apparatus 10. In order to cope with this, the robot apparatus 10 searches for a device having a print function. For example, the robot apparatus 10 images surrounding devices with a visual sensor (for example, a camera), and the identification unit 64 of the robot apparatus 10 identifies the captured devices by analyzing an image obtained by the imaging. . For example, it is assumed that the multi-function device 78 is installed around the robot apparatus 10 and has a print function. The identification unit 64 of the robot apparatus 10 identifies the multifunction device 78 and identifies a function (for example, a print function) that the multifunction device 78 has. In this case, the solving means is executed by the robot apparatus 10 and the multi-function device 78. This solving means is a solving means realized by a cooperation function using the robot apparatus 10 and the multifunction machine 78, that is, a solving means realized by cooperation of the robot apparatus 10 and the multifunction machine 78. For example, the robot apparatus 10 may automatically execute the solving means, or may execute the solving means upon receiving an execution instruction from the user. When executing the solution, the robot apparatus 10 communicates with the multi-function device 78, transmits information indicating the content of the remarks of the person 76 to the multi-function device 78, and gives a print instruction for the information to the multi-function device 78. Thereby, the content of the remarks of the person 76 is printed on the paper.

  The robot apparatus 10 may move to the multi-function device 78, obtain a sheet on which the content of the message is printed, and give it to the user. Further, the robot apparatus 10 may notify the user that the solving means has been executed by voice or the like.

  In the example illustrated in FIG. 17, the robot apparatus 10 is used as the solving means. However, the solving means using equipment other than the robot apparatus 10 may be executed without using the robot apparatus 10. In this case, the robot apparatus 10 gives an instruction to execute the solving means to devices other than the robot apparatus 10.

  The robot apparatus 10 may directly operate the multi-function device 78. For example, when a problem such as “I want to copy paper” is detected by the detection unit 56 of the robot apparatus 10, the solution means specifying unit 58 of the robot apparatus 10 specifies “copy function” as the solution means. In this case, the robot apparatus 10 searches for a device having a copy function (for example, the multifunction device 78), and causes the multifunction device 78 to copy a sheet. For example, the robot apparatus 10 receives a sheet from the user, sets it in the multifunction device 78, and directly operates the operation panel of the multifunction device 78 to give a copy instruction to the multifunction device 78. The robot apparatus 10 analyzes an image acquired by, for example, a visual sensor to identify an operation panel and give a copy instruction.

  As described above, even when a problem that cannot be solved only by the robot apparatus 10 is detected, it is possible to solve the problem by using another device. Thereby, it becomes possible to widen the range of problems that the robot apparatus 10 can solve in cooperation according to the situation.

(Application scene 2)
Application scene 2 will be described with reference to FIG. FIG. 18 shows the user, the robot apparatus 10 and the like. For example, it is assumed that the person 80 has fallen. The situation collection unit 44 of the robot apparatus 10 collects an image of the person 80 who has fallen as situation information, and the detection unit 56 of the robot apparatus 10 determines whether or not a problem has occurred based on the image. . In the example illustrated in FIG. 18, the problem “a person is falling” is detected.

  The solution means identification unit 58 refers to the solution means management information 38 and identifies a solution means for solving the problem. This problem can be solved by, for example, “asking for help from a person to rescue a fallen person”. In other words, the solution to this problem includes “seeking people for help”. In this case, the robot apparatus 10 requests help from a person 82 other than the person 80 who has fallen. For this purpose, the robot apparatus 10 may generate sound, or may move to the person 82 and contact the person 82. At this time, the control unit 54 of the robot apparatus 10 may display information indicating a procedure for rescue on the UI unit 50 of the robot apparatus 10 or may display the information on the terminal apparatus 14 possessed by the person 82. Good. For example, the robot device 10 may identify the terminal device 14 possessed by the person 82 using various sensors, and transmit information indicating a procedure for rescue to the identified terminal device 14.

  In the case where the solving means “carrying a fallen person to a safe place” is specified by the solving means specifying unit 58 and the robot apparatus 10 has a function of carrying an object, the robot apparatus 10 The person 80 may be carried only by the robot apparatus 10, the person 80 may be carried together with another person, the person 80 may be carried together with other equipment, or the other person and other equipment may be brought together. You may carry people 80 together. The other device is a device identified by the robot apparatus 10. If the robot apparatus 10 does not have a function of carrying an object, the robot apparatus 10 may give an instruction to the person 82 to carry the person 80 who has fallen to a safe place. Information indicating the instruction may be displayed on the UI unit 50 of the robot apparatus 10, or may be transmitted to a terminal apparatus held by the person 82 and displayed on the terminal apparatus.

  As described above, when a problem that cannot be solved only by the robot apparatus 10 is detected, it is possible to request cooperation from a person and solve the problem.

(Application scene 3)
Application scene 3 will be described with reference to FIG. FIG. 19 shows the user, the robot apparatus 10 and the like. For example, when the person 84 says “I am thirsty”, the situation collection unit 44 of the robot apparatus 10 collects the voice information of the comment as situation information, and the detection unit 56 of the robot apparatus 10 responds to the statement. Based on this, it is determined whether or not a problem has occurred. In the above statement, the problem of “thirsty” is detected.

  The solution means identification unit 58 refers to the solution means management information 38 and identifies a solution means for solving the problem. This problem can be solved by, for example, “purchasing a drink”. That is, the solution to this problem is constituted by “a function of providing a drink”.

  Here, it is assumed that the robot apparatus 10 does not have a “function for providing a drink”. In this case, the above problem cannot be solved only by the robot apparatus 10. In order to cope with this, the robot apparatus 10 searches for a device that provides a drink. For example, the robot apparatus 10 images surrounding devices with a visual sensor, and the identification unit 64 of the robot apparatus 10 identifies an imaged device by analyzing an image obtained by the imaging. For example, it is assumed that a drinking water vending machine 86 (an example of an apparatus) is installed around the robot apparatus 10. The identification unit 64 of the robot apparatus 10 identifies the vending machine 86 and identifies the function of the vending machine 86 (for example, the function of providing drinks for a fee). The identification unit 64 of the robot apparatus 10 may identify a payment method (for example, electronic money or money) supported by the vending machine 86 based on the image obtained by the visual sensor. And the payment method supported by the vending machine 86 may be identified. The control unit 54 of the robot apparatus 10 prepares a payment method supported by the vending machine 86.

  When the robot apparatus 10 has a payment means (for example, electronic money, money, credit card, etc.) corresponding to the vending machine 86, the control unit 54 of the robot apparatus 10 controls the payment operation. Thereby, the robot apparatus 10 purchases a drink by paying with the payment means in the vending machine 86. For example, the robot apparatus 10 purchases a drink by moving to the vending machine 86 and directly operating the vending machine 86. The robot apparatus 10 purchases drinks by identifying a purchase button or the like of the vending machine 86 by analyzing an image acquired by a visual sensor, for example. The robot apparatus 10 may deliver the drink to the person 84. In addition, the robot apparatus 10 may purchase a drink according to a user's instruction | indication, and may purchase a drink, without receiving a user's instruction | indication. When the purchase process is performed in accordance with a user instruction, the control unit 54 of the robot apparatus 10 displays information indicating a solution (for example, information indicating that a drink is purchased) on the UI unit 50 of the robot apparatus 10. When the user gives a purchase instruction using the UI unit 50, the robot apparatus 10 purchases a drink by paying with the payment means. Information indicating the solution may be transmitted to the terminal device 14 of the user and displayed on the terminal device 14. When a user gives a purchase instruction using the terminal device 14, information indicating the purchase instruction is transmitted from the terminal device 14 to the robot apparatus 10, and the robot apparatus 10 purchases a drink according to the purchase instruction.

  When the robot apparatus 10 does not have a payment means corresponding to the vending machine 86, the payment operation may be performed with payment assistance from at least one of the person 84 and other equipment. . For example, the robot apparatus 10 may receive money for payment from a person who may be involved in an action for solving the problem, and may purchase a drink with the money, or the payment means from another device having a payment means. The payment operation may be performed by borrowing. The person who can be involved in the action for solving the problem is, for example, a person 84 who has said that he is thirsty, and this person 84 is identified by various sensors of the robot apparatus 10.

(User judgment mode 1)
Hereinafter, an example of the user determination mode (user determination mode 1) will be described in detail. FIG. 20 shows a person, the robot apparatus 10 and the like. For example, a person 87 (three persons) is having a conversation, and a plurality of devices (for example, a multifunction machine 78, a projector 88, a camera 90, a display 92, and an aroma device 94) are installed around the robot apparatus 10. Yes.

  The situation collecting unit 44 of the robot apparatus 10 collects surrounding situation information using various sensors, and the detecting unit 56 detects the surrounding situation based on the situation information. Further, the identification unit 64 identifies devices present in the vicinity. For example, it is assumed that the detection unit 56 detects a situation (problem) that “three people are in a meeting and somehow give up”. In the example illustrated in FIG. 20, the identification unit 64 identifies the multifunction device 78, the projector 88, the camera 90, the display 92, and the aroma device 94. The identification unit 64 may identify the terminal device 14 possessed by a person (for example, a person 87 having a problem).

  When the determination unit 60 of the robot device 10 determines that the problem detected by the detection unit 56 cannot be solved by the robot device 10, the communication unit 30 of the robot device 10 controls the robot device 10 under the control of the control unit 54. The status information is transmitted to the terminal device 14 registered in advance or the terminal device 14 identified by the identifying unit 64 (for example, the terminal device 14 possessed by the person 87 having the problem). The status information includes information indicating the status (problem) detected by the detection unit 56, information indicating the device identified by the identification unit 64, and the like.

  A screen for explaining the situation is displayed on the UI unit 72 of the terminal device 14. FIG. 21 shows an example of the screen. A situation explanation screen 96 is displayed on the UI unit 72 of the terminal device 14. On the situation explanation screen 96, information indicating a situation (problem) detected by the detection unit 56, information indicating a device identified by the identification unit 64, and the like are displayed. In the example shown in FIG. 21, a character string indicating that “three people are meeting and giving up” is displayed as a situation (problem), and “a multifunction machine, a projector, a camera, a display, and an aroma device are surrounded. A character string indicating that “is present” is displayed.

  When the user requests additional information using the terminal device 14, information indicating the request is transmitted from the terminal device 14 to the robot device 10. The communication unit 30 of the robot apparatus 10 transmits additional information to the terminal apparatus 14 in response to the request. For example, when an image representing the surrounding situation is requested by the user as additional information, the communication unit 30 of the robot apparatus 10 transmits image data representing the surrounding situation to the terminal device 14.

  The communication unit 30 of the robot apparatus 10 represents, for example, image data associated with a person 87 who is assumed to have a problem, image data associated with a device identified by the identification unit 64, and the like in the surrounding situation. The image data is transmitted to the terminal device 14. The image data associated with the person 87 may be image data generated by photographing with a visual sensor, or image data schematically representing a person. The image data associated with the device may be image data obtained by photographing with a visual sensor (camera) when the identification unit 64 identifies the device, or image data (schematically representing the identified device ( For example, an icon) may be used. For example, the image data schematically represented may be stored in advance in the robot apparatus 10, or may be stored in advance in an apparatus such as a server and transmitted to the robot apparatus 10.

  The communication unit 30 of the robot device 10 transmits image data (for example, image data associated with the person 87 and image data associated with the device) as additional information to the terminal device 14. The image is displayed on the UI unit 72 of the terminal device 14. FIG. 22 shows an example of the image. A situation explanation screen 98 is displayed on the UI unit 72 of the terminal device 14. On the situation explanation screen 96, an image group as additional information is displayed. For example, an image 100 associated with a person 87, a device image 102 associated with a multifunction device 78, a device image 104 associated with a projector 88, a device image 106 associated with a camera 90, a device image 108 associated with a display 92, and an aroma A device image 110 associated with the device 94 is displayed.

  When the user designates a device image on the situation explanation screen 98 and gives an execution instruction of the solution means using the device associated with the device image, information indicating the execution instruction is transmitted to the device. The device executes solution means for solving the problem detected by the detection unit 56 in accordance with the execution instruction. Information indicating the execution instruction may be transmitted from the terminal device 14 to the device, or may be transmitted from the robot device 10 to the device.

  For example, it is assumed that the device images 108 and 110 are designated by the user. In this case, the solving means specifying unit 58 refers to the solving means management information 38 to solve the problem “three people are meeting and giving up” detected by the detecting unit 56. In this case, a solution means that uses the display 92 associated with the device image 108 and the aroma device 94 associated with the device image 110 is specified. In other words, the solving means specifying unit 58 refers to the solving means management information 38 to find a solving means for solving the problem “three people are meeting and giving up” detected by the detecting unit 56. By identifying and referring to the device function management information 40 and the link function management information 42, a device group (a device group having a function used to execute each solution unit) used for each identified solution unit is identified. Identify. When the display 92 and the aroma device 94 are designated by the user, the solving means specifying unit 58 selects a solving means using them.

  When the display 92 and the aroma device 94 are designated by the user, an execution instruction screen 112 is displayed on the UI unit 72 of the terminal device 14, for example, as shown in FIG. On the execution instruction screen 112, a device image designated by the user (for example, device images 108 and 110) is displayed, and information indicating functions (solution means) that can be executed using the device designated by the user is displayed. Is done. For example, a solution that can be executed by using the display 92 and the aroma device 94 is to “display a healing image on the display 92 and utter a healing scent from the aroma device 94”. This solving means is a cooperation function that can be executed by linking the display 92 and the aroma device 94, and this cooperation function is registered in the cooperation function management information 42. When the user gives an execution instruction for the solving means using the terminal device 14, information indicating the execution instruction is transmitted from the terminal device 14 to the display 92 and the aroma device 94. Of course, the information indicating the execution instruction may be transmitted to the display 92 and the aroma device 94 via the robot apparatus 10. The display 92 that has received the information indicating the execution instruction displays a healing image, and the aroma device 94 that has received the information indicating the execution instruction causes the healing scent to utter. The healing image data is stored in advance in the robot apparatus 10 and may be transmitted from the robot apparatus 10 to the display 92, or stored in an apparatus such as a server and transmitted from the apparatus such as a server to the display 92. May be.

  Note that the screens shown in FIGS. 21 to 23 may be displayed on the UI unit 50 of the robot apparatus 10. In this case, each screen may not be displayed on the UI unit 72 of the terminal device 14. Information displayed on the screen may be output as audio information.

  The device selection operation for executing the solving means will be described in detail later.

(User judgment mode 2)
Hereinafter, another example of the user determination mode (user determination mode 2) will be described in detail.

  When the determination unit 60 of the robot apparatus 10 determines that the problem detected by the detection unit 56 cannot be solved by the robot apparatus 10, the communication unit 30 of the robot apparatus 10 controls the terminal device 14 under the control of the control unit 54. Send status information.

  For example, as illustrated in FIG. 24, a notification screen 114 is displayed on the UI unit 72 of the terminal device 14. The notification screen 114 displays a message indicating that a problem (situation) that cannot be solved by the robot apparatus 10 has occurred.

  When the user presses the “Yes” button on the notification screen 114, the screen transitions to the next screen. For example, as illustrated in FIG. 25, a situation explanation screen 116 is displayed on the UI unit 72 of the terminal device 14. On the situation explanation screen 116, an explanation of the situation (problem) detected by the robot apparatus 10 is displayed.

  The situation explanation screen 116 displays a message asking whether the user has understood the situation. When the user presses the “No” button, it is possible to make an inquiry to the robot apparatus 10, that is, to request additional information. In this case, as shown in FIG. 26, an inquiry screen 118 is displayed on the UI unit 72 of the terminal device 14. On the inquiry screen 118, the user can use the terminal device 14 to input inquiry contents (for example, items that the user wants to know further). In the example shown in FIG. 26, the contents of an inquiry such as “I want the data of XX” and “I want to see the video of △ Δ” are input by the user. Information indicating the inquiry content is transmitted from the terminal device 14 to the robot apparatus 10. The status collection unit 44 of the robot apparatus 10 collects information (for example, image and audio data) in response to the inquiry, and the communication unit 30 of the robot apparatus 10 uses the information collected by the status collection unit 44 as the terminal device 14. Send to. The additionally acquired information is displayed on the situation explanation screen 116 of the UI unit 72 of the terminal device 14.

  When the user presses the “Yes” button on the situation explanation screen 116, as shown in FIG. 27, the solution display screen 120 is displayed on the UI unit 72 of the terminal device 14. On the solution display screen 120, information indicating the solution specified by the solution specifying unit 58 of the robot apparatus 10 (descriptive text or name of the solution, etc.) is displayed. For example, as described above, when a problem that “three people have given up because of a meeting” is detected, the solution means specifying unit 58 refers to the solution management information 38 to solve the problem. Identify the solution to do this. Information indicating the identified solving means is transmitted from the robot apparatus 10 to the terminal apparatus 14 and displayed on the UI unit 72 of the terminal apparatus 14. In the example shown in FIG. 27, four solution means plans (1) to (4) are specified as recommended solution means for solving the above-described problem, and information indicating them is displayed. The solutions may be displayed in a random order, may be displayed in an order in which higher effects can be obtained in solving the problem, or may be displayed in an order determined by ease of execution. May be displayed. For example, the solution means that uses a device that is located closer to the terminal device 14 or the robot apparatus 10 is more easily executed, and the solution means is displayed at a higher level. The position information of the robot device 10, the terminal device 14, and each device is obtained by using, for example, GPS (Global Positioning System). The control unit 54 of the robot apparatus 10 calculates the distance between the terminal apparatus 14 or the robot apparatus 10 and each device using position information obtained by using GPS. Based on the calculation result, the ranking is determined.

  When the solution display screen 120 is displayed on the UI unit 72 of the terminal device 14 and the user instructs screen transition or when a preset time has elapsed, as shown in FIG. The screen 122 is displayed on the UI unit 72 of the terminal device 14. The screen 122 displays a message such as “Please select a solution” (for example, a message prompting the user to select a solution). In addition, when an appropriate solution (for example, a solution that the user desires to execute) is not included in the proposed solution (see FIG. 27), the user can give another instruction. It is like that.

  For example, when an appropriate solution is included in the proposed solution (see FIG. 27) and the user selects the appropriate solution using the terminal device 14, it is shown in FIG. As shown, the confirmation screen 124 is displayed on the UI unit 72 of the terminal device 14. The confirmation screen 124 displays information indicating the solving means selected by the user. In the example shown in FIG. 29, the user has selected solving means (2) “display a healing image on the display” and (3) “provide a citrus scent on the aroma apparatus”. When the user presses the “Yes” button on the confirmation screen 124, information indicating the execution instruction of the solving means (2) and (3) is sent from the terminal device 14 or the robot apparatus 10 to the solving means (2) and (3). It is transmitted to devices used for execution (for example, the display 92 and the aroma device 94). Thereby, the solving means (2) is executed by the display 92, and the solving means (3) is executed by the aroma device 94.

  When the user presses the “No” button on the confirmation screen 124, the screen returns to the previous screen 122 (see FIG. 28).

  When an appropriate solution is not included in the proposed solution (see FIG. 27), the user may specify a solution other than the proposed solution. When the user uses the terminal device 14 to give an instruction to display the input screen for designating the solution, for example, as shown in FIG. 30, the user input screen 126 is displayed on the UI unit 72 of the terminal device 14. Is displayed. On the user input screen 126, the user instructs a device used as a solution means and a solution means using the device. In the example shown in FIG. 30, a multifunction device is designated as the device used as the solving means, and “XX” processing is designated as the solving means using the multifunction device. The user may specify the device used for the solution by entering the name of the device in characters, or specify the device used for the solution by specifying the device image associated with the device. May be. For example, the UI unit 72 of the terminal device 14 displays one or a plurality of device images associated with one or a plurality of devices identified by the robot device 10, and the user can select from one or a plurality of device images. A device image associated with the device used for the solving means is selected. By specifying the device by the device image in this way, it is possible to specify the device used for the solving means without knowing the detailed name of the device. In addition, when a device used for the solving means is designated by the user, a list of functions of the device is displayed on the UI unit 72 of the terminal device 14. The function of the device is specified by referring to the device function management information 40, for example. The user selects a function used for the solving means from the list of functions. When the user gives an instruction to execute the solving means using the terminal device 14, information indicating the execution instruction is transmitted to the device designated by the user, and the device executes the solving means.

  Note that the screens shown in FIGS. 24 to 30 may be displayed on the UI unit 50 of the robot apparatus 10. In this case, each screen may not be displayed on the UI unit 72 of the terminal device 14. Information displayed on the screen may be output as audio information.

  In the user determination modes 1 and 2 described above, information displayed on the UI unit 72 of the terminal device 14 is transmitted from the robot device 10 to the terminal device 14, for example. Of course, the information may be transmitted from a device such as a server to the terminal device 14 under the control of the robot device 10.

(Device identification processing)
The device identification process will be described below. As an example, by applying AR (Augmented Reality) technology (augmented reality technology), device identification information is acquired and a device is identified. For example, by applying AR technology, device identification information of a device used alone is acquired and the device is identified, and device identification information of a device to be linked is acquired to identify a device to be linked. Is done. A known AR technique is used as the AR technique. For example, a marker type AR technique using a marker such as a two-dimensional barcode, a markerless type AR technique using an image recognition technique, a position information AR technique using position information, and the like are used. Of course, the device identification information may be acquired and the device may be identified without applying the AR technology. For example, if the device is connected to the network, the device may be identified based on the IP address, or the device ID may be read to identify the device. Furthermore, in the case of a device or a terminal device having various wireless communication functions such as infrared communication, visible light communication, Wi-Fi, and Bluetooth, the device ID is acquired by acquiring the device ID of the device to be linked using these wireless communication functions. May be identified and the linkage function may be executed.

  The device identification information acquisition process will be described in detail below with reference to FIG. As an example, a case where the multifunction device 78 is installed around the robot device 10 and the robot device 10 acquires device identification information of the multifunction device 78 will be described. FIG. 31 schematically shows the appearance of the multi-function device 78. Here, a process for acquiring the device identification information by applying the marker AR technology will be described. A marker 128 such as a two-dimensional barcode is provided on the housing of the multi-function device 78. The marker 128 is information in which device identification information of the multi-function device 78 is encoded. The robot apparatus 10 photographs the marker 128 with a visual sensor. Thereby, image data representing the marker 128 is generated. In the robot apparatus 10, the control unit 54 extracts device identification information by applying a decoding process to the marker image represented in the image data. Thereby, the multi-function device 78 is identified. The identification unit 64 of the robot apparatus 10 specifies function information indicating a function associated with the extracted device identification information in the device function management information 40. Thereby, the function of the multi-function device 78 is specified (identified).

  The marker 128 may include encoded function information indicating the function of the multi-function device 78. In this case, by applying a decoding process to the image data representing the marker 128, the device identification information of the multifunction device 78 is extracted, and the function information indicating the function of the multifunction device 78 is also extracted. As a result, the multi-function device 78 is specified (identified) and the function of the multi-function device 78 is specified (identified).

  When the device identification information is acquired by applying the markerless AR technology, for example, the robot apparatus 10 captures all or part of the appearance of the device (for example, the multi-function device 78) with a visual sensor. Of course, it is useful to capture information for identifying the device such as the device name (for example, product name) and model number from the appearance. Appearance image data representing all or part of the appearance of the device is generated by photographing. In the robot apparatus 10, the control unit 54 identifies a device to be used based on the appearance image data. For example, the storage unit 32 of the robot apparatus 10 includes, for each device, appearance image association information indicating association between appearance image data representing all or part of the appearance of the device and device identification information of the device. It is remembered. For example, the control unit 54 compares the appearance image data obtained by photographing with each appearance image data included in the appearance image association information, and specifies the device identification information of the device to be used based on the comparison result. To do. For example, the control unit 54 extracts the feature of the appearance of the device from the appearance image data obtained by photographing, and in the appearance image data group included in the appearance image association information, features that are the same as or similar to the feature of the appearance. Appearance image data to be represented is specified, and device identification information associated with the appearance image data is specified. As a result, the device (for example, the multifunction device 78) is identified. As another example, when a device name (for example, a product name) or model number is photographed and appearance image data representing the name or model number is generated, it is based on the name or model number represented in the appearance image data. The device may be identified. The identification unit 64 of the robot apparatus 10 identifies function information indicating each function associated with the identified device identification information in the device function management information 40. As a result, the function of the device (for example, the multifunction device 78) is specified.

  When device identification information is acquired by applying the position information AR technology, for example, position information indicating a position where a device (for example, the multi-function device 78) is installed is acquired by using a GPS function. For example, each device has a GPS function and acquires device position information indicating the position of the device itself. The robot apparatus 10 outputs information indicating a device position information acquisition request to the device, and receives device position information of the device from the device as a response to the acquisition request. In the robot apparatus 10, the control unit 54 identifies a device based on the device position information. For example, the storage unit 32 of the robot apparatus 10 stores, for each device, position association information indicating association between device position information indicating the position where the device is installed and device identification information of the device. ing. The control unit 54 specifies device identification information associated with the device position information in the position association information. Thereby, the device is specified (identified). The identification unit 64 of the robot apparatus 10 identifies function information indicating each function associated with the identified device identification information in the device function management information 40. As a result, the function of the device (for example, the multifunction device 78) is specified (identified).

  For example, when the multifunction device 78 is identified by the robot apparatus 10, in the user determination mode 1 described above, a device image associated with the multifunction device 78 is displayed on the terminal device 14 as status information. For example, as illustrated in FIG. 32, the device image 102 associated with the multifunction device 78 is displayed on the UI unit 72 of the terminal device 14. The device image 102 may be, for example, an image generated by photographing with a visual sensor of the robot apparatus 10 or an image schematically representing the multi-function device 78.

  Further, when a device is identified, information indicating the name of the device may be transmitted from the robot apparatus 10 to the terminal device 14 and the name of the device may be displayed on the UI unit 72 of the terminal device 14. In the example shown in FIG. 32, the name “MFP (B)” is displayed.

  For example, when the user designates the device image 102 using the terminal device 14, for example, as illustrated in FIG. 33, the UI unit 72 of the terminal device 14 uses a multifunction device 78 associated with the device image 102. (For example, a button image for instructing execution of the solution means) is displayed. The multifunction device (B) has, for example, a print function, a scan function, a copy function, and a facsimile function. When these functions are used as a solution, a button image for executing these functions is displayed. It is displayed on the UI unit 72 of the terminal device 14. For example, when a user designates a button image representing a print function using the terminal device 14 and instructs execution of the print function, execution instruction information indicating an instruction to execute the print function is combined from the terminal device 14 or the robot device 10. Is transmitted to the machine 78. The execution instruction information includes control data for executing the print function, data such as image data to which the print function is applied, and the like. Upon receiving the execution instruction information, the multi-function device 78 executes printing according to the execution instruction information.

  When a plurality of devices (for example, the multifunction device 78 and the projector 88) are identified by the robot apparatus 10, in the user determination mode 1, the device image associated with the multifunction device 78 and the device image associated with the projector 88 are It is displayed on the terminal device 14 as information. For example, as shown in FIG. 34, the device image 102 associated with the multifunction device 78 and the device image 104 associated with the projector 88 are displayed on the UI unit 72 of the terminal device 14. The device images 102 and 104 may be, for example, images generated by photographing with a visual sensor of the robot apparatus 10 or may be images that schematically represent the multifunction machine 78 and the projector 88.

  Further, when a device is identified, information indicating the name of the device may be transmitted from the robot device 10 to the terminal device 14 and displayed on the UI unit 72 of the terminal device 14. In the example shown in FIG. 34, the name “multifunction machine (B)” of the multifunction machine 78 and the name “projector (C)” of the projector 88 are displayed.

  For example, when the user designates the device images 102 and 104 using the terminal device 14, as shown in FIG. 35, for example, the UI unit 72 of the terminal device 14 includes a multifunction device 78 associated with the device image 102 and the device image. Information (for example, a button image for instructing execution of the solution means) indicating the solution means using the projector 88 associated with 104 is displayed. The means for solving the problem is a means realized by a cooperation function using the multi-function device 78 and the projector 88. By coordinating the multi-function device 78 and the projector 88, for example, a co-operation function for projecting an image generated by scanning by the multi-function device 78 by the projector 88, or a co-operation for printing an image projected by the projector 88 by the multi-function device 78. The function can be executed. A button image for instructing execution of these cooperation functions is displayed on the UI unit 72 of the terminal device 14. For example, when the user designates a button image using the terminal device 14 and instructs execution of the solving means (cooperation function), execution instruction information indicating an execution instruction of the solving means is received from the terminal device 14 or the robot apparatus 10. It is transmitted to the multi function device 78 and the projector 88. Upon receiving the execution instruction information, the multi function device 78 and the projector 88 execute the cooperation function designated by the user.

  When the user touches the device image 102 and operates the finger up to the device image 104 (for example, tracing with the finger), the device images 102 and 104 are designated, and the multifunction device 78 and the projector 88 are linked. It may be designated as a device. The order in which the device images 102 and 104 are touched and the direction of tracing may be the reverse of the above example. Of course, screen contact means other than a finger such as a pen for tracing the screen may be used. When the user connects the device image 102 and the device image 104, the device images 102 and 104 may be specified, and the multifunction device 78 and the projector 88 may be specified as devices to be linked. When the user overlaps the device image 102 and the device image 104, the device images 102 and 104 may be specified, and the multi-function device 78 and the projector 88 may be specified as devices to be linked. The device to be linked may be specified by a drawing operation such as adding a circle, or the device to be linked can be specified by specifying the device image associated with the device to be linked within a preset time. Good. When canceling the cooperation, the user may specify a device to be canceled on the screen, or may press a cooperation cancellation button. The device to be released may be designated by a preset operation such as adding a cross mark.

  As another example, a device to be linked may be set in advance as a basic linked device. For example, it is assumed that the multi-function device 78 is preset as a basic cooperation device. The device identification information of the basic cooperation device may be stored in advance in an apparatus such as the robot apparatus 10 or a server. The user may specify the basic cooperation device using the terminal device 14. When the basic linkage device is set, the user designates a device linked to the basic linkage device by designating a device image associated with a device other than the basic linkage device.

  In the above example, the stand-alone function and the linkage function are functions that use devices as hardware, but the stand-alone function and the linkage function may be functions realized by software (application). . For example, instead of a device image, a functional image (for example, an image such as an icon) associated with a function realized by software is displayed on the UI unit 72 of the terminal device 14 and one or a plurality of functional images are designated by the user. Thus, a cooperation function that uses a function associated with a function image or a plurality of functions associated with a plurality of function images may be designated. Of course, the device image associated with the device as hardware and the function image associated with the function realized by software are displayed on the UI unit 72 of the terminal device 14, and the device image and the function image are designated by the user. In this case, a cooperation function that uses a device associated with the device image and a function associated with the function image may be designated.

  Hereinafter, processing when the function of the device is executed will be described. As an example, a process when executing the cooperation function will be described. In this case, a connection request is transmitted from the terminal device 14 to the cooperation target device, and the terminal device 14 and the cooperation target device are connected. Of course, the connection request may be transmitted from the robot apparatus 10 to the cooperation target apparatus, and the robot apparatus 10 and the cooperation target apparatus may be connected. Hereinafter, this connection process will be described with reference to FIG. FIG. 36 is a sequence diagram showing the processing.

  In the terminal device 14, when the cooperation function to be executed is specified by the user (S 70), the terminal device 14 indicates a connection request to the cooperation target devices (for example, the multifunction machine 78 and the projector 88) that execute the cooperation function. Information is transmitted (S71). For example, when the address information indicating the address of the cooperation target device is stored in the robot device 10, the terminal device 14 acquires the address information of the cooperation target device from the robot device 10. As another example, the terminal device 14 may store address information of a device to be linked. Of course, the terminal device 14 may acquire the address information of the cooperation target device by another method. The terminal device 14 transmits information indicating a connection request to the devices to be linked (for example, the multifunction device 78 and the projector 88) using the address information of the devices to be linked (for example, the multifunction device 78 and the projector 88).

  Upon receiving the information indicating the connection request, the multi-function device 78 and the projector 88 permit or do not permit the connection with the terminal device 14 (S72). For example, when the multi-function device 78 and the projector 88 correspond to devices that are not permitted to be connected, or when the number of devices that request connection exceeds the upper limit, the connection is not permitted. When the connection from the terminal device 14 is permitted, the change operation may be prohibited so that the unique setting information of the multifunction device 78 and the projector 88 is not changed from the terminal device 14. For example, it may be prohibited to change the color parameters of the multi-function device 78 or the set time when shifting to power saving. As a result, the security for the cooperation target device is improved. As another example, when the devices are linked, the change of the setting information may be limited as compared to the case where the devices are used alone without being linked. For example, fewer setting item changes may be permitted as compared to the case where the device is used alone. In addition, browsing of personal information of other users such as operation history may be prohibited. Thereby, the security with respect to a user's personal information improves.

  Result information indicating permission or non-permission of connection is transmitted from the multi-function device 78 and the projector 88 to the terminal device 14 (S73). When the connection to the multifunction device 78 and the projector 88 is permitted, communication is established between the terminal device 14 and the multifunction device 78 and the projector 88.

  Next, the user uses the terminal device 14 to instruct execution of the cooperation function (S74). In response to this instruction, execution instruction information indicating an instruction to execute the cooperation function is transmitted from the terminal device 14 to the multi-function device 78 and the projector 88 (S75). The execution instruction information transmitted to the multi-function device 78 includes information (for example, job information) indicating processing executed by the multi-function device 78, and the execution instruction information transmitted to the projector 88 includes the projector 88. Information (for example, job information) indicating processing to be executed is included.

  Receiving the execution instruction information, the multi-function device 78 and the projector 88 execute the function according to the execution instruction information (S76). For example, the multifunction device 78 and the projector 88 are linked to each other such as a function of transferring scan data from the multifunction device 78 (the multifunction device (B)) to the projector 88 (PC (A)) and projecting the data by the projector 88. Communication is established between the multi-function device 78 and the projector 88. In this case, for example, the execution instruction information transmitted to the multifunction device 78 includes address information of the projector 88, and the execution instruction information transmitted to the projector 88 includes address information of the multifunction device 78, and their addresses. Communication is established between the multifunction device 78 and the projector 88 using the information.

  When the execution of the cooperation function is completed, information indicating the completion of the execution of the cooperation function is transmitted from the multi-function device 78 and the projector 88 to the terminal device 14 (S77). Information indicating that the execution of the cooperation function is completed is displayed on the UI unit 72 of the terminal device 14 (S78). If information indicating completion of execution is not displayed even after a preset time has elapsed from the time when the execution instruction is given, the terminal device 14 causes the UI unit 72 to display information indicating an error, and again executes the execution instruction. Information or information indicating a connection request may be transmitted to the multi-function device 78 and the projector 88.

  Next, the user confirms whether or not to cancel the cooperative state of the multi-function device 78 and the projector 88 (S79), and performs processing according to the presence or absence of the cancellation (S80). When canceling the cooperation state, the user uses the terminal device 14 to give a cancellation instruction. As a result, communication between the terminal device 14 and the multi-function device 78 and the projector 88 is released. Similarly, communication between the multifunction device 78 and the projector 88 is also released. If the cooperation state is not canceled, an execution instruction may be given continuously.

  When a single function is executed, information indicating an instruction to execute the single function is transmitted from the terminal device 14 to a device that executes the single function. The device executes a single function according to the execution instruction.

  The execution instruction information may be transmitted from the robot apparatus 10 to each device.

  In the following, processing related to the device designation operation used for the solving means will be described.

(Switching display of information related to linkage function)
In the present embodiment, the display of information related to the cooperation function may be switched according to the order in which the device images associated with the devices are connected. Hereinafter, this processing will be described in detail with reference to FIGS.

  FIG. 37 shows a cooperative function management table as another example of the cooperative function management information 42. In this linkage function management table, as an example, information indicating a combination of device IDs, information indicating a device name to be linked (for example, the type of each device), information indicating a linkage function (linkage function information), and connection Information indicating the order is associated with information indicating the priority order. The connection order corresponds to the order of connecting the device images associated with the devices. The priority order is a display priority order of information related to the linkage function. For example, a device with a device ID “A” is a PC (personal computer), and a device with a device ID “B” is a multifunction device. By linking the PC (A) and the multifunction peripheral (B), for example, a “scan transfer function” and a “print function” are realized as linked functions. The “scan transfer function” is a function for transferring image data generated by scanning by the multifunction peripheral (B) to the PC (A). The “print function” is a function for transmitting data (for example, image data and document data) stored in the PC (A) to the multifunction device (B) and printing the data on the multifunction device (B). For example, when a device is connected from the multifunction device (B) to the PC (A), that is, when a device image is connected from a device image associated with the multifunction device (B) to a device image associated with the PC (A). The priority of the “scan transfer function” is “1st”, and the priority of the “printing function” is “2nd”. In this case, information related to the “scan transfer function” is displayed with priority over information related to the “print function”. Conversely, when a device is connected from the PC (A) to the multifunction device (B), that is, an image is connected from the device image associated with the PC (A) to the device image associated with the multifunction device (B). In this case, the priority of the “print function” is “1st”, and the priority of the “scan transfer function” is “2nd”. In this case, information related to the “print function” is displayed with priority over information related to the “scan transfer function”.

  38 and 39 show an example of a screen displayed on the UI unit 72 of the terminal device 14. For example, it is assumed that the multifunction peripheral (B) and the PC (A) are identified. In the above-described user determination mode 1, as shown in FIG. 38A, the UI unit 72 of the terminal device 14 has a device image 102 associated with the multifunction device (B) and a device image associated with the PC (A). 130 is displayed as status information. In this state, the user connects device images representing devices to be linked using an indicator (for example, a user's finger, pen, stylus, etc.). The control unit 74 of the terminal device 14 detects the contact of the indicator on the screen and detects the movement of the indicator on the screen. For example, as indicated by an arrow 132, when the user touches the device image 102 on the screen using the operation element and operates the device image 130 (for example, tracing the operation element on the screen), Connect the device images 130. As a result, the MFP (B) associated with the device image 102 and the PC (A) associated with the device image 130 are designated as devices to be linked and the order of connection is designated. The order of connecting the device images corresponds to the order of connection. Note that the multifunction device (B) corresponds to the first device, and the PC (A) corresponds to the second device. In the example shown in FIG. 38A, since the image is connected from the device image 102 to the device image 130, the device is connected from the multifunction device (B) to the PC (A). Information indicating the connection order of the devices is transmitted from the terminal device 14 to the robot device 10. The control unit 74 of the terminal device 14 may display an image representing a locus traced by the user on the screen, or after the devices are connected, the locus is replaced with a preset straight line or the like. It may be displayed on the screen.

  As described above, when the devices to be linked (for example, the multifunction device (B) and the PC (A)) are specified, the identification unit 64 of the robot apparatus 10 uses the linkage function management table shown in FIG. The cooperation function associated with the combination of the PC (A) and the multifunction machine (B) is specified. Thereby, the cooperation function executed by linking the PC (A) and the multifunction peripheral (B) is specified. When the connection order of devices is specified by the user, the identification unit 64 specifies the priority order associated with the connection order in the cooperation function management table. With reference to FIG. 37, a specific example will be described. Since the PC (A) and the multifunction peripheral (B) are designated as devices to be linked, the linkage function executed by them is “scan transfer function”. “Print function”. In addition, since a device is connected from the MFP (B) to the PC (A) (B → A), the priority of the “scan transfer function” is “first”, and the priority of the “print function” is “2nd place”.

  Information relating to the cooperation function specified as described above and information indicating the priority order are transmitted from the robot apparatus 10 to the terminal apparatus 14. The control unit 74 of the terminal device 14 causes the UI unit 72 to display information related to the cooperation function according to the priority order.

  For example, as illustrated in FIG. 38B, the control unit 74 of the terminal device 14 causes the UI unit 72 to display information related to the cooperative function candidate. Since the priority of the “scan transfer function” is “first” and the priority of the “print function” is “second”, the information related to the “scan transfer function” has priority over the information related to the “print function”. (For example, at the top). For example, as the information related to the “scan transfer function”, an explanatory text of “scan transfer function” “transfer data scanned by the MFP (B) to the PC (A)” is displayed. In addition, as the information related to the “print function”, an explanatory text of “print function” “print data in PC (A)” is displayed.

  When the cooperation function is designated by the user and an execution instruction is given, the designated cooperation function is executed. For example, when a “YES” button is pressed by the user, a cooperation function associated with the “YES” button is executed.

  In addition, the identification process of a cooperation function and the identification process of a priority may be performed in the terminal device 14.

  As an operation other than tracing between device images with an operator, a device to be linked may be specified by a drawing operation such as adding a circle, and a connection order may be specified. For example, the order of drawing operations corresponds to the connection order. As another example, a device to be linked and a connection order may be specified in accordance with a user's voice instruction.

  FIG. 39 shows another example of the operation. For example, as shown in FIG. 39A, the user touches the device image 130 using the operator and operates the device image 102 in the direction indicated by the arrow 134 to connect the device image 130 and the device image 102. . As a result, the PC (A) associated with the device image 130 and the multifunction peripheral (B) associated with the device image 102 are designated as devices to be linked and the order of connection is designated. In this example, since the image is connected from the device image 130 to the device image 102, the device is connected from the PC (A) to the multifunction device (B). Referring to the cooperation function management table shown in FIG. 37, the priority of “print function” is “1st”, and the priority of “scan transfer function” is “2nd”. In this case, as shown in FIG. 39B, the UI unit 72 of the terminal device 14 displays information related to the “printing function” preferentially (for example, higher) than information related to the “scan transfer function”. .

  As described above, a linkage function that uses a function of a device is specified by connecting device images associated with the device. Further, the display order of information related to the cooperation function is changed according to the order in which the images are connected, that is, the order in which the devices are connected. The connection order of devices also serves as the order of functions used by each device and the movement order of data that moves between linked devices. The operation of connecting devices (that is, the operation of connecting images) is the order of functions. It also serves as an operation to specify the order of data movement. Therefore, by changing the display order of the information related to the cooperation function according to the connection order, the information related to the cooperation function predicted to be used by the user is preferentially displayed. That is, information related to a cooperative function that is highly likely to be used by the user is preferentially displayed. For example, when an image is connected from the multifunction device (B) to the PC (A), “the function of the multifunction device (B) is used before the PC (A), and the multifunction device (B) to the PC (A). It is anticipated that the user will use the link function "transfer data to". Further, when an image is connected from the PC (A) to the multifunction device (B), “the function of the PC (A) is used before the multifunction device (B), and the multifunction device (B) from the PC (A). It is anticipated that the user will use the link function "transfer data to". Therefore, by changing the display order of the information related to the cooperative function according to the order of connecting the images, information related to the cooperative function that is highly likely to be used by the user is preferentially displayed. In addition, without performing an operation other than the operation of connecting the device images, the order of the functions to be used and the data movement order are specified, and information on the cooperation function predicted to be used by the user is displayed.

  The display switching process described above may be applied when a function image associated with a function is used. For example, the display of information related to the cooperation function is switched according to the designation order of the device image associated with the first function and the device image associated with the second function.

(Cooperative processing using partial images)
Depending on the position in the device image associated with the device, the function assigned to the device assigned to the cooperation function may be different. When a specific position is designated by the user in the device image, information related to a cooperation function that uses a function corresponding to the specific position is preferentially displayed. Hereinafter, this process will be described in detail.

  FIG. 40 shows an example of the device function management table. Data of this device function management table is stored in the robot apparatus 10 as device function management information 40. In this device function management table, as an example, information indicating a device ID, a device name (for example, a device type), information indicating a position in a device image, and information indicating a function corresponding to the position in the device image ( Function information) and an image ID are associated with each other. The position in the device image is a specific position (specific part) in the device image associated with the device, for example, a specific position in the device image schematically representing the device, or in the device image captured by the camera Is a specific position. Different functions are associated with each specific position in the device image.

  FIG. 41 shows an example of a screen displayed on the UI unit 72 of the terminal device 14. For example, it is assumed that the multifunction peripheral (B) and the PC (A) are identified. In the user determination mode 1 described above, as shown in FIG. 41A, the device images 102 and 130 are displayed as the situation information on the UI unit 72 of the terminal device 14. For example, a “print function” is assigned to a specific position (partial image 102 a) corresponding to the main body portion of the multifunction peripheral (B) in the device image 102. In the device image 102, a “scan function” is assigned to a specific position (partial image 102b) corresponding to the document cover, document glass, or automatic document feeder of the multifunction machine (B). A “stapler stop function” is assigned to a specific position (partial image 102 c) corresponding to the post-processing device in the device image 102. The “stapler stopping function” is a function for stopping the output paper with the stapler. Further, a “data storage function” is assigned to a specific position (partial image 130 a) corresponding to the main body portion of the PC (A) in the device image 130. A “screen display function” is assigned to a specific position (partial image 130 b) corresponding to the display unit of PC (A) in the device image 130. The “data storage function” is a function for storing data sent from another device on the PC (A). The “screen display function” is a function for displaying data sent from another device on the PC (A).

  Note that the control unit 74 of the terminal device 14 may cause the UI unit 72 to display the name of a function (for example, printing or scanning) assigned to a specific position in the device image. As a result, the user is provided with easy-to-understand information about what function corresponds to a specific position. Of course, the name of the function may not be displayed.

  When the position to which the function is assigned in the device image is designated by the user, the function assigned to the designated position is designated as a function to be linked. The user uses an indicator to connect a specific position (partial image) to which a function is assigned in the device image representing the device to be linked. For example, as indicated by an arrow 136, the user touches the partial image 102b and operates to the partial image 130b using the operation element, thereby connecting the partial image 102b and the partial image 130b. As a result, the MFP (B) associated with the device image 102 including the partial image 102b and the PC (A) associated with the device image 130 including the partial image 130b are designated as devices to be linked and the partial image 102b. The “scan function” assigned to ”and the“ screen display function ”assigned to the partial image 130b are designated. Moreover, the order of connection may be designated by this connection operation. In this case, the order of connecting the partial images corresponds to the order of connection. In the example shown in FIG. 41A, since the images are connected from the partial image 102b to the partial image 130b, a device is connected from the multifunction peripheral (B) to the PC (A). In addition, “scan function” and “screen display function” are designated as functions used for the linkage function. Information indicating the connection order of devices and information indicating a specific position designated by the user in the device image are transmitted from the terminal device 14 to the robot device 10.

  When the devices to be linked (for example, PC (A) and multifunction device (B)) are identified, the identification unit 64 of the robot apparatus 10 uses, for example, the PC (A) in the linkage function management table shown in FIG. And a cooperating function realized by cooperating the MFP (B). Also, the identification unit 64 identifies the function assigned to the position specified by the user in the device image by referring to the device function management table shown in FIG. The identification unit 64 uses a function assigned to a position designated by the user in a cooperation function group realized by linking the PC (A) and the multifunction peripheral (B). The priority of the cooperative function that does not use the function is lowered.

  Information relating to the cooperation function specified as described above and information indicating the priority order are transmitted from the robot apparatus 10 to the terminal apparatus 14. The control unit 74 of the terminal device 14 causes the UI unit 72 to display information related to the cooperative function as information related to the cooperative function candidate according to the priority order.

  For example, as illustrated in FIG. 41B, the control unit 74 of the terminal device 14 causes the UI unit 72 to display information related to the cooperative function candidate. Since the “scan function” and “screen display function” are specified by the user in that order, information on the link function “scan transfer display function” that is executed by linking the “scan function” and the “image display function” However, it is displayed preferentially (for example, at a higher rank) than information related to other linkage functions. For example, information related to the “scan transfer display function” is preferentially displayed over information related to the link function “scan transfer storage function” executed by linking the “scan function” and the “data storage function”. The scan transfer display function is a function for transferring data generated by scanning by the multifunction peripheral (B) to the PC (A) and displaying it on the screen of the PC (A). The scan transfer storage function is a function for transferring data generated by scanning by the multifunction peripheral (B) to the PC (A) and storing it in the PC (A). In the example shown in FIG. 41 (b), explanatory text of each cooperative function is displayed as information relating to each cooperative function.

  According to the collaboration process using partial images, when a device to be linked has multiple functions, the functions are individually specified, and information on the linkage function that uses the specified function is given priority. Is displayed. Thereby, the cooperation function predicted to be used by the user is preferentially displayed.

  Note that the linkage function may be a function that uses a combination of device parts, a function that uses a combination of the entire device and the device part, or a combination of the entire device. It may be a function that uses.

  The cooperation process using the partial image may be applied when a function image associated with the function is used. For example, different functions are assigned depending on the position in the function image, and the cooperation function that uses the function assigned to the position specified by the user is specified.

(Another example of cooperative processing using partial images)
Hereinafter, another example of the cooperation process using the partial images will be described with reference to FIGS. 42 and 43.

  FIG. 42 shows an example of the device function management table. Data of this device function management table is stored in the robot apparatus 10 as device function management information 40. In the device function management table, for example, device ID, information indicating a device name (for example, device type), information indicating the name of a device part (for example, the type of portion), and identification of the portion A part ID as part identification information, information indicating a function assigned to the part (function possessed by the part), and a part image ID for identifying a part image associated with the part are associated with each other. It has been. The partial image is an image representing the appearance of the part of the device obtained by photographing with the camera. Of course, a partial image that schematically represents the part of the device may be associated with the part. For example, a different function is assigned to each part of the device.

  A specific example will be described. A screen display function is assigned to the display unit of PC (A), and information indicating the screen display function is associated with the partial image ID of the partial image associated with the display unit. It has been. The screen display function is a function for displaying information on the PC (A). A data storage function is assigned to the main body of the PC (A), and information indicating the data storage function is associated with the partial image ID of the partial image associated with the main body. The data storage function is a function for storing data in the PC (A).

  Also, a print function is assigned to the main body of the multifunction peripheral (B), and information indicating the print function is associated with the partial image ID of the partial image associated with the main body. A scanning function is assigned to the reading unit of the multifunction device (B) (for example, a portion corresponding to the document cover, document glass, or automatic document feeder of the multifunction device (B)), and a portion associated with the reading unit Information indicating the scan function is associated with the partial image ID of the image. The stapling function is assigned to the post-processing apparatus of the multifunction peripheral (B), and information indicating the stapling function is associated with the partial image ID of the partial image associated with the processing apparatus thereafter. The stapling function is a function of binding the output paper by stapling.

  The function assigned to the device part is specified (identified) by using, for example, a markerless AR technology. For example, when the device portion is photographed by a camera (for example, a visual sensor of the robot apparatus 10), the identification unit 64 of the robot apparatus 10 specifies a function associated with the appearance image data in the device function management table. (Identify). Thereby, the function assigned to the photographed part is specified (identified). For example, when the main body of the multifunction peripheral (B) is photographed by the visual sensor, the identification unit 64 of the robot apparatus 10 specifies the print function associated with the appearance image data in the device function management table. Thereby, it is specified that the function assigned to the main body of the multifunction peripheral (B) is the print function.

  Of course, the function assigned to the part of the device may be specified (identified) by using the marker type AR technology. For example, each part of the device is provided with a marker such as a two-dimensional barcode in which part identification information (for example, part ID) for identifying the part is encoded. When a marker provided in a part is photographed by a visual sensor and the marker type AR technology is applied, partial identification information (for example, a partial ID) of the part is acquired. When the partial identification information is acquired in this way, the identification unit 64 of the robot apparatus 10 specifies (identifies) a function associated with the partial identification information (for example, partial ID) in the device function management table.

  FIG. 43 shows an example of the linkage function management table. The data of this cooperation function management table is stored in the robot apparatus 10 as cooperation function management information 42. This cooperation function management table is information indicating a cooperation function that uses the functions of a plurality of parts. In the cooperation function management table, for example, information indicating a combination of device parts and a combination of part IDs are included. The information indicating and the information indicating the cooperation function that uses the functions of a plurality of parts included in the combination are associated with each other. Of course, in the linkage function management table, information indicating the combination of the device portion and the entire device, and information indicating the linkage function using the function of the device portion and the function of the entire device are: It may be associated.

  A specific example will be described. A print function as a cooperation function is assigned to the combination of the display unit of the PC (A) and the main unit of the multifunction peripheral (B). Information indicating the combination of the partial ID and the partial ID of the main body of the multifunction peripheral (B) is associated with information indicating a print function as a cooperation function. The print function as the linkage function is a function that, for example, transmits data stored in the PC (A) to the multifunction device (B) and prints the data by the multifunction device (B).

  A print function as a cooperation function is assigned to the combination of the main body of the multifunction peripheral (B) and the main body of the projector (C), and the partial ID of the main body of the multifunction peripheral (B) and the projector ( The information indicating the print function as the cooperation function is associated with the information indicating the combination with the partial ID of the main body part C). The print function as the cooperation function is, for example, a function of transmitting data projected by the projector (C) to the multifunction device (B) and printing the data by the multifunction device (B).

  Further, a scan projection function as a cooperation function is assigned to the combination of the reading unit of the multifunction device (B) and the main body of the projector (C), and the partial ID of the reading unit of the multifunction device (B) and the projector The information indicating the combination with the partial ID of the main body part in (C) is associated with information indicating the scan projection function as the cooperation function. The scan projection function as the cooperation function is a function of, for example, transmitting data generated by scanning by the multifunction peripheral (B) to the projector (C) and projecting the data by the projector (C).

  Note that the cooperation function may be a function that uses a function of a plurality of parts included in the same device, or may be a function that uses a function of a part of a plurality of different devices. In addition, the cooperation function may be a function that uses functions of three or more parts.

  For example, when a plurality of parts of a device (for example, a plurality of parts of a plurality of different devices or a plurality of parts of the same device) are specified (identified) using the marker type AR technology or the markerless type AR technology. The identification unit 64 of the robot apparatus 10 identifies (identifies) the cooperation function associated with the combination of the plurality of identified parts in the cooperation function management table. Thereby, the cooperation function using the function which the some part identified (for example, imaging | photography) has is pinpointed (identification). For example, when the main body of the multifunction peripheral (B) and the main body of the projector (C) are identified, the robot apparatus 10 uses the main body of the multifunction peripheral (B) and the main body of the projector (C) in the linkage function management table. A print function or the like as a cooperation function associated with a combination of sets is specified.

(Designation of devices to be linked by overlaying device images)
A device group to be linked may be designated by superimposing a plurality of device images. Hereinafter, this process will be described with reference to FIGS. 44 and 45. 44 and 45 show an example of a screen displayed on the UI unit 72 of the terminal device 14.

  For example, it is assumed that the multifunction peripheral (B) and the PC (A) are identified. In the user determination mode 1 described above, as shown in FIG. 44A, device images 102 and 130 associated with the identified device are displayed as status information on the UI unit 72 of the terminal device 14. In this state, the user uses an indicator (for example, the user's finger, pen, stylus, etc.) to superimpose the device image associated with the first device on the device image associated with the cooperation destination device (second device). For example, as shown in FIG. 44B, the user designates the device image 102 by using the operator, and the device image 102 is superimposed on the device image 130 as indicated by an arrow 138. For example, device images are overlapped by a drag and drop operation. That is, the user drags the device image 102 and performs a drop operation on the device image 102 at a position overlapping the device image 130. This drag and drop operation itself is a known technique, for example. Note that a device image to be superimposed may be designated in accordance with a user's voice instruction. For example, the device images 102 and 130 may be designated as device images to be superimposed in accordance with a user's voice instruction, and may be superimposed.

  By superimposing the device images 102 and 130 on each other, the multifunction device (B) associated with the device image 102 and the PC (A) associated with the device image 130 are designated as devices to be linked.

  Note that the control unit 74 of the terminal device 14 may display the device image being dragged on the UI unit 72 in a display mode that can be identified. For example, the device image being dragged may be displayed semi-transparently or displayed in a specific color.

  When the device image 102 is superimposed on the device image 130 and the PC (A) can execute the cooperation function together with the multifunction peripheral (B), as shown in FIG. A confirmation screen 140 is displayed on the UI unit 72. The confirmation screen 140 is a screen for confirming whether or not a device group designated as a device to be linked is to be linked. When a cooperation instruction is given by the user on the confirmation screen 140 (for example, when the “YES” button is pressed by the user), information regarding the cooperation function is displayed on the UI unit 72 of the terminal device 14.

  For example, as illustrated in FIG. 45A, the control unit 74 of the terminal device 14 causes the UI unit 72 to display information related to the cooperative function candidate. By linking the PC (A) and the multifunction peripheral (B), for example, a “scan transfer function” and a “print function” are realized. Information about "Function" is displayed.

  When the cooperation function is designated by the user and an execution instruction is given, the terminal device 14 issues a connection request to the cooperation target device. As shown in FIG. 45B, a standby screen is displayed on the UI unit 72 of the terminal device 14 during the connection request. When the connection between the terminal device 14 and the device to be linked is successful, the designated linkage function is executed.

  As described above, a cooperation function that uses a function of a device is specified by superimposing device images associated with the device. Therefore, the functions can be linked without performing an operation other than the image operation, and the functions are linked with a simple operation.

  The cooperation function may be specified by superimposing the partial image on the device image or the partial image. This process will be described with reference to FIG. FIG. 46 shows an example of a screen displayed on the UI unit 72 of the terminal device 14.

  Similar to the cooperation process using the partial image described above, the function of the device differs depending on the position in the device image associated with the device. By superimposing the partial images in the device images on the partial images in the same or different device images, a cooperation function that uses a function associated with both partial images is specified. Hereinafter, this process will be described in detail.

  For example, it is assumed that the multifunction peripheral (B) and the PC (A) are identified. In the user determination mode 1 described above, as shown in FIG. 46A, the device images 102 and 130 are displayed as the situation information on the UI unit 72 of the terminal device 14. For example, the partial images 102a, 102b, 102c, 130a, and 130b are displayed as images that are separately movable from the other partial images.

  When a partial image is designated by the user and the partial image is superimposed on another partial image, a cooperative function that uses a function associated with both partial images is specified, and information regarding the cooperative function is stored in the UI of the terminal device 14. Displayed on the part 72.

  For example, as indicated by an arrow 142 in FIG. 46B, when the user drags the partial image 102b using the operator and performs a drop operation on the partial image 130b, the partial image 102b is included. The multifunction device (B) associated with the device image 102 and the PC (A) associated with the device image 130 including the partial image 130b are designated as cooperation target devices and are assigned to the partial image 102b with the “scan function” "And a" screen display function "assigned to the partial image 130b are designated as functions to be linked.

  In the robot apparatus 10, functions assigned to the partial images are managed. For example, identification information for identifying a partial image, functional information indicating a function associated with the partial image, and cooperative function information indicating a cooperative function executed by linking the functions are associated with each other. And stored in the robot apparatus 10. When a partial image is selected and superimposed on another partial image, identification information indicating the partial image is transmitted from the terminal device 14 to the robot apparatus 10. In the example shown in FIG. 46B, identification information indicating the partial images 102 b and 130 b is transmitted from the terminal device 14 to the robot device 10. Based on the identification information, the identification unit 64 of the robot apparatus 10 identifies a function assigned to each of the superimposed partial images 102b and 130b, and identifies a cooperative function that uses the function. Information regarding the cooperation function is transmitted from the robot apparatus 10 to the terminal apparatus 14 and displayed.

  According to the above processing, when a device to be linked has a plurality of functions, the functions are individually designated, and information related to the linked function using the designated function is preferentially displayed. Thereby, the cooperation function predicted to be used by the user is preferentially displayed.

  Further, the display priority of the cooperation function may be changed according to the order in which the partial images are superimposed. In this case, information related to the cooperation function that uses the function associated with the superimposed partial images is preferentially displayed.

  Each of the robot apparatus 10 and the terminal apparatus 14 is realized by cooperation of hardware and software as an example. Specifically, each of the robot apparatus 10 and the terminal apparatus 14 includes one or a plurality of processors such as a CPU (not shown). The function of each part of the robot apparatus 10 and the terminal apparatus 14 is implement | achieved when the said 1 or several processor reads and runs the program memorize | stored in the memory | storage device which is not shown in figure. The program is stored in the storage device via a recording medium such as a CD or DVD, or via a communication path such as a network. As another example, each unit of the robot apparatus 10 and the terminal apparatus 14 may be realized by hardware resources such as a processor, an electronic circuit, and an ASIC (Application Specific Integrated Circuit). In the realization, a device such as a memory may be used. As yet another example, each unit of the robot apparatus 10 and the terminal apparatus 14 may be realized by a DSP (Digital Signal Processor), an FPGA (Field Programmable Gate Array), or the like.

  DESCRIPTION OF SYMBOLS 10 Robot apparatus, 12 apparatus, 14 Terminal device, 34 Owned function management information, 36 Non-owned function management information, 38 Solution means management information, 40 Device function management information, 42 Cooperation function management information, 44 Status collection part, 46 Moving part , 48 working unit, 54 control unit, 56 detection unit, 58 solution means identification unit, 60 determination unit, 62 search unit, 64 identification unit.

Claims (43)

  1. A situation collecting means for detecting surrounding conditions and people existing around the device ;
    A storage unit that stores information indicating a threshold value that is determined for each attribute of the person and that is used to determine whether or not the situation problem has occurred;
    Detection means for determining that the problem has occurred when a value representing the detection result of the situation detected by the situation collection means is equal to or greater than the threshold;
    If only the functions of the apparatus can not execute the solutions to solve the previous Kitoi title, and control means for performing control to execute the solutions that utilize elements other than the self apparatus,
    Only including,
    The detection means further changes the threshold according to the attribute of the person detected by the status collection means by referring to the information stored in the storage means.
    Robot device.
  2. The element includes at least one of a device and a person other than the device.
    The robot apparatus according to claim 1.
  3. The control means preferentially executes control for executing the solution means not using a person when no person is detected,
    The robot apparatus according to claim 2.
  4. The control means causes a device other than the own apparatus to execute the solution means.
    The robot apparatus according to claim 2 or claim 3, wherein
  5. The control means communicates with a device other than the device itself to control the device other than the device itself, thereby causing the device other than the device itself to execute the solution means.
    The robot apparatus according to claim 4, wherein:
  6. The control means controls a direct operation to an operation unit of a device other than the device itself to cause the device other than the device itself to execute the solution means.
    The robot apparatus according to claim 4, wherein:
  7. In the case where control is not possible by communicating with a device other than the device itself, the control means controls the direct operation.
    The robot apparatus according to claim 6.
  8. The solving means is a means executed by at least two collaborative work among the own device, a device other than the own device, and a person.
    The robot apparatus according to claim 2, wherein the robot apparatus is characterized.
  9. The control means controls output of information indicating the solving means as control for executing the solving means.
    The robot apparatus according to any one of claims 1 to 8, wherein the robot apparatus is characterized.
  10. The information indicating the solving means includes information indicating the situation,
    The robot apparatus according to claim 9.
  11. The control means controls output of information indicating the solving means to the terminal device;
    The robot apparatus according to claim 9 or 10, wherein:
  12. The solving means is executed based on an instruction of a user who uses the terminal device.
    The robot apparatus according to claim 11.
  13. The instruction is an instruction to execute a function that can be executed by using a device that exists around the device.
    The robot apparatus according to claim 12, wherein:
  14. Position information of the equipment, based on at least one of the surrounding image and communication status further comprises identifying means for identifying a device existing around the own device,
    The robot apparatus according to claim 13.
  15. Further comprising transmission means for transmitting information indicating the instruction to devices existing around the device.
    The robot apparatus according to any one of claims 12 to 14, wherein the robot apparatus is characterized.
  16. The control unit controls display of information indicating the solution unit;
    The robot apparatus according to claim 9, wherein the robot apparatus is characterized.
  17. The control means further controls display of a device image associated with a device capable of executing the solution means.
    The robot apparatus according to claim 16.
  18. A function that uses the device associated with the device image specified by the user is executed.
    The robot apparatus according to claim 17.
  19. When a plurality of device images are designated by the user, a cooperation function that uses a plurality of devices associated with the plurality of device images is executed.
    The robot apparatus according to claim 17.
  20. When the problem of the situation based on the first detection result based on the information related to the person and the second detection result based on the information other than the person are different, the control unit may determine the first detection result determined in advance or Selecting the solution to solve the problem of the situation based on the priority of the second detection result;
    Robot device according to any claim 19 claim 1, wherein the.
  21. The information about the person is information including an image representing the person.
    Robot device according to claim 2 0, characterized in that.
  22. The image is an image representing at least a part of a human body.
    Robot device according to claim 2 1, wherein the.
  23. The image is an image representing a human face,
    Robot device according to claim 2 2, characterized in that.
  24. Information about a person is information that includes the voice of the person.
    Robot device according to claim 2 3 claim 20, characterized in that.
  25. When the element other than the device itself is available for a fee, the control means controls a payment operation for using the element other than the device itself.
    The robot apparatus according to any one of claims 1 to 24, wherein:
  26. The payment operation includes a payment operation using at least one of electronic currency and virtual currency.
    The robot apparatus according to claim 25 , wherein:
  27. When the device itself does not have a payment capability, the control means performs the payment operation with payment assistance from an element other than the device itself.
    The robot apparatus according to claim 25 or claim 26 , wherein
  28. The control means obtains money for payment from a person who may be involved in an action for solving the problem;
    The robot apparatus according to claim 27 .
  29. Further comprising a retrieval means for search the means for solving the previous SL problem,
    The robot apparatus according to any one of claims 1 to 28, wherein:
  30. The search means searches for the solution means using the Internet.
    30. The robot apparatus according to claim 29 .
  31. If the solution means cannot be determined only by the device itself, the control means controls execution of a mode for inquiring the user
    Robot device according to any one of claims 3 0 claim 1, characterized in that.
  32. In the mode for inquiring the user, the control means controls further collection of information on the situation in response to a request from the user.
    Robot device according to claim 3 1, feature that.
  33. The problem that the device can solve is changed according to the update of the function of the device.
    Robot device according to claim 3 2 claim 1, characterized in that.
  34. The update of the function is performed by changing at least one of hardware and software included in the device.
    Robot device according to claim 3 3, characterized in that.
  35. The control means repeats control for executing the solution means until the problem is solved.
    Robot device according to claim 3 4 claim 1, characterized in that.
  36. When the solving means is not executed, the control means performs control for executing another solving means.
    Robot device according to any one of claims 3 to 5 claim 1, characterized in that.
  37. It further has a moving means for moving to another place,
    Robot device according to any one of claims 3 to 6 claim 1, characterized in that.
  38. The moving means has at least one of a function of moving on land, a function of flying, and a function of moving underwater.
    The robot apparatus according to claim 37 , wherein:
  39. It further has a communication means for communicating with devices other than the device itself by changing the communication method according to the surrounding environment.
    The robot apparatus according to any one of claims 1 to 38, wherein:
  40. The surrounding environment relates to at least one of a distance between the surrounding devices and the presence or absence of an obstacle.
    40. The robot apparatus according to claim 39 .
  41. In the case where the solving means is a means for invoking a human action for solving the problem, the control means controls notification of a procedure of the action for solving the problem.
    Robot device according to any one of claims 4 0 from claim 1, characterized in that.
  42. The problem that cannot be solved by the function of the own device is a problem that cannot be solved by the function of the own device alone, a problem that the time required for the solution by the function of the own device alone exceeds a time threshold, or the function of the own device It is a problem that the quality of work results is below a predetermined quality,
    Robot device according to claim 4 1 claim 1, characterized in that.
  43. Computer
    Status collection means for detecting surrounding conditions and people existing around the device,
    A value that is determined for each person attribute and that represents a detection result of the situation detected by the situation collection means with reference to storage means that stores information indicating a threshold for determining whether or not the situation problem has occurred Detecting means for determining that the problem has occurred when the threshold is equal to or greater than the threshold;
    If only the functions of the apparatus can not execute the solutions to solve the previous Kitoi problem, control means for performing control to execute the solutions that utilize elements other than the self apparatus,
    To function as,
    The detection means further changes the threshold according to the attribute of the person detected by the status collection means by referring to the information stored in the storage means.
    program.
JP2017002408A 2017-01-11 2017-01-11 Robot apparatus and program Active JP6439806B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017002408A JP6439806B2 (en) 2017-01-11 2017-01-11 Robot apparatus and program

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017002408A JP6439806B2 (en) 2017-01-11 2017-01-11 Robot apparatus and program
US15/642,665 US20180104816A1 (en) 2016-10-19 2017-07-06 Robot device and non-transitory computer readable medium
CN201710924391.XA CN108297092A (en) 2017-01-11 2017-09-30 Robot device and its control method

Publications (2)

Publication Number Publication Date
JP2018111154A JP2018111154A (en) 2018-07-19
JP6439806B2 true JP6439806B2 (en) 2018-12-19

Family

ID=62869849

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2017002408A Active JP6439806B2 (en) 2017-01-11 2017-01-11 Robot apparatus and program

Country Status (2)

Country Link
JP (1) JP6439806B2 (en)
CN (1) CN108297092A (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003291083A (en) * 2002-03-28 2003-10-14 Toshiba Corp Robot device, robot controlling method, and robot delivery system
JP2005111637A (en) * 2003-10-10 2005-04-28 Ntt Data Corp Network robot service system
JP4588359B2 (en) * 2004-05-07 2010-12-01 富士通株式会社 Network robot function providing system and function providing method
JP2007249801A (en) * 2006-03-17 2007-09-27 Nippon Telegr & Teleph Corp <Ntt> Robot cooperation system
JP2007245317A (en) * 2006-03-17 2007-09-27 Nippon Telegr & Teleph Corp <Ntt> Robot controller, program, and robot control method
JP6069607B2 (en) * 2013-03-26 2017-02-01 株式会社国際電気通信基礎技術研究所 Robot service linkage system and platform

Also Published As

Publication number Publication date
CN108297092A (en) 2018-07-20
JP2018111154A (en) 2018-07-19

Similar Documents

Publication Publication Date Title
CN105187484B (en) The method of mobile terminal and control mobile terminal
CN104461410B (en) Mobile communication system, mobile terminal and its control method
CN103827780B (en) Method and system for virtual input device
US9563272B2 (en) Gaze assisted object recognition
EP2947867B1 (en) Mobile terminal and method of controlling the same
KR101444407B1 (en) Apparatus for controlling device based on augmented reality using local wireless communication and method thereof
CN104333846B (en) Position reminding method and device
US9128644B2 (en) Image processing system including an image processing apparatus and a portable terminal
KR101591835B1 (en) Mobile terminal and method for controlling the same
EP2945043B1 (en) Eyewear-type terminal and method of controlling the same
US9153110B2 (en) Video surveillance system and method for configuring a video surveillance system
US9639887B2 (en) In-store object highlighting by a real world user interface
CN105473021A (en) Wearable devices and associated systems
CN105677138A (en) Mobile terminal and control method for the mobile terminal
JP5122517B2 (en) User interface system based on pointing device
EP1784805B1 (en) Method for locating an object associated with a device to be controlled and a method for controlling the device
US10038740B2 (en) Camera-to-camera interactions, systems and methods
EP1965344B1 (en) Remote object recognition
EP2840558B1 (en) Electronic device communicating with external cameras
KR20110020746A (en) Method for providing object information and image pickup device applying the same
JP2017534091A (en) Method and system for augmented reality displaying a virtual representation of the action of a robotic device
JP2005117621A (en) Image distribution system
JP5523233B2 (en) Communication terminal and data transmission method thereof
JPWO2006013755A1 (en) Information processing system using space optical communication and space optical communication system
US9900498B2 (en) Glass-type terminal and method for controlling the same

Legal Events

Date Code Title Description
A975 Report on accelerated examination

Free format text: JAPANESE INTERMEDIATE CODE: A971005

Effective date: 20180521

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180529

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180727

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20181023

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20181105

R150 Certificate of patent or registration of utility model

Ref document number: 6439806

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150