CN108297092A - Robot device and its control method - Google Patents
Robot device and its control method Download PDFInfo
- Publication number
- CN108297092A CN108297092A CN201710924391.XA CN201710924391A CN108297092A CN 108297092 A CN108297092 A CN 108297092A CN 201710924391 A CN201710924391 A CN 201710924391A CN 108297092 A CN108297092 A CN 108297092A
- Authority
- CN
- China
- Prior art keywords
- function
- robot device
- information
- solution
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Abstract
Robot device and its control method.A kind of robot device, the robot device include detector and controller.Detector detects ambient conditions.If robot device cannot execute the solution for solving the problems in ambient conditions by using the function of robot device, controller executes control so that solution will be executed by using the element other than robot device.
Description
Technical field
The present invention relates to robot device and its control methods.
Background technology
Japanese Unexamined Patent Publication 2014-188597 bulletins, Unexamined Patent 5-65766 bulletins and special open 2005-111637 public affairs
Report discloses the robot device that processing is performed in unison with another device.
Invention content
In general, being fixed by the solution that robot device solves the problems, such as.Due to this reason, robot device may not
Can be by being met with according to robot device the case where, flexibly changes solution to solve the problems, such as.
Therefore, the purpose of the present invention is to enable robot device according to circumstances synergistically to solve to ask with another element
Topic.
According to the first aspect of the invention, a kind of robot device including detector and controller is provided.Detector
Detect ambient conditions.If robot device cannot be executed by using the function of robot device for solving in ambient conditions
The problem of solution, then controller execute control so that solution will be by using other than robot device
Element executes.
According to the second aspect of the invention, in the robot device according to first aspect, element may include people and remove
At least one of in device except robot device.
According to the third aspect of the invention we, in the robot device according to second aspect, controller can by with except
Device communication except robot device carrys out control device, so that device executes solution.
According to the fourth aspect of the invention, in the robot device according to second aspect, controller can be by direct
The operating unit for operating the device other than robot device carrys out control device, so that device executes solution.
According to the fifth aspect of the invention, in the robot device according to fourth aspect, if controller cannot pass through
It is communicated with the device other than robot device and carrys out control device, then controller can pass through the operation list of local control system
Member carrys out control device.
According to the sixth aspect of the invention, according to the robot device in terms of any of second to the 5th aspect
In, solution can be by robot device, the device other than robot device and people at least between two
It cooperates to execute.
According to the seventh aspect of the invention, according to the robot device in terms of any of first to the 6th aspect
In, controller can execute control so that by the information of display instruction solution.
According to the eighth aspect of the invention, in the robot device according to the 7th aspect, control can also be performed in controller
System so that will show and be able to carry out the device of solution the device image contacted.
According to the ninth aspect of the invention, in the robot device according to eighth aspect, can execute use with by with
The function of the device of the specified device image contact in family.
According to the tenth aspect of the invention, in the robot device according to eighth aspect, if user specifies multiple dresses
Image is set, then can execute the synergistic function using the multiple devices contacted with multiple device images.
According to the eleventh aspect of the invention, according to the robot device in terms of any of first to the tenth aspect
In, detector can detect information related with the people around robot device and with other than people around related letter
Breath, as ambient conditions.
According to the twelfth aspect of the invention, in the robot device according to the tenth one side, when expression and in addition to people
Except around related information value be equal to or higher than threshold value when, detector determination problem is had occurred and that in ambient conditions.
According to the thirteenth aspect of the invention, according to the 12nd aspect robot device in, threshold value can according to
The related information of people that is detected by detector changes.
According to the fourteenth aspect of the invention, according to the robot in terms of any of the 11st to the 13rd aspect
In device, if the problems in the ambient conditions detected from the first testing result based on information related with people with from based on
The problems in ambient conditions of the second testing result detection of related information difference around other than people, then controller can
It is used for the priority level selection determined according to the priority level determined for the first testing result or for the second testing result
Solve solution to the problem.
According to the fifteenth aspect of the invention, the robot dress in terms of according to any of first to fourteenth aspect
In setting, if the billable use of element other than robot device, controller controls the delivery operation of robot device
To use the element.
According to the sixteenth aspect of the invention, in the robot device according to the 15th aspect, if robot device
Without any means of payment, then controller can make robot device support to execute payment by receiving payment from element
Operation.
According to the seventeenth aspect of the invention, according to the robot device in terms of any of first to the 16th aspect
Can also include communication unit, the communication unit by be suitable for ambient enviroment change communication means come in addition to robot device it
Outer device executes communication.
According to the eighteenth aspect of the invention, it is filled according to the robot in terms of any of first to the 17th aspect
In setting, if solution is to solve solution to the problem by people's execution, controller can execute control so that will carry
For information related with how to execute solution.
According to the nineteenth aspect of the invention, it is filled according to the robot in terms of any of first to the 18th aspect
In setting, robot device cannot be solved the problems, such as by using the function of robot device be function only by robot device not
Can solve the problems, such as, only need threshold time or longer time solving the problems, such as by using the function of robot device or
Result with extra fine quality is not led to the problem of by using the function of robot device.
According to a twentieth aspect of the invention, a kind of control method for robot device is provided.Control method packet
Include following steps:Detect ambient conditions;And if robot device cannot execute use by using the function of robot device
In the solution for solving the problems in ambient conditions, then controller executes control so that solution will by using in addition to
Element except robot device executes.
According to first aspect, the 19th aspect and the 20th aspect, can make robot device according to circumstances with it is another
A element synergistically solves the problems, such as.
According to second aspect, the solution of at least one in user and device is executed.
According to the third aspect and fourth aspect, solution is executed by the device other than robot device.
According to the 5th aspect, even if controller cannot can also execute to use and be somebody's turn to do by being communicated come control device with device
The solution of device.
According to the 6th aspect, solution is also executed by using robot device.
According to the 7th aspect, the information of instruction solution is presented to user.
According to eighth aspect, the device for being able to carry out solution is presented to user.
According to the 9th aspect and the tenth aspect, solution is executed by executing operation to device image.
According to the tenth one side and the 12nd aspect, accuracy of detection is enhanced.
According to the 13rd aspect, the detection problem corresponding with people.
According to fourteenth aspect, even if specifying solution if inconsistent there are some between multiple testing results.
According to the 15th aspect and the 16th aspect, the billable device used can be used.
According to the 17th aspect, can come to communicate with device execution by using more suitably communication means.According to the tenth
Eight aspects so that be easier to make one to solve the problems, such as.
Description of the drawings
Exemplary embodiments of the present invention will be described in detail based on the following drawings, in attached drawing:
Fig. 1 is the block diagram of apparatus system according to illustrative embodiments.
Fig. 2 instantiates the appearance of robot device according to illustrative embodiments;
Fig. 3 is the block diagram of robot device according to illustrative embodiments;
Fig. 4 is the block diagram of terminal installation;
Fig. 5 and Fig. 6 instantiates the characteristic of wireless communication technique;
Fig. 7, which is instantiated, provides function management table;
Fig. 8 instantiates non-offer function management table;
Fig. 9 instantiates solution management table;
Figure 10 instantiates apparatus function management table;
Figure 11 instantiates synergistic function management table;
Figure 12 is the flow chart for the general introduction for instantiating situation inspection processing;
Figure 13 is the flow chart for the detailed description for instantiating situation inspection processing;
Figure 14 is the flow chart for instantiating the processing executed under user's decision-making mode;
Figure 15 is the flow chart for instantiating the processing of middle execution under the pattern of making decisions on one's own;
Figure 16 is the flow chart for instantiating the control process for executing solution;
Figure 17 is the view for illustrating application scenarios 1;
Figure 18 is the view for illustrating application scenarios 2;
Figure 19 is the view for illustrating application scenarios 3;
Figure 20 is the view for illustrating application scenarios 4;
Figure 21 to Figure 30 instantiates the example of picture;
Figure 31 is the schematic diagram for the appearance for instantiating multi-function device;
Figure 32 to Figure 35 instantiates the example of picture;
Figure 36 is the precedence diagram for instantiating connection processing;
Figure 37 instantiates synergistic function management table;
Figure 38 A to Figure 39 B instantiate the example of picture;
Figure 40 instantiates apparatus function management table;
Figure 41 A and Figure 41 B instantiate the example of picture;
Figure 42 instantiates apparatus function management table;
Figure 43 instantiates synergistic function management table;And
Figure 44 A to Figure 46 B instantiate the example of picture.
Specific implementation mode
The device system of information processing system according to an illustrative embodiment of the invention is served as below with reference to Fig. 1 descriptions
System.Fig. 1 instantiates the example of this apparatus system.
It include robot device 10, one or more devices 12 and end according to the apparatus system to illustrative embodiments
End device 14.Robot device 10, device 12 and terminal installation 14 can be via communication path N (such as network) or via not
Same independent communication pathway is communicated with another device.The construction of apparatus system shown in FIG. 1 is only example, and robot fills
Setting 10, device 12 and terminal installation 14 can not have to communicate with another device.
May include more in apparatus system although in apparatus system including single device 12 in example in Fig. 1
A device 12.In such a case, it is possible to include multiple devices 12 with the same function, or may include that there are different work(
Multiple devices 12 of energy.Although apparatus system includes single terminal device 14 in example in Fig. 1, can in apparatus system
To include multiple terminal installations 14.It can also include another device (such as server).
The problem of robot device 10 has detection 10 ambient conditions of robot device and finds in the case of detecting
Function.Robot device 10 can detect ambient conditions while mobile or static.When robot device 10 pinpoints the problems,
Robot device determines whether it can be solved the problems, such as by own and execute the processing according to definitive result.If robot
Device 10 can solve the problems, such as that then robot device solves by executing solution used to solve the problem by own
Problem.If robot device 10 cannot solve the problems, such as that it executes control by own so that can by using in addition to
Element except robot device 10 executes the solution.The example of element is people, other than robot device 10
Device and the robot device other than robot device 10.Robot device 10 itself can be the example of element.Machine
Device people device 10 can make another device solve the problems, such as that claimant solves the problems, such as so that another device and people solve to ask
Topic, or by being solved the problems, such as with another device and person cooperative work.
The example for the problem of being detected by robot device 10 be can only be solved the problems, such as by robot device 10, can
Only to be solved the problems, such as, can only be solved the problems, such as by people by the device other than robot device 10, can be by another
Collaborative work between device and people is solved the problems, such as, can be asked by what the collaborative work between robot device 10 and people solved
Inscribe, can be solved the problems, such as by the collaborative work between another device and robot device 10, can be by another device, another
Collaborative work between one robot device and robot device 10 solves the problems, such as and can be by another device, people
And the collaborative work between robot device 10 solves the problems, such as.That is, the example of solution is only by robot device 10
The solution of implementation, the solution only implemented by another device, the solution only executed by people, by another device
The solution of collaborative work implementation between people, the solution party implemented by the collaborative work between robot device 10 and people
Case, by between another device and robot device 10 collaborative work implement solution, by another device, another
Between robot device and robot device 10 collaborative work implement solution and by another device, people with
And the solution that the collaborative work between robot device 10 is implemented.
Device 12 is the device with specific function.The example of device 12 is that the image that there is image to form function forms dress
Set, personal computer (PC), display device (such as liquid crystal display and projecting apparatus), the fumigating machine for distributing fragrance, telephone set, when
Clock, wrist-watch, camera, monitor camera, automatic vending machine, air-conditioning, electric fan and humidifier.May include in apparatus system
Device in addition to these examples.Some devices can be considered as robot device according to the function of this device.
Terminal installation 14 is, for example, PC, tablet PC, smart phone or cellular phone.Terminal installation 14 is for example used for by user
It executes for solving solution to the problem.
In the illustrative embodiments, robot device 10 detects ambient conditions and finds asking in the case of detecting
Topic.Then robot device 10 executes control so that can be executed according to the type of problem for solving solution to the problem.
It the following will discuss the self-contained unit that apparatus system according to illustrative embodiments includes.
Fig. 2 instantiates the appearance of robot device 10.Robot device 10 is, for example, anthropomorphic robot.Robot device 10
It can be alternatively another robot.In the illustrated example shown in fig. 2, robot device 10 has trunk 16, is arranged in body
The head 18 of the top of cadre 16, the leg 20 being arranged below trunk 16, the arm 22 being arranged on 16 both sides of trunk with
And the finger portion 24 at 22 tip of arm is set.
Robot device 10 has various sensors (such as visual sensor, hearing transducer, touch sensor, the sense of taste
Sensor and olfactory sensor), thus have corresponding with five senses (such as vision, the sense of hearing, tactile, the sense of taste and smell) of people
Function.About tactile, for example, robot device 10 have understand and by surface feel (such as tactile, pain and temperature),
Depth preception (such as pressure, position and vibration) and cortical sensibility (such as 2 points perception and three-dimensional perception) and area each other
Point.Robot device 10 also has equilibrium sensation.Sensor (such as camera 26) is arranged on the head of robot device 10 18
In.The vision of robot device 10 is realized by identifying the image captured by camera 26.Voice collector (such as Mike
Wind) it is arranged in robot device 10.The sense of hearing of robot device 10 is realized by identifying the voice obtained by microphone.
Robot device 10 may include the detector of the E.E.G for detecting people.For example, E.E.G detector is attached to people,
And the detector in robot device 10 is set and receives the information sent from E.E.G detector.
Leg 20 corresponds to mobile unit, and the power drive by coming from driving source (such as motor).Robot device 10
It can be moved by using leg 20.Leg 20 can be with the shape of people's leg, or can be roller or tire.Leg 20
There can be another shape.Leg 20 is only the example of mobile unit.Robot device 10 may include that another movement is single
Member, for example, the component for flight, such as propeller, the wing and aircraft engine, or the component for moving under water, all
Such as underwater engine.That is, robot device 10 may include the component for ground movement, the component for flight and be used for
At least one of in the component moved under water, as mobile unit.Robot device 10 can not have to include any movement
Unit.
Robot device 10 can have to be caught or the ability of carry an object by using arm 22 and finger portion 24.Robot
Device 10 can have the ability moved while catching or keeping object.
Robot device 10 can have the function of exporting sound.Robot device 10 can have to be sent out with another device
Send and receive the communication function of data.Robot device 10 can have by make a sound or send communication information come with
The ability of people, another device or another robot device communication.
Robot device 10 can have with people by using the ability that the machine learning of artificial intelligence (AI) is realized similar
, carry out decision ability.Neural network deep learning or the intensified learning for partial reinforment learning areas can be used.
Robot device 10 can for example have by using (such as the solution for solving the problems, such as of Internet search information
Scheme) function.
Robot device 10 can be communicated with another device to control the operation of the device by using communication function.Machine
Device people device 10 can operate another device by using remote controler or directly be operated without using remote controler
The device.When directly operating another device, robot device 10 manipulate setting in the apparatus operating unit (such as by
Button or panel).If robot device 10 cannot control the operation of the device, robot by being communicated with another device
Device 10 can be carried out operating device by using remote controler or can directly operate the device.It is obtained by visual sensor by analyzing
Image, for example, robot device 10 can identify the operating unit or remote controler of another device, with operate the device or
Remote controler.
Robot device 10 may include display 28.On the display 28, it shows about the information of problem, about solution
The information of scheme, various message etc..
The construction of robot device 10 is described in detail below with reference to the block diagram of Fig. 3.
Communication unit 30 for communication interface has to another device transmission data and receives data from another device
Function.Communication unit 30 can be that the communication interface with wireless communication function or the communication with finite communication function connect
Mouthful.Communication unit 30 supports one or more communication means, and according to the communication means suitable for communication party (that is, communication party institute
The communication means of support) and correspondent.The example of communication means is infrared communication, visible light communication, Wi-Fi (registrars
Mark) communication, near-field communication (such as bluetooth (registered trademark)) and radio frequency identifiers (RFID).Communication unit 30 is according to communication party
Or ambient enviroment is (for example, obstacle between the distance between robot device 10 and communication party or robot device 10 and communication party
The presence or absence of object) switching communication means.Frequency band for communication can be 800 to 920MHz short-wave band (such as low-power consumption wide area
Or the long-wave band of 2.4GHz or 5GHz (such as MuLTEfire) (LPWA)).Communication unit 30 is according to communication party's switch of frequency band or root
Switch communication means according to ambient enviroment.
Storage unit 32 is storage device (such as hard disk or memory (for example, solid state disk (SSD))).In storage unit
In 32, storage provides function management information 34, non-offer function management information 36, solution management information 38, apparatus function
Management information 40 and synergistic function management information 42.Also store each item data and various programs.Indicate the device of other devices
Address information can also be stored in storage unit 32.The information of item described above can be stored in same storage device
Or in different storage devices.
The information that function management information 34 is the function of indicating to provide in robot device 10 is provided.Function management is provided
The independent operation that information 34 is also indicated the standalone feature provided in robot device 10 and can be executed by using these functions
Association between (including handle and manipulate).Allow to specify (identification) can be by robot with reference to function management information 34 is provided
The operation that device 10 executes.The function of not indicated in providing function management information 34 does not provide in robot device 10
Function.Allow to the behaviour for specifying (identification) can not be executed by robot device 10 with reference to offer function management information 34 as a result,
Make.
It is non-that the information that function management information 36 is the function of indicating not provide in robot device 10 is provided.Non- offer work(
Energy management information 36 also makes the standalone feature not provided in robot device 10 and can not be by by without these function
The independent operation (including handle and manipulate) that robot device 10 executes is associated with.So that it can with reference to non-offer function management information 36
With the operation for specifying (identification) can not be executed by robot device 10.The function of not indicated in non-offer function management information 36
It can be the function of being provided in robot device 10.Allow to specify with reference to non-offer function management information 36 as a result, and (know
Not) the operation that can be executed by robot device 10.
There is provided both function management information 34 and non-offer function management information 36 can be stored in storage unit 32,
Allow to based on function management information 34 is provided and provides the specified operation that can perform of function management information 36 or by
The operation that robot device 10 executes.Alternatively, function management information 34 and non-one provided in function management information 36 are provided
It is a to be stored in storage unit 32 so that can based on the offer function management information 34 being stored in storage unit 32 or
The operation that function management information 36 is specified executable operation or executed by robot device 10 is provided.
Solution management information 38 is instruction for solving the problems, such as how solution to the problem (solves and how to take action
To solve the problems, such as) information.Solution management information 38 also indicates the question of independence that may occur and is asked with for solving these
Association between the solution of topic.
Apparatus function management information 40 is the information of the function for managing the device other than robot device 10.Dress
Set the function of the instruction of function management information 40 function that the device identification information of device provides in a device with instruction for identification
Association between information.The example of device identification information is device ID, device name, type of device, device model, mounting device
Position (device location information) and indicate device appearance image.The example of functional information is functional identity and function title.
If it is the image forming apparatus of the device other than robot device 10 there is scanning function, copy function and scanning to turn
Shifting function, it indicates that the functional information of scanning function indicates that the functional information of copy function and instruction scan forwarding function
Functional information is associated with the device identification information of image forming apparatus for identification.Comparable device function management information 40 is so that can
To specify the function of (identification) self-contained unit.
The example for the device that apparatus function management information 40 is managed includes in apparatus system (such as device 12)
Device.Not being included in the device in apparatus system can also be managed by apparatus function management information 40.Robot device 10 can
To obtain information (including device identification information and functional information) related with the new equipment not being included in apparatus system, and
The information obtained is registered in apparatus function management information 40.Information related with new equipment for example can by using because
Special net for example inputs the result of this information as administrator and obtains.Robot device 10 for example can be periodically or specific
Time regularly updates apparatus function management information 40 by the time that administrator specifies.Since the update operates, it is possible to
The new function registered in apparatus function management information 40 after update operation and do not provided in a device before update operates
Related functional information.On the contrary, can be with the related functional information of the function that provides in a device before update operates
The information that instruction deactivates the function is deleted or can be registered as from apparatus function management information 40.Fresh information for example can be with
By using internet or for example the result of this information is inputted as administrator and obtained.
Synergistic function management information 42 is for manage can be by combine the information of the synergistic function that multiple functions execute.It is logical
It crosses and combines multiple functions, execute one or more synergistic functions.Synergistic function can pass through a group multiple functions for unification device
Or it is executed by combining multiple functions of multiple devices.The device (for example, terminal installation 14) for providing operational order can be by
The function of being included in the device to be identified, and provided in this device may be used as a part for synergistic function.In machine
The function of being provided in device people device 10 is also used as a part for synergistic function.
Synergistic function can be the function that can be executed without using hardware device.For example, synergistic function can be with
For the function that can be executed by combining multiple software items.Alternatively, synergistic function can be that can pass through the work(of combination hardware device
It can be performed function with the function of software.
Synergistic function management information 42 indicates and synergistic function to be used for and the item of the related functional information of function that combines
Combination and being associated between the synergistic function information for indicating the synergistic function.Synergistic function information for example indicates synergistic function ID
With synergistic function title.When the function of used aloned is wanted in update, synergistic function management information 42 is also therefore updated.More due to this
New operation, can not be performed synergistic function by the multiple functions of combination before update operates can become after update operation
It must can perform.It can be operated in update on the contrary, synergistic function can be performed by combination multiple functions before update operates
Become not can perform later.Indicate that the synergistic function information for becoming executable synergistic function after update operation can be registered
In synergistic function management information 42.Instruction becomes the synergistic function information of not executable synergistic function after update operation
The information that instruction deactivates the synergistic function can be deleted or can be registered as from synergistic function management information 42.
If combining multiple devices to execute synergistic function, synergistic function management information 42 is can to pass through group for managing
Close the information of the synergistic function of multiple functions execution of multiple devices.The instruction of synergistic function management information 42 is used to assist for identification
Between the combination and the synergistic function information for indicating the synergistic function of the item of the device identification information of the self-contained unit of congenerous
Association.When updating device function management information 40, synergistic function management information 42 is also therefore updated.Since the update operates,
Synergistic function can not be performed by multiple functions of the multiple devices of combination before update operates can be in update operation
After become executable.On the contrary, synergistic function can be performed by combining multiple functions of multiple devices before update operates
It can become not can perform after update operation.
Synergistic function can be by combining the function or the identical function by combining different device that multiple functions execute
It is performed function.Synergistic function can be unless the identical function of combination different function or combination different device is just executable
Function.This synergistic function can be can be by combining the function or the identical work(by combining different device that different function executes
Function can be performed.For example, (being swept by combining the device (printer) with printing function and the device with scanning function
Retouch instrument), copy function is implemented as synergistic function.That is, by combination printing function and scanning function, implement copy function.
In this case, associated with each other as the copy function of synergistic function and the combination of printing function and scanning function.It is cooperateing with
In function management information 42, indicates the synergistic function information of synergistic function (that is, copy function) and there is printing work(for identification
The device identification information of the device of energy and for identification combination of the device identification information of the device with scanning function are closed each other
Connection.
In storage unit 32, can store can use function management information.Can be with function management information can for management
The information of function for isolated user, and indicate that the customer identification information of user can be used for the user with instruction for identification
Function functional information (may include synergistic function information) between association.The function of can be used for user can be freely to carry
The function of supplying the function of user and bought by user.This function can be to want the function or synergistic function of used aloned.With
Family identification information is, for example, to indicate the user account information of User ID and user name.Reference can be allowed to function management information
Identification (specified) can be used for the function of isolated user.Can with function management information when providing a user function every time (for example, every
Secondary charge is free when providing a user function) update.
Function management information 34, non-offer function management information 36, solution management information 38, apparatus function pipe are provided
It manages information 40, synergistic function management information 42 and can be stored in addition to machine at least one in function management information
In device except people's device 10 (such as unshowned server, device 12 and terminal 14).In this case, it stores
This information in another device can not be in the storage unit 32 that must be stored in robot device 10.
Situation information controller 44 has and has by using various sensor collections and the case where 10 surrounding of robot device
The function of the information of pass.Hereinafter, this information will be referred to as " situation information ".As situation information collector 44, can make
With the above described visual sensor, hearing transducer, touch sensor, taste sensor and olfactory sensor.As
Visual sensor capture robot device 10 around image (such as video image and static image) as a result, robot device
10 execute image recognition.As the sound (including voice) around hearing transducer pickup robot device 10 as a result, machine
People's device 10 executes speech recognition.Temperature, humidity and smell around robot device 10 are come by using other sensors
Detection.Sensor other than sensor described above can be used for the case where collecting with 10 surrounding of robot device
Related information.Situation information collector 44 can be from the device and sensor collection situation letter other than robot device 10
Breath.
Mobile unit 46 has by using the component mobile for ground, for the component of flight and under water
In mobile component at least one of come make robot device 10 move function.The leg of mobile unit 46 for example as shown in Figure 2
Portion 20 is constituted.
Operating unit 48 has the function of the device operated other than robot device 10 and lifts simultaneously carry an object.
The leg 20 for example as shown in Figure 2 of operating unit 48, arm 22 and refer to the composition of portion 24.
User interface (UI) 50 includes display (for example, display 28 shown in Fig. 2) and operating unit.Display is aobvious
Showing device (such as liquid crystal display).Operating unit is input unit (such as touch tablet or keyboard).UI 50 can be serve as it is aobvious
Show the user interface (such as touch display and show the device of numeric keypad over the display) of both device and operating unit.
Robot device 10 can not have to include UI 50, or can only include hardware keys (such as various buttons) without display
Device.The example of button as hardware keys is to be exclusively used in the button (such as numeric keypad) using numeral input and be exclusively used in making
With the button (such as direction instruction key) in instruction direction.
Loud speaker 52 has the function of exporting sound and voice.For solution voice (such as claimant solve ask
The message of topic) it is exported from loud speaker 52.
Controller 54 controls the operation of the individual elements of robot device 10.Controller 54 includes detector 56, solution party
Case designating unit 58, judging unit 60, search unit 62 and recognition unit 64.
Detector 56 has the information of the case where by being based on being collected by situation information collector 44 (for example, various sensors
Value) detection 10 surrounding of robot device the case where and by determining that around robot device 10 problem whether occurs finds to ask
The function of topic.Detector 56 detect related with the people around robot device 10 information and with other than people around have
The information of pass, as situation.The example of information related with people is the image by visual sensor catcher (for example, face
Image, whole body image and show people movement image) and the voice that is picked up by hearing transducer.With other than people
The example of the related information of surrounding is the temperature information obtained by temperature sensor and the humidity information obtained by humidity sensor.
More specifically, detector 56 by via the sound for using the image captured by visual sensor, being picked up by hearing transducer and
Voice, information related with the tactile obtained by touch sensor, information related with the sense of taste obtained by taste sensor with
And the combine detection ambient conditions of information related with the smell obtained by olfactory sensor are pinpointed the problems.Detector 56 can be with
Use the information by not being included in the sensor collection in robot device 10.For example, detector 56 can be from mounted on machine
Sensor outside people's device 10 obtains information, and can be pinpointed the problems by using this information, and the sensor is such as
Sensor installation sensor in a room and be arranged in another device.
Detector 56 can be equaled or exceeded in the value (for example, the temperature obtained by temperature sensor) obtained by sensor
Determine that problem has occurred when threshold value.Threshold value can be according to the year of the various sensors detection by being arranged in robot device 10
Age or gender change, so that being pinpointed the problems according to individual.That is, people can according to age or gender and differently
Perception issue, and allow to find according to personal question according to age or sex change threshold value.
Solution designating unit 58 has specifies (identification) for by examining by referring to solution management information 38
Survey the function of the solution for the problem of device 56 detects.
If from testing result related with people (for example, image and voice of people) (corresponding to the first testing result) detection
To the problem of and from based on other than people around related information testing result (correspond to the second testing result) inspection
The problem of measuring is different, then solution designating unit 58 can be examined based on the priority level and second for the first testing result
The priority level for surveying result selects solution.In this case, solution designating unit 58 is based on the first testing result
Select solution so that can solve the problems, such as to give birth to human hair prior to the problems in surrounding.
Judging unit 60, which has, judges whether robot device 10 can solve the problems, such as the work(detected by detector 56
Energy.The example of 10 indeterminable problem of robot device is can not only to be solved the problems, such as, only by the function of robot device 10
The time (preset time) of specific quantity is needed by using the function of robot device 10 come solve the problems, such as and by using
The function of robot device 10 does not lead to the problem of the result with satisfactory quality (preset quality).If filled in robot
The solution that the one group of functional coverage provided in 10 is specified by solution designating unit 58 is set, then the judgement of judging unit 60 can
Only to be solved the problems, such as by robot device 10.That is, being executed needed for selected solution if robot device 10 has
Institute is functional, then the judgement of judging unit 60 can be solved the problems, such as only by robot device 10.On the contrary, if in robot device 10
The solution that one group of endless all standing of function of middle offer is specified by solution designating unit 58, the then judgement of judging unit 60
It cannot only be solved the problems, such as by robot device 10.That is, if robot device 10, which does not have, executes selected solution
Any one of required function or some, then judging unit 60 judgement cannot only be solved the problems, such as by robot device 10.
In order to specify the function of being provided in robot device 10 (or the problem of robot device 10 does not provide), judging unit 60 can
With with reference to offer function management information 34.Alternatively, the function of not provided in robot device 10 in order to specified is (or in machine
The problem of people's device 10 provides), judging unit 60 is referred to non-offer function management information 36.
If it is determined that the judgement of unit 60 can only be solved the problems, such as that then robot device 10 is in controller by robot device 10
The solution specified by solution designating unit 58 is executed under 54 control.On the contrary, if it is determined that the judgement of unit 60 can not
Only to be solved the problems, such as by robot device 10, then controller 54 executes control so that can be by using in addition to robot device
Element (another device or people) except 10 executes solution.Controller 54 can make in addition to robot device 10 it
Outer device executes solution, and claimant executes solution, or makes robot device 10 and another element (example
Such as, people or another device) solution is executed together.
Search unit 62 has search for solving the problems, such as to be detected by detector 56 and be not registered in solution pipe
Manage the function of the solution in information 38.Search unit 62 for example searches for this solution by using internet.
Recognition unit 64 has device and identification of the identification (specified) other than robot device 10 in the apparatus
The function of the function of offer.Recognition unit 64 can be based on the device captured by visual sensor image (for example, device appearance
Image) or based on the device identification information indicated in a device obtained from the image captured by visual sensor come identify dress
It sets.Recognition unit 64 can alternatively obtain the location information of the position of instruction mounting device.Recognition unit 64 is also known
The function of other device.For identification function, recognition unit 64 is with reference to the apparatus function management information stored in storage unit 32
40, and the functional information of specified instruction and the associated function of device identification information about institute's identification device.
Recognition unit 64 can identify multiple devices to be combined as collaborative device.Recognition unit 64 is referred to
The synergistic function management information 42 stored in storage unit 32, and specify and indicate and about these multiple devices identified
The synergistic function information of the associated synergistic function of combination of device identification information item.This allows to identify that (specified) will pass through group
The function of closing institute's identification device is performed synergistic function.
If management can be used for the function of isolated user, the user that recognition unit 64 can receive user for identification knows
Other information.Then recognition unit 64 is referred to the available function management information stored in storage unit 32, and specified finger
Show the functional information of the function for the user that can be used for being indicated by customer identification information.This allows to identify that (specified) can be used for this
One group of function of user.For example, customer identification information is sent to robot device 10, and recognition unit 64 from terminal installation 14
The functional information of specified instruction and the associated function of customer identification information.More specifically, 64 reception device of recognition unit identification letter
Breath and customer identification information.Recognition unit 64 is referring next to apparatus function management information 40, and specified instruction is identified with device
The functional information of the function of information association, and reference can use function management information, and specified instruction and customer identification information
The functional information of associated function.This allows to specify providing in the device specified by device identification information and can be used for
By the function for the user that customer identification information is specified.
Controller 54 can execute function purchase and handle and manage purchasing history.For example, if user buys charging function,
Then controller 54 can execute user charging processing.
Controller 54 includes intelligent cell, and controls independently wanting for controller 54 by using the AI of the intelligent cell
Element.
In detector 56, solution designating unit 58, judging unit 60, search unit 62 and recognition unit 64 extremely
One item missing can be stored in device (such as unshowned server, device 12 and terminal other than robot device 10
14) in.In this case, this element being stored in another device, which can not be, must be included in robot device 10
Controller 54 in.
The construction of terminal installation 14 is described in detail below with reference to the block diagram of Fig. 4.
Communication unit 66 for communication interface has to another device transmission data and receives data from another device
Function.Communication unit 66 can be that the communication interface with wireless communication function or the communication with wired communication functions connect
Mouthful.
The camera 68 for serving as imaging unit captures the image of object, to generate image data (for example, Still image data
And vedio data).Camera 68 can be not only used, the outside for being connected to communication path (such as network) can also be used
Camera.Communication unit 66 can receive the image data for the image that instruction is captured by this external camera, and UI 72 can
With display image data so that user can handle or use image data.Terminal installation 14 can not have to include camera
68。
Storage unit 70 is storage device (such as hard disk or memory (for example, SSD)).In storage unit 70, storage
Various programs and various data item.It is related with the address of the address of robot device 10 and other devices (for example, device 12)
Address information, information related with institute's identification device, letter related with to be combined as institute's identification device of collaborative device
Breath can also be stored in storage unit with the related information of function of institute identification device and information related with synergistic function
In 70.The information of item described above can be stored in same storage device or different storage device in.
UI 72 includes display and operating unit.Display is display device (such as liquid crystal display).Operating unit is
Input unit, such as touch tablet, keyboard and mouse.UI 72 can be the user for serving as both display and operating unit
Interface (such as touches display and shows the device of numeric keypad over the display).
The operation of the individual elements of 74 control terminal device 14 of controller.Serve as the control of display controller (control unit)
Device 74 on the display of UI 72 for example so that show various items of information.
On the display of UI 72, show the image captured by robot device 10, the image captured by camera 68,
With the image and and functional cohesion of institute's identification device to be used (device to be used alone and the device to be combined) contact
Image.The image contacted with device can be by robot device 10 capture expression the device image (for example, static figure
Picture or video image) or schematically show the image (for example, icon) of the device or indicate the expression captured by camera 68
The image (for example, static image or video image) of the device.The image data of the schematic images of instruction device can be stored in
In robot device 10, and terminal installation 14 is supplied to from robot device 10.This image data alternatively can be advance
It is stored in terminal installation 14, or can be stored in another device and be supplied to terminal installation 14 from this device.With
The image of functional cohesion is the image for indicating the function (such as icon).
Wireless communication technique is discussed below with reference to Fig. 5 and Fig. 6.Fig. 5 instantiates according to frequency, wireless communication technique
Characteristic (merits and demerits).Fig. 6 instantiates the characteristic of wireless communication technique according to communication means.
As shown in figure 5, one of the main standard of wireless communication technique using 900MHz frequencies is RFID.Some of RFID
Advantage is with the height endurability to barrier and with seldom interference band (interference band of such as micro-wave oven).RFID's
Some are the disadvantage is that large-size antennae and short coverage area.
Some main standards using the wireless communication technique of 2.4GHz are ZigBee (registered trademark) and bluetooth.It is this logical
Some advantages of letter technology are high power saving, high speed and small sized antenna, and one of disadvantage is that have multiple interference bands.
Some main standards using the wireless communication technique of 5GHz are IEEE 802.11a and MuLTEfire.It is this logical
Some advantages of letter technology are that have seldom interference band and high speed, and one of disadvantage is with the low tolerance to obstacle
Property.
As shown in fig. 6, some advantages of infrared communication are high power saving and easy miniaturization device, and one of disadvantage is infrared
Light is invisible.
One of the advantages of visible light communication is to be easy tracking communication path, and one of disadvantage is highly directive.
One of the advantages of near-field communication (NFC) is easy pairing between multiple devices, and one of disadvantage is to communicate to be limited to closely
Distance.
When being communicated with communication party's execution by using wireless communication technique, the communication unit 30 of robot device 10 passes through
Using with the wireless communication technique and correspondent suitable for ambient enviroment and the characteristic of communication party.More particularly, communication unit
Member 30 is by according to the distance between robot device 10 and communication party, the presence or absence of barrier and communication party are propped up between them
The communication means held, which changes wireless communication technique, to be come and correspondent.
The function of robot device 10 is described in detail below with reference to Fig. 7 and Fig. 8.Fig. 7 is shown as offer function pipe
The example of the offer function management table of information 34 is provided.Fig. 8 shows the non-offer function as non-offer function management information 36
Manage the example of table.
In the example of offer function management table shown in Fig. 7, management number indicates the work(provided in robot device 10
Can information and instruction can by using the function execute operation (including handle and manipulate) information with it is associated with each other.
For example, robot device 10 has the function of to lift lifting for object by using arm 22, and can be lifted by using this
Function is lifted and carries the up to object of 30kg.Robot device 10 also has locomotive function, and can be by using the movement
Function in 10km by changing speed movement in per hour.Allow to specify (identification) in machine with reference to function management table is provided
The function of being provided in people's device 10 and the operation that can be executed by using function.The work(provided in function management table is not provided
It can be the function of not provided in robot device 10 and the operation that can not be executed by robot device 10 that can and operate.Reference
Function management table is provided it is possible thereby to specify the function that (identification) does not provide in robot device 10 and can not be filled by robot
Set the operation of 10 execution.
In the example of non-offer function management table shown in Fig. 8, management number, instruction do not provide in robot device 10
Function information and indicate by without the function can not be by operation that robot device 10 executes (including place
Reason and manipulate) information with it is associated with each other.For example, robot device 10 does not have the refrigerating function (machine for external environment
Refrigerating function around people's device 10), thus room cannot be made to cool down.Robot device 10 does not have printing function, thus not
The manuscript that the word of the voice picked up by robot device 10 can be printed or seen by robot device 10.With reference to non-offer function
Management table allows to specify the function that (identification) does not provide in robot device 10 and can not be executed by robot device 10
Operation.The function and operation not being registered in non-offer function management table can be the work(provided in robot device 10
The operation that can and can be executed by robot device 10.With reference to non-offer function management table it is possible thereby to which specified (identification) is in robot
The function of being provided in device 10 and the operation that can be executed by robot device 10.
In storage unit 32, instruction can be stored, the data of function management table are provided and non-offer function management table is provided
Both data, and can be based on the specified operation that can be executed by robot device 10 of the two data item or can not be by machine
The operation that device people device 10 executes.Alternatively, one in two data item can be stored in storage unit 32, and can be with
It can not be executed come the specified operation that can be executed by robot device 10 or by robot device 10 based on the data item stored
Operation.
The specific example of solution management information 38 is described referring to Fig. 9.Fig. 9 is shown as solution management
The example of the solution management table of information 38.In the example of solution management table shown in Fig. 9, management number, instruction feelings
The information of the solution of the information of condition (problem) and instruction for solving the problems, such as (situation) with it is associated with each other.Detector 56
Detection case (problem).Solution designating unit 58 is specified with reference to solution management table for solving by detector
The solution of 56 the problem of detecting.
For example, if detecting situation (problem) " room temperature is equal to or higher than temperature threshold (for example, 30 DEG C) ", use
It is the temperature reduced in room in solving the solution to the problem, more specifically, solution is that " (1) makes room by air-conditioning
It is cooling " and " (2) open the window ".Room temperature is by the particular sensor that is arranged in robot device 10 (for example, temperature sensing
Device) it detects.Temperature threshold can change according to the age of the people around robot device 10 or gender.For example, machine
The detector 56 of people's device 10 is based on the information (example obtained by various sensors (such as visual sensor and hearing transducer)
Such as, image and voice information) people of the detection around robot device 10, and estimate age and the gender of people.For old age
The temperature threshold of people's (age be equal to or higher than age threshold) and the temperature threshold of young man's (age is less than age threshold) can be with
It is different.For example, if detecting the elderly, the temperature threshold lower than young man can be used.This allows to execute basis
The adequate solution at age.Temperature threshold can also change according to gender.This allows to execute according to the suitable of gender
Solution.
In another example, if detecting situation (problem) (3) " people in room seems cold ", for solving
The solution to the problem is the temperature increased in room, more specifically, solution is " (1) opens heating " and " (2) are closed
Window ".The detector 56 of robot device 10 observes the shifting of the people around robot device 10 by using various sensors
It moves and carrys out detection case (3).More specifically, face of image of the detector 56 based on the face and movement that show people according to people in room
Expression, the movement of leg and foot and sweat detection case (3).
If detecting multiple inconsistent situations, detector 56 can preferentially select one in multiple situations,
And solution designating unit 58 can specify the solution for solving precedence case.For example, if detecting situation
(1) " room temperature is 30 DEG C or higher " and if situation (3) " people in room seems cold ", detector 56 are also detected that
Select situation (3) prior to situation (1), and solution designating unit 58 specifies the solution for solving situation (3).
For preferentially selecting the priority level of solution to be predetermined.For example, the case where based on infomation detection related with people
Prior to based on other than people around related infomation detection the case where select.Situation (3) is based on related with people
The case where information, for example, by analyze people image or speech detection the case where, and situation (1) be based on in addition to people it
Around outer the case where related information.By selecting situation (3) prior to situation (1), preferential selection for may for
The solution of people's problem around robot device 10, thus, it is likely that will solve the problems, such as this.
Apparatus function management information 40 is discussed in detail below with reference to Figure 10.Figure 10 is shown to be believed as apparatus function management
The example of the apparatus function management table of breath 40.In the example that apparatus function shown in Fig. 10 manages table, device ID, instruction device
The information of title (for example, type of device), the information (functional information) of instruction device function and image ID with it is associated with each other.Dress
Set ID and device name be device identification information example.Image ID is to indicate the image of device (for example, showing for identification
The image of device appearance or the image (such as icon) for schematically showing device) image recognition information example.Image ID
Can not be must be included in apparatus function management table.Such as the device with device ID " B " is that multi-function device is (including more
A image forms the image forming apparatus of function), and for example with printing function and scanning function.The dress is indicated for identification
The image ID for the image set is associated with device.The image data of instruction device image is stored in the storage unit of robot device 10
In 32 or in another device.
By using the various sensors of robot device 10, the detection of detector 56 is for identification in robot device 10 weeks
The device ID of the device enclosed, and 64 comparable device function management table of recognition unit and the specified and associated device names of device ID
Title, function and image ID.This allows to identify the device around robot device 10.The information of instruction device title and
The image data of device image can be sent to terminal installation 14 from robot device 10, and may be displayed on terminal installation 14
On.Indicate that the image of device is shown as the image contacted with the device.The image contacted with device can be to be caught by camera
The image caught or the image (for example, icon) for schematically showing device.If user is specified on terminal installation 14 and device
The image of contact, then information (functional information or description information related with function) related with the function of the device can be with slave
Device people device 10 is sent to terminal installation 14 and may be displayed on terminal installation 14.
Below with reference to Figure 11 detailed description synergistic functions management information 42.Figure 11 is shown to be believed as synergistic function management
The example of the synergistic function management table of breath 42.In the example of synergistic function management table shown in Figure 11, device ID, instruction will groups
The information (for example, type of device) of the title of the device of conjunction and indicate the information (synergistic function information) of synergistic function with each other
Association.Device with device ID " A " is, for example, PC, and the device with device ID " B " is multi-function device.Pass through combination
PC (A) and multi-function device (B), " scanning forwarding function " and " printing function " are implemented as synergistic function." scanning transfer work(
Can " it is the function that the image data generated by the scan operation of multi-function device (B) is transferred to PC (A)." printing function " is
It is sent in the middle data (for example, image data or manuscript data) stored of PC (A) to multi-function device (B) and in multi-function device
(B) function of the data is printed in.
The general introduction of situation inspection processing is described below with reference to the flow chart of Figure 12.
In step S01, situation information collector 44 by using various sensor collections and robot device 10 week
The case where being with pass information (values of various sensors).Detector 56 is then based on around situation infomation detection robot device 10
The case where concurrent existing the problem of occurring in this case.
In step S02, judging unit 60 judges what whether robot device 10 can solve to be detected by detector 56
Problem.As described above, the example of 10 indeterminable problem of robot device be cannot be only by robot device 10
Function solve the problems, such as, only need the time (preset time) of specific quantity by using the function of robot device 10 to solve
The problem of and by using the function of robot device 10 do not generate the result of satisfactory quality (preset quality) and ask
Topic.
If robot device 10 can solve the problems, such as (being yes in step S02), it is executed in step S03 for solving
Certainly solution to the problem.The solution is specified by solution designating unit 58.In this case, robot device
10 without using another device or not recipient assist in the case of by using the function of robot device 10 to execute
Specified solution.Robot device 10 can make the UI 50 of robot device 10 or the U I 72 of terminal installation 14 show
Show information related with problem and information related with solution.Robot device 10 received from user execute instruction when
Solution can be executed.
If robot device 10 cannot solve the problems, such as (being no in step S02), the search for solution is executed
Processing.Determine robot device 10 whether by searched for alone solution in step S04.If the result of step S04 is yes,
Then robot device executes pattern of making decisions on one's own in step S05.If robot device 10 will not searched for alone solution
(being no in step S04), robot device 10 execute user's decision-making mode in step S06.Autonomous determine will be executed by predefining
Plan pattern still executes user's decision-making mode.That is, if robot device 10 cannot solve the problems, such as to be detected by detector 56,
Then it executes pattern of making decisions on one's own and scheduled one in user's decision-making mode.The pattern to be executed can be specified by user.
The detailed description of situation inspection processing is described below with reference to the flow chart of Figure 13.
In step S01, situation information collector 44 by using various sensor collections and robot device 10 week
The case where being with pass information (values of various sensors).In step s 11, detector 56 by using the value of sensor combination
Come the problem of predicting to be likely to occur.Detector 56 is by using the information obtained by the image around analysis robot device 10
Combination with the value of sensor carrys out forecasting problem.The voice picked up by hearing transducer can also be used, and detector 56 can
Carry out forecasting problem with the result by using the value of the result and analysis image and sensor of analysis voice.For example, if institute
The temperature detected is equal to or higher than temperature threshold (30 DEG C) and if captured image indicates around robot device 10
People perspire, then detector 56 detects temperature and needs to be reduced by air-conditioning or electric fan.If detector 56 is attempted by being used only one
A sensor detects ambient conditions, then it may not be able to correct detection case.However, indicating the figure around robot device 10
As the use of the value together with sensor allows to detect more complicated situation with higher precision.Using for this image also makes
From personal angle and detection case can be carried out from general angle.However, on the other hand, sensor can detect can not possibly be by
The case where vision or olfactometry.
Then, in step s 12, the value of sensor can be back to normal value by the estimation of solution designating unit 58
Solution.Solution designating unit 58 referring for example to solution management table shown in Fig. 9, and specify for solve by
The solution for the problem of detector 56 detects.Information related with specified solution can come from user receiving
Inquiry when present, this will be discussed further below, or is passing through user or the device other than robot device 10 solves
It can be presented that instruction (control) is asked when problem.
Then, judging unit 60 judges whether robot device 10 can solve the problems, such as to be detected by detector 56.For
Carry out the judgement, in step s 13, judging unit 60 by the function of robot device 10 with by solution designating unit 58
Specified solution is compared.
Then it determines whether the one group of function of providing in robot device 10 covers in step S14 by solution to be referred to
The specified solution of order member 58 (that is, whether robot device 10 has the function for executing solution).If step
The result of rapid S14 is yes, then processing proceeds to the step S03 shown in Figure 12.In this case, robot device 10 does not make
It is not executed by using the corresponding function of robot device 10 in the case of recipient's auxiliary with another device or specified
Solution.Robot device 10 can execute solution when being executed instruction from user's reception.
If the one group of function of being provided in robot device 10 does not cover the solution specified by solution designating unit 58
Certainly scheme, that is, if robot device 10 does not have a function for executing solution, (being no in step S14), then machine
People's device 10 executes pattern of making decisions on one's own or user's decision-making mode in step S15.
It is described in detail in the processing executed under user's decision-making mode below with reference to the flow chart of Figure 14.
In step S20, the communication unit 30 of robot device 10 is sent out under the control of controller 54 to terminal installation 14
Send the case where being collected by situation information collector 44 information (values of various sensors).In this case, communication unit 30 can
To switch communication means according to the presence or absence of barrier or the distance between robot device 10 and terminal installation 14.Communication unit 30
Can to the terminal installation 14 being already registered as a purpose or to the terminal installation 14 identified from robot device 10 or to
The terminal installation 14 for having sent the request communicated with robot device 10 sends situation information.For example, robot device 10
Recognition unit 64 can obtain device related with terminal installation 14 from the image of the terminal installation 14 captured by visual sensor
Identification information, and the device identification information identification terminal device 14 can be based on.In another example, communication unit 30 can
With to the terminal installation 14 in the position preset range from robot device 10 (for example, can be with positioned at robot device 10
Execute the terminal installation 14 in the range of near-field communication) send situation information.In another example, recognition unit 64 can be known
Although the problematic user that gets up, and send situation information to the terminal installation of the user 14.
In the step s 21, in addition robot device 10 can collect in response to the user from terminal installation 14 request
The case where information and to terminal installation 14 send the other situation information.For example, while mobile, robot device 10 passes through
The other image of surrounding is captured using visual sensor (such as camera), and the image generated is sent to terminal installation 14
(video image and static image), or again collect with the related data of the requested information of user (for example, temperature) and to
Terminal installation 14 retransmits data.
Then, in step S22, robot device 10 is presented for solving the problems, such as one detected by detector 56
Or multiple solutions (one or more solutions specified by solution designating unit 58).For example, communication unit
30 send the information of the one or more solutions of instruction under the control of controller 54 to terminal installation 14.Instruction one or more
The presentation of information of a solution is on the UI 72 of terminal installation 14.In another example, controller 54 can to indicate
The presentation of information of one or more solutions is on the UI 50 of robot device 10.
When user's selection is presented to one in the solution of user, the controller 54 of robot device 10 is in step
Determined in rapid S23 the solution selected by user whether be will by using other than robot device 10 device (for example,
Device 12) execute solution.For example, when user selects one in solution by using terminal installation 14, refer to
Show that the information of selected solution is sent to robot device 10 from terminal installation 14.Controller 54 is then based on the information
Determine whether the solution selected by user is the solution party to be executed by using the device other than robot device 10
Case.If it is indicated that one or more solutions presentation of information is on the UI 50 of robot device 10 and user has been selected
One in solution, then controller 54 determine user selected by solution whether will be held by using another device
Row.
If the solution selected by user is the solution to be executed by using the device other than robot device 10
Certainly scheme (being yes in step S23), then processing proceed to step S24.In step s 24, communication unit 30 is in controller 54
Information related with the device that can execute solution is sent to terminal installation 14 under control.For example, 54 reference of controller
Apparatus function manages table and synergistic function and manages table, and specified has the function of that the selected solution of user can be executed
Multiple devices in one.If " printing " is selected as solution, the multi-function device with printing function is referred to
The device of solution can be executed by being set to.The example of information related with that can execute solution device is to show device
The address information and information related with the specification of device of unit address of the image, instruction of appearance for being connected to device.
Alternatively, controller 54 can make presentation of information related with device on the UI 50 of robot device 10.In such case
Under, information can not have to be sent to terminal installation 14.
Then, in step s 25, solution is executed according to user's operation.That is, executing the dress for executing solution
The function of setting.As the function for executing solution, the function of including in single device can be executed, or can hold
Exercise the synergistic function of the function of multiple devices.As the function for executing solution, it can use and be filled in robot
Set 10 functions of including.Device can be supplied to from robot device 10 or terminal installation 14 by executing the instruction of function.Later
It will be discussed for executing the detailed description of the operation of function.
If the solution selected by user is not meant to by using the device execution other than robot device 10
Solution (being no in step S23), then robot device 10 ask the people around robot device 10 to hold in step S26
Row solution.Then, in step s 27, robot device 10 informs how user executes solution (implementation procedure).
In this case, controller 54 is so that loud speaker using process as voice output, either makes UI 50 show process or make
Robot device 10 be moved to people with touching he or she.If process is presented to people as the voice from loud speaker, can
To increase volume so that the process will be better understood in people.Communication unit 30 can alternatively send to terminal installation 14 and indicate
The information of the process, in this case, presentation of information is on the UI 72 of terminal installation 14.
If controller 54 determined in step S23 the solution selected by user be will by using device and people this two
Person is performed solution, thens follow the steps S24 to S27.The function of robot device 10 can also be used as solution
A part.
It is described in detail in the processing executed under the pattern of making decisions on one's own below with reference to the flow chart of Figure 15.
In step s 30, if necessary, then the controller 54 of robot device 10 is specified to executing by solution
Other function needed for the specified solution of designating unit 58 or the request to user.
Then controller 54 determines whether the solution specified by solution designating unit 58 is wanted in step S31
The solution executed by using the device (for example, device 12) other than robot device 10.
If by solution designating unit 58 specify solution be will by using in addition to robot device 10 it
The solution (being yes in step S31) that outer device executes, then processing proceed to step S32.In step s 32, controller
54 search can execute the device around robot device 10 of solution.Controller 54 is for example based on by visual sensor
The image of acquisition, location information related with the position of device and wireless communication state search for this device.Controller 54
Comparable device function management table and synergistic function manage table, and it is specified have the function of can executing one of solution or
Multiple devices.If solution is " printing ", multi-function device of the search of controller 54 with printing function.
Controller 54 determined whether there is in step S33 can execute solution around robot device 10
Device.If there is this device (being yes in step S33), then processing proceeds to step S34.In step S34, robot dress
The communication unit 30 for setting 10 sends the instruction (execution that instruction executes solution under the control of controller 54 to institute's searcher
Instruction for the function of executing solution) information.When receiving the information, device is in response to instruction execution solution party
Case.
Controller 54 can detect whether that the control for the device that can execute solution can be obtained.That is, controller
Whether 54 detection address informations related with device or driver used to control the device are stored in robot device 10.If
This driver can be obtained by using network, then controller 54 downloads the driver.Robot device 10 can pass through
The operation panel of local control system provides the instruction for executing solution to device by the remote controler of operating device.
About whether successful execution solution determination can according to problem device execute function after whether
It is resolved in order to carry out.If detector 56 is no longer pinpointed the problems, 54 determination problem of controller has been solved.If problem is not
It is solved, then controller 54 retransmits the instruction for executing solution, or search another solution to device.
If there is no the device (being no in step S33) around robot device 10 that can execute solution,
Then controller 54 searches for another solution in step s 35.In this case, controller 54 can will make decisions on one's own mould
Formula switches to user's decision-making mode.
If the solution specified by solution designating unit 58 is not meant to by using in addition to robot device 10
Except device execute solution (being no in step S31), then robot device 10 asked in robot in step S36
People around device 10 executes solution.In step S37, robot device 10 is by the content of request (for example, executing solution
The process of scheme) inform user.For example, controller 54 makes loud speaker using the content of request as voice output, or make
UI 50 display request content, or make robot device 10 be moved to people with touching he or she.Communication unit 30 is alternatively
The information of instruction request content can be sent to terminal installation 14.In this case, UI of the presentation of information in terminal installation 14
On 72.
Then controller 54 determines whether people has received request in step S38.If people has received the request (step
It is yes in rapid S38), then 54 completion of controller is handled.Then user executes solution.If it find that people does not receive request (step
It is no in rapid S38), then controller 54 searches for another solution in step s 35.Controller 54 can cut autonomous mode
Shift to user's decision-making mode.Controller 54 can be carried out determination described above by speech recognition.If controller 54 is by language
Sound identifies answer of the response from the people for receiving request, it is determined that the people has received request.In another example, if
Performed in preset time in order to execute solution and may operation performed by the user and if the operation by specific sensing
Device (for example, visual sensor) recognizes, then controller 54 can determine that people has received request.
The case where 10 surrounding of robot device, can change over time, in fact it could happen that the problem of therefore may also be
And change.Thus detection case (problem) in nearly real time of robot device 10, and solution is specified based on testing result.
The update also according to the function of robot device 10 can be solved the problems, such as by robot device 10 and is changed.Machine
The function of people's device 10 passes through at least one in the hardware and software of change robot device 10 to update.
Solution designating unit 58 (can not will need human-aided solution by using the solution that device executes
Certainly scheme) it is specified prior to the solution to be executed by people.The solution that always not executing will be executed by people (may be refused
Absolutely).Human-aided solution is not needed by preferentially specifying, is more likely to execute the solution.If do not detected
People around robot device 10, then solution designating unit 58 can not search for the feelings for needing human-aided solution
It is specified under condition and does not need human-aided solution.This can mitigate the load of the search for solution.
The control process of solution is executed below with reference to the flow chart detailed description of Figure 16.It is assumed that robot device 10
It is unable to solving problems by themselves.
In step s 40, the controller 54 of robot device 10 determines the solution under user's decision-making mode selected by user
Whether scheme or the solution specified by robot device 10 under the pattern of making decisions on one's own need the control of device.That is, control
Device 54 determines whether settling mode is the solution executed by using device.In other words, controller 54 determines solution
Whether it is the solution only implemented by people without using device.
If solution does not need the control (being no in step S40) of device, that is, if solution does not use device
And only implemented by people, then processing proceeds to step S41.In step S41, robot device 10 asks in robot device 10
The people of surrounding executes solution.For example, communication unit 30 sends instruction request content (for example, executing solution to terminal installation 14
The process that certainly scheme is taken) information.The presentation of information is on the UI 72 of terminal installation 14.In step S42, communication unit
Member 30 sends the information by using the communication means suitable for ambient enviroment to terminal installation 14.The example of ambient enviroment is machine
The distance between people's device 10 and terminal installation 14 and the presence or absence of barrier between them.Controller 54 can alternatively to raise
Sound device is using the content of request as voice output, either so that the content of the display requests of UI 50 or so that robot device 10
Be moved to people with touching he or she.In step S43, controller 54 observes whether user executes in preset time in order to execute
Solution and the operation that may be executed.Controller 54 is then based on whether execute the operation to determine problem in step S44
Whether it has been solved.If performing the operation and if particular sensor (for example, visual sensor) in preset time
The operation is had been detected by, then 54 determination problem of controller has been solved (being yes in step S44).Then controller 54 is completed
Processing.If being not carried out the operation in preset time and also detecting this without particular sensor (for example, visual sensor)
Operation, then 54 determination problem of controller is not solved (being no in step S44).In this case, robot device 10 can be with
Another solution is searched in step S45 or retransmits same request to user.
If solution needs the control (being yes in step S40) of device (that is, if solution is will be by using
The solution that device executes), then whether the use of the device of determination for executing solution in step S46 of controller 54
Freely.More specifically, controller 54 searches for the device of (identification) around robot device 10, and determine the use of the device
It is whether free.Use for each device management and device is charge or free related information.The information can be by device
Function management table manages.Controller 54 can alternatively be obtained from institute's identification device indicate the device use be charge or
Free information.
If the use of device is not free (being no in step S46), that is, if the use charge of device, controller 54
Determine whether robot device 10 possesses the means of payment for paying for device in step S47.The example of means of payment is
Electronic money (digital cash), ideal money, now and credit card.Payment can be by using in addition to these examples
Means carry out.If robot device 10 does not possess the means of payment (being no in step S47) for paying for device, control
Device 54 processed searches for another solution in step S48.If robot device 10 possesses the payment for paying for device
Means (being yes in step S47), then processing proceed to step S49.Controller 54 controls the delivery operation of robot device 10.
That is, robot device 10 pays in use device.If robot device 10 does not possess the branch for paying for device
The means of paying, the then device that it can be from people and other than robot device 10 receive payment and support (to borrow money from someone or device
Or collect money).
Then whether determination can be via communication function for executing the device of solution in step S49 for controller 54
To control.For each device management instruction device whether the information that can be controlled via communication function.The information can be by filling
Function management table is set to manage.
If device can control (being yes in step S49) via communication function, controller 54 selects in step s 50
Select the communication means (communication means that the device is supported) that can be adopted to communicate with the device.Each device management is filled
Set supported communication means.The communication means that each device is supported can be managed by apparatus function management table.If logical
It crosses when the communication means that use device is supported executes communication and communication mistake occurs, then processing may proceed to step S57.If
Device supports multiple communication means, then robot device 10 can attempt by using each and the dress in multiple communication means
Set communication.In this case, robot device 10 (can have maximum speed or generate most by using optimal communication method
The communication means of few noise) carry out selection device.
In step s 51, controller 54 obtains address information related with device and accesses password.Controller 54 can be from
Another device (such as server) of storage address information obtains address information via internet.Address information alternatively may be used
To be stored in ア robot devices 10.In step S52, if necessary to driver used to control the device, then controller 54
It obtains driver and is installed in robot device 10.Controller 54 (can such as be serviced from the device of memory driver
Device) or obtain driver via internet.
Then, in step S53, (execution is used for for instruction of the communication unit 30 to device transmission instruction execution solution
Execute solution function instruction) information.When receiving the information, device is according to instruction execution solution.
In step S54, controller 54 observes whether device solves the problems, such as.Then controller 54 determines in step S55
Whether problem should be through being solved.If performing the behaviour that may be executed to execute solution by device in preset time
Make and if particular sensor (for example, visual sensor) has been detected by the operation, controller 54 determine problem by
It solves (being yes in step S55).The then completion processing of controller 54.If being not carried out the operation in preset time and also not having
Particular sensor (for example, visual sensor) detects the operation, then 54 determination problem of controller is not solved (in step S55
It is no).In this case, robot device 10 can be searched in step S56 another solution or to device again
Send same instruction.Controller 54 may determine whether that another solution can be executed.For example, if room temperature it is high and if
In a room, then controller 54 may search for another solution using electric fan for device (such as electric fan) installation other than air-conditioning
Certainly scheme.Search unit 62 can search for another solution by using internet.
If the device (being no in step S49) for executing solution, machine cannot be controlled via communication function
Device people device 10 searches for the remote controler for operating device in step S57.Robot device 10 for example can by analysis by
The image that visual sensor captures identifies remote controler.
Then robot device 10 determines whether to find remote controler in step S58.If robot device 10 fails to send out
Existing remote controler (being no in step S58), then it searches for another solution in step S59.
If it find that remote controler (being yes in step S58), then the remote controller input in step S60 of robot device 10
Instruction for executing solution.When receiving the instruction, device is according to instruction execution solution.
Controller 54 observes whether device is solving the problems, such as, and determines whether problem is solved in step S61
Certainly.If performed to execute solution and the operation that may be executed and if specific sensing by device in preset time
Device (for example, visual sensor) has been detected by the operation, then 54 determination problem of controller has been solved (is in step S61
It is).The then completion processing of controller 54.If be not carried out in preset time the operation and also without particular sensor (for example,
Visual sensor) detect the operation, then 54 determination problem of controller is not solved (being no in step S61).In such case
Under, robot device 10 can be searched for another solution in step S62 or be sent out again to device by using remote controler
Send same instruction.
It the following will discuss the scene (application scenarios) of the robot device 10 using the illustrative embodiments.
(application scenarios 1)
Application scenarios 1 are discussed below with reference to Figure 17.In fig. 17 it is shown that office worker 76 and robot device 10.Office worker
76 (three members) for example just in session, and robot device 10 with office worker 76 together.
The case where robot device 10, information collector 44 was by using various sensor collections and robot device 10
The related situation information of surrounding.More specifically, situation information collector 44 be collected as it is voice, dialogue between office worker 76 interior
Appearance, image (face of office worker 76 and the image of whole body), temperature, humidity etc. are used as situation information.The detector of robot device 10
56 detect ambient conditions based on situation information (state (content, expression and the attitude such as talked with) and temperature of office worker 76).
If office worker 76 says " I wants that the content for making us talk with writes on paper ", situation information collector 44 is collected related with the dialogue
Voice messaging as situation information.Detector 56 is then based on whether dialogue determination has any problem in this case.
In this example, detector 56 detects problem " office worker 76 wants the dump for making them talk on paper ".
Solution designating unit 58 specifies solution used to solve the problem with reference to solution management information 38
Scheme.The problem by " collect conversation content as voice messaging and convert thereof into the function of character string " and " can for example be beaten
The combination of print function " solves.That is, the solution to the problem is by " collecting conversation content as voice messaging and to be converted
At the function of character string " and the combination of " printing function " constitute.
It currently assumes that robot device 10 has " to collect conversation content as voice messaging and convert thereof into character string
Function ", but do not have " printing function ".That is, the problem cannot be solved only by robot device 10.In order to solve the situation,
Device of the search of robot device 10 with printing function.Robot device 10 is captured by using visual sensor (camera)
The image of device around robot device 10, and recognition unit 64 passes through the analysis image recognition device.For example, having
The multi-function device 78 of printing function is mounted near robot device 10.64 identification multifunction device 78 of recognition unit, and
The function (for example, printing function) of identification multifunction device 78.In this case, solution is by using robot device
10 and multi-function device 78 execute.The solution is the synergistic function by using robot device 10 and multi-function device 78
The solution of implementation, that is, the solution implemented by regarding robot device 10 and the combination of multi-function device 78 as collaborative device
Certainly scheme.Robot device 10 can be automatically or in response to instruction execution solution from the user.When execution solution
When, robot device 10 communicates with multi-function device 78, is sent to multi-function device 78 in the dialogue between instruction office worker 76
The information of appearance, and to multi-function device 78 send type information instruction.Therefore, the dialogue between office worker 76 is printed on paper
Content.
Robot device 10 can be moved to multi-function device 78, fetch the paper of prints conversation content, and by its
Pass to office worker 76.Robot device 10 can export voice, to notify office worker 76 to execute solution.
In example in fig. 17, robot device 10 is used as one in the device for executing solution.However, it is possible to
Robot device 10 is necessarily used, on the contrary, solution can be executed by using the device in addition to robot device 10.
In this case, robot device 10 provides the instruction for executing solution to another device.
Robot device 10 can direct operating multifunction device 78.For example, if detector 58 detects that " I thinks problem
Duplicate manuscript ", then " copy function " is appointed as solution by solution designating unit 58.In this case, robot
Device (such as multi-function device 78) of the search of device 10 with copy function, and multi-function device 78 is made to carry out manuscript
It replicates.Robot device 10 receives manuscript from office worker 76, places it in multi-function device 78, and direct operating multifunction
The operation panel of device 78, to provide copy function to multi-function device 78.Robot device 10 for example can by analysis by
The image that visual sensor captures identifies operation panel, then provides copy function.
As described above, it can solve only to be solved by robot device 10 by using another device
The problem of.This enables robot device 10 to solve more to ask by according to circumstances working together with another device
Topic.
(application scenarios 2)
Application scenarios 2 are discussed below with reference to Figure 18.In fig. 18 it is shown that three people 82 and robot device 10.People
80 lie down.The case where robot device 10, information collector 44 collected the image of the people 80 to lie down, as situation information, and examined
It surveys device 56 and determines whether problem occurs based on the image.In example in figure 18, problem " someone lies down " is detected.
Solution designating unit 58 specifies solution used to solve the problem with reference to solution management information 38
Scheme.The problem can be solved by " to ask for help to someone to succour the people to lie down ".That is, the solution to the problem
Example is " asking for help to someone ".In this case, robot device 10 seeks to the people 82 other than the people 80 to lie down
Ask help.For doing so, robot device 10 can export sound or be moved to people 82 to touch him or she.Controller 54 can
To show information related with rescue process on the UI 72 of the terminal installation 14 of the UI 50 or people 82 of robot device 10.Machine
Device people device 10 can send and succour by using particular sensor class identification terminal device 14, and to terminal installation 14
The related information of journey.
For example specified solution " people to lie down is carried to safety place " of solution designating unit 58.If machine
People's device 10 has the function of carry an object, then it can be only by own or together with another person 82 or another device
Or with another device and another person 82 together carrier 80.The identification of robot device 10 can be with the device of carrier 80.Such as
Fruit robot device 10 does not have the function of carry an object, then it may be an indicator that the people 80 to lie down is carried to safely by people 82
Point.It indicates that the information of the instruction may be displayed on the UI 50 of robot device 10, or the terminal dress of people 82 can be sent to
It sets 14 and is shown on the UI 72 of terminal installation 14.
As described above, it can solve only to be solved by robot device 10 by requesting help to someone
The problem of.
(application scenarios 3)
Application scenarios 3 are discussed below with reference to Figure 19.In fig. 19 it is shown that three users 84 and robot device 10.
If the case where user says " I am thirsty ", robot device 10 information collector 44 collects voice letter related with the words
Breath, as situation information, and detector 56 determines whether problem occurs based on the words.In this example, problem is detected
" thirsty ".
Solution designating unit 58 specifies the solution party for solving the problems, such as with reference to solution management information 38
Case.The problem can for example be solved by " purchase beverage ".That is, for the solution to the problem by " providing the work(of beverage
Can " constitute.
It is assumed that robot device 10 does not have " function of providing beverage ".That is, the problem cannot be only by robot device
10 solve.In order to solve such case, robot device 10 searches for the device for providing beverage.Robot device 10 is by making
The image of the device around robot device 10 is captured with visual sensor (camera), and recognition unit 64 passes through analysis
The image recognition device.For example, the automatic vending machine 86 (example of device) for bottled water is mounted near robot 10.Know
Other unit 64 identifies automatic vending machine 86 and identifies the function (for example, charge provides the function of beverage) of automatic vending machine 86.Know
Other unit 64 can be used for means of payment (such as electronics of automatic vending machine 86 based on the image recognition captured by visual sensor
Currency or cash), or can be communicated with automatic vending machine 86, to identify means of payment.Controller 54 determines robot device
Whether 10 have means of payment.
If robot device 10 have used by automatic vending machine 86 means of payment (such as electronic money, cash or
Credit card), then controller 54 executes the delivery operation of robot device 10.Robot device 10 is then by with the means of payment
Payment buys beverage from automatic vending machine 86.Robot device 10 can move to automatic vending machine 86 and directly operate it, with purchase
Buy beverage.Robot device 10 by analyze the image that is captured by visual sensor identify the purchase of automatic vending machine 86 by
Button, and buy beverage.The beverage can be delivered to user 84 by robot device 10.Robot device 10 can be in response to coming
Beverage is bought from the instruction of user 84 or in the case of the instruction not from user 84.If robot device 10 is based on coming
Beverage is bought from the instruction of user 84, then controller 54 allows the information of instruction solution (user 84 buys beverage) aobvious
Show on UI 50.If user 84 by using UI 50 come provide purchase instruction, robot device 10 by with payment hand
Duan Zhifu buys beverage.Indicate that the information of solution can be sent to the terminal installation of user 84 and be shown in terminal installation
On 14 UI 72.If user 84 provides purchase instruction by using UI 72, instruct related information from terminal with purchase
Device 14 is sent to robot device 10, and robot device 10 instructs purchase beverage in response to purchase.
If robot device 10 does not have the means of payment that are used by automatic vending machine 86, it can by from addition to
At least one party in device and user 84 except robot device 10 receives payment and supports to pay.For example, robot device
10 can be by collecting money or by borrowing branch from another device from user related with for solving solution to the problem 84
The means of paying buy beverage.User related with for solving solution to the problem 84 is the user 84 for saying " I am thirsty ".It should
User 84 is identified by the particular sensor of robot device 10.
(user's decision-making mode 1)
The example (user's decision-making mode 1) of user's decision-making mode is described in detail below with reference to Figure 20.In fig. 20, it shows
Three office workers 87 and robot device 10.Office worker 87 is talking, and multiple devices (such as multi-function device 78, projection
Instrument 88, camera 90, display 92 and fumigating machine 94) it is mounted near robot device 10.
The state information collection device 44 of robot device 10 is by using various sensor collections with robot device's 10
The related status information of surrounding, and detector 56 is based on status information and detects ambient conditions.Recognition unit 64 is identified in machine
Device around people's device 10.For example, detector 56 detects that " three office workers just just look at arguement to situation (problem) in session
Something ".In the example of fig. 20,64 identification multifunction device 78 of recognition unit, projecting apparatus 88, camera 90, display 92 with
And fumigating machine 94.Recognition unit 64 can identify the terminal installation 14 of office worker 87 (for example, office worker 87 appears to have problem).
If it is determined that unit 60 judges that robot device 10 cannot solve the problems, such as to be detected by detector 56, then communication unit
Member 30 is identified to the terminal installation 14 being registered in robot device 10 under the control of controller 54 or to by recognition unit 64
Terminal installation 14 (for example, seeming the terminal installation 14 of problematic office worker 87) send situation information.Situation information includes
With (problem) related information the case where detection by detector 56 and information related with the device identified by recognition unit 64.
On the UI 72 of terminal installation 14, the picture to explain the situation is shown.Figure 21 shows the example of the picture.At end
On the UI 72 of end device 14, display situation illustrates picture 96.Illustrate on picture 96 in situation, display is detected with by detector 56
The case where (problem) related information and information related with the device identified by recognition unit 64.In example in figure 21,
It indicates that the character string of " three members are just just looking at arguement something in session " is shown as situation (problem), and shows character
It goes here and there " there are multi-function device B, projecting apparatus C, camera D, display E and fumigating machine F ".
If user carries out the request to other information by using terminal installation 14, it indicates that the information of request is from terminal
Device 14 is sent to robot device 10.The communication unit 30 of robot device 10 is then in response to the request to terminal installation 14
Send other information.If indicating that the image of ambient conditions is requested by a user as other information, communication unit 30 is to terminal
Device 14 sends the image data of instruction ambient conditions.
For example, communication unit 30 to terminal installation 14 send with seem image data that problematic office worker 87 contacts and
The image data contacted with the device identified by recognition unit 64, the image data as instruction ambient conditions.With appear to have
The image data of the contact of the office worker 87 of problem can be the picture number of the image for the office worker 87 for indicating to be captured by visual sensor
According to or schematically indicate the image data of office worker 87.The image data contacted with device is in the identification device of recognition unit 64
When can be to indicate the image data for the image of device that captured by visual sensor (camera) or schematically showing to be known
The image data (icon) of other device.Robot dress can be stored in advance in by schematically showing the image data of institute's identification device
It in setting 10, or can be stored in advance in another device such as server, and robot device 10 can be sent to.
Communication unit 30 sends image data as other information (for example, the figure contacted with office worker 87 to terminal installation 14
The image data contacted as data and with device).On the UI 72 of terminal installation 14, the image indicated by image data is shown.
Figure 22 shows the example of image.On the UI 72 of terminal installation 14, display situation illustrates picture 98.Illustrate picture in situation
On 98, one group of image is shown, as other information.For example, image 100 and multi-function device 78 that display is contacted with office worker 87
The image 102 of contact, the device image 104 contacted with projecting apparatus 88, the device image 106 contacted with camera 90 and display
The device image 108 that device 92 contacts and the device image 110 contacted with fumigating machine 94.
If user illustrates specified device image on picture 98 in situation, and joins by using with specified device image
The device of system provides the instruction for executing solution, then the information for indicating the instruction is sent to specified device.Then, the device
The solution for being used to solve the problems, such as to be detected by detector 58 in response to instruction execution.The information of indicator can be from end
End device 14 is sent from robot device 10 to the device.
User specifies such as device image 108 and 110.In this case, solution designating unit 58 is with reference to solution
Project management information 38, and specified solution, the solution for solves the problems, such as to be detected by detector 56 " three at
Member's meeting just looks at arguement something " and contact using the display 92 contacted with device image 108 and with device image 110
Fumigating machine 94.That is, solution designating unit 58 specifies (identification) for solving by referring to solution management information 38
The solution of the problem of being detected by detector 56 " three member's meetings just look at arguement something ", and by referring to dress
It sets function management information 40 and synergistic function management information 42 specifies one group of device for identified solution (to have and be used for
Execute one group of device of the function of solution).If user's designated display 92 and fumigating machine 94, solution is specified single
Member 58 selects the solution to be executed by using display 92 and fumigating machine 94.
If user's designated display 92 and fumigating machine 94, as shown in figure 23, shown on the UI 72 of terminal installation 14
Execute instruction picture 112.On executing instruction picture 112, show the device image specified by user (for example, device image 108
With 110), and show that instruction can be by the information for the function (solution) that the device specified by user executes.It can be by using
The solution that display 92 and fumigating machine 94 execute is, for example, that " display is consoled image and dissipated from fumigating machine 94 on display 92
Hair consoles fragrance ".The solution is the synergistic function that can be executed by combined display 92 and fumigating machine 94, and is registered in
In synergistic function management information 42.When user provides the instruction for executing solution by using terminal installation 14, from terminal
Device 14 sends the information for indicating the instruction to display 92 and fumigating machine 94.The information alternatively can be via robot device
10 are sent to display 92 and fumigating machine 94.When receiving the information, image is consoled in the display of display 92, and fumigating machine 94 dissipates
Hair consoles fragrance.It indicates that the data for consoling image can be stored in advance in robot device 10, and can be filled from robot
It sets 10 and is sent to display 92.Data can be alternatively stored in another device (such as server), and from the device
It is sent to display 92.
Picture 96,98 and 112 shown in Figure 21 to Figure 23 may be displayed on the UI 50 of robot device 10.At this
In the case of kind, picture 96,98 and 112 can not have to be shown on the UI 72 of terminal installation 14.In picture 96,98 and
The information shown on 112 can be used as voice messaging to export.
The detailed description of the selection operation for the device for executing solution will be discussed later.
(user's decision-making mode 2)
It the following will discuss another example (user's decision-making mode 2) of user's decision-making mode.
If it is determined that unit 60 judges that robot device 10 cannot solve the problems, such as to be detected by detector 56, then communication unit
Member 30 sends situation information under the control of controller 54 to terminal installation 14.
For example, as shown in figure 24, on the UI 72 of terminal installation 14, showing notification screen 114.In notification screen 114
On, message that the problem of display cannot be handled by robot device 10 (accident) occurs.
If user presses "Yes" button in notification screen 114, next picture occurs.For example, as shown in figure 25,
Show that situation illustrates picture 116 on the UI 72 of terminal installation 14.Illustrate on picture 116 in situation, shows by robot device 10
The explanation of the case where detection (problem).
Illustrate on picture 116 in situation, display requries the users the message whether user understands situation.If user presses
"No" button, then user can be inquired to robot device 10, that is, user can carry out the request to other information.
In this case, as shown in figure 26, on the UI 72 of terminal installation 14, display inquiry picture 118.On inquiry picture 118,
User can input inquiry (user in addition inquisitive item) by using terminal installation 14.In example in fig. 26, use
Family has inputted some inquiries " I wants the data about XXX " and " I wants to watch the video of XXX ".Indicate inquiry information from
Terminal installation 14 is sent to robot device 10.The case where robot device 10, information collector 44 was in response to user's inquiry collection
Information (for example, image data and voice data).Then communication unit 30 is sent to terminal installation 14 by situation information collector
44 information collected.Illustrate on picture 116 in the case where 72 UI of terminal installation 14, shows the information.
If user illustrates to press "Yes" button on picture 116 in situation, as shown in figure 27, in the UI of terminal installation 14
Show that solution shows picture 120 on 72.It is shown on picture 120 in solution, display instruction is specified single by solution
The information (explanation and title of such as solution) of the specified solution of member 58.For example, if inspection as discussed above
Problem " three office workers are just just looking at arguement something in session " is measured, then solution designating unit 58 is with reference to solution pipe
Reason information 38 is simultaneously specified for solving solution to the problem.Indicate that the information of specified solution is sent out from robot device 10
It send to terminal installation 14, and is shown on the UI 72 of terminal installation 10.In example in figure 27, specifies and show four
Solution (1) is to (4), as solving solution to the problem.Solution can be by random sequence or by validity
Or the descending of feasibility is shown.For example, using the solution for the device for being positioned as closer terminal installation 14 or robot device 10
Certainly scheme has higher feasibility, is thus shown in lists in higher position.With robot device 10, terminal installation
14 and the related information in position of other devices can be obtained by using global positioning system (GPS).Controller 54 is logical
It crosses and calculates each device and the distance between robot device 10 or terminal installation 14 using GPS position information.It is tied based on calculating
Fruit determines the sequence that the solution shown on picture 120 is shown in solution.
If user indicates picture while solution shows that picture 120 is shown on the UI 72 of terminal installation 14
Transition, or if preset time is gone over, as shown in figure 28, picture 122 is shown on UI 72.On picture 122, display
The message that such as " solution please be select ", that is, instruction user selects the message of solution.If user is being presented to user
Recommended solution (referring to Figure 27) in do not find suitable solution (desired solution), then user can carry
For another instruction.
If user found suitable solution in the solution (referring to Figure 27) recommended and by using end
End device 14 selects it, then as shown in figure 29, is shown on UI 72 and check picture 124.On checking picture 124, display instruction
The information of solution selected by user.In example in Figure 29, user selects solution (2) " to show over the display
Console image " and solution (3) " distributing citrus fragrance from fumigating machine ".If user presses "Yes" on checking picture 124
Button, then from terminal installation 14 or robot device 10 to the device for executing solution (2) and (3) (for example, display
92 and fumigating machine 94) send instruction execute solution (2) and (3) instruction information.Then, display 92 executes solution party
Case (2), and fumigating machine 94 executes solution (3).
If user presses "No" button, the picture 122 before display on checking picture 124 (referring to Figure 28).
If user does not find suitable solution in the recommended solution (referring to Figure 27) for being presented to user,
Then user can specify another solution.If user provides display for specifying solution party by using terminal installation 14
The instruction of the input picture of case then as shown in figure 30 shows that user inputs picture 126 on the UI 72 of terminal installation 14.With
Family inputs on picture 126, user's specified device and the solution to be executed by using the device.Example in fig. 30
In, multi-function device is designated for executing the device of solution, and " XXX processing " is designated as using this multi-functional
The solution of device.User can be by with the title of the character input device or passing through the specified device contacted with the device
Image specifies the device for solution.On the UI 72 of terminal installation 14, display with identified by robot device 10
One or more device images of one or more device contacts, and user's selection and the device for executing solution
The device image of contact.By selection device in this way, it is unfamiliar with the user of the title of the device for solution still
It can be with specified device.When user's specified device, the list for the function of including in a device is shown on UI 72.In device
Included function can be specified by referring to apparatus function management information 40.Then user is used for solution party from list selection
The function of case.When user provides the instruction for executing solution by using terminal installation 14, to the device specified by user
The information of indicator is sent, and device executes solution.
Picture 114,116,118,120,122,124 and 126 shown in Figure 24 to Figure 30 may be displayed on robot dress
It sets on 10 UI 50.In this case, picture 114,116,118,120,122,124 and 126 can not have to be shown in
On the UI 72 of terminal installation 14.The information shown on picture 114,116,118,120,122,124 and 126 can conduct
Voice messaging exports.
Under user's decision-making mode 1 described above and 2, to be shown in information on the UI 72 of terminal installation 14 from
Robot device 10 is sent to terminal installation 14.Information can alternatively fill under the control of robot device 10 from another
It sets, such as server, is sent to terminal installation 14.
(identifying processing for being directed to device)
The identifying processing for device is discussed below.In one example, device is by using augmented reality (AR) skill
Art is identified by obtaining the device identification information of instruction device.For example, by using AR technologies, the device of used aloned logical
It crosses to obtain and indicates the device identification information of the device to identify, and collaborative device is by obtaining device related with these devices
Identification information identifies.As AR technologies, known AR technologies can be used.The example of known AR technologies is to use label (such as
Quick Response Code) the AR technologies based on label, use image recognition technology unmarked AR technologies and using location information position
Confidence ceases AR technologies.Alternatively, device identification information can be obtained without using AR technologies.For example, being connected to net
The device of network can be identified based on the IP address of the device or by reading device ID.With various wireless communication functions
In the device and terminal of (such as infrared communication, visible light communication, Wi-Fi and bluetooth), to be used as collaborative device or terminal
Device or the device ID of terminal can be obtained by using corresponding wireless communication function, then can execute synergistic function.
The processing for obtaining device identification information is described in detail below with reference to Figure 31.It will be it is assumed that multi-function device 78
Near robot device 10 and robot device 10 obtains the feelings of device identification information related with multi-function device 78
Description is provided under condition.Figure 31 shows the signal appearance of multi-function device 78.In this example, it will be discussed for by using base
The processing of device identification information is obtained in the AR technologies of label.Label 128 (such as Quick Response Code) is arranged in multi-function device 78
Shell on.Label 128 is the encoded device identification information of multi-function device 78.Robot device 10 is by using vision
Sensor captures the image of label 128, to generate the image data of cue mark 128.The controller pair of robot device 10
Decoding process is executed by the tag image of image data instruction, with extraction element identification information.Robot device 10 can be with as a result,
Identification multifunction device 78.The recognition unit 64 of robot device 10 is specified referring next to apparatus function management information 40
Indicate the functional information with the associated function of extracted identification information.It is therefore intended that provided in (identification) multi-function device 78
Function.
Label 128 can also include the encoded functional information of the function of instruction multi-function device 78.In such case
Under, as controller 54 to the image data of cue mark 128 execute decoding process as a result, the dress of extraction multi-function device 78
Identification information is set, and extracts the functional information of the function of instruction multi-function device 78.It is therefore intended that (identification) is in multifunctional assembling
The function and multi-function device 78 of middle offer are provided.
If device identification information is obtained by using unmarked AR technologies, robot device 10 is by using vision
The image in whole or in part of sensor capture device (for example, multi-function device 78) appearance, to generate appearance images data.
In this case, the image for capturing the information (such as device name (for example, ProductName) and model) for specified device helps
In specified device.The controller 54 of robot device 10 is then based on the appearance images data identification means.In robot device
It is associated outer between storage instruction appearance images data and device identification information for each device in 10 storage unit 32
Portion's image related information, appearance images data instruction device appearance in whole or in part, and device identification information and the device
It is related.The appearance images data obtained by the image of capture device include by controller 54 in external image related information
Every appearance images data be compared, and the identification of related with device to be used device is specified to believe based on comparative result
Breath.For example, the feature of the appearance for the appearance images data extraction device that controller 54 is obtained from the image by capture device, and
And specify the outside drawing with same or like feature from one group of appearance images data for including in external image related information
As data.Then controller 54 specifies the device identification information with specified appearance images data correlation.It is therefore intended that device
(for example, multi-function device 78).If image of the appearance images data as capture device title (for example, ProductName) and signal
Result and generate, then device can be specified based on the device name and model that are indicated by outer light image data.Recognition unit
64 comparable device function management information 40, and specified instruction and the function of the associated function of defined device identification information are believed
Breath.It is therefore intended that the function of being provided in device (for example, multi-function device 78).
If device identification information is obtained by using location information AR technologies, it indicates that mounting device is (for example, more work(
Can device 78) the location information of position obtained by using GPS functions.Each device has the function of GPS and obtains instruction dress
The device location information of seated position.Then robot device 10 exports the letter that instruction obtains the request of device location information to device
Breath, and as the response to request from the device reception device location information.The controller 54 of robot device 10 is based on dress
Seated position information recognition device.In storage device 32, for each device, storage instruction device location information is believed with device identification
Associated position related information between breath, the device location information indicate the position of mounting device, and device identification information
It is related with the device.Controller 54 is based on position related information and specifies and the associated device identification information of device location information.Cause
This, specifies (identification) device.64 comparable device function management information 40 of recognition unit, and specified instruction is known with defined device
The functional information of the function of other information association.It is therefore intended that the work(that (identification) provides in device (for example, multi-function device 78)
Energy.
It is and multi-functional if multi-function device 78 is identified by robot device 10 for example, under user's decision-making mode 1
The device image that device 78 contacts, which is shown on terminal installation 14, is used as situation information.For example, as shown in figure 32, with multifunctional assembling
The device image 102 for setting 78 contacts is shown on the UI 72 of terminal installation 14.Device image 102 can be by robot device
The image that 10 visual sensor captures, or can be the image for schematically showing multi-function device 78.
When identifying device, the letter for the title for indicating the device can be sent from robot device 10 to terminal installation 14
Breath, and on the UI of terminal installation 14 72 display device title.In example in Figure 32, display Name " multifunctional assembling
Set B ".
For example, if user by using 14 specified device image 102 of terminal installation, as shown in figure 33, fills in terminal
Set shown on 14 UI 72 instruction will by using with device image 102 (for example, for the solution to be executed
Button image) contact multi-function device 78 execute solution information.Multi-function device 78 for example with printing function,
Scanning function, copy function and facsimile function.Button image for executing these these functions as solution is aobvious
Show on UI 72.If user provides the instruction for executing printing function by specifying the button image of printing function, from end
End device 14 or robot device 10, which to multi-function device 78 send instruction and execute the instruction of printing function, executes instruction information.
It includes control data for executing printing function and the image that such as printed by using printing function to execute instruction information
The data of data.Reception execute instruction information be when, multi-function device 78 executes printing according to information is executed instruction.
If under user's decision-making mode 1, multiple devices (for example, multi-function device 78 and projecting apparatus 88) are filled by robot
10 are set to identify, then the device image contacted with multi-function device 78 and the device image contacted with projecting apparatus 88 are shown in terminal
On device 14, as situation information.For example, as shown in figure 34, the device image 102 that is contacted with multi-function device 78 and with projection
The device image 104 that instrument 88 contacts is shown on the UI 72 of terminal installation 104.Device image 102 and 104 can be by machine
The image of the visual sensor capture of people's device 10 or the image for schematically showing multi-function device 78 and projecting apparatus 88.
When identification device, the letter for the title for indicating the device can be sent from robot device 10 to terminal installation 14
Breath, and can show the information on the UI 72 of terminal installation 14.In example in Figure 34, multi-function device 78 is shown
Title " multi-function device B " and projecting apparatus 88 title " projecting apparatus C ".
If user is by using 14 specified device image of terminal installation, 102 charge 104, as shown in figure 35, in more work(
Can show on the UI 72 of device 14 the instruction multi-function device 78 that is contacted with device image 102 of use and with device image 104
The information (button image for executing solution) of the solution of the projecting apparatus 88 of system.Piecemeal solution scheme is by using
The solution that the synergistic function of multi-function device 78 and projecting apparatus 88 is implemented.Pass through combined multi-functional device 78 and projecting apparatus
88, the synergistic function of the institute's scan image generated by multi-function device 78 is projected by using projecting apparatus 88 and by using more work(
The synergistic function that energy device 78 prints the image projected by projecting apparatus 88 can perform.Button figure for executing these synergistic functions
As being shown on the UI 72 of multi-function device 14.If user's designated button image indicates solution (synergistic function)
It executes, then sending instruction to multi-function device 78 and projecting apparatus 88 from terminal installation 14 or robot device 10 executes the solution party
The instruction of case executes instruction information.When reception executes instruction information, multi-function device 78 and projecting apparatus 88 execute user institute
Specified synergistic function.
Multi-function device 78 and projecting apparatus 88 can be designated as example as user with finger touch device image 102 and
Then finger is slid into the collaborative device that device image 104 is combined with the result of specified device image 102 and 104.User
Finger then can be slid into device image 102 first with finger touching device image 104.Picture can be used to contact matchmaker
It is situated between, rather than uses finger.User can specify them by attachment device image 102 and 104, by multi-function device 78
It is designated as collaborative device with projecting apparatus 88.User can specify it by the way that device image 102 and 104 to be superimposed upon on each other
, multi-function device 78 and projecting apparatus 88 are designated as collaborative device.User can be in the installation drawing contacted with collaborative device
As surrounding picture (such as round), or the device image contacted with collaborative device can be specified in preset time.It is assisted when cancelling
When congenerous, user can specify the device to be cancelled or pressing collaboration cancel button on picture.User can pass through execution
Predetermined registration operation (label of such as crossing) specifies the device to be cancelled.
In another example, collaborative device can be set in advance as basic collaborative device.For example, 78 quilt of multi-function device
It is set in advance as basic collaborative device.Device identification information related with basic collaborative device can be stored in advance in robot dress
It sets in 10 or another device (such as server).User can alternatively specify basic collaboration by using terminal installation 14
Device.If basic collaborative device is arranged, user specifies the installation drawing contacted with the device other than basic collaborative device
Picture, to select the device combined with basic collaborative device.
In example described above, function (individual feature and synergistic function) is implemented by using hardware device.So
And function can be implemented by software (software application).For example, instead of device image, with the functional cohesion by software implementation
Function image (for example, icon image) is shown on the UI 72 of terminal installation 14.One or more functions figure is specified as user
The result of picture, it is possible to specify the function of being contacted with function image or cooperateing with using the multiple functions of being contacted with multiple function images
Function.It the device image that is contacted with hardware device and can be shown in together with the function image of the functional cohesion by software implementation
On UI 72.Result as user's specified device image and function image, it is possible to specify use the device contacted with device image
With the synergistic function for the function of being contacted with function image.
The processing for executive device function is described below.In this example, it will be discussed for executing synergistic function
Processing.Connection request is sent from terminal installation 14 to the collaborative device to be combined, and terminal installation 14 is connected with these devices
To each other.Connection request can be sent from robot device 10 to collaborative device, and robot device 10 and these devices connect
It is connected to each other.The connection processing is described below with reference to the precedence diagram of Figure 36.
In step S370, user specifies the synergistic function to be executed by using terminal installation 14.Then, in step
In S71, terminal installation 14 sends instruction to the device (for example, multi-function device 78 and projecting apparatus 88) for executing the synergistic function and connects
Connect the information of request.If address information related with unit address is stored in robot device 10, terminal installation 14 from
Robot device 10 obtains the address information for the address for indicating these devices.Address information can be stored in terminal installation 14.
Terminal installation 14 can obtain address information by another method.Then terminal installation 14 is believed by using the address obtained
Cease the information that instruction connection request is sent to collaborative device (for example, multi-function device 78 and projecting apparatus 88).
When receiving the information of instruction connection request, in step S72, multi-function device 78 and projecting apparatus 88 receive or not
Receive the connection request.If not permitting multiple functional devices 78 and 88 to be connected to another device or be linked to if had requested that
The quantity of the device of multi-function device 78 and projecting apparatus 88 is more than maximum quantity, then multi-function device 78 and projecting apparatus 88 do not receive
Connection request.If multi-function device 78 and projecting apparatus 88 receive connection request, they can be in order to make and multi-function device
78 and operation of the 88 related setting information of projecting apparatus from being changed by terminal installation 14 and being forbidden changing the information.For example, can
To forbid the change of the color parameter in multi-function device 78 and the preset time for being transferred to battery saving mode.This enhancing collaborative device
Safety.In another example, the change of the setting information of collaborative device can be forced than used aloned device more
More limitations.For example, allowing the change of less setting option than the device for used aloned for collaborative device.Can forbid with
The reading of the related personal information of other users (such as operation note).This enhances the safety of personal information related with user
Property.
In step S73, whether instruction connection request is sent to terminal installation 14 from multi-function device 78 and projecting apparatus 88
Received result information.If having received connection request, in terminal installation 14 and multi-function device 78 and projecting apparatus 88
In each between establish communication.
Then, in step S74, user provides the instruction for executing synergistic function by using terminal installation 14.In response to
The instruction sends instruction to multi-function device 78 and projecting apparatus 88 from terminal installation 14 and executes synergistic function in step S75
Instruction executes instruction information.The information that executes instruction for being sent to multi-function device 78 will be by multi-function device 78 including instruction
The information (for example, job information) of the processing of execution.The information that executes instruction for being sent to projecting apparatus 88 will be by throwing including instruction
The information (for example, job information) for the processing that shadow instrument 88 executes.
When reception executes instruction information, in step S76, multi-function device 78 and projecting apparatus 88 are according to executing instruction letter
Breath executes function.If synergistic function is related to the place for sending and receiving data between multi-function device 78 and projecting apparatus 88
Reason (such as shifts institute's scan data and by projecting apparatus from multi-function device 78 (multi-function device B) to projecting apparatus 88 (projecting apparatus C)
88 projection institute scan datas), then it establishes and communicates between multi-function device 78 and projecting apparatus 88.In this case, it is sent
The address information that information includes the address of indicating projector 88 is executed instruction to multi-function device 78, and is sent to projection
Instrument 88 executes instruction the address information that information includes the address for indicating multi-function device 78.Believed by using these address
Breath is established between multi-function device 78 and projecting apparatus 88 to be communicated.
When completing the execution of synergistic function, in step S77, from multi-function device 78 and projecting apparatus 88 to terminal installation
14 send the information that the execution of synergistic function is completed in instruction.In step S78, instruction is shown on the UI 72 of terminal installation 14
Complete the information of the execution of synergistic function.If not showing this sending the preset time after executing instruction information after in the past
Information, then terminal installation 14 is shown on UI 72 indicates wrong information, and can be to multi-function device 78 and projecting apparatus 88
Retransmission executes instruction information or connectivity request message.
Then, user checks whether to cancel in step S79 cooperates with state between multi-function device 78 and projecting apparatus 88,
And the processing according to inspection result is executed in step S80.If the user find that by collaboration state is cancelled, then user is by making
The instruction for cancelling collaboration state is provided with terminal installation, to cancel in terminal installation 14 and multi-function device 78 and projecting apparatus 88
Communication between each.Also cancel the communication between multi-function device 78 and projecting apparatus 88.If the user find that will not cancel
Collaboration state, then terminal installation 14 will continue to provide to execute instruction.
If executing individual feature, instruction is sent to the device for executing the individual feature from terminal installation 14 and execute the work(
The information of the instruction of energy.Device and then basis, which execute instruction, executes the function.
It is described above to execute instruction information and be sent from robot device 10 to collaborative device.
The processing for specifying the device for executing solution is described below.
(processing of the switching of the display order for synergistic function information)
In the illustrative embodiments, the sequence for the device image that can be contacted with device according to connection shows to switch
The sequence of item of information related with synergistic function.The processing is discussed in detail below with reference to Figure 37 to Figure 39 B.
Figure 37 instantiate be synergistic function management information 42 another exemplary synergistic function management table.Shown in Figure 37
Synergistic function management table example in, the information of the combination of instruction device ID, indicate collaborative device title information (example
Such as, type of device), the information (synergistic function information) of instruction synergistic function, indicate that the information of the order of connection and instruction are preferential
The information of grade with it is associated with each other.The sequence for the device image that order of connection instruction connection is contacted with device.Priority level indicates
The priority of display item of information related with synergistic function.Device with device ID " A " is, for example, PC, and has device ID
The device of " B " is multi-function device.By combining PC (A) and multi-function device (B), " scanning forwarding function " and " printing function "
It is implemented as synergistic function." scanning forwarding function " is the image data that will be generated by the scan operation of multi-function device (B)
It is transferred to the function of PC (A)." printing function " is the data of storage in being sent in PC (A) to multi-function device (B) (for example, figure
As data or manuscript data) and in the middle function of printing the data of multi-function device (B).If user is by multi-function device (B)
PC (A) is connected to (that is, if the device image contacted with multi-function device (B) is connected to the device contacted with PC (A) by user
Image), then " scanning forwarding function " takes the first priority level, and " printing function " takes the second priority level.In such case
Under, related information is prior to the related presentation of information with " printing function " with " scanning forwarding function ".On the contrary, if user will
PC (A) is connected to multi-function device (B) (that is, if the device image contacted with PC (A) is connected to and multi-function device by user
(B) the device image contacted), then " printing function " takes the first priority level, and " scanning forwarding function " takes the second priority scheduling
Grade.In this case, information related with " printing function " is prior to the related presentation of information with " scanning forwarding function ".
Figure 38 A to Figure 39 B instantiate the example of the picture shown on the UI 72 of terminal installation 14.For example, identifying more work(
It can device (B) and PC (A).Under user's decision-making mode 1 described above, as shown in fig. 38 a, display and multi-function device
(B) the device image 102 contacted and the device image 130 with PC (A) contacts, as the feelings on the UI 72 of terminal installation 14
Condition information.In this state, user's connection expression will be by using indicator (for example, user's finger, pen or contact pilotage) group
The device image of the device of conjunction.The controller 74 of terminal installation 14 detects the position of indicator contact picture, with sentinel
Movement on picture.For example, as indicated by the arrow 132 in Figure 38 A, user is by using on indicator touching picture
Device image 102, and indicator is slid into device image 130 on picture, with attachment device image 102 and 130.The behaviour
The PC (A) for making the specified and multi-function device (B) contacted of device image 102 and being contacted with device image 130 is filled as collaboration
It sets, and the order of connection of executive device.The order of connection of device image corresponds to the order of connection of device.Multi-function device
(B) correspond to first device, and PC (A) corresponds to second device.In example in Figure 38 A, user connects device image 102
It is connected to device image 130, multi-function device (B) is connected to PC (A) as a result,.The information of the order of connection of instruction device is from terminal
Device 14 is sent to robot device 10.The controller 74 of terminal installation 14 can make the figure in the path for indicating that user is followed
As being shown on picture.After collaborative device is connected to each other, controller 74 can be replaced with default straight line path and by its
It is shown on picture.
As described above, when the specified collaborative device to be combined (for example, multi-function device (B) and printer (A))
When, the synergistic function with reference to shown in Figure 37 of recognition unit 64 of robot device 10 manages table, and identifies and PC (A) and more work(
The corresponding synergistic function of combination of energy device (B).Identification can be executed by combining PC (A) and multi-function device (B) in this way
Synergistic function.When the order of connection of user's specified device, recognition unit 64 with reference to synergistic function manage table, and identify with
The associated priority level of the order of connection.This will be further illustrated with reference to Figure 37.PC (A) and multi-function device (B) are designated
For collaborative device, the synergistic function that can be executed as a result, by PC (A) and multi-function device (B) is " scanning forwarding function " and " printing
Function ".Multi-function device (B) is connected to PC (A) (B → A) by user, as a result, " scanning forwarding function " take first preferential
Grade, and " printing function " takes the second priority level.
Information related with the synergistic function identified as described above and information related with priority level are from machine
People's device 10 is sent to terminal installation 14.The controller 74 of terminal installation 14 makes information related with synergistic function according to preferential
Grade is shown on UI 72.
For example, as shown in fig. 38b, the controller 74 of terminal installation 14 is so that with the candidate related presentation of information of synergistic function
On UI 72." scanning forwarding function " takes the first priority level, and " printing function " takes the second priority level.With " scanning transfer
The related information of function " is prior to the related presentation of information with " printing function ".In example in figure 38b, with " scanning transfer
The related information of function " is prior to the related presentation of information with " printing function ".As the related letter with " scanning forwarding function "
Breath, display description " shift the institute's scan data generated by multi-function device (B) to PC (A) ".As related with " printing function "
Information, display description " print data stored in the PC (A) ".
When user specifies synergistic function and offer executes instruction, specified synergistic function is executed.For example, if user
"Yes" button is pressed, then executes the synergistic function contacted with "Yes" button.
The identification of synergistic function and priority level can be executed by terminal installation 14, rather than by robot device 10
It executes.
For the specified collaborative device to be combined and its order of connection, user can draw (such as round), rather than fill
Set slip indicator between image.Drawing order corresponds to the order of connection.Alternatively, user can export specified collaborative device and
The voice of its order of connection.
Figure 39 A and Figure 39 B instantiate another example of operation.As shown in Figure 39 A, user is by using indicator
The device image 130 on picture is touched, and indicator is slid into installation drawing in the direction that picture upper edge is indicated by arrow 134
As 102, to attachment device image 130 and 102.The specified PC (A) contacted with device image 130 of the operation and with device image
The multi-function devices (B) of 102 contacts, as collaborative device, and the order of connection of specified device.In example in Figure 39 A,
Device image 130 is connected to device image 102 by user, and PC (A) is connected to multi-function device (B) as a result,.Collaboration in Figure 37
Function management represents " printing function " and takes the first priority level, and " scanning forwarding function " takes the second priority level.At this
Kind in the case of, as shown in Figure 39 B, on the UI 72 of terminal installation 14, with " printing function " related information prior to " sweep
Retouch forwarding function " related presentation of information.In example in Figure 39 B, with " printing " related information prior to " scanning turn
The related presentation of information of shifting function ".
As described above, it by connecting the device image contacted with device, specifies real by the function of institute's attachment device
The synergistic function applied.The display order of item of information related with synergistic function is according to the order of connection of device image (that is, device
The order of connection) change.The order of connection of device serves as the sequence of the function of use device and shifts data between the devices
Sequentially.The operation (that is, operation of attachment device image) of attachment device serve as the function for specifying use device sequence and
The operation of the sequence of transfer data between the devices.By changing the aobvious of item of information related with synergistic function according to the order of connection
Show sequence, it is preferential to show information related with the synergistic function that user use.That is, preferential display and user most likely with
The related information of synergistic function.If the device image 102 contacted with multi-function device (B) is connected to and PC (A) by user
The device image 130 of contact, then it may be assumed that user " will be come using the function of multi-function device (B) first using synergistic function
Data are shifted from multi-function device (B) to PC (A) ".If user by the device image 130 contacted with PC (A) be connected to it is more
The device image 102 of functional device (B) contact, then it may be assumed that user will use synergistic function " to use the function of PC (A) first
To shift data from PC (A) to multi-function device (B) ".Have with synergistic function by being changed according to the order of connection of device image
The display order of the item of information of pass, preferential display and user most likely with the related information of synergistic function.Use function
Sequence and the sequence for shifting data between the devices are only specified by attachment device image, and therefore, display may make with user
The related information of synergistic function.
Display hand-off process described above can be applied to the use with the function image of functional cohesion.For example, with
The display of the related item of information of synergistic function according to the specified device image with the first functional cohesion and with the second functional cohesion
The sequence of device image switches.
(being handled using the collaboration of parts of images)
The different positions in device image that different functions can be assigned to from the device of collaborative device to be used as contacts
It sets.When user specifies specific position in device image, preferential display is come with by using the function of being assigned to the position
The related information of synergistic function of implementation.The processing is described more fully below.
Figure 40 shows the example of apparatus function management table.Data in apparatus function management table are stored in robot device
In 10, as apparatus function management information 40.In the example of apparatus function management table shown in Figure 40, device ID, instruction dress
Set the information of title (for example, type of device), the information of position (device picture position) in instruction device image, instruction are divided
The information (functional information) of the function of dispenser apparatus picture position evidence and image ID with it is associated with each other.Device picture position be with
Specific location (specific part) in the device image of device contact.Device picture position be, for example, schematically show device or
Specific position in the device image of the specific position in device image captured by camera.Different functions is assigned to dress
Set the specific position in image.
Figure 41 A to Figure 41 B instantiate the example of the picture shown on the UI 72 of terminal installation 14.For example, identifying more work(
It can device (B) and PC (A).Under user's decision-making mode 1 described above, as shown in Figure 41 A, in the UI of terminal installation 14
Display device images 102 and 130 are used as situation information on 72.For example, following functions are assigned to the certain bits of device image 102
It sets.Printing function is assigned to specific position (parts of images 102a) corresponding with the main body of multi-function device (B).Scanning function
It is assigned to specific part (part corresponding with the manuscript lid of multi-function device (B), manuscript glass and automatic manuscript feeder
Image 120b).Bookbinding function is assigned to specific position (parts of images corresponding with the preprocessor of multi-function device (B)
102c).Bookbinding function is the function for the paper that bookbinding is exported from multi-function device (B).For example, following functions are assigned to device
The part of image 130.Data storage function is assigned to specific position (parts of images 130a) corresponding with the main body of PC (A).
Menu display function is assigned to specific position (parts of images 130b) corresponding with the display of PC (A).Data storage function
It is the function that will be stored in from the data that another device is sent in PC (A).Menu display function is will to be sent out from another device
The data sent are shown in the function on PC (A).
The controller 74 of terminal installation 14 can make the function for the specific position being assigned in device image (such as beat
Print and scanning) title be shown on UI 72.This allows users to understand which function is assigned to specific position.It can be with
The necessarily title of display function.
When the position in device image of the specified assigned function of user, which is designated as synergistic function.
In the device image for indicating the device to be used as collaborative device, user connects the specific of assigned function by using indicator
Position (parts of images).For example, as indicated by the arrow 136 in Figure 41 A, user touches parts of images 102 with indicator, and
And indicator is slid into parts of images 130b, with attachment device image 102b and 130b.Attended operation will with including part figure
It is contacted as the multi-function device (B) of the contact of device image 102 of 102b and with the device image 130 including parts of images 130b
PC (A) is appointed as collaborative device, and specifies the scanning function for being assigned to parts of images 102b and be assigned to parts of images
The menu display function of 130b.The attended operation can be with the order of connection of specified device.The order of connection of parts of images corresponds to
In the order of connection of device.In example in Figure 41 A, parts of images 102b is connected to parts of images 130b by user, as a result,
Multi-function device (B) is connected to PC (A).As the function of synergistic function to be used for, invisible scanning function and menu display function.
The information of position specified by the information of the order of connection of instruction device and instruction user, in device image is from terminal installation 14
It is sent to robot device 10.
When identifying device to be combined as collaborative device (for example, PC (A) and multi-function device (B)), robot
Synergistic function shown in 1 manages table to the recognition unit 64 of device 10 referring to Fig.1, and identifying will be by combining PC (A) and more work(
The synergistic function that energy device (B) is implemented.Recognition unit 64 manages table referring also to apparatus function shown in Figure 40, and identifies and divided
The function for the position that dispensing is specified by user.Then recognition unit 64 is adjusted is implemented by combining PC (A) and multi-function device (B)
Synergistic function priority level.More specifically, recognition unit 64 is improved using the work(for being assigned to the position specified by user
The priority level of the synergistic function of energy, and reduce the priority level of the synergistic function without using these functions.
Information related with synergistic function and information related with priority level are sent to terminal dress from robot device 10
Set 14.The controller 74 of terminal installation 14 makes information related with synergistic function be shown on UI 72 according to priority level, makees
For the related information with synergistic function candidate.
For example, as shown in figure 41b, controller 74 is so that with the candidate related presentation of information of synergistic function on UI 72.With
Family invisible scanning function and menu display function in the order.Therefore, with to be shown by array sweeping function and picture
Function is performed the related information of synergistic function " scanning transfer display function " prior to related with another synergistic function
Presentation of information.Related presentation of information is on other information item with " scanning transfer display function ".For example, with " scanning is shifted
The related information of display function " with the synergistic function by array sweeping function and data storage function execution prior to " scanning
The related presentation of information of transfer store function ".Scanning transfer display function is to be generated to PC (A) transfers by multi-function device (B)
Institute's scan data and the function of institute's scan data is shown on the picture of PC (A).Scanning transfer store function is turned to PC (A)
It moves the institute's scan data generated by multi-function device (B) and institute's scan data is stored in the function in PC (A).In Figure 41 B
Example in, the description of each synergistic function is shown as information related with synergistic function.
Collaboration processing using parts of images described above allows users to independent specified be assigned to and cooperates with dress
The function for the part set simultaneously allows to preferentially show and the related information of synergistic function by the function implementation specified by user.
Therefore, preferentially display user most likely with synergistic function.
Synergistic function can be the combination of the function using the combination of the part of same device, the part using different device
Function, the entirety of use device and another device part the function of combination or the entirety of use device and another
The function of the whole combination of device.
The described above collaboration processing using parts of images can be applied to make with the function image of functional cohesion
With.For example, different functions can be assigned to the different piece in function image, and be identified by use be assigned to by
The synergistic function that the function for the position that user specifies is implemented.
(using another example of the collaboration processing of parts of images)
Another example of the collaboration processing of parts of images is used below with reference to Figure 42 and Figure 43 discussion.
Figure 42 shows the example of apparatus function management table.Data in apparatus function management table are stored in robot device
In 10, as apparatus function management information 40.In the example of apparatus function management table shown in Figure 42, device ID, instruction dress
Set the information of title (for example, type of device), the information of the title of part (for example, partial type) of instruction device, instruction
The information of the part ID of the part of the section identification information as part for identification, instruction are assigned to the function of part
The parts of images ID of the information of (function of the part) and the parts of images contacted for identification with the part with close each other
Connection.Parts of images is the image of the appearance of a part for the device for indicating to be captured by camera.Parts of images can be alternatively
Schematically show the parts of images of a device part.Different functions is assigned to the different piece of device.
This will be described in closer detail.Menu display function is assigned to the display of PC (A), and and menu display function
Related information with and the parts of images ID of parts of images that contacts of display be associated with.Menu display function is shown on PC (A)
Show the function of information.Data storage function is assigned to the main body of PC (A), and information related with data storage function with and
The parts of images ID associations of the parts of images of main body contact.Data storage function is the function of storing data in PC (A).
Printing function is assigned to the main body of multi-function device (B), and information related with printing function and and main body
The parts of images ID associations of the parts of images of contact.Scanning function is assigned to the reader of multi-function device (B) (with manuscript
Lid, manuscript glass and the corresponding part of automatic manuscript feeder), and information related with scanning function and and reader
The parts of images ID associations of the parts of images of contact.Bookbinding function is assigned to the preprocessor of multi-function device (B), and with
The related information of bookbinding function with and the parts of images ID of parts of images that contacts of preprocessor be associated with.Bookbinding function be bookbinding from
The function of the paper of multi-function device (B) output.
Being assigned to the function of a device part can be specified (identification) by using unmarked AR technologies.If as
The result of the image of camera (for example, visual sensor of robot device 10) capture device part and generate picture number
According to, then the 64 comparable device function management table of recognition unit of robot device 10, and (identification) is specified to be closed with the image data
The function of connection.The operation allows to specify the function of the partial association of (identification) and device.For example, if being passed as vision
Sensor captures the result of the image of multi-function device (B) main body and generates image data, then 64 comparable device function of recognition unit
Table is managed, and specifies (identification) and the associated printing function of the image data.Thus, it is possible to which specified be assigned to multifunctional assembling
The function of setting the main body of (B) is printing function.
Being assigned to the function of a device part can be specified (identification) by using the AR technologies based on label.As
The label (such as Quick Response Code) of encoded section identification information (for example, part ID) is arranged in each section of device.Label
Image by visual sensor capture and handled by using the AR technologies based on label, with obtain indicate the part part
Identification information (part ID).The recognition unit 64 of robot device 10 manages table referring next to apparatus function, and specifies and (know
Not) with the associated function of section identification information.
Figure 43 shows the example of synergistic function management table.Data in synergistic function management table are stored in robot device
In 10, as synergistic function management information 42.Synergistic function management table instruction can respectively be executed by using the function of multiple portions
Synergistic function.In the example of synergistic function management table shown in Figure 43, the information of the combination of the part of instruction device, instruction
The information of the combination of part ID and instruction can be performed synergistic function by using multiple functions of the combination of device part
Information with it is associated with each other.In synergistic function manages table, a part for instruction device and the whole combination of another device
Information and the information of synergistic function that can be executed by using the function of the device part and the whole function of device of instruction
Can with it is associated with each other.
Synergistic function management table will be described in closer detail.As synergistic function, printing function is assigned to the display of PC (A)
The combination of the main body of device and multi-function device (B).Indicate the information of the printing function as synergistic function and showing for instruction PC (A)
Show the information association of the combination of the part ID of device and the part ID of the main body of multi-function device (B).Printing as synergistic function
Function is the data stored in being sent in PC (A) to multi-function device (B) and prints the work(of the data in multi-function device (B)
Energy.
As synergistic function, printing function is assigned to the group of the main body of multi-function device (B) and the main body of projecting apparatus (C)
It closes.Indicate the part ID and projecting apparatus of the information and the main body of instruction multi-function device (B) of the printing function as synergistic function
(C) information association of the combination of the part ID of main body.Printing function as synergistic function is sent to multi-function device (B)
By projecting apparatus (C) projection data and the functions of the data is printed in the multi-function device (B).
As synergistic function, scanning projection function is assigned to the master of the reader and projecting apparatus (C) of multi-function device (B)
The combination of body.Indicate the portion of the information and the reader of instruction multi-function device (B) of the scanning projection function as synergistic function
Divide the information association of the combination of the part ID of the main body of ID and projecting apparatus (C).Scanning projection function as synergistic function be to
Projecting apparatus (C) sends the institute's scan data generated by multi-function device (B) and by the function of projecting apparatus (C) data for projection.
Synergistic function can be using the function of the function of the multiple portions of same device or using the multiple of different device
The function of partial function.Synergistic function can be the function using the function of three or more parts.
When by using based on label AR technologies or unmarked AR technologies specify (identification) device multiple portions (such as
The multiple portions of the multiple portions of different device or same device) when, the recognition unit 64 of robot device 10 is with reference to collaboration work(
Table can be managed, and specifies the associated synergistic function of combination of (identification) and identified multiple portions.The operation allows to refer to
Fixed (identification) synergistic function, the synergistic function use the work(for the multiple portions for for example passing through the image recognition for capturing multiple portions
Energy.If the main body of the main body and projecting apparatus (C) of identification multifunction device (B), robot device 10 is with reference to synergistic function pipe
Table is managed, and printing function is appointed as the associated association of combination with the main body of the main body of multi-function device (B) and projecting apparatus (C)
Congenerous.
(collaborative device is specified by stacking apparatus image)
Will the device as collaborative device can be specified by being superimposed multiple device images.Extremely below with reference to Figure 44 A
Figure 45 B describe the processing.Figure 44 A to Figure 45 B instantiate the example of the picture shown on the UI 72 of terminal installation 14.
For example, identification multifunction device (B) and PC (A).Under user's decision-making mode 1 described above, such as Figure 44 A institutes
Show, display and the associated device image 102 and 130 of institute's identification device on the UI 72 of terminal installation 14, as situation information.
In this state, by using indicator (for example, the finger of user, pen or contact pilotage), user will contact with the first image
Device image is superimposed upon on the device image (the second image) contacted with associated images.For example, as shown in Figure 44 B, user is with referring to
Show device specified device image 102, and such as device image 120 is superimposed upon on device image 130 by what arrow 138 indicated.With
Family is for example by executing drag-and-drop operation come stacking apparatus image 102 and 130.That is, user's actuator image 102 and being put down
On device image 130.Drag-and-drop operation is known operation.Alternatively, user can provide phonetic order, and device image is referred to
It is set to and is superimposed upon on each other.For example, device image 102 and 130 can be specified and fold according to the phonetic order provided from user
It is added on each other.
By the way that device image 102 and 130 is superimposed upon on each other, the multi-function device (B) that is contacted with device image 120 and
The PC (A) contacted with device image 130 is designated as collaborative device.
Trailing simultaneously in device image under the control of the controller 74 of terminal installation 14, it can be with recognizable
Mode be shown on UI 72, such as it semi-transparently shows or is shown as particular color.
After device image 102 is superimposed upon on device image 130, if PC (A) can use multi-function device (B) to execute
Synergistic function shows on the UI 72 of terminal installation 14 then as shown in Figure 44 C and checks picture 140.Check that picture 140 is to be used for
Check whether user wishes that combination is designated as the device of collaborative device.If user provides collaboration work(on checking picture 140
Energy (if user presses "Yes" button), shows information related with synergistic function on UI 72.
As shown in Figure 45 A, the controller 74 of terminal installation 14 is so that with the candidate related presentation of information of synergistic function in UI
On 72.By combined multi-functional device (B) and PC (A), implement scanning forwarding function and printing function.Therefore, it is shifted with scanning
The related information of function and presentation of information related with printing function are on UI 72.
When user specifies synergistic function and offer executes instruction, connection is sent from terminal installation 14 to collaborative device and is asked
It asks.As shown in Figure 45 B, standby picture is shown on the UI 72 of terminal installation 14 while sending connection request.At
Work(is established after the connection between terminal installation 14 and collaborative device, and specified synergistic function is executed.
As described above, it is superimposed upon on each other by the device image that will be contacted with collaborative device, it is specified by making
The synergistic function implemented with the function of institute's stacking apparatus.Only simple operations (such as superimposed image) are enough combination function as a result,.
Synergistic function can be specified by the way that parts of images to be superimposed upon on device image or parts of images.Below with reference to
Figure 46 A and Figure 46 B describe the processing.Figure 46 A and Figure 46 B instantiate showing for the picture shown on the UI 72 of terminal installation 14
Example.
With with it is described above using parts of images cooperate with processing similar mode, different functions be assigned to
Different location in the device image of device contact.By by the parts of images in device image be superimposed upon same device image or
Another parts of images in different device image, the specified collaboration using the function of being assigned to these overlapping portion images
Function.It the following will discuss the processing.
For example, identification multifunction device (B) and PC (A).Under user's decision-making mode 1 described above, such as Figure 46 A institutes
Show, display device images 102 and 130 are used as situation information on the UI 72 of terminal installation 14.Parts of images 102a, 102b,
102c, 130a and 130b are shown as can be by detaching with other parts image come independent movable independently image.
When parts of images is specified by user and is superimposed upon on another parts of images, specified use is assigned to these institutes
The synergistic function of the function of overlapping portion image, and presentation of information related with synergistic function is in the UI 72 of terminal installation 14
On.
For example, if user is by using indicator trailer portion image 102b and is lowered into parts of images 130b
On, then as indicated by the arrow 142 in Figure 46 B, the multifunctional assembling that is contacted with the device image 120 including parts of images 102b
The PC (A) for setting (B) and being contacted with the device image 130 including parts of images 130b is designated as collaborative device.It is assigned to portion
The scanning function of partial image 102b and the menu display function for being assigned to parts of images 130b are also designated as synergistic function.
The function of being assigned to parts of images manages in robot device 10.For example, the knowledge of parts of images for identification
The collaboration that other information, instruction are assigned to the functional information of the function of parts of images and instruction will be executed by combination function
The synergistic function information of function is stored in associated with each other in robot device 10.When parts of images is selected and is superimposed upon another
When on one parts of images, the identification letter for indicating these overlapping portion images is sent from terminal installation 14 to robot device 10
Breath.In example in Figure 46 B, the identification information of indicating section image 102b is sent from terminal installation 14 to robot device 10
With the identification information of indicating section image 130b.The recognition unit 64 of robot device 10 is specified based on identification information and is assigned to
The function of parts of images 102b and 130b, with the specified synergistic function using these functions.It is filled from robot device 10 to terminal
14 transmissions information related with synergistic function is set, and shows the information.
Processing described above allows users to the function of independent specified collaborative device and allows to preferentially show
With use by the related information of the synergistic function of the function specified by user.Therefore, preferentially display user there is a possibility that association
Congenerous.
The priority level of display synergistic function can change according to the sequence of overlapping portion image.In this case,
The related information of synergistic function for the function that preferential display is contacted with the parts of images for using and being superimposed.
In robot device 10 and terminal installation 14 each can as the result that software and hardware operates together and
Implement.More specifically, each in robot device 10 and terminal installation 14 includes unshowned one or more processors
(such as central processing unit (CPU)).It reads and is executed in unshowned storage device as the processor or these processors
The program of middle storage as a result, implement robot device 10 and terminal installation 14 function.The program is by using recording medium
It (such as CD (CD) or digital versatile disc (DVD)) or is stored in the storage device via communication path (such as network).Separately
The function of selection of land, robot device 10 and terminal installation 14 can by using hardware resource (such as processor, electronic circuit with
And application-specific integrated circuit (ASIC)) implement.In which case it is possible to use the device of such as memory.Alternatively, machine
The function of people's device 10 and terminal installation 14 can be by using digital signal processor (DSP) or field programmable gate array
(FPGA) implement.
Above description to exemplary embodiments of the present invention is to be provided for purposes of illustration and description.Not
It is intended to carry out limit to the present invention, or limits the invention to disclosed precise forms.It is readily apparent that multiple modifications
It is apparent to those skilled in the art with variation example.Embodiment has been selected to illustrate to best explain the present invention's
Principle and its practical application so that others skilled in the art it will be appreciated that the present invention various embodiments, and be suitble to
In the various modifications of contemplated particular use.The scope of the present invention is intended to be limited by appended claims and its equivalent.
Claims (20)
1. a kind of robot device, the robot device include:
Detector, the detector detect ambient conditions;With
Controller, if the robot device cannot be described for solving by using the execution of the function of the robot device
The solution of the problems in ambient conditions, then the controller execute control so that the solution will by using in addition to
Element except the robot device executes.
2. robot device according to claim 1, wherein the element include people or in addition to the robot device it
At least one of in outer device.
3. robot device according to claim 2, wherein the controller by in addition to the robot device it
Outer device communicates to control described device, so that described device executes the solution.
4. robot device according to claim 2, wherein the controller is by directly operating in addition to the robot
The operating unit of device except device controls described device, so that described device executes the solution.
5. robot device according to claim 4, wherein if the controller cannot by in addition to the machine
Device except people's device communicates to control described device, then the controller is by directly operating the operation of described device
Unit controls described device.
6. the robot device according to any one of claim 2 to 5, wherein the solution is by the robot
It cooperates to execute at least between two in device, the device other than the robot device and people.
7. robot device according to any one of claim 1 to 6, wherein the controller executes control so that will
Display indicates the information of the solution.
8. robot device according to claim 7, wherein the controller also executes control so that by display and energy
Enough execute the device image of the device contact of the solution.
9. to go the robot device described in 8 according to right, wherein execute what use was contacted with the device image specified by user
The function of device.
10. robot device according to claim 8, wherein if user specifies multiple device images, execute use
The synergistic function of the multiple devices contacted with the multiple device image.
11. robot device according to any one of claim 1 to 10, wherein detector detection with described
The related information of people around robot device and with other than people around related information, as the ambient conditions.
12. robot device according to claim 11, wherein when expression is related with the surrounding other than people
When the value of described information is equal to or higher than threshold value, the detector determination has occurred and that problem in the ambient conditions.
13. robot device according to claim 12, wherein the threshold value according to detected by the detector
People related information changes.
14. the robot device according to any one of claim 11 to 13, wherein if from based on letter related with people
The problems in described ambient conditions for detecting of the first testing result of breath with from based on have around described other than people
The problems in the ambient conditions that second testing result of the information of pass detects difference, then the controller is according to for institute
The priority level stated the priority level of the first testing result determination or determined for second testing result is selected for solving
The solution of described problem.
15. the robot device according to any one of claim 1 to 14, wherein if in addition to the robot device
Except the element can charge use, then it is described to use to control the delivery operation of the robot device for the controller
Element.
16. robot device according to claim 15, wherein if the robot device does not have any payment hand
Section, then the controller makes robot device by receiving payment support from the element to execute the delivery operation.
17. the robot device according to any one of claim 1 to 16, the robot device further include:
Communication unit, the communication unit by be suitable for ambient enviroment change communication means come with other than the robot device
Device execute communication.
18. the robot device according to any one of claim 1 to 17, wherein if the solution is by people
It executes to solve the solution of described problem, then the controller executes control so that will provide and how to execute the solution
The certainly related information of scheme.
19. the robot device according to any one of claim 1 to 18, wherein the robot device cannot pass through
The described problem solved using the function of the robot device is only cannot by the function of the robot device
It solves the problems, such as, only need specific threshold time or longer time to solve by using the function of the robot device
The problem of or the result with extra fine quality do not led to the problem of by using the function of the robot device.
20. a kind of control method for robot device, which includes the following steps:
Detect ambient conditions;And
If the robot device cannot be executed by using the function of the robot device for solving the surrounding feelings
The solution of the problems in condition, then execute control so that the solution will be by using in addition to the robot device
Except element execute.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-002408 | 2017-01-11 | ||
JP2017002408A JP6439806B2 (en) | 2017-01-11 | 2017-01-11 | Robot apparatus and program |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108297092A true CN108297092A (en) | 2018-07-20 |
CN108297092B CN108297092B (en) | 2022-09-20 |
Family
ID=62869849
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710924391.XA Active CN108297092B (en) | 2017-01-11 | 2017-09-30 | Robot apparatus and control method thereof |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6439806B2 (en) |
CN (1) | CN108297092B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112294208A (en) * | 2020-10-30 | 2021-02-02 | 江苏美的清洁电器股份有限公司 | Cleaning robot map display method and device |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109120968A (en) * | 2018-08-14 | 2019-01-01 | 上海常仁信息科技有限公司 | A kind of system and method for robot and display screen wireless transmission projection |
WO2020071235A1 (en) * | 2018-10-03 | 2020-04-09 | ソニー株式会社 | Control device for mobile body, control method for mobile body, and program |
WO2020149131A1 (en) | 2019-01-15 | 2020-07-23 | ソニー株式会社 | Remote control device, moving device, communication control method, and program |
KR102386353B1 (en) | 2020-09-17 | 2022-04-12 | 유진기술 주식회사 | Telepresence robot and manless store managing method using the same |
KR20240005511A (en) * | 2022-07-05 | 2024-01-12 | 삼성전자주식회사 | Robot of performing a specific service and control method thereof |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003291083A (en) * | 2002-03-28 | 2003-10-14 | Toshiba Corp | Robot device, robot controlling method, and robot delivery system |
CN1518489A (en) * | 2002-03-15 | 2004-08-04 | 索尼公司 | Robot behavior control system, behavior control method, and robot device |
JP2005111637A (en) * | 2003-10-10 | 2005-04-28 | Ntt Data Corp | Network robot service system |
CN105751196A (en) * | 2016-04-12 | 2016-07-13 | 华南理工大学 | Operating method on basis of master-slave industrial robot collaboration |
CN106056207A (en) * | 2016-05-09 | 2016-10-26 | 武汉科技大学 | Natural language-based robot deep interacting and reasoning method and device |
US20160311115A1 (en) * | 2015-04-27 | 2016-10-27 | David M. Hill | Enhanced configuration and control of robots |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4588359B2 (en) * | 2004-05-07 | 2010-12-01 | 富士通株式会社 | Network robot function providing system and function providing method |
JP2007249801A (en) * | 2006-03-17 | 2007-09-27 | Nippon Telegr & Teleph Corp <Ntt> | Robot cooperation system |
JP2007245317A (en) * | 2006-03-17 | 2007-09-27 | Nippon Telegr & Teleph Corp <Ntt> | Robot controller, program, and robot control method |
JP6069607B2 (en) * | 2013-03-26 | 2017-02-01 | 株式会社国際電気通信基礎技術研究所 | Robot service linkage system and platform |
-
2017
- 2017-01-11 JP JP2017002408A patent/JP6439806B2/en active Active
- 2017-09-30 CN CN201710924391.XA patent/CN108297092B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1518489A (en) * | 2002-03-15 | 2004-08-04 | 索尼公司 | Robot behavior control system, behavior control method, and robot device |
JP2003291083A (en) * | 2002-03-28 | 2003-10-14 | Toshiba Corp | Robot device, robot controlling method, and robot delivery system |
JP2005111637A (en) * | 2003-10-10 | 2005-04-28 | Ntt Data Corp | Network robot service system |
US20160311115A1 (en) * | 2015-04-27 | 2016-10-27 | David M. Hill | Enhanced configuration and control of robots |
CN105751196A (en) * | 2016-04-12 | 2016-07-13 | 华南理工大学 | Operating method on basis of master-slave industrial robot collaboration |
CN106056207A (en) * | 2016-05-09 | 2016-10-26 | 武汉科技大学 | Natural language-based robot deep interacting and reasoning method and device |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112294208A (en) * | 2020-10-30 | 2021-02-02 | 江苏美的清洁电器股份有限公司 | Cleaning robot map display method and device |
Also Published As
Publication number | Publication date |
---|---|
JP2018111154A (en) | 2018-07-19 |
CN108297092B (en) | 2022-09-20 |
JP6439806B2 (en) | 2018-12-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108297092A (en) | Robot device and its control method | |
US20180104816A1 (en) | Robot device and non-transitory computer readable medium | |
CN108604342B (en) | NFC-based data transmission method and mobile device | |
CN106395198B (en) | The control method and device of intelligent garbage bin | |
US20190105771A1 (en) | Display control device, display control method, computer program product, and communication system | |
CN108848308A (en) | A kind of image pickup method and mobile terminal | |
CN108182626B (en) | Service pushing method, information acquisition terminal and computer readable storage medium | |
KR20160076264A (en) | Mobile terminal and contor method thereof | |
CN104077095A (en) | Information processing apparatus and storage medium | |
CN107909399A (en) | A kind of available resources recommend method and apparatus | |
JP2019212183A (en) | Information processing device and program | |
CN108009288A (en) | Recipe method for pushing and device | |
CN108541015A (en) | A kind of signal strength reminding method and mobile terminal | |
CN107748785A (en) | Wardrobe management method and mobile terminal | |
CN111752450A (en) | Display method and device and electronic equipment | |
CN108881979A (en) | Information processing method, device, mobile terminal and storage medium | |
CN108920572A (en) | bus information processing method and mobile terminal | |
CN108510267A (en) | A kind of account information acquisition methods, mobile terminal | |
US20210397695A1 (en) | Information processing apparatus and non-transitory computer readable medium | |
CN110059627A (en) | A kind of display control method and terminal | |
US11223729B2 (en) | Information processing apparatus and non-transitory computer readable medium for instructing an object to perform a specific function | |
CN107957838A (en) | The method, apparatus and computer-readable recording medium of multi-option application interface interaction | |
JP6721023B2 (en) | Information processing device and program | |
CN109711477A (en) | A kind of training method and device of automatic Pilot model | |
CN110489064A (en) | Information processing unit, information processing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: Tokyo, Japan Applicant after: Fuji film business innovation Co.,Ltd. Address before: Tokyo, Japan Applicant before: Fuji Xerox Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |