WO2012008553A1 - Système de robot - Google Patents
Système de robot Download PDFInfo
- Publication number
- WO2012008553A1 WO2012008553A1 PCT/JP2011/066164 JP2011066164W WO2012008553A1 WO 2012008553 A1 WO2012008553 A1 WO 2012008553A1 JP 2011066164 W JP2011066164 W JP 2011066164W WO 2012008553 A1 WO2012008553 A1 WO 2012008553A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- action
- user
- behavior
- control information
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/02—Hand grip control means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/008—Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
Definitions
- the present invention relates to a robot system.
- a remote user In order to facilitate communication between users at remote locations and people at the site, use the movement of the line of sight of the robot acting on behalf of the user at the site, and tell the people at the site what the remote user sees at the site. It is important to communicate. In order to achieve this, a remote user must control the posture of the mechanism on which the robot camera is mounted so that the robot's line of sight is aligned with a desired object at the site.
- a user makes a robot attend a remote conference hall, and uses an operation terminal for operating the robot from a remote location to operate the head of the robot equipped with the camera. Can do. By moving the head, the user can control the direction of the camera at a remote location and acquire an image in the direction desired by the user.
- the participant in the conference hall with the direction of the head of the robot can understand what the user is looking at.
- a remote user can use the head operation unit of the operation terminal to operate the head of a robot equipped with a camera in the field. Therefore, the user can view the video captured by the head camera on the operation terminal.
- materials are displayed on a display that can be viewed by all participants on the site, and participants discuss while viewing the screen.
- the user at the remote location does not look at the display on which the material at the site is displayed via the operation terminal, but displays the same material on another display installed at the remote location and discusses it while referring to it. May participate in.
- the operation device includes a behavior association storage unit that stores behavior sensing information and robot control information in association with each other, and inputs a measurement value from a sensor that measures a user's behavior, and the behavior association
- An action determining means for retrieving action sensing information matching the measured value from a storage means and outputting a control command for taking an action described in the robot control information corresponding to the retrieved action sensing information to the robot; .
- An operation program inputs a measurement value from a sensor that measures a user's behavior and a storage process in which behavior sensing information and robot control information are associated with each other and stored in the behavior association storage unit.
- behavior sensing information and robot control information are associated with each other and stored in a behavior association storage unit, and a measurement value is input from a sensor that measures a user's behavior, and the behavior association is performed.
- the behavior sensing information matching the measured value is retrieved from the storage means, and a control command for taking the behavior described by the robot control information corresponding to the retrieved behavior sensing information is output to the robot.
- the robot system of the present invention makes it easy for a user to cause a robot to perform a predetermined operation through his / her own operation.
- FIG. 1 is a diagram illustrating an example of a first embodiment of a robot system 90 according to the present invention.
- the robot system 90 according to the first embodiment includes an operation device 10 and a robot 20.
- the operation device 10 is often installed in a remote place of the robot 20, but may be installed in a nearby place such as an adjacent room.
- the operating device 10 is installed in a remote place of the robot 20.
- the operating device 10 and the robot 20 are connected via a network.
- the network may be either wireless or wired. In the case of wireless, the network is a wireless local area network (LAN), a mobile phone network, or the like.
- LAN wireless local area network
- mobile phone network or the like.
- the network is serial communication, parallel communication, LAN, or the like.
- the network may be another alternative communication method.
- the robot 20 includes an audio input unit 21 such as a microphone and a video input unit 22 such as a camera.
- the voice input by the voice input unit 21 and the image input by the video input unit 22 are transmitted to the voice presentation unit 11 such as a speaker of the operation device 10 and the video presentation unit 12 such as a display device via a network, and are transmitted to a remote place. Is presented to the user of the operating device 10.
- the controller device 10 includes a voice input unit 17.
- the voice input by the voice input unit 17 is transmitted to the voice presentation unit 25 of the robot 20 via the network, and presented to people in the field such as a conference room.
- the operation device 10 includes an operation input unit 13.
- the user of the operation device 10 wants to move the head of the robot 20 or the like, the user operates the joystick 31 or the like of the operation input unit 13.
- the operation determination unit 16 transmits a control command to the operation control unit 24 of the robot 20 via the network to operate the operation mechanism 23.
- the operating device 10 includes an action recognition unit 14.
- the behavior recognition unit 14 is a sensor that is mounted on the user's head, for example, and measures the user's motion and posture.
- the action determination unit 16 When recognizing that a remote user has performed a certain action from the input measurement value, the action determination unit 16 generates action sensing information corresponding to the action.
- the operation determination unit 16 is configured by hardware such as a logic circuit.
- the operation determination unit 16 may be realized by a processor (not shown) of the operation device 10 that is also a computer reading and executing a program stored in a memory.
- the program is stored in, for example, a hard disk device connected to the operation device 10 and is read into the memory when the operation device 10 is initially set.
- FIG. 6 shows an example of data recorded in the action association DB 15.
- the action association DB 15 includes data that associates the actions of the user and the robot 20.
- the action determination unit 16 refers to the data and recognizes that the user's line of sight at the remote place has seen the object (for example, the material A at hand) at the remote place. Then, the same unit installs the line of sight of the robot 20 in the field (for example, a conference room) and corresponds to the object (for example, the document A displayed on the screen) associated with what the user sees on the remote side. Perform the action of pointing the camera in the direction.
- the behavior association DB 15 is a table including one or more records (rows) that associate the behavior sensing information 53 and the robot control information 54. Each record of the action association DB 15 may include an ID 51 and an action recognition type 52.
- the behavior sensing information 53 is data describing a user's behavior (posture, motion, etc.).
- the behavior sensing information 53 is, for example, a predetermined position centered on one or more virtual rotation axes of the user's head measured by the behavior recognition unit 14 (angle sensor 41) mounted on the user's head. Or the rotation angle from the current position.
- the behavior sensing information 53 includes a single angle, a change from one angle within a predetermined time to another angle, a holding time of the same angle, and the like.
- the robot control information 54 is data describing the behavior of the robot 20.
- the robot control information 54 is, for example, a rotation angle from a predetermined position or a current position around one or more operation rotation axes of the head of the robot 20.
- the robot control information 54 includes a single angle, a change from one angle within a predetermined time to another angle, a holding time of the same angle, and the like.
- the action recognition type 52 stores text indicating the type of action. Each data stored in the action association DB 15 may be stored in advance before the robot system 90 is activated, or may be stored by the user using the operation device 10 as described later.
- the action determination unit 16 inputs one or more measurement values from the action recognition unit 14 (angle sensor 41), and based on the measurement value itself, a change in the measurement value, a holding time of the same value, or the like Generate sensing information.
- the action determining unit 16 takes the action described by the robot control information 54 of the matched record.
- the robot 20 is controlled.
- the record includes data such as “value of angle sensor 41, upper 10 degrees, left 100 degrees” as behavior sensing information 53, and data “head up 20 degrees, right 90 degrees” as robot control information 54. .
- the action recognition unit 14, the action determination unit 16, the action control unit 24, and the like as the action sensing information 53, “moved the head downward by 5 deg within 1 sec and then moved upward by 5 deg”, and the robot control information 54 as “ The head may be moved 10 deg downward in 0.5 sec and moved 10 deg upward in 0.5 sec ”.
- the voice presentation unit 11, the video presentation unit 12, the operation input unit 13, the action recognition unit 14, and the voice input unit 17 may be provided in another device outside the operation device 10.
- the behavior recognition unit 14 may monitor a user's behavior from a video image of a camera installed in the vicinity of the user and detect a head, hand position, eyelid movement, mouth movement, and the like.
- the behavior sensing information 53 may include data such as pointing a finger at the object, moving the object, or using a function of the object.
- the robot control information 54 may describe the behavior of the robot 20 such as moving an arm or self-running. In this case, it is assumed that the robot 20 can take the described action.
- FIG. 2 is a view showing the appearance of the robot 20.
- the robot 20 has a video input unit 22 that captures video on the robot 20 side, a voice input unit 21 that records audio, and a remote site side in order to realize a dialogue between a remote user and a person on the robot 20 side. You may provide the audio
- the robot 20 includes an operation mechanism 23 for transmitting a user's movement at a remote place.
- the motion mechanism 23 is moved by an actuator and operates in the vertical direction as a pitch axis and the horizontal direction as a yaw axis as an operation rotation axis.
- the operation mechanism 23 may have an operation axis in the direction of the neck that is a roll axis.
- the operation mechanism 23 may have an actuator of any one of a pitch axis, a yaw axis, and a roll axis.
- the operation mechanism 23 may include an actuator capable of translational operation in the front / rear, up / down, and left / right directions.
- the robot 20 may further include a moving mechanism that moves the robot 20 itself in addition to the head as the operation mechanism 23.
- the moving mechanism may be either a wheel mechanism that can move in all directions or a wheel mechanism that includes two parallel driving wheels.
- the video input unit 22 may calculate distance information using a plurality of cameras and present the distance information to the user on the controller device 10 side. Further, the video input unit 22 may have a zoom mechanism, and may capture a video with a zoom magnification desired by the user on the operation device 10 side.
- the video input unit 22 may be equipped with a plurality of cameras having different shooting directions so that the user on the operation device 10 side can select a video.
- the video input unit 22 may be equipped with a camera that captures the entire periphery of the robot 20, a super wide-angle camera, and a fisheye camera.
- the audio input unit 21 is a monaural microphone or a stereo microphone. Moreover, the voice input unit 21 may measure the direction of sound using a plurality of microphones and present it to the operation device 10 side user.
- the robot 20 may include a line-of-sight display unit 28 that enables a person observing the robot 20 to determine the line of sight of the robot 20.
- the line-of-sight display unit 28 is, for example, unevenness that represents the eyes and nose of the robot 20.
- FIG. 3 is a diagram illustrating an appearance of the operation device 10.
- the operation device 10 is a device for a remote user to remotely operate the robot 20.
- the operation device 10 includes an operation input unit 13 for inputting the movement of the robot 20, a video presentation unit 12 that presents a video shot on the robot 20 side, and a voice presentation that presents voice input on the robot 20 side.
- Unit 11 and a voice input unit 17 for inputting a voice of a remote user.
- a user who uses the operation device 10 interacts with a person (site person) on the robot 20 side through voice, video, and movement of the robot 20.
- FIG. 1 is a diagram illustrating an appearance of the operation device 10.
- the operation device 10 is a device for a remote user to remotely operate the robot 20.
- the operation device 10 includes an operation input unit 13 for inputting the movement of the robot 20, a video presentation unit 12 that presents a video shot on the robot 20 side, and a voice presentation that presents voice input on the robot 20 side.
- the operation input unit 13 includes, for example, a two-axis joystick 31 for controlling the upper and lower and left and right axes of the head of the robot 20.
- the operation input unit 13 includes a line-of-sight recording button 32 for recording the line of sight of the robot 20.
- the action recognition part 14 is the angle sensor 41 which is fixed to the head of the user of a remote place, and measures the attitude
- the operation input unit 13 is configured to rotate the pitch axis, the yaw axis, and the roll axis of the robot 20 head in accordance with the operation rotation axis of the motion mechanism 23 of the head of the robot 20 and any of the front / back, up / down, and left / right translation operations.
- a plurality of joysticks 31 for inputting such an operation may be provided.
- FIG. 5 is a flowchart for explaining the processing flow of the operation determination unit 16.
- the motion determination unit 16 controls the motion mechanism 23 so that the robot 20 sees the same kind of object at the site when the user at the remote location views the object at the remote location.
- the action determination unit 16 executes a process of associating the action of the user at the remote place with the action of the robot 20 and the process of controlling the action of the robot 20 based on the recognition result of the action of the user at the remote place. .
- the operation determination unit 16 determines whether or not there is an operation input of the joystick 31 (S1).
- the operation determination unit 16 outputs a control command for instructing the action to the operation control unit 24 of the robot 20 in accordance with the input of the joystick 31 (S2).
- the motion determining unit 16 transmits a control command for moving the head of the robot 20 upward.
- the motion determination unit 16 may change the motion speed of the head of the robot 20 according to the tilt amount of the joystick 31. Further, the motion determination unit 16 may use the tilt amount of the joystick 31 as the motion angle of the head of the robot 20.
- the operation determination unit 16 determines whether the line-of-sight recording button 32 has been pressed (S3). When the line-of-sight recording button 32 is pressed (Y in S3), the operation determination unit 16 generates behavior sensing information based on the measurement value input from the behavior recognition unit 14 at the timing of pressing. At the same time, the same unit reads the posture data of the motion mechanism 23 of the robot 20 (S4), associates both, and records them in the action association DB 15 (S5).
- the action determination unit 16 changes the measurement value of the angle sensor 41 at the time when the line-of-sight recording button 32 is pressed, the change of the measurement value within a predetermined time (for example, several seconds) before the time point, or the holding time of the same measurement value. Action sensing information based on the above is generated.
- An example of how to use the line-of-sight recording button 32 is as follows. As a first step, the user operates the joystick 31 or the like in advance so that the line-of-sight display means 28 on the head of the robot 20 faces a television screen or the like (second object) in the field in advance. When the robot 20 includes the video input unit 22, the user can easily perform this operation while viewing the image captured by the video input unit 22 through the video presentation unit 12 of the operation device 10.
- a user at a remote location presses the line-of-sight recording button 32 with his head facing the direction of a television or the like (first object) installed near the user.
- the posture data robot control information 54
- the measured values of the head angle sensor 41 and the like are associated with each other and recorded in the behavior association DB 15.
- the operation determination unit 16 inputs the current measurement value of the angle sensor 41 and the like, and compares whether the behavior sensing information based on the measurement value matches any of the behavior sensing information 53 registered in the behavior association DB 15. (S6). At this time, the motion determination unit 16 may determine a complete match, or whether the current behavior sensing information obtained from the angle sensor 41 is closer to the value registered in the behavior association DB 15 than the predetermined value. The approximate match may be determined by By this process, the operation determination unit 16 determines that the user at the remote location has taken the action of looking at the first target object at the remote location.
- the action determination unit 16 takes action described by the robot control information 54 registered in the action association DB 15 corresponding to the action sensing information 53.
- the robot 20 performs an action of viewing the second object on the spot that is associated with the first object at the remote place viewed by the user at the remote place.
- the action association DB 15 in which the action sensing information 53 and the robot control information 54 are already stored is prepared in advance, the line-of-sight recording button 32 may not be provided.
- the operation determination unit 16 does not have to implement the process (S3 to S5) when the line-of-sight recording button 32 is pressed.
- the behavior of the robot 20 (description content of the robot control information 54) need not be the same as the user's behavior (description content of the behavior sensing information 53). That is, the moving direction of the user's head (up / down / left / right, etc.) and the moving direction of the head of the robot 20 may be different, or the moving angle may be different.
- the robot system 90 of the present invention makes it easy for the user to cause the robot 20 to perform a predetermined operation through his / her own operation. The reason is that when the controller device 10 recognizes that the user has performed the behavior described by the behavior sensing information 53, the robot 20 causes the robot 20 to take the behavior described by the robot control device 54.
- the user can operate the robot 20 only by performing an action according to his / her purpose and desire.
- the robot system 90 of the present invention facilitates communication between people in the field such as a conference room and users in other places such as remote locations. For example, a display that displays the document A at the user's hand without the remote user pointing the head of the robot 20 at the site to the screen of the site displaying the document A (while keeping another direction) , The person at the site is not notified that the user at the remote location is viewing the document A. As a result, there is a problem that discussion between the user and the people on site cannot be smoothly conducted.
- the robot system 90 of the present invention prevents such a situation from occurring.
- FIG. 7 is a diagram illustrating a configuration of the operation input unit 13 and the action recognition unit 14 of the second embodiment.
- the action recognition unit 14 detects and recognizes a barcode from a camera 42 fixed to the user's head so that a user at a remote location can take a picture, and an image of the imaging result of the camera 42.
- the barcode recognition unit 43 outputs the reading result to the operation determination unit 16.
- the action recognition unit 14 recognizes the behavior that the user at the remote location has seen the material by reading the barcode. Is possible. In addition, if you paste objects such as a sheet with a barcode printed in advance on a remote object, such as a television or a whiteboard, the action that a remote user saw those objects, The operation determination unit 16 can determine by reading the barcode. What is attached to the object may be not only a barcode sheet but also an infrared transmitter, a radio transmitter, a QR code, or the like.
- the action recognition unit 14 may be configured to recognize each tag. For example, the action recognition unit 14 may include an IC tag reader instead of the camera 42.
- the robot 20 may be mounted with a camera on the head and detect a tag attached to the site side.
- the motion determination unit 16 can detect a predetermined tag with the camera 42 on the head of the robot 20. May be controlled. With this configuration, the action of the user or the robot 20 to see a certain object is realized by an operation of reading a tag attached to the object.
- the action association DB 15 may further store a barcode or tag value to be read by the robot 20 as the robot control information 54.
- the operation device 10 expands the degree of freedom of user operation and the degree of freedom of action setting of the robot 20. The reason is that the camera 42 or the IC tag reader provided in the operation terminal 10 reads the barcode or tag to recognize the user's action. In addition, the IC tag reader or the like included in the robot 20 reads the barcode or tag to determine the behavior of the robot 20.
- FIG. 8 is a diagram illustrating a configuration of the operation device 10 according to the third embodiment.
- the operating device 10 includes an action association DB 15 and an action determination unit 16.
- the behavior association DB 15 stores behavior sensing information 53 and robot control information 54 in association with each other.
- the action determination unit 16 inputs a measurement value from a sensor that measures the user's action, searches the action association DB 15 for action sensing information 53 that matches the measurement value, and performs robot control corresponding to the searched action sensing information 53.
- a control command for taking an action described by the information 54 is output to the robot 20.
- the robot system 90 of the present invention makes it easy for the user to cause the robot 20 to perform a predetermined operation through his / her own operation. The reason is that when the controller device 10 recognizes that the user has performed the behavior described by the behavior sensing information 53, the robot 20 causes the robot 20 to take the behavior described by the robot control device 54.
- Each unit of the robot system 90 described above may be realized by hardware such as a logic circuit, or may be realized by a computer and software executed on the computer.
- the present invention is not limited to the above embodiment.
- Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.
- This application claims the priority on the basis of Japanese application Japanese Patent Application No. 2010-160440 for which it applied on July 15, 2010, and takes in those the indications of all here.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Manipulator (AREA)
Abstract
L'invention porte sur un robot qu'un utilisateur amène à exécuter une action prédéterminée par l'action, etc. de l'utilisateur.
Un dispositif de manœuvre comprend un moyen d'association et de mémorisation d'action qui associe et mémorise une information de détection d'action comportant une information de commande, et un moyen de détermination d'action qui entre des valeurs mesurées provenant d'un capteur qui sert à mesurer l'action de l'utilisateur, récupère le moyen d'association et de mémorisation d'action pour l'information de détection d'action qui correspond aux valeurs mesurées, et transmet au robot une instruction de commande pour réaliser l'action décrite dans l'information de commande de robot qui correspond à l'information de détection d'action récupérée.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-160440 | 2010-07-15 | ||
JP2010160440 | 2010-07-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012008553A1 true WO2012008553A1 (fr) | 2012-01-19 |
Family
ID=45469548
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/066164 WO2012008553A1 (fr) | 2010-07-15 | 2011-07-08 | Système de robot |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2012008553A1 (fr) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108858207A (zh) * | 2018-09-06 | 2018-11-23 | 顺德职业技术学院 | 一种基于远程控制的多机器人协同目标搜索方法及系统 |
CN109199240A (zh) * | 2018-07-24 | 2019-01-15 | 上海斐讯数据通信技术有限公司 | 一种基于手势控制的扫地机器人控制方法及系统 |
US20200090022A1 (en) * | 2016-12-22 | 2020-03-19 | Intel Corporation | Efficient transferring of human experiences to robots and other autonomous machines |
CN114488879A (zh) * | 2021-12-30 | 2022-05-13 | 深圳鹏行智能研究有限公司 | 一种机器人控制方法以及机器人 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005123959A (ja) * | 2003-10-17 | 2005-05-12 | Nippon Telegr & Teleph Corp <Ntt> | 高臨場感通信会議装置 |
JP2005269549A (ja) * | 2004-03-22 | 2005-09-29 | Nippon Telegr & Teleph Corp <Ntt> | ダミーヘッドおよびこれを用いた高臨場感通信装置 |
JP2008254103A (ja) * | 2007-04-03 | 2008-10-23 | Sky Kk | プレゼンター動作再現ロボット、プレゼンター動作再現ロボットの制御方法及び制御プログラム |
-
2011
- 2011-07-08 WO PCT/JP2011/066164 patent/WO2012008553A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005123959A (ja) * | 2003-10-17 | 2005-05-12 | Nippon Telegr & Teleph Corp <Ntt> | 高臨場感通信会議装置 |
JP2005269549A (ja) * | 2004-03-22 | 2005-09-29 | Nippon Telegr & Teleph Corp <Ntt> | ダミーヘッドおよびこれを用いた高臨場感通信装置 |
JP2008254103A (ja) * | 2007-04-03 | 2008-10-23 | Sky Kk | プレゼンター動作再現ロボット、プレゼンター動作再現ロボットの制御方法及び制御プログラム |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200090022A1 (en) * | 2016-12-22 | 2020-03-19 | Intel Corporation | Efficient transferring of human experiences to robots and other autonomous machines |
US11615284B2 (en) * | 2016-12-22 | 2023-03-28 | Intel Corporation | Efficient transferring of human experiences to robots and other autonomous machines |
CN109199240A (zh) * | 2018-07-24 | 2019-01-15 | 上海斐讯数据通信技术有限公司 | 一种基于手势控制的扫地机器人控制方法及系统 |
CN109199240B (zh) * | 2018-07-24 | 2023-10-20 | 深圳市云洁科技有限公司 | 一种基于手势控制的扫地机器人控制方法及系统 |
CN108858207A (zh) * | 2018-09-06 | 2018-11-23 | 顺德职业技术学院 | 一种基于远程控制的多机器人协同目标搜索方法及系统 |
CN114488879A (zh) * | 2021-12-30 | 2022-05-13 | 深圳鹏行智能研究有限公司 | 一种机器人控制方法以及机器人 |
CN114488879B (zh) * | 2021-12-30 | 2024-05-31 | 深圳鹏行智能研究有限公司 | 一种机器人控制方法以及机器人 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11005982B2 (en) | Information processing terminal | |
JP6770061B2 (ja) | いつでもどこからでもビデオコンテンツを再生するための方法および装置 | |
US9906725B2 (en) | Portable video communication system | |
KR102184272B1 (ko) | 글래스 타입 단말기 및 이의 제어방법 | |
CN111541845B (zh) | 图像处理方法、装置及电子设备 | |
JP6229314B2 (ja) | 情報処理装置、表示制御方法及びプログラム | |
US20170357481A1 (en) | Method and apparatus for controlling surveillance system with gesture and/or audio commands | |
JP6452440B2 (ja) | 画像表示システム、画像表示装置、画像表示方法、およびプログラム | |
US20130265448A1 (en) | Analyzing Human Gestural Commands | |
WO2020238831A1 (fr) | Procédé de photographie et terminal | |
US11625858B2 (en) | Video synthesis device, video synthesis method and recording medium | |
JP2011152593A (ja) | ロボット操作装置 | |
US11151804B2 (en) | Information processing device, information processing method, and program | |
WO2015072166A1 (fr) | Dispositif d'imagerie, procédé d'aide à l'imagerie, et support d'enregistrement sur lequel est enregistré un programme d'aide à l'imagerie | |
CN107437063A (zh) | 用于感测环境的装置和方法、非暂态计算机可读介质 | |
JP2012175136A (ja) | カメラシステムおよびその制御方法 | |
JP2017536024A (ja) | ビデオ画像を制御するための方法及び装置 | |
WO2012008553A1 (fr) | Système de robot | |
CN116472715A (zh) | 显示设备及摄像头追踪方法 | |
US20160182860A1 (en) | Methods for performing image capture and real-time image display on physically separated or separable devices and apparatus therefor | |
KR101672268B1 (ko) | 전시공간 제어 시스템 및 전시공간 제어방법 | |
US7986336B2 (en) | Image capture apparatus with indicator | |
JP4960270B2 (ja) | インターホン装置 | |
JP6548802B1 (ja) | アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム | |
JP2021002145A (ja) | 仮想空間提供システム、仮想空間提供方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11806883 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11806883 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |