US20200030986A1 - Robotic camera control via motion capture - Google Patents
Robotic camera control via motion capture Download PDFInfo
- Publication number
- US20200030986A1 US20200030986A1 US16/588,972 US201916588972A US2020030986A1 US 20200030986 A1 US20200030986 A1 US 20200030986A1 US 201916588972 A US201916588972 A US 201916588972A US 2020030986 A1 US2020030986 A1 US 2020030986A1
- Authority
- US
- United States
- Prior art keywords
- robot
- movements
- motion capture
- operator
- control signals
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 147
- 230000003278 mimic effect Effects 0.000 claims abstract description 16
- 238000000034 method Methods 0.000 claims description 44
- 238000012545 processing Methods 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 11
- 238000010586 diagram Methods 0.000 description 10
- 238000013459 approach Methods 0.000 description 9
- 238000004590 computer program Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 208000027418 Wounds and injury Diseases 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000006378 damage Effects 0.000 description 3
- 239000012636 effector Substances 0.000 description 3
- 208000014674 injury Diseases 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000014616 translation Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0011—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
- G05D1/0016—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement characterised by the operator's input device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G06K9/00342—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/251—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H04N5/23203—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/35—Nc in input of data, input till input file format
- G05B2219/35444—Gesture interface, controlled machine observes operator, executes commands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
Definitions
- Embodiments of the present invention relate generally to robotics and, more specifically, to robotic camera control via motion capture.
- a camera operator controls the position, orientation, and motion of a camera in order to frame shots, capture sequences, and transition between camera angles, among other cinematographic procedures.
- operating a camera can be difficult and/or dangerous in certain situations.
- the camera operator may be required to assume an awkward position for an extended period of time while physically supporting heavy camera equipment.
- the camera operator may be required to risk bodily injury in order to attain the requisite proximity to the action being filmed.
- camera operators usually must perform very complex camera movements and oftentimes must repeat these motions across multiple takes.
- Various embodiments of the present invention set forth a computer-implemented method for controlling a robot, including generating motion capture data based on one or more movements of an operator, processing the motion capture data to generate control signals for controlling the robot, and transmitting the control signals to the robot to cause the robot to mimic the one or more movements of the operator.
- At least one advantage of the approach discussed herein is that a human camera operator need not be subjected to discomfort or bodily injury when filming movies.
- FIG. 1 illustrates a system configured to implement one or more aspects of the present invention
- FIG. 2 is a more detailed illustration of the control engine of FIG. 1 , according to various embodiments of the present invention
- FIG. 3 illustrates how the articulation of a robotic arm can be controlled via a motion capture setup, according to various embodiments of the present invention
- FIG. 4 illustrates how the position and orientation of a robotic drone can be controlled via a motion capture setup, according to various embodiments of the present invention.
- FIG. 5 is a flow diagram of method steps for translating motion capture data into control signals for a controlling a robot, according to various embodiments of the present invention.
- FIG. 1 illustrates a system configured to implement one or more aspects of the present invention.
- system 100 includes a motion capture setup 110 , a computer 120 , and a robot 140 .
- Motion capture setup 110 is coupled to computer 120
- computer 120 is coupled to robot 140 .
- Motion capture setup 110 includes sensors 112 ( 0 ) and 112 ( 1 ), configured to a capture motion associated with an operator 114 .
- Motion capture setup 110 may include any number of different sensors, although generally motion capture setup 110 includes at least two sensors 112 in order to capture binocular data.
- Motion capture setup 110 outputs motion capture data 150 to computer 120 for processing.
- Computer 120 includes a processor 122 , input/output (I/O) utilities 124 , and a memory 126 , coupled together.
- Processor 122 may be any technically feasible form of processing device configured process data and execute program code.
- Processor 112 could be, for example, a central processing unit (CPU), a graphics processing unit (GPU), an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), any technically feasible combination of such units, and so forth.
- I/O utilities 124 may include devices configured to receive input, including, for example, a keyboard, a mouse, and so forth.
- I/O utilities 124 may also include devices configured to provide output, including, for example, a display device, a speaker, and so forth.
- I/O utilities 124 may further include devices configured to both receive and provide input and output, respectively, including, for example, a touchscreen, a universal serial bus (USB) port, and so forth.
- USB universal serial bus
- Memory 126 may be any technically feasible storage medium configured to store data and software applications. Memory 126 could be, for example, a hard disk, a random access memory (RAM) module, a read-only memory (ROM), and so forth. Memory 126 includes a control engine 128 and database 130 . Control engine 128 is a software application that, when executed by processor 122 , causes processor 122 to interact with robot 140 .
- Robot 140 includes actuators 142 coupled to a sensor array 144 .
- Actuators 142 may be any technically feasible type of mechanism configured to induce physical motion of any kind, including linear or rotational motors, hydraulic or pneumatic pumps, and so forth.
- actuators 142 include rotational motors configured to articulate a robotic arm of robot 140 .
- actuators 142 include rotational motors configured to drive a set of propellers that propel robot 140 through the air.
- Sensor array 144 may include any technically feasible collection of sensors.
- sensor array 144 could include an optical sensor, a sonic sensor, and/or other types of sensors configured to measure physical quantities.
- sensor array 144 is configured to record multimedia data.
- sensor array 144 includes a video camera configured to capture a frame 146 . By capturing a sequence of such frames, sensor array 144 may record a movie.
- motion capture setup 110 captures motion associated with operator 114 and then transmits motion capture signals 150 to control engine 128 .
- Control engine 128 processes motion capture signals 150 to generate control signals 152 .
- Control engine 128 transmits control signals 152 to robot 140 to control the motion of robot 140 via actuators 140 .
- the resultant motion of robot 140 mimics and/or is derived from the motion of operator 114 .
- Robot 140 receives sensor data 154 via sensor array 146 and transmits that sensor data to control engine 128 .
- Control engine 128 may process the received data for storage in database 130 .
- operator 114 may control robot 140 to capture video sequences without being required to physically and/or directly operate any camera equipment.
- One advantage of this approach is that operator 114 need not be subjected to a difficult and/or dangerous working environment when filming. Control engine 128 is described in greater detail below in conjunction with FIG. 2 .
- FIG. 2 is a more detailed illustration of the control engine of FIG. 1 , according to various embodiments of the present invention.
- control engine 128 includes motion capture analyzer 200 , dynamics extractor 210 , dynamics translator 220 , and multimedia capture module 230 .
- Motion capture analyzer 200 is configured to receive motion capture signals 150 from motion capture setup 110 and to process those signals to generate raw data 202 .
- Motion capture signals 150 generally include video data captured by sensors 112 .
- Motion capture analyzer 200 processes this video data to identify the position and orientation of some or all of operator 114 over a time period.
- operator 114 wears a suit bearing reflective markers that can be tracked by motion capture analyzer 200 .
- Motion capture analyzer 200 may be included in motion capture setup 110 in certain embodiments.
- Motion capture analyzer 200 generates raw data 202 , which includes a set of quantities describing the position and orientation of any tracked portions of operator 114 over the time period.
- Dynamics extractor 210 receives raw data 202 and then processes this data to generate processed data 212 . In doing so, dynamics extractor 210 models the dynamics of operator 114 over the time period based on raw data 202 . Dynamics extractor 210 may determine various rotations and/or translations of portions of operator 114 , including the articulation of the limbs of operator 114 , as well as the trajectory of any portion of operator 114 and/or any object associated with operator 114 . Dynamics extractor 220 provides processed data 212 to dynamics translator 220 .
- Dynamics translator 220 is configured to translate the modeled dynamics included in processed data 212 into control signals 152 for controlling robot 140 to have dynamics derived from the modeled dynamics.
- Dynamics translator 220 may generate some control signals 152 that cause robot 140 to directly mimic the dynamics and motion of operator 114 , or may generate other control signals 152 to cause only a portion of robot 140 to mimic the dynamics and motion of operator 114 .
- dynamics translator 220 could generate control signals 152 that cause actuators 142 within a robotic arm of robot 140 to copy the articulation of joints within an arm of operator 114 .
- dynamics translator 220 could generate control signals 152 that cause actuators 142 to perform any technically feasible set of actuations in order to cause sensor array 144 to trace a similar path as an object held by operator 114 .
- dynamics translator 220 may also amplify the movements of operator 114 when generating control signals 152 , so that the dynamics of robot 140 represent an exaggerated version of the dynamics of operator 114 .
- Multimedia capture application 230 generally manages the operation of sensor array and processes incoming sensor signals such as sensor signals 154 . Based on these signals, multimedia capture module 230 generates multimedia data 232 for storage in database 130 .
- Multimedia data 232 may include any technically feasible type of data, although in practice multimedia data 232 includes frames of video captured by sensor array 144 , and possibly frames of audio data as well.
- control engine 128 is configured to translate movements performed by operator 114 into movements performed by robot 140 .
- operator 114 can control robot 140 by performing a set of desired movements.
- This approach may be specifically applicable to filming a movie, where a camera operator may wish to control robot 140 to film movie sequences in a particular manner.
- persons skilled in the art will understand that the techniques disclosed herein for controlling a robot may be applicable to other fields beyond cinema.
- FIG. 3 illustrates how the articulation of a robotic arm can be controlled via a motion capture setup, according to various embodiments of the present invention.
- the scenario depicted in FIG. 3 is provided for exemplary purposes only to illustrate one possible robot and one possible set of movements.
- the techniques described herein are applicable to any technically feasible robot and any technically feasible set of movements.
- Motion capture setup 110 is generally configured to track the motion of operator 114 , and, specifically, to track the articulation of joints in arm 300 of operator 114 . In one embodiment, motion capture setup 110 relies on markers indicating the positions of joints of arm 300 . Motion capture setup 110 may also track the position and location of motion capture object 302 . Motion capture setup 110 transmits motion capture signals 150 to control engine 128 that represent any and all captured data.
- Control engine 128 then generates control signals 152 to cause robot 140 to move from position 310 (B) to position 312 (B).
- Positions 310 (A) and 310 (B) are generally analogous and have a similar articulation of joints.
- positions 312 (A) and 312 (B) are analogous and have a similar articulation of joints.
- any intermediate positions are generally analogous.
- Robot 140 may effect these analogous articulations by actuating actuators 142 ( 0 ), 142 ( 1 ), 142 ( 2 ), and 142 ( 3 ) to perform similar joint rotations as those performed by operator 114 with arm 300 . In this manner, robot 140 performs a movement that is substantially the same as the movement performed by operator 114 .
- the motion of robot 140 represents an exaggerated version of the motion of operator 114 .
- control engine 128 could scale the joint rotations associated with arm 300 by a preset amount when generating control signals 152 for actuators 142 or perform a smoothing operation to attenuate disturbances and/or other perturbations in the mimicked movements.
- sensor array 144 captures data that is transmitted to control engine 128 as sensor data 154 .
- This sensor data is gathered from a sequence of locations that operator 114 indicates by performing movements corresponding to those locations, as described.
- robot 140 need not implement analogous dynamics as operator 114 .
- robot 140 may perform any technically feasible combination of movements in order to cause sensor array 144 to trace a similar path. This approach is described in greater detail below in conjunction with FIG. 4 .
- FIG. 4 illustrates how the position of a robotic drone can be controlled via a motion capture setup, according to various embodiments of the present invention. Similar to FIG. 3 , the scenario depicted in FIG. 4 is provided for exemplary purposes only to illustrate one possible robot and one possible set of movements. The techniques described herein are applicable to any technically feasible robot and any technically feasible set of movements.
- operator 114 moves motion capture object 302 along trajectory 400 (A) from initial position 410 (A) to final position 412 (A). In doing so, operator 114 articulates arm 300 , in like fashion as described above in conjunction with FIG. 3 .
- motion capture setup 110 need not capture the specific articulation or arm 300 .
- motion capture setup 110 tracks the trajectory of motion capture object 302 to generate motion capture signals 150 .
- control engine 128 determines specific dynamics for robot 140 that cause sensor array 144 to traverse a trajectory that is analogous to trajectory 400 (A).
- Control engine 128 generates control signals 152 that represent these dynamics and then transmits control signals 152 to robot 140 for execution.
- robot 140 moves sensor array 144 along trajectory 400 (B) from position 410 (A) to position 410 (B).
- control engine 128 may generate different control signals 152 .
- robot 140 is a quadcopter drone. Accordingly, control engine 128 generates control signals 152 that modulate the thrust of one or more propellers to cause the drone to move sensor array 144 along trajectory 400 (B).
- robot 140 could include an arm with a number of joints, similar to robot 140 shown in FIG. 3 . In this case, control engine 128 could determine a set of joint articulations that would cause robot 140 to move sensor array 144 along trajectory 400 (B). Again, those articulations need not be similar to the articulations of arm 300 .
- operator 114 indicates, via motion capture object 302 , a trajectory along with sensor array 144 should travel.
- Control engine 128 then causes robot to trace a substantially similar trajectory with sensor array 144 .
- sensor array 144 captures sensor data 154 for transmission to control engine 128 .
- control engine 128 could instruct robot 140 to copy the motions of operator 114 under certain circumstances, and to copy the trajectory of motion capture object 302 under other circumstances.
- control engine 128 could be trained to interpret the motions of operator 114 according to a gesture-based language. For example, operator 114 could perform specific hand motions to indicate camera operations such as “zoom in,” “pan left,” and so forth.
- control engine 128 records any and all movements performed by operator 114 during real-time control of robot 140 for later playback to robot 140 . In this manner, robot 140 can be made to repeat a set of movements across multiple takes without requiring operator 114 to repeat the associated movements.
- FIG. 5 is a flow diagram of method steps for translating motion capture data into control signals for controlling a robot, according to various embodiments of the present invention. Although the method steps are described in conjunction with the systems of FIGS. 1-4 , persons skilled in the art will understand that any system configured to perform the method steps, in any order, is within the scope of the present invention.
- a method 500 begins at step 502 , where motion capture setup 110 receive motion capture data associated with operator 114 .
- Motion capture setup 110 may implement computer vision techniques to track head, body, and limb movements of operator 114 , or rely on marking tracking techniques to track the movements of operator 114 .
- Motion capture setup transmits motion capture signals 150 to control engine 128 for processing.
- control engine 128 processes motion capture signals 150 to determine an articulation sequence to be performed by robot 140 and/or an end effector path for robot 140 to follow.
- An example of an articulation sequence is discussed above in conjunction with FIG. 3 .
- An example of an end effector path is described above in conjunction with FIG. 4 .
- control engine 128 generates control signals 152 based on the determined articulation sequence and/or end effector path.
- Control signals 152 may vary depending on the type of robot 140 implemented. For example, to control a robotic arm of a robot, control signals 152 could include joint position commands. Alternatively, to control a robotic drone, control signals 152 could include motor speed commands for modulating thrust produced by a propeller.
- control engine 128 transmits control signals 152 to actuators 142 of robot 140 .
- Control engine 128 may transmit control signals 152 via one or more physical cables coupled to robot 140 or transmit control signals 152 wirelessly.
- control engine 128 captures multimedia data from sensor array 144 coupled to robot 140 .
- Sensor array 144 may include any technically feasible type of sensor configured to measure physical quantities, including an optical sensor, sonic sensor, vibration sensor, and so forth. In practice, sensor array 144 generally includes a video camera and potentially an audio capture device as well.
- Multimedia capture module 230 processes the sensor data from sensor array 144 to generate the multimedia data.
- motion capture setup 110 and control engine 128 may operate in conjunction with one another to control robot 140 for any technically purpose, beyond filming movies.
- a motion capture setup records the movements of an operator, and a control engine then translates those movements into control signals for controlling a robot.
- the control engine may directly translate the operator movements into analogous movements to be performed by the robot, or the control engine may compute robot dynamics that cause a portion of the robot to mimic a corresponding portion of the operator.
- At least one advantage of the techniques described herein is that a human camera operator need not be subjected to discomfort or bodily injury when filming movies. Instead of personally holding camera equipment and being physically present for filming, the camera operator can simply operate a robot, potentially from a remote location, that, in turn, operates camera equipment.
- aspects of the present embodiments may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “module” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
- the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
- a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
Abstract
Description
- This application is a continuation of the co-pending U.S. patent application titled, “ ROBOTIC CAMERA CONTROL VIA MOTION CAPTURE,” filed on July 21, 2016 and having Ser. No. 15/216,583. The subject matter of this related application is hereby incorporated herein by reference.
- Embodiments of the present invention relate generally to robotics and, more specifically, to robotic camera control via motion capture.
- When filming a movie, a camera operator controls the position, orientation, and motion of a camera in order to frame shots, capture sequences, and transition between camera angles, among other cinematographic procedures. However, operating a camera can be difficult and/or dangerous in certain situations. For example, when filming a close-up sequence, the camera operator may be required to assume an awkward position for an extended period of time while physically supporting heavy camera equipment. Or, when filming an action sequence, the camera operator may be required to risk bodily injury in order to attain the requisite proximity to the action being filmed. In any case, camera operators usually must perform very complex camera movements and oftentimes must repeat these motions across multiple takes.
- As the foregoing illustrates, what is needed in the art are more effective approaches to filming movie sequences.
- Various embodiments of the present invention set forth a computer-implemented method for controlling a robot, including generating motion capture data based on one or more movements of an operator, processing the motion capture data to generate control signals for controlling the robot, and transmitting the control signals to the robot to cause the robot to mimic the one or more movements of the operator.
- At least one advantage of the approach discussed herein is that a human camera operator need not be subjected to discomfort or bodily injury when filming movies.
- So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
-
FIG. 1 illustrates a system configured to implement one or more aspects of the present invention; -
FIG. 2 is a more detailed illustration of the control engine ofFIG. 1 , according to various embodiments of the present invention; -
FIG. 3 illustrates how the articulation of a robotic arm can be controlled via a motion capture setup, according to various embodiments of the present invention; -
FIG. 4 illustrates how the position and orientation of a robotic drone can be controlled via a motion capture setup, according to various embodiments of the present invention; and -
FIG. 5 is a flow diagram of method steps for translating motion capture data into control signals for a controlling a robot, according to various embodiments of the present invention. - In the following description, numerous specific details are set forth to provide a more thorough understanding of the present invention. However, it will be apparent to one of skill in the art that the present invention may be practiced without one or more of these specific details.
-
FIG. 1 illustrates a system configured to implement one or more aspects of the present invention. As shown,system 100 includes amotion capture setup 110, acomputer 120, and arobot 140.Motion capture setup 110 is coupled tocomputer 120, andcomputer 120 is coupled torobot 140. -
Motion capture setup 110 includes sensors 112(0) and 112(1), configured to a capture motion associated with anoperator 114.Motion capture setup 110 may include any number of different sensors, although generallymotion capture setup 110 includes at least twosensors 112 in order to capture binocular data.Motion capture setup 110 outputsmotion capture data 150 tocomputer 120 for processing. -
Computer 120 includes aprocessor 122, input/output (I/O)utilities 124, and amemory 126, coupled together.Processor 122 may be any technically feasible form of processing device configured process data and execute program code.Processor 112 could be, for example, a central processing unit (CPU), a graphics processing unit (GPU), an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), any technically feasible combination of such units, and so forth. I/O utilities 124 may include devices configured to receive input, including, for example, a keyboard, a mouse, and so forth. I/O utilities 124 may also include devices configured to provide output, including, for example, a display device, a speaker, and so forth. I/O utilities 124 may further include devices configured to both receive and provide input and output, respectively, including, for example, a touchscreen, a universal serial bus (USB) port, and so forth. - Memory 126 may be any technically feasible storage medium configured to store data and software applications.
Memory 126 could be, for example, a hard disk, a random access memory (RAM) module, a read-only memory (ROM), and so forth. Memory 126 includes acontrol engine 128 anddatabase 130.Control engine 128 is a software application that, when executed byprocessor 122, causesprocessor 122 to interact withrobot 140. - Robot 140 includes
actuators 142 coupled to asensor array 144.Actuators 142 may be any technically feasible type of mechanism configured to induce physical motion of any kind, including linear or rotational motors, hydraulic or pneumatic pumps, and so forth. In one embodiment (described by way of example in conjunction withFIG. 3 ),actuators 142 include rotational motors configured to articulate a robotic arm ofrobot 140. In another embodiment (described by way of example in conjunction withFIG. 4 ),actuators 142 include rotational motors configured to drive a set of propellers thatpropel robot 140 through the air. -
Sensor array 144 may include any technically feasible collection of sensors. For example,sensor array 144 could include an optical sensor, a sonic sensor, and/or other types of sensors configured to measure physical quantities. Generally,sensor array 144 is configured to record multimedia data. In practice,sensor array 144 includes a video camera configured to capture aframe 146. By capturing a sequence of such frames,sensor array 144 may record a movie. - In operation,
motion capture setup 110 captures motion associated withoperator 114 and then transmitsmotion capture signals 150 to controlengine 128.Control engine 128 processesmotion capture signals 150 to generatecontrol signals 152.Control engine 128 transmitscontrol signals 152 torobot 140 to control the motion ofrobot 140 viaactuators 140. In general, the resultant motion ofrobot 140 mimics and/or is derived from the motion ofoperator 114. Robot 140 receivessensor data 154 viasensor array 146 and transmits that sensor data to controlengine 128.Control engine 128 may process the received data for storage indatabase 130. - In this manner,
operator 114 may controlrobot 140 to capture video sequences without being required to physically and/or directly operate any camera equipment. One advantage of this approach is thatoperator 114 need not be subjected to a difficult and/or dangerous working environment when filming.Control engine 128 is described in greater detail below in conjunction withFIG. 2 . -
FIG. 2 is a more detailed illustration of the control engine ofFIG. 1 , according to various embodiments of the present invention. As shown,control engine 128 includesmotion capture analyzer 200,dynamics extractor 210,dynamics translator 220, andmultimedia capture module 230. -
Motion capture analyzer 200 is configured to receivemotion capture signals 150 frommotion capture setup 110 and to process those signals to generateraw data 202. Motion capture signals 150 generally include video data captured bysensors 112.Motion capture analyzer 200 processes this video data to identify the position and orientation of some or all ofoperator 114 over a time period. In one embodiment,operator 114 wears a suit bearing reflective markers that can be tracked bymotion capture analyzer 200.Motion capture analyzer 200 may be included inmotion capture setup 110 in certain embodiments.Motion capture analyzer 200 generatesraw data 202, which includes a set of quantities describing the position and orientation of any tracked portions ofoperator 114 over the time period. -
Dynamics extractor 210 receivesraw data 202 and then processes this data to generate processeddata 212. In doing so,dynamics extractor 210 models the dynamics ofoperator 114 over the time period based onraw data 202.Dynamics extractor 210 may determine various rotations and/or translations of portions ofoperator 114, including the articulation of the limbs ofoperator 114, as well as the trajectory of any portion ofoperator 114 and/or any object associated withoperator 114.Dynamics extractor 220 provides processeddata 212 todynamics translator 220. -
Dynamics translator 220 is configured to translate the modeled dynamics included in processeddata 212 intocontrol signals 152 for controllingrobot 140 to have dynamics derived from the modeled dynamics.Dynamics translator 220 may generate somecontrol signals 152 that causerobot 140 to directly mimic the dynamics and motion ofoperator 114, or may generateother control signals 152 to cause only a portion ofrobot 140 to mimic the dynamics and motion ofoperator 114. - For example,
dynamics translator 220 could generatecontrol signals 152 that cause actuators 142 within a robotic arm ofrobot 140 to copy the articulation of joints within an arm ofoperator 114. This particular example is described in greater detail below in conjunction withFIG. 3 . Alternatively,dynamics translator 220 could generatecontrol signals 152 that cause actuators 142 to perform any technically feasible set of actuations in order to causesensor array 144 to trace a similar path as an object held byoperator 114. This example is described in greater detail below in conjunction withFIG. 4 . In one embodiment,dynamics translator 220 may also amplify the movements ofoperator 114 when generatingcontrol signals 152, so that the dynamics ofrobot 140 represent an exaggerated version of the dynamics ofoperator 114. - In response to control
signals 152,actuators 142 withinrobot 140 actuate and move sensor array 144 (and potentiallyrobot 140 as a whole).Sensor array 144 capturessensor data 154 and transmits this data tomultimedia capture module 230 withincontrol engine 128.Multimedia capture application 230 generally manages the operation of sensor array and processes incoming sensor signals such as sensor signals 154. Based on these signals,multimedia capture module 230 generatesmultimedia data 232 for storage indatabase 130.Multimedia data 232 may include any technically feasible type of data, although inpractice multimedia data 232 includes frames of video captured bysensor array 144, and possibly frames of audio data as well. - As a general matter,
control engine 128 is configured to translate movements performed byoperator 114 into movements performed byrobot 140. In this manner,operator 114 can controlrobot 140 by performing a set of desired movements. This approach may be specifically applicable to filming a movie, where a camera operator may wish to controlrobot 140 to film movie sequences in a particular manner. However, persons skilled in the art will understand that the techniques disclosed herein for controlling a robot may be applicable to other fields beyond cinema. -
FIG. 3 illustrates how the articulation of a robotic arm can be controlled via a motion capture setup, according to various embodiments of the present invention. The scenario depicted inFIG. 3 is provided for exemplary purposes only to illustrate one possible robot and one possible set of movements. The techniques described herein are applicable to any technically feasible robot and any technically feasible set of movements. - As shown,
operator 114 movesarm 300 from a position 310(A) upwards to assume a position 312(A).Operator 114 may optionally hold amotion capture object 302.Motion capture setup 110 is generally configured to track the motion ofoperator 114, and, specifically, to track the articulation of joints inarm 300 ofoperator 114. In one embodiment,motion capture setup 110 relies on markers indicating the positions of joints ofarm 300.Motion capture setup 110 may also track the position and location ofmotion capture object 302.Motion capture setup 110 transmits motion capture signals 150 to controlengine 128 that represent any and all captured data. -
Control engine 128 then generates control signals 152 to causerobot 140 to move from position 310(B) to position 312(B). Positions 310(A) and 310(B) are generally analogous and have a similar articulation of joints. Likewise, positions 312(A) and 312(B) are analogous and have a similar articulation of joints. Additionally, any intermediate positions are generally analogous.Robot 140 may effect these analogous articulations by actuating actuators 142(0), 142(1), 142(2), and 142(3) to perform similar joint rotations as those performed byoperator 114 witharm 300. In this manner,robot 140 performs a movement that is substantially the same as the movement performed byoperator 114. In some embodiments, the motion ofrobot 140 represents an exaggerated version of the motion ofoperator 114. For example,control engine 128 could scale the joint rotations associated witharm 300 by a preset amount when generatingcontrol signals 152 foractuators 142 or perform a smoothing operation to attenuate disturbances and/or other perturbations in the mimicked movements. - During motion,
sensor array 144 captures data that is transmitted to controlengine 128 assensor data 154. This sensor data is gathered from a sequence of locations thatoperator 114 indicates by performing movements corresponding to those locations, as described. In some cases, though,robot 140 need not implement analogous dynamics asoperator 114. In particular, whenoperator 114 movesmotion control object 302 along a certain path,robot 140 may perform any technically feasible combination of movements in order to causesensor array 144 to trace a similar path. This approach is described in greater detail below in conjunction withFIG. 4 . -
FIG. 4 illustrates how the position of a robotic drone can be controlled via a motion capture setup, according to various embodiments of the present invention. Similar toFIG. 3 , the scenario depicted inFIG. 4 is provided for exemplary purposes only to illustrate one possible robot and one possible set of movements. The techniques described herein are applicable to any technically feasible robot and any technically feasible set of movements. - In
FIG. 4 ,operator 114 movesmotion capture object 302 along trajectory 400(A) from initial position 410(A) to final position 412(A). In doing so,operator 114 articulatesarm 300, in like fashion as described above in conjunction withFIG. 3 . However, in the exemplary scenario shown inFIG. 4 ,motion capture setup 110 need not capture the specific articulation orarm 300. - Instead,
motion capture setup 110 tracks the trajectory ofmotion capture object 302 to generate motion capture signals 150. Based on motion capture signals 150,control engine 128 determines specific dynamics forrobot 140 that causesensor array 144 to traverse a trajectory that is analogous to trajectory 400(A).Control engine 128 generates control signals 152 that represent these dynamics and then transmitscontrol signals 152 torobot 140 for execution. In response,robot 140 movessensor array 144 along trajectory 400(B) from position 410(A) to position 410(B). - Depending on the type of robot implemented,
control engine 128 may generate different control signals 152. In the example shown,robot 140 is a quadcopter drone. Accordingly,control engine 128 generates control signals 152 that modulate the thrust of one or more propellers to cause the drone to movesensor array 144 along trajectory 400(B). In another example,robot 140 could include an arm with a number of joints, similar torobot 140 shown inFIG. 3 . In this case,control engine 128 could determine a set of joint articulations that would causerobot 140 to movesensor array 144 along trajectory 400(B). Again, those articulations need not be similar to the articulations ofarm 300. - With the approach described herein,
operator 114 indicates, viamotion capture object 302, a trajectory along withsensor array 144 should travel.Control engine 128 then causes robot to trace a substantially similar trajectory withsensor array 144. During motion,sensor array 144 capturessensor data 154 for transmission to controlengine 128. - Referring generally to
FIG. 3-4 , persons skilled in the art will understand that the techniques described above represent just two exemplary approaches for generating robot control signals based on motion capture data. Other approaches also fall within the scope of the invention. For example, the techniques described in conjunction withFIGS. 2 and 3 could be combined. According to this combined technique,control engine 128 could instructrobot 140 to copy the motions ofoperator 114 under certain circumstances, and to copy the trajectory ofmotion capture object 302 under other circumstances. Alternatively,control engine 128 could be trained to interpret the motions ofoperator 114 according to a gesture-based language. For example,operator 114 could perform specific hand motions to indicate camera operations such as “zoom in,” “pan left,” and so forth. With this approach,operator 114 may exercise fine-grained control over both the motions ofrobot 140 and more specific cinematographic operations. In one embodiment,control engine 128 records any and all movements performed byoperator 114 during real-time control ofrobot 140 for later playback torobot 140. In this manner,robot 140 can be made to repeat a set of movements across multiple takes without requiringoperator 114 to repeat the associated movements. -
FIG. 5 is a flow diagram of method steps for translating motion capture data into control signals for controlling a robot, according to various embodiments of the present invention. Although the method steps are described in conjunction with the systems ofFIGS. 1-4 , persons skilled in the art will understand that any system configured to perform the method steps, in any order, is within the scope of the present invention. - As shown, a method 500 begins at
step 502, wheremotion capture setup 110 receive motion capture data associated withoperator 114.Motion capture setup 110 may implement computer vision techniques to track head, body, and limb movements ofoperator 114, or rely on marking tracking techniques to track the movements ofoperator 114. Motion capture setup transmits motion capture signals 150 to controlengine 128 for processing. - At
step 504,control engine 128 processes motion capture signals 150 to determine an articulation sequence to be performed byrobot 140 and/or an end effector path forrobot 140 to follow. An example of an articulation sequence is discussed above in conjunction withFIG. 3 . An example of an end effector path is described above in conjunction withFIG. 4 . - At
step 506,control engine 128 generates control signals 152 based on the determined articulation sequence and/or end effector path. Control signals 152 may vary depending on the type ofrobot 140 implemented. For example, to control a robotic arm of a robot, control signals 152 could include joint position commands. Alternatively, to control a robotic drone, control signals 152 could include motor speed commands for modulating thrust produced by a propeller. - At
step 508,control engine 128 transmitscontrol signals 152 toactuators 142 ofrobot 140.Control engine 128 may transmitcontrol signals 152 via one or more physical cables coupled torobot 140 or transmitcontrol signals 152 wirelessly. - At
step 510,control engine 128 captures multimedia data fromsensor array 144 coupled torobot 140.Sensor array 144 may include any technically feasible type of sensor configured to measure physical quantities, including an optical sensor, sonic sensor, vibration sensor, and so forth. In practice,sensor array 144 generally includes a video camera and potentially an audio capture device as well.Multimedia capture module 230 processes the sensor data fromsensor array 144 to generate the multimedia data. - Persons skilled in the art will understand that although the techniques described herein have been described relative to various cinematographic operations, the present techniques also apply to robot control in general. For example,
motion capture setup 110 andcontrol engine 128 may operate in conjunction with one another to controlrobot 140 for any technically purpose, beyond filming movies. - In sum, a motion capture setup records the movements of an operator, and a control engine then translates those movements into control signals for controlling a robot. The control engine may directly translate the operator movements into analogous movements to be performed by the robot, or the control engine may compute robot dynamics that cause a portion of the robot to mimic a corresponding portion of the operator.
- At least one advantage of the techniques described herein is that a human camera operator need not be subjected to discomfort or bodily injury when filming movies. Instead of personally holding camera equipment and being physically present for filming, the camera operator can simply operate a robot, potentially from a remote location, that, in turn, operates camera equipment.
- The descriptions of the various embodiments have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments.
- Aspects of the present embodiments may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “module” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
- Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- Aspects of the present disclosure are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such processors may be, without limitation, general purpose processors, special-purpose processors, application-specific processors, or field-programmable processors or gate arrays.
- The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
- While the preceding is directed to embodiments of the present disclosure, other and further embodiments of the disclosure may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/588,972 US20200030986A1 (en) | 2016-07-21 | 2019-09-30 | Robotic camera control via motion capture |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/216,583 US10427305B2 (en) | 2016-07-21 | 2016-07-21 | Robotic camera control via motion capture |
US16/588,972 US20200030986A1 (en) | 2016-07-21 | 2019-09-30 | Robotic camera control via motion capture |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/216,583 Continuation US10427305B2 (en) | 2016-07-21 | 2016-07-21 | Robotic camera control via motion capture |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200030986A1 true US20200030986A1 (en) | 2020-01-30 |
Family
ID=59521648
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/216,583 Active 2037-01-10 US10427305B2 (en) | 2016-07-21 | 2016-07-21 | Robotic camera control via motion capture |
US16/588,972 Abandoned US20200030986A1 (en) | 2016-07-21 | 2019-09-30 | Robotic camera control via motion capture |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/216,583 Active 2037-01-10 US10427305B2 (en) | 2016-07-21 | 2016-07-21 | Robotic camera control via motion capture |
Country Status (4)
Country | Link |
---|---|
US (2) | US10427305B2 (en) |
EP (1) | EP3488308A1 (en) |
JP (3) | JP2019523145A (en) |
WO (1) | WO2018017859A1 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10279470B2 (en) * | 2014-06-12 | 2019-05-07 | Play-i, Inc. | System and method for facilitating program sharing |
US10427305B2 (en) * | 2016-07-21 | 2019-10-01 | Autodesk, Inc. | Robotic camera control via motion capture |
KR102640420B1 (en) * | 2016-12-22 | 2024-02-26 | 삼성전자주식회사 | Operation Method for activation of Home robot device and Home robot device supporting the same |
CN109521927B (en) * | 2017-09-20 | 2022-07-01 | 阿里巴巴集团控股有限公司 | Robot interaction method and equipment |
CN111590567B (en) * | 2020-05-12 | 2021-12-07 | 北京控制工程研究所 | Space manipulator teleoperation planning method based on Omega handle |
EP4173773A4 (en) * | 2020-06-25 | 2024-03-27 | Hitachi High Tech Corp | Robot teaching device and method for teaching work |
US11794342B2 (en) * | 2020-07-17 | 2023-10-24 | Intrinsic Innovation Llc | Robot planning using unmanned aerial vehicles |
CN114454174B (en) * | 2022-03-08 | 2022-10-04 | 江南大学 | Mechanical arm motion capturing method, medium, electronic device and system |
Citations (78)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US388760A (en) * | 1888-08-28 | Machine | ||
US719583A (en) * | 1901-04-11 | 1903-02-03 | Adolph Rosenthal | Foot-gear for use in swimming. |
CN2124815U (en) * | 1992-07-08 | 1992-12-16 | 台湾恒基股份有限公司 | Net printer |
US5841258A (en) * | 1997-01-31 | 1998-11-24 | Honda Giken Kogyo Kabushiki Kaisha | Remote control system for legged moving robot |
US6115639A (en) * | 1996-12-24 | 2000-09-05 | Honda Giken Kogyo Kabushiki Kaisha | Remote control system for legged moving robot |
US6285380B1 (en) * | 1994-08-02 | 2001-09-04 | New York University | Method and system for scripting interactive animated actors |
US6347261B1 (en) * | 1999-08-04 | 2002-02-12 | Yamaha Hatsudoki Kabushiki Kaisha | User-machine interface system for enhanced interaction |
US6353773B1 (en) * | 1997-04-21 | 2002-03-05 | Honda Giken Kogyo Kabushiki Kaissha | Remote control system for biped locomotion robot |
US6425865B1 (en) * | 1998-06-12 | 2002-07-30 | The University Of British Columbia | Robotically assisted medical ultrasound |
JP3836466B2 (en) * | 2001-08-29 | 2006-10-25 | 本田技研工業株式会社 | Biped mobile robot remote control device |
US20070078466A1 (en) * | 2005-09-30 | 2007-04-05 | Restoration Robotics, Inc. | Methods for harvesting follicular units using an automated system |
US20080161970A1 (en) * | 2004-10-19 | 2008-07-03 | Yuji Adachi | Robot apparatus |
US20090234502A1 (en) * | 2008-03-12 | 2009-09-17 | Denso Wave Incorporated | Apparatus for determining pickup pose of robot arm with camera |
WO2010138083A1 (en) * | 2009-05-29 | 2010-12-02 | Nanyang Technological University | Robotic system for flexible endoscopy |
WO2011153570A1 (en) * | 2010-06-08 | 2011-12-15 | Keba Ag | Method, control system and movement setting means for programming or setting movements or sequences of an industrial robot |
KR20120043887A (en) * | 2010-10-27 | 2012-05-07 | 김동호 | Joint module of universal robot |
US20120239196A1 (en) * | 2011-03-15 | 2012-09-20 | Microsoft Corporation | Natural Human to Robot Remote Control |
US20130079905A1 (en) * | 2010-06-03 | 2013-03-28 | Hitachi, Ltd. | Human-Operated Working Machine System |
US20130268118A1 (en) * | 2012-04-05 | 2013-10-10 | Irobot Corporation | Operating A Mobile Robot |
US20140009561A1 (en) * | 2010-11-12 | 2014-01-09 | Crosswing Inc. | Customizable robotic system |
KR101414362B1 (en) * | 2013-01-30 | 2014-07-02 | 한국과학기술원 | Method and apparatus for space bezel interface using image recognition |
CN103987496A (en) * | 2011-08-24 | 2014-08-13 | 山崎马扎克公司 | Nc machine tool system |
US20140229005A1 (en) * | 2013-02-14 | 2014-08-14 | Canon Kabushiki Kaisha | Robot system and method for controlling the same |
US20140237587A1 (en) * | 2013-02-15 | 2014-08-21 | Microsoft Corporation | Managed Biometric Identity |
CN104428107A (en) * | 2012-07-10 | 2015-03-18 | 西门子公司 | Robot arrangement and method for controlling a robot |
US20150078621A1 (en) * | 2013-09-13 | 2015-03-19 | Electronics And Telecommunications Research Institute | Apparatus and method for providing content experience service |
US20150120048A1 (en) * | 2013-10-24 | 2015-04-30 | Harris Corporation | Control synchronization for high-latency teleoperation |
WO2015143968A1 (en) * | 2014-03-23 | 2015-10-01 | 余浪 | Method for remotely controlling robot, and robot avatar network |
US9242379B1 (en) * | 2015-02-09 | 2016-01-26 | The Trustees Of The University Of Pennysylvania | Methods, systems, and computer readable media for producing realistic camera motion for stop motion animation |
FR3027473A1 (en) * | 2014-10-16 | 2016-04-22 | Renault Sa | DEVICE AND METHOD FOR CONTROLLING THE ELECTRIC MACHINE OF A VEHICLE IN ORDER TO MAINTAIN IT IN THE IMMOBILIZED POSITION |
DE112014004307T5 (en) * | 2013-09-20 | 2016-07-07 | Denso Wave Incorporated | Robot operation device, robot system, and robot operation program |
US20160288332A1 (en) * | 2015-03-30 | 2016-10-06 | Seiko Epson Corporation | Robot, robot control apparatus and robot system |
CN106068174A (en) * | 2014-01-31 | 2016-11-02 | Abb高姆技术有限责任公司 | Robot controls |
US20160318187A1 (en) * | 2015-05-01 | 2016-11-03 | General Electric Company | Systems and methods for control of robotic manipulation |
US20160350589A1 (en) * | 2015-05-27 | 2016-12-01 | Hsien-Hsiang Chiu | Gesture Interface Robot |
US20170100838A1 (en) * | 2015-10-12 | 2017-04-13 | The Boeing Company | Dynamic Automation Work Zone Safety System |
US20170225336A1 (en) * | 2016-02-09 | 2017-08-10 | Cobalt Robotics Inc. | Building-Integrated Mobile Robot |
US20170352351A1 (en) * | 2014-10-29 | 2017-12-07 | Kyocera Corporation | Communication robot |
JP2017536247A (en) * | 2014-09-02 | 2017-12-07 | エムビーエル リミテッド | Robot operation method and system for executing domain specific application in instrumentation environment using electronic small-scale operation library |
US20180099407A1 (en) * | 2015-05-28 | 2018-04-12 | Hitachi, Ltd. | Robot Operation Device and Program |
US9945677B1 (en) * | 2015-07-23 | 2018-04-17 | X Development Llc | Automated lane and route network discovery for robotic actors |
US20180333862A1 (en) * | 2016-03-28 | 2018-11-22 | Groove X, Inc. | Autonomously acting robot that performs a greeting action |
US20180345495A1 (en) * | 2017-05-30 | 2018-12-06 | Sisu Devices Llc | Robotic point capture and motion control |
US20190022870A1 (en) * | 2017-07-18 | 2019-01-24 | Panasonic Intellectual Property Management Co., Ltd. | Apparatus, method, non-transitory computer-readable recording medium storing program, and robot |
US20190030723A1 (en) * | 2016-04-08 | 2019-01-31 | Groove X, Inc. | Autonomously acting robot exhibiting shyness |
JP2019503875A (en) * | 2015-12-16 | 2019-02-14 | エムビーエル リミテッド | Robot kitchen including robot, storage arrangement and container for it |
US10242501B1 (en) * | 2016-05-03 | 2019-03-26 | WorldViz, Inc. | Multi-user virtual and augmented reality tracking systems |
US20190091874A1 (en) * | 2016-06-14 | 2019-03-28 | Groove X, Inc. | Autonomously acting robot that seeks coolness |
US20190134818A1 (en) * | 2008-09-18 | 2019-05-09 | Intouch Technologies, Inc. | Mobile videoconferencing robot system with network adaptive driving |
US20190184567A1 (en) * | 2016-08-29 | 2019-06-20 | Groove X, Inc. | Autonomously acting robot that recognizes direction of sound source |
US20190202054A1 (en) * | 2016-09-02 | 2019-07-04 | Groove X, Inc. | Autonomously acting robot, server, and behavior control program |
US20190224854A1 (en) * | 2016-09-14 | 2019-07-25 | Keba Ag | Control device and control method for industrial machines having controlled movement drives |
US10403050B1 (en) * | 2017-04-10 | 2019-09-03 | WorldViz, Inc. | Multi-user virtual and augmented reality tracking systems |
US20190278295A1 (en) * | 2016-11-24 | 2019-09-12 | Kyoto University | Robot control system, machine control system, robot control method, machine control method, and recording medium |
US20190337166A1 (en) * | 2018-05-01 | 2019-11-07 | Misty Robotics | Robot neck mechanism |
US10471591B1 (en) * | 2018-06-01 | 2019-11-12 | X Development Llc | Object hand-over between robot and actor |
US20190375112A1 (en) * | 2017-02-09 | 2019-12-12 | Mitsubishi Electric Corporation | Position control device and position control method |
US20190389058A1 (en) * | 2018-06-25 | 2019-12-26 | Groove X, Inc. | Autonomously acting robot that imagines virtual character |
US20200030970A1 (en) * | 2017-02-09 | 2020-01-30 | Mitsubishi Electric Corporation | Position control device and position control method |
US20200101614A1 (en) * | 2018-10-01 | 2020-04-02 | Toyota Research Institute, Inc. | Methods and systems for implementing customized motions based on individual profiles for identified users |
US20200159229A1 (en) * | 2018-08-13 | 2020-05-21 | R-Go Robotics Ltd. | System and method for creating a single perspective synthesized image |
CN111716365A (en) * | 2020-06-15 | 2020-09-29 | 山东大学 | Immersive remote interaction system and method based on natural walking |
US20200368904A1 (en) * | 2019-05-20 | 2020-11-26 | Russell Aldridge | Remote robotic welding with a handheld controller |
US20210027236A1 (en) * | 2019-07-22 | 2021-01-28 | Invia Robotics, Inc. | Decoupled Order Fulfillment |
CN112704563A (en) * | 2020-12-25 | 2021-04-27 | 天津市第三中心医院 | Remote ultrasonic operation simulation system for hepatobiliary surgery based on ultrasonic knife |
WO2021092194A1 (en) * | 2019-11-05 | 2021-05-14 | Vicarious Surgical Inc. | Surgical virtual reality user interface |
WO2021165908A1 (en) * | 2020-02-21 | 2021-08-26 | Louwrens Jakobus Briel | Camera equipped cycle and coordinated punch exercise device and methods |
US20210294944A1 (en) * | 2020-03-19 | 2021-09-23 | Nvidia Corporation | Virtual environment scenarios and observers for autonomous machine applications |
US20210387301A1 (en) * | 2020-06-12 | 2021-12-16 | Hexagon Metrology, Inc. | Robotic Alignment Method for Workpiece Measuring Systems |
US20210405366A1 (en) * | 2014-03-26 | 2021-12-30 | Mark D. Wieczorek | System and method for distanced interactive experiences |
US20220040852A1 (en) * | 2020-07-31 | 2022-02-10 | Robert Bosch Gmbh | Method for controlling a robot device and robot device controller |
US20220043263A1 (en) * | 2014-03-26 | 2022-02-10 | Mark D. Wieczorek | System and method for distanced interactive experiences |
US20220252881A1 (en) * | 2014-03-26 | 2022-08-11 | Mark D. Wieczorek | System and method for haptic interactive experiences |
US11436869B1 (en) * | 2019-12-09 | 2022-09-06 | X Development Llc | Engagement detection and attention estimation for human-robot interaction |
US20220288791A1 (en) * | 2019-04-16 | 2022-09-15 | Sony Group Corporation | Information processing device, information processing method, and program |
US20220314112A1 (en) * | 2021-04-06 | 2022-10-06 | Sony Interactive Entertainment LLC | Adjustable robot for providing scale of virtual assets and identifying objects in an interactive scene |
US20220331946A1 (en) * | 2021-04-16 | 2022-10-20 | Dexterity, Inc. | Repositionable robot riser |
US20230073265A1 (en) * | 2020-02-19 | 2023-03-09 | Sony Interactive Entertainment Inc. | Information processing device and action mode setting method |
Family Cites Families (55)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6016385A (en) * | 1997-08-11 | 2000-01-18 | Fanu America Corp | Real time remotely controlled robot |
FR2839176A1 (en) * | 2002-04-30 | 2003-10-31 | Koninkl Philips Electronics Nv | ROBOT ANIMATION SYSTEM COMPRISING A SET OF MOVING PARTS |
US8600550B2 (en) * | 2003-12-12 | 2013-12-03 | Kurzweil Technologies, Inc. | Virtual encounters |
JP3920317B2 (en) * | 2004-08-02 | 2007-05-30 | 松下電器産業株式会社 | Goods handling robot |
US20100222925A1 (en) * | 2004-12-03 | 2010-09-02 | Takashi Anezaki | Robot control apparatus |
DE102005061211B4 (en) * | 2004-12-22 | 2023-04-06 | Abb Schweiz Ag | Method for creating a human-machine user interface |
EP2281668B1 (en) * | 2005-09-30 | 2013-04-17 | iRobot Corporation | Companion robot for personal interaction |
DE102005058867B4 (en) * | 2005-12-09 | 2018-09-27 | Cine-Tv Broadcast Systems Gmbh | Method and device for moving a camera arranged on a pan and tilt head along a predetermined path of movement |
JP5109192B2 (en) * | 2006-07-28 | 2012-12-26 | ソニー株式会社 | FACS (Facial Expression Coding System) Solution in Motion Capture |
CN101646534B (en) * | 2007-06-27 | 2012-03-21 | 松下电器产业株式会社 | Apparatus and method for controlling robot arm, and robot |
WO2009004772A1 (en) * | 2007-07-05 | 2009-01-08 | Panasonic Corporation | Robot arm control device and control method, robot and control program |
JP2009032189A (en) * | 2007-07-30 | 2009-02-12 | Toyota Motor Corp | Device for generating robot motion path |
EP2296068B1 (en) * | 2008-02-28 | 2015-06-24 | Panasonic Intellectual Property Management Co., Ltd. | Control apparatus and control method for a robot arm, robot, control program for a robot arm, and electronic integrated circuit for controlling a robot arm |
US8214098B2 (en) * | 2008-02-28 | 2012-07-03 | The Boeing Company | System and method for controlling swarm of remote unmanned vehicles through human gestures |
KR101494344B1 (en) * | 2008-04-25 | 2015-02-17 | 삼성전자주식회사 | method and system for motion control in humanoid robot |
JP4568795B2 (en) * | 2009-01-09 | 2010-10-27 | パナソニック株式会社 | Robot arm control device and control method, robot, robot arm control program, and integrated electronic circuit |
JP2010223932A (en) * | 2009-02-27 | 2010-10-07 | Toyota Motor Corp | Method of detecting defect |
US7983450B2 (en) * | 2009-03-16 | 2011-07-19 | The Boeing Company | Method, apparatus and computer program product for recognizing a gesture |
JP4759660B2 (en) * | 2009-08-21 | 2011-08-31 | パナソニック株式会社 | Robot arm control device, method, program, integrated electronic circuit, and assembly robot |
KR20110026211A (en) * | 2009-09-07 | 2011-03-15 | 삼성전자주식회사 | Humanoid robot |
US8878779B2 (en) * | 2009-09-21 | 2014-11-04 | Extreme Reality Ltd. | Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen |
JP4699572B2 (en) * | 2009-09-28 | 2011-06-15 | パナソニック株式会社 | Robot arm control apparatus and control method, robot, robot arm control program, and integrated electronic circuit for robot arm control |
US8600552B2 (en) * | 2009-10-30 | 2013-12-03 | Honda Motor Co., Ltd. | Information processing method, apparatus, and computer readable medium |
JP2011140077A (en) * | 2010-01-06 | 2011-07-21 | Honda Motor Co Ltd | Processing system and processing method |
US8918213B2 (en) * | 2010-05-20 | 2014-12-23 | Irobot Corporation | Mobile human interface robot |
US8781629B2 (en) * | 2010-09-22 | 2014-07-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | Human-robot interface apparatuses and methods of controlling robots |
US8930019B2 (en) * | 2010-12-30 | 2015-01-06 | Irobot Corporation | Mobile human interface robot |
US9188973B2 (en) * | 2011-07-08 | 2015-11-17 | Restoration Robotics, Inc. | Calibration and transformation of a camera system's coordinate system |
TW201310339A (en) * | 2011-08-25 | 2013-03-01 | Hon Hai Prec Ind Co Ltd | System and method for controlling a robot |
WO2013035244A1 (en) * | 2011-09-06 | 2013-03-14 | パナソニック株式会社 | Robotic arm control device and control method, robot, control program and integrated electronic circuit |
EP2810748A4 (en) * | 2012-02-03 | 2016-09-07 | Nec Corp | Communication draw-in system, communication draw-in method, and communication draw-in program |
US8843236B2 (en) * | 2012-03-15 | 2014-09-23 | GM Global Technology Operations LLC | Method and system for training a robot using human-assisted task demonstration |
JP6021533B2 (en) * | 2012-09-03 | 2016-11-09 | キヤノン株式会社 | Information processing system, apparatus, method, and program |
US9025856B2 (en) * | 2012-09-05 | 2015-05-05 | Qualcomm Incorporated | Robot control information |
EP3932628A1 (en) * | 2012-12-10 | 2022-01-05 | Intuitive Surgical Operations, Inc. | Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms |
CN104936748B (en) * | 2012-12-14 | 2017-10-27 | Abb技术有限公司 | Free-hand robot path teaching |
US9056396B1 (en) * | 2013-03-05 | 2015-06-16 | Autofuss | Programming of a robotic arm using a motion capture system |
JP6514681B2 (en) * | 2013-03-15 | 2019-05-15 | ウーバー テクノロジーズ,インコーポレイテッド | Method, system and apparatus for multi-perceptive stereo vision for robots |
US8903568B1 (en) * | 2013-07-31 | 2014-12-02 | SZ DJI Technology Co., Ltd | Remote control method and terminal |
US9381642B2 (en) * | 2014-01-13 | 2016-07-05 | Massachusetts Institute Of Technology | Wearable robot assisting manual tasks |
EP3107429B1 (en) * | 2014-02-20 | 2023-11-15 | MBL Limited | Methods and systems for food preparation in a robotic cooking kitchen |
DK2933604T3 (en) * | 2014-04-14 | 2017-03-13 | Softbank Robotics Europe | PROCEDURE FOR LOCATING A ROBOT IN A LOCATION PLAN |
CN106456145B (en) * | 2014-05-05 | 2020-08-18 | 维卡瑞斯外科手术股份有限公司 | Virtual reality surgical device |
US10613527B2 (en) * | 2014-08-18 | 2020-04-07 | Verity Studios Ag | Invisible track for an interactive mobile robot system |
JP6415190B2 (en) * | 2014-09-03 | 2018-10-31 | キヤノン株式会社 | ROBOT DEVICE, ROBOT CONTROL PROGRAM, RECORDING MEDIUM, AND ROBOT DEVICE CONTROL METHOD |
US9694496B2 (en) * | 2015-02-26 | 2017-07-04 | Toyota Jidosha Kabushiki Kaisha | Providing personalized patient care based on electronic health record associated with a user |
US9643314B2 (en) * | 2015-03-04 | 2017-05-09 | The Johns Hopkins University | Robot control, training and collaboration in an immersive virtual reality environment |
EP3272473B1 (en) * | 2015-03-20 | 2022-09-14 | FUJI Corporation | Teaching device and method for generating control information |
US20170039671A1 (en) * | 2015-08-07 | 2017-02-09 | Seoul National University R&Db Foundation | Robotic self-filming system |
CN105223957B (en) | 2015-09-24 | 2018-10-02 | 北京零零无限科技有限公司 | A kind of method and apparatus of gesture manipulation unmanned plane |
DE102015221337A1 (en) * | 2015-10-30 | 2017-05-04 | Keba Ag | Method and control system for controlling the movements of articulated arms of an industrial robot as well as motion specification means used thereby |
US10434659B2 (en) * | 2016-03-02 | 2019-10-08 | Kindred Systems Inc. | Systems, devices, articles, and methods for user input |
US20190105779A1 (en) * | 2016-03-24 | 2019-04-11 | Polygon T.R Ltd. | Systems and methods for human and robot collaboration |
US10322506B2 (en) * | 2016-05-06 | 2019-06-18 | Kindred Systems Inc. | Systems, devices, articles, and methods for using trained robots |
US10427305B2 (en) * | 2016-07-21 | 2019-10-01 | Autodesk, Inc. | Robotic camera control via motion capture |
-
2016
- 2016-07-21 US US15/216,583 patent/US10427305B2/en active Active
-
2017
- 2017-07-20 JP JP2019502163A patent/JP2019523145A/en not_active Withdrawn
- 2017-07-20 EP EP17748602.4A patent/EP3488308A1/en active Pending
- 2017-07-20 WO PCT/US2017/043118 patent/WO2018017859A1/en unknown
-
2019
- 2019-09-30 US US16/588,972 patent/US20200030986A1/en not_active Abandoned
-
2021
- 2021-12-21 JP JP2021206918A patent/JP2022060201A/en active Pending
-
2023
- 2023-12-07 JP JP2023206828A patent/JP2024015262A/en active Pending
Patent Citations (81)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US388760A (en) * | 1888-08-28 | Machine | ||
US719583A (en) * | 1901-04-11 | 1903-02-03 | Adolph Rosenthal | Foot-gear for use in swimming. |
CN2124815U (en) * | 1992-07-08 | 1992-12-16 | 台湾恒基股份有限公司 | Net printer |
US6285380B1 (en) * | 1994-08-02 | 2001-09-04 | New York University | Method and system for scripting interactive animated actors |
US6115639A (en) * | 1996-12-24 | 2000-09-05 | Honda Giken Kogyo Kabushiki Kaisha | Remote control system for legged moving robot |
US5841258A (en) * | 1997-01-31 | 1998-11-24 | Honda Giken Kogyo Kabushiki Kaisha | Remote control system for legged moving robot |
US6353773B1 (en) * | 1997-04-21 | 2002-03-05 | Honda Giken Kogyo Kabushiki Kaissha | Remote control system for biped locomotion robot |
US6425865B1 (en) * | 1998-06-12 | 2002-07-30 | The University Of British Columbia | Robotically assisted medical ultrasound |
US6347261B1 (en) * | 1999-08-04 | 2002-02-12 | Yamaha Hatsudoki Kabushiki Kaisha | User-machine interface system for enhanced interaction |
JP3836466B2 (en) * | 2001-08-29 | 2006-10-25 | 本田技研工業株式会社 | Biped mobile robot remote control device |
US20080161970A1 (en) * | 2004-10-19 | 2008-07-03 | Yuji Adachi | Robot apparatus |
US20070078466A1 (en) * | 2005-09-30 | 2007-04-05 | Restoration Robotics, Inc. | Methods for harvesting follicular units using an automated system |
US20090234502A1 (en) * | 2008-03-12 | 2009-09-17 | Denso Wave Incorporated | Apparatus for determining pickup pose of robot arm with camera |
US20190134818A1 (en) * | 2008-09-18 | 2019-05-09 | Intouch Technologies, Inc. | Mobile videoconferencing robot system with network adaptive driving |
WO2010138083A1 (en) * | 2009-05-29 | 2010-12-02 | Nanyang Technological University | Robotic system for flexible endoscopy |
US20130079905A1 (en) * | 2010-06-03 | 2013-03-28 | Hitachi, Ltd. | Human-Operated Working Machine System |
WO2011153570A1 (en) * | 2010-06-08 | 2011-12-15 | Keba Ag | Method, control system and movement setting means for programming or setting movements or sequences of an industrial robot |
KR20120043887A (en) * | 2010-10-27 | 2012-05-07 | 김동호 | Joint module of universal robot |
US20140009561A1 (en) * | 2010-11-12 | 2014-01-09 | Crosswing Inc. | Customizable robotic system |
US20120239196A1 (en) * | 2011-03-15 | 2012-09-20 | Microsoft Corporation | Natural Human to Robot Remote Control |
CN103987496A (en) * | 2011-08-24 | 2014-08-13 | 山崎马扎克公司 | Nc machine tool system |
US20130268118A1 (en) * | 2012-04-05 | 2013-10-10 | Irobot Corporation | Operating A Mobile Robot |
CN104428107A (en) * | 2012-07-10 | 2015-03-18 | 西门子公司 | Robot arrangement and method for controlling a robot |
US20150158178A1 (en) * | 2012-07-10 | 2015-06-11 | Siemens Aktiengesellschaft | Robot arrangement and method for controlling a robot |
KR101414362B1 (en) * | 2013-01-30 | 2014-07-02 | 한국과학기술원 | Method and apparatus for space bezel interface using image recognition |
US20140229005A1 (en) * | 2013-02-14 | 2014-08-14 | Canon Kabushiki Kaisha | Robot system and method for controlling the same |
US20140237587A1 (en) * | 2013-02-15 | 2014-08-21 | Microsoft Corporation | Managed Biometric Identity |
US20150078621A1 (en) * | 2013-09-13 | 2015-03-19 | Electronics And Telecommunications Research Institute | Apparatus and method for providing content experience service |
DE112014004307T5 (en) * | 2013-09-20 | 2016-07-07 | Denso Wave Incorporated | Robot operation device, robot system, and robot operation program |
US20160229052A1 (en) * | 2013-09-20 | 2016-08-11 | Denso Wave Incorporated | Robot operation apparatus, robot system, and robot operation program |
US20150120048A1 (en) * | 2013-10-24 | 2015-04-30 | Harris Corporation | Control synchronization for high-latency teleoperation |
CN106068174A (en) * | 2014-01-31 | 2016-11-02 | Abb高姆技术有限责任公司 | Robot controls |
WO2015143968A1 (en) * | 2014-03-23 | 2015-10-01 | 余浪 | Method for remotely controlling robot, and robot avatar network |
US20220043263A1 (en) * | 2014-03-26 | 2022-02-10 | Mark D. Wieczorek | System and method for distanced interactive experiences |
US20220252881A1 (en) * | 2014-03-26 | 2022-08-11 | Mark D. Wieczorek | System and method for haptic interactive experiences |
US20210405366A1 (en) * | 2014-03-26 | 2021-12-30 | Mark D. Wieczorek | System and method for distanced interactive experiences |
JP2017536247A (en) * | 2014-09-02 | 2017-12-07 | エムビーエル リミテッド | Robot operation method and system for executing domain specific application in instrumentation environment using electronic small-scale operation library |
FR3027473A1 (en) * | 2014-10-16 | 2016-04-22 | Renault Sa | DEVICE AND METHOD FOR CONTROLLING THE ELECTRIC MACHINE OF A VEHICLE IN ORDER TO MAINTAIN IT IN THE IMMOBILIZED POSITION |
US20170352351A1 (en) * | 2014-10-29 | 2017-12-07 | Kyocera Corporation | Communication robot |
US9242379B1 (en) * | 2015-02-09 | 2016-01-26 | The Trustees Of The University Of Pennysylvania | Methods, systems, and computer readable media for producing realistic camera motion for stop motion animation |
US20160288332A1 (en) * | 2015-03-30 | 2016-10-06 | Seiko Epson Corporation | Robot, robot control apparatus and robot system |
US20160318187A1 (en) * | 2015-05-01 | 2016-11-03 | General Electric Company | Systems and methods for control of robotic manipulation |
US20160350589A1 (en) * | 2015-05-27 | 2016-12-01 | Hsien-Hsiang Chiu | Gesture Interface Robot |
US20180099407A1 (en) * | 2015-05-28 | 2018-04-12 | Hitachi, Ltd. | Robot Operation Device and Program |
US9945677B1 (en) * | 2015-07-23 | 2018-04-17 | X Development Llc | Automated lane and route network discovery for robotic actors |
US20170100838A1 (en) * | 2015-10-12 | 2017-04-13 | The Boeing Company | Dynamic Automation Work Zone Safety System |
JP2019503875A (en) * | 2015-12-16 | 2019-02-14 | エムビーエル リミテッド | Robot kitchen including robot, storage arrangement and container for it |
US20170225336A1 (en) * | 2016-02-09 | 2017-08-10 | Cobalt Robotics Inc. | Building-Integrated Mobile Robot |
US20180333862A1 (en) * | 2016-03-28 | 2018-11-22 | Groove X, Inc. | Autonomously acting robot that performs a greeting action |
US20190030723A1 (en) * | 2016-04-08 | 2019-01-31 | Groove X, Inc. | Autonomously acting robot exhibiting shyness |
US10242501B1 (en) * | 2016-05-03 | 2019-03-26 | WorldViz, Inc. | Multi-user virtual and augmented reality tracking systems |
US20190091874A1 (en) * | 2016-06-14 | 2019-03-28 | Groove X, Inc. | Autonomously acting robot that seeks coolness |
US20190184567A1 (en) * | 2016-08-29 | 2019-06-20 | Groove X, Inc. | Autonomously acting robot that recognizes direction of sound source |
US20190202054A1 (en) * | 2016-09-02 | 2019-07-04 | Groove X, Inc. | Autonomously acting robot, server, and behavior control program |
US20190224854A1 (en) * | 2016-09-14 | 2019-07-25 | Keba Ag | Control device and control method for industrial machines having controlled movement drives |
US20190278295A1 (en) * | 2016-11-24 | 2019-09-12 | Kyoto University | Robot control system, machine control system, robot control method, machine control method, and recording medium |
US20200030970A1 (en) * | 2017-02-09 | 2020-01-30 | Mitsubishi Electric Corporation | Position control device and position control method |
US20190375112A1 (en) * | 2017-02-09 | 2019-12-12 | Mitsubishi Electric Corporation | Position control device and position control method |
US10403050B1 (en) * | 2017-04-10 | 2019-09-03 | WorldViz, Inc. | Multi-user virtual and augmented reality tracking systems |
US20180345495A1 (en) * | 2017-05-30 | 2018-12-06 | Sisu Devices Llc | Robotic point capture and motion control |
US20190022870A1 (en) * | 2017-07-18 | 2019-01-24 | Panasonic Intellectual Property Management Co., Ltd. | Apparatus, method, non-transitory computer-readable recording medium storing program, and robot |
US20190337166A1 (en) * | 2018-05-01 | 2019-11-07 | Misty Robotics | Robot neck mechanism |
US10471591B1 (en) * | 2018-06-01 | 2019-11-12 | X Development Llc | Object hand-over between robot and actor |
US20190389058A1 (en) * | 2018-06-25 | 2019-12-26 | Groove X, Inc. | Autonomously acting robot that imagines virtual character |
US20200159229A1 (en) * | 2018-08-13 | 2020-05-21 | R-Go Robotics Ltd. | System and method for creating a single perspective synthesized image |
US20200101614A1 (en) * | 2018-10-01 | 2020-04-02 | Toyota Research Institute, Inc. | Methods and systems for implementing customized motions based on individual profiles for identified users |
US20220288791A1 (en) * | 2019-04-16 | 2022-09-15 | Sony Group Corporation | Information processing device, information processing method, and program |
US20200368904A1 (en) * | 2019-05-20 | 2020-11-26 | Russell Aldridge | Remote robotic welding with a handheld controller |
US20210027236A1 (en) * | 2019-07-22 | 2021-01-28 | Invia Robotics, Inc. | Decoupled Order Fulfillment |
WO2021092194A1 (en) * | 2019-11-05 | 2021-05-14 | Vicarious Surgical Inc. | Surgical virtual reality user interface |
US20220387128A1 (en) * | 2019-11-05 | 2022-12-08 | Vicarious Surgical Inc. | Surgical virtual reality user interface |
US11436869B1 (en) * | 2019-12-09 | 2022-09-06 | X Development Llc | Engagement detection and attention estimation for human-robot interaction |
US20230073265A1 (en) * | 2020-02-19 | 2023-03-09 | Sony Interactive Entertainment Inc. | Information processing device and action mode setting method |
WO2021165908A1 (en) * | 2020-02-21 | 2021-08-26 | Louwrens Jakobus Briel | Camera equipped cycle and coordinated punch exercise device and methods |
US20210294944A1 (en) * | 2020-03-19 | 2021-09-23 | Nvidia Corporation | Virtual environment scenarios and observers for autonomous machine applications |
US20210387301A1 (en) * | 2020-06-12 | 2021-12-16 | Hexagon Metrology, Inc. | Robotic Alignment Method for Workpiece Measuring Systems |
CN111716365A (en) * | 2020-06-15 | 2020-09-29 | 山东大学 | Immersive remote interaction system and method based on natural walking |
US20220040852A1 (en) * | 2020-07-31 | 2022-02-10 | Robert Bosch Gmbh | Method for controlling a robot device and robot device controller |
CN112704563A (en) * | 2020-12-25 | 2021-04-27 | 天津市第三中心医院 | Remote ultrasonic operation simulation system for hepatobiliary surgery based on ultrasonic knife |
US20220314112A1 (en) * | 2021-04-06 | 2022-10-06 | Sony Interactive Entertainment LLC | Adjustable robot for providing scale of virtual assets and identifying objects in an interactive scene |
US20220331946A1 (en) * | 2021-04-16 | 2022-10-20 | Dexterity, Inc. | Repositionable robot riser |
Also Published As
Publication number | Publication date |
---|---|
EP3488308A1 (en) | 2019-05-29 |
JP2022060201A (en) | 2022-04-14 |
WO2018017859A1 (en) | 2018-01-25 |
JP2019523145A (en) | 2019-08-22 |
US20180021956A1 (en) | 2018-01-25 |
US10427305B2 (en) | 2019-10-01 |
JP2024015262A (en) | 2024-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200030986A1 (en) | Robotic camera control via motion capture | |
CN110573308B (en) | Computer-based method and system for spatial programming of robotic devices | |
JP6420229B2 (en) | A robot system including a video display device that superimposes and displays an image of a virtual object on a video of a robot | |
US9300852B2 (en) | Controlling robotic motion of camera | |
Yip et al. | Model-less feedback control of continuum manipulators in constrained environments | |
Zuo et al. | Craves: Controlling robotic arm with a vision-based economic system | |
US20190086907A1 (en) | Programming a robot by demonstration | |
JP2016504077A5 (en) | ||
US20150273689A1 (en) | Robot control device, robot, robotic system, teaching method, and program | |
EP3740352A1 (en) | Vision-based sensor system and control method for robot arms | |
WO2016193781A1 (en) | Motion control system for a direct drive robot through visual servoing | |
JP2021000678A (en) | Control system and control method | |
CN110877334A (en) | Method and apparatus for robot control | |
Avalos et al. | Real-time teleoperation with the Baxter robot and the Kinect sensor | |
Cortes et al. | Increasing optical tracking workspace of VR applications using controlled cameras | |
CN105527980B (en) | Binocular vision system target following control method | |
US20180065247A1 (en) | Configuring a robotic camera to mimic cinematographic styles | |
Cai et al. | 6D image-based visual servoing for robot manipulators with uncalibrated stereo cameras | |
Yu et al. | Precise Robotic Needle-Threading with Tactile Perception and Reinforcement Learning | |
Rudd et al. | Intuitive gesture-based control system with collision avoidance for robotic manipulators | |
Liaw et al. | Target prediction to improve human errors in robot teleoperation system | |
Walęcki et al. | Control system of a service robot's active head exemplified on visual servoing | |
Hu et al. | Manipulator arm interactive control in unknown underwater environment | |
Li et al. | Image based approach to obstacle avoidance in mobile manipulators | |
US11900590B2 (en) | Inspection device articulation transformation based on image transformation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AUTODESK, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ATHERTON, EVAN PATRICK;THOMASSON, DAVID;CONTI, MAURICE UGO;AND OTHERS;SIGNING DATES FROM 20161208 TO 20161209;REEL/FRAME:050722/0577 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |