WO2019046559A1 - Utilisation de la réalité augmentée pour commander des dispositifs intelligents - Google Patents

Utilisation de la réalité augmentée pour commander des dispositifs intelligents Download PDF

Info

Publication number
WO2019046559A1
WO2019046559A1 PCT/US2018/048811 US2018048811W WO2019046559A1 WO 2019046559 A1 WO2019046559 A1 WO 2019046559A1 US 2018048811 W US2018048811 W US 2018048811W WO 2019046559 A1 WO2019046559 A1 WO 2019046559A1
Authority
WO
WIPO (PCT)
Prior art keywords
robotic arm
user
environment
robotic
user interactions
Prior art date
Application number
PCT/US2018/048811
Other languages
English (en)
Inventor
Xinli Zou
Maximilian KLASSEN
Xiaodong Zhou
Jiantao PAN
Original Assignee
Linkedwyz
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Linkedwyz filed Critical Linkedwyz
Publication of WO2019046559A1 publication Critical patent/WO2019046559A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39449Pendant, pda displaying camera images overlayed with graphics, augmented reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39451Augmented reality for robot programming

Definitions

  • This patent document relates generally to controlling intelligent devices in an augmented reality (AR) environment. For example, methods and systems for controlling a robotic arm using an AR device are disclosed.
  • AR augmented reality
  • a robotic arm is a typical type of intelligent device, capable of performing various tasks.
  • Today there is not a good solution for a user to interact with a robotic arm.
  • Even some simple tasks require programming skills - this l imits the applications that a robotic arm can be utilized.
  • instructing a robotic arm to do a simple task such as picking up an object, will not only need profound programming skills in programming the robotic arm, it also requires accurate positioning of the object in the digital coordinate of the robotic arm.
  • 3D three-dimensional
  • a system for controlling a robotic arm in a first environment includes a user interaction unit installable on an Augmented Reality (AR) device and a device control system.
  • the user interac tion unit is configured to display a simulation model of the robotic arm on a display of the AR device in a second environment, and receive one or more user interactions over the simulation model of the robotic arm, where the one or more interactions indicate a user desired operation of the robotic arm.
  • the device control system is configured to generate robotic arm programming instructions based on the one or more user interactions and transmit the robotic arm programming instructions to the robotic arm, where the robotic arm programming instructions cause the robotic arm to perform the user desired operation.
  • the one or more user interactions include a user selection of one or more spatial position tracking points, each tracking point corresponding to an actuatable part of the robotic arm and including one or more device parameters consisting of a movement, a speed, a velocity, a torque and a duration of a pause in time.
  • the user interaction unit may further display one or more device parameters on the display of the AR device while receiving the one or more user interactions, where the one or more device parameters correspond to each of the track points being selected.
  • the system also includes a robotic arm simulation system configured to receive from the user interaction unit the one or more user interactions over the simulation model of the robotic arm and generate a trajectory of movement of the robotic arm based on the one or more user interactions.
  • the user interaction unit is further configured to display the trajectory on the display of the AR device.
  • the system also includes a device calibration unit configured to generate an alignment between a first coordinate system in the first environment and a second coordinate system in the second environment, and the device control system is configured to use the alignment to generate the robotic arm programming instructions based on the one or more user interactions.
  • a method for controlling an intelligent device in a first environment includes displaying a simulation model of the intelligent device on a display of an AR device in a second environment.
  • the method also includes receiving one or more user interactions over the simulation model of the intelligent device, where the one or more interactions indicate a user desired operation of the intelligent device.
  • the method further includes generating programming instructions based on the one or more user interactions, transmitting the programming instructions to trie intelligent device, and causing the intelligent device to perform the user desired operation in the first environment.
  • the intelligent device may include a robotic arm.
  • FIGs. 1A and IB illustrate an example robotic arm and AR device in accordance with various examples described herein.
  • FIG. 2 illustrates a diagram of a device control system that includes various components in accordance with various examples described herein.
  • FIG. 3 illustrates an example of a simulated robotic arm overlaid on the physical robotic arm in accordance with some examples described herein.
  • FIG. 4 illustrates an example of a process of robotic arm planning in accordance with various examples described herein.
  • FIG. S illustrates an example of a process of controlling a robotic arm in accordance with various examples described herein.
  • An intelligent device may generally include any type of equipment, instrument, or machinery that includes its own computing and network communication capabilities. Although various methods and systems are described with examples of a robotic arm, the scope of this patent disclosure may not be limited to robotic arms. Examples of an intelligent device may also include a network connected excavator, paiver, or smart home devices, etc.
  • FIGs. 1 A and IB illustrate an example robotic system, such as a robotic arm, and AR device.
  • an intelligent device 120 may include one or more movable parts, such as 124(1, ... n), 126(1, n) and a control system 134 configured to control the one or more movable parts 124(1. ... n), 126(1, ... n).
  • intelligent device 120 may include a robotic arm system that includes multiple movable arms 124(1), 124(2), ... 124(n), which may be joined via one or more rotatable joints, e.g., 126(1), 126(2), ... 126(n).
  • Each of the rotatable joints may be actuated by actuator which causes the one or more movable arms to move according to a desired path.
  • the actuators associated with the rotatable joints may be controlled by the control system 134.
  • Control system 134 may include a processing device and non-transitory computer readable medium that contains programming instructions configured to cause the processing device to cause the actuators to actuate the associated rotaiabie joints.
  • Control system 134 may also be configured to communicate with an external device to receive programming instructions for controlling various parts of the robotic arm 120.
  • robotic arm 120 may also have a base 128 which hosts the control system 134.
  • control system 134 may be installed in any part of the robot, such as one of the robotic arms.
  • the control system needs to know the destination position of the robotic arm, such as the position of the object that the user instructs the robotic arm to grab. This destination position is relative to the coordinate system 122 of the robotic arm.
  • the user needs to pass the destination position in the coordinate system of the robotic arm to the control system.
  • a user may also provide the parameters of the various actuators that control the movement of the robotic arm to reach its destination position.
  • an AR device 100 may be used to control the robotic arm 120.
  • the AR device may include one or more image sensors 106, one or more lenses 112 and one or more displays 104.
  • the image sensor(s) 106 may be configured to capture the physical environment as the user of the AR device 100 sees through the lens(es) 1 12.
  • image sensor 106 may include two or more sensors to achieve stereo vision.
  • display 104 may also include two displays to display 3D objects.
  • display 104 may include a 3D display.
  • AR device 100 may include a head mounted headset 102, where each display being mounted to a respective lens 104 of the head mount headset 102.
  • AR device 100 may include other wearable devices.
  • the image sensor 106 may capture the robotic system 120 in the coordinate system of the AR device 110 and display the captured robotic system on the display 104 in 3D.
  • An example of an AR device may include the HoloLens from Microsoft.
  • an AR device may not be a wearable device.
  • an AR device may have a lens an d a display configured to display what is being seen through the lens.
  • FIG. 1B illustrates a snapshot of a scene as can be seen on the display from a user wearing a head mounted AR device 100.
  • the AR device 100 may be configured to display a physical robotic system 170.
  • the physical robotic system may be captured by the image sensor 106 (FIG. 1A) and rendered on the display of the AR device.
  • the lens of the AR device may allow lights to pass through so that the user can directly see the physical robotic system through the lens.
  • AR device 100 may include a control system 108.
  • the control system may be installed on other devices, e.g., robotic system 120 or on a cloud.
  • AR device 100 and the robotic system 120 may communicate over a communication link 140.
  • each of AR device 100 and robotic system 120 may be configured to communicate with other devices using any suitable communication protocols, e.g., Wi-Fi, Bluetooth, infrared, near-field communication (NFC) or any other wireless or wired communication protocols.
  • any suitable communication protocols e.g., Wi-Fi, Bluetooth, infrared, near-field communication (NFC) or any other wireless or wired communication protocols.
  • AR device 100 may be configured to communicate with robotic system 120 and align the digital coordinate of the AR device 100 to the digital coordinate of the robotic system 120.
  • AR device 100 may be; configured to allow a user who wears the AR device 100 to define the desired movements of the robotic arms. Consequently, AR device 100 may generate digital position information in the AR device indicating the desired movement of the robotic arms, and transmit the digital position information to the robot, to cause the robotic system to operate according to the digital position information.
  • the control system 108 may be configured to perform a calibration process, for example, by using one or more markers 130(1, ..., n), 132 to align the digital coordinate of the robotic system to that of the AR device.
  • control system 108 may also be configured to receive desired movements of one or more robotic arms via user interactions.
  • the control system may cause the AR device to display a simulated intelligence device 170a on the display of the AR device.
  • the simulated intelligent device 170a is a simulated robotic system of the physical robotic system 170 that can be virtually rendered on the display of the AR device.
  • the system may render the virtual simulated robotic system 170a on the same position with the physical robotic system.
  • the system may also render one or more tracking points 160(1)...160(5), and trajectory 161 on the display of the AR device.
  • the system may also allow the user to define the desired movements of the robotic aims wi th the simulated robotic arm.
  • the AR device may receive user interactions that indicate the desired position of a particular robotic arm or the desired movements of one or more robotic arms.
  • the AR device may receive gestures from the user interactions.
  • the gestures may be user's hand or finger gestures or eye gazes captured by a camera of the VR device.
  • the gestures may be user's motion captured by one or more motion controllers.
  • the gestures may include special gesture data captured from special input device, e.g., special digital gloves as input.
  • the AR device may be configured to display trajectories of the robotic arms on the display of the AR device, where the trajectories are generated based on the desired movements. This provides feedback to the user as to how the robotic arms will react according to the user defined movements to allow the user to make necessary adjustment to the desired movements.
  • FIG. 2 illustrates a diagram of a con trol system 200, which may be implemented in 108 in FIGs. 1A and IB, which includes various components in accordance with various examples described herein.
  • control system 200 may include user interaction unit 206, a camera calibration unit 218, a device calibration unit 210, a coordinate alignment unit 212, a robotic simulation system 214, a device control system 216, a 3D spatial rendered 218, and/or a processing device 220.
  • camera calibration unit 208 may be configured to calibrate the image sensors 202, such as calibrating the camera distortion.
  • Existing calibration systems may be available.
  • the camera calibration system may cause the image sensor(s) to scan a normal chessboard to make sure that the captured images from the image sensors match the actual chessboard without significant bias. This step may be implemented using, for example, OpenCV: camera calibration.
  • device calibration unit 210 may be configured to align the coordinate system of the camera space of the 2D image sensor to the 3D space coordinate system of the AR device.
  • device calibration unit 210 may be configured to calibrate the coordinate system 122 in FIG. 1 A, with the coordinate system 110 in FIG. 1 A.
  • device calibration unit 210 may determine the correct relative position and rotation of the camera in the digital space of the device, and use these relative position and rotation to align the camera to the digital space of the intelligent device, such as the robotic arm.
  • the calibration unit may use optical markers placed in a neighborhood relative to the robotic arm.
  • the optical markers may be placed in the proximity of the robotic system 120.
  • one or more optical markers e.g., 130(1), 130(2), 130(3), 130(4) are placed on a platform 150 relative to the robotic system 120.
  • the optical markers may be of any suitable shape and color that would allow a camera to distinguish them from the environment background.
  • an optical marker may be a black square, a black circle, or a 2D barcode.
  • the relative positions among the optical markers are fixed and known. Additionally, the sizes of the optical markers are also fixed and known.
  • device calibration unit 210 may be configured to receive one or more images of the robotic system and the optical markers that are captured by the image sensor(s) 202, and use the position of the robot, e.g., the base (128 in FIG. 1 A), relative to the optical markers to determine the relationship between the coordinate system of the robotic system (e.g., 122 in FIG. 1 A) and the coordinate system of the AR device (e.g., 110 in FIG. 1 A).
  • the base of the robotic system 128 may be placed in between the optical marker 130(2) and 130(3).
  • the relative positions of 130(2) and 130(3) in the coordinaite system 122 may be known (given), and the relative position of 130(2) and 130(3) in the coordinate system 110 may also be determined from the captured image, the coordinate systems 122 and 110 may be aligned.
  • lite optical marker may include a barcode that includes information about the size of the marker.
  • the optical market may include a 2D barcode, such as a QR code, where the 2D barcode has a physical size (measured by the edges) and also includes data that contains information about the physical size of the barcode in the coordinate system (e.g., 122 in FIG. 1 A).
  • a 2D barcode 132 is placed on the robotic arm 124(6).
  • device calibration unit 210 may be configured to receive an image of the 2D barcode (e.g., 132 in FIG. 1 A) that is captured by the image sensor (e.g., 106 in FIG.
  • Device calibration unit 210 may further determine the physical size of the barcode embedded in the decoded data.
  • the physical size of the marker will be encoded in 2D format.
  • Some of the binary bits will be used to present the unit (mm, cm, dm, m, 10m), and other bits will be used to describe the dimension of the marker.
  • a 5cm QR code or a 10cm QR code may be used.
  • a square 2D barcode with different sizes, or non-square code, such as rectangle or circular 2D markers with different dimensions or radius units and sizes may also be used.
  • various bit length may be used to embed size information in a barcode. For example, 8 bits may be used to indicate the physical size of the barcode. In some examples, 4 bits may be used to indicate the length and 4 bits may be used to indicate the width of the barcode. It is appreciated that other variations may be possible.
  • device calibration unit 210 may be configured to determine the size of the barcode in the coordinate system of the AR device (e.g., 1 10 in FIG. 1 A) and compare that with the actual size of the barcode decoded from the captured image. The device calibration unit 210 may use the comparison to determine the relative distance of the camera to the optical marker, and use the relative distance between the camera and the optical marker to align the coordinate system in the intelligence device (e.g., 122 in FIG. 1 A) and the coordinate system in the AR device (e. g., 110 in FIG. 1A).
  • the intelligence device e.g., 122 in FIG. 1 A
  • the coordinate system in the AR device e.g., 110 in FIG. 1A
  • coordinate alignment unit 212 may be configured to determine the relative position of the physical intelligent device, such as the robotic system (e.g., 120 in FIG. 1A), to the AR device (e.g., 100 in FIG. 1A). Coordinate alignment unit 212 may further be configured to generate a transformation between the 3D coordinate of the AR device (e.g., 110 in FIG. 1 A) and the digital 3D coordinates of the intelligent device (e.g., 122 in FIG. 1 A) by using visual alignment markers.
  • the relative position detected based on an optical marker may contain measurement errors, and multiple markers may be used to increase the calibration accuracy. In a non-limiting example, multiple markers may be laid out in a specific partem to effectively reduce the error. For example, as shown in FIG. 1 A, four markers 130(1), 130(2), 130(3), 130(4) may be placed on four comers of a square, with the base of the robotic system centered on the square.
  • the device calibration unit 210 may receive images captured from the 2D camera of the AR device, where the images contain the visual markers.
  • the calibration unit 210 may locate the position of each marker from the captured images.
  • Coordinate alignment unit 212 may be configured to perform a geometric mapping of the marker to locate the relative position of the marker based on the device calibration and map the marker to the 3D space coordinate system (e.g., 122 in FIG. 1A) of the robotic arm.
  • coordinate alignment unit 212 may be configured to additionally perform a dynamic robotic alignment procedure to achieve an alignment with higher accuracy.
  • an additional optical marker may be placed onto a robotic arm, such as an end-effector of a robotic arm.
  • Coordinate alignment unit 212 may send commands to control the robotic arm to move one or more joints to predefined positions.
  • a user may be prompted to move the wearable AR device to follo w the position of the marker as the robotic arm is moving to the specified spatial position.
  • Coordinate alignment unit 212 may be configured to generate position data of the robotic arm based on the predefined position.
  • the robotic arm may be controlled to move to the left most reachable position, and then to the right most reachable position.
  • Coordinate alignment unit 212 may be configured to determine the middle point of the left most and right most positions of the optical marker that are captured by the image sensor(s) as the center of the robotic arm. This dynamic alignment process could be used multiple times to refine the result to a desired accuracy.
  • robotic simulation system 214 may be configured to provide a simulation model of a robotic system (e.g., 120 in FIG. 1 A).
  • the simulation of the robotic system may include a digitally simulated robotic system for displaying in the AR device (e.g., 100 in FIG. 1 A).
  • the AR device may simulate the structure of the robotic arm, the joint behaviors and also the path generating algorithms.
  • the simulation system may be capable of parsing and processing the robotic programming instructions and simulating the result based on the instructions. In such case, the behaviors of the robotic arm could be predicted and verified with the simulated robotic system.
  • the simulated robotic system may include information of possible moves, positions, maneuvers of the target robotic arms in the digital device.
  • a digital simulated robotic arm for example, is created and displayed in the AR device.
  • This simulated robotic arm matches the physical dimensions of the actual robotic arm to be controlled, and also have similar actuators parameters.
  • robotic simulation system 214 may receive user interac lions with gestures, digital gloves or other kinds of motion controllers that allow a user to manipulate the simulated robotic arm in the AR device to define a specified movement of the physical robotic arm, or control the poses of robotic arm behavior directly.
  • the specified movement may include specified position of one or more robotic arms in the AR coordinates, e.g., 110 in FIG. 1A.
  • Robotic simulation system 214 may convert the specified position in the AR space to the position in the physical space of the robotic arm based on the alignment between the coordinates of the robotic arm, e.g., 122 in FIG. 1 A and the coordinates of the AR device, e.g., 110 in FIG. 1 A.
  • the digital simulated robotic arm may be capable of demonstrating the desired movement of the physical robotic arm based on robotic programming instructions without actually powering on the robotic arm.
  • the robotic simulation system may be configured to receive user defined destination position and desired path, and based on the destination position and desired path, determine required movements of one or more robotic arms to reach the destination position. In some examples, these determined movements may include actuations of the robotic aim joints and movements via inverse kinematic algorithms.
  • robotic simulation system may be configured to calculate a trajectory of the robotic arm and have the path of the movement rendered in the AR device based on the movements which are determined via inverse kinematic algorithms. Whereas the simulated robotic arm may facilitate the simulation of con trol of movements, the behaviors of the physical robotic arm may be predicted and verified on the user's AR device without requiring the physical robotic arm to be powered on.
  • 3D spatial Tenderer 218 may be configured to render the digital simulated robotic arm on the display of the AR device, such as 104 in FIG. 1 A.
  • the rendering of the simulated robotic arm on the display will allow a user to observe and make adjustments on the robotic arms by defining the movement of the robotic arm.
  • the rendering may also allow the user to see a predicted trajectory of the robotic arm as the result of the defined movement. This provides user feedback to the controlling of the robotic arm and allow the user to make necessary changes to the movement of the robotic arm based on the behavior prediction.
  • Robotic simulation system 214 and 3D spatial rendered 281 facilitate the rendering of the various positions and trajectory of the movements of the robotic arm as viewable to an operator in the AR device, allowing the operator to observe the trajectory in the 3D physical space around the robot, and also observe the 3D path from any angle. In such a way, the operator could determine, without powering on the robotic arms hardware, whether the robotic arm will reach the target or may hit obstacles unexpectedly, and change the movement of the simulated robot, as needed, before any acciden t could happen.
  • FIG. 3 further illustrates an example of a simulated robotic arm overlaid on the physical robotic arm according to some examples described herein.
  • both the simulated robotic arms 320 and the physical robotic arms 300 are displayed in the AR device.
  • the operator may actually see the robotic programming instructions being executed without turning on the physical robotic arm.
  • a useir may interact with the simulated robotic arms to control the movements of the physical robot ic arms.
  • the simulated robotic arms may be shown in holograms in the 3D space and overlaid on the physical robotic arms captured by the image sensor(s), such as 106 in FIG. 1A.
  • the user interaction unit may receive user interactions that allow a user to move a robotic arm of the simulated robotic arm to a desired position relative to a target object, e.g., 308, 3 10, which are captured by the image sensor(s) of the AR device.
  • the user interaction unit may capture user inputs, such as a gesture via the image sensor or controller inputs with spatial position information.
  • a user may move the user's finger to a starting position, e.g., point 328 on the robotic arm 324 of the simulated robotic arm which the user desires to move, and may make a gesture (e.g., a pinch) to indicate a start of a move from that position.
  • User may subsequently move the finger to a destination position, e.g., the handle bar 330 of object 308, and pause at the destination position to indicate an end of the move.
  • the end of the move may also be indicated by a user gesture, e.g., a second pinch.
  • the 3D spatial renderer e.g., 218 in FIG. 2
  • the robotic arm 324 may move the robotic arm 324 away from the original position towards the destination position 330.
  • the user defined movement of the robotic arm 324 may cause the simulated robotic arm to move to reach the destination position.
  • the robotic simulation system e.g., 214 in FIG. 2
  • the user-defined movement may be already converted to robotic programming instructions that can be understood by the robotic simulation system.
  • the user-defined movement may be converted by the user interaction unit (e.g., 206 in FIG. 2) to the robotic programming instructions to be received by the robotic simulation system.
  • 3D spatial tenderer e.g., 218 in FIG. 2 may display a trajectory of the movement of the simulated robotic arm on the display of the AR device based on the robotic programming instructions.
  • user interaction unit e.g., 206 in FIG. 2 may be configured to receive multiple tracking points (e.g., 160(1)...160(5) in FIG. IB) from the user to facilitate user-defined path of movement of the simulated robotic arm.
  • user may manipulate multiple spatial position tracking points on the hologram of the simulated robotic arms. These spatial position tracking points may be manipulated via user interaction unit (e.g., 206 in FIG. 2) to describe the 3D behaviors of the simulated robotic arms in 3D space.
  • these position tracking points are the holograms which hold the spatial position and rotation in the 3D spaces, and each tracking point may include a position, a rotation, a velocity, pause duration and/or a command to end effector.
  • the end effector commands associated to these points may include opening or closing of the claws, checking the end effector sensors, etc.
  • the spatial position tracking points may be manipulated and adjusted via user interaction unit of the AR device, such as controllers, menus, gestures or voice commands, etc.
  • user interaction unit such as controllers, menus, gestures or voice commands, etc.
  • user sees through the lens of a head-mount AR device, such as Microsoft's HoloLens, moves a finger to a desired spatial position tracking point, and selects the desired spatial positional tracking point by a gesture, such as a pinch.
  • the user interaction unit may display a drop down menu on the display of the AR device to allow the user to select a movement (e.g., rotation or shift) and velocity.
  • the user may use a controller connected to the AR device to define and manipulate the spatial position tracking points.
  • the user interaction unit may also include a voice command unit configured to receive and recognize user voice commands.
  • the user interaction unit may be configured to receive the parameters for each spatial position tracking point and transmit these parameters to the robotic simulation system (e.g., 214 in FIG. 2).
  • the user interaction unit may allow user to define the order of execution for spatial position tracking points. For example, user may point to each of the spatial position tracking points and select an order of execution. Alternatively, user may select each of the spatial position tracking points in the order in which they are to be executed.
  • the user interaction unit may send the order of execution for multiple spatial position tracking points to the robotic simulation system (e.g., 214 in FIG.
  • the robotic simulation system may also be configured to provide additional information that can be displayed on the display of the AR device.
  • the robotic simulation system may provide the electronic current of each running actuator, the real-time torque of each motor, motor wearing out of each joint, and/or the overall machine loads, etc. to be overlaid on the robotic arms. These data may be spatial, dynamic and real time. For example, with this information overlaid on top of the robotic arm on the display of the AR device, the operator could see the current loads and power consumption, and then optimize the robotic arm path to lower the power consumption to achieve a higher efficiency of the robotic arm usage, which may result in at least an extension of the life span of the robotic arm.
  • the user may make some adjustment on the path o r target positions (e.g., raise or lower the target position) so that the path does not show red or the path shows green, which means that the power consumption of the new path is below a threshold.
  • device control system 216 may be configured to serialize executable robotic programming instructions for controlling the robotic arm (e.g., 120 in FIG. I A) based on the multiple spatial position trac king points and the order of execution of these points defined by the user.
  • Device control system 216 may transmit the robotic programming instructions to the robotic control system 222 of the robotic arm to control the physical robotic arm.
  • the physical robotic arm may perform the same movement as provided by the robotic simulation system.
  • control system e.g., 200 in FIG. 2
  • the control system may be configured to generate a 3D environment to be overlaid on the display of the AR device.
  • the system may use one or more sensors (e.g., image or laser) to perform a 3D scanning to detect 3D features of the physical environment with the depth-sensing ability and generate a 3D mesh to represent the physical environment.
  • the system may use 3D scanning to determine the geometry, position, and size of some objects around the robot, or some unexpected obstacles that may be in the way of a robotic arm movement.
  • the robotic simulation system (e.g., 214 in FIG. 2) may be configured to detect whether a user defined movement of a robotic arm will cause a collision of the robotic arm with an object in the physical environment based on the 3D geometry information of the physical environment. If a potential collision is detected, the robotic simulation system may use a collision avoidance algorithm to generate an alternative trajectory to get around the obstacles. Alternatively, and or additionally, the robotic simulation system may also display the obstacle mesh on the display of the AR device so that the user may make the adjustment on the planned path by moving the spatial position tracking points.
  • FIG. 4 further illustrates an example of a process for robotic planning.
  • the process may be implemented in the various systems described in FIGs. 1-3.
  • a process 400 may include receiving all spatial position tracking points at 402, generating joint positions for the current tracking point (may be initialized as the first tracking point) at 404 and generating joint positions for the next tracking point at 406, and causing each joint to move for a time duration (e.g., a At) at 408.
  • the process may use an inverse kinetics (IK) algorithm to determine the joint positions associated with the moving of a tracking point
  • IK inverse kinetics
  • the movement may be based on the device parameters selected by the user, such as the velocity and speed.
  • the process may include detecting collision at 410 to determine whether a movement may cause a potential collision with an obstacle in the physical environment at 412. If a collision is detected, the process may include adding a new tracking point to avoid the obstacle at 414 and continue at 402. If a collision is not detected, the process may proceed with adding the current tracking position to the trajectory at 416. The process may further include determining whether the last tracking point is reached at 418. If the last tracking point is reached, the process may end at 420. If the last tracking point is not reached, the process may update the current tracking point as the next tracking point and continue at 404.
  • FIG. 5 illustrates an example of a process for controlling a robot, which process may be implemented in various embodiments described in FIGs. 1-4.
  • a process 500 may include aligning the coordinate system of the AR device to the coordinate system of the robotic arm at S04. This process may be implemented by the coordinate alignment unit 212 in FIG, 2.
  • process 500 may optionally include refining the alignment of the coordinates at 506 by allowing the user to operate the robotic arm according to a predefined movement, and compare the position of the robotic arm to the predefined movement.
  • process 500 may include calibrating one or more cameras of the AR device at 502.
  • process 500 may further include displaying a simulated robotic arm in the AR device at 508.
  • This process may be implemented, for example, in user interaction unit 206 of FIG.2.
  • the process may include obtaining the simulation model of the robotic arm at 510.
  • robotic simulation system 214 in FIG.2 may be used to obtain such simulation model of the robotic arm.
  • Process 500 may further include receiving user interaction in the AR device at 514.
  • user interaction unit 206 in FIG. 2 may implement such process to display the simulated robotic arm on the display of the AR device and also receive user commands.
  • process 500 may display device control parameters at 516, such as motor and torque control, to assist the user to define operations of the robotic arm.
  • process 500 may also include mapping 3D physical environment to the AR device at 512 and display the physical environment (e.g., a mesh model) on the display of the AR device.
  • Process 500 may further include determining an operation plan of the robotic arm at 518 via user interaction.
  • user interaction unit 206 and robotic simulation system 214 in FIG. 2 may be configured to display a simulated robotic arm to the user, receive user- defined operation plan (e.g., movement) of the robot, and display a trajectory of the simulated robotic arm according to the defined operation.
  • Process 500 may further include transmitting the operation plan to the robotic arm to cause the robotic arm to operate according to the user- defined operation.
  • the processes in FIGs.4 and 5 may be implemented in the various systems described in FIGs. 1-3.
  • a process may function as a user interface component of the system that facilitates user interfaces to allow a user to control the intelligence device.
  • the control system may provide all of the 3D information of the robotic arm and the surrounding mesh so an operator could move the tracking points to plan the robotic arm movement in a remote location without requiring the operator to be on site where the actual robotic arm is located.
  • the operator may work on site, and use a remote assistant to help remotely.
  • the remote assistant may see what the onsite operator sees and may also control the tracking points and make changes to the plan to provide assistance.
  • the robotic programming instructions generated by the control system e.g., device control system 216 in FIG.2 maybe transmitted wirelessly to the actual robotic arm for controlling the actual robotic arm.
  • the system may further be configured to limit the movement of the robotic arm.
  • the robotic arm may preferably be movable within a safe zone.
  • the system may be configured to use visual indicators to generate a safety bounding box on the display of the AR device.
  • the safety bounding box may be a spatial area within which the robotic arm may safely operate.
  • the system may create holographic areas to display the safe bounding box in the AR device.
  • the system may be configured to also keep track of the operator's position through the AR device and create the safety bounding box based on the real time position of the operator. This will create a better robot-human collaboration environment and keep the operators safe.
  • FIGs. 1-5 provide advantages in controlling an intelligence device, such as a robotic arm.
  • the described methods and systems may allow a user to control any suitable intelligent devices in an augmented reality (AR) environment with hand gestures, eye gaze and o ther natural interactions.
  • AR augmented reality
  • the present disclosure may facilitate locating the exact position of a robotic arm and accurately align the coordinate of the AR device to that of the robotic arm. This makes it feasible to plan the path of the physical robotic arm in the AR device without requiring the user to power on the robotic arm.
  • Robotic programming instructions for planned operation may be transmitted to the robotic arm for controlling the robotic arm. High precision of the robotic arm may be achieved since all of the positions are generated from digital simulation. Further, the robotic planning may be done with the aid of an AR device without requiring user to have programming skills to program the robotic arm.
  • advantages of the described s ystem also include the capability to allow user to pause in time, simulate a planned operation (by displaying a trajectory in the AR device) and tune a particular operation according to the trajectory of the robotic arm.
  • the capability of pausing a movement (during the movement) would not be possible in working with an actual robotic arm because the part of a robotic arm that is moving cannot be stopped in time due to the momentum from the movement or mechanical limitation of the robotic arm.
  • the disclosed system and method are advantageous in cases in which the action is very time sensitive or there are multiple robots involved and need to collaborate based on the time.
  • Other advantages include the overlay of 3D mesh which makes it easy for the operators to plan for the complicated environment and avoid collision with obstacles.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un procédé de commande d'un dispositif intelligent dans un premier environnement pouvant comprendre l'affichage d'un modèle de simulation du dispositif intelligent sur un affichage d'un dispositif de réalité augmentée (AR) dans un second environnement. Le procédé peut recevoir une ou plusieurs interactions d'utilisateur sur le modèle de simulation du dispositif intelligent, les interactions indiquant un fonctionnement souhaité par l'utilisateur du dispositif intelligent. Le procédé peut générer des instructions de programmation sur la base de la ou des interactions d'utilisateur, transmettre les instructions de programmation au dispositif intelligent et amener le dispositif intelligent à effectuer l'opération souhaitée par l'utilisateur dans le premier environnement. Dans certains exemples, le procédé de l'invention peut être mis en œuvre dans un système de commande d'un bras robotique. Le système peut recevoir une ou plusieurs interactions utilisateur pour permettre à un utilisateur de définir une opération souhaitée du bras robotique par l'intermédiaire d'interactions utilisateur naturelles.
PCT/US2018/048811 2017-08-30 2018-08-30 Utilisation de la réalité augmentée pour commander des dispositifs intelligents WO2019046559A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762551917P 2017-08-30 2017-08-30
US62/551,917 2017-08-30

Publications (1)

Publication Number Publication Date
WO2019046559A1 true WO2019046559A1 (fr) 2019-03-07

Family

ID=65527849

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/048811 WO2019046559A1 (fr) 2017-08-30 2018-08-30 Utilisation de la réalité augmentée pour commander des dispositifs intelligents

Country Status (1)

Country Link
WO (1) WO2019046559A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112418385A (zh) * 2020-12-10 2021-02-26 郑子龙 一种仿生智能体控制方法、设备、系统
JP2021062463A (ja) * 2019-10-17 2021-04-22 ファナック株式会社 ロボットシステム
CN113478492A (zh) * 2021-09-07 2021-10-08 成都博恩思医学机器人有限公司 一种避免机械臂碰撞的方法、系统、机器人及存储介质
US11534912B2 (en) 2019-04-26 2022-12-27 Fanuc Corporation Vibration display device, operation program creating device, and system
US11590663B2 (en) 2020-01-14 2023-02-28 International Business Machines Corporation Virtual reality enabled activity allocation
US11724388B2 (en) * 2018-10-02 2023-08-15 Fanuc Corporation Robot controller and display device using augmented reality and mixed reality
WO2024129802A1 (fr) * 2022-12-13 2024-06-20 Acumino Système de test et de commande de robot d'entraînement

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100045701A1 (en) * 2008-08-22 2010-02-25 Cybernet Systems Corporation Automatic mapping of augmented reality fiducials
US7714895B2 (en) * 2002-12-30 2010-05-11 Abb Research Ltd. Interactive and shared augmented reality system and method having local and remote access
US20150005785A1 (en) * 2011-12-30 2015-01-01 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for detection and avoidance of collisions of robotically-controlled medical devices
US20150224650A1 (en) * 2014-02-12 2015-08-13 General Electric Company Vision-guided electromagnetic robotic system
US20160257000A1 (en) * 2015-03-04 2016-09-08 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7714895B2 (en) * 2002-12-30 2010-05-11 Abb Research Ltd. Interactive and shared augmented reality system and method having local and remote access
US20100045701A1 (en) * 2008-08-22 2010-02-25 Cybernet Systems Corporation Automatic mapping of augmented reality fiducials
US20150005785A1 (en) * 2011-12-30 2015-01-01 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for detection and avoidance of collisions of robotically-controlled medical devices
US20150224650A1 (en) * 2014-02-12 2015-08-13 General Electric Company Vision-guided electromagnetic robotic system
US20160257000A1 (en) * 2015-03-04 2016-09-08 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11724388B2 (en) * 2018-10-02 2023-08-15 Fanuc Corporation Robot controller and display device using augmented reality and mixed reality
US11534912B2 (en) 2019-04-26 2022-12-27 Fanuc Corporation Vibration display device, operation program creating device, and system
JP2021062463A (ja) * 2019-10-17 2021-04-22 ファナック株式会社 ロボットシステム
JP7359633B2 (ja) 2019-10-17 2023-10-11 ファナック株式会社 ロボットシステム
US11590663B2 (en) 2020-01-14 2023-02-28 International Business Machines Corporation Virtual reality enabled activity allocation
CN112418385A (zh) * 2020-12-10 2021-02-26 郑子龙 一种仿生智能体控制方法、设备、系统
CN113478492A (zh) * 2021-09-07 2021-10-08 成都博恩思医学机器人有限公司 一种避免机械臂碰撞的方法、系统、机器人及存储介质
CN113478492B (zh) * 2021-09-07 2022-05-17 成都博恩思医学机器人有限公司 一种避免机械臂碰撞的方法、系统、机器人及存储介质
WO2024129802A1 (fr) * 2022-12-13 2024-06-20 Acumino Système de test et de commande de robot d'entraînement

Similar Documents

Publication Publication Date Title
WO2019046559A1 (fr) Utilisation de la réalité augmentée pour commander des dispositifs intelligents
US11220002B2 (en) Robot simulation device
US9919421B2 (en) Method and apparatus for robot path teaching
KR101978740B1 (ko) 원격조종시스템 및 그 제어방법
US10166673B2 (en) Portable apparatus for controlling robot and method thereof
US11039895B2 (en) Industrial remote control robot system
US9849595B2 (en) Contact force limiting with haptic feedback for a tele-operated robot
EP3055744B1 (fr) Procédé et dispositif permettant de vérifier un ou plusieurs volume(s) de sécurité pour une unité mécanique mobile
JP4680516B2 (ja) ロボットの情報を現実環境の画像へとフェードインするための方法、およびロボットの情報を現実環境の画像に視覚化するための装置
JP5246672B2 (ja) ロボットシステム
US11762369B2 (en) Robotic control via a virtual world simulation
US20190389066A1 (en) Visualization and modification of operational bounding zones using augmented reality
KR101876845B1 (ko) 로봇 제어 장치
JP7396872B2 (ja) 拡張現実を用いたシミュレーション装置及びロボットシステム
CN105058396A (zh) 机器人示教系统及其控制方法
US20150298318A1 (en) Teleoperation Of Machines Having At Least One Actuated Mechanism
CN113211494A (zh) 用于检查机器人的安全区域的方法
US9962835B2 (en) Device for dynamic switching of robot control points
CN107257946B (zh) 用于虚拟调试的系统
Makita et al. Offline direct teaching for a robotic manipulator in the computational space
WO2016172718A1 (fr) Système et procédé de télécommande à distance à l'aide d'une scène 3d reconstruite
Fareed et al. Gesture based wireless single-armed robot in cartesian 3D space using Kinect
KR102403568B1 (ko) 로봇 교시 장치 및 이를 이용한 로봇 교시 방법
WO2022255206A1 (fr) Appareil de traitement d'informations, procédé de traitement d'informations et programme informatique
CN114728413A (zh) 用于控制远程机器人的图形用户界面的方法和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18850041

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18850041

Country of ref document: EP

Kind code of ref document: A1