WO2019186551A1 - Augmented reality for industrial robotics - Google Patents

Augmented reality for industrial robotics Download PDF

Info

Publication number
WO2019186551A1
WO2019186551A1 PCT/IL2019/050348 IL2019050348W WO2019186551A1 WO 2019186551 A1 WO2019186551 A1 WO 2019186551A1 IL 2019050348 W IL2019050348 W IL 2019050348W WO 2019186551 A1 WO2019186551 A1 WO 2019186551A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
display
real
virtual
user
Prior art date
Application number
PCT/IL2019/050348
Other languages
French (fr)
Inventor
Eran Katzir
Omri SOUDRY
Mirko BORICH
Original Assignee
Servotronix Automation Solutions Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Servotronix Automation Solutions Ltd. filed Critical Servotronix Automation Solutions Ltd.
Priority to CN201980030164.XA priority Critical patent/CN112105486B/en
Publication of WO2019186551A1 publication Critical patent/WO2019186551A1/en
Priority to IL277596A priority patent/IL277596A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35506Camera images overlayed with graphics, model
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39014Match virtual world with real world
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39449Pendant, pda displaying camera images overlayed with graphics, augmented reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39451Augmented reality for robot programming
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40121Trajectory planning in virtual space
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40126Virtual landmarks, reference points for operator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40129Virtual graphic 3-D pointer, manipulator commands real manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40131Virtual reality control, programming of manipulator

Definitions

  • the present invention in some embodiments thereof, relates to using augmented reality for industrial robotics and, more particularly, but not exclusively, to using augmented reality or virtual reality to planning and/or testing a robot’s movements.
  • the present invention in some embodiments thereof, relates to using augmented reality for industrial robotics and, more particularly, but not exclusively, to using augmented reality or virtual reality to planning a robot’s movements.
  • a system for displaying a virtual robot in a real-world environment including a display, a computer model of a robot, a computing unit for calculating a shape of the robot at a specific position according to the computer model, a camera for capturing an image of a real-world environment, and a man machine interface (MMI) for a user to provide commands to the computing unit to calculate the shape of the robot, wherein the display is arranged to display the shape of the robot as a virtual robot.
  • MMI man machine interface
  • the model of the robot is a real-world robot.
  • the model of the robot is a computer model of a robot.
  • the computing unit is configured for controlling displayed movement of the model of the robot as movement of the virtual robot.
  • the display is a Virtual Reality (VR) display and further including the computing unit being arranged to control the VR display to display a virtual environment for the virtual robot.
  • VR Virtual Reality
  • the virtual environment is based on the image of the real-world environment.
  • the computing unit is configured for controlling displayed movement of all the motions possible for the model of the robot.
  • the computing unit is configured for controlling displayed movement of all the motions possible for the model of the robot to their full range of movement.
  • the display is configured to display the entire range of movement as a highlighted volume in space.
  • the display is an Augmented Reality (AR) display.
  • AR Augmented Reality
  • the sensors are arranged to detect the user’s gesture in the real-world environment in which the AR display apparently displays the virtual robot.
  • the sensors are arranged to detect a real object in a real-world space in which the display displays the robot.
  • the display is configured to refrain from displaying the robot in a same space as the real object.
  • the display is configured to display the entire range of movement of the virtual robot, except for when the virtual robot appears to occupy a same space as a real object, as a highlighted volume in space.
  • the display is configured to display only a portion of the range of movement of the virtual robot when the virtual robot appears to occupy a same space as a real object as a highlighted volume in space.
  • the computing unit is included within the display.
  • the AR display includes a head mounted display.
  • the display includes a tablet, and the screen of the tablet is arranged to display the virtual robot.
  • the screen of the tablet is further arranged to display a virtual environment of the virtual robot.
  • the screen of the tablet is further arranged to display the real-world environment as captured by the camera.
  • the screen of the tablet includes a touch screen and the system is arranged to use the touch screen to input user gestures for controlling movement of the virtual robot.
  • the system is arranged to use the touch screen to input which type of robot is to be displayed.
  • the system is arranged to use the touch screen to input which robot appendage is to be moved.
  • the system is arranged to use the touch screen to select which axis of a motion frame of a robot is to be controlled.
  • the system is arranged to use the touch screen to input by how much the selected axis of a selected robot is to be jogged.
  • At least one device sensor selected from a group consisting of a gyroscope, an accelerometer, and a GPS unit.
  • the system is used for delineating a space in a real-world environment spanned by movement of the robot.
  • a method for displaying a virtual robot in a real-world environment including providing a model of a robot, providing an image of a real-world environment, calculating a shape of the model of the robot at a specific position in the real-world environment according to the computer model, and using a display to display the shape of the robot as a virtual robot and to display an industrial environment based on the image of a real-world environment, wherein the virtual robot is displayed at the specific position in the real-world environment.
  • the display is an Augmented Reality (AR) display.
  • AR Augmented Reality
  • the sensing is performed by a camera capturing images from in a real-world space in which the AR display displays the robot.
  • input sensors are arranged to detect the user’s gesture in a real-world space in which the AR display displays the robot.
  • the model of the robot is a real-world robot.
  • the model of the robot is a computer model of a robot.
  • calculating a shape of the model of the robot includes controlling displayed movement of the virtual robot.
  • the display is a Virtual Reality (VR) display and calculating a shape of the model of the robot includes controlling the VR display to display a virtual environment for the virtual robot.
  • VR Virtual Reality
  • the virtual environment is based on the image of the real-world environment.
  • the display is used for delineating a space in the real-world environment spanned by movement of the virtual robot.
  • the display is used to display the entire range of movement as a highlighted volume in space.
  • the sensing a user’s gesture includes sensing a real object in a real-world space in which the display displays the virtual robot.
  • a front camera of a tablet to detect a user’s eyes, calculating where the user is looking, tracking a shift in the user’s direction of looking and controlling displaying movement of a selected appendage of the virtual robot based on the shift in the user’s direction of looking.
  • some embodiments of the present invention may be embodied as a system, method or computer program product. Accordingly, some embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a“circuit,”“module” or“system.” Furthermore, some embodiments of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon. Implementation of the method and/or system of some embodiments of the invention can involve performing and/or completing selected tasks manually, automatically, or a combination thereof.
  • selected tasks could be implemented by hardware, by software or by firmware and/or by a combination thereof, e.g., using an operating system.
  • hardware for performing selected tasks according to some embodiments of the invention could be implemented as a chip or a circuit.
  • selected tasks according to some embodiments of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system.
  • one or more tasks according to some exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions.
  • the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data.
  • a network connection is provided as well.
  • a display and/or a user input device such as a keyboard or mouse are optionally provided as well.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium and/or data used thereby may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for some embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • Some of the methods described herein are generally designed only for use by a computer, and may not be feasible or practical for performing purely manually, by a human expert.
  • a human expert who wanted to manually perform similar tasks such as delineating a volume spanned by a robot’s movements, might be expected to use completely different methods, e.g., making use of expert knowledge and/or the pattern recognition capabilities of the human brain, which would be vastly more efficient than manually going through the steps of the methods described herein.
  • FIG. 1 is a simplified block diagram illustration of a system for displaying a virtual robot in a real-world environment according to an example embodiment of the invention
  • FIG. 2 is a simplified flow chart illustration of a method for displaying a virtual robot in a real-world environment according to an example embodiment of the invention
  • FIG. 3 is a simplified image showing a system for displaying a virtual robot in a real- world environment according to an example embodiment of the invention
  • FIG. 4 is a simplified image showing a screen of a system for displaying a virtual robot in an environment according to an example embodiment of the invention.
  • FIG. 5 which is a simplified illustration of a virtual robot according to an example embodiment of the invention.
  • the present invention in some embodiments thereof, relates to using augmented reality for industrial robotics and, more particularly, but not exclusively, to using augmented reality or virtual reality to planning and/or testing a robot’s movements.
  • Augmented Reality is a way of displaying the physical, real-world environment with computer-generated 2D/3D objects displayed as part of the real-world environment.
  • the computer-generated 2D/3D virtual objects are optionally displayed to appear as interacting with the real world.
  • An AR display system typically uses sensors to sense locations of objects in the real-world environment, and image-processing technologies optionally in real-time.
  • An AR display system optionally interacts with a user, whether by the user providing input by computer controls, such as buttons, a keyboard or a touch screen, or by detecting user’s gestures in a volume of space where virtual objects are displayed.
  • VR Virtual Reality
  • the VR display system uses sensors to sense locations of real objects in a real-world environment, and optionally displays the real objects in the virtual world environment.
  • AR as used in the present specification and claims is intended to include VR, as will be understood by persons skilled in the art.
  • AR is already being used in a wide variety of industries, including gaming, medical industries and education. With more and more devices capable of displaying AR content coming out each year, the AR platform is becoming affordable for everyone - and it keeps getting better every year.
  • the IRI today uses traditional 3D software (e. g.: RoboWorks, RoboDK) to visualize robotic applications without using an actual robot. While this may be a good solution for testing just the motion of a robot, this method doesn’t take into account factors such as the user and the environment interactions with the robot.
  • 3D software e. g.: RoboWorks, RoboDK
  • Teach Pendants - hand-held devices which allow users to move a robot and then save robot parameters associated with location of the robot for later use.
  • Such a method can work with AR and/or VR - a user uses an AR (or VR) display device and operates a real or a virtual robot (optionally displayed by an AR or VR display) controlled by a teach pendant.
  • a Teach Pendant is used to control movement of a virtual robot, displayed by an AR or a VR device.
  • a Teach Pendant is used to control movement of a real robot, where the real robot optionally serves to anchor or locate a virtual robot displayed by an AR or a VR device, and the movement is optionally displayed such that a portion of the displayed virtual robot coincides with the real robot, and a portion of the displayed virtual robot - for example the moving portion - is displayed as a virtual robot movement, optionally without the real robot moving.
  • a Teach Pendant such as describe in above-mentioned co filed, co-pending and co-assigned U.S. Patent Application entitled“TEACH PENDANT” is optionally used to control movement of a real or virtual robot, and to display an AR or VR display on a screen of the TP.
  • the TP includes or is attached to a tablet, and the tablet screen optionally serves for displaying the AR or VR display, and/or for implementing a user interface such as a touch screen.
  • Teach pendants are devices, often hand-held, that can control movement of real or virtual (simulated) robots to a specific location, and optionally save the location and/or control commands for later use.
  • a user views the movement on a display, sometimes using a computer program for displaying robotic movement.
  • such display is optionally extended, so that users are able to see virtual robots and their interactions with the surrounding environment, optionally while using a teach pendant to control (and optionally save commands for) moving the virtual robots.
  • AR is optionally used instead of or in addition to a traditional teach pendant (which is hand-held), for example, by using AR Glasses that can free the user’s hands.
  • Jogging is optionally done by looking at certain points on an augmented world, for example looking at jog buttons that appear next to a real or virtual robot, thus potentially eliminating at least some disadvantages of hand-held devices.
  • the real-world location of virtual objects is optionally set by using physical predefined markers placed in the real world.
  • An example for such a marker could be a QR code on the ground.
  • real-time image processing can be used to detect surfaces or edges of objects in the real world.
  • Augmented Reality and/or Virtual Reality devices used include mobile platforms; wearable tech; tablets; mobile phones; AR Glasses, and VR glasses.
  • virtual objects such as a virtual robot are optionally displayed by lenses, such as in a Microsoft HoloLens or in a Google Glass, and not necessarily on a display screen or a touch screen.
  • AR glasses or displays are used which include a hand gesture interface, or a gaze interface.
  • a gesture is detected in space and serves as input to the AR or VR system.
  • a user’s gaze is detected as viewing in a specific direction, and the direction is used for interacting with an AR or VR system, for example detecting a gaze direction to a virtual object which the user is gazing at and optionally selecting the virtual object and/or moving the virtual object in a direction of the gaze.
  • virtual robot in all its grammatical forms is used throughout the present specification and claims to mean a robot as displayed by a display.
  • an AR display or a VR display is optionally integrated into Teach Pendant software - displaying a virtual robot in a Teach Pendant screen, eliminating a need for using external monitors to visualize virtual robots.
  • users can safely get close to location of a moving virtual robot displayed by an AR display or a VR display without the risk of getting harmed by a real robot.
  • users can step away from a computer/monitor and perform operations apparently directly on a virtual robot.
  • a sensing system optionally detects location of a user or a user’s hand(s), and when a user’s hand is at a location apparently touching the virtual robot, the sensing system senses movements of the user’s hands and translates the movements to act as if a user is Leading By Nose the virtual robot displayed by the AR display.
  • Real robots can be expensive to purchase and/or operate and/or fix when damaged.
  • Using a virtual robot in an AR setting can save money, allowing users to test robotic motion programs in the real world, educating employees, and working with a virtual robot when a real such robot maybe isn’t available.
  • a VR display is optionally integrated into Teach Pendant software - displaying a virtual robot and a virtual environment in a Teach Pendant screen, displaying for example a robot in an industrial setting
  • an AR display system is used to display a virtual robot in real space.
  • an AR display system is optionally used to display a virtual volume in real space which is a volume of a full range of movement of the virtual robot.
  • a VR display system is used to display a virtual robot in a virtual space.
  • the VR display optionally displays a virtual robot and a dynamic virtual environment, displaying for example a robot in an assembly line with dynamically moving objects in the assembly line.
  • Denavit and Hartenberg (DH) parameters are used to represent industrial robots.
  • DH parameters are assigned for a robot, a workspace of the robot is optionally calculated, and the workspace is optionally displayed by an AR display.
  • an AR display system is optionally used to display a virtual robot in space, where the virtual robot appears in a position determined by user commands.
  • a VR display system is optionally used to display a virtual robot in virtual space, where the virtual robot appears in a position determined by user commands.
  • the commands are provided by a Teach Pendant.
  • the commands are provided by a Teach Pendant as described by above-mentioned U.S. Provisional Patent Applications entitled“TEACH PENDANT AS AN ADD-ON”, the contents of which are incorporated herein by reference.
  • the virtual robot appears in a position determined by user commands, but a specific time following the commands.
  • the display system optionally apparently delays virtual execution of the commands by the specific time, by displaying a corresponding movement later than commanded.
  • a delay optionally serves for training.
  • the delay optionally enables showing the virtual robot performing an incorrect move, such as a collision with an object, and optionally enables a user to react to correct the incorrect move.
  • a virtual robot is displayed to appear in positions determined by a program which includes several robot movement commands, and the program also causes a real robot to move in real space.
  • an AR display optionally displays a virtual robot performing the program commands a specific period of time before the real robot is caused to perform the program commands.
  • the display shows the virtual robot apparently performing movements before the real robot performs the same movements.
  • a preview of where the real robot will be after the specific period of time optionally serves for training.
  • the preview optionally enables showing the virtual robot performing an incorrect move, such as a collision with an object, and optionally enables a user to react to stop, for example by using an emergency stop (E-stop) button before the real robot performs the incorrect move.
  • E-stop emergency stop
  • a head mounted display a computer display, a tablet display, a touch screen, a smart phone screen.
  • the display is optionally a tablet screen such as described in above- mentioned U.S. Provisional Patent Applications entitled“TEACH PENDANT AS AN ADD ON”, the contents of which are incorporated herein by reference.
  • the tablet camera serves to capture an image or images of the real world and the tablet display shows a virtual robot augmenting the real world image.
  • the tablet display dynamically displays what the tablet camera dynamically captures, with the virtual robot augmented onto a correct location in the display of the real world.
  • the tablet display displays what the tablet camera captures, and the image or images of the real world serve as background for the display of the virtual robot, even if or when the tablet is pointed elsewhere.
  • the tablet automatically detects when the tablet camera captures cease to point at a location designated for the virtual robot, for example when losing sight of an anchoring marking (described in more detail below) or losing sight of a real robot.
  • the table provides warning when the tablet camera loses sight of an intended location for the virtual robot.
  • the tablet displays a previous image or images of the real world as background for the display of the virtual robot.
  • an AR display system used to display a virtual robot includes information about real objects in the environment.
  • an AR display system used to display a virtual robot senses locations of real objects in the environment.
  • an AR display system used to display a virtual robot optionally does not display the virtual robot moving to positions which would cause a real robot to move into space occupied by the real objects in the environment.
  • Such an AR display system implements collision avoidance in the AR display of the virtual robot.
  • an AR display system used to display a virtual robot optionally displays a virtual volume in real space only where a real robot exercising a full range of movement would occupy space occupied by the real objects.
  • Such an AR display system displays spaces where collision of the virtual robot with real objects is detected.
  • Such a display can serve for warning and optionally fixing problems with a planned location or movement of a planned real robot.
  • an AR display system used to display a virtual robot includes information about real objects in the environment, displays a virtual volume in real space only where a real robot exercising a range of movement would not crash into the real objects.
  • Such an AR display system displays spaces where collision of the virtual robot with real objects does not happen.
  • Such a display can serve for visualizing a safe volume for a planned real robot and/or planning a safe volume for movement for the real robot.
  • markers are placed in a real environment, for example on a floor or on a workbench.
  • the markers are optionally sensed by an AR display system and optionally serve as a location for displaying a specific part of a virtual robot.
  • the markers may correspond to a shape of a base of a robot
  • the AR display displays a virtual robot with a base of the virtual robot overlapping the marker
  • the marker optionally includes encoded digital data, such as a QR code.
  • the QR code includes information including one or more of: a code associated with a type of robot, data about the location, an identification code of the location - for example when more than one location are programmed or known by an AR display system.
  • the AR display system optionally displays a virtual robot corresponding to the type of robot encoded in the marker.
  • a marker is placed on a floor or workbench
  • an AR display system is optionally used to display operation of a virtual robot
  • a user optionally adjusts the location of the marker, thereby adjusting a location where the virtual robot is displayed.
  • the user optionally adjusts location of the robot to verify that the virtual robot and/or a corresponding real robot, is correctly placed in real space - to maintain safety; to prevent collision with real world objects or other robots; and/or to reach locations which the robot is intended to reach.
  • a“Lead by Gesture” method for jogging a robot is optionally used instead of the above-mentioned“Lead by Nose” method.
  • users instead of applying force to a robot, users optionally use hand gestures (e.g.: drag, push) to select and virtually move a robot or a robot part in an AR display of a virtual robot, and cause the real robot or real robot part to move in a corresponding direction.
  • hand gestures e.g.: drag, push
  • sensor(s) in a space of an AR display showing a virtual robot sense location of the user’s hand and interpret gestures in the space.
  • the gestures are made in a same volume where the AR display apparently displays the virtual robot movement.
  • the AR display shows the virtual robot on a touch screen, and gestures are made on the touch screen to cause the AR display to move the shape of the virtual robot.
  • a camera such as a tablet camera detects a user touching a real robot and optionally provides a command to the real robot to move in a direction away from the touch, providing a result of the user pushing the real robot.
  • a camera such as a tablet camera detects a user’s hand at a location apparently touching a virtual robot, and optionally provides a command to the virtual robot to move in a direction away from the touch, providing a result of the user apparently pushing the virtual robot.
  • the AR display optionally provides a user interface for the user to select a robot degree of freedom, and optionally uses a drag gesture on a touch screen to determine an extent by which the degree of freedom is moved.
  • the AR display optionally provides a user interface for the user to select a robot and/or one or more robot motion parameters such as a robot motion frame, or other parameters such as velocity, acceleration, jerk, blending parameters and torque command and optionally uses a drag gesture on a touch screen to determine an extent by which the motion frame is moved.
  • the AR display optionally provides a user interface for the user to select a robot appendage for movement, and optionally uses a drag gesture on a touch screen to determine an extent by which the appendage is moved.
  • a front camera of a tablet or some other mobile or AR device optionally tracks a user’s eyes, and detects where the user is looking in order to select a robot appendage to move. In some embodiments the front camera optionally tracks where the user is looking, optionally moving a selected appendage in a direction where the user is looking.
  • a VR display is used to implement the above methods - the VR display displays both the virtual robot and the industrial environment.
  • The“Lead by Gesture” method is potentially safer than a“Lead by Nose” method acting upon a real robot, because the“Lead by Gesture” method can optionally be used without approaching an enabled real moving robot.
  • users can step away from a computer/monitor and perform operations apparently directly on a virtual robot.
  • a sensing system optionally detects location of a user or a user’s hand(s), and when a user’s hand is at a location apparently touching the virtual robot, the sensing system senses movements or gestures of the user’s hands and translates the movements to a user Leading By Nose the virtual robot displayed by the AR display.
  • real-time live data can be displayed, augmented on top of a real (or a virtual) robot, displaying system information such as tasks, motion frames, robot properties, drive status, and more. Users can optionally change parameters of a real robot directly from an AR application, optionally without using an additional Human Machine Interface or a computer monitors or a keyboard.
  • a user interacts with the AR or VR display system to mark a safety zone around objects in the real environment or in the virtual environment.
  • the AR or VR display optionally refrains from moving the virtual robot into the safety zone.
  • the AR or VR display optionally provides warning such as audible or visual warning when a user’s command to move the virtual robot would move the virtual robot into the safety zone.
  • the AR display system displays a virtual robot located in a real environment.
  • the AR display system optionally includes sensors which locate real object(s) in the real environment and the AR display system optionally takes into account location of the real object(s), such as object avoidance.
  • a VR display system displays a virtual robot located in a virtual environment.
  • the VR display system optionally includes sensors which locate real object(s) in the real environment and the VR display system optionally displays a virtual robot and a virtual environment which reproduces the real objects in the real environments.
  • a camera of a tablet, or mobile device, or AR device is optionally used to capture an image or images of the environment, and subsequently displays a corresponding virtual environment and a virtual robot in the virtual environment.
  • employees are optionally educated on a virtual robot, potentially not even on an industrial floor location, and potentially without being tied limited to a time when a real robot is free from industrial tasks - potentially saving time, money, company resources and eliminating fear of damaging a real robot.
  • Figure 1 is a simplified block diagram illustration of a system for displaying a virtual robot in a real-world environment according to an example embodiment of the invention.
  • Figure 1 shows a computing unit 101, a display 103, a computer model 102 of a robot, a camera 104 and a user interface 105.
  • the computing unit 101 optionally has data describing the computer model 102, and is optionally configured to calculate movement of the model 102 based on possible movements for the specific model 102.
  • the computing unit 101 optionally controls the display 103 to display a virtual robot corresponding to a specific shape of the model 102. In some embodiments, the computing unit 101 optionally controls the display 103 to display the virtual robot at any shape possible for the model 102.
  • the computing unit 101 optionally controls the display 103 to display a real environment as captured by the camera 104.
  • Figure 1 shows an optional block 106 representing the environment 106.
  • the computing unit 101 optionally controls the display 103 to display a virtual environment, optionally un-related to the real environment of the system.
  • the computing unit 101 optionally controls the display 103 to display a virtual environment, optionally based on one or more images or a video clip of the real environment of the system.
  • the computing unit 101 optionally interfaces with a user by the user interface 105.
  • the user interface 105 serves to provide user input for:
  • a specific robot motion frame such as joint, base, world, or tool for which to control a display of movement by a displayed virtual robot
  • the user gesture is a drag or slide along a touch screen, optionally a touch screen of a tablet.
  • the user gesture is a hand or finger movement detectable by the camera 104.
  • the user gesture is an eye movement detectable by the camera 104.
  • the user gesture is an eye movement detectable by a front camera of a tablet.
  • the camera 104 serves to detect a marking in a real-world environment with which to align the virtual robot.
  • the computer 101 controls the display 103 to display a real-time image of the real world, and also an image of the virtual robot situated in the real world at a location indicated by the marking.
  • the computer 101 controls the display 103 to display an image of a virtual world, and also an image of the virtual robot situated in the virtual world at a location indicated by the marking.
  • Figure 2 is a simplified flow chart illustration of a method for displaying a virtual robot in a real-world environment according to an example embodiment of the invention.
  • the method of Figure 2 includes:
  • a display to display the shape of the robot as a virtual robot and to display an industrial environment (208) based on the image of a real-world environment.
  • Figure 3 is a simplified image showing a system for displaying a virtual robot in a real-world environment according to an example embodiment of the invention.
  • Figure 3 shows a system including computer 301 having a display, showing a virtual robot 305 based on a computer model of a robot, and also displaying a real-world image 303 of the real world 302, optionally obtained by a camera (not shown) in the computer 301.
  • Figure 3 also shows a user’s finger 304 manipulating a touchscreen on the computer 301 to control the system.
  • Figure 3 can be understood to show the user selecting a specific part of the virtual robot
  • an identifier 306 associated with a selected specific part.
  • Figure 3 can alternatively be understood to show the swiping the finger 304 to control an extent of movement of the specific part of the virtual robot 305, and optionally being shown data
  • Figure 4 is a simplified image showing a screen of a system for displaying a virtual robot in an environment according to an example embodiment of the invention.
  • Figure 4 shows a screen 401 displaying a virtual robot 402 in an industrial environment
  • the industrial environment 403 may be any industrial environment such an assembly line, a laboratory bench, or the setting may optionally be non-industrial, such as a clinic or an operating room for a medical environment, or a game -playing robot in a game field such as soccer, or a table with a chess board for a game-playing robot environment.
  • Figure 4 also shows the screen 401 displaying optional controls 404 for controlling the virtual robot 402.
  • the screen 401 is a touch screen and the controls 404 are optionally manipulated by touch and/or swipe movements.
  • Figure 5 is a simplified illustration of a virtual robot according to an example embodiment of the invention.
  • Figure 5 shows symbols standing for actuation components 501 502 503 of a virtual robot.
  • the actuation components 501 502 503 may be motors, linear motors, pneumatic actuators, pneumatic actuators, or any other type of actuation component.
  • a virtual robot may have one or more degrees of freedom of movement, and each one of the actuation components 501 502 503 potentially implements one or more degrees of freedom of movement of the virtual robot.
  • Figure 5 also shows a volume 505 in space which a tip 504 of the virtual robot spans during displayed virtual movement.
  • the tip of the virtual robot may include, by way of some non-limiting examples, a tool, a robotic hand, and a robotic hand grasping a tool.
  • the volume 505 is intended to depict a potential volume which systems according to example embodiments of the invention may optionally display.
  • the volume 505 may represent not a volume spanned by a tip of the virtual robot, but, by way of a non-limiting example, an entire volume spanned by any position possible for the virtual robot, or a specific portion of the entire volume.
  • the volume 505 potentially delineates, by way of some non-limiting examples:
  • the volume 505 is displayed as a highlighted volume:
  • AR display is intended to include all such new technologies a priori.
  • compositions, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
  • the singular form“a”,“an” and“the” include plural references unless the context clearly dictates otherwise.
  • the term“a unit” or“at least one unit” may include a plurality of units, including combinations thereof.
  • range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible sub-ranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed sub-ranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
  • a numerical range is indicated herein (for example“10-15”,“10 to 15”, or any pair of numbers linked by these another such range indication), it is meant to include any number (fractional or integral) within the indicated range limits, including the range limits, unless the context clearly dictates otherwise.
  • the phrases“range/ranging/ranges between” a first indicate number and a second indicate number and“range/ranging/ranges from” a first indicate number “to”,“up to”,“until” or“through” (or another such range-indicating term) a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numbers there between.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A system for displaying a virtual robot in a real-world environment including a display, a computer model of a robot, a computing unit for calculating a shape of the robot at a specific position according to the computer model, a camera for capturing an image of a real-world environment, and a man machine interface (MMI) for a user to provide commands to the computing unit to calculate the shape of the robot, wherein the display is arranged to display the shape of the robot as a virtual robot. Related apparatus and methods are also described.

Description

AUGMENTED REALITY FOR INDUSTRIAL ROBOTICS
RELATED APPLICATIONS
This application is a PCT application claiming the benefit of priority of U. S. Provisional Patent Application No. 62/647,871 filed March 26, 2018, and of U. S. Provisional Patent Application No. 62/647,861 filed March 26, 2018, the contents of which are incorporated herein by reference in their entirety.
This application is related to co-filed, co-pending and co-assigned PCT Patent Application entitled“TEACH PENDANT AS AN ADD-ON” (Attorney Docket No. 76996), the disclosures of which are incorporated herein by reference.
FIELD AND BACKGROUND OF THE INVENTION
The present invention, in some embodiments thereof, relates to using augmented reality for industrial robotics and, more particularly, but not exclusively, to using augmented reality or virtual reality to planning and/or testing a robot’s movements.
The disclosures of all references mentioned above and throughout the present specification, as well as the disclosures of all references mentioned in those references, are hereby incorporated herein by reference.
SUMMARY OF THE INVENTION
The present invention, in some embodiments thereof, relates to using augmented reality for industrial robotics and, more particularly, but not exclusively, to using augmented reality or virtual reality to planning a robot’s movements.
According to an aspect of some embodiments of the present invention there is provided a system for displaying a virtual robot in a real-world environment including a display, a computer model of a robot, a computing unit for calculating a shape of the robot at a specific position according to the computer model, a camera for capturing an image of a real-world environment, and a man machine interface (MMI) for a user to provide commands to the computing unit to calculate the shape of the robot, wherein the display is arranged to display the shape of the robot as a virtual robot.
According to some embodiments of the invention, the model of the robot is a real-world robot.
According to some embodiments of the invention, the model of the robot is a computer model of a robot. According to some embodiments of the invention, the computing unit is configured for controlling displayed movement of the model of the robot as movement of the virtual robot.
According to some embodiments of the invention, the display is a Virtual Reality (VR) display and further including the computing unit being arranged to control the VR display to display a virtual environment for the virtual robot.
According to some embodiments of the invention, the virtual environment is based on the image of the real-world environment.
According to some embodiments of the invention, the computing unit is configured for controlling displayed movement of all the motions possible for the model of the robot.
According to some embodiments of the invention, the computing unit is configured for controlling displayed movement of all the motions possible for the model of the robot to their full range of movement.
According to some embodiments of the invention, the display is configured to display the entire range of movement as a highlighted volume in space.
According to some embodiments of the invention, the display is an Augmented Reality (AR) display.
According to some embodiments of the invention, further including sensors for detecting a user’s gesture.
According to some embodiments of the invention, the sensors are arranged to detect the user’s gesture in the real-world environment in which the AR display apparently displays the virtual robot.
According to some embodiments of the invention, the sensors are arranged to detect a real object in a real-world space in which the display displays the robot.
According to some embodiments of the invention, the display is configured to refrain from displaying the robot in a same space as the real object.
According to some embodiments of the invention, the display is configured to display the entire range of movement of the virtual robot, except for when the virtual robot appears to occupy a same space as a real object, as a highlighted volume in space.
According to some embodiments of the invention, the display is configured to display only a portion of the range of movement of the virtual robot when the virtual robot appears to occupy a same space as a real object as a highlighted volume in space.
According to some embodiments of the invention, the computing unit is included within the display. According to some embodiments of the invention, the AR display includes a head mounted display.
According to some embodiments of the invention, the display includes a tablet, and the screen of the tablet is arranged to display the virtual robot.
According to some embodiments of the invention, the screen of the tablet is further arranged to display a virtual environment of the virtual robot.
According to some embodiments of the invention, the screen of the tablet is further arranged to display the real-world environment as captured by the camera.
According to some embodiments of the invention, the screen of the tablet includes a touch screen and the system is arranged to use the touch screen to input user gestures for controlling movement of the virtual robot.
According to some embodiments of the invention, the system is arranged to use the touch screen to input which type of robot is to be displayed.
According to some embodiments of the invention, the system is arranged to use the touch screen to input which robot appendage is to be moved.
According to some embodiments of the invention, the system is arranged to use the touch screen to select which axis of a motion frame of a robot is to be controlled.
According to some embodiments of the invention, the system is arranged to use the touch screen to input by how much the selected axis of a selected robot is to be jogged.
According to some embodiments of the invention, further including a marking in the real- world environment associated with a location planned for the robot in the real-world environment.
According to some embodiments of the invention, further including at least one device sensor selected from a group consisting of a gyroscope, an accelerometer, and a GPS unit.
According to some embodiments of the invention, the system is used for delineating a space in a real-world environment spanned by movement of the robot.
According to an aspect of some embodiments of the present invention there is provided a method for displaying a virtual robot in a real-world environment including providing a model of a robot, providing an image of a real-world environment, calculating a shape of the model of the robot at a specific position in the real-world environment according to the computer model, and using a display to display the shape of the robot as a virtual robot and to display an industrial environment based on the image of a real-world environment, wherein the virtual robot is displayed at the specific position in the real-world environment. According to some embodiments of the invention, the display is an Augmented Reality (AR) display.
According to some embodiments of the invention, further including sensing a user’s gesture and using the user’s gesture to change the display of the virtual robot.
According to some embodiments of the invention, the sensing is performed by a camera capturing images from in a real-world space in which the AR display displays the robot.
According to some embodiments of the invention, input sensors are arranged to detect the user’s gesture in a real-world space in which the AR display displays the robot.
According to some embodiments of the invention, and further including using the user’s gesture to change the shape the AR display displays the model.
According to some embodiments of the invention, the model of the robot is a real-world robot.
According to some embodiments of the invention, the model of the robot is a computer model of a robot.
According to some embodiments of the invention, calculating a shape of the model of the robot includes controlling displayed movement of the virtual robot.
According to some embodiments of the invention, the display is a Virtual Reality (VR) display and calculating a shape of the model of the robot includes controlling the VR display to display a virtual environment for the virtual robot.
According to some embodiments of the invention, the virtual environment is based on the image of the real-world environment.
According to some embodiments of the invention, further including sensing a marking in the real-world environment associated with a location planned for the robot in the real-world environment.
According to some embodiments of the invention, the display is used for delineating a space in the real-world environment spanned by movement of the virtual robot.
According to some embodiments of the invention, the display is used to display the entire range of movement as a highlighted volume in space.
According to some embodiments of the invention, the sensing a user’s gesture includes sensing a real object in a real-world space in which the display displays the virtual robot.
According to some embodiments of the invention, further including refraining from displaying the virtual robot in a same space as the in a real-world space of the real object. According to some embodiments of the invention, further including displaying the entire range of movement of the virtual robot, except for when the virtual robot appears to occupy a same space as a real object, as a highlighted volume in space.
According to some embodiments of the invention, further including displaying only a portion of the range of movement of the virtual robot where the virtual robot is calculated to occupy a same space as a real-world space of the real object, as a highlighted volume in space.
According to some embodiments of the invention, further including using a front camera of a tablet to detect a user’s eyes, calculating where the user is looking, and selecting a virtual robot appendage to control based on where the user is looking.
According to some embodiments of the invention, further including using a front camera of a tablet to detect a user’s eyes, calculating where the user is looking, tracking a shift in the user’s direction of looking and controlling displaying movement of a selected appendage of the virtual robot based on the shift in the user’s direction of looking.
Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.
As will be appreciated by one skilled in the art, some embodiments of the present invention may be embodied as a system, method or computer program product. Accordingly, some embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a“circuit,”“module” or“system.” Furthermore, some embodiments of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon. Implementation of the method and/or system of some embodiments of the invention can involve performing and/or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of some embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware and/or by a combination thereof, e.g., using an operating system. For example, hardware for performing selected tasks according to some embodiments of the invention could be implemented as a chip or a circuit. As software, selected tasks according to some embodiments of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In an exemplary embodiment of the invention, one or more tasks according to some exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data. Optionally, a network connection is provided as well. A display and/or a user input device such as a keyboard or mouse are optionally provided as well.
Any combination of one or more computer readable medium(s) may be utilized for some embodiments of the invention. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhau stive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium and/or data used thereby may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for some embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Some embodiments of the present invention may be described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
Some of the methods described herein are generally designed only for use by a computer, and may not be feasible or practical for performing purely manually, by a human expert. A human expert who wanted to manually perform similar tasks, such as delineating a volume spanned by a robot’s movements, might be expected to use completely different methods, e.g., making use of expert knowledge and/or the pattern recognition capabilities of the human brain, which would be vastly more efficient than manually going through the steps of the methods described herein.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S)
Some embodiments of the invention are herein described, by way of example only, with reference to the accompanying drawings and images. With specific reference now to the drawings and images in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings and images makes apparent to those skilled in the art how embodiments of the invention may be practiced.
In the drawings:
FIG. 1 is a simplified block diagram illustration of a system for displaying a virtual robot in a real-world environment according to an example embodiment of the invention;
FIG. 2 is a simplified flow chart illustration of a method for displaying a virtual robot in a real-world environment according to an example embodiment of the invention;
FIG. 3 is a simplified image showing a system for displaying a virtual robot in a real- world environment according to an example embodiment of the invention;
FIG. 4 is a simplified image showing a screen of a system for displaying a virtual robot in an environment according to an example embodiment of the invention; and
FIG. 5 which is a simplified illustration of a virtual robot according to an example embodiment of the invention.
DESCRIPTION OF SPECIFIC EMBODIMENTS OF THE INVENTION
The present invention, in some embodiments thereof, relates to using augmented reality for industrial robotics and, more particularly, but not exclusively, to using augmented reality or virtual reality to planning and/or testing a robot’s movements. Introduction
Augmented Reality (AR) is a way of displaying the physical, real-world environment with computer-generated 2D/3D objects displayed as part of the real-world environment. The computer-generated 2D/3D virtual objects are optionally displayed to appear as interacting with the real world. An AR display system typically uses sensors to sense locations of objects in the real-world environment, and image-processing technologies optionally in real-time. An AR display system optionally interacts with a user, whether by the user providing input by computer controls, such as buttons, a keyboard or a touch screen, or by detecting user’s gestures in a volume of space where virtual objects are displayed.
Virtual Reality (VR) is a way of displaying a virtual world environment with computer generated 2D/3D objects. In some embodiments the VR display system uses sensors to sense locations of real objects in a real-world environment, and optionally displays the real objects in the virtual world environment.
The term AR as used in the present specification and claims is intended to include VR, as will be understood by persons skilled in the art.
AR is already being used in a wide variety of industries, including gaming, medical industries and education. With more and more devices capable of displaying AR content coming out each year, the AR platform is becoming affordable for everyone - and it keeps getting better every year.
According to an aspect of some embodiments there is provided a method for using AR in the Industrial Robotics Industry (IRI).
The IRI today uses traditional 3D software (e. g.: RoboWorks, RoboDK) to visualize robotic applications without using an actual robot. While this may be a good solution for testing just the motion of a robot, this method doesn’t take into account factors such as the user and the environment interactions with the robot.
Some example embodiment methods for teaching positions to a robot are:
Teach Pendants - hand-held devices which allow users to move a robot and then save robot parameters associated with location of the robot for later use. Such a method can work with AR and/or VR - a user uses an AR (or VR) display device and operates a real or a virtual robot (optionally displayed by an AR or VR display) controlled by a teach pendant.
In some embodiments a Teach Pendant (TP) is used to control movement of a virtual robot, displayed by an AR or a VR device.
In some embodiments a Teach Pendant (TP) is used to control movement of a real robot, where the real robot optionally serves to anchor or locate a virtual robot displayed by an AR or a VR device, and the movement is optionally displayed such that a portion of the displayed virtual robot coincides with the real robot, and a portion of the displayed virtual robot - for example the moving portion - is displayed as a virtual robot movement, optionally without the real robot moving.
In some embodiments a Teach Pendant (TP) such as describe in above-mentioned co filed, co-pending and co-assigned U.S. Patent Application entitled“TEACH PENDANT” is optionally used to control movement of a real or virtual robot, and to display an AR or VR display on a screen of the TP. In some embodiments the TP includes or is attached to a tablet, and the tablet screen optionally serves for displaying the AR or VR display, and/or for implementing a user interface such as a touch screen.
Lead By Nose - a technology which allow users to move a robot by applying force (for example with their hands) on the robot, and then save robot parameters associated with the location of the robot for later use.
Teach pendants are devices, often hand-held, that can control movement of real or virtual (simulated) robots to a specific location, and optionally save the location and/or control commands for later use.
In embodiments where the robot is a virtual robot, a user views the movement on a display, sometimes using a computer program for displaying robotic movement.
In some AR embodiments, such display is optionally extended, so that users are able to see virtual robots and their interactions with the surrounding environment, optionally while using a teach pendant to control (and optionally save commands for) moving the virtual robots.
In some AR embodiments, AR is optionally used instead of or in addition to a traditional teach pendant (which is hand-held), for example, by using AR Glasses that can free the user’s hands. Jogging is optionally done by looking at certain points on an augmented world, for example looking at jog buttons that appear next to a real or virtual robot, thus potentially eliminating at least some disadvantages of hand-held devices.
The real-world location of virtual objects, such as a virtual robot, is optionally set by using physical predefined markers placed in the real world. An example for such a marker could be a QR code on the ground. Alternatively, real-time image processing can be used to detect surfaces or edges of objects in the real world.
In various embodiments, Augmented Reality and/or Virtual Reality devices used include mobile platforms; wearable tech; tablets; mobile phones; AR Glasses, and VR glasses. In some embodiments, virtual objects such as a virtual robot are optionally displayed by lenses, such as in a Microsoft HoloLens or in a Google Glass, and not necessarily on a display screen or a touch screen.
In various embodiments, AR glasses or displays are used which include a hand gesture interface, or a gaze interface.
In some embodiments a gesture is detected in space and serves as input to the AR or VR system.
In some embodiments a user’s gaze is detected as viewing in a specific direction, and the direction is used for interacting with an AR or VR system, for example detecting a gaze direction to a virtual object which the user is gazing at and optionally selecting the virtual object and/or moving the virtual object in a direction of the gaze.
According to an aspect of some embodiments there are provided methods for using AR or VR displays to display virtual robotic movement in an Augmented Reality environment or in a Virtual Reality environment.
The term“virtual robot” in all its grammatical forms is used throughout the present specification and claims to mean a robot as displayed by a display.
In some embodiments, an AR display or a VR display is optionally integrated into Teach Pendant software - displaying a virtual robot in a Teach Pendant screen, eliminating a need for using external monitors to visualize virtual robots.
In some embodiments, users can safely get close to location of a moving virtual robot displayed by an AR display or a VR display without the risk of getting harmed by a real robot.
In some embodiments, users can step away from a computer/monitor and perform operations apparently directly on a virtual robot. A sensing system optionally detects location of a user or a user’s hand(s), and when a user’s hand is at a location apparently touching the virtual robot, the sensing system senses movements of the user’s hands and translates the movements to act as if a user is Leading By Nose the virtual robot displayed by the AR display.
Real robots can be expensive to purchase and/or operate and/or fix when damaged. Using a virtual robot in an AR setting can save money, allowing users to test robotic motion programs in the real world, educating employees, and working with a virtual robot when a real such robot maybe isn’t available.
Real robots are often part of a working assembly line. Using a virtual robot in an AR setting can save money, potentially enabling users to test robotic motion programs without pausing production. In some embodiments, a VR display is optionally integrated into Teach Pendant software - displaying a virtual robot and a virtual environment in a Teach Pendant screen, displaying for example a robot in an industrial setting
Visualization
In some embodiments an AR display system is used to display a virtual robot in real space.
In some embodiments an AR display system is optionally used to display a virtual volume in real space which is a volume of a full range of movement of the virtual robot.
In some embodiments a VR display system is used to display a virtual robot in a virtual space.
In some embodiments, the VR display optionally displays a virtual robot and a dynamic virtual environment, displaying for example a robot in an assembly line with dynamically moving objects in the assembly line.
In some embodiments Denavit and Hartenberg (DH) parameters are used to represent industrial robots.
In some embodiments DH parameters are assigned for a robot, a workspace of the robot is optionally calculated, and the workspace is optionally displayed by an AR display.
In some embodiments an AR display system is optionally used to display a virtual robot in space, where the virtual robot appears in a position determined by user commands. In some embodiments a VR display system is optionally used to display a virtual robot in virtual space, where the virtual robot appears in a position determined by user commands. In some embodiments the commands are provided by a Teach Pendant. In some embodiments the commands are provided by a Teach Pendant as described by above-mentioned U.S. Provisional Patent Applications entitled“TEACH PENDANT AS AN ADD-ON”, the contents of which are incorporated herein by reference.
In some embodiments the virtual robot appears in a position determined by user commands, but a specific time following the commands. The display system optionally apparently delays virtual execution of the commands by the specific time, by displaying a corresponding movement later than commanded. By way of a non-limiting example, such a delay optionally serves for training. By way of a non-limiting example the delay optionally enables showing the virtual robot performing an incorrect move, such as a collision with an object, and optionally enables a user to react to correct the incorrect move. In some embodiments a virtual robot is displayed to appear in positions determined by a program which includes several robot movement commands, and the program also causes a real robot to move in real space. In some embodiments, an AR display optionally displays a virtual robot performing the program commands a specific period of time before the real robot is caused to perform the program commands. The display shows the virtual robot apparently performing movements before the real robot performs the same movements. By way of a non-limiting example, such a preview of where the real robot will be after the specific period of time optionally serves for training. By way of a non-limiting example the preview optionally enables showing the virtual robot performing an incorrect move, such as a collision with an object, and optionally enables a user to react to stop, for example by using an emergency stop (E-stop) button before the real robot performs the incorrect move.
Display devices
In some embodiments an AR display system optionally uses one or more of the following to display an Augmented Reality display with a virtual robot augmented onto an image of a real world:
A head mounted display, a computer display, a tablet display, a touch screen, a smart phone screen.
In some embodiments the display is optionally a tablet screen such as described in above- mentioned U.S. Provisional Patent Applications entitled“TEACH PENDANT AS AN ADD ON”, the contents of which are incorporated herein by reference. In some embodiments the tablet camera serves to capture an image or images of the real world and the tablet display shows a virtual robot augmenting the real world image.
In some embodiments the tablet display dynamically displays what the tablet camera dynamically captures, with the virtual robot augmented onto a correct location in the display of the real world.
In some embodiments the tablet display displays what the tablet camera captures, and the image or images of the real world serve as background for the display of the virtual robot, even if or when the tablet is pointed elsewhere.
In some embodiments the tablet automatically detects when the tablet camera captures cease to point at a location designated for the virtual robot, for example when losing sight of an anchoring marking (described in more detail below) or losing sight of a real robot. In some embodiments the table provides warning when the tablet camera loses sight of an intended location for the virtual robot. In some embodiments, when losing sight of an anchoring marking or a real robot the tablet displays a previous image or images of the real world as background for the display of the virtual robot.
Virtual collisions
In some embodiments an AR display system used to display a virtual robot includes information about real objects in the environment.
In some embodiments an AR display system used to display a virtual robot senses locations of real objects in the environment.
In some embodiments an AR display system used to display a virtual robot optionally does not display the virtual robot moving to positions which would cause a real robot to move into space occupied by the real objects in the environment. Such an AR display system implements collision avoidance in the AR display of the virtual robot.
In some embodiments an AR display system used to display a virtual robot optionally displays a virtual volume in real space only where a real robot exercising a full range of movement would occupy space occupied by the real objects. Such an AR display system displays spaces where collision of the virtual robot with real objects is detected. Such a display can serve for warning and optionally fixing problems with a planned location or movement of a planned real robot.
In some embodiments an AR display system used to display a virtual robot includes information about real objects in the environment, displays a virtual volume in real space only where a real robot exercising a range of movement would not crash into the real objects. Such an AR display system displays spaces where collision of the virtual robot with real objects does not happen. Such a display can serve for visualizing a safe volume for a planned real robot and/or planning a safe volume for movement for the real robot.
Locating and/or anchoring a (virtual) robot
In some embodiments, markers are placed in a real environment, for example on a floor or on a workbench. In some embodiments the markers are optionally sensed by an AR display system and optionally serve as a location for displaying a specific part of a virtual robot.
By way of a non-limiting example, the markers may correspond to a shape of a base of a robot, and the AR display displays a virtual robot with a base of the virtual robot overlapping the marker.
In some embodiments, the marker optionally includes encoded digital data, such as a QR code. In some embodiments, the QR code includes information including one or more of: a code associated with a type of robot, data about the location, an identification code of the location - for example when more than one location are programmed or known by an AR display system.
In some embodiments, the AR display system optionally displays a virtual robot corresponding to the type of robot encoded in the marker.
In some embodiments, a marker is placed on a floor or workbench, an AR display system is optionally used to display operation of a virtual robot, and a user optionally adjusts the location of the marker, thereby adjusting a location where the virtual robot is displayed. The user optionally adjusts location of the robot to verify that the virtual robot and/or a corresponding real robot, is correctly placed in real space - to maintain safety; to prevent collision with real world objects or other robots; and/or to reach locations which the robot is intended to reach.
Lead by gesture
In some embodiments a“Lead by Gesture” method for jogging a robot is optionally used instead of the above-mentioned“Lead by Nose” method. Instead of applying force to a robot, users optionally use hand gestures (e.g.: drag, push) to select and virtually move a robot or a robot part in an AR display of a virtual robot, and cause the real robot or real robot part to move in a corresponding direction. In some embodiments sensor(s) in a space of an AR display showing a virtual robot sense location of the user’s hand and interpret gestures in the space.
In some embodiments the gestures are made in a same volume where the AR display apparently displays the virtual robot movement.
In some embodiments the AR display shows the virtual robot on a touch screen, and gestures are made on the touch screen to cause the AR display to move the shape of the virtual robot.
In some embodiments a camera such as a tablet camera detects a user touching a real robot and optionally provides a command to the real robot to move in a direction away from the touch, providing a result of the user pushing the real robot.
In some embodiments a camera such as a tablet camera detects a user’s hand at a location apparently touching a virtual robot, and optionally provides a command to the virtual robot to move in a direction away from the touch, providing a result of the user apparently pushing the virtual robot.
In some embodiments the AR display optionally provides a user interface for the user to select a robot degree of freedom, and optionally uses a drag gesture on a touch screen to determine an extent by which the degree of freedom is moved. In some embodiments the AR display optionally provides a user interface for the user to select a robot and/or one or more robot motion parameters such as a robot motion frame, or other parameters such as velocity, acceleration, jerk, blending parameters and torque command and optionally uses a drag gesture on a touch screen to determine an extent by which the motion frame is moved.
In some embodiments the AR display optionally provides a user interface for the user to select a robot appendage for movement, and optionally uses a drag gesture on a touch screen to determine an extent by which the appendage is moved.
In some embodiments a front camera of a tablet or some other mobile or AR device optionally tracks a user’s eyes, and detects where the user is looking in order to select a robot appendage to move. In some embodiments the front camera optionally tracks where the user is looking, optionally moving a selected appendage in a direction where the user is looking.
In some embodiments a VR display is used to implement the above methods - the VR display displays both the virtual robot and the industrial environment.
The“Lead by Gesture” method is potentially safer than a“Lead by Nose” method acting upon a real robot, because the“Lead by Gesture” method can optionally be used without approaching an enabled real moving robot.
In some embodiments, users can step away from a computer/monitor and perform operations apparently directly on a virtual robot. A sensing system optionally detects location of a user or a user’s hand(s), and when a user’s hand is at a location apparently touching the virtual robot, the sensing system senses movements or gestures of the user’s hands and translates the movements to a user Leading By Nose the virtual robot displayed by the AR display.
System monitoring
In some embodiments real-time live data can be displayed, augmented on top of a real (or a virtual) robot, displaying system information such as tasks, motion frames, robot properties, drive status, and more. Users can optionally change parameters of a real robot directly from an AR application, optionally without using an additional Human Machine Interface or a computer monitors or a keyboard.
Safety
In some embodiments a user interacts with the AR or VR display system to mark a safety zone around objects in the real environment or in the virtual environment. In some embodiments the AR or VR display optionally refrains from moving the virtual robot into the safety zone.
In some embodiments the AR or VR display optionally provides warning such as audible or visual warning when a user’s command to move the virtual robot would move the virtual robot into the safety zone.
Real environment
In some embodiments the AR display system displays a virtual robot located in a real environment.
In some embodiments the AR display system optionally includes sensors which locate real object(s) in the real environment and the AR display system optionally takes into account location of the real object(s), such as object avoidance.
Virtual environment
In some embodiments a VR display system displays a virtual robot located in a virtual environment.
In some embodiments the VR display system optionally includes sensors which locate real object(s) in the real environment and the VR display system optionally displays a virtual robot and a virtual environment which reproduces the real objects in the real environments.
In some embodiments a camera of a tablet, or mobile device, or AR device is optionally used to capture an image or images of the environment, and subsequently displays a corresponding virtual environment and a virtual robot in the virtual environment.
Employee education
In some embodiments employees are optionally educated on a virtual robot, potentially not even on an industrial floor location, and potentially without being tied limited to a time when a real robot is free from industrial tasks - potentially saving time, money, company resources and eliminating fear of damaging a real robot.
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways. Reference is now made to Figure 1, which is a simplified block diagram illustration of a system for displaying a virtual robot in a real-world environment according to an example embodiment of the invention.
Figure 1 shows a computing unit 101, a display 103, a computer model 102 of a robot, a camera 104 and a user interface 105.
In some embodiments, the computing unit 101 optionally has data describing the computer model 102, and is optionally configured to calculate movement of the model 102 based on possible movements for the specific model 102.
In some embodiments, the computing unit 101 optionally controls the display 103 to display a virtual robot corresponding to a specific shape of the model 102. In some embodiments, the computing unit 101 optionally controls the display 103 to display the virtual robot at any shape possible for the model 102.
In some embodiments, the computing unit 101 optionally controls the display 103 to display a real environment as captured by the camera 104.
Figure 1 shows an optional block 106 representing the environment 106.
In some embodiments, the computing unit 101 optionally controls the display 103 to display a virtual environment, optionally un-related to the real environment of the system.
In some embodiments, the computing unit 101 optionally controls the display 103 to display a virtual environment, optionally based on one or more images or a video clip of the real environment of the system.
In some embodiments, the computing unit 101 optionally interfaces with a user by the user interface 105.
In some embodiments, the user interface 105 serves to provide user input for:
selecting a specific robot model from a number of robot models 102 available in the system and/or obtainable by loading into the system;
selecting a specific robot appendage for which to control a display of movement by a displayed virtual robot;
selecting a specific robot motion frame such as joint, base, world, or tool for which to control a display of movement by a displayed virtual robot;
selecting a specific robot parameter for which to control a display of movement by a displayed virtual robot; and
selecting specific robot or type of robot for which to control a display of movement by a displayed virtual robot. an extent of movement to cause the robot to move. The extent is optionally provided by a user gesture. In some embodiments the user gesture is a drag or slide along a touch screen, optionally a touch screen of a tablet. In some embodiments the user gesture is a hand or finger movement detectable by the camera 104. In some embodiments the user gesture is an eye movement detectable by the camera 104. In some embodiments the user gesture is an eye movement detectable by a front camera of a tablet.
In some embodiments, the camera 104 serves to detect a marking in a real-world environment with which to align the virtual robot.
In some embodiments the computer 101 controls the display 103 to display a real-time image of the real world, and also an image of the virtual robot situated in the real world at a location indicated by the marking.
In some embodiments the computer 101 controls the display 103 to display an image of a virtual world, and also an image of the virtual robot situated in the virtual world at a location indicated by the marking.
Reference is now made to Figure 2, which is a simplified flow chart illustration of a method for displaying a virtual robot in a real-world environment according to an example embodiment of the invention.
The method of Figure 2 includes:
providing a model of a robot (202);
providing an image of a real-world environment (204);
calculating a shape of the model of the robot 9206) at a specific position in the real-world environment according to the computer model; and
using a display to display the shape of the robot as a virtual robot and to display an industrial environment (208) based on the image of a real-world environment.
Reference is now made to Figure 3, which is a simplified image showing a system for displaying a virtual robot in a real-world environment according to an example embodiment of the invention.
Figure 3 shows a system including computer 301 having a display, showing a virtual robot 305 based on a computer model of a robot, and also displaying a real-world image 303 of the real world 302, optionally obtained by a camera (not shown) in the computer 301.
Figure 3 also shows a user’s finger 304 manipulating a touchscreen on the computer 301 to control the system. Figure 3 can be understood to show the user selecting a specific part of the virtual robot
305 to control, and optionally being shown an identifier 306 associated with a selected specific part.
Figure 3 can alternatively be understood to show the swiping the finger 304 to control an extent of movement of the specific part of the virtual robot 305, and optionally being shown data
306 associated with the extent of movement indicated by the swipe.
Reference is now made to Figure 4, which is a simplified image showing a screen of a system for displaying a virtual robot in an environment according to an example embodiment of the invention.
Figure 4 shows a screen 401 displaying a virtual robot 402 in an industrial environment
403.
What is shown as the industrial environment 403 may be any industrial environment such an assembly line, a laboratory bench, or the setting may optionally be non-industrial, such as a clinic or an operating room for a medical environment, or a game -playing robot in a game field such as soccer, or a table with a chess board for a game-playing robot environment.
Figure 4 also shows the screen 401 displaying optional controls 404 for controlling the virtual robot 402. In some embodiments the screen 401 is a touch screen and the controls 404 are optionally manipulated by touch and/or swipe movements.
Reference is now made to Figure 5, which is a simplified illustration of a virtual robot according to an example embodiment of the invention.
Figure 5 shows symbols standing for actuation components 501 502 503 of a virtual robot. The actuation components 501 502 503 may be motors, linear motors, pneumatic actuators, pneumatic actuators, or any other type of actuation component.
A virtual robot may have one or more degrees of freedom of movement, and each one of the actuation components 501 502 503 potentially implements one or more degrees of freedom of movement of the virtual robot.
Figure 5 also shows a volume 505 in space which a tip 504 of the virtual robot spans during displayed virtual movement. The tip of the virtual robot may include, by way of some non-limiting examples, a tool, a robotic hand, and a robotic hand grasping a tool.
In Figure 5 the volume 505 is intended to depict a potential volume which systems according to example embodiments of the invention may optionally display. In some embodiments the volume 505 may represent not a volume spanned by a tip of the virtual robot, but, by way of a non-limiting example, an entire volume spanned by any position possible for the virtual robot, or a specific portion of the entire volume. The volume 505 potentially delineates, by way of some non-limiting examples:
an entire range of movement of the virtual robot;
a range of movement of the robot except a volume where the virtual robot would occupy a same space as a real object;
a portion of the range of movement of the virtual robot where the virtual robot appears to occupy a same space as a real object as a highlighted volume in space;
In some embodiments, the volume 505 is displayed as a highlighted volume:
It is expected that during the life of a patent maturing from this application many relevant AR displays will be developed and the scope of the term AR display is intended to include all such new technologies a priori.
It is expected that during the life of a patent maturing from this application many relevant robots will be developed and the scope of the term robot is intended to include all such new technologies a priori.
The terms“including”,“including”,“having” and their conjugates mean“including but not limited to”.
The term“consisting of’ is intended to mean“including and limited to”.
The term“consisting essentially of’ means that the composition, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
As used herein, the singular form“a”,“an” and“the” include plural references unless the context clearly dictates otherwise. For example, the term“a unit” or“at least one unit” may include a plurality of units, including combinations thereof.
The words“example” and“exemplary” are used herein to mean“serving as an example, instance or illustration”. Any embodiment described as an“example or“exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.
The word“optionally” is used herein to mean“is provided in some embodiments and not provided in other embodiments”. Any particular embodiment of the invention may include a plurality of“optional” features unless such features conflict.
Throughout this application, various embodiments of this invention may be presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible sub-ranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed sub-ranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
Whenever a numerical range is indicated herein (for example“10-15”,“10 to 15”, or any pair of numbers linked by these another such range indication), it is meant to include any number (fractional or integral) within the indicated range limits, including the range limits, unless the context clearly dictates otherwise. The phrases“range/ranging/ranges between” a first indicate number and a second indicate number and“range/ranging/ranges from” a first indicate number “to”,“up to”,“until” or“through” (or another such range-indicating term) a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numbers there between.
Unless otherwise indicated, numbers used herein and any number ranges based thereon are approximations within the accuracy of reasonable measurement and rounding errors as understood by persons skilled in the art.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.
All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting.

Claims

WHAT IS CLAIMED IS:
1. A system for displaying a virtual robot in a real-world environment comprising: a display;
a computer model of a robot;
a computing unit for calculating a shape of the robot at a specific position according to the computer model;
a camera for capturing an image of a real-world environment; and
a man machine interface (MMI) for a user to provide commands to the computing unit to calculate the shape of the robot,
wherein the display is arranged to display the shape of the robot as a virtual robot.
2. The system of claim 1 wherein the model of the robot is a real-world robot.
3. The system of claim 1 wherein the model of the robot is a computer model of a robot.
4. The system of any one of claims 1-3 wherein the computing unit is configured for controlling displayed movement of the model of the robot as movement of the virtual robot.
5. The system of claim 1 wherein the display is a Virtual Reality (VR) display and further comprising the computing unit being arranged to control the VR display to display a virtual environment for the virtual robot.
6. The system of claim 5 wherein the virtual environment is based on the image of the real-world environment.
7. The system of any one of claims 1-6 wherein the computing unit is configured for controlling displayed movement of all the motions possible for the model of the robot.
8. The system of any one of claims 1-7 wherein the computing unit is configured for controlling displayed movement of all the motions possible for the model of the robot to their full range of movement.
9. The system of any one of claims 1-8 wherein the display is configured to display the entire range of movement as a highlighted volume in space.
10. The system of claim 1 wherein the display is an Augmented Reality (AR) display.
11. The system of claim 10 and further comprising sensors for detecting a user’s gesture.
12. The system of claim 11 wherein the sensors are arranged to detect the user’s gesture in the real-world environment in which the AR display apparently displays the virtual robot.
13. The system of any one of claims 11 and 12 wherein the sensors are arranged to detect a real object in a real-world space in which the display displays the robot.
14. The system of claim 13 wherein the display is configured to refrain from displaying the robot in a same space as the real object.
15. The system of any one of claims 13 and 14 wherein the display is configured to display the entire range of movement of the virtual robot, except for when the virtual robot appears to occupy a same space as a real object, as a highlighted volume in space.
16. The system of any one of claims 13-15 wherein the display is configured to display only a portion of the range of movement of the virtual robot when the virtual robot appears to occupy a same space as a real object as a highlighted volume in space.
17. The system of any one of claims 1-16 wherein the computing unit is comprised within the display.
18. The system of any one of claims 1-17 wherein the AR display comprises a head mounted display.
19. The system of any one of claims 1-17 wherein the display comprises a tablet, and the screen of the tablet is arranged to display the virtual robot.
20. The system of claim 19 wherein the screen of the tablet is further arranged to display a virtual environment of the virtual robot.
21. The system of claim 19 wherein the screen of the tablet is further arranged to display the real-world environment as captured by the camera.
22. The system of any one of claims 19-21 wherein the screen of the tablet comprises a touch screen and the system is arranged to use the touch screen to input user gestures for controlling movement of the virtual robot.
23. The system of claim 22 wherein the system is arranged to use the touch screen to input which type of robot is to be displayed.
24. The system of any one of claims 22 and 23 wherein the system is arranged to use the touch screen to input which robot appendage is to be moved.
25. The system of any one of claims 22-24 wherein the system is arranged to use the touch screen to select which axis of a motion frame of a robot is to be controlled.
26. The system of claim 25 wherein the system is arranged to use the touch screen to input by how much the selected axis of a selected robot is to be jogged.
27. The system of any one of claims 1-18 and further comprising a marking in the real-world environment associated with a location planned for the robot in the real-world environment.
28. The system of any one of claims 1-27 and further comprising at least one device sensor selected from a group consisting of:
a gyroscope;
an accelerometer; and
a GPS unit.
29. The system of any one of claims 1-28 used for delineating a space in a real-world environment spanned by movement of the robot.
30. A method for displaying a virtual robot in a real-world environment comprising: providing a model of a robot;
providing an image of a real-world environment;
calculating a shape of the model of the robot at a specific position in the real-world environment according to the computer model; and
using a display to display the shape of the robot as a virtual robot and to display an industrial environment based on the image of a real-world environment,
wherein the virtual robot is displayed at the specific position in the real-world environment.
31. The method of claim 30 wherein the display is an Augmented Reality (AR) display.
32. The method of claim 30 and further comprising sensing a user’s gesture and using the user’s gesture to change the display of the virtual robot.
33. The method of claim 32 wherein the sensing is performed by a camera capturing images from in a real-world space in which the AR display displays the robot.
34. The method of claim 32 wherein input sensors are arranged to detect the user’s gesture in a real-world space in which the AR display displays the robot.
35. The method of any one of claims 32 and 34 and further comprising using the user’s gesture to change the shape the AR display displays the model.
36. The method of any one of claims 32-35 wherein the model of the robot is a real- world robot.
37. The method of any one of claims 32-35 wherein the model of the robot is a computer model of a robot.
38. The method of any one of claims 32-37 and wherein calculating a shape of the model of the robot comprises controlling displayed movement of the virtual robot.
39. The method of claim 30 wherein the display is a Virtual Reality (VR) display and calculating a shape of the model of the robot comprises controlling the VR display to display a virtual environment for the virtual robot.
40. The method of claim 39 wherein the virtual environment is based on the image of the real-world environment.
41. The method of any one of claims 32-40 and further comprising sensing a marking in the real-world environment associated with a location planned for the robot in the real-world environment.
42. The method of any one of claims 32-41 wherein the display is used for delineating a space in the real-world environment spanned by movement of the virtual robot.
43. The method of any one of claims 32-42 wherein the display is used to display the entire range of movement as a highlighted volume in space.
44. The method of claim 32 wherein the sensing a user’s gesture comprises sensing a real object in a real-world space in which the display displays the virtual robot.
45. The method of claim 44 and further comprising refraining from displaying the virtual robot in a same space as the in a real-world space of the real object.
46. The method of any one of claims 44 and 45 and further comprising displaying the entire range of movement of the virtual robot, except for when the virtual robot appears to occupy a same space as a real object, as a highlighted volume in space.
47. The method of any one of claims 44-46 and further comprising displaying only a portion of the range of movement of the virtual robot where the virtual robot is calculated to occupy a same space as a real-world space of the real object, as a highlighted volume in space.
48. The method of any one of claims 30-47 and further comprising:
using a front camera of a tablet to detect a user’s eyes;
calculating where the user is looking; and
selecting a virtual robot appendage to control based on where the user is looking.
49. The method of any one of claims 30-48 and further comprising:
using a front camera of a tablet to detect a user’s eyes;
calculating where the user is looking;
tracking a shift in the user’s direction of looking and
controlling displaying movement of a selected appendage of the virtual robot based on the shift in the user’s direction of looking.
PCT/IL2019/050348 2018-03-26 2019-03-26 Augmented reality for industrial robotics WO2019186551A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980030164.XA CN112105486B (en) 2018-03-26 2019-03-26 Augmented reality for industrial robots
IL277596A IL277596A (en) 2018-03-26 2020-09-24 Augmented reality for industrial robotics

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201862647861P 2018-03-26 2018-03-26
US201862647871P 2018-03-26 2018-03-26
US62/647,871 2018-03-26
US62/647,861 2018-03-26

Publications (1)

Publication Number Publication Date
WO2019186551A1 true WO2019186551A1 (en) 2019-10-03

Family

ID=66102733

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/IL2019/050348 WO2019186551A1 (en) 2018-03-26 2019-03-26 Augmented reality for industrial robotics
PCT/IL2019/050349 WO2019186552A1 (en) 2018-03-26 2019-03-26 Teach pendant as an add-on

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/IL2019/050349 WO2019186552A1 (en) 2018-03-26 2019-03-26 Teach pendant as an add-on

Country Status (3)

Country Link
CN (1) CN112105486B (en)
IL (1) IL277596A (en)
WO (2) WO2019186551A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021149938A1 (en) * 2020-01-21 2021-07-29 Samsung Electronics Co., Ltd. Electronic device and method for controlling robot
WO2023127563A1 (en) * 2021-12-28 2023-07-06 富士フイルム株式会社 Information processing device, information processing method, and information processing program

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111531551A (en) * 2020-04-22 2020-08-14 实时侠智能控制技术有限公司 Safety demonstrator using universal tablet computer and demonstration method
CN113093914B (en) * 2021-04-21 2022-10-28 广东电网有限责任公司电力科学研究院 High-presence visual perception method and device based on VR
CN113492410B (en) * 2021-09-09 2022-03-18 成都博恩思医学机器人有限公司 Method, system, mechanical equipment and storage medium for displaying robot operation process
WO2024044891A1 (en) * 2022-08-29 2024-03-07 Abb Schweiz Ag Adjusting a virtual relative position in a virtual robot work cell

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140277737A1 (en) * 2013-03-18 2014-09-18 Kabushiki Kaisha Yaskawa Denki Robot device and method for manufacturing processing object
DE102015012732A1 (en) * 2015-10-01 2016-04-14 Daimler Ag System and method for controlling, in particular for commissioning, a production plant
US20160257000A1 (en) * 2015-03-04 2016-09-08 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment
US20160346936A1 (en) * 2015-05-29 2016-12-01 Kuka Roboter Gmbh Selection of a device or object using a camera
US20170372139A1 (en) * 2016-06-27 2017-12-28 Autodesk, Inc. Augmented reality robotic system visualization
US9919427B1 (en) * 2015-07-25 2018-03-20 X Development Llc Visualizing robot trajectory points in augmented reality

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010025781B4 (en) * 2010-07-01 2022-09-22 Kuka Roboter Gmbh Portable safety input device for a robot controller
CN103213125B (en) * 2011-11-04 2016-05-18 范努克机器人技术美国有限公司 There is robot teaching's device that 3D shows
KR102281233B1 (en) * 2013-03-14 2021-07-23 삼성전자 주식회사 Apparatus and method controlling display
JP6440745B2 (en) * 2014-08-25 2018-12-19 エックス デベロップメント エルエルシー Method and system for augmented reality displaying a virtual representation of the action of a robotic device
US9597807B2 (en) * 2014-10-24 2017-03-21 Hiwin Technologies Corp. Robot teaching device
DE102015206578B3 (en) * 2015-04-13 2016-08-04 Kuka Roboter Gmbh Robotic hand-held device, and associated method
DE102015206571B3 (en) * 2015-04-13 2016-08-04 Kuka Roboter Gmbh Robotic hand-held device with an adapter device for a mobile terminal
CN107580690B (en) * 2015-06-15 2022-03-22 康茂股份公司 Portable safety control device for an industrial machine, in particular a robot
JP6338617B2 (en) * 2016-05-31 2018-06-06 株式会社アスコ Teaching device
CN107097227B (en) * 2017-04-17 2019-12-06 北京航空航天大学 human-computer cooperation robot system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140277737A1 (en) * 2013-03-18 2014-09-18 Kabushiki Kaisha Yaskawa Denki Robot device and method for manufacturing processing object
US20160257000A1 (en) * 2015-03-04 2016-09-08 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment
US20160346936A1 (en) * 2015-05-29 2016-12-01 Kuka Roboter Gmbh Selection of a device or object using a camera
US9919427B1 (en) * 2015-07-25 2018-03-20 X Development Llc Visualizing robot trajectory points in augmented reality
DE102015012732A1 (en) * 2015-10-01 2016-04-14 Daimler Ag System and method for controlling, in particular for commissioning, a production plant
US20170372139A1 (en) * 2016-06-27 2017-12-28 Autodesk, Inc. Augmented reality robotic system visualization

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021149938A1 (en) * 2020-01-21 2021-07-29 Samsung Electronics Co., Ltd. Electronic device and method for controlling robot
US11906973B2 (en) 2020-01-21 2024-02-20 Samsung Electronics Co., Ltd Electronic device and method for controlling robot
WO2023127563A1 (en) * 2021-12-28 2023-07-06 富士フイルム株式会社 Information processing device, information processing method, and information processing program

Also Published As

Publication number Publication date
CN112105486B (en) 2024-05-10
WO2019186552A1 (en) 2019-10-03
CN112105486A (en) 2020-12-18
IL277596A (en) 2020-11-30

Similar Documents

Publication Publication Date Title
WO2019186551A1 (en) Augmented reality for industrial robotics
Eswaran et al. Challenges and opportunities on AR/VR technologies for manufacturing systems in the context of industry 4.0: A state of the art review
US11850755B2 (en) Visualization and modification of operational bounding zones using augmented reality
Eswaran et al. Augmented reality-based guidance in product assembly and maintenance/repair perspective: A state of the art review on challenges and opportunities
Solanes et al. Teleoperation of industrial robot manipulators based on augmented reality
Seo et al. Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences
Nee et al. Augmented reality applications in design and manufacturing
EP1709519B1 (en) A virtual control panel
Szafir Mediating human-robot interactions with virtual, augmented, and mixed reality
Frank et al. Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks
CN105518575B (en) With the two handed input of natural user interface
EP3055744B1 (en) A method and a device for verifying one or more safety volumes for a movable mechanical unit
Leutert et al. A spatial augmented reality system for intuitive display of robotic data
EP3283938B1 (en) Gesture interface
US10166673B2 (en) Portable apparatus for controlling robot and method thereof
US20050251290A1 (en) Method and a system for programming an industrial robot
Re et al. Impact of monitor-based augmented reality for on-site industrial manual operations
CN102955568A (en) Input unit recognizing user's motion
Fang et al. Head-mounted display augmented reality in manufacturing: A systematic review
US20190163266A1 (en) Interaction system and method
Wu et al. Human-computer interaction based on machine vision of a smart assembly workbench
Huy et al. See-through and spatial augmented reality-a novel framework for human-robot interaction
WO2017021902A1 (en) System and method for gesture based measurement of virtual reality space
US20190243335A1 (en) Process planning apparatus based on augmented reality
Chan et al. A multimodal system using augmented reality, gestures, and tactile feedback for robot trajectory programming and execution

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19716996

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19716996

Country of ref document: EP

Kind code of ref document: A1